Friday, August 12, 2005

Before Insert

One of my developers came to me with this problem. He is trying to manipulate the data coming into a table by using an INSERT TRIGGER. For the life of me, I can't figure out why this doesn't work (I must be missing something simple):

SQL> drop table xyz
2 /

Table dropped.

SQL> create table xyz
2 (
3 x number(10),
4 y varchar2(20),
5 z varchar2(4))
6 /

Table created.

SQL>
SQL> create or replace trigger xyz_bi
2 before insert on xyz
3 for each row
4 begin
5
6 :new.z := '777';
7
8 end;
9 /

Trigger created.

SQL>
SQL> insert into xyz values (1, '123456789','123');

1 row created.

SQL> commit;

Commit complete.

SQL> select * from xyz;

X Y Z
---------- -------------------- ----
1 123456789 777

SQL>

OK, so far so good. The trigger populated the Z value like I expected. However, when I pass in a string that is longer than the Z field, I get:

SQL> insert into xyz values (1, '123456789','123456789');
insert into xyz values (1, '123456789','123456789')
*
ERROR at line 1:
ORA-12899: value too large for column "JEFFH"."XYZ"."Z" (actual: 9, maximum: 4)


SQL>
SQL> commit;

Commit complete.

I can't explain why in a BEFORE INSERT trigger Oracle would care what the length of the string is before the trigger even fires. Any hints?

7 comments:

Anonymous said...

The :NEW and :OLD values are constrained by the table definition. To see this in action, change the trigger code from ... '777' to '77777'.

Heath Sheehan said...

Row level triggers fire for each row that is affected by the triggering statement. That would imply that all validity checks have to be passed before the triggers fire. Any explicit or implicit data conversions have to result in valid data. Any check constraints have to pass, etc.

If the data isn't valid, the row isn't inserted and there's nothing for which the trigger should fire.

Jeff Hunter said...

Heath Sheehan said...
See part II

Jeff Hunter said...

Anonymous said...

The :NEW and :OLD values are constrained by the table definition. To see this in action, change the trigger code from ... '777' to '77777'.

Well, that I could see, although the trigger is still created.

Gary Myers said...

On deeper thinking, it isn't so surprising.
Firstly, we'd expect the trigger to 'see' the correct datatype.
For example if you insert into a table with a column of type DATE with a value of '01-jan-2000' you'd expect the implicit conversion from character to date to happen before the trigger fires.
The same goes with casting to a specific number precision.

set serveroutput on size 10000

create table test_a
(col_1 number, col_2 number(3,2));

create or replace trigger x_test_a before insert on test_a for each row
begin
dbms_output.put_line(:new.col_2);
end;
.
/

insert into test_a values (1,1.234);

You'll see the value 1.23 displayed.
The data type of the variable :NEW.COL_2 doesn't allow the extra digit so it gets truncated when the value is assigned to it.

Personally, I don't like the thought of a trigger where someone specifies a value in an INSERT and it gets overridden. If the value is valid it should go in the table, and if not it should be rejected. Shame there's no way to avoid if it an INSERT explictly specifies a NULL for a column value.

Jeff Hunter said...

Gary said...
Personally, I don't like the thought of a trigger where someone specifies a value in an INSERT and it gets overridden.

Nor do I, but this was a case of someone trying to put bad data into a table and I thought a trigger could fix it to make it valid.

Pradeep said...

I think this is happening during parse stage in the memory...very very wild guess...