Friday, March 23, 2012

Getdate() overflowing datetime

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

Chris Honcoop wrote:

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

That sounds strange. Can you use an error output on the destination component to capture the erroring records?

-Jamie

|||

I checked all my other packages with Fast Load and I noticed all the others had "check constraints" turned on - so I turned it on for this package and it ran fine all night (still going). Very strange but at least now if someone else sees this happen try turning on constraint checking.

Jamie, I'd love to - however I am on a deadline to get this data processed. This process is run once a month so hopefully next month I can give it a try and see if I can get some data on the failed rows. (that said I started running this process in sept and ran it in sept and oct without issue).

|||

Chris, have you recently installed any service packs or made other changes?

Thanks
Mark

|||

Not any on SQL/SSIS. I don't have visibility/control of the OS level. I do have some nonpublic patches for SSIS installed (provided to me by msft) to combat a specific problem I encountered (memory corruption on packages with lots of sorts) but these patches have been installed since august.

FYI this package is running on a 4 proc dual core 64-bit box with 16Gb RAM. For this package everything is local (the source db, SSIS, SQL, dest db).

|||

FYI.. I found more strangeness in regards to this error... when the package errored with that error it had also inserted thousands (15,000 each failure to be exact) of rows of garbage data: All int/decimal columns were 0, all string columns were empty set. (some of those ints were lookups where 0 is not a possible lookup value, also dates converted to ints as well). The only way I could trace these records to this failure is inserttime was correctly populated - I went back through my audit history and matched the inserttime to the failure time of the job.

So if this happens to you be on the lookout for this!!!

It almost appears that SSIS somehow got ahead of itself - inserting rows before it actually completely got the data from the source, populated the dtInsertTime etc....

Honestly this is quite scary....

|||We have been getting random errors with that message too. We are copying the records from a 64-bit sql 2005 enterprise edtions to a 64 bit standard edition server and are stumped as to why it would happen. The column that it is occuring on has the same value for all rows. We can run the package again and not have any problems.

No comments:

Post a Comment