Showing posts with label process. Show all posts
Showing posts with label process. Show all posts

Friday, March 23, 2012

Getdate() overflowing datetime

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

Chris Honcoop wrote:

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

That sounds strange. Can you use an error output on the destination component to capture the erroring records?

-Jamie

|||

I checked all my other packages with Fast Load and I noticed all the others had "check constraints" turned on - so I turned it on for this package and it ran fine all night (still going). Very strange but at least now if someone else sees this happen try turning on constraint checking.

Jamie, I'd love to - however I am on a deadline to get this data processed. This process is run once a month so hopefully next month I can give it a try and see if I can get some data on the failed rows. (that said I started running this process in sept and ran it in sept and oct without issue).

|||

Chris, have you recently installed any service packs or made other changes?

Thanks
Mark

|||

Not any on SQL/SSIS. I don't have visibility/control of the OS level. I do have some nonpublic patches for SSIS installed (provided to me by msft) to combat a specific problem I encountered (memory corruption on packages with lots of sorts) but these patches have been installed since august.

FYI this package is running on a 4 proc dual core 64-bit box with 16Gb RAM. For this package everything is local (the source db, SSIS, SQL, dest db).

|||

FYI.. I found more strangeness in regards to this error... when the package errored with that error it had also inserted thousands (15,000 each failure to be exact) of rows of garbage data: All int/decimal columns were 0, all string columns were empty set. (some of those ints were lookups where 0 is not a possible lookup value, also dates converted to ints as well). The only way I could trace these records to this failure is inserttime was correctly populated - I went back through my audit history and matched the inserttime to the failure time of the job.

So if this happens to you be on the lookout for this!!!

It almost appears that SSIS somehow got ahead of itself - inserting rows before it actually completely got the data from the source, populated the dtInsertTime etc....

Honestly this is quite scary....

|||We have been getting random errors with that message too. We are copying the records from a 64-bit sql 2005 enterprise edtions to a 64 bit standard edition server and are stumped as to why it would happen. The column that it is occuring on has the same value for all rows. We can run the package again and not have any problems.

Getdate() overflowing datetime

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

Chris Honcoop wrote:

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

That sounds strange. Can you use an error output on the destination component to capture the erroring records?

-Jamie

|||

I checked all my other packages with Fast Load and I noticed all the others had "check constraints" turned on - so I turned it on for this package and it ran fine all night (still going). Very strange but at least now if someone else sees this happen try turning on constraint checking.

Jamie, I'd love to - however I am on a deadline to get this data processed. This process is run once a month so hopefully next month I can give it a try and see if I can get some data on the failed rows. (that said I started running this process in sept and ran it in sept and oct without issue).

|||

Chris, have you recently installed any service packs or made other changes?

Thanks
Mark

|||

Not any on SQL/SSIS. I don't have visibility/control of the OS level. I do have some nonpublic patches for SSIS installed (provided to me by msft) to combat a specific problem I encountered (memory corruption on packages with lots of sorts) but these patches have been installed since august.

FYI this package is running on a 4 proc dual core 64-bit box with 16Gb RAM. For this package everything is local (the source db, SSIS, SQL, dest db).

|||

FYI.. I found more strangeness in regards to this error... when the package errored with that error it had also inserted thousands (15,000 each failure to be exact) of rows of garbage data: All int/decimal columns were 0, all string columns were empty set. (some of those ints were lookups where 0 is not a possible lookup value, also dates converted to ints as well). The only way I could trace these records to this failure is inserttime was correctly populated - I went back through my audit history and matched the inserttime to the failure time of the job.

So if this happens to you be on the lookout for this!!!

It almost appears that SSIS somehow got ahead of itself - inserting rows before it actually completely got the data from the source, populated the dtInsertTime etc....

Honestly this is quite scary....

|||We have been getting random errors with that message too. We are copying the records from a 64-bit sql 2005 enterprise edtions to a 64 bit standard edition server and are stumped as to why it would happen. The column that it is occuring on has the same value for all rows. We can run the package again and not have any problems.

Getdate() overflowing datetime

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

Chris Honcoop wrote:

I have a weird error that just started showing up. This process has run many times before and just today it started erroring.

The error I get is:

There was an error with input column "dtInsertTime" (242) on input "OLE DB Destination Input" (146). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

The weird thing is that column is added to the data flow via a derived column just before the destination and its set to GETDATE(). The destination column for that field is datetime not null with the same name. I have litterally hundreds of packages that do the same thing (add a column set to getdate() of type dbtimestamp going into sql 2005 column with datetime) and have never run into this. Its frustrating.. the job will run for a half hour inserting records just fine and then BAM fails.

I'm completely out of ideas... Most destinations I don't use the fast load option so I am running it right now with that off to see if that makes a difference (other than making it slower). Previously I had it set to "keep nulls" and "table lock" but not "check integrity".

Edit: I'll have to abandon my test without fast load... my load times went from < 30 second to 2-3 minutes per set of records.

That sounds strange. Can you use an error output on the destination component to capture the erroring records?

-Jamie

|||

I checked all my other packages with Fast Load and I noticed all the others had "check constraints" turned on - so I turned it on for this package and it ran fine all night (still going). Very strange but at least now if someone else sees this happen try turning on constraint checking.

Jamie, I'd love to - however I am on a deadline to get this data processed. This process is run once a month so hopefully next month I can give it a try and see if I can get some data on the failed rows. (that said I started running this process in sept and ran it in sept and oct without issue).

|||

Chris, have you recently installed any service packs or made other changes?

Thanks
Mark

|||

Not any on SQL/SSIS. I don't have visibility/control of the OS level. I do have some nonpublic patches for SSIS installed (provided to me by msft) to combat a specific problem I encountered (memory corruption on packages with lots of sorts) but these patches have been installed since august.

FYI this package is running on a 4 proc dual core 64-bit box with 16Gb RAM. For this package everything is local (the source db, SSIS, SQL, dest db).

|||

FYI.. I found more strangeness in regards to this error... when the package errored with that error it had also inserted thousands (15,000 each failure to be exact) of rows of garbage data: All int/decimal columns were 0, all string columns were empty set. (some of those ints were lookups where 0 is not a possible lookup value, also dates converted to ints as well). The only way I could trace these records to this failure is inserttime was correctly populated - I went back through my audit history and matched the inserttime to the failure time of the job.

So if this happens to you be on the lookout for this!!!

It almost appears that SSIS somehow got ahead of itself - inserting rows before it actually completely got the data from the source, populated the dtInsertTime etc....

Honestly this is quite scary....

|||We have been getting random errors with that message too. We are copying the records from a 64-bit sql 2005 enterprise edtions to a 64 bit standard edition server and are stumped as to why it would happen. The column that it is occuring on has the same value for all rows. We can run the package again and not have any problems.sql

Monday, March 12, 2012

Get the variable from Execute Process Task to C#

Hi!

I need help with some C# code. I have build a SSIS package with an Execute Process Task. I need to send dynamic variables in to my C# program so I thought it was a good idea to use the StandardInputVariable.

How do I get the variable in my C# code?
Thanks

CarlYour Main methods parameter collection?|||

Peter K wrote:

Your Main methods parameter collection?

yes. provided that this functionality has been built into the c# code.|||Thanks for your help.

I tried to get the variable through the main method but it dont work, the only thing I got was the argument.
My test code:
static void Main(string[] args)
{


for (int i=0; i<args.Length; i++)
{
Console.WriteLine(argsIdea);
Console.ReadLine();
}

Carl|||

ctsand wrote:

Thanks for your help.

I tried to get the variable through the main method but it dont work, the only thing I got was the argument.
My test code:
static void Main(string[] args)
{


for (int i=0; i<args.Length; i++)
{
Console.WriteLine(args);
Console.ReadLine();
}

Carl

did you include the variable as an argument to your c# executable in the execute process task?|||I have a static argument and a variable in StandardInputVarable. I put a value in the variable for testing but it will be dynamic.|||

ctsand wrote:

I have a static argument and a variable in StandardInputVarable. I put a value in the variable for testing but it will be dynamic.

is this necessary? can't you just use a dynamically updated ssis variable when calling your executable in the execute process task?|||

My intention was have to have the argument to jump to a special method in the code. The variable will have information about witch rows in the table the code shall read in and treat.

In any case, can I put a dynamic variable in the argument?

Carl

|||

the code below is how to execute package in c# code.the red code tell you how to dymamic edit variable.i think this method can get variable.but i didn't try.please try it

//add reference "Microsoft.SqlServer.ManagedDTS"(in microsoft.sqlserver.manageddts.dll)
Imports Microsoft.SqlServer.Dts.Runtime

Dim pkg As String = "package directory"

Dim app As Application = New Application()
Dim p As Pakage = app.LoadPackage( pkg, Nothing )
p.InteractiveMode = true
Dim pty As DtsProperty

'Dim n As Integer = p.Configurations.Count

Dim strPty As String
For Each pty In p.Properties
strPty = pty.Name & ":"
Try
If pty.Get Then strPty &= pty.GetValue( pty ).ToString()
Catch
End Try
Console.WriteLine( strPty )
Next

Dim vir As Variables = p.Variables
vir( "strFile" ).Value = "C:\MyApp2.txt"

Console.WriteLine( p.Execute( Noting, vir, Nothing, Nothing, Nothing ).ToString() )

|||

ctsand wrote:

My intention was have to have the argument to jump to a special method in the code. The variable will have information about witch rows in the table the code shall read in and treat.

understood

In any case, can I put a dynamic variable in the argument?

Carl

i don't know. did you try?|||

I tried it but it didnt worked. The only thing I got was the name of the variable.

You wrote earlier:

is this necessary? can't you just use a dynamically updated ssis variable when calling your executable in the execute process task?

What did you mean by that?

|||

ctsand wrote:

I tried it but it didnt worked. The only thing I got was the name of the variable.

You wrote earlier:

is this necessary? can't you just use a dynamically updated ssis variable when calling your executable in the execute process task?

What did you mean by that?

what are the option settings in the process page of the execute process task editor?|||

RequireFullFileName = True

Executable = M:\Program\Person.exe

Arguments = fakt

WorkingDirectory = M:\Program

StandardInputVariable = User::test

StandardOutputVariable =

StandardErrorVariable =

FailTaskIfReturnCodeIsNotSuccessValue = True

SuccessValue = 0

TimeOut = 0

TerminateProcessAfterTimeOut = True

WindowStyle = Normal

|||what is the value and data type of User::test immediately before the execute process task starts executing? is the value of this variable correct?|||

The data type is string and the value is testing.

I have a question for you.

Is't meaning that I shall use the main-method to get the argument and the variable?

Carl