While debugging the otherproblems, I noticed the peek to output spaces for a char column even though the very same value written to the database was NULL. Surprising twist...
Actually, this warning is quite simple to fix. Apparently, DS expects the last record of a file to terminate with the record delimiter in any case - maybe there is a setting to alter this behaviour but I did not deem worthwhile the effort to look for it. So the solution is just to terminate the file on a single empty line. Obviously, this case is for Unix like line breaks. The others delimiters are handled analogously.
Well, you checked and double-checked the existence of respective field and it is there! Yeah, stumbled over it again. And again! Possible solutions You need to compile jobs explicitly. You're done then. Try it. Go! What are you waiting for? If you are really extra sure compiled the job, then double-check the presence of the respective field again, but not in the stage DS tells you but in the very source at the start of the dataflow where this field comes from. It happens that if the column name of your SQL is not identical to the field name propagated. I do absolutely not know why: compilation does not reveal the problem DataStage misses the correct problem location by far
I came across this when I wanted to tighten the use of jobs by using lists in job parameters. DataStage is a bit funny there: A list is apparently a data type of its own, even though list can only hold strings. One cannot pass the value of a list parameter to a list parameter of a contained job. If the latter is true, how is it possible to run into this problem? Well, it is if one changes the data type of a parameter of a contained job and that parameter has been fed by the value of a list parameter.
Kommentare
Kommentar veröffentlichen