My questions is i create a two sessions for one mapping.but
my requirement is if all number of source records are same
as in target then execute first session or some rows are
rejected due to t/r logic so session two was execute
please clarify
This is the case that we use for audit balancing in our project.
At the starting of the session have the source count and checksum of a field and write it into a file.After the load is done in target count the number of rows in the target and write the values in a file.After the target load session is over,use command task and have a script that compares the two scripts.If the count matches scripts succeeds else fails.In assignment link give condition that session.status =SUCCEEDED then first workflow and session.status-FAILED hen second workflow
Is This Answer Correct ? | 0 Yes | 0 No |
How your source files are coming to your ETL server. Actually at which stage of your mapping it is happen.
Aggregator transformation is having fields say a,b,c,d,e group by is enabled on a,b,c with sorted input,How the aggregator transformation process the i/p data?or in which way i/p comes to agg transformation
What is a repository manager?
What is the difference between Connected and UnConnected Lookup Transformation.Give me one or two examples please?
How to generate sequence numbers?
what is shared lookup&persistent lookup?
4 Answers Accenture, Cap Gemini,
If I have 10 flat files with same name abc.txt files with different timestamps as source I need to load them in tgt table oracle. in between job execution fails and rows are not loaded into tgt. how can I make them load in that target even if my job fails?
Is snow flake or star schema used? If star schema means why?
Informatica and datawarehousing courses in Pune?
What is the function of union transformation?
who to split dimentions into sub dimention
how can you load data into target table without leading zeor's