when load type is selected as bulk or normal in session level ?let me know the internal process and give me an example?
Answer Posted / skg
A:when you run a session with Normal load IS commits data into the target table as well as commits rowid into a log table.
For Example: you have 10000 records in your source,while loading data if your session fails at any point of time then there is a scope of recovery.After you fetch the issue when you rerun the job you need to ensure that your session is running with Normal load and instead of loading from the scratch,IS connect to the Repository service then Repo service connect to the Repo Database and check what is the last rowid commited in the log table and start loading from the max rowid+1 on wards.
checkpoint at session level: Resume From Last save heckpoint
BulkLoad: when you run a session with Bulk load the IS loads the data into target table and it won't commit any rowid into the log table. so performance will be good because only one operation is happing here but if session is failed at any point of time then there is no scope of recovery.
Note: when you work with Bulk load you need to ensure that there is no indexes on target table.if it has indexes your session get fails.
Q: I want to use bulk load and i have indexes on target table can't i access?
A: Yes you can achieve this by ensuring the option
Pre-SQL as :Drop Index index_Name;
Post-SQL as:create Index index_Name on table_Name(col_Name);
Is This Answer Correct ? | 1 Yes | 0 No |
Post New Answer View All Answers
Please let me know how to make encryption and decryption with example?
What is meant by target load plan?
wat are deployement groups in informatica, how it will be used for developers
Design a mapping to load the cumulative sum of salaries of employees into target table?
How can informatica be used for an organization?
How to load data in informatica ?
What is a rank transform?
How does a rank transform differ from aggregator transform functions max and min?
What are junk dimensions?
What is decode in static cache?
What is a standalone command task?
What are the various test procedures used to check whether the data is loaded in the backend, performance of the mapping, and quality of the data loaded in informatica?
What is aggregator transformation in informatica?
How to load the data from a flat file into the target where the source flat file name changes daily?
Can I use same Persistent cache(X.Dat) for 2 sessions running parallely? If it is not possible why?If yes How?