How to fasten loading of 100 million distinct records in
informatica?
(Initially they are loaded into target without using any
transformation, taking 2 hours)
Answer Posted / nitin tomer
1) Load Data using Bulk Load for that we have to drop index from the target table, but after loading the data to create the index will also take good amount of time
2) Create a sequence 1 to 100 million using sequence generator and create the pass-through partition to load data, let say 10 partition we can create from 1 to 100000 like that.
Partitioning will definitely give huge performance improvement.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
What is the function of look up transformation?
Can you use flat files in Mapplets.
Suppose on 1st Nov 2010 you had created a mapping which includes huge aggregator calculations and it is under process for next two days. You will notice that even on 3rd day also its still calculating. So without changing a logic or changing a mapping How will you troubleshot or to run that mapping? Explain the steps
what are the different types of transformation available in informatica. And what are the mostly used ones among them?
What is a repository manager?
What is mapping debugger?
Which transformation is needed while using the Cobol sources as source definitions?
What are the performance considerations when working with aggregator transformation?
Write the unconnected lookup syntax and how to return more than one column.
Explain the informatica workflow?
What happen if you have 3 ports in SQ and 4 ports in SQL override of SQ( provided all ports are in same order and they are connected with proper source and target)? Also what happens when I have 4 ports and will extract 3 values in SQloverride.. What will be the value in 4th port ?
What are the guidelines to be followed while using union transformation?
What are the different types of transformation available in informatica.
Write the prerequisite tasks to achieve the session partition?
What is complex mapping?