3,if our source containing 1 terabyte data so while loading
data into target what are the thing we keep in mind?

Answers were Sorted based on User's Feedback



3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / chinni

1tera data is huge amount of data so if we use normal load
type it takes so much of time....to overcome this better to
use bulk loading...and also taking the large commit
intervals we can get good performance....and one more
technique is external loading.....

Is This Answer Correct ?    9 Yes 1 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / james

I agree with you Chinni.
But my suggestion is instead of loading huge amount of data
at a time,we can always split the source if it is a file
using unix and extract the data from all the source files
using indirect type and load into target.

Please let me know if you have any suggestions.

Thanks.
James.

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / vasu

Always Partition is the best option for huge volume

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / chiranjeevi k

We can use push down optimizationto to handle the huge
volume of data.

Thanks
Chiranjeevi

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / bakshu shaik

one more thing we need to take into consideration, along with above suggestions:

If the target having an index, Loading time (huge amount of data) increases/performance hazard......so it's always better to drop the index before loading(Stored procedure transformation) and create again after loading completed...!

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / sudheer113

Along with the above suggestions......

Better to use partitioning (single or multiple thread) and and increase the commit level to 50 lacs. Better to load first flat file instead of loading into table.
then if u r db is oracle better to use sql loader to load data from file to table.If DB2 than use db2 loader.

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / upendra

large amount of data in source file.So we can devide partitions and commit level ur own wish,then connected to different target table, each one target table better for bulk load.

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More Informatica Interview Questions

to improve the performance of aggregator we use sorted input option and use sorter t/r befor aggregator. But here we are increasing one more cache in our mapping i.e; sorter. So how can u convince that you are increasing the performance.?

3 Answers   IBM,


Hi, If any hav Informatica n DWH FAQ's,Plz do fwd to vanibv6@gmail.com Thnx Vani

1 Answers  


How can you generate reports in informatica?

0 Answers  


1 lac of flat fles in source how to load target at a time?

4 Answers   Cap Gemini,


we have 6 records in source , i need 2nd record in one target and 5th record in one target or 2nd & 5th record in same target.

8 Answers  


what is correlated query?

2 Answers   IBM,


what type of documents you receiving from client later wt can you do? what type of documents have to prepare?

2 Answers   CGI,


How to load the name of the current processing flat file along with the data into the target using informatica mapping?

0 Answers   Informatica,


can i any one explain me realtime healthcare project explanation..for interview .iam new to informatica .thanks in advance.

0 Answers  


my source like dis 10,asd,2000 10,asd,2000 10,asd,2000 20,dsf,3000 20,dsf,3000 20,dsf,3000 like dis and my requirement is first record is inserted into first target and duplicates of first record is inserted into second target ...like dis way ...? how to achieve dis?

4 Answers   Mind Tree, Polaris,


How we can create indexes after completing the loan process?

0 Answers  


difference between informatica 8.6 and 9

3 Answers   Atos Origin, BA Continnum Solutions, Core Logic,


Categories