Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


3,if our source containing 1 terabyte data so while loading
data into target what are the thing we keep in mind?

Answers were Sorted based on User's Feedback



3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / chinni

1tera data is huge amount of data so if we use normal load
type it takes so much of time....to overcome this better to
use bulk loading...and also taking the large commit
intervals we can get good performance....and one more
technique is external loading.....

Is This Answer Correct ?    9 Yes 1 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / james

I agree with you Chinni.
But my suggestion is instead of loading huge amount of data
at a time,we can always split the source if it is a file
using unix and extract the data from all the source files
using indirect type and load into target.

Please let me know if you have any suggestions.

Thanks.
James.

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / vasu

Always Partition is the best option for huge volume

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / chiranjeevi k

We can use push down optimizationto to handle the huge
volume of data.

Thanks
Chiranjeevi

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / bakshu shaik

one more thing we need to take into consideration, along with above suggestions:

If the target having an index, Loading time (huge amount of data) increases/performance hazard......so it's always better to drop the index before loading(Stored procedure transformation) and create again after loading completed...!

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / sudheer113

Along with the above suggestions......

Better to use partitioning (single or multiple thread) and and increase the commit level to 50 lacs. Better to load first flat file instead of loading into table.
then if u r db is oracle better to use sql loader to load data from file to table.If DB2 than use db2 loader.

Is This Answer Correct ?    0 Yes 0 No

3,if our source containing 1 terabyte data so while loading data into target what are the thing we ..

Answer / upendra

large amount of data in source file.So we can devide partitions and commit level ur own wish,then connected to different target table, each one target table better for bulk load.

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More Informatica Interview Questions

Whats the difference between informatica powercenter server, repositoryserver and repository?

0 Answers  


In target table how to display the system time

1 Answers  


How we can get unique records into one target table and duplicate records into another target table??

11 Answers   Deloitte, IBM, iGate,


I have scenario, i have 10 worklflows and i want run the 1st workflow first and immediately i want to run 10th workflow has to run and am not bothering of 2 to 9 workflows? can any one help me how to do this? Thanks in advance.....

4 Answers   Patni,


Hi, I am unable to connect ports from two Active trasformations to another active transformation ? Why is it so? Why it is designed like that? Some rules has been designed like this that is We can't connect passive and another passive to active.Can any one please letme know all these rules. Ex:- I am trying to connect filter1 and filter2 to another filter. Please answer me Advance Thanks

1 Answers   IBM,


What are the tuning techniques you applied during your last project

0 Answers  


What is hash partition?

2 Answers  


Mapping variables, parameters syntax, if you create mapping variables and parameters in mapplet can you use them in the mapping?

1 Answers  


what are the deliverables?in your project?

0 Answers   DELL,


Briefly define a session task?

0 Answers  


Which kind of index is preferred in DWH?

3 Answers  


what are cost based and rule based approaches and what is the difference?

1 Answers  


Categories