how do u estimate the number of partitions that a mapping
really requires? Is it dependent on the machine configuration?
Answer Posted / tcs
There is no particular rule that says that mapping has to be
partitioned.But we can do so to improve the performence.Yes,it
depends on the machine configuration, as it uses the system
memory to run those sessions
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
how do u extract data from different data sources explain with an example?
Assume u have a 24CPU machine with 24GB RAM, suggest how u would like to configure Informatica ,like number of concurrent sessions, RAM requirements etc,max partitions that u would permit per mapping.
What is etl process? How many steps etl contains?
Assume u have a 24CPU machine with 24GB RAM, suggest how u would like to configure Informatica ,like number of concurrent sessions, RAM requirements etc,max partitions that u would permit per mapping.
what is the difference steps for data modeling?
What is cube grouping?
1.Identify and discuss the problems which are occurred during data quality verification for the single-source and multi-source problems. 2.Testing has very important role in DWH. In ETL testing phase, how can we perform the integration testing and regression testing? 3.What are the perquisites of system testing and regression testing in ETL phase?
Explain about enterprise scalability and roi of data integration suite?
Suppose I am loaded with questions as I am an experienced etl coder, but not an analytical report builder. I am using analysis services to build a cube but am trying to choose the reporting architecture. Can someone please confirm whether or not reporting services (using the business intelligence report) will allow for slicing and dicing, or is it only a static report builder. Also, if I am using the cube browser in analysis services, where can I put the option of non empty so that I dont see records that are blank.?
What is etl process?
What is data modeling and data mining?
how can i text accracy of ETL migration? i am very new to data warehousing. we are writing ETL scripts using SCRIPELLA tool. how can i test the correctness of data. and we are generating reports using pentaho . is there any easy way to test the pentaho. how can test these ETL scripts written in scriptella? thanks in advance
Is hadoop a etl tool?
when u connect the repository for the first time it asks you for user name & password of repository and database both.But subsequent times it asks only repository password. why?
what do u do when DB time takes more?in sap bi