In mapping f.f as one src and f.f as trg,f.f as src and
oracle as trg which is fast? mean which is complete first
process
Answers were Sorted based on User's Feedback
Answer / sai krishna karri
ff as source and ff as target, this will be the faster
process. Because writing to the flatfile is faster than
writing it into a database.
| Is This Answer Correct ? | 8 Yes | 0 No |
Answer / ramesh
flatfiles is fast then oracle,because in flat files there
is no constraints.
| Is This Answer Correct ? | 3 Yes | 0 No |
There are n numbers of flatfile of exactly same format are placed in a folder .Can we load these flatfile's data one by one to a single relational table by a single session??
What are variable ports and list two situations when they can be used?
my sourse is like id name sal--1 aa 1000 ,2 bb 2000, 3 cc 3000 4 dd 4000, 5 ee 6000 , 6 ff 7000 and so on but myrequirement is like this id name sal up_sal,1 aa 1000 null,2 bb 2000 1000, 3 cc 3000 2000 4 dd 4000 3000, 5 ee 5000 4000 , 6 ff 6000 50000 and so on so how can i get it plez reply for this as soon as possible , thanks in advanced
How to configure mapping in informatica?
Consider two cases: (1) Power Center Server and Client on the same machine (2) Power Center Sever and Client on the different machines what is the basic difference in these two setups and which is recommended?
How can you validate all mappings in the repository simultaneously?
What is workflow? What are the components of the workflow manager?
How to create the list file having millions of flat files while indirect loading in informatica? In indirect file loading, suppose we have less no.of flat files then we can enter files names manually in list file creation. If millions of files are there, how can we enter the flat file names in list file?
Hi experts, can anyone tell how much we use plsql in real time
If sal is null then replace it with min(sal). Can any one write a query for this in oracle ? Advance Thanks
Explain grouped cross tab?
1) Alternative to update strategy transformation 2) out of 1000 records after loading 200 records, the session got failed. how do u load the rest of records ?? 3) use of lookup override