With out using Funnel Stage, how to populate the data from
different sources to single target
Answer Posted / vijay
Hi Kiran ,
We can populate the sources metadata to target without
using funnel stage using "Seqential File" Stage.
let me explain
In Sequential file we have a property called "File" so first
u give the file name and load the data.
Next time in the same sequential file right side we have a
property "File" just click on that so it will ask another
file name just give other file name ...do not load the data
, in the same you can give how many files u have.
Finally u ran the job Automatically the data will be appended.
Thks
| Is This Answer Correct ? | 14 Yes | 5 No |
Post New Answer View All Answers
What is the purpose of interprocessor stage in server jobs?
How do you reject records in a transformer?
what is the use of surogate key in datastage
How and where you used hash file?
How many areas for files does datastage have?
How do y read Sequential file from job control?
Does datastage support slowly changing dimensions ?
What is the difference between orabulk and bcp stages?
Why do you need stage variables?
How you can fix the truncated data error in datastage?
What is data partitioning?
1)what is the size of Fact table and dimension table? 2)how to find the size of Fact table and dimension table? 3)how to implement the surrogate key in transform stage? 4)write the configuration file path? 5)how many types of datasets explain? 6)diff b/w developed projects and migration projects? 7)how to delete the header and footer file of the sequencer file? 8)how can u call the parameters in DS in unix environment? 9) how much data ur getting daily ? 10)
Field,NVL,INDEX,REPLACE,TRANSLATE,COLESC
if i have two tables table1 table2 1a 1a,b,c,d 1b 2a,b,c,d,e 1c 1d 2a 2b 2c 2d 2e how can i get data as same as in tables? how can i implement scd typ1 and type2 in both server and in parallel? field1 field2 field3 suresh , 10,324 , 355 , 1234 ram , 23,456 , 450 , 456 balu ,40,346,23 , 275, 5678 how to remove the duplicate rows,inthe fields?
How can we perform the 2nd time extraction of client database without accepting the data which is already loaded in first time extraction?