source table have single column single record having with
single space.
load that source record into trg . trgt having two columns
and the
SOURCE TABLE LIKE
COL1
BHANU PRASAD
TRGT TABLE
COL1 COL2
IN THAT LOAD THE RECORD IN THE trgt table LIKE
COL1 COL2
BHANU PRASAD
HOW ? TELL ME PLZ
Answers were Sorted based on User's Feedback
Let our source table name is "Name",which is having COL1 and
data in it is BHANU PRASAD (For easier understanding instead
of space between bhanu and prasad,lets take a underscore
i.e. BHANU_PRASAD.
Source ----> Exp t/r ----> target
In the exp t/r take "name" as an input port only and create
2 out put ports as COL1 and COL2.
In COL1 give condition
substr(Name,1,instr(Name,'_',1)-1)
In COL2 give condition
substr(Name,instr(Name,'_',1)+1)
now connect COL1 port form exp to COL1 port in tgt and same
for COL2, than the out put in the target table will be as below:
COL1 COL2
BHANU PRASAD
Is This Answer Correct ? | 14 Yes | 1 No |
How are indexes created after completing the load process?
How to handle decimal places while importing a flatfile into informatica?
What are active and passive transformations?
Hi ETL gurus can any one tell me with a flow how to implement SCD Type 1 and SCD Type 2 in a single mapping.For some fields SCD type has to be implemented and for some fields scd type has to be implementd..Thank in advance..please let me know in case of any concerns...
select * from emp where sal>(select min(sal) from emp) how to implement the same in informatica ?
What is informatica etl tool?
In which scenario did u used Mapping variable?
if i have 10 records in my source,if we use router t/r and given the condition as i>2,i=5 and i<2in the different groups what is the o/p in the target
Please explain in detail with example about 1.Confirmed Dimension. 2.Junk Dimension. 3.Degenerated Dimension. 4.Slowly changing Dimensions
How to fasten loading of 100 million distinct records in informatica? (Initially they are loaded into target without using any transformation, taking 2 hours)
How is Source Side push down optimization different to just providing a SQL override in Source qualifier transformation.
What is a pre-defined event and user-defined event?