Hi
If i had source like unique & duplicate records like 1,1,2,3,3,4
then i want load unique records in one target like 2,4 and i want
load duplicate records like 1,1,3,3 then can any body please send
me what is th scnario. my mail i
shek.inform@gmail.com
Answers were Sorted based on User's Feedback
Answer / shridhar kasat
For Unique and Duplicate , we can go for same flow :
Source-->SrcQualifier-->Aggregator(group
by this column and find count also)--> then use joiner (use
the column from Source qualifier and join the data from
Aggregator as source qualifier.col=aggregator.col) -->
Router (insert into t1 for count>1 and for count=1 insert
into t2). t1 is for duplicate records and it will cintain
1,1,3,3 and t2 will have 2,4.
Records in Source Q :
col
1,1,2,3,3,4
The records after agregator will look like this :
col count
1 2
2 1
3 2
4 1
Records after Joiner : source q.col=aggregator.col
col count
1 2
1 2
2 1
3 2
3 2
4 1
These records will flow in router and we define two groups;
one for count>1 (for duplicates record) and second for
count=1 ( for unique record)
Hope this answer suffice your requirements.
Regards
Shri
Is This Answer Correct ? | 23 Yes | 0 No |
Answer / suman chattopadhyay
Shirish,
Part 1 of your answer is not correct. If you use distinct
option in sorter you will get all the distinct values even
for the duplicate. i.e if the values are 1,1,2,3,3,4, you
will get 1,2,3,4 in target.
Part 2 of your answer will do all the trick for duplicate
and unique ones. If you use group by and count the number
of occurences you will get 1 as the count for unique values
and more than one as count for duplicate values. Pass those
records with count 1 to 1 target and the other to the 2nd
target.
Hope this helps.
Suman
Is This Answer Correct ? | 10 Yes | 1 No |
Answer / sanju_yu
source> dynamic lookup>router,2 conditions in router
1. condition if column_lkp port is null then insert into
target1(unqie)
2. condtion if COLumn_lkp port is not null then insert into
target2(duplicates
Plz let me know if am wrong
Sanjay
Is This Answer Correct ? | 7 Yes | 4 No |
Answer / shirish
1.For unique: Source-->SrcQualifier--> sorter (use distinct
option)--> Target
2. For duplicate : Source-->SrcQualifier-->Aggregator(group
by this column and find count also)-->filter(use count
greater than 1 )--> left outer join with source qualifier --
> target 2.
Hope i am clear.
Is This Answer Correct ? | 3 Yes | 3 No |
Answer / rahul
hi sanjay,
will this logic work when we are loading for the 1st time
Is This Answer Correct ? | 0 Yes | 1 No |
What are the Differences between static cache and dynamic cache?
what is incremental aggregation ,with example?
write a query following source region sales 1 1000 2 2000 i want the output ?please give solution 1 2 1000 2000
Can you copy the session to a different folder or repository?
Dependecy Errors in Informatica ? Do u got any dependency problems while running session? Can any one Explain Clearly.
Session S1, S2, and S3. In the session S3 I want to load every Saturday. How it is possible?
1:when we develop a project what are the performance issue will raise?? KPIT 2:if a table have INDEX and CONSTRAINT why it raise the performance issue bcoz when we drop the index and disable the constraint it performed better??KPIT 3:what are unix commands frequently used in informatica??
what is meant by lookup caches?
2 Answers Cap Gemini, Informatica,
explain different levels in pushdown optimization with example?
How to handle decimal places while importing a flatfile into informatica?
Define Pmrep command?
On a day i load 10 rows in my target and on nextday i get 10 more rows to add in target. But out of 10 - 5 records are send them to target?how i can insert the remaining records