Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


Source is a flat file and want to load unique and duplicate
records separately into two separate targets; right??

Answers were Sorted based on User's Feedback



Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / nitin

Create the mapping as below to load Unique records and duplicate records each in separate targets

Source->SQ->Sorter->Aggregator->Router-> Tgt_Unique
-> Tgt_Duplicate
In aggregator use group by on all ports.
and define a port OUTPUT_COUNT = COUNT(*)
In the router define two groups OUTPUT_COUNT > 1 and OUTPUT_COUNT = 1; Connect the outputs from the first group
OUTPUT_COUNT > 1 to tgt_Duplicate and OUTPUT_COUNT = 1 to Tgt_Unique

Is This Answer Correct ?    1 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / ankit kansal

Hi,
What i have understood after seeing your problem is like if your source contains 1,2,1,2,3 then only 3 is taken as unique and 1,2 will be considered as duplicate values.

SRC->SQ->SRT->EXP(to set flags for dup)->ROUTER->JOINER->EXP->RTR->2TGTS

http://deepinopensource.blogspot.in/

Is This Answer Correct ?    1 Yes 1 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / mohank106

Refer the below link, the answer is crystal clear here

http://www.bullraider.com/database/informatica/scenario/11-informatica-scenario3

Is This Answer Correct ?    0 Yes 0 No

Source is a flat file and want to load unique and duplicate records separately into two separate t..

Answer / rani

Take Source Qualifier,next place sorter t/f,select option Distinct in sorter and load it in Unique_target.

Take lookup transformation and lookup on target and compare it with source, when a record occurs more than 1 ,delete that record from target using Update strategy -DD_DELETE 2 and load
in Duplicate_target.This is a source in another pipeline and take unconnected lookup and write lookup override like count(*) having >1 then load them in Duplicate_target.

Is This Answer Correct ?    0 Yes 2 No

Post New Answer

More Informatica Interview Questions

My Source qualifier has empno, sal. Now my mapping is like SQ(EMPNO)->AGGR->EXP->TARGET SAL ------------>TARGET ? Is this mapping valid or any issues are there if we design like this?

3 Answers   Span Systems,


SO many times i saw "$PM parser error " .what is meant by PM?

1 Answers  


What is Data Caches size?

0 Answers   Informatica,


Hi, In Router transformation I created two groups . One is Passthrough=> True Second one is CorrectId’s => Invest>50000 Here I have one doubt. Can’t I treat default group as Passthrough group (fist group) . Is there any difference between default group and Passthrough group in this scenario? Let me know if you want more information about this scenario. Advance thanks.

3 Answers   IBM,


How many ways are there to do 'remove duplicate records in informatica'?

0 Answers  


Describe data concatenation?

0 Answers  


What is the cumulative sum and moving sum?

0 Answers  


In which scenario did you used pushdown optimization?

1 Answers  


Why union transformation is an active transformation?

0 Answers   Informatica,


what is meant by lookup caches?

2 Answers   Cap Gemini, Informatica,


Tell me about MD5 functions in informatica

0 Answers  


expain about the tune parameters?

0 Answers   TCS,


Categories