How can i catch the Duplicate Rows From SorterTrans in a
Seperate Target Table ?
Answers were Sorted based on User's Feedback
Answer / krithi
When you use 'Select Distinct' option , the Sorter
Transformation discards the duplicate records.
If you want to trace out the duplicate records, then send
the sorted data into Expression Transformation. Add a new
variable and compare the variable with the input data. If
the data matches then flag the record or else pass the
default row.And load all the flagged records into a
seperate table.
| Is This Answer Correct ? | 4 Yes | 2 No |
Answer / sudhar
You can use distinct to eliminate the duplicate records.
Otherwise if you wanted to send the duplicate records to
another table then sort the data, then using the expression
you can find the duplicate value, assign different constant
values for the duplicate and the normal one. Use the router
to send duplicate and orginal records to the different
target tables.
| Is This Answer Correct ? | 2 Yes | 4 No |
What is domain and gateway node?
How to convert a row into column and a column into rows? Name all DTM threads. What all threads stop when we issue STOP or ABORT? How to pass the value of a data(variable kind of) from one session ( generated in mapping) to another session in the same workflow... What are the tyoes of partitioning you know and how to apply them in real time ... Can partitioning be applied to expression transformation and how
Suppose i have 10000 records.First time i have to load 1 to 1000 records and second run i have load 1000 to 2000 records and third load i have to load 2000 to 3000 rows .How will achieve
What is the method of seperating unique and duplicate records in session level?
2 Answers Emphasis, JBL, Wipro,
What is the difference between normal and bulk loading? Which one is recommended?
connected and unconnected lookups?
What is an active transformation?
What is a pre-defined event and user-defined event?
Kimball and Inmon methodologies?
IN SCD1, insource we have 10 billion records and in the first day its uploaded successfully and in the second day its taking time to upload because some records it might get update or insert new records. As a developer what will be the better solution for this??
What will happen when Mapping variable and Mapping parameter is not defined or given? Where do you use mapping variable and mapping parameter?
One of the optimizing technique to improve the session performance is push down optimization,by using push down optimization we push as much as transformation logic to source/target database,but this degrades the d/b performance,how to overcome this?