Source (Flatfile) - Target (1-1 mapping). How to load first or
last 20 record to target?
Answers were Sorted based on User's Feedback
Answer / joe
We cannot use a sorter because it will shuffle the data.
For the last 20 rows. In a seperate pipeline use aggregator
and do a Count(key) and Max(key) to get the total number of
rows.
use an output port o_count and o_max to populate this count
to an expression transformation before the Filter.
o_count_last_20 = o_count - 20
i.e. if there are 35 records we need records from 15 to 35
In the filter check nextval between o_count_last_20 and
o_Max
Is This Answer Correct ? | 3 Yes | 0 No |
Answer / sunny
Hello joe, your solution is partially correct - what if the
records are not in sequence to do MAX minus 20 ? I guess
this logic can be applied only after passing through
sequence generator, by getting a sequence number.
Is This Answer Correct ? | 2 Yes | 0 No |
Answer / joe
1. Use a sequence generator to associate a NEXTVAL to the
row.
2. Filter condition : NextVal <=20
For source- dbase: Do a filter at SQ by including the
following condition . WHERE Rownum <=20
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / sri
For this you can write SQL query by TOP clause in expression
transformation.
Or else You can use Rank Transformation.
Is This Answer Correct ? | 0 Yes | 3 No |
Answer / raju
You can also achieve using following flow in the mapping.
1)Create an out put port SNO in expression transformation
and connect it to NEXTVAL port in sequence generator
transformation.
2)Use sorter transformation.In sorter ,use SNO port as key
column and Direction is Descending.
3)Use filter transformation with filter condition as
SNO<=20.
Is This Answer Correct ? | 0 Yes | 3 No |
If the source has duplicate records as id and name columns, values: 1 a, 1 b, 1 c, 2 a, 2 b, the target should be loaded as 1 a+b+c or 1 a||b||c, what transformations should be used for this?
My sql query is 1. select 1+x from dual? 2. select 1+'x' from Dual? 3. Select x+1 from dual? what is the out put of the above queries?
What is the method of loading 5 flat files of having same structure to a single target and which transformations I can use?
How to generate or load values in to the target table based on a column value using informatica etl tool.
Hi all, Can you please send me the Dimensions and fact tables which are used in mutual fund project. and please send me the brief summary about the project. Please do the needful.
performance wise which one is better in joiner and lookup transformation
Whether Aggrigator transformation ignore the null values or consider the null values ? Advance Thanks, Manojkumar
Can you copy the batches?
write a query to remove null value follwing table? col1 col2 col3 dinesh null null null suresh null null null prakesh i want the output col1 col2 col3 dinesh suresh prakesh
1.what is SDLC of a project,explain in detail? 2.what are dimensions,facts?what is confirmed dimensions? 3.what is the need of OLTP? when we have datawarehousing?
Explain reference cursor?
How to join three sources using joiner?