If session fails after loading 10000 records in the
target,how can we load 10001 th record when we run the
session in the nexttime?
Answers were Sorted based on User's Feedback
Answer / rama krishna
Hey man ... first ofall u should need to mention that,
wether u r applied any commit or not.
with out apply any commit, How can u recover the session
yar?
ur quwstion is not valid. if u want exactly improve the
performence, then u must use commit points& commit
intervels.
if u really apply any commit,then if it fail according to u
at 10000 records, then u can recover remaining records by
using OPB_SRVR_RECOVERY table.
Hai friends am i correct ? think once
Is This Answer Correct ? | 11 Yes | 0 No |
Answer / srikanth
Select the recovery strategy in session properties as
"Resume from the last chk point"
Is This Answer Correct ? | 10 Yes | 4 No |
Answer / guest
using performance recovery to load the target
they are 3 types of performance recovery
1 ) we enable the recovery option in session
2) we put a commit intervel point in session
3) trancate the target and load once again
Is This Answer Correct ? | 3 Yes | 0 No |
Answer / lathagarat
First we will get the how many records was loaded in the
target system i.e max(records) (using aggregate
transformation we can get)and incremented from there
onwards.
Is This Answer Correct ? | 4 Yes | 1 No |
Answer / chaitanya
On selecting lookup cache as PERSISTENT we can able to use same cache for next session also.
Is This Answer Correct ? | 3 Yes | 0 No |
Answer / gayathri
In session properties we have "suspend on error" option is
there. It starts from the failed task not from the begining.
Is This Answer Correct ? | 7 Yes | 7 No |
Answer / basu
I think his question is, he wants to loads incremental. so you can go for Change data capture by using mapping variable/parameters in the SQ query overid
Is This Answer Correct ? | 0 Yes | 0 No |
Answer / nidhi
As per me Mr.Rama Krishna is right, every one else is ...
Ms. latha he already mentioned like after loading 10000
records, so there is no need to check again the
max(records),but incremental loading is also one of the ways
i guess..
1)Session recovery
2)Incremental loading are the ways as per my knowledge..
If any 1 think i am wrong plz let me knw..
Is This Answer Correct ? | 1 Yes | 6 No |
if we hav 10 records in a file, can we get first record from it by using Aggregator with out using Groupby Port in it?If Yes..let me know the answer plz!!!
According to his methodology what all you need before you build a datawarehouse
What is constraint based loading exatly? And how to do this? I think it is when we have primary key-foreign key relation ship. Is it correct? please answer me. Advance Thanks.
-Which expression we can not use in Maplets?, -Can we join(relate) two dimensions in a schema? -Why and where we use 'sorted input' option?
how to get the first row without using rank t/r?
Can we have a Mapping without a Source Qualifier?
what is lookup chache?
how do u use sequence created in oracle in informatica? Explain with an simple example
What is the difference between power center and power mart? What is the procedure for creating independent data marts from informatica 7.1?
what is DSS?
Q. We are the loading the table on daily basis it is incremental loading. and A person rahul slary was 10000, so if i check before run my salalr is 10000. but toay there is update that my sal is 15k but that will come to know after the load. braod crtiteria is we donot want to show downstream teams partial updated data. need aproad as etl developer
To import the flat file definition into the designer where should the flat file be placed?