Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


I am running a job with 1000 records.. If the job gots
aborted after loading 400 records into target... In this
case i want to load the records in the target with 401
record... How will we do it??? This scenario is not for
sequence job it's only in the job Ex: Seq file--> Trans-->
Dataset..

Answers were Sorted based on User's Feedback



I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / sree

by using look-up stage we can get the answer..
there are two tables like 1000 records(source) table and 400
records(target) table.

take the source table as primary table and 400 records table
as reference table to look-up table

reference table
.
.
.
source............. look-up......... target

Is This Answer Correct ?    10 Yes 4 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / phani kumar

With the help of available Environment variable as
APT_CHECKPOINT_DIR, we can run the job for remaining records.

This is available in Datastage Administrator like.....


With Datastage Administrator - project-wide defaults for
general environment variables, set per project in the
Projects tab under Properties -> General Tab -> Environment
variables.

Here, we can enable the this checkpoint variable. Then, we
can load the remaining records........

Is This Answer Correct ?    6 Yes 3 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / bharath

in datastage administrator we can enable add check point on failure option we can start from 401 record....

Is This Answer Correct ?    7 Yes 5 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / shar

ok now we have an reject job with stages like
seq--tx--dataset1 lets say job1

now take job2 (job) dependency from job1 to job2) in that
get the data from seq use tx as input 1 (primary) and dataset2 as secondary (ref) in which we use path of dataset1 file this as second input. now connect these inputs to lookup stage and reject the unmatched records connect the reject link to dataset3 in which file name should be same as dataset1 with update policy = append.



thats it dude.

Is This Answer Correct ?    2 Yes 0 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / abc

our design: src-->tfm-->data set.
to load the records from 401, just set the option Use Existing(discard records) in data set update action property

Is This Answer Correct ?    1 Yes 0 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / vinay sharma

1) First take seq file after that take one extra column in
that increment the value one by one and send to target
2) Now store the maximum value of extra column and pass this
value to transform thru HASH file and add in extra column
such as (sno+MAX)
Examlpe
SNO
1
2
3
4
5
6
7
Thera are max value is 7
Now add maxvalue means add 7 in SNO
like that
1+7
2+7
3+7
4+7
5+7
6+7
7+7

Is This Answer Correct ?    2 Yes 3 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / jagannath karmakar

USING SAVE POINT CAN PREVENT THIS PROBLEM

Is This Answer Correct ?    3 Yes 7 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / venkat

using an option called cleanup on failure

Is This Answer Correct ?    1 Yes 6 No

I am running a job with 1000 records.. If the job gots aborted after loading 400 records into targ..

Answer / mukteswara rao

by usig
1. job level
2. job sequencing level

Is This Answer Correct ?    1 Yes 8 No

Post New Answer

More Data Stage Interview Questions

How do you schedule or monitoring the job?

0 Answers  


My source having following data as below, AB1 Aim2 Abnv5 1An8bx and my question is i need the Datastage job the following as in my target 000AB1 00Aim2 0Abnv5 1An8bx Please help me to achive this.

6 Answers   Wipro,


How to get max salary of an organization using data stage stages........... can any body help me plz.......

7 Answers   Cap Gemini,


How one source columns or rows to be loaded in to two different tables?

0 Answers  


what is .dsx files

3 Answers   IBM,


How can we move a DATASTAGE JOB from Development to Testing environment with the help of a datastage job using unix commands.

5 Answers  


How do you generate sequence number in datastage?

0 Answers  


how can find maximum salary by using Remove duplicate stage?

5 Answers   IBM,


What is the difference between passive stage and active stage?

0 Answers  


what is meant by port ? what is the use of port ? what are the different type of ports and its usage

2 Answers   Accenture,


Why we use surrogate key?

0 Answers  


Hi guys, please design job for this, MY INPUT IS COMPANY,LOCATION IBM,CHENNAI IBM,HYDRABAD IBM,PUNE IBM,BANGLOORE TCS,CHENNAI TCS,MUMBAI TCS,BANGLOORE WIPRO,HYDRABAD WIPRO,CHENNAI HSBC,PUNE MY OUTPUT IS COMPANY,LOCATION,COUNT IBM,chennai,hydrabad,pune,banglore,4 TCS,chennai,mumbai,bangloore,3 WIPRO,hydrabad,chennai,2 HSBC,pune,1 Thanks

3 Answers   IBM,


Categories