explain the scenario for bulk loading and the normal
loading option in Informatica Work flow manager ???
Answers were Sorted based on User's Feedback
Answer / rekha
NORMAL LOAD : IT LOADS THE RECORD ONE BY ONE AND WRITES LOG
FOR EACH FILE . IT TAKES MORE TIME TO COMPLETE
BULK LOAD : I LOAD THE NUMBER OF RECORDS AT A TIME ,IT WONT
FALLOW ANT LOG FILES OR TRACE LEVELS . IT TAKES LESS TIME
USE THE BULK MODE FOR IMPROVING THE SESSION PERFORMANCE
Is This Answer Correct ? | 49 Yes | 4 No |
Answer / addy
Hi
I would like to mention- apart from logging differences
there is a crucial diffrence - if we are running the session
in bulk target load mode we cant recover a session from the
same point when we run it for the next time.
whereas if we are running the session in normal target load
mode we can recover the session if the output being loaded
to the target is deterministic in nature - this is relay a
helpful feature with the applications handling RealTime data.
-addy
Is This Answer Correct ? | 31 Yes | 4 No |
Answer / shashank
in the normal loading the taget write all the row on the
database log , while laoading the bulk loading the database
log is not come in the picture (that mean its skip the
property )so when the session got failed we can easily find
recover the seesion by the help of data base log.but in
case of bulk loading we can do .
but normaol loading is very slow as compare to bulk laoding.
Is This Answer Correct ? | 14 Yes | 3 No |
Answer / sujith
Above 5,6&7 are the correct for this questions....
Is This Answer Correct ? | 8 Yes | 1 No |
Answer / thiru
In normal loading it will creates the log file before
loading target.it will takes the time but in this sessiom
recovery is available.
In bulk loading the integration service bypasses the log
file,direct load into the target.inthis there is no session
recovery available but performance increase.
Is This Answer Correct ? | 7 Yes | 2 No |
Answer / sankar
NORMAL LOADING:THE INTEGRATION SERVUCE CREATE THE DATA BASE
LOG BEFORE LOADING DATA INTO THE THE TARGET DATA BASE.SO
--THE INTEGRATION SERVICE PERFORM ROLL BACK AND SESSION
RECOVERY.
BULK LOADING:THE INTEGRATION SERVICE IN WORK THE BULK
UTILITY WHEN BYPASS THE DATA BASE LOG
--THIS IS IMPROVES THE PERFORMANCE DATA LOADING
--THIS IS NOT PERFORM ROLLBACK
Is This Answer Correct ? | 6 Yes | 2 No |
1)Bulkload & Narmal load
Normal: In this case server manager allocates the
resources(Buffers) as per the parameter settings. It creates
the log files in database.
Bulk: In this case server manager allocates maximum
resources(Buffers) available irrespective of the parameter
settings. It will not create any log files in database.
In first case data loading process will be time taking
process but other applications are not affected. While in
bulk data loading will be much faster but other application
are affected.
Is This Answer Correct ? | 12 Yes | 13 No |
Answer / jyothsna katakam
when you select the normal, it will check the p.k and f.k
relation ship while running the mapping but when you select
the bulk it wont check any p.k and f.k relation ship
Is This Answer Correct ? | 7 Yes | 27 No |
SO many times i saw "$PM parser error " .what is meant by PM?
what is work of PUSH DOWN option
What is diff between connected and unconnected lookup?
5 Answers BirlaSoft, Blue Star Infotech,
what is the difference between informatica6.1 and infomatica7.1
How to eliminate duplicates in FF and oracle both
How many ways a relational source definition can be updated and what are they?
delete data from staging table as it loads to target table.here is the case we are getting data from 3 different server.a b and c.the data from server A loaded into staging table and we ran the task and data loaded to target table.now today data from server B and C also got loaded to the staging table.now what techniques and what transformations should be used to delete only the data which has been loaded only to the target.we need to delete only that data from staging which has been loaded into the target.looking for your responses
Which means the first record should come as last record and last record should come as first record and load into the target file?
wt is Dynamic lookup Transformation? when we use?how we use?
who is the best faculty for informatica in hyderabad ameerpet?
What is staging area?
what are challenges that you faced in testing? give solution to it?