Expain testing proces of your company?
Answers were Sorted based on User's Feedback
Answer / nag
SYSTEM TESTING PROCESS
Test Initiation Test Planning
Test Design Test Execution
Test Closure
Test Reporting
Software Development Process along with Software Testing Process
Requirements Gathering (B.R.S)
Analysis and
project planning (SRS & Project Plan)
Development Team
System Testing Team
Design & Reviews
Test Imitation
Coding & Unit Testing
Test Plan
Integration Testing
Test Design
Initial Build
Test Execution Test Reporting
Test Closure
User Acceptance Testing
Sign Off (Release the Build to Customer)
1. System Test Initiation:
Generally the every organization system testing process is
starting with test initiation. In this phase project manager
(or) Test Manager are preparing test strategy document. This
document defines required testing approaches to be followed
by testing components in test strategy.
Components in Test Strategy:
1. Scope and objective: The importance of testing and their
milestones
2. Business Issue?s: Cost allocation in between development
process and testing process.
3. Test Approach: selected list of test factors (or) test
issues to be applied by testing team on corresponding
software build. This selection is depending on requirements
in that software, scope of that requirements and the risks
involved in that project testing
4. Roles & Responsibilities: The name of jobs in testing
team and their responsibilities.
5. Communication & Status Reporting: They required
negotiation in between every two consecutive testing jobs in
testing teams
6. Test Automation & Testing Tools: The purpose of
automation and available tools in our organization
7. Defect Reporting & Tracking: They required negotiation in
between the testing team and development team to review and
resolve defects during testing.
8. Testing Measurements & Metrics: To estimate quality,
capability and status, testing team is using a set of
measurements and metrics.
9. Risks and Assumptions: The expected list of problems and
their solutions to overcome.
10. Change and Configuration Management: Managing the
development and testing deliverables for future reference
11. Training Plan: The required no. of training sessions for
testing team to understand customer requirements (or)
Business logic.
12. Test Deliverables: Names of test documents to be
prepared by testing team during testing.
E.g. Test plan, test cases, test log, defect reports, and
summary reports.
Test Factors (or) Test Issues
1 Authorization Validity of users
2 Access Control Permission of users to use specific
services functionality.
3 Audit Trail The correctness of metadata
4 Continuity of processing Integration of programs
5 Data Integrity Correctness of Input data.
6 Correctness Correctness of output values & manipulations.
E.g. Mail compose correctly working or not
7 Coupling Co-existence with other software?s to share
common resources
8 Ease of use User Friendly Screens
9 Ease of operate Installation, Un installation, dumping,
down loads, up loading???..etc
10 Portable Run on different platforms
11 Performance Speed of processing
12 Reliability Recover from abnormal situations
13 Service levels Order of functionalities or services to
give support to customer site people
14 Maintainable Whether our software is long time
serviceable to customer site people or not?
15 Methodology Whether our testing team is following
pre-defined approach properly or not?
Test Factors V/s Testing Techniques:
Test factor indicates a testing issue or
topic. To test every topic in our project. Testing team is
following a set of testing techniques.
1 Authorization Security Testing
2 Access Control Security Testing
3 Audit Trail Functional Testing
4 Continuity of processing Integration testing
5 Data Integrity Functionality Testing
6 Correctness Functionality Testing
7 Coupling Inter System Testing
8 Ease of use User Interface (or) Manual Support Testing
9 Ease of operate Installation Testing
10 Portable Compatibility & Configuration Testing
11 Performance Load & Stress Testing & Data Volume Testing
12 Reliability Recovery, Stress Testing
13 Service levels Regression or Software Change Testing (C.C.B)
14 Maintainable Compliance Testing
15 Methodology Compliance Testing
Compliance Testing:
Whether our
project team is following our company standards or not?
Case Study:
Test Factors ----- 15
-4 (w.r.p.t depends on project requirements)
-------
11
+1 (w.r.p.t depends on scope of
requirements)
------
12
-3 (w.r.p.t risks in testing)
-------
9 (Finalized factors / issues)
-------
In the above example the Project Manager / Test Manager
finalized 9 testing topics / issues to be applied by testing
team on project / s/w build.
2. Test Planning:
After completion of test
strategy finalization, the test lead category people are
developing test-planned documents. In this stage the test
lead category people are preparing system test plan and
divide that plan in to module test plans. Every test plan is
defining ?What to Test??, ?How to Test??, ?When to Test??,
?Who to Test??
To develop these test plans, test lead category people are
following below approach
Test Team formation
Project Plan
Identify tactical risks
Prepare test plans
Development Documents (SRS) review test plans
Test Plans
Test Strategy
System test plans are compulsory, module test plans are optional
a. Testing Team Formation:
Generally
the test-planning task is starting with testing team
formation. In this stage, test lead is depending on below
factors
? Project size. E.g. lines of codes or functional points
? Available no. of test engineers
? Test Duration
? Test Environment Resources. e.g. Testing Tools
Case study:
C/s, website, ERP?3 to 5 months of system testing
System s/w (embedded, mobile...)?7 to 9 months of system testing
Machine Critical s/w (A.I, Robots, Satellites) ? 12 to 15
months of system testing
b. Identify Tactical Risks:
After
completion of testing team formation, test lead is
concentrating on the risks analysis and assumptions w.r.p.t
that formed testing team.
Risk-1: of knowledge of test engineers on
that project
Risk-2: Lack of time
Risk-3: Lack of documentation
Risk-4: Delay in
Risk-5 Lack of development process rigor
Risk-6 Lack of resources
Risk-7 Lack of communication
c. Prepare Test Plan:
After completion of
testing team formation and risks analysis, test lead is
concentration on test plan document preparation. In this
stage test lead is using IEEE-829 test plan document format
(Institute of Electrical & Electronics Engineers)
Format:
1. Test Plan ID: The title of test plan documents for future
reference
2. Introduction: About Project
3. Test Items: List of modules in our project
4. Features to be tested: List of modules or functions to be
tested
5. Feature not to be tested: List of modules, which are
already tested in previous version testing.
3 to 5 ?What to Test??
6. Approach: List of testing techniques to be applied on
above (selected)
Modules
7. Test deliverables: Required testing documents to be
prepared by test engineer
8. Test Environment: Required hardware?s & software?s to
conduct testing on above module.
9. Entry Criteria: Test Engineers are able to start test
execution after creating below criteria
 Test cases developed and reviewed
 Test environment established
 Software build received from developers
10. Suspension Criteria: Some time test engineers are
stopping test execution part timely due to
 Test Environment is not working
 Pending defects (quality gap, job gap) are more at
development side
11. Exit Criteria: It defines test execution process exit point
 All requirements tested
 All major bugs resolved
 Final build is stable w.r.p.t customer requirements
6 to 11 ?How to test?
12. Staff & Training Needs: The selected test engineer?s
names and required no. of training sessions for them
13. Responsibilities: The mapping in between the names of
test engineers and the requirements in our project.
12 to 13 ?Who to test?
14. Schedule: Dates and time ?When to Test??
15. Risks & Assumptions: List of analyzed risks and their
assumptions to overcome
16. Approvals: Signatures of P.M or T.M & Test Lead
213835
d. Review Test Plan:
After completion of
test plan documents preparation, test lead is conducting a
review meeting to estimate completeness and correctness of
that document. In this review meeting the selected testing
team members that project are also involving.
3. Test Design:
After completion of test
planning, the corresponding selected test engineer?s are
concentrating on test design, test execution & test reporting.
Generally the selected test engineers are starting testing
job with the test design in every project. In this test
design every test is studying all requirements of the
project and preparing test cases for selected requirements
only w.r.p.t test plan.
In this test design, test engineers are using three types of
test case design methods to prepare test cases for
responsible requirements.
1. Functional & System Specification Based Test Case Design
2. Use Cases Based Test Case Design
3. Application Build Based Test Case Design
Test Case:
Every test case is defining a unique
condition. The every test case is self-standing and
self-cleaning to improve understandability in test design,
test engineers are starting every test case with verify or
check English words.The every test case is traceable with
requirement in your project
1. Functional & System Specification Based Test Case Design:
B.R.S
S.R.S (Functional & System Requirements) Test cases
(Run) Test
H.L.D & L.L.D
Execution
. Exe (Build)
Coding Build
From the above diagram the test
engineers are preparing the maximum test cases depending on
Functional & System Requirements in S.R.S. In this type of
test case writing, test engineers are following below approach.
Approach For Writing Test Cases:
1. Step 1: Collect functional and system specifications for
responsible requirements (modules)
2. Step 2: Select one specification from that list
2.1. Identify entry point (start)
2.2. Identify inputs required
2.3. Study normal flow
2.4. Identify outputs & outcomes
2.5. Identify exit point (End)
2.6. Identify alternative flow & exceptions (rules)
3. Step 3: Prepare test case titles or test scenarios
4. Step 4. Review the test case titles for completeness and
correctness
5. Step 5: Prepare complete documents for every test case title
6. Step 6. Go to step 2 until all specifications study and
test cases writing
Functional Specification: 1
A login process allows user id and password to authorize
users. User id is taking alpha numeric in lower case from 4
to 16 characters long. The password is taking alphabets in
lowercase from 4 to 8 characters long.
Prepare test case titles or scenarios:
Test Case 1: Verify user id value
BVA (Boundary value analysis) (Size)
Min --------- 4 chars----- pass
Max--------- -16 chars----- pass
Min-1-------- -3 chars------fail
Min+1---------5 chars----- pass
Max-1------- 15 chars------pass
Max+1------ 17 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
a - z A - Z
0 ? 9 Special Characters
Blank Fields
Test Case 2: Verify password value
BVA (Boundary value analysis) (Size)
Min --------- 4 chars----- pass
Max------------8 chars----- pass
Min-1-------- -3 chars------fail
Min+1---------5 chars----- pass
Max-1---------7 chars------pass
Max+1--------9 chars------fail
Test Case 3: Verify login operation
Decision Table:
User id Password Criteria
Valid Valid Pass
Valid Invalid Fail
Invalid Valid Fail
Value Blank Fail
Blank Value Fail
Functional Specification ? 2:
In
an insurance application users can apply for different types
of policies when a user select type A Insurance, system
asks, age of that user. The age value should be greater than
16 years and should be less than 80 years.
Prepare test case titles or scenarios:
Test Case 1: Verify the selection of type A insurance
Test Case 2: Verify focus on age when users select type A
insurance
Test Case 3: Verify age value
BVA (Boundary value analysis) (Size)
Min --------- 17 chars----- pass
Max--------- - 79 chars----- pass
Min-1-------- -16 chars------fail
Min+1---------18 chars----- pass
Max-1---------78 chars------pass
Max+1------- 80 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Functional Specification ? 3:
A door opened when a person comes to in front of the
door and the door closed when that person comes to inside.
Prepare test case titles or scenarios:
Test Case 1: Verify door open operation
Decision Table
Person Door Criteria
Present Open Pass
Present Closed Fail
Absent Open Fail
Absent Closed Pass
Test Case 2: Verify door close operation
Decision Table
Person Door Criteria
Inside Open Fail
Inside Closed Pass
For outside we have possibilities, but we covered in above
test case.
Test Case 3: Verify door operation when that person is
standing in the middle of the door
Functional Specification ? 4:
A
computer shut down operation
Prepare test case titles or scenarios:
Test Case 1: Verify shutdown option using start menu
Test Case 2: Verify shutdown option using Alt + F4
Test Case 3: Verify shutdown operation
Test Case 4: Verify shutdown operation when a process is in
running
Test Case 5: Verify shutdown operation using power off button
Functional Specification ? 5:
In a Shopping application users are purchasing
different types of items. In this purchase order our system
is allowing user to select item no. and to enter quantity up
to 10 this purchase order returns total amount along with
one item price.
Prepare test case titles or scenarios:
Test Case 1: Verify the selection of item number
Test Case 2: Verify quantity value
BVA (Boundary value analysis) (Size)
Min --------- 1 chars----- pass
Max--------- -10 chars----- pass
Min-1-------- -0 chars------fail
Min+1---------2 chars----- pass
Max-1---------9 chars------pass
Max+1------- 11 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Test Case 3: Verify calculation such as total = price * Qty
Functional Specification 6:
Washing machine operation
Prepare test case titles or scenarios:
Test Case 1: Verify power supply
Test Case 2: Verify door open
Test Case 3: Verify water filling with detergent
Test Case 4: Verify clothes filling
Test Case 5: Verify door closing
Test Case 6: Verify door closing due to clothes overflow
Test Case 7: Verify washing settings
Test Case 8: Verify washing operation
Test Case 9: Verify washing operation with improper power
supply (low voltage)
Test Case 10: Verify washing operation with clothes overload
inside
Test Case 11: Verify washing operation with door open in
middle of the process
Test Case 12: Verify washing operation with lack of water
Test Case 13: Verify washing machine with water leakage
Test Case 14: Verify washing operation with improper settings
Test Case 15: Verify washing operation with machinery problems
Functional Specification 7:
In an E ? Banking application, users are
connecting to bank server through Internet connection. In
this application users are filling below fields to connect
to bank server.
Password: 6 digits number
Area code: 3 digits number and optional
Prefix: 3 digits number but does not start with 0 and 1
Suffix: 6 digits alpha numeric
Commands: Cheque deposit, money transfer, mini statement,
bills pay.
Prepare test case titles or scenarios:
Test Case 1: Verify password value
BVA (Boundary value analysis) (Size)
Min=Max --------------6 chars----- pass
Min=Max-1--------- -- 5 chars----- fail
Min =Max+1-------- -- 7 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Test Case 2: Verify area code value
BVA (Boundary value analysis) (Size)
Min=Max --------------3 chars----- pass
Min=Max-1--------- -- 2 chars----- fail
Min =Max+1-------- -- 7 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Test Case 3: Verify prefix value
BVA (Boundary value analysis) (Size)
Min --------- 200 chars----- pass
Max--------- - 999 chars----- pass
Min-1-------- -199 chars------fail
Min+1---------201 chars----- pass
Max-1---------998 chars------pass
Max+1------- 1000 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Test Case 4: Verify suffix value
BVA (Boundary value analysis) (Size)
Min=Max --------------6 chars----- pass
Min=Max-1--------- -- 5 chars----- fail
Min =Max+1-------- -- 7 chars------fail
ECP (Equivalence class partition) (Type)
Valid Invalid
A ? Z
a- z
0 ? 9 Special Characters
Blank Fields
Test Case 5: Verify connection to Bank Server
Field Values
Criteria
All are valid values
Pass
Any one is invalid value Fail
Any one is blank except area code Fail
All are valid & Area code is blank pass
Functional Specification 8:
A
computer restart operation
Functional Specification 9:
Money with drawl from ATM machine
Test Case 1: Verify card insertion
Test Case 2: Verify card insertion in wrong angle or improper
Test Case 3: Verify card insertion with improper account
Test Case 4: Verify pin number entry
Test Case 5: Verify operation when you entered wrong pin
number 3 times
Test Case 6: Verify language selection
Test Case 7: Verify account type selection
Test Case 8: Verify operation when you selected invalid
account type w.r.p.t that inserted card
Test Case 9: Verify withdrawal option selection
Test Case 10: Verify amount entry
Test Case 11: Verify with drawl operation correct amount,
right receipt and able to take back the card
Test Case 12: Verify with drawl operation with wrong
denominations in amount
Test Case 13: Verify withdrawal operation when our amount >
possible balance
Test Case 14: Verify with drawl operation due to lack of
amount in ATM
Test Case 15: Verify with drawl operation when our amount is
> day limit
Test Case 16: Verify with drawl operation when our current
transaction number > day limit on number of transactions
Test Case 17: Verify withdrawal operation when we have
network problem
Test Case 18: Verify cancel after insertion of card
Test Case 19: Verify cancel after entry of pin number
Test Case 20: Verify cancel after selection of language
Test Case 21: Verify cancel after selection of account type
Test Case 22: Verify cancel after entry of amount.
Test Case Documentation Format:
After completion of test case titles or
scenarios selection, test engineers are documenting the test
case with complete information. In this test case
documentation, test engineers are using IEEE-829 formats.
Format:
1. Test Case ID: Unique No. or name
2. Test Case Name: The title or scenario of corresponding
test case.
3. Feature to be Tested: Corresponding module or function or
service
4. Test Suite Id: The name of test batch, in this batch our
test case is a member (Dependent group of member)
5. Priority: The importance of test case in terms of
functionality
P0 ? Basic Functionality (Functionality of projects
P1 ? General Functionality (Compatibility,
reliability, performance?)
P2 ? Cosmo tic Functionality (Usability of
projects)
6. Test Environment: The required hardware?s & software?s to
execute the test case on our application build.
7. Test Effort: Expected time to execute the test case on
build. (ISO ? Standards)
E.g. 20 minutes is an average
time (manually) by using tool 5 min
8. Test Duration: Approximate Date & Time.
9. Pre condition (or) Test set up: Necessary tasks to do
before start the test case execution
10. Test Procedure (or) Data Matrix:
Format for Test Procedure
Step No. Action I/P Required Expected Actual Result Defect
During Test Design During Test Execution
Format for Data Matrix
Input Object ECP Type BVA (Size/Range)
Valid Invalid Min Max
11. Test Case Pass (or) Fail Criteria: When this case is
passed & when this case is failed.
Note:
1. Above 11 fields test case format is not mandatory because
some field?s values are common to maximum test cases & some
field?s values are easy to remember or derive.
2. Generally the test cases are covering objects and
operations (more than one object). If our test case is
covering an object input values then test engineers are
preparing Data Matrix.
3. If our test case is covering an operation or function
then test engineers are preparing Test Procedure from
Base-State to End-State.
Functional Specification: 10
A login process is allowing a User ID & Password to
authorized users. User id is taking alpha numeric in lower
case from 4 to 16 characters long. The password object is
taking alphabets in lower case from 4 to 8 characters long.
Prepare Test Case Document -1
1. Test Case ID: Tc_login_ourname_1 (All capital letters)
2. Test Case Name: Verify user id
3. Test Suite ID: Ts_login
4. Priority: P0
5. Precondition: User id object is taking values from key board
6. Data Matrix:
Input Object ECP BVA
Valid Invalid Min Max
User id a-z, 0-9 A-z, Special Characters, Blank Fields 4 16
Prepare Test Case Document -2
1. Test Case ID: Tc_login_ourname_2 (All capital letters)
2. Test Case Name: Verify password
3. Test Suite ID: Ts_login
4. Priority: P0
5. Precondition: Password object is taking values from key board
6. Data Matrix:
Input Object ECP BVA
Valid Invalid Min Max
Password a-z 0-9, A-z, Special Characters, Blank Fields 4 8
Prepare Test Case Document -3
1. Test Case ID: Tc_login_ourname_3 (All capital letters)
2. Test Case Name: Verify login operation
3. Test Suite ID: Ts_login
4. Priority: P0
5. Precondition: Registered user id & password available in
hand (tester)
6. Data Matrix:
Step No. Action Input Required Expected
1 Focus to login window None User id object focused
2 Fill fields ?User id? & ?password? ?ok? button enabled
3 Click ?ok? Valid Valid Next message
Valid Invalid Error message
In valid Valid Error message
Valid Blank Error message
Blank Valid Error message
2. Use Cases Based Test Case Design:
Other alternative method for test cases
selection is ?use cases based? test case design. This method
is referable to outsourcing testing companies. Generally the
maximum testing people are preparing test cases depending
?Functional & System Specifications? in corresponding
project SRS. Some times the testing people are preparing
test cases depending on use cases also. ?Use cases? are more
elaborative and more understandable than functional and
system specifications.
BRS BRS
SRS
SRS
Test Cases Use cases
HLD & LLD?s
HLD & LLD?S
Coding
Coding
(Functional & System Specification (Use Cases Based
Test Case Design)
Test Case Design)
From the above diagrams test team
is receiving ?use cases? from project management. To prepare
test cases, every use case is describing functionality with
all required information. Every use case is following a
standard format, unlike theoretical functional specification.
Formats:
1. Use Case Name: The name of use case for future reference
2. Use Case Description: Summary of functionality
3. Actors: Names of actors which are participating in
corresponding functionality
4. Related Use Cases: Names of related use cases, which have
dependency with this use case
5. Preconditions: List of necessary tasks to do before start
this functionality testing in our project
6. Activity Flow Diagram: The graphical notation of
corresponding functionality
7. Primary Scenarios: A step by step actions to perform
corresponding functionality
8. Alternative Scenario?s: Alternative list of actions to
perform same functionality
9. Post Conditions: It specifies the exit point of
corresponding functionality.
10. U. I. Make up: Model screen or prototype
11. Special Requirements: List of rules to be following if
possible.
Conclusion:
From the above use case format, project management
is providing every functionality documentation with complete
details. Depending on that use cases, test engineers are
preparing test case using IEEE-829 Format
4. Application Build Based Test Case Design:
Generally the test engineers are preparing
test cases depending on functional & system specifications?
or ?use cases?. After completion of maximum test cases
selection, test engineers are preparing some test cases
depending on application build, which received from
developer team. These new test cases are only concentrating
on usability of the screens in our application build. These
test cases are covering
1. Ease of use
2. Look & Feel
3. Speed in Interface
4. User manuals correctness (Help Documents)
Example Test Cases:
Test Case-1: Verify spelling in every screen.
Test Case-2: Verify contrast of each object in every screen
Test Case-3: Verify alignment of objects in every screen.
Test Case-4: Verify color commonness in all screens
Test Case-5: Verify font commonness in all screens
Test Case-6: Verify size commonness in all screens
Test Case-7: Verify functionality-grouped objects in screens
Test Case-8: Verify boarders of functionality grouped objects.
Test Case-9: Verify tool tips (e.g. Messages about icons in
screens)
Test Case-10: Verify the place of multiple data objects in
screens. (E.g. list boxes, combo boxes, and table grids,
active x controls, menus?)
Test Case-11: Verify scroll bar.
Test Case-12: Verify labels of objects in every screen as
init-cap
Test Case-13: Verify keyboard accessing in your application
build
Test Case-14: Verify abbreviations in all screens (E.g.
short cuts)
Test Case-15: Verify information repetition in screens
Test Case-16: Verify help documents (Help menu contents)
Note:
Generally the test engineers are preparing maximum test
depending on functional & system specifications in SRS, the
remaining test cases are prepared using application build
because the functional & system specifications are not
providing complete information about every small issue in
our project.
Some times the testing people are using ?Use Cases?
instead of functional & system specification in SRS.
Review Test Cases:
After completion of test cases selection and
documentation, test lead is conducting a review meeting
along with test engineers. In this review test lead is
concentrating on the completeness and correctness of test
engineers prepared test cases. In this coverage analysis
test lead is using two type of factors.
Requirement based test cases coverage
Testing Technique based test cases coverage
After completion of this review meeting, test engineers
are concentrating on test execution.
4. Test Execution:
In test execution,
test engineers are concentrating on test cases execution and
defect reporting and tracking. In this stage the testing
team is conducting a small meeting with development team for
version controlling and establishment of test environment.
1. Version Control:
During test
execution development people are assigning unique version
no. To software builds after performing required changes.
This version numbering system is understandable to testing
people.
For this build version controlling the development people
are using version control software?s
Ex. VSS ? Visual source safe
2. Levels of Test Execution:
Development
Testing
Initial build
Stable build
Level-0 (sanity / smoke)
Defect reporting
Bug fixing
Level-1 (comprehensive / real)
Modified build
Bug resolved
Level-2 (regression)
Level-3(final regression/post mart)
3. Levels of Test Execution V/s Test Cases:
Level ? 0 (Initial Build) ? Selected test cases for basic
functionality
(Sanity/Smoke Testing)
Level ? 1 (Stable Build) ? All test cases in order to detect
defect.
(Comprehensive Testing)
Level ? 2 (Modified Build) ? Selected test cases w.r.p.t
modifications.
(Regression Testing)
Level ? 3 (Master Build) ? Selected test cases w.r.p.t bug
density
After that Golden Build (ready to UAT) released to customer.
1. Level - 0 (Sanity / Smoke Testing): Generally the testing
people are starting test execution with level ? 0 testing.
It is also known as Sanity / Smoke Testing or Tester
Acceptance Testing (TAT) or Build Verification Testing or
Testability Testing.
In this testing level, test engineers are concentrating on
below 8 factors through operating corresponding initial build
 Understandable
 Opera table
 Observable
 Controllable
 Consistency
 Simplicity
 Maintainable
 Automat able
Operation + Observation = Testing
Programmer: Expect logic & develop functionality
Tester: Expecting customer requirement
2. Level -1 (Comprehensive Testing):
After receiving stable build from development team,
test engineers are executing all test cases sequentially
either in manual or in automation.
In manual test execution, test engineer is comparing test
cases specified expected values, and build specify actual
values. In this test execution, test engineers are preparing
?test log? document. This document consists of 3 types of
entries.
Passed: All expected values of the test case are equal to
all actual values of that build.
Failed: Any one expected value is variation with any one
actual value of that build
Blocked: Dependent test cases execution post phoned to next
cycle (After modified build) Due to wrong parent functionality.
Level-1
(Comprehensive test cycle)
Level -2 (Regression Testing):
During
above level-1 comprehensive testing, testing people are
reporting mismatches between test cases expected and build
actual to development team as ?defect report?. After
reviewing and resolving the defect, development people are
releasing modified build to testing team. In this stage, a
development person is releasing ?release note? also. The
responsible test engineers are studying that release note
and try to understand modifications in that modify build
and then test engineers are concentrating on regression
testing to ensure that modifications.
Level-0
Level-1
Check in  Level-2  Check-out
(Regression)
From the above diagram, test engineers are conducting
regression testing on modified build w.r.p.t modifications,
which are mentioned in ?release note?.
Study release note & consider severity of resolved bug
High Medium Low
all P0 (priority) all P0
some P0
all P1 max P 1
some P1
max P2 test cases some P2 test cases some P2
test cases
On modified build
Case1: If the development team resolved bug severity is high
then test engineers are re-executing all P0, all P1 and
carefully selected maximum P2 test cases on that modified
build w.r.p.t modifications mentioned in release note
Case2: If the development team resolved bug severity is
medium then test engineers are re-executing all P0 carefully
selected all P1 and some of P2 test cases.
Case3: If the development team resolved bug severity is low
then test engineers are re-executing carefully selected some
P0, P1, P2 test cases.
Case4: If testing team received modified build due to sudden
changes in customer requirements, then test engineers are
re-executing all P0, all P1, max P2 test cases.
0
5. Test Reporting:
During level ?1 & level ?2,
test execution, test engineers are reporting miss matches in
between test case expected values and build actual values as
defect report to development team.
In this test reporting, development people are receiving
defect report from testing team in a standard format. This
format followed by every test engineer in test execution to
report defects.
IEEE-829 Defect Report Format:
1. Defect ID: Unique no./name for future reference.
2. Description: Summary about defect.
3. Build Version: The version no. of current build, in this
build test engineers detected this defect.
4. Feature: The name of module / function, in that area test
engineers found this defect.
5. Test Case Name: The name of failed test case, in that
case execution test engineer found this defect.
6. Status: New reporting first time, Re-open Re-reporting.
7. Re-producible: Yes Defect appears every time in test
case execution, No Defect appears rarely in test case
execution.
8. If Yes: Attach test procedure.
9. If No: Attach snapshot and strong reasons.
10. Severity: The seriousness of defect in terms of
functionality.
High  Not able to continue testing
with out resolving this defect (show stopper)
Medium  Able to continue remaining
testing, but mandatory resolve.
Low  Able to continue remaining testing and May
/ May not to resolve.
11. Priority: Importance of defect to resolve in terms of
customer (High, Medium, Low)
12. Detected by: Name of test engineer.
13. Assigned To: The name of responsible person at
development side to receive this defect report.
14. Suggested Fix (Optional): Reasons to accept and resolve
this defect.
Resolution Type:
After receiving defect report from testing team, the
responsible development people are conducting review meeting
and sending resolution type to the responsible testing team.
There are 12 types of resolutions, they are
1. Enhancement: The reported defect is rejected, because
this defect related to future requirements of the customer.
2. Duplicate: The reported defect is rejected, because this
defect raised due to limitations of hardware devices.
3. Hardware Limitations: The reported defect is rejected,
because this defect raised due to limitations of hardware
devices.
4. Software Limitations: The reported defect is rejected,
because this defect is raised due to limitations Software
Technologies (Ms-Access).
5. Not Applicable: The reported defect is rejected because
these defects have improper meaning.
6. Functions as Designed: The reported defect is rejected,
because the coding is correct w.r.p.t design documents.
7. Need More Information: The reported defect is not
accepted / not rejected but the developers are requiring
more information about the defect to understand.
8. Not Re-producible: The reported defect is not accepted &
not rejected, but the developers are requiring correct
procedure to reproduce that defect.
9. No Plan To Fix It: The reported defect is not accepted &
not rejected, but the development people are requiring some
extra time.
10. Open: The reported defect is accepted & the development
people are ready to resolve through changes in coding.
Is This Answer Correct ? | 5 Yes | 2 No |
Answer / hasini
PET PROCESS-REFINEMENT OF V-MODEL
test intiation
test planning
test design
test execution
test reporting
test closure
Is This Answer Correct ? | 2 Yes | 0 No |
Answer / rajesh
we can and we should explain Test process in easy and
understandable manner as great Test Engineer...that is like
1. Test Initiation
2. Test Plan
3. Test Design
4. Test Execution
5. Defect Reporting
6. Test Colser
thats it friends......
Is This Answer Correct ? | 1 Yes | 0 No |
Answer / prasannat
In my Company The Testing process starts with ?TEST
INITIATION? . In this the Project Manager prepares Test
Methodology for corresponding Project. He decides
reasonable tests depends on requirements and release the
document to Test Lead.
To Prepare the ?TEST PLAN? My Test Lead studying the
BRS, SRS, Design Docs, development Plan and Test
strategy . He go to HR Manager to talk with team
formation. After completion of Testing Team formation
Risks analysis he prepares the complete ?TEST PLAN? and
Detailed ?TEST PLANS?. The Test Lead decides the schedule
of the different tests i.e What to Test? When to Test? How
to Test? Who to Test?. After completion of test plan and
review he will take the approval from Project Manager and
provide training to selected test engineers.
After completion of required training like me people are
concentrating on Test Cases outlines. Basing on the
outlines preparing indepth Test Case documents. After
receiving the build from developers and sanity testing we
start the test execution to detect defects. We report the
defects by Excel format. After modifications the
development people release the modified build in that we
conduct the regression testing. After completion of all
reasonable tests and defects closed the management is
concentrating on User Acceptance testing . In this testing
they collect the feed back from real or model customers.
After completion of UAT my test lead will prepare the final
test summary report . This report is submitting to
customers by TL or PM.
Is This Answer Correct ? | 0 Yes | 0 No |
Answer / lakshmi
In our organization after statement of work is signed
1. Team lead starts preparing Test plan and send it for
client's approval
2.if client need any enhancements then changes r made and
agai sent it for clients approval.
3. Once the test plan is baselined then we start authoring
the test cases
4. These test cases are send for clients approval
5. Client will send a qa plan
6. based on that team lead will assaign atask i.e
distributing the work to the team members.(Effort estimates
r done here)
7. and at each phase Peer reviews etc are conducted.
8. once test cases are authored we start filling up the RTM
immediately.
9. Start down loading the build.
10. perform smoke test
11. perform sanity check
12. start executing all the test cases
13. If any bugs are found during the execution then log the
bug using any bug tracking tool.
14. during the entire process all the weekly status reports
and daily status reports are mainted.
15. when u get modified build then perform regression
testing to make sure all the fixed bugs are correct and
there are no side effects i.e old functionalities will not
effect with the new changes
16. After this cycle is completed that is the build is
ready for the release then we call it as a "Release
candidate" or "Gold Release".
17. And we perform "User Acceptence Test" on this gold
release.
18. Then finally team lead will prepare Test summary report.
This is the process. If any corrections are there pls let
me know.
Is This Answer Correct ? | 0 Yes | 0 No |
why did u choose software testing as ur profession????
Should I write test case for sorting items? [If there are 1000 items & I have to sort them by name, code etc..]
What is smoke testing and what is sanity?
What are the different phases available in STLC.
If there is a field called "Amount" which takes 4 digit number & 2 decimal points what will be the test data for this.
In my application, date box is present on every screen. As a tester I have to test it on every screen. So need to write test case for it. If I write a test case for Date box only once & I refer that test case for next screen date box testing. Is it Ok?? E.g. Test case name - Date Box test Test Case Id - 01 If I want to write test case for Date box on New screen/module. Then is it OK?? Test Case Name - Date Box test Test Case Id - 20 Test Steps - Refer steps of test case id-01
what is debugging and whitebox testing and what is the diff?
How will you transfer file from remote server to your machine if ftp is disable
wht is ISO standards
What is big bang approach?
1. Write configuration tests for a web based application. 2. How does a plane turn on ground? Explain with logic. 3. How would you test the volume control for a music system? 4. You have been given an application and you have only 15 mins to test the application. How would you do the testing? 5. Your machine, which is on a network has been disconnected. How you debug the problem? 6. What is the sum of numbers from 1 to 1000? 7. What is the probability that on rolling four dices same number appears on all dices? 8. You have been given 8 identical balls out of which one of them is heavier. How would find the heavier ball in the least number of tries with the help of a balance? Explain 9. There is a disc which has been painted half black and half white. You have been given sensors which could detect white or black colors. The disc is revolving in either clockwise or anticlockwise direction with variable angular speed. Find the minimum number of sensors required to determine the direction of rotation of the disc. 10. 64,54,42,31,20 which number does not fit in the series?
What is Defect Leakage? and explain with example?