Monday, 29 December 2014

TESTING DOCUMENT


Testing Documents

Company level
 
                                                                        Test Policy
                                                                    Test Strategy
                                                                              
Test Methodology
Test plan
Test Cases
Project level
 
Test Procedures
Test Engineer
 
Test Scripts
Defect Reports
Final Test Summary Report


I.       TEST POLICY:

This document developed by Quality Control people(Almost Management). In this document QC defines “Testing Objectives”.

XXXXXXXXXXXXXXXX
XXXXXXX

Testing Definition       :        Verification & Validation        


Testing Process           :         Proper planning before starts testing.


Testing Standard         :         1 defect per 250 line of coding /1 defect per 10 Functional Point


Testing Measurement :          Quality Assessment Measurement, Testing Management        Measurement, Process Capability Measurement



XXXXXXX
                                                                                                                                           (C.E.O)




II.    TEST STRATEGY:

It is a company level document and developed by Quality Analyst / Project Manager category people. This document defines testing approach.

Components

1.         Scope & Objective: Definition & purpose of testing in organization.
2.         Business Issues: Budget control for testing.

100%







 
                                    64                        36
            (Development & Maintenance) (Testing)

3.      Test Approach: Mapping between development stages and testing issues.

Develop Stages
                     

Information Gathering & Analysis

Design

Coding
System Testing

Maintenance

Testing Issues
        
Ease of use
X
X
ü
ü
Depends up on change request
Authorization
ü
ü
ü
ü

This Matrix is known as “TEST RESPONSIBILITY MATRIX”.

4.      Test Deliverables:  Required testing documents to be prepared.
5.      Roles & Responsibilities: Names of jobs in testing team and their responsibilities.
6.      Communication & Status Reporting: required negotiations between two consecutive jobs in      6testing team.
7.      Defect Reporting & Tracking:  Required negotiations between testing team and development team during test execution.
8.      Automation & Testing Tools: Purpose of automation and possibilities to go to test automation. 
9.      Testing Measurements & Metrics: QAM, TMM, PCM.
10.  Risks & Mitigations:  What possible problems will come in testing and solutions to over come them.
11.  Change & Configuration Measurement:  To handle change request during testing.
12.  Training Plan:  Required training secessions to testing team before start testing process.

Testing Issues:

To define a quality software organizations are using 15 testing issues as maximum.

     QC       ß  Quality
  QA/PM ß  Test Factor
   TL ß  Testing Technique
 TE ß Test Cases

From the above model a quality software testing process formed with below 15 testing issues.

1.      Authorization:  Whether user is valid are not to connect to application.
2.      Access Control: Whether a valid   user have permission to use specific service or not.
3.      Audit Trial: Maintains Metadata about user operations in our applications.
4.      Continuity of processing:  Inter process communication (Module to Module).
5.      Corrections: Meet customer requirements in terms of functionality.
6.      Coupling: Co-Existence with other existing software’s to share resources.
7.      Ease of Use: User friendliness of the screens.
8.      Ease of Operate: Installation, un-installation, Dumping, Downloading, uploading etc
9.      File Integrity: Creation of backup.
10.  Reliability: Recover from abnormal stage.
11.  Performance: Speed of processing.
12.  Portable: Run on different platforms.
13.  Service Levels: order of functionalities.
14.  Maintainable: Whether our application build is long time serviceable to customer site people are not.
15.  Methodology: Whether our testers are following standards are not during testing.

Test Factors Vs Black Box testing Techniques

1.      Authorization                    → Security Testing
                                                → Functionality Testing

2.      Access Control                  → Security Testing
                                                → Functionality Testing

3.      Audit Trial                                    → Functionality Testing
→Error handling testing

4.      Continuity of processing   → White Box
→ Execution
→ operation

5.      Corrections                        → Functionality testing
→ Requirements testing

6.      Coupling                           → Inter System testing

7.      Ease of Use                       → User Interface testing
→ manuals support testing

8.      Ease of Operate                → Installation testing

9.      File Integrity                     → Functionality testing
→ Recovery Testing

10.  Reliability                          → Recovery Testing (1 user)
→ Stress testing (Peak hours)

11.  Performance                      → Load testing
→ Stress testing
→ Storage testing
→ Data Volume testing

12.  Portable                             → Compatibility testing
→ Configuration Testing

13.  Service Levels                   → Functionality Testing
→ Stress testing

14.  Maintainable                     → Compliance Testing
Management Level Testing.
15.  Methodology                    → Compliance Testing



III. TEST METHODOLOGY:

It is a project level document. Methodology provides required testing approach to be followed for current project. In this level QA / PM selects possible approaches for corresponding project testing through below procedure.

Step 1: Acquire Test Strategy
Step 2: Determine project type.

Type
IF gathering & Analysis
Design
Coding
System Testing
Maintenance

Traditional


ü

ü

ü

ü

ü

Off Shelf


X

X

X

ü

X

Maintenance


X

X

X

X

ü

Note: Depends up on project type QA/PM decrease number of columns in TRM.

Step 3: Determine Project requirements
Note: Depends on project requirements QA/PM decreases number of rows in TRM.

Step 4: Identifies Scope of application
Note: Depends on expected future enhancements QA/PM add some of previously deleted rows and columns.

Step5: Identifies tactical risks.
Note: Depends on analyzed risks, QA/PM decreases number of selected issues (Rows) in TRM.

Step 6: Finalize TRM for current project

Step7: Prepare system test plan.

Step 8: Prepare modules test plans if required.

Testing Process:


  






PET Process (Process Expert Tools and Techniques) :

It is a refinement form of V model. It defines mapping between development stages and testing stages. From this model organizations are maintaining separate team for functionality and system testing. Remaining stages of testing done by development people. This model developed in HCL and recognized by QA forum of India.

Information Gathering (BRS)
Analysis (S/W RS)
                                                                                                                                     
                        Design                                                                                     Test Initiation
                                                                                                                                     
                        Coding                                                                                           Test Planning & Training                                                                                                                                       
               Unit & Integration                                                                                 Test Design          
                                                                                                                                        
                                                                                                                     Test case selection closer                                                                         Initial Build
                                                                                                                                                                                                                 Sanity / Smoke / TAT/ BVT ( Level 0)
           
            Test Automation
           

            Create test scripts / Test batches / Test Suits
Next Batch
 
Modified Build

Regression
Level 2
 
           
        Resolving                        Select a batch and start execution
Defect Fixing
 
                         ↓ Level 1
Developers
 
Defect

Reporting
 
                                    If a test engineer got a mismatch    Independent Batch
            Suspend that batch
           
            Otherwise
           
            Test Closer
           
Final Regression / Release testing / Pre Acceptance / Post Mortem (Level 3)
           
            User Acceptance Testing
           
            Sign OFF

IV. TEST PLANNING:

After finalization of possible tests to be applied for corresponding project, test lead category people concentrate on test plan document preparation to define work allocation in terms of “ what to test?”, “Who to test ?”, “when to test ?”, and “How to test ?”.







To prepare test plan documents, test plan author follows below approach
Pentagon: System Test Plan
-          Team Formation
-          Identify Tactical Risks
-          Prepare Test Plan
 
Pentagon: Development Documents


-   Review Test Plan
 
Pentagon: TRM


1.      Team Formation:

In general, test planning starts with testing team formation. To define a testing team, test plan author depends on below factors.
i.        Availability of testers
ii.      Test duration
iii.    Availability of test environment Resources

Case Study:

Test Duration:
-          Client / Server or Web or ERP            - 3 to 5 months functional & system testing
-          System S/W                                        -  7 to 9 months functional & system testing
-          Machine critical                                   - 12 to 15 months functional & system testing
                                                                              (Robots,  satellites etc )
-          Team Size                                            -  3 : 1 (developers  : Testers)

2.    Identify Tactical Risks:

After completion testing team formation, test plan author analyses possible risks and mitigations.

Example:

Risk 1 :           Lack of knowledge of test engineers on that domain.
Risk 2 :           Lack of resources
Risk 3 :           Lack of budget ( Time )
Risk 4 :           Lack of test data ( Some times test engineers are conduction Adhoc testing depends on past experiences)
Risk 5 :           Lack of development process rigor (Seriousness)
Risk 6 :           Delays in delivery
Risk 7 :           Lack of communication ( In between Testing team & Test lead / developers / testing team)

3.    Prepare test Plan :

After completion of testing team formation and risks analysis, test plan author concentrates on test plan documentation in “IEEE” format.

Format:

1.      Test plan ID:  Unique Number / Name
2.      Introduction: About Project
3.      Test Items: Modules / Functions / Services / Features
4.      Features to be Tested:  Responsible modules for test design.
5.      Features not to be tested: Whish ones and why not.
Note: 3-5 What to test?
6.      Approach:  List of selected techniques to be applied on above specified modules (From finalized TRM)
7.      Feature pass/fail criteria: when a feature is pass and when a feature is fail.
8.      Suspension Criteria: Possible abnormal situations raised during above features testing.
9.      Test Environment:  Require hardware  & software to conduct testing on above features.
10.  Test deliverables:  Required testing documents to de prepared during testing.
11.  Test Tasks: Necessary tasks to do before start every feature testing.
Note: 6 –11 How to test ?
12.  Staff & Training Need: Names of selected test engineers and training requirements to them.
13.  Responsibilities:  Work allocation to above selected staff members.
Note :  12 & 13 Who to test?
14.  Schedule: Dates & Times
Note: 14 – when to test?
15.  Risks & Mitigations: Possible testing level risks and solutions to overcome them.
16.  Approvals:  Signatures of test plan author and PM/QA.

4.      Review Test Plan:

After completion of plan document preparation test plan author conducts a review for completeness and correctness. In this review plan author follows “Coverage Analysis

Þ           BR based coverage (What to test? Review)
Þ           Risks Based coverage (When and Who to test? Review)
Þ           TRM based coverage (How to test? Review)

Case Study:
Deliverable
Responsibility
Completion time
Test Case Selection
Test Engineer
30 to 40 days
Test case review
Test lead / engineer
4 to 5 days
Requirements Traceability matrix
Test Lead
1 to 2 days
Test Automation        (including Sanity testing)
Test engineer
10 to 20 days
Test execution  including Regression testing
Test engineer
40 to 60 days
Defect reporting
Test engineer / Every one
On going
Communication Status reporting
Test Lead
Weekly Twice
Test closure & Final Regression
Test Lead /  Test Engineer
4 to5 days
User Acceptance Testing
Customer site people / involvement of testing team
4 to 5 days
Sign OFF
Test Lead
1 to  2 days


V.      Test Design:

After completion of test planning and required training to testing team, corresponding testing team members will prepare list of test cases for their responsible modules. There are three types of test case design methods to cover core level testing (Usability & Functionality testing).

1.    Business logic based test case design
2.    Input Domain based test case design
3.    User interface base test case design

1.      Business logic based test case design:

In general test engineers are writing a set of test cases depends up on use cases in S/W RS. Every use case describes functionality in terms of input, process and output. Depends on this use cases test engineer are writing test cases to validate that functionality.

                        BRS                                                                Test Cases
                         
             Use Cases / Functional Specs
                         
                        HLD
                         
                      LLD’s
                         
               Coding( .EXE)
                                                                                          
From the above model test engineers are preparing test cases depends on corresponding use cases and every test case defines a test condition to be applied.

To prepare test cases, test engineers study use cases in below approach.

Step 1:  Collect use cases of our responsible modules.
Step 2:  Select use cases and their dependencies from that list







Oval: Use Case
 


Oval: Use Case  
              Determinant                                                               Dependent

Step 2.1:  Identify entry condition (Base State)   
Step 2.2:  Identify Input required (Test Data)
Step 2.3:  Identify exit condition (End state)
Step 2.4:  Identify output and out come (Expected)



             Multiply                                                                Login Operation

UID

PWD      
 
XXXX
 

Input1

Input 2



Result
 
OK
 
XXX
 
XXX
 
  










IN BOX
 


XXX
 



XXX
 






OK
 

 







                Output                                                                    
                                                                                                                           Outcome     

Step 2.5:  Identify normal flow (Navigation)
Step 2.6:  Identify alternative flows and exceptions (Protocols)
Step 3   :  Write test cases depends on above information.
Step 4   :   Review test cases for completeness and correctness.
Step 5   :   Go to step 2 until completion of all use cases.

Use Case 1:

A login process allows UID & PWD to validate users. During this validation, login process allows UID as alphanumeric from 4 to 16 characters long and PWD allows alphabets in lower case from 4 to 8 characters long.

Test Case 1: Successful entry of UID.





BVA(Size)

Min      – 4              Pass
Max     – 16             Pass
Min-1 – 3               Fail
Min+1 – 5               Pass
Max-1 - 15              Pass
Max+1– 17              Fail


 


ECP(TYPE)

Valid
Invalid
a – z, A- Z,
0 – 9
Special characters

Blank


 

 










Test Case 2: Successful entry of PWD





BVA(Size)

Min      – 4              Pass
Max     – 8               Pass
Min-1 – 3               Fail
Min+1 – 5               Pass
Max-1 -  7               Pass
Max+1– 9                Fail


 

ECP(TYPE)

Valid
Invalid
a – z
A- Z,
0 – 9
Special characters

Blank


 
 











Test Case 3: Successful login operation

UID
PWD
Criteria
Valid
Valid
Pass
Valid
In valid
Fail
In Valid
Valid
Fail
Value
Blank
Fail
Blank
Value
Fail

Use Case 2 :
           
In a shopping application user can apply for different purchase orders. Every purchase orders allows item selection number and entry of qty up to 10. System returns one item price and total amount depends on given quantity.

Test Case 1: Successful Selection of item number.
Test Case 2: Successful Entry of QTY







ECP(TYPE)

Valid
Invalid
0 – 9

A- Z,
a - z
Special characters
Blank


 

BVA(range)

Min      – 1              Pass
Max     – 10             Pass
Min-1 – 0               Fail
Min+1 – 2               Pass
Max-1 -  9               Pass
Max+1– 11              Fail


 

 











Test Case 3: Successful Calculation, Total = Price X QTY

Use Case 3:

In an insurance application, user can apply for different types of insurance policies.
When they select insurance type as B, system asks age of that customer. The age should be > 18 years and < 60 years.

Test Case 1: Successful selection of type B insurance.
Test Case 2: Successful focus to age
Test Case 3: Successful entry of age






ECP(TYPE)

Valid
Invalid
0 – 9

A- Z,
a - z
Special characters
Blank


 

BVA(range)

Min      – 19            Pass
Max     – 59             Pass
Min-1 – 18             Fail
Min+1 – 20             Pass
Max-1 -  58             Pass
Max+1– 60              Fail


 

 


   










Use Case 4:

A door opens when a person comes in front of door. A door closed when a person come in.

Test Case 1:  Successful door opens, when person comes in front of door.
Test Case 2: Unsuccessful door open due to absence of the person in front of the door.
Test Case 3: Successful door closing after person get in.
Test Case 4: Unsuccessful door closing due to person standing at the door.
Use Case 5:  Prepare test cases for washing machine operation.

Test Case 1:  Successful power supply.
Test Case 2:  Successful door open
Test Case 3:  Successfully filling water.
Test Case 4:  Successful drooping of detergent
Test Case 5:  Successful filling of cloths
Test Case 6:  Successful door closing
Test Case 7:  Unsuccessful door close due to over flow of cloths
Test Case 8:  Successful selection of washing settings
Test Case 9:  Successful washing operation
Test Case 10:  Unsuccessful washing due to wrong settings
Test Case 11:  Unsuccessful washing due to lack of power
Test Case 12:  Unsuccessful washing due to lack of water
Test Case 13:  Unsuccessful washing due to water leakage
Test Case 14:  Unsuccessful washing due to door open in the middle of the process
Test Case 15:  Unsuccessful washing due to machinery problem
Test Case 16:  Successful Dry cloths

Use Case 6:
Prepare test case for money withdrawal from ATM.

Test Case 1:  Successful insertion of card
Test Case 2:  Unsuccessful operation due to wrong angle of card insertion
Test Case 4:  Unsuccessful operation due to invalid card
Test Case 4:  Successful entry of pin number

Test Case 5:  Unsuccessful operation due to entry of wrong pin no three times
Test Case 6:  Successful selection of language
Test Case 7:  Successful selection of account type
Test Case 8:  Unsuccessful operation due to invalid account type selection

Test Case 9:  Successful selection of withdrawal option
Test Case 10:  Successful entry of amount

Test Case 11:  Unsuccessful operation due to wrong denominations
Test Case 12: Successful withdrawal (Correct amount, Right receipt and card come back)
Test Case 13: Unsuccessful withdrawal due to amount > possible balance.
Test Case 14: Unsuccessful withdrawal due to amount > Day limit (Including Multiple transactions)
Test Case 15:  Unsuccessful transaction due to lack of amount in ATM
Test Case 16:  Unsuccessful due to server failure
Test Case 17:  Unsuccessful due to click cancel after insert card
Test Case 18:  Unsuccessful due to click cancel after insert card & PIN
Test Case 19:  Unsuccessful due to click cancel after insert card, PIN & language selection
Test Case 20:  Unsuccessful due to click cancel after insert card, PIN, language & account type selection
Test Case 21:  Unsuccessful due to click cancel after insert card, PIN, language, account type & amount selection
 







Use Case 7:      

In  an E-Banking application users can connect to bank server using his personnel computers. In this login process user can use below fields.

Password                6 digit no
Area code                3 digit no, allows blank
Prefix                     3 digit no, does not begins with 0 or 1.
Suffix                      6 digit alphanumeric
Commands              Check deposit, Money transfer, Bill pay  and Mini statement.

Test Case 1: Successful entry of password.






ECP(TYPE)

Valid
Invalid
0 – 9

A- Z,
a - z
Special characters
Blank


 

BVA(Size)

Min = Max = 6       Pass
Min-1        – 5        Fail
Min+1        – 7        Fail


 

 
  










Test Case 2: Successful entry of area code






ECP(TYPE)

Valid
Invalid
0 – 9
 Blank

A- Z,
a - z
Special characters


 

BVA(Size)

Min = Max = 3       Pass
Min-1        – 2        Fail
Min+1        – 4        Fail


 

 


        









Test Case 3: Successful entry of prefix






ECP(TYPE)

Valid
Invalid
0 – 9


A- Z,
a - z
Special characters
Blank

 

BVA(Range)

Min      – 200          Pass
Max     – 999           Pass
Min-1 – 199           Fail
Min+1 – 201           Pass
Max-1 -  998           Pass
Max+1– 1000          Fail


 

 
           









Test Case 4: Successful entry of suffix





BVA(Size)

Min = Max = 6       Pass
Min-1        – 5        Fail
Min+1        – 7        Fail


 

ECP(TYPE)

Valid
Invalid
0 – 9
A- Z,
a – z


Blank
Special characters


 
 





Test Case 5: Successful selection of commands such as check deposit, money transfer, bills pay and mini statement.
Test Case 6: Successful connect to bank server with all valid values
Test Case 7: Successful connect to bank server with out filling area code.
Test Case 8: Unsuccessful operation due to with out filling all fields except area code.

Test Case Format:

During test design test engineers are writing list of test cases in IEEE format.

1.      Test Case ID                    : Unique number or name
2.      Test Case Name               : The name of test condition to be tested.
3.      Features to be Tested      : Module / Function / Feature
4.      Test Suit ID                      : Batch ID in which this case is a member.
5.      Priority                             : Importance of test case

                                                            P0 :  Basic Functionality
                                                                     P1 :  General Function (I/P domain, Error handling,                                           Compatibility, Inter systems  etc)      
                                                            P2  :  Cosmetic (User Interface)
6.      Test Environment         :  Required Hardware and software to executive this test case.
7.      Test Effort(Person/hr)  : Time to executive this test case  (Ex : 20 mts max)
8.      Test Duration                 :  Date & Time
9.      Test Setup                       :  Required testing tasks to do before starts this case execution.
10.  Test Procedure               :  Step by step procedure to executive this test case.

      Format:  

Step No
Action
I/P required
Expected
Actual
Result
Comments








                                         Test Design                                                            Test Execution

11.  Test Case Pass/Fail Criteria: When this case is pass and when this case is fail.

Note: In general test engineers are writing list of test cases along with step by step procedure only.

Example: Prepare test procedure for below test case. Successful file save in note pad.

Step No
Action
I/P required
Expected
1
Open note pad
-
Empty Editor
2
Fill with text
-
Save icon enabled
3
Click save icon
-
Save window appears
4
Enter file name & click save
Unique File name
File name appears in title bar of editor






Example2: Prepare test scenario with expected for below test case.
                        “Successful Mail reply” in Yahoo.
                                                 
Step No
Action
I/P required
Expected
1
Login to site
Valid UID Valid PWD
Inbox appears
2
Click Inbox
-
Mail box appears
3
Click Mail Subject
-
Mail Message Appears
4
Click Reply
-
Compose Window appears with
To: Received Mail ID
Sub: Received mail Subject
CC: Off
BCC: Off
MSG: Received message with comments.
5
Type New massage and click send
-
Acknowledgement from WEB server


2.      Input Domain Based Test Case Design:

In general test engineers are writing maximum test cases depend on use cases / functional specs in S/W RS. These functional specifications provide functional descriptions with inputs, outputs and process. But they are not responsible to provide information about size and type of input objects. To collect this type of information test engineers study “Data Modal” of responsible modules (E-R Diagrams in LLD’s)

During data model study, test engineer follows below approach.

Step 1:  Collect data model of responsible modules
Step 2:  Study every input attribute in terms of size, type and constraints.
Step 3:  Identify critical attributes in that list, which participated in manipulations and retrievals.
Step 4:  Identify non-critical attributes such as just input, output type.

Example:

                                                A/C No
Critical                        A/C Name
                                                Balance                Non Critical
                                                A/C Orders    

Step 5:  Prepare BVA and ECP for every input object.

I/P Attribute
ECP
BVA(Size / Range)
Valid
Invalid
Min
Max







Note: In general test engineers are preparing step by step procedure based test cases for functionality testing. Test engineers prepare valid and invalid table based test cases for input domain of object testing.


Case Study:

Prepare test cases with required documentation depends on below scenario.

In a bank automation Software, fixed deposit is functionality. Bank employee operates the functionality with below inputs.
Ø  Customer Name    → Alphabets in lower case.
Ø  Amount                 → Rs 1500 to 100000.00
Ø  Tenure                   → Up to 12 months
Ø  Interest                    Numeric With decimal

From functional specification (Use Cases), if tenure is > 10 months interest must > 10%.

Test Case 1:

Test Case ID:              TC_FD_1    
Test Case Name:         Successful Entry of customer Name

Data Matrix:
I/P Attribute
ECP
BVA(Size)
Valid
Invalid
Min
Max
Customer Name


a to z
A to Z
0 to 9
Special Characters & Blank
1 characters
256 characters

Test Case 2:

Test Case ID:              TC_FD_2    
Test Case Name:         Successful Entry of Amount

Data Matrix:
I/P Attribute
ECP
BVA(Range)
Valid
Invalid
Min
Max
Amount


0-9
A to Z
a to z
Special Characters & Blank
1500
100000

Test Case 3:

Test Case ID:              TC_FD_3    
Test Case Name:         Successful Entry of Tenure

Data Matrix:
I/P Attribute
ECP
BVA(Range)
Valid
Invalid
Min
Max
Tenure


0-9
A to Z
a to z
Special Characters & Blank
1
12


Test Case 4:

Test Case ID:              TC_FD_4    
Test Case Name:         Successful Entry of Interest

Data Matrix:
I/P Attribute
ECP
BVA(Range)
Valid
Invalid
Min
Max
Interest


0-9
With Decimal
A to Z
a to z
Special Characters & Blank
1
100

Test Case 5:

Test Case ID:              TC_FD_5    
Test Case Name:         Successful fixed deposit operation

Test Procedure:
Step No
Action
I/P required
Expected
1
Login to bank Software
Valid ID
Menu Appears
2
Select Fixed Deposit
-
FD form Appears
3
Fill all fields and click OK
All valid

Any in valid
Acknowledgement from bank server
Error message from bank server

Test Case 6:

Test Case ID:              TC_FD_6    
Test Case Name:         Unsuccessful fixed deposit operation due to Time > 10 months & Interest < 10%

Test Procedure:
Step No
Action
I/P required
Expected
1
Login to bank Software
Valid ID
Menu Appears
2
Select Fixed Deposit
-
FD form Appears
3
Fill all fields and click OK
Valid customer Name, Amount and Time > 10 with interest >10


Valid customer Name, Amount and Time > 10 with interest <10

Acknowledgement from bank server




Error message from bank server
Test Case 7:

Test Case ID:              TC_FD_7    
Test Case Name:         Unsuccessful fixed deposit operation due to with out filling all fields.

Test Procedure:
Step No
Action
I/P required
Expected
1
Login to bank Software
Valid ID
Menu Appears
2
Select Fixed Deposit
-
FD form Appears
3
Fill all fields and click OK
Valid customer Name, Amount and Time interest. But some as blank
Error message from bank server

Note:               Test case 0 – 4 → I/P domain
                        Test case 5 – 6 → Functionality
                        Test case 7       → Error handling

3.      User Interface Based Test Case Design:

To conduct usability testing test engineers writing a list of test cases depends on our organisation user interface conventions,  Global interface rules and Interest of customer site people.

Examples:

Test Case 1:   Spell Check
Test Case2:  Graphics check (Screen level alignment, font, style, colour, size(object width and height) and Microsoft 6 rules)
Test Case 3:   Meaningful error messages.
Test Case 4:   Accuracy of data displayed
Text Box: üText Box: û
Amount                  Text Box: $   Amount                         

Text Box: üüText Box: üText Box: û                                                   

Text Box: --/--/--Text Box: --/--/--DOB        DOB                               DOB        (DD/MM/YY)                                

Test Case 5:   Accuracy of data in data base as a result of user inputs.
Text Box: û

















Test Case 6:   Accuracy of data in a data base as a result of external factors.

Example:         File attachments. Greetings one year

Test Case 7:   Meaning full Help menus (Manual Support testing).

Review Test Cases:

After completion of all possible test cases writing for responsible modules, testing team concentrates on review of test cases for completeness and correctness. In this review testing team applies coverage analysis.

        BR based coverage
        Use case based coverage
        Data modal based coverage
        User Interface based coverage
        Test Responsibility based coverage

At the end of this review test lead prepare “Requirements Tracability Matrix” or “Requirements Validation Matrix".

Business Requirements
Sources
(Use cases, Data model etc)
Test Cases
XXXXXXX
(Login)
:
:
:
:
:
:
:


XXXXXXXX
(Mail Open)


XXXXXXXXX
( Mail Compose)

XXXXXXXXX
(Mail Reply)
:
:

XXXXXXXX
XXXXXXXX
XXXXXXXXX

XXXXXXXX
XXXXXX

XXXXXXX
XXXXXXX
XXXXXXX
 
From the above model tracebility matrix defines mapping between customer requirements and prepared test cases to validate that requirements.

IV. TEST EXECUTION:

After completion of test cases selection & their review, testing team concentrates on build release from development and test execution on the build.

1.      Test Execution Levels / Phases:

Development                                                              Testing

                         Stable Build                            Level 0 (Sanity / TAT / BVT)
                                                                                        Test Automation



Defect Fixing              Defect Reporting                 Level 1 (Compressive)
                                                                                   



Defect Resolving        Modified Build                    Level 2 (Regression)
                                                                                                      
                                                                                  Level 3 (Final Regression)
2.      Test Execution Levels Vs Test Cases:

Level 0 → P0 test cases
Level 1 → All P0, P1 and P2 test cases as batches
Level 2 → Selected P0, P1 and P2 test cases w.r.t modifications
Level 3 → Selected P0, P1 and P2 test cases w.r.t critical areas in the master build.

3.      Build Version Control:

In general test engineers are receiving build from development in below modes.

                        Build →   Server    Soft  Base






Text Box: Test Environment


                                             FTP(File Transfer Protocol) 



















From the above approach test engineers are dumping application build from server to local host through  FTP. Soft Base means that collections of software’s.

During test execution test engineers are receiving modified builds from soft base. To distinguish old builds & new build, development team gives unique version no in system, which is understandable to testers.

For this version controlling, developers are using version control tools also.(Ex: VSS(Visual Source safe)

4.      Level 0: ( Sanity / TAT / BVT)

After receiving initial build test engineers concentrate on Basic functionality of that build, to estimate satiability for complete testing. In this sanity testing test engineers try to execute all P0 test cases to cover basic functionality. If functionality not working or functionality is missing testing team reject that build. if testers decided stability, they concentrate on test execution of all test cases to detect defects.

During this sanity testing, test engineers observe below factors on the build.

                        → Understandable
                        → Operatable
                        → Consistency
                        → Controllable
                        → Simplicity
                        → Maintainable
                        → Automatable

From the above 8 testable issues sanity testing is also known as Testability Testing or Octangle Testing.



5.      Test Automation:

If test automation is possible then testing team concentrate on test scripts creation using corresponding testing tool. Every test script consists of navigational statements along with checkpoints.
Stable Build
Test Automation
(Selective Automation)
(All P0 and Carefully selected P1 test cases)

6.      Level 1(Comprehensive Testing) :

After completion of sanity testing and possible test automation, testing team concentrates on test batches formation with dependent test cases. Test batch is also known as test suit or test set. During these test batches execution, test engineer prepare test log document this document consists of three types of entries.

                        → Passed        – All expected = Actual
                        → Failed         – Any one expected != Actual
                        → Blocked      – Corresponding parent functionality failed.




Comprehensive Test Cycles
7.      Level 2 (Regression Testing):

During comprehensive test execution, test engineers are reporting mismatches as defects to developers. After receiving modified build from them, test engineers concentrate on regression testing to ensure bug fixing work and occurrences of side effects.

Resolved Bug Severity




                        High                                        Medium                                   Low    
                        All P0                                      All P0                                   Some P0
                        All P1                          Carefully selected P1                       Some P1



           Carefully selected P2               Carefully selected P2                       Some P2
On Modified Build
Case 1:

If development team resolved bug impact (Severity) is high, test engineers re execute all P0, P1 and carefully selected P2 test cases on that modified Build.
Case 2:

If development team resolved bug impact (Severity) is medium, test engineers re execute all P0, carefully selected P1 and some of P2 test cases on that modified Build.

Case 3:

If development team resolved bug impact (Severity) is low, test engineers re some of  P0, P1 and  P2 test cases on that modified Build.

Case 4:

If development team released modified build due to sudden changes in project requirements, test engineers re execute all P0, all P1 and Carefully selected P2 test cases w.r.t that requirements modifications.

VII.    TEST REPORTING:

During comprehensive testing, test engineers are reporting mismatches as defects to developers through IEEE format.

1.      Defect ID              : Unique Number / Name
2.      Description           : Summary of defect
3.      Feature                 : Module/Function/Service (In this module test engineers found this   defect)
4.      Test Case Name   : Corresponding failed test condition
5.      Reproducible       : Yes / NO (Yes – every time Defect appears, NO – Rarely defect appears)
6.      If  Yes                   : Attach test Procedure          
7.      If NO                    :  Attach snap shot and strong reasons.
8.      Status                    : New / Reopen (New – Defect appears first time, Reopen – Reappearance   of the defect once closed)
9.      Severity                : Seriousness of defect w.r.t functionality
High→ With out resolving that defect test engineer is not able to  continue testing.(Snow Stopper)

Medium→ Able to continue testing but mandatory to resolve.

Low→ May or may not resolve
10.  Priority                 : Importance of the defect w.r.t customer(High, medium, low)
11.  Reported by         :  Name of test engineer
12.  Reported on         :  Date of submission
13.  Assigned to           : Name of responsible person in development side(PM)
14.  Build Version ID : In which version of build test engineer found this defect.
15.  Suggested Fix       : Tester try to produce suggestions to solve this defect(Optional)

_ _ _ _ _ _ _ _ _ __ _ _ _ _ _ _ _ _ __ _ _ _ _ _ _ _ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

By Developers
16.  Fixed By               : PM / Team Lead
17.  Resolved By         :  Programmers Name
18.  Resolved On         :  Date of resolving
19.  Resolution Type  :
20.  Approved By       : Sign of PM

Defect Age:
The time gap between “ Reported on” and “Resolved On”.
Defect Submission Process:

Large-Scale Organisations 




                        QA                                                     
                                           Test Manager                        Project manager
Text Box: If high severity defect rejected                                                                            
                        Test Lead                               Team Lead
                                                                   
                                                     Test Engineer                              Developer                                  

Transmittal Reports

Medium & Small-Scale Organisations

                   Project Manager

Test Lead                               Team Lead
                                                                                               
                                                 Test Engineer                      Developer                                  

Transmittal Reports

Defect Status Cycle:

                                                New
                                                 
                                 Open / Rejected / Deferred (defect accepted but not interested  to resolve
                                                                                                                         in this version)
                                                 
                                             Closed
                                                 
                                             Reopen



Defect Life Cycle / Bug Life Cycle:

                                   
                                                   Detect Defect
                                                           
                                                Reproduce Defect
                                                           
                                                  Report Defect
                                                           
                                                    Fix Defect
                                                           
                                                 Resolve Defect
                                                           
                                                   Close Defect


Defect Resolution Type:

After receiving defect reports from testers, developers reviews that defect and send resolution type to testers as reply.

1.      Duplicate:  Rejected due to this defect same as previously reported defect.
2.      Enhancement: Rejected due to this defect related to future requirement of customer.
3.      Hardware Limitations: Rejected due to this defect raised w.r.t limitations of Hardware devices.
4.      Software Limitations: Rejected due to this defect raised w.r.t limitations of Software technologies.
5.      Not Applicable: Rejected due to no proper meaning to this defect.
6.      Function as Designed: Rejected due to coding is correct w.r.t design document.
7.      Need More Information: Not accepted and not rejected but developer requires extra information to understand that defect.
8.      Not Reproducible: Not accepted and not rejected but developer requires correct procedure to reproduce that defect.
9.      No Plan to Fix it: Not accepted and not rejected but they want extra time to fix.
10.  Fixed: Developer accepted as to be resolved.
11.  Fixed Indirectly: Accepted but not interested to resolve in this version (Deferred).
12.  User Misunderstanding: Extra negotiations between testing and development teams.


Types of Defects:

1.      User Interface Bugs: Low Severity

Ex 1: Spelling Mistake → High Priority
Ex 2: Improper alignment → low priority

2.      Boundary Related Bugs: Medium Severity

Ex 1: Does not allow valid type → High Priority
Ex 2: Allows invalid type also → Low Priority

3.      Error Handling Bugs: Medium Severity

Ex 1: Does not providing error massage window → High Priority
Ex 2: Improper meaning of error massages → Low Priority

4.      Calculation Bugs: High Severity

Ex 1: Final output is wrong → low priority
Ex 2: Dependent results are wrong → high priority

5.      Race Condition Bugs: High Severity

Ex 1: Dead Lock → High Priority
Ex 2: Improper order of services → Low Priority

6.      Load Condition Bugs: High Severity

Ex 1: Does not allow multiple users to operate → High Priority
Ex 2: Does not allow customer expected load → Low Priority

7.      Hardware Bugs: High Severity

Ex 1: Does not handle device → High Priority
Ex 2: Wrong output from device → Low Priority

8.      ID Control Bugs: Medium Severity

Ex 1: Logo missing, wrong logo, version no mistake, copyright window missing, developers name missing, tester names missing.

9.      Version Control Bugs: Medium Severity

Ex: Difference between two consequitive build versions.

10.  Source Bugs: Medium Severity

Ex: Mistakes in help documents.
 

VIII.       Test Closer:

After completion of all possible test cycles executions, test lead conducts a review to estimate completeness & correctness of testing. In this review test lead follow below factors along with test engineer.

1.      Coverage Analysis:

                        → BR based coverage
                        → Use case based coverage
                        → Data modal based coverage
                        → UI based coverage
                        → TRM based coverage
2.      Bug Density:

Ex:                   A → 20%
                        B → 20%
                        C → 40 %  ← Final Regression
                        D → 20%

3.      Analysis of Differed Bugs:

Whether differed bugs are deferrable or not?

At the end of this review, testing team concentrates on final regression testing on high bug density modules if time is available.










Level 3: (Final Regression / Pre Acceptance Testing)



                                                                       

IX. User Acceptance Testing:

After completion of final regression cycles, our organisation management concentrates on user acceptance testing to collect feedback. There are two approaches to conduct this testing such as a - test and b - test.

X.    Sign OFF:
After completion of user acceptance testing and their modifications, test lead concentrates on final test summary report creation. It is a part of software release note. This final test summary report consists of below documents.

→ Test Strategy / Methodology (TRM)
→ System Test Plan
→ Requirements Tracebility matrix
→ Automated Test Scripts
→ Bugs Summary Report

BUG Description
Feature
Found By
Severity
Status  (Closed/Differed)
Commands










Auditing:

To audit testing process Quality people three types of measurements & Metrics.

1.      QAM (Quality Assessment Measurement):

These measurements used by quality analysts / PM during testing process(Monthly once).

Stability:
                                   
                                y 










No of defects
 



 






Time
 
                                0                                      x         

20% testing → 80% defects
80% testing → 20% defects

Sufficiency:

→ Requirements Coverage
→Type-Trigger analysis

Defect Severity Distribution:

→ Organization – Trend limit check.

2.      TMM (Test Management Measurement):

These measurements used by test lead during testing process (weekly twice).

Test Status:

→ Completed
→ In progress
→ Yet to execute

Delays in Delivery:

→ Defect arrival rate
→ Defect resolution rate
→ Defect age

Test Efficiency:

→ Cost to find a defect ( No of defects / Person-Day)
3.      PCM (Process Capability Measurement):

These measurements used by project management to improve capability of testing process depends on feed back of customer in existing maintenance software’s.

Test Effectiveness:

→ Requirements Coverage
→Type-Trigger analysis

Defect Escapes (Missed defects):

→ Type – Phase analyses

 Test Efficiency:

→ Cost to find a defect ( No of defects / Person-Day)




No comments:

Post a Comment