Software Test Plan
|
Name
|
Role
|
Date
|
Prepared By
|
AKHIL REDDY
|
Test
Lead
|
16/07/08
|
Reviewed
By
|
P Rama Krishna
|
Test
Manager
|
21/07/08
|
Approved
By
|
SUMAN CHAKRAVARTHY
|
Project
Manager
|
25/07/08
|
A-Added,
M-Modified, D-Deleted
Sno
|
Date
|
Version No
|
Page No
|
Change Mode (A/M/D)
|
Brief Description of Change
|
|
|
|
|
|
|
Table of Contents
1. Introduction
1.1 Overview
1.2 Audience of this document
1.3 References for this document
2. Scope
2.1 Product Scope
2.2 Out of Scope
3. QA Release Team
4. Features to be tested/not to be tested
4.1 Features to be tested
4.2 Features not to be tested
5. Approach
5.1 Gather Test Requirements
5.2 Test Planning and Scripting Test Cases
5.3 Test Execution
7. Schedule
8. Entry/Exit criteria
8.1 Gather Test Requirements
8.2 Test Planning and Test Case Scripting
8.3 Test Execution
9. Item pass/fail criteria
10. Responsibilities
10.1 Development Team (IBEE Solutions (P) Ltd)
10.2 Test Lead
10.3 Team Members
11. Suspension criteria and resumption requirements
12. Test deliverables
13. Application Environment
14. Contingencies/ Issues & Risks
1. Introduction
The Software Test Plan (STP) is
designed to prescribe the scope, approach, resources, and schedule of all
testing activities. The plan must identify the items to be tested, the features
not to be tested, the types to be performed, the personnel responsible for
testing, the resources and schedule required to complete testing, and the risks
associated with the plan.
1.1 Overview
This
document outlines the overall test strategy, which will be performed on the
IBEE eCommerce Portal.
This plan
supports the following objectives:
·
To identify the release content and briefly
describe all supported features.
·
To describe the activities required
preparing for and conducting testing.
·
To present to related departments the
testing teams test strategy and schedule.
·
To describe the approach and level of
testing for this IBEE eCommerce Portal.
·
To identify assumptions, risks and
contingencies that could impact the test schedule.
·
To identify entry and exit criteria of the
Testing Phases.
·
To identify the equipment requirements and
associated software and hardware test tools.
1.2 Audience of this document
This
document is intended for the Nrstt (P) Ltd Testing Team & its Project
Manager. IBEE Solutions development Team and its Project Manager. Next box
testing team (UAT team of IBEE).
1.3 References for
this document
*(List of all documents
that support this test plan.)*
§ IBEE Solutions
Project Plan
§ Requirements
specifications and Use Cases
§ prototype Screens
§ Development and
Test process standards
§ Corporate standards
and guidelines
2. Scope
Testing
will be performed in several points in the life cycle as the product is
constructed. Testing is a very “dependent” activity. As a result, test planning
is a continuing activity performed throughout the system development life
cycle. Test plans must be developed for each level of testing.
The scope of this Test Plan document is the
testing process for entire System Testing of IBEEeCom Portal 1st
Version.
As IBEEeCom Portal is a Product, testing is not limited, but in the 1st
Version testing is limited to following scenarios only.
2.1 Product Scope
·
Front
End (Customer Store front)
o
Browsing Products Info
o
Searching Products
o
Registering with the site
o
Buying Products through different payment
options
o
Submitting feedback
o
Participating in online polls
o
Reading
the news
·
Back
End (administrative
resources)
o Storing Registered Info
o Defining Categories and Products
o Configuring Categories and Products details
o Updating details of Categories and Products
o Defining the Polls
o Providing News
o Importing & Exporting Products Details
o Getting various Reports
o Defining Discounts
o Getting Status
2.2 Out of Scope
o Products Comparison
o Choosing Currency
o Payment gateways
o Inventory
o Shipping Management
o Financial Accounts
o Taxes
o External Systems Integration
Etc…
3. QA Release Team
The table
below shows the IBEEeCom Portal Release team:
Name:
|
Role
|
Responsibilities
|
P Rama Krishna
|
Test
Manager
|
·
Responsible for release
·
Reviews the QA progress
|
Koteswar
|
Development site Coordinator
|
·
Coordination with Development team and ongoing
quality assurance of release deliverables
|
Narayana Murthy
|
Test Lead
|
·
Track status and schedule of the overall project
activities
|
Sushant
|
Module lead
|
·
Responsible for over all testing activities in Admin.
·
Track status and schedule of Admin Module
|
Himabindu
|
Module lead
|
·
Responsible for over all testing activities in User.
·
Track status and schedule of User Module.
|
4. Features to be tested/not to be tested
4.1 Features to be tested
Front end (customer storefront)
- Products Catalog
- Customer’s registration
- Customer account
- Products Search
- Advanced Search
- Price list
- News
- Feedback
- Shopping cart
- Checking out
- Polls
Back end (administrative resources)
- Login
- Managing products catalog
- Adding new categories/subcategories
- Viewing/Editing/deleting existing categories
- Adding new products
- Viewing/Editing/deleting existing product entry
- Table of products
- Importing products
- Exporting products
- Special offers
- Defining Polls
- Adding news
- Reports
4.2 Features not to be tested
<NO>
5. Approach
5.1 Gather Test Requirements
The major tasks for testing engineers during the planning phase are
preparing the following:
·
Analysis of the
Requirements Specification and design documents
·
Prepare Review
Reports for respective documents or work packages
·
Schedule test
environment
·
Configure Defect
Tracking System (Bugzilla)
5.2 Test Planning and Scripting Test Cases
QA engineers are expected to participate in walkthroughs provided by the
IBEE QA team and seek clarifications to proceed with documenting test plan and
test cases.
The objective of this phase is preparing the release for formal testing:
·
Review use cases
and design documents which has outlined the detailed requirements to be
implemented
·
Develop test plan
and test scenarios and test cases to guarantee the services works as specified
in use cases
·
Detail test
scenarios from the use cases
·
Conduct a review
of the test scenarios and test cases
·
Environment setup
5.3 Test Execution
During this phase, Selection of Test cases for execution and finding gaps
between requirements and tests
The test execution phase includes system testing of different business
scenarios as per the identified drop plan for all the services in IBEE eCom
Portal.
Our goal is to execute 100 percent test cases during each individual
drop. Testing Team will also provide weekly status report as requested as well
as bugs report during this phase.
In order
for testing team to achieve this goal within the estimated schedule for the
test execution phase, the application has to meet a certain criteria such as
release code complete on the dates specified in the schedule and patch releases are provided with Impact
analysis.
6. Defect Tracking
Test logs
will be created for each test design document and will record the test results
during test execution together with PR’s and issues raised.
Any faults
found during testing will be raised as Problem Reports (PR’s).
Problem
report priorities as defined as
follows
Priority 1(Showstopper)
Unable to continue with testing
Priority 2(Severe)
Critical to execution of a particular
function but testing can continue by bypassing that particular function.
Priority 3(Medium)
Problem with standing data as opposed to
execution of function
Works around are possible
Priority 4(Minor)
Cosmetic issues
Defects
raised by the team members will be assigned to concerned person. The assignee
will fix the defect, updates the defect status and gives the analysis for the
defect raised.
The
offshore testing team will test the defect raised and close the defect.
7. Schedule
Sno |
Task
|
Schedule (in Days)
|
From Date
|
To Date
|
1
|
Requirements Understanding & Review and RR preparation.
|
5 days
|
22-Aug-08
|
27-Aug-08
|
2.
|
Test Scenarios Preparation
|
3 days
|
28-Aug-08
|
30-Aug-08
|
3.
|
TCD Preparation |
5 Days
|
01-Sept-08
|
06-Sept-08
|
4.
|
Peer Reviews |
One day
|
08-Sept-08
|
08-Sept-08
|
5.
|
Refinement and Baseline of TCD
|
One day
|
09-Sept-08
|
09-Sept-08
|
6.
|
Project Meeting |
One day
|
10-Sept-08
|
10-Sept-08
|
7.
|
Build#1 Execution & Bug reporting |
3 Days
|
11-Sept-08
|
13-Sept-08
|
8.
|
Refinement and Baseline of TCD |
2 Days
|
15-Sept-08
|
16-Sept-08
|
9.
|
(Patch) Build#2 Execution & Regression Testing |
2 Days
|
17-Sept-08
|
18-Sept-08
|
10.
|
Project Meeting |
One Day
|
19-Sept-08
|
19-Sept-08
|
13.
|
Automation with QTP |
15 Days
|
20-Sept-08
|
07-Oct -08
|
14
|
Project Meeting |
One Day
|
08-Oct-08
|
08-Oct -08
|
8. Entry/Exit criteria
8.1 Gather Test Requirements
8.1.1 Entry Criteria
·
The
requirement documents, design documents and use cases are received by the
testing team well in advance, at least 2 weeks before the start date of system
testing.
·
Walk
through sessions conducted for the team members for different documents.
·
Basic
training given to the testing team for different kinds of services.
8.1.2 Exit Criteria
·
Query
registers prepared for all the documents
·
All
the queries raised by the team members are closed
8.2 Test Planning and Test Case Scripting
8.2.1 Entry Criteria
·
All
the release dates and test deliverables have been formally agreed
·
Resources
identified for all the testing activities
·
Team
Members received sufficient training for the functionality of eCommerce
(Business to Customer) Portal
·
Signed
Service Level Agreements
·
All
the risks have been identified which can delay the testing activity
8.2.2 Exit Criteria
·
Test
Cases reviewed by the Offshore (Testing)Team Lead and Solution Designer
·
The
Test Plan is signed off from the Development Team (Client)
·
The
identified Test Scenarios and Test Cases get a sign off from Development
Company (Client)
8.3 Test Execution
8.3.1 Entry criteria
·
Unit Testing done by the development team
·
Sign off for the test plan and test cases,
documented by the Testing team
·
Testing environment readiness and
availability for the test case execution
8.3.2 Exit Criteria
·
All Test Cases have been executed
·
All test blocking bugs (P1) have been
fixed.
·
Defects filed
9. Item pass/fail criteria
A summary of the test coverage, outstanding
faults and the restrictions to testing will be produced at the end of the
system test window.
10. Responsibilities
10.1 Development Team (IBEE Solutions (P) Ltd)
·
The
queries raised on the documents by the Testing team will be resolved with in
the stipulated time
·
The
defects assigned to the Development team members will be given root cause
analysis after fixing
·
The
defect status to be updated in the defect tracking tool ‘Bugzilla’ as soon as
an action is taken by the development team
·
The
logs generated in Nrstt will be stored on a day to day basis with a date stamp
against it
·
The
planned downtime of the testing environment will be informed in advance to the
Testing team
·
The
Design change, if any will be formally informed to the Testing testing team
·
The
turn around time for the query raised during execution will be minimized to the
maximum possible extent
·
All
the patch releases will be given an Impact Analysis
10.2 Test Lead
·
Continuously
track the efforts of all the team members
·
Any
deviations from the planned schedule to be escalated immediately as per the
identified escalation hierarchy
·
Test
Plan to be prepared taking the timelines in to consideration
·
Plan
the efforts required for the testing
·
Allocation
of work to the team members
·
Co-ordinate
with the Development team to get the resolution for the queries raised by the
team members
·
Conduct
team meetings for the knowledge transfer
·
Identify
the necessity for different trainings
10.3 Team Members
·
All
the activities assigned by the team lead to be done in stipulated time
·
Test
scenarios and test cases to be scripted as per the identified standards
·
All
the logs are to be maintained as per the identified standards
·
Issues
to be escalated as per the identified hierarchy
11. Suspension criteria and resumption requirements
Testing
may be suspended under the following circumstances: -
·
The delivered software is not functioning
well enough for the tests to be meaningful.
·
A fault is found in the software, blocking
the further test scenarios.
·
A fault is found in the software, which
affects further tests or renders them obsolete.
·
Infrastructure problems make testing
invalid.
·
Testing environment non-availability and
environment downtime
The
following activities must be carried out on resumption of testing: -
Checks must be carried out on any
corrections, which have been made, to ensure that the corrective action does
not affect other parts of the system
12. Test deliverables
The
following deliverables will be produced during the test cycle: -
1) SRSS
Test Plan
2) Test
design Documents (Test Outlines, Test cases and Test data)
3) Test
logs
4) Defect
Reports
5) Automated
Tests (Functional & Performance)
6) Test
Metrics
7) Test
Summary report
13. Application Environment
13.1) Development
Environment
Technologies:
- HTML for web pages Design
- HTML with Java Script for Client side Validations
- PHP for Server side Scripting
- Adobe Dreamweaver for debugging
- Apache Web Server
- MYSQL for Database Server (Database Design & Data Storage)
- Windows 2003 Server as Operating System
13.2) Testing Environment
13.2.1 Server side:
- Operating- Windows 2003 Server (Web server), Linux Server (Database Server & Bugzilla) and Microsoft windows 2003 Server (QC and VSS)
- IBEE eCom Portal, Apache web server PHP Suit (runtime)
- VSS (Server edition) and Ms-Office
- QTP, LoadRunner
- Quality Center
- Application Documents, Testing Documents & Testing templates.
- Kaspersky Antivirus Software
13.2.2 Client Side:
- Operating- Windows 2000 Professional + SP4
- VSS (Client Edition) (Configuration management) and Ms-Office (Documentation)
- QTP, LoadRunner (Test Automation)
- Quality Center Remote Agent (Test Management)
- Mozilla Firefox Browser (Browser Compatibility)
- Kaspersky Antivirus Software
13.2.3 Hardware:
13.2.3.1 Workstations
- IBM Compatible PCs
- Processor 1.2GHz or Higher
- RAM 512MB or Higher
- Hard disk 40GB or Higher
13.2.4 Others:
§
Intranet (LAN) Web Environment
§
List of users.
§
List of URL’s.
§
Access to IBEE Solutions Server.
§
Access to Web Server.
§
Access to Database Server
§
Access to Bugzilla.
13.3) Production
Environment
13.3.1 Client Side:
- Operating System- win98/windows 2000 Prof/XP/Windows 2000 Server/Windows 2003 Server and Higher
- Browser- Internet Explorer/ Mozilla Firefox/Opera/AOL/Google Chrome etc
13.3.2 Server Side:
- IBEE eCom Portal, Apache web server, PHP Suit (runtime) including MYSQL
14. Contingencies/ Issues & Risks
No
|
Contingency
|
Mitigation Plan
|
|
1.
|
Down
time and non -availability of the testing environment
|
·
Any Planned down time will be informed before hand to
the Testing team
·
Restricted access to the server to be planned and
need to assign and authorize the resources working with server
·
The server restarts will be done in a standardized
manner only after informing all the dependant teams
·
The server restarts to be done only after receiving
requests from the authorized personnel
|
|
2.
|
Delay
in executing the test cases due to connectivity problems
|
Help
of Development Site coordinator in discussion with SRSS test manager and
decide the new timelines accordingly.
|
|
3.
|
Delay
in executing the test cases due to design change
|
Any
design change that the application undergoes needs to be informed formally to
the Nrstt testing team.
|
|
4.
|
Late delivery of any test item or any
part of the test environment will impact on the time available for testing.
|
Managed by reducing the range of testing
in agreement with Test manager, release manager & project manager.
|
|
5.
|
The quality of a delivered item may be
such as to render it impossible to test.
|
The item would have to be rejected by the
testing team and the testing schedule and sign-off date recalculated after
the correction and re-delivery of the item
|
|
6.
|
Availability of interfaces
|
Managed by pursuing the appropriate
support teams.
|
|
< End of the
document >
No comments:
Post a Comment