Center of Excellence
Application development
Testing service
» May I Help You?
Name
eMail
Phone
Message
 
 

 

Testing Service :

Delivery of software solutions that are business related and ensure reliability is vital to any company as they scale up. The cost of correcting errors detected at a later stage of the lifecycle increases exponentially.

INBISS, through its Verification and Validation Services, builds specialized testing and validation skills that it offers as a service to its customers.

All of INBISS’s service offerings in testing are geared to help customers achieve the edge in terms of time-to-market, quality, productivity and cost-effective roll out.

Testing and Verification Methodology

The processes by which INBISS engages in testing for companies are varied and focused on the particular needs of the customer. Given below is a flowchart that represents INBISS’s Testing methodology.

For each level of testing, as appropriate, the following activities will be performed:

The following section describes the various inputs, outputs, and the processes executed as a part of Various Steps of Testing methodology

Step 1 - Create Test Strategy:

  • Inputs for this process: Functional and technical requirements of the application (Requirements, Change Requests, Technical and Functional Design Documents).
  • System Limitations, i.e. requirements that the system cannot provide.

Outputs of this process:

  • An approved and signed-off test Statement of Work that describes the testing strategy, test plan, Test scenarios, Test conditions, test cases, and deliverables.
  • Required hardware and software components, including Test tools (Test Environment, Test Tool Data), roles and responsibilities of the resources.

Process:

  • A test strategy is developed for all levels of testing, as required.
  • The INBISS Test Team analyzes the requirements, writes the test strategy and reviews the plan with the Quality Assurance Manager and the project team.
  • The test plan the overall plan for the test cycles with the related Test Scenarios in a particular test environment
  • Test scenarios includes test conditions and cases, the testing environment, a list of testing related tasks, pass/fail criteria and testing Risk assessment.
  • The test schedule identifies all tasks required for a successful testing effort, a schedule of activities, and resource requirements.
  • Feasibility study is carried out with the automated testing tool for that specific application.

Step 2 - Create Test Scenarios :

Inputs for this process:

  • Automated test ware and previously developed scripts, if applicable (Test Tools).
  • Test document problems uncovered as a result of testing (Test Document Problems).
  • Understanding of software complexity and module path coverage derived from General and Detailed Design documents (Software Design, Code, and Complexity Data).

Outputs of this process:

  • Problems with the design to be fed back to the developers (Software Design, Code Issues).
  • Approved test scenarios, conditions and scripts (Test Design, Cases, Scripts). Test data.
 

Process:

  • Test scenarios and cases are prepared by reviewing the functional requirements of the release and preparing logical groups of business functions that can be further broken into test scripts.
  • Tests define test conditions, data to be used for testing, and expected results (database updates, file outputs, report results, etc.).
  • Test scenarios are designed to represent both typical and unusual situations that may occur in the application.
  • The Test Team develops test scenarios/cases for GUI & FUNCTIONAL testing with assistance from developers and clients.
  • The client develops acceptance test cases with help from the project and Test Team.
  • Test scenarios are executed through the use of test scripts. Scripts define a series of steps necessary to perform one or more test scenarios.
  • A test script usually represents a transaction or process that can occur during normal system operations.
  • Test scripts include the specific data that is used for testing the process or transaction.
  • Test scripts cover multiple test scenarios and include run / execution / cycle information.
  • Test scripts are mapped back to the requirements and Traceability matrices to ensure each test are within the scope.
  • Test data is captured and base-lined, prior to testing. This data will serves as the foundation for system testing and is used to exercise system functionality in a controlled environment.
  • Some output data is also base-lined for future comparisons.
  • Base-lined data is used to support future application maintenance via regression testing.
  • A pre-test meeting is held to assess the “readiness” of the application, and the environment and data to be tested.
  • A test readiness document is created to indicate the status of the entrance criteria of the release.

Step 3 - Execute Test Cases :

Inputs for this process:

 

  • Approved test documents (Test Plan, Cases, and Procedures).
  • Automated test ware, if applicable and developed scripts (Test Tools).
  • Changes to the design (Change Requests).
  • Test data.
  • Availability of the test and project teams (Project Staff, Test Team).
  • General and Detailed Design Documents (Requirements, Software Design).
  • A complete development environment that has been migrated to the test environment via the Configuration/Build Manager.
  • Test Readiness Document.
  • Update documents.

Outputs of this process:

  • Changes to the code (Test Fixes).
  • Test document problems uncovered as a result of testing (Test Document Problems).
  • Problems with the design fed back to the developers and clients (Requirements, Design, Code Issues).
  • Formal record of test incidents (Problem Tracking - PT).
  • Base-lined package ready for migration to the next level (Tested Source and Object Code).
  • Log and summary of the test results (Test Report).
  • Approved and signed-off with revised testing deliverables (Updated Deliverables).

Process:

  • Checkpoint meetings are held throughout the Execution phase. The Checkpoint meeting will be held daily (if required) to address and discuss testing issues, status, and activities.
  • Execution of tests is completed with the test documents in a methodical manner. As each package of test procedures is performed, an entry is recorded in a test execution log to note the execution of the procedure and whether the test procedures uncovered any defects. The output from the execution of test procedures is referred to as test results.
  • The appropriate project members will evaluate test results, applicable to the level of test, to determine whether the expected results were obtained. All discrepancy/anomalies will be logged and discussed with the Software Development Manager/Programmer and documented for further investigation and resolution. (Each client may have a different process for logging and reporting bugs/defects uncovered during testing, verify the process through the Configuration Management (CM) group).
  • Pass/Fail criteria are used to determine the severity of the problem, and results are recorded in a test summary report.
  • The severity of a problem found during system testing is defined in accordance to the customer’s risk assessment and recorded in their/ INBISS suggested/selected tracking tool.
  • Proposed fixes are delivered to the testing environment based on the severity of the problem. Fixes are regression tested and flawless fixes are migrated to the new baseline. Following the completion of the test, members of the Test Team will prepare a summary report. The Project Manager, Clients, Software Quality Assurance (SQA) and/or Test Team Lead review the summary report.
  • After a particular level of testing has been certified, the Configuration Manager coordinates the migration of the release software components to the next test level as documented in the Configuration Management Plan. The software will only be migrated to the production environment after the client’s formal acceptance.
  • The Test Team reviews the test document problems identified during testing and update documents where appropriate. Some problems may be the result of inconsistencies or modifications between the Technical and Functional.

Tools Used for Testing :

INBISS has expertise in using various testing tools in different type of project. Some of them are:

  • Regression test tools - Rational Robot, Silk Test, Win Runner, QTP
  • Performance testing tools - Rational performance test studio, Load runner, Web load
   
Company Products Services Consulting Partners Contacts


2000-2011 INBISS Inc. All rights reserved.
terms of Use   privacy policy   contact us

INBISS Inc.: CRM For Insurance, Workflow Automation, Business Consulting, Testing Services
inbiss Eurohaus office
  • i
  • bag
  • letter
INBISS Solutions Inc.
5338 East Point
Sarasota, FL - 34232
USA
info@INBISS.com

Tel.: 1(727)-493-2797
Fax:941 827 9588