UAT Test Planning

Test planning for MuktaSoft V1.0, HUDD

1.0 Objective

The purpose of this document is to outline the User Acceptance Testing (UAT) process for the MUKTA application developed on the DIGIT-Works platform for the implementation of the MUKTA scheme notified by the Government of Odisha. This will enable the ULB and CBO users to validate that the MUKTASoft application meets the agreed-upon acceptance criteria for various functionalities.

It will ensure that the system satisfies the needs of the business as specified in the functional requirements (PRD/ User Stories) and provides confidence in its use. It will also help to identify defects with associated risks and suggest enhancements(change requests), communicate all known defects/enhancements to the project team, and ensure that all issues are addressed in an appropriate manner prior to go-live.

2.0 UAT Methodology

During the testing process, a pre-approved list of test cases/scripts will be executed to verify the behaviour and performance of various functionalities. The observations from the testing will be noted and classified as defects, enhancements, or clarifications. Some of the enhancements may be doable using configuration changes while others( Change Requests) may have to go through a change management process.

UAT observation classification:

The UAT team will execute all the test scripts referenced in section 6.0. Users may also perform additional tests not detailed in the plan but remain relevant and within the scope of the project.

Users will report feedback to the eGov team for documentation and escalation using a Google sheet. These defects will be described, prioritized, and tracked by using screen captures, verbiage, and steps necessary for the development team to reproduce the defect. For change requests the requirements will be described, analysed, prioritised and tracked. Information on defect and CR processing can be found in section 7.0.

2.1 Test Phases

The various phases of UAT are illustrated in the diagram below:

2.2 Scope for UAT

UAT is conducted by distributing all test case scenarios to various roles. The UAT session will start with a product demo CUM brief training for all the users.

  1. MuktaSoft web application for employees.

  2. MuktaSoft web (Mobile First) and mobile (Android) application for CBOs.

  3. Well-defined functional scope.

  4. Well-defined role-wise functional test case scenarios covering end-to-end flow.

  5. Well-defined system configuration and master data.

  6. Identified application users from ULBs and CBOs.

  7. Performing the test with the real-world data from ULB’s files.

  8. Capturing test case results and observations.

  9. Resolution of issues reported as agreed upon.

Out of the scope of UAT

For the below-mentioned functionalities, separate demos and/or training sessions will be conducted for the targeted user groups.

  1. Helpdesk for complaints and user management

    • User management module to add/update/ activate/deactivate users and role mapping

    • Handling complaints via inbox functionality

  2. HUDD and ULB dashboards and reports.

  3. Project closure, work order revision, and bill PDFs.

  4. JIT-FS and Aadhar integrations.

During UAT, the team will validate the end-to-end flow and features of the application to validate:

  • End-to-end business flow for each of the identified users flows

  • All the required fields are present

  • The system accepts only valid values

  • Users can save after filling in the mandatory fields in the form filling

  • Correctness of label and its translation

2.3 Prerequisites for UAT

Following is the list of pre-requisites for conducting the UAT

  1. The MuktaSoft mobile app for UAT deployed in the UAT environment

  2. Installation of the MuktaSoft app on CBO user mobiles (Android 10 and Above)

  3. Configuration of MuktaSoft mobile app in UAT environment with master data

  4. Readiness of handouts

    • Test case document

    • Defect/change request reporting template

  5. Availability of teams from HUDD, SUDA, UMC and CBOs for user acceptance testing

  6. Nomination of participants for the UAT session so that test accounts can be created by eGov.

  7. Configuration of ticket tracking tool for UAT (JIRA)

3.0 UAT Environment

The UAT environment is similar to the production environment in terms of specifications and is provided by eGov, so that accurate assumptions can be drawn regarding the application’s performance.

Applicable IP addresses and URLs will be provided by eGov team to the UAT Team and all the mobiles will be configured for access to the test environment.

3.1 UAT Process

Each test participant will be provided with a checklist to verify access to the applications within the defined scope of testing. The tester then logs in, performs the prescribed steps and generates the expected results for each activity. Any feature identified as missing or a bug during testing from the UAT environment should be reported back to eGov.

3.2 UAT Data

Access to test data is a vital component in conducting a comprehensive test of the system. All UAT participants will require the usage of test accounts and other actual files from the ULBs with the project's data to use as the test data. All user roles should fully emulate production in the UAT path. The test accounts and role mapping shall be done for the users identified by eGov. Following are sample test data for UA testing:

  1. Sample Master Data

    • Tenant configuration data

    • Location/boundary master (Ward/ Locality )

    • Application configuration master data

      • Project Type

      • Target Demography

      • Unit of Measurement

      • Overheads

      • Skill Category

      • Skills

      • CBO Role

      • Deductions

      • Organization Type

      • Organization Sub Type

      • Organization Functional Category

      • Organization Class/ Rank

      • ULBs Sections/ Departments

      • Designations

    • User data for test login creation

      • Employees

      • Community-Based Organizations

    • Wage seeker’s master data

  2. Testing Data

    • MUKTA works manual files - to use the data in the files as test data

4.0 Roles and Responsibilities

HUDD

  • Nomination of stakeholders for executing the UAT

  • Verification of product based on UAT scenarios

  • Decision regarding date and venue

  • Arrangement of logistics-Venue

  • Providing feedback after the UAT session and sign off

  • Provision devices (15 laptops for master trainers and ULB Employees + Android phones for the 8 SHG members)

eGov

  • Demo before the UAT event

  • Training before UAT initiation

  • Framing test scenarios

  • Creation of UAT Plan

  • Collection and triaging of the UAT feedback

UMC

  • Review of translated UAT feedback (as provided by HUDD Odia cell)

  • Nominate master trainers to participate in the UAT

4.1 UAT Team

The test team is composed of members who possess a knowledge of the operations and processes on the ground.

  1. These team members will be able to initiate test input, review the results

  2. Have prior experience/learnings from on-ground implementation MUKTA

All team members will be presented by the eGov team with an overview of the test process and their specific roles in UAT. The eGov team will oversee testing by assigning scripts to testers, providing general support, and serving as the primary UAT contact point throughout the test cycle.

5.0 UAT Deliverables

The following sections detail milestones crucial to the completion of the UAT phase of the project. Once all dependent milestones have been completed, HUDD will formally sign off on the system’s functionality and distribute an email to all project stakeholders.

5.1 UAT Definition

Detailed Functional Scope

Employee portal and CBO app are deployed in the UAT instance with the below-listed features:

5.2 UAT - Session structure

The UAT session will be conducted physically in Bhubaneswar, Odisha at the HUDD office. The detailed plan for the UAT session is as follows:

The CR analysis may take more than the anticipated timelines. The CR analysis if exceeding more than Day 2, the analysis results, acceptance and prioritisation will be discussed in a subsequent meeting.

5.3 UAT Sign-off Checklist

  • The signoff shall be provided at the end of UAT by HUDD via email communication or official orders/memo to the eGov.

  • All UAT test cases completed: Verified that all UAT test cases, as defined in the UAT plan, have been executed and completed

  • Business requirements validated: Validated that all business requirements, as defined in the requirements documentation - all features, functions, workflows, calculations, and translations have been thoroughly tested and verified

  • Compatibility tested: Verified that the application has been tested on the specified devices, Operating System (Windows 10 Pro, Android 10 and Above), and browser Chrome (Latest version) and any compatibility issues have been addressed.

  • All feedback has been identified and documented. Agreed on the priority.

  • Documentation updated: Ensured that all relevant documentation, including user manuals, have been updated post-UAT and they reflect changes identified and acted on during UAT

  • UAT summary report prepared: Prepared a UAT summary report that includes the UAT results, findings, and any outstanding issues or risks identified during UAT

  • The mutually agreed defects/CR from UAT will be resolved and demonstrated to users within the agreed time frame.

  • Post resolution and demonstration of all the issues the application will be ready for deployment in the production environment

5.4 UAT Success Criteria

  • Training on the UAT environment was completed with 100% participation as per the plan. In case some of the participants are not available, prior intimation with reason is to be captured.

  • No P1 bugs were found during the execution of the test scenarios.

  • 90% of the total test cases should be executed successfully and the observed behaviour was found to match the expected results.

  • Post training and UAT sessions, a quiz will be administered to check the familiarity of the participants with MuktaSoft. The quiz must be completed with at least 75% correct responses.

6.0 UA Test Scenarios

Test cases provide a high-level description of the functionality to be tested. All regression and new functionality test cases are contained in the Excel spreadsheet “UA Test Cases” available at: [UAT Test Scenarios].

The team will leverage relevant QA test cases for project-specific functionality. Each test case based on new functionality will reference a specific functional requirement.

Each script contains: the test case number, test description, requirement number, action to be performed, test data to be utilized, expected results, error descriptions (if applicable), pass/fail results, date tested, and any additional comments from the UA tester.

The UA test scripts are contained within the UAT test case spreadsheet and can be accessed via hyperlinks from each individual test case.

7.0 UAT Feedback

Defects and change requests are entered and tracked in JIRA by the eGov team during the UAT process. Each entry includes detailed information about each defect/CR.

UAT Defect/CR Tracking

The test team is provided with instructions on how to effectively execute test scripts, as well as identify, capture, and report defects/observations by the eGov team at the beginning of the UAT session. The test team present their findings during the UAT session.

UAT Defect Life Cycle

Defects must be clearly captured and escalated to ensure prompt resolution by development. Each defect submitted by UAT will be assigned a priority, worked by development, resolved, and re-tested by UAT prior to closure. The following is a snapshot of the standard defect lifecycle:

eGov and HUDD together will prioritize and classify defects. Defects found in UAT can be assigned one of three (3) levels of severity:

  • P1, Work Halted – Testing defects that due to the complexity of the function or the scheduled dates are putting the implementation date at risk. No workaround exists.

  • P2, Work Slowed – Testing defects occurring in a less complex function of the application with sufficient time to resolve before the implementation date – but must be implemented as scheduled. A workaround has been identified and is listed in the defect.

  • P3 and lower, Work Unaffected – Testing defects occurring in a function that is simple to fix or could be excluded if not resolved by the scheduled implementation date.

Response (acknowledgement of the issue) Commitments for Defects

As a non-profit, we are unable to make any commitment on how long issues will take to fix and deploy in production. We will endeavour to resolve P1 issues as quickly as possible.

UAT Change Request Life Cycle

CR must be clearly captured and reported for analysis to identify effort and impact in the eGov team. Each CR submitted will be validated and categorised for acceptance and then assigned with a priority. The development team will work on it and will be made available for testing. Following is a snapshot of the standard CR lifecycle:

Categorisation

eGov in consultation with HUDD will decide acceptance and categorisation of change requests. Change requests found in UAT can be assigned one of three (3) levels of category:

  • Must Have – Change requests that are needed for the success of a campaign. No workaround exists.

  • Should Have – Change requests that are required for better tracking and monitoring and will increase the ease of use of the system for users. A workaround has been identified and is listed in the CR.

  • Good to Have – Change requests that are simply for better visualisation and reporting. It could be excluded if not resolved by the scheduled implementation date.

eGov to endeavour to cover Must Have changes before distribution. Lower priority changes will be taken through the eGov Gating process for planning subsequent releases.

8.0 References

The following are reference documents which have been leveraged for project-specific information in the design of the UAT plan:

Last updated

​​  ​All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.