a1qa ensures the quality of unique billing solution

The client was creating a single billing platform to replace existing systems and pitched on a1qa to ensure the quality of a new billing solution and provide successful data migration.
Migration testing
Performance testing
QA consulting
Test automation


…expresses its sincere thanks to the a1qa team for the quality assurance services provided in the course of the billing solution transformation project. Timely testing made it possible to detect and fix defects in the new solution, while test automation accelerated the validation of new releases and greatly sped up time to market.

Director of Infrastructure Projects

The client is a telecommunications company providing cellular and local telephone service, broadband Internet access, and cable television services.

The customer was creating a single billing platform to replace standalone billing systems used by the branches and subsidiaries and applied to a1qa to ensure the quality of a new billing solution. This software product covers dozens of business processes such as customer care, payments, and billing.

Successful data migration was another core objective in terms of a1qa involvement.

Services offered

Test automation
Migration testing
System testing
QA audit

Project scope

The client launched a unique large-scale project to upgrade the billing architecture into a single solution for all branches.

By the time the project started, each of the eight client’s branches had a separate billing system with its own characteristics and product lines.

As a result, customers connected to one branch could not receive a full-fledged service by contacting another customer’s branch.

In order to meet the client’s business objectives, the expert team developed a turnkey QA approach.

Considering the overall complexity of the solution, a1qa specifically highlighted the significance of a safe and complete data migration process.

Therefore, rigorous attention was paid to ensuring the quality of this core aspect.

The client pitched on a1qa due to the following manner of testing:

  1. A highly skilled team with extensive expertise in regression, system, integration, E2E testing of OSS/BSS solutions.
  2. Compliance of employees’ qualifications with the client’s requirements.
  3. Specialized Center of Excellence in Telecom.
  4. Team scalability upon request.
  5. Professional QA services delivered to 5 developers of OSS/BSS solutions.
  6. Load testing expertise in Telecom.

System testing

The prime task was to check the compliance of the product with the initial requirements within the integrated environment and the configured products.

To begin with, the a1qa engineers were to build a testing methodology. Therefore, the team:

  1. Developed over 2,000 test cases
  2. Checked their execution on the client’s environment
  3. Analyzed test results and assessed their quality
  4. Prepared a report on test coverage and overall effectiveness
  5. Made changes to the previously created test suite based on the results of the analysis.

The test methodology was recorded in Confluence.

To perform testing, UAT verifications were chosen as the basis. They were expanded using alternative options for completing checks, negative scenarios and tests that were not related to business processes.

All tests contained the following fields:

  • Title and description
  • Preconditions for starting a test (test environment status description)
  • Steps to implement (with an explicit indication of interactive subsystems)
  • Expected result
  • ‘Sequence diagram’ revealing the interaction of the subsystems participating in the test
  • Link to documentation and business process
  • Scripts for the preparation of test data (if necessary).

The test model was gradually expanding. The a1qa engineers were updating the test suite in line with further improvement of current business processes and use cases.

System testing by a1qa helped achieve the following benefits:

  • Tests were tagged to determine the model (B2B/B2C/B2B+B2C).
  • Priority was set to fulfil business-critical tests first with the possibility to allocate the minimum set for conducting smoke tests.
  • Hierarchical tests structure was simple to use, easily readable by business users, and cost-effective to keep the tests revamped.
  • Test storage structure was developed considering business processes, which highlighted the risk of software inoperability in terms of business objectives.
  • Well-thought-out methods of detailing steps and their design to perform tests manually and automate them were introduced (within API queries with a description of input and output parameters, SQL queries to the database, etc.).

Throughout the whole development cycle, the a1qa engineers ensured the quality of builds by conducting 2-3 iterations of testing, which included:

  • Functional testing
  • Results analysis
  • Preparation of the Product Quality report
  • Correction and updating of tests.

Based on the results of the work carried out, the client received a document describing the testing methodology and the structure of test scenarios.

Test automation

At the pre-release stage of development, it was necessary to automate testing within the integrated environment on the operator side.

Test automation objectives:

  • Speeding up time to market
  • Increasing test coverage.

Python, Robot Framework, and REST API were applied to achieve this aim.

To develop an effective testing solution, a1qa formed cross-functional teams combining the system testing engineers and test automation specialists.

What was the trigger for this decision? When working separately, the testing teams encountered a range of difficulties:

  1. The first one was connected with the time interval between the completion of test development and the beginning of automation. It occurred due to the lack of confirmation that the test was relevant by the early testing and there were no errors both in the software and in the test area where the automation was running.
  2. It took up to one month to develop an automated test (from creating a manual test to its full automation).
  3. Sometimes it was necessary to redesign the automated tests due to code modifications.

Since the tests covered the chains of systems involved in the business scenario execution, test steps were repeated oftentimes.

To avoid code duplication and simplify test support in the future, the keywords-driven approach was applied.

Manual tests described them. The base of the defined keywords was stored in TestRail. Their coding and further reusing in tests took place during automation.

Load testing methodology audit

The client was performing load testing of the newly developed solution.

Its creation involved the integration of a geographically distributed infrastructure into a single system.

It was vital for the system to withstand the load that previously was distributed between 25 systems.

The client wanted to make sure the load testing would bring informative results and asked to conduct load testing methodology audit.

a1qa assigned a lead performance testing engineer to run it.

During the audit, it was required to identify the weak points of the testing processes, check the results and evaluate them for compliance with the client’s key performance indicators (KPI), as well as offer recommendations for improvement.

The audit included checking the program code, test methodology, the system for monitoring testing and obtained results.

The a1qa engineer:

  1. Examined the architecture of the complex and all related documentation
  2. Analyzed the methodology, scripts, and load generators
  3. Monitored the load/system status/equipment during the testing period.

All the results were successfully presented to the client’s Board of Directors.

After 4 months of rigorous audit, the a1qa dedicated expert defined more than 100 bottlenecks, suggested recommendations for improvement, and assessed their effectiveness.

Data migration testing

The transformation of the billing solution always requires the migration of a large data amount from the source system to the target one.

Migration should not affect the routine actions of subscribers.

Therefore, the tariffs and terms of service should remain at the same level. Even a slight increase in bills or calculation errors will cause user dissatisfaction and the growing number of customer support calls.

To fulfill the task successfully, QA consultants have developed a migration testing strategy using Parallel and Dry Run methods.

Parallel Run testing

Parallel Run is an approach, which allows 2 billing systems to work using the same input data while the output results are compared and the differences are analyzed.

The properly operating source system is the standard. The system that is only planned to be introduced – the target one – is compared with this standard.

The expected result presupposes that the same actions of the clients migrated are of equal effect. The effect, in this case, involves a similar amount of charges for the same services, the equal display of charges and payments on the balance, billing and fee calculation. All the traced discrepancies are potential defects in product configuration, migration, or functionality.

How is Parallel Run fulfilled? The process consists of two stages: preliminary and regular.

The objectives of the preliminary stage are:

  • Identification and recovery of errors connected with product mapping, incomplete transfer of attributes of subscribers and clients, data synchronization between billing subsystems, functionality errors.
  • Debugging comparison and analysis scripts.
  • Team preparation for the analysis of discrepancies to launch testing in regular mode considering plans to initiate industrial migration of branches.

The regular stage included identifying and fixing the issues regarding:

  • Product mapping, incomplete transfer of attributes of subscribers and clients
  • Data synchronization between the subsystems involved in the business processes being checked, functionality errors.

At the preliminary stage, only a list of selected clients that will migrate is used for comparison, while at the regular stage – all clients.

Dry Run testing

Dry Run is the process of data preparation for the Parallel Run, during which part of the subscriber base of clients to migrate is selected.

For example, at the initial project stage, clients with debts on the balance may be restricted to migrate until the debts are fully paid off.

After testing, a discrepancy analysis is carried out. All the detected defects are registered in the bug-tracking system.

Further on, the specialists gather statistics of discrepancies in terms of the verified business processes, describe the effect on the overall result of the most critical defects, and provide a detailed final report.

The advantage of Parallel Run is the extended coverage of the subscriber base and the configuration of products.

The client was satisfied with the way the a1qa team established and streamlined QA and testing processes of the new solution and ensured faultless data migration to the revamped billing.

Technologies & tools

  • SQL Developer
  • Citrix
  • Robot Framework
  • TestRail
  • PuTTY
  • Fiddler
  • Postman
  • Atlassian Suite


  • The a1qa team coped with the core client’s business objective and helped achieve the improved quality of the released system versions.
  • The QA specialists provided faultless data migration.
  • Due to the introduced test automation approach, the client managed to shorten time to market.
  • The a1qa engineers defined bottlenecks in the load testing methodology and proposed ways to eliminate them.

In numbers

years of the project duration
system tests were created
bottlenecks were detected during the audit of the load testing methodology

Get in touch

Please fill in the required field.
Email address seems invalid.
Please fill in the required field.
We use cookies on our website to improve its functionality and to enhance your user experience. We also use cookies for analytics. If you continue to browse this website, we will assume you agree that we can place cookies on your device. For more details, please read our Privacy and Cookies Policy.