Crafted performance testing service to ensure smooth SaaS platform operation for international cloud provider

PRODUCT OVERVIEW

Serving companies worldwide that belong to multiple industries, the client contributes to streamlining decision-making to help its customers rapidly adapt to constantly evolving business environments.

The developed software — a cloud SaaS platform — provides end users with the possibility to perform financial and operational planning, business modeling, and predictive analytics. By applying this solution, companies can create, share, and maintain applications for performing multiple tasks such as budgeting, sales, operational and strategic planning, demand forecasting, reporting, and many more.

The platform allows each end user to create a customized model and perform multi-dimensional planning, including such dimensions as time, product, customer, location, currency, etc.

Therefore, each tailored model was to be tested to maintain high performance and eliminate any interruptions in its functioning.

PROJECT SCOPE

To help the client's in-house QA teams get things done, a1qa assigned six performance testing engineers.

Before getting down to checking the platform, the QA professionals learned its bells and whistles to perform independent testing and define bottlenecks in architecture configuration at a glance.

The next step on the way to assuring the quality of the system concerned choosing the proper testing strategy. It was based on the user behavior approach aimed to simulate the work of real consumers within the platform by means of transactions — separate actions performed by users. Between fulfilling them, "think time" was applied to simulate natural delay and receive the most realistic profile for executing tests.

To protect highly sensitive customer data, it was necessary to fulfill concurrency testing only after all models were sanitized and copied into the testing environment.

Therefore, having prepared the necessary scope of real-life user journeys and ensured that the models were configured correctly, the engineers proceeded with designing scripts with complex logic and, later on, executing tests. To launch them from multiple geographical regions, the BlazeMeter tool was introduced. It allowed the team to receive the relevant data regarding ongoing activities in real time — transaction errors, the response time (RT), the number of queries per second.

Crafted performance testing service package delivered by a1qa encompassed four types of checks listed below:

  • Peak load tests — to measure the average RT and the throughput of the model at a given number of concurrent users.
  • Stress tests — to define the upper limit of the model by gradually increasing the load as well as analyze its dependence on the number of concurrent users, requests, or transactions.
  • Endurance tests — to track characteristic changes of the model over time (namely 9 hours) and find out any long-term stability issues. It was vital to make sure no system failures like a memory leak or increased RT would arise.
  • Customized tests — to create complex scenarios covering the requirements that hadn’t been considered by the previous checks. Its focus and details were to be discussed with the client.

Performance data was captured, and the test environment was monitored during each test run fulfilled in isolation to avoid the results being impacted by other activities in the test environment. At the end of each testing iteration, the QA engineers were gathering test results and presenting them in reports containing the main performance metrics, the conclusion, and analysis of the results. After executing tests and reporting the results, the team was organizing Demo meetings to answer any client’s questions.

Specific mention deserves the team’s flexibility to stick to the changing schedule carrying out tasks even during the off-hours, thus, observing all the deadlines.

To strengthen the current project scope, a1qa has introduced particular improvements:

  • Design of a tailored course to help newcomers quickly delve into the essence of a technically complicated product.
  • Process optimization by creating automated scripts to configure the models swiftly.
  • Smart team scalability depending on the amount of work to be fulfilled.

Cooperation with the client is evolving. The number of projects a1qa supports each month may vary and depends on many factors. However, on average, the team has about 5-8 projects per month.

SERVICES OFFERED
  • Performance testing
TECHNOLOGIES & TOOLS
  • Apache JMeter
  • Gatling
  • WebSocket
  • IntelliJ IDEA
  • BlazeMeter
  • Splunk
  • Fiddler
  • Charles
  • Java, Python
RESULTS
  • Strengthened CX was achieved due to ensuring smooth platform performance under the desired load.
  • Load testing process was established through designing a standardized approach to fulfilling load tests among all in-house QA teams.
  • Additional workforce was freed up to focus on providing quality of the internal services.
  • Increased value was obtained through reselling performance testing services carried out by a1qa to their clients.
IN NUMBERS
  • 2
    years of productive business cooperation
  • 6
    QA engineers were cherry-picked by a1qa
QA news and tips delivered right to your inbox
We’ll send you one newsletter a month, jam-packed with amazing QA offers, hottest industry news, and all kinds of Software Testing goodness.