Serving companies worldwide that belong to multiple industries, the client contributes to streamlining decision-making to help its customers rapidly adapt to constantly evolving business environments.
The developed software — a cloud SaaS platform — provides end users with the possibility to perform financial and operational planning, business modeling, and predictive analytics. By applying this solution, companies can create, share, and maintain applications for performing multiple tasks such as budgeting, sales, operational and strategic planning, demand forecasting, reporting, and many more.
The platform allows each end user to create a customized model and perform multi-dimensional planning, including such dimensions as time, product, customer, location, currency, etc.
Therefore, each tailored model was to be tested to maintain high performance and eliminate any interruptions in its functioning.
To help the client’s in-house QA teams get things done, a1qa assigned six performance testing engineers.
Before getting down to checking the platform, the QA professionals learned its bells and whistles to perform independent testing and define bottlenecks in architecture configuration at a glance.
The next step on the way to assuring the quality of the system concerned choosing the proper testing strategy. It was based on the user behavior approach aimed to simulate the work of real consumers within the platform by means of transactions — separate actions performed by users. Between fulfilling them, “think time” was applied to simulate natural delay and receive the most realistic profile for executing tests.
To protect highly sensitive customer data, it was necessary to fulfill concurrency testing only after all models were sanitized and copied into the testing environment.
Therefore, having prepared the necessary scope of real-life user journeys and ensured that the models were configured correctly, the engineers proceeded with designing scripts with complex logic and, later on, executing tests. To launch them from multiple geographical regions, the BlazeMeter tool was introduced. It allowed the team to receive the relevant data regarding ongoing activities in real time — transaction errors, the response time (RT), the number of queries per second.
Crafted performance testing service package delivered by a1qa encompassed four types of checks listed below:
Performance data was captured, and the test environment was monitored during each test run fulfilled in isolation to avoid the results being impacted by other activities in the test environment. At the end of each testing iteration, the QA engineers were gathering test results and presenting them in reports containing the main performance metrics, the conclusion, and analysis of the results. After executing tests and reporting the results, the team was organizing Demo meetings to answer any client’s questions.
Specific mention deserves the team’s flexibility to stick to the changing schedule carrying out tasks even during the off-hours, thus, observing all the deadlines.
To strengthen the current project scope, a1qa has introduced particular improvements:
Cooperation with the client is evolving. The number of projects a1qa supports each month may vary and depends on many factors. However, on average, the team has about 5-8 projects per month.