Serving companies worldwide that belong to multiple industries, Anaplan contributes to streamlining decision-making to help its customers rapidly adapt to constantly evolving business environments.
The developed software — a cloud SaaS platform — provides end users with the possibility to perform financial and operational planning, business modeling, and predictive analytics. By applying this solution, companies can create, share, and maintain applications for performing multiple tasks such as budgeting, sales, operational and strategic planning, demand forecasting, reporting, and many more.
The platform allows each end user to create a customized model and perform multi-dimensional planning, including such dimensions as time, product, customer, location, currency, etc.
Therefore, each tailored model was to be tested to maintain high performance and eliminate any interruptions in its functioning.
To help Anaplan and their in-house QA teams get things done, a1qa assigned 6 performance testing engineers.
Before getting down to checking the platform, the engineers learned its bells and whistles. This commitment to exploring the architecture of internal models later made it possible to perform their independent testing and define bottlenecks in configuration at a glance.
The next step on the way to assuring the quality of the system concerned choosing the proper testing strategy. It was based on the user behavior approach aimed to simulate the work of real users within the platform by means of transactions — separate actions performed by users. Between fulfilling them, “think time” was applied to simulate natural delay and receive the most realistic profile for executing tests.
To protect highly sensitive customer data, it was necessary to fulfill concurrency testing only after all models were sanitized and copied into the testing environment.
Therefore, having prepared the necessary scope of real-life user journeys and ensured that the models were configured correctly, the engineers proceeded with designing scripts with complex logic Apache JMeter and, later on, executing tests.
To launch them from multiple geographical regions, BlazeMeter tool was introduced. It allowed the team to receive the relevant data regarding ongoing activities in real time — transaction errors, the response time (RT), the number of queries per second.
Crafted performance testing service package delivered by a1qa encompassed 4 types of checks listed below:
Performance data were captured, and the test environment was monitored during each test run fulfilled in isolation to avoid the results being impacted by other activities in the test environment.
At the end of each testing iteration, the engineers gathered test results and presented them in two report types: a Spreadsheet report containing the main performance metrics and a Summary executive report comprising the conclusion and analysis of the results.
After executing tests and reporting the results, the team organized Demo meetings to answer any client’s questions.
The team’s flexibility to stick to the changing schedule deserves specific mention. They carried out tasks even during the off-hours, thus, observing all the deadlines.
To strengthen the current project scope, a1qa has introduced particular improvements:
Cooperation with Anaplan is evolving. The number of projects a1qa supports each month may vary and depends on many factors. However, on average, the team has about 5-8 projects per month.