Portfolio

Crafted performance testing service to ensure smooth SaaS platform operation for Anaplan

Performance enhancement for a global leader in pioneering connected planning and simplifying decision-making for businesses.
IT and software development
Performance testing

Overview

Serving companies worldwide that belong to multiple industries, Anaplan contributes to streamlining decision-making to help its customers rapidly adapt to constantly evolving business environments.

The developed software — a cloud SaaS platform — provides end users with the possibility to perform financial and operational planning, business modeling, and predictive analytics. By applying this solution, companies can create, share, and maintain applications for performing multiple tasks such as budgeting, sales, operational and strategic planning, demand forecasting, reporting, and many more.

The platform allows each end user to create a customized model and perform multi-dimensional planning, including such dimensions as time, product, customer, location, currency, etc.

Therefore, each tailored model was to be tested to maintain high performance and eliminate any interruptions in its functioning.

Services offered

Performance testing

Project scope

To help Anaplan and their in-house QA teams get things done, a1qa assigned 6 performance testing engineers.

Before getting down to checking the platform, the engineers learned its bells and whistles. This commitment to exploring the architecture of internal models later made it possible to perform their independent testing and define bottlenecks in configuration at a glance.

The next step on the way to assuring the quality of the system concerned choosing the proper testing strategy. It was based on the user behavior approach aimed to simulate the work of real users within the platform by means of transactions — separate actions performed by users. Between fulfilling them, “think time” was applied to simulate natural delay and receive the most realistic profile for executing tests.

To protect highly sensitive customer data, it was necessary to fulfill concurrency testing only after all models were sanitized and copied into the testing environment.

Therefore, having prepared the necessary scope of real-life user journeys and ensured that the models were configured correctly, the engineers proceeded with designing scripts with complex logic Apache JMeter and, later on, executing tests.

To launch them from multiple geographical regions, BlazeMeter tool was introduced. It allowed the team to receive the relevant data regarding ongoing activities in real time — transaction errors, the response time (RT), the number of queries per second.

Crafted performance testing service package delivered by a1qa encompassed 4 types of checks listed below:

  • Peak load tests — to measure the average RT and the throughput of the model at a given number of concurrent users.
  • Stress tests — to define the upper limit of the model by gradually increasing the load as well as analyze its dependence on the number of concurrent users, requests, or transactions.
  • Endurance tests — to track characteristic changes of the model over time (namely 9 hours) and find out any long-term stability issues. It was vital to make sure no system failures like a memory leak or increased RT would arise.
  • Customized tests — to create complex scenarios covering the requirements that hadn’t been considered by the previous checks. Their focus and details were to be discussed with the client.

Performance data were captured, and the test environment was monitored during each test run fulfilled in isolation to avoid the results being impacted by other activities in the test environment.

At the end of each testing iteration, the engineers gathered test results and presented them in two report types: a Spreadsheet report containing the main performance metrics and a Summary executive report comprising the conclusion and analysis of the results.

After executing tests and reporting the results, the team organized Demo meetings to answer any client’s questions.

The team’s flexibility to stick to the changing schedule deserves specific mention. They carried out tasks even during the off-hours, thus, observing all the deadlines.

To strengthen the current project scope, a1qa has introduced particular improvements:

  • Design of a tailored course to help newcomers quickly delve into the essence of a technically complicated product.
  • Process optimization by creating automated scripts to configure the models swiftly.
  • Smart team scalability depending on the amount of work to be fulfilled.

Cooperation with Anaplan is evolving. The number of projects a1qa supports each month may vary and depends on many factors. However, on average, the team has about 5-8 projects per month.

Technologies & tools

  • Apache JMeter
  • Gatling
  • WebSocket
  • IntelliJ IDEA
  • BlazeMeter
  • Splunk
  • Fiddler
  • Charles
  • Java, Python

Results

  • Improved CX was achieved due to ensuring smooth platform performance under the desired load.
  • Load testing process was established through designing a standardized approach to fulfilling load tests among all in-house QA teams.
  • Additional workforce was freed up to focus on providing quality of the internal services.

In numbers

5+
years of productive business cooperation
6
engineers were cherry-picked by a1qa

Get in touch

Please fill in the required field.
Email address seems invalid.
Please fill in the required field.
We use cookies on our website to improve its functionality and to enhance your user experience. We also use cookies for analytics. If you continue to browse this website, we will assume you agree that we can place cookies on your device. For more details, please read our Privacy and Cookies Policy.