

The client operates in the shoppable media technology sector owning a platform that leverages machine learning to transform the way people shop through video.
The product recognises items inside video and overlays on-player cards with names, prices, and direct purchase links, so viewers can browse and buy without leaving the stream. It allows brands to embed interactive elements, making it a powerful tool for boosting engagement and driving conversions across digital channels.
The client was transforming its platform, shifting to a new architecture, moving to the cloud, and integrating with external services. As a result, they required specialised testing to sustain daily release velocity, protect stability during change, and detect and fix issues early in every phase.
Before testing began, a1qa’s team assumed full responsibility for QA management and implemented a Kanban workflow to smoothly handle daily QA activities. This approach enabled the engineers to quickly adapt to shifting priorities and accelerate the delivery of tasks into the production environment. In addition, we managed a cross-functional QA specialist on the client’s side, while on our side a dedicated manager worked with two development managers to coordinate workloads.
Despite the client working across multiple time zones, an efficient communication framework was also established. a1qa held weekly meetings with the business stakeholders, daily stand-ups, status meetings for project managers, and bug triage sessions with the client’s product owner to maintain transparency throughout the project, stay aligned, and address issues productively.
To improve planning and effectively allocate QA engineers across features, a1qa developed a Gantt chart. The team estimated testing hours for each feature, and the manager mapped tasks to specialists in a chart that showed assignments, timelines, and supported release forecasting. On this basis, the client was provided with a release plan including preliminary delivery dates. This approach optimized resource usage and ensured smooth, predictable testing workflows.
These preliminary workflow setup activities laid the foundation for well-coordinated testing efforts, that included:
The client aimed to deliver numerous customized features simultaneously, which made a single environment insufficient. To speed up the process, the team scaled to 30–40 developers and over 10 testers at peak, while the project maintained 8–10 test environments, supported by dedicated DevOps engineers (up to two at peak). This setup enabled daily releases (except on Fridays) and ensured rapid delivery of high-quality functionality.
QA specialists performed full-cycle functional testing — covering everything from requirements testing to regression testing — to ensure stability and reliability across frequent releases, maintain high product quality, and minimize risks in delivering customized features.
What’s more, before a release, each feature went through two types of collaborative sessions with developers, business analysts, and QA engineers. First, a grooming session let QA assess risks and suggest improvements. Second, a pre-demo after development gathered BA and QA feedback, with comments documented instead of formal bug reports to speed refinement. This process reduced the time from idea to release, enabling fast and flexible iterations.
Additionally, a1qa’s QA engineers validated an object detection solution that used neural networks to identify fashion items in videos and images and match them against large eCommerce catalogues. For each test, a1qa built a dataset by selecting the most relevant items for a given video, such as matching a dress worn by a model to similar products in partner stores. The system’s recommendations were then compared against this ground truth to confirm accuracy. Testing covered multiple retailers and a wide assortment of items, ensuring that the feature delivered precise product matches and enhanced the shopping experience.
To support continuous delivery, allow for early detection of regressions, and significantly reduce manual testing efforts, a1qa implemented a robust Java- and Kotlin-based test automation strategy integrated into the CI/CD pipeline. Test environments were provisioned automatically using Infrastructure as Code (IaC), ensuring consistent and reproducible setups for reliable test execution.
Automation was tightly integrated with the Kanban workflow, enabling seamless validation of tasks moving toward release, and ensuring high confidence in build quality across changing priorities. a1qa’s test automation efforts yielded a positive outcome, enabling the execution of approximately 1,200 tests in just 1.5 hours.
The client prepared for an important event, therefore performance testing was of vital importance to ensure the system could handle a load of up to 100,000 concurrent users.
Two of a1qa’s experts collaborated closely with the DevOps engineers and the client’s project architect, prepared all necessary scripts and carried out load and stress testing, reproducing the actions of real users as realistically as possible to determine how the system would behave under extreme load and prevent any degradations. Reports and metrics were shared, confirming that the software was stable and ready for the upcoming event.
Additionally, a1qa handed over all the scripts to the client so they could independently run the tests whenever needed.
a1qa’s specialists performed OWASP-guided penetration testing, using semi-automated tooling for routine checks. The team uncovered several high-severity issues that could allow a remote, unauthorised actor to access APIs and harvest data, trigger denial-of-service, and cause other serious impacts.
The experts also logged and validated defects, compiled a detailed report with findings and recommendations, and presented a demo to QA teams, developers, and client representatives. Additionally, a1qa provided a privilege matrix describing the roles and clarifying which features were accessible, restricted, or limited in accordance with the system’s original design.
In the scope of the project, a1qa’s team also successfully coped with several tech-related challenges, namely:
Changes in the client’s plans and task priorities required strong adaptability from a1qa’s management team. Shifting priorities meant reassessing the impact of changes, reassigning tasks, and cancelling outdated ones. This also affected regression testing which had to remain lightweight despite the fact that its suite was assembled from 8,000 test cases, carefully selected to provide the necessary coverage.
To solve this issue, a1qa’s testers and the client’s development teams collaborated closely, with engineers flagging risk areas during feature discussions and grooming sessions. A detailed test model in TestRail allowed custom assembly of regression scopes, while automation suites were categorized to quickly generate the right set of tests. Moreover, automation engineers taught functional testers to run automated checks locally, further accelerating regression cycles and helping the team keep pace with fast-changing priorities.
With a customer base that was still taking shape, a1qa’s team worked in a highly agile, adaptable mode. Planning horizons were shorter, and the system was tuned continuously to meet near-term client needs. At times, in-progress features were paused or reprioritised to address urgent requests.
a1qa’s onboarding process aimed to select candidates who were technically proficient and adaptable to a fast-paced workflow. The company reviewed candidates’ backgrounds, spoke with their managers, and used scenario questions to assess their ability to shift priorities.
Considering the complex nature of the project’s technical and business logic, a1qa prioritized onboarding highly skilled QA engineers who could handle the technical demands of the work.
Slow release cycles were a major bottleneck. The team needed to find a way to accelerate the delivery of new features and fixes without compromising quality. The traditional separation of roles, where automated tests were solely the responsibility of test automation engineers was no longer effective.
To support rapid releases, a1qa dedicated manual testers who knew how to run automated suites and analyse results. The expanded scope strengthened skills and made this group a critical link in the delivery chain.