


To accelerate the release time and streamline QA workflows so much more, a1qa created a test automation framework for us to build upon. Once the framework was complete, the a1qa team provided training and supported our team with an augmented model to continue to extend the automation suite’s capability…
… their proactivity – they are not just “extra hands”, they approach the assignments thoughtfully and always propose options, best practices, performance improvements, etc.
VP Product Operations
The client designs innovative iGaming experiences across both physical venues and digital platforms. Its holdings span a spectrum of leisure offerings: expansive gaming floors teeming with electronic terminals, a meticulously crafted golf facility in a dense urban center, and a historic equestrian track with decades of legacy. The company seamlessly integrates real-world and virtual loyalty programs, bridging traditional experiences and mobile-based entertainment.
The client developed and operates a platform for sports wagering, along with web and mobile products that include casino games and third-party integrations. To support expansion into additional US states and achieve faster time to value, the client engaged a QA partner to strengthen delivery across a multi-vendor setup.
To support frequent releases in a Scrumban workflow, a1qa assigned a dedicated team that scaled with the product and resolved issues as new functionality shipped.
Prior to testing, the QA engineers created and then maintained detailed test cases and scenarios to ensure clear coverage of functional requirements and support structured testing processes. They conducted new feature testing to confirm expected behaviour and alignment with business rules. Regression testing was carried out on a regular basis to ensure that all existing functionality (registering, placing bets, browsing the list of players and events, online casino games, payments, etc.) remained stable and unaffected by changes introduced in new releases, covering critical user paths and previously delivered core features.
The team logged all defects with clear reproduction steps, expected results, and environment details to speed up triage and fixes. They also retested fixes to confirm resolution and check for side effects.
Additionally, a1qa helped the client comply with the latest regulatory policies and directives set by the United Kingdom Gambling Commission (UKGC). During the registration process, end users must confirm their identity by providing the relevant documents to get full access to the account and platform functionality. The confirmation process takes time. The QA engineers checked that the user identity verification cycle operates as intended (e.g., there is limited functionality during the confirmation period).
The QA engineers helped the client to ensure consistent system behaviour and user experience across a wide range of environments. Browser compatibility testing was conducted on multiple desktop browsers to verify correct rendering, functionality, and responsiveness under different configurations. In parallel, the team validated software behaviour across various iOS and Android mobile devices and tablets, confirming that features operated as expected on different screen sizes, operating systems, and hardware profiles. These activities helped identify environment-specific issues early and supported reliable product performance across platforms.
The specialists utilized a1qa’s in-house real mobile devices to increase testing accuracy and discover more flaws. Their main focus concerned the iOS version because of its development specifics. In addition to testing the major functionality, a1qa’s engineers performed specific verifications such as interruptions or external action.
The engineers calculated the uplift of increased testing coverage and ROI to quantify saved effort and improved capabilities of spotting errors. Afterward, the experts proceeded with C#-based automation of integration, end-to-end, API tests, and operations with a database.
To support these tasks, a1qa:
a1qa implemented a tailored test automation solution using Playwright as the primary framework and environment setup, with tests written in TypeScript by cross-functional QA engineers. The engineers automated scenarios based on a behaviour-driven development approach to verify different ways users interact with the software and make sure the system works precisely as defined by the business. Serenity and Java-based automation were also part of earlier onboarding processes, reflecting the evolution of the client’s automation strategy over time.
To accelerate time to market and increase service agility, the engineers ensured a daily benchmark build process. Each night the CI/CD system extracts the latest code changes, creates a new environment, builds and deploys the full version of the application to run tests on it. In addition to providing a smooth delivery process, a1qa managed to define failed tests and boost an overall quality level.
a1qa integrated TestRail to make test management and results tracking easier. The team also implemented and upgraded the in-house Aquality Tracking system to produce clear reports with statistics and screenshots. After each build and automated test run, reports were generated and attached.
One challenge involved a browser game built by a third-party vendor. Standard locator-based UI automation was not feasible, so the team used OpenCV computer vision to identify elements in screenshots, calculate coordinates, and trigger actions such as button clicks.
To guarantee that the software can withstand the overload and remain operable in the long run, a1qa checked the performance of SOAP-based API. As for the client side, the engineers tested user registration, login, betting, web browsing, etc.
The team also verified the admin panel for managers who create events, serve end users, interact with their profiles, give out free bets, and more. For that, the QA team applied the following approaches:
To test each scenario, a1qa gathered targets from the production environment and targets for the number of transactions per second for each component. When the tests passed those targets, the team increased them by 50% to identify system limits and performance bottlenecks. After creating tickets for developers to fix them, a1qa reran tests and validated fixes.
Script creation was complex because each component had unique requirements. Instead of recording user sessions, the team generated load through API requests and used Java parameterisation to meet the test design requirements.