Intive Blog

Testing Ten Applications at the Same Time

 “I’m running out of time and I haven’t been able to test yet.” We’ve all been there. In our case, one time we needed to test ten applications simultaneously. A few of them were about to go in production while others were already in that stage. How to deliver on time while ensuring quality? Could we meet the client’s expectations, considering that the automation team was small? To answer those questions, we set out to check the status of the team, examine the technologies in use and review our work methodologies.  

Scanning the Applications  

First, we analyzed the applications and discovered that some UI components were common to all of them. Such components were built in React Redux, and since they were very dynamic, they could be activated and disabled intermittently in the Document Object Model (DOM), something that represented a challenge for automation. One thing is to automate an object that’s going to remain stable the whole time and quite another is to automate an object that’s dynamic due to the nature and technology of the application. 

All of the applications were domain specific, that is, each one had one specific task. When a new application appears in the market, it’s essential to understand its business logic rather than its style, because they can all look more or less the same in that aspect 

Analyzing Manual Testing 

Then, we analyzed manual testto observe where the bottlenecks appeared 

Due to a continuous data mutation, the client required us to execute daily checks, which consisted in testing different features of the application every morning through smoke tests and long, undefined regression tests that took a lot of time and demanded much effort. How can you do that if you haven’t automated testing or you haven’t been able to automate what you need? 

Part of the testing implied evaluating whether new documents were regularly indexed into the system (which we referred to as Documents Count) and checking if the user experience and usability was maintained (a process called User Experience Monitoring). 

Checking the Technology in Use 

  • .Net framework: the solution was created in .NET and written in C#, with NUnit as testing library.  
  • Page Object Model (POM) in Selenium for the UI and RestSharp for WebApiusing Page Factory for POM.  
  • Jenkins: to organizthose jobs in charge of running the different automation processes and creating the HTML reports to be sent by email.  
  • Amazon Web Services: with EC2 instances to run tests. 

Setting Goals 

Based on all that, were able to set the following testing goals: 

For the team: 

  • Identify and reduce manual testing bottlenecks.  
  • Automate daily checks, smoke and regression tests, administrative tasks.  
  • Adopt the continuous delivery/continuous integration methodology by creating a centralized dashboard connected to Jenkins pipelines and QA processes.  

For the client:  

Generate clear reports showing the coverage and status of the solution.  

Demonstrate that the processes being followed are adequate.  

Establishing a Roadmap 

Afterwards, we set a roadmap to meet those goals: 

  • We refactored Page Factory to a dynamic POM. Page Factory is a library with which you can better control instances but, if a component changes in the DOM after initialization, it’s necessary to update it. With a dynamic POM, instead, we can delegate the task of controlling the instances to the elements, due to the lambda expressions’ ability to reduce exceptions.   
  • We implemented tools like Allure, which provides everything we need to create quality reports: history, reproduction steps, screenshots, videos, coverage templates, timelines, etc. 
  • We set up a system of clusters to run a large number of tests in less time. We started using Selenoidwhich can run browser- and mobile-based tests using Docker to create containers. It has a lot of scalability potential. By contrast, Selenium Grid can be less stable, needs more maintenance, is harder to autoscale and doesn’t perform as well. 

We established clear frameworks with a custom dashboard centralizing every report (by application) and a versioned progression chart that could execute jobs by categories. We also migrated to Kanban for more flexibility.  

This was our technology stack: 

Return on Investment 

You may be wondering about the results. Well, in the past two years we’ve had a 75% increase in solution performance and we’ve reduced regression testing times from 3 hours to 1 hour per application and to a total time of 3 hours for all 10 applications (using 10 containers), compared to the 12-hour total time it took before. 

Insights 

Many clients want to solve issues quickly by resorting to paid products or services. Although one of the strong points of paid options is the support they offer, open source tools and technologies rely on community support, which can be a deal breaker when it comes to choosing a solution. Except Amazon Web Services, our entire solution is developed with open source libraries, tools and technologies.   

During this process we learned: 

  • How to automate smart: There’s no need to automate everything, only the processes which can help us increase team bandwidth.  
  • How to think big: Running 1 million tests simultaneously? Yeah, why not? Trial and error are essential if we want to learn how to create scalable solutions. 
  • Developers are our friends: When you work side by side with developers, interaction between teams can only improve. 

What’s your experience automating simultaneous tests?  What technologies did you use? What did you learn? We look forward to reading your comments. 

Facundo Alarcón

Facundo Alarcón is QA Automation Lead in intive since November 2017. Student of Systems Engineering at the Universidad de la Marina Mercante, Facundo is also a professional webmaster by the Universidad Tecnológica Nacional (UTN). Gamer from a young age (“I like video games since I can remember”) and a student of martial arts (wing tsun and fencing), he recently began learning to make sushi. Whiskey fan, he is co-founder of the “after office” brigade of that beverage in the office.

Sebastián Zapata

Sebastián Zapata works in the QA Automation team in intive since March 2018. He has 2 years of experience with QA automation, Selenium, ChromeDriver and EndtoEnd testing. He defines himself as a lover of technologies and software development. Seba specializes in programming languages and software architecture in automation testing. His hobbies are videogames and motocycles. 

Add comment