eHarmony Engineering logo

Collaboration Yields Automation Success

March 2, 2015

At eHarmony, automation plays an integral role in our QA process. Continuous tests execute across our products, enabling us to catch regression defects early in the QA cycle, while simultaneously enabling additional focus on new features and functionality. In this post, I reflect on what it takes for automation to succeed in an organization. Brent Hohn

Vision and Plan

How many times have you heard or said “yes, we have automation but” ? Automation is fraught with perils and pitfalls and, like anything else, fails if not done properly. Maybe it’s clunky, or complicated in a way that hinders its adoption by other testing or development groups.  Bad choices in tool selection, undefined vision, poor framework design and implementation are all factors that can derail your test automation. However, the prime reason that automation fails is lack of collaboration between SDET’s and QA engineers.

Before embarking on an automation endeavor, define a vision and create a plan. Present the vision and plan to management to get the support needed. It is fairly easy to make the case for automation. However, make sure management is onboard and understands the commitment and resources required to make it work.

Vision: Improve quality by finding defects early and reduce product delivery time to market by automating a percentage of regression tests across a specific platform (API’s, Native apps, or desktop browsers).

There, that was easy!

Now create a plan. There are quite a few considerations when creating a plan, including:

  • Identify target platform for automation API’s, front end desktop, native app or mobile web platforms.
  • Select tools.
  • Design framework. How will the Framework manage library dependencies, project builds, abstraction and encapsulation of common methods and classes? Also consider reporting requirements.
  • Decide where the automation tests will run: on a grid, locally, in the Cloud?
  • Target areas to automate. Is there a test case management system to identify test cases for automation? Start with high priority areas, anything to do with getting into your site and getting paid, like subscription, login, registration.
  • Define who will be using the automation, who is your target audience, will they execute the tests using a script, Jenkins or will these tests be triggered as part of a CI process?
  • Define project milestones: 3 months 50% completion, 6 months 100%
  • Determine how much coverage automation will provide: Safari, Chrome, Firefox, Mobile Safari?

Once a formalized plan has been developed share it with your stakeholders, including QA, product and development teams. Gather feedback/concerns and modify your plan accordingly.

Execute Your Plan

Now that your vision and plan are defined and communicated, it’s time to execute. Schedule some time for a team planning session to discuss framework creation. This is the backbone where your tests will be driven, and is an entirely separate topic not covered in this article. At eHarmony, we follow Agile development methodology to plan, create, and deliver automation and framework enhancements in 2-week sprints.

We start with a pile of stories in a backlog and, with the assistance of QA and the Product team, we identify and prioritize the regression areas. Having a test case management system with test cases already created and identified made it quite easy, and provided the user story content needed to create the automation.

Integrating Automation into the QA Process

Collaboration between SDET’s and QA engineers is absolutely essential for automation to succeed. Once the framework is completed and a fair amount of automation test cases exist, then it’s time to integrate automation into the test plan.  This is not the time to throw it over to the QA team and say “have at it”. You now provide demos to your target audience, showing them what has been built, covering general usage of the test harness, giving a high level view of how it works and its new role in the test plan.

Pick a sprint or a project to pilot the automation. Work with the QA engineer(s) who are responsible for executing the test plan, and inform them you will be running automation on the builds they will be testing and providing them with pass/fail results for the areas under automation.

This is a critical stage for automation’s success; we begin to see its role defined as it takes a foothold in the test plan. Furthermore the collaboration between QA and SDET’s accomplishes the key objectives: confidence and trust. QA engineers and SDET’s gain confidence in the automation as they begin to see results, especially when regression defects are found. Trust builds knowing the SDETs are fully vested in the testing process, and the SDETs see that QA are relying on automation results.

Ultimately this collaboration facilitates a critical feedback loop between SDET’s and QA to identify gaps in automation coverage, and generates more interest in automation, expanding the role of automation execution in the test plan.

Share your results

At the end of each sprint its imperative to share your results with your stakeholders, showing the impact automation has on the project. Some key metrics to share are:

  • Percentage of tests that are automated
  • Amount of regression testing time saved with the automation
  • Defects found or missed by automation (be honest)
  • Areas where automation can be improved

In conclusion, automation can be a critical tool used for finding regression defects early in the QA cycle, while increasing velocity for product time-to-market.  It can help improve quality by affording QA engineers more time for boundary, negative, and edge case testing on new features and components.

Creating a vision and plan are useful in helping to determine how to navigate to the end goal, but ultimately automation success hinges on the collaboration between SDET’s, automation engineers, and QA.