testing predictions
BLOG
TESTAUS

2020 Testing predictions

Antoine Aymer, Sogeti’s Global Strategic Portfolio Director for Testing, offers his top predictions for the world of Testing in 2020.

2020 test prediction illustration 1.png
antoine-aymer-200x200.png
Antoine is a passionate market-driven technologist and researcher. He has been exploring how brands are transforming their business to create new, personal interactions. His research aims at helping mobile teams solve the triple quality-time-cost constraint through innovative testing strategies. The development of mobile apps makes this equation even more subtle.
2020 test prediction grey icon.png2020 test prediction red icon.png2020 test prediction grey icon.png

1. Model-based testing (MBT) drives efficiency

The quality of testing depends significantly on the ability to understand the required outcomes from the outset. Currently, too much time is spent on using test management tools as ground zero. The result? You end up with far more test cases than required, with lots of overlap and redundancy. This leads to growth in regression testing, and it makes it hard for teams to prioritize. Our industry is reconsidering model-based testing, where the full, requirements-led flow of user stories is being built. It allows even “de-skilled” automation to create rigorous tests for each code commit, executed in the same iteration. To be valuable, model-based testing will need to be used as the one-stop-shop to maintain test cases, scripts, data and virtual data from one place. Expect to see more of MBT in 2020.

2. Rise of synthetic data

Many teams waste too much time creating test data, searching through large copies of production data, and creating missing data combinations manually. In the end, there are still not enough data sets for parallel testing and execution. And available data often proves to be invalid, which destabilizes test automation. Let me reiterate here the very high risk of data regulatory non-compliance and exposure to heavy fines as a result. To address these challenges and enable more rigorous testing, I believe companies will investigate the lifecycle of data (and not just the provisioning step) a little more seriously. More specifically, with the readiness of artificial intelligence applied to this domain, 2020 should be the golden year of synthetic data generation, preserving all the characteristics of ‘real’ data, while meeting data privacy/security needs.

3. Stronger focus on security

While security is not currently a central focus for Agile and DevOps teams, I predict developers taking on increased responsibility for this constantly evolving topic. Given the perceived importance of security, it may seem surprising that a shift into the mainstream hasn’t gained in momentum before now. Developers will not become security experts but will gain the fundamental skills to become instinctively more secure coders. It is interesting that this year’s World Quality Report urges testing teams to factor in security and security testing at the earliest stages of the design cycle - making it an integral part of development.

4. Analytics delivers rich information

To maximize the accuracy and efficiency of our quality validation process, I expect a stronger reliance on the depth of production information - and more specifically, embedded libraries and other types of analytics. We will rely more on the richness of events and data analytics provided to increase visibility throughout the lifecycle. Meta-data on end users’ context and settings will be used to create more realistic automated scripts and ensure crashes can be easily reproduced, reducing the mean time to fix. In the coming years, I am anticipating (hoping) that real user journeys will be correlated with the flows designed through model-based testing, increasing the accuracy of the test cases.

2020 test prediction illustration 2.png

 

Antoine Aymer
Antoine Aymer
Strategic Portfolio Director at Sogeti
+33767793048