Derk-Jan de Grood, who works for Valori as senior test manager and agile transition coach, will open this track with a presentation entitled Grip on your Test maturity using the Ambition chart. Agile test teams access their effectiveness every sprint. The power of retrospectives is that they, when done well, focus on improving in small achievable steps. This ensures that progress is being made and is more than just a far away dream. In this presentation he will explain how to make and use an ambition chart and he will explain situations in which it can be beneficial and share some examples of focus areas.

Niv Segev will be in charge of the second presentation. He will discuss and share from his own experience as testing manager at Telefonica digital, how they were able to test a telco app affecting hundreds of millions around the world, serving multiple distributed geo’s (influenced by locale settings, culture & 3rd party infrastructure), neverending platforms – web, desktop, ios, android WP taking into account all relevant hardware models and different OS versions – all of that with 14 QA engineers.

Avi Babaniwill be the third speaker of this track, and he will talk about the Intel Active Management Technology (AMT) – a hardware and firmware technology for remote out-of-band management of personal computers, in order to monitor, maintain, update, upgrade, and repair them. For many years, testers working on AMT were given their own CPU board which they used in testing. The board was sitting on their personal lab bench and engineers got used to have their “own” hardware. Since these development boards cost a lot of money, we were looking for ways to share the HW between engineers.

And last, Viviane Lyrio e Higgor Valença will explain their experience about the user experience as indicator for assessing performance risks. Due to the constant pressure to release a feature as soon as possible, many tests are deprioritized because the feature are considered simple. However, these features can have non-functional requirements that can affect the user impression of the system. This paper proposes an approach to performance test that uses user experience (UX) as an indicator of what should be prioritized.