Newsletter

Subscribe to our newsletter and receive updated information about the conference

registro
Organised by

sqs

Sponsored by

sqs

QA&TEST Twitter QA&TEST Flickr QA&TEST Blog QA&TEST Linked In QA&TEST RSS QA&TEST RSS

Programme

Developing a System for Performance Testing of IntelŪ Computer Vision SDK on Computer Vision and Deep Learning Workloads from Scratch

Mikhail Treskin    
Intel - Rusia

Presentation abstract

This presentation is on the problem of measuring performance of implementations of CV and DL algorithms. We have been developing an automatic system for collecting, storing, and reporting performance data on CV and DL workloads for two years from scratch. Performance testing of Intel® CV SDK started with manually running samples and copy-pasting results to reports and evolved to an extendable automatic test system capable of collecting data from tests, samples, and benchmarks of various types. Along the way, we faced many challenges, problems, and pitfalls. We would like to share our vision of and our experience with automatic performance testing of CV and DL workloads, especially with respect to handling test data. 

  • Data collecting. How to collect accurate performance data. How to configure test systems to produce stable performance data. 
  • Data storing. What DB type and structure to use to enable efficient collection, storage, and processing of performance data. 
  • Data processing. How to represent performance data. We get many requests for different data performance representation, because different grouping of performance data is required for different types of performance reports. We are going to present some of the results of our work.

Speakers information

Mikhail Treskin has Master degree in radio-physics. Started his career in IT industry at 2011. Now Mikhail is working on validation of computer vision and deep learning products at Intel. His responsibilities includes but not limited by tests and CI process automation, performance and accuracy validation enabling and support, organizing infrastructure for tests results storing and representation.



return back