Behind the scenes: How quality assurance drives the DPI technology of ipoque

Thomas Meissner portrait

By Thomas Meissner
Published on: 26.06.2023

Quality assurance (QA) is an all-important tenet of any deep packet inspection (DPI) solution. After all, DPI technology is often tasked to monitor critical gateways that lead traffic across network borders. Any shortfall in DPI’s ability to identify the underlying packets, as they throng through the network, can lead to a disastrous impact on the network and erode DPI’s potency in monitoring and safeguarding network assets.

At ipoque, we take pride in our high QA standards. We inherit an ethos that puts quality assurance at the center of all our efforts in creating the next best version of our DPI software. Both R&S®PACE 2, and its VPP-based counterpart, R&S®vPACE are testaments to this, with a string of technological breakthroughs ensuring our software delivers unparalleled accuracy and reliability.

The race against time

Why is QA so important for DPI technology? Unlike most static technologies, a DPI software must respond instantly to the constant change in today’s networking environment to prevent decreasing product quality. As networks evolve to include different IP connectivity stacks, as devices proliferate and as new architectures, such as cloud-native ones, become more prevalent, visibility into traffic flows becomes ever more crucial. New applications are introduced every day, leaving network administrators clueless about the packets that navigate their network. New forms of cyberattacks are unleashed every week, resulting in many unknown threats lurking in the network.

Most networking technologies take weeks, if not months, to respond to these changes. DPI software, however, does not have that privilege. The slightest change in traffic signatures must be reflected quickly so that networks remain traffic aware. At ipoque, we ensure the fastest turnaround times – with signature updates applied every week. We achieve this by having dedicated expert teams who continuously gather and analyze global traffic flows to identify the latest changes in signatures, even the minutest. These findings are thoroughly investigated, implemented, tested and eventually deployed in customer networks. This weekly cycle is repeated infinitely to create the most comprehensive and up-to-date signature repository, ensuring the highest level of classification accuracy with virtually zero false positives and false negatives.

The rigor

Underpinning each cycle of signature updates is the comprehensive QA methodology of ipoque. Our QA methodology comprises two phases. The first phase leverages our capacity to generate real network traffic in a controlled environment, a process that is executed in our test labs by our highly experienced experts. We currently create up to 1000 traces each week and analyze these traffic captures to determine the latest classification updates. We emulate different networks, for example radio access networks, to re-create actual traffic scenarios that help us understand how network characteristics influence application behavior.

As part of our signature update exercise, we track thousands of protocols, applications and service types, focusing on what is most relevant for our customers. Specifically, mobile traffic goes through higher scrutiny due to the relentless stream of new applications and version updates. We address this with our Mobile Automation Framework (MAF), where we autogenerate traffic for a selected 150 high-priority applications, whose daily updates trigger test runs 24/7. Our test team produced more than 500,000 test traces since 2012, half of them in the last four years. We also take into consideration the special attributes of regional traffic with our MAF nodes distributed across key regions, such as America, Europe, Africa and Asia. These nodes are driven by secure networking infrastructure from Rohde & Schwarz to create regional specific data and transfer it all back into our central trace management system in Europe. Given our advancements in test automation, we also expect a strong need for clean and controlled but realistic machine learning (ML) training data in the coming years, especially with more traffic becoming encrypted or obfuscated.

Prevention better than cure

The second phase in our QA process focuses on ensuring a weekly, fully-updated classification portfolio. To abide by such stringency, we have implemented built-in QA which is principled upon prevention rather than cure, where incidences of bugs and code errors are systematically minimized. As part of our commitment to this, we have integrated automated regression testing in our continuous integration/deployment (CI/CD) pipelines. This enables a 100% classification test coverage for each code merge and allows us to focus on specific anomalies. As a result, we are able to process 3.5 terabytes of test material up to 100 times per week, with millions of test results analyzed automatically (see Pic1 for an example). Our built-in QA machines utilize distinct sets of test material which are curated to reflect application behavior in different environments. This provides every DPI engineer with the capacity to focus on detected anomalies.

Improvement of the IMO application classification.
Picture 1: Improvement of the IMO application classification. Detected byte sizes changed and our test tool instantly reported this back to development.

Assurance for all environments

Being aware of the computing intensity that is involved in using DPI products, the QA infrastructure also performs extensive memory alignment tests. This guarantees a minimal memory footprint, making sure our solutions stay light-weight and compatible across any environment. Additionally, we also run simulations for scenarios that involve quality of service (QoS) degradations and malicious traffic to improve the stability and robustness of our DPI software. These robustness tests take place in the background, around the clock.

All these enhancements naturally set our DPI technology apart from competitors and from other in-house and open-source DPI software. We believe that without such commitment to quality, networks will grapple endlessly with new waves of applications and cyber threats, challenged by outdated intelligence and many visibility gaps.

Tackling tomorrow’s visibility needs today

The very nature of DPI software necessitates a product strategy that is strongly focused on the visibility needs of the future. We expect applications to become more complex, with entirely new attributes and behaviors. The use of encryption and obfuscation will also increase, resulting in the need for other advanced inspection techniques, including machine learning and deep learning. DPI solutions must also be adaptable to new platforms where they are released on, with compatibility across new software stacks and network architectures.

We have already been able to enhance our classification portfolio with the utilization of machine learning models and this enhancement is expected to increase more and more. Ensuring the software quality of an ML DPI library will reshape our software development processes and our software quality assurance methods in consequence. Despite the newest achievements in AI applications, their reliability will be a challenge in the upcoming future. As a test department, we have to face the hunger our machine learning models have for training data. This means that for every single trace needed as base for a traditional classification implementation, we will need 1000 labeled traces as training data to implement a new classification based on machine learning. This transfers considerable infrastructure responsibilities to our QA team and we’re not only managing that already but are also prepared for an ML-driven future.

At ipoque, we envisage a fast-changing future where nothing is constant except change. Recognizing the demands of a world driven by digital connectivity, we have perfected our QA with unrivalled testing capacity and unmatched speeds at which we update our classification portfolio. This, coupled with the rigor of our QA processes, is the reason why ipoque remains the partner of choice for our customers and why our DPI engines, R&S®PACE 2 and R&S®vPACE, are always a step ahead of the game.

Thomas Meissner portrait

Thomas Meissner

Contact me on LinkedIn

Thomas is Head of Quality Assurance at ipoque and holds a degree in sociology. With over 10 years of experience in test management, Thomas is responsible for the software quality assurance and continuous improvement of the deep packet inspection product series R&S®PACE 2 and R&S®vPACE at ipoque. As a QA advocate, his mission is to create a collaborative environment between testers, developers, and operations to provide a quality assessment for everyone as fast as possible. When he’s not breaking any computer systems, he likes to escape into his library.

ipoque blog - discover the latest news and trends in IP network analytics

Sign up for the ipoque newsletter

Stay informed about the latest advances and trends in
deep packet inspection and network traffic visibility