AnalyticsOps & Automated Dashboard Tests

Marc Polizzi
Marc Polizzi
July 30, 2024

Businesses are experiencing perpetual growth in complexity, underscoring the crucial necessity to meticulously analyze and comprehend their current operational landscape. In this milieu, operational decisions are no longer solely reliant on raw datasets; instead, they are propelled by the insights gleaned from dashboards, analytics, key performance indicators (KPIs), and charts. Consequently, the imperative for decision-makers lies in placing unwavering trust in the accuracy and reliability of these analytics and dashboards.

TABLE OF CONTENTS
A Group of Robots Testing Dashboards
Technology
All

Businesses are experiencing perpetual growth in complexity, underscoring the crucial necessity to meticulously analyze and comprehend their current operational landscape. In this milieu, operational decisions are no longer solely reliant on raw datasets; instead, they are propelled by the insights gleaned from dashboards, analytics, key performance indicators (KPIs), and charts. Consequently, the imperative for decision-makers lies in placing unwavering trust in the accuracy and reliability of these analytics and dashboards.

This confidence can be established through Analytics governance, also known as AnalyticsOps, which encompasses a suite of operational and administrative procedures, tools, and tasks including automated testing, content lifecycle management, and continuous monitoring.

AnalyticsOps in 2024

In 2021, David Menninger from Ventana Research, in his paper “Analytics Ops : The Last Mile of Data Ops” predicted that :

By 2024, one-third of organizations will adopt an analytic operations approach similar to, and integrated with, their data operations processes to enhance responsiveness and agility.

Contrary to predictions, AnalyticsOps does not seem to have attained yet the same level of robust development and widespread adoption as DevOps and DataOps. Possibly, this is because AnalyticsOps necessitates a broader approach compared to DevOps and DataOps. Nonetheless, there is no justification for delaying the implementation of automated testing for analytics.

Certainly, testing lies at the core of AnalyticsOps and is pivotal for achieving agility. Once implemented, it enables the automated validation of dashboards, reports, and other Business Intelligence (BI) content for accuracy, user experience, security/authorization, performance, and regressions. This results in fewer errors and fosters greater trust in the analytics infrastructure.

Analytics and Dashboard Tests

Testing is crucial for the success of any software development project, including Analytics and Dashboard projects. It validates the accuracy and reliability of the Dashboards and Analytics tools, which are essential for decision makers who rely on them for their daily operations.

A schematic view of how testing works

Identifying and addressing issues as early as possible is significantly more efficient than allowing problems

to reach production and impact customers.

Testing Challenges

Testing a dashboard is inherently complex due to the need to validate various visualizations such as charts, tables, graphs, and images. Nowadays, the extensive usage of embedding of analytics and dashboards into existing applications introduces additional layers for potential issues. And given that these issues can manifest at both technical and business levels, finding individuals proficient in both aspects becomes increasingly challenging.

To illustrate those challenges, here is a non-exhaustive list of several sources of potential issues helping us to identify the required tests to detect them :

  • Rendering Glitch : Visual discrepancies, though they may not hinder functional usability, can be vexing and detract from the perceived quality of dashboards. These may manifest as misaligned labels, overcrowded charts, incorrect color schemes, and other such anomalies.
  • Rendering Glitch While Printing : Upon printing, a dashboard adopts a layout that is not precisely identical to its desktop rendering counterpart, consequently resulting in certain rendering glitches unique to the print format.
  • Functional Glitch : Such errors significantly impede the functional usability of the dashboards. Examples include a date picker displaying incorrect values, a multi-selection filter functioning as a single selection, a chart lacking a necessary selection, or a broken drill down feature within a table, among others.
  • Data Regression : The dashboards may encounter issues where they fail to utilize the correct data, often attributable to unforeseen alterations in the data source (such as database updates or changes in IoT provider APIs), glitches in the Extract, Transform, Load (ETL) process, or similar factors.
  • Query / Custom Calculation Regression : Comparable to data regression and traditional coding errors, queries and custom calculations may unexpectedly exhibit erroneous behavior.
  • Missing Dashboards : Initially, such errors may appear peculiar; however, when managing a considerable quantity of pre-defined (and potentially automatically generated) dashboards, ensuring the availability of all as anticipated may not be immediately evident.
  • Security / Authorization : The presence of diverse user security profiles and authorization levels can engender a variety of errors, as mentioned earlier (e.g., inaccessible dashboards due to access rights restrictions, regression in queries or calculations stemming from the inability to access certain data, etc.).
  • Performance Regression : An unforeseen surge in data volume and/or the influx of users accessing the analytics is resulting in query slowdowns, causing the opening and rendering of dashboards to become unacceptably sluggish.
  • System Load Regression : Much akin to performance regression, this form of regression neglects to ascertain the system’s capacity to withstand an escalation in data volume, user numbers, and similar factors.

This complexity underscores the importance of thorough testing procedures to ensure the accuracy and reliability of the dashboard and likely contributes to the continued reliance on manual testing methods for dashboards.This sentiment is echoed in the feedback received from customers at icCube asking for advices about automated testing for improving the quality of their dashboards.

Automated vs. Manual Testing

Before delving into the specifics of various types of tests, it is worth recalling several reasons why automated testing should be used in favor of manual testing as much as possible.

  • Slow : Manual testing is slow by nature : there is a hard limit on how fast a user can interact with dashboards and can execute the various procedural steps of the tests. This is quite pertinent as usually multiple testing cycles are required when an issue is detected. This is a limiting factor for achieving a swift time-to-market for the solution.
  • Monotonous : Manual testing can be tedious and monotonous which can result in a lack of motivation amongst testers. This diminished enthusiasm contributes adversely to the overall quality of the testing cycle.
  • Documentation : Manual testing requires extensive documentation describing the various steps of the tests at both technical and business levels. Maintaining this documentation poses a challenge, particularly in a dynamic environment where the product undergoes a rapid evolution.
  • Hard to Detect Data Error : Certain errors prove challenging to detect through manual testing. For instance, identifying invalid data within a table chart can be particularly arduous, especially when testers lack the requisite business knowledge to discern inconsistencies in the data.
  • Highly Skilled Workers : As previously noted, manual testing necessitates proficiency in both technical and business domains. Individuals possessing such highly specialized skill sets are scarce and are often allocated to other areas within the organization, where their talents can be applied to more creative and innovative endeavors.
  • High Cost : The aforementioned factors collectively contribute to heightened costs, as substantial manpower is required for the efficient testing of the product.
  • Error Prone : Manual testers are well… human and therefore make mistakes so, it is evident that manual testing is susceptible to errors contributing adversely to the overall quality of the testing.
  • Scalability Issue : Manual testers lack the capacity to replicate themselves in order to simulate a specific number of users for performance and stress testing, nor can they operate continuously on a 24x7 basis unless you happen to have an army of ChatGPT agents at your disposal ;-)
  • CI/CD Pipeline Integration : Ultimately, the integration of manual testing into existing DevOps and DataOps pipelines available within the organization presents a challenge, thereby impeding the delivery of the Dashboards and Analytics project, affecting its time-to-market.

Automated Tests

In this section, we will examine various types of automated tests that can be employed to effectively tackle each previously identified issue. While this list is not exhaustive, it provides valuable insights into actions that can be taken to enhance the project’s quality.

Github Public Project

Acknowledging the necessity and complexity of testing, icCube has released the ic3-analytics-ops project on GitHub with the following goals in mind:

- permissive license to reuse and extend at will,

- flexible enough to integrate into the existing CI/CD pipeline,

- automated server tests,

- automated data model tests,

- automated data authorization tests,

- automated dashboards tests.

This framework offers the building blocks to build various types of tests.

Functional Testing

The test runner operates the dashboards as a user would, verifying that the application functions as intended. Cypress is the chosen tool for conducting this type of testing within the project. The tests are scripted in JavaScript, leveraging a set of predefined helper functions specifically tailored for testing the dashboards generated by icCube. These functions are already internally used by iCube to test the dashboard application itself.

Regression Testing

The test runner simulates user actions on interactive dashboards, executing queries to confirm the anticipated outcomes. Tests are scripted declaratively using the JSON5 file format. The project facilitates the generation of expected results, which are subsequently reused to ensure non-regression. Non-regression testing for visual rendering remains an ongoing area of development. Further details will be provided as progress unfolds.

Performance Testing

During non-regression data testing, the test runner can validate timing-based outcomes to verify that the system’s performance aligns with expectations. This may include assessing metrics such as dashboard opening and printing times, drilldown round trip durations to the server, and similar performance indicators.

Stress Testing

The tests, as executed by the test runner, delineate one or more actors, with each actor assigned a set of tasks to perform. These actors operate autonomously in their own threads of control, rendering them well-suited for replication either on a single tester machine or across multiple tester machines. This allows for the simulation of a substantial user load, facilitating the exploration of the system’s capacity limits. Moreover, such tests can be conducted continuously, 24x7, to ascertain system stability as an additional benefit.

Security/Authorization Testing

The test runner possesses the capability to authenticate users using various credentials, thereby ensuring that the expected results align with each authorization profile. In today’s landscape, such tests hold heightened significance, particularly given the extensive integration of embedded analytics, which contributes to the complexity of the overall system under scrutiny.

Dev. vs QA. vs Prod. Environment Testing

The test runner is not confined solely to the development environment; it is imperative to execute tests across all environments, including development, QA, and ideally, even in the production environment. While constraints on accessing confidential production data may limit certain types of tests, conducting at least some form of smoke testing is essential to ensure a baseline functionality of the system. Additionally, efforts can be made to assist your end-users in establishing their own testing scenarios.

Upgrade / Migration Testing

With comprehensive tests implemented and increased confidence in the application’s quality, you will be more inclined to upgrade the application to deliver new features to end-users. This is accompanied by the notable benefit of significantly reducing the time-to-market. This shift can be game-changing in our rapidly evolving and competitive landscape.

Conclusion

Hopefully, you’re now convinced that automated (as opposed to manual) testing is the key for the success of your dashboards and analytics project and that you can implement it right now with existing tools.

At icCube, we understand that testing complexities can be daunting. That’s why the Data Analytics Boutique Services are here to help you, so do not hesitate to contact us, we’d love to help you.

Medium Posts

The following Medium stories provide detailed coverage and insightful perspectives on this blog post,

offering additional context for readers interested in exploring the topic further :

- AnalyticsOps and Automated Dashboard Tests

- Testing your Analytics and Dashboards

- Load Testing your Analytics and Dashboards

- Analytics/Dashboards : Version Update Validation

_