Pragmatic Works Software, Inc. is a wholly owned subsidiary of SentryOne

modern_day_data_testing

In this new video series, Pragmatic Works Founder Brian Knight and Software Architect Manuel Quintana discuss Modern Day Data Testing. Brian and Manuel have heard from hundreds of customers how they’re doing testing today, including the regulatory issues surrounding it, common mistakes and oversights that lead to failure and best practices developed through successful data testing implementations.

Data-driven testing is a fairly new concept for many organizations. In order to prevent a false sense of security, it’s important to avoid these common pitfalls that lead to data testing failures. Watch as Brian and Manuel discuss their list of the top and most common mistakes they hear customers make:

5. Mistake: Not Communicating the Test Results

Repeated studies by Gartner have shown that 40% of business intelligence projects fail and the ones that do succeed have an issue with trust. 1 in 3 business leaders don’t trust the data they’re looking at in those reports. When you build a testing plan, make sure you communicate the testing results. This will help build trust in the data.

Tweet: Tweet: “Build trust by communicating test results.” – @BrianKnight on avoiding common #data testing pitfalls http://bit.ly/2gP5ehA

For example, Pragmatic Works recently went through a commission system overhaul. As part of that, we ran both systems in parallel, and we used LegiTest validate the results of both systems. This made sure that all the orders made it into the old and new system and that they were commissioned properly. We wanted to ensure that the sales team trusted the new sales commissioning system, so we granted access to a dashboard in LegiTest through a webpage each day that showed the validated results along with the failures that we needed to correct. When everything showed as green for a week, we launched a fully validated, vetted system that people trusted across the board.

4. Mistake: Not Understanding Who is Responsible

When we first tried to build LegiTest, we presented the product to the Professional Association of SQL Server (PASS) conference attendees. Overwhelmingly, they told us that the QA folks tested the data. We then went to a QA conference where they told us that the DBAs tested the data. We scratched our heads and eventually figured out that at most companies, the applications are tested but not the data. It’s no wonder that CEOs don’t trust their data!

As part of a cohesive plan, you need to ensure that you know who is responsible for the data integrity and who is responsible for fixing things when they go bump in the night. Build a testing plan that involves escalation not only in production but also a definitive testing plan with owners in development.

3. Mistake: Not Doing Negative Testing

Most testing that we do is positive testing, where you look for something expected. Negative testing reviews the opposite. You might look at some logic that averages the price per order. What would happen if you passed in 0 to both of those numbers? Negative testing ensures that you’ve caught those nasty bad data problems early in development before they show up in production.

2. Mistake: Not Testing in Production

Most companies do some level of testing of their data in QA and development. Production is considered sacred and never to be touched. I would argue that your sacred ground is constantly getting ruined by bad data.

Imagine you receive a file from a partner every day. 95% of the time, things work great. You’ve built processes that look for errors in the load but nothing that looks for errors in the data. Then, the partner sends you the wrong file by mistake and you successfully loaded bad data into production. Each day, you should have a dashboard or email that recaps the quality of your production environment and can the users trust their data.

1. Mistake: Spot Checking Data

The biggest answer we hear during customer interviews is that the customers only spot-check data. We all know this is not the right answer but we continue to make the mistake. It’s the equivalent of testing one page in a program and assuming everything else works. We need to test each variant that our data loads might encounter and continue to test after we pass.

Over the past few years, we’ve been perfecting a solution for doing automated data-driven testing called LegiTest. With LegiTest, you can:

  • Set up tests in minutes with our point-and-click wizards
  • Implement automated data-driven testing
  • Reconcile your databases
  • Verify your data – in production, QA and development

With a 14-day risk-free trial of LegiTest, you can see how easy it is to build trust in your data.


Try LegiTest




Share This