Perhaps this post is a bit late to the party. The NSA spying scandal is already a distant memory for many, having been replaced by subsequent news cycles.

But as I see an increasing number of our own Testuff customers move their applications, testing, and development into the cloud, it’s hard for me to find true closure. The NSA spying was far-reaching. And the implications extend well beyond phone records and emails. In the age of big data – the tools that we use to communicate, create, and execute are all constantly under potential threat.

So many questions keep coming to mind.

As privacy and security concerns evolve, will end users increasingly expect software developers and testers to protect them from prying eyes? If not the architects of these tools, who is responsible?

One could make the argument that it’s not really our concern. When developing bespoke software, isn’t the job finished when the client is satisfied?

But this logic isn’t terribly gratifying. Most of us in the software testing community hold ourselves (and our work) to very high standards. And we routinely go above and beyond to create products that exceed our own expectations – even if this means working off the clock.

However, this doesn’t answer the larger question.

Assuming that we are the gatekeepers and protecting end users is our responsibility, do we really have options? Is it even within our ability to prevent future NSA spying scandals and government hacking?

I actually think it is within our power. Our entire raison d’etre is defined by perpetual improvements and a never-ending game of “one-uppance.” New security holes and bugs are discovered, tested, and fixed on a daily basis.

Moreover, we outnumber the governments of the world. True, they sometimes have cooler toys, but then again, we’re the ones who tested and developed them.

QA Testing Tools for Big Data, Big Challenges, and Big Opportunities

Collectively, we may already have the tools and skills to ensure greater privacy. But our industry is fragmented enough that I don’t expect any unified efforts to emerge.

The more likely outcome – in my opinion – is the birth of a new testing breed. One dedicated almost exclusively to the types of “privacy” concerns that the NSA scandal helped uncover.

This is, of course, pure speculation. But history is on my side. There’s already a precedent for the very types of organic solutions that magically emerge when the need arises.

Over the years we’ve witnessed a shift from debugging-oriented to destruction-oriented to prevention-oriented testing. And with this steady progression:

  • end user expectations have matured
  • new best practices have emerged
  • traditional job duties have evolved

All keeping in line with the unique technological challenges and opportunities of the day.

In the age of big data and digital privacy, many of these unique challenges and opportunities now have a social element. And the new breed of testers will need to develop methodologies, tools, and a language to reflect that fact.

Agree? Disagree? Please share your thoughts down below.