The Ethics of Big Data and Privacy

What are an individual’s rights to privacy? There’s no easy answer as the boundaries of privacy differ among cultures and individuals. At its core, however, there are shared basics and common themes. And in most cultures, there is some agreement that individuals are entitled to some level of privacy.

Historically, the protection of private information was the responsibility of the “discloser.” If a person didn’t want information exposed, they simply never disclosed or published details. This worked well before printing presses, the Pony Express, telephones, internet and, of course, the “privacy disclosure” agreement that is now part of every online registration process that no one ever reads.

As I mentioned in my previous post about our first Data Curiosity Roundtable, the issue of big data privacy and the ethics surrounding this topic are a hotbed for discussion. Today, information is fluid, its distribution is instantaneous and global, and usage pattern data from any provider of online services is valuable.

Anyone in marketing will tell you how important it is to understand individual consumer identity and patterns. If your company or marketing firm isn’t doing this, you should be asking questions because the more you know about buyers, the better your chances are of providing an offer that is tempting. This concept is called personalized marketing. To do this effectively, however, means you need to know a lot about a person and how he/she fits a pattern observed in others you’re targeting with marketing campaigns and promotions.

For most people, disclosing a little information for a perceived benefit is no big deal, right? The issue arises when that data is bought and sold, then recombined by new and separate institutions that merge it with other pattern data to infer or detect non-disclosed information. This point is at the heart of the privacy matter—and the ethics debate.

Through these data acquisition practices, can a firm detect something about a person that was not disclosed? Yes. Is this data gathered just the application of a generalized consumer pattern or does it uncover a legally defined non-disclosed personal attribute? Maybe.

What constitutes going too far? When does it get downright creepy? That is tricky. Because a company can follow the letter of the law, and still use Big Data and large-scale analytics in a way that leaves the public’s collective comfort zone. Consider the cases of two major retailers—Target  and Nordstrom.

Target’s practice of data mining backfired when it sent coupons for pregnancy-related items to a teenager, which prompted an angry father to berate a local store manager before learning his daughter was indeed pregnant. While Target did nothing technically wrong by using buying patterns to identify a particular customer as being pregnant, and then mailing associated materials to her, they nonetheless created a firestorm with this identification of something private. Similarly, Nordstrom was called out for using a service that collected information from unsuspecting customers’ smartphones when they connected to the store’s WiFi service.  Each  of these situations hit a public nerve and the brands were negatively impacted by these practices.

Because of the blow-back to these brands, I posit that the responsibility of securing privacy has transitioned from the disclosure to the analyst utilizing these powerful analytical tools. The responsibility for not going too far, even if you can, lies with the design and use of the analytical tool.

One could say technology has simply outpaced societal norms. Consumers haven’t had enough time to express their comfort zones over the use of their personal and usage pattern data. Analysts are also treading new ground now that they have the tools to reach further than they did under the do-not-disclose paradigm. Furthermore, acceptable social and business behaviors—or ethics—have yet to be formed. Without careful consideration by the person designing the use of the system, over-stepping of acceptable boundaries can, and will happen.

It’s time to find the right balance. Let’s collectively continue the ethics discussion, so there are no more issues to tarnish the industry and its capabilities. What’s your take? Drop me a line at Matt_Wolken@Dell.com or on Twitter at @matthewwolken.

About the Author: Matt Wolken