“Churning” © Joe Honton

Ethical Design in the Orwellian Commons

Choosing the right approach to personal data versus big data

by Joe Honton

This world is being assembled, in bits and pieces, from our interactions with technology: the tracking and surveillance of our movements; the communications and transactions of our commerce; the words and pictures of our inner circle; the desires and fears of our search for knowledge; and even the patterns and habits of our free time.

This is a dark thought, perhaps one that we'd rather not contemplate. And yes it's uncomfortable — it reveals that we haven't been thinking deeply enough about the work we do, that we haven't been capturing, measuring and evaluating the metrics of technology's by-products, and the harms it can cause.

And worse, it's uncomfortable because it's not us versus them. It's just us — designers, marketers, engineers, and all the rest of us building out this new age of information.

But I believe it's not too late to stop the march if we start to act with deliberate intent.


How we came to glorify big data

In 2008 researchers at Google announced that they could predict the occurrence of influenza by analyzing the pulse of search terms coursing through their servers. This intriguing experiment was one in a long line of modeling efforts by statisticians using data captured from unwitting participants. But because of its apparent beneficial nature, it became the poster child for big data.

We reveled in its glory. It gave us license to imagine all sorts of other experiments that could be conducted with data that was otherwise going fallow. Suddenly everybody was capturing, storing, and trading this new commodity called data.

And today all of that data is being hurled at machine learning and artificial intelligence with no thought given to its potential side effects.

It all starts innocently enough:

  • We directly fill in forms requesting basic information about ourselves.
  • We research things that are important to us.
  • We use a credit card or electronic wallet to purchase something.
  • We take photographs and videos of ourselves and those we know.
  • We use cellphones to give us the location of things near us.
  • We choose what television program to watch, what books to read, what music to listen to.

But it quickly gets a bit iffy, when we look at what's happening as a by-product of this pattern:

  • By supplying our name and other basic information on forms, we're providing a digital fingerprint of ourselves, one that can be used in big data analytics to amalgamate the bits and pieces into a comprehensive profile.
  • By looking for information online, we're feeding search engines with highly specific information about our interests and desires.
  • By using electronic payments for purchases we're giving researchers clues to our age, gender, family status, wealth, and even our values.
  • By sharing photographs of ourselves, we're providing facial recognition software with the fuel it needs to be able to identify us in photos taken by people we don't know.
  • By enabling our cellphone's GPS, we're giving data providers a minute-by-minute trace of everywhere we've been.
  • By turning on Netflix or Kindle Cloud Reader or Spotify, we're revealing how much of each television show we've watched, how much of each book we've read, and how often we've listened to a music recording.

These are simple things, showing how easy it is to paint a picture of who you are, where you've been, and what you do. The focus becomes even more sharp when law enforcement turns its spotlight on someone with a geofence search warrant.

Eventually the Google Flu Trends experiment was determined to be a failure. Eight years after it started, the program was quietly abandoned.

It seems that big data is not all it was hyped up to be.


The Orwellian Commons

In George Orwell's dystopian novel Nineteen Eighty-Four, the Thought Police could peer into the corners of every room with a type of technological wizardry that is on par with what we have today. Except today we go even further with surveillance.

In our quest for simplicity we allow cameras to act as toll-booth collectors on the roads we travel. In our quest for convenience we allow transit cards to record our departures and arrivals. In our quest for security we allow badge-reading turnstiles to be the gatekeepers to company headquarters. And on and on. Just moving from here to there we loose our anonymity. These are not optional. We can't "opt out" of this type of surveillance.

But it's not just Big Brother that's watching us. Today we voluntarily give Big Tech much more than Orwell's totalitarian leaders took by force. Multiple times a day we voluntarily check the box that says, "Yes it's OK to gather information about what I'm reading, what I'm looking for, what I'm purchasing."

It feels like the GDPR experiment is merely a bureaucratic whitewash of the true protections we need. Every new website that we visit informs us that they use tracking technologies to improve the customer experience and to make their product better. We see these notices and automatically grant them permission.

We check those boxes with resignation and a bit of exasperation, because we know that the alternative is not an alternative at all. Saying no and opting out of their data gathering schemes doesn't mean that the website works with reduced capability. It means that the website just doesn't work. Bye-bye and good luck elsewhere.


The Mad Men of Silicon Valley

Oh how we love to say that we hate advertising. But there's a dichotomy behind this. In fact, we love advertising too. The whole of the last two centuries confirms this.

The success of the long-running Farmers' Almanac was not due to its weather predictions, but to its ability to promote new products that we didn't even know that we wanted. Decade by decade that quaint approach to advertising has given way to ever more sophisticated approaches, so that today we even celebrate that sophistication.

The first week of every February is when Madison Avenue turns the spotlight on itself, when Superbowl ads are more hyped than the game itself, when our love of advertising is at its zenith. Advertising informs and entertains in equal measure. We love it.

When ads migrated from print media to radio, to television, to the internet, we accommodated ourselves to its presence. But when computer tracking algorithms began to fingerprint our identities and began to follow us around, we finally began to say enough is enough.

What we are railing against is not the advertising, but the individual targeting. Nobody wants to be accosted on the sidewalk by a pushy pampleteer shoving paper handouts our way. Nobody wants to be shadowed around the used car lot by a salesman extolling this one over that one.

So too, nobody in the new age of digital fingerprinting wants to be stalked by advertisers who follow us from site to site. We were OK with advertising as a broadcast message, but we're still queasy about advertising by companies that seem to know more about us than we do ourselves.


Virtue ethics

The traces of data that we leave behind, like breadcrumbs scattered on the trail of our daily lives, has become troubling — much more troubling than the topics of surveillance, or advertising, or behavioral prediction would each suggest.

We have an ethical dilemma: how can we protect an individual's personal data from misuse, while still providing a useful experience?

There are aspects to this that need our thoughtful consideration: data as a commodity, data amalgamation without express consent, data archival for indefinite periods of time, the expungement of data for the protection of our youth, data custodianship version ownership, and others like these.

In short, for each new tech project we should be asking: why are we collecting this data? how long do we really need to keep it? and what harm could result from its misuse by us and others?

In days gone by, public domain records were limited to birth, wedding, and death certificates. For anything else, the picture was filled out by family albums and newspaper clippings. Data archiving was limited to what the archivist could squeeze into his cinder-block storage facility. Data retention was fallible: fires, floods, and mold took turns at reducing history to a sketch.

In contrast, today it seems that everything we've ever done, said, posted, or photographed, is being captured and archived, unable to fade from memory.

But some things should be forgotten, otherwise our children won't be able to grow up and experiment and make mistakes. Teenagers need to be protected from themselves. We need to have regulations in place to allow them to expunge the data of their misdeeds, their bad choice of words, their poor judgment with cameras. How else can we protect them from being forever dragged down after they've matured?

And we need to have better rules about the ownership of data we generate as a by-product of our tech activities. Who should have ultimate control over the ownership and retention rights of such data, the individual, or the business that stores it? Who really owns it? If internet companies are custodians of that data, what responsibilities do they have to the people whose lives are being recorded?

We need to develop a social contract, where tech companies and the users they serve, agree that personal data is not a commodity. At a minimum it should begin with:

  • A non-amalgamation clause in our end-user licensing agreements that restricts or prevents us from joining data that we collect with data that we obtain from third parties without express consent from the user.
  • A data retention clause that prohibits the archival of transaction data for unlimited periods of time without the express consent of the user, and that mandates that we erase those transactions when that period expires.
  • A data collection clause that restricts us from capturing or storing ephemeral data, such as GPS way-points.
  • A data purging clause that specifies how long usage data — such as movies seen, books read, or music listened to — can be kept before mandatory deletion must occur.

When we start putting these kinds of guidelines in place, we'll begin to regain the trust that's been lost.


Accepting responsibility

So whose job is it to consider and make decisions about the ethical implications of our work? Marketers? Product designers? Software engineers?

It's easy to point the finger at marketing when we watch the eroding values of Don Draper, in Mad Men, as he blazes through Madison Avenue leaving a path of destruction. But it's a cheap shot. Marketing is not so much about balancing good and evil, as it's about accentuating the positive.

Maybe product designers should bear the responsibility. After all they're the ones telling us to capture and store all that personal data because it might be helpful with improving the product.

Then there's the foot soldiers: the software engineers who write the forms, who build the database, who analyze the data, who return the results back to the marketers and designers.

Or perhaps it's the CEO — the guy that's going to be called before the hearing committees after it's all too late.

But in the end, if we're all part of the cycle, shouldn't we all bear some of the responsibility?


Perhaps Bret Taylor, chief operating officer at Salesforce, has best summed up what we're all beginning to feel, “ ‘We’re just a platform’ is no longer an acceptable excuse. We need to not only build technology but also consider how it's applied.”

Ethical Design in the Orwellian Commons — Choosing the right approach to personal data versus big data

🔎