I’ve been fascinated lately by the targeting technology that exists today. As ARCOMPANY discovers more sophisticated ways to uncover human intent through online data, we welcome the increasing accuracy and speed that data enables business to make actionable decisions from real-time information. On the other side of the coin, we are also aware of the inherent dangers of an increased reliance on big data.
Truth: What Big Data knows about us is becoming more creepy
I saw this article on the web a few weeks ago and posted it on my Facebook status. The article implied that Google’s ad targeting capability, baked into an algorithm, will target users with ads based on a sexist bias. If you are a woman, you will be less likely to see an ad for executive job opportunities.
The discussion on my stream revealed some disagreement about whether the culprit was indeed the algorithm or the advertiser. Understanding that retargeting is largely about displaying a “personalized” ad based on where a user has visited assumes there would be no real gender bias. Yet, this isn’t the case. When conducting a similar test through their own ad targeting tool on Google (influenced by user behavior) called “Ad Fisher”, the team at Carnegie Mellon noticed the following:
Google showed the ads 1,852 times to the male group — but just 318 times to the female group.
You decide: Was there gender bias in Google’s algorithm?
Social Profiling: Have you been to CrystalKnows?
It provides a social profile on people you know or don’t know and provides a predictive personality based on the individual’s online footprint. What’s more fascinating is that it is a predictor of the dynamics between two individuals, should they move to a working relationship.
The data is aggregated from “publicly available” online resources written by or about the individual. The accuracy score is improved each time people answer questions about the individual. Then the data is run through True Colours Personality Profiling system, which is a more refined version of Myers-Briggs Type Indicator (MBTI).
Its business value: I use this system to refine my communication to this prospect, based on how they prefer to receive emails or how they prefer to be approached by phone or in person. If it helps close the sale, why wouldn’t I want to have this information at my disposal?
And that is the holy grail for marketers:
The marketing industry is attempting to profile and classify us all, so that advertising [messaging] can be customized and targeted as precisely as possible.
Algorithms aren’t capable of discretion
Many algorithms also build in heuristic systems to allow the machine to learn from patterns in the data and optimize to improve predictability scores. However, the starting point always points to the human element. This article alludes to St. George’s Hospital Medical School, which used an algorithm to streamline and “automate its admissions process.” Since the original program relied on historical admissions data that tended to “favor” white males, it effectively drew a bias against women and minorities.
While the intent is for algorithms to move towards objectivity, they have no discretion when tactless incidents occur.
- Consider what Harvard Professor Latanya Sweeney discovered when she was viewing online ads specifically for companies offering background checks services: “racially-associated-names” triggered ads that were linked to criminal activity.
- Remember when Target developed a pregnancy prediction model based on a slew of customer buying information? This backfired when the company inadvertently sent a coupon mailer to a teenage girl, which, unbeknownst to her angry father, eventually revealed that she was indeed pregnant.
- Staples was charged with price discrimination when it was discovered they were offering varied pricing based on customer “estimated income levels.” While unintended, the higher prices were targeted to people who lived in rural areas and had significantly lower income levels than those who received the discounted prices.
Target collects as much data as possible on everyone that purchases from their stores:
Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. “If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we’ve sent you or visit our Web site, we’ll record it and link it to your Guest ID,” Pole said. “We want to know everything we can.”
In addition, they purchase external data to augment the information, then analyze customers at different life stages to identify common triggers in behaviour and purchase habits that may imply child expectancy.
In the case of Latanya Sweeney and the racially associated ads, Author Cynthia Matuszek, a Professor of Computer Ethics and the University of Maryland, said this:
It’s part of a cycle: How people perceive things affects the search results, which affect how people perceive things
People clicking on ads with the “racially associated names” increases the relevancy score, which in turn, perpetuates the racial bias.
There is Big Money in Profiling
The day is coming when companies will realize that reach and impressions fail to yield significant business results. The more information we reveal about ourselves on our mobile phones and within our social channels will continue to amass this aggregation and analysis of data to the level that, as individuals, we will fully reveal ourselves at the behest of any organization.
Acxiom, a data aggregator, has claimed to have over 500 million consumer profiles and “offers its data in aggregated form to anyone who will pay for it, from websites to banks to insurance companies, and even to a US Army contractor.” BluKai offers the same opt-in services. Don’t discount the hundreds of millions of dollars both Google and Facebook have invested in developing artificial intelligence in their systems to garner more robust audience information and user propensities. Make no mistake, as long as you are signed into Facebook, your actions within and outside of Facebook are attributed to you. Google is the same. Google records your “entire” search history while you’re logged in; the business of profiling is unfathomable. Consider this:
The organization Europe Versus Facebook, founded by law student Max Schrems, has publicized the extent of Facebook’s data collection. With the help of EU laws, he obtained Facebook’s internal record of him, a thousand-page dossier containing more or less everything he had ever done on Facebook: invites, pokes, chats, logins, friendings, unfriendings and so on.
And while I continue to believe in the value of one to one marketing, there needs to be some human intervention in how to truly determine context. Individual-customized, real-time messaging is not that far away.
DataXu co-founder Michael Baker wrote,
It is no longer difficult to imagine a time in the not so distant future when all media—TV, radio, outdoor—is digital and addressable and capable of being purchased in an auction on an individual impression-by-impression basis.
Consumers naturally find this vision disturbing; as a consumer I do. Even if I was careful with each footprint I leave behind, I am vulnerable to the black boxes that determine how industry and marketers perceive me. That is out of my control.
It’s clear there is so much technology exists that will tap the boundaries of really identifying not only identity, but also the complexities of human emotion and motivation. Regardless of what technology provides, the business side should be responsible in what information they extract and how they use it. Period.
ArCompany uses social data insights as the foundation of our content and community strategies. Discover more about your customers by contacting us today.
Image Source: Wikipedia