Finding the Humanity in Data: Privacy, Identity, and Anonymity in 2015

  by    0   0

Privacy appears to be on the ropes.

Regardless of whether you are aware of it or not (and there is evidence that many of us are not), your data is collected and used by a wide swath of companies with very few restrictions or guidelines.

Not just when you’re actively browsing – if you carry a mobile or tablet device with you, your apps / devices are constantly collecting and transmitting information, even when you think you are browsing anonymously.

There’s more than just consumer privacy at stake: inherently human concepts like anonymity and identity are at play as well.

The social web is the most obvious place for this battle. Examples include some recent, and jarring, examples of algorithms accidentally going wrong… along with less obvious but similarly questionable misuses of data by Uber and Google.

Part of the problem is that there really aren’t many restrictions when it comes to how your information is used. Individual companies typically  develop their own standards for digitally transmitted data, and most of them simply say they can and will use it any way they please (something startups / newer technology companies almost always write in the fine print). This reflects the unfortunate reality that industry groups and government regulation simply haven’t caught up yet. And when they do catch up, it will raise another set of daunting issues around managing detection, enforcement, and effective penalties.

Another issue is that there are currently very few ways for us to control how our own data is used, and efforts like Do Not Track are quickly rendered ineffective / brushed aside by companies eager to exploit visitors to a website or app.

Of course it’s not all bad, there are benefits to how your data is used, and there are companies that are more transparent about how they do so. Kindara, for example, is a Boulder based startup that helps women track / optimize their fertility and menstrual cycles, and has a clear and easy to understand privacy policy.

But the overwhelming reality is that while things like programmatic ads & other automated serving of content have a ton of potential, there are a lot more companies asking “what can we do with your data” than there are that ask “what should we do with it.”

The latter approach – which is what will  ultimately lead to a sustainable web – requires more than quickly exploiting tactics. In other words, engaging in what our own Susan Silver calls “finding the humanity in data.”

Clearly we’re at a point of convergence, and while we have access to staggering amounts of data we are only beginning to scratch the surface of how and when we should use it… not to mention combating misconceptions about what data can actually do.

So, what comes next? And how will privacy, identity, and anonymity evolve in the coming year?

While large companies like  Uber, Google, and Facebook all recently encountered criticism, there have been problems across the ecosystem. For example, with Gamersgate questions of privacy quickly hit the headlines when a large number of women were publicly harassed, and some even had their lives threatened.

Along with privacy, 2014 saw a rise in demand for social platforms that allow less reliance on static identity, and give users the option of  remaining anonymous or semi-anonymous. , Alternatives like Yik Yak, ‘Ello, and Whisper quickly gained traction and funding, although questions remain as to whether they can be trusted to keep their user data secret.

We asked the team here at ArCompany to talk about changes they see coming, and how both platform & product companies will need to adapt their efforts around research / business intelligence, content, and community.

Susan’s take:

My ideas around these issues have changed drastically over the past year.  I’ve always had an awareness about how public our lives are because of the internet. In this medium I have met like minded individuals and even built a  career. What is not obvious is the trade-off and leeway we have given the companies that enable us to connect. The infatuation period with these technologies is quickly waning as we understand just how deeply they impact our lives and their possible repercussions.

In 2014 I witnessed these issues first hand as friends and colleagues endured retaliation  for their identity and choosing to speak out about their harassment.  Twitter has a well known problem with protecting users from abuse and has no real way to enforce its own policies on the matter.  If companies are going to use data so closely tied to my personal identity then I should at least feel safe disclosing and be able to control the visibility of my data.

Protecting users starts by having a community and moderation team committed to doing the work when these issues become apparent. It is certainly possible and beneficial to set boundaries early on and enforce them such that users police themselves. This  creates  a safe and inclusive space. It also relates to issues of privacy because this team is best able to educate users, build relationships, answer concerns, and respond to breaches. When users are empowered to take action and direct their experience, everyone wins.

We need to mature out of this stage where companies dictate down to users what they do with their information. Let’s move on to the next stage where we work in collaboration and build awesome things together.   As a marketer who uses social intelligence, I want the information to remain free flowing, but it should be limited by the  individual based on how they value and exert their will in the relationship they have with me.

Hessie’s take:

I’m caught between my role as a marketer and an everyday user of technology. As a marketer I want to know more about what motivates people, and the impulses and external influences that make them behave the way they do. If it helps me sell more product then of course I want to know this information. As a user of technology, I am aware of (or thought I was aware) of the information that was being collected about me.

The reality is that privacy disclosures were meant to be convoluted and long. No one reads them and that’s one issue. The other side of this is that applications like Whisper are not upfront about this information, which begs the question: Does disclosure really matter?  I doubt that anonymity will thrive in a data rich environment. It’s clear there is so much technology exists that will tap the boundaries of really identifying not only identity, but also the complexities of human emotion and motivation. Regardless of what technology provides, the business side should be responsible in what information they extract and how they use it. Period.

Amy’s take:

I’m torn over this issue, just as I am to a degree over users (as in us) being anonymous on the web on sites like Whisper; it’s easier to bully when there is no accountability.

But on this issue, here’s where I’m stuck: I’m a marketer AND a consumer who benefits in both roles from algorithms based on my online behavior. As a consumer, I also suffer when the data collected is clumsily used. In regards to privacy, and the Uber example is perfect, yes, I have deep concerns, and I’ve mentioned Cuban’s Cyber Dust app. many times as an option for us to try to cover our tracks. The wrench is that as an option to me, the problem is not simply that our user experience will be less agile if we don’t give up our data, it’s that people with evil intentions will be harder to track.  For example: many child pornography sites are stalked by law enforcement to weed out pedophiles and other criminals. If we all get to be totally anonymous online, we lose the ability to stamp out crime as well.

The only answer to me is one that many web experts fear: regulation. Accountability by businesses to a set of rules… but that brings with it a wave of other questions.

Joe’s take:

I think the problem is that we’re still stuck in thinking of the web from a largely binary perspective.  I don’t support anonymity without limits, but privacy and identity are essential human concepts. Human behavior can’t be absolutely controlled without negative results, nor should we just give up completely on creating both actual architecture on the web, along with social norms, that are thoughtful and useful. I believe that if we build good structure, people will respond accordingly.

For brands / marketers this means thinking carefully about your internal and external technology stacks, and letting your audience & customers guide you. Building healthy social norms / social structure is absolutely critical, and anything that you’re unsure about don’t be afraid to get a wide variety of perspectives from your community. For technology companies in the apps and/or service layer, this means building tools that are consumer centric. It’s also absurd to me that companies like Verizon, Turn, and Facebook can secretly collaborate to completely track & control what ads users see but the best they can come up with for users is an opt out that requires you to accept cookies… and as it turns out that doesn’t really opt you out at all (for relevant screengrab + link to full Orwellian doublespeak click here). The FTC should hold the adtech industry’s feet to the fire, and the industry players should stop being lazy and start creating technology to help us control how our data is stored and accessed.

Bob’s take:

There will always be individuals, organizations or countries out there with an “Evil Empire” agenda.  Witness several of the examples my associates have already mentioned.  Now add other instances like North Korea and their cyber attack on Sony’s corporate information (which stole personnel records), due to the planned release of the comedy movie, “The Interview”.  And continue right on down to cyber crimes committed by individuals for a wide range of reasons. Clearly there is a need for policing and enforcement, and perhaps more importantly education and prescriptive measures to combat these “evil” situations.  And these measures often take time to demonstrate results, because a cultural change is required, and successful culture change is usually a struggle led from within.

Although we need to continue the charge in dealing with the above “evil” situations, we may have the opportunity and ability to make dramatic changes in the inappropriate use of individual data through the business community. Historically, businesses as they get larger seem to lose touch with their customers (both B2C and B2B).  They begin to convince themselves that they have to surreptitiously “monitor and control” their customers in order to sell their products and services.  However, and this may sound rather simplistic, it may come as a shock to businesses that the overwhelming majority of their customers want their products and services, and are capable of rational decisions.

Today, more than any other time in history, consumers have more tools and access to information to help them make decisions about what to purchase … thanks to the internet and the thousands of social media services, forums, blogs, etc.  Businesses need to think in a permissive marketing manner or permission marketing, as coined by Seth Godin.

Permission marketing is an approach to selling goods and services in which a prospect explicitly (and maybe even excitedly) agrees in advance to receive marketing information.  Part of the “deal” of granting permission may also require the business to keep the person’s identity a secret or protect their personal data.  Think of providing healthcare or pharmaceutical products or advice to someone who is anonymous, but where the service provider (the business) can look at their health attributes (age, weight, medical conditions, etc.) through an intermediary managed Personal Health Record (PHR) as an example.  If the business can provide useful information, then the prospect will become a willing customer, while still protecting various pieces of their personal data (as chosen by the customer).

The future is not about interruptive marketing and catching people’s attention by “getting in their face.” The future is about introducing yourself to targeted prospects and providing information in a consultative / relationship manner that the recipient considers valuable and can support their decision making.

 

 

Featured image courtesy of: commons.wikimedia.org

ccsomerightsreserved

0 thoughts on “Finding the Humanity in Data: Privacy, Identity, and Anonymity in 2015

  1. kirklander61 says:

    DJThistle hessiejones This may be the great question of our time…I am now following you on Twitter to get your posts.

  2. hessiejones says:

    kirklander61 DJThistle thanks so much!

  3. kirklander61 says:

    hessiejones Thanks for the mention Hessie…Have a great weekend.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.