Listen – Do You Want to Know a Secret?
by Andrew Rudin
Do You Want to Know a Secret? When the Beatles released this song in 1963, the world was different. Less frantic, and more naïve.
Listen/ Do you want to know a secret/Do you promise not to tell?, whoa oh, oh
Let me whisper in your ear/ Say the words you long to hear/ I’m in love with you!
Fifty years of social progress has obviated the need for partners to wax poetic. In 2014, Good2Go, an iPhone application, was launched to help people skip romance and jump to a lascivious endgame.
Shortly afterward, Apple kicked Good2Go off its platform, citing application guidelines that prohibit objectionable or crude content. There were few protests, however, because it turns out, Good2Go wasn’t a hit with customers. Among the comments: “Even scarier than talking about sex,” and “worse than nothing.”
No secrets, and no promises not to tell. When it was originally released, Good2Go required users to provide the identity of their partner du jour, time consent was given, and each person’s state of sobriety. The ostensible purpose for creating Good2Go was to help people document encounters before things got steamy, should allegations of impropriety crop up later on.
Once the data escaped the user’s keyboard, it was captured. By who or by what? Where was it kept? And just what were the intentions of whatever entity that might be storing it? I don’t know, exactly. Nor did almost everyone else. One thing’s for certain: it wasn’t just going to sit there, un-analyzed, or un-shared.
By pressing Submit, data about the hookup breezed into the cloud. No strings attached, just like the liaison. Good2Go offered a curt privacy disclaimer: “We may not be able to control how your personal information is treated, transferred, or used.” Paul McCartney, step aside! There’s nothing like legal fineprint to stoke the fires of passion.
If you’re a control freak about your personal information, some free advice: chuck your computer and iPhone immediately, and get off the grid. Wait about 20 years. After that, there will be no secrets. Personal privacy will be quaint anachronism, and everyone will be part of a scandal. Don’t stress, though. By that time, nobody will care.
Whenever we give up personal details – whether online, over the phone, or on paper – the best we can hope for is that the custodians of our data will behave ethically, and take adequate measures to conceal our information from those who might abuse it. Recently, the Equifax data hack made it clear those aredangerous assumptions. The company’s massive databases contained millions of individual social security numbers, birth dates, driver’s license numbers and credit scores, but Equifax executives didn’t give a tinker’s damn whether consumers were protected.
The company squandered opportunities to prevent outsiders from hacking into this sensitive information,allowing 143 million private records to fall into nefarious hands. “The Equifax hacks are a case study in why we need better data breach laws . . . Equifax handled a disastrous hack poorly. But the core of theirbehavior isn’t unusual,” read a headline on Vox, an online media website.
Breaches happen more than most people know. Your right to be informed depends on where you live. “The clamor for a standardized data breach notification requirement has become almost as quotidian as a data breach itself. Companies no longer wonder whether they will ever have to notify consumers of a breach but rather when they will do so. Incident response planning, however, is complicated by 47 different state breach notification laws and those of additional jurisdictions such as D.C., New York City, Puerto Rico, Guam and the Virgin Islands. The variety is no doubt confusing and increases the compliance costs for companies,” according to another article, Examining the President’s Proposed National Data Breach Notification Standard Against Existing Legislation.
That article was written in 2015, when Obama was president, and before the government stopped caringabout protecting consumers. With Trump in office, don’t hold your breath waiting for this initiative. In the meantime, if you want to navigate the thicket of data breach notification laws state-by-state, click here. Please remind me who has information power: Consumers – or the companies that use their data? I keep forgetting.
Sometimes, a company has good reasons to delay disclosure about a data breach. For example, a forensic investigation might require secrecy. Or, a company might need to learn the full extent of a breach before sharing information with customers. But proving malfeasance can be difficult. Equifax hasn’t disclosed exactly why it waited weeks to inform customers about the breach, but during that time, its senior executives sold millions of dollars of Equifax stock. A company spokesperson told The Washington Post that at the time, the company’s executives had no knowledge of the breach.
“No knowledge” . . . of 143 million stolen records . . . That sounds far-fetched to me, too. The hacking and the timing of the stock sale brought Equifax CEO Richard F. Smith in front of Elizabeth Warren for a Senate hearing. He had some ‘splaining to do, and he didn’t project well in front of the cameras. Incharacteristic fashion, Senator Warren emasculated him the same way she did Wells Fargo CEO John Stumpf. In a company press release from September 7, after the breach was publicly disclosed, Smith said, “We pride ourselves on being a leader in managing and protecting data, and we are conducting a thorough review of our overall security operations.” I reluctantly believe the second part of that sentence, but I’m calling BS on the first.
Personal privacy faces unprecedented threats. In the digital era, almost everything we do leaves a trail of recorded transactions and events that is harvested, curated, and maintained as a fungible asset. After it’s released into the cloud, it’s hard to know who – or what – controls it. As a result, we have far less control over our data privacy than we did 30 years ago, when smaller amounts of our data were housed in cumbersome silos. But the silos came down, along with the cost of storing data. Most significantly, companies have developed sophisticated tools and algorithms to systematically exploit it. As a result, corporations and government agencies have steadily gained hegemony over our data and our privacy. Today, they control five powerful variables that profoundly affect our privacy: capture, use, retention, protection, and ownership of consumer data.
Capture: Though we might not be aware, we routinely give up a mother lode of personal details every second through our mobile phones, wearable sensors, and the IoT (Internet of Things). Even more digitalexhaust gets captured and recorded through devices like Amazon’s Alexa and Google Home, which are at their core, eavesdropping devices. But for now, I’ve stopped fussing because doing so compares to peeing into the wind. Nobody wants to give up the latest digital gadgetry, and or to suffer the indignities of not having “personalized experiences.” I concede that the data capture pig has permanently left his pen. He has grown too strong to restrain, too fat to ever fit back through the gate.
From here, I’ll focus on manageable issues:
Use: Companies can – and should – be held accountable for how they use personal information. “Compliance with federal and state laws” does not absolve them from providing responsible governance and strong consumer protection.
Retention: How long should companies keep personal data? Do customers have a right to be forgotten, as the General Data Protection Regulation (GDPR) laws in Europe will enable? What should companies disclose to consumers? And is it ethical for companies to charge customers for deleting personal information? These not-hypothetical questions were at the center of the Ashley Madison hacking case, in which the company assessed its customers a fee for expunging their personal data. Rather than deleting the data as promised, Ashley Madison simply changed the record status to “inactive,” and moved the files to a backend server. Those records were stolen in the hack, along with the active ones.
Protection: As data has increased in value, it has become a more attractive target for theft. The threats are significant and omnipresent. But regulation and corporate risk mitigation measures such as cyber- security haven’t kept pace. The nine largest consumer data hacks leading up to Equifax in 2017 illustrate the outcomes when companies are lackadaisical about data security. The direct costs are substantial, the indirect costs incalculable.
Ownership: The question of data ownership is central to preserving consumer privacy. But sometimes, provenance and ownership are difficult to track. Further, Terms of Service statements aren’t explicit, and once consumers have created data, they rarely control it. And since data can be copied, sold, and repackaged, and re-sold, custodianship and responsibilities for protecting consumer data becomes a murky issue. Yet, privacy preservation in the digital age demands clarity over this basic matter.
The road ahead. “The increased amount of and use of data calls into questions pressing issues of fairness, responsibility and accountability, and whether existing legislation is fit to safeguard against harm to an individual or group’s privacy, welfare or physical safety,” according to the Open Data Institute’s September 13, 2017 report, Ethical Data Handling. Public safety should not be taken for granted, and the benevolence of government never assumed. As I wrote in an article, The Dark Side of Online Lead Generation, companies routinely use data to exploit the most vulnerable consumers – who can also be the most profitable.
Access to consumer information confers an obligation on an organization to
1. be transparent about ownership,
2. control how the data is used,
3. ensure the data is protected from unintended use, and
4. ensure that consumers will not be harmed.
This is a monumental order. Why would any company voluntarily sign up? Because it’s the ethical,customer-centric thing to do. But there are other advantages, too: First, without adopting constraints, businesses will undermine their revenue generation efforts. Ethical data governance enables trust. Second, companies that demonstrate strong data governance will achieve competitive advantages over ones that are sloppy and uncaring.
To create good privacy outcomes for customers, executives must answer five data-governance questions:
1. Does our use of the data reflect consumer preferences?
2. Is our intended use for this data ethical?
3. Is our intended use fair and respectful to our customers and prospects?
4. Have our customers been provided any control over how their data is collected, stored, and used?
5. Is our organization appropriately transparent about our intentions, policies, and safeguards?
The as-is state, circa February, 2019. A passage extracted from a privacy statement that recently landed on my desk serves as a shiny emblem for how far we need to go with data governance:
“To protect your personal information from unauthorized access and use, we use security measures that comply with federal and state laws. These measures include computer safeguards and secured files andbuildings. We limit access to your information to those who need it to do their job.”
These strictures, if you want to call them that, offer scant solace.
When a company captures or requests information from customers, they should reveal,
1. what data is being collected 2. the entity or company that owns the data 3. who has access to that data 4. specifics regarding how the data will be used 5. existing internal measures that protect confidentiality 6. whether the data will be shared with third parties, which ones, and for what purpose(s)
7. the length of time that data will be retained 8. customer rights for data erasure and/or amendment 9. where to go within the organization for redress of consumer issues regarding data 10. the federal, state, and local laws that govern the company’s use of that data
“Business data is everything. Protect it well,” reads a full-page ad for Carbonite, a data security company. In previous articles, I have complained about companies trivializing C-Level roles (do companies really need Chief Listening Officers?). But the tocsin that just sounded from the Equifax hack tells us thatit’s past time to give Data Governance a chair in the C-Suite. (Note: the GDPR laws mandate certain companies assign a Data Protection Officer or DPO.) Should a new position be created, CDGO, Chief Data Governance Officer? I’ll make the case in a future article.
A regulated market? “A regulated national information market could allow personal information to bebought and sold, conferring on the seller the right to determine how much information is divulged,”Kenneth Laudon of New York University wrote in a 1996 article titled Markets and Privacy. He was aheadof his time. “More recently, the World Economic Forum proposed the concept of a data bank account. Aperson’s data, it suggested should ‘reside in an account where it would be controlled, managed,exchanged and accounted for.’ The idea seems elegant, but neither a market nor data accounts have materialized yet,” according to The Economist (Fuel of the Future, May 6, 2017).
Ethical data governance: the way forward. A white paper, Guiding Principles for the Ethical Use of Data by Jennifer Glasgow and Sheila Colclasure, offers a clear case for corporate data governance: “As inany relationship, business or otherwise, trust needs to be earned, sustained and nurtured over time. To succeed in the long run brands, have to first be accountable. Therefore, a common understanding of what it means to act ethically with consumer data is required. Without a common set of rules or proper governance, it’s unrealistic to assume brands across a vast marketplace can meet this expectation and maintain the trust of the consumers they serve over time.”
Sounds like common sense. Why then, do so many companies choose a riskier, ethically-shaky path? Greed? Naivete? Stupidity? Lack of will? It’s hard to say, exactly.
Listen/ Do you want to know a secret/ Do you promise not to tell?
Ethical data governance will help companies fulfill this critically important consumer expectation.
Author’s note: To read the previous articles in this series about data privacy and risk, please click the linksbelow:
In the Digital Revolution, Customers Have Nothing to Lose But Their Privacy
Companies That Abuse Consumer Privacy Might Feel Their Fury – Again