With the ownership of data resting solely at consumers’ feet, a question arises over whether they’re getting their money’s worth. All organisations benefit from using data. From delivering more targeted marketing, to supply chain efficiencies, workforce optimisation and improving customer support. Consumers share in some of these perks, however, the value exchange sometimes feels weighted in a business’ direction.
New ways to monetise data
Then there are the organisations who are selling their user data for profit. Facebook and Google are well-known examples of this. However, even public bodies such as HMRC are in on the act. Reports suggest that MPs are looking at new ways to sell taxpayer data to third parties. Under such circumstances, surely consumers deserve a slice of the profits?
Therefore, some consumers are exploring other ways of monetising their data. Namely, selling it to the highest bidder.
Selling your data soul
Dutch student Shawn Buckles kicked off the debate on selling personal data by offering his ‘data soul’ at auction. The winning bidder would receive a data bundle of Buckle’s private information including his email conversations and browsing history. The Next Web won the auction for £288.
Data selling services
Hot on his heels, a number of specialised services have opened to capitalise on this growing market. Many are still in their infancy. Ocean Protocol is a decentralised data exchange that enables users to share and monetise their data. Whilst adding a layer of transparency, accountability and compliance to proceedings.
Datacoup is a more established company that was founded in 2012. It advertises itself as the “World’s first personal data marketplace” and it allows consumers to connect different data sources, such as apps, to a ‘data profile’ that then pays them money. Different data is worth varying sums, based on demand from the marketplace. However, the total earnings from the data are likely to be very little.
Little monetary gain
This is something that Buckles also highlighted when selling his data. Individual data isn’t worth very much to buyers. Brands don’t purchase individual data. Instead, they buy bundles – and like most bulk purchasing, that makes the individual units of data very cheap. Cambridge Analytica (of the Facebook/Cambridge Analytica scandal) reportedly paid 75 cents to $5 for each piece of personal data.
Some companies don’t offer money at all. Blockchain-based Wibson offers its users a ‘Wibson token’ in exchange for their data. This cryptocurrency can be traded on four different platforms for other cryptocurrency and services.
Selling data consent
Another proposed monetisation method is to sell the consent surrounding personal data. Professor Mindaugas Kiskis suggests this angle, as consent would make data legitimate and increase its value. He points out that, “Contrary to data, it’s not easy to get consents without us. By offering and pooling consents, each of us may get part of the profits in data business, and more control over it.”
However, this may infringe GDPR. Under the Regulation, consent must be “freely given” which means that it is offered without coercion or undue incentives. Breaching GDPR will make the value of such data worthless.
A wider shift in data value
For consumers, selling personal data is unlikely to make them millionaires. Yet, it’s a signal of a wider shift at play. One where consumers are more aware of the value contained within their data. Business leaders would do well to heed this awareness. As, if you fail to value a customer’s data, they will be quick to send it elsewhere.
You don’t have to be the highest bidder. Just the one that consumers trust and respect the most.
Although the General Data Protection Regulation (GDPR) has clarified data ownership in Europe and the UK, there is still some confusion about data ownership and how to enforce it. As Mike Dougherty, CEO of adtech company Jelli explains, “Under GDPR law, the individual owns the rights to their data, with a few exceptions. They ultimately have the final say, not the company that possesses it — whether obtained through consent or not.”
This is seconded by Julia Stead, VP of Marketing at analytics company Invoca, who explains that tech companies are quick to avoid the thorny issue of data ownership. “Data giants like Google and Facebook are very careful to avoid mentioning ownership in their data collection policies, they focus on collecting and storing user data,” she explains.
Under each interpretation, it appears that the consumer does have ultimate control over their personal data and can make requests… to a certain degree.
Organisations are responsible for data
So, GDPR gives consumers more rights to their personal data. However, once they share it, their control over it lessens and other entities become responsible for it. Companies that use personal data become stewards of it. A position that requires ethical data use and effective data security.
Failing consumers
Thus far, organisations haven’t succeeded with this. Data breaches are frighteningly common. Recently, pregnancy club Bounty was fined £400,000 for sharing the personal data of 14 million members with credit reference and marketing agencies. In particular, the company was reprimanded by the Information Commissioner’s Office for selling the information of potentially vulnerable new mothers or mothers-to-be, as well as the birth date of newborns.
Microsoft has recently admitted that hackers gained access to some Outlook users’ emails. Accounts were compromised for three to six months and the company is still to reveal the number of customers affected.
Even the UK Government is not immune. The Home Office has issued an apology for a data breach of EU citizen information. It unwittingly shared the details of 240 people seeking ‘settled’ status post-Brexit. An email to the individuals failed to use the ‘BCC’ box, thus revealing their details to everyone else CC’d into the email communication.
Each case reveals a different way that organisations have failed to use and protect personal data. It highlights how easy it is for companies to fall short when using consumer data. For the public to trust organisations with their personal data, issues like data breaches and hacks must be resolved swiftly. They cannot be as commonplace as they are today.
The data value exchange
Which brings us back to the question of ownership. Because individuals have ultimate ownership of their data but relinquish the protection of it to organisations. They do so in a value exchange: a business gains insights to improve operations and individuals benefit from tailored marketing, product recommendations and so forth.
The pressure on businesses to secure it effectively is only going to grow. Post-Cambridge Analytica, consumer trust is on shaky ground and their expectations are high. Organisations must realise this and meet these expectations. Strong relationships between data owners (consumers) and stewards (organisations) are key to the future of data. If one party doesn’t trust the other, then neither can unlock value from personal data.
Technology leaders hold tremendous power to shape the world in a positive way. Responsible use of technology will support society’s needs and ensure a sustainable future. However, thus far, the irresponsible and unethical use of technology has marred the industry.
Data breaches are a common scandal. In 2018, T-Mobile suffered a data hack that exposed two million customer details including encrypted passwords. British Airways was also hacked, with 380,000 payment details stolen. The scandal left panicked customers scrambling to contact their banks and credit card providers. Recently, hackers published the personal details of thousands of FBI agents and law enforcement officers. Making them and their families vulnerable to attack and potentially blowing their cover. Breaches are a worrying development, with new reports of attacks and leaks surfacing every few weeks.
Scandals that damage trust
Then there is the unethical use of data. Highlighted in the Cambridge Analytica scandal, which resulted in a deep public distrust of data use. Only a fifth of the UK public trust organisations to store and use their data. This has been further weakened by Facebook continuing to share data with third parties without explicit consumer knowledge or consent. Sharing arrangements with 150 organisations were revealed after the Cambridge Analytica scandal. These included retailers, other tech companies, media organisations, publishers and automotive manufacturers.
Time to rebuild the relationship
To rebuild trust in data and the wider tech sector, tech leaders must do more to address consumer concerns. They also have to go a step further, in ensuring sustainable business practices and actively solving society’s most critical issues. 76% of the public want to see CEOs actively driving change in society, instead of waiting for governments to impose it.
Positive uses are overshadowed
Overshadowed by scandal are many positive uses of technology. AI is being used in medical research to detect breast cancer with 99% accuracy, for example. It has potential applications in the circular economy, with AI designing out food waste in supply chains. The estimated value creation from this is estimated at US$127 billion a year by 2030.
Communicating such benefits and positive advances would do a great deal in regaining public trust in the sector.
Increased regulation
Concerns around the impact of technology are translating as increased scrutiny and regulation. The EU’s General Data Protection Regulation (GDPR) was introduced to give individuals greater power and ownership of their personal data. Likewise, governments are debating regulations for the sharing economy, gig workers, the policing of online content and breaking up big tech. All are products of technology, with resulting pros and cons.
Hard to predict tech’s impact
Indeed, few tech leaders could have foreseen the long-term implications of Facebook, Uber, Airbnb and Twitter when first founded. We couldn’t have predicted the influence that Facebook would’ve had on the U.S. elections or Brexit voting. Nor could we of envisioned the widespread disruption to the travel and transport industry caused by Uber and Airbnb.
With more advanced developments on the horizon, such as deep learning and autonomous vehicles, the onus is on tech companies to make changes that have a net-positive impact on society. The time is now for technology leaders to ask reflective questions on the use and role of technology. To learn from the unintended outcomes of unhampered technology development and use.
Place humanity at the centre
It is time for humans to be placed squarely in the centre of all tech developments. Not to create newer and more advanced technology to prove that we can. But to develop technology that builds a better world for generations to come.
Oversight and accountability
To achieve this, we need oversight and accountability. This is something that governments are trying to implement. However, they lack the detailed understanding and insight of those within the industry. Therefore, it falls on tech leaders to regulate and hold each other accountable.
Technology should showcase best practice. It must be a symbol of purpose and positive societal progress. In doing so, profits will soon follow. To do the opposite will be to forever undermine the power of technology. We’re in the industry to change the world. Make sure it’s for the greater good.