“Don’t follow the crowd” is a good theory, but devilishly difficult in practice when all your rivals appear to be leaping ahead.
Cloud-based communication platform Slack, owned by the proudly “trust first” Salesforce, found itself in precisely this awkward situation when angry customers highlighted that the wording in its privacy policy was vague and unclear about what the company was doing with their sensitive data.
Because millions of businesses use Slack to discuss confidential internal topics, the revelation that these conversations might be available to the public by being loaded into LLMs (Large Language Models) spooked many of its customers.
Slack’s privacy principles stated that: “Machine Learning (ML) and Artificial Intelligence (AI) are useful tools that we use in limited ways to enhance our product mission. To develop AI/ML models, our systems analyse Customer Data (e.g. messages, content, and files) submitted to Slack as well as other information (including usage information) as defined in our privacy policy and in your customer agreement.”
Meanwhile, its website said, “Work without worry. Your data is your data. We don’t use it to train Slack AI.”
Slack said this policy was written a long time ago, before LLMs became a huge technology, but that it would update these standards to be more in line with the age of artificial intelligence (AI).
Slack dealt with the problem quickly, but this story offers crucial lessons for how businesses should be thinking about data if they decide to use it in any capacity.
For a start, the race is now on to develop the best LLM models, and the profit motive mixed with the first-mover dynamic means companies are adopting an act now and ask for forgiveness later approach to all things AI. That’s just how business operates, unfortunately.
Rather than complain about some corporations acting unethically, it is much smarter for businesses to assume that all internet access is compromised in some way. We wish this weren’t the case, but story after story proves that companies can no longer be trusted to guarantee customer privacy when using their personal data.
The only option to minimise the risk of private messages leaking to the public is to train staff never to discuss sensitive information online.
This sounds like terrible advice, but did you know it is still possible for companies to talk without using the internet? About ten years ago, major banks like Goldman Sachs and Morgan Stanley were worried about the direction of the internet and decided that all discussions above a certain level of sensitivity should be done using fax.
That’s right. Just when everyone thought the days of the landline were over, it turns out that fax machines were one of the only remaining communication tools that does not store a copy of the information passing through its wires. Goldman Sachs no doubt still uses third-party apps for most internal discussions, but the fax machine is its only option for sensitive information.
If faxing documents in 2024 is good enough for the world’s largest banks, surely it’s good enough for your company, too?
The other lesson of the Slack story is that the more companies break trust with their customers, the bigger the market becomes for any firm that can protect customer privacy.
Part of Slack’s updated privacy policy was that if users are concerned about their data, they can simply opt-out. An opt-out clause allows third-party companies to plausibly say that customers were warned in the terms and conditions (which no one reads) so they can continue using customer data under the full protection of the law.
This is a horrible strategy for building the intangible asset of trust.
A more trustworthy strategy is to set an opt-in clause for using customer data instead. This would force the software company to be honest about how it is using any private data while also bringing its customers along for the journey.
There’s also a high chance that customers will happily choose to opt in if the incentives are well-structured. For example, it may be smart to offer customers a discount on purchases or subscription prices if they allow the company to use their personal data on upcoming projects.
Another angle would be to change the narrative about why people’s personal data is so useful for both the company and the customer. Presently, the narrative is that this data is valuable only to businesses, so why should a customer contribute her personal data if it is just going to line the pockets of shareholders?
But a better narrative about “opting-in” is that a customer’s data helps the company create a better product. Since the customer is already paying for the product, and contributing their data could make the next version much better, why not do it? The business might get more from this trade than will the customer. But just because one side gains more does not mean the other side loses. A “win less/win more” situation is still a win-win.
An opt-in policy means the customer can always say stop. It’s up to the customer to measure how much they value “getting a better product” versus the “risk of exposing their private data.” Some customers will err to one side, some to the other.
These approaches to data are about enhancing trust with customers, which is a fundamental intangible asset. The Slack story reinforces the imperative that CEOs must think deeply about using data wisely, especially as they race to catch up with their peers and rivals.
Recommended Reads
Free 1hr Consultation
Intangible assets are a company’s greatest source of hidden value and hidden risk. Make the valuable visible in your organisation.
Sign-up for a free 1-hour consultation today.