Slack Faces Backlash Over Controversial AI Training Policy

Recent concerns have emerged regarding how major tech companies are utilizing individual and business data for AI service training, particularly within the Slack community. Users are expressing frustration over Salesforce-owned Slack's aggressive push towards its AI initiatives.

Like many companies, Slack is leveraging its user data to enhance new AI offerings. However, users who wish to prevent Slack from using their data must contact the company directly via email to opt out—a requirement that many found surprising and inconvenient.

The details regarding this opt-out process are buried within an outdated and unclear privacy policy that had gone largely unnoticed. This changed dramatically when a disgruntled user shared their concerns on Hacker News, a popular platform among developers. The post quickly gained traction, sparking widespread discussion about Slack’s AI training practices.

The conversation was initiated when a Hacker News user highlighted Slack's AI training methods, providing a straightforward link to its privacy principles. This link revealed that Slack opts users into AI data usage by default, necessitating an email for those wishing to withdraw consent. This generated numerous questions on various platforms, particularly regarding a relatively new feature called "Slack AI." Many users were baffled as to why this feature wasn’t explicitly mentioned in the privacy policy, nor how it relates to user data.

Compounding this confusion, users have expressed irritation about needing to email for an opt-out, especially when Slack emphasizes that "You control your data." While the reaction may be new, the policies are not; according to the Internet Archive, these terms have been in effect since at least September 2023. We have reached out to Slack for confirmation.

As outlined in the privacy policy, Slack employs customer data to train "global models," which drive features such as channel and emoji recommendations and search results. A Slack representative clarified that the data usage has specific boundaries.

“Slack employs platform-level machine learning models for features like channel recommendations and search results. We do not design these models to learn, retain, or reproduce any customer data,” the spokesperson stated. However, the policy lacks clarity on the broader scope of the company’s AI training strategies.

Slack claims that users who opt out still gain access to its “globally trained AI/ML models.” This raises further questions about the necessity of using customer data for features like emoji recommendations if the models are already globally trained.

Furthermore, the company maintains that customer data is not used to train Slack AI.

“Slack AI is an optional add-on that utilizes large language models (LLMs) without incorporating customer data. These LLMs are hosted on Slack's AWS infrastructure, ensuring that user data remains within the organization and is not shared with external LLM providers, keeping customer data under their control,” said the spokesperson.

Some of the ambiguities regarding information use are likely to be addressed soon. In response to criticism on Threads, Slack engineer Aaron Maurer acknowledged that the company needs to revise its privacy policy to clarify its relevance to Slack AI. Maurer noted that the existing terms were decided prior to the launch of Slack AI and focus on search and recommendation processes. Future updates to these terms will be beneficial, considering the current debate surrounding Slack's AI initiatives.

Slack's situation serves as a potent reminder in the rapidly evolving AI landscape that user privacy must remain a priority, and companies should provide transparent terms of service that clarify how and when user data is utilized.

Have a news tip? Contact Ingrid securely on Signal via ingrid.101 or here. (Please, no PR pitches.)

We’re excited to announce an upcoming AI newsletter! Sign up here to receive it in your inbox starting June 5.

Most people like

Find AI tools in YBX