Clubhouse security and privacy are behind the explosive growth

Clubhouse has come a long way in assuring its users that its privacy and security policies are in full swing.
Enlarge / Clubhouse has come a long way in assuring its users that its privacy and security policies are in full swing.

Carsten Koall | Getty Images

In recent months, the audio-based social media app Clubhouse has emerged as Silicon Valley’s latest disruptive darling. The format feels familiar: partly Twitter, partly Facebook Live, and the others talking on the phone. But as Clubhouse continued to expand, the failures of security and privacy came under increasing scrutiny – and it caused the company to scramble to rectify problems and manage expectations.

Clubhouse, still in beta and only available on iOS, offers its users ‘rooms’ that are essentially group audio chats. It can also be set as public addresses or panel discussions, where some users are ‘speakers’ and the rest are audience members. The platform is said to have more than 10 million users and is valued at $ 1 billion. Since last year, it has been an exceptional haven for Silicon Valley elite and celebrities, including an appearance by Elon Musk earlier this month. But the company has struggled with concrete security issues and short-lived questions about how much privacy users should expect.

“With smaller, newer social media platforms, we need to be on our guard for our data, especially if it’s going through big growth, it’s testing a lot of the controls,” says security researcher Robert Potter. “Things you may have gotten away with with only 100,000 people on the platform – you increase the numbers tenfold and the level of exposure goes up, the threat goes up, the number of people who investigate your platform goes up.”

Recent concerns about the security of Clubhouse lead the weather from vulnerabilities to questions about the underlying infrastructure of the app. A little over a week ago, Stanford Internet Observatory researchers put a spotlight on the platform when they found that the app was sending users’ clubhouse identifiers and chat room IDs unscathed, meaning a third party could your actions in the app. The researchers further pointed out that some of the Clubhouse infrastructure is managed by a company in Shanghai, and it appears that the app’s information traveled at least a bit through China, potentially exposing users to the targeted or even widespread Chinese government oversight. Then on Sunday, Bloomberg confirmed that a third-party website had scrapped and compiled audio from clubhouse conversations. Early Monday, further revelations followed that the clubhouse bookings had been scrapped for a non-affiliated Android app, enabling users on the operating system to listen in real time.

Potter, one of the researchers investigating Clubhouse’s various data-scraping projects, explains that these programs and websites did not look malicious; they just wanted to make clubhouse content available to more people. But the developers could only do so because Clubhouse did not have scraping mechanisms that could stop it. Clubhouse, for example, did not limit how many rooms a single account could stream simultaneously, allowing someone to create an application programming interface to stream each public channel at once.

Adult social networks like Facebook have more developed mechanisms to shut down their data, both to prevent breaches of privacy by users and to defend the data they have as an asset. But even they can still have potential exposure to creative scraping techniques.

Clubhouse was also scrutinized for the aggressive collection of users’ contact lists. The app strongly encourages all users to share their address book data so Clubhouse can help you connect with people you know who are already on the platform. It also requires you to share your contact list to invite other people to the platform, as Clubhouse is still invitations only, carrying a sense of exclusivity and privacy. However, many users have pointed out that the app also makes suggestions if you are going to invite others, based on the phone numbers in your contacts that are also in the contacts of the largest number of clubhouse users. In other words, if you and your local friends all use the same florist, doctor or drug dealer, they may very well appear on your list of suggested people to invite.

Clubhouse did not respond to WIRED’s request for comment on the recent safety issue. In a statement to Stanford Internet Observatory researchers, Clubhouse outlined specific changes he planned to strengthen his security, including turning off pings to servers in China and strengthening them. The company also said it would work with a third-party security firm to implement the changes. In response to the unauthorized website that flooded the discussion of the clubhouse, the company told media channels that it had permanently banned the user behind it and would add additional “guarantees” to prevent the situation from recurring.

Although Clubhouse seems to take researchers’ feedback seriously, the company has not yet been specific about all the security enhancements it has implemented or plans to add. Since the app does not provide end-to-end encryption to its users, researchers say there is still a feeling that Clubhouse has not paid enough attention to its security position. And that’s even before you wrestle with some of the fundamental privacy issues the app poses.

When you start a new clubhouse room, you can choose from three settings: an ‘open’ room is accessible to every user on the platform, a ‘social’ room only allows people you follow and a ‘closed’ room restrict access to invitees. Each has its own implicit level of privacy, which can make Clubhouse clearer.

“I think for public rooms, Clubhouse users should give the expectation that public is public for all users, as everyone can join and record, take notes, etc.” says David Thiel, chief technology officer at the Stanford Internet Observatory. “For private chambers, they can communicate that an authorized member, as with any communication mechanism, can record the content and identity, so make sure you establish both expectations and trust the participants.”

Like any prominent social network, Clubhouse also struggled to deal with abuse on the platform. The app’s terms of service prohibit hate speech, racism and harassment from November, and the platform offers some moderation features, such as the ability to block users or indicate a room as potentially abusive. But one of the biggest features of Clubhouse is also a problem for abuse: People can use the platform without the responsibility of having their contributions automatically saved as posts. This may encourage some users to make abusive or derogatory remarks, thinking that they will not be included and that it will not have consequences.

Stanford’s Thiel says that Clubhouse is currently keeping recordings of bookings for review in case of abuse. However, if the company were to implement end-to-end encryption for security, it would be even harder to keep abusers because it would not be able to make the recordings so easily. Every social media platform faces some version of this tension, but security experts agree that, if relevant, the benefits of adding end-to-end coding are worth developing more sophisticated and creative anti-abuse solutions. .

Even end-to-end encryption does not eliminate the additional possibility that any Clubhouse user may remotely record the conversation they are having. This is not something that Clubhouse can easily solve. But it can at least set expectations accordingly, no matter how friendly and off-the-record the conversation feels. “Clubhouse just needs to be clear about what it’s going to contribute to your privacy,” Potter says, “so you can determine what you’re going to talk about.

This story originally appeared on wired.com.

Source