Categories
Tech Now + Beyond

Clubhouse has the whole internet talking, but not always for a good reason

There is no question that technology has achieved some pretty cool things. When the pandemic shut down the world, technology made sure things didn’t go completely dark. Our devices became our window to the outside world. Through video conferences, we were able to see family and friends who were prohibited. Through social media like Facebook, Twitter, and Instagram, we were able to connect with strangers all sharing similar experiences with lockdown isolation, boredom, and sheer loneliness. 

So it’s no wonder TikTok boomed in popularity, or that YouTube viewership increased, or that Twitter users set a record during the pandemic. Podcasts have done particularly well, too; it seems like everyone with a platform started a podcast in 2020, and it kept a lot of people sane

With all the different corners of today’s internet, who could possibly think of another app that is unique and original enough to draw throngs of users? 

Entrepreneurs Paul Davison and Rohan Seth, that’s who. In 2020, these tech creatives launched Clubhouse. With people stuck at home and craving something new, the app couldn’t have come at a better time. 

Clubhouse is an audio-only app, designed to bring people together while avoiding the artifice that accompanies pic-posting, gaudy profile headers, and long, preachy posts. In fact, users must provide their full names to make the experience as authentic as possible. Bringing people together to have real conversations (instead of text-only shouting matches on Twitter) is what Clubhouse is all about. Think of a virtual room to have an intimate phone conversation with as many (or as few) people as you’d like. It all depends on the room’s moderator/host and, of course, the preferences of the audience members. 

The catch to all this inclusivity? It’s exclusive to iPhones. And it’s invite-only. 

Doesn’t this make Clubhouse’s come-one-come-all vibe an oxymoron? Despite receiving attention from the giants in the Silicon Valley and the financial support of over 180 investors (in January it was valued at $1 billion), the app is still in beta testing. So right now, it’s all about who you know; if you don’t know a friend who is an existing user to shoot you an invite link, you get stuck on the purgatory waiting list like me. 

But there are other issues that users have forced the app to address, even as it is still in testing. During a September chat hosted by activist Ashoka Finley about anti-Semitism and the Black community, the panel was infiltrated by the very thing panelists condemned: anti-Semitic slurs. Although there are community guidelines that allow users to report each other for harassment or threats, the app does not address how difficult it is to screen or silence users when there are so many in a chat room. Because of this, many users no longer felt safe.

Furthermore, this was not an isolated incident. As recently as April 18, Clubhouse again had to shut down multiple chat rooms with anti-Semitic discussions, where people slammed “Jewish privilege” and promoted stereotypes. Clubhouse was quick to condemn these chat rooms, citing the way hate speech was in violation of the Community Guidelines. But just like that, Clubhouse’s conversation-embracing idealism crumbled. 

This exemplifies the importance of policing hate speech online, particularly on an app that can have up to 2 million weekly users who can say anything at any time before being blocked. While apps like Twitter and Facebook allow users to block others before seeing offensive content, Clubhouse does not. Despite the ability to report users on Clubhouse and the way moderators can pre-screen the individuals that they allow into their chat rooms, the app faces a unique challenge in moderating offensive language. Because the app is audio-only, there is no way for hate speech to be blocked through text search (like how most social media platforms do). Words, once spoken, are intractable. 

If you don’t know a friend who is currently an existing user to shoot you an invite link, you get stuck on the purgatory waiting list like me.

Moderators can yield much control over who listens in their chat room, but the problem lays in just that. Not everyone who moderates a chat room knows how to police hate speech and keep rowdy individuals under control, making it difficult for other listeners to avoid hearing triggering/offensive sentiments. Currently, moderators are simply other people who use the app–not necessarily trained or even prepared for the task.

Moderation should be the app’s burden, not that of users hosting a session. While moderators have the ability to mute listeners (you have to request to speak in order to be heard), many do not want to be seen as silencing people and hurting Clubhouse’s appeal of embracing open conversation. Because of this, there is the matter of whether or not moderators contribute to policing freedom of speech. 

Does policing contribute to cancel culture? Perhaps it does, but it’s a necessary evil if it means ensuring that everyone has a safe space. Because Clubhouse (and the internet itself) welcomes everyone, do safe spaces truly exist online? 

If you violate the rules of the Community Guidelines, Clubhouse can suspend or terminate your account. They are also aware of how hard it is to track users who spread hate speech. Their current system records the sessions to review for violation of the Community Guidelines. However, the recording automatically stops and deletes itself if no report is made by a user during the live event itself. If a user wants to make a report after the event, the recording will have already been deleted.

As a result, the system puts the burden of reporting entirely on the users, who may be the victims of hate speech, to report right then and there. For a lot of these Clubhouse rooms, it’s likely that offensive remarks won’t be reported immediately, accounts will remain in place, and hate speech stays on a platform that is supposed to shield communities from feeling unwelcome. 

According to Reuters, Davison has been employing “safety teams” in multiple languages to investigate troubling incidents. A Clubhouse representative also shared that the app uses internal and external services and advisors to filter content that violates the rules, but they did not provide specifics as to how they operate. It is expected that the app will eventually have more tools for detecting and blocking hate speech, to be used by both session hosts and audience members. As of right now, all of these plans are still in the trial phase. After all, the app is still in beta testing.

But while Clubhouse grapples with living up to their promise, they already have big competition. On their heels, major companies have already jumped the audio-only bandwagon with Facebook’s new feature similar to Rooms and Twitter’s Spaces. This new app, although already supported by major Silicon Valley players, will have to keep up with the current elites. And speaking as someone who created an account but is still on the waiting list, I’d say the app has a long way to go.

Get The Tempest in your inbox. Read more exclusives like this in our weekly newsletter!

By Laurie Melchionne

Laurie Melchionne is the editor in chief at The Argo, Stockton University's independent student newspaper. Laurie majors in Literature with a double minor in Journalism and Digital Literacy/Multimedia Design. With a concentration in creative writing, Laurie loves all things editorial and communications, and believes in people sharing their voices through the written word.