ADVERTISEMENT

Anonymous social apps are on the rise again, but there are risks

ADVERTISEMENT

Have you ever told a stranger a secret about yourself online? Did you feel a certain freedom, precisely because the context was removed from your daily life? Personal disclosure and anonymity have long been a powerful mix intertwined through our online interactions.

We’ve seen this recently with the resurgence of anonymous questions apps targeting young people, including Sendit and NGL (which stands for “don’t lie”). The latter has been installed 15 million times worldwide, according to recent reports.

These apps can be linked to users’ Instagram and Snapchat accounts, allowing them to post questions and receive anonymous replies from followers.

While they are trending right now, it’s not the first time we’ve seen them. Early examples include ASKfm, launched in 2010, and Spring.me, launched in 2009 (as “Fromspring”).

These platforms have a troubled history. As a technology sociologist I have studied human-technology encounters in controversial environments. Here’s my take on why anonymous question apps have taken the internet by storm again and what their impact could be.

Why are they so popular?

We know that teens are attracted to social platforms. These networks connect them with their peers, support their journeys towards identity formation and offer them space for experimentation, creativity and bonding.

We also know that they manage online disclosures of their identities and personal lives through a technique sociologists call “public segregation” or “code switching.” This means that they are likely to present themselves differently to their parents than to their peers online.

Digital cultures have long used online anonymity to separate real identities from online personas, both for privacy and in response to online surveillance. And research has shown that online anonymity improves self-disclosure and honesty.

It is important for young people to have online spaces to express themselves, away from the gaze of adults. Anonymous question apps provide this space. They promise to offer exactly what young people are looking for: opportunities for self-expression and authentic encounters.

Risky by design

We now have a generation of children growing up with the internet. On the one hand, young people are hailed as pioneers of the digital age – and on the other, we fear them as innocent victims.

A recent TechCrunch article described the rapid adoption of anonymous question apps by young users and raised concerns about transparency and security.

NGL exploded in popularity this year, but has not solved the problem of hate speech and bullying. The anonymous chat app YikYak was shut down in 2017 after being riddled with hate speech, but has since returned.

These apps are designed to engage users. They use certain platforming principles to provide a highly engaging experience such as interactivity and gamification (introducing a form of ‘play’ to non-gaming platforms).

Moreover, given their experimental nature, they are a good example of how social media platforms have historically developed with a “move fast and break things” attitude. This approach, first articulated by Meta CEO Mark Zuckerberg, has arguably reached its use-by date.

Breaking things in real life is not without consequences. Likewise, breaking free of important safeguards online is not without social consequences. Rapidly developed social apps can have harmful effects on young people, including cyberbullying, cyber dating abuse, image-based abuse and even online grooming.

In May 2021, Snapchat suspended its integrated anonymous messaging apps Yolo and LMK after they were sued by the distraught parents of teens who committed suicide after being bullied through the apps.

Yolo’s developers overestimate the capacity of their automated content moderation to identify malicious messages.

In the wake of these suspensions, Sendit rocketed through the app store charts as Snapchat users sought a replacement.

Snapchat then banned anonymous messages from third-party apps in March this year to limit bullying and harassment. Even so, Sendit still seems to link to Snapchat as a third-party app, so terms of implementation are variable.

Are children manipulated by chatbots?

It also appears that these apps could include automated chatbots parading as anonymous responders to instigate interactions — or at least that’s what the Tech Crunch staff discovered.

While chatbots can be harmless (or even helpful), problems arise when users cannot see whether they are interacting with a bot or with a person. At the very least, it’s likely that the apps don’t screen bots out of conversations effectively.

Users can’t do much either. If comments are anonymous (and don’t even have a profile or post history associated with them), there’s no way of knowing if they’re communicating with a real person or not.

It’s hard to confirm whether bots are rife on anonymous query apps, but we’ve seen them cause massive problems on other platforms, opening opportunities for deception and exploitation.

For example, in the case of Ashley Madison, a dating and hook-up platform that was hacked in 2015, bots were used to chat with human users to keep them engaged. These bots used fake profiles created by Ashley Madison employees.

What can we do?

Despite all of the above, some research has shown that many of the risks that teens experience online have only short-term negative effects, if any. This suggests that we may be placing too much emphasis on the risks young people face online.

At the same time, implementing parental controls to mitigate online risks is often at odds with young people’s digital rights.

So the way forward is not easy. And just banning anonymous question apps isn’t the answer.

Instead of avoiding anonymous online spaces, we should trudge through it together — all the while demanding as much accountability and transparency from tech companies as we can.

For parents, there are some helpful resources to help kids and teens navigate taut online environments wisely.The conversation


Alexia Maddox, Research Associate, Blockchain Innovation Hub, RMIT, RMIT University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ADVERTISEMENT

ADVERTISEMENT