Share

Employee claims TikTok is not equipped to be more than “an app for dance crazes”

0 0

A BBC Three investigation has discovered that content on TikTok is the source of anti-social frenzies that has engulfed the real world. According to former employees that the BBC spoke to, the platform has been reluctant to address this issue due to concerns about hindering its business growth.

It added that TikTok drove disproportionate engagement towards certain topics. To arrive at this conclusion, the BBC said it interviewed ex-staff members and app users, as well as an analysis of broader social media data conducted by the BBC.

The investigation by the BBC also revealed that TikTok’s algorithm and user interface design contribute to users encountering videos they wouldn’t typically be exposed to. This exposure then incentivizes them to create unconventional content on the platform.

TikTok has previously distanced itself from incidents of disorder, such as the recent threats of looting on London’s Oxford Street, which politicians attributed to the billion-user app.

TikTok to continue layoff plans in the coming weeks

More on the BBC TikTok investigation

The BBC said it examined four episodes in recent months where disproportionate engagement on TikTok was connected to harmful behaviour. They include:

  • An online obsession with a murder case in Idaho, USA, which led to innocent people being falsely accused
  • Interference in the police investigation of Nicola Bulley, who went missing in Lancashire, UK
  • School protests involving vandalism spreading across the UK
  • Fanning flames of riots in France, which spread at an unusual intensity and to unexpected locations

According to the BBC, a spokesperson for TikTok said that its “algorithm brings together communities while prioritising safety. It said it recommends different types of content to interrupt repetitive patterns, removes harmful misinformation and reduces the reach of videos with unverified information.”

The reporter of the article, Marianna Spring who said she had never heard of Moscow, Idaho, before November last year, claimed that her TikTok feed became “flooded with details of the murder of four students in their bedrooms while two surviving housemates slept.”

“Speculative theories around who committed the murders gripped TikTok, without any evidence to back them up. TikTok users were uniquely obsessed. Videos I found about the case racked up two billion views from November 2022 to August this year, compared to just 80,000 on YouTube,” she added.

Read also: TikTok to pay $379 million EU fine over child data privacy violations

One TikTokker, Olivia, who was interviewed for the article said that her videos did well when she was at the murder scene in person. At least one of her videos of the Idaho murders reached 20 million views.

“I felt this need to go out there and dig for answers and see if I can help out in any way,” she said.

TikTok needs to do something about the growing trend of non-consensual sexual videos on its platform

The report claims that a protest around the dress code at Rainford High School in Merseyside in February this year caught fire because a video of teachers checking the length of girls’ skirts was posted on the platform. According to the BBC, “Within three days, students at over 60 schools had held and filmed their own versions of the protest. After a week, students at over 100 schools had gotten involved.

“In some cases, they also got out of hand windows were smashed, trees were set on fire and teachers were assaulted,” the report claimed.

Several former TikTok employees in the US and UK said that “limiting these frenzies of harmful content was not a priority for the social media company, because it could slow down the app’s meteoric growth.”

One of the employees, who worked in data strategy and analysis at the company claimed that the company is just “not equipped to become more than just an app for dance crazes.”

“It grew so fast that they couldn’t possibly keep up with or predict every single way the app was going to go…But in terms of dangerous content, at least I never heard of them trying to proactively prevent them from getting big. And in general, they don’t want to, they don’t want to stand in the way of entertainment growing quickly on their platform,” he said.

You may also like...