Last year, Facebook launched live video streaming to the one billion people using the site. Chief executive Mark Zuckerberg expressed desire for a product that would support the “personal and emotional and raw and visceral” ways people communicate.
Today, live-streaming has become a major trend on social media used for anything from family backyard barbeques to news broadcasts. Even businesses have managed a way to capitalize on this growing trend.
Unfortunately, like most social media outlets in existence, there are a few disadvantages to live streaming. Nobody could have predicted the raw, visceral, and dark moments that have begun to plague various live-streaming platforms. People with darker intentions have managed to take advantage of the growing trend, using live-streaming as a means to broadcast graphic content such as assaults, rapes, murders and suicides in real-time.
With an accessible platform that is intended to invite online visitors for a closer look at a broadcaster’s life, is it possible that social media companies have bitten off more than they can chew?
As the amount of graphic content being live-streamed continues to grow, more online communities are criticizing social media companies for their lax regulations and slow reaction times in the face of explicit content being live-streamed.
In 2016, an investigation was launched after a 19-year-old French woman live-streamed her suicide on Periscope. The woman, who called herself Océane, engaged with her audience up until her death.
NYT reported that though the video was no longer available on Periscope, excerpts that are still in circulation on YouTube reveal Océane sharing, “The video [I’m] doing right now is not made to create buzz, but rather to make people react, to open minds, and that’s it.”
Authorities reported that at 4:29 p.m., the young woman threw herself underneath an oncoming train, while the stream was still on. During that fatal moment, the screen went dark and there was no noise for approximately five minutes. The last audio and visual recovered before the video’s abrupt end was a glimpse of the train, followed by an off-screen worker who can be heard saying, “I am under the train with the victim; I need to move the victim.”
Viewers wondered why it had taken so long for moderators to stop the stream.
One of the more recent additions of live-streamed crimes occurred on Easter Sunday, in which Cleveland killer Steve Stephens filmed himself murdering 74-year-old Robert Godwin, Sr. The disturbing content was then posted to Stephens’ personal Facebook page, where he live-streamed his confession to the crime.
It took Facebook more than two hours to delete the video and deactivate Stephens’ page; however, by the time the stream was taken down, Stephens’ crime had been viewed over 150,000 times and shared on other social media outlets. Many Facebook critics demanded to know why it had taken the company over two hours to respond.
According to the Washington Post, Periscope, YouNow, and Twitch mainly relied on user-submitted reports to alert moderators. This overdependence on users is a major weakness regarding user safety.
The spread of violent content could potentially serve as a danger to the general public, as most of these graphic videos go up live without any trigger warnings and without a reliable alert system in which moderators can easily interfere with a dangerous broadcast.
Research suggests that content that features explicit and graphic material such as self-harm and suicide can incite violent thoughts in certain individuals and can even cause these individuals to act out in response to these triggers. Should social media companies take responsibility and develop better systems to combat violent live-streams? Most people would believe so.
In May 2017, Facebook responded to this issue and announced its intentions to hire an additional 3,000 people for the express purpose of monitoring for harmful videos.
Zuckerberg wrote, “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down.”
Whether or not this new move is temporary is still unknown, but it’s a start to resolving the problem.