

(I'm looking at you, Google and YouTube-not to mention Facebook, which hosted the live stream to begin with.) And you don't even need to be on 8chan to stumble on the footage: Search engines' predictive search will actively encourage you to browse for it when you type a related term. You don't need to be an 8chan denizen to be tempted by firsthand footage of an event dominating the news cycle, just as most people probably wouldn't look away if they came by the scene of an attack-or even a particularly bad accident-in real life. A small minority of us might be scanning the footage in desperate hope to establish the whereabouts of our loved ones.īut plenty of people are today looking at the Christchurch video for no real good reason-just because the draw of the drama and the apparent safety of viewing it from miles away, behind a computer screen. Some of us, like journalists and police, are professionally obliged to view distressing imagery to try to discern valuable new information, whether for investigation purposes or to better inform debate. Dozens of copies of what appears to be footage from a helmet-mounted camera are circulating on the darker corners of the internet and are being persistently posted on more mainstream platforms such as YouTube, Twitter and Facebook, which don't always manage to catch the video before it goes up.
Other video-streaming sites like TikTok and YouTube require users to have a certain number of followers before they're able to stream live, reports Allyn.Horrific videos like the one posted by the Christchurch mosque shooting suspect Brenton Tarrant are geared to appeal to the morbidly curious, and appeal it did. Facebook, Twitter and other sites like them have teams of thousands working to moderate content and block violent media from reaching people.įor example Twitch, the site the Buffalo shooter livestreamed on, could make it harder for people to open accounts and instantly upload live videos. Social media companies used to take a mostly hands-off approach to moderating content on their sites, but now more than ever sites are trying to manage the societal problems their sites create, reports Allyn. Experts say social media companies could do more Listen to his discussion on Morning Edition. "The social media platforms that profit from their existence need to be responsible for monitoring and having surveillance, knowing that they can be, in a sense, an accomplice to a crime like this, perhaps not legally but morally," Hochul said.Īllyn reports that social media companies usually are not held liable for what they don't police on their sites.
