The live footage of Friday's attacks, New Zealand's worst-ever mass shooting, was first posted to Facebook and has since been shared on Twitter, Alphabet Inc's YouTube and Facebook-owned Whatsapp and Instagram.
At one point, the shooter even paused to give a shout-out to one of YouTube's top personalities, known as PewDiePie, with tens of millions of followers, who has made jokes criticized as anti-Semitic and posted Nazi imagery in his videos.
According to authorities, a shooter appeared to livestream video of the attack on Facebook, documenting the attack on Facebook from the drive to the Al Noor Mosque from a first-person perspective, and it showed the shooter walking into the mosque from the auto and opening fire.
Other violent crimes that have been live-streamed on the internet include a father in Thailand in 2017 who broadcast himself killing his daughter on Facebook Live.
Exact matches of removed material can not be uploaded again at YouTube and Facebook. Still, the problem persists.
Facebook, Twitter and YouTube yesterday all said they were taking action to remove the videos. "We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware", Facebook said in a statement. "We also cooperate with law enforcement to facilitate their investigations as required". "We are working to have any footage removed", wrote New Zealand police on Twitter.
Iterations of the video, however, continued to spread on social media after the attack.
"What's going on here?" she said, referring to the shooter's ability to livestream for 17 minutes. "I think this will add to all the calls around the world for more effective regulation of social media platforms", she added.
After Facebook stopped the Christchurch livestream, it told moderators to delete any copies or complimentary comments on the attack.
Users intent on sharing the violent video took several approaches ― doing so at times with an nearly military precision.
Facebook yesterday acknowledged the challenge and said it was responding to new user reports. Facebook says it does not want to act as a censor, as videos of violence, such as those documenting police brutality or the horrors of war, can serve an important goal.
"We urge people to report all instances to us so our systems can block the video from being shared again".
Reddit - which has over 20 investors, including Conde Nast owner Advance Publications - said it was actively monitoring the situation in New Zealand. Tech companies have pledged to improve their filtering and prevention efforts while balancing those measures against the drive to protect the open spirit of the platforms, which enabled them to grow so explosively over the past decade.
The rampage's broadcast "highlights the urgent need for media platforms such as Facebook and Twitter to use more artificial intelligence as well as security teams to spot these events before it's too late", Ives said.
Britain's interior minister also spoke out.