Facebook failed to block 300,000 videos of New Zealand attack

New Zealand terror attack: Facebook failed to block 300,000 videos

Facebook has claimed to remove 1.5 million videos globally of the Christchurch mosque attack within the first 24 hours. The terror attack on two New Zealand mosques was live streamed by the attacker on Facebook who killed at least 50 people.

In a series of tweets, Facebook’s Mia Garlick said a total of 1.5 million videos were removed in the first 24 hours. She also said that they will continue removing such violating content by the help of automated technologies and human content moderators.

Despite all these safety measures, Facebook failed to detect around 300,000 such videos being uploaded on its platform, representing a 20 per cent failure rate according to TechCrunch. Several videos of the terror attack have been posted to Facebook after more than 12 hours of the attack.

Many people are calling on Facebook to release the engagement figures, such as how many views, shares and reactions were made before the videos were taken down, as it is a more specific way to know, how far the videos have been spread as per critics.

The terror attack by the 28-year-old shooter targeted worshippers at two mosques during morning prayers in Christchurch, New Zealand. The attacker live streamed the attack video on Facebook using a head-mounted camera. The Police said they apprehended the shooter about half an hour after reports of the first attack came in.

The attacker is charged with murder who described himself as a self-professed fascist, according to a manifesto, he posted shortly before the attacks.

Social media giants like Facebook had to face “further questions” about their response to the event said New Zealand prime minister Jacinda Ardern, on Sunday.

Latest Update