2019.09.18 12:04World eye

フェイスブック、テロ動画配信防止へ英警察と連携

【サンフランシスコAFP=時事】交流サイト(SNS)最大手の米フェイスブックは17日、テロ攻撃のライブ配信の防止対策強化に向けた取り組みの一環として、ロンドン警視庁と連携すると発表した。(写真は交流サイト(SNS)最大手の米フェイスブックのロゴ)
 ニュージーランドのクライストチャーチでは今年3月、白人至上主義者を自称する男がモスク(イスラム礼拝所)2か所を襲撃し、多くの人を殺害。頭に装着するカメラを使って襲撃の様子を撮影し、フェイスブック上でライブ配信した。
 フェイスブックは犯罪対策の担い手との新たな連携と称し、「危険」な集団や個人に投稿された暴力的な映像をより早く発見、削除するため、共同でソフトウエアに学習を積ませると表明した。
 フェイスブックは自社のブログで、「クライストチャーチでの襲撃動画に、われわれの自動検出システムは反応しなかった。暴力事件を一人称視点で映し出したコンテンツの量が不十分で、機械学習テクノロジーが有効な訓練を積んでいなかったからだ」と説明。
 さらに、「そのような理由でわれわれは米英の政府および警察と協力し、射撃訓練の映像を入手する。システムを訓練するための貴重な情報源となる」とした。
 フェイスブックや動画投稿サイト「ユーチューブ」などのサービスは、襲撃動画の配信を早期に検知できず、その後もインターネット上で拡散した動画を速やかに削除できなかったことで激しい批判を受けている。【翻訳編集AFPBBNews】
〔AFP=時事〕(2019/09/18-12:04)
2019.09.18 12:04World eye

Facebook taps London police to track terror livestreams


Facebook on Tuesday teamed up with the London police to help its artificial intelligence tools track livestreams of terror attacks such as the New Zealand mosque massacre.
A self-professed white supremacist used a head-mounted camera in March to broadcast live footage on Facebook of him attacking two mosques in the city of Christchurch.
Facebook and platforms such as YouTube came under intense criticism for initially failing to detect the broadcast and then struggling to take down its uploads that proliferated online.
New Zealand's Jacinda Ardern and other world leaders in May launched a Christchurch Call to Action against online extremism -- a campaign major platforms joined later that month.
The California-based social media behemoth said Tuesday it was in the process of updating and refining its policies for dealing with extremism and online hate.
Some of these changes predate the tragic terrorist attack in Christchurch, New Zealand, but that attack, and the global response to it in the form of the Christchurch Call to Action, has strongly influenced the recent updates to our policies and their enforcement.
- Machine learning -
London's Metropolitan Police said the initiative will see it provide Facebook footage of training by its forearms command unit.
The videos will be captured on body cameras provided by Facebook that London's Firearms Command officers wear during exercises.
This will help Facebook capture the volume of images needed to train our machine learning tools, the company said.
This will mean our AI tools will be able to more accurately and rapidly identify real life first person shooter incidents and remove them from our platform.
The London police said its footage will be combined with video Facebook is already using from law enforcement agencies in the United States.
The new technology will also significantly help prevent the glorification of such acts and the promotion of the toxic ideologies that drive them, Britain's Special Operations assistant commissioner Neil Basu said.
The Metropolitan Police said Facebook decided to ask London for help because it has created the world's first counter-terror internet response team focused on online hate.
The machine learning tools will also be applied to Facebook's hugely successful Instagram platform as it captures more and more younger users worldwide.
The London police said it will further share its training footage with the UK interior ministry so that it can then offer it to other interested social media networks as the initiative grows.
Firearms Command regularly train in how to respond to a wide variety of scenarios, from terrorist incidents to hostage situations, on land, public transport and water, the London police said.
The footage they provide will show a 'shooter' perspective in a broad range of situations.
- 'Crisis intervention' -
The speed with which the videos spread and Facebook's initial inability to track them all down redoubled public and government scrutiny of the world's biggest social media company.
The Christchurch images were broadcast live for 17 minutes -- and remained online for a further 12 minutes -- before Facebook was alerted by a user and took it down.
Yet millions of upload and shares continued to spread for days.
Facebook on Tuesday defended its track record but conceded that bad actors will continue to try to get around our systems.
It reported banning 200 white supremacist organisations and removing 26 million pieces of content or terrorist organisation such as the Islamic State.
Facebook said Tuesday that it was also expanding to Australia and Indonesia a US programme in which users who search for extremist content on the platform are directed to a special support group.
The US group was founded by former violent extremists that provides crisis intervention, education, support groups and outreach, Facebook said.

最新ニュース

写真特集

最新動画