Daniel Halliday
Sep 15 · Last update 3 mo. ago.

Do tech companies need to rethink online content moderation?

Following the white supremacist terrorist attack on a mosque in New Zealand that was live streamed to Facebook, and a variety of extreme or harmful media that continuously get uploaded on a multitude of platforms online, do tech companies need to rethink online content moderation? Currently, most content moderation is subcontracted out to smaller companies, but this remains controversial in its own right, as until artificial intelligence is improved human moderators are needed to constantly watch flagged content all day to decide what needs to be removed. But with the horror of an act of terrorism streamed to Facebook, a call for tech companies to do more to protect users, by moderating their content more swiftly, was made in both the media and by politicians internationally. What could be done differently? Should private companies or the state be responsible online censorship, or neither? Do social networking companies need to rethink their strategy to moderation or is enough being done already? thehill.com/policy/technology/434346-live-video-of-new-zealand-shooting-puts-tech-on-defensive
Stats of Viewpoints
No - the question of liability
1 agrees
0 disagrees
Viewpoints
Add New Viewpoint

No - the question of liability

Private companies cannot be responsible for the actions of a very few extremists. In the United States, Section 230 of the Communications Decency Act removes liability from social network providers, as they are not the “content provider” of the extremist and/or illegal content. Social networking services may have to remove content that that is sensitive or illegal, in a similar way to copyrighted contents, and many things have been done to counter and remove extreme content quicker. But the prevalence of terrorism remains a problem of the society affected not the problem of social media platform used to spread such content, users are simply using new media broadcasting tools, tech companies are therefore not liable for this huge issue, it's a societal problem and requires a societal response.

technology.findlaw.com/modern-law-practice/understanding-the-legal-issues-for-social-networking-sites-and.html

Agree
Disagree
Latest conversation
Daniel Halliday
Sep 15
Created
Translate