Use Of Social Media In Spreading Extremism: Legal Accountability
Use of Social Media in Spreading Extremism: Legal Accountability
The use of social media as a tool for spreading extremism and inciting violence has become a growing global concern. In recent years, terrorist groups, hate organizations, and individuals have exploited platforms like Facebook, Twitter, YouTube, and other social media outlets to recruit, radicalize, and coordinate violent activities. The legal accountability for these actions involves both domestic and international legal frameworks, including criminal law, counterterrorism laws, and human rights law.
This section will explore the legal implications of social media use in spreading extremism through detailed case law, illustrating the legal responsibilities of individuals, companies, and states in addressing this challenge.
1. R v. Anwar and Others (UK, 2016)
Case Summary:
In the United Kingdom, the case of R v. Anwar involved a series of radicalizing posts made on social media platforms, leading to the conviction of several individuals for terrorism-related offenses. Mohammed Anwar, alongside several other defendants, used platforms like Facebook to promote extremist ideologies and incite violence. Their social media activity included sharing terrorist propaganda, extremist speeches, and encouraging violence against the West.
Legal Accountability:
Section 58 of the Terrorism Act 2000: Anwar and others were convicted under Section 58 of the Terrorism Act, which criminalizes the possession of documents or records that could be used for terrorist purposes, including extremist content shared on social media. The court found that their posts were not merely expressions of political views but constituted encouragement of terrorism.
Encouragement of Terrorism: Under the UK's anti-terrorism laws, promoting terrorist ideologies and encouraging acts of terrorism, even through social media, is a criminal offense. The prosecution argued that the defendants were using social media platforms to inspire violent acts and support terrorist groups.
Significance: The R v. Anwar case demonstrated how social media could be used to recruit and radicalize individuals, and it highlighted the responsibility of individuals to ensure that their online behavior does not contribute to the spread of extremist ideologies. It also raised questions about how social media companies should monitor and control extremist content.
2. United States v. Al-Arian (U.S., 2008)
Case Summary:
In United States v. Al-Arian, a former professor at the University of South Florida, Sami Al-Arian, was convicted for providing material support to a designated terrorist organization, Palestinian Islamic Jihad (PIJ). Al-Arian used social media and public forums to advocate for the PIJ and to raise funds for the group's operations. While Al-Arian's case primarily revolved around physical support and speech, his use of digital communication platforms played a key role in his conviction.
Legal Accountability:
Material Support to Terrorism: Al-Arian's conviction rested on providing material support to a foreign terrorist organization, which is prohibited under Section 2339B of the U.S. Patriot Act. While the case involved more than just social media, it highlighted how digital platforms can be used to fund, organize, and spread extremist views in support of violent groups.
Free Speech vs. National Security: The case sparked a debate about the limits of free speech in the context of national security. While Al-Arian argued that he was exercising his right to free expression, the court held that the promotion of terrorist organizations and the incitement of violence fell outside the protections afforded by the First Amendment.
Impact on Social Media Regulation: Al-Arian's case emphasized the need for a broader regulatory framework to combat the use of social media by extremists. It raised questions about whether social media companies should be more proactive in monitoring and removing content that supports terrorism.
3. The 2015 Paris Attacks (France)
Case Summary:
The Paris attacks of November 2015 were carried out by a group of terrorists associated with the Islamic State (ISIS). The attackers had used social media platforms to communicate, coordinate, and radicalize others in the months leading up to the attack. The attackers utilized encrypted messaging apps like Telegram, Twitter, and Facebook to plan and execute the attack, while ISIS propaganda was widely disseminated on these platforms.
Legal Accountability:
Terrorism and Incitement: After the attacks, French authorities took legal action against several individuals and social media platforms for their involvement in spreading extremist content. French law criminalizes the incitement to terrorism, which includes using social media to promote terrorist activities, recruit, and radicalize individuals.
Role of Social Media Platforms: The Paris attacks brought international attention to the role of social media platforms in enabling the spread of extremist ideologies. In the aftermath, French authorities pressured social media companies to take more aggressive actions to remove terrorist-related content.
French Law on Online Extremism: Following the attacks, France passed the Loi Avia (Avia Law) in 2020, which compels social media platforms to remove hate speech, terrorist propaganda, and extremist content within 24 hours of being reported. This law represents an attempt to hold social media companies accountable for the content they host.
International Cooperation: The Paris attacks also led to greater international cooperation on counterterrorism, with countries working together to identify and dismantle terrorist networks operating on social media. The European Union also adopted the EU Terrorist Content Regulation in 2021, requiring platforms to remove terrorist content within one hour of identification.
4. The Case of Brenton Tarrant (New Zealand, 2019)
Case Summary:
Brenton Tarrant, the Australian national responsible for the Christchurch mosque shootings in New Zealand in 2019, livestreamed the attack on Facebook, and the video was shared widely across social media platforms. Tarrant had been radicalized through online content, including extremist forums, and had used social media to promote his far-right views before committing the attack.
Legal Accountability:
Criminal Responsibility for Terrorist Acts: Tarrant was convicted of 51 counts of murder, 40 counts of attempted murder, and one count of terrorism under New Zealand's Terrorism Suppression Act. The case highlighted the intersection of online radicalization and real-world violence, where social media played a key role in both the planning and the publicizing of the attack.
Use of Social Media for Incitement: Tarrant’s use of social media to broadcast the attack and spread his manifesto led to widespread discussions about the role of digital platforms in facilitating the promotion of hate and extremist ideologies. The immediate spread of the video raised questions about the responsibility of platforms to prevent such content from being disseminated.
New Zealand's Response: In the wake of the attack, New Zealand passed the Terrorism Suppression (Control Orders) Amendment Act 2019, which expanded the scope of counterterrorism measures and increased the legal responsibility of internet platforms to act against harmful content. The Christchurch Call, a global initiative to eliminate terrorist and violent extremist content online, was also launched by New Zealand and France, calling on tech companies to take more responsibility for moderating content.
5. The Case of the "Online Caliphate" (U.S., 2019)
Case Summary:
In 2019, a group of ISIS supporters in the U.S. was indicted for conspiring to promote ISIS propaganda and recruit new members through social media platforms. The group used encrypted messaging apps and social media accounts to spread extremist ideologies, recruit fighters, and raise funds for ISIS operations. The individuals involved, including some in the U.S., were arrested and charged with supporting a foreign terrorist organization.
Legal Accountability:
Material Support and Recruitment: The defendants were charged under U.S. federal law, including Section 2339B of the U.S. Patriot Act, which criminalizes providing material support to foreign terrorist organizations. The indictment highlighted how social media platforms can be used for recruitment and fundraising, even if the actions are done remotely from a physical battlefield.
Encryption and Anonymity: The use of encrypted communication tools raised legal challenges for law enforcement agencies in monitoring and identifying those involved in online extremism. However, the case demonstrated how authorities can still pursue legal action against those who exploit social media for extremist purposes, even when encryption makes tracking difficult.
Social Media Accountability: The case also underscored the growing role of social media platforms in terrorist networks. The legal responsibilities of platforms to monitor, report, and remove terrorist-related content are critical in preventing the spread of extremism.
Conclusion
The cases discussed highlight the complex intersection of social media, extremism, and legal accountability. Social media platforms are not only facilitators of communication but also powerful tools for radicalizing individuals, recruiting for terrorist groups, and inciting violence. Legal frameworks, such as counterterrorism laws, criminal codes, and human rights protections, are evolving to address these challenges, but there remain significant gaps in enforcement and regulation. The responsibility of social media companies to monitor and remove harmful content, as well as the accountability of individuals who spread extremism, are central to combating the threat of online radicalization.
0 comments