Digital Content Moderation – NITDA
You may also like
This Week’s Hot Crypto Highlights
key trends in cryptocurrency: U.S. Bitcoin ETFs, global adoption, and market challenges, all captured in this engaging summary.
Kenya Pan-African Payment System leads Africa’s Push for currency independence
Kenya leads Africa's financial revolution with the Pan-African Payment and Settlement System (Papss), enabling local currency trade and promoting economic independence across the continent
Mr. Chuta on Blockchain, Web3 & AI: Nigeria’s Future
In an era where...
CRYPTO; A BEGINNER’S GUIDE TO DEFI YIELD FARMING
One of the most...
The National Information Technology Development Agency (NITDA) has emphasized the need for digital content moderation in Nigeria to address online issues like hate speech, misinformation, and cyberbullying.
Call of duty
NITDA’s Director-General, Kashifu Inuwa, highlighted the importance of collaboration among stakeholders to enhance the Nigerian digital space.
The agency has introduced a “Code of Practice”. This was introduced for Interactive Computer Service Platforms/Internet Intermediaries.
It guides the operations of major social media platforms like Twitter, Facebook, WhatsApp, Instagram, Google, and TikTok in Nigeria. This code aims to enforce digital safety and global best practices in content moderation
Code of practice was introduced to guides the operations of major social media platforms like Twitter, Facebook, WhatsApp, Instagram, Google, and TikTok in Nigeria.
Digital Rights in Nigeria:
A 2023 report by FRCN (Federal Radio Corporation of Nigeria) highlights the emerging issues surrounding digital rights in Nigeria. One key point is the need for a balanced approach that protects users from harmful content while also upholding freedom of expression. The report mentions the role of stakeholders, including the government, in establishing responsible practices for online content.
Goals of strap
The agency’s Strategic Roadmap and Action Plan 2.0 (SRAP 2024-2027)
- Fostering digital literacy
- Enhancing cybersecurity,
- Building a robust technology research ecosystem
- Advancing the country’s digital infrastructure
Moderating online content:
This 2021 OHCHR (Office of the United Nations High Commissioner for Human Rights) article explores the global debate on online content moderation. It highlights the challenges of balancing free speech with protecting users from harmful content like hate speech and misinformation. The article suggests focusing on improving content moderation processes rather than adding content-specific restrictions.
Roundtable discussion on impact of digital media:
A 2020 meeting by the Policy for a Multipolar World (PMG) in South Africa discussed the impact of digital media on misinformation and the challenges of content moderation. This discussion, while not specific to Nigeria, reflects broader concerns about online content in Africa.
TikTok has expressed support for NITDA’s initiatives, aligning its policies with the agency’s vision.
TikTok Nigeria and West Africa’s head of Government Regulation and Public Policy, Mrs. Tokunbo Ibrahim, noted that the platform prioritizes online safety and collaborates with NITDA to empower content creators and promote safe digital practices
Digital content moderation in Nigeria is gaining traction due to concerns about hate speech, misinformation, and cyberbullying.
- Misinformation and disinformation: Fake news and misleading information can have a significant impact on everything from elections to public health.
- Hate speech and incitement to violence: Online hate speech can create divisions and even lead to violence.
- Cyberbullying and harassment: Online harassment can have a serious negative impact on mental health.
This call is part of broader efforts to create a safer cyberspace, particularly to protect minors and ensure digital safety for all users.
Director-General of NITDA, Kashifu Inuwa, stated this on Monday when a team from TikTok, a social media platform, visited him in Abuja.
Concerns about content moderation:
- Potential for censorship: Overly restrictive content moderation practices can hinder free expression and opposing views.
- Lack of transparency: The processes used by social media platforms to moderate content can be unclear, causing concerns about fairness and biases.
- Digital content moderation in Nigeria must be tailored to the country’s context and legal framework.
- Government, civil society, and private sector collaboration is crucial for developing effective, responsible content moderation.
- Public education about digital literacy and online safety is also important for creating a healthier online environment.
0 Comments