Home Blog

Responsible Social Media Use Should not Justify Censorship

0

The Kenyan government plans to tighten regulation of social media platforms to curb what it terms as “misuse” of social media in the country. In a directive issued by the Ministry of Interior and National Administration on their X account,  social media platforms are now required to set up offices in the country to ensure responsibility and accountability, citing the rise of mis and disinformation, hate speech, and online safety concerns. Some political leaders support this move, stating that social media platforms are being used to spread disinformation, facilitate cyberbullying and disseminate other harmful content. These calls for regulation of social media platforms also coincides with a petition calling for the banning TikTok for promoting sexually explicit content and threatening Kenya’s cultural and religious values.

Undoubtedly, social media platforms have an obligation to moderate content hosted on their platforms. They have a responsibility to respect users’ freedom of expression, as outlined in the UN Guiding Principles on Business and Human Rights. Furthermore, UNESCO’s Guidelines for the governance of digital platforms offers guidance and a framework on how platforms should conduct human rights due diligence and adhere to international human rights standards in content moderation. These guidelines are critical given that content moderation actions, such as post take down, shadow banning, account suspensions, and content labelling, can limit freedom of expression.

Content moderation is a complex process involving context dependent factors, which ideally should be done by trained individuals due to its impact on free speech. In practice, platforms employ a mix of algorithmic decision systems and human content moderators. Sometimes legal content is removed, and governments are increasingly requesting social media companies to take down content. Social media platforms must be transparent in reporting on the nature of these requests and their handling, recognizing that such requests can sometimes arise from political pressure that  are contrary to international human rights standards on limitation of freedom of expression. 

The call for responsible social media use comes at a time when social media users who criticize the government have been abducted or even found murdered. On 24th December 2024, Cartoonist Kibet Bull was abducted by individuals suspected to be security operatives. It is speculated that his abduction was in relation to a series of silhouette images of President William Ruto that were shared on X and other social media platforms. Kibet’s abduction comes hot in the heels of a wave of abductions of government critics since the June 2024 anti-finance bill protests.

But the increased government’s intolerance to criticism is not just at the national levels of governments but extends to county governments as well. For instance, on 6th December 2023, the body of a popular Meru political blogger Benard Muthiani popularly known as Sniper was found at a River with marks implying that he was strangulated. On 7th April 2024, the body of a popular Kisii based political blogger was found dangling with a rope around his neck. Both these bloggers paid a heavy price for criticizing local politicians online.

Conclusion

While the government is keen on removal of harmful content online, responsible social media use can be encouraged through;

  • National digital literacy media training to fill in the skills gaps on identifying disinformation and manipulated content and information verification.
  • Encouraging a media diversity and access to information to stem disinformation campaigns since disinformation thrives where there are no alternative sources of information for verification.
  • Addressing regulatory guidelines on platform accountability in Kenya by putting in place guidelines to ensure tech companies respect human rights in the entire cycle of their operations such as product design, terms of use, content moderation and complaints handling mechanisms.
  • Developing a robust competition framework for digital markets to discourage monopolization and in turn give consumers more choices and access to better services.

Authored by Angela Minayo, Program Officer, Digital Rights at ARTICLE 19 Eastern Africa

The views expressed in this blog are author’s own and not members of FECoMo coalition.

Disability Mainstreaming, Content Moderation and Freedom of Expression

0

PRESS RELEASE

Background

Local communities continue to experience the power of digital connectivity. As a result, freedom to access and share information has greatly been enhanced. However, concerns abound around potentially harmful content propelled by emerging technologies such as hate speech, disinformation and misinformation are not new. Social media platforms and governments have employed content moderation practices and laws respectively in an effort to regulate vast amounts of content generated online and in compliance with set out international human rights standards. It is noteworthy that UNESCO has recently published the Guidelines for Digital Platforms[1] and it is leading global engagements on the Draft Code of Ethics for the Information Society[2] which addresses the freedom of expression challenges posed by content moderation.

Communication and Technology innovations as well as systems are moving forward rapidly so much so that persons with disabilities are being excluded from the whole digital era evolution due to the lack of integration of assistive technology.  At the same time, it is alarming that persons with disabilities risk being excluded by new information communication technologies. Assistive technologies, ICTs accessibility standards and inclusion of persons with disabilities in the digital world are some of the issues being overlooked despite having the relevant legislative frameworks that ensure and promote a barrier-free technological environment thus exacerbating the digital divide. Kenya ratified the Protocol to the African Charter on Human and Peoples Rights on the Rights of Persons with Disabilities in Africa[3] and became the first country in Africa to enact ICT Accessibility Standards.[4] But the fight for inclusive technologies should not stop there. Opportunities exist for more advocacy for disability mainstreaming in technology innovation[5] especially now that digitalization is taking root in critical government service delivery.

Call to action

In order to reap the digital dividends and other internet’s limitless possibilities, local communities should be empowered to identify and tackle potentially harmful content. At the same time, digital technologies must be designed with persons with disabilities in mind. Recognizing the potential dangers of online hate speech and disinformation, UNESCO and ARTICLE 19, launched the Social Media 4 Peace.  This EU-funded initiative targets post-conflict countries to build resilience against potentially harmful online content. The project focuses on promoting peace through digital technologies, particularly social media, by establishing multistakeholder coalitions.  In Kenya, the National Coalition on Freedom of Expression and Content Moderation (hereinafter “FeCoMo) was established to bring together tech stakeholders, academia, government, think tanks and civil society organizations to foster collaboration in addressing online harmful content, particularly disinformation and hate speech while safeguarding freedom of expression.

However, there exist knowledge gaps at FeCoMo. Therefore, ARTICLE 19 Eastern Africa and the National Council for Persons with Disabilities with the support of UNESCO will be holding a 2-and-a-half-day workshop on Disability Mainstreaming, Content Moderation and Freedom of Expression. Following the training, participants will be empowered to;

  1. Strengthen the capacities of FeCoMo members with appropriate skills and knowledge to ensure the application of the basic principles of disability mainstreaming in exercising their duties.
  2. Actively participate in the development of strategies, policies, activities, and advocacy campaigns aimed towards content moderation in relation to Disability.
  3.  Advocate for a human right-centered approach to content moderation,
  4. Understand the roles and human rights responsibilities of social media companies, to uphold freedom of expression while mitigating the spread of harmful content and ensure adherence to international human rights standards.

[1] https://unesdoc.unesco.org/ark:/48223/pf0000387339

[2] https://unesdoc.unesco.org/ark:/48223/pf0000187196

[3] https://www.socialprotection.go.ke/dsd-achievements

[4] https://dig.watch/updates/accessibility-for-ict-products-and-services-for-persons-with-disabilities-standard-launched-in-kenya

[5] https://thedocs.worldbank.org/en/doc/123481461249337484-0050022016/original/WDR16BPBridgingtheDisabilityDividethroughDigitalTechnologyRAJA.pdf

FECoMo Participates in Consultations on UN Global Principles for Information Integrity in Kenya

0

Two weeks prior to the UN Summit of the Future taking place in New York, the National Coalition on Freedom of Expression and Content Moderation in Kenya (FECoMo) established under the UNESCO Social Media 4 Peace project, as well as other stakeholders, convened at the UN Office in Nairobi (UNON) for a follow-up discussion on the UN Global Principles for Information Integrity and its applications to the local context. The group had previously met in December 2023 for an initial consultation on the Policy Brief: Information Integrity on Digital Platforms, part of the UN Secretary-General’s 2021 “Our Common Agenda“, which focused on the impacts of misinformation, disinformation, and hate speech on global, and national sustainable development agendas.

In the opening session, Ms Misako Ito, UNESCO’s Regional Advisor for Communication and Information in Africa, provided an overview of the December 2023 expert consultation on the Voluntary Code of Conduct on Information Integrity on Digital Platforms. She highlighted the growing spread of misinformation and disinformation online and emphasized the need establishment , compliance by digital platforms, positive framing of the Code of Conduct in local contexts and fostering strong linkages to existing international frameworks like UNESCO’s Internet for Trust Guidelines for the Governance of Digital Platforms. Ms Ito also reminded the importance of ensuring the Code’s sustainability through periodic reviews and updates.

The consultation was part of the series of expert consultations conducted by the UN Information Service aimed at shaping the development of the voluntary UN Code of Conduct for Information Integrity, now referred to as the UN Global Principles for Information Integrity.

Ms Misako Ito, UNESCO’s Regional Advisor for Communication and Information in Africa

The Global Principles were introduced to the participants in the following session by Ms Sandra Macharia, Director of the UN Information Service in Nairobi. These principles recommend actionable steps for various stakeholders including technology companies, AI actors, media organizations, researchers, civil society, governments, and the UN, across five key areas: societal trust and resilience; healthy incentives; public empowerment; independent, free and pluralistic media; and transparency and research.   

Ms Sandra Macharia, Director of the UN Information Service in Nairobi UNIS

“The United Nations Global Principles for Information Integrity aim to empower people to demand their rights (…). At a time when billions of people are exposed to false narratives, distortions and lies, these principles lay out a clear path forward, firmly rooted in human rights, including the rights to freedom of expression and opinion.” 

António Guterres, UN Secretary General

During plenary discussions, participants were invited to address the five areas of the recommendation by focusing on three questions: Are the Global Principles ambitious enough for their context? How can the recommended actions be applied within their sectors? What opportunities exist for measuring the impact of these actions? The discussions highlighted the transformative potential of the recommendations but stressed the need for local implementation, digital infrastructure development, youth empowerment, and stronger involvement of major tech companies in applying the principles. They also called for cross-sectoral cooperation, enhanced media independence, regular policy updates to keep pace with digital evolution, and integration of the principles into education, training, and research, particularly with an emphasis on youth engagement and the inclusion of offline populations. The operationalization of these principles was recognized as having the potential to accelerate local innovation, contributing to job creation and allowing increased impact measurement control.

Participants at the consultation on the UN Global Principles for Information Integrity in Kenya UNIS and UNESCO

In the final session, Ms Sandra Macharia provided an overview of the Summit of the Future, describing it as a once-in-a-generation chance to rebuild trust and demonstrate the effectiveness of international cooperation in tackling global challenges such as the stagnation of SDGs, climate change, and unchecked technological growth. The summit will involve the endorsement by world leaders of an inter-governmentally negotiated Pact for the Future, and its two annexes, the Global Digital Compact and the Declaration on Future Generation, based on the UN Global Principles for Information Integrity.

In her closing remarks Ms Misako Ito reaffirmed UNESCO’s commitment to the promotion of access to information and combatting disinformation, with FECoMo continuing to lead these efforts through the Social Media 4 Peace project and by being engaged in global discussions shaping global frameworks and standards. As Ms Sandra Macharia stated in her closing remarks, global issues can only be addressed with global solutions, which requires a commitment to international cooperation, the enhancement of intersectoral partnerships, and the continuous engagement of all in these discussions.

As discussions on the UN Global Principles for Information Integrity advance, FECoMo seeks to play a crucial role in Kenya by strengthening cross-sectoral partnerships, enhancing digital platform governance, and promoting media and information literacy. FECoMo will advocate for integrating these principles into national policies and promoting media and information literacy competencies within local communities and society at large, helping to create a more secure and inclusive digital environment in Kenya.

Strengthening Design Governance for Safer Online Spaces in Kenya: Insights from DataFest Africa 2024

0

In an era where digital platforms dominate our daily interactions, ensuring safe and inclusive online spaces is paramount. At the forefront of this is user empowerment through capacity development and creating awareness. Datafest Africa 2024 hosted a transformative two-day workshop titled “Strengthening Design Governance for Safer Online Spaces in Kenya” to address this critical issue. This workshop, a collaborative effort by Ushahidi, Pollicy, UNESCO and FECoMo Kenya, aimed to equip participants with actionable toolkits for design governance and regulatory compliance tailored to the Kenyan context.

Reuben Kihiu and Wangu Mwenda from Ushahidi facilitated the 2-day workshop, held at Aga Khan University. Ushahidi is a Kenyan-founded nonprofit with a 16-year track record in digital public good(DPG) initiatives. It brought together a diverse group of participants including designers, developers, legal experts and tech enthusiasts.

The workshop was conducted in person and consisted of facilitator-led sessions and group activities. The first day of the workshop highlighted the challenges digital platforms face today, such as misinformation, ideological polarization and the lack of transparency and accountability. A lack of African contextualization of software was spotlighted. A multistakeholder approach was emphasized as crucial for effective digital governance. Recommendations brought forth included collaborative governance, inclusion of diverse stakeholders especially vulnerable groups, institutionalizing checks and balances and promoting cultural diversity.

  1. Balancing Freedom of Expression with Content Moderation in Reference to Article 33 of the Kenyan Constitution

Participants explored the delicate balance between upholding freedom of expression and implementing content moderation strategies on digital platforms. Recommendations included the development of content moderation tools that can accurately process and understand local languages and dialects. This would help in the detection and mitigation of harmful content without infringing on users’ rights to free expression. The importance of incorporating fact-checking mechanisms was emphasized to combat misinformation and disinformation while preserving the integrity of information shared online.

  1. Ensuring Non-Discrimination in Content Moderation in Reference to Article 27 of the Kenyan Constitution

In discussions around non-discrimination, the focus was on creating inclusive digital spaces that respect diversity. This involved the development of AI language models that support a wide range of local languages and dialects, ensuring that content moderation does not inadvertently marginalize certain groups. Participants also highlighted the need for platforms to contextualize user personas by considering different age groups, making sure that content and interactions are appropriate for all users, regardless of their background or identity.

  1. Protecting Consumers through Content Moderation in Reference to Consumer Protection Guidelines-Competitive Authority of Kenya(2017)

Protecting consumers in digital spaces was a key concern, with recommendations focusing on ensuring the integrity and trustworthiness of online marketplaces. Participants suggested thorough vetting of prospective merchants to maintain a reliable marketplace environment. Additionally, they advocated for the use of technologies like computer vision and machine learning to verify the authenticity and quality of products sold online. The idea of allowing third-party social proof through reviews and endorsements was also proposed to enhance credibility and reliability in online transactions.

  1. Protecting Children in Digital Spaces in Reference to Article 53 of the Kenyan Consitution 

The protection of children online was discussed with a focus on ensuring access to age-appropriate content. Participants recommended the enforcement of rigorous age verification methods, such as requiring users to upload IDs or selfies to confirm their age. They also emphasized the importance of establishing strict digital boundaries between adults and children on co-shared platforms, to prevent exposure to inappropriate content and interactions. This was seen as crucial in creating a safer digital environment for younger users.

Participants created user personas to identify the motivations, frustrations and skill levels of hypothetical users. This brought about clear statements reflecting user needs and challenges. Participants then brainstormed UI/UX solutions addressing specific user frustrations. Ideas ranged from the use of different servers in online games depending on user age to colour-sensitive websites for differently-abled persons. Teams presented their UI prototypes and received peer and expert feedback, fostering an environment of constructive critique and collaborative improvement.

The workshop effectively equipped participants with a thorough grasp of the legal limitations in content moderation and promoting a user-oriented approach to addressing issues within their unique contexts. A key outcome of the workshop was emphasis on the need to leverage the potential for establishing secure online environments in Kenya that honour and safeguard the rights of all users.

After the workshop, participants expressed a deepened understanding of the importance of an inclusive design and development process. They emphasized how critical it is to create digital products that are not only secure but also accessible and equitable for all users. This realization underscored their commitment to ensuring that future digital platforms are both safe and inclusive, catering to the diverse needs of all communities.