Home Blog

SM4PII and KenSafeSpace: Stakeholder Reflections on Promoting Digital Safety, Inclusion, and Rights in Kenya

0

As Kenya’s digital ecosystem continues to expand, the urgency to foster safer, more inclusive, and rights-respecting online spaces cannot be overemphasized. In efforts to address emerging issues within the space, the FECoMo Coalition, through the UNESCO’s Social Media for Peace (SM4P) Phase II project, in collaboration with Internews and KICTANet under the KenSafeSpace Project, both funded by European Union, convened a training workshop from May 7-9, 2025, in Naivasha. The initiative brought together over 50 stakeholders affiliated with the FECoMo coalition and KenSafeSpace network. Participants included regulators, human rights defenders, research institutions, tech actors and grassroots organisations working across key thematic areas related to digital rights, safety, inclusion, and governance.

The convening came at a critical time, following the 2024 anti-finance bill protests in Kenya, which saw the rise of digital activism where digital platforms played a significant role in mobilising civic action and driving public discourse. Whilst this was a key indicator of the possibilities abound with digital platforms, it also exposed significant threats within the space that manifested in the form of disinformation, cyberbullying, doxxing, hate speech, and tech-facilitated gender-based violence (TFGBV). Such public interest events, including Kenya’s upcoming general elections, underscore the need to enhance stakeholder understanding, foster strategic networks, and develop practical approaches to promoting digital safety, inclusion, and responsible online engagement. The workshop provided a timely platform to discuss key issues, including recommendations and action points for key stakeholders in the digital space.

Key discussion points and outcomes from the workshop

  • A major focus of the workshop was on the challenges posed by emerging technologies such as AI, surveillance systems. These technologies, while transformative, present pressing concerns around data protection, content governance, and digital exclusion, particularly for marginalised populations. Stakeholders called for more agile, human rights-aligned legal frameworks, improved enforcement capacity, and greater investment in digital literacy and public awareness.
  • A key session on data privacy encouraged participants to reflect on how personal information is collected, stored, and used, particularly in programmes targeting at-risk or marginalised communities. Through a practical group exercise, participants examined common gaps in consent, security, and ethical data use. The discussion reinforced the importance of adopting privacy-by-design principles, limiting data collection to what is necessary, and ensuring communication is clear, culturally appropriate, and trauma-informed.
  • The question of regulation, “who should regulate digital platforms and how”, sparked rich dialogue. While many participants supported a co-regulation model involving both government and tech companies, concerns were raised about government overreach and opaque platform moderation practices that often result in unaccountable power. A concern for overregulation was also raised with proposals for guidelines/code of conduct brought forth to encourage responsible online behaviour. The need for localised moderation tools and accountability mechanisms was a recurring theme.
  • One of the workshop’s standout moments was the Kenya ICT Policy Reviews session, where participants proposed urgent legal reforms, including the review of Section 23 of the Computer Misuse and Cybercrimes Act, stronger legal recognition of TFGBV, and amendments to the Data Protection Act to better safeguard user rights.
  • Participants were also introduced to the Phoenix User Manual Tool, an open-source platform by BuildUp that enables real-time monitoring of online trends such as hate speech and disinformation. This tool was recognised as a vital asset for civil society organisations engaged in digital threat detection and advocacy.
  • Regulatory approaches from other jurisdictions were explored, drawing lessons on how global frameworks can inform local policy development. In a related session, the ICT policy-making process in Kenya was unpacked, highlighting the importance of inclusive consultations, evidence-based decision-making, and adapting international norms to fit local contexts.
  • Participants were encouraged to engage in policy and legal processes at all levels, for example, contributing to the African Commission’s AI study, the UN Digital Compacts, and Africa’s AI Continental Strategy. Success was defined as Kenyan voices shaping global digital policy, safer online spaces for vulnerable groups, and a resilient, informed coalition of rights defenders. The call to action was clear: (i) stay active in regional networks, (ii) share grassroots insights, (iii) respond to public consultations, and (iv) use digital platforms to amplify advocacy and drive inclusive policy change.
  • The workshop also facilitated a review of the Draft National Guidelines for Addressing Disinformation, Misinformation, and Hate Speech, whose development was led by NCIC with support from FECoMo. Participants provided feedback on language clarity, rights-based safeguards, and practical applicability in local contexts. These guidelines, set for launch in June, aim to ensure inclusive, transparent, and effective responses to online harms in Kenya.

Beyond reflective learning and providing proactive recommendations, the workshop highlighted the critical role of coalitions in amplifying digital rights advocacy. Representatives from FECoMo, KenSafeSpace, and TrustLab shared lessons on building decentralised, values-driven networks. Emphasis was placed on joint advocacy, mentorship for grassroots organisations, and aligned work plans to increase impact and resilience across all levels. These collaborative approaches strengthen resilience, foster knowledge sharing, and enhance influence at national, regional, and global levels.

Participants shared upcoming initiatives and reaffirmed their commitment to working collaboratively to promote safer, more inclusive online environments across Kenya.

About the SM4P Project

The Social Media 4 Peace project, currently in phase II, is a UNESCO EU-funded initiative which seeks to strengthen the resilience of societies to potentially harmful content spread online, in particular hate speech inciting violence, while protecting freedom of expression and enhancing the promotion of peace through digital technologies, notably social media.

FeCoMothe National Coalition on Freedom of Expression and Content Moderation is a multi-stakeholder coalition formed under the SM4P Project that brings together tech stakeholders, academia, government, think tanks and civil society organisations to foster collaboration in addressing online harmful content, particularly disinformation and hate speech, while safeguarding freedom of expression in Kenya.

About KenSafeSpace Project

The Kenya Safe and Inclusive Digital Space (KenSafeSpace) Action is a 30-month project funded by the European Union, implemented by Internews, Internet Without Borders (IWB) and KICTANet. This initiative aims to amplify Kenyan human rights organisations’ voice, capacity, and influence to advocate for a democratic, safe, and inclusive digital environment. The consortium also collaborates with organisations such as the Bloggers Association of Kenya (BAKE), Tribeless Youth, Mzalendo Trust, and Watoto Watch Network (WWN) to support advocacy and research activities.

Authored by Noreen Wekati, Research Analyst, ACEPIS

FECoMo joins UNESCO for the Launch of Phase II of Social Media 4 Peace Project in Kenya

0

On 7 April 2025, members of the National Coalition on Freedom of Expression and Content Moderation in Kenya (FECoMo), joined UNESCO to officially launch Phase II of the Social Media 4 Peace (SM4P) project in Nairobi, Kenya.

This new phase builds on the achievements of Phase I and will continue to support national stakeholders to develop and apply evidence-based and context-relevant mechanisms to curb the spread and impact of online harmful content, including disinformation and hate speech. It aims to strengthen capacities and enhance cooperation with state authorities, members of parliament and judiciary to safeguard freedom of expression and promote access to information; and will also promote active participation of the marginalized and vulnerable groups in the governance of the information ecosystem in Kenya.

The event attracted participation of 24 participants representing the 17 FECoMo member organizations,  who collectively presented and discussed ideas and plans to continue building on the results achieved in the last three years, and the priorities that will guide their work during the phase II of the project which will run until December 2027. This also marked an opportunity for co-creation of priority areas for the phase 2 of SM4P project.

oplus_2

In her remarks, UNESCO Regional Director for Eastern Africa, Louise Haxthausen, commended Kenya’s commitment to the project, stating that:

“Kenya’s commitment to the Social Media for Peace initiative has been exceptional, and the establishment of a dynamic, engaged coalition (FECoMo) is one of our key successes. This coalition is not just symbolic; it is actively driving change. Our goal is to continue fostering a digital landscape that prioritizes human rights, peaceful discourse, and collective resilience,”

Ms Haxthausen emphasized the importance of expanding the membership of the coalition and increase its diversity by including more members from vulnerable communities, women and youth, who are among the most affected groups by the increasing hateful content and online disinformation.

John Okande, Programme Officer at UNESCO indicated that key actions of the Phase II of SM4P will include:

  • Fostering stronger partnerships with national actors, including content creators and digital platforms
  • Creating tools and knowledge products to improve human rights-based content moderation practices; and
  • Increasing youth engagement in addressing harmful content on digital platforms and promoting peacebuilding narratives.

A major development announced during the meeting was the Draft National Guidelines for Disinformation and Hate Speech in Kenya, with support from the SM4P project. Kyalo Mwengi, Deputy Director Legal Services at the National Cohesion and Integration Commission (NCIC) underscored the importance of this initiative, especially ahead of the 2027 elections.

“Hate speech, misinformation, and disinformation have increasingly moved to social media, where news spreads rapidly. We are majorly concerned about data security, especially with AI tools and other emerging technologies. As we approach the election period, it is crucial to build the capacity of youth to identify and use accurate information,” Mwengi emphasized.

The developed guidelines emphasize compliance with existing laws, user responsibility in flagging false content, and strong data protection. They advocate for evidence-based research, platform accountability, and timely removal of harmful content, grounded in human rights, transparency, and safety.

Misako Ito, Regional Advisor for Communication and Information in Africa at UNESCO, reaffirmed the importance of anchoring digital governance frameworks in international human rights standards stating:

“To address online harms, we must ensure that we align existing legal tools with international human rights frameworks. For instance, UNESCO’s guidelines on governance of digital platforms offer valuable guidance that would help us create accountable and inclusive digital spaces. By implementing these frameworks, we can protect users’ rights and address emerging online threats in Kenya’s growing digital landscape.”

Sharon Kechula, Programme Manager at the Association of Media Women in Kenya (AMWIK) highlighted the need for stronger protections against Technology-Facilitated Gender-Based Violence (TFGBV), particularly for women and marginalized groups.

“We must intensify our efforts to ensure that women and marginalized communities are protected from digital harm, stressing the urgency of creating stronger legal frameworks as Kenya approaches the 2027 elections.

The event also explored opportunities for mechanisms for creating synergies between UNESCO’s SM4P Phase II project and the KENSAFE initiatives, both funded by the European Union. These collaborative efforts aim to maximize impact in curbing harmful online content through shared resources and expertise.

Miriam Beatrice Wanjiru, Programme Officer for Eastern Africa at the Paradigm Initiative, emphasized the need to expand the project’s outreach beyond Nairobi to ensure inclusive participation from local organizations and counties within Kenya.

“As we move into the next phase, it is crucial to take the conversation beyond Nairobi and engage local organizations. Decentralizing our efforts will ensure more inclusive participation in the project”

She added that the coalition should strengthen its role in legal and litigation processes to create lasting change.

The first phase of SM4P, launched in 2021, laid the foundation for addressing the challenges posed by social media, such as disinformation, hate speech, and other harmful content. It strengthened journalists’ capacities, equipped youth with media and information literacy (MIL) skills, and promoted dialogue between digital platforms and local stakeholders.

The launch also featured interactive working group session, where participants from various FECoMo member organizations shared their priority intervention areas. The session aimed to identify opportunities for synergy and collaboration, with discussions centered on how partners can align their strategies and pool resources such as human capital, technical expertise, and financial support to amplify the impact of their initiatives. This collaborative approach is expected to enhance the coalition’s effectiveness and foster more coordinated responses to common challenges.

Social Media 4 Peace is a UNESCO initiative, funded by the EU Service for Foreign Policy Instruments.

Responsible Social Media Use Should not Justify Censorship

0

The Kenyan government plans to tighten regulation of social media platforms to curb what it terms as “misuse” of social media in the country. In a directive issued by the Ministry of Interior and National Administration on their X account,  social media platforms are now required to set up offices in the country to ensure responsibility and accountability, citing the rise of mis and disinformation, hate speech, and online safety concerns. Some political leaders support this move, stating that social media platforms are being used to spread disinformation, facilitate cyberbullying and disseminate other harmful content. These calls for regulation of social media platforms also coincides with a petition calling for the banning TikTok for promoting sexually explicit content and threatening Kenya’s cultural and religious values.

Undoubtedly, social media platforms have an obligation to moderate content hosted on their platforms. They have a responsibility to respect users’ freedom of expression, as outlined in the UN Guiding Principles on Business and Human Rights. Furthermore, UNESCO’s Guidelines for the governance of digital platforms offers guidance and a framework on how platforms should conduct human rights due diligence and adhere to international human rights standards in content moderation. These guidelines are critical given that content moderation actions, such as post take down, shadow banning, account suspensions, and content labelling, can limit freedom of expression.

Content moderation is a complex process involving context dependent factors, which ideally should be done by trained individuals due to its impact on free speech. In practice, platforms employ a mix of algorithmic decision systems and human content moderators. Sometimes legal content is removed, and governments are increasingly requesting social media companies to take down content. Social media platforms must be transparent in reporting on the nature of these requests and their handling, recognizing that such requests can sometimes arise from political pressure that  are contrary to international human rights standards on limitation of freedom of expression. 

The call for responsible social media use comes at a time when social media users who criticize the government have been abducted or even found murdered. On 24th December 2024, Cartoonist Kibet Bull was abducted by individuals suspected to be security operatives. It is speculated that his abduction was in relation to a series of silhouette images of President William Ruto that were shared on X and other social media platforms. Kibet’s abduction comes hot in the heels of a wave of abductions of government critics since the June 2024 anti-finance bill protests.

But the increased government’s intolerance to criticism is not just at the national levels of governments but extends to county governments as well. For instance, on 6th December 2023, the body of a popular Meru political blogger Benard Muthiani popularly known as Sniper was found at a River with marks implying that he was strangulated. On 7th April 2024, the body of a popular Kisii based political blogger was found dangling with a rope around his neck. Both these bloggers paid a heavy price for criticizing local politicians online.

Conclusion

While the government is keen on removal of harmful content online, responsible social media use can be encouraged through;

  • National digital literacy media training to fill in the skills gaps on identifying disinformation and manipulated content and information verification.
  • Encouraging a media diversity and access to information to stem disinformation campaigns since disinformation thrives where there are no alternative sources of information for verification.
  • Addressing regulatory guidelines on platform accountability in Kenya by putting in place guidelines to ensure tech companies respect human rights in the entire cycle of their operations such as product design, terms of use, content moderation and complaints handling mechanisms.
  • Developing a robust competition framework for digital markets to discourage monopolization and in turn give consumers more choices and access to better services.

Authored by Angela Minayo, Program Officer, Digital Rights at ARTICLE 19 Eastern Africa

The views expressed in this blog are author’s own and not members of FECoMo coalition.

Social Media for Peace (SM4P) Project Supports the Development of National Guidelines for Addressing Disinformation and Hate Speech on Digital Platforms in Kenya

0

On 12 March 2025, the National Cohesion and Integration Commission (NCIC) convened a one-day expert review session of the Draft National Guidelines on Disinformation and Hate Speech. The proposed guidelines aim to establish clear guiding principles for promoting online information integrity while upholding the constitutional freedoms guaranteed under Article 33 of Kenya’s 2010 Constitution, which provides for freedoms of speech and access to information.

This consultation brought together key stakeholders from the Communications Authority, Media Council of Kenya, Parliamentary Service Commission, Law Society of Kenya, Office of the Data Protection Commissioner, Internews, Article 19 Eastern Africa, and members of the National Coalition on Freedom of Expression and Content Moderation in Kenya. It followed preliminary consultations held in January 2025 in Naivasha, marking a significant step in the development of the guidelines.

“The development of these guidelines is a significant step as it will play a pivotal role towards curbing the harmful effects of misinformation, disinformation and hate speech on digital platforms in Kenya. This ongoing process reflects our shared commitment towards promoting online information integrity and protecting the rights of all Kenyans in the digital age.” Rev. Dr. Samuel Kobia, Chairman, National Cohesion and Integration Commission

Additionally, the NCIC is also reviewing the National Cohesion and Integration Act (2008) to ensure it is responsive to emerging challenges to national unity and social cohesion in the digital age. The review will lay the ground for the ultimate implementation of the new guidelines.

“The Communications Authority recognises the transformative power of digital technologies while also acknowledging the potential for misuse. These guidelines demonstrate a crucial step in establishing a framework that fosters innovation responsibly, ensuring a safer and more inclusive digital environment for all Kenyans.” David Mugonyi, EBS, Director General and Chief Executive Officer of the Communications Authority of Kenya

While technological advancements have expanded access to information and communication, they have also increased the spread and impact of harmful content.

“As we work towards an ‘internet of trust,’ digital platforms must respect human rights, promote transparency, and ensure accountability. This includes protecting fundamental freedoms of expression, access to information, and addressing harmful online content. UNESCO, therefore, hopes that this process will result in a code that strengthens information integrity and a healthy digital ecosystem in Kenya.” Misako Ito, UNESCO Regional Advisor for Communication and Information in Africa

The guidelines are grounded in several key principles, including:  

  1. Respect for human rights and fundamental freedoms,
  2. Transparency and accountability by tech companies
  3. Continuous multistakeholder engagement and cooperation.
  4. User Empowerment on Media and Information Literacy,

“The responsibility of tech companies and social media providers to take the primary duty to proper action against harmful content cannot be gainsaid. Harmful content is a threat to young democracies like Kenya. Therefore, social media companies must be joined by civil society and MDAs to work together in a multipronged approach to conclusively deal with the menace, through deliberate content moderation, partnerships, and digital education.” Leo Mutisya, Manager, Media Council of Kenya

The guidelines will align with national and international legal frameworks, including Kenya’s Constitution, the Kenya Information and Communications Act, and global instruments like the Universal Declaration of Human Rights (UDHR). It also complements the ongoing implementation of UNESCO’s Guidelines for the Governance of Digital Platforms, which promote information integrity and safeguard human rights online.

“We call for a human rights-centered approach in tackling disinformation and hate speech in Kenya that protects citizens’ freedom of expression online. This can be achieved through a multi-stakeholder model in policy and legislative processes, ensuring a robust business and human rights to guide companies on respect for human rights in their content moderation processes and digital literacy media training. ARTICLE 19 Eastern Africa is grateful for the opportunity to participate in this process and will continue to give technical expertise on balancing freedom of expression while addressing potentially harmful content online.” Mugambi Kiai, Regional Director, ARTICLE 19 Eastern Africa

Kenya is taking decisive steps to address the growing challenges of disinformation and hate speech on digital platforms – issues that continue to threaten social cohesion, undermine democratic processes, and fuel distrust and conflict. The development of national guidelines aimed at promoting online information integrity, advocating for the responsible use of digital platforms, and safeguarding fundamental rights and freedoms represents a significant and commendable milestone in this effort.

This process is supported by UNESCO’s EU-funded Social Media 4 Peace initiative, which aims to strengthen the resilience of societies to harmful content online,  particularly hate speech inciting violence, while protecting freedom of expression and promoting peace through digital technologies.