Home Articles Strengthening Design Governance for Safer Online Spaces in Kenya: Insights from DataFest...

Strengthening Design Governance for Safer Online Spaces in Kenya: Insights from DataFest Africa 2024

0

In an era where digital platforms dominate our daily interactions, ensuring safe and inclusive online spaces is paramount. At the forefront of this is user empowerment through capacity development and creating awareness. Datafest Africa 2024 hosted a transformative two-day workshop titled “Strengthening Design Governance for Safer Online Spaces in Kenya” to address this critical issue. This workshop, a collaborative effort by Ushahidi, Pollicy, UNESCO and FECoMo Kenya, aimed to equip participants with actionable toolkits for design governance and regulatory compliance tailored to the Kenyan context.

Reuben Kihiu and Wangu Mwenda from Ushahidi facilitated the 2-day workshop, held at Aga Khan University. Ushahidi is a Kenyan-founded nonprofit with a 16-year track record in digital public good(DPG) initiatives. It brought together a diverse group of participants including designers, developers, legal experts and tech enthusiasts.

The workshop was conducted in person and consisted of facilitator-led sessions and group activities. The first day of the workshop highlighted the challenges digital platforms face today, such as misinformation, ideological polarization and the lack of transparency and accountability. A lack of African contextualization of software was spotlighted. A multistakeholder approach was emphasized as crucial for effective digital governance. Recommendations brought forth included collaborative governance, inclusion of diverse stakeholders especially vulnerable groups, institutionalizing checks and balances and promoting cultural diversity.

  1. Balancing Freedom of Expression with Content Moderation in Reference to Article 33 of the Kenyan Constitution

Participants explored the delicate balance between upholding freedom of expression and implementing content moderation strategies on digital platforms. Recommendations included the development of content moderation tools that can accurately process and understand local languages and dialects. This would help in the detection and mitigation of harmful content without infringing on users’ rights to free expression. The importance of incorporating fact-checking mechanisms was emphasized to combat misinformation and disinformation while preserving the integrity of information shared online.

  1. Ensuring Non-Discrimination in Content Moderation in Reference to Article 27 of the Kenyan Constitution

In discussions around non-discrimination, the focus was on creating inclusive digital spaces that respect diversity. This involved the development of AI language models that support a wide range of local languages and dialects, ensuring that content moderation does not inadvertently marginalize certain groups. Participants also highlighted the need for platforms to contextualize user personas by considering different age groups, making sure that content and interactions are appropriate for all users, regardless of their background or identity.

  1. Protecting Consumers through Content Moderation in Reference to Consumer Protection Guidelines-Competitive Authority of Kenya(2017)

Protecting consumers in digital spaces was a key concern, with recommendations focusing on ensuring the integrity and trustworthiness of online marketplaces. Participants suggested thorough vetting of prospective merchants to maintain a reliable marketplace environment. Additionally, they advocated for the use of technologies like computer vision and machine learning to verify the authenticity and quality of products sold online. The idea of allowing third-party social proof through reviews and endorsements was also proposed to enhance credibility and reliability in online transactions.

  1. Protecting Children in Digital Spaces in Reference to Article 53 of the Kenyan Consitution 

The protection of children online was discussed with a focus on ensuring access to age-appropriate content. Participants recommended the enforcement of rigorous age verification methods, such as requiring users to upload IDs or selfies to confirm their age. They also emphasized the importance of establishing strict digital boundaries between adults and children on co-shared platforms, to prevent exposure to inappropriate content and interactions. This was seen as crucial in creating a safer digital environment for younger users.

Participants created user personas to identify the motivations, frustrations and skill levels of hypothetical users. This brought about clear statements reflecting user needs and challenges. Participants then brainstormed UI/UX solutions addressing specific user frustrations. Ideas ranged from the use of different servers in online games depending on user age to colour-sensitive websites for differently-abled persons. Teams presented their UI prototypes and received peer and expert feedback, fostering an environment of constructive critique and collaborative improvement.

The workshop effectively equipped participants with a thorough grasp of the legal limitations in content moderation and promoting a user-oriented approach to addressing issues within their unique contexts. A key outcome of the workshop was emphasis on the need to leverage the potential for establishing secure online environments in Kenya that honour and safeguard the rights of all users.

After the workshop, participants expressed a deepened understanding of the importance of an inclusive design and development process. They emphasized how critical it is to create digital products that are not only secure but also accessible and equitable for all users. This realization underscored their commitment to ensuring that future digital platforms are both safe and inclusive, catering to the diverse needs of all communities.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version