June 13 2024 GM

From TCU Wiki
Glitter Meetups

Kwanele's Journey Navigating Ethical Tech for GBV Justice

Join us on June 13, to hear from Leonora and Lebogang, currently the Managing director and Lead developer of Kwanele South Africa who will be talking about:

  • How Kwanele approaches co-creating technology to support survivors of gender-based violence (GBV).
  • The features and impact of the mobile application and chatbot, which provide crucial resources and assistance to those in need.
  • Kwanele's ethical approach to building technology aimed at increasing access to justice for survivors of GBV.

Lebo reviews data for Kwanele at App Developer Studio (ADS) and helps connect ADS and Kwanele. She facilitates data set collection and works with their youth advisory committee.

Leonora is the founder of Kwanele, she comes from a background in the NPO sector and is a passionate advocate for justice, equality and women's rights.

What is Glitter Meetup?

Glitter Meetup is the weekly town hall of the digital rights and Internet Freedom community at the IF Square on the TCU Mattermost, at 9am EDT / 1pm UTC. It is a text-based chat where digital rights defenders can share regional and project updates, expertise, ask questions, and connect with others from all over the world! Do you need an invite? Learn how to get one here.

Notes

Does GRIT stand for something more than the word?
  • Yes! Gener Rights in Tech. We were Kwanele for the last 3 years but just changed our name as we are looking to scale in Africa
Can you give us a brief overview of what GRIT/Kwanele does, including how you got to designing a technology for GBV Justice?
  • GRIT, formerly known as Kwanele, is a South African-based organization dedicated to leveraging technology to increase reporting and access to services for survivors of gender-based violence (GBV). Our mission emerged from deep frustration with the South African justice system, where access to justice and safety was often reserved for a small percentage of the population.
  • To address these challenges, we developed several innovative solutions:
    • Emergency App: Our app includes a panic button for immediate help, a secure vault for storing evidence, and it is data-free to ensure accessibility for all users. Note the panic button only works in South Africa currently as it is linked to armed response. Also the bot is still in beta mode so feel free to test her, break her and send feedback. We cover all the costs, as an NPO that is one of the things we fundraise for, for the app running costs, the armed response and the data costs
    • WhatsApp Line: This service provides legal assistance, counseling, and access to emergency relief, making essential support easily reachable.
    • AI Chatbot Zuzi: Designed to guide survivors through the justice system, sexual and reproductive health (SRH), and access to abortion services, Zuzi uses youth-friendly language to ensure clarity and comfort. The chatbot is still on BETA testing and will be available on the internet, Facebook, and WhatsApp.
  • GRIT was born out of a need to make justice and services more accessible to everyone, especially those who have been marginalized. By using technology, we aim to bridge the gap and provide crucial support to GBV survivors across South Africa.
When will the IA Chatbot be available to test, break and provide feedback?
  • It's ready to be tested. She can be tested on questions around safety planning and protection orders.
  • Please note that Zuzi (chatbot) is trained with South African laws. So most of the time you will pick up that. The reason for that is we want to ensure accuracy on certain topics first before we can spread our wings to others.

Have the panic buttons improved the ease of reporting emergencies and response from South African emergency services?

  • Yes the panic button def improves response times, but we need to scale up our app user base now.
Do you have connections with social media platforms or even a "workflow" with them when you need them to take some actions for the survivors of GBV you support?
  • Yes, we are on a journey with Meta around this, and part of our scale now is in the area of TFGBV.
  • We have seen a huge increase in requests for services in this area and we are about to launch a 8 month child/youth-led research project on.
Lebo, how did you learn about Digital Rights, and reconcile working as a developer who is relatively new to understanding digital rights, incorporating some of these right-based insights into your work, versus seemingly rigid and different priorities within the computing world?
  • I studied data science, so I knew how to handle data in many ways. Later, I learned web development and got a job at ADS as a data reviewer. ADS makes apps, and I wasn't sure if they cared about digital rights until Kwanele asked for an app and I had workshops with them. During the workshops, digital rights were explained clearly. I asked my company if we follow digital rights, and they said it's very important and a top priority. They make sure to comply with digital rights when posting apps to the Play Store or App Store. This reassured me that our work aligns with digital rights standards.
  • To be honest, my understanding around digital rights was very basic compared to what I learned with Kwanele. I took that as a free course
Can you tell us how you go about actually creating technology informed by local communities and GBV survivors in South Africa?
  • Firstly none of this part of our journey would have been possible without Bobi from Mozilla, she guided and upskilled me 100% on everything I know in this process.
  • At GRIT, we're all about creating technology that's genuinely informed by local communities and GBV survivors in South Africa. Our design approach is rooted in the “terms we serve with,” which balances legal requirements with user-centered needs. Everyone knows those huge terms and conditions documents that no one ever reads. But with our tech solutions, where people share deeply personal information about some of the worst moments of their lives, we wanted to ensure accessibility, even for those with low tech literacy.
  • We took the legal requirements and held focus groups with womxn and youth to hear what they wanted included. Through many revisions, we built the terms we serve, making sure children and young people understood what was being done with their data and felt safe. This approach was also applied to our app and bot. It’s a lengthy and more expensive process, but the result is a tool that truly speaks to the user.
  • For our AI chatbot, Zuzi, we conducted extensive focus groups on her persona, look, and tone. During testing, we found that people wanted a chatty tone. With help from youth nationwide, we're currently revising the prompts and dataset to be much more conversational. This iterative process required multiple changes and a lot of humility—no egos allowed!
  • By involving the community every step of the way, we ensure our tech solutions are not only legally sound but also empathetic and user-friendly. This co-design approach might take longer and cost more, but it results in tools that genuinely resonate with and serve the people who need them most.
Are these internal builders egos or egos that could be mirrored through the chat bots' responses?
  • Builders. Often you go to a community session with maybe your thoughts, our builders or our team and quickly you see that while well intended, it was a very bad idea and you can't feel offended by that.
How do you build trust among your intended users? that is a problem we often face as developers of digital tools, limited time and funds to actually engage with our users.
  • Leonora says: We do a lot of work in the community. So either through NPOs, Schools, community leadership, chiefs or the tribal council. Our goal also this year is to train up 120 community based advocates nationwide who will then do peer-to-peer onboarding. But it takes time. People are hesitant. The digital divide in South Africa is also HUGE.
  • Lebo adds: the first thing we do is teach people about digital rights and let them know that our apps don't require much of your information which can put you in danger. Another thing is that we teach them how we keep their data private, for instance the chatbot we only need phone numbers which as receivers find as tokens
What lessons, including frictions, have you learned over the period of this project that has been beneficial to GRITs growth, and that technologists or social scientists in digital rights can take note of for their own work?
  1. User-Centered Design is Crucial: One of the most important lessons we've learned is the value of a user-centered design approach. Involving actual users, especially those from marginalized communities, in the design process ensures that the technology meets their needs and is accessible. Technologists and social scientists should prioritize this approach to create more effective and impactful tools.
  2. Balancing Legal Requirements with Accessibility: Creating terms and conditions that are both legally sound and user-friendly was a significant challenge. Through multiple revisions and user feedback, we found a balance that ensured users understood their rights and felt secure. This lesson emphasizes the importance of making legal information accessible, especially in contexts where tech literacy may be low.
  3. Tech Literacy vs. Security: We struggled extensively with the trade-off between security and accessibility. The more secure you make a tool, the more exclusive it can become, potentially alienating users with low tech literacy. Finding a balance where the tool remains secure yet easy to use is crucial. This is a significant consideration for technologists aiming to create inclusive digital solutions.
  4. Iterative Development and Flexibility: Building technology this way is time-consuming and often requires multiple changes. Being flexible and open to continuous feedback is crucial. We learned that having no ego in the process and being willing to adapt based on user feedback leads to better outcomes. For technologists, this means embracing iterative development and being ready to pivot when necessary.
  5. Investing in Community Engagement: Our extensive focus groups and continuous engagement with the community have been vital. This investment has not only improved our products but also built trust and credibility with our users. Social scientists and technologists should recognize the value of deep community engagement and the long-term benefits it brings.
  6. Addressing Tech Literacy Gaps: Understanding that not all users are tech-savvy was key. Simplifying interfaces and providing clear, easy-to-understand information ensured that our tools were accessible to everyone, regardless of their tech skills. Technologists should always consider the varying levels of tech literacy in their user base.
  7. Creating a Safe and Inclusive Environment: Ensuring that users, especially children and young people, felt safe and understood how their data was being used was critical. Building trust through transparency and inclusivity is essential. For those in digital rights, creating safe spaces where users feel respected and heard can significantly enhance engagement and trust.
  8. Longer and Costlier Processes Yield Better Results: While our co-design approach was longer and more expensive, the results were worth it. Tools that speak directly to the user and address their real needs are far more effective. This lesson underscores the importance of investing time and resources into thorough, user-focused development processes.
Regarding the point 3, "Tech Literacy vs Security", did you find the balance eventually?
  • No, its a huge challenge.
What were some of the expertise you have needed to bring together or learn yourself to build the technology? Are those changing for you?
  • Building the technology for GRIT was a very steep learning curve for me, coming from an NPO background rather than a tech background. Learning about AI was particularly challenging, but we were very blessed with the support from the online community, which was incredibly open and helpful in upskilling us. Mozilla played a crucial role in bridging a lot of our learning during the “terms we serve” process and ethical AI considerations. Additionally, support from the Gates Foundation provided technical assistance in coding and deployment.
  • One significant challenge we faced was testing our technology in grassroots communities. The existing ecosystem is not designed for such environments, and the testing rubrics provided to us were far too abstract for areas with low tech literacy. We had to adapt these rubrics significantly to ensure we captured the most valuable input from our users.
  • Our expertise needs are always evolving. Initially, we needed to understand AI and ethical considerations, along with coding and deployment. Now, we are focusing on language accuracy in African languages and more community-based UX design. We are also revisiting our testing methodologies to better suit the environments we work in, ensuring that our tools are truly user-friendly and effective.
  • Despite the challenges, the learning process has been incredibly rewarding. It has highlighted the importance of adaptability and community engagement in developing technology that serves marginalized groups effectively. As we continue to grow, we remain committed to learning and evolving to meet the needs of the communities we serve.
How can people interested in your work reach you?
  • If you're interested in our work at GRIT, we would love to hear from you! You can reach out directly via email at [email protected] or connect with us on our social media channels at @grit_gbv.
  • We are also currently recruiting for our Tech Advisory Board, so if you're passionate about using technology for good and want to help us grow, please get in touch.