July 13 2023 GM

From TCU Wiki
Glitter Meetups

Glitter Meetup is the weekly town hall of the Internet Freedom community at the IF Square on the TCU Mattermost, at 9am EDT / 1pm UTC. Do you need an invite? Learn how to get one here.

How to Tame the Algorithm: Recommendations to Improve the Wellbeing and Context of Delivery App Workers

Learn about the Karisma Foundation (a well-known Colombian digital rights organization) and their recent research and recommendations to tame the algorithms for delivery applications (such as Uber, Rappi, Cabify, and Didi) to make them fairer for the people they employ and support their overall wellbeing and context of delivery app workers. In addition, find out how the issues they uncovered impact delivery workers not just in Latin America, but other parts of the world as well.

Juan de Brigard, Coordinator for Autonomy and Dignity in Karisma, where he studies algorithmic discrimination, AI and Digital Identity, among other topics. He studied philosophy at the Universidad de los Andes, and did a master's degree in bioethics at the Universidad Javeriana in Colombia.

Notes

Could you introduce yourself to the folks who are arriving in the chat room and tell us more about yourself and your work?
  • My undergraduate was in Philosophy and I did a masters in Bioethics after that. I'm now part of fundación Karisma, a CSO based in Bogotá Colombia.
  • We look into all sorts of things technology and human rights related, but today we will be talking about delivery apps
We know Karisma is a well-known organization working on digital rights from Colombia, but for those who did not know about the initiative, could you introduce Karisma’s work and objectives?
  • Yes. Karisma is turning 20 years old this year (we're getting older! haha) and at the beginning the foundation's work was more oriented towards creative commons and access to knowledge.
  • Nowadays we have 4 research lines: Social inclusion, civic participation, democratization of culture and knowledge, and the one I coordinate is called autonomy and dignity.
  • Apart from that, we have two laboratories: the k-lab, that supports the work of the lines and does digital security research and advocacy, and the appropriation lab, which works mainly with communities and ways of connecting rural areas.
What are the objectives and main projects of the “Autonomy and Dignity” area in Karisma?
  • The autonomy and dignity line works mainly on three fronts: digital identity, Automated decision making systems (which sort of overlaps with AI as a whole) and data governance.
  • This particular project falls under the second category.
  • As automated decision making systems are a key part of how delivery apps work, of course.
  • Here's some of our research regarding digital ID, if you want to look into it.
  • And all across those issues, we try to understand the consequences of introducing new technology on human rights. This may mean access to services, living conditions, disproportionate risks, etc.
It raises an interesting question of how algorithms are fundamental to this kind of work. Could you share your definition of algorithm with us?
  • Well i like to use two quite a simple definitions:
    • 1) a set of rules that, given an input, will give you an output.
    • And 2) a mathematical function.
  • I think if you start with these very basic ways of defining them you can start building complexity. For instance, you can start questioning whether the set of rules is known, whether it is fixed or changes and whether.
  • You can also think in terms of the input and output:
    • Is the input biased in any way?
    • Is it what you need in order to arrive to the outcome?
    • Is the outcome a decision? is it new information?
    • Does it affect someones life...
  • Of course, once you get into machine learning and deep learning algorithms, one of the key questions becomes more how the algorithm was produced and what sort of process/thing are you optimizing for.
  • In the case of delivery apps, for instance, it's quite clear that you are optimizing for income generated for the owner of the apps, which leads to question what sort of thing is falling behind as you optimize for that end.
And could you describe how the algorithms are impacting our work and daily life?
  • Well this depends a lot on where you live and what your lifestyle is, but very staright forward example I'm sure you're aware of, are algorithmic recommendations on social media and their influence on the culture/content we consume and our consumption practices (for instance addictions, around that)
  • But beyond that we can mention other interesting (and sometimes worrying) applications such as welfare/benefits allocation.
  • There's a system in place in Colombia which does precisely this, and it is NOT exempt of bias. We've done quite a lot of work on it.
  • Here's one of the reports (in Spanish): Experimentando con la pobreza.
  • We looked into the potential beneficiaries of the sisben system in Colombia. They are surveyed by the national government in order to allocate them a place on a tier which grants them access to different state programs based on their income/vulnerability. The thing is, the sorting algorithm is automatized and it is very unclear for people who you end up on one group or another.
  • This implies some people are left our and unable to correct the categorization or end up with the feeling that it was biased or unfair.
  • The algorithm they use is kept secret as a way of "protecting national resources" as they think people are going to take advantage of it if it is disclosed.
  • We're fighting or the algorithm to be made public in order for it to be auditable and in order for people to better understand the government's decisions and be able to contest them.
The Colombian Government sent their labor reform to the Congress in March 2023, which included a section related to platform workers' rights. What stage of discussion or voting is the bill in the Congress? Will it have options to become a law?
  • The current government is putting forward A LOT of reforms (on health, judicial system, electoral code, etc) and labour was just one of them. Given the amount of work that that implied, congress pushed back (for political reasons) on some of those reforms. One of them was the labour reform.
  • But the reasons were political and they had very little to do with gig economy. The main thing had to do with a tension between private interests and warranties for the employees.
  • The government will present the bill again, we'll have to see how it changed and try and figure out whether it passes on this second try, but that will be next semester, at best.
  • Regarding our particular topic, it is hard to tell what place gig economy will have in the new draft.
  • The legislature just ended, I think the question of when to present it again is very political, so it is not easy to forecast... It could be as soon as August/September but it could be later as well.
  • They have said they are working on it, but unless you have some definite majorities (a more robust coalition) in congress it is unlikely that they will even attempt it again.
Could you summarize the main recommendations of your report “Taming the algorithm: recommendations to improve welfare of riders of delivery apps” related to improve the welfare of riders of delivery apps in Colombia?
  • We looked specifically into delivery apps here in Colombia. We were trying to come at the issue from another angle: usually the discussion on the gig economy focuses on labour rights (as we mentioned) and, although that is probably the most important question to answer regarding rider's wellbeing, this approach can lead to overlooking the technology.
  • Since our expertise at Karisma is precisely technology, we offered an alternative route for regulating platforms: regulating the technology itself. The ministry for Technology, Information and Communications (MinTIC) has the competence to demand certain standards of platforms, and that's what we're aiming for.
  • It's not necessary that a whole legislative agenda moves forward in order to start improving rider's lives.
  • So our report looks into particular characteristics of the algorithm (which is also secret, but we can learn those functionalities by inquiring into the experiences of rider's) and how they affect the fairness of the work that rider's do.
  • We're aiming at getting MinTIC to set some standards for:
    • 1) precision regarding the distance traveled by riders
    • 2) for it to incorporate or take into consideration risks (weather, security, etc.)
    • 3) to include a support system for beneficiaries that allows them to communicate with "their bosses" at all times, and
    • 4) accountability and clear explanations for instances such as banning people from the platform.
  • For instance, it is clear that there's not a very efficient or open communication line between riders and administrative personel. They are frequently left on their own or punished without them understanding why.
  • A very grim example: one rider was on his way to deliver an order and his friend (who was also working with his bike) got into a crash. The first rider stopped to help him and get him an ambulance, and as a result for his delay, he was banned from the platform for several days. His friend ended up loosing his life, and he had no way whatsoever to communicate this situation for him to be treated more humanly. It's quite a sad example, but it shows how deep this simple things as open communications can go.
You also included the weather conditions on your recommendations
  • Yes, the weather has, to some extent, already been included. Rider's get payed a little bit better of a fee when it is raining. But they don't get more flexible times for the delivery, for instance.
  • And the decision was made by the company, they can roll it back whenever they see it's not necessary for profit...
Would you like to share your contact (mail, Twitter, or what works better for you) to the community members, just in case somebody wants to reach you?
  • You can contact me through my email: juan.de.brigard@karisma.org.co