As Donna Haraway advised in Staying with the Trouble, the solution may not be a solution in itself. Rather, we can use the problem as a compost (here I imagine technological „cagadas“ literally translated as „technological turds“ which means „technological screw-ups“) to plant new forms of affect. In this text I wrote about some non-utilitarian initiatives around the problem of „engineered oppression and imperialist logic embedded in mainstream software“ that you mention. Here is a list of them:
1. From Artistic Practice: The Library of Missing Data
2. From Activism and empathy: Domestic Data Streamers
3. From the Poetics: Myriad (Tulips)
4. From Speculation: Xenoimage Dataset
The conclusion of the texts goes in this direction: The justice in the content of a dataset has to do with politics. When we do politics, we try to convince someone that one solution is better than another. If what we do is to show imaginary scenarios, we don't need to convince anyone, but it can be used as a guide. We can ask open questions like: 'what would you, with your convictions, think in that scenario?' Not having enough imaginaries is what can lead to hegemonic and discriminatory thinking. Devising operational protocols that make it possible to hack the hegemonic visual imaginary through the refunctionalization of image databases can serve to activate the visual future in the field of the disruption of hierarchies, relying on AI as a new sensorial tool for visualization and starting from a xenofeminist perspective. Algorithmic reprogramming will be through hyperstitional operators: tools that serve to think other futures in which identity traits are not eradicated but deactivated as motors of oppression and inequality; speculative futures that resist and imagine beyond the consummation of the automatic organisation of life.