He Ministry of Equality has launched a new report to criticize algorithms and artificial intelligence for sexist behavior. Irene Montero’s department maintains that the latest technologies «obscure discriminatory dynamics through the use of mathematical language. «The regulation so that the algorithms are feminist are cosmetic recommendations. Gender discrimination is treated as a mere statistical matter. The current recommendations do not question the logic or the rules of the game », he denounces.
This is stated in the document entitled Preliminary Report with an Intersectional Perspective on Gender Biases in Artificial Intelligence. The objective of this commission, which is added to other actions of the Government for this same issue, is to “promote knowledge and dissemination of the presence of the women in the information society.
“Artificial intelligence (AI) is a booming process automation and organization technology with great social impact. However, empirical research and academic reports over the past five years have documented numerous cases of discrimination intersectional gender in AI”, says the official report signed by Lorena Jaume-Palasi in a document of 42 pages. “You cannot separate technology from the social context that produces it and reduce its evaluation and prevention of gender discrimination to a mere statistical or mathematical matter,” they insist, pointing directly to various technological giants.
Among the conclusions of the report, examples of “discriminatory gender impact” are cited, among which is “the temperature of a building intelligent or the personnel selection processes”. He maintains that these interviews are automated with AI programs that “have the middle-aged white male as the standard.” “The rest becomes a statistical deviation within the system, producing a discriminatory gender impact, especially at the intersection with racial, age, etc. categorizations,” he theorizes.
Technology, they say, hasa patriarchal look» who commits «racism» and other «isms» such as «islamophobia, transphobia, fatphobia, ageism“, among other. It is also ugly that they rely on the “gender binary”, that is, the idea that Equality criticizes that there are only men and women in society.
Nature
On the other hand, he points out that the environmental impact “It should also form part of the criteria for assessing the impact of gender discrimination in algorithmic systems.” For example, he assures that “the European continent depends on the amount of critical raw materials, located mostly in other continents, for the manufacture of these technologies, being a living example of technologies that transcend borders.” He describes as “key” that national and international regulations develop “mechanisms that protect against the discriminatory impact of gender produced by the environmental consequences of Artificial Intelligence.”
Dependence
It also conveys a very critical vision of technologies. “Any creation of infrastructures for complex automation systems is itself the creation of a dependency in the medium and long term», exposes the report.
«An agenda with a vision of gender justice must go beyond the reactive look and offer a social vision that focuses feminist interests and ideas. Although there is already regulation in this regard, these measures are partially cosmetic or are reduced to mere recommendations and self-assessments in the hands of the same companies that develop algorithmic systems”, he adds.
In the last decade, they remember, there has been a regulatory shift: before, it was legislated sectorally (pensions, banking sector, energy, transport) and now a regulation is being pursued more transversal (horizontal), as is the case of the General Data Protection Regulation, the Digital Services Law and the Digital Markets Law -or the Artificial Intelligence Act bill-. Afea We can that in this way the context disappears and it is the context that helps to develop ethical principles, evaluation methods and concrete measures”.