Inteligencia Artificial para un futuro sostenible: desafíos jurídicos y éticos

Artificial Intelligence and environmental racism: initial reflections 196 is no specifically international human rights legally binding instrument on AI, soft law instruments play a relevant role in a) specifying the content of long-established principles such as non-discrimination, b) supporting the development of trustworthy and ethical AI, and c) guaranteeing the safety and fundamental rights of peoples 3. FINAL CONSIDERATIONS It is undeniable that AI has emerged as one of the disruptive technologies of this century, with the potential to trigger significant transformations. The reflections are no longer about whether we should welcome this type of technology but under what conditions and safeguards. Recognizing that this emerging technology should not be embraced uncritically, systematic theoretical and empirical investigations are required to examine the potential negative consequences of IA and avoid discrimination. This chapter sparks the discussion and provides initial thoughts through case studies. With regards to AI and environmental racism, preliminary research reveals the likelihood of discrimination against social groups in situations of vulnerability, particularly in low-income neighborhoods with higher proportions of racial and ethnic minorities. The three case studies presented in this paper showed an ambiguous relationship between AI and environmental sustainability. Regarding data-driven environmental initiatives and AI, a bibliography review indicates AI as a promising alternative for improving predictions for a variety of high-impact events. For example, it allows the development of programs that aid disaster relief by providing faster and more accurate weather forecasts. Positive outcomes in terms of environmental racism were also found by reducing the chances of casualties from early warnings and efficient post-event relief. While climate AI is meant to address social and environmental inequities, in many ways it might reproduce them. As negative outcomes, the research finds geographical or population biases, non-representative data, and inequalities by prioritizing resources. For example, coverage gaps in weather radars can inadvertently under-represent some populations, such as indigenous people and

RkJQdWJsaXNoZXIy MjEzNzYz