As practitioners and consumers of data science, how do we make ethical decisions when confronted with thorny dilemmas in artificial intelligence (AI)? How do we even recognize when those dilemmas emerge before they become public relations catastrophes? Humanities and social science provide productive perspectives to help us navigate ethical conundrums in AI. At first glance these specialties seem far outside of the STEM fields traditionally associated with data science but including these specializations on your data science team allows powerful ethical questions to be asked and gives you the tools to answer them.
Philosophers, historians, and other humanistic perspectives foster an ethos of asking “what if” questions that stretch the mind to possibilities not yet experienced. These voices are key to understanding the future of AI ethics. Social scientists, such as anthropologists, sociologists, and psychologists, have the tools and research expertise to explore solutions to ethical problems by focusing on the human element.
An example of collaboration among researchers in humanities, social science, and data science is study of 2.3 million individuals across the globe to address a critical ethical issue in faced by AI in self-driving cars. Should a car swerve to avoid hitting a jaywalking pedestrian if doing so jeopardizes the lives of the car’s passengers? It turns out the socially acceptable answer to that question is not universal and varies across regions of the globe. This suggests that how an AI should respond to that scenario to provide the most ethical response should also vary from region to region. Getting that insight required a team of transdisciplinary researchers whose combined perspectives allowed them to think outside the box of standard STEM fields.
A pair of NYU researchers in a recent Nature Machine Intelligence article argue that qualitative research of the kind found in the social sciences and humanities can aid the development in AI in three fundamental ways:
Qualitative social research can help understand the categories through which we make sense of social life and which are being used in AI.
A qualitative data-collection approach can establish protocols to help diminish bias.
Qualitative research typically requires researchers to reflect on how their interventions affect the world in which they make their observations.
All of these strengths are essential to an ethical AI.
The importance of incorporating social science and humanities into STEM related products and solutions is not a new concept, but the divide between these spheres is growing. This is happening while the need for these perspectives in artificial intelligence is become ever more important. Northwestern University and University of Illinois are hoping to combat this issue by providing programs (CS + X ) where computer science majors can gain specializations outside traditional roles. These programs are the future of artificial intelligence. While STEM skills are essential to furthering data science and artificial intelligence, the qualitative perspectives of social science and humanities are essential for an ethical future of AI.
Most leaders want to make the upstanding choice, but navigating what decisions are the most ethical given the resource constraints facing organizations requires nuanced thinking beyond traditional STEM perspectives. At Pandata, our Data Scientists come from diverse backgrounds in STEM and the humanities and social science. We have have the skills needed to address questions in an ethically and responsible manner, always looking to protect our clients and society from unintended consequences.
We empower innovative organizations to design & develop human-centered AI and machine learning solutions.
Our deep expertise in human behavior is why we shine at designing and developing ethical, human-centered AI & ML solutions for innovative teams. This expertise is instrumental in helping our clients navigate the complexities of regulated and protected data sources to mitigate risk and protect the most priceless asset they own – their reputation.