CS professor co-teaches course on ethics of artificial intelligence
Over the past decade, Artificial Intelligence (AI) technology has progressed in leaps and bounds, becoming a key fabric of our everyday lives and oft-mentioned in our popular discourse. Even as tech-enthusiasts such as Ray Kurzweil envision a rosy future where AIs power an existential upgrade for the human species, others like British philosopher Nick Bostrom warn us against the emergence of malevolent super intelligence. Industry leaders and scholars including Elon Musk, Bill Gates and Stephen Hawking have called for ethical and engineering standards to prevent powerful AIs that will supplant humanity in the future.
In contrast to future super intelligences that are the focus of some, we currently live in a world where narrow, special-purpose AIs have important positive and negative real-world implications for sectors such as transportation, manufacturing, finance, medicine and law.
Will AI software displace humans in various job categories? How will we allocate responsibility for accidents and errors that are caused or prevented by AI-tools, such as medical diagnosticians and autonomous vehicles? How can we resolve the problem of biased learning in AI behavior, examples of which include reinforcing racial or gender-prejudiced tendencies in human decision making? How can we create AIs that complement, rather than replicate, human intelligence, so as to mitigate human myopia and to support a sustainable, just world?
The Ethics of Artificial Intelligence University Course, which launched this semester, considers the immediate moral and legal repercussions of AI presence in our society, and the possibility of consciousness, cognition, conation and emotion in an artificial being of the future, and the implications of that possible reality. Our aim is to equip students with the scientific/technological knowledge and critical capacity to engage in discourse about rapidly evolving AI technology and its societal implications.
We have organized the course into weekly modules that address issues at the nexus of AI, with fields ranging from the humanities to engineering. The weeks are ordered by topics conjoining AI and Personhood; Smart Cities; the Arts; Social Justice; Law; Education; Environmental Sustainability; Superintelligence; Economy, Business, and Finance; Divinity; Warfare; and Medicine.
We have chosen an ordering that interlaces broad philosophical, societal and theological issues with engineering and applied areas. Our reasoning is that students most interested in broader issues will not have to wait long between studies of them, and those more interested in applied areas will not have to wait long either.
Because this is a University Course, we want to feature university faculty members. Each week a guest faculty discussant will participate in our Tuesday class discussions, not as a lecturer but as an embedded expert. For a full description of the course, visit our website.
The idea for the course stems from two other prototypical projects we considered, one on a more general overview of AI and the other concerning the aesthetics of algorithm. Our decision to direct more focused attention to the ethical dimension of AI came from the realization that AI discourse and practice in the conceptual and artistic domain are inevitably and necessarily entangled with the axiological dimension of life.
Our course will address near-term, pragmatic implications of AI in areas such as warfare, automation, law and transportation (e.g., a focus of Cornell’s course, EdX’s course), as well as longer-term and higher moral questions, to include the implications of AIs on human perceptions of personhood, which is material also included in courses such as Edinburgh’s.
Unlike other courses at Vanderbilt and peer institutions, however, ours is unique in that we aspire to transmit both non-trivial computational intuitions about the operations of AI to those without computing experience, as well as exercising and growing competencies in literature, history, moral philosophy, and business.