The medical profession has an ethic: first, do no harm. Silicon Valley has an ethos — build it first and ask for forgiveness later. Now, in the wake of fake news and other troubles at tech companies, universities that helped produce some of Silicon Valley’s top technologists are hustling to bring a more medicine-like morality to computer science. This semester, Harvard University and the Massachusetts Institute of Technology (MIT), both in the US, are jointly offering a new course on the ethics and regulation of artificial intelligence. The University of Texas at Austin, US, has just introduced a course titled “Ethical Foundations of Computer Science” — with the idea of eventually making it mandatory for all computer science majors.
And at Stanford University, US, the academic heart of the industry, three professors and a research fellow are developing a computer science ethics course for next year. They hope several hundred students will enrol.
The idea is to train the next generation of technologists and policymakers to consider the ramifications of innovations — such as autonomous weapons or self-driving cars — before those products go on sale.
“It’s about finding or identifying issues that we know in the next two, three, five, 10 years, the students who graduate from here are going to have to grapple with,” said Mehran Sahami, a popular computer science professor at Stanford who is helping to develop the course. He is renowned on campus for bringing Mark Zuckerberg to class.
“Technology is not neutral,” said Sahami, who formerly worked at Google as a senior research scientist. “The choices that get made in building technology then have social ramifications.”
The courses are emerging at a moment when big tech companies have been struggling to handle the side effects — fake news on Facebook, fake followers on Twitter, lewd children’s videos on YouTube — of the industry’s build-it-first mindset. They amount to an open challenge to a common Silicon Valley attitude that has generally dismissed ethics as a hindrance.
“We need to at least teach people that there’s a dark side to the idea that you should move fast and break things,” said Laura Norén, a postdoctoral fellow at the Center for Data Science at New York University who began teaching a new data science ethics course this semester. “You can patch the software, but you can’t patch a person if you, you know, damage someone’s reputation.”
Computer science courses are required to make sure students have an understanding of ethical issues related to computing in order to be accredited by ABET, a global accreditation group for university science and engineering programmes. Some computer science departments have folded the topic into a broader class, and others have stand-alone courses.
But until recently, ethics did not seem relevant to many students.
“Compared to transportation or doctors, your daily interaction with physical harm or death or pain is a lot less if you are writing software for apps,” said Joi Ito, director of the MIT Media Lab.
One reason that universities are pushing tech ethics now is the popularisation of powerful tools like machine learning — computer algorithms that can autonomously learn tasks by analysing large amounts of data.
Because such tools could ultimately alter human society, universities are rushing to help students understand the potential consequences, said Ito, who is co-teaching the Harvard-MIT ethics course.
“As we start to see things, like autonomous vehicles that clearly have the ability to save people but also cause harm, I think that people are scrambling to build a system of ethics,” he said.
This past fall, Cornell University introduced a data science course where students learned to deal with ethical challenges — such as biased data sets that include too few lower-income households to be representative of the general population. Students also debated the use of algorithms to help automate life-changing decisions such as hiring or college admissions.
“It was really focussed on trying to help them understand what in their everyday practice as a data scientist they are likely to confront, and to help them think through those challenges more systematically,” said Solon Barocas, an assistant professor in information science at Cornell who taught the course.
In another Cornell course, Karen Levy, also an assistant professor in information science, is teaching her students to focus more on the ethics of tech companies. “A lot of ethically charged decision-making has to do with the choices a company makes: what products they choose to develop, what policies they adopt around user data,” Levy said. “If data science ethics training focuses entirely on the individual responsibility of the data scientist, it risks overlooking the role of the broader enterprise.”