{shortcode-0daf4e04960bfa56993a1c9030dc33a27298895d}
Harvard is “in dire need” of classes that help steer technology in the correct direction, Nikola Jurkovic ’25 told me in an interview.
“There’s just so much energy being put into making the technology more powerful — and just so few resources being put into shaping the technology positively,” Jurkovic said.
The social risks of emerging technologies are now greater than ever, and more critically, are often unpredictable. Industry leaders are aware of this dilemma, with major names in technology — Elon Musk and Steve Wozniak among them — calling for a complete pause in AI research in March.
Harvard’s primary (and self-touted) approach to tackling this issue so far has been the development of Embedded EthiCS — a collaborative program between the Departments of Philosophy and Computer Science with the aim of “bringing ethical reasoning into the computer science curriculum.”
I’ve spent the last few weeks speaking to College students studying technology about their impressions of the Embedded EthiCS program. I’ve been trying to answer the question: Is Harvard adequately preparing its students for this dangerous and uncertain technical future?
Right now, social responsibility feels like an afterthought to the University’s flashy innovation initiatives in technology. Even Embedded EthiCS is hampered by its limited reach and secondary status to course content, according to many of the students I spoke with.
Embedded EthiCS faces challenges arising directly from its implementation. Instead of standalone courses — which “send the message that ethics is something that you think about after you’ve done your ‘real’ computer science work,” per the initiative’s co-founder — the program brings guest lecturers into certain computer sciences each semester. However, this strategy hinges on the willingness of Computer Science faculty to collaborate on relevant case studies and extend the discussion beyond just one lecture, Embedded EthiCS program director Matt C. Kopec told me in an interview.
Relatedly, Teresa Lu-Romeo ’25 told me that the positive interactions that she’s had with the program have been due to the “goodwill” of the professors hosting the guest lecturers.
Lu-Romeo said that ideally, ethics should be treated as “intertwined, kind of inextricably so, with technical things.”
In her experience with Embedded EthiCS, this has only been possible when “professors have clearly put in the work to integrate it into a topic,” Lu-Romeo said.
Swati Goel ’25 told me that, in her experience with Embedded EthiCS, “the intentionality is good,” but that in implementation, “there’s a little bit of a disconnect.” She recalled her experience skipping the Embedded EthiCS module while simultaneously enrolled in Computer Science 121: “Introduction to Theoretical Computer Science.” Since the assignment wasn’t graded, she says that like many of her classmates, she simply had no incentive to do it.
Across all my interviews, I found this sentiment to be common. Few disagreed with the mission of Embedded EthiCS — but the embedded format of interventions often leads to a feeling of ethics being sidelined, which in turn, casts doubt on just how essential the computer science curriculum deems ethics.
Conan Lu ’26 told me about his experience in Computer Science 51: “Abstraction and Design in Computing,” where he said a lack of serious engagement with the Embedded EthiCS module “points to an overall de-prioritization from course staff.”
His assignment for the module, an essay, was never graded.
“The professor, I don’t know what he did with it,” he told me.
I brought some of these concerns to Kopec, the director of the program. While he emphasized the program’s structure of working with both philosophy and computer science experts to generate technically-relevant curriculum, he also acknowledged the limitations of the Embedded EthiCS approach.
“We’re unfortunately under the restrictions that we need the help of the CS professor for us to be able to integrate ethics material more seamlessly. And they just have limited time,” he said.
The structure of the Embedded EthiCS program, per Kopec, is largely intended to conserve resources. The modular structure of Embedded EthiCS allows for “a lot of content” with “a smaller amount of resources,” he said. To that end, the program’s teaching lab consists of one faculty member and a group of postdoctoral fellows, who deliver module content.
“If they gave us a computer science faculty line and a philosophy faculty line, and one of the ideas is that they’re going to teach courses like interdisciplinary courses and computing ethics, that solves a lot of problems,” Kopec told me.
But even when Embedded EthiCS effectively delivers on its mission, its scope remains narrowly focused on ethical interventions for computer scientists, according to Kopec. To create leaders — of all disciplines — that can navigate the transformative effects of emerging technologies, Harvard needs to do more.
Embedded EthiCS, while a laudable development, must be complemented by other programming — such as accessible CS classes for policy students, seminars uniting engineers and social scientists, or even a new interdisciplinary concentration — to provide comprehensive opportunities for the crucial study of technology and society.
Across the University, many students are approaching the issue of technology’s social impact from their own perspectives. Unfortunately, those whom I spoke to relayed that curricular support from the College often falls flat, leaving these students to forge their own path on an issue that’s increasingly vital.
{shortcode-8723aeaa48a3c6fe6d0cf7baf69ee56167a7fa55}
The lack of accessible technical courses, for example, means that social scientists of technology are often left without options to gain technical perspectives that would augment their understanding of the subject. Allegra Wong ’26 told me that this often leads to classes that “teach you the buzzwords side of things, of how to talk in a politician-esque way” on issues of technology.
“There’s CS 50 or CS 32 — and then it gets really hard really fast,” Wong said. “So I almost think there’s a cap on how much technical skills that I can get.”
Sophia C. Weng ’24 echoed this sentiment.
“I’ve had so many inane conversations in classes about ChatGPT that are hair-pulling, with people who don’t actually understand the technology,” she told me.
“Not that I really do, but I know what I know enough to know what I don’t know,” Weng added.
So what should Harvard do? Offering courses that explain the mechanisms of AI and machine learning would help shed light for more politically-oriented students on what is technically possible in terms of transparency and fairness. These courses would go beyond new General Education offerings in this space — courses that, while laudable, are too broad to provide a rigorous training.
Moreover, Harvard should build on interdisciplinary academic spaces, where engineers and social scientists alike can learn from each other. Sherry X. Liu ’24, a teaching fellow for Computer Science 105: “Privacy and Technology,” sees potential in courses like the one she teaches, which is a discussion seminar capped at 48 students. The strength of the class, in her view, is bringing together students who have “dedicated a lot of time to knowing a lot in their own respective disciplines.”
Finally, Harvard should consider a dedicated undergraduate concentration grounded in both technology and social science, akin to MIT’s Science, Technology and Society program or Stanford’s Symbolic Systems major. Almost all students I talked to said they knew people who would be interested in a similar course of study. In fact, Jurkovic said that he’s currently exploring a special concentration about “ensuring AI has a positive impact.”
Meanwhile, Julia High ’26 told me that to the best of her knowledge, some professors have been pushing for similar courses of study, including Latanya A. Sweeney, the Daniel Paul Professor of the Practice of Government and Technology at the Harvard Kennedy School and in the Harvard Faculty of Arts and Sciences.
“They talked about trying to make public interest tech more of an academic thing and trying to make it its own academic interdisciplinary discipline,” High said.
Regardless of its ultimate manifestation, Harvard must devote significant resources to equipping undergraduates with the skills and knowledge to address the wide-ranging effects of technology. Indeed, as The Crimson’s Editorial Board has noted before, ethics cannot be an afterthought of innovation, lest we all pay the consequences.
Correction: October 18, 2023
A previous version of this article misspelled Swati Goel’s name.
A previous version of this article also misstated that the program's teaching lab team was only made up of postdoctoral fellows. In fact, the teaching lab's staff also includes one faculty member.
Andy Z. Wang ’23, an Associate News Editor, is a Social Studies and Philosophy concentrator in Winthrop House. His column, ‘Cogito, Clicko Sum," runs on triweekly Wednesdays.
{shortcode-65b3aef7f23f2f7d4db926fd9cc26a674395bf2a}
Read more in Opinion
An Educator’s Obligation