Democratizing Deep Learning: How Michigan is Teaching AI to Arts & Sciences Undergrads
Share this article
Artificial intelligence, particularly deep learning, is reshaping society, demanding broader understanding beyond computer science specialists. Recognizing this imperative, the University of Michigan's Department of Statistics embarked on an ambitious mission: to create a rigorous yet accessible deep learning course for undergraduates in the College of Literature, Sciences, and the Arts (LSA). Led by Professor Ambuj Tewari, the course's journey over three years reveals valuable lessons in making complex AI concepts approachable while maintaining intellectual depth.
The Accessibility Challenge and Pilot Pivot
Deep learning's foundations—multivariable calculus, linear algebra, probability, and Python programming—pose significant barriers for students outside engineering. Michigan's solution was bold: drastically reduce prerequisites to single-variable calculus, introductory programming (any language), and one statistics/probability course. The Winter 2022 pilot, limited to 40 students, initially followed a traditional structure: concept-focused lectures and programming labs.
"After a few weeks, I started to feel uneasy. Students seemed overwhelmed, not just by the math-heavy lectures, but also by the programming assignments... It was not the students’ fault; my instruction had not fully adjusted to the realities of the reduced prerequisites," admits Tewari.
The small cohort proved crucial. Tewari paused the curriculum to introduce an intensive linear algebra bootcamp paired with hands-on NumPy and TensorFlow tensor labs. This mid-course correction, driven by direct student feedback, filled critical knowledge gaps and salvaged the semester. Students progressed from linear regression viewed through a deep learning lens to fully connected neural networks, CNNs for images, and RNNs for time series, turning a potential failure into a success.
Scaling Up and Evolving with the Field
Demand exploded. Enrollment surpassed 100 students in Winter 2023. Building on pilot learnings, Tewari embedded the linear algebra bootcamp upfront and ingeniously redesigned backpropagation instruction using only single-variable calculus and the chain rule, eliminating the need for multivariable calculus. The core solidified into four modules: Intro, Fully Connected Nets, CNNs, and RNNs.
The AI field's rapid evolution forced further adaptation. Post-ChatGPT, student interest surged in transformers. The Winter 2024 edition replaced RNNs with a transformer module. Simultaneously, a teaching grant enabled a novel foray: machine olfaction.
The Scent of Innovation: Smell as Data
Students engage in a deep learning lecture hall. (Source: CACM)
Unlike vision or language, machine learning's impact on smell (olfaction) remains nascent. Tewari introduced students to this frontier. Google Brain spin-off Osmo, aiming to give computers a sense of smell and advised by Geoffrey Hinton, became a case study. Osmo CEO Alexander Wiltschko (a Michigan alum) guest-lectured on graph neural networks for molecular tasks like predicting odor from chemical structure.
Theory met practice in a unique lab. Sensory expert Michelle Krell Kydd ("the nose of Ann Arbor") guided students in attentively smelling and describing substances. A subset then participated in blind-smelling 20 monomolecular odorants, labeling each (e.g., "sweet," "fruity"). The resulting dataset revealed fascinating individual variations alongside shared perceptions – nearly all identified vanillin as "vanilla."
"The goal was twofold: to introduce an emerging research area and to expose students to real-world data collection challenges," explains Tewari. This hands-on experience underscored the complexities of translating sensory phenomena into data usable by deep learning models.
Lessons Learned and The Road Ahead
Now established as DATA SCI 315, the course exemplifies how to scale deep learning education beyond traditional CS:
1. Prerequisite Pragmatism: Rigor is achievable without traditional heavy math gates through targeted bootcamps and conceptual reframing (e.g., single-variable backpropagation).
2. Flexibility & Feedback: Small pilots allow crucial mid-course corrections based on direct student interaction. This agility remains vital even at larger scales.
3. Evolution is Mandatory: Curriculum must adapt swiftly to technological shifts (e.g., replacing RNNs with Transformers post-ChatGPT).
4. Novel Applications Engage: Integrating cutting-edge, tangible research areas like machine olfaction boosts engagement and illustrates AI's expanding frontiers.
Looking to Fall 2025, Tewari plans to incorporate generative models and experiments with olfactory mixtures, pushing models beyond single molecules. Michigan's experiment demonstrates that deep learning literacy for diverse students isn't just possible—it's essential. By demystifying the foundations and showcasing novel applications, they are equipping a broader generation to understand, critique, and shape the AI systems transforming our world.
Source: Adapted from 'Scaling Deep Learning Education in a College of Arts & Sciences' by Ambuj Tewari (University of Michigan), Communications of the ACM (CACM), August 2025.