Professors Using Chatbots in Exemplary Ways

By Jim Shimabukuro (assisted by CopilotChatGPTGeminiPerplexity, DeepSeek, Claude, Pi, Poe, and You.com)
Editor

[Also see the reports from Dec 2025, Oct 2025Sept 2025, ]

Introduction: I collaborated with nine different chatbots to come up with a list of college professors who are using them in their courses in exemplary ways. The purpose was to provide readers with concrete examples of chatbot use by professors in their courses. These examples, hopefully, will generate interest among educators to integrate AI strategies in classrooms.

A secondary purpose was to spotlight professors who are actively applying chatbots in their courses. They’re leading the way into the AI Century (2025-2075) and deserve recognition. I’m sure the chatbots have missed dozens if not hundreds of other professors who should have been on this list of 49. If you happen to be one or know of others, please let me know in the comments section attached to this article.

I asked each chatbot to identify ten. I ended up with 49 professors (teams were counted as one). Five appeared in two lists, and one appeared in three (Professor Ashok Goel, Georgia Institute of Technology). I eliminated items in lists that omitted professor names. I eliminated an entire list because the chatbot failed to include information requested in the prompt. One chatbot listed only two professors.

I was surprised at the variations in the lists. The same prompt was used across all chatbots, but only six names appeared in more than two lists. The takeaway seems to be that bot search engines and rules vary widely and reliability among them can be low, depending on the topic. The open-ended nature of this prompt also seems to invite wider variation.

Finally, in the prompt, I didn’t limit the selections to the U.S. or explicitly invite international selections, and this might have impacted the low number of professors from other countries. If you’d like to recommend professors outside the U.S., please name them in the comments section below. -js

Prompt: Please identify 10 noteworthy college professors who are using chatbots in their courses in exemplary ways. Describe each of them, their college, and course. For each, develop a 100-word description of their chatbot strategies. Append an annotated list of references, in APA style, that have informed your response.

List of professors mentioned by the chatbots in alphabetical order using their last name.

Cynthia Alby (DS): Innovative Teaching Methods. Georgia College & State University. Alby trains future educators to use chatbots for lesson planning, student engagement, and assessment design. She emphasizes ethical AI use, ensuring teachers leverage chatbots responsibly while maintaining human-centered instruction.

José Bowen (DS): Teaching & Learning with AI. University of Southern California. Bowen advocates for “AI-enhanced pedagogy,” where chatbots assist in generating discussion prompts, grading rubrics, and personalized study guides. He trains faculty to integrate AI as a co-instructor, improving efficiency without sacrificing depth.

Erik Brynjolfsson (GE): The Economics of AI. Stanford University (and MIT previously). As a leading expert on the economic implications of AI, Dr. Brynjolfsson likely incorporates interactive AI tools, including chatbots, to allow students to directly experience and analyze AI’s capabilities and limitations. His approach would involve using chatbots to simulate economic scenarios, generate data for analysis, or even engage in discussions about the societal impact of AI. This provides hands-on experience with the technology they are studying, allowing for a deeper, experiential understanding of AI’s economic role.

Bobby Carnes (CL): Introduction to Financial Accounting. University of Southern California. Carnes, an associate professor of clinical accounting at USC, has adopted a balanced approach to AI integration, stating “I use it all the time, so it doesn’t make sense to tell (students) they can’t use it.” California college professors have mixed views on AI in the classroom | EdSource His strategy involves modeling responsible AI use while teaching financial accounting concepts. Carnes demonstrates how chatbots can assist with complex accounting problem-solving, helping students understand financial statements and accounting principles through interactive dialogue. His approach emphasizes transparency about AI capabilities and limitations while encouraging students to develop critical thinking skills alongside AI literacy in professional accounting contexts.

Emily Chen (YO): Introduction to Artificial Intelligence. Northeastern University. Dr. Chen uses chatbots as simulated AI agents in her course. Students interact with the chatbot to understand natural language processing and machine learning concepts. Assignments include critiquing chatbot responses and designing improvements, giving students hands-on experience with AI ethics and design.

Philippa Collin (GE): Digital Cultures / Media Studies. Western Sydney University. Dr. Collin might incorporate chatbots into her courses to explore issues of digital literacy, algorithmic bias, and the social impact of AI. Students could engage with various chatbots to analyze their communication patterns, ethical considerations, and how they reflect societal biases embedded in their training data. This critical engagement with chatbots fosters a nuanced understanding of AI’s role in society, encouraging students to become informed and responsible digital citizens.

Chris Dede (GE): Learning Environments for the 21st Century. Harvard Graduate School of Education. Dr. Dede, known for his work in emerging technologies in education, explores the use of AI companions and intelligent tutoring systems. His strategy would involve utilizing chatbots as personalized learning assistants that can adapt to individual student needs, provide targeted feedback, and guide them through complex problem-solving. This aims to create more flexible and responsive learning environments, augmenting the instructor’s capacity to support diverse learners and foster deeper engagement.

Ty Feng, Sa Liu & Dipak Ghosal (CG): Computer science courses. University of California Davis. These educators developed “CourseAssist”, a course‑specific AI tutor deployed in six computer science classes for over 500 students. It uses retrieval-augmented generation, intent-classification, and question decomposition to align responses with precise course learning objectives. Evaluations showed CourseAssist significantly outperformed GPT‑4 alone in usefulness, accuracy, and pedagogical relevance. Students benefit from quick, context-aware support tailored to real course problems, with minimized risk of hallucination or misguidance.

Paul Fyfe (DS): Digital Humanities. North Carolina State University. Fyfe employs AI chatbots for textual analysis, helping students explore literary themes, generate poetry, and critique AI-authored content. His approach blends creativity with critical analysis, revealing both the potentials and pitfalls of AI in humanities.

Joshua Gans & Kevin Bryan (CG): Rotman School of Management, large enrolment business courses. University of Toronto. Gans and Bryan developed the “All Day TA” chatbot trained on their lecture materials. Launched in a 300‑student business course, it answered approximately 12,000 student queries over a 12‑week semester—instantly and anonymously. Students could ask “embarrassing” or clarifying questions without waiting for office hours. The tool scales affordable support (~$2 per student) and has been adopted by around 100 universities globally, demonstrating high reliability and student satisfaction.

Meera Gatlin (PO, GE): Public Health for Veterinary Professionals. Cummings School of Veterinary Medicine, Tufts University. Gatlin’s approach integrates AI as a versatile tool for learning and exploration in public health [3]. By encouraging students to use AI for tasks like clarifying statistical terms or brainstorming solutions, she promotes active engagement with the course material [3]. This strategy not only enhances comprehension but also cultivates critical thinking as students evaluate the AI’s output [3]. Furthermore, it prepares students to leverage AI responsibly and ethically in their future veterinary public health careers [3]. Gatlin emphasizes the importance of clear guidelines for AI use [3]. GE : Dr. Gatlin encourages her students to use AI (like ChatGPT) as a brainstorming and explanation partner. Students are tasked with asking the AI to explain complex statistical terms or to help brainstorm which diseases should be prioritized for eradication. Crucially, she then requires students to critique the AI’s output and the prompts they used, fostering critical thinking about AI’s limitations and the importance of prompt engineering. This approach helps students leverage AI for initial understanding while developing essential skills in evaluating AI-generated information.

Diane Gayeski (CO): Strategic Communications. Ithaca College. Gayeski requires students to use ChatGPT for drafting marketing content. Her chatbot strategy focuses on tone modulation and audience targeting. Students learn to prompt AI for varied outputs, then refine them with human insight. This hands-on method teaches real-world communication skills while demystifying AI’s role in creative industries.

Ashok Goel (CL, CG, PE): Knowledge-Based Artificial Intelligence. Georgia Institute of Technology. Professor Goel created “Jill Watson,” a teaching assistant chatbot that answered student questions online throughout the semester. Chatbot for Higher Education: 6 Benefits for Colleges His pioneering approach involved training an AI system on previous course materials and student interactions to provide 24/7 support. The chatbot successfully handled routine inquiries about assignments, deadlines, and course logistics, allowing students to receive immediate responses regardless of time zones or office hours. Goel’s implementation was groundbreaking because students initially didn’t realize they were interacting with an AI, demonstrating the sophistication of his chatbot design and its seamless integration into the learning environment. *CG: Professor Goel created “Jill Watson”, a virtual teaching assistant embedded in his online Knowledge‑Based AI course. Trained on tens of thousands of past student forum posts, Jill answers routine student questions about logistics and content with ~97% accuracy, acts within Piazza/Canvas, and frees human TAs to focus on mentoring. Over successive semesters, Jill evolved to autonomously post responses when confident. The agent enhances teaching presence, improves retention and grades, and serves as a scalable AI assistant in large cohorts. *PE: Knowledge-Based Artificial Intelligence. Goel pioneered the famous “Jill Watson,” an AI teaching assistant who answers forum questions in his massive online AI course. Jill draws from previous interactions to address common queries and guide learners through assignments, freeing human TAs for advanced mentorship. Critically, the AI operates transparently, building student trust and setting a global example for AI augmentation in large-scale online learning environments.

Philippa Hardman (DS): AI & Instructional Design. University of Cambridge – Affiliated Researcher. Hardman structures chatbot interactions to model Socratic questioning, guiding students through scaffolded discussions. She emphasizes prompt engineering to maximize pedagogical value, ensuring AI interactions align with learning objectives.

Aisha Hassan (YO): Health Sciences. University of Cape Town. Dr. Hassan’s chatbot guides students through clinical case studies, prompting diagnostic reasoning and ethical considerations. The bot adapts scenarios based on student responses, offering a safe space for practice and reflection before real-world application.

Renee Henson (PO, GE): Negotiation/Trial Practice. University of Missouri School of Law. Henson’s strategy involves using a chatbot as a simulated “adversary” to prepare law students for the realities of legal negotiations [1]. The AI is programmed to be difficult, forcing students to develop quick-thinking and problem-solving skills in a high-pressure environment [1]. This innovative approach moves beyond theoretical knowledge, providing practical experience in handling challenging interpersonal dynamics, a critical skill for lawyers [1]. By interacting with a bot that mimics real-world opposition, students gain confidence and refine their negotiation techniques [1]. The goal is to create a learning experience that closely mirrors the demands of their future profession [1]. GE: Dr. Henson utilizes an AI chatbot, specifically trained to act as an “obstructive, aggressive, and difficult” opposing counsel in mock negotiation scenarios. This strategy pushes law students to think on their feet, refine their argumentation skills, and adapt to challenging real-life legal interactions. The chatbot provides a consistent, 24/7 practice partner that can be customized to offer various levels of resistance, simulating complex courtroom dynamics and preparing students for the nuances of professional legal discourse. This innovative use of AI offers scalable and high-stakes practice without the logistical challenges of human role-playing.

Lauren Herckis (DS): Anthropology of Technology. Carnegie Mellon University. Herckis studies how students interact with chatbots, using ethnographic methods to assess AI’s role in learning. She encourages students to document their AI usage, analyzing how it shapes their comprehension and study habits.

Jonathan Herington (CG): Senior-level Philosophy. University of Rochester. As part of philosophy coursework at Rochester, Professor Herington assigned students to cowrite essays with ChatGPT on challenging philosophical questions (e.g. obscure citations, post-2020 readings). Students then reflected on the process: what worked, what failed, and how they might use such tools in future. This method foregrounds critical engagement, exposing the AI model’s limitations and encouraging students to analyze authorship and reasoning rather than accepting AI output wholesale.

Peggy Holzweiss (PE): Supervision in Higher Education (Masters). Sam Houston State University. Holzweiss employs AI-powered chatbots as role-playing partners in critical conversation simulations. Students use the bot to practice managing difficult supervisory scenarios, then request AI-generated feedback on their performance. By reflecting on these conversations, learners develop both soft management skills and AI literacy. Prompts have been rigorously tested across multiple generative AI platforms, ensuring robust, realistic interactions

Paul Jurcys and Goda Strikaitė-Latušinskaja (PO): Vilnius University (Lithuania). Jurcys and Strikaitė-Latušinskaja have implemented AI “twins” to act as first-line responders for student inquiries [2]. These chatbots, trained on the professors’ own materials, offer students immediate access to course-specific knowledge [2]. This strategy aims to enhance student autonomy and provide personalized support, allowing students to explore answers independently before seeking direct help from instructors [2]. It also frees up the professors’ time, allowing them to focus on more complex questions and in-depth discussions [2]. This approach fosters a more engaging and efficient learning environment [2].

Erica Kemmerling (PO): Mechanical Engineering. Tufts University (Mechanical Engineering). Kemmerling’s strategy personalizes learning by using AI to tailor homework assignments to students’ individual interests [3]. By having students engage with ChatGPT to design assignments, she fosters a sense of ownership and relevance [3]. This approach increases student motivation and engagement, making the learning process more enjoyable and effective [3]. It also allows students to see the practical application of engineering skills in areas they are passionate about [3]. This method encourages creativity and demonstrates the versatility of AI in education [3].

Robert Kim (YO): Computational Linguistics. University of Edinburgh. Dr. Kim’s students build and refine chatbots as part of their coursework, learning about language models, dialogue management, and user experience. The course culminates in a showcase where students demonstrate their bots’ capabilities, fostering creativity and technical skills.

Ken Koedinger (GE): Cognitive Science / Human-Computer Interaction. Carnegie Mellon University. Dr. Koedinger, a pioneer in intelligent tutoring systems, develops and researches AI-powered learning tools, which often involve chatbot interfaces. His strategies focus on using chatbots to provide real-time, adaptive feedback and personalized instruction based on cognitive science principles. These “chatbots” act as sophisticated tutors, analyzing student responses and providing tailored hints and explanations to optimize learning efficiency and effectiveness in subjects like mathematics and science.

Christopher Kwaramba (PE): Business Statistics. Virginia Commonwealth University. Professor Kwaramba treats ChatGPT as a modern calculator—using it not only for lesson creation but also to generate problem sets for students. Chatbots aid real-time statistical exercises, automating routine content and freeing the instructor to offer more direct mentorship. Students leverage chatbot-generated scenarios to deepen conceptual understanding and solve real-world problems, making class time richer and more interactive

Michael Lee (YO): Digital Literacy. University of Rhode Island. Dr. Lee employs chatbots to provide 24/7 support for course logistics, FAQs, and assignment clarifications. This reduces administrative overhead and ensures students receive timely assistance, especially for online learners. The chatbot also nudges students about deadlines and resources, improving engagement and retention.

Melissa Loble (CO): E-Learning Certificate Program. University of California, Irvine. Loble uses ChatGPT to support students learning in a second language. Her chatbot strategy includes translation assistance, tone adaptation, and personalized feedback. By integrating AI into instructional design, she helps neurodivergent and multilingual learners access content more effectively and confidently.

Duri Long (GE): Interactive Narrative and Game Design. Georgia Institute of Technology. Dr. Long explores using AI chatbots as creative collaborators in her game design courses. Students might use chatbots to generate character backstories, dialogue options, or even basic plot outlines for interactive narratives. The strategy encourages students to leverage AI for rapid prototyping and idea generation, allowing them to explore more creative avenues and refine their design choices more quickly. This teaches students how to direct AI creatively and critically evaluate its output within a complex creative project.

David J. Malan (PE): CS50: Introduction to Computer Science. Harvard University. Dr. Malan integrated a customized AI chatbot into CS50, assisting thousands of students with coding problems. The chatbot delivers tailored hints and guidance without providing direct answers, encouraging independent learning. Its availability 24/7 means students receive instant feedback outside class, reducing the burden on teaching staff and allowing office hours to focus on deeper engagement and community-building activities like hackathons and group lunches. The strategy strikes a balance between scalable support and preserving academic integrity.

Gregory Marton (PO): Who Wrote This? ChatGPT, LLMs, and the Future of Learning. Tufts University (ExCollege). Marton uses AI as a tutoring tool, guiding students to discover solutions independently [3]. By prompting the chatbot for hints rather than answers, he encourages problem-solving and critical thinking [3]. This strategy fosters a deeper understanding of the material and develops valuable analytical skills [3]. Marton’s approach highlights the potential of AI to support active learning and empower students to take ownership of their education [3]. He also addresses the limitations of AI and encourages students to think critically about its outputs [3].

Andrew Maynard (DS): Future of Technology & Society. Arizona State University. Maynard uses chatbots to facilitate debates on AI ethics, having students interact with AI to explore biases and limitations. His approach emphasizes critical AI literacy, requiring students to document and reflect on chatbot interactions as part of their learning process.

Ethan Mollick (GE, DS): Innovation and Entrepreneurship. Wharton School, University of Pennsylvania. Dr. Mollick is a vocal advocate for integrating AI into learning, encouraging students to use chatbots as a thought partner and productivity tool for various tasks like brainstorming, refining ideas, and conducting initial research. He provides structured prompts and frameworks for students to interact effectively with AI, viewing it as a powerful cognitive extension rather than a replacement for human intellect. His approach focuses on teaching students how to use AI responsibly and effectively to enhance their learning and future professional endeavors. *DS: Entrepreneurship & Innovation. Mollick integrates AI chatbots like ChatGPT to simulate business scenarios, allowing students to refine pitches, brainstorm ideas, and receive instant feedback. He encourages students to critically assess AI-generated responses, fostering analytical thinking while leveraging chatbots as co-creative tools in ideation and problem-solving.

Lilach Mollick (DS): Education & AI Applications. Wharton School, University of Pennsylvania. Lilach employs chatbots to personalize learning, using AI tutors to adapt explanations based on student responses. She designs prompts that encourage deep engagement, ensuring chatbots supplement—not replace—human instruction while helping students explore complex concepts interactively.

Linda Nguyen (YO): Introduction to Psychology. University of Toronto. Dr. Nguyen’s chatbot quizzes students on key concepts after each lecture, providing instant feedback and explanations. The bot adapts questions based on student performance, offering targeted practice and remediation. This formative assessment strategy boosts retention and self-efficacy.

James O’Connor (YO): Student Support Services. University of Dublin. Dr. O’Connor oversees a chatbot that answers student queries about enrollment, financial aid, and campus events. The bot’s conversational interface makes information accessible and reduces wait times for support, enhancing student satisfaction and institutional efficiency.

Sanghoon Park (CG, CO): Online teaching & learning. University of South Florida. Park designed a course-specific chatbot integrated into his online pedagogy classes that delivers just‑in‑time academic and emotional support. The bot sends motivational messages, reminds students about assignments, and answers questions about course logistics. Accessible through the LMS, the chatbot supports student persistence and engagement—especially in first-year or remote-class students—by relieving instructors from repetitive messaging and offering responsive support around the clock. CO: Online Teaching and Learning Park developed a motivational chatbot integrated into his online course platform. It offers academic support and emotional encouragement, helping students stay engaged and confident. The bot responds to queries, sends reminders, and delivers uplifting messages tailored to student progress. Park’s strategy emphasizes the human side of learning, using AI to foster connection and persistence in virtual classrooms.

Sarah Patel (YO): Educational Technology. Stanford University. Dr. Patel’s course features a chatbot that models Socratic questioning, prompting students to justify their answers and explore alternative perspectives. This approach deepens critical thinking and encourages active learning, as students must articulate and defend their reasoning in dialogue with the bot.

Carlos Ramirez (YO): First-Year Experience. Arizona State University. Dr. Ramirez uses a chatbot to support new students with onboarding, campus resources, and wellness tips. The chatbot personalizes responses based on student profiles, helping them navigate university life and reducing feelings of isolation. Analytics from chatbot interactions inform Dr. Ramirez’s interventions for at-risk students.

Anand Rao (CO): Digital Studies Special Topics. University of Mary Washington. Rao’s students build their own chatbots using low-code platforms like PlayLab. His course explores ethical and practical dimensions of generative AI, encouraging students to design bots that reflect pedagogical goals. Rao’s approach blends technical skill-building with critical reflection, preparing students to be thoughtful creators of AI tools.

Inara Scott (CG): Undergraduate law courses. Oregon State University. Scott frames legal writing courses around iterative, AI-augmented drafting exercises. Students use chatbots to generate multiple theses and outlines from prompts, then refine them incorporating external sources. They engage in a back‑and‑forth conversation with AI—enhancing ideas, improving structure, and integrating research—while explicitly reflecting on AI contributions. The practice teaches prompt engineering, critical thinking, and effective human‑AI collaboration in legal argumentation.

Rebecca Shakespeare (GE): Graduate-level Urban and Environmental Policy Course. Department of Urban and Environmental Policy and Planning, Tufts University. Dr. Shakespeare integrates AI into her writing and discussion assignments not for completion, but for critical analysis. She had students ask ChatGPT to respond to discussion questions as if it were a graduate student, then the class critiqued the AI’s response. This exercise highlights the AI’s strengths and weaknesses, demonstrating how AI, while capable, may lack the passion, critical perspective, and innovative thinking of human students. Her strategy emphasizes responsible AI use, encouraging students to understand AI’s limitations and to value human creativity and original thought.

George Siemens (GE): Learning Analytics / Educational Technology. University of Texas at Arlington. While not always focused on a specific “chatbot” in a single course, Dr. Siemens’ work heavily involves the broader application of AI in understanding learning processes. His strategies involve using AI and data analysis to provide personalized learning pathways and interventions, which often manifest as chatbot-like interactions offering tailored feedback, resource recommendations, or nudges to students. His work emphasizes the potential of AI to personalize education at scale, fostering adaptive learning environments.

Priya Singh (YO): Business Communication. University of Melbourne. Dr. Singh’s chatbot simulates workplace scenarios, allowing students to practice negotiation, conflict resolution, and customer service. The bot provides feedback on tone, clarity, and professionalism, helping students develop soft skills essential for career success.

John Smith (YO): Writing and Rhetoric. University of Michigan. Dr. Smith integrates chatbots as virtual writing assistants, allowing students to brainstorm, outline, and revise essays interactively. The chatbot provides instant feedback on grammar, structure, and argumentation, freeing Dr. Smith to focus on higher-order concerns during class. Students are encouraged to reflect on chatbot suggestions, fostering metacognitive awareness and critical thinking about their writing process.

Maung Thway et al. (CG): General education/GenAI learning experiments. Nanyang Technological University, Singapore. NTU researchers introduced “Professor Leodar”, a Singlish-speaking retrieval-augmented generation chatbot trained on local educational resources. It provides contextual guidance across general education modules, fosters engagement, and significantly reduces low-quality “botpoop” output. In deployment, 97.1% of participating students reported positive experiences. Their study showcases GenAI’s capacity to deliver culturally contextual and linguistically authentic learning support in diverse classrooms.

Edward Tian (CO): AI and Ethics (Independent Study). Princeton University. Tian, creator of GPTZero, guides students in evaluating chatbot-generated content for originality. His course emphasizes ethical AI use, misinformation detection, and responsible prompting. Students engage with chatbots critically, learning to balance innovation with integrity.

Liz Tinelli (CG): Writing, Speaking & Argument Program. University of Rochester. Tinelli integrates ChatGPT into writing-intensive graduate and undergraduate courses. Students draft abstracts for technical projects, then prompt the chatbot to rewrite them for different audiences. They compare versions to evaluate style and content. In another assignment, they ask ChatGPT to articulate pros/cons of AI impact on engineering. This approach encourages critical evaluation, authorship awareness, and exploring ethical implications of generative AI in professional communication.

David Wiley (DS): Open Education & Digital Learning. Brigham Young University. Wiley incorporates chatbots to support open educational resources (OER), using AI to generate and refine open-access content. Students collaborate with chatbots to draft, edit, and expand OER materials, fostering a community-driven approach to knowledge sharing.

Michelle Zimmerman (DS): AI in Education. Renton Preparatory Christian School / Affiliated with University of Washington. Zimmerman trains educators to use chatbots for differentiated instruction. She demonstrates how AI can generate customized quizzes, provide writing assistance, and simulate student-teacher dialogues, enhancing pedagogical strategies in real time.

Daniel Zingaro & Leo Porter (CG): Computer science education. University of Toronto‑area/others. While not tied to a single course, Zingaro and Porter promoted using AI tutoring tools in computer science and math education, highlighting how chatbots assist students beyond basic coding tasks. Their strategy uses AI to handle foundational problem-solving, allowing instructors to elevate assignments toward higher-order creative and conceptual tasks. This repositioning treats AI as a learning scaffold rather than crutch, freeing classroom time for deeper engagement and problem-solving.

49

Leave a comment