Talks Practice—AI

T-05: AI in the Classroom

June 04, 2025 | 10:30 - 11:45 AM Dawson College
Presentation

Using AI tools to Enhance Students' Learning Experience of Mathematical Proofs

We present a study where a generative artificial intelligence tool (genAI; i.e., ChatGPT) was used to produce mathematical proofs that were then presented to small groups of students during some of their weekly scheduled laboratories/tutorials in a third-year math course. Students were asked to review and evaluate these proofs and discover and correct the mistakes in the arguments. These activities encouraged peer interaction and strengthened students’ problem-solving skills.

Presenter(s)

Presentation

Voltaire: an AI Chatbot to Enhance Evidence-Based Argumentation on Technology and Society.

The social impact of technology is a key topic in engineering education, requiring students to engage critically with societal, ethical, and environmental issues. In *Environnement, technologie et société* (TIN503) at ÉTS, a customized AI chatbot, Voltaire, was developed as a skeptical interlocutor. It challenges students to construct evidence-based arguments, provides real-time feedback, and supports iterative learning. This presentation will outline the chatbot’s development, discuss its initial classroom implementation, and highlight key insights on student engagement, argumentation challenges, and feedback quality. The findings offer broader implications for integrating AI-driven critical dialogue in engineering education.

Presenter(s)

Eric Francoeur

Eric Francoeur

SALTISE Committee Member, École de technologie supérieure, Montreal

Presentation

Automated Generation of Challenge Questions for Student Code Evaluation Using Abstract Syntax Tree Embeddings and RAG

This paper presents an exploratory study on detecting learning gaps in student-submitted code by generating automated challenge questions. The method compares the abstract syntax trees of student code with those of class-taught examples using embeddings and retrieval-augmented generation. The approach identifies the most structurally deviant sections of student code and generates challenge questions targeting un-taught coding techniques, such as function pointers and variadic functions. The evaluation demonstrates the effectiveness of the selection process and the quality of generated questions. This work highlights the potential for using structural analysis and automated challenge question generation to improve student assessment in coding education.

Presenter(s)

Additional Information

Organizer
SALTISE