The Team:

Project Coordinator: Kevin Lenton, Physics Teacher (Vanier College)

I have been teaching at Vanier College for just over 12 years, and I have previous experience teaching at the University level. I have taught the entire physics college curriculum, including bridging and technology courses. I have been involved in several research projects with Liz Charles and SALTISE: myDALITE, Epistemic Artifacts, Feedback Loops, S4 project, SSHRC College Social Innovation Grant.

Clearly, our whole profession is geared towards helping students to learn. In the physics context, Students have trouble relating motion around them to the physics they learn. All of us have ingrained concepts about how the world works, and often this “common sense” is wrong. One of the goals of physics education is getting students to move beyond their preconceptions to a Newtonian model, where they are constantly aware of the forces operating in the environment. Getting students to notice and to think differently is difficult in any discipline, even when coupled with active learning pedagogies. Active learning pedagogies demand that teachers act and react differently in their new role as coach rather than sage on the stage. This is particularly true for active learning classrooms: designed spaces which promote student interaction in groups.

These rooms can be challenging for teachers to teach/coach in, and sometimes the room design can counter how teachers want to use the room. I have become increasingly interested in how teachers interact with active learning classrooms.

Although Kevin Lenton is the primary driver of the project, particularly getting the hardware, it is intended to be implemented with other CEGEP teachers in classrooms notably the teachers with whom I interact the most for other projects including Liz Charles, Michael Dugdale, Nathaniel Lasry, Chris Whittaker, Yann Brouillette, Rhys Adams. In addition, I’ve made contact with Ryan Cooke of IEEE Concordia to help with the electronics.

 

The Project:

ClassroomsA-BA problem with Active Learning (AL) Pedagogies and Classrooms is understanding how teacher implementation (orchestration) depends on resources [defined broadly] in the classroom. These extra resources can be difficult for AL teachers to manage. This includes where and how teachers move i.e. how AL classroom physical design improves/impedes teacher feedback, and why they move i.e. what cues do teachers pay attention to. These cues can be aural, but also visual: what attracts a teacher’s attention?

A logical extension of active learning pedagogies (e.g., Chickering & Gamson, 1987) are active learning classrooms (e.g., the SCALE-UP & TEAL models). Just as the learning locus has shifted from teacher as sage on the stage to teacher facilitator supporting students’ activity (King, 1993), the architecture of classrooms must also change. One can define Active Learning Classrooms (ALCs) as technology-rich collaborative learning environments, which support students’ learning experiences. These innovative spaces are intended to create a student-centered environment that encourages collaboration and communication among learners. Learning becomes distributed across the physical space because there is no definite “front” to the classroom: the teacher desk is often re-positioned to the center of the room, if it exists at all, and rows of desks are replaced with group tables. As adapting to supporting students’ needs drives the learning agenda, teachers no longer fully control what will happen in the classroom. Teachers must now manage feedback from multiple streams (visual, aural, oral, technological) and adaptively react adaptively. Such work can be characterized as orchestration, the real-time management of activity, along with the management of classroom resources (e.g., Dillenbourg & Jermann, 2010). As a research topic, orchestration has been gaining much interest in the CSCL community (Dillenbourg, 2013). This moment-to-moment management of the constraints of the classroom ecosystem, coupled with the management of the learning, places greater demands on the teacher than traditional classrooms and traditional instruction. Physical space and layout are important orchestrational considerations (Dillenbourg & Jermann, 2010). Where the teacher is located, what the teacher can access does make a difference to the possible interactions and feedback to learners and forms the focus of this project.

To date, eye trackers have provided a method to document and assess teacher orchestration (Prieto, Sharma & Dillenbourg, 2015). In this method, teachers wear a complicated headpiece which tracks eye movement. It costs a lot both in the financial sense, and, in the natural flow of the teachers involved.

This project aims to build on previous work, to examine of one part of the orchestration puzzle:  how the teacher moves in the learning space, as a function of time, and what attracts their attention.

We have already been looking at a method of tracking teacher position in the classroom using the physics Tracker application and video of the classroom- a very simple method https://goo.gl/vAxXJu. This has given very interesting data but has been difficult to automate which limits its scalability. This project aims to take the basic idea and automate it with an easy to wear scalable technology.

This automated teacher tracker will simply track position in the classroom as a function of time using a tag in the teacher’s pocket. The tag reports to a base unit in the classroom. Secondly, another gadget [behind the ear] will track head orientation in 3D. Together with the position information, the gaze direction of the teacher can be computed. Assumptions are that teachers look at what they are noticing, and that head orientation is a proxy for gaze direction.

The end-goal would be to match this data with video of classrooms and observe how classroom resources e.g. smart boards, whiteboards, and activity design itself can give feedback to teachers. An extension of the project could be to examine where students are looking, potentially as a measure of student engagement.

Sources:

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE bulletin, 3, 7.

Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485-492.

Dillenbourg, P., & Jermann, P. (2010). Technology for classroom orchestration. In New science of learning (pp. 525-552). Springer New York.

King, A. (1993). From sage on the stage to guide on the side. College teaching, 41(1), 30-35.

Prieto, L. P., Sharma, K., & Dillenbourg, P. (2015). Studying teacher orchestration load in technology-enhanced classrooms. In Design for Teaching and Learning in a Networked World (pp. 268-281). Springer, Cham.

 

October 2018 Update:

Theoretical Update

Over the summer, I had the opportunity to interact with Dr. Kenneth Holstein at Carnegie Mellon University, who, it turns out is interested in this project because he is collecting similar data, but not in active learning classrooms [more like traditional classrooms], and with much younger students (middle school age).

However, he reports how the data from this project could be used.

He has been using VR type headsets to track position and student-teacher interactions. These can be “replayed” with the teachers involved to further pedagogical development goals.

Almost immediately, they noticed that teachers tended to actively monitor their students in concentrated bursts, interleaved with (often lengthy) idle periods in which the teacher monitor the whole class from a fixed position in the room, or attend to an unrelated activity. During periods in which teachers were walking around the classroom, they occasionally provided students with apparently unsolicited feedback (i.e. feedback that was not preceded by the student raising her/his hand) based on their observations while watching a student’s work on a computer screen. Teachers appeared to selectively monitor certain students while consistently passing others by. In interviews with some of these teachers, they noted that they monitor their students strategically during computer lab sessions, relying on prior knowledge about their students’ abilities and behavioral tendencies and whether they expect more off-task (e.g. browsing external websites) behaviours from these students. In addition, teachers tended to neglect certain regions of the classroom and overlooked students who truly tend towards greater time off-task.

The striking thing about these observations is that some of the preliminary data from this study [using video tracking only, not automated] also shows this type of teacher pattern – but only in certain classrooms. This leads to the hypothesis that the classroom design can impact how teachers use the space, and rather teachers need to know how they use the space.

Hardware Update

So the goal is to design an automated teacher tracker, that is to say, track the teacher physical position in a classroom as a function of time, preferably with an accuracy of ±50 cm, and be able to function in different classrooms within different institutions.

What technologies are available?

GPS: relies on timing signals from geostationary satellites. The main problem with this technology is that the satellite signals are not reliable indoors, and even with good signals, position is usually within ±3m, not nearly accurate enough for this study.

Wifi: This method uses wifi-signal-strength mapping to determine position. Nowadays, in our cities, we sit within a myriad of different wifi signals, with the signal strength being a function of distance to the wifi router, and a function of any signal-blocking objects such as walls etc. This means that within a room just about every position has a unique wifi signature. However, to link position to within 50 cm with the wifi signature a time-consuming calibration [“fingerprinting”] has to be done by documenting the wifi signature at multiple points within the room. In addition, the signatures may change day-to-day. This is not something that you can walk into a room and set up within minutes.

Bluetooth Beacons: Sort of similar to GPS but using beacons within the room, and measuring signal timings. This is sensitive to room layout and furniture and therefore does not have the necessary accuracy.

DecaWave Technology: This is in many ways also similar to a miniature GPS system. This ultra-wideband technology transmits very short pulses using one of the 6 RF bands available between 3.5GHz to 6.5GHz. The base of the system is a set of Anchors that are positioned in the room (compare to the satellites in GPS), they are the reference. The other part of the system is one or more Tags (compare to the GPS receiver) that are on the object to be tracked. By sending short high frequency radio messages between the Anchors and Tags, the system measures the distance from each Anchor to the Tags and calculates the position of the Tags from that information. The difference here is that the decawaves can easily pass through objects over a short range, and therefore the classroom geometry is not a limiting factor and gives a position accuracy within 10cm.

So Decawave technology it is!

The company identified with the best product is Pozyx https://www.pozyx.io/Documentation/faq

The developer’s kit has been ordered, but has yet to arrive, so until it does, the project cannot proceed in earnest.