Implementation/Application Articles

New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies

Authors: Tripp Harris (Indiana University) , Tracey Birdwell orcid logo (Indiana University)

  • New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies

    Implementation/Application Articles

    New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies

    Authors: ,

Abstract

This classroom technology pilot explored the ways in which three different tools yielded insights into instructors’ and students’ use of physical classroom space in a higher ed environment. The tools of interest were a ceiling heatmap camera from Panasonic, small ultra-wideband movement trackers from Pozyx, and an audio sensor from TSI. Major goals of this pilot study were to determine the nature of the information that these tools could provide relevant to use of classroom space separately and in concert, and to document the process of piloting new classroom technologies designed to study physical classroom spaces and the teaching and learning taking place therein. This pilot experience contributes to foundations for using these three tools—and others designed to collect similar data--to study teaching and learning in physical classroom space and using the insights they yield to support the design of new classroom spaces. This pilot also contributes lessons learned for introducing new tools into classroom spaces and bringing instructors, researchers, and technology experts together for using and studying new classroom technology that captures various auditory, movement, and positioning data to learn more about use of physical learning space.

Keywords: classroom technology, movement, audio, heatmap, active learning

How to Cite:

Harris, T. & Birdwell, T., (2023) “New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies”, Issues and Trends in Learning Technologies 10(2). doi: https://doi.org/10.2458/itlt.5227

Downloads:
Download PDF

711 Views

812 Downloads

Published on
29 Jan 2023
Peer Reviewed
New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies

New approaches for studying multidimensional use of classroom space: A pilot study of three classroom technologies

Researchers have explored questions surrounding the relationships between teaching, learning, and use of physical classroom space in detail in recent years (e.g., Park & Choi, 2014; Baepler et al., 2016; Holec & Marynowski, 2020). Yet, most research at these intersections have focused on manually collected data (Aguillon et al., 2020; Zhu & Basdogan, 2021). Technologies designed to capture classroom use data automatically may offer some new insights into how students and instructors use their learning spaces. The goal of this study is to pilot three tools in live classroom settings as part of the Program’s ongoing research on use of physical classroom space and faculty instructional support, and to study the extent to which the tools yield insights into instructors’ and students’ use of the classroom.

The tools employed in this pilot study included a ceiling heatmap camera from Panasonic, small positioning trackers from Pozyx, and an audio level sensor from TSI. The heatmap camera—a tool that produces a color-based visualization of where people are physically present in a space based on a camera recording of the space—shows where people most often stood, sat, or moved throughout the classroom, which offers a sense for the areas of the room that students and instructors most frequently used. The positioning trackers—small wearable or holdable devices that track a person or object’s physical location relative to a predetermined reference point such as a desk or podium—show precise movements and locations of the person holding or wearing the tracker, and the audio sensor measures levels of classroom-wide audio. Given the visual nature of their outputs, each of these tools supports a data visualization approach for studying use of space. Moreover, these tools support automated data collection as opposed to manual, time-intensive methods for collecting classroom use data as seen in several studies that rely on audio and video analysis and hands-on qualitative coding of data (e.g., Lim et al., 2012; Gurzynski-Weiss et al., 2015).

This was a pilot study in the sense that it was a preliminary effort to familiarize ourselves with the tools and their capabilities. We were not yet concerned with specific empirical research questions, but rather broad questions relevant to how these tools work and what they can tell us. Our research questions are the following:

  • What can each of these tools tell us about the use of physical classroom space?
  • What can using these tools together tell us about use of physical classroom space?

Background

This pilot study exists at the intersections of research concerned with instructional improvement, use of physical classroom space, and development of multistakeholder networks that bridge divides between institutional units. In our institutional context, our pilot extends on the ongoing efforts of our Faculty Fellows program (hereafter referred to as ‘the Program’) to support instruction in physical classroom spaces. These efforts have largely focused on supporting instructional improvement in space through faculty development and networking initiatives centered around use of classroom space (Birdwell & Uttamchandani, 2019; Copridge et al., 2021; Birdwell & Harris, 2022). This pilot extends on this earlier instruction-focused research to better understand how technology might tell us more about ways in which students and faculty use classrooms.

Prior research outside of our institutional and programmatic context has also informed our efforts to pilot these tools. Our pilot is best positioned among research interested in use of learning spaces in higher education (e.g., Yang et al., 2013; Ellis & Goodyear, 2016). More specifically, our focus on technologies for studying specific elements of classroom use aligns with research on what has become known as classroom proxemics. Proxemics is defined as “the study of how teachers and students use [classroom] space, and the impact of the spatial design” on behavioral patterns and learning activities taking place within classrooms (Martinez-Maldonado, 2019, p. 21). An et al. (2020) characterize proxemics as explicitly applied to instruction; in their words, proxemics is the study of “how teachers allocate time and attention to interact with students in different regions of the classroom during pedagogical activities” (p. 2). Recent scholarship on proxemics has employed heatmap and positioning technology to study the use of both formal and informal learning spaces in higher education (e.g., Martinez-Maldonado et al., 2018; Martinez-Maldonado et al., 2022).

Our pilot extends on existing proxemics research to investigate the utility of audio sensing technology in concert with tools designed to capture classroom movement and positioning data. Extensive prior research has utilized audio sensors to study classroom activity (e.g., Owens et al., 2017; Gordy et al., 2018). Few studies, though, have linked classroom movement data to audio data (e.g., Howard et al., 2018; Bosch et al., 2018; Ahuja et al., 2019), and no published research has used established commercial heatmap cameras or positioning technology aligned with audio sensors to study spatial use in educational settings. Our pilot offers a foundation for drawing explicit connections between proxemics—as captured via heatmap and positioning data--and classroom audio levels.

The secondary goal of this pilot study was to document and troubleshoot collaborative and logistical challenges inherent to planning and conducting research among personnel from different institutional units. This project was informed by—and aims to contribute to—research on infrastructuring emerging from design-based educational research (Penuel, 2019), as well as intra-institutional knowledge sharing, community building, and research collaboration in higher education (Njiraine, 2019; Palaez et al., 2018; Leahy, 2016).

Procedure

Our research procedure consisted of selecting and acquiring technologies of interest, choosing a classroom conducive to the installation and use of the technologies, obtaining IRB approval, recruiting and obtaining written consent from potential faculty and student participants, and data collection.

Acquire tools

Researchers with the Program worked closely with learning technologies personnel to identify three tools to capture unique aspects of ways in which students and instructors use physical classroom space. Through internal funding, the Program acquired the Panasonic 360 heatmap camera, Pozyx UWB positioning tags, and TSI Sound Quest Pro audio sensor during the fall 2021 semester (prior to the pilot semester in spring 2022). Once a classroom of interest was identified, and an instructor participant teaching in the classroom was recruited, learning technologies personnel installed the Panasonic 360 camera in the center of the classroom’s ceiling. The Pozyx tags and TSI audio sensor did not require installation prior to the pilot semester—during pilot class sessions, the audio sensor was placed in the center of the classroom, and a member of the research team gave each instructor a Pozyx tag to carry throughout the class meeting.

Room Selection

We selected our classroom of interest from among several active learning classrooms (rooms designed to support active and engaged pedagogies) on the university system’s flagship campus. Active learning classrooms are formal learning spaces enriched with digital and non-digital technologies that are designed to support active learning pedagogies. Personnel from the university’s Learning Technologies division installed the Panasonic heatmap camera during the semester break prior to the beginning of the pilot in the spring 2022 semester. After the camera was installed, Program and Learning Technologies staff members met with one of the instructors in the classroom to demo the technology and discuss how the instructor might use the positioning tags during pilot class periods.

Faculty Recruitment

During the fall 2021 semester, the Program director reached out to a past Faculty Fellow who was scheduled to teach a course in the selected classroom during the pilot semester. This instructor had previously completed a one-year term as a Faculty Fellow with the Program and was already familiar with the research and faculty development goals of the Program. Program staff successfully recruited an additional faculty member at the beginning of the pilot semester who had not participated in the Faculty Fellows program. The faculty members were scheduled to teach back-to-back class sessions in the classroom during the pilot semester—the first class meeting from 9:45am to 11:00am, and the second from 11:30am-12:45pm. Program staff met with both instructors separately early in the semester to schedule dates for piloting the tools during their class meetings. After the researchers informed students of the tools and obtained their consent to participate, the tools were piloted over the course of three class meetings with one of the faculty members and two class meetings with the other participating faculty member.

IRB Protocol Approval

This pilot required moderate interaction with human participants. We carefully followed guidelines for ethical human subjects research after our Institutional Review Board approved the protocol for our pilot. Recruitment of participating faculty members via email and their expressed agreement to participate via email response represented consent on the part of the two faculty participants. Even though none of the tools produced outputs that yielded identifiable images or recordings of students (or disrupted classroom instruction in any way), we gave potential student participants the opportunity to opt into or out of the pilot experience. To obtain consent from all students in each class, a Program staff member addressed students at the beginning of each class session to describe the pilot study, the tools, and the nature of the data that the tools would collect. Each student was provided with an Informed Consent form that contained further details about the pilot study and the tools. Students were instructed to read and sign the form if they fully understood the details of the pilot study and agreed to participate. Although students were given the option to opt out of participation, all students in both class sections agreed to participate.

Program staff discussed possibilities for ensuring students’ privacy and confidentiality with both instructors separately. Both instructors agreed that in the event some students withheld consent, they would be given a space to sit in the classroom together that—while not out of the view of the heatmap camera—would be blacked out manually by the Learning Technologies staff member who had remote access to the heatmap camera and its outputs. This would add a layer of security and confidentiality for students who chose not to participate (in addition to the non-identifiable images produced by the camera). We discuss further details of this challenge—and potential future changes—in the section on lessons learned below.

Data Collection

We conducted the pilot throughout five class meetings in total between the two faculty members’ courses—three separate class meetings for one instructor and two class meetings for the other instructor. The initial plan was for a Program staff member to appear in class once for each instructor’s sections to explain the pilot and collect consent signatures. However, since some students in each instructor’s section were absent on days when consent signatures were collected, the Program staff member spoke briefly at the beginning of each of the five total class periods to ensure that all students—not just those present on the first day of the pilot--had been given opportunities to review, sign, or decline to sign the Informed Consent form.

Data collection days required coordination between Learning Technologies personnel, participating instructors, and Program staff to ensure all equipment was set up prior to the start of class. On the three pilot days, Program and Learning Technologies staff members met in the classroom 15 minutes prior to the beginning of the first class of the day to set up the audio sensor and to power on the positioning tags. Each data collection day, a Program staff member remained nearby the classroom throughout both class meetings to troubleshoot technology in the event issues emerged. At the conclusion of the second class on each data collection day, Learning Technologies personnel returned to the class to collect the positioning tags and break down and store the audio sensor equipment.

After collecting data during the three pilot days, Program staff members met via Zoom with Learning Technologies personnel to discuss the data outputs from each tool. The Learning Technologies staff member with remote access to the heatmap camera output captured ‘staying’ and ‘passing’ outputs for each of the five pilot class meetings broken up into 15-minute time segments and sent these outputs to Program staff in JPEG format. Another Learning Technologies staff member—who controlled remote access to the positioning and audio data--transferred both sets of data to Program staff in PDF format.

Discussion

What the Tools Tell Us and Lessons Learned

Our findings from this pilot experience are two-fold. First, from an educational technology perspective, our pilot yielded insights into the nature and types of information these tools can tell us about use of physical classroom space—both when used separately and together. This experience also offered valuable lessons for planning and conducting future research that requires extensive collaboration between teaching faculty and professionals from multiple institutional units. Findings of both veins contribute to foundations for studying classroom use and designing new classroom spaces conducive to active learning.

Panasonic Heatmap Camera

The Panasonic Heatmap Camera was affixed to the center of the active learning classroom’s ceiling. This positioning allowed the camera tocapture the entirety of the classroom. The camera is set to capture heatmap data at all times— for this pilot study, Learning Technologies staff allowed the camera to record constantly and subsequently accessed data for class periods of interest at the conclusion of the pilot semester. As an alternative to this approach, personnel with remote access to the camera can manually turn it on and off or schedule it to record ahead of time to gather data at specific start and end times.

The camera produces two separate heatmap outputs simultaneously—one output showing the most trafficked areas within the classroom (passing view), and the other showing the areas in the room where students and instructors were most consistently positioned (staying view). The passing view gives us a sense for the areas of the room through which students and instructors move most frequently. The passing view from this pilot, for example, showed heavy traffic for both class sections through the main isles that separated the three sections of desks in the classroom. This was especially the case for the time segments that corresponded with the beginning and end of class meetings—times in which students entered and exited the room (shown by dark red coloring in the heatmap output in figure 1). Throughout both class sections, blue coloring in the heatmap outputs indicated moderate movement in the area across the front of the classroom for most time blocks during the class sessions (see figure 2). This appears to correspond with instructor movement near the classroom’s whiteboard and podium. The staying view outputs, as expected, demonstrates locations where students sat most consistently during the class meetings.

Figure 1
Heatmap output showing high levels of foot traffic near the front of the room

Heatmap output showing high levels of foot traffic near the front of the room

Figure 2
Heatmap output showing low to moderate foot traffic throughout the room

Heatmap output showing low to moderate foot traffic throughout the room

As Figures 1 and 2 demonstrate, the data gathered from the heatmap camera aligns with what we hoped the tool would tell us: where people were physically present during the class periods, and which areas of the room saw the most use and foot traffic. Such data has implications for choosing tools and furniture for classrooms. Ultimately, we hope to use the camera to gather insight into relationships between use of physical classroom space and instructors’ use of various pedagogies designed to support active learning—and how pedagogies in the context of the space are associated with positive learning outcomes. Additionally, we hope to use heatmap data from the camera to inform future design conversations about our physical classroom space. The findings from this pilot experience suggest that the heatmap camera will be a foundational tool for pursuing these goals.

Pozyx UWB Positioning Tags

The Pozyx indoor-positioning tags use ultra-wideband technology to track precise physical locations of people and/or moving objects. The tags tracked instructors’ movements from location to location throughout the class sessions—offering a view of instructor movement that was significantly more precise than the view generated by the whole-classroom passing heatmap outputs. The positioning tags were attached to lanyards that allowed instructors to easily carry or wear them as they moved around the classroom while teaching. The instructor for a class of interest can carry a tag in their hand, in their pocket, or wear one around their neck while they teach.

The positioning tags can generate heatmap outputs as well as “spaghetti” outputs that show extremely precise (+/- 10 centimeters) movements and locations of whomever is carrying or wearing them. To complement the heatmap output generated by the Panasonic camera, we focused on the spaghetti outputs from the positioning tags. The spaghetti outputs show lines representing movement relative to a predetermined “anchor” in the classroom. Figure 3, for example, shows the instructor’s movement relative to the anchor positioned at the front of the classroom. In this case, it appears that the instructor rarely moved away from the vicinity of the podium and whiteboard during this particular class session.

Figure 3
“Spaghetti” output showing the instructor’s movement around the front of the room

Instructor's movement around the front of the room

Similarly, the heatmap data produced by the Panasonic ceiling camera and the spaghetti outputs shown in Figure 3 align with our goals of studying instructor movement and use of physical classroom space while teaching. In a broad sense, the positioning tags yielded similar general insights about instructor movement as the heatmap outputs from the ceiling camera—outputs from both tools show where the instructors moved throughout class periods, as well as how much they moved (represented by color in the Panasonic heatmap output and concentration of lines in the Pozyx tag output). These insights, though, are not necessarily redundant. The heatmap from the ceiling camera is useful for gathering generalized, whole-class views of where people (i.e., not just instructors or those holding tags) positioned themselves and moved throughout class sessions. The positioning tags, on the other hand, provided a much more accurate view of the movements and locations of specific individuals— in this case, instructors carrying the tags. Thus, these tools could be used in a complementary fashion.

TSI Sound Quest Pro

The TSI Sound Quest Pro, unlike the ceiling heatmap camera or positioning tags, captures audio data rather than movement or location data. The audio sensor measures audio levels in the classroom over the course of a class period. The sensor was assembled on a tripod stand, placed in a centralized location within the classroom, and manually powered on prior to the beginning of class on each pilot day (and manually paused between classes). The sensor captures audio levels in decibels and provides the minimum and maximum average noise levels in the room in five-second increments. Outputs from the sensor show data from three different views: a detailed data log showing the sound measurements in five-second timestamped increments (see figure 4), a chart with aggregated measurements from the data log, and a graph showing the percentages of class time in which a range of decibel levels were recorded. Figure 5 shows the percentage view, through which we can see that the audio level most commonly captured throughout this particular class was 58 decibels (for 12% of the class session). Figure 4 shows the aggregated audio measurements from the timestamped data log. Information represented in this view includes equivalent continuous sound level (Leq-1), maximum peak sound pressure (Lpk-1), and maximum and minimum (Lmax-1, Lmin-1) sound levels measured over each ten-second increment (derived from the five-second increments represented in the timestamped data log).

Figure 4
Graphical output showing specific audio events in the classroom for a given period

Graph of classroom audio events

Figure 5
Graphical output showing average decibel levels in the classroom for a given period

Graph showing classroom decibel levels

Figures 4 and 5 offer examples of ways in which the audio sensor provides a rich array of information on classroom audio levels with significant detail. The tool’s capabilities open doors for exploring a number of questions relevant to teaching and learning in various classroom spaces. Possible questions include those surrounding the extent to which whole-class audio levels are associated with students’ perceptions of academic engagement, learning, and interaction. Like with the Panasonic heatmap camera and the Pozyx tags, we hope to contextualize our use of the TSI sensor within instructional approaches designed to support active learning. For example, the sensor could allow us to explore relationships between modes of instruction (e.g., lecture, class discussion, small-group activities, etc.), classroom audio, and measures of perceived learning. We also aim to continue employing these three tools while concurrently gathering data on relationships between use of physical classroom space and classroom noise levels. It is important to note that, similarly to the heatmap camera and positioning tags, the audio sensor does not produce any personally identifiable data. The sensor captures only audio levels, not audio details such as student or instructor speech.

Lessons Learned

This pilot experience taught us valuable lessons about the potential of the heatmap camera, positioning tags, and audio sensor for helping us better understand how instructors and students use physical classroom space. The pilot produced evidence that these tools—when used individually and simultaneously—can provide valuable insights that will bolster research, teaching and learning, and future efforts to imagine and design new learning spaces. Each tool provides affordances to produce data visualizations for unique aspects of classroom usage and enables collection of data automatically rather than manually. This pilot experience also served the secondary goal of our study to learn more about the logistical and collaborative challenges that characterize research designs spanning across institutional units. Our experience gave us opportunities to think critically about how to organize for future research using classroom technology of this kind. These logistical considerations include ideas for finetuning IRB protocol development, faculty recruitment, scheduling, and coordination between various stakeholders to plan and conduct research efficiently.

Lessons for Technology Use in Research or Evaluation

Our technology pilot helped us imagine new ways of studying classroom space and the teaching and learning that takes place therein. Regarding our first research question, we developed a sense for how each of these three tools can provide useful information when used on their own. The heatmap camera shows which areas of the classroom are most heavily used. The staying view shows us where students tend to sit, where instructors position themselves, and which analog and digital technological features see the most use throughout a given class session. The camera’s passing view gives us a sense for the extent to which students and instructors move during class, as well as the areas of the room in which they move (and perhaps are ‘actively learning’) most often. The positioning tags, while also generating data on instructor movement in the classroom, offer a much more precise, quantifiable understanding of movement patterns.

We also discovered what the audio sensor could tell us when used in isolation. It could be useful for analyzing noise levels in classrooms, and when its outputs are studied alongside indicators of student engagement and/or performance, may open doors for exploring relationships between communication, active learning, and positive learning outcomes. Since the sensor can be easily set up in different classrooms, it could also be possible to study noise levels in classrooms with different designs (i.e., lecture halls vs. traditional classrooms vs. active learning classrooms).

With respect to our second research question, this pilot uncovered ways in which using these tools together—in the same classroom and at the same time—could create foundations for surfacing rich data to tell us more about use of physical classroom space (through automated data collection) and to inform the designs of new classrooms. Each of the three tools offers data that is distinct from the data offered by the other tools to provide a multidimensional view of physical and verbal activity taking place in the classroom. Lessons gleaned from using these tools together may translate to use of other technologies designed to study unique aspects of students’ and instructors’ use of classroom space. These lessons could be integrated alongside other data collection approaches or methodological frameworks (e.g., Lee et al., 2017) to study classrooms and activities taking place within them at scale.

Lessons for Research Design

The logistical lessons and design implications learned throughout this pilot study were just as significant as those learned of the technologies themselves. Ideas for upholding research ethics, specifically related to recruitment of student and instructor participants, were especially salient. We chose to collect consent signatures from students enrolled in both instructors’ class sections, even though none of the tools collected identifiable data. We wanted to give students a chance to opt out of participation, but we also wanted to clearly explain to them that if they did choose to participate, it would not be possible to identify them from any of the tools’ data outputs. Our university’s IRB recommended that we ask students to read informed consent forms explaining the nature of the pilot study and data collected by the tools, and to sign them if they agreed to participate. To facilitate this process (and to answer questions, further explain the tools, and distribute and collect informed consent forms), a member of the research team spoke to each class section prior to the start of class.

Even though all potential student participants signed forms agreeing to participate, there were several logistical issues worth addressing for future studies with these tools. The original plan was for the research team member to appear in each class section once—during the first of three pilot days. Given unpredictable patterns of attendance among students in both classes, the researcher ultimately spoke at the beginning of each class section for all three days of the pilot study. This was in order to ensure that students who missed their chance to read the informed consent form and opt out on the first day of the pilot were still able to do so. This was perhaps an inefficient use of time for both the researcher and instructor (and students who attended all classes and thus sat through repetitive explanations of the pilot study); these extra steps were necessary, though, to uphold standards for ethical research. Researchers exploring similar technologies during live class sessions may benefit from working with instructors to communicate with students the nature of the study beforehand via email, and possibly collect consent signatures via email if permitted by the university’s IRB. This would prevent researchers from having to introduce the study to the same class sections on multiple occasions and would give all students the chance to read about the study and provide or decline their consent to participate regardless of their class attendance.

Since the heatmap camera was designed to capture a whole-classroom view, and since the audio sensor does not allow for setting a finite range, determining how to accommodate students who may choose not to participate in this pilot study was a significant challenge. Even though, all students ultimately chose to participate, there needed to be a plan in place for those who opted out of the pilot experience. We discussed several alternative plans in case some decided to opt out of the study. One potential option was to have non-participants sit at the same table in a predetermined area of the classroom, and for the Learning Technologies personnel to manually black out that area after data collection (but before data was accessed by the research team). Another option was to have non-participants leave the classroom altogether and work at a table in the hallway directly outside the classroom door. This option, while an easy solution for having students avoid the view of the heatmap camera, was pedagogically undesirable. Future research would benefit from deliberate, advance coordination between instructors, researchers, and technology professionals to determine how to organize highly ethical and pedagogically sound research designs.

Implications for Future Research

It is worth emphasizing that this pilot study was carried out without instructional context. We did not pair our data collection with specific learning activities to determine how they were implemented, nor did we gather measured or perceived learning outcomes. Since this stage of our research was only focused on how the tools function and the nature of the data they can collect, we cannot make instructionally contextualized claims based on the data collected for this pilot study. Future research with these tools will focus on instruction and will explore relationships between audio, movement, and positioning data and instructional activities and outcomes.

Research involving these tools could also open doors for different stakeholders to work together on researching, evaluating, and designing classrooms and other campus learning spaces—including personnel not traditionally involved in research and design of classroom space. Future research focused on collaboration and instructional change could benefit from drawing on team-based approaches for bringing personnel from various institutional units together (Elrod & Kezar, 2017; Olmstead et al., 2019); such collaborative approaches could prove more effective for instructional improvement than traditional professional development programs (Gallagher, 2016). Personnel from teaching centers, facilities operations, capital planning, academic departments, and other institutional units could use these tools collaboratively to explore a range of questions and ideas relevant to use of learning spaces.

These technologies could also offer opportunities for exploration that is removed from traditional research designs focused on student and instructor perceptions and students’ grades as outcome measures; these tools could open doors for integrative mixed methods approaches for studying classroom spaces. Example research questions of both broad and narrow scope could include:

  • How is a specific classroom used? What tools are getting used? How might the classroom be used more effectively?
  • What kinds of changes to physical classroom features affect or improve the use of the classroom space?
  • How do instructors make use of specific classroom features (e.g., podiums, boards, and desks)?
  • How might instructors’ use of classroom features influence students’ use of classroom space?
  • What kinds of instructional approaches are associated with different patterns of movement and/or classroom use?
  • How does an active learning classroom’s design influence the degree to which students interact with each other?

These questions will qualitatively and quantitatively explore the relationships between patterns of classroom space use, classroom noise levels, and academic outcomes such as achievement and student engagement. We will also work closely with instructors for future projects to devise research questions related to specific pedagogical approaches and their relationships with use of physical learning space. We hope to recruit additional faculty participants from different disciplines to expand the scope of our exploration into use of the space.

To prepare for future research with these tools, we will expand coordination among researchers, instructor participants, and technology personnel to include tool demos in the classroom of interest and Zoom meetings devoted to finalizing scheduling logistics, plans for non-participating students, instructional/pedagogical plans, and research questions. Given the need to coordinate such a broad range of logistical and research design considerations, pre-study coordination should take place across multiple meetings—but with all stakeholders present at each meeting in order to communicate and plan efficiently.

Future research may consider developing a protocol for efficient planning, coordination, and research design relevant to studying classroom space with new and existing technologies. Such a protocol could support researchers and instructors in institutional contexts as they pilot similar tools and conduct research into teaching and learning in space, and ultimately support the design and installation of novel learning spaces on their campuses.

Conclusion

This classroom technology pilot uncovered answers to our initial research questions concerning the nature of the information these tools could produce, both individually and in concert. Individually, the heatmap camera provides useful data visualization outputs that tell us which features of the classroom students and instructors tend to use most heavily relative to other areas of the room. Likewise, the positioning trackers yield another unique visible representation of classroom use, giving us a sense for the extent to which instructors move throughout the classroom. Taken together, outputs from the heatmap camera and positioning trackers provide a holistic yet detailed view of spatial occupation and movement throughout the classroom space.

The audio sensor, while not intended to record speech-specific details, offers time-specific data that shows classroom noise levels at any given time during a class meeting of interest. When the audio sensor is used alongside the heatmap camera and positioning trackers, these three tools provide a multidimensional view of instructors and students use physical classroom space via automated data collection. The findings of this pilot study—relevant to both technological affordances and research design—offer a promising outlook for using such tools to study teaching and learning in space.

Acknowledgements

We deeply appreciate the technological guidance and support from Mark Russell, Chris Golden, and Joshua Foster provided throughout this pilot study. We also appreciate the support of the instructors who agreed to participate in this pilot study, Dr. J Duncan and Dr. Nikolaus Zirogiannis.

References

Aguillon, S. M., Siegmund, G. F., Petipas, R. H., Drake, A. G., Cotner, S., & Ballen, C. J. (2020). Gender differences in student participation in an active-learning classroom. CBE—Life Sciences Education, 19(2).

Ahuja, K., Kim, D., Xhakaj, F., Varga, V., Xie, A., Zhang, S., ... & Agarwal, Y. (2019). EduSense: Practical classroom sensing at scale. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, USA, 3(3), 1-26.

An, P., Bakker, S., Ordanovski, S., Paffen, C. L., Taconis, R., & Eggen, B. (2020). Dandelion diagram: aggregating positioning and orientation data in the visualization of classroom proxemics. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, USA, 1-8.

Baepler, P., Walker, J. D., Brooks, D. C., Saichaie, K., & Petersen, C. I. (2016). A guide to teaching in the active learning classroom: History, research, and practice. Stylus Publishing, LLC.

Birdwell, T., & Harris, T. (2022). Active learning classroom observation tool: Improving classroom teaching and supporting institutional change. Journal of Learning Spaces, 11(1).

Birdwell, T., & Uttamchandani, S. (2019). Learning to teach in space: Design principles for faculty development in active learning classrooms. Journal of Learning Spaces, 8(1).

Bosch, N., Mills, C., Wammes, J. D., & Smilek, D. (2018). Quantifying classroom instructor dynamics with computer vision. Proceedings of the International Conference on Artificial Intelligence in Education, AIED’18 (pp. 30-42). Springer.

Copridge, K. W., Uttamchandani, S., & Birdwell, T. (2021). Faculty reflections of pedagogical transformation in active learning classrooms. Innovative Higher Education, 46(2), 205-221.

Ellis, R. A., & Goodyear, P. (2016). Models of learning space: Integrating research on space, place and learning in higher education. Review of Education, 4(2), 149-191.

Elrod, S., & Kezar, A. (2017). Increasing student success in STEM: Summary of a guide to systemic institutional change. Change: The Magazine of Higher Learning, 49(4), 26-34.

Gallagher, H. A. (2016). Professional development to support instructional improvement: Lessons from research. Menlo Park, CA: SRI International.

Gordy, X. Z., Jones, E. M., & Bailey, J. H. (2018). Technological innovation or educational evolution? A multi-disciplinary qualitative inquiry into active learning classrooms. Journal of the Scholarship of Teaching and Learning, 18(2), 1-23.

Gurzynski-Weiss, L., Long, A. Y., & Solon, M. (2015). Comparing interaction and use of space in traditional and innovative classrooms. Hispania, 98 (1), 61-78.

Holec, V., & Marynowski, R. (2020). Does it matter where you teach? Insights from a quasi-experimental study on student engagement in an active learning classroom. Teaching and Learning Inquiry, 8(2), 140-164.

Howard, S. K., Yang, J., Ma, J., Ritz, C., Zhao, J., & Wynne, K. (2018). Using data mining and machine learning approaches to observe technology-enhanced learning. Proceedings of the 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE) (pp. 788-793). IEEE.

Leahey, E. (2016). From sole investigator to team scientist: Trends in the practice and study of research collaboration. Annual Review of Sociology, 42, 81-100.

Lee, D., Arthur, I. T., & Morrone, A. S. (2017). Using video surveillance footage to support validity of self-reported classroom data. International Journal of Research & Method in Education, 40(2), 154-180.

Lim, F. V., O’Halloran, K. L., & Podlasov, A. (2012). Spatial pedagogy: Mapping meanings in the use of classroom space. Cambridge Journal of Education, 42(2), 235-251.

Martinez-Maldonado, R. (2019). "I spent more time with that team": Making spatial pedagogy visible using positioning sensors. Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 21-25). ACM.

Martinez-Maldonado, R., Echeverria, V., Santos, O. C., Santos, A. D. P. D., & Yacef, K. (2018). Physical learning analytics: A multimodal perspective. Proceedings of the 8th International Conference on Learning Analytics & Knowledge (pp. 375-379). ACM.

Martínez-Maldonado, R., Yan, L., Deppeler, J., Phillips, M., & Gašević, D. (2022). Classroom analytics: Telling stories about learning spaces using sensor data. In E. Gil, Y. Mor, Y. Dimitriadis, & C. Köppe (Eds.), Hybrid learning spaces (pp. 185-203). Springer.

Njiraine, D. (2019). Enabling knowledge sharing practices for academic and research in higher education institutions. Journal of Information and Knowledge Management, 9(3).

Olmstead, A., Beach, A., & Henderson, C. (2019). Supporting improvements to undergraduate STEM instruction: an emerging model for understanding instructional change teams. International Journal of STEM Education, 6(1), 1-15.

Owens, M. T., Seidel, S. B., Wong, M., Bejines, T. E., Lietz, S., Perez, J. R., ... & Tanner, K. D. (2017). Classroom sound can be used to classify teaching practices in college science courses. Proceedings of the National Academy of Sciences, USA, 114(12), 3085-3090.

Park, E. L., & Choi, B. K. (2014). Transformation of classroom spaces: Traditional versus active learning classroom in colleges. Higher Education, 68(5), 749-771.

Pelaez, N., Anderson, T. R., Gardner, S. M., Yin, Y., Abraham, J. K., Bartlett, E. L., ... & Stevens, M. T. (2018). A community-building framework for collaborative research coordination across the education and biology research disciplines. CBE—Life Sciences Education, 17(2), es2.

Yang, Z., Becerik-Gerber, B., & Mino, L. (2013). A study on student perceptions of higher education classrooms: Impact of classroom attributes on student satisfaction and performance. Building and Environment, 70, 171-188.

Zhu, M., & Basdogan, M. (2021). Examining social learning in an active learning classroom through pedagogy-space-technology framework. Journal of Learning Spaces, 10(1).