Research Articles

Student Choice, Instructor Flexibility: Moving Beyond the Blended Instructional Model

Authors: Jackie B. Miller (The Ohio State University) , Mark D. Risser (The Ohio State University) , Robert P. Griffiths (The Ohio State University)

  • Student Choice, Instructor Flexibility: Moving Beyond the Blended Instructional Model

    Research Articles

    Student Choice, Instructor Flexibility: Moving Beyond the Blended Instructional Model

    Authors: , ,

Abstract

Due to the rapid increase in online course enrollments, online and blended education receives much research attention. However, a paucity of research exists for the Hybrid-Flexible (HyFlex) instructional model. This model allows students flexibility about how to participate in lecture and is geared toward providing students with educational choices and incorporating instructional technologies that mirror the personal technologies students use every day. This article outlines the development and testing of a modified HyFlex instructional model specifically designed for large, on-campus courses where students had three attendance mode choices (live online, face-to-face, or view a recorded session). To support curricular goals, the instructor implemented technology affording live lecture streaming, polling, and backchannel communication with negligible cost to students and little cost to the department. Highlighted results indicate the modified HyFlex instructional model had no negative impact on student performance in the class, both in overall learning and on individual grades. Furthermore, students greatly enjoyed the educational choices and overwhelmingly reported the incorporation of technology increased their participation in class and comprehension of course content. The authors discuss the findings, address study limitations, and offer suggestions for future HyFlex research.

Keywords: Educational environments, Pedagogy, Teaching methods, Technology, HyFlex, Audience Response Systems

How to Cite:

Miller, J. B., Risser, M. D. & Griffiths, R. P., (2013) “Student Choice, Instructor Flexibility: Moving Beyond the Blended Instructional Model”, Issues and Trends in Learning Technologies 1(1). doi: https://doi.org/10.2458/azu_itet_v1i1_16464

Downloads:
Download PDF

6503 Views

11132 Downloads

Published on
21 May 2013
Peer Reviewed
Student Choice, Instructor Flexibility: Moving Beyond the Blended Instructional Model

Student Choice, Instructor Flexibility:
Moving Beyond the Blended Instructional Model

Jackie B. Miller
The Ohio State University

Mark D. Risser
The Ohio State University

Robert P. Griffiths
The Ohio State University

As postsecondary enrollments in the United States continue to increase—2000 to 2010 saw a 41% increase in undergraduate enrollments (Knapp, Kelly-Reid, & Ginder, 2012)—there is pressure for traditional brick-and-mortar universities to provide more diverse delivery options for students to take courses. Given the popularity of strictly online education, one choice for these institutions is to use hybrid learning methods, in which courses consist of synchronous online and face-to-face components. One promising hybrid learning model is the HyFlex (hybrid, flexible) model, originally designed by Dr. Brian Beatty (2010) for his graduate courses at San Francisco State University.

The HyFlex Model

The primary feature of the HyFlex model is to combine synchronous online and face-to-face components ("hybrid") in a single course and allow students to choose when and how they attend ("flexible"). Beatty (2010) defines HyFlex courses to be those that "enable a flexible participation policy for students whereby students may choose to attend face-to-face synchronous class sessions or complete course learning activities online without physically attending class" (Introduction, para. 1).

In a HyFlex course, the choice of participation mode can be made independently by each student. The instructor provides lecture content, structure, and activities to meet the goals of the syllabus. This must be done in such a way as to give "equivalency," so that students can experience the course content and complete the course requirements in comparable ways whether attending online, face-to-face, or in some combination of the two (Beatty, 2010). A true HyFlex course should also maximize accessibility by ensuring that students are equipped with the technological skills needed to access the participation choices (Beatty, 2010).

Though it has existed for nearly a decade (Beatty, 2006), relatively little research exists about the effectiveness of the HyFlex model for student learning within higher education. Thus far, much of the information around HyFlex stems from Beatty’s work, which has been featured at a variety of conferences, including the 2010 Educause Learning Initiative, and was identified as an Effective Practice in 2008 by the Sloan Consortium (2012). The formal applications of HyFlex are mostly concentrated in the California State University system, where the model is useful due to the fact that many students work full time or commute long distances to attend class (Beatty, 2010). There is little research into the application of HyFlex to large courses, where it can arguably provide significant cost-savings and other benefits. The question of how to provide an engagingly interactive experience in a course with several hundred students remains unanswered—particularly when a portion of those enrolled are not sitting in the physical lecture space.

Comparisons between HyFlex and related models

A variety of other pedagogical models have experimented with ideas similar to those used in the HyFlex model. Live lecture streaming (LLS), which involves the "broadcasting of a lecture over the internet [sic] at the same time as it is being delivered in the traditional lecture theater" (Fernando, Cole, Tan, & Freitas, 2011, p. 2), is a course model similarly motivated by cost, demand, and cohort size issues. Lecture recordings are made available for students who are unable to attend lecture during its scheduled time, as well as to enable remediation (Fernando et al., 2011). The more generically-titled "video streaming" model implemented by Buhagiar and Potter (2010) delivers a fully online, synchronous video lecture to the enrolled students, but also gives students the option of attending the live (traditional) version of the class that is being broadcasted (Buhahiar & Potter, 2010).

These models have many of the same motivations and components as the HyFlex model, but many studies in this area focus on distance learning, not hybrid methods for local students (Abdous & Yoshimura, 2010). HyFlex also enrolls all students in a single course section, instead of separating sections by delivery mechanism, as both LLS and the video streaming model do. HyFlex allows flexibility in attendance method on a daily basis, accommodating a broad variety of learning preferences and, more practically, the diversity of college students’ busy schedules. Furthermore, all students in a HyFlex course have equal opportunity to interact with the instructor and other classmates; the extent and nature (i.e., in person or online) of this interaction can also vary based on individual preference and need. Therefore, while containing some overlap, the HyFlex model is sufficiently distinct as to warrant independent treatment.

Backchannel Communication and Audience Response Systems

Educators can choose from numerous technologies when implementing the HyFlex model. These choices will have a major impact on the suitability of the model for a given situation, and on the ultimate success of the implementation. This study supplemented basic video streaming and recording tools with two additional technologies intended to amplify the viability and equivalency of available attendance options: backchannel communication and an audience response system (ARS).

Backchannel communication. "Backchannel" is a mode of communication created by audience members to connect with other observers both inside and outside of the presentation space, with or without the speaker’s awareness (Atkinson, 2009). The backchannel—often a web-based or mobile chat room—facilitates real-time conversation among students (and possibly the instructor) during a traditional lecture (Educause Learning Initiative, 2010), and "can entice students who rarely raise a hand to express themselves via a medium they find as natural as breathing" (Trip, 2011, para. 5). Previous research has shown backchannel use increases students’ feelings of connectedness in lecture and allows students to provide real-time feedback within a "micro discussion" context, similar to text messaging or micro-blogging (e.g., Twitter) (Aargard, Bowen, & Olesova, 2010). The backchannel affords audience engagement by reporting, enhancing, and commenting on course content via a public website and offers the presenter more direct and immediate feedback (Atkinson, 2009).

Audience response systems. Polling via an audience response system also has received much attention in the educational community. Put simply, "[audience] response systems are instructional technologies that allow instructors to rapidly collect and analyze student responses to questions posed during class" (Bruff, 2009, p. 1). ARS, also known as "clickers," have more recently come to include web-based polling using text messaging and mobile devices. Research about polling has shown that it yields more active student involvement, increased attention levels and interaction, and increased comfort in answering questions (Filer, 2010). Furthermore, while use of an ARS did not increase post-lecture quiz scores, students using an ARS reported significantly higher course satisfaction scores (Filer, 2010). ARS give instructors the ability to "assess students’ existing knowledge and competencies" as well as "provide immediate and appropriate feedback during instruction to help clarify student understanding of class content" (Kyei-Blankson, 2009, Conclusion, para. 4). Finally, an ARS can be used to acknowledge each individual’s response and draw attention to the diversity of responses, all while keeping responses anonymous and leaving instructors with quantitative information about student comprehension (Campt & Freeman, 2010).

Polling and the backchannel further provide both pedagogical and practical benefits in the classroom. For example, both technologies can be used to identify common errors or misconceptions and allow immediate instructor feedback by way of additional explanations or clarifying examples (Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010). The anonymity provided by polling removes the "embarrassment factor" and gives students a safe space to answer questions honestly (Bruff, 2009), which promotes an engaging classroom environment (Ambrose et al., 2010). Moreover, the backchannel provides meaningful technology use by students during class. As Derek Bruff told The New York Times , "professors could reduce [tuning out, checking email, and online shopping] by giving students something class-related to do on their mobile devices" (Tripp, 2011, para. 10).

Together with the HyFlex model, the technologies mentioned above can be leveraged to attain the scalability needed for increased course enrollment, while also meeting the demand among current college students for autonomy and choice in the consumption of educational content (Dahlstrom, 2012). While this particular study was conducted in a general education Statistics course, the results and recommendations are geared for broader application in any academic discipline.

Method

Setting

A total of 161 undergraduate students enrolled in the pilot section of Statistics 145 (Autumn 2011) at a large, public Midwestern university participated in the study (98.8% of the course enrollment). Statistics 145 is an algebra-based introduction to data analysis, experimental design, sampling, and linear regression. The course satisfies the "data analysis" portion of the general education requirement for students in several programs across the university. For comparative purposes, the control group for this study consisted of the 168 undergraduate students enrolled in two additional sections of Statistics 145: one section with 114 students during the day, and one at night with 54 students (the pilot section was held during the day). At the time of this study, Statistics 145 was a 10-week course with three 48-minute lecture sections and two 48-minute recitation sections per week. As will be outlined, the pilot study focused almost completely on the lecture component of the course.

Following Beatty’s HyFlex model, the primary goal for this study was to provide students with attendance options. Specifically, the live attendance options made available to students in this HyFlex model were face-to-face (traditional) attendance and synchronous online attendance. The course was structured so students could choose how they wished to attend lecture on a daily basis; no requirements were imposed on how often students attended online or traditionally. For example, students could attend completely face-to-face, completely online, or some combination of the two. The only mandated face-to-face portions of the class were recitation periods and examinations.

Choice of technology. This study was initially motivated by a university-specific semester conversion, which caused a higher demand for large lecture spaces. Another primary goal was to serve more students with reduced access to large lecture spaces. The HyFlex model allowed the registrar to book a large space for two lecture days compared to the typical three. An implied third goal, consistent with the concept of HyFlex equivalency, was to standardize the students’ experience of the class as much as possible across the attendance methods (Beatty, 2010). Planning for and implementation of technology became a large component of the project, as a variety of tools were needed to enable students attending synchronously online to also fully participate in lecture meetings. To this end, a backchannel and an ARS were introduced; the specific technology chosen for this project was Poll Everywhere for both the ARS and backchannel. The authors based this choice on factors including technological compatibility, cost, and accessibility. The usage and implementation of Poll Everywhere will be described more fully; however, ongoing expansion of this research includes studying the viability of other technologies, such as Twitter and Top Hat Monocle.

Procedures. The non-pilot sections did not offer attendance choices, use of an ARS, or use of a backchannel, but aside from these differences the three sections of Statistics 145 referenced in this paper (one pilot section, two traditional/control sections) were highly similar. Three different lecturers taught the different sections, but the instructors used the same PowerPoint slides, provided students with the same note outlines, received common training from the course coordinator, and participated in weekly meetings to discuss pace and progress of the course. All sections were taught with the same course objectives, namely to introduce students to the process of doing statistics, collecting data, enabling the use of statistical tools for data presentation and analysis, and learning common statistical procedures and summaries. Homework assignments and exams were common to the three sections, and the learning activities administered in the small-group recitation time were identical. Some variability was introduced by way of the team of graduate teaching assistants who led the recitation sections.

Lecture attendance was not required in either the pilot or non-pilot sections, but recitation attendance was required in all cases and contributed to a participation component of students’ grades. An additional portion of each student’s participation grade came from a completion-based activity, and this activity was different for the pilot and control sections. As in previous terms, students in the control sections were required to complete and turn in a worksheet based on the day’s recitation activity, which was graded for completion. Students in the pilot section, on the other hand, were required to complete a "lecture review quiz" after each lecture and before the following recitation meeting. This was a three- to five-question quiz, administered on the course management website, and was designed to reiterate large-scale concepts from lecture. This assessment had two goals: to ensure that students experienced the lecture content before they attended recitation, and to track how students were attending lecture. Lecture review quizzes were graded for completion.

HyFlex Model Implementation

Screenshot of Adobe Connect

Figure 1 . Screenshot of Adobe Connect. This screen is what students see when attending synchronously; here, in solving a worked problem, the instructor�s writing appears in real time.

During each lecture, both Miller (course coordinator, lecturer and primary instructor) and graduate research associate Risser were present, with Miller teaching at the front of the classroom and Risser situated with a laptop among the students to assist with setup and backchannel management. The HyFlex model was introduced on the first day of lecture, during which attendance methods and technological implementations were outlined.

Screenshot of the backchannel, using Poll Everywhere

Figure 2 . Screenshot of the backchannel, using Poll Everywhere. Responses with "(Risser)" refer to commentary provided by the research associate and classroom assistant.

To make synchronous online attendance possible, both lecture slides and an audio feed were streamed live using Adobe Connect web conferencing software. Adobe Connect was available for the project thanks to a university license, and served the needs of the HyFlex model well. Lecture slides were projected in class via a tablet PC, and the instructor wore a lapel microphone to provide an audio feed. Students attending online watched a live view of the instructor’s computer screen (Figure 1), most often in full-screen presentation mode showing PowerPoint slides with real-time writing. As well as being streamed live, lectures were recorded and posted on the course management website to enable future viewing and reviewing.

A backchannel and an ARS were incorporated to allow interaction between the instructor and students attending class synchronously. Using Poll Everywhere, the ARS consisted of web-based polling, and student participation was made possible through use of text messaging or Internet-enabled mobile devices. The backchannel allowed students viewing online to submit a question at any time during the lecture. Submissions would then scroll down the public backchannel site much like a news or Twitter feed (Figure 2), visible to other students attending synchronously. The graduate research associate moderated the backchannel, either responding to questions directly (also through the backchannel) or repeating the question orally in class for the instructor to provide a more thorough response.

Figure 3a (a)

Figure 3b (b)

Figure 3c (c)

Figure 3 . Screenshot of three Poll Everywhere multiple-choice questions. The questions highlight how the poll allowed the instructor to address the majority of incorrect responses: in 3(a), the second choice is actually correct; in 3(b), the third choice is correct; in 3(c), the first choice is correct.

In general, the primary function of Poll Everywhere is to provide a forum for multiple choice, true/false, and short answer questions, which are administered and viewed online (see Figure 3). On average, between one and four questions were posed to students each lecture. The specific questions chosen were designed to mirror the pedagogical foundations referenced in the introduction; namely, to reveal common misconceptions held by students in the classroom and to subsequently allow students who answered incorrectly to see that they are not alone in their misunderstanding.

Incorporation of the backchannel and online polling also benefited students who chose to attend face-to-face. Both the backchannel and the online polls allowed contributions via any device (e.g., text message or web browser), and students were able to contribute to these forums regardless of attendance mode. Participation in both the poll questions and backchannel was neither encouraged nor discouraged. Student participation in the polls and backchannel was anonymous—a function of the constraints of Poll Everywhere, but a reality that the authors were content with.

Findings

Data collected from the pilot study included self-reported attendance information, grades, an end-of-term survey, and focus groups. Additionally, students enrolled in all sections of Statistics 145 for the Autumn 2011 term completed a pre-test and post-test to assess progress during the term.

Comparison to Non-Pilot Sections

The first research question involved an overall comparison of learning between students involved in the pilot section of Statistics 145 and students in one of the two other Statistics 145 sections offered during Autumn 2011. To address this question, student learning was measured by the START test, a standardized test which assesses students’ statistical reasoning. The 14-question START test was developed as a shorter alternative to the 40-item Comprehensive Assessment of Outcomes in a First Statistics Course (CAOS) test, developed by the ARTIST team at the University of Minnesota as part of a National Science Foundation grant (Delmas, Garfield, Ooms, & Chance, 2007). Students in both pilot and non-pilot sections of Statistics 145 were encouraged to take this test at the beginning and end of the course; the variable of interest was the difference between the post-class score and pre-class score. A total of 42.9% of the consenting students (69 of 161) in the pilot section took both pre- and post-tests, as did 67 of the students from the non-pilot sections; analysis for this first question involves these 136 students.

Define µ pilot to be the true difference between post-class score and pre-class score for students in the pilot section; define µ control to be the true difference between post-class score and pre-class score for students in the non-pilot sections. A Welch’s two-sample t -test for equality of means was not statistically significant, with t (132.8) = -1.0447, p = .2967, with a 95% confidence interval to be [-1.401, .431] ( see Table 1 ). With a p -value of .2967, we do not reject the null hypothesis of equality of means—the HyFlex model performs no differently from the traditional classroom model with respect to student learning. Note that because the confidence interval includes more negative values (and the estimate itself is negative), the scores in the control sections are estimated to be better than the scores from the pilot section, although this difference is small and not significant.

Comparison Within Pilot Section

First, a comparison was made between the two categories of "Live" attendance. For each grade category (homework average, midterm, final grade), define µ f2f to be the true average grade for students attending most often face-to-face and µ AC to be the true average grade for students attending most often synchronously via Adobe Connect. For each of these three grade categories, a two-sample test for equality of means was again not statistically significant, with p -values of .7312, .2724, and .3448, respectively. See Table 3 for the full results and confidence intervals . For each grade category, with large p -values we are not able to reject the null hypotheses of no difference in means and conclude that there is no significant difference in homework grades, midterm scores, and final course grades for the two secondary attendance groups of students. This conclusion is again expected based on the fact that each confidence interval includes zero; as before, note that each interval contains more positive values, which indicates that the face-to-face grades are higher, but not significantly so.

To determine whether attendance methods have an impact on student success in the course, we next turn to a within-pilot section comparison. The proxy for student success chosen for our analysis was course grades; specifically, the average from 12 homework assignments, the midterm exam score, and final course grade. Analysis considered each of these three categories separately. To formally address this question, students were placed in categories by primary and secondary attendance method (measured by attendance mode frequency, self-reported in daily lecture review quizzes; see Table 2 ). The data used to address this second research question come from the 161 students in the pilot section who allowed their grades to be used for analysis.

Because no difference could be detected between the two groups of students who most often attended lecture live, it is safe to consider these students a single group, which allows us to further make the same comparisons as above, now based on the two groups consisting of students in the primary categories of "Live" and "Recording." See Table 4 for complete results . Again, large p -values and the fact that confidence intervals contain zero allow us to conclude that there is no significant difference in homework grades, midterm scores, and final course grades for these two primary attendance groups.

Affective Responses from Students

Both the end-of-term survey and focus groups addressed students’ affective responses to the HyFlex classroom model and provided qualitative information on what they did and did not like about the new course structure and inclusion of more learning technology. The end-of-term survey was aligned with university course satisfaction surveys; specifically, students were asked 30 questions about the class, with responses solicited on a 4-point Likert scale (strongly agree, agree, disagree, strongly disagree). Most of the questions addressed the technology used in the course, including both the technologies used previously (e.g., online course management system and publisher-created software used for online homework) and the new "instructional technologies" (IT) implemented in the pilot (IT was defined to be any combination of tablet-based slides in PowerPoint, statistical applets, Poll Everywhere, and Adobe Connect).

Ease of use and perceived benefits. A total of 77 students in the pilot study (47.9%) responded to the end-of-term survey. In general, students found the IT easy to use and helpful to their learning. For example, 80% of students either "agreed" or "strongly agreed" that it was easy to access the online lectures through Adobe Connect (another 12% marked "not applicable"); approximately the same percentage found it easy to respond to a lecture poll. Another 95% "agreed" or "strongly agreed" that IT made the course materials more interesting and increased understanding of the course concepts, and more than 70% said technology increased their participation in class over what was expected coming into the course. A subset of the results is available in Tables 6 and 7 .

Desire for flexibility. The feedback from the focus groups contained much overlap with the comments shared in the end-of-term survey. The first major theme in the focus groups involved the attendance choices. Most students chose their daily attendance based on whatever worked with their schedule on that particular day. Weather conditions, other class responsibilities, extracurricular commitments, and the need for extra sleep motivated Adobe Connect attendance; a preference for consistency and face-to-face instruction motivated in-class attendance. Interestingly, different students favored different attendance methods. Some preferred face-to-face attendance to online synchronous attendance because of the increased interaction with the instructor, the ability to speak with her after class, and an overall more engaging experience. Others disagreed, saying online synchronous attendance was better than attending face-to-face, as they were able to stay more alert, focused, and comfortable outside of the classroom. One student commented, "I enjoyed being able to watch class on my laptop sitting in my bedroom with a cup of coffee, in my pajamas. I learn better when I am comfortable, especially with morning classes."

F2F lectures with IT 57%; Online lectures 38%

Figure 4 . Final question from the end-of-term survey; highlights a preference for lectures that involve technology.

Level of participation. Students overwhelmingly replied that instructional technology increased their participation in class. Having lecture recordings for review enabled one student to go back and re-listen to how the instructor explained a difficult topic—a remediation tool not previously available. ("I really loved how if I didn’t understand a lecture, I could just go back and re-watch the lecture since they are all online.")Some students found the course difficult but "manageable" thanks to the use of instructional technology; students found the in-class polls helpful, engaging, and a fun way to check comprehension without the pressure of getting an answer wrong. One student even commented that using Poll Everywhere gave the feeling of being in a small class, where problems are worked out on a chalkboard.

The last question on the end-of-term survey asked students to select how they would most likely attend lecture if given a choice (Figure 4). Only five percent of respondents indicated that they would prefer face-to-face lectures with no instructional technology, while an overwhelming majority (57%) would select face-to-face lectures with instructional technology. A slightly smaller proportion (38%) would prefer completely online lectures.

Discussion

Based on the research foundations in Section 1 and the results and analyses shared in Section 3, a strong argument has been made in favor of the HyFlex model, especially for large, on-campus courses. Given the current state of the post-secondary education system, the flavor of blended learning described in this paper has great potential in its application, as it reduces the pressures placed on brick and mortar universities by increased enrollment and rising popularity of online education. It is notable that after experiencing a 10-week HyFlex course, only five percent of respondents in the end-of-term survey said they would prefer a class that was fully face-to-face with limited instructional technology. The HyFlex model additionally satisfies the needs and desires of today’s college student by providing educational choices, opportunity for a student-driven curriculum, an engaging classroom atmosphere, and a concrete framework for incorporating the technology students are already using every day. For example, a 2010 Pew survey reported that 54% of all teens contact their friends daily using text messaging and 25% do the same with a social network site; only 33% of teenagers talk to their friends face-to-face on a daily basis (Lenhart, 2010). Furthermore, the HyFlex model accomplishes all of this while maintaining quality instruction and avoiding a decrease in course outcomes (operationalized by grades).

In the previous section, we discussed the ways in which this pilot study improved the quality of our course. The feedback received from students emphasized an increase in participation, increased interest in the course materials, and increased understanding of core concepts. The new components of this model were easy for students to use, from participating with in-class polls to accessing online homework assignments. Additionally, students were grateful for the availability of lecture recordings, which were posted shortly after class each day. Initially, the recordings were posted so that students who missed class could still experience the lecture content; however, we found that students were using the recordings as a remediation tool, going back through the videos to review difficult concepts. The impetus for increased participation in class came from a removal of the "embarrassment factor," made possible by anonymous responses to the web polls and backchannel. Students were more willing to take risks and commit to an answer even if they were not completely confident in their selection.

Technical Challenges and Self-Reported Attendance

However, our HyFlex curriculum model is by no means perfect, and several difficulties should be noted. The most common problem experienced involved technical difficulties which, while disruptive to the flow of lecture, were not overly detrimental. Additionally, as the term unfolded, it became obvious that collecting attendance data through self-reporting was less than ideal. On average, only 75-85% of the class completed lecture review quizzes for each session; however, each student filled out enough review quizzes over the course of the term to enable categorization by most common attendance method.

More seriously, anecdotal evidence indicated misreporting of attendance ("I know Dr. Miller said my responses won’t affect my grade, but I’m still not sure I feel comfortable honestly reporting how I attend class") in spite of repeated encouragement to honestly report attendance. Fortunately, although we cannot verify individual student responses, we are able to compare reported attendance numbers with actual data on synchronous Adobe Connect attendance and lecture recording views for nearly all class sessions ( see Table 5 ). The good news is that students were quite honest in reporting their synchronous Adobe Connect attendance: on average, true attendance actually exceeded reported attendance. On the other hand, reported views of the lecture recordings were much higher than the actual views. Thus, while the reported numbers of lecture recording views are suspect, the tallies for synchronous Adobe Connect attendance seem to be quite reliable. In any case, future research would do well to establish a more accurate and objective method for collecting attendance data.

Combined with subjective daily observation, the self-reported data seemed to indicate that the total number of students actually experiencing the lecture content was comparable to other terms. Nonetheless, an important question to address as this project moves forward is: what is the best way to motivate students to experience the lecture, either through face-to-face or online attendance? Measuring student engagement is something that might inform this question, but just how to do so was not addressed in this project.

Next Steps

Despite the success of this project, the issues encountered throughout the HyFlex pilot study leave many research questions for future investigation. For example, better ways to monitor attendance and develop the desired classroom community are needed. A more controlled research design might eliminate the option to view a recorded lecture, allowing all students to interact synchronously regardless of attendance method. Providing ways for students to feel socially connected with their classmates becomes particularly important within a HyFlex model where some students are not physically present in the classroom. Furthermore, the technologies used in this pilot study have been somewhat piecemeal, and we hope to find a more cohesive technology to both simplify setup and provide students with a more seamless experience. As technologies continue to advance, we look for ways to find the appropriate technologies for both our needs and the needs of our students.

Development of the Backchannel

Based on feedback from the end-of-term survey and personal pedagogical philosophy, another future goal of this project is continued development of the backchannel. The backchannel is not currently in its optimal form, which would be a real-time, in-class forum for students to ask questions, answer other students’ questions, and host relevant discussion on lecture content. Students are not currently required to bring a laptop to class, which would be necessary for the superlative backchannel. While Poll Everywhere and Adobe Connect will continue to be used in the short-term, additional exploration of available technologies is motivated by this problem, and finding a more all-encompassing technology (as mentioned above) would greatly improve the usefulness of the backchannel and contribute to feelings of classroom community.

Similarly, a better logistical arrangement for monitoring the backchannel needs to be developed. As described in Section 2, the current version of the HyFlex model requires two people to be present in the classroom: one to lecture, and one to monitor the technology. This is unrealistic; in future terms the instructor will be required to monitor the backchannel while lecturing. It remains to be seen if this additional responsibility will prove to be too great a distraction.

Continued research is required to continually improve this model. The results demonstrate some benefits, with little-to-no harm compared to the traditional lecture model. As HyFlex gains traction, it will be important to determine best practices to maintain quality while reducing physical and personnel resources associated with large courses. The authors invite others to join the conversation and to work to advance the HyFlex model.

References

Aargard, H., Bowen, K., & Olesova, L. (2010). Hotseat: Operating the backchannel in large lectures. Educause Quarterly , 33 (3). Retrieved from http://www.educause.edu/ero/article/hotseat-opening-backchannel-large-lectures

Abdous, M., & Yoshimure, M. (2010). Learner outcomes and satisfaction: A comparison of live
video-streamed instruction, satellite broadcast instruction, and face-to-face instruction. Computers & Education , 55(2), 733-741.

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M.C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching . San Francisco: Jossey-Bass.

Atkinson, C. (2009). The backchannel: How audiences are using Twitter and social media and changing presentations forever . Berkeley: New Riders Press.

Beatty, B. (2006). Designing the HyFlex world [PowerPoint slides] . Retrieved from http://itec.sfsu.edu/hyflex/beatty_hyflex_aect2006.ppt

Beatty, B. (2010). Hybrid courses with flexible participation – the HyFlex design . Draft v2.2. Retrieved from http://itec.sfsu.edu/hyflex/hyflex_course_design_theory_2.2.pdf

Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments . San Francisco: Jossey-Bass.

Buhagiar, T., & Potter, R. (2010). To stream or not to stream in a quantitative business course.
Journal of Instructional Pedagogies , 3. Retrieved from http://www.aabri.com /manuscripts/09417.pdf

Campt, D., & Freeman, M. (2009). Talk through the hand: Using audience response keypads to augment the facilitation of small group dialogue. International Journal of Public Participation , 3 (1), 80-107.

Dahlstrom, E. (2012). ECAR study of undergraduate students and information technology, 2012 (Research Report). Educause Center for Applied Research. Retrieved from http://www.educause.edu/library/resources/ecar-study-undergraduate-students-and-information-technology-2012

Delmas, R., Garfield, J., Ooms, A., & Chance, B. (2007). Assessing students’ conceptual understanding after a first course in statistics. Statistics Education Research Journal , 6 (2), 28-58.

Educause Learning Initiative (ELI) Publication. (2010). Seven things you should know about backchannel communication. Educause Library. Retrieved from http://net.educause.edu/ir/library /pdf/ELI7057.pdf

Fernando, N. J. S., Cole, J. V., Tan, P. L., & Freitas, J. C. (2011, June). Live Lecture Streaming
for Distributed Learning . Paper presented at Scanning the horizons: Institutional research in a borderless world higher education institutional research network conference, Kingston University. Retrieved from http://www.heir2011.org.uk/conference-papers/papers/Fernando%20HEIR2011.pdf

Filer, D. (2010). Everyone’s answering: using technology to increase classroom participation. Nursing Education Perspective , 31 (4), 247-50.

Knapp, L. G., Kelly-Reid, J. E., & Ginder, S. A. (2012). Enrollment in postsecondary institutions, fall 2010; Financial statistics, fiscal year 2010; and Graduation rates, selected cohorts, 2002-2007. U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubs2012/2012280.pdf

Kyei-Blankson, L. (2009). Enhancing student learning in a graduate research and statistics course with clickers. Educause Quarterly , 32 (4). Retrieved from http://www.educause.edu/ero/article/enhancing-student-learning-graduate-research-and-statistics-course-clickers

Lenhart, A. (2010). Text messaging becomes centerpiece communication. Pew Research Center Publications. Retrieved from http://pewresearch.org/pubs/1572/teens-cell-phones-text-messages

Sloan Consortium. (2012). Effective Practices Award Winners. http://sloanconsortium.org/ep_award_winners

Trip, G. (2011, May 12). Speaking up in class, silently, using social media. New York Times . Retrieved from http://www.nytimes.com/2011/05/13/education/13social.html?_r=2&pagewanted=all

Tables

Table 1

Test for the equality of means between pilot and non-pilot sections

Classroom model

n

Mean difference in pre- and post-class START test scores

95% CI for the difference in means

p -value

HyFlex (pilot)

69

.754

[-1.401, .431]

.2967

Traditional (control)

67

1.239

Note: CI means confidence interval.

Testing the hypothesis of no difference in effect of classroom model (pilot versus traditional) on student learning. The large p -value indicates that classroom model had no significant effect on student learning.

» Return to text

Table 2

Primary Attendance Category

Secondary Attendance Category

Category Description

Total Students ( n )

Live

Face to Face

Students attending lecture in person.

56

Adobe Connect

Students attending lecture synchronously online.

21

Recording

Plans to Watch

Students reporting their attendance before watching the recorded video.

41

Watched Recording

Students reporting their attendance after watching the recorded video.

39

Did Not Attend

Students neither attending lecture nor planning to watch the recorded video.

4

Categorization of students in the pilot section, by the method students most often attended lecture.

» Return to text

Table 3

Test for the equality of means between face-to-face and Adobe Connect attendees

Grade Category

Most frequent attendance method

n

Average grade

95% CI for the difference in means (f2f – AC)

p -value

Homework          (20 points total)

Face to face

56

17.21

[-1.418, 2.001]

.7312

Adobe Connect

21

16.919

Midterm Exam      (50 points total)

Face to face

56

42.728

[-1.476, 5.074]

.2724

Adobe Connect

21

40.929

Final Course Grade

Face to face

56

81.854

[-4.223, 11.731]

.3448

Adobe Connect

21

78.1

Note : CI means confidence interval; f2f means Face to face; AC means Adobe Connect.

Within-section grade comparisons between secondary attendance groups of "Face to Face" and "Adobe Connect." Large p -values are desirable; this indicates that choice of attendance method does not negatively impact grades.

» Return to text

Table 4

Test for the equality of means between Live and Recording attendees

Grade Category

Most frequent attendance method

n

Average grade

95% CI for the difference in means
(Live – Recording)

p -value

Homework (20 points total)

Live

77

17.131

[-1.343, .635]

.4806

Recording

80

17.485

Midterm Exam (50 points total)

Live

77

42.237

[-2.862, .823]

.2761

Recording

80

43.256

Final Course Grade

Live

77

80.83

[-6.315, 2.167]

.3354

Recording

80

82.904

Note : CI means confidence interval.

Within-section grade comparisons, between primary attendance groups of "Live" and "Recording." Again, large p -values are desirable; this indicates that choice of attendance method does not negatively impact grades.

» Return to text

Table 5

Class Date

Actual recording views

Reported recording views

Difference (Reported minus actual)

Actual Adobe Connect attendees

Reported Adobe Connect attendees

Difference (Reported minus actual)

21-Sep

36

N/A

N/A

N/A

N/A

N/A

23-Sep

49

N/A

N/A

28

N/A

N/A

26-Sep

34

13

-21

36

25

-11

28-Sep

42

35

-7

30

21

-9

30-Sep

46

36

-10

26

21

-5

3-Oct

41

34

-7

39

35

-4

5-Oct

39

37

-2

32

26

-6

7-Oct

50

57

7

43

24

-19

10-Oct

45

35

-10

36

31

-5

12-Oct

47

45

-2

30

24

-6

14-Oct

48

66

18

27

25

-2

17-Oct

39

27

-12

42

37

-5

21-Oct

56

61

5

22

25

3

24-Oct

54

63

9

22

23

1

26-Oct

42

63

21

19

16

-3

28-Oct

44

87

43

12

17

5

31-Oct

50

70

20

18

17

-1

2-Nov

47

51

4

28

20

-8

4-Nov

54

69

15

16

17

1

7-Nov

45

64

19

30

25

-5

9-Nov

39

60

21

27

21

-6

14-Nov

45

68

23

23

17

-6

16-Nov

42

67

25

21

14

-7

18-Nov

43

71

28

18

17

-1

21-Nov

48

57

9

21

18

-3

23-Nov

50

75

25

15

16

1

28-Nov

36

52

16

20

16

-4

30-Nov

37

61

24

22

14

-8

2-Dec

35

76

41

16

15

-1

Actual (unique) recording views and synchronous Adobe Connect attendance compared with reported numbers. Note: Reported recording views change drastically in mid-October, which corresponds to the time of the midterm exam.

» Return to text

Table 6

Question

Strongly Agree or Agree

Disagree or Strongly Disagree

N/A

It was easy for me to respond to a lecture poll using Poll Everywhere.

83.1%

11.7%

5.2%

It was easy for me to submit a question to the backchannel using Poll Everywhere.

67.5%

10.4%

22.1%

It was easy to access online lectures with Adobe Connect.

81.8%

6.5%

11.7%

It was easy to access lecture recordings in the university learning management system.

84.4%

3.9%

11.7%

End-of-term survey for Autumn 2011 pilot section; questions focusing on backchannel, polls, and live streaming. Response rate 47.2%.

» Return to text

Table 7

Question

Strongly Agree or Agree

Strongly Disagree or Disagree

Instructional technology (IT) helpedme prepare for recitations.

88.3%

11.7%

IT helped make the materials and activities in the course more interesting.

96.1%

3.9%

IT helped increase my interest in Statistics.

71.4%

28.6%

IT helped me increase my understanding of the concepts in the course.

94.7%

5.3%

IT was worth the time I spent using it.

92.2%

7.8%

IT took away from my learning in the course.

9.1%

90.9%

IT helped me increase my participation in class over what I would have expected coming into the course.

71.4%

28.6%

IT in this course helped meet my preferred learning style.

83.1%

16.9%

I would recommend this course to other students.

97.4%

2.6%

End-of-term survey for Autumn 2011 pilot section; questions focusing on general use of instructional technology (IT), defined to be any combination of tablet-based slides in PowerPoint, statistical applets, Poll Everywhere, and Adobe Connect. Response rate 47.2%.

» Return to text