Closing the gap between online and classroom student outcomes

For the past few years community colleges have shifted their focus from one of access to one of completion. Offering online programming is a great way to provide access to higher education but closing the gap between online and classroom student outcomes is an ongoing challenge.

eLearning via web-conferencing
eLearning CC-BY Wolfgang Greller on Flickr

Efforts on bridging the gap have mostly centered around learner characteristics: GPA / SAT scores, whether students have previous experience with online learning, their capacity for self-regulated learning (self-efficacy, time management, organizational skills), etc.. Although these can be helpful indicators in predicting online student achievement, another approach that offers promise focuses on the at-risk course.

Ferris State University has been offering Structured Learning Assistance (SLA) for the past 25 years focusing not on the at-risk student, but on the high-risk for failure course. Although this has not been offered as an online option the impact on student success has consistently meant better than a 10% higher pass rate than of those sections without supplemental instruction. By focusing on courses with a history of high failure and withdrawal rates SLA has been able to support students with an additional 45 hours of supplemental instruction.

A recent study at Borough of Manhattan Community College at CUNY looks at online course-level predictors of learning outcomes (Wladis et al 2015). The study found that there was a significant gap in course completion between online courses taken as electives and those that were required for a given major. Also lower-level courses had a much higher attrition rate than higher-level courses. The study suggests that interventions such as embedded supplemental instructional support (tutoring, mentoring, advising, extra technical assistance) within the more challenging courses could significantly improve – and possibly even eliminate – the performance gap between online and face-to-face outcomes.

A few years ago I attended the Online Learning Consortium (OLC) conference and sat in on a panel presentation describing strategies to keep students enrolled in online learning. The panel was represented by faculty and staff from Penn State World Campus. The strategies included embedding tutors in some of the more challenging online courses. A single embedded tutor might support as many as eight sections of the course.

Tutors meet virtually with students one-on-one or with groups by appointment as many as five to six days a week – including evening and weekend hours. The tutors also scheduled “drop-in sessions” when they would go over some of the more challenging concepts, answer questions, and provide more detail on upcoming course assignments. In addition to the virtual meetings, tutors posted helpful tips on study skills and supplemental web resources. The goal of the initiative was to increase retention by 2% per year over a five year period. However, the results showed a 75% reduction in withdrawal and late drops, and a 15% reduction in course failure rates.

Implementing an online supplemental instruction program sounds like a big undertaking but focusing on the most challenging online courses sounds like a great place to begin.

References:

C. Wladis, K. Conway, A. C. Hachey (2015). Using course-level factors as predictors of online course outcomes: A multilevel analysis at a U.S. urban community college. Studies in Higher Education. Vol 42 (1). Taylor & Francis Online. http://www.tandfonline.com/doi/abs/10.1080/03075079.2015.1045478

Advertisements

How do the best online teachers bridge the transactional distance between instructor and student?

Orcutt and Dringus (2017) share the results of their study on what experienced online instructors do to establish teaching presence and create a climate of academic intellectual curiosity.

Source: Beyond Being There: Practices that Establish Presence, Engage Students and Influence Intellectual Curiosity in a Structured Online Learning Environment

Their findings demonstrate the effectiveness of developing teaching presence well before the course actually begins. By initiating “authentic relationships” through the use of welcome posts or emails students are able to connect with their instructors before the first day of class. This proactive approach establishes teaching and social presence and help learners to develop a sense of “being there” – closing the perceived transactional distance typical of the virtual learning environment.

Stitz & Zeager Open Algebra 3rd Edition

Can OER improve learning outcomes?

The June issue of the International Review of Research in Open and Distributed Learning (IRRODL) is dedicated to Open Educational Resources (OER). Several of the studies focus specifically on OER and student learning outcomes.

In, Exploring Open Educational Resources of College Algebra, Marcela Chiorescu, of Georgia College describes a case-study in which the instructor of a hybrid college algebra course switches from commercial software and textbook after the first two semesters, to an open text and supplemental low-cost software for a semester, and then back again to the commercial content.

For the semester using OER, Chiorescu estimates students collectively saved over $13,500  – spending approximately 75% less than students using the commercial text and software. Certainly one of the main advantages of adopting OER is lowering the costs to students. However, there may be other advantages more closely tied to student success.

Analyzing the grade distribution over the four semesters, the percentage of students earning a grade of C or better was significantly higher at 84.3% for the students enrolled in the semester using OER over the previous or subsequent semesters. The percentage of students earning an A in the course was also higher for those using OER. In addition, the OER sections reported significantly fewer withdrawals.

The decision to return to commercial content after one semester was related to technical issues with the low-cost software (quizzes locking up and slow download speeds). Chiorescu was also concerned that the low-cost software was not comparable in to the commercial version, due to a “lack of resources”. To compensate for the deficiency of course materials, she developed a LibGuide to accompany the course including supplemental videos and tutorials.

In another study: The Impact of Enrollment in an OER Course on Student Learning Outcomes, Kim Grewe and Preston Davis, of Northern Virginia Community College, compared learning outcomes for students enrolled in an online history course using OER with a similar number of non-OER sections over two semesters. The study took into account student cumulative GPA and, as expected, found a correlation between prior academic achievement and student course achievement. However, an even stronger correlation was found between student achievement and OER section enrollment.

Both studies build upon previous research looking at the efficacy of OER and student achievement (Grew & Davis) finding that students enrolled in courses using OER perform as well, if not better, than students enrolled in non-OER courses. In addition, OER supported courses are more affordable and students are more likely to enroll in a higher number of credit hours per semester – and thereby, achieve their academic goals in a more timely manner.

Another subtle, but important, takeaway was the use of LibGuides to supplement OER textbooks. One of the challenges of adopting OER is that the open textbook may not include all of the supplemental materials that oftentimes accompany the commercial texts. LibGuides offer the opportunity to engage the academic librarian in the course design process and potentially improve the overall quality of OER course offerings.

References:

College Algebra, 3rd edition by Stitz C. and Zeager, J. (2013) Available at http://www.stitz-zeager.com/

CHIORESCU, Marcela. Exploring Open Educational Resources for College Algebra. The International Review of Research in Open and Distributed Learning, [S.l.], v. 18, n. 4, jun. 2017. ISSN 1492-3831. Available at: <http://www.irrodl.org/index.php/irrodl/article/view/3003/4223>. Date accessed: 26 Jun. 2017. doi:http://dx.doi.org/10.19173/irrodl.v18i4.3003.

GREWE, Kim; DAVIS, William Preston. The Impact of Enrollment in an OER Course on Student Learning Outcomes. The International Review of Research in Open and Distributed Learning, [S.l.], v. 18, n. 4, jun. 2017. ISSN 1492-3831. Available at: <http://www.irrodl.org/index.php/irrodl/article/view/2986/4211>. Date accessed: 26 Jun. 2017. doi:http://dx.doi.org/10.19173/irrodl.v18i4.2986.

 

 

 

Keeping score at relay race for Melrose Running Club

Measuring Student Success

For many years, community colleges have focused on providing access to everyone seeking a post-secondary education. More recently the narrative has changed to that of completion and how long it takes the student to complete. The literature tells us the longer a student is in college, the less likely she is to complete. The completion rates for first-time full-time degree seeking students are low – especially for community college students. Generally speaking, when we measure student success, what we are talking about is completion rates.

The fact is many – if not most – of our students are not first-time, full-time degree seeking students. Many have already completed a college education, some having attained bachelors and masters degrees, before coming back to start over in a new career. More and more of our student population are enrolling part-time. Others not seeking a degree at all, but instead, looking to update their job skills by taking a course or two. How do we measure student success for these learners?

California Community Colleges studied these non-completers over a two-year period. Referred to as Skills Builders, these students enrolled in a limited number of courses for the express purpose of enhancing their job skills or moving up the ladder within their careers. The study found that the average Skills Builder improved their salary by 13.6% – averaging $4,300 / year! Clearly these students were successful in meeting their educational goals.

EDUCAUSE Review recently published a collection of short essays entitled, Student Success: Mission Critical.  In the introduction, John O’Brien says, “If students don’t succeed, colleges and universities don’t succeed. Our full attention must be concentrated on the mission-critical goal of helping students define – and meet- their educational goals.”

To my way of thinking, this is what we should be talking about when we talk about student success: “helping students define – and meet – their educational goals“.

Regardless of whether students come to college to take a few courses to improve their employability, complete a program of study, or transfer to university, achieving their educational goals requires persistence. We often hear the terms “persistence and “retention” used synonymously. The National Center for Education Statistics (NCES), however, differentiates between the terms: “retention” as an institutional measure and “persistence” as a student measure. In other words, institutions retain and students persist.

This is an important distinction when it comes to measuring student success. As an institution, we can measure retention: did the student drop / complete? Has the student continued their program of study? However, whether the learner persists or not, is really up to the student. For this reason, we can, and should, create a learning environment that encourages student persistence.

“Tinto posits that students are more likely to remain enrolled in an institution if they become connected to the social and academic life of that institution” – Community College Research Center (CCRC).

The CCRC study found that community college students who make academic connections… “create a sense of belonging and attachment that seems to encourage persistence” – CCRC.

In Student Success: Mission Critical, George Mchaffey reminds us to not “avoid the academic heart of the enterprise… But the core of the enterprise is the curriculum and particularly the classroom. Some people avoid tackling that area because it is likely the most difficult. However, substantive change in student success outcomes must include attention to what happens to students in classrooms and in their academic journeys.”

We can measure student success in different ways: retention is measured by the quantity of students who continue and complete, but persistence in measured by the quality of the student’s experience – whether they belong here and how much we care.

References:

http://www.aacc.nche.edu/AboutCC/Trends/Pages/completion_report.aspx

https://www.insidehighered.com/news/2016/03/04/california-community-colleges-find-new-way-measure-success-noncompleters

http://er.educause.edu/articles/2017/5/student-success-mission-critical

http://ccrc.tc.columbia.edu/media/k2/attachments/exploration-tintos-integration-framework.pdf

 

Teen on Smarphone

Smartphone Learning

For the past several years the Horizon Report has listed mobile learning, in one form or another, as an emerging educational technology (e.g. mobile computing, mobile apps, social media, BYOD, mobile learning). Mobile technologies have changed over the years: from the early PDAs, Blackberrys and feature phones with texting capability and cameras, to tablets and eReaders to the ubiquitous smartphones of today. According to the ECAR 2016 Study of Undergraduate Students and Information Technology, 96% of undergraduate students now own a smartphone. Smartphones have clearly emerged as the mobile technology of choice, while the tablet, eReader and wearable technology ownership has dropped off.

Chart showing rate of undergraduate smartphone ownership
Undergraduate Smartphone Ownership

Despite near universal device ownership, students have yet to fully embrace the smartphone as a tool for learning. The ECAR study indicates that most students (appx 80%) do use their smartphones for one or more classes while only 46% consider them “essential for coursework” compared to their laptops at 93%. This is understandable considering the fact that many online courses tend to be reading and writing intensive. The size of the screen and necessity of “typing” with a virtual keyboard, can mean reading and writing with the smartphone a laborious task.

The top three ways listed by students for using academic technology include making it easier to access coursework (72%), increasing communication with other students (65%), as well as with their instructors (60%) – in other words, student-to-content, student-to-student, and student-to-instructor interactivity. Anderson’s Interaction Equivalency Theorem states that “deep and meaningful formal learning is supported as long as one (or more) of the three forms of interaction is at a high level”.

What if we were to design the course with the smartphone learner in mind? Not necessarily that the course must be taken using a smartphone, but that the learner who uses a smartphone as their primary technology would not be disadvantaged. What would we then need to do differently?

Considering student-to-content interaction, rather than delivering content primarily in text, the use of video and/or audio formats might prove more mobile-friendly. Smartphones are great for watching short videos or listing to music. Video and audio files (podcasts) can can be easily created using various mobile apps or web-conferencing solutions (e.g. Voice Recorder, Zoom.us, Skype). By using Google Drive or Archive.org, media can be made available for students to download offline, when they may be without a WiFi connection.

The ability to take photos, record and share images, audio and video via the smartphone camera can be a powerful tool for both student-to-content and student-to-student interactivity. By sharing or attaching photos, screenshots, video or audio files, learners can create authentic artifacts. Such media can be submitted to an e-portfolio or blog (e.g. Tumblr) for peer review or assessment of learning.

Most social media technologies (SMT) are designed to work with the smartphone as well as with desktop browsers. By replacing LMS threaded discussion with SMT (e.g. GroupMe), messaging, engaging in group discussions, as well as sharing news, scholarly articles, video, etc. becomes a simple and familiar process.

Scheduling virtual office hours using Skype, Zoom, or Hangouts can increase student-to-instructor interactivity and improve student satisfaction. Skype also can be used for asynchronous video and audio communication, supporting teaching presence and instructor immediacy.

Despite the pervasiveness of smartphone ownership by today’s undergraduate students, their use of the technology for academic purposes has not kept up with the rate of adoption. One reason students may not leverage their mobile devices for formal learning is educators have yet to fully “harness” the affordances of the technology for teaching and learning.

References:

Brooks, D.C. (2016). ECAR Study of Undergraduate Students and Information Technology 2016. EDUCAUSE.

ANDERSON, T (2003). Getting the Mix Right Again: An Updated and Theoretical Rationale for Interaction. IRRODL

Cochrane, T., Bateman, R. (2010). Smartphones give you wings: Pedagogical
affordances of mobile Web 2.0. Australasian Journal of Educational Technology.

Can Transactional Distance Theory inform instructional design for CBE?

For the past several years, online learning in higher education has focused on delivering a highly structured learning environment. Courses are typically designed to deliver content sequentially (week one, week two, etc.) with required reading assignments, discussion forums, quizzes, etc. This approach to online learning has served to help new-to-online learners navigate the online course, learn to use the most common learning management system (LMS) tools, and interact with their classmates in ways that have modeled the traditional classroom. However, with emerging deliver models such as Competency-based Educational (CBE) and Massive Open Online Courses (MOOCs) gaining traction in higher education, it may be time to consider design approaches to better support the autonomous learner.

Billiards Table - CC-BY by Oliver Clark on Flickr
CC-BY by Oliver Clark on Flickr

Transactional distance doesn’t refer to the distance between the instructor and student in regards to space or time but rather the distance in regard to transactions or interactions between the learner, instructor, and content. According to Transactional Distance Theory the “degree of structure and dialogue [required] in a course depends on the students’ ability to learn autonomously”.  Students who lack the necessary skills to self-regulate their learning may require a more structured learning environment, whereas autonomous learners are better positioned to succeed in a less structured environment (Koslow & Piña, 2015).

The inverse relationship between learner autonomy and course structure is important to keep in mind as we design for emerging distance learning models. In the CBE environment students enter a program of study at different times and progress at different rates. In order to support the flexibility required by CBE we may need to abandon the more traditional lock-step approach used over the past several years for designing online learning environments in higher ed.

Kozlow et al (2015) suggests that in order for autonomous learners to be successful they must possess self-regulated learning (SRL) strategies. The self-regulated learner has the ability to plan their own approach to learning as well as to review and evaluate their own understanding.

Online courses designed with SRL in mind might begin by offering pre-assessment opportunities for students to discover what they do or do not know on a given subject. Assessment feedback could include contextual links to additional resources / material for students to review. The use of practice quizzes along with digital badging systems and other formative assessment tools can assist students in measuring their own progress, as well as providing motivational support. Journals, blogs, and e-portfolios could replace the discussion forum commonly used in the “traditional” online course – providing an autonomous learning tool to assist with learner reflection.

As more colleges explore delivery models offering less structure and providing fewer opportunities for dialog, we need to consider instructional design approaches that can support student success in environments with greater learner autonomy.

References:

A. Kozlow & A. Pina (2015) Using Transactional Distance Theory to inform online instructional design. International Journal of Instructional Technology and Distance Learning. Vol. 12. No. 10  http://www.itdl.org/Journal/Oct_15/Oct15.pdf#page=67&zoom=auto,87,688

M. Weimer (2010). What it means to be a Self-regulated Learner. Faculty Focus, Magna Publications. http://www.facultyfocus.com/articles/teaching-and-learning/what-it-means-to-be-a-self-regulated-learner/

Using LMS Data to Improve Self-regulated Learning

A recent study examined learning management system (LMS) log files to look at course interactions for 530 online students and found that the students’ self-regulated study habits significantly influenced course achievement. The study focused on those interactions related to self-regulated learning, including such habits  as maintaining a regular study schedule, timely submission of assignments, frequency of course logins, and proof of reading course content (You, 2016).

It may seem obvious that students who possess good study habits are more likely to succeed in the online learning environment (or any learning environment for that matter). However, for me, the important take-away from the study is the potential for leveraging the data collection systems and early alert functionality within our LMS toward reinforcing self-regulating learning habits with those students who may be at-risk of dropping or failing their online course.

The Retention Center is an early alert feature included in our campus LMS – Blackboard Learn. The Retention Center comes set up with four default rules: Course Access, Activity, Grades, and Missed Deadlines. Although these “rules” align fairly well with the study habits mentioned in the study, we are able to customize, as well as to design new rules that can be added to the Risk Table to further support and reinforce study habits.

Screen Shot - Blackboard Retention Center - Rules Customization
Blackboard Retention Center – Rules Customization

The course access and activity rules align with the frequency of course logins habit. The default rule is five days since last access, but in light of the research, I suggest shortening the number to two or three days. The User Activity default rule is set to twenty percent below average for a week – again, I would suggest changing it to three or four days.  The timely submission of assignments and proof of reading course content will depend on setting up corresponding columns in the grade book. By developed assignments that require students to review material (possibly video content) and then to complete a related assessment, grade alerts can be triggered in the corresponding grade book column. By tweaking the default rules or adding new rulee, the instructor can quickly identify students who display poor study habits and immediately reach out to reinforce good habits that support student success.

In addition to the Retention Center, the Performance Dashboard tool can be used to view the content items the student has accessed along with the number and length of posts a student has submitted to course discussion forums. By requiring students to review content and submit substantive posts within a discussion forum, the instructor can encourage the reading of course content – another study habit predictive of student achievement. Encouraging students to subscribe to the discussion forums can further support regular and substantive interaction with classmates.

In a survey of unsuccessful online students at Monroe Community College, students reported the number one challenge they experienced in their online courses… “I got behind and it was too hard to catch up” (Fetzner, 2013). By designing online courses that leverage the LMS analytics features to identify and support at-risk students within the first few weeks, new-to-online students can develop the skills and habits required to be successful in the online environment.

References:

You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. The Internet and Higher Education, 29, 23-30.

Fetzner, M. (2013). What Do Unsuccessful Online Students Want Us to Know?. Journal of Asynchronous Learning Networks, 17(1), 13-27.

 

 

 

 

The Ninety-day Action Plan

A recent article in Forbes encourages business leaders to consider putting together 90 day action plans to be reviewed and updated every 90 days in an effort to improve organizational agility.

Falguni Desai: “Strategic plans that cast a five year look into the future provide a sense of calm. They coax management teams into thinking that there is still time to put the plan into action.” – The Digital Era is Crippling the Five-year Strategic Plan

The author points out that in this digital age the role of the strategist is changing, and if we are to keep up with change we need to become “fututists” – keeping up with current trends, we should consider shorter planning cycles allowing us to be better prepared to change course if necessary.

Speaking of looking toward the future, one of the publications I look forward to reading each year is the Horizon Report put out by the New Media Consortium (NMC). The Horizon Report is put together by a panel of experts offering insights into what is emerging in educational technology on the college campus. The report offers key trends and significant challenges in technology adoption, as well as emerging technologies for the immediate future (one year or less), mid term (two to three years, and longer term (four to five years) in higher education.

Horizon Report 2016 Trends, Challenges, and Technologies for Higher Ed
CC-BY New Media Consortium Horizon Report 2016

Bring Your Own Device (BYOD), Learning Analytics and Adaptive Learning are listed in the current report as this year’s emerging technologies. Augmented and Virtual Reality, and Makerspaces time-to-adoption as two to three years. For four to five years, Affective Computing and Robotics are on the list.

The Horizon Report is not meant to replace the strategic plan but it can offer some useful insight into current and emerging trends in higher ed. In fact, our campus has implemented a number of these same strategies: in response to the BYOD trend we have implemented responsive web design and adopted mobile apps for our LMS. We are also planning to include a Makerspace as part of the Health Technologies capital project over the next two years.

Six years ago I would have agreed with the 2010 Horizon Report in that Open Educational Resources was set to take off but still today, we have yet to fully embrace OERs on most campuses and it is no longer listed in the report. Five years ago Competency Based Education (CBE) was not listed as one of the trends but with today’s focus on a campus completion agenda, getting students to degree faster has become a priority and that means considering new online delivery models.

The five-year strategic plan may no longer make sense for ed tech. Perhaps by employing such practices as the twelve to twenty-four month strategic plan and ninety-day action planing cycles we can learn to improve institutional agility and realize our efforts to embrace change and foster innovation on the college campus.

Advancing a Culture of Innovation

According to the 2017 Horizon Report , “Advancing Cultures of Change and Innovation” is one of the long-term trends to watch for in Higher Ed over the next five years.

“It will require visionary leadership to build higher education environments that are equipped to quickly change processes and strategies as startups do. If these organizational models are designed well, universities can experience more efficient implementation of new practices and pedagogies.” -2016 Horizon Report

Changing direction by Kapapuka Argazkiak on Flickr
Changing Direction by K. Argazkiak on Flickr, CC-BY-NC-SA

The report references Eric Ries’s book, The Lean Startup (2011) as an example of an approach educators may employ to advance cultures of change on the college campus. The process is a business model for entrepreneurs to rapidly design and develop new ventures and involves a cycle of deploying “lean” (less than fully developed) prototypes, followed by collecting feedback from consumers, which in turn informs the next step in the development / design process. Fully developed products and services may have undergone several iterations, oftentimes resulting in a final product that may have changed significantly from the original prototype but has proven more attractive to the consumer.

This idea of rapidly cycling through numerous iterations is a method also used in DevOps – an approach to application development that brings together software developers (coders) and information technology (operations) upon the goals of improving quality and lessening time to market. An important characteristics of the DevOps approach is the focus on cultural change.

The word “culture” comes from the Latin – cultura, meaning to cultivate or prepare for growth. It seems to me this serves as an excellent metaphor for fostering change in the organization.

“When a college is undertaking a broader reform effort, a culture of inquiry can be used to define a framework for action, cultivate the engagement of a broad range of practitioners and identify discrete action steps at various levels of the institution.”

The Research and Planning Group (RPgroup) of California Community Colleges published a paper on Building a Culture of Inquiry: Using a cycle of exploring research and data to improve student success (2010). The project was funded by Completion By Design and describes the use of an Applied Inquiry Framework: a cycle of evidence based improvement consisting of five stages:

  1. Defining a focus of inquiry
  2. Gathering relevant and meaningful evidence
  3. Engaging a broad range of practitioners and exploring the evidence
  4. Translating collective insight into action
  5. Measuring the impact of action

This cycle of evidence can certainly be applied to advancing a campus culture of innovation. As an example, consider how the adoption of Open Educational Resources (OERs) may impact online student success. This would serve as stage one – defining our focus of inquiry. In stage two, we gather research about OERs and student success (e.g. Multi-institutional study of the impact of open textbook adoption on the learning outcomes of post secondary students (Fisher, L. Hilton, J., Wiley, D. 2015)).

The third stage where we bring together a broad range of practitioners to explore the evidence, is critical in advancing a culture of innovation. It is at this stage we share insights, explore and challenge our collective beliefs and assumptions in an effort to get to the fourth stage, where we translate this collective insight into action.

In our scenario we would invite faculty who use OERs as well as those who are reluctant to adopt open texts for whatever reasons. Instructional designers, librarians, and others would be invited to the table as well to engage in discourse and inquiry. Unfortunately, in higher education we often work in isolation. Even our classrooms, both virtual and physical, are essentially closed environments. However, they could potentially become environments of inquiry and experimentation, where not only students learn, but the faculty and the campus community learn as well.

Stage four is where we test our assumptions and collect feedback and data. If we already have instructors using OERs, what do students think of the course and materials? Is there any data on student outcomes that we can compare to similar courses where the materials are not used? Such feedback need not be especially burdensome. The “lean” approach is used to test our assumptions and evaluate the feedback. A simple survey or focus group may provide enough information for the next stage.

The fifth stage, measuring the impact of our action, is not actually the final stage. In a culture of inquiry and innovation, the feedback we collect is used to inform subsequent iterations of our innovation. We may find that students appreciate access to the free digital text but they may in fact, be printing out each chapter as the course progresses. How does this inform our next iteration? Should we consider offering a low-cost print alternative?

It seems to me the Applied Inquiry Framework is similar in many respects to Design-based Research (DBR) – a qualitative research approach used in authentic educational settings. The goal of DBR is to learn about learning in real-world settings which are often complex and unique environments. The virtual classroom is such a setting and to improve learning in the online environment is an iterative process. If our goal is to advance a culture of change and innovation, we will need to change our approach to that which fosters experimentation and to share with others what we are learning even as we are learning.

Virtual Office Hours

In their paper, “Using Virtual Office Hours to Enhance Student-Faculty Interaction”, Lei Li and Jennifer Pitts (2009) from Colorado State University found that students enrolled in courses that offered virtual office hours experienced a higher level of satisfaction than those students enrolled in traditional on-campus courses. Apparently, just by making the virtual office hours an option for students, regardless of whether they availed themselves of the opportunity, was enough to increase the students’ comfort level with the instructor and the online course.

11513097184_11306a8c98_z
Skype & Coffee by I. Keller of Flickr, CC-BY-NC-SA

Ll, L., Pitts, J. (2009). Does it really matter? Using virtual office hours to enhance student-faculty interaction. Journal of Information Systems Education, Vol 20. No. 2. http://jise.org/Volume20/20-2/Pdf/V20N2P175-abs.pdf