Category Archives: Persistence

Persistence is when a learner continues to pursue their educational goals until achieved. It has been said that retention is the role of the institution and persistence is the role of the student. The more we understand persistence the better we can support student success.

Equity, inclusion and the use of Open Educational Resources (OER)

The reason sports teams switch sides at half-time has to do with ensuring both teams have an equal advantage in the event one side of the field has a higher slope than the other, or one team maybe faces the sun, or runs against the wind, and so on. This is why it is necessary to “level the playing field”.

men playing soccer

CC-BY Ninian Reid on Flickr

For many low-income, minority, and part-time students the playing field may not be all that level. The cost of attending college takes a larger bite out of the low-income household budget and includes more than tuition, fees, and textbooks. Attending school part-time may mean fewer hours available for employment, increased transportation costs, child-care expenses, and more.

A recent study at the University of Georgia, Athens (UGA) looked at outcomes for students taking courses using traditional commercial textbooks versus open educational resources (OER). The study considered student income, ethnicity, gender, and full vs part-time status.

This research suggests OER is an equity strategy for higher education: providing all students with access to course materials on the first day of class serves to level the academic playing field in course settings. – Covert & Watson

Researchers, Nicholas B. Colvard (University of Georgia) and C. Edward Watson (American Association of Colleges and Universities) looked at learning outcomes for 21,822 UGA students with 10,141 enrolled in courses using OER and 11,681 using traditional commercial textbooks. While previous studies have investigated OER and student achievement, they did not break out the results by household income, ethnicity, or part-time status.

For the purposes of the UGA study, income status was described as Pell-eligible. Results of the study found students enrolled in OER sections (both Pell and non-Pell eligible students) earned significantly higher final grades. For Pell-eligible students enrolled in OER courses, however, the average final grade was significantly higher than for non-Pell in non-OER courses. Similar results were found in regards to course withdrawal and failure rates (DFW). Both Pell and non-Pell student groups using OER experienced lower DFW rates than those using commercial texts.

Both white and non-white student groups enrolled in courses using OER saw improved grades and lower DFW rates. However, in the case of part-time vs full-time, those students enrolled in OER courses showed significantly more improvement in both final grade and DFW rates than their full-time counterparts.

Admittedly, the results of this study, completed at a large research university, are not generalizable for the average community college. Nevertheless, the demographics of the students most benefiting from the use of OER in this study, matches a large percentage of the community college student demographic. These results hold promise that OER may help to level the playing field for many underrepresented students.

References:

Covard, N. B. & Watson, C.E. (2018). The impact of Open Educational Resources on various student success metrics. International Journal of Teaching and Learning in Higher Education. Vol 30 No 2. http://www.isetl.org/ijtlhe/pdf/IJTLHE3386.pdf

Advertisements

Closing the gap between online and classroom student outcomes

For the past few years community colleges have shifted their focus from one of access to one of completion. Offering online programming is a great way to provide access to higher education but closing the gap between online and classroom student outcomes is an ongoing challenge.

Student studying at computer

eLearning CC-BY Wolfgang Greller on Flickr

Efforts on bridging the gap have mostly centered around learner characteristics: GPA / SAT scores, whether students have previous experience with online learning, their capacity for self-regulated learning (self-efficacy, time management, organizational skills), etc.. Although these can be helpful indicators in predicting online student achievement, another approach that offers promise focuses on the at-risk course.

Ferris State University has been offering Structured Learning Assistance (SLA) for the past 25 years focusing not on the at-risk student, but on the high-risk for failure course. Although this has not been offered as an online option the impact on student success has consistently meant better than a 10% higher pass rate than of those sections without supplemental instruction. By focusing on courses with a history of high failure and withdrawal rates SLA has been able to support students with an additional 45 hours of supplemental instruction.

A recent study at Borough of Manhattan Community College at CUNY looks at online course-level predictors of learning outcomes (Wladis et al 2015). The study found that there was a significant gap in course completion between online courses taken as electives and those that were required for a given major. Also lower-level courses had a much higher attrition rate than higher-level courses. The study suggests that interventions such as embedded supplemental instructional support (tutoring, mentoring, advising, extra technical assistance) within the more challenging courses could significantly improve – and possibly even eliminate – the performance gap between online and face-to-face outcomes.

A few years ago I attended the Online Learning Consortium (OLC) conference and sat in on a panel presentation describing strategies to keep students enrolled in online learning. The panel was represented by faculty and staff from Penn State World Campus. The strategies included embedding tutors in some of the more challenging online courses. A single embedded tutor might support as many as eight sections of the course.

Tutors meet virtually with students one-on-one or with groups by appointment as many as five to six days a week – including evening and weekend hours. The tutors also scheduled “drop-in sessions” when they would go over some of the more challenging concepts, answer questions, and provide more detail on upcoming course assignments. In addition to the virtual meetings, tutors posted helpful tips on study skills and supplemental web resources. The goal of the initiative was to increase retention by 2% per year over a five year period. However, the results showed a 75% reduction in withdrawal and late drops, and a 15% reduction in course failure rates.

Implementing an online supplemental instruction program sounds like a big undertaking but focusing on the most challenging online courses sounds like a great place to begin.

References:

C. Wladis, K. Conway, A. C. Hachey (2015). Using course-level factors as predictors of online course outcomes: A multilevel analysis at a U.S. urban community college. Studies in Higher Education. Vol 42 (1). Taylor & Francis Online. http://www.tandfonline.com/doi/abs/10.1080/03075079.2015.1045478

Keeping score at relay race for Melrose Running Club

Measuring Student Success

For many years, community colleges have focused on providing access to everyone seeking a post-secondary education. More recently the narrative has changed to that of completion and how long it takes the student to complete. The literature tells us the longer a student is in college, the less likely she is to complete. The completion rates for first-time full-time degree seeking students are low – especially for community college students. Generally speaking, when we measure student success, what we are talking about is completion rates.

The fact is many – if not most – of our students are not first-time, full-time degree seeking students. Many have already completed a college education, some having attained bachelors and masters degrees, before coming back to start over in a new career. More and more of our student population are enrolling part-time. Others not seeking a degree at all, but instead, looking to update their job skills by taking a course or two. How do we measure student success for these learners?

California Community Colleges studied these non-completers over a two-year period. Referred to as Skills Builders, these students enrolled in a limited number of courses for the express purpose of enhancing their job skills or moving up the ladder within their careers. The study found that the average Skills Builder improved their salary by 13.6% – averaging $4,300 / year! Clearly these students were successful in meeting their educational goals.

EDUCAUSE Review recently published a collection of short essays entitled, Student Success: Mission Critical.  In the introduction, John O’Brien says, “If students don’t succeed, colleges and universities don’t succeed. Our full attention must be concentrated on the mission-critical goal of helping students define – and meet- their educational goals.”

To my way of thinking, this is what we should be talking about when we talk about student success: “helping students define – and meet – their educational goals“.

Regardless of whether students come to college to take a few courses to improve their employability, complete a program of study, or transfer to university, achieving their educational goals requires persistence. We often hear the terms “persistence and “retention” used synonymously. The National Center for Education Statistics (NCES), however, differentiates between the terms: “retention” as an institutional measure and “persistence” as a student measure. In other words, institutions retain and students persist.

This is an important distinction when it comes to measuring student success. As an institution, we can measure retention: did the student drop / complete? Has the student continued their program of study? However, whether the learner persists or not, is really up to the student. For this reason, we can, and should, create a learning environment that encourages student persistence.

“Tinto posits that students are more likely to remain enrolled in an institution if they become connected to the social and academic life of that institution” – Community College Research Center (CCRC).

The CCRC study found that community college students who make academic connections… “create a sense of belonging and attachment that seems to encourage persistence” – CCRC.

In Student Success: Mission Critical, George Mchaffey reminds us to not “avoid the academic heart of the enterprise… But the core of the enterprise is the curriculum and particularly the classroom. Some people avoid tackling that area because it is likely the most difficult. However, substantive change in student success outcomes must include attention to what happens to students in classrooms and in their academic journeys.”

We can measure student success in different ways: retention is measured by the quantity of students who continue and complete, but persistence in measured by the quality of the student’s experience – whether they belong here and how much we care.

References:

http://www.aacc.nche.edu/AboutCC/Trends/Pages/completion_report.aspx

https://www.insidehighered.com/news/2016/03/04/california-community-colleges-find-new-way-measure-success-noncompleters

http://er.educause.edu/articles/2017/5/student-success-mission-critical

http://ccrc.tc.columbia.edu/media/k2/attachments/exploration-tintos-integration-framework.pdf

 

Using LMS Data to Improve Self-regulated Learning

A recent study examined learning management system (LMS) log files to look at course interactions for 530 online students and found that the students’ self-regulated study habits significantly influenced course achievement. The study focused on those interactions related to self-regulated learning, including such habits  as maintaining a regular study schedule, timely submission of assignments, frequency of course logins, and proof of reading course content (You, 2016).

It may seem obvious that students who possess good study habits are more likely to succeed in the online learning environment (or any learning environment for that matter). However, for me, the important take-away from the study is the potential for leveraging the data collection systems and early alert functionality within our LMS toward reinforcing self-regulating learning habits with those students who may be at-risk of dropping or failing their online course.

The Retention Center is an early alert feature included in our campus LMS – Blackboard Learn. The Retention Center comes set up with four default rules: Course Access, Activity, Grades, and Missed Deadlines. Although these “rules” align fairly well with the study habits mentioned in the study, we are able to customize, as well as to design new rules that can be added to the Risk Table to further support and reinforce study habits.

Screen Shot - Blackboard Retention Center - Rules Customization

Blackboard Retention Center – Rules Customization

The course access and activity rules align with the frequency of course logins habit. The default rule is five days since last access, but in light of the research, I suggest shortening the number to two or three days. The User Activity default rule is set to twenty percent below average for a week – again, I would suggest changing it to three or four days.  The timely submission of assignments and proof of reading course content will depend on setting up corresponding columns in the grade book. By developed assignments that require students to review material (possibly video content) and then to complete a related assessment, grade alerts can be triggered in the corresponding grade book column. By tweaking the default rules or adding new rulee, the instructor can quickly identify students who display poor study habits and immediately reach out to reinforce good habits that support student success.

In addition to the Retention Center, the Performance Dashboard tool can be used to view the content items the student has accessed along with the number and length of posts a student has submitted to course discussion forums. By requiring students to review content and submit substantive posts within a discussion forum, the instructor can encourage the reading of course content – another study habit predictive of student achievement. Encouraging students to subscribe to the discussion forums can further support regular and substantive interaction with classmates.

In a survey of unsuccessful online students at Monroe Community College, students reported the number one challenge they experienced in their online courses… “I got behind and it was too hard to catch up” (Fetzner, 2013). By designing online courses that leverage the LMS analytics features to identify and support at-risk students within the first few weeks, new-to-online students can develop the skills and habits required to be successful in the online environment.

References:

You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. The Internet and Higher Education, 29, 23-30.

Fetzner, M. (2013). What Do Unsuccessful Online Students Want Us to Know?. Journal of Asynchronous Learning Networks, 17(1), 13-27.

 

 

 

 

Virtual Office Hours

In their paper, “Using Virtual Office Hours to Enhance Student-Faculty Interaction”, Lei Li and Jennifer Pitts (2009) from Colorado State University found that students enrolled in courses that offered virtual office hours experienced a higher level of satisfaction than those students enrolled in traditional on-campus courses. Apparently, just by making the virtual office hours an option for students, regardless of whether they availed themselves of the opportunity, was enough to increase the students’ comfort level with the instructor and the online course.

11513097184_11306a8c98_z

Skype & Coffee by I. Keller of Flickr, CC-BY-NC-SA

Ll, L., Pitts, J. (2009). Does it really matter? Using virtual office hours to enhance student-faculty interaction. Journal of Information Systems Education, Vol 20. No. 2. http://jise.org/Volume20/20-2/Pdf/V20N2P175-abs.pdf

 

What can we learn from the unsuccessful online student?

We have quite a bit of information on what it takes to be a successful online student but we may also be able to learn a few things from a couple of studies focusing on the unsuccessful online student.

24566052442_97a919e823_z

Great Expectations by Greg Myers on Flickr CC-BY-NC-SA

Monroe Community College in Rochester, NY, surveyed 438 unsuccessful online students over a ten year span about their online learning experience. “Unsuccessful students” were defined as those who either failed or withdrew from the online course (Fetzner, 2013).

Online course retention rates are on average, between 5 – 10% lower than for on-campus courses. Findings from the study show the group of students least likely to complete include the “first-time, full-time” students, with a difference in the success rate of 32%. The top three reasons given by the unsuccessful students for dropping or failing their courses, were “falling behind in coursework”, “personal problems”, and conflicts with work and/or family responsibilities (Fetzner, 2013).

In another study examining student engagement and online persistence, researchers from from the University of Georgia collected data on 423 students enrolled in thirteen online course sections over three semesters. The withdrawal rate for the asynchronous online courses was very high at 32%. Of those students who stayed enrolled in the courses, only 75% successfully completed the course (Morrie, Finnegan, Wu 2005).

Findings from the UG study (2005) indicated that unsuccessful completers (those finishing the course earning a D or F grade) were much less likely to participate or engage meaningfully in course discussions / postings, etc.

It is apparent from both studies that students may have very unrealistic expectations of what it takes to succeed going into the online learning environment. I have heard anecdotally on numerous occasions of students who have enrolled in an online course for the first time because they thought it would be “easier”.

Our campus is developing an “Introduction to Distance Learning” module in our LMS for students enrolling in an online or blended course for the first time at our college. At the moment they enroll they will be automatically enrolled in the module. An email message will alert them that they need to complete the module before their online course begins. The purpose of the module is to help students better understand what they can expect in taking an online course in regards to organizational skills, time-on-task, the amount of reading and writing required, as well as their access to, and comfort with, technology, etc.

I would be interested to hear of what other colleges have done to help mitigate unrealistic expectations of the first-time online student.

References:

Fetzner, M. (2013). What do unsuccessful online students want us to know? Journal of Asynchronous Learning Networks, Vol 17, No. 1

Morris, L., Finnegan, C., Wu, S. (2005). Tracking student behavior, persistence, and achievement in online courses. Internet and Higher Education Vol 8. 221-231.

 

Approaching the finishing line… by Sumeet Mulani Via Flickr:

Can OER improve time-to-degree?

I recently overheard a student complain about being required to purchase a new textbook for their business course. The book cost about one hundred dollars new but they were hoping to save money by purchasing used. Unfortunately, used wasn’t an option as the text included an access code for ancillary publisher materials made available online. Since the access codes are non-transferable, only new texts are made available in the bookstore for students enrolling in the class. Another student said that they were hoping to get by without the text as it was “too expensive”.  Although I don’t know how frequently this happens, it’s not the first time I’ve heard of students trying to manage without the required textbooks for their college courses.

In a recent experimental study published in the Journal of Computing in Higher Education, researchers found that students using open educational resources (OERs), including open textbooks, performed as well or better, in regards to completion rates and final grades, than students using commercial textbooks (Fisher et al 2015).

The study includes a sample size of 16,727 students from four universities and six community colleges with 4909 students in the “treatment condition” using OERs for their courses, and 11,818 in the control group using commercial textbooks.

The study compared student course completion, passing grade, and number of credits students took during a semester. Results indicated student completions were approximately the same for both the control and treatment groups with the exception of a couple of courses where attrition was somewhat higher for students in the control group. Grades were also mostly similar for the students in both groups with the majority of courses, although a few courses indicated students using the commercial text scored somewhat higher.

The most significant difference between the treatment and control groups was in the credit load – students in the treatment condition averaged 13.29 credit hours, while the control group averaged 11.14 hours. It may be that the savings students experienced using the OERs over the commercial text permitted more resources to be used for tuition.

It would be interesting to know some of the details in cases where students using OERs outperformed those using commercial texts. Perhaps these are the students who would otherwise have tried to get by without a “too expensive” textbook and later in the semester decided to drop or fail. Regardless of the reason, students using OERs stayed the course, and by enrolling in more courses may indeed cross the finish line that much earlier.

Reference

Fischer, L., Hilton III, J., Robinson, T.J.,Wiley, D.A. (2015). A multi-institutional study of the impact of open textbook adoption on the learning outcomes of post secondary students. Journal of Computing in Higher Education, Springer. Retrieved from http://link.springer.com/article/10.1007%2Fs12528-015-9101-x