Sunday, 16 September 2012

What we don’t know about going to distance education, and the challenge of comparing apples to apples

The Georgia legislature had been considering a bill that required high school students to take on-line courses as a graduation requirement. Maureen Downy of the AJChad a piece in last Monday’s column which reported on a study by the National Educational Policy Center at the University of Colorado about how well such requirements were faring: Minnesota, which has tripled its full-time virtual high school enrollment, found that online students scored lower in state testing and dropped out of school at higher rates; a quarter of online seniors dropped out, compared to only 3 percent of their peers.

A study of Colorado’s full-time cyber-students noted similar performance lags. Once in the virtual school, students scored lower on state reading exams, with scores declining the longer that they were in the program. An analysis by the I-News Network and Education News Colorado found that Colorado’s virtual high schools produced three times more drop-outs than graduates, which was the exact reverse of the state average, in which there were three graduates for every dropout. Distance education is important to develop and explore. We can’t realistically ask broad questions like, “Does distance education work?” Distance education (or virtual high schools) is not just one thing. The evidence is strong that the Open University UK works, but it works because the courses are well-designed and well-tested. We don’t know enough about how to design well and what factors influence success in distance education. It is reasonable to ask about the impact of current practice, but in that case, the specifics on practice and context of the study is important.

In her blog, Downy recently considered the flaws of the 2009 US Department of Education meta-study on distance education programs. I had critiqued the meta-study earlier for ignoring issues of drop-out rates. Turns out that the definition of a distance education “course” varied considerably in the 2009 report, and that all the fully-online studies were at universities, where the students are much more motivated to complete than in high school or community college. Nice try. But that study has serious flaws, especially as it pertains to community colleges. In the “Effectiveness of Fully Online Courses for College Students: Response to a Department of Education Meta-Analysis,” Shanna Smith Jagger and Thomas Bailey of the Community College Research Center at Columbia University point out that only 28 of the 99 studies examined in the Education Department report focused on courses that were fully online. Furthermore, only seven looked at semester-long courses, as opposed to short-term online programs on narrow topics, “such as how to use an Internet search engine.”

In other words, out of all the studies reviewed by the Education Department, only a handful dealt with the kind of fully online, semester-long courses that are being touted as a means of increasing college-completion rates. Even more alarming, for those of us on the front lines at community colleges, is the fact that all seven of those studies were conducted at midsize or large universities, five of which were rated as “selective” or “highly selective” by U.S. News & World Report. Those are not exactly the kinds of places that typically attract at-risk students—the ones least likely to complete their degrees. Community colleges do attract such students, and in large numbers.

Deepa Singh
Business Developer
Email Id:-deepa.singh@soarlogic.com

No comments:

Post a Comment