Thomas Ho

Thomas Ho, Ph.D.

Previously, he was Professor of Computer and Information Technology atIndiana University Purdue University Indianapolis (IUPUI). Previously, he was a Senior Fellow in Information Systems and Computer Science at theNational University of Singapore from 1993-1994. From 1990-1992, he was Director of the Information Networking Institute atCarnegie Mellon University. From 1978 to 1988, he was Head of the Department of Computer and Information Technology at Purdue University which was recognized by the Data Processing Management Association for its Four-year Institution Award for undergraduate computer information systems programs. From 1986-1988, he was on loan from Purdue to serve as Executive Director of the INTELENET Commission which pioneered the INdiana TELEcommunications NETwork. He received his BS, MS, and Ph.D. degrees in computer science from Purdue University.

ETC Publications

Should Students Have a Personal Brand?
Facebook Timeline: They’re Already Telling Us the Story of Their Life…

Franklin P. Schargel

Franklin P. Schargel

Franklin Schargel, a native of Brooklyn, New York now residing in Albuquerque, NM, is a graduate of the University of the City of New York. Franklin holds two Masters Degrees: one in Secondary Education from City University and a degree from Pace University in School Administration and Supervision. His career spans thirty-three years of classroom teaching, school counseling and eight years of school supervision and administration. In addition, Franklin taught a course in Dowling College’s MBA Program.  Read more.

ETC Publications

Sound Bites Aren’t the Answer for Reform

Robert Plants

Robert Plants

Dr. Plants serves as Assistant Dean, Director of Off-campus Undergraduate Advising, and Assistant Professor. Bob teaches the School of Education’s only technology classes, online and a live lab-based course. While not performing his off-campus duties, he has served as the School of Education’s “webmaster” and on its technology committee. Read more.

ETC Publications

Living in Glass Schoolhouses: 21st Century Teaching and Learning Is Much More Than Standards
Arne and Michelle vs. Larry: The Statistical Battle
Real Change with 21st Century Learning Communities
Computer Science – A Field of Dreams

Marc Prensky

Marc Prensky
Guest Author

Marc Prensky is an internationally acclaimed speaker, writer, consultant, and designer in the critical areas of education and learning. He is the author of 3 books: Teaching Digital Natives — Partnering for Real Learning (Corwin 2010), Don’t Bother Me Mom — I’m Learning (Paragon House 2005), Digital Game-Based Learning (McGraw-Hill, 2001).

Marc is the founder and CEO of Games2train (whose clients include IBM, Nokia, Pfizer, the US Department of Defense and the L.A. and Florida Virtual Schools) and creator of the sites and .

Marc has created over 50 software games for learning, including the world’s first fast-action videogame-based training tools and world-wide, multi-player, multi-team on-line competitions. He has also taught at all levels. Marc has been featured in articles in The New York Times and The Wall Street Journal, has appeared on CNN, MSNBC, PBS, and the BBC, and was named as one of training’s top 10 “visionaries” by Training magazine. He holds graduate degrees from Yale (Teaching) and Harvard (MBA).

ETC Publications

Simple Changes in Current Practices May Save Our Schools

Jim Shimabukuro – Publications & Presentations

Selected Publications

Innovate Blog: Need for More Discussion.” Innovate, the journal of online education, February/March 2009 (Volume 5, Issue 3).

Innovate-Blog: A Step into Blog 2.0.” Innovate, the journal of online education, December 2008/January 2009 (Volume 5, Issue 2).

Innovate-Ideagora: Introducing a New Feature in Innovate.” Alan McCord, Denise Easton, and James N. Shimabukuro. Innovate, the journal of online education, October/November 2008 (Volume 5, Issue 1).

Freedom and Empowerment: An Essay on the Next Step for Education and Technology.” Innovate, the journal of online education, June/July 2005 (Volume 1, Issue 5).

Rising Stars in Virtual Education: A Peek into 2010.” Technology Source, November/December 2002.

The Evolving Virtual Conference: Implications for Professional Networking.” Technology Source, September/October 2000.

What Is an Online Conference?Technology Source, January/February 2000.

“How to Get the Most Out of an Online Conference.” TCC Worldwide Online Conference: Looking Back Towards the Future, April 7-9, 1999.

How to Survive in an Online Class: Guidelines for Students.” First published at the Fourth Annual TCC Online Conference: Best Practices in Delivering, Supporting, and Managing Online Learning, April 7-9, 1999.

CMC and Writing Instruction: A Future Scenario.” A chapter in volume 1 of Berge and Collins’ Computer-Mediated Communications and the Online Classroom (Cresskill, NJ: Hampton Press, 1995).

Stimulating Learning with Electronic Guest Lecturing.” A chapter coauthored with Morton Cotlar in volume 3 of Berge and Collins’ Computer-Mediated Communications and the Online Classroom (Cresskill, NJ: Hampton Press, 1995).

Beyond the Classroom: International Education and the Community College. A four-volume series co-edited with Robert W. Franco (University of Hawaii, 1992).

Selected Presentations

“The Force Is with US—The Teachers: Freedom in the New Classroom.” Keynote presentation. Ninth Annual TCC 2004 Online Conference Apr. 20, 2004.

“The Evolving Virtual Conference: Trends in an Emerging Medium for Professional Networking.” Keynote presentation. GATE 2000 International Virtual Conference. June 15-16, 2000.

“Teaching a Required Freshman Course Online: Implications for Distance Education.” Presentation. Third Seminar for Presidents of Junior and Community Colleges, June 16, 1997, at the East-West Center, University of Hawaii at Manoa.

David G. Lebow

lebow160aPresident, HyLighter, Inc
ETC Guest Author

USDE 2009 Report on Effectiveness of Online Learning

The following excerpts are from Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies (Washington, D.C., 2009), conducted by the U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. It was prepared by Barbara Means, Yukie Toyama, Robert Murphy, Marianne Bakia, and Karla Jones. Click here for the complete report.


A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an online to a face-to-face condition, (b) measured student learning outcomes, (c) used a rigorous research design, and (d) provided adequate information to calculate an effect size. As a result of this screening, 51 independent effects were identified that could be subjected to meta-analysis. The meta-analysis found that, on average, students in online learning conditions performed better than those receiving face-to-face instruction. The difference between student outcomes for online and face-to-face classes—measured as the difference between treatment and control means, divided by the pooled standard deviation—was larger in those studies contrasting conditions that blended elements of online and face-to-face instruction with conditions taught entirely face-to-face. Analysts noted that these blended conditions often included additional learning time and instructional elements not received by students in control conditions. This finding suggests that the positive effects associated with blended learning should not be attributed to the media, per se. An unexpected finding was the small number of rigorous published studies contrasting online and face-to-face learning conditions for K–12 students. In light of this small corpus, caution is required in generalizing to the K–12 population because the results are derived for the most part from studies in other settings (e.g., medical training, higher education).


Executive Summary

Online learning—for students and for teachers—is one of the fastest growing trends in educational uses of technology. The National Center for Education Statistics (2008) estimated that the number of K-12 public school students enrolling in a technology-based distance education course grew by 65 percent in the two years from 2002-03 to 2004-05. On the basis of a more recent district survey, Picciano and Seaman (2009) estimated that more than a million K–12 students took online courses in school year 2007–08.

Online learning overlaps with the broader category of distance learning, which encompasses earlier technologies such as correspondence courses, educational television and videoconferencing. Earlier studies of distance learning concluded that these technologies were not significantly different from regular classroom learning in terms of effectiveness. Policy-makers reasoned that if online instruction is no worse than traditional instruction in terms of student outcomes, then online education initiatives could be justified on the basis of cost efficiency or need to provide access to learners in settings where face-to-face instruction is not feasible. The question of the relative efficacy of online and face-to-face instruction needs to be revisited, however, in light of today’s online learning applications, which can take advantage of a wide range of Web resources, including not only multimedia but also Web-based applications and new collaboration technologies. These forms of online learning are a far cry from the televised broadcasts and videoconferencing that characterized earlier generations of distance education. Moreover, interest in hybrid approaches that blend in-class and online activities is increasing. Policy-makers and practitioners want to know about the effectiveness of Internet-based, interactive online learning approaches and need information about the conditions under which online learning is effective.

The findings presented here are derived from (a) a systematic search for empirical studies of the effectiveness of online learning and (b) a meta-analysis of those studies from which effect sizes that contrasted online and face-to-face instruction could be extracted or estimated. A narrative summary of studies comparing different forms of online learning is also provided.

These activities were undertaken to address four research questions:
1. How does the effectiveness of online learning compare with that of face-to-face instruction?
2. Does supplementing face-to-face instruction with online instruction enhance learning?
3. What practices are associated with more effective online learning?
4. What conditions influence the effectiveness of online learning?

This meta-analysis and review of empirical online learning research are part of a broader study of practices in online learning being conducted by SRI International for the Policy and Program Studies Service of the U.S. Department of Education. The goal of the study as a whole is to provide policy-makers, administrators and educators with research-based guidance about how to implement online learning for K–12 education and teacher preparation. An unexpected finding of the literature search, however, was the small number of published studies contrasting online and face-to-face learning conditions for K–12 students. Because the search encompassed the research literature not only on K–12 education but also on career technology, medical and higher education, as well as corporate and military training, it yielded enough studies with older learners to justify a quantitative meta-analysis. Thus, analytic findings with implications for K–12 learning are reported here, but caution is required in generalizing to the K–12 population because the results are derived for the most part from studies in other settings (e.g., medical training, higher education).

This literature review and meta-analysis differ from recent meta-analyses of distance learning in that they

• Limit the search to studies of Web-based instruction (i.e., eliminating studies of video- and audio-based telecourses or stand-alone, computer-based instruction);
• Include only studies with random-assignment or controlled quasi-experimental designs; and
• Examine effects only for objective measures of student learning (e.g., discarding effects for student or teacher perceptions of learning or course quality, student affect, etc.).

This analysis and review distinguish between instruction that is offered entirely online and instruction that combines online and face-to-face elements. The first of the alternatives to classroom-based instruction, entirely online instruction, is attractive on the basis of cost and convenience as long as it is as effective as classroom instruction. The second alternative, which the online learning field generally refers to as blended or hybrid learning, needs to be more effective than conventional face-to-face instruction to justify the additional time and costs it entails. Because the evaluation criteria for the two types of learning differ, this meta-analysis presents separate estimates of mean effect size for the two subsets of studies.

Key Findings

The main finding from the literature review was that

• Few rigorous research studies of the effectiveness of online learning for K–12 students have been published. A systematic search of the research literature from 1994 through 2006 found no experimental or controlled quasi-experimental studies comparing the learning effects of online versus face-to-face instruction for K–12 students that provide sufficient data to compute an effect size. A subsequent search that expanded the time frame through July 2008 identified just five published studies meeting meta-analysis criteria.

The meta-analysis of 51 study effects, 44 of which were drawn from research with older learners, found that

• Students who took all or part of their class online performed better, on average, than those taking the same course through traditional face-to-face instruction. Learning outcomes for students who engaged in online learning exceeded those of students receiving face-to-face instruction, with an average effect size of +0.24 favoring online conditions. The mean difference between online and face-to-face conditions across the 51 contrasts is statistically significant at the p < .01 level. Interpretations of this result, however, should take into consideration the fact that online and face-to-face conditions generally differed on multiple dimensions, including the amount of time that learners spent on task. The advantages observed for online learning conditions therefore may be the product of aspects of those treatment conditions other than the instructional delivery medium per se.

• Instruction combining online and face-to-face elements had a larger advantage relative to purely face-to-face instruction than did purely online instruction. The mean effect size in studies comparing blended with face-to-face instruction was +0.35, p < .001. This effect size is larger than that for studies comparing purely online and purely face-to-face conditions, which had an average effect size of +0.14, p < .05. An important issue to keep in mind in reviewing these findings is that many studies did not attempt to equate (a) all the curriculum materials, (b) aspects of pedagogy and (c) learning time in the treatment and control conditions. Indeed, some authors asserted that it would be impossible to have done so. Hence, the observed advantage for online learning in general, and blended learning conditions in particular, is not necessarily rooted in the media used per se and may reflect differences in content, pedagogy and learning time.

• Studies in which learners in the online condition spent more time on task than students in the face-to-face condition found a greater benefit for online learning. The mean effect size for studies with more time spent by online learners was +0.46 compared with +0.19 for studies in which the learners in the face-to-face condition spent as much time or more on task (Q = 3.88, p < .05).

• Most of the variations in the way in which different studies implemented online learning did not affect student learning outcomes significantly. Analysts examined 13 online learning practices as potential sources of variation in the effectiveness of online learning compared with face-to-face instruction. Of those variables, (a) the use of a blended rather than a purely online approach and (b) the expansion of time on task for online learners were the only statistically significant influences on effectiveness. The other 11 online learning practice variables that were analyzed did not affect student learning significantly. However, the relatively small number of studies contrasting learning outcomes for online and face-to-face instruction that included information about any specific aspect of implementation impeded efforts to identify online instructional practices that affect learning outcomes.

• The effectiveness of online learning approaches appears quite broad across different content and learner types. Online learning appeared to be an effective option for both undergraduates (mean effect of +0.35, p < .001) and for graduate students and professionals (+0.17, p < .05) in a wide range of academic and professional studies. Though positive, the mean effect size is not significant for the seven contrasts involving K–12 students, but the number of K–12 studies is too small to warrant much confidence in the mean effect estimate for this learner group. Three of the K–12 studies had significant effects favoring a blended learning condition, one had a significant negative effect favoring face-to-face instruction, and three contrasts did not attain statistical significance. The test for learner type as a moderator variable was nonsignificant. No significant differences in effectiveness were found that related to the subject of instruction.

• Effect sizes were larger for studies in which the online and face-to-face conditions varied in terms of curriculum materials and aspects of instructional approach in addition to the medium of instruction. Analysts examined the characteristics of the studies in the meta-analysis to ascertain whether features of the studies’ methodologies could account for obtained effects. Six methodological variables were tested as potential moderators: (a) sample size, (b) type of knowledge tested, (c) strength of study design, (d) unit of assignment to condition, (e) instructor equivalence across conditions, and (f) equivalence of curriculum and instructional approach across conditions. Only equivalence of curriculum and instruction emerged as a significant moderator variable (Q = 5.40, p < .05). Studies in which analysts judged the curriculum and instruction to be identical or almost identical in online and face-to-face conditions had smaller effects than those studies where the two conditions varied in terms of multiple aspects of instruction (+0.20 compared with +0.42, respectively). Instruction could differ in terms of the way activities were organized (for example as group work in one condition and independent work in another) or in the inclusion of instructional resources (such as a simulation or instructor lectures) in one condition but not the other.

The narrative review of experimental and quasi-experimental studies contrasting different online learning practices found that the majority of available studies suggest the following:

• Blended and purely online learning conditions implemented within a single study generally result in similar student learning outcomes. When a study contrasts blended and purely online conditions, student learning is usually comparable across the two conditions.

• Elements such as video or online quizzes do not appear to influence the amount that students learn in online classes. The research does not support the use of some frequently recommended online learning practices. Inclusion of more media in an online application does not appear to enhance learning. The practice of providing online quizzes does not seem to be more effective than other tactics such as assigning homework.

• Online learning can be enhanced by giving learners control of their interactions with media and prompting learner reflection. Studies indicate that manipulations that trigger learner activity or learner reflection and self-monitoring of understanding are effective when students pursue online learning as individuals.

• Providing guidance for learning for groups of students appears less successful than does using such mechanisms with individual learners. When groups of students are learning together online, support mechanisms such as guiding questions generally influence the way students interact, but not the amount they learn.


In recent experimental and quasi-experimental studies contrasting blends of online and face-to-face instruction with conventional face-to-face classes, blended instruction has been more effective, providing a rationale for the effort required to design and implement blended approaches. Even when used by itself, online learning appears to offer a modest advantage over conventional classroom instruction.

However, several caveats are in order: Despite what appears to be strong support for online learning applications, the studies in this meta-analysis do not demonstrate that online learning is superior as a medium, In many of the studies showing an advantage for online learning, the online and classroom conditions differed in terms of time spent, curriculum and pedagogy. It was the combination of elements in the treatment conditions (which was likely to have included additional learning time and materials as well as additional opportunities for collaboration) that produced the observed learning advantages. At the same time, one should note that online learning is much more conducive to the expansion of learning time than is face-to-face instruction.

In addition, although the types of research designs used by the studies in the meta-analysis were strong (i.e., experimental or controlled quasi-experimental), many of the studies suffered from weaknesses such as small sample sizes; failure to report retention rates for students in the conditions being contrasted; and, in many cases, potential bias stemming from the authors’ dual roles as experimenters and instructors.

Finally, the great majority of estimated effect sizes in the meta-analysis are for undergraduate and older students, not elementary or secondary learners. Although this meta-analysis did not find a significant effect by learner type, when learners’ age groups are considered separately, the mean effect size is significantly positive for undergraduate and other older learners but not for K–12 students.

Another consideration is that various online learning implementation practices may have differing effectiveness for K–12 learners than they do for older students. It is certainly possible that younger students could benefit more from a different degree of teacher or computer-based guidance than would college students and older learners. Without new random assignment or controlled quasi-experimental studies of the effects of online learning options for K–12 students, policy-makers will lack scientific evidence of the effectiveness of these emerging alternatives to face-to-face instruction.


U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies, Washington, D.C., 2009.