Learning Online
Learning Online
At a time when more and more of what people learn both in formal courses
and in everyday life is mediated by technology, Learning Online provides a
much-needed guide to different forms and applications of online learning. This
book describes how online learning is being used in both K-12 and higher
education settings as well as in learning outside of school. Particular online
learning technologies, such as MOOCs (massive open online courses), multi-
player games, learning analytics, and adaptive online practice environments,
are described in terms of design principles, implementation, and contexts
of use.
Learning Online synthesizes research ¿ndings on the effectiveness of
different types of online learning, but a major message of the book is that
student outcomes arise from the joint inÀuence of implementation, context,
and learner characteristics interacting with technology—not from technology
alone. The book describes available research about how best to implement
different forms of online learning for speci¿c kinds of students, subject areas,
and contexts.
Building on available evidence regarding practices that make online and
blended learning more effective in different contexts, Learning Online draws
implications for institutional and state policies that would promote judicious
uses of online learning and effective implementation models. This in-depth
research work concludes with a call for an online learning implementation
research agenda, combining education institutions and research partners in
a collaborative effort to generate and share evidence on effective practices.
Dr. Barbara Means directs the Center for Technology in Learning at SRI
International. Dr. Means is an educational psychologist whose research focuses
on ways to use technology to support students’ learning of advanced skills
and the revitalization of classrooms and schools. A fellow of the American
Educational Research Association, she is regarded as a leader in de¿ning
issues and approaches for evaluating the implementation and ef¿cacy of
technology-supported educational innovations.
Dr. Marianne Bakia is a senior social science researcher with SRI
International’s Center for Technology in Learning, where she leads research
and evaluation projects that explore online learning and other educational
technology policies and programs. Prior to joining SRI, Dr. Bakia worked
at the Federation of American Scientists and the Education Unit of the World
Bank.
Barbara Means,
Marianne Bakia, and
Robert Murphy
Center for Technology in Learning
SRI International
First published 2014
by Routledge
711 Third Avenue, New York, NY 10017
and by Routledge
2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2014 Taylor & Francis
The right of Barbara Means, Marianne Bakia, and Robert Murphy
to be identified as authors of this work has been asserted by
them in accordance with sections 77 and 78 of the Copyright, Designs
and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or
reproduced or utilized in any form or by any electronic, mechanical,
or other means, now known or hereafter invented, including
photocopying and recording, or in any information storage or
retrieval system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or
registered trademarks, and are used only for identification and
explanation without intent to infringe.
Library of Congress Cataloging in Publication Data
Means, Barbara, 1949–
Learning online : what research tells us about whether, when and how /
Barbara Means, Marianne Bakia, Robert Murphy.
pages cm
Includes bibliographical references and index.
1. Internet in education. 2. Distance education. 3. Educational
technology. I. Bakia, Marianne. II. Murphy, Robert, 1962– III. Title.
LB1028.3M415 2014
371.33v44678—dc23 2013036601
1 Introduction 1
9 Conclusion 178
References 190
Index 208
List of Figures and Tables
Figures
Tables
is changing, educators will always have to make decisions about whether and
how to implement the newest technology-based innovations in the absence of
rigorous research on that innovation implemented in their own particular
context. For this reason, decision makers have to depend on the “best available
evidence.” The online learning research base is, and will always be, imperfect.
Educators and policy makers need summaries of the research evidence and
help drawing reasonable inferences to inform the decisions they need to make
in the absence of the perfect study.
This book offers a balanced reporting of available evidence that takes the
reader beyond the question of whether online learning can be as effective as
face-to-face instruction to examine the research base for speci¿c design
principles and implementation practices as they are used with different types of
learners (for example, high versus low achievers). We describe the ways that
online learning is being used in a wide range of contexts—in elementary and
secondary schools, colleges and universities, and in the home and wherever
else people have access to Internet connections.
We wrote this book with multiple audiences in mind. We have brought
together research on the use of online learning across the spectrum from
elementary school to university studies, to provide the most comprehensive
review possible in a way that is useful for courses on the role of technology in
learning and education. Chapters of the book would be useful also in educational
leadership courses, methodology courses addressing meta-analysis, courses on
the economics of education, and education policy courses.
Throughout, we address issues of practice and policy, drawing out the
implications of the research for people who need to make decisions today, and
who do not want to hear that “more research is necessary.”
We delve into apparently contradictory studies concerning the effectiveness
of online learning, making sense of the pattern of ¿ndings by looking more
closely at the nature of the learning experiences and of the types of learners
being compared. We note that many studies have fallen into the trap of treating
the online aspects of a course or other learning experience as if they were self-
contained, ignoring the broader context in which learning takes place and the
relationship between online and ofÀine learning activities.
Many studies focused on measuring impact do not even specify the essential
features of the online experience they are investigating. In part, this neglect
reÀects the ¿eld’s lack of a common conceptual framework and vocabulary—
two gaps we attempt to ¿ll by offering a framework and set of key design
features for online learning.
This book provides a framework for thinking about the essential conditions
that support online learning. We summarize empirical studies on the effects of
designing those conditions into online systems. We also describe and discuss
research on the use of online learning with learners who are vulnerable to
school failure, examining the complex relationships among learning system
Preface xi
Introduction
(2010) estimated that there were 1.5 million K-12 course enrollments in
2009–10.
As of late 2010, 48 states and Washington, D.C. supported online educational
programs, including state virtual schools that provided courses to supple-
ment conventional offerings, state-led initiatives that provided schools with
online resources and tools, and full-time online schools (Watson et al., 2010).
The largest state virtual school, the Florida Virtual School (FLVS), had over
260,000 course enrollments in 2010–11.
Christensen, Horn, and Johnson (2008) predict that by 2019 half of all
U.S. high-school enrollments will be online. That is a long way from the
estimated 1.5 million online K-12 course enrollments in 2009–10 (Wicks,
2010), but the shape of the K-12 online learning growth curve from 2001 to
2010 now resembles the exponential growth we saw for school acquisition
of computers from 1985 to 1995 and for K-12 Internet connections from 1995
to 2005.
Five states (Alabama, Florida, Idaho, Michigan, and Virginia) require a
student to take an online course in order to earn a high-school diploma, as
do a large number of districts (Evergreen Education Group, 2012).
What is driving this rapid growth in online learning? We see four, inter-
related trends pushing the world in this direction.
First, as technology capabilities have expanded and information technology
has become more affordable and mobile, people live more of their lives online.
Why should people who schedule dinner reservations, stay in touch with
friends, get experts’ responses to their questions, and ¿nd their way through
unfamiliar cities with online resources forgo such capabilities when they want
to take a course or learn about a new ¿eld? Many parents and students are
asking for online learning to supplement what is done in classes. In one survey,
43 percent of middle and high-school students identi¿ed online classes as an
essential component of their ideal school (Project Tomorrow, 2011).
In states that now require students to take an online course to earn a diploma,
the rationale is that they will need to know how to learn online because they
will be required to do so in college and throughout their careers. In a world
where the state of the art is advancing at an unprecedented rate, people will
need to expand their skills repeatedly in order to stay employable.
The second impetus for online learning is the belief that it can address some
of education’s persistent and emerging challenges. These include the
achievement gap and the rate at which students—especially poor and non-
Asian minority students—leave high schools and colleges without a diploma.
The graduation rate in the U.S. (77 percent) is below the Organisation for
Economic Co-operation and Development (OECD) average and well
below that of countries with the highest graduation rates (like Germany and
Japan, with graduation rates of 97 percent and 95 percent, respectively). The
average graduation rate in U.S. public schools was less than 80 percent in
4 Learning Online
technology is at the core of virtually every aspect of our daily lives and
work, and we must leverage it to provide engaging and powerful learning
experiences and content, as well as resources and assessments that measure
student achievement in more complete, authentic, and meaningful ways.
Technology-based learning and assessment systems will be pivotal in
improving student learning and generating data that can be used to
continuously improve the education system at all levels.
(U.S. Department of Education, 2010b, p. xi)
At the same time that these motivations for online learning have fueled its
growth, that growth has given rise to questions about the quality of learning
experiences offered online. For example, some Florida parents expressed
concern and displeasure when they were surprised to ¿nd out that their
children would be taking one of their courses online (Herrera, 2011). Quality
concerns are especially acute in the case of K-12 and higher education degree
programs conducted entirely online. Some two dozen states have prohibited
cyber-schools and high-school degree programs offered entirely online (Glass
& Welner, 2011).
Considering the potential scale and impact of online learning, as well as the
controversies surrounding it, research-based guidance regarding effective
online learning practices and their implementation in different contexts is
strongly needed.
More Definitions
As we have de¿ned it, online learning was not possible prior to the inception
of the World Wide Web. Certainly, there were technology-based learning
options much earlier. By the 1970s, Patrick Suppes and his colleagues at
Stanford University were offering computer-based mathematics instruction
that was the forerunner of many of the online math offerings we have today
(Suppes, 1965). But computer- or server-based instructional offerings lacked
the reach, affordability, and Àexibility that are possible today with instruction
taking place over the Internet.
Distance learning is a broader concept, as it encompasses any instruction in
which the learner and the instructor are physically separated. Because distance
learning includes other technologies, going all the way back to print-based cor-
respondence courses, we treat online learning as a subset of distance learning
rather than a synonym of it. But some of the research we will examine comes
from the distance learning literature, and it remains an important source of
insights.
Another important trend related to online learning is that of open edu-
cational resources. These resources, available over the Internet for free or at
nominal cost, may include courses or course modules (the use of which we
would consider online learning), but also may include course syllabi, lesson
plans, and other instructional resources intended for use by instructors
rather than for direct use by learners. (Though of course instructors can learn
online too.)
Our de¿nition of online learning differs from that of open educational
resources in that it does not require that the learning experience be offered for
free, and indeed, there are many for-pro¿t companies that have become very
active in providing online courses and credentials.
Outcomes
Design Implementation
Context
and an agreed set of more speci¿c features within each dimension to generate
comprehensive descriptions of the interventions they studied.
The set of essential online learning intervention features that have emerged
from our own work are shown in Table 1.1. These features and terms will
appear prominently in the chapters that follow.
Context
In describing the components of each dimension in our conceptual framework,
we will begin generally with broad considerations and then move toward ¿ner
levels of detail. Under the dimension context, we consider ¿rst the ¿eld of use:
whether the online learning application is intended for higher education, K-12
(primary/secondary) education, military or job training, or self-initiated
learning. The resources we describe as “mixed ¿eld of use” are designed for
use in more than one of these ¿elds. A related dimension is the provider type:
online learning is offered by district and state public K-12 education institutions
(e.g., the Michigan Virtual School, Riverside Virtual School); for-pro¿t
vendors (e.g., K12 Inc., University of Phoenix); by public or private nonpro¿t
higher education institutions (Arizona State University); other types of
nonpro¿t institutions (National Geographic Society); government agencies
(e.g., the U.S. Department of Energy’s Online Learning Center); and by
consortia of multiple organizations.
Third, we consider the breadth of the online offering: whether it is a full
certi¿cate or degree program, a formal course or training experience, a unit or
10 Learning Online
Context
Field of use K-12, higher education, postsecondary training,
self-initiated, mixed
Provider District, state, for-profit vendor, consortium,
nonprofit higher education institution, other
nonprofit, government agency, consortium
Breadth Whole program, course, portion of course, brief
episode
Learner’s level of Weak, adequate, strong
preparation
Design features
Modality Fully online, blended, Web-enabled
Pacing Independent mastery-paced, class-paced, mixture
Pedagogy Expository, practice environment, exploratory,
collaborative
Online communication Asynchronous, synchronous, both
synchrony
Intended instructor Active instruction, small presence, none
role online
Intended student role Listen and read; complete problems and answer
online questions; explore simulation and resource;
collaborate with peers in building knowledge
Role of online Determine if student ready for new content, tell
assessments system how to support student, provide student
and teacher with information about learning
state, calculate student’s risk of failure
Source of feedback Automated, teacher, peers, mixed, none
Implementation
Learning location School, home, other, mixed
Co-located facilitator Primary instructor, monitor and facilitator,
absent
Student–instructor ratio
Level of online High, medium, low
student–content
interaction
Level of online High, medium, low
student–instructor
interaction
Level of online High, medium, low
student–student
interaction
Intended outcomes
Cognitive Declarative knowledge, procedural skills,
problem solving and strategies
Engagement Primary goal, secondary goal, not explicit goal
Productivity Course pass rate, graduation rate, time to
completion, cost
Learning to learn Self regulation, new media skills
Introduction 11
Instructional Design
There is an almost in¿nite number of possible features for the design of an
online learning experience, but our conceptual framework is limited to features
that some research suggests inÀuence the outcomes of online learning. First
among these is what we call modality: the distinction between online, blended,
and Web-enabled learning experiences discussed above.
Next, there is the pacing of instruction. Allowing students to begin learning
and to proceed to the next learning module when (and only when) they have
mastered the current module is a practice incorporated into many online
learning systems, as it was in the computer-assisted learning systems of earlier
decades. It is also possible to have a ¿xed or class-based schedule for when
students are to be online and when learning components are supposed to be
completed, much as the typical classroom-based course is run. Finally many
instructors and online learning providers are experimenting with various
strategies falling between these two pacing options, with some required times
for online interaction or some completion deadlines but more Àexibility than
found in traditional classroom-based courses.
A related design feature is the synchrony provided by the technology used in
the online learning system. In the earlier days of distance learning, some
systems were designed to give learners in all locations the sense of being in the
classroom, and they provided for synchronous (same time, different place)
communication only. Other learning systems relied entirely on asynchronous
(different time, different place) communication using materials posted online
and discussion boards. Some researchers found that learning interventions
using asynchronous communication were more effective than those using
synchronous communication, but interest in the topic has receded with the
dominance of modern, Web-based learning systems that support both
synchronous and asynchronous interactions.
Describing aspects of the design dimensions becomes more complex as we
move to consideration of the nature of the instructor and student roles. The
intended instructor role online may be to lead instruction and conduct
12 Learning Online
Implementation
The third dimension in our model is implementation. No matter how a course
or learning system has been designed, students may have different experiences
depending on how it is implemented in practice. Typically, many decisions
about implementation are made by schools or teachers, but some are made, at
least in part, by students. The ¿rst feature under this dimension is the learning
Introduction 13
Outcomes
Finally, in describing and evaluating online learning resources it is important
to keep in mind the intended outcomes. Most of the time we think about the
cognitive outcomes valued by schools and colleges. But there are different
kinds of cognitive outcomes and research and theory suggests that different
kinds of learning experiences best enhance the different types, which can be
described as declarative knowledge (e.g., learning the motor vehicle laws for
your state), procedural skills (e.g., Àuency solving algebra word problems), or
problem solving and strategies for future learning (e.g., the split-half strategy
for troubleshooting computer systems).
Another important category of outcomes has to do with the learner’s
affective responses and engagement in the online activity. For much self-
initiated learning and for some of the activities selected by teachers, the extent
of student engagement is valued as much as, or more than, cognitive outcomes.
From an education policy perspective, one of the most important classes of
outcomes are productivity measures. These include things such as the course
pass rate, a school’s graduation rate, the time it takes a student to complete a
program of study, or the costs of obtaining each course completion.
Finally, technology advocates believe that online learning experiences are
vital for obtaining learning-to-learn outcomes. Two major classes of these
outcomes dominate the literature. The ¿rst has to do with what is sometimes
called self-regulation—the ability to plan and execute learning activities
14 Learning Online
independently without needing someone else to tell you what to do and when
to do it. Self-regulation skills include having an awareness of what you do and
do not understand or know how to do and being able to set up and adhere to a
schedule that meets completion deadlines. The other important class of
learning-to-learn outcomes concerns the use of the new, Internet-based media
themselves. As these media have become such a large part of our lives—
socially and professionally—the mastery of online learning and communication
skills has become a valued outcome in its own right.
• “¿nding the best ways to engage with people with different interests,
passions, and ways of thinking”—principal Paul Lorette on his Ready
for Learning blog
• “empower[ing] each student to learn the way they learn best—when,
where, and how they want”—Pearson Learning Solutions Web site
• “providing adjustments to what is learned, the pace at which learning
happens and the methods of learning based on students’ needs and
performance”—Desire2Learn Web site
Online learning inspires strong views both pro and con. Depending on which
media accounts you attend to, the growing use of online instruction portends
either a transformational increase in educational access and personalization
(Collins & Halverson, 2009; Swan, 2003) or the cynical degradation of
educational quality at public expense (Glass, 2009).
The purpose of this chapter is ¿rst to summarize available empirical research
on the effectiveness of online and blended learning approaches, and second to
describe why the ¿eld needs to move beyond studies that ask whether online
learning “works” and address principles for designing and implementing
online learning for different purposes, circumstances, and types of learners.
advantage of using the standardized effect size metric is that it makes it possible
to combine results across studies that examined different outcomes and used
different measurement instruments.
Meta-analysis has a number of advantages:
From the 1,132 abstracts, 99 online learning research studies were identi¿ed
that compared an online condition (either entirely online or a blended condi-
tion with online and face-to-face components) with face-to-face instruction
and used an experimental or controlled quasi-experimental design with an
objective measure of student learning outcomes.
Of the 99 studies comparing online and face-to-face conditions, 45 provided
suf¿cient data to compute or estimate 50 independent effect sizes. The types of
learners in the meta-analysis studies were about evenly split between students
in college or earlier years of education and learners in graduate programs or
professional training. The average learner age in a study ranged from 13 to 44,
but only ¿ve studies with seven independent effect size estimates dealt with
K-12 education.
The studies encompassed a wide range of learning content, including com-
puter science, teacher education, mathematics, and languages, with medicine
or health care being the most common subject area.
We computed the average difference between learning outcomes in
conditions incorporating online learning and those employing business-as-
usual face-to-face instruction. We computed an average effect size for the
entire set of 50 contrasts and then separately for those involving fully online
instruction and those contrasting blended learning with wholly face-to-face
conditions.
Meta-analysis of all 50 effects found that, on average, students in conditions
that included signi¿cant amounts of learning online performed better than
students receiving face-to-face instruction by .20 standard deviations, a modest,
but statistically signi¿cant, amount. However, the subset of studies employing
blended learning approaches was entirely responsible for the observed online
advantage: The online advantage relative to purely face-to-face instruction was
signi¿cant statistically in those studies contrasting blended learning with
traditional face-to-face instruction but not in those studies contrasting purely
online learning with face-to-face conditions. The magnitude of the blended
learning advantage was +.35, about one-third of a standard deviation.1
College Research Center (Jaggers & Bailey, 2010) point out that such short-
term interventions may not represent the results for entire courses. Differences
may not show up during a short intervention that would be apparent over a full
academic term.
Part of the reason that few controlled studies of online and blended learning
involve entire courses and large samples is because it is dif¿cult to obtain
permission to conduct such studies (Bowen et al., 2012; Figlio, Rush, & Yin,
2010). As a matter of policy, many colleges and universities will not allow
students to be assigned to an online course section against their will, thus ruling
out random assignment for any students except those who volunteer to let a
researcher assign them to an online or face-to-face section at random. Figlio,
Rush, and Yin (2010) found that a relatively small proportion of students
volunteer under such circumstances, even when offered extra credit toward
their course grade for doing so.
the ¿eld if they describe these features for all of the instructional conditions
being compared (including classroom-based “business as usual”).
Table 2.2 illustrates how the dimensions in Table 2.1 could be used to
describe typical examples of three currently popular forms of online instruction
(independent learning online, online community of learners, and MOOCs).
28 Learning Online
a
This description is based on the best-known MOOCs, those provided by Coursera
and Udacity. Other forms of MOOCs, sometimes called “Canadian” or “cMOOCs,”
have quite different instructional practices, as discussed in Chapter 3.
Research on the Effectiveness of Online Learning 29
¿eld, so that principles are available to guide future online learning design, this
situation is unfortunate.
The sheer size of this list (as well as the need for each of these terms to be
de¿ned) attests to the complexity of the idea of personalization.
In addition to the many features of the learning experience that can be
customized to individual learners’ needs or tastes, there are also multiple
possibilities for how differentiations along these dimensions are made. Some
systems let the learner choose different objectives, content, or pedagogy
(usually referred to by a more user friendly term such as “mode”).
Some systems are set up to give the student’s teacher control of the level of
content and support each student receives. Although letting the student or the
teacher match instructional mode to the student’s preference or perceived
learning “style” sounds attractive, meta-analyses of this kind of matching
Research on the Effectiveness of Online Learning 33
suggest that it is a weak intervention at best (Aiello & WolÀe, 1980; Hattie,
2009; Slemmer, 2002).
Some systems administer a diagnostic test or assessment of learner prefer-
ences when learners ¿rst start with the system (U.S. Department of Education,
2010b). The most sophisticated learning systems perform “assessment on the
Ày” with the system itself using the learner’s prior actions to determine the
approach presumed to be the best ¿t for that learner (U.S. Department of
Education, 2013, Chapter 2).
Another frequently touted virtue of commercially produced online learning
resources and systems is their use of “rich media.” Marketing materials often
imply “the more media the better,” but the research literature actually suggests
a much more complex set of design principles.
Humans have separate channels for processing visual and verbal infor-
mation (pictures and words), and learning can be enhanced by exploiting both
channels, but humans can only process a certain amount of information at any
one time. Further, the extent to which learning occurs and is retained is a
function of the extent to which the learner actively engages with the material
to be learned.
With these three characteristics of human information processing inter-
acting, the results of any particular con¿guration of text, animation, voice over,
and graphics becomes much harder to predict. In many cases, simple schematic
diagrams are as effective as or more effective than more elaborate visuals.
Mayer and his colleagues have demonstrated repeatedly that adding music,
video clips, and the insertion of interesting facts to instruction on science
phenomena actually reduces the effectiveness of instruction (Mayer, 2008).
Mayer has conducted meta-analyses on a number of theoretically grounded
principles for multimedia instruction. He identi¿es ¿ve principles, each of
which has been shown to improve learning in most or all experimental tests:
But the only way to know whether the use of media in a particular online learn-
ing resource is in fact optimal is to try out different versions with learners.
Although promotional materials characterize qualities like personalization,
game-like, and media rich as “research-based,” the available supporting
research typically is limited to speci¿c applications of some version of the
principle in interventions that may be quite different from those of the product
being promoted. Potential adopters need evidence of a positive impact of the
34 Learning Online
way the principle has been implemented in the particular product under
consideration.
An implication of the gap between the loose usage of learning terms in
everyday discussion and product promotion and the much more narrow,
operational de¿nitions used in research is that it is dif¿cult to translate research
into terms that make sense to practitioners and product designers.
Individual research studies involve particular instantiations of theory-based
concepts, such as immediate feedback or spaced practice. The details of how
these concepts are operationalized are important to researchers and can affect
learning outcomes. Given the number of different ways in which an instructional
design principle can be operationalized in different contexts, interpreting
research investigating principles for designing online learning can be daunting.
One approach to dealing with this complexity is to promote a common
set of terms and de¿nitions organized as a hierarchy of terminology. Yaeil Kali
of the Israel Institute of Technology and Marcia Linn from the University of
California, Berkeley, have done this with their online design principles
database. This resource was developed to support those engaged in designing
learning environments in applying research-based principles and des-
cribing their products in terms of a standard set of features. As illustrated in
Figure 2.1, from Kali and Linn (2010), the Design Principles Database
classi¿es principles at three different levels of abstraction (see edu-design-
principles.org). A meta-principle is a generalization based on the basic research
literature about how people learn. A sample meta-principle would be “Make
thinking visible.” Pragmatic principles are more speci¿c descriptions of how
the meta-principle could be instantiated in a piece of learning software. Two
pragmatic principles for “make thinking visible” are “Provide a visual overview
of the inquiry cycle” and “Provide dynamic visual aids for the perception of
Hf
9.r-{t
flU
!U
1ft!.
hi
•
11
!
l~
l~
i
I
~
}
~
Notes
1 Cohen (1988) describes effect sizes of .20 as “small,” .50 as “medium,” and .80 as
“large” for social science research. Hattie (2009), who has synthesized results of over
800 meta-analyses on variables related to educational achievement, suggests that an
effect size of .40 represents a level “where the effects of the innovation enhance
achievement in such a way that we can notice a real-world difference” (p. 17).
2 Although Jaggers provides examples of better and more poorly controlled studies,
she does not report the criteria she used to classify studies.
3 Most learning technology researchers regard this as a fairly meaningless comparison
(Kozma, 1994) since they regard the point of using technology as providing
experiences that are not possible in conventional classroom instruction.
4 Looking just at the 11 contrasts with the largest positive effects for online or blended
learning, retention rates were higher for the face-to-face condition in two, for the
online condition in two, equivalent in the two conditions in two, and not reported in
¿ve.
Chapter 3
Colleges and universities have led the way in adopting online learning. Over
the last decade, enrollment in online college courses has been increasing faster
than enrollment in higher education as a whole (Allen & Seaman, 2013).
Tertiary education institutions have advantages over elementary and secondary
schools as contexts for Web-based learning because they have greater resources
to invest in a technology infrastructure. In addition, they deal with more mature
learners, who can be expected to do more of their learning independently
without close monitoring by an instructor. Finally, experience with distance
learning at institutions such as the Open University in the United Kingdom and
Athabasca University in Canada provided proof points that learning at a
distance could serve the needs of many students and yielded insights into
how it could be implemented.
future; by 2012 the proportion had risen to nearly 70 percent (Allen & Seaman,
2013). The surveys have documented also a trend toward greater con¿dence
that online instruction is as effective as traditional face-to-face instruction. In
the ¿rst survey conducted in 2002, 57 percent of responding chief academic
of¿cers said that online learning was either equal or superior to traditional
methods in effectiveness. By 2012, that proportion had risen to 77 percent
(Allen & Seaman, 2013).1
A number of different factors appear to have contributed to putting
online learning into the higher education spotlight. First, the technology
infrastructure—the systems that campuses can afford to purchase and the
hardware devices that students bring to campus—reached a “tipping point,”
enabling course designers at many campuses to assume every student would
have access to anything on the Web. Another enabler has been the increased
supply of free or “open” learning management systems and learning resources
and commercial products. This supply has been stimulated by a recent inÀux of
both philanthropic and commercial investment capital into companies offering
services, infrastructure, or content for higher education learning online. In
addition, some of the for-pro¿t postsecondary education providers, such as
the University of Phoenix and Walden University, achieved tremendous
year-to-year growth in enrollments during the ¿rst decade of the twenty-¿rst
century, putting pressure on public university systems to compete by offering
courses and programs that met the needs of busy adults with home or work
responsibilities.2
In the U.S., the increased activity of for-pro¿t providers and venture capital
investment in online learning systems and resources occurred at the same time
that state budgets for higher education were being cut signi¿cantly (State
Higher Education Executive Of¿cers, 2013). Budgetary pressures led higher
education administrators, governors, and legislatures to look for ways to
continue to deliver higher education services but at lower cost. Many looked to
online education as the way to escape their service-cost dilemmas.
In California, for example, Governor Jerry Brown has pushed for more online
courses in the state’s college systems as a means of reducing costs. Speaking
before the University of California Regents, Brown asserted, “So there isn’t the
luxury of sitting in the present trajectory unless you don’t mind paying ever
increasing tuition” (Melendez, 2013). A bill that went before California’s state
legislature in 2013 called for the development of 50 online versions of the most
highly oversubscribed required lower-division college courses. Students at
the University of California, California State, or community college campuses
would be able to take these online courses, or approved equivalent online
courses from commercial providers or out-of-state colleges, if they were not
able to get into the classroom-based course on their own campus.
Walsh (2011) quotes William G. Bowen, the former president of Princeton
University, who said, “I think that present and prospective economic realities
42 Learning Online
dictate that there be a serious rethinking of the way some forms of instruction
are provided, especially in those parts of the public sector that have been hurt
the most by funding cuts.”
Taking a more international view, we are in a period of rising demand
for higher education worldwide. Atkins, Brown, and Hammond (2007)
assert that global demand for higher education cannot possibly be met
without online learning. They illustrate the scope of this need with remarks by
Sir John Daniels, former vice-chancellor of the Open University and CEO of
Canada’s Commonwealth of Learning. Daniels asserted that there are
30 million more people in the world quali¿ed for a university education than
there are places in higher education institutions. Daniels predicted that this
number would soon grow to 100 million and estimated that fully meeting the
worldwide demand for higher education would require opening a major
university every week. Clearly, this level of unmet need cannot be satis¿ed
with brick-and-mortar institutions of the same kind we have had for the last
300 years.
Finally, the spectacular enrollment numbers for MOOCs—with hundreds of
thousands of people signing up to take a single online course—have triggered
huge amounts of publicity and a “gold rush” mentality among colleges and
universities eager to get in on the opportunity. Colleges feel they need to
participate or risk being overshadowed by institutions with online offerings.
One university president was even ¿red by her board of trustees, apparently for
failing to move fast enough to implement MOOCs as other elite universities
had done (Webley, 2012).3 Both irrational exuberance and deep-seated fear
concerning online learning are running high (Brooks, 2012; Christensen &
Horn, 2013; Shullenberger, 2013; Yuan & Powell, 2013).
Influential Precedents
As noted above, learning at a distance, whether through televised lectures,
radio, or more recently online courses, has had a presence within higher
education for decades. But the rise of the World Wide Web and the emergence
of Web 2.0 capabilities have brought learning at a distance from instructors
into a new era. Harasim (2001) traces the origins of online courses both before
and after the introduction of the World Wide Web.
While the invention of email and computer conferencing had launched an
unprecedented level of social interaction, communication, and collaboration,
the invention of the Web led to a phenomenal amount of self-publishing. Two
basic models of online courses thus emerged: one based on collaborative
learning and interaction, and the other based on publishing information online
(course materials, lecture notes, student assignments, and so on). The second,
based on the old model of transmission of information or lecture mode, seemed
to Àourish during the late 1990s, but then its weaknesses became evident.
At the same time, new tools and environments customized for education
Online and Blended Learning in Higher Education 43
In designing MIT OCW, the university was clear that it was putting course
materials on the Web—syllabi, reading, lecture notes, and so on—not providing
online instruction per se. Part of the rationale for this decision was a desire not
to “dilute the brand” of MIT instruction offered to its own students on campus
and part was a recognition of the fact that posting their course materials
required much less of faculty than redesigning their courses as online instruction
would. MIT expected that the major users of OCW would be faculty at other
colleges and universities who would have access to the course content and
supporting materials used at MIT and could incorporate these materials into
their own courses (Walsh, 2011).
In 2001 MIT made content from an initial set of 50 courses freely available
online. Another 500 courses were added during a second phase. Eventually, the
number of OCW courses exceeded 1,200.
A survey of MIT OCW users conducted in 2009 found that usage was
worldwide with 54 percent of OCW visits coming from outside the U.S.
(Walsh, 2011). Contrary to expectation, only 9 percent of users were educators:
42 percent of OCW users described themselves as students, and 43 percent as
“self-learners.” Course content that MIT had expected to be taken up by and
interpreted by faculty was instead being consumed directly by learners—an
outcome totally in keeping with the emerging philosophy of resource sharing
that came to be characterized as “Web 2.0.”
From the beginning, MIT had hoped that its OCW initiative would serve as
an example to other universities around the world. With a grant from the
Hewlett Foundation, MIT launched the OCW Consortium in 2005. A decade
later, more than 200 other higher education institutions from around the world
had joined the OCW Consortium, each pledging to make at least ten of its
courses available in open form (Smith, 2009). The biggest enthusiasm for
joining the OCW Consortium came from Asian universities, however. U.S.
institutions did not join in large numbers (Walsh, 2011).
analysis of the subject domain. The courses also include short tests interspersed
with the learning and practice activities. These tests are self-assessments
designed to help students reÀect on their learning rather than tests that serve as
inputs into a course grade. Some of the assessment items call for reÀections,
such as “What was hardest to understand?”
Although the cost of developing an OLI course has dropped from the initial
million dollars to something in the neighborhood of half of that (Walsh, 2011),
the expertise and labor going into these courses is far from typical for the
development of online college courses.
vary for different students, personalized learning systems seek to hold the level
of mastery constant and vary learning time.
In the 1980s mastery learning approaches were used most often in the U.S.
in compensatory education services for younger students receiving remediation
in basic reading and mathematics skills. More recently, we are starting to see
online learning systems resulting in greater use of mastery learning approaches
in higher education as well as K-12. Mastery learning has particular appeal in
remedial courses addressing mathematics and language arts skills that students
should have attained prior to entering college and in mathematics and math-
related subjects more generally.
A high-pro¿le application of mastery learning for this purpose is the “empor-
ium model” of mathematics instruction popularized by Virginia Polytechnic
Institute and State University (“Virginia Tech”). Motivated by the desire to
improve student performance in mathematics courses and to reduce costs,
Virginia Tech set up a large open space with clusters of computers to serve as
an “emporium” for mathematics learning. Currently housed in a 60,000-square-
foot off-campus space that formerly served as a grocery store, Virginia
Tech’s Math Emporium serves over 8,000 students a semester. Equipped with
550 computers arranged in six-station hubs, the Math Emporium is open
24/7 for students wanting to work on their online math courses. Students use
streaming video, audio, or text explanations of math concepts and then take
online quizzes. Once they have mastered the content in all the quizzes, they
take a test comprised of new mathematics problems on the same learning
objectives. Students who would like personal assistance while working in the
Math Emporium can get it from one of the instructors, graduate students, or
advanced undergraduates who staff the Emporium 15 hours a day. Because
fewer and less-expensive staff are used in the Emporium than in traditional
lecture-based classes, the Math Emporium has reduced the cost of delivering
math courses by 75 percent at Virginia Tech (Robinson & Moore, 2006). The
university reports that students who took their mathematics classes through
the Math Emporium do as well or better in more advanced mathematics classes
as those students who took the classes in traditional classroom-based format
(Robinson & Moore, 2006).
The emporium model for mathematics instruction has been adopted by other
universities, reportedly resulting in larger proportions of their students earning
a C or better in remedial or entry college-level mathematics courses (Twigg,
2011). The model is currently being expanded to 38 two-year or community
colleges under the auspices of the National Center for Academic Transformation
(Twigg, 2011).
A more general concept that subsumes mastery learning is that of adaptive
instruction. A learning system is considered adaptive if it uses information
gained as the student is learning with it, rather than pre-existing information
such as the learner’s age, gender, or prior achievement test score, to change the
48 Learning Online
way in which the system presents instruction to better meet the learner’s needs.
Systems can be adaptive in the way they represent a concept (e.g., through text
or diagrams or formulae), the dif¿culty level of the text or problems they
provide, the sequencing of material, or the nature of hints and feedback given.
While the mastery learning systems of the 1980s varied the pace with which
students moved through a curriculum, they still had all students work through
the same content in the same sequence. Newer adaptive learning systems with
arti¿cial intelligence capabilities are able to mimic the more complex variations
in instructional styles that a human tutor would use with a student (Koedinger
& Corbett, 2006; U.S. Department of Education, 2013). The Open Learning
Initiative courses described above incorporate arti¿cial intelligence techniques
to make their instruction adaptive.
As education systems adopt individualized, mastery-based approaches to
instruction, a very natural extension is the move into competency-based
approaches to structuring learning and learning certi¿cation. Building a
mastery-based instructional system requires specifying the content and the
level of performance that students must master to complete the course and
developing a valid, reliable way of assessing whether the stipulated com-
petencies have been attained. If such assessments are in fact valid measures of
competency attainment, one can then think about systems in which students
can take the assessment any time they feel they are ready, much as student
drivers choose when to take their driving test. Logically, what should matter is
competency attainment, not the number of hours one sat in a classroom to
attain it. Competency-based systems determine both a student’s place in an
education program and when the student has completed the program on the
basis of the demonstration of competencies, rather than the number of hours of
instruction or the month of the year.
Providers of online degree programs (see Chapter 6) are enthusiastic
about competency-based approaches to documenting learning because such
approaches free learners up from ¿xed timeframes for academic terms. Once
the certi¿cation of learning is decoupled from time spent in the classroom,
all kinds of new models of higher education become possible.
An early effort to provide competency-based education online was Western
Governors University, which, as described in Chapter 6, offers degrees based
on competency rather than seat time.
Under the Obama Administration, the move toward competency-based
higher education has received encouragement from the federal government. In
an interview with The New York Times, U.S. Education Secretary Arne Duncan
highlighted the competency-based programs of Western Governors University
(WGU) and said that in the future he would like to see them become the norm
(Lewin, 2011).
Currently, a number of public higher education institutions are working to
pioneer competency-based programs, an effort that requires overcoming the
hurdles presented by accrediting processes and federal ¿nancial aid regulations
Online and Blended Learning in Higher Education 49
that were written around the notion of measuring education by class hours (the
“Carnegie unit”).4 In fall 2012 Southern New Hampshire University received
approval from its regional accrediting agency for its proposed College for
America, which offers online assessments of 120 competencies. There are no
required courses in this program; rather, students attempt to perform the task
associated with each competency and students’ products are judged by faculty
using a rubric for judging whether or not the performance provides suf¿cient
evidence that the student possesses the target competency. Competencies are
organized into three levels, from simpler to more complex, and a student may
take as much or as little time as needed to perform the tasks at the required
level of competency. Tuition is set at $2,500 a year, so a student could earn an
associate’s degree for as little as $1,250 if she could demonstrate all the
required competencies in six months’ time (LeBlanc, 2013).
The U.S. Department of Education subsequently approved Southern New
Hampshire University’s program for student receipt of ¿nancial aid based on
direct assessment of competency rather than credit hour. The Secretary of
Education, Arne Duncan, wrote:
The Bill & Melinda Gates Foundation is another organization that has been
trying to leverage online learning and competency-based programs to create
new higher education structures with dramatically lower costs and better
student graduation rates. The foundation has funded experimental programs
within existing colleges and universities and start-up higher education institu-
tions to develop “breakthrough delivery models” for cost effective higher edu-
cation. One of the foundation’s grantees, Northern Arizona University, has
partnered with Pearson to provide competency-based undergraduate programs
in computer information technology, business administration, and liberal arts,
starting in 2013. Courses for these programs were redesigned around outcomes
and competencies. The programs were designed for self-motivated learners
who could test out some of the instructional units by demonstrating the
target competencies on a pre-assessment. The program design allows students
to start a course at any time during the year, with tuition set at a Àat rate of
$2,500 for six months.5
Blended Learning
A second major trend in higher education, as in K-12 education, is the use of
blended learning approaches involving various combinations of online and
50 Learning Online
transform student outcomes in higher education, the great majority of the pro-
posals it received, and most of the 29 it chose to fund, called for blended rather
than fully online models of instruction (Means, Shear et al., 2013). Many of
these projects involved making a set of resources and expertise available to
faculty wishing to redesign their courses to use blended learning (grants
to Abilene Christian University, Bryn Mawr College, Cerritos College, the
Missouri Learning College, OhioLINK, and the SUNY Learning Network).
Other projects developed speci¿c online resources for use in blended learning
courses. One grant supported the implementation of SimSchool, a game-like
environment that uses simulations of classroom situations that can be incor-
porated into teacher training courses. A grant to Carnegie Learning supported
development of a collection of games designed to support students’ math
Àuency and strategy development for rapid decision making in procedural
areas. The games were designed to be incorporated into developmental
mathematics classes. Similarly, a grant to the University of Massachusetts
supported implementing their Wayang Outpost online environment in college
mathematics classes. Wayang Outpost was designed to support students’ math-
ematics learning by having them apply math in the context of “adventures” in
a virtual environment, complete with “learning companions” who provide
affective encouragement as students progress through the material.
While learning analytics has already been used in admissions and fund-
raising efforts on several campuses, “academic analytics” is just beginning
to take shape.
(Johnson et al., 2011, p. 28)
The 2013 edition of the Horizon Report placed learning analytics two to three
years in the future and noted:
This year, the rise of big data was the subject of discussions across many
campuses, and educational data scientists all over the world are beginning
to look at vast sets of data through analytical methods pioneered by
businesses to predict consumer behaviors.
(Johnson et al., 2013, p. 24)
Learning analytics, and the related ¿eld of educational data mining, borrow
many of the techniques developed by Web-based businesses such as Amazon,
Google, and NetÀix. While the latter seek a better understanding of their
customers in order to better tailor their online experiences in ways that will
maintain engagement, and ultimately lead to sales, applications of these
techniques in education have a different set of priorities. A brief published by
the U.S. Department of Education (Bienkowski, Feng, & Means, 2012)
characterized the kinds of questions addressed by learning analytics as:
Life, or blogs. Papers and a ¿nal project were required and evaluated only for
those students taking the course for credit (Parry, 2010).
Siemens (2012) contrasts the MOOCs pioneered in Canada and at Brigham
Young University and the University of Florida with those that came later out
of Stanford and edX, referring to the former as “cMOOCs.” According to
Downes and Siemens, the instructional philosophy behind the Canadian
cMOOCs reÀects connectivism, a philosophy of open learning through
collaborative online networks. Rather than conveying a body of instructor-
developed content in top-down fashion, the cMOOCs seek to seed online
discussions and collaborations through which the networked community of
learners will build knowledge and understanding. It is for this reason that
cMOOC participants are free to share material and collaborate using any
technological tools they like.
Siemens calls the newer MOOCs that burst on the scene in fall 2011 and led
The New York Times to call 2012 “The Year of the MOOC” (Pappano, 2012)
“xMOOCs.” One commentator described xMOOCs as emerging “at the inter-
section of Wall Street and Silicon Valley” (Caul¿eld, 2012). While the original
xMOOCs shared with cMOOCs the notion of free worldwide participation in
a course without credit, they differed in employing a course management plat-
form, in many ways not unlike Blackboard or Moodle, and instructor-led
instruction in a traditional hub-and-spoke model. Nevertheless, there are
some distinctive, characteristic features of xMOOCs that emerged from the
predilections of the Stanford faculty who began the xMOOC movement.
In the summer of 2011, Stanford University research professor Sebastian
Thrun and Peter Norvig, Google’s Director of Research, began planning the
course Introduction to Arti¿cial Intelligence, which they would co-teach that
fall. Thrun and Norvig wanted to use the course to explore a better way to
extend high-quality instruction to large numbers of students at no cost. Thrun
sent out an email announcing that this class would be offered to anyone free of
charge. The New York Times picked up the news, and it “went viral.”
Norvig had expected 1,000 students to sign up for the online version of the
course, which would be offered also in face-to-face format for Stanford
students. The more optimistic Thrun predicted 10,000 online enrollments,
without really believing that prognosis himself. Both instructors—and
Stanford—were amazed when 160,000 people from 190 countries signed up to
take the course online. Online students were not offered Stanford college
credits, but those who successfully completed all course requirements would
get a “letter of accomplishment” signed by the instructors.
Thrun and Norvig acknowledged MIT’s role in leading the effort of high-
prestige universities to make course materials available for free online. But
they both felt that the MIT online materials were too dominated by videotapes
of hour-long university lectures (Leckhart & Cheshire, 2012). Thrun and
Norvig did not think this was the best strategy for engaging online learners;
56 Learning Online
they regarded the Khan Academy’s ¿ve-minute videos as closer to the mark. In
addition, Thrun and Norvig wanted to preserve some elements of the assignment
deadlines and synchronous interactions among people in a face-to-face class.
Accordingly, they chose to use very short pieces of videotaped explanations
(about a minute in length) as one of the professors sketched out an idea,
followed by interactive exercises that asked the online student to respond to
questions about what he had just seen. Students might respond by choosing
one of several options or by entering a value right on the same drawing.
Responses were scored for accuracy automatically so that the student could get
immediate feedback. Online students could also submit questions, and arti¿cial
intelligence software sifted through the submissions to identify the most
common questions, which Thrun or Norvig then addressed in future videotaped
segments. Online and Stanford-based students did the same homework, due
every Friday, and took the same exams. Peer-to-peer interaction was achieved
through online discussion forums. Some 2,000 volunteer translators helped
translate the course materials into 44 languages (Thrun, 2012).
The Introduction to Arti¿cial Intelligence course will be remembered as a
landmark in the move to open access to higher education, but Thrun and Norvig
viewed it as a pilot effort. Norvig points out that the 160,000-person enrollment
numbers are somewhat misleading; many of the people who signed up for the
course never did anything with it again (Norvig, 2012a). Three weeks into the
course, there were 45,000 students participating online—still more students
than Stanford’s entire student body. Of those online students, 21,000 ended up
successfully completing all of the course requirements and 3,000 completed a
“basic” track with less programming.
And what of the 200 students taking the course at Stanford? Thrun notes that
by the end of the term, the face-to-face lectures were attended by only about
30 students (Thrun, 2012). Most of the Stanford students preferred the online
short videos and associated exercises.
When it came to students’ performance in the course, Norvig describes the
distribution of points for online and Stanford students as being overall quite
similar (Norvig, 2012b). Stanford students had a slightly higher mean, and
there was more variance among online students. Thrun points out that the
248 students with perfect scores were all taking the course online (Salmon,
2012). Norvig speculates that this may be because Introduction to Arti¿cial
Intelligence was the only class some of the online students were taking, and
they could spend a lot of time to get things just right. Some cynics suggested
that there may have been some online cheating.
In January 2012 Thrun launched a Web-based start-up using the same
learning platform as the Stanford arti¿cial intelligence course. He named his
new enterprise Udacity—a combination of “university” and “audacity.” By the
fall of 2012 venture capitalists had ponied up $21.5 million to fund Thrun’s
start-up, according to The Wall Street Journal (Clark, 2012).
Online and Blended Learning in Higher Education 57
At roughly the same time, two other Stanford professors, Daphne Koller and
Andrew Ng, announced the formation of another MOOC start-up called
Coursera. Unlike Udacity, which concentrated on mathematics and informa-
tion technology subjects, Coursera announced its intention to work with all
kinds of higher education content. Coursera received $16 million in venture
capital funding in April 2012 and announced its partnership with four elite
universities (Stanford, University of Michigan, Princeton, and the University
of Pennsylvania). By fall 2012, Coursera was offering 100 courses. By the end
of 2012 another 29 universities had agreed to partner with Coursera and by
April 2013, the anniversary of the company’s founding, the Coursera Web site
boasted 3.2 million users of its courses.
Although it started like Udacity by offering its courses for free on a non-
credit basis, Coursera moved quickly to open up options for its MOOC partici-
pants to receive college credit. It began talking to the American Council on
Education (ACE) about reviewing ¿ve of its ¿rst courses to certify them as
being on the college level. In spring 2013 the ACE recommended that its
member colleges and universities accept these courses (Algebra, Pre-Calculus,
Calculus, Genetics and Evolution, and Bioelectricity) for transfer credit. ACE
course approval opened the door for students to receive credit for course com-
pletion, but students taking one of these MOOCs for college credit would have
to pay to have their examinations proctored by ProctorU, a company that con-
nects students and proctors through Web cameras, and to petition for credit
from the speci¿c college or university they attend. ACE subsequently reviewed
four of Udacity’s courses for credit as well.
The MOOC providers’ move into offering courses for era was apparent in
new partnership with political support from the California governor’s of¿ce
and ¿nancial support from the Gates Foundation, Udacity, and San Jose State
University (SJSU) struck a deal to develop and implement a set of MOOCs on
that state university campus. The experience with one of these MOOCs, that
for teaching remedial mathematics, is described in some detail in Chapter 7.
In addition to the two Stanford MOOC spinoff companies, 2012 also saw the
establishment of edX as a joint project between the MIT and Harvard University
to offer free online courses on a massive scale on their own MOOC platform.
The nonpro¿t edX has tended to portray itself as the more academic, deliberate
player in the MOOC sector using the tag line “take great online courses from
the world’s best universities.” EdX was able to draw on an even larger set of
resources than the Stanford startups, with MIT and Harvard pledging $30
million each to establish the nonpro¿t organization. Subsequent partners have
included the University of California, Berkeley, the University of Texas
system, and a host of U.S. and international partners (Australian National
University, TU Delft, Ecole Polytechnique Fédérale de Lausanne, Georgetown
University, McGill University, Rice University, the University of Toronto,
Wellesley College, Kyoto University, Seoul National University, Berklee
58 Learning Online
around the world, the average question receives an answer within 22 minutes
(Koller & Ng, 2012).
As the xMOOC companies have gained experience in course outcomes and
trying to obtain faculty buy-in, they have departed from their initial concept of
instruction offered totally online and started moving to blended models of
instruction. After the developmental mathematics MOOC piloted at SJSU in
spring 2013 resulted in a drastic drop in student pass rates, a blend of MOOC-
based and classroom-based instruction in developmental mathematics was
tried out the following summer.
Some observers see supporting education in the developing world as the
major potential contribution of MOOCs (Liyanagunawardena, Williams, &
Adams, 2013), and blended learning versions appear most likely to take
hold in those settings as well. In Rwanda, the new Kepler University has
opened with a design based on using MOOCs supplemented by classroom
sessions with local instructors.
It should be remembered that MOOCs are in a very early stage of evolution.
Colleges and universities are just starting to gain experience with them, and
research on how they operate and their effects on learning is just now under
way (see the MOOC Research Hub at www.moocresearch.com/).
To gain another perspective on where MOOCs may go in the future, we
return to Peter Norvig, Google’s director of research and one of the instructors
of the Stanford Introduction to Arti¿cial Intelligence MOOC that triggered so
much attention. An idea about improving the MOOC experience came from
Peter Norvig’s response when we asked him what he would do differently if he
were offering another MOOC:
It should be remembered that despite all the media attention, MOOCs had a
relatively limited footprint in higher education in 2012. According to the
Babson Survey Research Group, only 2.6 percent of U.S. higher education
60 Learning Online
institutions were offering MOOCs that year and only another 9.4 percent
described themselves as engaged in planning to do so (Allen & Seaman, 2013).
Moreover, the vast majority of MOOC participants were not enrolled students
taking the course for credit.
Evidence of Effectiveness
With all this activity around online courses in higher education, a natural
question is how well do students learn from these courses? Are cohort-based
MOOCs using a broadcast model more or less effective than mastery-based
learning systems? How important is it to have a face-to-face component
supplementing learning activities that occur online?
Unfortunately, we do not yet have a straightforward answer to any of these
questions. The meta-analysis we conducted for the U.S. Department of
Education suggested that blended approaches typically have an advantage over
conventional face-to-face instruction while purely online courses produce
equivalent learning outcomes. However, that meta-analysis did not include any
courses developed after 2008.
More recently, we had the opportunity to examine outcomes for a variety of
online learning interventions funded by the Bill & Melinda Gates Foundation
through the Next Generation Learning Challenges (NGLC) initiative. The ¿rst
wave of grants awarded under this initiative went to interventions designed to
improve student outcomes by applying one or more of four technology-based
strategies: blended learning, learning analytics, open courseware for a core
academic course, or experiences to promote deeper learning. The rationale
behind the grant-making was that the “pockets of innovation” in colleges and
universities typically do not spread to improve outcomes at scale, in part
because of a “not invented here” attitude on the part of higher education insti-
tutions and in part because the inventors of new approaches lack funding and
prior experience in bringing innovations to scale. The question the foundation
asked us to address as evaluators for this set of grants was whether or not the
student outcomes achieved on the campus originally developing the innova-
tion, or wherever it had ¿rst been used, could be replicated on multiple
campuses with larger numbers of students.
For a variety of reasons, few of the NGLC projects set up random-assignment
experiments to provide a rigorous test of the effectiveness of their online
learning innovations. In some cases, technology-based and conventional
versions of a course ran concurrently so that student outcomes could be
compared. More often, the best data available were course outcomes for the
newly designed course compared to those for the same course taught in prior
years to different student cohorts (and not necessarily by the same instructor).
Hence, the individual projects’ reported outcomes could easily be confounded
by differences between the students taking the two versions of the course or by
Online and Blended Learning in Higher Education 61
conditions contrasted by Figlio et al. were two versions of a course with the
only difference being whether or not the lectures were watched over the
Internet or in the classroom. Few technology proponents would expect a differ-
ence in learning outcomes for such a weak manipulation, and indeed there
were no statistically signi¿cant differences between the two groups in terms of
performance on course examinations. Figlio et al. went on to examine differ-
ences between the two conditions for different student subgroups de¿ned by
race, gender, and prior college grade point average. They found that male stu-
dents, Hispanics, and students with below-average grade point averages did
better if they were assigned to the live-lecture condition than they did in the
online lecture condition. The authors conjecture that Hispanic students may
still be learning English and consequently have some dif¿culty understanding
the videotaped lectures and that males and lower-achieving students may be
more prone to procrastination and last-minute cramming when given the option
of listening to lectures whenever they want. In discussing their ¿ndings, Figlio
et al. strike a warning about higher education’s rush to embrace online
learning:
Our strongest ¿ndings in favor of live instruction are for the relatively
low-achieving students, male students, and Hispanic students. These are
precisely the students who are more likely to populate the less selective
universities and community colleges. These students may well be
disadvantaged by the movement to online education and, to the extent that
it is the less selective institutions and community colleges that are most
fully embracing online education, inadvertently they may be harming a
signi¿cant portion of their student body.
(Figlio, Rush, & Yin, 2010, p. 21)
Course completion rate comparisons will be more fair when students in MOOC
and conventional versions of the same course all come from the same popula-
tion and all take the course for credit.
Early results leaked to the press from SJSU’s pilot Udacity MOOCs are not
encouraging (Fujimoto & Cara, 2013). The percentage of SJSU students
earning course credit with a C or better in the spring 2013 MOOCs was
44 percent in college algebra and 51 percent in statistics (compared with usual
rates of 74 percent in classroom versions of both courses). In the developmental
mathematics course for students who had already failed the course the prior
semester, the results were even more discouraging: only 29 percent of SJSU
students earned the required C or better in this course, compared with its usual
completion rate of 80 percent when taught as a classroom-based mastery
learning course with online math practice and assessment software.6
Notes
1 The same series of surveys suggests that university faculty are not necessarily buying
in. As late as 2012, only 30 percent of chief academic of¿cers reported that their
faculty see the value and legitimacy of online learning (Allen & Seaman, 2013).
2 This interest in online education as a potentially lucrative business opportunity is not
limited to the U.S. In Brazil, the for-pro¿t distance learning company Anhanguera is
valued at $1.4 billion (Barber, Donnelly, & Rizvi, 2013, p. 34).
3 Subsequently, she was re-instated after strong demonstrations of support from
faculty, alumni, and students, some of whom spray painted “GREED” on the columns
of a campus building (Webley, 2012).
4 The Carnegie unit for measuring college credit, which is based on the number of
hours spent in the classroom, has come under increasing criticism in recent years. In
December 2012 the Carnegie Foundation for the Advancement of Teaching
announced that it had received funding from the Hewlett Foundation to study the
70 Learning Online
past, present, and future role of the Carnegie unit in education. In a press release the
foundation states “as expectations for schools and students have risen dramatically
and technology has revealed the potential of personalized learning, the Carnegie
Foundation now believes it is time to consider how a revised unit, based on
competency rather than time, could improve teaching and learning in high schools,
colleges, and universities.”
5 See http://www4.nau.edu/insidenau/bumps/2012/7_9_12/pl.html (accessed May 26,
2013).
6 In the two mathematics MOOCs that were taken also by students outside of San Jose
State for a whole range of purposes, the completion rate for non San Jose State
students was just 12 percent (Fujimoto & Cara, 2013).
Chapter 4
Interest-Driven Learning
Online
learning will one day displace formal schooling. In the new era they envision,
learning will happen in multiple venues, be computer-mediated, and focus on
generic skills such as problem solving, communicating in multiple media, and
being able to ¿nd learning resources when you need them rather than on having
a standard body of knowledge in your head. Regardless of whether or not
formal schooling gets displaced as Collins and Halverson predict, there is
ample evidence that formal schooling and training programs are not the only
learning game in town.
Increasingly, policy documents are acknowledging the importance of
learning outside of school and throughout the lifespan. The most recent
National Education Technology Plan (NETP) from the U.S. Department of
Education (2010b), for example, explicitly sets the nation’s goal as providing
the infrastructure and opportunities to learn any where, any time, and
throughout the lifespan—something much broader than the goal of pro-
viding technology in K-12 schools that was the focus of earlier NETPs
developed in 2000 and 2008. A recent report by the OECD calls for the rec-
ognition of competencies gained through non-formal and informal learning
(Werquin, 2010).
Several commentators (Conner, 1997–2013; Sefton-Green, 2010) have
suggested placing informal learning into the conceptual landscape de¿ned
by two dimensions—one dealing with the extent to which the learning is
planned and structured and the other dealing with the setting. Sefton-
Green (2004) describes a continuum of settings with formal settings such as
schools or training programs on one end, through intermediate categories
such as museums, clubs, and camps, to the totally informal settings we think
of as everyday life, in which we interact with family, friends, and passers-
by. His other dimension involves content, with organized curriculum at
one end of the continuum and “casual” learning at the other, as illustrated in
Figure 4.1.
In contrast to the last chapter and Chapters 5–8, which deal with formal
settings and planned instruction, this chapter focuses on settings outside of
school and on types of learning that are voluntary and often loosely structured.
We provide examples of self-initiated learning occurring outside of classrooms,
organizing their presentation according to the primary motivation driving the
learning activity. We then discuss what we see as an emerging trend to try to
connect school and informal learning activities in order to leverage the interest
and persistence that learners show in informal settings for school-based
learning. This discussion is followed by a description of research studies that
have examined the learning outcomes of self-initiated informal learning
activities and a cautionary note on applying research conclusions drawn from
studies of informal learning to the design of formal learning environments.
Finally, we take up the question of whether the “digital divide” is greater in
self-directed learning than in school-based technology-based learning.
Interest-Driven Learning Online 73
Content
Formal School •
CtrrIc:Uum
Athlotlc •
Trainln, Sdence
Informal • Center Exhibits
Settlnl Formal
AItenchooI •
Clubs
Educational Games •
in the Oassroom
Multi-userVirrual
• EnvIronments
/n(omla/
play called Breezy Point was written in 1898 by one of America’s ¿rst woman
playwrights and netted a copy of the Breezy Point script, scanned by Google
and available from the Harvard University library.2
Such situation-driven learning serves no obvious practical purpose but
clearly enriches our lives. Through such activities, we gain a fuller appreciation
of the world and, through the accumulation of such experiences over time, we
can accrue social capital and an expanded sense of our own knowledge
resources. As Renninger (2000) points out, interest-driven learning combines
the affective and the cognitive aspects of our nature.
Learning as Entertainment
When you ask a young child to explain which things are “school” and which
are “fun,” the answer you get is essentially that “fun things are what I do that
nobody makes me do.” By this de¿nition, self-initiated learning is always fun.
Our category of learning as entertainment is a bit narrower, however. In this
category we place learning activities undertaken without the expectation that
there will be any tangible short- or long-term bene¿t other than pleasure.
Important subcategories of online learning for entertainment are online games;
Web 2.0 opportunities to create and share content; and the online opportunities
provided by museums, civic groups, and clubs.
Web 2.0 tools allow people without programming skills to create and post
Internet content, information and learning resources are no longer limited by
the number of professional content producers or their judgment about what
will attract a signi¿cant audience. As a consequence, our ability to become
knowledgeable about very specialized topics that interest us has been magni¿ed
exponentially.
In addition to its role as a source of information, the Internet can provide
models of others who are expert in the topic of interest and enable the learner
to communicate with distant experts. Web 2.0 tools allow us to go beyond
receiving interesting information on the Internet to becoming producers of
content that reÀects our interests and creativity. As Gee (2013) explains,
“Digital tools . . . allow people to collaboratively engage in their own designs,
critique, and discussions of news, games, media, science, policy, health, civic
participation, or any other domain or topic one can imagine” (p. 8).
As people use the Internet to pursue information and communicate about
these topics they become more and more knowledgeable about them. Over
time, such self-initiated learning can lead to real expertise. A number of
scholars are examining the role of online learning activities in developing
expertise that becomes incorporated into the learner’s sense of identity. In their
book A New Culture of Learning (2011), Thomas and Seely Brown write about
three kinds of learning: “learning about” and “learning to do,” which are the
focus of formal schooling and training, and “learning to be,” which can occur
when learning for fun leads to sustained engagement. The ethnographic work
of Mimi Ito and her colleagues (2009) illustrates how out-of-school experiences
involving use of technology can foster what Thomas and Seely Brown call
“learning to be.” Ito and colleagues offer numerous examples of how an
accumulation of knowledge about a given area, such as Japanese anime, can
become an important part of a young person’s emerging identity, in turn
creating the motivation for more learning. Barron’s (2006) studies of young
people’s interest-driven learning emphasizes the connections between online
and ofÀine learning activities as well as the role of family and peers in creating
opportunities for building knowledge over time.
doing so, even paying real money to obtain virtual artifacts that will enhance
their online game experience.
Gee (2009) describes the essential attributes of games and their connection
to learning, “Digital games are, at heart, problem solving spaces that use
continual learning and provide pathways to mastery through entertainment and
pleasure” (p. 67). The most obvious learning that occurs is simply how to play
the game, but games may also call on learning content (e.g., Trivial Pursuit),
skills, values, and conceptual content. Commercial games like Civilization and
Sim City can be thought of as having “serious” subject matter (history, culture,
and the environment). But regardless of its topical focus, gaming involves
¿guring out rules that can be used to one’s advantage to accomplish goals
that one is ‘personally and emotionally’ attached to. Gaming is always about
problem solving, but unlike school assignments targeting problem solving
skills, in gaming the problem solving is ‘integrated with self interest.’ Games
provide a sense of “microcontrol” by giving the player power over movement
and actions of an avatar or other game components at a ¿ne-grained level of
detail, leading to a sense of power and “embodied cognition.” The player feels
that his body is part of the virtual world of the game. Well-designed games lead
players to induce the game’s patterns and rules in order to prepare themselves
for more advanced levels of the game. Games capitalize on the learning
principles of ample practice and immediate feedback.
One of the best-known examples of an online game that attracts sustained
engagement and serves as a “passionate af¿nity space” is World of Warcraft, a
multiplayer game that 10 million people pay subscription fees to play. Players
choose characteristics of their avatar and explore a complex world in which
they can learn trades, slay monsters, and acquire possessions, working either
singly or in teams (Steinkuehler & Duncan, 2008).
Online learning can support better living by helping people learn how to main-
tain their homes or gardens, ¿nd recipes or learn advanced cooking techniques,
acquire how-to tips or language skills for foreign travel, and understand and
improve their own health.
The Web site videojug was designed for this kind of learning; it encourages
us to “get good at life” by taking advantage of its 60,000 free videos and guides
in areas spanning beauty and style, family and careers, do-it-yourself and
home, sports and outdoor, and technology and cars.
The Duolingo website and associated smartphone applications offer free
language instruction in Spanish, English, French, German, Portuguese, and
Italian.3 Duolingo was founded by CMU professor Luis von Ahn and one of his
graduate students, using funds from von Ahn’s MacArthur “genius award” and
a grant from the National Science Foundation. Duolingo reported achieving
3 million users by May 2013—just 11 months after its launch in June 2012.
For cooks (or gourmets), there is Epicurious, a Web site with recipes,
interviews with famous chefs, and articles on food preparation and restaurants
that was launched by Condé Nast in 1996. In 2009 Epicurious released a
mobile app. The Condé Nast Web site stated in July 2013 that 8.6 million
people use Epicurious every month and that the Epicurious app has been
downloaded 6.9 million times. That is a lot more people than ever attended the
Cordon Bleu cooking school!
But cooking Web sites have nothing on those devoted to health. The Yahoo!
Health Web site receives an estimated 21.5 million unique visitors a month,
and the National Institutes of Health and Web MD are not far behind with
20 million and 19.5 million monthly users, respectively.
Although the aforementioned websites were produced professionally, there
are even more better-living resources offered online by volunteer enthusiasts.
By making it possible for people without high levels of technical skill to be
developers of Internet content, Web 2.0 has released a Àood of person-to-
person instruction and advice giving, much of it provided with little or no
monetary exchange.
Often this learning is done in a just-in-time fashion. Your fancy European
dishwasher is not draining properly. You know the repairman will charge
hundreds of dollars just to come look at it. You look online and learn about a
common problem with this brand. There is even a link to a YouTube video
demonstrating how to repair it yourself. More and more people are turning to
the increased access to information, demonstrations, and encouragement
available online to learn how to do new things. In contrast to school-based
learning in which the learning objectives are typically set by someone else
and the learner may not see the relevance of the learning content, this kind of
just-in-time learning is self-initiated and of obvious value to the learner.
One popular Web site, Skillshare, brings together people who have a
skill they are willing to teach online and learners who are willing to pay
Interest-Driven Learning Online 81
a modest fee (say $20) to take a skills class. Featured classes in July 2013
included “Meatball Making with the Meatball Shop,” “HTML and CSS
from Scratch,” “Rock Poster Design,” and “Illustrating Your Favorite
Runway Looks.” People who sign up for classes watch video lessons, produce
a project, and receive feedback from peers. A learner can start any time
after a “class” has launched and has ongoing access to the class resources with
no end date.
The Instructables Web site appeals to the “maker culture,” providing a place
where people can share photographs and instructions on how to make all kinds
of things from a radio-controlled World War I tank to homemade sparklers to a
decorative wall vent cover.
And ¿nally, for those who are not sure what they need to learn to improve
their lives, the Marc and Angel Hack Life site offers 50 Things Everyone
Should Know How to Do (www.marcandangel.com/2008/06/02/50-things-
everyone-should-know-how-to-do/).4
with realistic business and networking scenarios and challenges them to make
decisions and complete projects for virtual clients.
Students who complete a Cisco Academy course and pass a proctored
examination receive industry certi¿cation. With 10,000 academies in 165
countries, the Cisco Networking Academy has provided learning opportunities
to over 4 million students.5 Cisco has reported that two-thirds of 1,500 alumni
of the Cisco Academy responding to a survey said they had found at least one
job as a direct result of their Cisco training, and 20 percent said they had
obtained a higher-level job than they could have gotten without the training
(Little¿eld, n.d.).
Oracle Academy offers a similar program but with more emphasis on
“twenty-¿rst-century skills.” Courses include Introduction to Computer
Science, Advanced Computer Science, and Enterprise Business Applications.
Microsoft Learning provides training to prepare for certi¿cation examinations
for the full array of Microsoft products. Learners can take the training online
with a virtual instructor.
Other players are now joining IT companies in providing online training in
computer programming languages, Web design, and entrepreneurship. The
MOOC company Udacity was founded in 2012 with an emphasis on cutting-
edge computer science skills and the precept that its courses, while not earning
credits from an education institution, would be attractive to employers.
When he announced that he was leaving his Stanford professorship to found
Udacity, Sebastian Thrun described his motivation, “My real goal is to invent
an education platform that has high quality to it . . . that enables students . . .
to be empowered to ¿nd better jobs” (NPR interview quoted in Fast
Company, 2012).
After co-teaching the MOOC on Arti¿cial Intelligence at Stanford, Thrun
selected advanced topics for the ¿rst two Udacity courses: how to build a
search engine and how to program a self-driving car. In October 2012 Udacity’s
course coverage was described as computer science, mathematics, general sci-
ences, programming, and entrepreneurship, all areas with economic value in
the workplace. Thrun argues that technology is moving so fast that universities
cannot keep up with workforce demands because their faculty do not have the
latest technical skills.
Udacity’s original business model called for offering its MOOCs for free
and obtaining revenue by charging modest fees for taking certi¿ed examina-
tions and accepting referral fees from employers eager to hire the best of the
MOOC graduates. In an April 2013 interview with PandoDaily, Sebastian
Thrun explained that companies such as Google, AutoDesk, and Microsoft
were funding the development of Udacity courses to teach skills, such as
HTML5 (the programming language for mobile Web applications), that
are in short supply in the labor market (Lacy, 2013). Although Udacity has
developed classes in subjects as far a¿eld from computer science as
Interest-Driven Learning Online 83
The teachers are good, but they can’t go at a pace that’s . . . like perfect for
everyone . . . I like the concept of knowing something in class but actually
going back [using Khan Academy videos] and pushing pause or rewind
and actually getting a deeper understanding of it.
Worldwide, many parents seek private tutoring to help their children keep up
and excel in school. A 2011 report by Global Industry Analysts, Inc. predicted
that the private tutoring market would exceed $100 billion by the year 2017.
Many students, especially teens, prefer to receive their tutoring online. The
largest provider of online private tutoring, Tutor.com, acts as a marketplace
where independent tutors can offer their online services. Although tutors from
all over the world are now coaching students online, the online tutoring market
has been a poster child for the “Àat world” global economy. In high-wage
countries, such as the U.S. and the U.K., the price of a private tutor is beyond
Interest-Driven Learning Online 85
the reach of many families. Entrepreneurs ¿gured out that by providing private
tutoring over the Internet, a company could offer much better rates by using
tutors from a country with lower wages. As an English-speaking country with
a large educated population, India quickly became the source of a large
proportion of the online private tutoring for the U.S. and U.K. markets.
Growing Stars, founded in 2000 with a base in Kochi, India, employs a set of
online tutors, all of whom hold a master’s degree in the subject being tutored
as well as a teaching credential. TutorVista.com, based in Bangalore, India,
offers unlimited tutoring in 30 subjects for a Àat monthly fee.
Homework Assistance
Another way that learners initiate learning to support their success in school
is by seeking online help for completing their homework. HippoCampus
is a website devoted to homework help resources for elementary and
secondary school students. HippoCampus provides access to collections of
open education resources, such as the Khan Academy videos and exercises,
and the Phet science simulations. Students who are having dif¿culty with a
homework topic can use online resources related to that topic to sharpen
their understanding.
For those looking for human interaction rather than course-related resources,
the Internet can give students an alternative peer community or study group.
OpenStudy, a company started in 2010 as a spinoff of the Georgia Institute of
Technology and Emory University, leverages social networking to support
learners working with online courses, such as those of the OCW Consortium.
The site is free to users who are matched up with others working on the same
course in a kind of virtual, global study group. The idea is that by assembling
large groups of students from around the world, there will be a peer available
to help you no matter what time of day or night you are studying. When
users have a question, OpenStudy sends it to the appropriate user group.
Users can earn OpenStudy badges by providing helpful answers to other
students. By 2013, OpenStudy was reporting over 500,000 unique visitors to
its site per month.
Clubhouse participants attend the program at least once a week, and half of
them attend every school day (Gallagher, Michalchik, & Emery, 2006).
Another prominent example of a third space using online learning is the
Digital Youth Network (DYN) in Chicago. DYN offers two-hour weekly
afterschool sessions for students in Chicago middle and high school, during
which students are challenged and supported in creating digital media prod-
ucts that reÀect their character and surroundings. DYN mentors help students
acquire skills in creating their own movies, music videos, and commercials. An
important component of DYN is Remix World, a private social learning
network on which DYN students from all the partner schools can share their
products, comment on each other’s work, and participate in blogs and
discussion threads (Kennedy Martin et al., 2009).
The advantages of third spaces as sites for learning that prepare young
people for the interdisciplinary thinking needed to address issues of the twenty-
¿rst century were highlighted in research by DeVane, Durga, and Squire
(2009). They chose to study what students learn playing the game Civilization
in an afterschool setting because academic classes in speci¿c subjects do not
combine ecological, economic, and political concepts in the way that the game
does. Science teachers likely would view time students spend working with
economic and political concepts as a diversion, and economics teachers would
be apt to feel the same way about devoting class time to biology. Those who
design learning activities for third spaces can focus on what will interest and
foster intellectual development in their students rather than covering a broad
set of mandated curriculum content.
At the same time, third spaces can have drawbacks as learning sites if their
activities are not well designed and implemented. Many afterschool settings
are staffed by a combination of volunteers and low-paid staff with no training
on how to select or design learning activities or how to guide students in ways
that support their learning as they engage in those activities (Lundh et al.,
2013). Some of the early research on student learning with educational
computer games found that students often play the games in ways that cir-
cumvent the educational content, preferring to use the game for entertainment,
for example, by purposely answering questions incorrectly if a wrong response
leads to an amusing explosion.
satisfy their curiosity who ¿nd the information they are looking for may be a
more suitable measure of effectiveness than any knowledge test. The number
of times people who are turning to online activities as a source of entertainment
return, the amount of time they spend on each visit, and their willingness to pay
money for the experience all attest that the online experiences are ful¿lling
their goals.
Other types of informal online learning, such as those designed to improve
our lives, have goals whose attainment we might be able to measure, but in
most cases, the dif¿culty of obtaining the data would outweigh the bene¿t to
be derived. We do not know how many people have learned how to do home
repair tasks from the Internet, for example, or how competent they have been
when they attempted those tasks. But if people keep returning to the Internet to
get home repair tips, they are “voting with their clicks,” and for many situations,
online usage rates and reviews are reasonable evidence of effectiveness from
the consumer’s standpoint (U.S. Department of Education, 2013).
In the area of online learning to support professional advancement, providers
are more inclined to publicize testimonials from individuals who obtained
good jobs than to publish overall employment statistics for those who used
their materials and services. And even if they do publish statistics, there is no
way to know what percentage of the individuals would have got the same kinds
of jobs without the online learning experience.
In contrast to formal education, most informal learning activities of the sorts
described in this chapter do not receive public funding, and hence there is less
pressure to provide evaluation ¿ndings. Exceptions are the non-formal envi-
ronments such as museums, planetariums, and afterschool programs, which
may receive government funding to support their educational missions. For a
long time the value of such activities was largely accepted as a given, and pro-
grams presented attendance rates as evidence of their worth. More recently,
cash-strapped governments and philanthropic organizations have started
demanding evidence that people are not just engaging in activities in these set-
tings but that they are learning something from them, and providing this kind
of evidence is a challenge very much on the minds of the nonpro¿t organiza-
tions providing publicly and philanthropically funded informal learning
experiences.
Organizations that support informal learning opportunities point to the basic
incompatibility between the nature of informal learning and the requirements
of rigorous educational research designs. Controlled educational studies
require samples of learners experiencing contrasting learning treatments, each
of which is de¿ned and implemented in a standard fashion and all of which
intend to produce the same learning outcomes. The speci¿cation of outcomes
and treatments by the researcher, as required by experimental research designs,
is antithetical to the self-initiated and emergent qualities of informal learning.
Moreover, controlled studies require that all participants take before and after
90 Learning Online
• For many of the kinds of learning outcomes that researchers want to study
and that designed learning environments hope to foster, there are no
existing outcome measures with known validity and reliability.
• The individuals for whom data are available are often not representative of
the entire population using the online learning resources.
• By de¿nition, people engage in self-initiated learning only as long as they
want to, and this may not be enough time to accrue any learning bene¿ts,
and especially not the kinds of competencies that have been observed for
the most active users and participants in online communities.
scores for semesters 2 through 4 of college Spanish were well established for
this test and could be used in interpreting the progress made by learners using
Duolingo.
A notice on the Duolingo website, made visible to users who registered for
Spanish instruction, offered a $20 gift certi¿cate to qualifying learners willing
to take a Spanish placement examination before starting their Duolingo study
and then again after two months. The invitation page was viewed by 727
people, of whom 556 completed a short required survey describing their
backgrounds and reason for wanting to study Spanish. The characteristics of
this group of volunteers are described in the top row of Table 4.2.
Table 4.2 also describes the researchers’ subsequent steps to remove
volunteers who would not be appropriate for the study and to draw a random
sample of suf¿cient size to determine learning effects.
Study participants were urged to spend at least 30 hours on the Duolingo
website during the two months the study was being conducted and received
weekly notices of the time they had spent.
At the end of the eight-week study period, the researchers examined the
website’s backend usage data for the study participants. Only a quarter of them
had spent the recommended 30 or more hours on the Duolingo website. People
who had spent less than two hours on Duolingo over the eight-week period
were eliminated from the data set on the grounds that they had not made a
serious effort to participate. Even this very low bar (two hours out of the
recommended 30) resulted in removing 115 people from the study sample,
illustrating the third challenge listed above, the dif¿culty obtaining samples
with enough exposure to the self-initiated online learning to have reason to
expect signi¿cant learning impacts.
As shown in Table 4.2, the ¿nal group of 88 people whose assessment results
were analyzed by Vesselinov and Grego differed from those who initially
volunteered (as well as from the general population), illustrating the issue of
representativeness.
The average amount of time study participants had spent with Duolingo was
22 hours across the eight-week study period, but the range of study times was
fairly broad. A fourth of the participants spent between two and eight hours;
another fourth spent 30 hours or more, with one individual putting in 133 hours.
Vesselinov and Grego argue that given the vastly different amounts of time
spent by these self-initiated learners, the appropriate learning outcome is assess-
ment score gain corrected for time spent. Accordingly, they computed the number
of points gained on the placement examination per hour spent with Duolingo.
Using this calculation, they estimated that the average person knowing no Spanish
could make the equivalent of a semester’s worth of progress in the language by
spending 34 hours using Duolingo.
Although this statistic seems impressive, it should be remembered that
despite being urged to spend 30 hours on the website, fewer than one out of eight
of the 203 randomly selected study participants actually spent this much time
94 Learning Online
over the two months the study was ongoing. Another way of looking at the
study ¿ndings is that 11 percent of the participants gained a semester or more
of Spanish placement from pretest to posttest.
The Duolingo study was conducted thoughtfully, and the organization is to
be commended for evaluating their product’s effectiveness and making the
data public. At the same time, the evaluation illustrated just how dif¿cult it is
to do this kind of research well.
With home access to computers and the Internet slowly but steadily
increasing, policymakers may also believe that youth will learn whatever
they need to know about technology in home environments, under the
myth that all youth are digital natives . . . who can effortlessly absorb
advanced media skills on their own or from friends, thus making
community centers redundant. We hope that this review has demonstrated
the naiveté of such beliefs and the necessity of providing enhanced social
support, such as that offered in youth media programs, if we are to
seriously tackle inequity in use of technology and the outcomes associated
with such use.
(Warschauer & Matuchniak, 2010, p. 218)
With respect to digital divide trends, Gee (2009) goes even further, arguing that
the digital divide is growing, not shrinking, because those with greater literacy
skills and more access to supports for learning how to use new technologies are
obtaining larger and larger learning bene¿ts not available to people of limited
means. He suggests that the most empowering aspect of digital participation
lies in the Web 2.0 capabilities to create or modify online content. Gee asserts
that young people from less-privileged backgrounds have less opportunity for
this kind of activity because of the combination of limited reading and writing
skills and lack of access to mentoring (Gee, 2009, p. 13). Similarly, Warschauer
and Matuchniak (2010) describe the importance of the “social envelope”
surrounding technology use—the support of peers and family members who
can serve as role models, mentors, and technical support for more advanced
uses of technology.
Warschauer (2012) goes even further, citing evidence that students who do
not have this supporting social envelope for using technology for learning tend
to use their computing devices and Internet access for activities such as playing
simple games or searching for celebrity sites, which may undermine rather
than enhance educational attainment.
These trends have motivated efforts to create “third spaces,” such as clubs
and community centers, with rich technology resources and the necessary
human supports for using them in ways that support learning. These efforts are
one strategy for providing increased online informal learning opportunities for
people who otherwise would not have them in the home or at school and who
lack the resources to take advantage of fee-charging venues. There is some
evidence that such a strategy can be effective. In one study, sixth-grade
DYN participants in Chicago described themselves as Àuent users of a greater
variety of technology tools than did a sample of more afÀuent Silicon Valley
students in grades 6–8, with greater access to technology at home and at school
(Barron et al., in press).
Interest-Driven Learning Online 97
Conclusion
In considering the shortage of rigorous quantitative research demonstrating
learning outcomes for self-initiated online learning activities, it should be
remembered that the kinds of outcomes typically measured in research studies
are beside the point for many of these activities. Many of these learning
activities are highly idiosyncratic, undertaken just in time to increase our
enjoyment of what we are viewing on an excursion or satisfying a momentary
curiosity. Many of the more enduring informal learning activities, such as
participation in online communities with people with like interests or playing
multiplayer online games, are undertaken for pleasure. They are learning, but
the participants think of them as fun. Developers of these online communities
and games measure their success by the number of people who use their
system, the length of their engagement with the website, and the number of
repeat visits. If the goal is interest and enjoyment, these are surely relevant
measures of success.
Elsewhere we have argued for an expanded view of the research methods
appropriate for judging the quality of digital learning resources (Means &
Harris, 2013). The backend data available from digital systems provide a
window into an individual user’s pattern of interaction and can be analyzed to
reveal learning over time (U.S. Department of Education, 2013). Advances in
analyzing log ¿le data will make it easier to obtain longitudinal measures of
learning on much larger samples of learners than it has been feasible to study
through qualitative case study work of the sort described by Barron (2006) and
Ito and her colleagues (2010).
Because informal learning situations are not subject to the mandated
curricula, accountability pressures, and strict time schedules that characterize
schools, those who design online learning activities for these settings have
greater degrees of freedom than those who design learning products for
schools. We believe that this reduced set of constraints explains why some of
the most innovative uses of technology for learning have come from the ¿eld
of virtual gaming environments and other informal learning products. Because
people are choosing to use or not use these products and there are no serious
consequences for performing poorly or withdrawing from the activity, the
stakes are low. A person can try one of these online resources, particularly the
many free ones, and simply withdraw if it is not enjoyable or does not seem
worthwhile.
Self-initiated learning is powered by affective as well as cognitive engage-
ment. The element of choice and learner control is important for creating a
sense of empowerment. As we have noted above, informal learning enthusiasts
are promoting the idea that these elements can be brought into classrooms and
other formal learning settings. Although we endorse the effort, we think it is
important to point out that many things change once you make a learning
experience mandatory. When it is not self-chosen it may not have the emotional
98 Learning Online
Notes
1 We use the term “intentional” not to denote careful planning but rather in the
psychological sense as the antonym for “implicit learning.” Implicit learning occurs
without any effort to try to learn something. Implicit learning, which happens
throughout our waking hours as we read, hear, or observe, whether in the physical
world or through electronic media, is a major subject in its own right, and is beyond
the scope of this volume.
2 The play itself, full of outmoded social attitudes, is hardly impressive as literature.
3 Duolingo also crowd-sources language translation by combining translation activi-
ties with language learning. Although the language lessons are free and without
advertising, Duolingo charges for language translation.
4 The top 50 include building a ¿re, driving a stick shift, performing cardio-pulmonary
resuscitation, and—of course—conducting Google advanced search functions.
5 See http://www.cisco.com/web/learning/netacad/academy/index.html (accessed on
July 5, 2013).
6 Although Whyville science activities are designed to enhance academic learning, the
descriptions of how tweens spend their time in Whyville by Kafai and her colleagues
suggest that the environment’s appeal stems more from the opportunities it affords
for social interactions and identity exploration than from a desire for academic
learning. The researchers describe how Whyville users (70 percent of whom are girls)
establish an online presence, play with others by tossing projectiles back and forth,
exclude someone from some portion of the Whyville world by throwing many
projectiles at him, and Àirt by throwing a heart or a kiss.
Chapter 5
K-12 classrooms at all grade levels are incorporating online instruction into the
school day. With increased access to high-speed bandwidth, low-cost devices,
and a Àood of new educational apps and open educational resources, schools
and teachers are looking for ways to leverage and integrate the best online
activities in their instruction. Over the last decade, districts and schools have
made signi¿cant investments in learning management systems to streamline
administrative functions, and many states and districts have set up online
virtual schools. Only recently have we seen signs of similar levels of invest-
ment directed toward the use of online resources to augment classroom instruc-
tion within traditional brick-and-mortar schools. The Web-enhanced learning
environments of charter schools such as Rocketship Education and Carpe
Diem, and being used in clusters of innovation in public schools such as
New York City’s iZone schools and the League of Innovative Schools, are
receiving tremendous media coverage. They have become the new poster
children for the promise of technology in the classroom and, arguably,
represent the future of K-12 education.
In this chapter we explore prevalent and emerging models of the use of
online learning in K-12 classrooms, the role these models play in teaching and
learning, and practices that support more effective adoption of these models.
For our purposes, a simpler de¿nition of blended learning will suf¿ce: “the use
of online learning in conjunction with traditional teacher-led forms of
instruction” (if for no other reason than that we can remember it without
writing it down). All of the models and examples described in this chapter
qualify as “blended learning,” using our de¿nition.
Horn and Staker’s blended learning report also de¿nes six different blended
learning models, later consolidated into four models with four different
variations on one of them—the “rotation model” (Staker & Horn, 2012). One
kind of rotation blended learning model has students moving across multiple
activity stations, at least one of which involves online learning, in their regular
classroom. In the lab rotation model, online instruction occurs in a computer or
learning lab, separate from the classroom where core, teacher-led instruction
occurs. The students’ regular teacher may lead lab activities or students may
Blending Teacher and Online Instruction in K-12 Schools 101
Like any set of categories used to describe educational practice, these are
imperfect, and it is easy to come up with examples addressing more than one
of these goals, as will be illustrated by the examples we use below.
By the end of the year, more than 96 percent of these students displayed
pro¿cient or advanced literacy skills for students their age. Certainly multiple
factors within this KIPP school are likely to be contributing to this academic
success, but the school regards its blended instructional model and the small-
group instruction it makes possible as important factors.
Enabling teachers to spend more time interacting with individual students
and with small groups is also one of the ideas behind the notion of the “Àipped
classroom.” Sal Khan made this concept popular by presenting it in a TED talk.
In the Àipped classroom model, instead of focusing their classroom time on
lecturing or other forms of content presentation, teachers assign Web-based
videos to introduce new concepts or background knowledge as homework—
sometimes videos of themselves lecturing and sometimes videos from third-
party providers such as the Khan Academy. Teachers can then use class time to
work on deepening students’ understanding of the ideas and procedures in the
video lectures. The original concept was to have students practice applying the
concepts presented by the video in class, working on what traditionally might
have been homework, so that the teacher can observe where students struggle
and offer just-in-time individual assistance.
As the Àipped classroom concept has spread and evolved, some teachers
prefer to use the class time freed up by having students view content presenta-
tions at home on follow-up discussions or on interactive activities or projects.
Attempts at Àipping the classroom are still at the exploratory stage. But there
is a growing community of like-minded teachers teaching a wide range of
subject areas and grade levels who are experimenting with different ways to
organize their instruction by sending students to the Web for a good portion
of their learning and focusing their own interactions with students on the
development of deeper learning skills.
regular monitoring of student progress on the online system, and offering mini-
lessons or tutorials to small groups of students struggling with the same
concepts.
For the past two years, we have been studying the use of the Khan Academy
online learning system in 21 schools in the San Francisco Bay Area. The Khan
Academy started as a collection of a few hundred YouTube videos on a range
of math problem types created by Sal Khan himself. Over time, with extensive
philanthropic funding, it has developed into to a free online digital learning
system incorporating more than 3,000 videos and hundreds of problem sets
covering the majority of K-12 mathematics topics.
As students work on Khan Academy problem sets, they receive immediate
feedback on the correctness of their answers and embedded supports, including
links to related videos and the ability to view the steps in the correct solution
path, to help them learn from their mistakes. While students are using Khan
Academy, teachers can monitor their progress online, seeing for each student
and each skill area the problem sets the student has attempted, the amount of
time spent on each, and whether or not each skill has been mastered.
In Algebra Readiness, the class designed for the students entering ninth
grade with the least math preparation, the ¿rst semester is geared toward
making up all of the math learning that should have happened between
kindergarten and grade 8. The teacher relies on students’ performance on Khan
Academy problem sets to identify which skills each student does and does not
possess. Reports generated by the Khan Academy system identi¿ed those
students lacking the most basic skills, such as dividing double-digit numbers.
The teacher creates small groups to teach these very basic math skills while
other students work on Khan Academy exercises dealing with the more
advanced skills they are ready to master.
In the Algebra class, the teacher gives a daily lesson and then has students
work on Khan Academy problems sets related to the lesson’s topic. After
giving the lesson, the teacher distributes mini tablet computers, and students
work on the relevant Khan Academy exercises for the remainder of the period.
Students know that any un¿nished exercises in the Khan Academy unit will
have to be done as homework to avoid detention.
Generally, teacher-led instruction lasts for about 20 minutes and practice on
Khan Academy consumes the other 20 minutes of the class period. These time
allocations, along with the structure of class, are highly consistent throughout
the school year, and have proved to be an ef¿cient means to work through the
course content. The algebra teacher informed us that his two Algebra I classes
using this blended learning model moved through the content so ef¿ciently that
they were able to advance far beyond the point reached by his previous algebra
classes, enabling him to spend signi¿cant time on later material, like geometry.
The third context for using Khan Academy at this school is the learning lab,
a mandatory second period of mathematics for all freshmen at this school. The
entire 40 minutes in learning lab are spent on computers. Students are given a
list of Khan Academy goals to complete for the week. When they enter the lab
space, their small laptops are already set up at their desks and they simply
check their “playlists” and begin independent work.
The advantage of mastery-based learning is that students work at their own
place in the curriculum and move at their own pace. Although it is theoretically
possible to implement mastery-based models without technology, advances in
online instructional programs and Web-based learning management tools
make it much easier to manage. It is also possible to incorporate other
information about students, such as their preferences for different instructional
formats, into blended learning models with mastery-based learning, as
described below.
School of One model has been piloted in a number of New York middle
schools as a strategy for helping teachers differentiate instruction for students
with a range of achievement levels and different learning preferences.
During two back-to-back math periods each day, School of One students are
exposed to a variety of different instructional modes, including online learning,
guided by a “playlist” that is customized for each child. The playlists, accessed
by students when they log in to their assigned computer stations, shows the
child the skill he or she will be working on that day and the type of instructional
activities he or she will engage in. The activities include a mix of teacher-led,
online, independent, and collaborative learning. Instruction takes place in a
large, open classroom with a student–teacher ratio of 10:1.
As students enter the room, large monitors around the classroom display the
opening activities and skills each student will work on to begin the day. For the
¿rst 10 minutes of class, students check in with their homeroom math teacher
who is responsible for monitoring individual student progress in the program.
On any given day, a student may participate in a teacher-led large or small
group, an online live-tutorial, collaborative project work with other students,
and independent online or workbook practice. At the end of each daily
instructional block, students take an online assessment on the day’s skill; if
they pass, they move to the next skill in the sequence on the next day of class.
If they do not pass, the next day they receive a different mix of instructional
activities on their playlist to help them master the skill, sometimes including
live one-on-one online tutoring.
The School of One model requires teachers to change their role in the
classroom and the way they plan lessons. Teachers need to be able to provide
instruction across a variety of modalities and to learn to use technologies that
allow them to monitor and grade their students and help prepare their lessons.
School of One teachers are expected to deliver instruction to large and small
groups, facilitate peer learning activities, and support students as needed when
they are working independently. Each evening, teachers receive their
assignments for the following day, including the skills they will teach, the
students they will be teaching, whether students have been exposed to the
content before, and links to lesson plans from a variety of textbooks. Teachers
then modify their lessons based on the needs of the students. At any time,
teachers can log in to a portal to see which skills they will likely be teaching
over the next several days.
At the core of the School of One model is a computer-based learning
algorithm that creates each student’s daily playlist. The algorithm takes into
account the student’s prior academic performance and diagnostic assessment
scores, and the school’s available resources, including classroom space, teacher
time, and technology. Teachers can modify the playlists for individual students
based on their knowledge of the student and the instructional modes they
believe will be most effective for that student.
Blending Teacher and Online Instruction in K-12 Schools 109
A study published in June 2012 by the Research Alliance for New York City
Schools based at New York University examined the impacts of the adoption
of the School of One model in three middle schools during 2010–11, its ¿rst
year of implementation as a schoolwide math program (Cole, Kemple, &
Segeritz, 2012). The study found that overall there were no differences in math
achievement on state assessments between students experiencing the School of
One program and similar students attending other New York City public middle
schools. However, the study reports that effects varied by school, with one
school showing a positive effect, one no difference from math achievement in
other middle schools, and one a negative effect. We second the report authors’
caution against drawing any ¿rm conclusions about the potential effectiveness
of the School of One model, given the early stage of its development and
implementation within these schools, and their recommendation that the
project increase its focus on implementation research to better understand
the sources of the variation in effectiveness across different schools.
from teachers. The level of conversation among students is high, more akin to
what one would expect in a high-school cafeteria than in a classroom. But if
you listen carefully, most of the conversation is about math or involves one
student helping another navigate the learning management system or use a
particular digital resource.
None of the usual markers of the conventional classroom are present in
Summit’s personalized learning space. Each day, in two different shifts, 200
students from mixed grade levels assemble for their math instruction
during a two-hour instructional block. At least one hour of this instruction is
student-directed “personalized learning time” guided by playlists created by
teachers and accessed by students through the school’s learning management
system.
Students are at different points in the curriculum from the very start of the
school year, based on their results on a diagnostic assessment administered at
the beginning of the year. At any one time, half of the students, 100 or so, are
working on their playlists, arranged in small groups of four to six students,
each with his own laptop and working at his own pace.
The playlists are comprised of multiple digital resources, among which
Khan Academy content is prominent, and cover speci¿c skills such as calculat-
ing the circumference and area of a circle or factoring polynomials. At the end
of each playlist, the student takes an assessment on the targeted skill that must
be passed before starting the next playlist in the curriculum sequence. In addi-
tion to Khan Academy videos and problem sets, the playlists include both free
open educational resources and subscription-based digital resources from other
sources as well as teacher-developed worksheets.
During personalized learning time, Summit students who are struggling to
learn a new concept or skill have several options for getting support. The
teachers and learning coaches encourage students to seek help from their peers,
and most of them do so. The din of students talking to each other about math is
the result. In addition, two “tutoring bars” are set up in the center of the large
personalized learning time space, and teachers and adult volunteers are there to
answer student questions.
Students’ progress within the self-directed curriculum is closely monitored
by learning coaches who receive nightly reports, generated by the learning
management system, showing the number of assessments each student has
successfully passed. Teachers work with any student who falls behind to create
a “back-on-track” plan that outlines the steps that she will take to catch up.
Teachers and learning coaches regularly monitor the progress of students with
back-on-track plans, using the nightly reports and, if necessary, scheduling
daily check-ins.
At any one time, half of the students are working through their playlists
in the personalized learning time space, the other half are in adjoining rooms
experiencing teacher-facilitated instruction. This classroom-based instruction
Blending Teacher and Online Instruction in K-12 Schools 111
that their products were used in almost 20 percent of U.S. schools and that they
had millions of users a month.
fundamental to the school’s design and organization for instruction (as in the
cases of School of One and Summit Schools), an expedient for providing a
class not otherwise available (as in the grade 8 Algebra I example), or a set of
resources that individual teachers decide how to integrate into their practice (as
in the games and TELS examples).
made by the system’s adaptive algorithms. While teachers generally like the
adaptive nature of these programs that allow students to proceed at their own
pace, they still want greater control of the content presented to their students so
that they can integrate the use of the online resources with their classroom
lessons.
supplements the courses offered by the teachers it has trained with courses
offered by Connections Inc.
There is considerable variation from course to course within a virtual school,
but each of these three well-established virtual schools—CDLI, FLVS, and
VHS—has its own characteristic approach to instruction that shapes the online
student’s course experience. The vignettes in Figures 6.1 through 6.3 provide
portraits of the self-paced, asynchronous communication typical of FLVS, the
active role of the teacher in synchronous and asynchronous online activities in
VHS, and the synchronous instruction that typi¿es CDLI.
As the vignettes in Figures 6.1 and 6.2 illustrate, VHS and FLVS represent
very different instructional models. In designing their courses and training
their online instructors, FLVS started with a model with its roots in traditional
correspondence courses, while VHS sought to create online equivalents to
traditional classroom practices. FLVS uses a mastery learning approach, with
students advancing to the next module of a course only after demonstrating
mastery of the content of the current module, and allows students to start
a course at any time and ¿nish it whenever all course objectives have
been mastered. In contrast, VHS is cohort-based, with the class of online
students progressing together and a focus on synchronous and asynchronous
discussion and peer interaction.
Until very recently, neither FLVS nor VHS offered high-school diplomas.
Students got credit for completing their online courses from their homeschool
Working at home, the student dons her headphones, logs onto the system, and
finds her place in her self-paced Algebra 1 course. A dashboard shows her which
modules she has completed and which she still has to do. The modules are
aligned with Florida’s curriculum standards and include problem solving as well
as learning concepts and procedures. Lessons include slideshow presentations
with associated audio, opportunities to practice, and formative and summative
assessments. The student views material in her current module, using the head-
phones for audio content. After completing an assessment that demonstrates her
knowledge of the content, she then calls up the next assignment, Quadratic
Equations, and begins viewing introductory multi-media materials. Finding part of
the assignment difficult to understand, the student sends a message to the
course instructor requesting a phone call for the next day. The teacher, who is
responsible for about 165 students taking the course, sends an email reply
accepting the suggested time for the call. The teacher plans to use the
opportunity not just to respond to the student’s questions about the assignment,
but also to perform one of the FLVS-required check-ins, discussing some
of the content the student has recently completed to make sure she really
understands it.
The VHS Bioethics course uses elements of self-pacing and classroom pacing. The
student logging on to the system clicks his online course and finds a photo of the
teacher and other students as well as the course syllabus, assignments, and
assignment due dates. Using online content and a textbook, the student reads
and works on the week’s assignment. This week the student is required to select
a recent news article about bioethics, write a synopsis of the article, and post
the synopsis online for other students and the teacher to see. For next week, the
student needs to review the synopses of articles posted by other students and
make comments on those articles. The teacher reviews the students’ online dis-
cussion, once in a while making a comment to correct a factual error or to press
for deeper thinking. The teacher also provides written feedback on each of the
20 students’ synopses.
Students sitting in a classroom log into the various online courses they are
taking. Several students taking French for English speakers enter their virtual
classroom where they have direct messaging and virtual hand-raising capabilities.
The other classmates and the teacher are online, working from their own
physical schools located throughout the province. The online French teacher
starts the class by asking, “Quel temps fait-il chez toi?” Some students respond
orally using their microphones. Others type in their answers using the direct
messaging tool. The teacher then presents slides about different kinds of weather
using the online whiteboard. She asks questions and presents new vocabulary.
Later, she divides her class of 25 into smaller groups, sending each group to
a different virtual room where they can interact with each other, practicing their
French discussion skills using the system’s audio capabilities and giving each
other feedback.
Figure 6.3 A Centre for Distance Learning and Innovation Course Experience
Source: Based on descriptions in Murphy and Coffin (2003).
online course providers, and there is rapid growth in enrollments on the demand
side. Watson et al. (2012) estimate that 275,000 U.S. K-12 students were
attending fully online schools in school year 2011–12. Others put the enrollment
¿gure somewhat lower, at 200,000 (Miron, Horvitz, & Gulosino, 2013).
Twenty-seven states have sponsored their own virtual schools, with active
programs including those of Alabama, Georgia, Michigan, and Montana in
addition to Florida’s FLVS described above.
The trend for states to set up their own virtual schools appears to have lost
steam in recent years, however. One explanation is the growing number of
private providers of K-12 online courses and high-school programs with a
national reach coupled with a growing number of programs offered by
individual school districts and consortia (Watson et al., 2012).
An example of the latter is the Riverside Uni¿ed School District, which
began offering online courses to increase access to courses required to qualify
for admission to California’s universities (the “A-G requirements”). Many
students, particularly from rural, low-income or immigrant backgrounds, ¿nd
that they have no chance for admission to a state university because they have
not taken required courses that were not even offered at their high schools. This
concern led the district to start offering selected online courses in 2005. The
success of these courses led the district to create the Riverside Virtual School
in collaboration with Riverside’s neighboring districts in 2008. In addition to
making online courses available to their own students, the Riverside Virtual
School cooperating districts now make complete curricula for grades 3–12
available to homeschoolers.
A second explanation for the declining role of state virtual schools is the
severe cuts to state education budgets experienced after the end of the federal
stimulus package that followed the 2008 economic crisis. A number of states
lost funding or were severely underfunded for their virtual schools during this
period. For example, FLVS recently cut about a third of its workforce, mostly
teaching adjuncts, to accommodate a decline in year-to-year enrollments and a
state-legislated reduction in per-pupil expenditures for students enrolled part-
time in FLVS (Herald, 2013).
Another common strategy is for entities other than states and districts to
establish virtual schools as charter schools (Miron & Urschell, 2012; Watson
et al., 2012). State laws regarding charter schools vary markedly; some states
do not permit charter schools and some cap the number of charter schools or
the total charter school enrollment. These state regulations affect the level of
opportunity for online schools as well. States also regulate virtual schools
explicitly and may disallow them or cap the number of students who may
participate. By 2012, 31 states plus the District of Columbia permitted the
operation of fully online schools within their jurisdictions. However, a number
of states considering legislation to permit the operation of charter schools in
recent years decided to postpone any action until more information concerning
the effectiveness of these schools becomes available.
130 Learning Online
Table 6.1 U.S. Virtual School Student Enrollment in School Year 2010–11, by
Provider Type
Source: Miron, Horvitz, & Gulosino (2013) using data from the National Center for
Education Statistics.
Many programs of the latter type are run by school districts. In states where
districts receive per-student funding for online enrollments, retaining students
or luring back those who have dropped out with an online option is a way to
increase enrollment and hence funding. The National Center for Education
Statistics reports that 62 percent of U.S. school districts offer online courses
for credit recovery (Queen & Lewis, 2011). In addition, more and more school
districts have turned to online courses as a strategy for saving money by
replacing traditional summer school courses.
While it is very likely that students enroll in virtual schools with multiple
and distinct motivations, getting a clear handle on the proportion of students
with particular goals and intentions is not easy. Kim, Kim, and Karimi (2012)
assert that students “who choose online learning do so because they do not like
traditional schools’ structure and conventional learning materials.” In their
study of students’ experiences in FLVS Algebra I and English I, Bakia et al.
(2011) found another reason that students choose online courses—a signi¿cant
portion had failed the course previously and were taking it again to qualify for
graduation. In responding to criticism concerning their low graduation rates,
representatives of virtual schools in Colorado reported that in the early days of
these schools most enrollees “tended to have at least one involved parent and
homeschooling experience” but that their recent school populations included
more students turning to online schools as a last resort.
obstacles that online K-12 schools face are even more challenging. Every
state has its own set of laws and regulations concerning the conduct and
funding of education. In some states, students may be taught only by a teacher
with a credential from that state, undermining the potential economies
of scale that a national or regional online school could have. Some states
limit the number of students per course, and impose the same limits on
online courses as for face-to-face courses. In California, concern about
online schools “poaching” students (and hence the per-pupil aid that follows
them) from school districts led to a regulation allowing an online school to
serve students only from its own district and those districts geographically
contiguous, a requirement that has limited the growth of the Riverside
Virtual School.
Teachers’ unions, while not opposed to blended learning approaches, have
voiced their disapproval of virtual schools. The California Federation of
Teachers, for example, has stated that online learning should be used only in
cases where attending a brick-and-mortar school is not feasible. Further, it has
developed language around provisions to prevent a district from using online
learning to eliminate or consolidate faculty positions for use by its local
af¿liates when they are negotiating their contracts with districts.
Students who took FLVS courses in mathematics or English also earned higher
scores on the corresponding portions of the state test than did students taking
these classes in the traditional format. Unfortunately, these analyses did not
control for pre-existing differences between the two sets of students or for the
higher rate of course withdrawal for FLVS.
A subsequent study by Bakia et al. (2011) attempted to address these
de¿ciencies in an examination of the performance of FLVS students in English
I and Algebra I (two of FLVS’s highest enrollment courses) relative to students
taking the same courses in traditional classrooms. Bakia et al. found that
students taking English and algebra through FLVS were as likely as those
taking these courses in regular classrooms to pass the course and that they
achieved higher scores on the corresponding portion of the state test. This same
pattern was found after controlling for student differences in prior achievement
and ethnicity.
On the negative side, a study of Colorado virtual schools by Hubbard and
Mitchell (2011) and a series of reports from the University of Colorado’s
National Education Policy Center (NEPC) raise concerns about the outcomes
students of virtual schools achieve. Reports on virtual schools produced by
NEPC include a study of schools operated by K12 Inc. (Miron & Urschel,
2012), a 2013 report on U.S. virtual schools in general (Miron, Horvitz, &
Gulosino, 2013), and several earlier think pieces (Glass, 2009; Glass &
Wellner, 2011).
Glass and Welner (2011) expressed alarm at policies made to accommodate
online schools at a time when “Little or no research is yet available on the
outcomes of such full-time schooling” (p. i). Hubbard and Mitchell (2011)
examined records for 2,400 online Colorado students who had taken the state
achievement test the prior year. They reported that the likelihood of scoring
pro¿cient for these students actually went down. (They do not explain that the
state’s test changes from grade to grade, so “pro¿ciency” is not necessarily
equivalent in the two years.) Hubbard and Mitchell point out also that online
schools had among the weakest state rankings on graduation and dropout
rates—with a virtual school student being three times more likely than a regular
school student to drop out.
Defenders of Colorado’s virtual schools have argued that their institutions
were the last chance for many at-risk students, an assertion that may be true but
that state data on student poverty rates and achievement prior to entering
virtual schooling suggest cannot explain the entire graduation gap (Hubbard &
Mitchell, 2011). Another argument is that virtual schools are less likely than
brick-and-mortar schools to know when a student has switched to another
school and hence can be counted as a transfer rather than a dropout. The extent
of this problem is not known.
Another NEPC study looked at the 48 full-time virtual schools operated by
K12 Inc. in 2010–11. Miron and Urschel (2012) found that students in K12 Inc.
134 Learning Online
virtual schools had average test scores lower than the averages in the states
where they operate. The gap ranged from 2 to 11 percentage points in reading
and from 14 to 36 in mathematics. The NEPC analyses used school-level data
rather than analyzing effects for individual students and controlling for
differences in the prior achievement and educational histories of the students
being compared.
Similarly, an NEPC analysis of all U.S. virtual schools by Miron,
Horvitz, and Gulosino (2013) compared three years of data for virtual schools
with data for all U.S. public schools on whether or not the school made
adequate yearly progress, whether or not the school was deemed “academically
acceptable” by its state, and the school’s on-time graduation rate. The
researchers compared data for the two types of schools without con-
sidering differences in the types of students served or the very large state
differences in adequate yearly progress criteria, the rigor of their assessments
of pro¿ciency, or their graduation requirements. Nevertheless, the differences
between the rates for virtual schools and those for traditional brick-and-
mortar schools, all favoring the latter, are so large (for example, an on-time
graduation rate of 38 percent in virtual schools versus 79 percent in U.S.
high schools overall) that it is unlikely that rigorous controls for state prac-
tices and assessments and student characteristics would eliminate the gaps
completely.
Miron, Horvitz, and Gulosino (2013) did not disaggregate their data by the
nature of the virtual school operator, making it dif¿cult to say anything about
the relative effectiveness of virtual schools run by nonpro¿t organizations or
public agencies (such as states and districts). One thing we know from our own
experience is that virtual schooling providers often maintain very little data
about the prior educational experiences of the students who enroll in their
courses. This practice makes student-level longitudinal analyses for students
moving into and out of virtual schools of the sort conducted by Bakia et al.
(2011) very resource-intensive to conduct, as student data must be obtained
from both state education agencies and virtual school providers and then linked
through student-level matching that usually requires hand-checking to get an
acceptable match rate. Even when school districts or states offer virtual
school options, their student data systems may not distinguish between courses
taken online and those taken in regular classrooms, making comparative
analyses impossible. Under these circumstances, it is not surprising that
Education Week reports that only 16 percent of school districts have compared
student outcomes for their online courses to those for their conventional
courses (Ash, 2012).
The controversy over the effectiveness of virtual schools underscores
the need for better data systems that would permit tracking individual
students longitudinally and identifying the portions of their education program
taken online.
Online Schools and Universities 135
These authors note that the characteristics associated with success in an online
program are those that distinguish the adult learner from the less-mature
learner, and that online learning technologies and practices have their roots in
higher education, where learners are adults. They note the need for more
136 Learning Online
Because the disruptive innovation can offer a lower price point, Christensen
et al. argue that inevitably it will win the lion’s share of a transformed market.
Based on the quantitative model they developed studying disruptive innova-
tion in other industries, Christensen and his colleagues predicted in 2008 that
50 percent of all high-school courses in the U.S. would be taken online by the
year 2019 (Christensen, Horn, & Johnson, 2008). In 2011 they predicted that
half of all higher education course enrollments would be online by 2014
(Christensen et al., 2011).
Another tenet of Christensen’s theory is that disruptive innovations are
never led by a mainstream organization offering the traditional product or
service:
In this analysis, the independent virtual schools and online universities, many
of which are for-pro¿t companies, are the potentially disruptive force in educa-
tion. Christensen et al. acknowledge the signi¿cant barriers that regulatory
restrictions can pose for disruptive innovations, but they argue that innovators
¿nd ways to work around those barriers.
In higher education, accrediting agencies, which base their judgments of
quality largely on measures like the faculty-student ratio that are linked to the
amount spent per student, have posed challenges for online universities. The
fact that higher education funding in the U.S. is largely in the form of Pell
grants that are given to students who can use them at any accredited institution
means that online universities need accreditation to survive, but that once they
attain that accreditation, they can compete with traditional colleges and
universities for these students.
In K-12 education, state and district laws and regulations pose different
barriers and complexities in different states. Online providers and political
proponents of school choice have been lobbying for changes that will make it
easier for virtual schools to operate. And they have had some success. For
example, a number of states have passed legislation allowing students to take
online courses from a combination of different providers, making it easier for
a student to receive a high-school diploma on the basis of courses taken online
(Evergreen Education Group, 2013).
To make smart policies about virtual schools, education systems should
have good longitudinal data about their students’ achievement outcomes and
subsequent education. Unfortunately, the data systems in most states and
districts have not been structured in a way that permits rigorous analyses that
Online Schools and Universities 139
Credit Recovery
Credit recovery programs give students the opportunity to pass and receive
credit for a course they attempted previously without performing well enough
to earn credit toward graduation (Watson & Gemin, 2008). Most students who
Online Learning for Less-Prepared Students 141
drop out of high school failed one or more of the courses they attempted in
grade 9; most credit recovery efforts are aimed at helping students recoup their
lost credits and stay with their high-school class (Watson & Gemin, 2008).
Many districts are turning to online courses to support their credit recovery
efforts. Blackboard (2009) reports that more than 60 percent of the 4,000
district technology directors responding to a survey said they use a learning
management system for credit recovery. FLVS, highlighted in Chapter 6,
reports that a third of its course enrollments are for students who previously
attempted but failed the course they are taking online (Dessoff, 2009).
An example of a large-scale credit recovery program using online learning
is the Online Learning Program of the Los Angeles Uni¿ed School District.
The courses offered for credit recovery use an active online instructor and
course content segmented into chunks, permitting the differentiation of
instruction. Students can test out of portions of the course they have already
mastered from their ¿rst experience with it, and focus entirely on those chunks
they still need to work on. The district works with individual schools to set up
a course management system and Web conferencing so that courses can use
both synchronous and asynchronous communication. Online teachers often
start a unit with a diagnostic test followed by Àexible combinations of group
work and individual tutoring for different students.
Oliver et al. (2009) surveyed students taking courses through the North
Carolina virtual schools and found some interesting differences between
responses of credit recovery students and students in general, honors, or AP
online courses. The credit recovery students were more likely than all the
others to express the opinion that they learned more online than in classroom-
based courses. They were also more positive about the instructional skills of
their online teachers. One the other hand, they were more likely to express a
lack of con¿dence that they were technically prepared to take a course online
(60 percent) and to report limitations in their access to Internet connectivity at
home and at school and to a computer at school (Oliver et al., 2009).
Alternative Education
While credit recovery programs are typically structured around speci¿c courses
and may be of relatively short duration, alternative education programs focus
on taking the student all the way through to the diploma with a different
learning environment, and typically a greater network of support, than is found
in conventional classrooms and schools.
An example of a model for using blended learning in alternative education
is the Performance Learning Center program developed by Communities in
Schools, a nonpro¿t dropout prevention organization that ¿rst implemented the
program in Georgia in 2002 (Kronholtz, 2011). Communities in Schools
promotes this program for students who have performed poorly in regular
142 Learning Online
schools, and have poor attendance, low motivation, and challenges such as
poverty or pregnancy. The four performance learning centers operated in
Virginia reported that in 2009–10 a third of the students at these centers were
two or more years behind their grade level in terms of academic credits earned
(Kronholtz, 2011). Performance learning centers operate as small schools with
four or ¿ve teachers for fewer than 100 students, who work in four or ¿ve
classrooms, depending on the number of teachers. Most of the instruction is
carried out through online courses, but a teacher is present in each classroom
to act as a coach and answer questions.
The Bridge Program of the Salem-Keizer school district in Oregon is another
alternative school model combining online and classroom-based instruction.
Bridge Program students take one course at a time and attendance is mandatory
for the two hours a day they work in a computer classroom. Students are
expected to undertake additional work online from home or other offsite loca-
tions, but program hours are Àexible so students can stay in school while also
working. The Bridge Program staff consists of two online teachers and two
assistants who work in the computer lab to provide on-site help and keep
students on track.
Student Outcomes
The research literature lacks rigorous experimental tests comparing student
outcomes for credit recovery and alternative education programs with and
without an online component, but districts are reporting numbers of students
recovering credits and receiving diplomas after participating in these programs
(Archambault et al., 2010; Kronholtz, 2011; Watson & Gemin, 2008). In many
cases, it is likely that districts would not be able to offer such programs at all if
they did not have the option of using online learning.
Recommended Practices
Based on their experience trying out and re¿ning these programs, practitioners
have started to develop a consensus around important elements of programs
for students who have fallen behind grade-level expectations or for
other reasons are at risk of not completing their high-school program. These
include:
community colleges: in those settings about two out of three students are taking
an online course in any given academic term (Pearson Foundation, 2011).
Online instruction is attractive to community colleges for multiple reasons.
The motivation cited most often by academic of¿cers is online learning’s capa-
bility to make courses available to people with work or family responsibilities
that make attending scheduled sessions on campus dif¿cult (Parsad & Lewis,
2008). Other attractions of online courses are the potential for saving costs
and for obtaining greater consistency across different sections of a course.
out or fail to earn a C or better in their online courses than in their classroom-
based courses.
lower likelihood of successful completion than those taking the same courses
in a conventional classroom-based format. Researchers have considered
alternative sources for this poor performance in online courses: It may have
something to do with the characteristics or life circumstances of that portion of
the community college student population electing to take coursework online,
or it may stem from weaknesses in the online courses themselves.
Starting with the second of these, there is some indication that the
typical community college online courses, if those studied by the CCRC are
representative, were far from exemplary in the years before and shortly after
2005. Researchers at the CCRC conducted qualitative studies of 23 online
introductory academic courses at two of the Virginia community colleges
that were part of the analyses conducted by Xu and Jaggers (2011b), inter-
viewing students and course instructors as well as reviewing course
materials. They found that some of the so-called online courses consisted
solely of posting a course syllabus online and collecting assignments through
the course management system (Bork & Rucks-Ahidiana, 2012). Many
of the courses were primarily textbook driven and contained no multimedia
elements. In the language of academic understatement, Jaggers and Xu (2012)
sum up these qualitative data by stating that “many online courses in the
community college setting are not thoughtfully designed” (p. 1). Such
¿ndings suggest that the research comparing outcomes for students in online
and classroom-based community college courses, while providing sobering
insights into what has been done in the past, tells us little about what could be
done with well-designed online or blended courses, a topic we will take up
later in this chapter.
Whether or not community colleges are offering high-quality online courses,
the characteristics and life circumstances of students enrolling in these courses
may undermine odds for success. In the opinion of experienced online instruc-
tors, key factors in online learning success are time management skills, facility
with technology, initiative, and communication competency (Mandernach,
Donnelli, & Dailey-Hebert, 2005). Data-based inquiries using higher educa-
tion student data systems show that prior grade point average, number of online
courses taken previously, and successful prior completion of an online course
all predict successful online course completion (Bernard et al., 2004; Cheung
& Kan, 2002; Dupin-Bryant, 2004; Xu & Jaggers, 2011a).
The literature generally points to four categories of student characteristics
potentially accounting for community college students’ weak performance in
online courses:
Competing Priorities
The community college students who opt for online options are more likely
than their same-college peers in all classroom-based courses to have extensive
weekly work hours, family responsibilities, and ¿nancial aid (Hachey, Conway,
& Wladis, 2013; Hagedorn, 2010). Community colleges want to serve such
students—doing so, they say, is their primary motivation for offering online
course options—but these life circumstances also increase the risk of non-
completion (Conklin, 1997; Grimes & Antworth, 1996; Hachey, Conway, &
Wladis, 2013).
Several of the studies described above attempted to control for such life
circumstances in their analyses and still found poorer performance in online
courses (Carpenter, Brown, & Hickman, 2004; Xu & Jaggers, 2011a, 2011b).
Xu and Jaggers (2011b), for example, limited their study sample to students
who had enrolled in at least one online course between 2004 and 2008 and
compared their course completion rates for their online and classroom-based
courses. This strategy assures that we are looking at the same kinds of students
in the online and comparison courses—in fact, we are looking at the very same
students. Xu and Jaggers found that students had a higher completion rate for
their classroom-based courses than for their online courses. It could still be
argued, however, that even though Xu and Jaggers examined the results for the
same people in the two types of courses, personal circumstances may not have
been stable over a four-year period. Taking on a new job or care of a sick
family member is the kind of event that could lead a student to choose an
online course option and at the same time create a level of demand and stress
that would work against course completion.
they have learned if they know they will not be assessed on it again. This is one
of the Àaws of some mastery learning systems that assume that once a student
has earned a passing score on an assessment of some piece of content that that
learning will be retained and does not need to be reinforced or re-assessed.
problems. Several studies show that online students report greater enjoyment
and better learning in courses in which they feel they are interacting with “real
people” (Akyol et al., 2009; Swan & Shih, 2005). Liu, Gomez, and Yen (2009)
found that the more students sense “social presence” in their online course, the
more likely they are to ¿nish the course and the higher their ¿nal grade.
goal is to identify any dif¿culties in time to help students recover and earn
course credit.
During the 2013–14 academic year, Rio Salado College began to pilot new
support services for students in its associate of arts and associate of general
studies degree programs. An entering student will be paired with a personal
advisor, who will work with her using the RioPACE analytics that track
the completion of course work and performance on assessments for all her
courses.
Small group work helps because one student knows what to do and
reinforces their own learning as he or she explains to other group members.
It’s good for the others to see that a peer has gotten it, and they feel
comfortable asking a fellow student to repeat and explain.
matter of time spent working on it: “For somebody who does not see the
numbers, you need to practice until you do. There is no magic; you just need to
practice.” An emphasis on the important learning skill of reÀecting on your
learning and being able to express it verbally was cited by seven interviewees.
The same teacher described above exhorted her students, “You have to [be able
to] explain what prompted you to do what you did. You need to talk to your-
self when you solve the problem.” Later, one of us watched students in her
class work on problems in small groups. Four young women moved to the
whiteboard to begin working on the algebra problem they had been assigned.
One student did the writing, but others gave input. When they had ¿nished,
one of the young women asked, “Can we explain it in case she chooses one of
us to explain?” Clearly these students were accustomed to the expectation that
they show how they arrived at an answer and explain the reasoning behind
each step.
A developmental course improvement practice recommended by Jaggers
(2011) is integration with the college’s general support programs. This practice
appeared to be relatively uncommon among the blended learning courses
covered by our interviews; only four interviewees cited any such integration.
We heard more about Àexibility than about lowered institutional barriers.
It was not uncommon for courses to have policies of letting students take
assessments when they were ready or retake assessments with only the higher
grade counting. Many of the developmental programs also had intensi¿cation
strategies allowing students to complete two semesters of work in a single
term or to take a brief, intense course in preparation for the college’s math
placement exam. But strategies requiring change on a larger scale, such as
changes to course sequences, pairing courses, or allowing partial credit, were
less common.
data show that a student’s sense of “social belonging” within the course
is the strongest survey-based predictor of course completion (Silva &
White, 2013).
Finally, the Pathways Project has made a major effort to deal with the non-
cognitive aspects of success. Drawing on research by psychologists such as
Carol Dweck and James Stigler, Pathways researchers and instructors have
designed and tried out a number of techniques for dealing with the need to
change self perceptions around mathematics and develop academic persistence.
Over 70 percent of students entering a Pathways course come in with doubts
about their ability to succeed in math. The Pathways courses commence with
three weeks of “starting strong” activities, which include reading and writing a
response to an article based on Carol Dweck’s “growth mindset” work about
how the brain changes in response to mental effort and practice. Carnegie
researchers worked with the instructor at one community college to assign
students randomly to either read this article or to read another article about the
structure of the brain. They found a major effect of this simple manipulation:
students who read the growth mindset article were twice as likely as those who
read the parts of the brain article to ¿nish their developmental mathematics
course—and they earned signi¿cantly higher grades (Silva & White, 2013).
Pathways instructors and researchers try out innovations such as these,
collecting data on the results and sharing ¿ndings with other members of the
collaboration. Carnegie’s term for this data-based continuous improvement
approach is “improvement science.” The idea is that researchers and
practitioners work together, developing a testable hypothesis about what might
improve student success rates, trying it out, and collecting the data that will
help them determine whether or not their approach was an improvement. Data
about successes and failures are shared across the network so that effective
practices can spread rapidly.
The same process was used in developing the course software. A number of
two-year college faculty tried out some of the original Statway modules with
their classes in fall 2010. Based on this experience, the project team realized
that the materials needed a major revision before being implemented on a
wider scale. A team of mathematics faculty from multiple colleges came to
Carnegie to work together on redesigning the course, and the result was
Statway Version 1.5, which was tried out by 21 colleges in school year
2011–12. Prior to implementing the Pathways approach, the collaborating
colleges saw just 6 percent of their entering students requiring developmental
mathematics earn a college-level math credit within 12 months of continuous
enrollment. Among colleges using Statway in 2011–12, 51 percent of Pathway
students met this goal (Strother, Van Campen & Grunow, 2013). Early results
for Quantway appear even more impressive—56 percent of Quantway students
completed their developmental mathematics requirement in a single semester
(Silva & White, 2013).
164 Learning Online
A case can be made that the productivity of the K-12 education system in the
U.S. has been declining for decades. Government and student spending on
K-12 education has increased by 70 percent over the last 30 years with very
little improvement in outcomes (U.S. Department of Education, 2010b). In
higher education, the costs of tuition and fees have increased an astonishing
274.7 percent between 1990 and 2009 (Christensen et al., 2011).
Economists describe this situation as the “Cost Disease,” a dilemma endemic
to service industries that depend on highly skilled labor. Like a symphony
orchestra, a university cannot escape the need to have skilled employees in a
range of different areas by substituting equipment or other capital outlays.
They must pay increasing wages for skilled labor, and cannot cut costs without
cutting the number of employees, which would bring associated declines in
output (Bowen et al., 2012; Hill & Roza, 2010).
The most common purposes for adopting online learning have been
expanding access, improving quality, and reducing costs. These three factors—
cost, quality, and access—are often referred to as the “iron triangle” of
education, suggesting that a positive change in any one is achieved at the
expense of a negative impact on the others. In other industries, technology has
been the key to raising productivity, and a number of policy makers are making
the argument that the state of the art in online learning has now risen to a level
where it can do so in education (U.S. Department of Education, 2010b; Wise &
Rothman, 2010).
U.S. Secretary of Education Arne Duncan has been a major proponent of the
concept of measuring and improving educational productivity, asserting in a
2010 speech at the American Enterprise Institute that in what he called “the
New Normal” K-12 and postsecondary educators were going to face the chal-
lenge of “doing more with less.” Duncan urged educators to see this challenge
as an opportunity for innovation and highlighted the role of technology:
They refer to the ¿rst of these as increasing ef¿ciency and to the second as
increasing effectiveness. It is important to note that measures of productivity
can be estimated only if data on both costs and outcomes are available and can
be used only when two or more alternatives are being compared to each other
(Levin & McEwan, 2001).
We have discussed online and blended learning outcomes at some
length in previous chapters. In this chapter, we turn to the topic of reducing
costs with online technology and the relationship between costs and student
outcomes.
The cost estimates for K-12 full-time virtual high-school degree programs
provided in the literature vary considerably from program to program and
study to study as shown in Figure 8.1, which contains estimates from seven
different reports (none of which appeared in a peer-reviewed journal).
There are many possible sources for the variations in estimates of face-to-
face and online per-pupil costs. As Watson (2004) notes, studies of public
virtual school costs tend to use district and state budgets as their source. An
exception is Watson’s (2004) Colorado estimate, which was based on the
estimated value of a set of resources that teachers and administrators identi¿ed
as essential to the program, an approach more closely aligned with standard
education cost estimation procedures (Levin & McEwan, 2001). The data
presented by Cavanaugh (2009) are 2008 cost per full-time online student
estimates obtained from a survey by the Center for American Progress of
20 virtual school directors in 14 different states.
More recently, as part of a project sponsored by the Thomas B. Fordham
Foundation, Battaglino, Haldeman, and Laurans (2012) investigated the costs
of K-12 online learning. They interviewed 50 entrepreneurs, policy experts,
and school leaders to obtain information on costs in the categories of labor,
content acquisition, technology, operations, and student support services. They
estimated the average annual cost for a student in a full-time virtual high school
to be $6,400 (with a range of $5,100 to $7,700). With the average cost of
educating a student in a brick-and-mortar school (after removing administrative
costs) estimated at around $10,000 a year, the virtual school costs do indeed
Fa.ce-to-Face Onllne
Colorado $8.917
(WauDn, 2004) $7,210
Ohlo 1 $7,~52
(Wauon, 2007) $5,383
Aorida 56.291
(Aorida Tax 'IVatth, 2007) $5.2~3
Utah $5,683
(SIaan and Hockoy, 2O(9) $2,500
Wlsc:onsln $11,397
(Stuiber .t 01., 20 I0) $6,On
Expert COlt Esdmate $10,000
(Batt:a&lIno, Haldeman and $8.000
Laurans, 20 12)
$0 $2,000 $~,OOO $6,000 $8.000 $10,000 $12,000
School of One and New York City planned to reach much greater scale
over time.
Hollands calculated the ongoing costs of implementing School of One at a
hypothetical middle school with 480 students. Hollands’s estimate included
the costs of renovating a school space to provide the large room needed for the
School of One model, an in-house digital content manager, professional devel-
opment for staff, hardware, expected licensing fees for School of One content,
and the costs of the online tutors that are part of the model. Her estimate
excluded development costs, which the school would not be expected to pay.
By her calculations, replicating School of One would add $1,352 per student
over and above the normal per-pupil expenditures. By performing a break-
even analysis, Hollands calculated that School of One would need to produce
a mathematics achievement gain almost twice as large as that produced by the
average school program to be cost effective. As noted in discussing School of
One in Chapter 5, this kind of impact has not been demonstrated thus far.
redesigned 14 courses and piloted them in the spring or fall of 2012. In one
example, the faculty member redesigned a mathematics course to be taught in
a large computer lab space with 75 students working online with mastery
learning software, with graduate students and adjunct faculty available to
provide assistance. The Missouri Learning Commons reported that costs were
reduced in ten of their 14 course redesign efforts, and that 12 of these courses
produced the same or better student pass rates as prior versions of the courses.
While NCAT’s work is very forward looking, it has some noteworthy
limitations. The reported results are based on self-report data. The data are
sometimes estimates, with a noted tendency to underestimate actual costs. For
example, developing online learning resources tends to take longer and cost
more than expected. Others have noted that the pre-post design used by NCAT
to measure learning outcomes lacks any experimental or statistical control
(Lack, 2013).
Despite these methodological weaknesses, the NCAT course redesign
models demonstrate the potential of using blended learning models to improve
institutional productivity by replacing staff time with a combination of
technology and the labor of relatively less-expensive staff, such as teaching or
graduate assistants. Bowen et al. (2012) reach the same conclusion on the basis
of their work with the OLI statistics course implemented in blended learning
models at six public universities. As described in earlier chapters, technology
can reduce staff time in a variety of ways—through performing some of
the functions related to content delivery, student assessment, feedback, and
communication, or through automating administrative tasks such as attendance
and other record keeping. As various tasks are shifted to technology, skilled
staff can be redeployed for more complex tasks, such as facilitating students’
work with complex concepts.
In the case of open educational resources like the Khan Academy, content
acquisition is “free” in the sense that Khan Academy does not collect a fee for
use of its materials. However, district technology of¿cers are fond of pointing
out that free digital learning resources are “free like a puppy.” You may not
have to pay a provider to use them, but you still need to pay for upkeep—the
technology infrastructure to support their use and supports for teachers who
need to learn how to use them well.
We examined the cost implications of one school district’s use of Khan
Academy resources. In this particular district, start-up costs were minimal
because the district had a robust, technology infrastructure that could be
deployed for using Khan Academy. This would not be the case in many schools
districts, however. In addition, even though no direct cash outlays were
required, the district did reallocate resources to implement Khan Academy
systematically. Staff time was required to locate those Khan Academy resources
that ¿t with the district’s curriculum and its objectives for Khan Academy use.
Teachers had to devote time to planning lessons that incorporated those
Khan Academy resources. There were also opportunity costs associated with
the use of the district’s technology infrastructure, including technical support,
and use of about 450 computers and schools’ 100-megabit connections for this
activity.
Process Redesign
Research across many industries suggests that the introduction of technology
by itself will not increase productivity. Rather, to attain improvements in
productivity, an institution needs to couple technology with organizational
changes such as reengineering of key business processes and industry structures
(Athey & Stern, 2002; Atkinson & McKay, 2007; Brynjolfsson & Hitt, 2000;
McKinsey Global Institute, 2000, 2002).
An online course that involves teachers in replicating traditional lecture
formats and delivering the bulk of instructional content verbally at the
same teacher–student ratio—but doing so online—incurs additional costs
and is unlikely to garner any compensating increases in student learning.
Such applications may be justi¿ed on the basis of increasing access if
they bring the course to learners who would not otherwise experience it,
however.
Many have argued that the true productivity potential of educational
technologies is in their ability to transform core instructional practices,
including the use of student and teacher time. The availability of micro-data
about student learning processes online can not only make the online
system adaptive in ways that enhance learning but also inform teachers and
counselors of the nature of needed corrective actions to avoid course failure
(U.S. Department of Education, 2012).
Online Learning and Educational Productivity 173
Scale of Implementation
The term “economies of scale” captures the idea that per-unit costs are reduced
as the scale of production increases. Conceptually, this is possible by leveraging
high investment or “¿xed” costs with low variable, recurrent costs. In the case
of online learning, investment costs may be relatively big-ticket items related
to infrastructure, including items like hardware, connectivity, and content
development. Course development may constitute a large or small portion of
¿xed costs depending on the instructional model (Anderson et al., 2006).
Recurrent costs represent annual ongoing expenses such as instructor salaries
and technology maintenance.
One way to improve productivity is to increase the numbers of students
whom high-skilled personnel can serve. Personnel (what economists call
“labor”) represent a signi¿cant proportion of ongoing cost in education.
Personnel costs typically include the time of teachers, teaching assistants,
developers, administrators, and any other staff involved in creating or running
an educational program (online or otherwise). Per-student labor costs can be
reduced if teachers and other staff can serve more students by changing what
they do and how they do it. The NCAT work described above suggests several
strategies for redeploying staff.
In cases with a high proportion of ¿xed costs, per-student costs go down as
more students are served, since each student has to assume a smaller and
smaller portion of the ¿xed costs. In the case of Hollands’s examination of
School of One costs described earlier, the project’s ambitious software
development goals combined with a need for a large bank of learning materials
and assessments made the initial development costs fairly high (over $5,000
per child). If they were allocated to just the 1,500 pupils experiencing School
of One in its pilot year, their impact on the cost of educating each student is
substantial. If the software were used with 150,000 students, on the other hand,
the per-pupil additional cost would be around $52.
Compared with conventional instruction, online learning usually incurs
higher investment costs associated with designing a new program, developing
new curriculum, and selecting or developing digital resources. A few studies in
postsecondary education have found online learning to be an expensive
alternative because of these initial development costs and the ongoing personnel
costs for delivering instruction (Jones, 2001; Ramage, 2005; Smith & Mitry,
2008).
There are often-cited structural incentives of digital media that argue in
favor of scale—once the investment in development has been made, why not
get the course out to as many people as possible? Although online course
content can be expensive to develop, once created it has the potential to be
distributed to large numbers of students with relatively little additional expense
(Adsit, 2003; Christensen, Horn, & Johnson, 2008; Watson, 2004). Once an
174 Learning Online
Costof
Proa:ram Camponents Context Outcomes
Although speci¿c costs vary by program, the categories of costs that need to
be considered are summarized in the ¿gure:
• Other costs include those needed for the successful and legal operation of
a program that are not included in the categories above, including student
support services.
Conclusion
Online and blended learning can be more or less cost effective than conven-
tional classroom-based education, depending on the particular alternatives
being compared. Costs vary dramatically depending on the implementation
model, the number of students served, and the size of the investment in soft-
ware development or subscription fees. We need to look at the particular online
or blended learning model under consideration to assess whether or not it
enhances educational productivity relative to a speci¿c alternative, and unfor-
tunately there are very few cases in which we have both good data about costs
and good data about student outcomes.
Online and blended learning implementation costs are highly dependent on
factors such as the number of students in a course or program and the way in
Online Learning and Educational Productivity 177
Conclusion
In this capstone chapter, we take a step back from the particulars of online
learning in the different settings and applications discussed in chapters 3–8 to
highlight some of the major themes running throughout the book. Building on
these themes, we will conclude with a set of areas we judge ripe for research
that could improve the design of online learning activities on the one hand and
inform decisions about when and how to implement online and blended
learning on the other.
part to the accountability pressures these schools face. Feeling under the gun to
avoid being labeled as in need of improvement, schools tend to respond with a
relentless focus on language arts and mathematics test preparation, to the
neglect of more advanced skills and other subjects.
making decisions about the design of future online learning activities and
environments for different purposes and types of learners. Most providers of
online learning have yet to ¿gure out how to structure the clickstream data
their systems generate in a way that is useful for analysis (Bakia et al., 2013),
but work in this area is ongoing, and we expect major strides in the next
¿ve years.
Research Agenda
We will close with a consideration of future directions for research, taking into
account gaps in our current knowledge and trends in the way online learning
Conclusion 185
and blended learning are being used. Below we discuss seven topics, each of
which could be the centerpiece for a whole program of research.
peer assessment of students’ work in humanities courses, but the results have
not always been satisfactory. Students do not necessarily value the feedback
of peers assessing their products, and indeed not every student is equipped to
provide good feedback in a ¿eld in which he himself is a novice. This area
is ripe for further investigation and innovative design. The design of highly
effective blended learning models for these subject areas appears rather
elusive as well. The integration of online learning into blended courses in
the NGLC initiative, for example, produced positive impacts on average
for mathematics, social science, and business courses, but negative impacts for
English and humanities classes (Means, Shear et al., 2013). De¿ning and
testing design principles for effective learning of the important ideas and forms
of argumentation in these ¿elds would be a productive challenge spurring
innovation.
education practice. The ability of online learning systems to collect and display
information about each student’s online work and pro¿ciencies has outstripped
educators’ capacity, or inclination, to use the available data. In our research
on the implementation of blended learning in K-12 schools, we are ¿nding
that many teachers and administrators are not using data available on their
online systems to understand what students know and can do and where they
need help.
There are many reasons for this and the problem can be addressed from
multiple angles. Some teachers question the meaningfulness of system-
generated data on skill mastery; some teachers believe that their own
observations of students’ performance in in-class activities or in conversations
provide a better window into student competencies. Other teachers cite the
time required to wade through the large amounts of data in the way it is
formatted by the learning system they use. This problem is exacerbated in
classes where multiple learning systems are in use, each with a different set of
data and data displays. Administrators and teachers want to go to one place for
all their learning information for a particular student. Finally, some teachers
lack clear ideas about the instructional decisions they could make on the basis
of student data, suggesting the need for teacher professional development or
coaching around different instructional strategies appropriate for students with
different patterns of assessment data. Design studies examining the instructional
decisions teachers make day to day and testing the usability and utility of
different kinds of, and formats for, student online learning data could help
bridge this gap between potential and reality.
Final Words
Finally, we close with the advice that development, implementation, and
research on new online teaching and learning innovations need to go hand in
hand to a much greater extent than they have in the past. Online learning effec-
tiveness studies too often fail to specify the key features of the learning
experience design, and treat the online aspects of a course or other learning
experience as if they were self-contained, ignoring the broader context in
which learning takes place and the relationship between online and ofÀine
learning activities. Bringing all of these features into a systems framework can
help guide the design of new learning interventions and research on key design
features and implementation factors. Only by undertaking careful work at this
level of detail can we build an evidence-based understanding of how to design
and implement online and blended learning for different purposes, kinds of
learners, and settings.
References
Adsit, J. (2003). Funding online education: A report to the Colorado Online Education
Programs Study Committee. Retrieved from http://www.cde.state.co.us/edtech/
download/osc-VW.pdf.
Aiello, N. C. & WolÀe, L. M. (1980). A meta-analysis of individualized instruction in
science. Paper presented at the annual meeting of the American Educational Research
Association, Boston, MA.
Akyol, Z., Arbaugh, B., Cleveland-Innes, M., Garrison, R., Ice, P., Richardson, J., &
Swan, K. (2009). A response to the review of the Community of Inquiry Framework.
Journal of Distance Education, 23(2): 123–136. Retrieved from http://www.jofde.ca/
index.php/jde/article/view/630/885.
Allen, I. E. & Seaman, J. (2008). Staying the course: Online education in the United
States. Needham, MA: Sloan Consortium.
Allen, I. E. & Seaman, J. (2013) Changing course: Ten years of tracking online education
in the United States. Babson College, MA: The Sloan Consortium. Retrieved from
http://sloanconsortium.org/publications/survey/pdf/learningondemand.pdf.
American Society of Training Development (ASTD) (2010). 2009 State of the Industry
Report. Alexandria, VA.
Anderson, A. J., Augenblick, D., DeCesare, D., & Conrad, J. (2006). Costs and funding
of virtual schools: An examination of the costs to start, operate, and grow virtual
schools and a discussion of funding options for states interested in supporting virtual
school programs. Report prepared for BellSouth Foundation. Denver, CO:
Augenblick, Palaich, and Associates. Retrieved from http://www.inacol.org/research/
docs/CostsandFunding.pdf.
Anderson, R. & Ainley, J. (2010). Technology and learning: Access in schools around
the world. In P. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia
of education. 3rd edition. Amsterdam: Elsevier.
Anderson, T. (2004). Toward a theory of online learning. In T. Anderson & F. Elloumi,
The theory and practice of online learning. Athabasca, Alberta: Athabasca University.
Aragon, S. R. & Johnson, E. S. (2008). Factors inÀuencing completion and noncompletion
of community college courses. American Journal of Distance Education, 22,
146–158.
Archambault, L., Diamon, D., Brown, R., Cavanaugh, C., Coffey, M., Foures-Aalbu, D.,
Richardson, J., & Zygouris-Coe, V. (2010). An exploration of at-risk learners and
online education. Retrieved from http://¿les.eric.ed.gov/fulltext/ED509620.pdf.
References 191
Evergreen Education Group. (2013). 10-year anniversary issue: Keeping pace with
K-12 online and blended learning: An annual review of policy and practice. Durango,
CO: Evergreen Education Group. Retrieved from http://kpk12.com/reports/.
Fain, P. (2013a). Kaplan 2.0. Inside Higher Ed. August 15. Retrieved from wwhttp://
www.insidehighered.com.
Fain, P. (2013b). Possible probation for Phoenix. Inside Higher Ed. February 26.
Retrieved from http://www.insidehighered.com.
Fast Company. (2012). Can Google’s Thrun create the ¿rst real online college degree?
Retrieved from fastcoexist.com/1679192/can-googles-thrun-create-the-¿rst-real-
online-college-degree.
Figlio, D. N., Rush, M., & Yin, L. (2010). Is it live or is it Internet? Experimental
estimates of the effects of online instruction on student learning. NBER working
paper 16089. Cambridge, MA: National Bureau of Economic Research.
Fishman, B. & Dede, C. (in preparation). Teaching and technology: New tools
for new times. To appear in D. Gitomer & C. Bell (Eds.), Handbook of research
on teaching, 5th edition. Washington, DC: American Educational Research
Association.
Florida Tax Watch. (2007). Final report: A comprehensive assessment of Florida
Virtual School. Tallahassee, FL: Florida Tax Watch. Retrieved from http://www.
inacol.org/docs/FLVS_Final_Final_Report(10-15-07).pdf.
Florida Virtual School. (n.d.). Florida Virtual School district enrollment summary:
2011–2012.
Fujimoto, K. & Cara, E. (2013). Massive open online courses (MOOC) mashup:
San Jose State University: Udacity experiment with online courses ¿zzles. San Jose
Mercury News, July 25.
Gallagher, L., Michalchik, V., & Emery, D. (2006). Assessing youth impact of the
Computer Clubhouse Network. Menlo Park, CA: SRI International.
Gee, J. P. (2009). Deep learning properties of deep digital games: How far can they go?
In U. Ritter¿eld, M. J. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and
effects (67–82). New York and London: Taylor & Francis.
Gee, J. P. (2013). Digital media and learning: A prospective retrospective, unpublished
paper. Tempe, AZ: Arizona State University.
Gibbons, J., Pannoni, R., & Orlin, J. (1996). Tutored video instruction: A distance
education methodology that improves training results. Paper presented at the
international conference of the American Society for Training and Development,
Orlando, FL.
Glass, G. & Welner, K. (2011). Online K-12 schooling in the U.S.: Uncertain private
ventures in need of public regulation. Boulder, CO: National Education Policy
Center.
Glass, G. V. (2009). The realities of K-12 virtual education. Retrieved from http://nepc.
colorado.edu/publication/realities-K-12-virtual-education.
Graesser, A. (2009). Inaugural editorial for Journal of Educational Psychology. Journal
of Educational Psychology, 101(2), 259–261.
Graf, S. & Kinshuk. (2013.) Dynamic student modeling of learning styles for advanced
adaptivity in learning management systems. International Journal of Information
Systems and Social Change, 4(1), 85–100.
196 References
Graham, C. R., Allen, S., & Ure, D. (2005). Bene¿ts and challenges of blended learning
environments. In M. Khosrow-Pour (Ed.), Encyclopedia of information science and
technology (253–259). Hershey, PA: Idea Group.
Grimes, S. K. & Antworth, T. (1996). Community college withdrawal decisions:
Student characteristics and subsequent reenrollment patterns. Community College
Journal of Research and Practice, 20(4), 345–361.
Grubb, W. N., Boner, E., Frankel, K., Parker, L., Patterson, D., Gabriner, R., Hope, L.,
Schiorring, E., Smith, B., Taylor, R., Walton, I., & Wilson, S. (2011a). Understanding
the “crisis” in basic skills: Framing the issues in community colleges. Basic Skills
Instruction in California Community Colleges Report No. 1. Sacramento, CA: Policy
Analysis for California Education.
Grubb, W. N., Boner, E., Frankel, K., Parker, L., Patterson, D., Gabriner, R., Hope, L.,
Schiorring, E., Smith, B., Taylor, R., Walton, I., & Wilson, S. (2011b). Basic skills
instruction in community colleges: The dominance of remedial pedagogy. Basic
Skills Instruction in California Community Colleges Report No. 2. Sacramento, CA:
Policy Analysis for California Education.
Hachey, A. C., Conway, K. M., & Wladis, C. W. (2013). Community colleges and
underappreciated assets: Using institutional data to promote success in online
learning. Online Journal of Distance Learning Administration, 16(1).
Hagedorn, L. S. (2010). Introduction to the issue: Community college retention—
an old problem exacerbated in a new economy. Journal of College Student Retention,
12(1), 1–5.
Hanford, E. (2013). The story of the University of Phoenix. Retrieved from http://
americanradioworks.publicradio.org/features/tomorrows-college/phoenix/story-of-
university-of-phoenix.html.
Harasim, L. (2001). Shift happens: Online education as a new paradigm in learning.
Internet and Higher Education, 3, 41–61.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. London and New York: Routledge.
Haynie, D. (2013). What employers really think about your online bachelor’s degree.
U.S. News and World Report, July 1.
Heller, N. (2013). Laptop U. New Yorker, May 20, 80–91.
Heppen, J. B., Walters, K., Clements, M., Faria, A., Tobey, C., Sorensen, N., & Culp, K.
(2012). Access to Algebra I: The effects of online mathematics for grade 8 students.
NCEE 2012–4021. Washington, D.C.: National Center for Education Evaluation and
Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Herald, B. (2013). Florida Virtual School, other e-schools face dif¿cult times EdWeek,
blog, Digital Education: Vienna, VA. Retrieved from http://www://blogs.edweek.
org/edweek/DigitalEducation/2013/08/Florida_Virtual_School_Other_E-Schools_
Face_Dif¿cult_Times.html.
Herrera, L. (2011). In Florida, virtual classrooms with no teachers, New York Times,
January 17.
Hidi, S. & Renninger, K. A. (2006). The four-phase model of interest development.
Educational Psychologist, 41(2), 111–127.
Hill, P. (2012). Online educational delivery models: A descriptive view. EDUCAUSE
Review, November/December, 85–97.
References 197
Hill, P. & Roza, M. (2010). Curing Baumol’s disease: In search of productivity gains in
K-12 schooling. CPRE White Paper 2010_1. Center for Reinventing Public
Education. Seattle, WA: University of Washington. Retrieved from http://www.crpe.
org/cs/crpe/download/csr_¿les/whp_crpe1_baumols_jul10.pdf.
Hollands, F. (2012). Using cost-effectiveness analysis to evaluate School of One (So1).
New York: Center for Bene¿t-Cost Studies of Education, Teachers College, Columbia
University.
Horn, M. B. & Staker, H. (2011). The rise of K-12 blended learning. Innosight Institute.
Retrieved from http://www.innosightinstitute.org/innosight/wp- content/uploads/
2011/01/The-Rise-of-K-12-Blended-Learning.pdf.
Hubbard, B. & Mitchell, N. (2011). Achievement of online students drops over time,
lags state averages on every indicator. Education News Colorado. Retrieved from
www.ednewscolorado.org.
Ingersoll, R. M. (2003). Is there really a teacher shortage? Philadelphia and Seattle:
Consortium for Policy Research in Education and Center for the Study of Teaching
and Policy.
International Telecommunications Union. (2013). ICT Facts and Figures. Geneva:
Telecommunication Development Bureau, ICT Data and Statistics Division.
Ito, M., Horst, H., Bittanti, M., Boyd, D., Herr-Stephenson, B., Lange, P. G., Pascoe,
C. J., & Robinson, L. (2009). Living and learning with new media: Summary of
¿ndings from the Digital Youth Project. Chicago: MacArthur Foundation.
Ito, M., Baumer, S., Bittanti, M., Boyd, D., Cody, R., Herr-Stephenson, B., & Tripp, L.
(2010). Hanging out, messing around, and geeking out: Living and learning with new
media. Cambridge, MA: MIT Press.
Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J.,
Sefton-Green, J., & Watkins, C. (2013). Connected learning: An agenda for research
and design. Irvine, CA: Digital Media and Learning Research Hub.
Jaggers, S. S. (2011). Online learning: Does it help low-income and underprepared
students? CCRC Brief 52. New York: Columbia Teachers College, Community
College Research Center.
Jaggers, S. S. & Bailey, T. (2010). Effectiveness of fully online learning courses for
college students: Response to a Department of Education meta-analysis. New York:
Columbia Teachers College, Community College Research Center.
Jaggers, S. S. & Xu, D. (2012). Predicting student outcomes from a measure of course
quality. Paper presented at the annual meeting of the American Educational Research
Association.
Jenkins, H. (2006). Confronting the challenges of participatory culture: Media
education for the 21st century. Chicago, IL: John D. & Catherine T. MacArthur
Foundation.
Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H.
(2013). NMC Horizon report: 2013 Higher education edition. Austin, TX: New
Media Consortium.
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon
report. Austin, TX: New Media Consortium.
Jones, D. (2001). Technology costing methodology handbook. Boulder, CO: Western
Cooperative for Educational Telecommunications.
198 References
Kafai, Y. B. (2010). The world of Whyville: Living, playing, and learning in a tween
virtual world. Games and Culture, 5(1).
Kali, Y. & Linn, M. (2010). Curriculum design as subject matter: Science. In B.
McGraw, E. Baker, & P. Peterson (Eds.), International encyclopedia of education
(3rd edition) (468–474). Oxford: Elsevier.
Kalyuga, S., Chandler, P., Tuovinen, J., & Sweller, J. (2001). When problem solving is
superior to studying worked examples. Journal of Educational Psychology, 93,
579–588.
Kamenetz, A. (2013). San Jose State MOOC missteps easy to see. Retrieved from
http://diverseeducation.com/article/54903/#.
Kelderman, E. (2011). Online programs face new demands from accreditors. Chronicle
of Higher Education. Retrieved from http://chronicle.com/article/Online-Programs-
Face-New/129608/
Kennedy Martin, C., Barron, B., Austin, K., & Pinkard, N. (2009). A culture of sharing:
A look at identity development through the creation and presentation of digital media
projects. International Conference on Computer-Supported Education (Lisbon).
Kim, P., Kim, F. H., & Karimi, A. (2012). Public online charter school students: Choices,
perceptions, and traits. American Educational Research Journal, 49(3), 521–545.
doi: 10.3102/0002831212443078.
Kluger, A. N. & DeNisi, A. (1996). The effects of feedback interventions on
performance: A historical review, a meta-analysis, and a preliminary feedback
intervention theory. Psychological Bulletin, 119(2), 254.
Koedinger, K. R. & Corbett, A. (2006). Cognitive tutors: Technology bringing learning
sciences to the classroom. In R. K. Sawyer (Ed.), The Cambridge handbook of the
learning sciences (61–77). New York: Cambridge University Press.
Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2012). The Knowledge-Learning
Instruction (KLI) framework: Bridging the science-practice chasm to enhance robust
student learning. Cognitive Science, 36(5).
Koller, D. & Ng, A. (2012). Log on and learn: The promise of access in online education.
Forbes, September 19.
Koller, D., Ng, A., Do, C., & Chen, Z. (2013). Retention and intention in massive open
online courses: In depth. EDUCAUSE Review Online.
Kolowich, S. (2013a). The professors who make the MOOCs. Chronicle of Higher
Education, March 18.
Kolowich, S. (2013b). San Jose State U. puts MOOC Project with Udacity on hold.
Chronicle of Higher Education, July 19.
Kozma, R. B. (1994). Will media inÀuence learning? Reframing the debate, Educational
Technology Research and Development, 42(2), 7–19.
Kronholz, J. (2011). Getting at-risk teens to graduation. Education Next, 11(4), 24–31.
Retrieved from http://educationnext.org/¿les/ednext_20114_feature_kronholz.pdf.
Kulik, C. L. C., Kulik, J. A., & Bangert-Drowns, R. L. (1990). Effectiveness of mastery
learning programs: A meta-analysis. Review of Educational Research, 60(2),
265–299.
Lack, K. A. (2013). Current status of research on online learning in postsecondary
education. ITHAKA. Retrieved from http://www.sr.ithaka.org/research-publications/
current-status-research-online-learning-postsecondary-education.
References 199
Lacy, S. (2013). Udacity’s answer to Silicon Valley’s computer science problem. Video
interview with Sebastien Thrun. Pandodaily.com, April 29.
LeBlanc, P. (2013). Accreditation in a rapidly changing world. Inside Higher Ed.
Retrieved May 26, 2013, from insidehighered.com/views/2013/01/31/competency-
based-education-and-regional-accreditation.
Leckhart, S. & Cheshire, T. (2012). University just got Àipped: How online video is
opening up knowledge to the world. Wired Magazine, May.
Leddy, T. (2013). Are MOOCs good for students? Boston Review, June 14.
Lederman, D. (2006). In search of “big ideas.” Inside Higher Ed. Retrieved from
http://www.insidehighered.com.
Levin, H. & McEwan, P. (2001). Cost-effectiveness analysis: Methods and applications.
2nd edition. Thousand Oaks, CA: Sage.
Lewin, T. (2011). Of¿cial calls for urgency on college costs. New York Times.
November 29.
Linn, M. C., Lee, H. S., Tinker, R., Husic, F., & Chiu, J. L. (2006). Teaching and
assessing knowledge integration in science. Science, 313(5790), 1049–1050.
Lipsey, M. W. & Wilson, D. B. (2001). Practical meta-analysis (Vol. 49). Thousand
Oaks, CA: Sage.
Little¿eld, J. (n.d.). Cisco Networking Academy: A model for blended, interactive
learning. About.com Distance Learning. Retrieved from http://distancelearn.about.
com/od/onlinecourses/a/Cisco-Networking-Academy.htm.
Liu, S., Gomez, J., & Yen, C. (2009). Community college online course retention and
¿nal grade: Predictability of social presence. Journal of Interactive Online Learning,
8(2), 165–182.
Liyanagunawardena, T., Williams, S., & Adams, A. (2013). The impact and
reach of MOOCs: A developing countries’ perspective. eLearning Papers, 33,
1–8.
Lovett, M., Meyer, O., & Thille, C. (2008). The Open Learning Initiative: Measuring
the effectiveness of the OLI statistics course in accelerating student learning. Journal
of Interactive Media in Education, May.
Lundh, P., House, A., Means, B., & Harris, C. J. (2013). Learning from science:
Case studies of science offerings in afterschool programs. Afterschool Matters, 18,
33–41.
Macfadyen, L. & Dawson, S. (2010). Mining LMS data to develop an “early warning
system” for educators: A proof of concept. Computers & Education, 54(2),
588–599.
Mandernach, B. J., Donnelli, E., & Dailey-Hebert, A. (2005). Learner attribute research
juxtaposed with online instructor experience: Predictors of success in the accelerated,
online classroom. Journal of Educators Online, 3(2).
Marshall, J. (2012). Victory for crowdsourced biomolecule design. Nature, January 22.
doi:10.1038/nature.2012.9872.
Maxwell, W., Hagedorn, L. S., Cypers, S., Moon, H. S., Brocato, P., Wahl, K., &
Prather, G. (2003). Community and diversity in urban community colleges: Course
taking among entering students. Community College Review, 30(4), 1–21.
Mayer, R. E. (2008). Applying the science of learning: evidence-based principles for
the design of multimedia instruction. American Psychologist, 63(8), 760–769.
200 References
Project Tomorrow. (2011). The new 3 E’s of education: Enabled, engaged and
empowered: how today’s students are leveraging emerging technologies for learning.
Congressional Brie¿ng—Release of Speak Up 2010 National Data for Students
and Parents. Retrieved from http://www.tomorrow.org/speakup/speakup_reports.
html.
Queen, B. & Lewis, L. (2011). Distance education courses for public elementary and
secondary school students: 2009-10, First Look. U.S. Department of Education,
National Center for Education Statistics. Washington, D.C.: U.S. Government
Printing Of¿ce.
Rainie, L. (2010). Internet, broadband, and cell phone statistics. Washington, DC: Pew
Internet. Retrieved from http://pewinternet.org/Reports/2010/Internet-broadband-
and-cell-phonestatistics.aspx.
Rainie, L. (2012). Changes to the way we identify Internet users. Pew Research Center.
Retrieved July 12, 2013, from http://www.pewinternet.org/Reports/2012/Counting-
internet-users.aspx.
Ramage, T. (2005). A system-level comparison of cost-ef¿ciency and return on
investment related to online course delivery. E-Journal of Instructional Science and
Technology, 8(1). Retrieved from http://spark.parkland.edu/ramage_pubs/2.
Renninger, A. (2000). Individual interest and development: Implications for theory and
practice. In C. Sansone and J. M. Harackiewicz (Eds.), Intrinsic and extrinsic
motivation: The search for optimal motivation and performance (375–404). New
York: Academic Press.
Resnick, M., Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan,
K., & Kafai, Y., (2009). Scratch: Programming for all. Communications of the ACM,
52(11), 60–67.
Richards, J., Stebbins, L., & Moellering, K. (2013). Games for a digital age: K-12
market map and investment analysis. New York: The Joan Ganz Cooney Center.
Robinson, B. L. & Moore, A. H. (2006). Chapter 42. The Math Emporium. In
D. G. Oblinger (Ed.), Learning spaces. EDUCAUSE.
Roblyer, M. D. & Davis, L. (2008). Predicting success for virtual school students:
Putting research-based models into practice. Online Journal of Distance Learning
Administration, 11(4).
Rusk, N., Resnick, M., & Cooke, S. (2009). Origins and guiding principles of the
Computer Clubhouse. In Y. Kafai, K. A. Peppler, & R. N. Chapman (Eds.), The
Computer Clubhouse: Constructionism and creativity in youth communities. New
York: Teachers College Press, 17–25.
Russell, A. (2008). Enhancing college education success through developmental
education. A Higher Education Policy Brief. Washington, D.C.: American Association
of State Colleges and Universities.
Salmon, F. (2012). Udacity and the future of online universities, Web log post. January
23. Retrieved from http://blogs.reuters.com/felix-salmon/2012/01/23/udacity-
and-the-future-of-online-universities/
Sefton-Green, J. (2004). Literature review in informal learning with technology outside
school. FutureLab Series Report 7. Bristol: FutureLab.
Sefton-Green, J. (2010). Learning at not-school: A review of study, theory, and advocacy
for education in non-formal settings. Cambridge, MA, and London: MIT Press.
References 203
Shectman, N., DeBarger, A., Dornsife, C., Rosier, S., & Yarnall, L. (2013). Grit,
tenacity, and perseverance in 21st-century education: State of the art and future
directions. Menlo Park, CA: SRI International.
Sheehy, K. (2013). Online course enrollment climbs for 10th straight year. U.S. News &
World Report. January 28. Retrieved from http://www.usnews.com.
Shullenberger, G. (2013). The MOOC revolution: A sketchy deal for higher education.
Dissent Magazine, February 12.
Siemens, G. (2012). MOOCs are really a platform. Elearnspace. Retrieved from http://
www.elearnspace.org/blog/2012/07/25/moocs-are-really-a-platform/
Silva, E. & White, T. (2013). Pathways to improvement: Using psychological studies to
help college students master developmental math. Stanford, CA: Carnegie Foundation
for the Advancement of Teaching.
Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative
effectiveness of Web-based and classroom instruction: A meta-analysis. Personnel
Psychology, 59, 623–664.
Slemmer, D. L. (2002). The effect of learning styles on student achievement in various
hypertext, hypermedia, and technology-enhanced learning environments: A meta-
analysis. Unpublished doctoral dissertation, Boise State University, ID.
Sloan, J. & Mackey, K. (2009).VOISE Academy: Pioneering a blended-learning model
in a Chicago public high school. Innosight Institute. Retrieved from http://www.
innosightinstitute.org.
Smerdon, E. T. (1996). Lifelong learning for engineers: Riding the whirlwind. The
Bridge, 26(1/2).
Smith, D. & Mitry, D. (2008). Investigation of higher education: The real costs
and quality of online programs. Journal of Education for Business, 83(3),
147–152.
Smith, M. L. & Glass, G. V. (1977). Meta-analysis of psychotherapy outcome studies.
American Psychologist, 32, 752–760.
Smith, M. S. (2009). Opening education, Science, 323(89).
Smith, R., Clark, T., Blomeyer, R. (2005). A synthesis of new research on K-12 online
learning. Learning Point Associates.
Spielhagen, F. R. (2006). Closing the achievement gap in math: The long-term effects
of eighth-grade algebra. Journal of Advanced Academics, 18, 34–59.
Squire, K. & Durga, S. (2013). Productive gaming: The case for historiographic game
play. In R. E. Ferdig (Ed.), Design, utilization, and analysis of simulations and game-
based educational worlds. Hershey, PA: Information Science Reference.
Staker, H. & Horn, M. (2012). Classifying K-12 blended learning. San Mateo, CA:
Clayton Christensen Institute for Disruptive Innovation.
State Higher Education Executive Of¿cers. (2013). State higher education ¿nance: FY
2012. Boulder, CO: State Higher Education Executive Of¿cers.
Steinkuehler, C. & Duncan, S. (2008). Scienti¿c habits of mind in virtual worlds.
Journal of Science Education and Technology, 17(6), 530–543. doi:10.1007/
s10956-008-9120-8.
Strother, S., Van Campen, J., & Grunow, A. (2013). Community College Pathways:
2011–2012 descriptive report. Stanford, CA: Carnegie Foundation for the Advancement
of Teaching.
204 References
Stuiber, P. K., Hiorns, K., Kleidon, K., La Tarte, A., & Martin, J. (2010). An evaluation
of virtual charter schools. Wisconsin Department of Public Instruction. Retrieved
from http://www.legis.wisconsin.gov/lab.
Summerlin, J. A. (2003). A comparison of the effectiveness of off-line Internet and
traditional classroom remediation of mathematical skills. Unpublished doctoral
dissertation. Baylor University, Waco, TX.
Suppes, P. (1965). Computer-based mathematics instruction: The ¿rst year of the
project. Bulletin of the International Study Group for Mathematics Learning, 3,
7–22.
Swan, K. (2003). Learning effectiveness: What the research tells us. In J. Bourne &
J. C. Moore (Eds.), Elements of quality online education, practice and direction
(13–45). Needham, MA: Sloan Center for Online Education.
Swan, K. & Shih, L. F. (2005). On the nature and development of social presence in
online course discussions. Journal of Asynchronous Learning Networks, 9(3),
115–136.
Sweller, J. & Cooper, G. A. (1985) The use of worked examples as a substitute for
problem solving in learning algebra, Cognition and Instruction, 2, 59–89.
Thomas, D. & Seely Brown, J. (2011). A new culture of learning: Cultivating the
imagination for a world of constant change. Create Space Independent Publishing
Platform.
Thrun, S. (2012). University 2.0. Presentation at the Digital Life, Design 2012
conference. Munich.
Trautman, T. & Lawrence, J. (n.d.). Credit recovery: A technology-based intervention
for dropout prevention at Wichita Falls High School. Retrieved from http://www10.
ade.az.gov/AIMSDPToolkit/TrautmanAndLawrenceFormatted.aspx.
Twigg, C. A. (1999). Improving learning and reducing costs: Redesigning large-
enrollment courses. Troy, NY: National Center for Academic Transformation,
Rensselaer Polytechnic Institute. Retrieved from http://www.thencat.org/
Monographs/ImpLearn.html.
Twigg, C. A. (2003). Improving learning and reducing costs: New models for online
learning. EDUCAUSE Review, 38(5), 28–38.
Twigg, C. A. (2004). Improving quality and reducing costs: Lessons learned from
Round III of the Pew Grant Program in Course Redesign. National Center for
Academic Transformation. Retrieved April 6, 2011, from http://www://www.thencat.
org/PCR/RdIIILessons.pdf.
Twigg, C. A. (2005). Course redesign improves learning and reduces costs. Policy
Alert. National Center for Public Policy and Higher Education. Retrieved from http://
www.highereducation.org/reports/pa_core/.
Twigg, C. A. (2011). The Math Emporium: Higher education’s silver bullet. Change,
May–June.
Tyler-Smith, K. (2006). Early attrition among ¿rst time e-learners: A review of factors
that contribute to dropout, withdrawal and non-completion rates of adult learners
undertaking e-learning programs. Journal of Online Learning and Technology, June.
U.S. Department of Commerce. (1995). Falling through the net: A survey of the “have
nots” in rural and urban America. Retrieved from National Telecommunications &
Information Administration website: http://www.ntia.doc.gov/ntiahome/fallingthru.
html.
References 205
CO: Evergreen Education Group. Retrieved October 29, 2011, from http://www.
kpk12.com/cms/wp-content/uploads/KeepingPace09-fullreport.pdf.
Watson, J., Murin, A., Vashaw, L., Gemin, B., & Rapp, C. (2010). Keeping pace with
K-12 online learning: An annual review of state-level policy and practice. Durango,
CO: Evergreen Education Group. Retrieved October 29, 2011, from http://www.
kpk12.com/wp-content/uploads/KeepingPaceK12_2010.pdf.
Watson, J., Murin, A., Vashaw, L., Gemin, B., & Rapp, C. (2012). Keeping pace with
online and blended learning 2012: An annual review of policy and practice.
Evergreen Group, Colorado. Retrieved from http://www://kpk12.com/
Watters, A. (2012). The LMS infrastructure enters the MOOC fray. Web log post.
October 31. Retrieved from http://blogs.reuters.com/felix-salmon/2012/01/23/
udacity-and-the-future-of-online-universities/
Webley, K. (2012). University uproar: Ouster of UVA president draws ¿re. Time, June
20. Retrieved from http://content.time.com/time/nation/article/0,8599,2117640,00.
html.
Wenglinsky, H. (1998). Does it compute? The relationship between educational
technology and student achievement in mathematics. Princeton, NJ: Educational
Testing Service.
Wenglinsky, H. (2005). Technology and achievement: The bottom line. Educational
Leadership, 63(4), 29–32.
Werquin, P. (2010). Recognition of non-formal and informal learning: Country
practices. Organisation for Economic Co-operation and Development.
Wicks, M. 2010. A national primer on K-12 online learning. Version 2. Retrieved
from http://www://www.inacol.org/research/docs/iNCL_NationalPrimerv22010-
web.pdf.
Wise, R. & Rothman, R. (2010). The online learning imperative: A solution to three
looming crises in education. Issue Brief. Washington, D.C.: Alliance for Excellent
Education.
Wojciechowski, A. & Palmer, L. B. (2005). Individual student characteristics: Can any
be predictors of success in online classes? Online Journal of Distance Learning
Administration, 8(2). Retrieved from http://www.westga.edu/%7Edistance/ojdla/
summer82/wojciechowski82.htm.
Wu, J. (n.d.). Revolutionize education using learning analytics. University of California,
Berkeley School of Information.
Xu, D. & Jaggers, S. S. (2011a). Online and hybrid course enrollment and performance
in Washington State community and technical colleges. CCRC Working Paper 31.
New York: Columbia Teachers College, Community College Research Center.
Xu, D. & Jaggers, S. S. (2011b). The effectiveness of distance education across
Virginia’s community colleges: Evidence from introductory college-level math and
English courses. Educational Evaluation and Policy Analysis, 33, 360.
Yuan, L. & Powell, S. (2013). MOOCs and open education: Implications for higher
education. JISC CETIS Centre for Educational Technology & Interoperability
Standards. Retrieved from http://publications.cetis.ac.uk/2013/667.
Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, H. S. (2005). What makes the difference? A
practical analysis of research on the effectiveness of distance education. Teachers
College Record, 107(8), 1836–1884.
References 207
Zucker, A. & Kozma, R. (2003). The virtual high school: Teaching generation V. New
York: Teachers College Press.
Zuckerman, M., Porac, J., Lathin, D., Smith, R., & Deci, E. L. (1978). On the importance
of self-determination for intrinsically motivated behavior. Personality and Social
Psychology Bulletin, 4(3), 443–446.
Index
students 144–6, 146–7, 149; and cyber learning, as synonym for “online
MOOCs 64–6; and online universities learning” 6
125 cyber schools see virtual schools
compulsory learning: for high school
diplomas in certain states 3; mandatory D’Angelo, C. 21, 24
orientation courses 152–3; vs Daniels, John 42
self-initiated 97–8 see also college dashboards 112, 127
credit; developmental education data mining and learning analytics 24,
computer-based learning: as distinct 51–4, 158, 163, 172, 183–4, 187–8
concept from “online learning” 6–7; data systems, need for improved 139,
early offerings of 8 158, 187
Computer Clubhouses 87–8 Davis, L. 152
ConnectED 182 Dawley, L. 77, 78
Conner, M. 72 declarative knowledge 13, 21
constructionist philosophy of knowledge Dede, C. 24, 95, 114
87–8 de¿nitions of online learning 6–8
content producers, learners as: and design issues: correlation vs cause of
“better living” learning 80; and the effects 31; design as its own ¿eld 69;
digital divide 95–6; and learning as design as way to construct knowledge
entertainment 76, 77–8 87–8; as dimension of online learning
context: as dimension of online learning systems 8–9, 11–12, 26–8, 34–8; and
systems 8–11; grounding content in improvement of outcomes for
concrete context 154; providing less-prepared students 151–9; as key
multiple contexts 154 variable in research 23–4; and learning
continuous improvement processes outcomes 34, 148; need for systemic
161–3 approach 183–4
control variables 22, 26, 30 Design Principles Database 34
CoreSkillsMastery 83 Desire2Learn 58
correlation vs cause 30–2 DeVane, B. 88
cost (to institution) of online learning: developing world education 59
cost effectiveness of online learning developmental (remedial) education 47,
165–77; cost savings of online courses 100, 141–7, 150, 153, 159–64
4–5; funding issues over MOOCs 68; diagnostic testing 15, 33, 110
future research agenda on cost differentiation—de¿nitions 14–15
effectiveness 189; productivity of digital divide: and credit recovery
K-12 education in recent years 165; students 141; and less-prepared
seen as a way to limit costs 41; virtual students 149–50; need for attention to
vs regular schools 132–3, 139 94–6, 182–3
cost (to user) of learning 8, 61, 170 disruptive innovation theory 116, 137–9,
counseling services 157–9 179–80
course completion rates: and Àexibility distance learning: de¿nition of 8; as
of online programs 123, 149; and distinct concept from “online learning”
less-prepared students 144–6, 146–7, 6–7, 8; long popularity of 42
149; and MOOCs 64–6; and online “dosage” (time spent using online
universities 125 system) 31
Coursera 57, 58–9, 64, 84 Downes, Stephan 54
credit recovery students 4, 131, 140–1, DreamBox 183
164 Duncan, Arne 5, 48, 49, 166
Crowley, K. 75 Duncan, S. 91
curiosity, episodic learning to satisfy Duolingo 80, 92–4
73–6, 89 Durga, S. 88, 92
Index 211