Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 11 hours 48 min ago

LMS Market Updates, Dec 2015

Thu, 2015-12-10 18:37

By Phil HillMore Posts (378)

There seems to be a series of news and analysis on the LMS higher education market worth summarizing.

Major Adoption News

I posted last weekend about University of Phoenix (UoP) and their LMS. UoP is well-known for being the biggest user of a homegrown LMS for well over a decade, but in the past several years they rolled out “Classroom”, an entirely new adaptive-learning based design. In a major strategic change, UoP is abandoning this effort and moving to a commercial provider.

What we can now confirm at e-Literate is that the “learning platform” selected by the University of Phoenix is Blackboard Learn Ultra. This is the cloud-based redesign of Learn that Michael and I have described in several posts. Even with the University of Phoenix’s reduced enrollment, I consider this news to be the most important new client acquisition for Blackboard since at least 2011.

Today Campus Technology reported that Stanford is moving to adopt Canvas as their campus-wide LMS. Previously Stanford was a founding member of Sakai, with its implementation called CourseWork.

The university has been piloting Instructure Canvas since the 2014-2015 academic year. The vice provost for teaching & learning (VPTL) said in a statement that about 80 percent of faculty in the pilot reported being “very or somewhat satisfied” with the new platform; even more students (94 percent) found it “very or somewhat easy” to use.

Alongside the pilot, two Stanford schools had already adopted the application independently. The Graduate School of Education moved to Canvas in 2013-2014, and the Graduate School of Business did so in 2014. Both adoptions were considered successes.

During this school year, the migration was accelerated. Some 300 classes switched to Canvas. And the plan is to migrate the remaining 4,200 classes still using the legacy LMS software over the next academic year.

Moodle Moves

Michael wrote a week ago about two significant developments that will impact the future of Moodle – the launch of the Moodle Association and POET (an alliance of Moodle service providers) entering the Apereo Foundation’s incubation program.

Phil and I have written about the growing tension between the interests of Moodle HQ and a those of a couple of the bigger Moodle Partners, most notably Blackboard. There are a number of ways that this tension could be resolved, but one of the more dramatic possibilities would be a fork of Moodle. While we are not predicting it will happen, a couple of developments hit the wires last week that give us some idea of what the world might look like if there were a real and permanent split between the two groups.

Summary Posts and Data

George Kroner at Edutechnica wrote a year-end review, in a long-standing (2 year) tradition, of the LMS market. Topics included:

  • New and evolving takes on what a LMS should be and do
  • New entrants and indirect competition
  • A growing realization that course design is more important than the LMS
  • Moodle maneuver mania
  • and more

Justin Menard at LISTedTECH put out another great market visualization based on implementation and decommissioning dates at schools. He has an interactive visualization based on numbers and percentages, along with choice to show combined Blackboard data (combining Learn with ANGEL and WebCT) or separate product views. Here is a teaser:

LMS Historical Market Share

Justin’s concluding comments are worth considering:

As a side note: We know that Canvas is currently being piloted in 60+ higher education institutions. Those numbers are not reflected here, but we will be looking at this in our next post: who are the institutions that are piloting canvas and what LMS they are currently using.

That is interesting both for Canvas and for the nature of having forward-looking market data to analyze that goes beyond anecdotes.

The post LMS Market Updates, Dec 2015 appeared first on e-Literate.

Plus Ca Change: About that ed tech adoption curve

Mon, 2015-12-07 13:59

By Phil HillMore Posts (378)

All the same We take our chances
Laughed at by Time, Tricked by Circumstances
Plus ca change, Plus c’est la meme chose
The more that things change, The more they stay the same

– Rush, Circumstances

Over the past few years I have increased my usage of the technology adoption curve – originating from Everett Rogers and extended by Geoffrey Moore – to explain some of the tension faced by ed tech support organizations. In a nutshell, a bigger determination of adoption patterns of pedagogical and technology-enabled changes in education is from social change issues rather than the innovation (whether purely pedagogical or tech-based) itself. It’s not technology, it’s people. One graphic in particular that I use is based on the notion that while Moore presented a chasm in the technology adoption curve describing solutions moving across the chasm to reach majority markets, a bigger issue is that we will always have innovators, early adopters, majority and laggard groups in education, with a constant flux of teaching and learning innovations. Therefore the issue is straddling the chasm, helping both sides.

When Michael and I gave a keynote at last month’s OpenEd conference, the always interesting Alan Levine pointed out via Twitter (and referencing a blog post of his) that this perspective reminded him of one he read years ago and rediscovered in 2014. I found the article (and the author, I think), and it is a fascinating read that shows just how strong the pattern of innovation in ed tech is. The 1994 article could be written today with just a few changes to examples used.

Whatever Happened to Instructional Technology?

William Geoghegan was working in academic technology consulting at IBM in 1994, and he wrote the seminal article “Whatever Happened to Instructional Technology?” as a paper presented at the 22nd Annual Conference of the International Business Schools Computing Association. Keep in mind that 1994 was just one year after the first web browser, so the Internet had not yet become the dominant communication medium. As Alan provided in his post, a few excerpts from this paper 21 years ago are “quite revealing and relevant”.

The advent of digital computers on college campuses more than three decades ago brought with it a growing belief that this new technology would soon produce fundamental changes in the practice, if not the very nature, of teaching and learning in American higher education. It would foster a revolution where learning would be paced to a student’s needs and abilities, where faculty would act as mentors rather than “talking heads” at the front of an auditorium, where learning would take place through exploration and discovery, and where universal educational access, transcending barriers of time and space, would become the norm. This vision of a pedagogical utopia has been in circulation for at least three decades, enjoying a sort of perpetual imminence that renews itself with each passing generation of technology.

So the vision of personalized learning and its attendant faculty role change is not quite as new as might be assumed.

Alan’s post provides quite a few excerpts and comments, but I’m going to focus specifically on the Rogers / Moore perspective.

But there’s a problem. Despite massive technology expenditures over the last decade or so, the widespread availability of substantial computing power at increasingly reasonable prices, and a growing “comfort level” with this technology among college and university faculty, information technology is not being integrated into the teaching and learning process nearly as much as people have regularly predicted since it arrived on the educational scene three or four decades ago. There are many isolated pockets of successful technology implementations. But it is an unfortunate fact that these individual successes, as important and as encouraging as they might be, have been slow to propagate beyond their initiators; and they have by no means brought about the technologically inspired revolution in teaching and learning so long anticipated by instructional technology advocates.

That is exactly the problem that I believe is primary. It’s important to understand that the goal is not technology integration, but Geoghegan’s critique is of the pattern of instructional technology not leading to improved instructional practices in large part. His focus is on understanding why that is.

The instructional technology problem, in other words, is not simply a matter of technology being unavailable to faculty. It is not attributable to faculty discomfort with the technology itself, nor to faculty disenchantment with the potential benefits of information technology to instruction. In fact, the best evidence we have available today suggests that desktop computing is being widely used by faculty and, more importantly, that it is being used in support of teaching. The problem is that this support is for the most part logistical in nature: preparation of lecture notes, handouts, overhead transparencies, and other types of printed and display material that substitute for the products of yesterday’s blackboard and typewriter technologies. Such usage may enhance faculty productivity, and it may even help student learning (by substituting neatly printed transparencies for blackboard scribbles, if nothing else); but it does little or nothing to exploit the real value of the technology as an aid to illustration and explanation, as a tool that can assist in analysis and synthesis of information, as an aid to visualization, as a means of access to sources of information that might otherwise be unavailable, and as a vehicle to enable and encourage active, exploratory learning on the part of the student. The technology is being used logistically, in other words, but it is only occasionally being utilized as a medium of delivery, and to even a lesser extent do we find it deeply woven into the actual fabric of instruction.

Geoghegan then describes the problems leading this past and current situation, and then uses the Rogers / Moore technology adoption curve and chasm to explain why. The chasm between early adopters and early majority are key, with different needs of each side of the chasm summarized in this table.

 

Sides of Chasm

 

This perspective is the basis of the graphic “Straddling The Chasm” I have shared at several conferences this year.

Hill Slides OTC 20150619 straddling

In pointing out why instructional technology has not dramatically impacted the mainstream of faculty, Geoghegan describes four factors:

  • Ignorance of the gap;
  • An unholy alliance of faculty innovators, campus ed tech support staff, and technology vendors focusing on the left side only;
  • Alienation of the mainstream faculty; and
  • A lack of a compelling reason to adopt.

That third factor is quite interesting and worth quoting.

Differences between the visionaries and the early majority can produce a situation in which the successes of the early adopters actually work to alienate the mainstream. A good application of technology to instruction, for example – one that promises a radical improvement in some aspect of teaching or learning, and that is produced by technically comfortable and self-sufficient visionaries under risky experimental conditions – can attract considerable attention, and it can set what potential mainstream adopters may perceive as unreasonably high expectations that they may be unable to meet. Moore also points out that the “overall disruptiveness” of early adopter visionaries can alienate and anger the mainstream Moore (1991:59). The early adopters’ high visibility projects can soak up instructional improvement funds, leaving little or nothing for those with more modest technology-based improvements to propose; and their willingness to work in a support vacuum ignores the needs of mainstream faculty who may find themselves left with responsibility for the former’s projects after the developer has moved on to other things. And, finally, the type of discontinuous change favored by the early adopter has a tendency to product disruptive side-effects that magnify the overall cost of adoption.

So much for the idea that ed tech hype and pushback started with MOOCs.

To Be Fair . . .

There have been some real changes in the past 21 years, and I read this paper as describing a common pattern faced in ed tech rather than an indictment of all innovations failing. Online education is now mainstream with about one out of three US higher ed students taking at least one course online, often allowing a new level of access to working adults in particular. Competency-based education has further served this same demographic in a significant and growing manner. The vast majority of faculty use an LMS (for better or worse; the issue is that this innovation has diffused). The Internet has completely changed the access and availability of free and openly-licensed content.

However, the pattern remains, and the human element of diffusing innovations is the primary issue to address to improve learning outcomes.

In the meantime, William Geoghegan retired from IBM in 2007. He did not respond to an interview request, but I do want to thank him for this great and still-timely paper.

The post Plus Ca Change: About that ed tech adoption curve appeared first on e-Literate.

Exclusive: University of Phoenix moving from homegrown platform to Blackboard Learn Ultra

Sat, 2015-12-05 06:46

By Phil HillMore Posts (377)

The University of Phoenix has a history of using its scale to develop and rely on homegrown platforms, including the  adaptive learning platform branded as “Classroom”. I wrote about this investment in 2013.

The full significance of the University of Phoenix bet on adaptive learning platforms goes beyond pure dollars and became clear when the school announced the closure of 115 of its 240 locations. The stated usage of the savings from campus closures is primarily to further invest in the platform as described by the Phoenix Business Journal.

In early 2015 the University of Phoenix experienced “significant disruption with respect to our new online classroom platform”. By summer, the university’s parent company The Apollo Group announced to investors that it planned “to move away from certain proprietary and legacy IT systems”. CEO Greg Cappelli further described these changes in broad terms in the most recent earnings call.

Finally, as we discussed last quarter, we continue to build self-service capabilities for many of our student service applications, and we’re making progress in moving away from more costly proprietary and legacy IT systems that will greatly improve cost and efficiency. We’re still on track to begin to roll out the new Learning Management System for incoming students in partnership with a leading provider using their newly designed state-of-the-art LMS by the end of fiscal 2016 [end of August].

What we can now confirm at e-Literate is that the “learning platform” selected by the University of Phoenix is Blackboard Learn Ultra. This is the cloud-based redesign of Learn that Michael and I have described in several posts. Even with the University of Phoenix’s reduced enrollment, I consider this news to be the most important new client acquisition for Blackboard since at least 2011.

Interview

The Apollo Group’s Mike Sajor, CIO, and David Fraser, Senior Director of Academic and Instructional Technology, agreed to an exclusive interview with e-Literate to describe the changes.

The driver for the University of Phoenix to move away from their homegrown system to Blackboard is an initiative called Project Bedrock. The concept is that they view a pyramid of functionality, where the bottom level is necessary infrastructure that can be provided with a commodity SaaS approach (e.g. HR systems with little differentiation), and the top level is unique value-add functionality where it is worth investing in a custom solution (e.g. learning analytics targeted at individual students in their specific demographics). The LMS, with its rosters and grade books is no longer viewed as worthy of internal development and investment. As Mike Sajor explained, they could develop the world’s greatest grade book, and even if people acknowledged that this was indeed a great grade book, people would still not care that much nor would students see significant benefits.

Project Bedrock also assumes that commodity and off-the-shelf software should focus on configuration and not customization. Mike Sajor said that he was under no illusions that there would be no customizations, but they clearly want to minimize this work even by changing internal processes. Put another way, the University of Phoenix will be likely be using the commercial, standard version of Learn Ultra.

I pointed out that this was a major change in company strategy compared to recent huge investments in a homegrown learning platform, and they agreed. As David Fraser described, however, when they took a hard look at the problem, the continuing LMS investment was not worth it.

The University of Phoenix selected Blackboard’s Learn Ultra as their new LMS based on two primary factors – a comprehensive set of functionality and the redesigned user experience of Ultra. David Fraser described how the academic units and leadership looked at their mission and came up with a comprehensive checklist of functionality, and Blackboard hit those checklist items.

Where they want to invest more is in analytics. Mike Sajor described his view that there are two flavors of learning analytics – aggregate data in reports or visualizations across student groups and predictive behavioral model at an individual level. Apollo plans to “take what Blackboard has and use the heck out of it”, but they have not yet determined what types of analytics will end up in Blackboard or in custom development. They did explain that Blackboard’s Analytics mostly addresses the aggregate data problem. Again, this area of learning analytics is where Apollo wants to invest and provide unique functionality.

The decision thus far has been specific to the University of Phoenix. While other Apollo Group brands (e.g. Apollo Global and Western International) are going through similar re-evaluations based on the same principles as Project Bedrock, they have not chosen to go with Blackboard. Yet.

David Fraser described that moving to Blackboard Learn Ultra is a disruptive change affecting a large number of faculty and students. The first communication went out in July notifying faculty of the change, and the executive team is holding a road show of sorts, visiting local campuses to describe changes and get feedback.

On the subject of expectations, I asked about adaptive learning, which was a major focus of the homegrown Classroom system and is not native to Learn Ultra. David Fraser said that faculty have largely become accustomed to the individual analytics and adaptive approach of the legacy Classroom system, and they will expect the same in Learn Ultra. While details have not been determined yet, they are mostly planning on adaptive applications integrated with Learn Ultra to provide similar functionality.

Notes

Given the strong focus on analytics at the University of Phoenix, several of Blackboard’s recent hires and acquisitions now make even more sense. Last year Blackboard hired John Whitmer from California State University, and he has been deeply involved in data research including user experience issues such as how different reports and graphs are presented in context. After the two founders of iStrategy, the company Blackboard acquired at the end of 2010 to provide the foundation for their Analytics platform, left the company in 2014, there was a hole to fill on how to design and implement the platform. This is where Blackboard’s acquisition of Blue Canary, announced just this week, comes in. I suspect that Blue Canary was mostly an aqui-hire of Mike Sharkey in particular – in fact, his new title is VP of Analytics. Mike actually worked at the University of Phoenix on their analytics strategy from 2002 – 2010. John and Mike are two of the most knowledgable and practical people in the field of learning analytics.

There is another story to consider in that one of the world’s largest educational institutions was unable to successfully develop and deploy a custom learning platform despite massive investment. In the interview, the decision was chalked up to their new IT strategy, but the earnings call transcripts make it clear that there were also real problems with Classroom.

This move also raises the question of how significant this LMS selection is to Blackboard themselves. We have extensively documented at e-Literate the market share losses over the years, the bold effort to redesign as Ultra, and the subsequent delays in completing Ultra. For Blackboard to remain viable as a long-term learning platform company, they need two things: international growth (which they appear to be gaining both for Learn and for Moodle Solutions) and successful completion and acceptance of Ultra particularly for the US market. Blackboard now has a huge anchor client for Learn Ultra with the University of Phoenix, and this client is one of the most demanding customers in higher education.

What are the risks? The most obvious one is the delays in Ultra coming at a time of aggressive cost-cutting and layoffs. Blackboard faces the challenge of needing to invest (to complete Ultra) and to cut costs (to improve their financial metrics for debt markets and for corporate acquisition). This is a difficult challenge, and the risk is now shared by Apollo Group. The sharing of risk, however, actually makes it more likely that Blackboard will maintain focus, complete the major functionality needed in Ultra, and deliver to the market in summer 2016. Another risk is that the University of Phoenix, Apollo Group, and Blackboard will need to provide similar levels of learning analytics and adaptive learning capabilities as provided by the legacy Classroom platform, and these capabilities will need to be pulled together from multiple sources – Blackboard software, integrated third-party software, and internal development.

We will continue to track this development and its impact on the University of Phoenix, Blackboard and the broader ed tech market. For now, count this as a major win for Blackboard and a major change in strategy for the University of Phoenix.

The post Exclusive: University of Phoenix moving from homegrown platform to Blackboard Learn Ultra appeared first on e-Literate.

Moodle Moves Give Hints of What a Post-Fork World Could Look Like

Thu, 2015-12-03 13:22

By Michael FeldsteinMore Posts (1056)

Phil and I have written about the growing tension between the interests of Moodle HQ and a those of a couple of the bigger Moodle Partners, most notably Blackboard. There are a number of ways that this tension could be resolved, but one of the more dramatic possibilities would be a fork of Moodle. While we are not predicting it will happen, a couple of developments hit the wires last week that give us some idea of what the world might look like if there were a real and permanent split between the two groups.

The Moodle Association Launches

On the Moodle HQ side of things, last week marked the official launch of the Moodle Association. Up until now, the way that Moodle has been funded is as follows:

  1. Moodle schools hire Moodle Partners to provide hosting and/or services.
  2. Moodle Partners send a portion of their gross Moodle-related revenues (10% in most cases) to Moodle HQ.
  3. Moodle HQ then does the substantial majority of development of core Moodle functionality.
  4. There is some direct code contribution to Moodle by trusted members of the Moodle community, as overseen by Moodle HQ.

Screenshot 2015-09-16 09.29.58

All of the cash money and most of the total resource has flowed through the Moodle Partners on its way to HQ. The non-profit Moodle Association provides a second route. Individuals and institutions can join the Association directly by paying a membership fee. That fee buys them votes toward possible Moodle development projects. The projects that get voted up, subject to a feasibility and relevance review by the Association’s governing body, get funded for development by HQ using membership dollars. Moodle News has written a post that provides a back-of-the-envelope analysis of the potential for revenue to flow into core Moodle development through this model, comparing the launch of the association to an IPO. It’s not a perfect analogy; for example, the revenues aren’t really equivalent to the market capitalization of a publicly traded company. But it gives a good example of the kind of math you need to do in order to see how much of an impact the association could have on Moodle development resources, which would be particularly important if one or more of the larger companies were to leave the Partner program.

POET Enters Apereo Incubation

The Partners of Open source Educational Technologies (POET) Working Group is an alliance of Moodle service providers that currently collaborate on Moodle-related projects of mutual interest. From the press release:

POET’s goals are to help establish a focus for plugin reliability, security, and innovation for the community within the Moodle project….POET’s initial focus will be to help Moodle with improvements to the quality assurance programs for Moodle plug-ins by creating open testing and review processes and tools to be used by the Moodle community. Beyond that, POET plans to improve developer documentation, create new development training resources, create testing environments for high usage/high-performance Moodle installations, and participate in the co-development of new plug-ins focused on integrations with other open source systems.

Who are the founding members of POET?

  • Moodlerooms, now owned by Blackboard
  • Remote-Learner UK, now owned by Blackboard
  • Netspot, now owned by Blackboard
  • Nivel Siete, now owned by Blackboard
  • Remote-Learner.net, which has left the Moodle Partner program
  • Lambda Solutions [Corrected.]

The new executive director of POET is Mike Churchward, who also works at Remote-Learner.net. While there is a rationale for a group of large Moodle hosts to collaborate on testing plugins with an eye toward requirements that other Moodle users might not have, this is also clearly a group that is not exactly in perfect alignment with Martin Dougiamas and Moodle HQ. Their decision to go through the Apereo incubation process is interesting. If they were just looking to check the box required by the Open Source license, they could have just put the code up on Github. By applying to Apereo, a non-profit organization run by colleges and universities with an Apache-like multi-project portfolio and incubation process, they are sending a signal that they are serious about doing true community-based open source work and are willing to go through an incubation process to ensure it. Now, this could be nothing more than a good community-oriented business move for the plugin work they wanted to do anyway. But if the group should decide to fork Moodle together at some point, they would have a home for the project that provides them with some open source credibility as well as a community of potential adoptees and participants.

In the short term, the big winner in all of this is Apereo. Neither of its predecessor organizations—the Sakai Foundation or the Jasig Foundation—would have been a good fit for POET. But post-merger, the educational focus of the Sakai Foundation combined with the ecumenicalism of the Jasig Foundation have made Apereo a logical home for an open source project of just about any learning tool. POET’s selection of the foundation vindicates the merger strategy and raises Apereo’s profile as an honest broker and facilitator for educational open source.

 

The post Moodle Moves Give Hints of What a Post-Fork World Could Look Like appeared first on e-Literate.

Is Open Education a Movement?

Tue, 2015-12-01 09:41

By Michael FeldsteinMore Posts (1056)

Phil and I had a wonderful time co-keynoting the OpenEd 2015 conference. It was a hoot.

Here is the spectacular graphic depiction of the keynote done by Tracy Kelly from BC Campus:

And here is the post-keynote conversation that we had with the Virtually Connecting gang:

There have also been some other interesting virtual conversations growing around the conference, first over Twitter and now in the blogosphere. Some of the ones worth reading that I have seen so far include those from Maha Bali, Clint Lalonde (parts one and two), Robert Farrow (who also captures some of the tweets), and Robin DeRosa.

Phil and I approached our keynote as analysts because we figured that’s probably the main value that we have to offer. We wouldn’t presume to tell people what open education “should” be about, but given any particular “should,” we’re pretty good at thinking through how to achieve it. For my part of the talk I used the analogy of building the climate action movement because the open education community often speaks of itself in the language of a movement and because I happen to have some experience in that area. And if you read some of the posts I mentioned above, you will indeed see varying degrees of movement language.

But there is an underlying question that really should be addressed explicitly: To what extent is it accurate or useful to think of open education as a movement?

There are several factors to consider, starting with scale of ambition. One of the points that I made early in the presentation is that the degree to which you need a strategy—whether it is a movement organizing strategy or some other kind—depends on the scale of your ambition. Different participants could come to open education with different sized goals: Blog Post2

If your goals are near the bottom of that list, then you don’t need much of a strategy, which means you don’t need (or want) a movement. If your goals are near the top, then you definitely need a strategy and you might need a movement. One of the subtleties of movement building is that you are always trying to move people further along the list from low ambition to high ambition. (Maybe that’s why they call it a movement.) If you’re an open education organizer, maybe somebody comes to your meeting for the first time because she is interested in using OER for her own class. You want to get her hooked. Maybe you want her to try open teaching practices in her own class. And then maybe you want her to show her colleagues what she’s doing and convince them to do the same. And then maybe you want her to organize her colleagues to go to the administration and advocate for college-level support for fostering open teaching practices. And so on. Your goal is to encourage her to move up from the thing that she cares about—the thing that brought her in the door—to the things that you and she both care about. Bigger things that you can only accomplish together. For me, part of what separates an actual movement from movement talk is having a clear ladder of ambition that forms the framework for movement organizing activities.

I want to pick on a widely noted tweet by Jim Groom here because it makes a direct claim on this kind of movement building. He wrote,

The unfortunate equation of open education w/ free text books has made the movement seem more and more myopic and less and less compelling.

— Jim Groom (@jimgroom) November 19, 2015

There are actually a number of claims packed into this tweet, including that the open education is being equated with OER and that OER is being equated with free, and that free curricular content is being equated with free textbooks. And then there’s the question of what it means for one to be “equated” with the other. Equated is a strong verb. But I want to focus for the moment on his conclusion that, assuming that open education is being equated with free textbooks, “the movement” will be less compelling. I interpret “less compelling” in the context of the scenario I just presented to mean that you (as an open education movement organizer) are less likely to get people to come to your meetings and less likely to get them to climb that ladder of ambition. Is Jim’s claim true? Probably not for educators in the developing world. If you live in a developing nation where the choice for your students is a stark, binary one between free educational resources or none at all, then free curricular materials is not a narrow issue. This probably why the first global declaration on OER was written in Cape Town, South Africa. The question of whether OER are compelling enough to be worthy of attention is a First World problem. But Jim lives in the First World, and there is a legitimate question of the degree to which a focus on OER makes open education more or less compelling to First World educators who might be interested in open education. Let’s look at both sides of it.

I’m going to return to my keynote analogy of climate organizing because, again, it’s something that I have some experience with. A natural gas company is proposing to run a major pipeline through our area. In our local climate change organizing meetings, we have a choice about how much of our time and energy we want to focus on fighting the construction of that pipeline as opposed to, for example, pushing for state-level carbon pricing, fostering local uptake of renewables and energy efficiency, or advocating for fossil fuel divestment. Will a strong focus on the pipeline increase or decrease participation? On the one hand, we have found there are a lot of people in our area who may care about climate change but are less motivated to take substantial and sustained action to prevent it than they are to prevent the construction of a massive pipeline that would carry up to a billion cubic feet a day of pressurized, explosive methane gas near their homes, schools, and parks. There is no question that the pipeline has been a huge driver for our growth. On the other hand, it is also true that if we only talked about the pipeline, if we “equated” climate activism with pipeline activism, then other participants would not show up. (I am one of those other participants.) It’s not an either/or situation and a lot hangs on the meaning of that verb. But on balance, we have had ample evidence that a focus on the campaign to stop the pipeline has done much more good than harm, especially because, while we do focus on the pipeline, we do not equate the campaign goal with the movement goal. We think globally but act locally.

My sense is that a focus on OER can and often does drive participation in open education in a similar way, despite the fact that I don’t yet see a clear, movement-defining ladder of ambition. My sense of #OpenEd2015 is that there was widespread interest in ambitions beyond open textbooks but, lacking a clearly articulated ladder of ambition, there wasn’t a lot of focus on it.

But there is a deeper question related to the appropriateness of the “movement” label for the open education community represented at #OpenEd2015: What’s at the top of the ladder? If free textbooks are not the ultimate goal, than what is? I have yet to hear one clearly articulated. What is open education? How will we know when it has been achieved? I hear movement talk about more radical ambitions, but I don’t know what’s at the top of the ladder. And to the degree that people don’t know what’s at the top, they are less likely to start climbing. Sometimes I think that open education, writ large, is actually the opposite of a movement. It is a critique in the critical theory sense of the word. Maybe the OpenEd community is partly an affinity group for people who feel a deep sense of dissatisfaction with the state of our modern education system. People who feel that it is canned, prepackaged, and dehumanizing for students and teachers alike. They don’t like that courses are often about transmission of content that is packaged up by a corporate entity and sold to students at great cost. They don’t like that large computer systems created by other corporate entities and purchased (or, more accurately, rented) at great cost by the university become the virtual classrooms and shape pedagogical affordances. They don’t like that, in this model, faculty are reduced to repeaters of content and students are reduced to memorizers of content. They believe that these are all related symptoms of larger structural problems in our educational system.They see clearly the things that they don’t like. The better alternative system and means for achieving it are less clear. “Openness” seems to capture the spirit of values that various participants in the community are trying to articulate, so that becomes the banner. But to the degree that OpenEd is fundamentally based in critique, critique is much better at taking things apart than it is at putting them together. And wrapping that critique in movement talk doesn’t change that fact.

If you want a case in point of how wrapping a critique in movement talk will fail to create a movement, look no further than Occupy. What’s the main thing that people remember about it? What’s the first thing that the average person who wasn’t an Occupy protester will tell you about if you ask them to tell you about Occupy? People in tents in city parks. What were they trying to accomplish? I doubt that even one out of ten people can answer that question. And once the tents were packed up, the “movement” disappeared. Oh, there were a few local victories and successor organizations that have done some good, but those mostly exist where very specific goals, e.g., labor contracts, were presented under the banner of Occupy. The successful branches were thinking globally but acting locally. The larger thing just disintegrated, because it was not a movement but a critique of the current global distribution of wealth and the power structures that sustain it. Regardless of the merits of that critique, it had no ladder of ambition or strategy for getting people to climb that ladder. A ladder of ambition is a ladder of a participation. It is a structure for bringing more people together to achieve more ambitious but commonly held goals. There is a reason that the core activity of a movement is called “organizing.”

It is possible to have both critique and movement that are complementary. But they are not the same. I worry that complaints that OpenEd is not sufficiently radical may equate with the complaint that any set of goals that is sufficiently specific enough to form rungs on a ladder of ambition is not sufficiently radical to support the expansiveness of the critique. This is one way that movements commit suicide. But another way is to focus on the instrumental and achievable while taking for granted that you will somehow get around to the bigger goals at some indeterminate time in the future, or that they will take care of themselves. Instrumentalism and aspiration must be connected with a strategy.

OpenEd does not have to define itself as a movement in order to be useful or successful or worthwhile. It could be an affinity group driven by critique. It could be a community of practice. It could be a professional development conference. It could be some mix of those. Those are all fine things to be. It can also be a movement. But to the degree that it wants to be one, then it must be organized around a nested set of increasingly ambitious goals and strategies that are clearly defined and achievable in scope.

The post Is Open Education a Movement? appeared first on e-Literate.

Student Course Evaluations and Impact on Active Learning

Mon, 2015-11-30 10:51

By Phil HillMore Posts (375)

The Chronicle has an article out today, “Can the Student Course Evaluation Be Redeemed?”, that rightly points out how student course evaluations are often counter-productive to improving teaching and learning. The article refers to a Stanford professor’s call for an instructor completed “inventory of the research-based teaching practices they use”, but most of the article centers on revised course evaluation tool from a Kansas State University spin-off (the IDEA Center). One of the key problems described is that “administrators often take their results as numerical gospel” as well as faculty misapplying the results.

However they’re used, a lot of course evaluations simply aren’t very good, [IDEA president] Mr. Ryalls says.

But as flawed as they are, faculty members still turn to them as some gauge of effectiveness in the classroom. About three-quarters of instructors use formal evaluations and informal feedback “quite a bit” or “very much” when altering their courses, according to the Faculty Survey of Student Engagement.

One limitation of many tools is that they ask students things they don’t really know. A frequent example: Was your instructor knowledgeable about course content?

There is one additional problem with most student course evaluations that is not explicitly covered in the Chronicle articles – students newly involved in active learning approaches often rate the course and instructor poorly even if they end up learning more effectively. We saw this in our e-Literate TV case study at UC Davis. In a previous post we highlighted how the routine hard work required of students in active learning courses can lead to poor evaluations, but later in the interview student course evaluations came up as a major barrier to improving teaching practices.

Phil Hill: Catherine, especially with even more of a firsthand view, what do you see as the biggest barrier?

Catherine Uvarov: Well, in a way, I was fortunate because I was more a newbie instructor, so I didn’t have like 20 years of experience where I had done it this other way. Just coming in and telling instructors, “Hey, that thing that you’ve been doing for 20 years. You could be doing it better.” They don’t want to hear that. That thing that you’ve been doing for 20 years. You could be doing it better. They have worked very hard over the past 15-, 20-plus years to optimize their instructional methods to the best of their ability within their set of norm practices.

Chris Pagliarulo: And the feedback that they were getting.

Catherine Uvarov: And the feedback, so there is a huge emphasis on student evaluations and how much students like you, which is not really correlated at all with how much they’re actually learning. So, if that’s the only measure of student learning or a student—anything in the class—is student evaluations, then that’s what the instructor is tuning for.

They’re not really figuring out if their students are learning or turning the mirror on themselves and saying, “What can I do to improve my student’s learning?” They’re just saying, “What can I do to make my students like me better?”

Phil Hill: Actually, I’d like you to go a little bit more detail on course evaluations as they’re currently used. I think I heard you say those are more based on, “Do students like me?” So, what do the current course evaluations really measure? What direction does it push faculty?

Catherine Uvarov: In my opinion, the student evaluations are pretty much worthless because the questions that they ask are very generic. It’s like, “Does the person speak loud? Are their visual aids clear?” It’s very generic and bland, and then it gets down to the only question that they really care about—rate the overall performance of this instructor.

What we have found in my flipped class and in any of these where the lecture is changing their style and making the emphasis more on the students, the students are thinking, “Well, I learned all of the material on my own, so the instructor didn’t teach me that material. I’m going to rate the instructor lower because they were not as valuable to me.

Erin Becker: When you make the students do more work, they don’t like you as much, and that hurts your course evaluations, which in turn feeds back in to the incentivization issue.

Marc Faciotti: It’s a challenge. If you’re not thinking about education all day—and most of us have research labs that occupy a lot of time as well (administrative duties and all that type of thing)—so if you don’t have training there, there’s a lot of catching up to do. Most institutions have great resources on campus. There’s people dying here at iAMSTEM to help and to catalyze some of these things. So, seek help, be realistic about how much you’re going to change the first time around, and have kind of a long-term plan for what you’d like to achieve.

Marco Molinaro: I think the biggest barrier we have right now is that the faculty rewards system doesn’t yet take in to account this type of experimentation and doesn’t really promote a faculty member based on the quality of their instruction and the effects that they’ve had on student learning.

Later in the Chronicle article there is a discussion about whether to scuttle student evaluations altogether. I strongly agree with this conclusion:

For Mr. Ryalls, of IDEA, the problems with students’ evaluations shouldn’t scuttle their use altogether. “What drives me crazy,” he says, “is this notion that students don’t know what the hell they’re talking about.” They spend more time than anyone else watching faculty members teach, he says. “Student voice matters.”

The post Student Course Evaluations and Impact on Active Learning appeared first on e-Literate.

New Visual From LISTedTECH Shows LMS Market By New Implementations

Mon, 2015-11-23 19:15

By Phil HillMore Posts (374)

Justin Menard and his team at ListEdTech have produced a great new visual on the LMS market in North America. Using his wiki-based data with 4,000+ institutions, he shows the percentage of LMS implementations per year (scaled to 100% for each year). While we are used to seeing LMS market share in terms of number or percentage of institutions using each LMS as primary system, this view highlights the dynamics of the market – which systems are getting adopted. See original post for full chart and additional description.

LMS-Providers-Market-Share-by-Year1

A few notes:

  • We’ve come a long way since the days of simple bar charts for market data.
  • Keep in mind that this data shows new implementations and not LMS decisions, per se. If a school goes through an RFP process and selects their current vendor, this data will likely not show up as a new implementation.
  • Also keep in mind that while 4,000 institutions is impressive, it is not exhaustive and not bias-free. Read this as the best data available on new implementations, painting a picture of trends in the market. Note the description at the bottom of Justin’s post on limitations of the data.
  • We have covered at e-Literate that Blackboard’s Moodle-based clients are growing in number (over 1,400 as of this fall), especially outside of the US. But note that this growth is not occurring in North America.
  • The dominant player in the market is obviously Canvas. As of last year (and I expect that 2015 will be even stronger for Instructure, maker of Canvas), Canvas has more North American new implementations that Blackboard, Moodle, and D2L combined. We have not seen this dominance of new implementations since Blackboard back in the 1998 – 2002 years when the market was a land grab.
  • Notice how Blackboard, WebCT, and ANGEL are all colored shades of blue – based on Blackboard’s acquisition of the other two companies. This allows you to also see the combined company effect.

More description of the data in the original post.

The post New Visual From LISTedTECH Shows LMS Market By New Implementations appeared first on e-Literate.

Rapid Fire Feedback from #WCET15

Mon, 2015-11-16 18:58

By Sasha ThackaberryMore Posts (1)

ed. We welcome Sasha Thackaberry with this post summarizing observations at last week’s WCET conference in Denver. We met Sasha while consulting at Cuyahoga Community College (Tri-C) and were impressed with her breadth of knowledge. We’ve asked her to provide her own take on the WCET conference.

It was my first time at the WCET Annual Meeting in Denver. I was extremely fortunate to attend and thrilled to present with Luke Dowden, whom I met for the first time at the conference. To hang on to some of the great learning, I wanted to create a brief recap of the conference and some thoughts on the next evolution of edtech in higher ed.

One theme that emerged from the conference was that higher education is moving beyond an era of traditional online courses and programs toward a more simultaneously disaggregated and integrated future. This evolution is creative, messy, and occurring on multiple fronts, with the twin drivers of tightening budgets and the need for college-ready graduates propelling new models of learning and teaching. Individual and highly specialized innovations are converging to create entirely new ecosystems of learning, both pedagogically and architecturally.

The overall tenor of the conference was future-forward. Not focused on limitations, instead the sessions and keynotes centered around a common theme: We can rebuild it. We have the technology…. We can make it better, stronger, faster, (and cheaper.)

1. We’re beyond “traditional online courses.”

Though this may seem like a “duh” moment to many of us in edtech, this is still news to parts of the higher education industry. There is a wide spectrum of implementation of online courses and programs in higher ed – anywhere from cohesive online programs that have a formalized course creation, redesign and QA process to those institutions that we liked to call to as the “Wild West” of online learning, which refers to courses created at will without formalized quality review, partnership with instructional designers or developers, or in some cases even required faculty training.

However, on the other end of the other spectrum, that Wild West concept referring to online courses is so last week. On the new frontier of digital learning innovation in higher ed we’re seeing modularization at a new level, competency-based education yes, but also microcredentials, integration of alternative providers like boot camps and incorporation of open learning like MOOCs. We’re talking convergence of digital innovation. Many national initiatives are supporting this evolution in various ways (see #2 about ecosystems.)

And how is this evolution occurring? These alternative learning models may seem like they’re fringe, outside of the mainstream, branded with the stamp of the all-too-overused “Disruptive Innovation.” However, we’re increasingly seeing these alternative models being intentionally incorporated into the fabric of higher education. The institutions leading the charge are strategically taking smart risks to be at the front-end of this evolutionary spectrum, meeting the needs of an increasingly diverse population of lifelong learners.

Check out Wichita State University’s Stackable Badges for awarding credit. The Colorado Community College System is conducting a system-wide digital badging initiative. There are initiatives to gain credentialing coherence like “Connecting Credentials” and the associated project the Credential Transparency Initiative. Colleges are increasingly aware of graduating students who can actually get jobs (note this is not a knock on a liberal arts education, which I passionately believe in; folks need to have a family-supporting wage too.) Adaptive learning was a big topic too – Colorado Technical University’s Intellipath is an example. Prior Learning Assessment (PLA) was a topic, and though it’s not new by any means, it is gaining additional adoption as colleges struggle to reconcile how credit will be awarded and transcribed. Competency-Based Education (CBE) was highlighted, with UW’x Flexible Option taking front and center for a session. There’ve been a bunch of new convenings and conferences popping up around this accelerating model, including CBExchange in Phoenix and CBE4CC in Denver. There were also the more traditional concerns about state authorization and student authentication as well as protecting student data. Less fun for my particular brand of geekiness, but no less critical to the success of our endeavors. There was also the groundbreaking work of scaling Open Educational Resources (#OER) that University of Maryland University College shared, which won them a WOW award. Dreaming the dream of a no-textbook-cost degree! All of these conversations – from talented colleagues – had one thing in common: they are deeply connected and intertwined.

2. We’re talking ecosystems and acceleration.

The portfolio approach is passe – the digital learning ecosystem is the new portfolio approach (for those of you unfamiliar, that was the world’s most awkward “Orange is the New Black” parallel.)

The interconnected nature of each of these elements means that institutions need to make decisions about scope and identity. First, institutions need to know where they are on the spectrum of innovation in digital learning. Secondly, they need to decide where they want to be. And – this is important – institutions that don’t make that decision consciously may find themselves overtaken by those institutions who have.
There are initiatives cropping up all over the place to bring the effectiveness of these new models to scale, including the Competency-Based Education Network (#C-BEN), CBEinfo.org which is another great organization/resource site, and IMS Global, which is attempting to solve the substantial system integration challenges that this ecosystem introduces. I could list them but it would be a long list, and it would be incomplete – it seems like new initiatives crop up every day.

This new digital ecosystem is like a living thing – it adapts, grows, rewires itself, makes new connections, mutates. The tightening of state budgets and the calls for job-ready graduates are the climate change of this digital ecosystem. Dramatic shifts in the environment are necessitating adaptation and change.

Though “evolve or die” may be a little dramatic, operating on the same assumptions that sustained higher ed a generation ago will not be successful for most institutions. Figuring out how to dig in with both hands and get messy and creative will be a successful strategy for institutions that are evolving, and improving student success as they go.

Interested in learning more about culture change and innovation? Check out the asynch resources from the presentation I did with Luke Dowden, “Strategic Innovation: Working Through the Strategy and the Skeptics.” Here’s the Google Site we used to curate resources and the strategies themselves: www.tinyurl.com/Strategy2Innovate. If you’re one of my Tweeple, you can catch all the action by searching #Strategy2Innovate on Twitter.

3. Oh the humanity – yep, we’ve got that too.

One of the challenges with “traditional” online education has long been “student engagement.” Student engagement is the strange and wondrous, amorphous term that we in education use to describe the human element – interactive communication, a sense of connection, a feeling of belonging and responsiveness that comes when learners – and college staff and administration – feel that they are part of a community. The complaint of asynchronous learning has long been that it lacks student engagement, and at least some data supports that the student grades and retention rates are higher in face-to-face classes and blended/hybrid courses than in fully online courses. This has been particularly pronounced at the community college level, though students who take online courses also graduate in greater percentages than those who do not. More than one study has found no difference overall in learning outcomes. And indeed, studies have revealed the importance of engagement within online courses for student success.

How do we square this lack of student engagement with the world in which we live where Twitter, Facebook, Instagram, Tumblr and a dozen other ubiquitous social media channels are a core part of how we stay connected to friends and family?

Here’s the secret answer – there’s a better way to do it.

When over one third of marriages in the U.S. start online, I think it’s time that we can squarely throw out the hypothesis that online interactions – in and of themselves – are less satisfying, less connecting, and less effective for learning than those in the ostensible “real world.” Our core business in higher education has not been wholly dependent on creating systems and processes to support meaningful interactions online. Harmony.com, however, has quite a bit invested in meaningful interactions online. Colleges and universities have long been in the business of information transference – a totally different purpose and one that is being fractured as our understanding of effective learning and teaching transforms. The legacy systems that still dominate the marketplace like LMSs are not designed for connection, they’re designed for information delivery and assessment.

Student readiness and instructional design are of course, the other two core components of the triad of effective elearning, but this engagement component can, will, and is being tackled. As more colleges experiment with what student support, coaching, and advising mean, they’re finding huge dividends in virtual connections. And we’re talking beyond email or phone calls, we’re talking mobile apps, texting alerts, and a Skype, Facetime, Google Hangout, WebEx, and Adobe Connect world. A panel discussion about virtual student coaching brought up many of these concerns and how some companies and institutions are making online coaching effective and engaging. Engaging faculty – both full-time and particularly adjuncts – is critical too. Excelsior College shared strategies for engaging adjunct professors while UC Online and Case Western Reserve University shared their alternatives to sit-and-get training.

The point of the technology is not technology itself. It is a tool that enables rich learning.

This human digital experience is indeed “sometime in the future” for many colleges and universities. Meanwhile, innovative institutions are evolving their very DNA, creating agile processes that support turning on a dime. Size, funding, and stature no longer guarantee success for institutions. In an increasingly unbundled world of learning, the student has a level of choice like never before. They will vote with their mouses, with their tablets and thumbs.

4. Insert pithy leadership-related title here….

The second day of the conference was kicked off with a fantastic panel on leadership in edtech. Ron Legon from Quality Matters, Loraine Schmitt from Portland Community College, and Ray Schroeder from the Center for Online Leadership, UPCEA shared diverse and convergent perspectives led by the facilitation of Michael Abbiatti from WCET. High on the radar was the need to mentor and coach the new generation of leaders, and the importance – and often neglected – necessity of succession planning. Those of us Gen-Xers looking to get into the C-suite had our hopes buoyed by the demographics. The numbers of baby boomers retiring will leave a void of leadership in higher education and the visionaries in our field are both concerned and committed to preparing the next generation.

This message was very similar to one that I heard at the ACE Women’s Network of Ohio conference at Corporate College East at Cuyahoga Community College week before last. Women are a large part of edtech and anyone familiar with the #LeanIn movement knows that tech in general needs more women leaders. Under the school of “Chance Favors the Prepared” I – who am working on my dissertation in #CBE – met several other women at the conference, also middle-managers, who are doing the same, prepping for the next step. The research that is being pursued is meaningful and exciting, and it was great to connect with so many other like-minded curious and passionate people – among them Phil Hill, Ron Legon, Kelvin Bentley (my old boss,) Luke Dowden, and new friends Brenda Perea, Cali Morrison, Meg Raymond, Dale Johnson, Amanda Shaffer, Catrina Roitman, Wendy Bullock and Tammy Vercauteren. I also enjoyed meeting Vernon Smith, Michael Abbiatti and Russ Adkins – follow-up is on the way, so that’ll be fun as well.

Wrapping it up….

This was my first time at #WCET15. I will definitely be back. There were so many great learning opportunities and sessions that they all couldn’t be mentioned here – this post simply mentioned the path I took at the conference.

The current state of edtech in higher ed is messy and looking to get messier. It’s in that interesting place of explosive creativity, where everyone gets all the materials out, looks at the current piece of art, and decides how to deconstruct and then reconstruct it. This time we’re doing it with the student – not the institution – squarely in the middle of the picture. And then we’re making an infographic of it. And PDFing it. And making a text-only accessible version. And an audio-version. And an animation of it.

Because that’s just how we roll.

Right over the meet the student where they are.

The post Rapid Fire Feedback from #WCET15 appeared first on e-Literate.

Live Stream for Wednesday’s OpenEd Keynote

Mon, 2015-11-16 16:29

By Michael FeldsteinMore Posts (1055)

Phil and I will be giving a joint keynote to the OpenEd conference the day after tomorrow—Wednesday, 11/18—at 8:30 AM Pacific Standard Time. You can watch the live stream here. If you miss it live, don’t worry. The video will be available afterward at the same URL. For a modest license fee, all rights reserved.

Kidding about that last part.

The post Live Stream for Wednesday’s OpenEd Keynote appeared first on e-Literate.

Interview with Josh Coates, CEO of Instructure, on today’s IPO

Fri, 2015-11-13 17:01

By Phil HillMore Posts (382)

Instructure, maker of the Canvas (higher ed and K-12 markets) LMS and Bridge (corporate learning market) LMS, held their Initial Public Offering today. Prior to the IPO, Wall Street analysts focused on the company’s growth, its large losses, and the challenges of the education market. The company was priced on the lower end of its range ($16.00), and closed up 12.5% at $18.00.

This IPO and its lead up have been highly watched, particularly given the rapid growth in ed tech investments and questions on whether there are real businesses to emerge based on the investments. I had the opportunity to interview CEO Josh Coates today. What follows is an edited version of the interview, focusing mostly on how Instructure’s IPO will impact education markets and existing customers. I tried to capture as much of the Q&A verbatim as was feasible, but treat the answers below as a paraphrase.

Q. What are your impressions on how the IPO has gone so far?

A. The market in general has been a blood bath [ed. Dow down 585, or 3.3%, for the week], but we’re doing well so far. Given market conditions right now, we’re pleased as punch. We priced in range [ed. $16 – $18], and the market responding well. We’re really focused as a company 6-12 months down the road, but it is nice to get this IPO feedback.

Q. The use of funds in your S-1 filing indicates more of the same but with additional war chest. Do I read this correctly to say that you do not plan to change your limited mergers and acquisition (M&A) approach? If it’s just more of same, what is the biggest impact existing customers should expect (besides Jared Stein showing up in black Chevy Suburban with an entourage)?

A. We have a bias against M&A other than very limited cases. You are right that we plan no change of strategy with our usage of the funds [ed. they raised $70 million with the IPO]. Honestly, customers should expect no real change other than that they can now dig into our financials.

Q. Some of your competitors have been suggesting that the consistent losses listed in your S-1 means that you will have to raise prices. How do you respond? Will you be able to make additional revenue from existing clients?

A. Our prices are at market levels and we intend to keep them at market. We have fundamentally strong business that works, and it’s a healthy business, so we won’t have to do anything unnatural. [ed. I pushed back that there must be pressure to make additional revenue and upsell to existing clients]. Our upsell approach right now includes getting customers to add premium support. But we are a software company. Customers should expect us to create new independent software products every 12 – 18 months. Some existing customers will use, some won’t. That’s the strategy – create new business by building great new software.

Q. What is the relative importance of growth in K-12 vs Higher Ed vs Corporate Learning for your model over the next two years?

A. [ed. Josh did not directly answer this one but offered the following observations.] We have four markets that we serve – domestic higher ed, domestic K-12, domestic corporate learning, and international markets. Right now our fastest growth is in corporate learning, but that product, Bridge, was just released in February. Just behind that market in terms of growth is domestic K-12, which is largely a green-field market; we’ve just gotten started. It’s interesting, but by customer count, domestic K-12 is our largest market. We have to do well and grow in all four markets.

Q. Do you have any plans you can share on how long you’ll be at the company?

A. I will stick around as long as board will keep me. I love the job, have a lot more work to do, and have no active plans to leave.

Q. How will your IPO affect the broader ed tech market?

A. Everything is connected. The effect will be generally positive, providing an example of a little VC-backed company that launched in 2011 and has become a healthy independent company. This is a good proof point that education and training markets can support this type of company and investment.

Q. When will you aggressively target non-English-speaking international markets?

A. Both Canvas & Bridge are in 15 languages, and Instructure has clients in 25 countries. We provide phone support English & Spanish and soon Portuguese. We’re adding offices in multiple international cities including in Brazil. But we’re doing this carefully. [ed. I pushed back that in my visit to Latin America, very few people in ed tech had any real knowledge of Instructure.] You’re right – we’re just at point of setting up legal entities in Latin America and have done no marketing. We’re in the early days.

Given the nature of an IPO and SEC rules, some of these answers are not very specific and are in good faith. We’ll keep this interview in mind here at e-Literate, however, to see if the answers are backed up by future action.

The post Interview with Josh Coates, CEO of Instructure, on today’s IPO appeared first on e-Literate.

Data To Back Up Concerns Of Textbook Expenditures By First-Generation Students

Thu, 2015-11-12 15:49

By Phil HillMore Posts (381)

David Wiley has added to the conversation[1] over use of data on college textbook pricing and student spending patterns with “The Practical Cost of Textbooks”. The key argument is to go beyond prices and spending and look at the most direct measure of asking students themselves how textbooks costs have impacted them. He then looks at the Florida Virtual Campus surveys (also included in my post), concluding:

What impact does the cost of textbooks have on students? Textbook costs cause students to occasionally or frequently take fewer courses (35% of students), to drop or withdraw from courses (24%), and to earn either poor or failing grades (26%). Regardless of whether you have historically preferred the College Board number or the student survey number, a third fact that is beyond dispute is that surveys of students indicate that the cost of textbooks negatively impacts their learning (grades) and negatively impacts their time to graduation (drops, withdraws, and credits).

And yes, we need to do something about it.

Amen. Surveying over 18,000 students, the FVC surveys are quite important and should be on everyone’s radar.

More Out Of Data

I strongly feel that this type of discussion (as well as media quotes, policy, and legislation) should use the best data available, describe that data accurately, and ask for more data where there are holes. And it often takes multiple views into multiple data sources to get a complete picture. For the College Board / IPEDS data, it is indisputable what it says, but descriptions of the source should note that it comes from financial aid offices and not from student spending or budgets. For the student spending data (Student Monitor, NACS, Cal State), the data is solid and accurately described (that I have seen), but as reported they are missing segmentation across key student demographics.

In Mike Caulfield’s post he argues that it’s all well and good that students are creatively reducing their textbook expenses by renting, borrowing, etc, but that this data can mask other important factors.

You could use skills like this to cobble together that bag of books and say *that’s* the true cost. Look what you can get your books for if you game the system right!

But to say that, you’d have to have learned nothing in the past decade about why students fail. Requiring a non-traditional student to cobble together a bag of half-priced textbooks the way a second-generation student might is setting them up for failure.

Since I have argued that the student spending data is critical and much more relevant than the College Board data, let’s explore that data further – asking for more. Richard Hershman and NACS were kind enough to agree to requests for cross-tabs on two of their questions against sector and first-generation status.

For Fall 2014, students were asked how much they spent on required course materials.

Spending_and_Cost_Concerns 1

It turns out that Mike is right. First-generation students spend 10 % more, acquire 6% fewer textbooks, and end up paying 17% more per textbook than do non first-generation students. This data could be used as a starting point for policy that addresses this problem – explaining the problem and providing a measurement for improvement.

For Spring 2015, students were asked to list their top 3 cost concerns for college.

Spending_and_Cost_Concerns 3

First-generation students are somewhat more likely to worry about course materials (41% to 37%) than non first-generation students, but the view across sectors is more telling. 2-year college students are much more likely to worry about course materials (50% to 37%) than 4-year college students. Tuition is lower at 2-year schools, and fewer student live on campus or away from home. So it makes sense that course material concerns would increase in relative terms (% listing in top 3 concerns). It also makes sense how car payments / insurance / gas would be more important.

These are real issues of equity, and accurate understanding of multiple data sources is more likely to lead to effective policy decisions than using single sources. And you get that data by exploring it further and seeing where it takes you.

  1. My initial post, Mike Caulfield responseBracken Mosbacker, my response to Mike, Mike follow-up

The post Data To Back Up Concerns Of Textbook Expenditures By First-Generation Students appeared first on e-Literate.

What Schoology’s Venture Funding Means for the LMS Market

Wed, 2015-11-11 11:05

By Michael FeldsteinMore Posts (1055)

LMS vendor Schoology just raised $32 million in Series D venture capital funding, bringing the total that they’ve raised to just over $57 million. If you’ve never heard of them, that’s because they have mostly been focused on K12, where they are doing very well. But they have turned their attention to US higher ed recently. They had a surprisingly big presence at EDUCAUSE, where CEO Jeremy Friedman told me that they are prepared to make an aggressive push. Their ability to get major funding was probably helped by Instructure going to market, and possibly by the leak that Blackboard is putting itself on the block as well. I don’t generally take money guys too seriously in their ability to predict ed tech, but they may be lucky on this one. I think there may be an opening the US higher ed LMS market for a new entrant.

LMS selection for schools often works a little like the selection process that high school students typically go through when picking a college. Students looking at colleges usually have a favorite going in. Maybe their friends are going there. Or their big brother or sister. Or maybe they just heard that it’s cool. But they don’t apply to just one college, in case it doesn’t work out for one reason or another. So they have a second tier of schools that might be OK too. Generally, they don’t know much about your favorite school going in and they know even less about the “might be OK” schools. Depending on how cautious they are, they might throw in one or two “safety” schools that they really don’t want to go to but that they feel (or their parents feel) should be included for the sake of completeness.

Likewise, colleges and universities frequently go into an LMS evaluation process with a favorite. Because the selection is generally done by a committee of stakeholders rather than just one person, there might be conflicting opinions on what the favorite is. But more often than not, there is a nascent majority or a consensus opinion about the likely winner, at least among the core selection committee. Back in the early to mid-aughts, the default favorite was usually Blackboard because it was considered to be the safe alternative that everybody was using. When Blackboard faltered, the favorite began to split between D2L and Moodle—and occasionally Sakai, particularly for larger public universities—with type of school and geography having a big influence on which one was likely to be the frontrunner. These days, the schools that Phil and I talk to report Instructure as the starting frontrunner at least four times out of five, across school types or geographies.

But LMS selection processes still need their “might be OK” candidates. For one thing, most of them are mandated by policy or by law to do a real multi-vendor evaluation. And most evaluation committees genuinely do want to look at alternatives. Just because they have a sense going in of which alternative is most likely to be the best doesn’t mean that they are closed-minded. The trouble is that there aren’t many alternatives that selection committees feel hopeful about these days. Increasingly, Sakai and Moodle aren’t even making it to the serious evaluation stage in US higher ed evaluations; and even when they do, they are often treated like safety schools. Blackboard never fully recovered from reputational damage done under Michael Chasen and their failure to deliver on Ultra this year was a huge setback. At the moment, they are being treated like a safety school as often as not. If Ultra slips further—and maybe even if it doesn’t—they could start losing significant numbers of customers again. And we haven’t run into many schools that are particularly excited about D2L either. Probably the best I can say for them is that they are the least likely of the LMS companies that are not Instructure to be dismissed out-of-hand.

I think there’s an opportunity for a new entrant to get a fair hearing from selection committees that want a real horse race but aren’t excited about any of the incumbents. Ironically, the rise and success of Instructure has probably reduced risk aversion among schools to go with a scrappy start-up. I don’t know if Schoology is going to be the one that gets a foothold in the market because of this opening, but their timing is definitely good.

The post What Schoology’s Venture Funding Means for the LMS Market appeared first on e-Literate.

New Column At EdSurge

Tue, 2015-11-10 16:22

By Phil HillMore Posts (379)

Starting today, Michael and I are publishing a three-post series on personalized learning at EdSurge. Depending on how that goes, we could end up providing a regular column there.

The first post today is titled “Why Personalized Learning Matters to a New Generation of College Students”.

As we talk to the people on the front lines of these differentiated instructional approaches—students, faculty and staff with real-world experiences, both good and bad—the most significant theme that emerged was the challenge of helping students that come to class with wildly different starting knowledge and skill levels. Personalized learning should be seen as an effort for institutions to directly support students across the spectrum.

EdSurge

We’re excited to be working with EdSurge, helping them expand their coverage of higher education and helping us to share analysis and e-Literate TV content with a broader audience.

You can read the whole article here.

The post New Column At EdSurge appeared first on e-Literate.

Asking What Students Spend On Textbooks Is Very Important, But Insufficient

Tue, 2015-11-10 08:37

By Phil HillMore Posts (379)

Mike Caulfield responded to my post on data usage to understand college textbook expenditures. The core of my argument is a critique of commonly cited College Board data. That data originating from financial aid offices leads to the conclusion that students on average either spend or budget $1,200 per year with that number rising, while there is more reliable data originating from students showing the number to be half that amount and dropping.

In Mike’s response post yesterday, he generally agreed with the observation but is concerned that “readers of that piece are likely to take away the wrong conclusion from Phil’s figures (even if Phil himself does not)”. There is a risk that people see the lower numbers and conclude the “crisis is overblown”, leading to this observation:

If we’re looking to find out if prices for some set of goods are too high, then by definition we cannot look at what people are spending as a reliable gauge, because one of the big effects of “prices too high” is that people can’t afford what they need.

If you don’t pay attention to this you get in all sorts of tautologies.

In the specific world of textbooks, Mike considers the lower-cost method of renting used textbooks, noting:

So which figure do we use here? The chances of getting everything you need as a rental are low. Sure, you could be the super-prepared student who knows how to work the system and get them *all* as rentals — but not every student can be first in line at the bookstore. And the ones at the back of the line — guess their socio-economic class and first generation status?

This is an important issue, and I appreciate Mike’s understanding that I am not arguing that college textbook pricing is an overblown crisis. I agree that the crisis is real and that the hardest-hit are likely low socio-economic class and first generation students.

But let’s move past these agreements and drop the gloves.

I do not believe that we are solely “looking to find out if prices for some set of goods are too high”. What we need is a richer picture understanding both prices and expenditures. This past spring on the same College Board data subject I wrote:

It is important to look at both types of data – textbook list prices and student expenditures – to see some of the important market dynamics at play. All in all, students are exercising their market power to keep their expenditures down – buying used, renting, borrowing, obtaining illegally, delaying purchase, or just not using at all. And textbook publishers are suffering, despite (or largely because of) their rising prices.

But there are downsides for students. There are increasing number of students just not using their required course materials, and students often delay purchase until well into the academic term. Whether from perceived need or from rising prices, this is not a good situation for student retention and learning.

This is not too far from Mike’s parable of the Perdimians and the concept of a food (or textbook) deficit. The richer picture should include some measurement of what is needed (the bag of textbooks) and what is spent (the expenditures) and if there are gaps. If we want to use College Board data to understand pricing, however, then we will need better guidance from the Education Department on what to include and not include (for consistency) and we need the data broken down by student demographics. The data right now is fundamentally flawed.

On the subject of impacts of pricing, Kim Thanos from Lumen Learning pointed out a Florida report from 2012 that asked students about this deficit. The findings:

Florida Table 6

That’s a high percentage of students avoiding textbooks, avoid or dropping courses, and performing poorly due to costs. And this type of data is only available by asking students.

But overall, I suspect Mike and I are on the same page in terms of concerns – the crisis is real, we need pricing and expenditures, and poor or first-gen students might not have access to the same cost cutting measures. On that note, if you accept 2-year vs. 4-year schools as a proxy for the measure of poor / first-generation students, the NACS data backs up Mike’s point. 2-year students are far less likely to go get course materials for free.

Acquisition by sector

 

As part of his argument, Mike looked at “first year suggested schedule for a math education major” and looked up bookstore list pricing and rental of used textbooks for those courses over a year.

  • One year of new textbooks: $1430.68
  • One year of rentals (mostly): $712.46
  • Mix, half rentals, half new: $1071.57

Getting back to the College Board usage of $1,225 for books and supplies and using the mix of half rentals, half new, Mike observes:

What I think you’ll find out is that although there is a lot of variability in the cost of textbooks for a year in different degrees, the College Board estimate is much closer  to estimating what students actually need than other measures offered.

Yes, $1,225 is closer to $1,072 than is the NACS and Student Monitor figures of $530 – $640. But the benefit of asking students for data and not relying on financial aid self-reporting is that the data is already far richer than the headlines. First-year students spend much more than others, as seen in the 2014 NACS report:

Highest lowest spending

It turns out that $1,019 is very close to Mike’s number of $1,072.

We need data on both pricing of required course materials (including local substitutes) and expenditures by students. The poor / first-generation problem should be addressed by better data from students – cross-tabs on student expenditures and asking them the impacts of pricing in particular – rather than falling back to College Board usage of financial aid office reporting.

Asking students how much they spend on college textbooks is an excellent question, but it is insufficient.

Thanks to Mike for extending this conversation and adding in the challenge of poor / first-generation students.

The post Asking What Students Spend On Textbooks Is Very Important, But Insufficient appeared first on e-Literate.

Bad Data Can Lead To Bad Policy: College students don’t spend $1,200+ on textbooks

Sun, 2015-11-08 20:56

By Phil HillMore Posts (379)

The average US college student does not spend or budget more than $1,200 for textbooks, with that number rising each year, as commonly reported in the national media. The best data available continues to show that students spend roughly half of that amount, and that number is going down over time, not up.

Last spring I wrote a post documenting that the College Board is not a reliable source for college textbook expenditures. With last week’s release of College Board data, it is worth repeating that data for their “Books and Supplies” category are:

average amounts allotted in determining total cost of attendance and do not necessarily reflect actual student expenditures.

Much more reliable data from the National Association of College Stores (NACS) and the Student Monitor consistently show that students on average spend between $530 – $640 per year for textbooks or “required course materials”.[1]

There is also fairly clear data from NACS and Student Monitor showing that student expenditures on textbooks or “required course materials” is going down[2].

NACS Expenditures Per Year 2015

NPR Student Monitor

This week Campus Technology published an article on California State University’s Affordable Learning Solutions project, extensively quoting assistant vice chancellor for academic technology services Gerry Hanley. This article included an estimate from the system that further backs up the $530 – $640 figures.

[Hanley’s] rough estimate: As of a few years ago, learners at the 23-campus, 460,200-student university system were spending $300 million a year on course materials — about $651 per student per school year.

That $1,200 Figure

The College Board data in its recent report shows the “Books and Supplies” category is ranging from $1,249 to $1,364 across the college sectors.

College Board Fall 2015

Furthermore, the College Board data shows this category rising year over year. Why is that? The reason is that the College Board gets this data from its member institutions who provide the books and supplies budget estimates they use for the National Center for Education Statistics’, or (NCES) Integrated Postsecondary Education Data System (IPEDS), which are self-reporting of campus financial aid offices which often want to maximize student eligibility[3] of loans and grants[4].

Update: Edited paragraph above to clarify that CB gets data from NCES which in turn gets data from financial aid offices and their self reporting. NPSAS does not act as a collection point of the data.

We have the College Board non-data implying that students spend more than $1,200 per year on textbooks, with that number rising each year, when the best data shows the expenditures of roughly half the amount, $600, with that number dropping.

This is why it is frustrating to see the continued usage of College Board data in the media and now in a bill introduced in the US Senate. SB2176 is the “Affordable College Textbook Act” sponsored by Senators Durbin, Franken and King. As described in the bill:

According to the College Board, during the 2014–2015 academic year, the average student budget for college books and supplies at 4-year public institutions of higher education was $1,225.

I should note that the description given to the media by OER advocates has changed since last spring from “students typically spend” and “average student spends as much as” to the more accurate “student budget for”. But this is a technicality that, while less misleading, is still misleading. I also have not seen anything indicating that the $1,225 number has led to bad policy in this bill.

The Affordable College Textbook Act was introduced in the 2013, did not get out of committee, and was reintroduced with minor changes a month ago. The bill would create a grant program for OER pilot programs, targeted at proposals that can:

demonstrate the greatest potential to— (1) achieve the highest level of savings for students [etc]

The primary purpose of the bill is to create savings for students; it should start with understanding what students actually spend and not a somewhat arbitrary financial aid number.

For what it’s worth, it should be noted that NACS is one of the groups actively supporting this bill.

Why Does This Matter?

I have had several people argue that this distinction does not really matter, but I maintain that it does for two reasons.

The first reason is simply a matter of transparency. For those advocating for open educational resources (OER), open should include open process and open data sources. The primary barrier to adoption of OER is faculty acceptance and not the existence of pilot programs. If faculty by and large start to believe that advocates are using misleading data, that will not help with getting more faculty to use open resources.

The second reason is that we have moved beyond the $1,200+ number being used just in media circles and now having it potentially embedded in legislation and policy decisions. What happens when policy makers or even vendors start to base real decisions on this number? What if a program or institution or state body decided that enough is enough, we need to cut that $1,200 average down to $700 per student by executive decision? Or what if a vendor convinced an administrator that they could supply all texts, digitally, for $700 per student as long as the school required 100% adoption? Those are hypotheticals, but if this scenario played out in some form, students would actually end up having to spend more money, not less, on textbooks or required course materials. We would also have missed the chance to spend policy capital on efforts that do help students save money.

Cal State and the AFS program should be commended for using actual student expenditure data as they set goals of reducing student expenses by 50%. If they had started with the College Board data for current state but then surveyed student actual expenditures to determine savings, Cal State would have almost met their own goal already, and there would be little reason to push any further for savings.

There is growing acceptance of the potential for OER, which means that decisions with bigger impact are likely to be made in the future. Having misleading information that ignores the actual expenditures by students can lead to bad policy decisions.

Updates:

  1. Read the spring post and a a postscript if you’d like to see the details.
  2. Note that NACS used to be a biannual study and does not have data for AY2009 and AY2011.
  3. Update: Note that this is but one example of flawed data. There is little federal guidance on how to determine the Books & Supplies category and no consistency in how financial aid offices do so.
  4. See postscript post for more details.

The post Bad Data Can Lead To Bad Policy: College students don’t spend $1,200+ on textbooks appeared first on e-Literate.

Instructure Dodges A Data Bullet

Wed, 2015-11-04 10:23

By Phil HillMore Posts (378)

Last week’s EDUCAUSE conference was relatively news free, which is actually a good thing as overall ed tech hype levels have come down. Near the end of the conference, however, I heard from three different sources about a growing backlash against Instructure for their developing plans for Canvas Data and real-time events. “They’re Blackboarding us”, “the honeymoon is over”, “we’re upset and that is on the record”. By all appearances, this frustration mostly by R1 institutions was likely to become the biggest PR challenge for Instructure since their 2012 outage, especially considering their impending IPO.

The first complaint centered on Instructure plans to charge for daily data exports as part of Canvas Data, which Instructure announced at InstructureCon in June as:

a hosted data solution providing fully optimized data to K-12 and higher education institutions capturing online teaching and learning activity. As a fundamental tool for education improvement, the basic version of the service will be made available to Canvas clients at no additional cost, with premium versions available for purchase.

What that last phrase meant was that monthly data access was free, but institutions had to pay for daily access. By the EDUCAUSE conference, institutions that are part of the self-organized  “Canvas R1 Peers” group were quite upset that Instructure was essentially selling their own data back to them, and arguments of additional infrastructure costs were falling flat.

Vince Kellen, CIO of the University of Kentucky, was quite adamant on the principle that vendors should not sell back institutional data – that belongs to the schools. At the most vendors should charge for infrastructure.

The second complaint involved a product under development – not yet in beta – called Live Events. This product will provide access to clickstream data and live events, ideally following IMS standards and supporting the Caliper framework. Unizin is the primary customer driving this development, but the Canvas R1 Peers group is also playing an active role. The concern is that the definition of which data to make available in real-time, and how that data is structured to allow realistic access by schools analyzing the data, has not yet been defined to a level that satisfies Unizin and the Peers group.

I contacted the company Friday mid day while also conducting interviews with the schools and with Unizin. Apparently the issues quickly escalated within the company, and Friday evening I got a call from CEO Josh Coates. He said that they had held an internal meeting and decided that their plans were wrong and had to change. They would no longer charge for daily access to Canvas Data. On Monday they posted a blog announcing this decision.

tl;dr: Canvas Data is out of beta. This means free daily data logs are available to all clients. [snip]

We just took Canvas Data out of beta. A beta is a chance to test, with actual clients, the technology, the user experience, and even possible for-cost add-on features. Some of the things we learned from the 30 beta institutions were that once-a-month updates aren’t enough (Canvas Data “Basic”), and charging extra for daily updates is kinda lame (Canvas Data “Plus”).

“Kinda lame” is not the Canvas Way. So we changed it: No more Canvas Data Basic vs. Plus; It’s now just Canvas Data, with daily updates of downloadable files, at no additional cost, for everyone.

Checking back with schools from the Canvas R1 Peers group and Unizin, I was told that Instructure really did defuse the Canvas Data issue with that one quick decision.

On the Live Events issue, the Canvas R1 Peers group put together a requirements document over the weekend that collected data needs from Berkeley, UT Austin, U Kentucky, and the University of Washington[1]. This document was shared with Instructure through Internet2 based on the Net+ contract with Instructure, and they are now working out the details.

Vince Kellen indicated that “Live Events is real minimal start in the right direction”, but that Instructure will need to figure out how to handle transactional events with no data loss and clickstream data not requiring the same fidelity within the same system.

Additional sources confirmed that the Canvas Data issue was resolved and that Instructure was on the right path with Live Events, although there is still a lot of work to be done.

Amin Qazi, CEO of Unizin, stated in an email:

Yes, Unizin had an agreement which allowed access to the daily Canvas Data files without our members paying any additional fees. My understanding of the new pricing model is all Instructure Canvas customers now have a similar arrangement.

Unizin is only beginning to explore the benefits of Live Events from Canvas. We are transporting the data from Instructure to our members via cloud-based infrastructure Unizin is building and maintaining, at no cost to our members. We have started developing some prototypes to take advantage of this data to meet our objective of increasing learner success.

Unizin has had, and plans to have, discussions with Instructure regarding the breadth of the data available (current: https://canvas.beta.instructure.com/doc/api/file.live_events.html), the continued conformity of that data to the IMS Global standards, and certain aspects of privacy and security. Unizin believes these topics are of interest to all Instructure Canvas customers.

We understand this is a beta product from Instructure and we appreciate their willingness to engage in these discussions, and potentially dedicate time and resources. We look forward to working with Instructure to mature Live Events.

In the end, there is work remaining for Instructure to support institutions wanting to access and analyze their learning data from the LMS, but Instructure dodged a bullet by quick decision-making.

Additional Notes
  • I am still amazed that Instructure’s competitors do not understand how Instructure’s rapid and non-defensive acknowledgement and resolution of problems is a major factor in their growth. There were no excuses given this weekend, just decisions and clear communication back to customers.
  • This is the clearest demonstration of value by Unizin that I have seen. Amin’s explanation goes beyond the vague generalities that have plagued Unizin over the past 18 months and is specific and real.
  1. There might be other schools involved.

The post Instructure Dodges A Data Bullet appeared first on e-Literate.

EDUCAUSE and Robot Tutors In The Sky: When investors are your main customers

Tue, 2015-10-27 15:46

By Phil HillMore Posts (377)

Yippie i ohhh ohh ohh
Yippie i aye ye ye
Robot tutors in the sky

Before I head out to Indianapolis for the EDUCAUSE conference, I keep thinking back to a comment someone made in response to Michael’s description of Knewton marketing as “selling snake oil”. I can’t find the exact quote, but the gist was:

This is what happens when you start to see VCs as your main customers.

This viewpoint could be applied well beyond Knewton, as they successfully parlay their marketing hype into raising more than $100 million to date (I suspect with another round in the works based on the aggressive marketing). Martin Weller has a post out today looking back at the MOOC investment mania and lessons learned such as “Don’t go cheap – they won’t respect you” and “Big rhetoric wins – allied with the fear factor”. The post is somewhat tongue-in-cheek and cynical in nature . . . but spot on.

Update: Ray Henderson shared a recent WSJ story about Chegg and how they are suffering from trying to increase market valuation by a “ratchet”.

Tech startups eager to land sky-high valuations from investors might want to heed the cautionary tale of Chegg Inc., the textbook rental service whose stock has languished since its IPO in 2013.

In a candid interview, an early investor in Chegg revealed how the company gunned for the highest possible valuation in several funding rounds ahead of its public offering. Chegg in exchange granted venture capitalists a favorable term called a “ratchet” that guaranteed the share price in the IPO would be higher than what they paid.

The move backfired. When Chegg went public, it was motivated to set an IPO price that met the terms of the covenant, or Chegg would have to pay the difference in shares to the early investors. The stock plummeted on the first day of trading and hasn’t recovered.

The entire ed tech market finds itself in the interesting position where it is easier to raise large sums of money from VCs or private equity or strategic buyers than it is to establish real business models with paying customers.

On one hand:

  • Ed Tech private investment (seed, angel, VC, private equity) has hit an all-time high of $3.76 billion for the first 9 months of 2015, according to Ambient Insight; and
  • M&A activity in ed tech is even higher, with $6.8 billion in Q3 of 2015 alone, according to Berkery Noyes.

On the other hand:

  • In the LMS market Blackboard is laying off staff and their owners are trying find an exit and D2L has hit a plateau despite massive investment. Instructure, while set for a half-billion+ IPO later this year has yet to set concrete plans to become profitable, and they are by far the hottest company in this market.
  • In the MOOC market, Coursera is just now getting to a repeatable revenue model, yet that is likely $20 million per year or less.
  • Other than ALEKS and MyLabs (owned by McGraw-Hill and Pearson), it is unlikely that any of the adaptive software providers have yet become profitable.
  • Etc, etc.

I am not one to argue against investment in ed tech, and I do think ed tech has growing potential when properly applied to help improve educational experiences and outcomes. However, there is a real danger when it is much easier for an extended period of time for companies to raise private investment or get bought out at high multiples than it is to establish real revenue models with end user customers – mostly institutions. The risk is that the VCs and private equity funders become the main customers and company marketing and product plans center on pleasing investors more than educators and students.

Knewton has fallen into this trap (although at $100 million + you could argue it is not a trap from their perspective) as have many others others.

What is needed in the market is for more focus to be applied to companies finding and simply delighting customers. This is a balance, as there is a trap on the other side of just supporting the status quo. But the balance right now is heavily tilted towards pleasing investors.

This is one of the main issues I plan to watch for at the EDUCAUSE conference – how much the company messages and products are targeted at educators and students vs. how much they are targeted at investors.

The post EDUCAUSE and Robot Tutors In The Sky: When investors are your main customers appeared first on e-Literate.