Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 1 day 15 hours ago

An Employee View of Textbook Publishers and Ed Tech Companies

Sun, 2014-02-02 12:48

Update: For reference, 2U has pointed out to me that they were recently recognized by Glassdoor as the 17th-best medium-sized company to work for in 2014. With an average score of 4.2 (based on 21 reviews), this puts them on top of the other ed tech companies discussed here. The only other education company on the Glassdoor best places lists is Edmentum which, oddly, only has a 3.1 average rating based on 37 reviews.

Second Update: All the education companies that have scored well on Glassdoor are coming out of the woodwork now. W.W. Norton has pointed out to me that the have a 4.2 average rating, based on 47 reviews. That puts them tied for first among education companies (that I know of) with 2U.

One way to get a sense of how the companies whose products you depend on are doing is to talk to their employees. This is particularly helpful at times like this, when huge changes are underway. So I thought it would be interesting to take a survey of the employee reviews on Glassdoor of some of the major companies in the industry. The results are revealing, if not entirely surprising.

Methodology

I looked at the three biggest US LMS players (Blackboard, Desire2Learn, and Instructure) and four of the biggest textbook providers (Pearson, McGraw Hill, Cengage, and Wiley). I added the best- and worst-ranked companies on Glassdoor to provide reference points. I threw in Rutgers University (my alma mater) and Coursera just for fun. I have ordered the presentation from lowest ranking to highest. For easy scanning, I put the average rank number in red if it was below the middle 3 out of 5 score, orange if it was on the line, and green if it was above the line. In addition to showing any summary review and ranking information Glassdoor provides on each company, I picked the most recent review with four stars or higher and the most recent review with two stars or lower to give examples of positive or negative review. (I chose a mechanical approach to avoid having to make a judgment about what counts as “representative,” which would be difficult or impossible.) At the end of each, I added a few observations of my own.

Note that this is highly unscientific, starting with the fact that we have widely varying sample sizes from company to company. This should be read as impressionistic rather than a true data-driven evaluation.

The Rankings DISH Network

Average Rank: 2.4

Overview:

Screen Shot 2014-02-02 at 9.26.40 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 9.30.59 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 9.33.07 AM

My Comments

To give you a sense of just how bad DISH really is, it ranks worse than working at either Sears or Radio Shack (although, to be fair, it looks like the number of positive reviews has taken an uptick in recent months). So the benchmark for the absolute bottom of the barrel is an average score of 2.4.

Cengage Learning

Average Rank: 2.7

Overview

Screen Shot 2014-02-02 at 9.42.46 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 9.45.30 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 9.47.08 AM

My Comments

To give you a sense of just how bad this looks for Cengage, the worst company on Glassdoor has an average rating of 2.4, the ninth worst has an average rating of 2.5, and Cengage’s number is 2.7. Thirty-three percent of DISH employees would recommend the company to a friend, while 35% of Cengage employees would recommend their employer to a friend. Cengage is going through bankruptcy and has had numerous reorgs over the last few years, so a low number is not terribly surprising. But this rating, combined with the tone of the reviews, suggests that the employees do not feel that the company is finding its way out of the wilderness yet. Unfortunately, that is consistent with Phil’s and my observations of the company. We can tell you what Pearson’s strategy is for the digital transition, and we think we can tell you what McGraw Hill’s strategy is, but there is no clear indicator of Cengage’s strategy—or even an indicator that they have one. Whether you look at their product releases, their partnership announcements, or their hiring and promotion moves, it’s hard to see a pattern.

McGraw Hill Education

Average Ranking: 3.0

Overview

Screen Shot 2014-02-02 at 10.27.33 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 10.30.46 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 10.33.36 AM

My Comments

McGraw Hill falls into the bottom end of what I call the “I Want To Believe” (IWTB) category of education companies. Having recently gone through a sale to private equity, and facing the declining sales and relatively low salaries that all the textbook publishers are struggling with, there is understandable demoralization. At the same time, their Connect and LearnSmart product lines have been successful and the company is beginning to form a digital strategy to build on those successes. There is a sense of optimism among some (though by no means all) of these reviews. Fifty percent of McGraw Hill employees who completed reviews would recommend the company to a friend, as compared to Cengage’s 35%.

Desire2Learn

Average Ranking: 3.2

Overview

Screen Shot 2014-02-02 at 10.43.50 AM

Most Recent Positive Comment

Screen Shot 2014-02-02 at 10.45.34 AM

Most Recent Negative Comment

Screen Shot 2014-02-02 at 10.47.03 AM

My Comments

The two comments listed above were literally the most recent two comments on the D2L page as of this writing, reflecting that the company is solidly in the IWTB category. As Phil and I have reported, the company has recently gone through some moderate downsizing (admittedly as what appears to be a course correction in the midst of a larger hiring spree) and management changes. Although they haven’t made any announcements about their intentions, they are clearly gearing up for IPO. But rather than being a cause for excitement for the employees, the Glassdoor reviews and our own research indicate that employees tend to see this drive as a net negative. First, D2L has a reputation of paying low salaries, and the indications are the employees don’t necessarily believe that they will benefit from any payout that goes with an IPO. Second, there are indications that employees have low confidence in current management. If you look at the scores in the overview section, the “senior management” and “comp & benefits” categories both average scores below 3.0. And one highly rated review of the company was entitled, “Quo Vadis Desire2Learn?” I wish that Glassdoor provided us with a tool to look at historic trends, because my guess is that D2L’s score was probably higher a year ago.

Pearson

Average Ranking: 3.2

Overview

Screen Shot 2014-02-02 at 11.08.27 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 11.09.47 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 11.11.28 AM

My Comments

Pearson is clearly another IWTB company. They have an identical overall rating to Desire2Learn’s, but with a higher percentage of employees who would recommend the company to a friend (62% versus 58%) and a lower percentage of positive opinions of the CEO (53% versus 62%). Some of this is just the difference between a huge public company with a new CEO versus a smaller private company with a founder CEO. Given Pearson’s massive reorg and aggressive new strategy, it will be interesting to see what their numbers look like a year from now.

Blackboard

Average Ranking: 3.3

Overview

Screen Shot 2014-02-02 at 11.17.59 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 11.22.18 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 11.25.26 AM

My Comments

If you want to get a sense of what reviews from a true IWTB company read like, then spend some time looking through the Blackboard reviews. They are all over the place, which is highly consistent with what Phil and I are hearing. Everyone agrees that there are major cultural changes happening in the transition from Michael Chasen to Jay Bhatt. What they don’t agree on is whether those changes are leading the company in the right direction overall.

Wiley

Average Ranking: 3.3

Overview

Screen Shot 2014-02-02 at 11.32.27 AM

Most Recent Positive Review

Screen Shot 2014-02-02 at 11.33.43 AM

Most Recent Negative Review

Screen Shot 2014-02-02 at 11.36.14 AM

My Comments

Wiley is more of a niche player than the other publishers covered in this post, and a big chunk of their revenue comes from areas other than college textbooks. So, on the one hand, they are at least somewhat more insulated from the downturns than the others. On the other hand, they have fewer resources to go after a bold new strategy. What you get when you read the Wiley Glassdoor reviews is what it’s like to work in a second-tier family-run textbook company. Times aren’t great, money isn’t great, but overall people still seem to have a positive impression of the company.

Coursera

Average Rating: 3.7

Overview

Screen Shot 2014-02-02 at 12.43.48 PM

Most Recent Positive Review

Screen Shot 2014-02-02 at 12.45.13 PM

Most Recent Negative Review

Screen Shot 2014-02-02 at 12.46.21 PM

My Comments

Given that there is a grand total of three Coursera employee reviews on Glassdoor, we can’t really draw any conclusions here. I just wanted to put an ed tech startup in for reference, and Coursera had at least some data. (Udacity, for example, has no reviews at all.)

Rutgers University

Average Rating: 3.8

Overview

Screen Shot 2014-02-02 at 1.40.45 PM

Most Recent Positive Review

Screen Shot 2014-02-02 at 1.42.10 PM

Most Recent Negative Review

Screen Shot 2014-02-02 at 1.43.40 PM

My Comments

I conducted an unscientific sampling of a bunch of colleges and universities of various sizes and shapes and found the typical rating on Glassdoor to range between 3.5 and 4.0. The mix of reviewers definitely tilts toward staff over faculty, but there are some faculty (and grad students, and undergrad work-study students) in the mix.

Instructure

Average Rating: 3.9

Overview

Screen Shot 2014-02-02 at 12.49.02 PM

Most Recent Positive Review

Screen Shot 2014-02-02 at 12.51.04 PM

Most Recent Negative Review

instructure-neg

My Comments

Somebody really did not enjoy working at Instructure. That comment was so long it took me a while to figure out how to capture it on the screen. Anyway, that aside, I know a bunch of Instructure employees and have had a difficult time getting them to say anything negative about the company. The overall feeling there seems to be one of optimism. Of course, Instructure is the only company in the education group here (other than Coursera) which has not gone through a sale, bankruptcy, or significant reorg, so that may be part of it. Whatever the reason, employees seem to feel that it is a well run company on the whole, putting outside the IWTB bubble that most of these other companies are in.

LinkedIn

Average Rating: 4.6

Overview

Screen Shot 2014-02-02 at 1.27.41 PM

Most Recent Positive Review

Screen Shot 2014-02-02 at 1.29.09 PM

Most Recent Negative Review

Screen Shot 2014-02-02 at 1.32.00 PM

My Comments

LinkedIn shares Glassdoor’s highest average rating of 4.6 with two other companies. I have been to the LinkedIn campus, and it certainly seems like a happy place. (And a well-fed place, too; their cafeteria is amazing.) That said, LinkedIn shares one very important feature with the other two 4.6-rated companies—Bain and Twitter—namely, it is highly lucrative to work there. While I’m sure that the company rating would still be quite good had LinkedIn shares had not through the stratosphere in the years since their not-so-long-ago IPO, it’s hard to say just how much higher they would be than our ed tech high-scorer, Instructure, with their 3.9.

The post An Employee View of Textbook Publishers and Ed Tech Companies appeared first on e-Literate.

Response from Babson Survey author on differences with IPEDS

Sat, 2014-02-01 18:14

I have written a series of posts on the new IPEDS data, including two that showed how this data seems to be quite different from the pervasive Babson Survey Research Group (BSRG) data (formerly known as the Sloan Survey). In particular, there were two findings, one on the number of students taking online courses:

And no, there aren’t 7.1 million [from Babson] US higher ed students taking at least one online course. There are closer to 5.5 million [from IPEDS] as of Fall 2012.

and one on the number of institutions offering online education:

The big difference that should be obvious is that the Babson data shows less than half the number of institutions with no online offerings than the IPEDS data – 15% compared to 31%.

Who has an online offering?

I have been in contact with Jeff Seaman, one of the two authors of the BSRG reports, to get his analysis on the differences, including sharing my spreadsheets used for analysis of the two data sets. Jeff has graciously reviewed the data and provided the following analysis of why BSRG data is so different from IPEDS data.

Preliminary IPEDS data includes an indicator if an institution offers distance education. The Babson Survey Research Group surveys of online learning have contained a similar measure for the past eleven years. The two are not measuring the same thing.

When the Alfred P. Sloan Foundation approached the Babson Survey Research Group to conduct the first of these reports in 2003, the hypothesis was that the most important transition point for an institution was when it moved from having NO online offerings to having ANY such offering. As such, the measure of “online offerings” was defined as broadly as possible – any online offering of any length to any audience at any time. IPEDS on the other hand only counts undergraduate courses for “A student enrolled in a 4- or 5-year bachelor’s degree program, an associate’s degree program, or a vocational or technical program below the baccalaureate.” Students who are not enrolled in a program, but are just taking courses do not count in the IPEDS definition. Non-credit courses, continuing education courses, courses for alumni, and courses for students not registered for a degree program at the institution are not counted.

For larger institutions with well-developed online programs the difference in these definitions has little impact – they may have offerings outside of the IPEDS definition but they also have courses that do meet the IPEDS definition. The BSRG and IPEDS measures agree very well for institutions with more than 1500 total enrollments. However, for schools with less than 1500 total students, the BSRG measure includes far more institutions than does the IPEDS measure. Most of these differences occur at the lower end of this spectrum; schools with only a few hundred total enrollments.

Many of these institutions do not meet the IPEDS definition for providing distance offerings but claim to meet the BSRG definition. These schools typically lack the resources to launch significant online offerings, but have consistently reported that online education is critical for them and that they [offer] it. Their online offerings are often very small (sometimes only two or three students), and rarely part of their core program. Their offerings may be shorter than a full-length course and are rarely for credit. They might not even be a “course.” It is very rare for these to be part of a degree-program.

To understand the differences in the two measures it may be helpful to think of IPEDS as the measure of providing distance courses for those pursuing a degree and BSRG as the measure of providing ANY online offering of any type for any participant. Institutions that meet the IPEDS definition will also meet the BSRG definition, but the reverse is not necessarily true.

To help visualize the issue described by Jeff, it is useful to see these two charts that compare institutions offering online or not, broken down by enrollment (you can hover over bars to get actual numbers), followed by summary table.

Data no online by enrollment 2

One clarification I would add is that IPEDS measures not only students enrolled in a degree program but also in certificate programs. I believe that Jeff’s point remains about students who are taking online courses and not seeking credit, it is just that credit can encompass courses leading to a degree or a certificate.

Definition of Online Course

The other issue that was raised in our discussions involved the different definition by BSRG and IPEDS of what constitutes an online course, independent of the degree or certificate-seeking status. The BSRG definition (page 6 of the report):

An online course is defined as one in which at least 80 percent of the course content is delivered online. Face-to-face instruction includes courses in which zero to 29 percent of the content is delivered online; this category includes both traditional and web facilitated courses. The remaining alternative, blended (or hybrid) instruction, has between 30 and 80 percent of the course content delivered online.

While IPEDS defines a distance education course as follows:

A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.

Could this play a major role in the difference in data between BSRG survey and IPEDS? Jeff  answered that this distinction is secondary.

The 80% portion of the definition has an impact. [snip]  The primary factor is the issue of audience – that IPEDS only counts courses for those enrolled in degree programs and most of these very small schools are using their online for other purposes.  The 80% factor plays a role, but only for those that would otherwise meet the IPEDS definitions.

Comment

First of all, this whole interaction has partially restored my confidence in university-based research. Michael and I (as well as Al Essa and Mike Caufield) have described the disturbing trend where some universities are all too willing to present research findings when it helps the institutional image, but they can be quite reluctant to be transparent when there are legitimate questions that should cause the groups to re-examine their findings. Jeff has been quite willing to admit the differences between IPEDS and BSRG and spend time trying to understand the differences.

Jeff also mentioned that they are looking at questions to ask next year.

We have been thinking about what questions to put in next year’s survey to get at these differences.  The number of schools counted as having online may be different for the two measures, but the impact is really very small when it comes to students – since these schools are so small and teach such small numbers of students online.

I encourage BSRG to follow through on this plan. The BSRG survey is the basis for a great deal of commentary and decision-making in US higher ed, and with the new data IPEDS will also play an influential role. It is important that there is coherence between the two data sources, allowing the best of non-survey required reporting and trend analysis.

The post Response from Babson Survey author on differences with IPEDS appeared first on e-Literate.

Update: Blackboard and Washington Post change the employee count

Wed, 2014-01-29 21:34

Well that was a major change. As I noted yesterday, Blackboard described its reorganization efforts to the Washington Post for its Saturday profile of the company and CEO Jay Bhatt.

Blackboard today is completely reorganized, compared with a year ago, a process that required layoffs in some departments and new hires in others, Bhatt said. The company counts roughly 2,200 employees to date.

This was interesting to me, since in Fall 2012 Blackboard gave information to both the Washington Business Journal and the Washington Post stating that the company had 3,000 employees. I noted this in my post yesterday:

That is a significant change, if these stories are accurate, going from 3,000 employees to 2,200 in less than 18 months …

The Washington Post just issued a correction to their story today that changes the numbers significantly:

Blackboard today is completely reorganized, compared with a year ago, a process that required layoffs in some departments and new hires in others, Bhatt said. The company counts roughly 3,000 employees to date.

There is almost no explanation for the change in numbers, other than the following:

An earlier version of this story incorrectly spelled Adrenna, the learning management platform. It also incorrectly stated how many people are employed at Blackboard. This version has been corrected.

Blackboard, through a company spokesperson, let me know that part of the change in numbers was that 2012 they had included call center staff in the employee count and this weekend they did not (which I noted in an update to yesterday’s post). They also confirmed that they provided the 2,200 number to the Washington Post. I have requested that Blackboard provide precise numbers and accurate descriptions (number of employees counting or not counting call center staff for both Nov 2012 and today, or total number of staff layoffs over past 18 months), but they have declined to do so.

Nevertheless, the basis of my analysis was public information provided by Blackboard to the Washington Business Journal and Washington Post. These numbers have changed, so my analysis has to change.

I am no longer reporting that Blackboard has dropped 26% of its workforce nor am I reporting that “a large number” have been cut. How much has the workforce changed? According to the new public numbers, not a lot, but the real answer is that we do not know.

Given this significant change, I feel I owe readers some more context, through a timeline, so you can decide how to interpret the multiple posts and updates:

  • Sunday, Jan 26th: Blackboard is profiled by Washington Post, stating it has “roughly 2,200 employees”.
  • Tuesday, Jan 27th (morning): I write a post analyzing the changes, stating that “Blackboard seems to have cut more than 26% of workforce”.
  • Tuesday, Jan 27th (afternoon): After talking to Blackboard spokesperson, I revise my headline to “Blackboard seems to have cut large amount of workforce” and add the update. I ask for precise numbers from Blackboard.
  • Tuesday, Jan 27th (evening): Blackboard talks to the Washington Post and asks them to change the number of employees.
  • Wednesday, Jan 28th (morning): The Washington Post changes its number to “roughly 3,000 employees”.
  • Wednesday, Jan 28th (afternoon): I talk to Blackboard spokesperson again, asking for precise numbers; she cannot provide these, but she shares as much general information as she is allowed.
  • Wednesday, Jan 28th (evening): I write this post.

In my opinion Blackboard is making a mistake here. I understand the argument that they are private and don’t like to share details . . . but they certainly seem to like sharing numbers if no one calls them on it and were in no rush to correct the information from Sunday through Tuesday. Making an exception and being precise this week would answer the question on how much the workforce has changed and put this issue to bed. In the end it is Blackboard’s call on how much information to release, but it is also the general market’s call to figure out how much confidence to assign to that information.

The post Update: Blackboard and Washington Post change the employee count appeared first on e-Literate.

e-Literate TV Update and Request for Help

Wed, 2014-01-29 11:09

We are fast approaching the release of our first e-Literate TV (ETV) series. Our latest estimate is that we will go live in mid-February with a collection of interview episodes filmed at EDUCAUSE 2013 covering the following topics:

  1. An overview of challenges in higher education that are driving conversations about educational technology
  2. A review of the online learning landscape
  3. A discussion MOOC mania
  4. An exploration of the genre of courseware
  5. A conversation about the basics of learning analytics and adaptive learning
  6. A review of one faculty member’s experience with classroom flipping

Meanwhile, we are already hard at work on the second series, which will be called “Massive and Open: What Are We Learning?” Filmed primarily at the the MOOC Research Initiative Conference, the series takes a deeper dive into how people are actually using the tools that MOOCs offer in different contexts and discusses their promise as well as their problems. We expect this to be the model for future series, where we explore one topic from a number of angles rather than offering a smorgasbord like the pilot series. We have a long list of topics we’d like to cover in future series, ranging from increasing educational access to personalized learning to the promise and perils of big data in education and more.

With the release of the first series around the corner, Phil and I would like to tell you more about our ideas for ETV and request your feedback and (verbal) support.

The Problem We’re Trying to Solve

During a conversation about ETV with a handful of colleagues at the last Sloan-C conference, Mike Caulfield nicely summed up the challenge we’re trying to tackle in one word: solutionism. We seem to go through endless hype cycles for technology-enabled education solutions, whether those solutions are general online learning programs, MOOCs, adaptive learning, competency-based education, et cetera and so on, ad nauseam. And the context in which these solutions are discussed is usually either a pitch by a vendor or some breathless bit of fluff utopianism in the tech media or the mainstream punditry. There are usually very few information sources that provide in-depth and independent analysis of any given solution. But perhaps more importantly, the conversation itself on campus is framed by the vendor marketing and media hype. Conversations tend to be collapsed into which thing we should buy rather than what’s the best way to solve the problem that we’re worried about or reaching the goal that we’re aspiring to achieve.

Given that situation, educators generally have three choices. First, they can trust most everything the vendors and media say and hope that their vendor partners are steering them well. Second, they can distrust most everything the vendors and media say and approach technology-enabled solutions in general with deep skepticism. Third, they can put together a review process which attempts to codify the differences amongst the solution candidates. Unfortunately, what often happens is a combination of the worst of all three of these options. Somebody will make a (labor-intensive) attempt to codify every feature of all the solution candidates and make the vendors respond to a massive RFP. This has the effect of filtering out smaller vendors who can’t afford the sales support staff required to respond to long RFPs as well as unusual solutions that don’t fit into the box. But because the RFP is usually written to document differences among products rather than illuminate important features of the problem being solved that might recommend one solution over another, the results of the process often don’t change many minds. So people who entered the process thinking one of the products is a cure for cancer generally leave the process thinking it is a cure for cancer. Similarly, people who start the process thinking one of the products causes cancer generally end the process still thinking that the product causes cancer. So either the vendor that checks the most boxes wins or the vendor who had the most support going into the process wins. Obviously, this is not a good situation for educators, students, or schools.

Nor is it a good situation for vendors—even the big ones who can afford to jump through the hoops. If the wrong stakeholders at a school dislike the product (or the vendor) going into the conversation, then the vendor loses. Period. No matter how good their solution is. On the other hand, if the school (or individual educator) adopts a product uncritically without fully understanding its strengths and limitations, then that implementation is ultimately likely to fail, thus harming future sales.

But the biggest problem that solutionism causes is that it distracts us from focusing on the problems the solutions are supposed to solve. Who cares about adaptive learning, really? What we care about is student success. Adaptive learning is interesting to the degree that helps specific kinds of students achieve specific kinds of educational goals. One of the interesting aspects we’ll be highlighting with our ETV series on MOOCs is that “massive” and “open” have a lot of really interesting and exciting educational applications. It’s just that one of those applications probably isn’t “disrupting the university.” But because disruption hype from folks like Thomas Friedman and Sebastian Thrun has so dominated the discussion, we have missed opportunities on campuses to talk about the problems that MOOCs can help with.

Obviously, campus politics can be an aggravating condition to solutionism. Content—even interactive content—can only have a limited impact on this. But to the degree that we can generate dialog on campuses before these issues become politicized, (e.g., before somebody puts a stake in the ground and declares, “We are doing a MOOC!”), we might be able to reduce the role that politics play in decision-making.

How We Plan to Do It

We’re very lucky that we have a readership at e-Literate that includes a lot of knowledgeable and passionate campus ed tech evangelists. Their official roles can vary widely. Sometimes they are directors of centers for teaching and learning. Sometimes they are passionate individual faculty or department chairs. Sometimes they are CIOs. What they tend to have in common is a clear-eyed interest in how technology can help solve important educational problems. At e-Literate, one of our main goals is to serve these people by providing them with relevant information and analysis. Unfortunately, this usually isn’t enough to enable them to solve the problem of solutionism on their campus. They need more tools in their toolboxes. Specifically, they need an information source that can function as an anti-hype, not in the sense of being as negative about a solution as the hype is positive, but in the sense encouraging educators within a campus community who are not ed tech geeks to think for themselves and talk to each other about what they need in a solution, and which problems they need to solve in the first place. The most recent example of this spirit is the piece we did on adaptive learning for the American Federation of Teachers. We want to do that in a programmatic way.

Enter ETV. We start with a series of short videos on a topic. These videos are discussion-based, exploratory, and aimed at non-technologists. They are produced by experienced film makers using three cameras, so they hopefully will carry an air of respectability for those stakeholders who are sensitive to those sorts of authority markers. But they don’t provide any easy answers. In fact, each one ends with a question. We try to provide just enough information to convince the viewers that the topic we are covering in a segment is worthy of discussion. The videos are designed to function like mini-lectures at the beginning of a discussion-based workshop or an unconference session. We don’t expect the stakeholders we are trying to reach will watch every video. But we are hoping that our ed tech evangelists can persuade them to at least watch enough of one video to start a conversation. The videos will be embedded in a blog where we hope to encourage discussion among stakeholders of all types across institutions, but we expect that a lot of the conversations will happen within individual colleges and universities and outside of our software. All the videos will be released under a Creative Commons Attribution license.

We also want to encourage people to dig deeper into the content by getting into their zone of proximal curiosity. Using the platform provided by In the Telling, we will be linking to blog posts, news articles, and other resources around the web in the specific context of particular moments in the videos. If a viewer finds a point being made in the discussion to be particularly interesting and wants to learn more, then more will be only one click away. We want to create lots of different entry points into the content so that different people can focus on the aspects that are most relevant to them in the level of detail that is most useful to them. We call this “differentiated engagement.” Our goal is not to get everybody to know the same things about the topic but to get enough people on campus interested in and knowledgeable about different aspects of the topic that there is an opportunity for some productive and enjoyable conversation. It is connectivist in its design in that sense.

What I’ve described so far is what we will be releasing in February with the first series. Soon after, we hope to build out a community space around the content using BuddyPress and other WordPress extensions. In the short term, we will focus on creating a space for ed tech evangelists at different campuses to share ideas and ask each other questions about how to foster conversation (although we hope that ETV will encourage increasing numbers of people to see themselves as ed tech evangelists or, at least, ed tech ambassadors). We don’t pretend to have all the best ideas for catalyzing these conversations, and we want to encourage those of you who are out there doing this work every day to share your best ideas with your peers. It will take us some time to build this out, but we are excited by the prospect of it.

All of this will be free, the content will be openly licensed, and much of it will be available without a registration or login. (Discussion participation will probably require some sort of registration process, whether we use a distributed ds106-style method or a centralized method like bbPress.) We intend to pay for it through PBS News Hour-style sponsorship. And while Phil and I do have some long-term ideas for paid content and services that ETV community members might be interested in (which I will blog about in a separate post in the near future), the primary purpose of ETV is not to generate sales. We expect that a substantial majority of ETV users will never pay us a dime, and we are fine with that. Our goal is to help campus communities have better conversations that lead to better educational decisions. If we can do that, then we are confident that we will earn the good reputation we need in order for people to come to us for consulting.

And this brings us to the part where we need your help.

(How) Will You Use This?

We are in the process of talking with sponsors now, including a meeting that we will be having with a major foundation in a few weeks. The first questions they will undoubtedly ask are, “How many people will actually use this, and how will they use it?” Obviously, we have these questions ourselves as well. We have talked to some friends and colleagues to get input, but we need more. We’re looking for two kinds of help from you:

1. Your Specific Ideas About How You Might Use ETV in Your Context

What we’re looking for are statements like, “I run a monthly lunch and learn for about 20 faculty at Rural Community College, and I would consider using ETV episodes as the basis for some of these sessions if the content was right.” Or, “Our college is planning a major smart classroom initiative, and I would use an ETV series on learning spaces to facilitate an all-day workshop if such a series existed.” Or “If I had had the MOOC series when my Provost was making a decision to launch an edX course, here is how I might have approached the conversation differently.” Or “I could never get my Provost/CIO/faculty/whoever to watch a ten-minute video, but if I had a two-page article like the one you pointed to on adaptive learning, I would pass that around as part of our monthly e-newsletter to the Faculty Senate.” Or even, “I would never use this because….” The more specific you can be, the more helpful it will be for us. What want your ideas for what ETV could do for you and how you would use it, if only we design it right. Please brainstorm with us.

2. Expressions of Interest from Your Campus

We’re also looking for schools to express a more official interest in adopting ETV programmatically. We’re not looking for a commitment, and we’re certainly not asking for money. What we’re hoping is that some of you would say something like, “Not only am I interested in principle in encouraging all of my department heads to watch ETV episodes; I’d like to discuss the possibility of using a MOOC series as the basis for an unconference with them.” This will both help us to shape ETV as we move forward and assure potential sponsors that there is real interest on campuses in making meaningful use of ETV if we can get the funding for it. For these more specific conversations, it’s probably best for you to email me at michael [at] mindwires [dot] com.

As always, thanks for your honest feedback.

The post e-Literate TV Update and Request for Help appeared first on e-Literate.

Coursera blocked in Iran due to human grading and discussion forums?

Tue, 2014-01-28 23:49

Update: I just received confirmation from a Coursera spokesperson that the description of facts about Coursera’s block of student access in this and my previous blog post is accurate. They cannot comment on the rest of the professor’s statement nor on Udacity and edX.

Yesterday I wrote about Coursera having to block access to its courses from students in Iran, Syria, Cuba and Sudan. Kris Olds covered this subject in more detail this morning as did Carl Straumsheim at Inside Higher Ed. The nub of the issue is that the US government (the State Department I believe) has declared MOOCs as educational services rather than information or informational materials, and thus the courses are subject to US embargoes of the four countries in question.

According to IHE, Udacity “has not received official word about enrolling students in countries subject to economic sanctions”, and EdX has a license to operate in Iran, Syria and Cuba. So the issue is mostly with Coursera, although not by their decision.

If a recent post from a Coursera instructor is accurate, we may now know what triggered the classification of MOOCs as a a service subject to sanctions. I was notified by a trusted reader that Ebrahim Afsah from the University of Copenhagen (yes, the same one from the story about abusive discussion forum posts) send a message this afternoon to all students in his MOOC “Constitutional Struggles in the Muslim World” (registration required). He quotes a message from Coursera to MOOC instructors that includes the new information [emphasis added].

As some of you already know, certain U.S. export control regulations prohibit U.S. businesses, such as Coursera, from offering services to users in sanctioned countries (Cuba, Iran, Sudan, and Syria). The interpretation of the export control regulations in the context of MOOCs has been ambiguous up until now, and we had been operating under one interpretation of the law. Last week, Coursera received definitive guidance indicating that access to the course experience is considered a service, and all services are highly restricted by export controls.

In particular, the notion of “services” includes offering access to human grading of quizzes and assessments, peer-graded homework, and discussion forums. Regrettably, Coursera must therefore cease offering MOOC access to users in certain sanctioned countries in order to ensure compliance with these U.S. laws and to avoid serious legal ramifications.

Some notes (assuming the authenticity of this message):

  • Whatever you think about the wisdom of sanctions on the four countries in question, we have reached a level of bureaucratic irrationality with this ruling (“I swear officer, I only got machine grading of my essays”  ”OK, carry on”).
  • Given this definition, especially access to discussion forums, it’s hard to see how Udacity will be treated differently than Coursera.
  • It appears that edX has been more successful in getting around the sanctions, either due to a better legal approach or due to their non-profit status. As Carl noted in his article:

The only option for students in the sanctioned countries may be edX, the MOOC provider founded in partnership between Harvard University and the Massachusetts Institute of Technology. Tena Herlihy, edX’s general counsel, said the company has since last May worked with the U.S. State Department and the Treasury Department’s Office of Foreign Assets Control, and has so far applied for and received company-specific licenses for its MOOCs to enroll students in Cuba and Iran (a third license, for Sudan, is still in the works).

“We want to provide education to anyone with an internet connection,” Herlihy said. “We do not want to withhold education from someone just because of the country they live in.”

In addition to applying for licenses, the office also issues general licenses that don’t apply to a specific person or organization. Herlihy said edX is operating under such a license in Syria, but declined to elaborate. One such license, however, exempts “services incident to the exchange of personal communications over the Internet” from the sanctions.

Update: Nina Curley, author of the original story at Wamda, was notified by Coursera that it has reinstated access in Syria. She also confirmed that it was the State Department specifically making the ruling and that it was based on the services, not Coursera’s for-profit status. See full update at Nina’s original story.

*********

Although this is somewhat of a different subject (my primary interest is in the ruling and what it means for online education providers), I will also note that the professor is not a fan of the US or its sanctions. From the class email:

I write this email under protest and with a considerable degree of anger and sadness. Few things illustrate the bone-headedness, short-sightedness, and sheer chauvinism of the political structure of the United States better than the extent to which its ideologues are willing to go to score cheap domestic political points with narrow interests in the pursuit of a sanctions regime that has clearly run its course. [snip]

But you will now be interested to hear that also my course (and anything else Coursera offers) has been classified, if not a weapon that could be misused, then at least a “service” and as such must not fall into the hands of anybody happening to live in the countries that the United States government doesn’t like. I have thus been informed that my students in Cuba, Syria, Sudan and my homeland will no longer be able to access this course. I leave it to you to ponder whether this course is indeed a weapon and if so against what and what possible benefit the average American citizen could possibly derive from restricting access to it.

Be this as it may, I invite those students affected to use services such as hola.org or VPN routers to circumvent these restrictions.

Let me reiterate that I am appalled at this decision. Please note that no-one at Coursera likely had a choice in this matter!

At any rate, rest assured that these are not the values of the University of Copenhagen, of its Faculty of Law, and most assuredly not mine!

This makes me question the professor’s previous admonition to students from December 10, 2013:

I will not tolerate abusive language, self-righteous posturing, baseless assertions just for the sake of being ‘right’ and anything likely to damage the enjoyment of this course for all.

The post Coursera blocked in Iran due to human grading and discussion forums? appeared first on e-Literate.

Blackboard seems to have cut large amount of workforce

Tue, 2014-01-28 14:24

Update: Please see new post with updated information.

Update: I heard back from the company that part of the discrepancy in numbers is that public statements about employee count may have changed in whether they included the call center employees (which vary seasonally). If I can get some hard numbers from Blackboard, I will publish a new post with more accurate information. For now, please note that the 26% number may be based on inconsistent definitions. Accordingly, I have changed the post title and am bumping this post.

The Washington Post ran a piece over the weekend about Blackboard’s reorganization efforts since Jay Bhatt took over as CEO.

Blackboard has upended its corporate structure and strategy behind closed doors since chief executive Jay Bhatt took over the private company a year ago with a mandate to reinvigorate one of the District’s oldest and most recognized technology brands.

The changes come after years of eroding market share for Blackboard, a pioneer in online learning management software. Bhatt said the changes made in the past year provide a foundation on which to grow the business.

Michael and I have both noted some of the layoffs that have taken place as well as the reorganization and removal of silos. Alert former employee George Kroner, however, noted on Twitter just how significant the layoffs have been. This nugget from the WaPo story is the key:

Blackboard today is completely reorganized, compared with a year ago, a process that required layoffs in some departments and new hires in others, Bhatt said. The company counts roughly 2,200 employees to date.

Compare this to the first story on company layoffs from September 2012, from Bill Flook at the Washington Business Journal. Note that Michael Chasen was still CEO at this point, as Jay Bhatt took over at the end of December 2012.

In a statement, Blackboard spokesman Matthew Maurer said the company has “seen strong growth this year in terms of revenue and in the acquisition of new businesses that have opened up new markets for us.” The company’s total workforce now stands at 3,000 globally, “even with the recent elimination of a small number of roles,” he said.

That is a significant change, if these stories are accurate, going from 3,000 employees to 2,200 in less than 18 months – a reduction of more than 26% of the workforce [see update above].

I’m sure that not all of these losses have come from layoffs, as a fairly significant number of employees have likely left of their own volition. This is fairly typical within companies making such significant changes, however.

For now, it’s worth noting that there are big changes happening at the two biggest commercial LMS providers (Desire2Learn recently laid off 7% of its workforce).

The post Blackboard seems to have cut large amount of workforce appeared first on e-Literate.

Coursera and Udacity, but not edX, blocked in Syria and Iran

Mon, 2014-01-27 13:26

One of the most-cited aspects of Massive Open Online Courses (MOOCs) is their openness – anyone can access the courses with just Internet access and and email address. But this openness might not be valid in all countries based on recent actions.

The media site Wamda, which focuses on supporting entrepreneurs in the Middle East and North Africa (MENA) region, has a story out today describing how Coursera and Udacity MOOCs have apparently been blocked in Syria and Iran [see update below on Syria].

Just over six months ago, online education platform Coursera thanked a Syrian doctor for a poignant blog post that spoke directly to its mission to change the world by educating the masses. [snip]

Yet today, Dr. Angrimi no longer has that lifeline, as Coursera appears to have blocked Syrian IPs since Friday.

“Our system indicates that you are attempting to access the Coursera site from an IP address associated with a country currently subject to U.S. economic and trade sanctions. In order for Coursera to comply with U.S. export controls, we cannot allow you access to the site.”

At the end of the article, Ahmad Sufian Bayram, the MENA Connector for French collaborative consumption network OuiShare, shared that Coursera is not alone.

Coursera isn’t alone in that predicament; Coursera competitor Udacity, which is also for-profit, has been banned for a while in Syria, says Bayram. One site that still works is EdX, the nonprofit MOOC platform founded by MIT, which offers courseware for free.

Yet EdX only offers a few courses, says Bayram. For the foreseeable future, there’s no solution. “We rely on these courses to improve our thinking and to get more jobs,” he explains. “This is going to be difficult.”

Despite the article title, it appears that this is not a story about the MOOC providers deciding on their own to block access to their site. Rather, this is a matter of US export control policy.

While I do not have direct knowledge of the decision, it is worth understanding official US export policy for both Syria and Iran. The summary document from the Treasury Department on Iran sanctions has the following section [emphasis added]:

• EXPORTS TO IRAN – In general, unless licensed by OFAC, goods, technology, or services may not be exported, reexported, sold or supplied, directly or indirectly, from the United States or by a U.S. person, wherever located, to Iran or the Government of Iran. The ban on providing services includes any brokering function from the United States or by U.S. persons, wherever located. For example, a U.S. person, wherever located, or any person acting within the United States, may not broker offshore transactions that benefit Iran or the Government of Iran, including sales of foreign goods or arranging for third-country financing or guarantees. [snip]

Donations of articles intended to relieve human suffering (such as food, clothing, and medicine), gifts valued at $100 or less, licensed exports of agricultural commodities, medicine, and medical devices, and trade in “information and informational materials” are permitted. “Information and informational materials” are defined to include publications, films, posters, phonograph records, photographs, microfilms, microfiche, tapes, compact disks, CD ROMs, artworks, and news wire feeds, although certain Commerce Department restrictions still apply to some of those materials. To be considered informational material, artworks must be classified under chapter subheadings 9701, 9702, or 9703 of the Harmonized Tariff Schedule of the United States.

The Syria document prohibits:

The direct or indirect exportation, reexportation, sale, or supply of any services to Syria from the United States or by a U.S. person, wherever located;

What is not clear is whether Coursera and Udacity are considered “information and informational materials” under the Iran document’s definition. It is interesting how outdated that definition is (where did I put that microfiche – is it under my typewriter?). It is also not clear why edX is still available.

There is some small irony in US export control policy preventing the access to Coursera and Udacity MOOCs, as the State Department recently sanctioned several people in Iran for Internet censorship.

The move against Communications Minister Reza Taghipour came after he was blamed for ordering the jamming of international satellite TV broadcasts and restricting Internet access, a State Department official said.

The United States was determined to stop the “Iranian government from creating an ‘electronic curtain’ to cut Iranian citizens off from the rest of the world,” said State Department spokeswoman Victoria Nuland.

Note that I’m trying to track down whether encryption technologies might be the explanation for the blocking of these site.

If anyone has more information on these restrictions, let me know via the comments or through the e-Literate contact form. For now, count this story as another example of the regulatory challenges that online education providers encounter.

Update: I just found this from Coursera’s help page:

As you may know, certain United States export control regulations prohibit U.S. businesses, such as Coursera, from offering services to users in sanctioned countries (Cuba, Iran, Sudan, and Syria). The interpretation of export control regulations as they related to MOOCs was unclear for a period of time, and Coursera had been operating under one interpretation of the law.

Recently, Coursera received a clear answer indicating that certain aspects of the Coursera MOOC experience are considered “services” (and all services are highly restricted by export controls). While many students from these countries were previously able to access Coursera, this change means that we will no longer be able to provide students in sanctioned countries with access to Coursera moving forward.

Students in affected countries attempting to log in will see the following message:

Sign In

We truly regret that this is the case and apologize to users who were caught off guard by this change, but we must comply with U.S. laws regarding economic and trade restrictions.

This likely means, as stated by @GlobalHigherEd, that “education provided by a MOOC is officially a tradable ‘service’ vs information”.

Update: Nina Curley, author of the original story at Wamda, was notified by Coursera that it has reinstated access in Syria. She also confirmed that it was the State Department specifically making the ruling and that it was based on the services, not Coursera’s for-profit status. See full update at Nina’s original story.

The post Coursera and Udacity, but not edX, blocked in Syria and Iran appeared first on e-Literate.

Twice as many institutions as previously reported have no online courses

Thu, 2014-01-23 20:15

Recently I pointed out that the widely-quoted Babson survey on online learning estimates 7.1 million US higher ed students taking at least one online course while the new IPEDS data indicates the number as 5.5 million. After looking deeper at the data, it appears that the difference in institutions (whether or not an institution offers any online courses) is even greater than the difference in students. This institutional profile is important, as the Babson report (p. 13) noted that institutions offering no online courses had very different answers than others, a theme that ran through much of the report: [emphasis added]

The results for 2013 represent a marked change from the pattern of responses observed in previous years. In the past, all institutions have consistently shown a similar pattern of change over time. Different groups of institutions typically reported the same direction of change – if one group noted an improvement on a particular index, all other groups would show a similar degree of improvement. The overall level of agreement with a particular statement might vary among different groups, but the pattern of change over time would be similar. This is not the case for 2013.

As noted above, there was a year-to-year change in the overall pattern of opinions on the strategic importance of online education, and on the relative learning outcomes of online instruction, as compared to face-to-face instruction. In both cases, the historic pattern of continued improvement took a step back for 2013, and all of the changes are accounted for in a single group of institutions: those that do not have any online offerings.

Institutions with no online offerings represent a small minority of higher education – how are they different?

Let’s look at the IPEDS data on institutions versus the Babson data, first by institutional control. I took the data on page 32 of the Babson report and recreated the graph, then I ran the same analysis using IPEDS data. (NOTE: these interactive charts do not come through on RSS feeds, so you probably will have to click through to post to see.)

Data no online by control

The Babson report also evaluates these institutions by basic Carnegie classification and institutional enrollment. I did not evaluate the former (too messy), but I did run the same analysis by enrollment.

Data no online by enrollment

Some notes:

  • While I have been able to recreate the universe of 4,726 institutions referenced on page 29 of the report, I cannot get the same total enrollment figure. The Babson data indicates 21.3 million students compared to IPEDS data of 20.6 million. I don’t believe this 3% difference is that meaningful.
  • While the Babson data refers to a universe of 4,726 institutions, the data provided is based on 4,332 and 4,269 institutions, primarily by using far fewer for-profit institutions. There is no explanation for these different numbers, but keep in mind that Babson’s data is from a survey that extrapolates to estimate the universe.
  • The big difference that should be obvious is that the Babson data shows less than half the number of institutions with no online offerings than the IPEDS data – 15% compared to 31%.

Not only does the IPEDS indicate that twice as many institutions have no online courses as previously reported, but I also question the finding that “institutions with no online offerings represent a small minority of higher education”. 31% is not a small minority.

I am not questioning the research methods of the Babson Survey Research Group nor the value of their annual survey. It is just that we now have a new source of data that must be accounted for. While I do not think the IPEDS data is flawless, it is better than the survey-based data used by Babson. Jeff Seaman, one of the two Babson researchers, said as much in this Chronicle article:

So which number is correct?

The lower one, probably. The Education Department data are more likely to be accurate, “given that they are working from the universe of all schools,” says Mr. Seaman by email. [snip]

The reporting requirements for the department “are such that I would always trust their numbers over ours,” he wrote. “However, I still believe that the trends we have reported for the past 11 years are very much real.”

I hope the analysis I’m doing based on IPEDS data doesn’t come off as nitpicking or attacking the Babson survey. The annual survey has been a very useful source of information, and the trend data as well as attitudinal data cannot be replicated by IPEDS. These Babson reports have enormous influence on the higher education community, being the most widely-quoted source on just how prevalent online education is in the US. It is very important to adjust our thinking based on new information and to be transparent with research data.

The post Twice as many institutions as previously reported have no online courses appeared first on e-Literate.

Gen Ed and Competency-Based Education

Thu, 2014-01-23 08:04

Inside Higher Ed has a write-up today on an effort by the Association of American Colleges & Universities (AAC&U) to develop a competency-based framework for general education called General Education Maps and Markers (GEMs), funded by a multi-million-dollar Gates Foundation grant. I am honored to report that I have been invited to participate on one of the GEMs committees.

To be honest, I have mixed feelings about competency-based education (CBE). On the one hand, I do think that an earnest attempt to have a norming conversation and set some standards for what we hope students will learn at college is very important and badly overdue. On the other hand, it’s all too easy to apply CBE in a mechanistic and reductive way that ignores the important but hard-to-measure skills precisely because they’re hard to measure. It’s essentially the same worry that a number of readers expressed in the comments on my post about Pearson’s efficacy initiative. That said, AAC&U has a good shot at assembling the right stakeholders to grapple with this complexity appropriately. I do think there are models out there. For example, ASU’s Habitable Worlds class, which is both competency-based and mastery-based, is not at all dumbed down and could even be further expanded to include writing and similar open-ended analytic and communications skills. (I interviewed Habitable Worlds co-creator Ariel Anbar last year.) But there are many challenges.

Anyway, the first meeting of the GEMs group is coming up in a couple of weeks. I’ll let you know what I think after spending a couple of days talking about the issues with the working group.

The post Gen Ed and Competency-Based Education appeared first on e-Literate.

New IPEDS Data: A graphical view of online ed by state and by sector

Thu, 2014-01-23 00:07

Update: I should have guessed this, but the visualizations don’t come through on RSS feeds, so you’ll need to click through to the article.

Reader Mike Himmelstein has rightly pointed out that our analysis of the new IPEDS data would benefit from using visualization tools instead of just tables. This comment led me to a multi-day investigation of which data visualization tool would best integrate into a WordPress blog while maintaining interactive data exploration. I tried MicroStrategy (great tool but cannot share without login), IBM Many Eyes (good public tool but limited in formatting), and several variations of Google Charts (not as rich in features as MicroStrategy, but close, and supports public sharing). In the end I’ve ended up using the Visualizer plugin to display Google Charts. All data below is for degree-granting institutions.

Let’s first look at the state-by-state data in a Geochart. This data tracks online ed in public higher ed institutions as of Fall 2012. I’m showing the data broken out by undergraduate and graduate students. The color scale is based on the percentage of students taking at least one online course, but if you hover over a state you can also see the percentage of students taking all of their courses online.

For the basic data combining undergraduate and graduate students in a table format, see this post.

Percentage of Public Higher Ed Undergrad Students Taking Online Courses, by State

Percentage of Public Higher Ed Grad Students Taking Online Courses, by State

Profile of All Degree-Seeking Higher Ed by Sector

In another post I showed the number and percentage of students taking online courses by sector – public, private and for-profit for both 4-year+ and 2-year. Hovering over the sector bars will show the data.

The post New IPEDS Data: A graphical view of online ed by state and by sector appeared first on e-Literate.

Clarification: No, there aren’t 7.1 million students in US taking at least one online class

Thu, 2014-01-16 11:06

I’ve written several posts recently looking at the new IPEDS data on distance education (for the most part equivalent to online education). In one post I pointed out [emphasis added]:

Previously, the best data available on total student counts came from the Babson Survey Research Group with their annual survey (prior to 2012 called the Sloan survey). This is the survey tracking the total number of students taking at least one course online. When I talked to the researchers earlier this year, they mentioned that they hoped the new IPEDS data would ‘put them out of business’. I hope this comment was half in jest, as their survey measures much more information than just total student counts and they have very useful longitudinal data. I have asked BSRG for an updated statement on their plans but have not been able to get a response yet.

Some additional notes on the data:

According to the IPEDS data, 5.5 million (26%) degree-seeking students in the US took at least one online course in Fall 2012, which is significantly less than that reported by Babson (6.7 million / 32% for Fall 2011 data). Keep in mind the different methodologies involved – IPEDS collects data reported directly from colleges and universities, while Babson is based on a representative survey; IPEDS measures “distance education”, while Babson measures “online courses”. I would look forward to this year’s Babson survey to explain the differences, but for now, I’ll just note the difference.

The new Babson Survey came out yesterday morning, and I was disappointed to find out that they did not address or acknowledge the differences in the data. With the new survey data (p. 15, using Fall 2012 data just like IPED), they came to a dramatically different conclusion [emphasis added]:

There were 412,000 more online students in fall 2012 than in fall 2011, for a new total of 7.1 million students taking at least one online course. This year-to-year change represents the smallest numeric increase in the past five years. The growth rate of 6.1 percent in students taking at least one online course also represents the lowest percentage increase since these reports began tracking online enrollments.

The difference between 5.5 million and 7.1 million is quite significant – Babson’s estimate from the survey is 29% higher than the IPEDS data. The Babson survey is the most widely-quoted source for answering the question “how many students take online courses in the US”, and these differences are important for policy makers and planners. And as a note, the Babson survey also looks specifically at degree-granting institutions (see p. 33 of the survey report).

While the initial articles in the Chronicle and Inside Higher Ed did not cover this disparity, Steve Kolowich at the Chronicle did talk to Jeff Seamen (one of the report authors) to get a statement with this morning’s post:

But how many American students are taking at least one online course right now?

The answer, according to the latest figures from the Babson Survey Research Group, is about 7.1 million.

Or is it?

For the last decade, researchers and journalists have relied on the Babson group and its annual survey to measure the scale and growth of online higher education in the United States. With backing from the Sloan Consortium and others, the Babson surveyors have been taking the temperature of online education in the United States since 2002, when they estimated that 1.6 million students were taking at least one online course.

The article goes on to answer the question:

So which number is correct?

The lower one, probably. The Education Department data are more likely to be accurate, “given that they are working from the universe of all schools,” says Mr. Seaman by email. [snip]

The reporting requirements for the department “are such that I would always trust their numbers over ours,” he wrote. “However, I still believe that the trends we have reported for the past 11 years are very much real.”

It’s good to see this follow-up from Steve and Jeff’s  response.

I certainly agree that the differences between the Babson Survey and IPEDS data points out the challenge we have had with inconsistent definitions (what data should trigger course to be classified as online versus face-to-face or hybrid) as well as the inconsistent data collection by colleges and universities. Jeff makes a great point that the new IPEDS data will force institutions to be officially track the data in student systems (rather than shadow systems) and to use a consistent definition.

I think that Babson Survey Research Group should have acknowledged these differences publicly along with the release of the report, as we need to have more confidence in the impartiality of our data collection. Even with the IPEDS data, the Babson survey is for the attitudinal data and the consistent trend data they provide. But Kudos to them for the quick explanation.

What we still need is guidance on how to translate the data in an understandable way. One possible solution is for Babson to do a one-time adjustment based on new bias factor, going back to 2002 to give new time-series data. While there will likely be media confusion for a short time, we would benefit from coordinated data in the long term.

The Babson survey is very useful – and will remain so – even with the release of IPEDS data. We need to have confidence in the data, however, and this clarification is important. I am in contact with one of the Babson authors and expect to be able to provide a deeper explanation soon.

And no, there aren’t 7.1 million US higher ed students taking at least one online course. There are closer to 5.5 million as of Fall 2012.

The post Clarification: No, there aren’t 7.1 million students in US taking at least one online class appeared first on e-Literate.

Evolve Conference on Friday: Engaging students in the online conversation

Wed, 2014-01-15 18:27

On Friday, January 17th, I’ll be in LA to help moderate the Evolve Conference put on by the 20 Million Minds Foundation. Despite all the talk about the role of online education in higher ed, most conferences and events – GSV Education Innovation Summit, EDUCAUSE, WCET, Sloan Consortium, and others  - include institutional leaders and staff, faculty, technologists, investors, and consultants but not students. Last year’s Re:Boot conference did include a student panel, but we need more focus on the student perspective. I don’t fault  these conferences, as it is very hard to get student input outside of a survey, but it is still a missing piece to the puzzle – arguably the biggest piece.

This student focus gets to the core purpose of the Evolve Conference:

Major stakeholders have weighed in on the role of online and educational technologies–we have heard from faculty, unions, system leaders, policy makers, and leading ed-tech innovators. However, the ever-critical student perspective has largely been absent.

Building on the success of the 20MM “Re:Boot California Higher Education” symposium of January 2013, “EVOLVE California Higher Education” is the second installment of the collaborative discussion series that will shift the focus to ensure that the student voice is given prominent placement in the state and national conversations surrounding online learning and technology. Student presenters will be joined by a myriad of post secondary leaders and technology innovators.

I’ll co-moderate the panels with Isa Adney, a recent (read that as ‘much younger than Phil’) community college graduate who went on to get her bachelor’s and master’s degrees. She has written a book on the community college experience and is now a speaker and advocate for students. I look forward to working with Isa at the conference.

The program is built around a student panel, who will start by sharing their personal experiences with online education and educational technologies. After this first panel, there will be series of  presenter panels that will speak to the students and field questions from the ongoing student panel. The topics include:

  • How will online offerings increase access to undergraduate and bottleneck courses?
  • How can online offerings support the needs of a diverse student population?
  • How will faculty use online platforms to design robust course offerings?
  • What policy recommendations can support online initiatives that increase graduation rates?

There is an obvious bias that online offerings can address these problems if done properly, so much of the conversation should focus on how, not whether.

The event will be live-streamed on Friday, January 17th from 9:30am – 2:00pm PST. Check out the conference web site for more information on attending the conference in person or by live stream. Update: the live stream information is not on the site yet but will be by Friday.

The post Evolve Conference on Friday: Engaging students in the online conversation appeared first on e-Literate.

New IPEDS Data: State by state online ed comparison / public institutions

Tue, 2014-01-14 14:55

It’s Tuesday, so it must be time for my daily post on the new IPEDS data including online education. There are so many ways to slice this data that just was not possible before, and with a little spreadsheet engineering, I’m finding it easy to come up with new views. Today let’s look at the adoption of online education per state for public institutions (both 4-year and 2-year schools). Note that this data does not include private non-profit or for-profit institutions as I wanted to focus on official state involvement in online ed.

IPEDS tracks information now about students taking exclusively distance education courses, students taking some but not all distance education courses, and students not taking any distance education courses. Please note the following:

  • I am using the terminology “online courses” rather than “distance education”. For the most part these terms are interchangeable, but they are not equivalent as “distance education” can include courses delivered by a medium other than the Internet (e.g. correspondence course). Part of the reason I stick with “online course” is that there is a growing use case where students local to a campus choose to take a course online, which is not really at a distance.
  • I have combined the fields for students taking “exclusive online courses” and “some but not all online courses” to derive the “at least one online course”, as this roughly equals the data reported by the Sloan survey / Babson Survey Research Group survey.

Online ed by state

Some notes on the data:

  • There is quite a variation by state, with 42% of public higher ed students in Arizona taking at least one online course but only 10% in Delaware and 3% in Washington, DC (yes, I know it’s not really a state).
  • Not surprisingly, the greatest adoption of online education in public higher ed is in the larger western states (Arizona, North Dakota, Alaska, Idaho, Oklahoma, etc) with the least adoption in smaller eastern states (DC, Delaware, Connecticut, Vermont, New Hampshire, etc). The biggest outlier is North Carolina, which has the third highest adoption despite being medium size geographically and in the east.
  • I was surprised to see that more than one in five students at Kentucky institutions are in fully-online programs.
  • Two of the states getting most of the exposure in online education, California and New York, are 43rd and 44th in terms of current online enrollment; meanwhile Florida is 13th.

I’m sure there are many other interesting observations possible with this data.

The post New IPEDS Data: State by state online ed comparison / public institutions appeared first on e-Literate.

New IPEDS Data: Profile of online education in US / Fall 2012

Sun, 2014-01-12 23:31

Last week I mentioned how the National Center for Educational Statistics (NCES) and its Integrated Postsecondary Education Data System (IPEDS) are now providing preliminary data for the Fall 2012 term that for the first time includes online education. Let’s look at a profile of online education in the US for degree-granting colleges and university, broken out by sector and undergrad / grad  degrees.

IPEDS tracks information now about students taking exclusively distance education courses, students taking some but not all distance education courses, and students not taking any distance education courses. Please note the following:

  • I am using the terminology “online courses” rather than “distance education”. For the most part these terms are interchangeable, but they are not equivalent as “distance education” can include courses delivered by a medium other than the Internet (e.g. correspondence course). Part of the reason I stick with “online course” is that there is a growing use case where students local to a campus choose to take a course online, which is not really at a distance.

Here is the new IPEDS data for undergraduate and graduate students and courses:

Undergrad by sector

Grad by sector

 

Now let’s combine the data, undergraduate and graduate students and courses:

Totals by sector

Previously, the best data available on total student counts came from the Babson Survey Research Group with their annual survey (prior to 2012 called the Sloan survey). This is the survey tracking the total number of students taking at least one course online. When I talked to the researchers earlier this year, they mentioned that they hoped the new IPEDS data would ‘put them out of business’. I hope this comment was half in jest, as their survey measures much more information than just total student counts and they have very useful longitudinal data. I have asked BSRG for an updated statement on their plans but have not been able to get a response yet.

Some additional notes on the data:

  • According to the IPEDS data, 5.5 million (26%) degree-seeking students in the US took at least one online course in Fall 2012, which is significantly less than that reported by Babson (6.7 million / 32% for Fall 2011 data). Keep in mind the different methodologies involved – IPEDS collects data reported directly from colleges and universities, while Babson is based on a representative survey; IPEDS measures “distance education”, while Babson measures “online courses”. I would look forward to this year’s Babson survey to explain the differences, but for now, I’ll just note the difference.
  • For-profit schools (66% undergrad and 82% of grad students taking at least one online course) and public 2-year colleges (27% students taking at least one online course) have the highest usage of online courses, followed by public 4-year institutions (22% undergrad and 24% of grad students taking at least one online course.
  • Graduate students have a higher rate of taking online courses (22% exclusively online and 30% taking at least one online course) than do undergraduates (11% exclusively online and 25% taking at least one online course); however, there is a higher percentage of undergrads taking some, but not all, of their courses online (14%) than of grad students (8%). In other words, grad programs appear to be fully face-to-face or fully online whereas undergrad programs have a higher mix-and-match percentage.
  • Overall, there were slightly more students taking some, but not all, of their courses online than students taking exclusively online courses. This group of students who are augmenting their face-to-face programs with online courses, gets far less coverage and analysis than the fully online students. I suspect that this number will grow over time.

The post New IPEDS Data: Profile of online education in US / Fall 2012 appeared first on e-Literate.

Opportunities and Predictions, 2014 A.D.

Thu, 2014-01-09 15:54

I make fun of predictions all the time (see, for example, Education Analysts Have Predicted 7 of the last 0 Mobile Revolutions). But heck, why not jump in?

Here’s how I see the world. Mostly, being in a field doesn’t give you predictive powers — it just makes you aware of where the real opportunities are.

But opportunities, even great ones, aren’t always seized upon. There’s good opportunities that people will take. There’s good opportunities people will ignore. And then there are the dead-ends everyone else is in love with. Here’s my take on what will have more prominence in a year’s time, more or less in those categories.

. Good Opportunities That Will Be Taken Seriously by the Powers That Be 1. Synchronous Online

usplus
Image from US+ project

Synchronous online is largely dismissed — the sexy stuff is all in programmed, individuated learning these days, and individuated is culturally identified with asynchronous. That’s a mistake. First, tens of thousands of students (hundreds of thousands?) are still suffering through traditional multi-campus distance learning sessions where they wait patiently in largely empty classrooms for the classroom camera to grind over to them so they can ask a question to a teacher teaching on some other satellite campus 150 miles away. These sessions have no flow, no life, and take up valuable room assignments. And they aren’t cost-neutral — the associated equipment either needs to be updated and replaced, or we need to find new ways of doing this. My sense is if better online options were provided, a lot of campuses would ditch the traditional equipment and go with something more fluid.

Second, and more importantly, *millions* of students suffer through lousy Adobe Connect and Bb Collaborate sessions every semester. I’ve talked about this before — but the problem with these sessions is they ditch the affordances that physical space provides to structure classroom discussion without taking advantage of the unique affordances of the Net. Something as simple as pairing off students and walking around to “eavesdrop” remains extremely clunky in these systems (yes, I know about “breakout rooms”, but the fluidity is not there), and more advanced techniques such as “speed-dating reviews”, fishbowl discussions, and peer instruction are just plain impossible. And there’s no reason for that — it’d be trivial to model any number of research-tested approaches to classroom discussion using things like timed-turns, auto-pairing, random prompts and the like.

In other words, synchronous online is where asynchronous online was a number of years back, when the LMS was a glorified content management system. With the LMS, suddenly the long-commodified LMS market realized that if they focused on student learning and success rather than treating student and content management as a corporate function they’d have a different pitch. Here’s my take on that LMS shift from back in 2011, as Canvas entered to redefine the market:

But, as it turns out, the real discussion had hardly begun. The LMS wars had all been on Blackboard’s turf —  as David Wiley has pointed out, they exhibited the classic attributes of “Hat Fail”. Instead of managing the interaction piece of learning — the individual path of students through the course — they conceptualized themselves as content repositories, and became irrelevant as both quality content and quality content publishing approached zero cost.

Adobe Connect and Blackboard Collaborate are, I think, in a similar place. They are perfect tools for sales presentations, but they remain education-illiterate products. They don’t help structure interaction in helpful ways. I sincerely doubt that either product has ever asked for the input of experts on classroom discussion on how net affordances might be used to produce better educational interaction, and I doubt there’s all that much more teacher input into the products either. The first people to bother to talk to experts like Stephen Brookfield on what makes good discussion work *pedagogically* and implement net-based features based on that input are going to have a different pitch, a redefined market, and the potential to make a lot of money. For this reason, I suspect we’ll see increasing entrants into this space and increasing prominence of their offerings.

Suggested tag line: “We built research-driven video conferencing built for educators, and that is sadly revolutionary.”

.

2. Small Data Products

nic-cage-netflix

Big Data is data so big you can’t fit it in a commercial database. Small Data is the amount of data so small you can fit it in a spreadsheet. Big Data may indeed be the future. But Small Data is the revolution in progress.

Why? Because the the two people most able to affect education in any given scenario are the student and the teacher. And the information they need to make decisions has to be grokable to them, and fit with their understanding of the universe.

Want to know a great example of Small Data? A local school district here runs a “one-F” analysis on student grades. It’s just what it sounds like — it pulls a list of students that have A’s and B’s in all their classes, but an F in one. And then they sit down with the teacher and the student and ask what’s going on.

There’s no complex correlations here, because complex correlations that aren’t understood by end users don’t result in meaningful action, a fact that even Big Data darling Netflix learned the hard way. One-F analysis works because both people involved with the situation can think meaningfully about the pattern and alter their behavior to affect it. If I sit a student or a professor down and explain the one-F situation, they can immediately theorize how their behavior, competency, or history might be contributing to the situation.

That’s not to say that complex math doesn’t have a place. Point biserial correlations run on multiple choice tests can identify questions where success doesn’t tend to correlate with success on other questions. The math is not hard, but it’d freak your average professor out. The point, however, is that I can run a correlation on a professor’s test with a fairly small sample size and identify potentially bad questions, and explain the problem to the professor as “Your ‘A’ students did no better on this question than your ‘F’ students — I think there might be a problem with it.’ And they’ll get that.

What will the Small Data products look like? It’s likely to be dashboard style stuff; visualizations, metrics, alerts. But the key will be removing the “secret sauce” mentality that most vendors have about the scoring of their analytics. If schemes like Course Signals persist (and they will) they will have to be made meaningful. Netflix tells me that it suggested Dodgeball because I liked Zoolander, and does that in the pretty low stakes arena of weekend movie selection. And as that linked article above demonstrates, Netflix is moving more and more in this direction, creating tens of thousands of “micro-genres” to help customers better understand the meaning behind the math. (And yes, you really need to read that article).

In higher education the stakes are somewhat greater, yet many analytics are stuck where Netflix was four years ago, telling you that there was a 95% chance you would give Movie X a 3.5 rating, and watching you not care in the least. That was a Big Data way of thinking. For the moment, however, small is beautiful, and future analytics products will have to identify to students and professors whether a poor prognosis is due to prior GPA, class attendance, or LMS logins.

Potential hurdle: Vendors love them some secret sauce, and hate you to know that 95% of their “complex” prediction is just prior GPA. It kills the mystery. So this one’s less certain.

. 3. Local Online

This is an old prediction of mine, going back a couple years now, and you might argue that it’s already happened — at any given university with an online arm you’ll find that 25-40% of the online classes are taken by students who are already on campus. So the trend is here. It’s arrived.

That said, there’s been surprisingly few concerted attempts to restructure online to serve this incredibly important population. At many universities students who take online classes are actually treated as if they are enrolled in a separate “campus”, despite the fact they are being taught by the same people who teach their face-to-face courses and are taking the courses for a degree program in which they are enrolled locally.

In a way, this is related to the Synchronous Online prediction. For a long time the assumption has been that people take “inferior” online courses primarily because these people are

  • Distant
  • Schedule-constrained

For an increasing audience of students this just isn’t true anymore. People take online due to course scarcity, or because they like an online experience “in the mix”. Or maybe it has their favorite professor, or a really cool experience. Maybe they *could* get up at 8 a.m. if they had to, but they don’t trust themselves to do that religiously. Maybe they have time for the synchronous experience, but the scheduling gods have deigned to put the two courses they need to graduate at the same time on the same day.

Schools have been shipping these students the same product as their more traditional distant, time-constrained brethren. Why? Why take such huge local advantages and opportunities and throw away just because the course is “online”?

There’s been a lot of scattered experimentation on this front, but I think this is the year a college or two really cracks the code on this, weaving local online into their global curriculum intentionally, and (and this part’s the big part) making their unique approach to local online part of the marketing campaign of the college. And on the trailing edge, this may push schools who have put their online and face-to-face efforts in completely different units to move towards integration and realignment.

I’m sure a couple schools will show up in the comments and say they’ve started that already. That’s good, because these sort of realignments take time, and if this prediction is going to turn out well, you’ll have to be chugging into the marketing campaign in about eight months…

. Good Opportunities that Will Likely Not Be Taken Seriously by the Powers That Be. . 1. Return of Education 2.0

seinfeld

Hey, I hated the term too. I wrote a screed on it. But term aside, the use of social media — blogs, wikis, social bookmarking — was treated over the last couple of years a bit like a phase we went through on our way to MOOCdom. Now that the dust has cleared, I’m seeing institutions that have made huge strides in the past three years while everyone was looking elsewhere. While the University of British Columbia was getting into MOOC production on one end of the organization, smart people on the other end were plugging away at the UBC Wiki. While UMW’s excellent ds106 garnered much attention, UMWBlogs continued to grow.

The social climate has changed as well. “Hashtag” became a term so well known that it jumped the shark. Social bookmarking returned in the form of Pinterest, a site so popular that one out of every three American women on the internet use it. Teens left the suburbia of Facebook for the selfie-foodie kingdom of Instagram and the, um — something — that is Tumblr. The “teens don’t tweet” conversation of a couple years ago has morphed into “Teens love them some pubsub architecture.”

Able critiques of the like/retweet web aside, part of what is happening is the architecture of the web is moving into areas that do not require writing or video production for participation. And that’s good. Because when I can get in front of faculty and explain the same Pinterest architecture that showed them how to cook a better turkey last Thanksgiving can show them how to build a better class, that’s an unqualified win for everyone. We can work on the writing part later.

Professionally, what I’ve noticed is this — in the communities I’m in we’ve moved from evangelism (You need to get your students onto blogs, and all your problems are illusions!) to a much more reflective state. There are people around now that have been working with student blogging and student wikis for more than a decade now, and that changes things in a profound way. I think we are getting closer to the point where the service-wrapped free-software model that is being used by David Wiley and Kim Thanos at Lumen could be profitably applied to some of these blurred-boundary classroom efforts, because the body of knowledge exists, it’s just not equally distributed.

So why the dismal forecast for this year? History, I guess. Ultimately this is not a story about a product, but a story about a way of working with one another. It’s about thinking like the web. That remains a tough sell.

Almost everyone I know that has really advanced “learning 2.0″ has got into it because tapping into web communities changed their professional, personal, or political life and they wondered why the heck student experiences couldn’t be like that. The people that excel at putting these sorts of educational experiences together don’t teach this as a skill, but as a lifestyle. And to do that, you must live it.

We’re getting there in our personal lives, via Pinterest, via Tumblr, via Twitter. But most people still do not have the experience of building and tapping into strong online networks of expertise for their job, and in fact are actively dissuaded from doing so. Most administrators still see Twitter as a productivity suck or a toy. Until this changes, the learning 2.0 community will keep on truckin’, building ways of teaching that might have a chance for broad adoption in 2015 or 2016. But not this year. For this year expect the actual people in power to engage in more dumb discussions about “net civility”, “social media safety”, and FERPA while the rest of us build the education you will use a decade from now.

. 2. Privacy Products

faceblock

I’d like to see last year’s revelations about NSA spying as a watershed moment where people move to more distributed, client-managed solutions. I really would.  But I just don’t see it.

The problem with consumer privacy is it isn’t a consumer problem, like a preference for products with rich APIs. If you want to know why, read Unsafe at Any Speed and replace “car industry” with “Silicon Valley” and “safety” with “privacy”. You can’t buy your way out of a lousy car industry, and you won’t be able to boycott Silicon Valley into truly respecting your privacy.

The way we will get privacy, if we get it at all, is public pressure followed by legislation that sets up systems of oversight and accountability. But given that we can’t even pass minor gun legislation passed after Newtown, that’s not happening this year. Not with this administration, not with this Congress.

bills-side.r

That said, the platform dreams of Facebook, Google+, and Twitter will begin a death spiral this year as the services integration point moves back into the operating system. As you increasingly use apps to log into Facebook, Gmail, and Twitter, these companies won’t be able to use the web’s horrible broken cookie-based persistence system to track your behavior across other parts of the web. That means the global view of your behavior is going to have be at the OS level, and people may react differently to a phone that they paid for (and pay for) tracking them than a free service where “that was just the deal.”

But probably not. Just pray for Congress to get better so we can get on with the legislative efforts. Until then, expect more of the same.

. Dead-ends that everyone else will be talking about

friedman

Mobile learning will continue to be a meaningless term that means both everything and nothing.

Gamification will continue to be a bizarre term denoting principles derived from 70 years of educational research on motivation and competency, as told to us by the people who created Farmville.

For-profit Bundled Service Providers like 2U, Academic Partnerships, and others will continue to find non-profit institutions willing to auction off their name for a cut of online action, and Thomas Friedman will be there to cover it as some gosh-darn American private sector ingenuity.

Competency certification efforts will speed up, which is *not* a dead end. But along with this, the GED for Higher Education will begin take shape, although no one will call it that, since the GED was a policy disaster that did horrible damage to our educational system. Not like this new thing, right?

———————–

Image Credits

“You’ve been automuted because you’re talking too much”: US+ project, via FastCo.labs

“Movies featuring an Epic Nicolas Cage meltdown”: Netflix’s April Fool’s Joke, via Salon

“I will be part of the last generation to get Seinfeld”: Anonymous, via Quickmeme

“I tried to deactivate my Facebook profile”: Arrested Development, Season 4, via the Online Privacy Blog

“Bills Enacted Into Law, 1947 to 2013″: Boston Globe

“Dial M for Moustache”: This Modern World

The post Opportunities and Predictions, 2014 A.D. appeared first on e-Literate.

Boundless Is Totally #Winning

Thu, 2014-01-09 14:09

I was contacted by Boundless CEO Ariel Diaz regarding a concern he had with my blog post about the lawsuit outcome. This was not entirely surprising, and I was curious to see which aspects of the post concerned him. Was it my characterization of Boundless as not a content company? Was it my speculation that Boundless was really designed to ride on Chegg’s coattails with the VC community? No, none of those. It was my claim that Boundless lost the legal fight.

You may wonder, “How can a company that was required to pay $600,000 to the plaintiffs, destroy three of the products it had created, and accept a permanent injunction on how it does business going forward, have actually won?” I wondered that too.

Diaz’s argument was essentially that (1) Boundless is still in business (and growing!), and (2) they are still offering textbooks that are “aligned” with those of the publishers and marketing them direct to students. When I asked him how he could do the latter without violating the terms of the injunction, his response was that he could not comment on the details of the confidential settlement but urged me to look again and the details of the consent and injunction judgment.

Huh.

Rereading said document, we find the following clause:

7. Boundless, and its officers, agents, servants, employees, attorneys in fact, and all those in active concert or participation with them, are hereby PERMANENTLY ENJOINED from:

a. Aligning any of its products, services, or content to Plaintiffs’ Copyrighted Works without prior written permission of the Plaintiff whose work it is [emphasis added]

This suggests that the private settlement includes some sort of a licensing agreement by which Boundless pays the publishers in return for being able to “align” their books and advertise them as aligned. Why would the publishers go along with this? I can only speculate. One possibility is that it was worth it for them to give up a concession in return for getting a legal result that affirmed their copyright protection. e-Literate reader Don Gorges (who, it must be said, clearly does not think highly of Boundless) put me onto this article by the law firm Jones and Day describing the intricacies of the textbook publishers’ legal burden. The publishers had a pretty good case, but it wasn’t open and shut. (By the way, this article is also a good read for OER advocates who want to understand the broader implications and what the legal threat is and is not to the broader OER project. It is also something I truly hope that Boundless’ investors looked into before writing a check.)

But also, as long as the copyright issue was put to bed, allowing Boundless to publish knock-off texts isn’t giving up anything of great value to the publishers, particularly if they get a royalty in return. Remember, their legacy textbook business is falling off a cliff anyway. The publishers are looking to get out of it and into other products and services as quickly as possible. In the meantime, they are fighting a rear guard action by finding ways in which they can extract some small fee from students who otherwise might not pay them anything at all for their books. For example, on Amazon, Cengage offers a textbook rental price for Mankiw’s Principles of Economics for $39.35. How does the quality compare between the $39.35 rental option and the $20 Boundless option? Our Boundless critic Don Gorges has shared these helpful comparison links:

When thinking about the strength of Boundless’ business prospects, you need to look at the quality of the Boundless “aligned” version relative to the original and then think about the value, not of a $130 price difference, but of a $20 dollar price difference.

As an aside, for those of you who feel that you would commit to the product on principle to support a company that is developing OER, you might want to read the Boundless Terms of Service:

While much of the Content is Creative Commons Content, the rest of the Content available through our Website is not necessarily licensed under the Creative Commons license. For clarity and without limitation, the following are not licensed under the Creative Commons license:

I. THE NAMES, TRADEMARKS AND LOGOS OF BOUNDLESS LEARNING AND ANY OTHER THIRD PARTY LISTED ON THE WEBSITE ARE NOT INCLUDED IN THE CONTENT THAT IS LICENSED UNDER THE CREATIVE COMMONS TERMS; AND

II. ANY OTHER CONTENT ON THE WEBSITE THAT IS NOT IDENTIFIED ON THE WEBSITE AS COMING FROM A CREATIVE COMMONS SOURCE.

For Content of this Website that is not Creative Commons Content, users have a personal, non-transferable, non-exclusive right to access and use the Content solely for user’s personal, non-commercial use, subject to these Terms of Service.

The Website contains both Creative Commons Content, Content proprietary to Boundless as well as user-contributed Content (“Contributions”). Contributions, include, but are not limited to, educational content that you submit to us, your postings, messages, commentary, text, images, photographs, videos, audio files, or other materials whether submitted to us through our Services or otherwise. The Content may contain typographical errors, other inadvertent errors or inaccuracies. We reserve the right to make changes to document names and content, descriptions or specifications of products or services, or other information without obligation to issue any notice of such changes. We reserve the right to take down Content from our Website if we no longer offer or replace such Content with a different offering.

You may view, copy, and print Content that is available on this Website, subject to the following conditions:

I. THE CONTENT MAY BE USED SOLELY FOR INTERNAL INFORMATIONAL PURPOSES. NO PART OF THIS WEBSITE OR ITS CONTENT MAY BE REPRODUCED OR TRANSMITTED IN ANY FORM, BY ANY MEANS, ELECTRONIC OR MECHANICAL, INCLUDING PRINTING, PHOTOCOPYING AND RECORDING FOR ANY OTHER PURPOSE; PROVIDED THAT NO SUCH RESTRICTION APPLIES TO ANY CONTENT THAT IS CREATIVE COMMONS CONTENT.

II. THE CONTENT, OTHER THAN CREATIVE COMMONS CONTENT, MAY NOT BE MODIFIED. [Emphasis in original.]

Update: I see that some, but by no means all, Boundless titles are released in toto under a CC-BY-SA license. 

Anyway, to recap:

  • Boundless paid $600,000, or approximately 6% of their venture funding, in penalty fees to the three publishers. (This doesn’t include their legal fees.)
  • They had to destroy three of their products.
  • They are permanently enjoined from marketing their products in the ways that they had previously.
  • In return, they apparently won the privilege of selling knock-offs that compete against relatively competitively priced originals in a dying product category, provided that they get prior permission from the owners of said originals, and possibly with the requirement that they pay a license fee for each sale.

Look, I’m a utilitarian and an optimist when it comes to ed tech entrepreneurship. I believe that startups can genuinely contribute to goals of improving educational access and quality. But not because entrepreneurs have magical startup fairy dust and magic disruptive innovation wands. We have to be clear-eyed and honest about what these ventures do, as well as how they do it. Personally, when I read a puff piece in VentureFizz saying “Boundless might be THE pillar EdTech startup; not just in Boston, but anywhere,” it makes me want to puke. Nobody, including the investors, is going to get value from ed tech startups if we don’t apply some critical analysis of just what they are doing and what reason we have to believe that they will be successful or, for that matter, good for education.

The post Boundless Is Totally #Winning appeared first on e-Literate.

Instructure releases their third public security audit

Wed, 2014-01-08 19:35

In the fall of 2011 I made the following argument:

We need more transparency in the LMS market, and clients should have access to objective measurements of the security of a solution.  To paraphrase Michael Feldstein’s suggestions from a 2009 post:

  • There is no guarantee that any LMS is more secure just because they say they are more secure
  • Customers should ask for, and LMS vendors should supply, detailed information on how the vendor or open source community has handled security issues in practice
  • LMS providers should make public a summary of vulnerabilities, including resolution time

I would add to this call for transparency that LMS vendors and open source communities should share information from their third-party security audits and tests.  All of the vendors that I talked to have some form of third-party penetration testing and security audits; however, how does this help the customer unless this information is transparent and available?  Of course this transparency should not include details that would advertise vulnerabilities to hackers, but there should be some manner to be open and transparent on what the audits are saying. [new emphasis added]

Inspired by fall events and this call for transparency, Instructure (maker of the Canvas LMS) decided to hold an public security audit using a white hat testing company, where A) the results of the testing would be shared publicly, and B) I would act as an independent observer to document the process. The results of this testing are described in two posts at e-Literate and by a post at Instructure.

Instructure has kept up the practice and just released their third public security audit.

To be clear, we are continually performing security audits on Canvas. Occasionally, our customers even call for their own third-party audits, which we fully support. But once a year, we bring in a third party for an annual public audit, which helps us remain objective and committed to the security of your information.

This year we retained the company Secure Ideas, a network security consulting firm based in Orange Park, Florida. Their security consultants have spent years researching various exploits and vulnerabilities, building toolsets, and helping organizations secure their networks.

This year’s audit started in November 2013. Secure Ideas spent three weeks doing penetration testing and conducting a general review of Canvas’ security architecture. They presented their findings in this Final Summary Report. In short, they found 0 critical, 1 high, 1 medium, and 2 low priority vulnerabilities. Details of fixes can be found in our Security Notes Forum.

No other LMS vendor has taken up this call for public security testing to my knowledge, and I attempted to describe some of the arguments against the practice here.

While I obviously have not had the same insight into the second and third annual public audits (you can review the results in the public report), I am impressed to see that the company has kept their word.

As such, we see no reason why all LMS providers in the market shouldn’t provide open security audits on an annual basis.

I still think it would help the market in general if more LMS providers adopted this practice of public security audits – it would be useful for higher ed clients and it would be good for the providers themselves.

The post Instructure releases their third public security audit appeared first on e-Literate.

Lessons from the Boundless Copyright Infringement Suit

Tue, 2014-01-07 15:42

On December 17th, the Boundless OER-based textbook startup issued a press release describing the settlement they had reached with Pearson, Cengage, and Macmillan in the lawsuit those three companies had filed against the company. (Full disclosure: Pearson has been a client of MindWires Consulting.) Actually, a lot of the press release wasn’t really about the lawsuit, and the description of the settlement consisted of the following:

Today, we’re excited to announce that we’ve settled the lawsuit. In agreeing to a confidential settlement agreement, along with a public judgment and injunction entered by the Court, the parties have resolved the dispute. The resolution allows the parties to move forward and focus on their mutually shared goal of helping students learn. Boundless now has a clear path for building and marketing its OER-driven textbook alternatives without treading upon the Plaintiffs’ rights, and it is confident that it is in compliance and will not have further legal issues with the Plaintiff publishers. In turn, Plaintiffs have reinforced the strong protection they have in and to their copyrighted works and the related goodwill that they and their authors have established, and look forward to Boundless operating its business within the agreed upon framework.

This seemed like a strangely muted ending to a strange story. It’s hard to tell from the press release what actually happened. But having read the consent decree and injunction, I have come to two conclusions. First, Boundless lost. Second, the suit and its outcome tell us very little about the future of OER but rather more about business strategy for ed tech startups.

Boundless Copyright?

Viewed at a distance, the lawsuit by the publishers looked preposterous. Boundless was publishing textbooks that competed with the publishers’ popular titles but were built using OER content. Everybody agreed that the words in Boundless’ books did not copy the words of the publishers’ products, and yet the publishers sued for copyright infringement. How could that be? Were they claiming that they owned copyright for the table of contents of, say, a standard calculus textbook? Aren’t these books structured by widely shared learning objectives? Were they claiming to own the very idea of a calculus course?

But the truth was rather revealed in the court document is more complex than that. First, Boundless wasn’t just marketing their products as competitive with those of the other publishers. They were marketing their books as, for example, “the Boundless version” of Mankiw’s Principles of Economics (published by Cengage). And they used a picture of the Mankiw book cover when advertising the “Boundless version.” The textbook publishers also alleged that Boundless “copied the selection, coordination and/or arrangement of these textbooks, including with respect to topics, sub-topics, sub-sub-topics and photos, figures, illustrations, and examples.” In other words, the Boundless products were being designed and marketed as meticulous paraphrases of entire books. The fact that they happened to use OER as raw materials was incidental, except insofar as it helped keep their product costs down. There was no talk, for example, of the remixing value that OER advocates tout. To the contrary, the whole point of the Boundless products was that they were exactly like the name-brand products they were genericizing. The company’s business strategy was never about capitalizing on the values of Openness; it was about capitalizing on the valuation of Chegg.

“It’s like Thelma and Louise, only with aliens!”

According to CrunchBase, Boundless received its seed funding in April 2012, a month after Chegg received $25 million in their F round and at a time when it was clear that the used textbook insurgent was preparing for an IPO. The VC pitch pretty much writes itself. “Imagine Chegg, but with no warehouse to maintain and no physical books to ship!” VCs tend to love this kind of pitch, for several reasons. First, in a complex industry where it’s often very difficult to recognize a good bet, a safe strategy is to copy something that is already a hit. Second, VCs tend to be deeply suspicious of any business plan that comes within 100 miles of a bureaucratic process (or a union representative). By marketing direct to students, Boundless planned to avoid having to deal with the complexities of faculty adoption (including having to an army of sales reps). This was a pure consumer play. And production costs were low. After all, they were taking content created by other people and fitting it into a detailed structure created by still other people. To be clear, I don’t have any problem at all with re-using openly licensed content in order to lower costs for students (and for the companies that distribute that content). My point is that Boundless was never intended to be a content company, so they weren’t burdened with high content creation costs.

So, to recap: Low product creation costs, no distribution costs and, because they are marketing directly to internet-savvy and price-sensitive students while drafting behind the textbook companies that were driving the faculty adoptions, low sales and marketing costs. Plus a sales model that’s based pretty heavily on one of the few education startup success stories of the time. Boundless was perfectly engineered to attract VC money. Unfortunately for the founders, it was less perfectly engineered to withstand a copyright suit from the publishers whose books they were openly imitating.

Live to fight another day (and way)?

The net result of the suit is the following:

  • Boundless is enjoined from selling their products that are “aligned with” those of the plaintiffs and must destroy all copies of the books and the marketing materials.
  • The company is enjoined from selling “aligned” products, using the images of their competitors’ products, or describing their own products as a “version,” “copy,” or “equivalent” to the plaintiffs’ products.
  • Boundless will pay $200,000 to each of the plaintiffs.

That was the public part. There was a private settlement as well.

Boundless say they have “increased their usage and reach” to 3 million users of 21 titles (according to their press release), raise a total of $9.7 million in funding, and executed a pivot while all of this is going on. If you go to their website today, the headline on the home page reads, “Introducing Boundless for Educators.” Rather than doing an end run around faculty, they are now marketing directly to them. In fact, they look a lot like the current incarnation of FlatWorld Knowledge. With the cloud of the lawsuit removed, they can now focus on trying to drive that new strategy forward (and potentially raising more investment money). What they’ll be able to do with that start is not clear. But this is one “pivot” that was predictable, probably from the very beginning.

Like I said, I don’t think there’s much of a lesson here for the OER community, but there may very well be one for the VC community.

Here’s the full text of the settlement:

Download (PDF, 241KB)

The post Lessons from the Boundless Copyright Infringement Suit appeared first on e-Literate.

New IPEDS Data: Top 20 online US institutions by sector

Mon, 2014-01-06 18:49

Update: The original version had incorrect totals for total enrollment including non-online students. I have removed those columns until I can verify that data. I apologize for the mistake and any confusion.

The National Center for Educational Statistics (NCES) and its Integrated Postsecondary Education Data System (IPEDS) provide the most official data on colleges and universities in the United States. At long last they have started to include data fields for online education (technically distance education, the vast majority of which is online) starting with the preliminary data released through Fall 2012 term. Despite all of the talk about data and measuring online programs, we can only now start to get official information from a central source rather than relying on surveys or institutional data.

As an example, let’s look at the top 20 online programs (in terms of total number of students taking at least one online course) for various sectors. Some notes on the data:

  • I have combined the categories ‘students exclusively taking distance education courses’ and ‘students taking some but not all courses as distance education’ into ‘total students taking online courses’.
  • IPEDS tracks data based on the accredited body, which can differ for systems. For example, the University of Phoenix puts all of their online students into the Online Campus while DeVry, Kaplan and Heald assign their online students a home campus. I manually added the DeVry, Kaplan and Heald totals, but I’m sure there are other examples where the data should be combined.
  • I have not been able to set up WordPress to both show these tables in a usable format while also allowing copy / paste / sort, so for now these are images.
Public 2-Year Institutions - Fall 2012

Public 2-year

 

Private 2-Year Institutions – Fall 2012

Private 2-year

For-Profit 2-Year Institutions - Fall 2012

For-profit 2-year

Public 4-Year or Above Institutions – Fall 2012

Public 4-year

Private 4-Year or Above Institutions – Fall 2012

Private 4-year

For-Profit 4-Year or Above Institutions – Fall 2012

For-profit 4-year

This is great data which should support much better data analysis. Kudos to the Department of Education and NCES.

The post New IPEDS Data: Top 20 online US institutions by sector appeared first on e-Literate.

CCSF Accreditation Injunction: The decisions and implications

Mon, 2014-01-06 12:31

Last week, as expected, a California superior court judge ruled on whether to allow the Accrediting Commission for Community and Junior Colleges (ACCJC)  to end accreditation for City College of San Francisco (CCSF) as of July 31, 2014. As reported in multiple news outlets, the judge granted an injunction preventing ACCJC from stripping CCSF’s accreditation at least until a court trial based on the city of San Francisco lawsuit, which would occur in the summer 2014 at the earliest. This means that CCSF will stay open for at least another academic term (fall 2014), and it is possible that ACCJC would have to redo their accreditation review.

What was the actual decision and what are the implications for other schools?

Law Suits

The original issues found by ACCJC were raised in the 2006 review, leading to multiple follow-up reports and actions. By summer 2012 ACCJC issued a Show Cause ruling based on a new review – the one that is the crux of the lawsuits and injunction. The full 2012 report documented the new evaluation that in order to “fully meet each ACCJC Accreditation Standard and Eligibility Requirements [sic]“, the college must follow 14 recommendations by March 2013 to keep its accreditation. CCSF did not meet this timeline and ACCJC in July 2013 sent a letter stating that CCSF accreditation would be revoked as of July 31, 2014. For full background, read this post.

Despite the seven-year buildup, CCSF finally got serious about changes in summer 2013, and they replaced their Board of Trustees with a “special trustee” (Robert Agrella) ”with unilateral powers to try and save the school from losing accreditation in one year”.

As CCSF is the largest college in California (85,000 students before 2012) and potentially the largest college ever to lose accreditation, the issue quickly became political. Three groups filed law suits seeking for force ACCJC to maintain CCSF accreditation – the City of San Francisco’s attorney Dennis Herrera, the American Federation of Teachers (AFT) Local 2121 and the California Federation of Teachers (CFT), and the Save City College Coalition (which was not part of last week’s ruling). Much of these three lawsuits’ arguments were based on a Department of Education notification from August 2013 that ACCJC was “out of compliance in several areas related to its sanctioning of City College”.

Both CCSF administration as well as California Community College system officials accepted the ACCJC ruling and decided to work within the system, even arguing against the three lawsuits. As described in the San Francisco Chronicle:

“The ruling doesn’t affect me at all,” said Robert Agrella, the special trustee appointed by the state to replace the elected Board of Trustees last summer. “I was brought in to meet the accreditation standards, and that is exactly what we’re doing.”

Brice Harris, chancellor of the statewide community college system, agreed. In a letter to Herrera on Thursday, Harris expressed dismay that the courts had gotten involved at all.

“Court intervention is not necessary to keep City College open,” Harris wrote. “Characterizations that the cases before the court are a ‘last-ditch’ effort to ‘save’ City College are inaccurate and will do additional damage to the college’s enrollment.”

He then listed nine areas in which the college had made significant progress, including hiring a permanent chancellor, hiring a collection agency to recoup millions of dollars in student fees it never collected, and mapping out progress on each of the 357 steps needed to fully comply with accreditation standards.

In fact, CCSF has maintained a public spreadsheet detailing its efforts:

CCSF_Accreditation_-_Roadmap

 

Nevertheless the city kept up the pressure through its lawsuit, as described by the San Francisco Chronicle:

The city’s suit says the commission allowed political bias and conflicts of interest to influence not only its decision to revoke the college’s accreditation next summer, but also its entire evaluation of the college that began in March 2012.

The suit alleges that the commission unfairly stacked its evaluation team with supporters of a statewide initiative called the Student Success Task Force that sought to limit college access for thousands of students whose academic goals did not include a degree or transfer to a four-year college. The commission’s president, Barbara Beno, also wrote letters to the state in support of the initiative, which became law. At the same time, City College students and faculty members were among the most outspoken critics of the idea.

The suit also claims the evaluation team had too few faculty members and should not have included Beno’s husband, Peter Crabtree.

Ruling

What the judge ruled:

  • To prevent ACCJC from finalizing its revocation of accreditation for CCSF until a trial can be held based on the city’s lawsuit;
  • To deny the city’s request to prevent ACCJC from blocking any accreditation rulings for all colleges it reviews (mostly California Community Colleges);
  • To deny the CFT request for injunction based on alternate legal theories; and
  • To deny two ACCJC requests to throw out the city and CFT lawsuits.

By reading the ruling itself it becomes apparent that the basis of the ruling was California’s Unfair Competition Law (UCL) governing “unfair”, “unlawful” and “fraudulent” practices. The City Attorney’s claimed in a press releasethat:

the court recognized that Herrera’s office is likely to prevail on the merits of his case when it proceeds to trial,

yet the actual language of the ruling was that (p. 41):

In short, I conclude there is some possibility that the City Attorney will ultimately prevail on the merits, because there is some possibility that he will establish some Commission practices (i) have zero utility and so demonstrating their unfairness, or others (ii) are illegal.

The injunction really is based on the harm done to CCSF by allowing revocation to proceed before the lawsuit goes to trial, but it does not give significant insight into whether the lawsuit might prevail.

Why did the judge grant an injunction based on the city lawsuit but not the faculty union? He explained that this was mostly a matter of the attorney’s role, as shown in the Plain English Summary (p. 54):

Back to this case. Some of the plaintiffs (the union, teachers and students) have a problem with their case. They have probably shown enough to conclude that the Commission imposed unfair procedures, but they have not shown that those procedures led to the Commission’s adverse decision. As far as the evidence presented to the court shows, the Commission might have issued exactly the same decisions with fair procedures. The plaintiffs have argued that they can win by just showing unfair procedures, and it doesn’t matter if the Commission would have done the same thing or not. But under the UCL, it does matter, at least as far as the union, teachers, and students are concerned. They have at least to show they were harmed by the specific acts they say were unfair or illegal under the UCL. They didn’t do that. It’s not good enough to argue that the Commission’s ultimate decisions (for example, threatening to terminate accreditation) causes harm.

The situation is different with respect to the case brought by the City Attorney. As a law enforcement officer he is empowered, along with other City Attorneys and the state’s Attorney General, to enforce the UCL without showing that any particular person was harmed.

As I mentioned last week:

What is remarkable for such a significant decision is that the CCSF deficiencies are not related to academic quality, and no one (CCSF, City of San Francisco, faculty union) has argued that the actual accreditation findings are in error. We’re facing the biggest accreditation shut down in history, and the issue is whether procedures were followed in evaluating non-academic management. Go figure.

Significance Inside and Outside California

The reason I am covering this case in so much detail is that it gives insight into the external pressures on higher education institutions. The real significance of the CCSF injunction is that it opens the door to direct political action to change the accreditation processes. Yes, there have been other cases where a court granted an injunction to delay revocation of accreditation, but to my knowledge all previous cases have involved motions coming from the affected institution itself (e.g. St Paul). With CCSF we have state and city politicians who went to court and prevailed (at least in their motion) despite the school and the system accepting the decision.

Accreditation is a hot political issue, and there is now blood in the water. Politicians can prevail with direct action on accreditation and not just by indirect pressure and not even through arcane accreditation procedures (CCSF ruling based on California law). As the San Francisco Bay Guardian described the political stakes in California:

The ACCJC has come under increasing fire from state education advocates, a bipartisan coalition of state legislators and U.S. Rep. Jackie Speier for its controversial advocacy to dramatically restrict the mission of California’s community colleges by focusing on degree completion to the detriment of vocational, remedial and non-credit education.  The accrediting body’s political agenda — shared by conservative advocacy organizations, for-profit colleges and student lender interests — represents a significant departure from the abiding “open access” mission repeatedly affirmed by the California legislature and pursued by San Francisco’s Community College District since it was first established.

And there is a great interest in changing accreditation processes even at the federal level. Just last month the President’s Council of Advisors on Science and Technology made a specific recommendation on the subject:

2. Encourage accrediting bodies to be flexible in response to educational innovation. College degrees in the United States are accredited primarily by regional nonprofit organizations whose members collaborate in accrediting one another.19 These organizations, on the whole, do a reasonably good job of quality assurance, but they have many standards (concerning the adequacy of physical facilities, library collections, etc.) that are irrelevant to providers of online courses and degrees. The Federal Government (and in particular, the U.S. Department of Education) should continue to encourage the regional accrediting bodies to be flexible in recognizing that many standards normally required for an accredited degree should be modified in the online arena; it should also encourage such flexibility in state oversight of education.20 If the bar for accreditation is set too high, the infant industry developing MOOC and related technology platforms may struggle to realize its full potential.

How would the Tiffin University / Ivy Bridge College case have turned out if Ohio politicians had taken a similar approach to San Francisco politicians? CCSF has had seven years warning to deal with its issues, yet Ivy Bridge College was shut down, and Altius Education broken apart and sold off, based on a notice of several months from its accrediting agency HLC. From the press release [emphasis added]:

Today, Tiffin University announced to students a directive from the Higher Learning Commission (HLC) that the school must discontinue offering associate degree programs through Ivy Bridge College as of October 20, 2013. Ivy Bridge College, a college within Tiffin University, has offered online associate degree programs to students across the U.S. since its creation in 2008. The HLC directive, which was issued on July 25, was unexpected by Tiffin University, and Ivy Bridge College is now intensely focused on ensuring that its students’ progress towards a degree won’t be interrupted by the decision despite the very short timeline.

I agree with Audrey Watters’ take last week:

As I noted in one of my year-end review posts, I predict this and other accreditation battles will dominate the headlines in 2014.

The post CCSF Accreditation Injunction: The decisions and implications appeared first on e-Literate.