Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 3 days 5 hours ago

On ECAR data and ed tech purgatory

Fri, 2014-08-22 09:24

Recently I wrote a post about many ed tech products being stuck in pilots without large-scale adoption.

In our consulting work Michael and I often help survey institutions to discover what technologies are being used within courses, and typically the only technologies that are used by a majority of faculty members or in a majority of courses are the following:

  • AV presentation in the classroom;
  • PowerPoint usage in the classroom (obviously connected with the projectors);
  • Learning Management Systems (LMS);
  • Digital content at lower level than a full textbook (through open Internet, library, publishers, other faculty, or OER); and
  • File sharing applications. [snip]

This stuck process ends up as an ed tech purgatory – with promises and potential of the heaven of full institutional adoption with meaningful results to follow, but also with the peril of either never getting out of purgatory or outright rejection over time.

With the Chronicle’s Almanac coming out this week, there is an interesting chart that on the surface might contradict the above information, showing ~20 technologies with above 50% adoption.

 Educause Center for Analysis and Research

Note: Data are drawn from responses by a subset of more than 500 of the nearly 800 institutions that participated in a survey conducted from June to October 2013. Reported statistics are either an estimated proportion of the population or an estimated median.
Source: Educause Center for Analysis and Research [ECAR]

The difference, however, is that ECAR (through The Chronicle) asked how many institutions have different ed tech products and our survey asked how many courses within an institution use different ed tech products.

There are plenty of technologies being piloted but few hitting the mainstream, and adoption within an institution is one of the key indicators to watch.

The post On ECAR data and ed tech purgatory appeared first on e-Literate.

OER and the Future of Knewton

Mon, 2014-08-18 11:41

Jose Ferriera, the CEO of Knewton, recently published a piece on edSurge arguing that scaling OER cannot “break the textbook industry” because, according to him, it has low production values, no instructional design, and is not enterprise grade. Unsurprisingly, David Wiley disagrees. I also disagree, but for somewhat different reasons than David’s.

When talking about Open Educational Resources or, for that matter, open source software, it is important to distinguish between license and sustainability model, as well as distinguishing between current sustainability models and possible sustainability models. It all starts with a license. Specifically, it starts with a copyright license. Whether we are talking about Creative Commons or GPL, an open license grants copyright permission to anyone who wants it, provided that the people who want to reuse the content are willing to abide by the terms of the license. By granting blanket permission, the copyright owner of the resource chooses to give up certain (theoretical) revenue earning potential. If the resource is available for free, then why would you pay for it?

This raises a question for any resource that needs to be maintained and improved over time about how it will be supported. In the early days of open source, projects were typically supported through individual volunteers or small collections of volunteers, which limited the kinds and size of open source software projects that could be created. This is also largely the state of OER today. Much of it is built by volunteers. Sometimes it is grant funded, but there typically is not grant money to maintain and update it. Under these circumstances, if the project is of the type that can be adequately well maintained through committed volunteer efforts, then it can survive and potentially thrive. If not, then it will languish and potentially die.

But open resources don’t have to be supported through volunteerism. It is possible to build revenue models that can pay for their upkeep. For example, it is possible to charge for uses of materials other than those permitted by the open license. Khan Academy releases their videos under a Creative Commons Noncommercial Share-Alike (CC NC-SA) license. Everyday students and teachers can use it for free under normal classroom circumstances. But if a textbook publisher wants to bundle that content with copyrighted material and sell it for a fee, the license does not give them permission to do so. Khan Academy can (and, as far as I know, does) charge for commercial reuse of the content.

Another possibility is to sell services related to the content. In open source software, this is typically in the form of support and maintenance services. For education content, it might be access to testing or analytics software, or curriculum planning and implementation services. This is a non-exhaustive list. The point is that it is possible to generate revenue from open content. And revenue can pay for resources to support high production values, instructional design, and enterprise scaling, particularly when paired with grant funding and volunteer efforts. These other options don’t necessarily generate as much revenue as traditional copyright-based licensing, but that’s often a moot point. Business models based on open licenses generally get traction when the market for licensed product is beginning to commodify, meaning that companies are beginning to lose their ability to charge high prices for their copyrighted materials anyway.

That’s the revenue side. It’s also important to consider the cost side. On the one hand, the degree to which educational content needs high production values and “enterprise scaling” is arguable. Going back to Khan Academy for a moment, Sal Khan popularized the understanding that one need not have an expensive three-camera professional studio production to create educational videos that have reach and impact. That’s just one of the better known of many examples of OER that is considered high-quality even though it doesn’t have what publishing professionals traditionally have thought of as “high production values.” On the other hand, it is important to recognize that a big portion of textbook revenues go into sales and marketing, and for good reason. Despite multiple efforts by multiple parties to create portals through which faculty and students can find good educational resources, the adoption process in higher education remains badly broken. So far with a few exceptions, the only good way to get widespread adoption of curricular materials still seems to be to hire an army of sales reps to go knock on faculty doors. It is unclear when or how this will change.

This brings us to the hard truth of why the question of whether OER can “win” is harder than it seems. Neither the OER advocates nor the textbook publishers have a working economic model right now. The textbook publishers were very successful for many years but have grown unsustainable cost structures which they can no longer prop up through appeals to high production values and enterprise support. But the OER advocates have not yet cracked the sales and marketing nut or proven out revenue models that enable them to do what is necessary to drive adoption at scale. If everybody is losing, then nobody is winning. At least at the moment.

This is where Knewton enters the picture. As you read Jose’s perspective, it is important to keep in mind that his company has a dog in this fight. (To be fair at the risk of stating the obvious, so does David’s.) While Knewton is making noises about releasing a product that will enable end users to create adaptive content with any materials (including, presumably, OER), their current revenues come from textbook publishers and other educational content companies. Further, adaptive capabilities such as the ones Knewton offers add to the cost of an educational content product, both directly through the fees that the company charges and indirectly through the additional effort required to design, produce, and maintain adaptive products. To me, the most compelling argument David makes in favor of OER “winning” is that it is much easier to lower the price of educational materials than it is to increase their efficacy. So if you’re measuring the value of the product by standard deviations per dollar, then smart thing is to aim for the denominator (while hopefully not totally ignoring the numerator). The weak link in this argument is that it works best in a relatively rational and low-friction market that limits the need for non-product-development-related expenses such as sales and marketing. In other words, it works best in the antithesis of the conditions that exist today. Knewton, on the other hand, needs there to be enough revenue for curricular materials to pay for the direct and indirect costs of their platform. This is not necessarily a bad thing for education if Knewton-enhanced products can actually raise the numerator as much as or more than OER advocates can lower the denominator. But their perspective—both in terms of how they think about the question of value in curricular materials and in terms of how they need to build a business capable of paying back $105 million in venture capital investment—tilts toward higher costs that one hopes would result in commensurately higher value.

All of this analysis assumes that in David’s ratio of standard deviations per dollar, all that matters is the ratio itself, independently of the individual numbers that make it up. But that cannot be uniformly true. Some students cannot afford educational resources above a certain price no matter how effective they are. (I would love to lower my carbon footprint by buying a Tesla. Alas….) In other cases, getting the most effective educational resources possible is most important and the extra money is not a big issue. This comes down to not only how much the students themselves can afford to pay but also how education is funded and subsidized in general. So there are complex issues in play here regarding “value.” But on the first-order question of whether OER can “break the textbook industry,” my answer is, “it depends.”

The post OER and the Future of Knewton appeared first on e-Literate.

D2L raises $85 million but growth claims defy logic

Fri, 2014-08-15 09:57

Yesterday D2L announced a second round of investment, this time raising $85 million (a mix of debt and equity) to go with their $80 million round two years ago (see EDUKWEST for a useful roundup of news and article links). While raising $165 million is an impressive feat, does this funding give us new information on the LMS market?

First, here are the claims by D2L as part of this round of financing, from EdSurge:

The deal comes on the heels of what the company calls “a year of record growth in the higher education, K-12 and corporate markets.” John Baker, founder and CEO, says the company currently serves 1,100 institutions and 15 million learners–up from 850 and 10 million, respectively, at this time last year. The company also recently opened offices in Latin America, Asia Pacific and Europe.

That’s a 29% growth in the number of institutions and a 50% growth in the number of learners in just one year. Quite impressive if accurate.

Yet the company went through a significant round of layoffs in late 2013 that let go more than 7% of its workforce, and according to both LinkedIn data and company statements they have had no significant growth in number of employees over the past year. According to the EdSurge article, the company does plan to use the new money to hire more staff [emphasis added].

This time, the company says it will play it cool. “There are no planned acquisitions at this stage,” Baker tells EdSurge. “At this point, we’re primarily focused on building out our learning platform to support our clients and thousands of integration partners.” To do so, the company will grow its team of 783 full-time employees. “We are actively looking for dozens of new positions; over 60 in R&D alone,” shares Baker.

Note this slide from John Baker’s FUSION keynote one year ago:

John Baker keynote slide from FUSION conference July 2013

John Baker keynote slide from FUSION conference July 2013

If you take the information above – 800+ employees last year and 783 today - at face value, D2L has actually dropped in employee headcount. Does it make sense that a company can grow 50% in terms of learners without growing company employment, especially coming between two massive funding rounds?

Secondarily, what about the statement of “thousands of integration partners”? D2L is claiming to have more than twice as many integration partners as they do actual clients.

The other issue is market share. It is clear that D2L is planning to grow in corporate (10% of their business according to WSJ), K-12, and international higher ed markets; however, their largest business is still US higher ed. And here they have actually shown signs of no real growth, and for community colleges even dropping market share.

For the first time in an LMS market survey that I am aware of, Desire2Learn has actually lost market share. In fact, Desire2Learn is now lower than both Moodle and Canvas for community colleges according to this survey. This is a topic worth exploring further, especially in relation to last year’s layoffs.

Edutechnica ran the numbers for US higher education in October 2013.[1]

Edutechnica data from Oct 13 for US institutions with more than 2,000 FTE

Edutechnica data from Oct 13 for US institutions with more than 2,000 FTE

Edutechnica ran the numbers again for the end of June for 2,000 FTE and above (to allow an apples-to-apples comparison with Oct 2013), but they have not yet published the results. George did agree to share preliminary information with me, and D2L came out with 225 institutions and 2,084,089 enrollments.[2] The Edutechnica numbers leads to an increase of 3% in number of US institutions and 2% in enrollment (number of learners) over the past 10 months. If D2L has grown its total number of learners by 50% over the past year, it would make sense that we would see very different numbers for their largest market.

In another interview with local media outlet The Star, CEO John Baker described growth this way:

“We’re seeing very rapid growth in Europe, we’ve seen triple-digit growth in Latin America and Asia Pacific. In terms of new accounts we’re seeing great growth basically everywhere we look,” Baker said. Desire2Learn is prioritizing growth in “key hubs,” including Brazil, Mexico, the U.S. and Singapore, he said.

This raises some questions:

  • They mention growth everywhere they look, including the US. Where is this growth that is not showing up in market data?
  • What percentage of their business – in terms of revenue, customers or learner counts – comes from international markets? The company press releases mention their investments in international hubs but I can find no significant news on new accounts with huge numbers.

D2L did not respond to several requests for comment or clarification for this post.

My intention in this and previous posts is to explain what I am seeing in the market and challenge the marketing claims - education institutions need an accurate understanding of what is happening in the LMS market. It is worth noting that not a single media outlet listed by EDUKWEST or quoted above (WSJ, Reuters, Bloomberg, re/code, edSurge, TheStar) challenged or even questioned D2L’s bold claims. It would help if more media outlets didn’t view their job as paraphrasing press releases.

  1. Edutechnica also ran an update in May 2014, but that used a different criteria of ‘more than 1,000 FTE’.
  2. By the way, think of how useful the Edutechnica data approach is compared to annual surveys, with the ability to adjust variables and update results so quickly.

The post D2L raises $85 million but growth claims defy logic appeared first on e-Literate.

Pilots: Too many ed tech innovations stuck in purgatory

Tue, 2014-08-12 13:44

Steve Kolowich wrote an article yesterday in the Chronicle that described the use of LectureTools, a student engagement and assessment application created by faculty member Perry Sampson at the University Michigan. These two paragraphs jumped out at me.

The professor has had some success getting his colleagues to try using LectureTools in large introductory courses. In the spring, the software was being used in about 40 classrooms at Michigan, he says.

Adoption elsewhere has been scattered. In 2012, Mr. Samson sold LectureTools to Echo360[1], an education-technology company, which has started marketing it to professors at other universities. The program is being used in at least one classroom at 1,100 institutions, according to Mr. Samson, who has kept his title of chief executive of LectureTools. But only 80 are using the software in 10 or more courses.

93% of LectureTools clients use the tool for less than 10 courses total, meaning that the vast majority of customers are running pilot projects almost two years after the company was acquired by a larger ed tech vendor.

We are not running out of ideas in the ed tech market – there are plenty of new products being introduced each year. What we are not seeing, however, are ed tech innovations that go beyond a few pilots in each school. Inside Higher Ed captured this sentiment when quoting a Gallup representative after the GSV+ASU EdInnovations conference this year:

“Every one of these companies has — at least most of them — some story of a school or a classroom or a student or whatever that they’ve made some kind of impact on, either a qualitative story or some real data on learning improvement,” Busteed said. “You would think that with hundreds of millions of dollars, maybe billions now, that’s been plowed into ed-tech investments … and all the years and all the efforts of all these companies to really move the needle, we ought to see some national-level movement in those indicators.”

In our consulting work Michael and I often help survey institutions to discover what technologies are being used within courses[2], and typically the only technologies that are used by a majority of faculty members or in a majority of courses are the following:

  • AV presentation in the classroom;
  • PowerPoint usage in the classroom (obviously connected with the projectors);
  • Learning Management Systems (LMS);
  • Digital content at lower level than a full textbook (through open Internet, library, publishers, other faculty, or OER); and
  • File sharing applications.

Despite the billions of dollars invested over the past several years, the vast majority of ed tech is used in only a small percentage of courses at most campuses.[3] Most ed tech applications or devices have failed to cross the barriers into mainstream adoption within an institution. This could be due to the technology not really addressing problems that faculty or students face, a lack of awareness and support for the technology, or even faculty or student resistance to the innovation. Whatever the barrier, the situation we see far too often is a breakdown in technology helping the majority of faculty or courses.

Diffusion of Innovations – Back to the basics

Everett Rogers wrote the book on the spread of innovations within an organization or cultural group in his book Diffusions of Innovations. Rogers’ work led to many concepts that we seem to take for granted, such as the S-curve of adoption:

 The Diffusion of Innovations, 5th ed, p. 11

Source: The Diffusion of Innovations, 5th ed, p. 11

leading to the categorization of adopters (innovators, early adopters, early majority, late majority, laggards), and the combined technology adoption curve.

 The Diffusion of Innovations, 5th ed., p. 281

Source: The Diffusion of Innovations, 5th ed., p. 281

But Rogers did not set out to describe the diffusion of innovations as an automatic process following a pre-defined path. The real origin of his work was trying to understand why some innovations end up spreading throughout a social group while others do not, somewhat independent of whether the innovation could be thought of as a “good idea”. From the first paragraph of the 5th edition:

Getting a new idea adopted, even when it has obvious advantages, is difficult. Many innovations require a lengthy period of many years from the time when they become available to the time when they are widely adopted. Therefore, a common problem for many individuals and organizations is how to speed up the rate of diffusion of an innovation.

Rogers defined diffusion as “a special type of communication in which the messages are about a new idea” (p. 6), and he focused much of the book on the Innovation-Decision Process. This gets to the key point that availability of a new idea is not enough; rather, diffusion is more dependent on the communication and decision-process about whether and how to adopt the new idea. This process is shown below (p. 170):

 The Diffusion of Innovations, 5th ed., p. 170

Source: The Diffusion of Innovations, 5th ed., p. 170

What we are seeing in ed tech in most cases, I would argue, is that for institutions the new ideas (applications, products, services) are stuck the Persuasion stage. There is knowledge and application amongst some early adopters in small-scale pilots, but majority of faculty members either have no knowledge of the pilot or are not persuaded that the idea is to their advantage, and there is little support or structure to get the organization at large (i.e. the majority of faculty for a traditional institution, or perhaps for central academic technology organization) to make a considered decision. It’s important to note that in many cases, the innovation should not be spread to the majority, either due to being a poor solution or even due to organizational dynamics based on how the innovation is introduced.

The Purgatory of Pilots

This stuck process ends up as an ed tech purgatory – with promises and potential of the heaven of full institutional adoption with meaningful results to follow, but also with the peril of either never getting out of purgatory or outright rejection over time.

Ed tech vendors can be too susceptible to being persuaded by simple adoption numbers such as 1,100 institutions or total number of end users (millions served), but meaningful adoption within an institution – actually affecting the majority of faculty or courses – is necessary in most cases before there can be any meaningful results beyond anecdotes or marketing stories. The reason for the extended purgatory is most often related to people issues and communications, and the ed tech market (and here I’m including vendors as well as campus support staff and faculty) has been very ineffective in dealing with real people at real institutions beyond the initial pilot audience.

Update: Add parenthetical in last sentence to clarify that I’m not just talking about vendors as key players in diffusion.

  1. Disclosure: Echo360 was a recent client of MindWires
  2. For privacy reasons I cannot share the actual survey results publicly.
  3. I’m not arguing against faculty prerogative in technology adoption and for a centralized, mandatory approach, but noting the disconnect.

The post Pilots: Too many ed tech innovations stuck in purgatory appeared first on e-Literate.

Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt

Sat, 2014-08-09 13:30

I have been very critical of the Brookings Institution report on student debt, particularly in my post “To see how illogical the Brookings Institution report on student loans is, just read the executive summary”.

D’oh! It turns out that real borrowers with real tax brackets paying off off real loans are having real problems. The percentage at least 90 days delinquent has more than doubled in just the past decade. In fact, based on another Federal Reserve report, the problem is much bigger for the future, “44% of borrowers are not yet in repayment, and excluding those, the effective 90+ delinquency rate rises to more than 30%”.

More than 30% of borrowers who should be paying off their loans are at least 90 days delinquent? It seems someone didn’t tell them that their payment-to-income ratios (at least for their mythical average friends) are just fine and that they’re “no worse off”.

Well now the Federal Reserve Board themselves weighs in on the subject with a new survey, at least as described by an article in The Huffington Post.  I have read the Fed report and concur with HP analysis – it does argue against the Brookings findings.

Among the emerging risks spotlighted by the survey is the nation’s $1.3 trillion in unpaid student debt, suggesting that high levels of student debt are crimping the broader economy. Nearly half of Americans said they had to curb their spending last year in order to make payments on student loans, adding weight to the fear among federal financial regulators that the burden of student debt on households will depress economic growth for years to come.

Some 35 percent of survey respondents who are paying back student loans said they had to reduce their spending by “a little” over the past year to keep up with their student debt payments. Another 11 percent said they had to cut back their spending by “a lot.”

The Fed’s findings appear to challenge recent research by a pair of economists at the Brookings Institution, highlighted in The New York Times and cited by the White House, that argues that households with student debt are no worse off today than they were two decades ago.

The full Fed report can be found here. Much of the survey was focused on borrowers and their perceptions of how their student loans impact them, which is much more reliable than Brookings’ assumptions on how convoluted financial ratios should affect borrowers. In particular, consider this table:

Fed Table 11

Think about this situation – amongst borrowers who have completed their degrees, almost equal numbers think the financial benefits of a degree outweigh the costs as think the opposite (41.5% to 38.1%). I don’t see this as an argument against getting a degree, but rather as clear evidence that the student loan crisis is real and will have a big impact on the economy and future student decision-making.

Thanks to the Federal Reserve Board for helping us out.

Update: Clarified that this is Federal Reserve Board and not NY Fed.

The post Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt appeared first on e-Literate.

Update on 2U: First full quarterly earnings and insight into model

Wed, 2014-08-06 19:09

2U, the online service provider that went public in the spring, just released its financial report for the first full quarter of operations as a public company. The company beat estimates on total revenue and also lost less money than expected. Overall, it was a strong performance (see WSJ for basic summary or actual quarterly report for more details). The basics:

  • Revenue of $24.7 million for the quarter and $51.1 m for the past six months, which represents year-over-year increase of 32 and 35%;
  • EBITDA Losses of $7.1 m for the quarter and $10.9 m for the past six months, which represents year-over-year increase of -2% and 12%; and
  • Enrollment growth of 31 – 34% year-over-year.

Per the WSJ coverage of the conference call:

“I’m very pleased with our second quarter results, and that we have both the basis and the visibility to increase all of our guidance measures for 2014,” said Chip Paucek, 2U’s Chief Executive Officer and co-founder. “We’ve reached a turning point where, even with continued high investment for growth, our losses have stopped accelerating. At the midpoint of our new guidance range, we now expect our full year 2014 adjusted EBITDA loss to improve by 17% over 2013. Further, we’ve announced a schedule that meets our stated annual goal for new program launches through 2015.”

The company went public in late March at $14 / share and is still at that range ($14.21 before the quarterly earnings release – it might go up tomorrow). As one of only three ed tech companies to have gone public in the US over the past five years, 2U remains worth watching both for its own news and as a bellwether of the IPO market for ed tech.


The financials provide more insight into the world of Online Service Providers (OSP, aka Online Program Management, School-as-a-Service, Online Enablers, the market with no name). On the conference call 2U’s CEO Chip Paucek reminded analysts that they typically invest (money spent – revenue) $4 – $9 million per program in the early years and do not start to break even until years 3 – 4. 2U might be on the high side of these numbers given their focus on small class sizes at big-name schools, but this helps explain why the OSP market typically focuses on long-term contracts of 10+ years. Without such a long-term revenue-sharing contract, it would difficult for an OSP to ever break even.

As the market matures – with more competitors and with schools developing their own experiences in online programs, it will become more and more difficult for companies to maintain these commitments from schools. We have already seen signs over the past year of changes in institutional expectations.

2U, meanwhile, has positioned itself at the high-end of the market, relying on high tuitions and brand-name elite schools with small classes. The company for the most part will not even compete in a Request for Proposal process, avoiding direct competition with Embanet, Deltak, Academic Partnerships and others. Their prospects seem much stronger than the more competitive mainstream of OSP providers.

See the posts here at e-Literate for more background.

2U has changed one aspect of their strategy, as noted by Donna Murdoch on G+. At least through 2012 the company positioned itself as planning to work with one school per discipline (or vertical in their language). Pick one school for Masters of Social Work, one for MBA, etc. As described in Jan 2012:

“As we come into a new vertical, 2tor basically partners with one great school per vertical. We find one partner, one brand that is world-class. We partner with that brand over a long time period to create the market leader in that space for that discipline.”

2U now specifically plans for secondary schools in different verticals as can be seen in their press release put out today:

Programs Aug 2014

Note the duplication of Social Work between USC and Simmons, Nursing between Georgetown and Simmons, and Data Science between Berkeley and SMU. Note the new approach from page 20 of the quarterly report:

As described above, we have added, and we intend to continue to add, degree programs in a number of new academic disciplines each year, as well as to expand the delivery of existing degree programs to new clients.

View Into Model

Along with the first quarter release (which was not based on a full quarter of operations as a public company), 2U release some interesting videos that give a better view into their pedagogical approach and platform. In this video they describe their “Bi-directional Learning Tool (BLT)”:

This image is from a page on the 2U website showing their approach, with a view of the infamous Brady Bunch layout for live classes (synchronous).

Live Courses

We’ll keep watching 2U and share significant developments as we see them.

The post Update on 2U: First full quarterly earnings and insight into model appeared first on e-Literate.

Full Disclosure

Sat, 2014-08-02 12:41

As you probably know, we run a consulting business (MindWires Consulting) and sometimes work with the companies and schools that we write about here. Consequently, we periodically remind you and update you on our conflict of interest policies. We do our best to avoid or minimize conflicts of interest where we can, but since our system isn’t perfect, we want you to understand how we handle them when they arise so that you can consider our analysis with the full context in mind. We value your trust and don’t take it for granted.

We talk a lot with each other about how to deal with conflicts of interest because we run into them a lot. On the one hand, we find that working with the vendors and schools that we write about provides us with insight that is helpful to a wide range of clients and readers. There just aren’t too many people who have the benefit of being able to see how all sides of the ed tech relationships work. But along with that perspective comes an inevitable and perpetual tension with objectivity. When we started our business together 18 months ago, we didn’t have a clear idea where these tensions would show up or how big of an issue they might turn out to be. We originally thought that our blogging was going to remain an addiction that was subsidized but somewhat disconnected from our consulting. But it turns out that more than 90% of our business comes from readers of the blog, and a significant portion of it comes out of conversations stimulated by a specific post. Now that we understand that relationship better, we’re getting a better handle on the kinds of conflict of interest that can arise and how best to mitigate them. Our particular approach in any given situation depends on lot on whether the client wants analysis or advice.


In many cases, clients want us to provide deeper, more heavily researched, and more tailored versions of the analysis that we’ve provided publicly on this blog. In this situation, there isn’t a strong a direct conflict of interest between working providing them with what they are asking for and writing public analysis about various aspects of their business. That said, no matter how hard we try to write objectively about an organization that is, was, or could be a client, human nature being what it is, we can’t guarantee that we will never be even subconsciously influenced in our thinking. That is why we have a policy to always disclose when we are blogging about a client. We have done this in various ways in the past. Going forward, we are standardizing on an approach in which we will insert a disclosure footnote at the end of the first sentence in the post in which the client is named. It will look like this.[1] (We are not fully satisfied that the footnote is prominent enough, so we will be investigating ways to make it a little more prominent.) We will insert these notices in all future posts on the blog, whether or not we are the authors of those posts. In cases where the company in question is not currently a client but was recently and could be again in the near future, we will note that the company “was recently a client of MindWires Consulting”.


Sometimes the client wants not only analysis but also strategic advice. Those situations can be trickier. We want to avoid cases in which we blog in praise (or condemnation) of a company for taking an action that they paid us to tell them to take. Our policy is that we don’t blog about any decisions that a company might make based on our advice. There are some theoretical situations in which we might consider making an exception to that rule, but if they ever do come up in reality, then the disclosure principle will apply. We will let you know if, when, and why we would make the exception. Aside from that currently theoretical exception, we recuse ourselves from blogging about the results of our own consulting advice. Furthermore, when potential clients ask us for advice that we think will put us into a long-term conflict of interest regarding one of our core areas of analysis, we turn down that work. Analysis take precedence over advice.

Getting Better at This

We’re going to continue thinking about this and refining our approach as we learn more. We also have some ideas about business models that could further minimize potential conflicts in the future. We’ll share the details with you if and when we get to the point where we’re ready to move forward on them. In the meantime, we will continue to remind you of our current policy periodically so that you are in a better position to judge our analysis. And as always, we welcome your feedback.


  1. Full disclosure: Acme Ed Tech Company is a client of MindWires Consulting, the sponsor of e-Literate.

The post Full Disclosure appeared first on e-Literate.

Mobile-first learning platform EmpoweredU acquired by Qualcomm

Wed, 2014-07-30 17:49

Qualcomm, the giant $26 billion wireless technology conglomerate, acquired EmpoweredU – a mobile-first learning platform available for the education market. What does this acquisition mean?

Who is EmpoweredU?

The company was created by CEO Steve Poizner in 2011 in partnership with Creative Artists Agency, the world’s largest sports and talent agency, under the name “Encore Career Institute”. The initial work was to offer continuing ed classes targeted at Baby Boomers through the UCLA extension.  ((These are certificate programs for $5,000 – $10,000 total tuition.))  In essence, this was an Online Service Provider (OSP) model similar to Embanet, Deltak, Academic Partnerships and 2U. As described by the San Francisco Chronicle in 2011:

Poizner, in an interview at the firm’s headquarters this week, said the company combines “three of California’s greatest assets” – its famed public university system, the creative know-how of its technology center, Silicon Valley, and the cutting-edge marketing savvy of Hollywood. [snip]

In addition to its employment potential for Baby Boomers, Poizner said, the collaboration could bring new revenue for cash-strapped UCLA and thousands of new students from around the nation to its online courses.

The company changed names to Empowered Careers and then eventually settled on EmpoweredU.

In the meantime they figured out that the OSP model is high risk and expensive, often requiring investments of $1 – $10 million per program by the OSP, with revenue-sharing profits occurring several years later. EmpoweredU has pivoted over the past year to become a mobile-first learning platform with content services.

The platform is built on top of the Canvas open source version offered by Instructure and started pilots at 15+ schools this spring (including specific programs at USC, UC Berkeley, U of San Francisco, etc). This may be the most significant use of open source Canvas, and it might end up competing with Canvas, at least indirectly.

As we’ll see later, EmpoweredU is also attempting to create a learning ecosystem that can combine multiple technologies.

Why is Qualcomm making an ed tech acquisition?

I interviewed Vicki Mealer (Senior Director, Business Development, Qualcomm Labs, which is the unit acquiring EmpoweredU) and Steve Poizner today. Vicki’s description of Qualcomm’s interest in ed tech is that they are all about mobile technology, and they have had a philanthropic interest in education for years (donating over $240 million cash to various institutions).  Qualcomm wants to be a behind-the-scenes cheerleader, but they also need for an ecosystem to for each market. Qualcomm Labs started looking at education a year ago, trying to identify and overcome barriers for adoption of mobile technology. Some of the perceived barriers:

  • The digital divide leading to students having gaps in their connectivity (wi-fi vs. cellular);
  • Vendor lock-in and lack of modularity, causing school leaders to have painful technology replacement decisions to move into a mobile strategy; and
  • A lack of software and tools for instructors to take advantage of mobile features and be able to develop curriculum that leverages the technology – partially to have instructors catch up to where the students are.

For Qualcomm Labs, EmpoweredU can provide the modular ecosystem for education and shares their device-agnostic views. This will help them accelerate adoption of mobile in education.

Steve is becoming the SVP of a new business unit within the Labs, called Qualcomm Education. The EmpoweredU unit will combine with a separate Mobile Learning Framework initiative and broaden its focus to K-20.

Should we care?

I visited the company in May of this year and saw a very different design approach than the current generation of browser-based learning platforms that have added mobile features as after-the-face enhancements. At this point EmpoweredU is a niche player only targeting specific academic programs that can afford an iPad one-to-one approach or similar methods to ensure that all students have tablets. Longer term they see this need broadening out to entire institutions. The technology has a full browser interface, so the company could target institution-wide opportunities should they choose.

What is meant by mobile-first in this case is that the platform was conceived and designed around the iPad, directly integrating device features such as location as well as camera and microphone input. In addition, the platform uses push notifications to alert students to assignments or due dates.

Main UI 1

One feature that I find quite important for the mobile world is automatic caching to allow offline access. The default setup syncs the current, past, and next week’s material to the device while connected, allowing offline work that will be re-synched when back on the network.

While the platform was written originally for the iPad, they now support multiple devices and have one pilot that is web only.

In a nod to their OSP origins and content-generating experience, EmpoweredU offers “content sherpas” and a content authoring system. The idea is to support faculty and designers who are attempting to design courses and content that take advantage of the mobile platform.


They released initial analytics support in the spring.

During the interview, it became apparent that Qualcomm is interested not just in the learning platform, but in EmpoweredU’s broader plans to create an ecosystem.


I pushed them to describe who would be their competitors, either in higher ed or K-12, but they would not directly answer. They kept coming back to the ecosystem and the ability to provide a modular approach and not force rip and replace strategies. I can see this in theory but question what this means in reality.

From an initial look at the company, it will be interesting to watch to see if Qualcomm’s financial backing will allow EmpoweredU to move beyond a niche provider for select programs and attempt to directly compete in the LMS market for institutions or at least compete more broadly. It will also be interesting to see if they are successful in their entrance to the K-12 market. If so, the learning platform market will get even more interesting.

As for the full ecosystem, there are not enough details to understand how seriously to take this approach. Are schools even ready for this approach? How does this ecosystem relate to the LTI specifications that are fundamentally changing the ed tech market? I have many questions in this area that we’ll have to watch over time.

Update: Corrected University of San Francisco reference (and not UC San Francisco) per comments below.

The post Mobile-first learning platform EmpoweredU acquired by Qualcomm appeared first on e-Literate.

To see how illogical the Brookings Institution report on student loans is, just read the executive summary

Tue, 2014-07-29 17:13
il·log·i·cal i(l)ˈläjikəl/ adjective
  1. lacking sense or clear, sound reasoning.  ((From Google’s definition))

There have been multiple articles both accepting the Brookings argument that “typical borrowers are no worse off now than they were a generation ago” and those calling out the flaws in the Brookings report. I have written two articles here and here criticizing the report. The problem is that much of the discussion is more complicated that it needs to be. A simple reading of the Brookings executive summary exposes just how illogical the report is.

College tuition and student debt levels have been increasing at a fast pace for at least two decades. These well-documented trends, coupled with an economy weakened by a major recession, have raised serious questions about whether the market for student debt is headed for a crisis, with many borrowers unable to repay their loans and taxpayers being forced to foot the bill.

The argument is set up – yes, tuition and debt levels are going up, but how is a crisis defined? It’s specifically about “many borrowers unable to repay their loans”. Is there a crisis? That’s not a bad setup, and it is a valid question to address.

Our analysis of more than two decades of data on the financial well-being of American households suggests that the reality of student loans may not be as dire as many commentators fear. We draw on data from the Survey of Consumer Finances (SCF) administered by the Federal Reserve Board to track how the education debt levels and incomes of young households evolved between 1989 and 2010. The SCF data are consistent with multiple other data sources, finding significant increases in average debt levels, but providing little indication of a significant contingent of borrowers with enormous debt loads.

This is an interesting source of data. Yes, the New York Fed’s Survey of Consumer Finances tracks student debt, but this data is almost four years old due to triennial survey method. [1]

But hold on – now we’re talking about “significant contingent of borrowers with enormous debt loads”? I thought the issue was ability to repay. What does “enormous” even mean other than being a scary word?

First, we find that roughly one-quarter of the increase in student debt since 1989 can be directly attributed to Americans obtaining more education, especially graduate degrees. The average debt levels of borrowers with a graduate degree more than quadrupled, from just under $10,000 to more than $40,000. By comparison, the debt loads of those with only a bachelor’s degree increased by a smaller margin, from $6,000 to $16,000.

Fair enough point to start, noting that a quarter of debt growth comes from higher levels of education including grad school. Average debt loads have gone up more than 2.5x for undergrads, and that certainly sounds troublesome given the report’s main point of “no worse off”. Using the ‘but others are worse off, so this is not as bad’ argument, Brookings notes that grad students had their debt go up by 4x. The argument here appears to be that 2.5 is less than 4.[2]

Second, the SCF data strongly suggest that increases in the average lifetime incomes of college-educated Americans have more than kept pace with increases in debt loads. Between 1992 and 2010, the average household with student debt saw in increase of about $7,400 in annual income and $18,000 in total debt. In other words, the increase in earnings received over the course of 2.4 years would pay for the increase in debt incurred.

Despite the positioning of the report that a small portion of borrowers skews the data and coverage, Brookings resorts to using the mythical “average household”. For that mythical entity, they certainly seem to have the magical touch to not pay any taxes and obtain zero-interest loans.[3]

Nonetheless, we’ve now changed the issue again – first by ability to repay, then whether the loan is “enormous”, and now based on how long a mythical payoff takes.

Third, the monthly payment burden faced by student loan borrowers has stayed about the same or even lessened over the past two decades. The median borrower has consistently spent three to four percent of their monthly income on student loan payments since 1992, and the mean payment-to-income ratio has fallen significantly, from 15 to 7 percent. The average repayment term for student loans increased over this period, allowing borrowers to shoulder increased debt loads without larger monthly payments.

Small issue, but we’ve now gone from average household as key unit of measurement to median borrower? Two changes from one paragraph to the other – average to median and household to borrower?

OK, now we have replaced the scary “enormous” with “borrowers struggling with high debt loads”. Although not in the executive summary, the analysis of the report seems to define these large debts as $100,000 or more. Doesn’t it matter who the borrower is? A humanities PhD graduate working as an adjunct for $25,000 a year might view $20,000 debt as enormous.

Brookings introduces a new measure, and this one does at least take into account the difference in borrowers: payment-to-income ratios of median borrowers. If I’m reading the argument correctly (this took a while based on key measures and terms changing paragraph to paragraph), not only should there be no crisis, but the situation might actually be improving.

These data indicate that typical borrowers are no worse off now than they were a generation ago, and also suggest that the borrowers struggling with high debt loads frequently featured in media coverage may not be part of a new or growing phenomenon. The percentage of borrowers with high payment-to-income ratios has not increased over the last 20 years—if anything, it has declined.

So I was reading it correctly: “typical borrowers are no worse off” and the percentage of borrowers with high ratios has declined.[4] The only problem, however, is that if we go back to the original setup of the issue, “many borrowers unable to repay their loans”, there might be a much more direct measurement. How about actually seeing if borrowers are failing to repay their loans (aka being delinquent)?

The Brookings report does not analyze loan delinquency at all - the word “default” is only mentioned three times – once referring to home mortgages and twice referring to interest rates (not once for the word “delinquent”). What do actual delinquency rates show us?

It turns out that we can go to the same source of data and find out. Here is the New York Fed report from late 2013:


D’oh! It turns out that real borrowers with real tax brackets paying off off real loans are having real problems. The percentage at least 90 days delinquent has more than doubled in just the past decade. In fact, based on another Federal Reserve report, the problem is much bigger for the future, “44% of borrowers are not yet in repayment, and excluding those, the effective 90+ delinquency rate rises to more than 30%”.

More than 30% of borrowers who should be paying off their loans are at least 90 days delinquent? It seems someone didn’t tell them that their payment-to-income ratios (at least for their mythical average friends) are just fine and that they’re “no worse off”.

Back to the Brookings report:

This new evidence suggests that broad-based policies aimed at all student borrowers, either past or current, are likely to be unnecessary and wasteful given the lack of evidence of widespread financial hardship. At the same time, as students take on more debt to go to college, they are taking on more risk. Consequently, policy efforts should focus on refining safety nets that mitigate risk without creating perverse incentives.

Despite the flawed analysis that changed terms, changed key measures, and failed to look at any data on delinquencies, Brookings now calls out a “lack of evidence of widespread financial hardship”. How can we take their recommendations seriously when the supporting analysis is fundamentally illogical?

At least the respectable news organizations will do basic checking of the report before parroting such flawed analysis.

The worries are exaggerated: Only 7% of young adults with student debt have $50,000 or more.

— David Leonhardt (@DLeonhardt) June 24, 2014

ICYMI=>The Student Debt Crisis Is Being Manufactured To Justify Debt Forgiveness #tcot #taxes

— Jeffrey Dorfman (@DorfmanJeffrey) July 5, 2014


  1. Also note that we’re skipping the years with the highest growth in student debt.
  2. This argument also ignores or trivializes the issue that grad students are indeed students.
  3. There is no other way to get to the 2.4 year payoff.
  4. And yet another change – from average to median to typical.

The post To see how illogical the Brookings Institution report on student loans is, just read the executive summary appeared first on e-Literate.

Early Review of Google Classroom

Mon, 2014-07-28 16:36

Meg Tufano is co-Founder of SynaptIQ+ (think tank for social era knowledge) and leader of McDermott MultiMedia Group (an education consulting group focused on Google Apps EDU). We have been checking out Google Classroom – with her as the teacher and me as the student. I include some of Meg’s bio here as it is worth noting her extensive experience designing and teaching online courses for more than a decade.

Meg posted a Google Slides review of her initial experiences using Google Classroom from a teacher’s perspective, which I am sharing below with minimal commentary. The review includes annotated slides showing the various features and Meg’s comments.

I have not done as much work to show the student view, but I will note the following:

  • The student view does not include the link to the Chrome Store that Meg finds to be too confusing.
  • The biggest challenge I’ve had so far is managing my multiple Google accounts (you have to be logged into the Google Apps for Edu as your primary Google account to enter Classroom, which is not that intuitive to students).
  • I wonder if Google will continue to use Google tools so prominently in Classroom (primary GDrive, YouTube, GDocs) or if the full release will make it easier to embed non-Google tools.
  • I have previously written “Why Google Classroom won’t affect institutional LMS market … yet”, and after initial testing, nothing has changed my opinion.
  • I have one other post linking to video-based reviews of Google Classroom here.

The post Early Review of Google Classroom appeared first on e-Literate.

CCSF Update: Accreditation appeal denied, but waiting for court date

Wed, 2014-07-23 13:30

It looks like I’ll have the California trifecta for the past week, having already posted on Cal State and University of California news recently. Maybe I should find a Stanford or some other private university story.

In my last post on CCSF from January:

Last week, as expected, a California superior court judge ruled on whether to allow the Accrediting Commission for Community and Junior Colleges (ACCJC)  to end accreditation for City College of San Francisco (CCSF) as of July 31, 2014. As reported inmultiple news outlets, the judge granted an injunction preventing ACCJC from stripping CCSF’s accreditation at least until a court trial based on the city of San Francisco lawsuit, which would occur in the summer 2014 at the earliest. This means that CCSF will stay open for at least another academic term (fall 2014), and it is possible that ACCJC would have to redo their accreditation review.

 In the meantime, ACCJC reviewed CCSF’s appeal of the accrediting decision, and ACCJC is sticking to its guns on the decision, as described in the San Francisco Chronicle:

City College of San Francisco remains out of compliance with eight accreditation standards, so the threat to revoke its accreditation stands, said the commission that set July 31 for the action that would shut the college down.

Accreditation won’t be revoked on that date, however, because a judge delayed the deadline until an October trial can determine if the Accrediting Commission for Community and Junior Colleges properly conducted its 2012 evaluation of City College.

In other words, ACCJC has changed its determination that CCSF should lose accreditation. There are only two caveats at this point:

  • The injunction that prevents ACCJC from revoking accreditation until the October court date; and
  • A new loophole called “restoration status”.

From the SF Chronicle again:

Besides pinning its hopes on the lawsuit – which could trigger a completely new evaluation – the college has one more option, made possible in June when the U.S. Department of Education firmly explained to the reluctant commission that it had the power to extend the revocation deadline.

As a result of that intervention, the commission created a new “restoration status” for City College – and any other college that finds itself in such a precarious position – giving it two more years to improve and comply with a new range of requirements.

City College would have to apply for the new status by July 31.

But Phil, you say, I am fascinated by the accreditation review process and want more! To keep you going, here is the letter from ACCJC to CCSF rejecting the appeal. In the letter ACCJC calls out the areas where CCSF is still not in compliance:

I.B   Improving Institutional Effectiveness

II.A  Instructional Programs

II.B  Student Support Services

II.C  Library and Learning Support Services

III.B Physical Resources

III.C Technology Resources

III.D Financial Resources

IV.B Board and Administrative Organization

For historical context of how we got here, see this post.

The high-profile game of Chicken continues.

The post CCSF Update: Accreditation appeal denied, but waiting for court date appeared first on e-Literate.

University of California’s $220 million payroll project reboot

Mon, 2014-07-21 16:11

Chris Newfield has an excellent post at Remaking the University about the University of California’s budget situation and how it relates to the recent Moody’s negative outlook on higher education finances. The whole article is worth reading, but one section jumped off the page for me [emphasis added].

The sadder example of ongoing debt is the request for “external financing for the UCPath project.” UC Path was UCOP’s flagship solution to UC inefficiencies that were allegedly wasting taxpayers’ money–in other words, new enterprise software for the systemwide consolidation of payroll and human resources functions. This is boring, important back office stuff, hardly good material for a political campaign to show the state “UC means business,” but that’s what it became. Rather than funding each campus’s decades-old effort to upgrade its systems on its own, UCOP sought centralization, which predictably introduced new levels of cost, complexity, and inefficiency, since centralization is often not actually efficient.

I had heard nothing good about UC Path from people trying to implement it on campuses, and have tried to ignore it, but this week it has resurfaced as a problem at the Regental level. The project timeline has grown from 48 to 72 months, and its costs are said to be $220 million (it had spent $131 million by May 2014) . Worse, the repayment schedule has mushroomed from seven to twenty years. Annual payments are to be something like $25 million. Campuses are to be taxed to pay for 2015-era systems until 2035, which is like taking out a twenty year mortgage to pay for your refrigerator, except that your fridge will be working better in 2035 than next year’s PeopleSoft product. Since the concurrent budget document notes efficiency savings of $30 million per year (top of page 4), UCOP may be spending $220 million to save a net $5 million per year over a couple of decades–and going into debt to do it. In the end, an efficiency measure has turned into a literal liability.

What the hell – a $220 million project to save money? How did this project get in this much trouble?

The UCPath project concept originated in 2009 with the project announcement coming in late 2011. The goal is to replace the Payroll Personnel System (PPS) that runs separately for each of the 11 UC locations with Oracle’s PeopleSoft payroll and HR systems. PPS is over 30 years old, and there are major risk issues with such an old system as well as a host of inefficient processes. The original project plans were based on a $170 million budget[1] with the first wave of go-live for the Office of the President and 3 campuses scheduled for early 2013. All campuses would be live on the new system by late 2014.[2]

In a presentation to the Board of Regents in January 2012:

Over the same period, cost reductions are expected to be approximately $750 million from technology efficiency gains, process standardization and consolidation of transactional activities into a UC-wide shared services center. Overall, the project has a net present value of approximately $230 million (at a nine percent discount rate) with breakeven in year 5.

Subsequent promises were made in March of 2012:

We think this project is likely to pay for itself within five years, and UC could be accruing over $100 million in annual savings by the eighth year,” said Peter Taylor, UC’s chief financial officer. “We also expect to deliver HR and payroll services with increased efficiency, accuracy and quality.”

At the Board of Regents’ meeting last week, the project team gave the first update to the regents since January 2012 (itself a troubling sign). See this Sharestream video from 2:56:10 – 3:22:40.

By Fall 2013 the project was in trouble, and UC leadership brought in new leadership for the project: Mark Cianca as Deputy CIO and Sabu Varghese as Program Director. Their first act was to do a health check on the project, and the results were not pretty (as described in last week’s Board of Regents’ meeting).

  • The project team and implementation partner (Oracle) had treated the project as a software replacement rather than a fundamental business transformation initiative.
  • The individual campuses had not been consulted on changes in business processes, and in fact they had not even been asked to sign off on future state business processes that each campus would have to run to stay in operation.
  • The new project team had to go through more than 100 future state processes with campuses and get agreement on how to proceed.

The result, as described by UC President Janet Napolitano at last week’s meeting, was the team having to “reboot the entire project”.

Based on the reboot, the current plan is $220 million with first wave complete by February 2016 and all campuses live by mid 2017. That’s $50 million over budget and 24 months over schedule.

Deployment Schedule Jul 2014

But the planning is not complete. They are working up their “final” replan of budget and timeline, which they will present in January 2015.

Topics for Jan 2015

How solid is the current estimate? The implementation schedule is listed as the highest risk, even with the delays.

Major Risks Jul 2014

The project financing has changed so much that UC is now facing the need to use external financing over a much longer term, as described in the material for last week’s board meeting.

Therefore, this item seeks approval to refinance the UCPath loan out of CapEquip and into external financing to achieve the financing customization required. As indicated above, the original repayment plan based on the $220.5 million budget was expected to have been repaid with annual debt service of $25 million. This would have resulted in a 12-year loan term once principal was to be repaid. In January 2015, UCPath project leadership plans to present a revised project timeline, a revised project budget and a revised estimated loan repayment schedule. Project leadership will work with the campus budget officers (and campus budget department staff) to develop: (1) an appropriate campus cost allocation strategy; (2) an estimated repayment schedule that will reflect commencement of principal repayments in conjunction with the final campus deployment (estimated to be early 2017); and (3) an estimated 15-20 year loan repayment period.

  • The new project team seems quite credible, and for the most part they addressed the right points during the briefing. Kudos to UC for making this change in leadership.
  • This is a major project turnaround (or reboot, in Napolitano’s words), but I’m not sure that UC had communicated the significance of the project changes to system campuses (and certainly not to the media).
  • I would view the current plan of $220 million and Q1 2017 full deployment as best case situation – the team told the regents that they were going to update the plan, and ERP project almost never come in earlier than planned.
  • The actual amount is much higher than $220 based on this footnote: “The $10 million in tenant improvements approved for the UCPath Center Riverside site as well as the $17.4 million purchase of the facility (UCPath is currently projected to use no more than 50 percent of the building) are not included in the figures above.”
  • How do you go 2.5 years between updates from what is now a quarter billion dollar project?
  • What about the current estimate of benefits – is it $30 million per year as Chris described or closer to $100 million per year? One big concern I have is that the information on project benefits was not updated, presented to the regents, or asked by the regents. While I question the $25 million financing and $30 million benefits numbers, I think Chris got it exactly right by noting how UC administration is failing to ask hard questions:

Moving forward, I’m afraid that officials are going to have to get much better at admitting mistakes like UCPath, and then actually undoing them. I couldn’t listen to the recording of the UCPath conversation, but Cloudminder made it sound like a lot of restrained finger-pointing with no solution in sight. Did anyone say, “well, this seemed like a good idea at the time, but it’s not. Let’s just cancel it, figure out where we went wrong, and come up with something better”?

It is possible that continuing with the rebooted project is the right answer, but UC is not even asking the question. Failing to ask whether 15-20 year financing of a new ERP makes sense seems like a major oversight. Won’t this lock UC into an Oracle system that is already antiquated for another two decades or more? It seems stunning to me that UC is planning to commit to $220 million of external financing without asking some basic questions.

  1. one regent last week stated the original request was actually $156 million.
  2. All public projects should fear the Wayback Machine for checking old web pages.

The post University of California’s $220 million payroll project reboot appeared first on e-Literate.


Sun, 2014-07-20 08:16

It would be deeply unfair of me to mock Blackboard for having a messy but substantive keynote presentation and not give equal time to D2L’s remarkable press release, pithily entitled “D2L Supercharges Its Integrated Learning Platform With Adaptive Learning, Robust Analytics, Game-Based Learning, Windows® 8 Mobile Capabilities, And The Newest Education Content All Delivered In The Cloud.” Here’s the first sentence:

D2L, the EdTech company that created the world’s first truly integrated learning platform (ILP), today announces it is supercharging its ILP by providing groundbreaking new features and partnerships designed to personalize education and eliminate the achievement gap.

I was going to follow that quote with a cutting remark, but really, I’m not sure that I have anything to say that would be equal to the occasion. The sentence speaks for itself.

For a variety of reasons, Phil and I did not attend D2L FUSION this year, so it’s hard to tell from afar whether there is more going on at the company than meets the eye. I’ll do my best to break down what we’re seeing in this post, but it won’t have the same level of confidence that we have in our Blackboard analysis.

Let me get to the heart of the matter first. Does it look to us like D2L has made important announcements this year? No, it does not. Other than, you know, supercharging its ILP by providing groundbreaking new features and partnerships designed to personalize education and eliminate the achievement gap. They changed their product name to “Brightspace” and shortened their company name to D2L. The latter strikes me as a particularly canny PR move. If they are going to continue writing press releases like their last one, it is probably wise to remove the temptation of the endless variety of potential “Desire2″ jokes. Anyway, THE Journal probably does the best job of summarizing the announcements. For an on-the-ground account of the conference and broader observations about shifts in the company’s culture, read D’Arcy Norman’s post. I’ve been following D’Arcy since I got into blogging ten years ago and have learned to trust his judgment as a level-headed on-the-ground observer.

From a distance, a couple of things jump out at me. First, it looks to me like D2L is trying to become a kind of a content player. Having acquired the adaptive platform in Knowillage, they are combining it with the standards database that they acquired with the Achievement Standards Network. They are also making a lot of noise about enhancements to and content partnerships for their Binder product, which is essentially an eBook platform. Put all of this together, and you get something that conceptually is starting to look (very) vaguely like CogBooks. It wants to be an adaptive courseware container. If D2L pulls this off it will be significant, but I don’t see signs that they have a coherent platform yet—again, acknowledging that I wasn’t able to look at the strategy up close at FUSION this year and could easily be missing critical details.

Second, their announcement that they are incorporating IBM’s Cognos into their Insights learning analytics platform does not strike me as a good sign for Insights. As far as we have been able to tell from our sources, that product has languished since Al Essa left the company for McGraw Hill. One problem has been that their technical team was unable to deliver on the promise of the product vision. There were both data integrity and performance issues. This next bit is speculation on my part, but the fact that D2L is announcing that they plan to use the Cognos engine suggests to me that the company has thus far failed to solve those problems and now is going to a third party to solve them. That’s not necessarily a bad strategy, but it reinforces our impression that they’ve lost another year on a product that they hyped to the heavens and raises questions about the quality of their technical leadership.

The post Desire2Wha? appeared first on e-Literate.

It’s The End of Cal State Online As We Know It . . .

Sat, 2014-07-19 08:48

In a letter to campus leaders, Cal State University system office last month announced that Cal State Online will no longer operate as originally conceived. Emphasis added below.

As the CSU continues to expand its online education strategies, Cal State Online will evolve as a critical component. An early Cal State Online goal will continue: to increase the quality and quantity of fully online education offerings to existing and prospective CSU students, resulting in successful completion of courses and graduation.

The re-visioning of Cal State Online was recommended by the Council of Presidents and approved by the chancellor. This will include a shift to a communication, consultation and services’ strategy for fully online campus degree programs, credentials, certificates and courses supported by opt-in shared services. Cal State Online’s shared services will be designed, delivered and managed to:

1. Make it easy for prospective and existing students to discover, decide, enroll and successfully complete their CSU online education opportunities.

2. Make it more cost-effective for CSU campuses to develop, deliver and sustain their high- quality fully online degree, credential and certificate programs and courses.

Background in a nutshell

In early 2010 a sub-set of the Cal State presidents – the Technology Steering Committee (TSC) – came up with a plan to get the system to aggressively push online education across the system. In fall 2011 the group commissioned a consultant’s set of reports to help them pick an operating model, with the reports delivered in February 2012. This study led to the creation of CSU Online, conceived as a separate 501(c)3 non-profit group1 run by the system, with the plan to use a for-profit Online Service Provider (OSP).2 Early on they realized that Colorado State University was already using the CSU Online name, and the initiative was renamed Cal State Online. The idea was to offer fully-online programs offered by individual campuses in a one-stop shop. Based on an RFP process, in August 2012 Cal State Online selected Pearson as their OSP partner.

Some media coverage of initiative:

The March IHE article quoted official Cal State documents to describe the initiative.

“The goal of Cal State Online is to create a standardized, centralized, comprehensive business, marketing and outreach support structure for all aspects of online program delivery for the Cal State University System,” says the draft RFP. In the open letter, the executive director offers assurances that “participation is optional” for each of the system’s nearly two dozen campuses, “all programs participating in Cal State Online are subject to the same approval processes as an on-campus program,” and “online courses will meet or exceed the quality standards of CSU face-to-face courses.”

What has changed?

This change is significant and recent, meaning that Cal State likely does not have full plans on what will happen in the future. For now:

  • Cal State Online will no longer be a separate operating entity, and the remnant, or “re-visioned” services will be run by the existing Academic Technology Services department within the Chancellor’s Office.

The re-visioning Cal State Online team will be led by Gerry Hanley (Assistant Vice Chancellor for Academic Technology Services) with Sheila Thomas (State University Dean, Extended and Continuing Education).

  • Pearson is no longer the OSP, and in fact, they had already changed their role many months ago3 to remove the on-site team and become more of a platform provider for the LearningStudio (aka eCollege) LMS and supporting services.
  • Cal State is no longer attempting to provide a centralized, comprehensive support structure “for all aspects of online program delivery” but instead will centrally provide select services through the individual campuses.
  • It is clear that Cal State is positioning this decision to show as much continuity as possible. They will continue to provide some of the services started under Cal State Online and will continue to support the programs that have already been offered through the group.

Some services will continue and CSU may keep the name, but it’s the end of Cal State Online as we know it.

I am working on a longer post to explain what happened, including (hopefully) some interviews for supporting information . . . stay tuned.

Update: Changed description of Pearson change and added footnote.

  1. I have not independently verified that the organization truly was set up as a 501(c)3.
  2. Pearson had a team in place at Cal State providing LMS, implementation and integration services, enrollment management & marketing, course design support, analytics and reporting, learning object repository, help desk and technical support, training and faculty support.
  3. I believe this occurred Feb 2014 but am not sure.

The post It’s The End of Cal State Online As We Know It . . . appeared first on e-Literate.

Blackboard’s Big News that Nobody Noticed

Fri, 2014-07-18 16:57

This week was both D2L’s FUSION conference and Blackboard’s BbWorld. The conventional wisdom going around is that there was no big news out of either conference. In Blackboard’s case, that’s just not true. In fact, there was an astonishing amount of very significant news. It’s just that Blackboard didn’t do a very good job of explaining it to people. And that, by itself, is also news.

The big corporate keynote had to be one of the strangest I’ve ever seen. CEO Jay Bhatt ran through a whole long list of accomplishments for the year, but he only gave each one a few seconds as he rattled through the checklist. He mentioned that the company has a new mission statement but didn’t bother to explain it. It took nearly an hour of mostly talking about big macro trends in education and generalities about the categories of goals that the company has set before he finally got around to new product announcements. And then commenced what I can only describe as a carpet bombing run of announcements—a series of explosions that were over by the time you realized that they had started, leaving you to wonder what the heck had just happened. Vice President of User Experience Stephanie Weeks gave a 10-minute talk that was mostly platitudes and generalities about goals for students while some truly significant UX work that her team had done played on the video screen in the background, largely unexplained. There was something mentioned about cloud. Collaborate without a Java plugin! A new mobile app. Wait, another new mobile app, but something about jobs. Wait! Go back to the last slide! I think that was…. Is it over already? It seemed like simultaneously the longest and shortest keynote ever.

Phil and I had a chance to talk to Jay about it later in the day and asked him (politely) what he was thinking. He said, “I don’t view BbWorld as a selling conference. At all.”

Wait. What? This is the Blackboard conference, right?

Apparently it was. This executive team is nothing if not earnest about wanting to talk about the real issues in education. In fact, they’re so earnest about it that they’d rather talk about that than sell you their product. As a result, what was announced in Vegas stayed in Vegas. They made a serious mistake with their keynote plan. But as far as serious mistakes go, it was kind of awesome. And revealing. In and of itself, it is a strong indicator that, having begun a major cultural shift under Ray Henderson, the Blackboard of today is under Jay Bhatt is a very different beast than the Blackboard of five or six years ago. Many of your assumptions about what the company is and what you can expect from them probably aren’t safe ones to make anymore.

Anyway, it’s not surprising that people observing the conference from afar (and even from anear) missed the announcements. So what were they?

Major UX Overhaul

In the past, a “major UX overhaul” for Blackboard typically meant “we moved around some stuff in the admin panel and put on a skin that looks 5 years out of date rather than 15.” Not this time. The new UX is very different. It takes a lot of design cues from iOS (and, to a certain degree, from Windows Mobile). Forget about the 15 different submenus. They’re moving everything to a single-page model with contextual overlays that fly in when you need them. Workflows have been greatly simplified, and many of them rethought. As I sat in on a demo later in the day, I’m pretty sure that the woman in the row in front of me started crying when she saw how much easier it is to import content from an old course.

To be fair, this isn’t shipping code. “Oh, Michael,” you’re thinking about now, “How can you be such a sucker as to fall for the old vaporware bait and switch?” Well, Phil and I spent some time in their UX lab. We were given access to what was clearly a live system (as was anyone else who came to the UX lab). The UX guy managing the lab gave us a script and warned us that this is still a system in development so if we wanted to see what is actually working today we should stick to the script. But of course, we didn’t. The workflows covered by the script were significant, and a lot that wasn’t on the script was also actually already working. This is real, folks. It may not be done yet, but it’s credible. And if the alpha we saw was any indication, it’s not crazy to imagine that Blackboard could raise the bar on LMS UX design by the time that they release. I kid you not.

Underneath all of this, some serious technical work has been done. Blackboard UX is now 100% separated from the business logic, using Node.js to deliver it and putting presentation code in the browser. Also, the new UX is fully responsive. It dynamically adjusts to the size of the browser window (and device).

Even more impressive was the overhaul of Blackboard Collaborate. The Java plugin is gone.1 It’s been replaced by a simple—dare I say elegant?—WebRTC-based UX. We saw a live demo of it. If Google had designed Hangouts specifically for education, they probably would have built something like what Blackboard is showing off. And it works. We saw it in action.

The UX overhaul would be a pretty significant development all by itself. But it wasn’t all by itself.

Blackboard Learn Is Going to the Cloud

Wait. What?

Phil and I are still trying to nail down some of the details on this one, particularly since the term “cloud” is used particularly loosely in ed tech. For example, we don’t consider D2L’s virtualization to be a cloud implementation. But from what we can tell so far, it looks like a true elastic, single-instance multi-tenant implementation on top of Amazon Web Services. It’s kind of incredible. And by “kind of incredible,” I mean I have a hard time believing it. Re-engineering a legacy platform to a cloud architecture takes some serious technical mojo, not to mention a lot of pain. If it is true, then the Blackboard technical team has to have been working on this for a long time, laying the groundwork long before Jay and his team arrived. But who cares? If they are able to deliver a true cloud solution while still maintaining managed hosting and self-hosted options, that will be a major technical accomplishment and a significant differentiator.

This seems like the real deal as far as we can tell, but it definitely merits some more investigation and validation. We’ll let you know more as we learn it.

Bundled Products

This one may sound like a trivial improvement unless you’ve ever actually dealt with Blackboard’s sales force and trivial to implement unless you’ve ever worked in a big software company with lots of business units, but Blackboard has ended the practice of separately licensing 57 different products, each with its own sales rep and price sheet. In some cases—like xpLOR and myEDU—they’re merging the functionality into the core product. In others, they’re creating tiers of service.

Here’s how their website currently describes the tiers:

  • Learning Core: Bb Learn. (But remember, they’re merging previously separate offerings into it.)
  • Learning Essentials: Everything in Core plus Collaborate.
  • Learning Insight: Everything in Essentials plus Analytics for Learn
  • Learning Insight & Student Retention: Everything in Insight plus “retention services.” I didn’t catch this at the conference, but if it’s what it sounds like then the company is beginning to move away from differentiating between products and services and toward integrated solutions.

This should deliver more value to customers with less hassle.

Other Stuff

Those were the big announcements, but there was a lot of other stuff that floated by. It seems like they’re doing significant work on their mobile app, separate from the responsive UX work. I didn’t get a chance to even see what that is about. They’re working on a content store in partnership with MBS Books that could be more significant than it looks at a glance. There was some sort of jobs or career mobile app that whizzed by in the keynote. And who knows what else.

When I take a step back and look at this as a whole, a few thoughts run through my head. First comes, “Yeah, they had to do most of this in order to compete with Instructure. The holes they are filling are fairly clear.” Next comes, “I really didn’t believe they could pull some of this off at all, never mind as quickly and well as they seem to be doing it. Time will tell but…wow.” Then comes, “How the hell did they manage to get through a keynote with all of this in it and not blow people out of their chairs?” And finally, “Who would have thought in a million years that the LMS space could become interesting again?”

But there you have it. This is just a news post; the implications for Blackboard and the market are many and significant. Phil and I will have more to say about it in the days and weeks ahead. For now, the take-home message can be summed up thusly:

Game on.

  1. Many Bothans died to bring you this enhancement.

The post Blackboard’s Big News that Nobody Noticed appeared first on e-Literate.

NPR and Missed (Course) Signals

Mon, 2014-07-14 12:45

Anya Kamenetz has a piece up on NPR about learning analytics, highlighting Purdue’s Course Signals as its centerpiece. She does a good job of introducing the topic to a general audience and raising some relevant ethical questions. But she missed one of the biggest ethical questions surrounding Purdue’s product—namely, that some of its research claims are likely false. In particular, she repeats the following claim:

Course Signals…has been shown to increase the number of students earning A’s and B’s and lower the number of D’s and F’s, and it significantly raises the chances that students will stick with college for an additional year, from 83% to 97%. [Emphasis added.]

Based on the work of Mike Caulfield and Al Essa summarized in the link above, it looks like that latter claim is probably the result of selection bias rather than a real finding. So who is at fault for this questionable claim being repeated without challenge in a popular venue many months after it has been convincingly challenged?

For starters, Purdue is. They never responded to the criticism, despite confirmation that they are aware of it—for one thing, they got contacted by us and by Inside Higher Ed—and despite the fact that they apparently continue to make money off the sales of the product through a licensing deal with Ellucian. And the uncorrected paper is still available on their web site. This is unconscionable.

Anya clearly bears some responsibility too. Although it’s easy to assume from the way the article is written that the dubious claim was repeated to her in an interview by Purdue research Matt Pistilli, she confirmed for me via email that she took the claim from the previously published research paper and did not discuss it with Pistilli. Given that this is her central example of the potential of learning analytics, she should have interrogated this a little more, particularly since she had Matt on the phone. Mike Caulfield also commented to me that any claim of such a dramatic increase in year-to-year retention should automatically be subject to additional scrutiny.

I have to put some blame on the higher ed press as well. Inside Higher Ed covered the story (and, through them, the Times Higher Education). In fact, Carl Straumsheim actually advanced the story a bit by putting the question to researcher Matt Pistilli (who gave a non-answer). The Chronicle of Higher Education did not cover it, despite having run a puff piece on Purdue’s claims the same day that Mike Caulfield wrote his original piece challenging the results. It is very clear to Phil and me that we are read by the Chronicle staff, in part because they periodically publish stories that have been obviously influenced by our earlier coverage. Sometimes without attribution. I don’t care that much about the credit, but if they thought Purdue’s claims were newsworthy enough to cover in the first place then they should have done their own reporting on the fact that those claims have been called into question. If they had been more aggressive in their coverage then the mainstream press reporters who find Course Signals will be more likely to find the other side(s) of the story as well. Outside of IHE, I’m having trouble finding any coverage, never mind any original reporting, in the higher ed or ed tech press.

I have a lot of respect for news reporters in general, and I think that most people grossly underestimate how hard the job is. I think highly of Anya as a professional. I like the reporters I interact with most at the Chronicle as well. Nor will I pretend that we are perfect here at e-Literate. We miss important angles and get details wrong our fair share. For example, I doubt that I would have caught the flaw in Purdue’s research if Mike hadn’t brought it to my attention. But collectively, we have to do a better job of providing critical coverage of topics like learning analytics, particularly at a time when so much money is being spent and our entire educational system is starting to be remade on the premise that this stuff will work. And there is absolutely no excuse whatsoever for a research university to not take responsibility for their published research on a topic that is so critical to the future of universities.

The post NPR and Missed (Course) Signals appeared first on e-Literate.

A response to Bloomberg article on UCLA student fees

Sat, 2014-07-12 13:56

Megan McArdle has an article that was published in Bloomberg this week about the growth of student fees. The setup of the article was based on a new “$4 student fee to pay for better concerts”.

To solve this problem, UCLA is introducing a $4 student fee to pay for better concerts. That illuminates a budgeting issue in higher education — and indeed among human beings more generally.

That $4 is not a large fee. Even the poorest student can probably afford it. On the other hand, collectively, UCLA’s student fees are significant: more than $3,500, or about a quarter of the mandatory cost of attending UCLA for a year.

Those fees are made up of many items, each trivial individually. Only collectively do they become a major source of costs for students and their families and potentially a barrier to college access for students who don’t have an extra $3,500 lying around.

I’m sympathetic to the argument that college often costs too much and that institutions can play revenue games to avoid the appearance of raising tuition. I also think that Megan is one of the better national journalists on the topic of the higher education finances.


However, this article is somewhat sloppy in a way that harms the overall message. I would like to clarify the student fees data to help show the broader point.

Let’s look at the actual data from UCLA’s web site. I assume that Megan is basing this analysis on in-state undergraduate full-time students. The data is listed per quarter, and UCLA has three quarters for a full academic year. I have summarized below summing three quarters into yearly data, and you can:

  • Hover over each measure to see the fee description from UCLA’s fee description page;
  • Click on each category that I added to see the component fees;
  • Sort either column; and
  • Choose which rows to keep or exclude.
  • NOTE: Static image above if you cannot see interactive graphics

UCLA Fees for In-State Undergrads (Total $3,749.97)

Learn About Tableau Some Clarifications Needed
  • The total of non-tuition fees is $3,750 per year, not $3,500; however, Megan is right that this represents “about a quarter of the mandatory cost of attending UCLA for a year” ($3,750 out of $14,970).
  • The largest single fee is the UC health insurance fee (UC-SHIP), which is more than half of the total non-tuition fees. This fact (noted by Michael Berman on Twitter) should have been pointed out, given the significant percentage of the total.
  • With the UC-SHIP at $1,938 and the student services fee at $972, I hardly consider these as “trivial individually”.
Broader Point on Budgeting

The article’s broader point is that using extraneous fees to create additional revenue leads to a flawed budgeting process.

As I’ve written before, this is a common phenomenon that you see among people who have gotten themselves into financial trouble — or, for that matter, people who are doing OK but complain that they don’t know where the money goes and can’t save for the big-ticket items they want. They consider each purchase individually, rather than in the context of a global budget, which means that they don’t make trade-offs. Instead of asking themselves “Is this what I want to spend my limited funds on, or would I rather have something else?” they ask “Can I afford this purchase on my income?” And the answer is often “Yes, I can.” The problem is that you can’t afford that purchase and the other 15 things that you can also, one by one, afford to buy on your income. This is how individual financial disasters occur, and it is also one way that college tuition is becoming a financial disaster for many families.

This point is very important. Look at the Wooden Center fee, described here (or by hovering over chart):

Covers repayment of the construction bond plus the ongoing maintenance and utilities costs for the John Wooden Recreation Center. It was approved by student referendum. The fee is increased periodically based on the Consumer Price Index.

To take Megan’s point, this fee “was approved by student referendum”, which means that UCLA has moved budgeting responsibility away from a holistic approach to saying “the students voted on it”. This makes no financial sense, nor does it make sense to shift bond repayment and maintenance and utilities cost onto student fees.

While this article had some sloppy reporting in terms of accurately describing the student fees, it does highlight an important aspect of the budget problems in higher education and how the default method is to shift the costs to students.

The post A response to Bloomberg article on UCLA student fees appeared first on e-Literate.

Blackboard’s Perceptis Acquisition Offers Clues into Company’s Strategy

Wed, 2014-07-09 08:47

Yesterday Blackboard announced that they acquired Perceptis, a provider of help desk and financial aid support services for colleges and universities. In and of itself, this is not a huge acquisition. Perceptis has 33 clients, offers services that Blackboard was already offering, and has no substantial new technology. But as we approach BbWorld next week, the move provides some early hints into the strategic direction that the company may highlight at the conference.

I had the opportunity to talk with Blackboard’s Vice President of Education Services Katie Blot about the move.

There are a couple of different ways to frame help desk services, so I was curious to hear how Blackboard would position it. Katie talked about being “very, very focused on end-to-end learner-centric support” and “supporting learner pathways” for “non-traditional and post-traditional students.” And in the acquisition announcement, Jay Bhatt is quoted as saying,

By combining the Blackboard and Perceptis teams, we will enhance a service model that the industry needs: one that fully supports students from the first moment they are interested in a school to the day they graduate. This is yet another way Blackboard is reimagining education.

While “reimagining education” may be laying it on a little thick in the context of acquiring a help desk service, the reframing of the company mission as supporting students from orientation to graduation is a significant change. I always got the feeling that former CEO Michael Chasen’s role model was Oracle’s Larry Ellison. If you need a piece of software to help you do something important, Ellison will get it and sell it to you. It doesn’t matter too much what kind of software it is, as long as you’re the kind of customer he wants to have. There’s nothing wrong with that per se, but it leads to particular types of business decisions. A friend who used to work at Georgetown University liked to joke that Blackboard probably had some useful insights about his bowel health because he had to swipe his Blackboard-vended key card every time he used the faculty bathroom. Barring an uncommonly expansive definition of what it means to “fully support students,” this is just not the kind of business that the company Jay Bhatt is describing would be likely to get into (although, for the record, Blackboard currently still owns this business).

Interestingly, this is a point that Katie brought up unprompted. She took great pains to emphasize how they are building a “new Blackboard” (which, by implication, is importantly not like the old Blackboard). In the old days, she said, the company made acquisition decisions based primarily on the financial case. “We bought a lot of companies that were not closely aligned with the core.” I would put it slightly differently. I would say that Blackboard did not have the same core that the company leadership is articulating today.

And what is that core? What is the company trying to become? We will likely know more after next week, but by doubling down on the support services and positioning it the way they are, the company is trying to move up the value chain, away from being perceived as a software vendor and toward being perceived as a student success-related services vendor. According to Katie, their services business as tripled in the three years since Chasen got Blackboard into the call center support business by acquiring Presidium. The Perceptis move can be seen as doubling down. This puts them in an increasingly crowded space, particularly in online education, with competitors that range from Pearson to 2U to Hobsons. When I asked Katie how the company intends to differentiate itself, she cited two factors. First, they provide an a la carte approach and are avoiding making moves that they believe would potentially either put them in direct competition with their customers or otherwise cannibalize the schools’ core competencies. They are staying out of certain services businesses—she didn’t specify, but I imagine that curriculum development is a good example of what she means—while in others she said they take a “teach to fish” approach, moving more toward the consulting than the outsourcing range of the spectrum. This is not terribly different from the marketing message that Instructure deployed against the MOOC providers when announcing the Canvas Network and may be effective against the textbook publishers and more full-service outsourcing companies.

The second differentiator was interesting too. While Katie emphasized the a la carte message and specifically mentioned that Perceptis was attractive to the company because it served non-Blackboard customers and reinforced the message that they want to provide services to schools using other LMSs, she also said that Blackboard’s knowledge of the learning technology stack and, more importantly, the learning data, gives them an edge helping support their customers in making data-driven decisions. There aren’t many service providers who can make that claim right now. To be honest, I’m not sure that Blackboard can either yet. As I have written previously, the heritage of Blackboard’s analytics product is not really with learning analytics and they are still in the early stages of moving into this space. That said, Phil and I are impressed with their decision to hire John Whitmer as Director for Platform Analytics and Educational Research. As Phil has observed, Instructure has gotten strong benefits from hiring academic Jared Stein. Likewise, Al Essa led some pretty major conceptual work on analytics at Desire2Learn before they lost him to McGraw Hill. John is a solid researcher in the field of learning analytics and just the sort of guy that Blackboard needs to help them figure out how to deliver on their claims that they understand how educational data can provide insights enabling better student support.

Obviously, I’m reading tea leaves here. Speaking of data, Phil and I will both be at BbWorld next week and should have more concrete moves by Blackboard to analyze.

The post Blackboard’s Perceptis Acquisition Offers Clues into Company’s Strategy appeared first on e-Literate.

Roundtable Discussion on Integrative Education July 9th at 2pm EDT

Tue, 2014-07-08 16:19

Tomorrow, July 9th at 2:00pm EDT, I’ll join a great cast to discuss Reinvent the University for the Whole Person: Principles Driving Policy, and I hope many of you can watch. The other participants:

  • Randy Bass (Vice Provost for Education and Professor of English at Georgetown University)
  • Martha Kanter (Distinguished Visiting Professor of Higher Education at New York University & former U.S. Under Secretary of Education)
  • Robert Groves (Provost at Georgetown University)
  • Jeffrey Selingo (Author of College (Un)Bound: The Future of Higher Education and What It Means for Students)
  • Tia Brown McNair (Senior Director for Student Success at the Association of American Colleges & Universities)
  • Anthony Carnevale (Director of the Center on Education & the Workforce at Georgetown University)


The core idea for the series:

American higher education rarely has been more in the national spotlight than with the arrival of new digital technologies and new for-profit education businesses, among other big trends. In this rapidly changing landscape, the old model looks increasingly outmoded and many efforts are underway to begin to transform the system for the 21st century. Most efforts are focusing on making the system more efficient and producing a larger number of graduates to fit in a changing economy.

Very little thought is going into other valuable contributions that universities have provided in the past. Universities also produce future citizens, problem–solvers, leaders – not to mention knowledge that can drive innovation and economic growth. How do we ensure that these other critical outcomes will continue in the future? How can we build on new insights about learning and invent new ways to deliver and measure education that matters for a lifetime? How can we use new tools and approaches that are only available now to carry out the mission of educating for the whole person even better than before?

For the roundtable tomorrow, we’ll discuss:

What are the opportunities for shaping public policy for integrative education in a world that also needs more access, lower costs and workplace preparation? How do we ensure this focus is not elitist?

You can access the discussion on the Reinventors website here.

You can access the discussion within Google+ here.

The post Roundtable Discussion on Integrative Education July 9th at 2pm EDT appeared first on e-Literate.

Instructure’s CTO Joel Dehlin Abruptly Resigns

Mon, 2014-07-07 16:32

One week after the conclusion of Instructure’s Users’ Conference, CTO Joel Dehlin abruptly resigned from the company for a new job. Joel took the CTO job with Instructure in summer 2013, around the same time as Devlin Daley’s departure (Devlin was co-founder). Joel’s resignation comes as a surprise, especially given his prominent placement as the technology lead for the Canvas LMS. As recently as InstructureCon on June 27th, Joel gave the product update presentation.

The change became apparent by viewing the new Instructure leadership page (nice page design, btw), as I noticed that Joel was not included. I contacted Devin Knighton, Director of Public Relations for Instructure, who confirmed that the resignation was unexpected and was Joel’s decision. I am not sure how significant this resignation is for the company. What we do know is that Joel has not been replaced as CTO, but that Jared Stein (VP of Research and Education), Trey Bean (VP of Product), David Burggraaf (VP of Engineering), and Zach Willy (Chief Architect) will cover the CTO responsibilities in the near term. I would have more details, but Devin is on family vacation, and I did not want to push for him to send me an official email.

We’ll keep you posted if we find out more information (assuming it is newsworthy).

Update: Corrected second paragraph on VP of Product and VP of Engineering per Devin Knighton comment below.

The post Instructure’s CTO Joel Dehlin Abruptly Resigns appeared first on e-Literate.