Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 2 hours 13 min ago

Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt

Sat, 2014-08-09 13:30

I have been very critical of the Brookings Institution report on student debt, particularly in my post “To see how illogical the Brookings Institution report on student loans is, just read the executive summary”.

D’oh! It turns out that real borrowers with real tax brackets paying off off real loans are having real problems. The percentage at least 90 days delinquent has more than doubled in just the past decade. In fact, based on another Federal Reserve report, the problem is much bigger for the future, “44% of borrowers are not yet in repayment, and excluding those, the effective 90+ delinquency rate rises to more than 30%”.

More than 30% of borrowers who should be paying off their loans are at least 90 days delinquent? It seems someone didn’t tell them that their payment-to-income ratios (at least for their mythical average friends) are just fine and that they’re “no worse off”.

Well now the Federal Reserve Board themselves weighs in on the subject with a new survey, at least as described by an article in The Huffington Post.  I have read the Fed report and concur with HP analysis – it does argue against the Brookings findings.

Among the emerging risks spotlighted by the survey is the nation’s $1.3 trillion in unpaid student debt, suggesting that high levels of student debt are crimping the broader economy. Nearly half of Americans said they had to curb their spending last year in order to make payments on student loans, adding weight to the fear among federal financial regulators that the burden of student debt on households will depress economic growth for years to come.

Some 35 percent of survey respondents who are paying back student loans said they had to reduce their spending by “a little” over the past year to keep up with their student debt payments. Another 11 percent said they had to cut back their spending by “a lot.”

The Fed’s findings appear to challenge recent research by a pair of economists at the Brookings Institution, highlighted in The New York Times and cited by the White House, that argues that households with student debt are no worse off today than they were two decades ago.

The full Fed report can be found here. Much of the survey was focused on borrowers and their perceptions of how their student loans impact them, which is much more reliable than Brookings’ assumptions on how convoluted financial ratios should affect borrowers. In particular, consider this table:

Fed Table 11

Think about this situation – amongst borrowers who have completed their degrees, almost equal numbers think the financial benefits of a degree outweigh the costs as think the opposite (41.5% to 38.1%). I don’t see this as an argument against getting a degree, but rather as clear evidence that the student loan crisis is real and will have a big impact on the economy and future student decision-making.

Thanks to the Federal Reserve Board for helping us out.

Update: Clarified that this is Federal Reserve Board and not NY Fed.

The post Federal Reserve Board backs up e-Literate in criticism of Brookings report on student debt appeared first on e-Literate.

Update on 2U: First full quarterly earnings and insight into model

Wed, 2014-08-06 19:09

2U, the online service provider that went public in the spring, just released its financial report for the first full quarter of operations as a public company. The company beat estimates on total revenue and also lost less money than expected. Overall, it was a strong performance (see WSJ for basic summary or actual quarterly report for more details). The basics:

  • Revenue of $24.7 million for the quarter and $51.1 m for the past six months, which represents year-over-year increase of 32 and 35%;
  • EBITDA Losses of $7.1 m for the quarter and $10.9 m for the past six months, which represents year-over-year increase of -2% and 12%; and
  • Enrollment growth of 31 – 34% year-over-year.

Per the WSJ coverage of the conference call:

“I’m very pleased with our second quarter results, and that we have both the basis and the visibility to increase all of our guidance measures for 2014,” said Chip Paucek, 2U’s Chief Executive Officer and co-founder. “We’ve reached a turning point where, even with continued high investment for growth, our losses have stopped accelerating. At the midpoint of our new guidance range, we now expect our full year 2014 adjusted EBITDA loss to improve by 17% over 2013. Further, we’ve announced a schedule that meets our stated annual goal for new program launches through 2015.”

The company went public in late March at $14 / share and is still at that range ($14.21 before the quarterly earnings release – it might go up tomorrow). As one of only three ed tech companies to have gone public in the US over the past five years, 2U remains worth watching both for its own news and as a bellwether of the IPO market for ed tech.


The financials provide more insight into the world of Online Service Providers (OSP, aka Online Program Management, School-as-a-Service, Online Enablers, the market with no name). On the conference call 2U’s CEO Chip Paucek reminded analysts that they typically invest (money spent – revenue) $4 – $9 million per program in the early years and do not start to break even until years 3 – 4. 2U might be on the high side of these numbers given their focus on small class sizes at big-name schools, but this helps explain why the OSP market typically focuses on long-term contracts of 10+ years. Without such a long-term revenue-sharing contract, it would difficult for an OSP to ever break even.

As the market matures – with more competitors and with schools developing their own experiences in online programs, it will become more and more difficult for companies to maintain these commitments from schools. We have already seen signs over the past year of changes in institutional expectations.

2U, meanwhile, has positioned itself at the high-end of the market, relying on high tuitions and brand-name elite schools with small classes. The company for the most part will not even compete in a Request for Proposal process, avoiding direct competition with Embanet, Deltak, Academic Partnerships and others. Their prospects seem much stronger than the more competitive mainstream of OSP providers.

See the posts here at e-Literate for more background.

2U has changed one aspect of their strategy, as noted by Donna Murdoch on G+. At least through 2012 the company positioned itself as planning to work with one school per discipline (or vertical in their language). Pick one school for Masters of Social Work, one for MBA, etc. As described in Jan 2012:

“As we come into a new vertical, 2tor basically partners with one great school per vertical. We find one partner, one brand that is world-class. We partner with that brand over a long time period to create the market leader in that space for that discipline.”

2U now specifically plans for secondary schools in different verticals as can be seen in their press release put out today:

Programs Aug 2014

Note the duplication of Social Work between USC and Simmons, Nursing between Georgetown and Simmons, and Data Science between Berkeley and SMU. Note the new approach from page 20 of the quarterly report:

As described above, we have added, and we intend to continue to add, degree programs in a number of new academic disciplines each year, as well as to expand the delivery of existing degree programs to new clients.

View Into Model

Along with the first quarter release (which was not based on a full quarter of operations as a public company), 2U release some interesting videos that give a better view into their pedagogical approach and platform. In this video they describe their “Bi-directional Learning Tool (BLT)”:

This image is from a page on the 2U website showing their approach, with a view of the infamous Brady Bunch layout for live classes (synchronous).

Live Courses

We’ll keep watching 2U and share significant developments as we see them.

The post Update on 2U: First full quarterly earnings and insight into model appeared first on e-Literate.

Full Disclosure

Sat, 2014-08-02 12:41

As you probably know, we run a consulting business (MindWires Consulting) and sometimes work with the companies and schools that we write about here. Consequently, we periodically remind you and update you on our conflict of interest policies. We do our best to avoid or minimize conflicts of interest where we can, but since our system isn’t perfect, we want you to understand how we handle them when they arise so that you can consider our analysis with the full context in mind. We value your trust and don’t take it for granted.

We talk a lot with each other about how to deal with conflicts of interest because we run into them a lot. On the one hand, we find that working with the vendors and schools that we write about provides us with insight that is helpful to a wide range of clients and readers. There just aren’t too many people who have the benefit of being able to see how all sides of the ed tech relationships work. But along with that perspective comes an inevitable and perpetual tension with objectivity. When we started our business together 18 months ago, we didn’t have a clear idea where these tensions would show up or how big of an issue they might turn out to be. We originally thought that our blogging was going to remain an addiction that was subsidized but somewhat disconnected from our consulting. But it turns out that more than 90% of our business comes from readers of the blog, and a significant portion of it comes out of conversations stimulated by a specific post. Now that we understand that relationship better, we’re getting a better handle on the kinds of conflict of interest that can arise and how best to mitigate them. Our particular approach in any given situation depends on lot on whether the client wants analysis or advice.


In many cases, clients want us to provide deeper, more heavily researched, and more tailored versions of the analysis that we’ve provided publicly on this blog. In this situation, there isn’t a strong a direct conflict of interest between working providing them with what they are asking for and writing public analysis about various aspects of their business. That said, no matter how hard we try to write objectively about an organization that is, was, or could be a client, human nature being what it is, we can’t guarantee that we will never be even subconsciously influenced in our thinking. That is why we have a policy to always disclose when we are blogging about a client. We have done this in various ways in the past. Going forward, we are standardizing on an approach in which we will insert a disclosure footnote at the end of the first sentence in the post in which the client is named. It will look like this.[1] (We are not fully satisfied that the footnote is prominent enough, so we will be investigating ways to make it a little more prominent.) We will insert these notices in all future posts on the blog, whether or not we are the authors of those posts. In cases where the company in question is not currently a client but was recently and could be again in the near future, we will note that the company “was recently a client of MindWires Consulting”.


Sometimes the client wants not only analysis but also strategic advice. Those situations can be trickier. We want to avoid cases in which we blog in praise (or condemnation) of a company for taking an action that they paid us to tell them to take. Our policy is that we don’t blog about any decisions that a company might make based on our advice. There are some theoretical situations in which we might consider making an exception to that rule, but if they ever do come up in reality, then the disclosure principle will apply. We will let you know if, when, and why we would make the exception. Aside from that currently theoretical exception, we recuse ourselves from blogging about the results of our own consulting advice. Furthermore, when potential clients ask us for advice that we think will put us into a long-term conflict of interest regarding one of our core areas of analysis, we turn down that work. Analysis take precedence over advice.

Getting Better at This

We’re going to continue thinking about this and refining our approach as we learn more. We also have some ideas about business models that could further minimize potential conflicts in the future. We’ll share the details with you if and when we get to the point where we’re ready to move forward on them. In the meantime, we will continue to remind you of our current policy periodically so that you are in a better position to judge our analysis. And as always, we welcome your feedback.


  1. Full disclosure: Acme Ed Tech Company is a client of MindWires Consulting, the sponsor of e-Literate.

The post Full Disclosure appeared first on e-Literate.

Mobile-first learning platform EmpoweredU acquired by Qualcomm

Wed, 2014-07-30 17:49

Qualcomm, the giant $26 billion wireless technology conglomerate, acquired EmpoweredU – a mobile-first learning platform available for the education market. What does this acquisition mean?

Who is EmpoweredU?

The company was created by CEO Steve Poizner in 2011 in partnership with Creative Artists Agency, the world’s largest sports and talent agency, under the name “Encore Career Institute”. The initial work was to offer continuing ed classes targeted at Baby Boomers through the UCLA extension.  ((These are certificate programs for $5,000 – $10,000 total tuition.))  In essence, this was an Online Service Provider (OSP) model similar to Embanet, Deltak, Academic Partnerships and 2U. As described by the San Francisco Chronicle in 2011:

Poizner, in an interview at the firm’s headquarters this week, said the company combines “three of California’s greatest assets” – its famed public university system, the creative know-how of its technology center, Silicon Valley, and the cutting-edge marketing savvy of Hollywood. [snip]

In addition to its employment potential for Baby Boomers, Poizner said, the collaboration could bring new revenue for cash-strapped UCLA and thousands of new students from around the nation to its online courses.

The company changed names to Empowered Careers and then eventually settled on EmpoweredU.

In the meantime they figured out that the OSP model is high risk and expensive, often requiring investments of $1 – $10 million per program by the OSP, with revenue-sharing profits occurring several years later. EmpoweredU has pivoted over the past year to become a mobile-first learning platform with content services.

The platform is built on top of the Canvas open source version offered by Instructure and started pilots at 15+ schools this spring (including specific programs at USC, UC Berkeley, U of San Francisco, etc). This may be the most significant use of open source Canvas, and it might end up competing with Canvas, at least indirectly.

As we’ll see later, EmpoweredU is also attempting to create a learning ecosystem that can combine multiple technologies.

Why is Qualcomm making an ed tech acquisition?

I interviewed Vicki Mealer (Senior Director, Business Development, Qualcomm Labs, which is the unit acquiring EmpoweredU) and Steve Poizner today. Vicki’s description of Qualcomm’s interest in ed tech is that they are all about mobile technology, and they have had a philanthropic interest in education for years (donating over $240 million cash to various institutions).  Qualcomm wants to be a behind-the-scenes cheerleader, but they also need for an ecosystem to for each market. Qualcomm Labs started looking at education a year ago, trying to identify and overcome barriers for adoption of mobile technology. Some of the perceived barriers:

  • The digital divide leading to students having gaps in their connectivity (wi-fi vs. cellular);
  • Vendor lock-in and lack of modularity, causing school leaders to have painful technology replacement decisions to move into a mobile strategy; and
  • A lack of software and tools for instructors to take advantage of mobile features and be able to develop curriculum that leverages the technology – partially to have instructors catch up to where the students are.

For Qualcomm Labs, EmpoweredU can provide the modular ecosystem for education and shares their device-agnostic views. This will help them accelerate adoption of mobile in education.

Steve is becoming the SVP of a new business unit within the Labs, called Qualcomm Education. The EmpoweredU unit will combine with a separate Mobile Learning Framework initiative and broaden its focus to K-20.

Should we care?

I visited the company in May of this year and saw a very different design approach than the current generation of browser-based learning platforms that have added mobile features as after-the-face enhancements. At this point EmpoweredU is a niche player only targeting specific academic programs that can afford an iPad one-to-one approach or similar methods to ensure that all students have tablets. Longer term they see this need broadening out to entire institutions. The technology has a full browser interface, so the company could target institution-wide opportunities should they choose.

What is meant by mobile-first in this case is that the platform was conceived and designed around the iPad, directly integrating device features such as location as well as camera and microphone input. In addition, the platform uses push notifications to alert students to assignments or due dates.

Main UI 1

One feature that I find quite important for the mobile world is automatic caching to allow offline access. The default setup syncs the current, past, and next week’s material to the device while connected, allowing offline work that will be re-synched when back on the network.

While the platform was written originally for the iPad, they now support multiple devices and have one pilot that is web only.

In a nod to their OSP origins and content-generating experience, EmpoweredU offers “content sherpas” and a content authoring system. The idea is to support faculty and designers who are attempting to design courses and content that take advantage of the mobile platform.


They released initial analytics support in the spring.

During the interview, it became apparent that Qualcomm is interested not just in the learning platform, but in EmpoweredU’s broader plans to create an ecosystem.


I pushed them to describe who would be their competitors, either in higher ed or K-12, but they would not directly answer. They kept coming back to the ecosystem and the ability to provide a modular approach and not force rip and replace strategies. I can see this in theory but question what this means in reality.

From an initial look at the company, it will be interesting to watch to see if Qualcomm’s financial backing will allow EmpoweredU to move beyond a niche provider for select programs and attempt to directly compete in the LMS market for institutions or at least compete more broadly. It will also be interesting to see if they are successful in their entrance to the K-12 market. If so, the learning platform market will get even more interesting.

As for the full ecosystem, there are not enough details to understand how seriously to take this approach. Are schools even ready for this approach? How does this ecosystem relate to the LTI specifications that are fundamentally changing the ed tech market? I have many questions in this area that we’ll have to watch over time.

Update: Corrected University of San Francisco reference (and not UC San Francisco) per comments below.

The post Mobile-first learning platform EmpoweredU acquired by Qualcomm appeared first on e-Literate.

To see how illogical the Brookings Institution report on student loans is, just read the executive summary

Tue, 2014-07-29 17:13
il·log·i·cal i(l)ˈläjikəl/ adjective
  1. lacking sense or clear, sound reasoning.  ((From Google’s definition))

There have been multiple articles both accepting the Brookings argument that “typical borrowers are no worse off now than they were a generation ago” and those calling out the flaws in the Brookings report. I have written two articles here and here criticizing the report. The problem is that much of the discussion is more complicated that it needs to be. A simple reading of the Brookings executive summary exposes just how illogical the report is.

College tuition and student debt levels have been increasing at a fast pace for at least two decades. These well-documented trends, coupled with an economy weakened by a major recession, have raised serious questions about whether the market for student debt is headed for a crisis, with many borrowers unable to repay their loans and taxpayers being forced to foot the bill.

The argument is set up – yes, tuition and debt levels are going up, but how is a crisis defined? It’s specifically about “many borrowers unable to repay their loans”. Is there a crisis? That’s not a bad setup, and it is a valid question to address.

Our analysis of more than two decades of data on the financial well-being of American households suggests that the reality of student loans may not be as dire as many commentators fear. We draw on data from the Survey of Consumer Finances (SCF) administered by the Federal Reserve Board to track how the education debt levels and incomes of young households evolved between 1989 and 2010. The SCF data are consistent with multiple other data sources, finding significant increases in average debt levels, but providing little indication of a significant contingent of borrowers with enormous debt loads.

This is an interesting source of data. Yes, the New York Fed’s Survey of Consumer Finances tracks student debt, but this data is almost four years old due to triennial survey method. [1]

But hold on – now we’re talking about “significant contingent of borrowers with enormous debt loads”? I thought the issue was ability to repay. What does “enormous” even mean other than being a scary word?

First, we find that roughly one-quarter of the increase in student debt since 1989 can be directly attributed to Americans obtaining more education, especially graduate degrees. The average debt levels of borrowers with a graduate degree more than quadrupled, from just under $10,000 to more than $40,000. By comparison, the debt loads of those with only a bachelor’s degree increased by a smaller margin, from $6,000 to $16,000.

Fair enough point to start, noting that a quarter of debt growth comes from higher levels of education including grad school. Average debt loads have gone up more than 2.5x for undergrads, and that certainly sounds troublesome given the report’s main point of “no worse off”. Using the ‘but others are worse off, so this is not as bad’ argument, Brookings notes that grad students had their debt go up by 4x. The argument here appears to be that 2.5 is less than 4.[2]

Second, the SCF data strongly suggest that increases in the average lifetime incomes of college-educated Americans have more than kept pace with increases in debt loads. Between 1992 and 2010, the average household with student debt saw in increase of about $7,400 in annual income and $18,000 in total debt. In other words, the increase in earnings received over the course of 2.4 years would pay for the increase in debt incurred.

Despite the positioning of the report that a small portion of borrowers skews the data and coverage, Brookings resorts to using the mythical “average household”. For that mythical entity, they certainly seem to have the magical touch to not pay any taxes and obtain zero-interest loans.[3]

Nonetheless, we’ve now changed the issue again – first by ability to repay, then whether the loan is “enormous”, and now based on how long a mythical payoff takes.

Third, the monthly payment burden faced by student loan borrowers has stayed about the same or even lessened over the past two decades. The median borrower has consistently spent three to four percent of their monthly income on student loan payments since 1992, and the mean payment-to-income ratio has fallen significantly, from 15 to 7 percent. The average repayment term for student loans increased over this period, allowing borrowers to shoulder increased debt loads without larger monthly payments.

Small issue, but we’ve now gone from average household as key unit of measurement to median borrower? Two changes from one paragraph to the other – average to median and household to borrower?

OK, now we have replaced the scary “enormous” with “borrowers struggling with high debt loads”. Although not in the executive summary, the analysis of the report seems to define these large debts as $100,000 or more. Doesn’t it matter who the borrower is? A humanities PhD graduate working as an adjunct for $25,000 a year might view $20,000 debt as enormous.

Brookings introduces a new measure, and this one does at least take into account the difference in borrowers: payment-to-income ratios of median borrowers. If I’m reading the argument correctly (this took a while based on key measures and terms changing paragraph to paragraph), not only should there be no crisis, but the situation might actually be improving.

These data indicate that typical borrowers are no worse off now than they were a generation ago, and also suggest that the borrowers struggling with high debt loads frequently featured in media coverage may not be part of a new or growing phenomenon. The percentage of borrowers with high payment-to-income ratios has not increased over the last 20 years—if anything, it has declined.

So I was reading it correctly: “typical borrowers are no worse off” and the percentage of borrowers with high ratios has declined.[4] The only problem, however, is that if we go back to the original setup of the issue, “many borrowers unable to repay their loans”, there might be a much more direct measurement. How about actually seeing if borrowers are failing to repay their loans (aka being delinquent)?

The Brookings report does not analyze loan delinquency at all - the word “default” is only mentioned three times – once referring to home mortgages and twice referring to interest rates (not once for the word “delinquent”). What do actual delinquency rates show us?

It turns out that we can go to the same source of data and find out. Here is the New York Fed report from late 2013:


D’oh! It turns out that real borrowers with real tax brackets paying off off real loans are having real problems. The percentage at least 90 days delinquent has more than doubled in just the past decade. In fact, based on another Federal Reserve report, the problem is much bigger for the future, “44% of borrowers are not yet in repayment, and excluding those, the effective 90+ delinquency rate rises to more than 30%”.

More than 30% of borrowers who should be paying off their loans are at least 90 days delinquent? It seems someone didn’t tell them that their payment-to-income ratios (at least for their mythical average friends) are just fine and that they’re “no worse off”.

Back to the Brookings report:

This new evidence suggests that broad-based policies aimed at all student borrowers, either past or current, are likely to be unnecessary and wasteful given the lack of evidence of widespread financial hardship. At the same time, as students take on more debt to go to college, they are taking on more risk. Consequently, policy efforts should focus on refining safety nets that mitigate risk without creating perverse incentives.

Despite the flawed analysis that changed terms, changed key measures, and failed to look at any data on delinquencies, Brookings now calls out a “lack of evidence of widespread financial hardship”. How can we take their recommendations seriously when the supporting analysis is fundamentally illogical?

At least the respectable news organizations will do basic checking of the report before parroting such flawed analysis.

The worries are exaggerated: Only 7% of young adults with student debt have $50,000 or more.

— David Leonhardt (@DLeonhardt) June 24, 2014

ICYMI=>The Student Debt Crisis Is Being Manufactured To Justify Debt Forgiveness #tcot #taxes

— Jeffrey Dorfman (@DorfmanJeffrey) July 5, 2014


  1. Also note that we’re skipping the years with the highest growth in student debt.
  2. This argument also ignores or trivializes the issue that grad students are indeed students.
  3. There is no other way to get to the 2.4 year payoff.
  4. And yet another change – from average to median to typical.

The post To see how illogical the Brookings Institution report on student loans is, just read the executive summary appeared first on e-Literate.

Early Review of Google Classroom

Mon, 2014-07-28 16:36

Meg Tufano is co-Founder of SynaptIQ+ (think tank for social era knowledge) and leader of McDermott MultiMedia Group (an education consulting group focused on Google Apps EDU). We have been checking out Google Classroom – with her as the teacher and me as the student. I include some of Meg’s bio here as it is worth noting her extensive experience designing and teaching online courses for more than a decade.

Meg posted a Google Slides review of her initial experiences using Google Classroom from a teacher’s perspective, which I am sharing below with minimal commentary. The review includes annotated slides showing the various features and Meg’s comments.

I have not done as much work to show the student view, but I will note the following:

  • The student view does not include the link to the Chrome Store that Meg finds to be too confusing.
  • The biggest challenge I’ve had so far is managing my multiple Google accounts (you have to be logged into the Google Apps for Edu as your primary Google account to enter Classroom, which is not that intuitive to students).
  • I wonder if Google will continue to use Google tools so prominently in Classroom (primary GDrive, YouTube, GDocs) or if the full release will make it easier to embed non-Google tools.
  • I have previously written “Why Google Classroom won’t affect institutional LMS market … yet”, and after initial testing, nothing has changed my opinion.
  • I have one other post linking to video-based reviews of Google Classroom here.

The post Early Review of Google Classroom appeared first on e-Literate.

CCSF Update: Accreditation appeal denied, but waiting for court date

Wed, 2014-07-23 13:30

It looks like I’ll have the California trifecta for the past week, having already posted on Cal State and University of California news recently. Maybe I should find a Stanford or some other private university story.

In my last post on CCSF from January:

Last week, as expected, a California superior court judge ruled on whether to allow the Accrediting Commission for Community and Junior Colleges (ACCJC)  to end accreditation for City College of San Francisco (CCSF) as of July 31, 2014. As reported inmultiple news outlets, the judge granted an injunction preventing ACCJC from stripping CCSF’s accreditation at least until a court trial based on the city of San Francisco lawsuit, which would occur in the summer 2014 at the earliest. This means that CCSF will stay open for at least another academic term (fall 2014), and it is possible that ACCJC would have to redo their accreditation review.

 In the meantime, ACCJC reviewed CCSF’s appeal of the accrediting decision, and ACCJC is sticking to its guns on the decision, as described in the San Francisco Chronicle:

City College of San Francisco remains out of compliance with eight accreditation standards, so the threat to revoke its accreditation stands, said the commission that set July 31 for the action that would shut the college down.

Accreditation won’t be revoked on that date, however, because a judge delayed the deadline until an October trial can determine if the Accrediting Commission for Community and Junior Colleges properly conducted its 2012 evaluation of City College.

In other words, ACCJC has changed its determination that CCSF should lose accreditation. There are only two caveats at this point:

  • The injunction that prevents ACCJC from revoking accreditation until the October court date; and
  • A new loophole called “restoration status”.

From the SF Chronicle again:

Besides pinning its hopes on the lawsuit – which could trigger a completely new evaluation – the college has one more option, made possible in June when the U.S. Department of Education firmly explained to the reluctant commission that it had the power to extend the revocation deadline.

As a result of that intervention, the commission created a new “restoration status” for City College – and any other college that finds itself in such a precarious position – giving it two more years to improve and comply with a new range of requirements.

City College would have to apply for the new status by July 31.

But Phil, you say, I am fascinated by the accreditation review process and want more! To keep you going, here is the letter from ACCJC to CCSF rejecting the appeal. In the letter ACCJC calls out the areas where CCSF is still not in compliance:

I.B   Improving Institutional Effectiveness

II.A  Instructional Programs

II.B  Student Support Services

II.C  Library and Learning Support Services

III.B Physical Resources

III.C Technology Resources

III.D Financial Resources

IV.B Board and Administrative Organization

For historical context of how we got here, see this post.

The high-profile game of Chicken continues.

The post CCSF Update: Accreditation appeal denied, but waiting for court date appeared first on e-Literate.

University of California’s $220 million payroll project reboot

Mon, 2014-07-21 16:11

Chris Newfield has an excellent post at Remaking the University about the University of California’s budget situation and how it relates to the recent Moody’s negative outlook on higher education finances. The whole article is worth reading, but one section jumped off the page for me [emphasis added].

The sadder example of ongoing debt is the request for “external financing for the UCPath project.” UC Path was UCOP’s flagship solution to UC inefficiencies that were allegedly wasting taxpayers’ money–in other words, new enterprise software for the systemwide consolidation of payroll and human resources functions. This is boring, important back office stuff, hardly good material for a political campaign to show the state “UC means business,” but that’s what it became. Rather than funding each campus’s decades-old effort to upgrade its systems on its own, UCOP sought centralization, which predictably introduced new levels of cost, complexity, and inefficiency, since centralization is often not actually efficient.

I had heard nothing good about UC Path from people trying to implement it on campuses, and have tried to ignore it, but this week it has resurfaced as a problem at the Regental level. The project timeline has grown from 48 to 72 months, and its costs are said to be $220 million (it had spent $131 million by May 2014) . Worse, the repayment schedule has mushroomed from seven to twenty years. Annual payments are to be something like $25 million. Campuses are to be taxed to pay for 2015-era systems until 2035, which is like taking out a twenty year mortgage to pay for your refrigerator, except that your fridge will be working better in 2035 than next year’s PeopleSoft product. Since the concurrent budget document notes efficiency savings of $30 million per year (top of page 4), UCOP may be spending $220 million to save a net $5 million per year over a couple of decades–and going into debt to do it. In the end, an efficiency measure has turned into a literal liability.

What the hell – a $220 million project to save money? How did this project get in this much trouble?

The UCPath project concept originated in 2009 with the project announcement coming in late 2011. The goal is to replace the Payroll Personnel System (PPS) that runs separately for each of the 11 UC locations with Oracle’s PeopleSoft payroll and HR systems. PPS is over 30 years old, and there are major risk issues with such an old system as well as a host of inefficient processes. The original project plans were based on a $170 million budget[1] with the first wave of go-live for the Office of the President and 3 campuses scheduled for early 2013. All campuses would be live on the new system by late 2014.[2]

In a presentation to the Board of Regents in January 2012:

Over the same period, cost reductions are expected to be approximately $750 million from technology efficiency gains, process standardization and consolidation of transactional activities into a UC-wide shared services center. Overall, the project has a net present value of approximately $230 million (at a nine percent discount rate) with breakeven in year 5.

Subsequent promises were made in March of 2012:

We think this project is likely to pay for itself within five years, and UC could be accruing over $100 million in annual savings by the eighth year,” said Peter Taylor, UC’s chief financial officer. “We also expect to deliver HR and payroll services with increased efficiency, accuracy and quality.”

At the Board of Regents’ meeting last week, the project team gave the first update to the regents since January 2012 (itself a troubling sign). See this Sharestream video from 2:56:10 – 3:22:40.

By Fall 2013 the project was in trouble, and UC leadership brought in new leadership for the project: Mark Cianca as Deputy CIO and Sabu Varghese as Program Director. Their first act was to do a health check on the project, and the results were not pretty (as described in last week’s Board of Regents’ meeting).

  • The project team and implementation partner (Oracle) had treated the project as a software replacement rather than a fundamental business transformation initiative.
  • The individual campuses had not been consulted on changes in business processes, and in fact they had not even been asked to sign off on future state business processes that each campus would have to run to stay in operation.
  • The new project team had to go through more than 100 future state processes with campuses and get agreement on how to proceed.

The result, as described by UC President Janet Napolitano at last week’s meeting, was the team having to “reboot the entire project”.

Based on the reboot, the current plan is $220 million with first wave complete by February 2016 and all campuses live by mid 2017. That’s $50 million over budget and 24 months over schedule.

Deployment Schedule Jul 2014

But the planning is not complete. They are working up their “final” replan of budget and timeline, which they will present in January 2015.

Topics for Jan 2015

How solid is the current estimate? The implementation schedule is listed as the highest risk, even with the delays.

Major Risks Jul 2014

The project financing has changed so much that UC is now facing the need to use external financing over a much longer term, as described in the material for last week’s board meeting.

Therefore, this item seeks approval to refinance the UCPath loan out of CapEquip and into external financing to achieve the financing customization required. As indicated above, the original repayment plan based on the $220.5 million budget was expected to have been repaid with annual debt service of $25 million. This would have resulted in a 12-year loan term once principal was to be repaid. In January 2015, UCPath project leadership plans to present a revised project timeline, a revised project budget and a revised estimated loan repayment schedule. Project leadership will work with the campus budget officers (and campus budget department staff) to develop: (1) an appropriate campus cost allocation strategy; (2) an estimated repayment schedule that will reflect commencement of principal repayments in conjunction with the final campus deployment (estimated to be early 2017); and (3) an estimated 15-20 year loan repayment period.

  • The new project team seems quite credible, and for the most part they addressed the right points during the briefing. Kudos to UC for making this change in leadership.
  • This is a major project turnaround (or reboot, in Napolitano’s words), but I’m not sure that UC had communicated the significance of the project changes to system campuses (and certainly not to the media).
  • I would view the current plan of $220 million and Q1 2017 full deployment as best case situation – the team told the regents that they were going to update the plan, and ERP project almost never come in earlier than planned.
  • The actual amount is much higher than $220 based on this footnote: “The $10 million in tenant improvements approved for the UCPath Center Riverside site as well as the $17.4 million purchase of the facility (UCPath is currently projected to use no more than 50 percent of the building) are not included in the figures above.”
  • How do you go 2.5 years between updates from what is now a quarter billion dollar project?
  • What about the current estimate of benefits – is it $30 million per year as Chris described or closer to $100 million per year? One big concern I have is that the information on project benefits was not updated, presented to the regents, or asked by the regents. While I question the $25 million financing and $30 million benefits numbers, I think Chris got it exactly right by noting how UC administration is failing to ask hard questions:

Moving forward, I’m afraid that officials are going to have to get much better at admitting mistakes like UCPath, and then actually undoing them. I couldn’t listen to the recording of the UCPath conversation, but Cloudminder made it sound like a lot of restrained finger-pointing with no solution in sight. Did anyone say, “well, this seemed like a good idea at the time, but it’s not. Let’s just cancel it, figure out where we went wrong, and come up with something better”?

It is possible that continuing with the rebooted project is the right answer, but UC is not even asking the question. Failing to ask whether 15-20 year financing of a new ERP makes sense seems like a major oversight. Won’t this lock UC into an Oracle system that is already antiquated for another two decades or more? It seems stunning to me that UC is planning to commit to $220 million of external financing without asking some basic questions.

  1. one regent last week stated the original request was actually $156 million.
  2. All public projects should fear the Wayback Machine for checking old web pages.

The post University of California’s $220 million payroll project reboot appeared first on e-Literate.


Sun, 2014-07-20 08:16

It would be deeply unfair of me to mock Blackboard for having a messy but substantive keynote presentation and not give equal time to D2L’s remarkable press release, pithily entitled “D2L Supercharges Its Integrated Learning Platform With Adaptive Learning, Robust Analytics, Game-Based Learning, Windows® 8 Mobile Capabilities, And The Newest Education Content All Delivered In The Cloud.” Here’s the first sentence:

D2L, the EdTech company that created the world’s first truly integrated learning platform (ILP), today announces it is supercharging its ILP by providing groundbreaking new features and partnerships designed to personalize education and eliminate the achievement gap.

I was going to follow that quote with a cutting remark, but really, I’m not sure that I have anything to say that would be equal to the occasion. The sentence speaks for itself.

For a variety of reasons, Phil and I did not attend D2L FUSION this year, so it’s hard to tell from afar whether there is more going on at the company than meets the eye. I’ll do my best to break down what we’re seeing in this post, but it won’t have the same level of confidence that we have in our Blackboard analysis.

Let me get to the heart of the matter first. Does it look to us like D2L has made important announcements this year? No, it does not. Other than, you know, supercharging its ILP by providing groundbreaking new features and partnerships designed to personalize education and eliminate the achievement gap. They changed their product name to “Brightspace” and shortened their company name to D2L. The latter strikes me as a particularly canny PR move. If they are going to continue writing press releases like their last one, it is probably wise to remove the temptation of the endless variety of potential “Desire2″ jokes. Anyway, THE Journal probably does the best job of summarizing the announcements. For an on-the-ground account of the conference and broader observations about shifts in the company’s culture, read D’Arcy Norman’s post. I’ve been following D’Arcy since I got into blogging ten years ago and have learned to trust his judgment as a level-headed on-the-ground observer.

From a distance, a couple of things jump out at me. First, it looks to me like D2L is trying to become a kind of a content player. Having acquired the adaptive platform in Knowillage, they are combining it with the standards database that they acquired with the Achievement Standards Network. They are also making a lot of noise about enhancements to and content partnerships for their Binder product, which is essentially an eBook platform. Put all of this together, and you get something that conceptually is starting to look (very) vaguely like CogBooks. It wants to be an adaptive courseware container. If D2L pulls this off it will be significant, but I don’t see signs that they have a coherent platform yet—again, acknowledging that I wasn’t able to look at the strategy up close at FUSION this year and could easily be missing critical details.

Second, their announcement that they are incorporating IBM’s Cognos into their Insights learning analytics platform does not strike me as a good sign for Insights. As far as we have been able to tell from our sources, that product has languished since Al Essa left the company for McGraw Hill. One problem has been that their technical team was unable to deliver on the promise of the product vision. There were both data integrity and performance issues. This next bit is speculation on my part, but the fact that D2L is announcing that they plan to use the Cognos engine suggests to me that the company has thus far failed to solve those problems and now is going to a third party to solve them. That’s not necessarily a bad strategy, but it reinforces our impression that they’ve lost another year on a product that they hyped to the heavens and raises questions about the quality of their technical leadership.

The post Desire2Wha? appeared first on e-Literate.

It’s The End of Cal State Online As We Know It . . .

Sat, 2014-07-19 08:48

In a letter to campus leaders, Cal State University system office last month announced that Cal State Online will no longer operate as originally conceived. Emphasis added below.

As the CSU continues to expand its online education strategies, Cal State Online will evolve as a critical component. An early Cal State Online goal will continue: to increase the quality and quantity of fully online education offerings to existing and prospective CSU students, resulting in successful completion of courses and graduation.

The re-visioning of Cal State Online was recommended by the Council of Presidents and approved by the chancellor. This will include a shift to a communication, consultation and services’ strategy for fully online campus degree programs, credentials, certificates and courses supported by opt-in shared services. Cal State Online’s shared services will be designed, delivered and managed to:

1. Make it easy for prospective and existing students to discover, decide, enroll and successfully complete their CSU online education opportunities.

2. Make it more cost-effective for CSU campuses to develop, deliver and sustain their high- quality fully online degree, credential and certificate programs and courses.

Background in a nutshell

In early 2010 a sub-set of the Cal State presidents – the Technology Steering Committee (TSC) – came up with a plan to get the system to aggressively push online education across the system. In fall 2011 the group commissioned a consultant’s set of reports to help them pick an operating model, with the reports delivered in February 2012. This study led to the creation of CSU Online, conceived as a separate 501(c)3 non-profit group1 run by the system, with the plan to use a for-profit Online Service Provider (OSP).2 Early on they realized that Colorado State University was already using the CSU Online name, and the initiative was renamed Cal State Online. The idea was to offer fully-online programs offered by individual campuses in a one-stop shop. Based on an RFP process, in August 2012 Cal State Online selected Pearson as their OSP partner.

Some media coverage of initiative:

The March IHE article quoted official Cal State documents to describe the initiative.

“The goal of Cal State Online is to create a standardized, centralized, comprehensive business, marketing and outreach support structure for all aspects of online program delivery for the Cal State University System,” says the draft RFP. In the open letter, the executive director offers assurances that “participation is optional” for each of the system’s nearly two dozen campuses, “all programs participating in Cal State Online are subject to the same approval processes as an on-campus program,” and “online courses will meet or exceed the quality standards of CSU face-to-face courses.”

What has changed?

This change is significant and recent, meaning that Cal State likely does not have full plans on what will happen in the future. For now:

  • Cal State Online will no longer be a separate operating entity, and the remnant, or “re-visioned” services will be run by the existing Academic Technology Services department within the Chancellor’s Office.

The re-visioning Cal State Online team will be led by Gerry Hanley (Assistant Vice Chancellor for Academic Technology Services) with Sheila Thomas (State University Dean, Extended and Continuing Education).

  • Pearson is no longer the OSP, and in fact, they had already changed their role many months ago3 to remove the on-site team and become more of a platform provider for the LearningStudio (aka eCollege) LMS and supporting services.
  • Cal State is no longer attempting to provide a centralized, comprehensive support structure “for all aspects of online program delivery” but instead will centrally provide select services through the individual campuses.
  • It is clear that Cal State is positioning this decision to show as much continuity as possible. They will continue to provide some of the services started under Cal State Online and will continue to support the programs that have already been offered through the group.

Some services will continue and CSU may keep the name, but it’s the end of Cal State Online as we know it.

I am working on a longer post to explain what happened, including (hopefully) some interviews for supporting information . . . stay tuned.

Update: Changed description of Pearson change and added footnote.

  1. I have not independently verified that the organization truly was set up as a 501(c)3.
  2. Pearson had a team in place at Cal State providing LMS, implementation and integration services, enrollment management & marketing, course design support, analytics and reporting, learning object repository, help desk and technical support, training and faculty support.
  3. I believe this occurred Feb 2014 but am not sure.

The post It’s The End of Cal State Online As We Know It . . . appeared first on e-Literate.

Blackboard’s Big News that Nobody Noticed

Fri, 2014-07-18 16:57

This week was both D2L’s FUSION conference and Blackboard’s BbWorld. The conventional wisdom going around is that there was no big news out of either conference. In Blackboard’s case, that’s just not true. In fact, there was an astonishing amount of very significant news. It’s just that Blackboard didn’t do a very good job of explaining it to people. And that, by itself, is also news.

The big corporate keynote had to be one of the strangest I’ve ever seen. CEO Jay Bhatt ran through a whole long list of accomplishments for the year, but he only gave each one a few seconds as he rattled through the checklist. He mentioned that the company has a new mission statement but didn’t bother to explain it. It took nearly an hour of mostly talking about big macro trends in education and generalities about the categories of goals that the company has set before he finally got around to new product announcements. And then commenced what I can only describe as a carpet bombing run of announcements—a series of explosions that were over by the time you realized that they had started, leaving you to wonder what the heck had just happened. Vice President of User Experience Stephanie Weeks gave a 10-minute talk that was mostly platitudes and generalities about goals for students while some truly significant UX work that her team had done played on the video screen in the background, largely unexplained. There was something mentioned about cloud. Collaborate without a Java plugin! A new mobile app. Wait, another new mobile app, but something about jobs. Wait! Go back to the last slide! I think that was…. Is it over already? It seemed like simultaneously the longest and shortest keynote ever.

Phil and I had a chance to talk to Jay about it later in the day and asked him (politely) what he was thinking. He said, “I don’t view BbWorld as a selling conference. At all.”

Wait. What? This is the Blackboard conference, right?

Apparently it was. This executive team is nothing if not earnest about wanting to talk about the real issues in education. In fact, they’re so earnest about it that they’d rather talk about that than sell you their product. As a result, what was announced in Vegas stayed in Vegas. They made a serious mistake with their keynote plan. But as far as serious mistakes go, it was kind of awesome. And revealing. In and of itself, it is a strong indicator that, having begun a major cultural shift under Ray Henderson, the Blackboard of today is under Jay Bhatt is a very different beast than the Blackboard of five or six years ago. Many of your assumptions about what the company is and what you can expect from them probably aren’t safe ones to make anymore.

Anyway, it’s not surprising that people observing the conference from afar (and even from anear) missed the announcements. So what were they?

Major UX Overhaul

In the past, a “major UX overhaul” for Blackboard typically meant “we moved around some stuff in the admin panel and put on a skin that looks 5 years out of date rather than 15.” Not this time. The new UX is very different. It takes a lot of design cues from iOS (and, to a certain degree, from Windows Mobile). Forget about the 15 different submenus. They’re moving everything to a single-page model with contextual overlays that fly in when you need them. Workflows have been greatly simplified, and many of them rethought. As I sat in on a demo later in the day, I’m pretty sure that the woman in the row in front of me started crying when she saw how much easier it is to import content from an old course.

To be fair, this isn’t shipping code. “Oh, Michael,” you’re thinking about now, “How can you be such a sucker as to fall for the old vaporware bait and switch?” Well, Phil and I spent some time in their UX lab. We were given access to what was clearly a live system (as was anyone else who came to the UX lab). The UX guy managing the lab gave us a script and warned us that this is still a system in development so if we wanted to see what is actually working today we should stick to the script. But of course, we didn’t. The workflows covered by the script were significant, and a lot that wasn’t on the script was also actually already working. This is real, folks. It may not be done yet, but it’s credible. And if the alpha we saw was any indication, it’s not crazy to imagine that Blackboard could raise the bar on LMS UX design by the time that they release. I kid you not.

Underneath all of this, some serious technical work has been done. Blackboard UX is now 100% separated from the business logic, using Node.js to deliver it and putting presentation code in the browser. Also, the new UX is fully responsive. It dynamically adjusts to the size of the browser window (and device).

Even more impressive was the overhaul of Blackboard Collaborate. The Java plugin is gone.1 It’s been replaced by a simple—dare I say elegant?—WebRTC-based UX. We saw a live demo of it. If Google had designed Hangouts specifically for education, they probably would have built something like what Blackboard is showing off. And it works. We saw it in action.

The UX overhaul would be a pretty significant development all by itself. But it wasn’t all by itself.

Blackboard Learn Is Going to the Cloud

Wait. What?

Phil and I are still trying to nail down some of the details on this one, particularly since the term “cloud” is used particularly loosely in ed tech. For example, we don’t consider D2L’s virtualization to be a cloud implementation. But from what we can tell so far, it looks like a true elastic, single-instance multi-tenant implementation on top of Amazon Web Services. It’s kind of incredible. And by “kind of incredible,” I mean I have a hard time believing it. Re-engineering a legacy platform to a cloud architecture takes some serious technical mojo, not to mention a lot of pain. If it is true, then the Blackboard technical team has to have been working on this for a long time, laying the groundwork long before Jay and his team arrived. But who cares? If they are able to deliver a true cloud solution while still maintaining managed hosting and self-hosted options, that will be a major technical accomplishment and a significant differentiator.

This seems like the real deal as far as we can tell, but it definitely merits some more investigation and validation. We’ll let you know more as we learn it.

Bundled Products

This one may sound like a trivial improvement unless you’ve ever actually dealt with Blackboard’s sales force and trivial to implement unless you’ve ever worked in a big software company with lots of business units, but Blackboard has ended the practice of separately licensing 57 different products, each with its own sales rep and price sheet. In some cases—like xpLOR and myEDU—they’re merging the functionality into the core product. In others, they’re creating tiers of service.

Here’s how their website currently describes the tiers:

  • Learning Core: Bb Learn. (But remember, they’re merging previously separate offerings into it.)
  • Learning Essentials: Everything in Core plus Collaborate.
  • Learning Insight: Everything in Essentials plus Analytics for Learn
  • Learning Insight & Student Retention: Everything in Insight plus “retention services.” I didn’t catch this at the conference, but if it’s what it sounds like then the company is beginning to move away from differentiating between products and services and toward integrated solutions.

This should deliver more value to customers with less hassle.

Other Stuff

Those were the big announcements, but there was a lot of other stuff that floated by. It seems like they’re doing significant work on their mobile app, separate from the responsive UX work. I didn’t get a chance to even see what that is about. They’re working on a content store in partnership with MBS Books that could be more significant than it looks at a glance. There was some sort of jobs or career mobile app that whizzed by in the keynote. And who knows what else.

When I take a step back and look at this as a whole, a few thoughts run through my head. First comes, “Yeah, they had to do most of this in order to compete with Instructure. The holes they are filling are fairly clear.” Next comes, “I really didn’t believe they could pull some of this off at all, never mind as quickly and well as they seem to be doing it. Time will tell but…wow.” Then comes, “How the hell did they manage to get through a keynote with all of this in it and not blow people out of their chairs?” And finally, “Who would have thought in a million years that the LMS space could become interesting again?”

But there you have it. This is just a news post; the implications for Blackboard and the market are many and significant. Phil and I will have more to say about it in the days and weeks ahead. For now, the take-home message can be summed up thusly:

Game on.

  1. Many Bothans died to bring you this enhancement.

The post Blackboard’s Big News that Nobody Noticed appeared first on e-Literate.

NPR and Missed (Course) Signals

Mon, 2014-07-14 12:45

Anya Kamenetz has a piece up on NPR about learning analytics, highlighting Purdue’s Course Signals as its centerpiece. She does a good job of introducing the topic to a general audience and raising some relevant ethical questions. But she missed one of the biggest ethical questions surrounding Purdue’s product—namely, that some of its research claims are likely false. In particular, she repeats the following claim:

Course Signals…has been shown to increase the number of students earning A’s and B’s and lower the number of D’s and F’s, and it significantly raises the chances that students will stick with college for an additional year, from 83% to 97%. [Emphasis added.]

Based on the work of Mike Caulfield and Al Essa summarized in the link above, it looks like that latter claim is probably the result of selection bias rather than a real finding. So who is at fault for this questionable claim being repeated without challenge in a popular venue many months after it has been convincingly challenged?

For starters, Purdue is. They never responded to the criticism, despite confirmation that they are aware of it—for one thing, they got contacted by us and by Inside Higher Ed—and despite the fact that they apparently continue to make money off the sales of the product through a licensing deal with Ellucian. And the uncorrected paper is still available on their web site. This is unconscionable.

Anya clearly bears some responsibility too. Although it’s easy to assume from the way the article is written that the dubious claim was repeated to her in an interview by Purdue research Matt Pistilli, she confirmed for me via email that she took the claim from the previously published research paper and did not discuss it with Pistilli. Given that this is her central example of the potential of learning analytics, she should have interrogated this a little more, particularly since she had Matt on the phone. Mike Caulfield also commented to me that any claim of such a dramatic increase in year-to-year retention should automatically be subject to additional scrutiny.

I have to put some blame on the higher ed press as well. Inside Higher Ed covered the story (and, through them, the Times Higher Education). In fact, Carl Straumsheim actually advanced the story a bit by putting the question to researcher Matt Pistilli (who gave a non-answer). The Chronicle of Higher Education did not cover it, despite having run a puff piece on Purdue’s claims the same day that Mike Caulfield wrote his original piece challenging the results. It is very clear to Phil and me that we are read by the Chronicle staff, in part because they periodically publish stories that have been obviously influenced by our earlier coverage. Sometimes without attribution. I don’t care that much about the credit, but if they thought Purdue’s claims were newsworthy enough to cover in the first place then they should have done their own reporting on the fact that those claims have been called into question. If they had been more aggressive in their coverage then the mainstream press reporters who find Course Signals will be more likely to find the other side(s) of the story as well. Outside of IHE, I’m having trouble finding any coverage, never mind any original reporting, in the higher ed or ed tech press.

I have a lot of respect for news reporters in general, and I think that most people grossly underestimate how hard the job is. I think highly of Anya as a professional. I like the reporters I interact with most at the Chronicle as well. Nor will I pretend that we are perfect here at e-Literate. We miss important angles and get details wrong our fair share. For example, I doubt that I would have caught the flaw in Purdue’s research if Mike hadn’t brought it to my attention. But collectively, we have to do a better job of providing critical coverage of topics like learning analytics, particularly at a time when so much money is being spent and our entire educational system is starting to be remade on the premise that this stuff will work. And there is absolutely no excuse whatsoever for a research university to not take responsibility for their published research on a topic that is so critical to the future of universities.

The post NPR and Missed (Course) Signals appeared first on e-Literate.

A response to Bloomberg article on UCLA student fees

Sat, 2014-07-12 13:56

Megan McArdle has an article that was published in Bloomberg this week about the growth of student fees. The setup of the article was based on a new “$4 student fee to pay for better concerts”.

To solve this problem, UCLA is introducing a $4 student fee to pay for better concerts. That illuminates a budgeting issue in higher education — and indeed among human beings more generally.

That $4 is not a large fee. Even the poorest student can probably afford it. On the other hand, collectively, UCLA’s student fees are significant: more than $3,500, or about a quarter of the mandatory cost of attending UCLA for a year.

Those fees are made up of many items, each trivial individually. Only collectively do they become a major source of costs for students and their families and potentially a barrier to college access for students who don’t have an extra $3,500 lying around.

I’m sympathetic to the argument that college often costs too much and that institutions can play revenue games to avoid the appearance of raising tuition. I also think that Megan is one of the better national journalists on the topic of the higher education finances.


However, this article is somewhat sloppy in a way that harms the overall message. I would like to clarify the student fees data to help show the broader point.

Let’s look at the actual data from UCLA’s web site. I assume that Megan is basing this analysis on in-state undergraduate full-time students. The data is listed per quarter, and UCLA has three quarters for a full academic year. I have summarized below summing three quarters into yearly data, and you can:

  • Hover over each measure to see the fee description from UCLA’s fee description page;
  • Click on each category that I added to see the component fees;
  • Sort either column; and
  • Choose which rows to keep or exclude.
  • NOTE: Static image above if you cannot see interactive graphics

UCLA Fees for In-State Undergrads (Total $3,749.97)

Learn About Tableau Some Clarifications Needed
  • The total of non-tuition fees is $3,750 per year, not $3,500; however, Megan is right that this represents “about a quarter of the mandatory cost of attending UCLA for a year” ($3,750 out of $14,970).
  • The largest single fee is the UC health insurance fee (UC-SHIP), which is more than half of the total non-tuition fees. This fact (noted by Michael Berman on Twitter) should have been pointed out, given the significant percentage of the total.
  • With the UC-SHIP at $1,938 and the student services fee at $972, I hardly consider these as “trivial individually”.
Broader Point on Budgeting

The article’s broader point is that using extraneous fees to create additional revenue leads to a flawed budgeting process.

As I’ve written before, this is a common phenomenon that you see among people who have gotten themselves into financial trouble — or, for that matter, people who are doing OK but complain that they don’t know where the money goes and can’t save for the big-ticket items they want. They consider each purchase individually, rather than in the context of a global budget, which means that they don’t make trade-offs. Instead of asking themselves “Is this what I want to spend my limited funds on, or would I rather have something else?” they ask “Can I afford this purchase on my income?” And the answer is often “Yes, I can.” The problem is that you can’t afford that purchase and the other 15 things that you can also, one by one, afford to buy on your income. This is how individual financial disasters occur, and it is also one way that college tuition is becoming a financial disaster for many families.

This point is very important. Look at the Wooden Center fee, described here (or by hovering over chart):

Covers repayment of the construction bond plus the ongoing maintenance and utilities costs for the John Wooden Recreation Center. It was approved by student referendum. The fee is increased periodically based on the Consumer Price Index.

To take Megan’s point, this fee “was approved by student referendum”, which means that UCLA has moved budgeting responsibility away from a holistic approach to saying “the students voted on it”. This makes no financial sense, nor does it make sense to shift bond repayment and maintenance and utilities cost onto student fees.

While this article had some sloppy reporting in terms of accurately describing the student fees, it does highlight an important aspect of the budget problems in higher education and how the default method is to shift the costs to students.

The post A response to Bloomberg article on UCLA student fees appeared first on e-Literate.

Blackboard’s Perceptis Acquisition Offers Clues into Company’s Strategy

Wed, 2014-07-09 08:47

Yesterday Blackboard announced that they acquired Perceptis, a provider of help desk and financial aid support services for colleges and universities. In and of itself, this is not a huge acquisition. Perceptis has 33 clients, offers services that Blackboard was already offering, and has no substantial new technology. But as we approach BbWorld next week, the move provides some early hints into the strategic direction that the company may highlight at the conference.

I had the opportunity to talk with Blackboard’s Vice President of Education Services Katie Blot about the move.

There are a couple of different ways to frame help desk services, so I was curious to hear how Blackboard would position it. Katie talked about being “very, very focused on end-to-end learner-centric support” and “supporting learner pathways” for “non-traditional and post-traditional students.” And in the acquisition announcement, Jay Bhatt is quoted as saying,

By combining the Blackboard and Perceptis teams, we will enhance a service model that the industry needs: one that fully supports students from the first moment they are interested in a school to the day they graduate. This is yet another way Blackboard is reimagining education.

While “reimagining education” may be laying it on a little thick in the context of acquiring a help desk service, the reframing of the company mission as supporting students from orientation to graduation is a significant change. I always got the feeling that former CEO Michael Chasen’s role model was Oracle’s Larry Ellison. If you need a piece of software to help you do something important, Ellison will get it and sell it to you. It doesn’t matter too much what kind of software it is, as long as you’re the kind of customer he wants to have. There’s nothing wrong with that per se, but it leads to particular types of business decisions. A friend who used to work at Georgetown University liked to joke that Blackboard probably had some useful insights about his bowel health because he had to swipe his Blackboard-vended key card every time he used the faculty bathroom. Barring an uncommonly expansive definition of what it means to “fully support students,” this is just not the kind of business that the company Jay Bhatt is describing would be likely to get into (although, for the record, Blackboard currently still owns this business).

Interestingly, this is a point that Katie brought up unprompted. She took great pains to emphasize how they are building a “new Blackboard” (which, by implication, is importantly not like the old Blackboard). In the old days, she said, the company made acquisition decisions based primarily on the financial case. “We bought a lot of companies that were not closely aligned with the core.” I would put it slightly differently. I would say that Blackboard did not have the same core that the company leadership is articulating today.

And what is that core? What is the company trying to become? We will likely know more after next week, but by doubling down on the support services and positioning it the way they are, the company is trying to move up the value chain, away from being perceived as a software vendor and toward being perceived as a student success-related services vendor. According to Katie, their services business as tripled in the three years since Chasen got Blackboard into the call center support business by acquiring Presidium. The Perceptis move can be seen as doubling down. This puts them in an increasingly crowded space, particularly in online education, with competitors that range from Pearson to 2U to Hobsons. When I asked Katie how the company intends to differentiate itself, she cited two factors. First, they provide an a la carte approach and are avoiding making moves that they believe would potentially either put them in direct competition with their customers or otherwise cannibalize the schools’ core competencies. They are staying out of certain services businesses—she didn’t specify, but I imagine that curriculum development is a good example of what she means—while in others she said they take a “teach to fish” approach, moving more toward the consulting than the outsourcing range of the spectrum. This is not terribly different from the marketing message that Instructure deployed against the MOOC providers when announcing the Canvas Network and may be effective against the textbook publishers and more full-service outsourcing companies.

The second differentiator was interesting too. While Katie emphasized the a la carte message and specifically mentioned that Perceptis was attractive to the company because it served non-Blackboard customers and reinforced the message that they want to provide services to schools using other LMSs, she also said that Blackboard’s knowledge of the learning technology stack and, more importantly, the learning data, gives them an edge helping support their customers in making data-driven decisions. There aren’t many service providers who can make that claim right now. To be honest, I’m not sure that Blackboard can either yet. As I have written previously, the heritage of Blackboard’s analytics product is not really with learning analytics and they are still in the early stages of moving into this space. That said, Phil and I are impressed with their decision to hire John Whitmer as Director for Platform Analytics and Educational Research. As Phil has observed, Instructure has gotten strong benefits from hiring academic Jared Stein. Likewise, Al Essa led some pretty major conceptual work on analytics at Desire2Learn before they lost him to McGraw Hill. John is a solid researcher in the field of learning analytics and just the sort of guy that Blackboard needs to help them figure out how to deliver on their claims that they understand how educational data can provide insights enabling better student support.

Obviously, I’m reading tea leaves here. Speaking of data, Phil and I will both be at BbWorld next week and should have more concrete moves by Blackboard to analyze.

The post Blackboard’s Perceptis Acquisition Offers Clues into Company’s Strategy appeared first on e-Literate.

Roundtable Discussion on Integrative Education July 9th at 2pm EDT

Tue, 2014-07-08 16:19

Tomorrow, July 9th at 2:00pm EDT, I’ll join a great cast to discuss Reinvent the University for the Whole Person: Principles Driving Policy, and I hope many of you can watch. The other participants:

  • Randy Bass (Vice Provost for Education and Professor of English at Georgetown University)
  • Martha Kanter (Distinguished Visiting Professor of Higher Education at New York University & former U.S. Under Secretary of Education)
  • Robert Groves (Provost at Georgetown University)
  • Jeffrey Selingo (Author of College (Un)Bound: The Future of Higher Education and What It Means for Students)
  • Tia Brown McNair (Senior Director for Student Success at the Association of American Colleges & Universities)
  • Anthony Carnevale (Director of the Center on Education & the Workforce at Georgetown University)


The core idea for the series:

American higher education rarely has been more in the national spotlight than with the arrival of new digital technologies and new for-profit education businesses, among other big trends. In this rapidly changing landscape, the old model looks increasingly outmoded and many efforts are underway to begin to transform the system for the 21st century. Most efforts are focusing on making the system more efficient and producing a larger number of graduates to fit in a changing economy.

Very little thought is going into other valuable contributions that universities have provided in the past. Universities also produce future citizens, problem–solvers, leaders – not to mention knowledge that can drive innovation and economic growth. How do we ensure that these other critical outcomes will continue in the future? How can we build on new insights about learning and invent new ways to deliver and measure education that matters for a lifetime? How can we use new tools and approaches that are only available now to carry out the mission of educating for the whole person even better than before?

For the roundtable tomorrow, we’ll discuss:

What are the opportunities for shaping public policy for integrative education in a world that also needs more access, lower costs and workplace preparation? How do we ensure this focus is not elitist?

You can access the discussion on the Reinventors website here.

You can access the discussion within Google+ here.

The post Roundtable Discussion on Integrative Education July 9th at 2pm EDT appeared first on e-Literate.

Instructure’s CTO Joel Dehlin Abruptly Resigns

Mon, 2014-07-07 16:32

One week after the conclusion of Instructure’s Users’ Conference, CTO Joel Dehlin abruptly resigned from the company for a new job. Joel took the CTO job with Instructure in summer 2013, around the same time as Devlin Daley’s departure (Devlin was co-founder). Joel’s resignation comes as a surprise, especially given his prominent placement as the technology lead for the Canvas LMS. As recently as InstructureCon on June 27th, Joel gave the product update presentation.

The change became apparent by viewing the new Instructure leadership page (nice page design, btw), as I noticed that Joel was not included. I contacted Devin Knighton, Director of Public Relations for Instructure, who confirmed that the resignation was unexpected and was Joel’s decision. I am not sure how significant this resignation is for the company. What we do know is that Joel has not been replaced as CTO, but that Jared Stein (VP of Research and Education), Trey Bean (VP of Product), David Burggraaf (VP of Engineering), and Zach Willy (Chief Architect) will cover the CTO responsibilities in the near term. I would have more details, but Devin is on family vacation, and I did not want to push for him to send me an official email.

We’ll keep you posted if we find out more information (assuming it is newsworthy).

Update: Corrected second paragraph on VP of Product and VP of Engineering per Devin Knighton comment below.

The post Instructure’s CTO Joel Dehlin Abruptly Resigns appeared first on e-Literate.

Adaptive Learning Market Acceleration Program (ALMAP) Summer Meeting Notes

Mon, 2014-07-07 05:04

I recently attended the ALMAP Summer Meeting. ALMAP is a program funded by the Gates Foundation, with the goals described in this RFP webinar presentation from March 2013:

We believe that well implemented personalized & adaptive learning has the potential to dramatically improve student outcomes

Our strategy to accelerate the adoption of Adaptive Learning in higher education is to invest in market change drivers… …resulting in strong, healthy market growth

As the program is in its mid stage (without real results to speak of yet), I’ll summarize Tony Bates style with summary of program and some notes at the end. Consider this my more-than-140-character response to Glenda Morgan:

@PhilOnEdTech was the agenda of the Gates Summit online at all?

— Glenda Morgan (@morganmundum) June 30, 2014

Originally planned for 10 institutions, the Gates Foundation funded 14 separate grantees at a level of ~$100,000 each. The courses must run for 3 sequential semesters with greater than 500 students total (per school), and the program will take 24 months total (starting June 2013). The awards were given to the following schools:

Gates has also funded SRI International to provide independent research on the results of each grant.

The concept of accelerator as used by the Gates Foundation is to push adaptive learning past the innovator’s adoption category into the majority category (see RFP webinar).

ALMAP accelerator

The meeting was organized around quick updates from most of the grantees along with panels of their partner software providers (Knewton, ALEKS, CogBooks, Cerego, OLI, ASSISTments, Smart Sparrow), faculty, and several local students. Here is a summary of the meeting agenda.

ALMAP Agenda


Adaptive Learning is becoming a hotter topic in higher education recently, and I expect that we will hear more from ALMAP as the results come in. In the meantime, here are some preliminary notes from the meeting (some are my own, some are group discussions that struck me as very important).

  • Despite the potential importance of this funding program, I can only find one full article (outside of Gates publications) about the program. Campus Technology had an article in April titled “The Great Adaptive Learning Experiment”. David Wiley referred to the program in his take on the risks of adaptive learning. Scientific American (among a few others) described ALMAP in one paragraph of a larger story on Adaptive Learning.
  • We really need a taxonomy to describe Adaptive Learning and Personalized Learning as both terms are moving into buzzword and marketing-speak territory. During the break out groups, it seemed there was unanimous agreement on this problem of a lack of precise terminology. While the Gates Foundation also funded two white papers on Adaptive Learning, I did not hear the ALMAP participants using the embedded taxonomy (see below) to improve language usage. I’m not sure why. I provided a short start in this post before EDUCAUSE, but I think Michael and I will do some more analysis on the field and terminology soon. Michael also has a post that was published in the American Federation of Teachers publication AFT On Campus, titled “What Faculty Should Know About Adaptive Learning”, that is worth reading.
  • The above problem (lack of accepted taxonomy, different meanings of adaptive), along with faculty flexibility in determining how to use the software, will make the research challenging, at least in terms of drawing conclusions across the full set of experiments. SRI has its work cut out for them.
  • There appears to be a divide in the vendor space between publisher models, where the content is embedded with the platform, and a platform-only model, where content is provided from external sources. Examples of the former include ALEKS, Adapt Courseware and OLI. Examples of the latter include ASSISTments, Smart Sparrow, CogBooks, Cerego. Cerego might be the only example where they provide “starter” content but also allow the user to provide or integrate their own content. Credit to Neil Heffernan from WPI and ASSISTments for this observation over drinks.
  • Programs of this type (pushing innovation and driving for changes in behavior) should not be judged by the first semester of implementation, when faculty are figuring out how to work out the new approach. Real results should be judged starting in the second semester, and one attendee even recommended to avoid results publication until the third semester. This is the primary reason I am choosing to not even describe the individual programs or early results yet.
  • Kudos to the Gates Foundation for including a student panel (like 20MM Evolve and upcoming WCET conference). Below are a few tweets I sent during this panel.

Student on panel: Profs matter a lot – could tell the ones who don't like teaching. Ones who love teaching are contagious, her best classes.

— Phil Hill (@PhilOnEdTech) June 27, 2014

Conversely, fac who use tech poorly – don't understand, no instructions, no effort to use well – have very negative impact on students

— Phil Hill (@PhilOnEdTech) June 27, 2014

Whether it's from prof or from adaptive sw (or both), student panel wants clear instructions on assignments, timely feedback

— Phil Hill (@PhilOnEdTech) June 27, 2014

Expect to hear more from e-Literate as well as e-Literate TV not only on the ALMAP awardees and their progress, but also from the general field of personalized and adaptive learning.

Below is the taxonomy provided as part of the Gates-funded white paper from Education Growth Advisors.

AL Whitepaper Taxonomy


Update: I did not mention the elephant in the room for adaptive learning – whether software will replace faculty – because it was not an elephant in this room; however, this is an important question in general.

@ricetopher Good point. Unclear if gates funded automation would eliminate teachers… Are we becoming the machine? @PhilOnEdTech

— Whitney Kilgore (@whitneykilgore) July 7, 2014

At the ALMAP meeting, I believe that most grantees had faculty members present. From these faculty members (including a panel specifically on faculty experiences), there were discussions about changing roles (“role is facilitator, coach, lifeguard in a sense”), the fact that faculty were requested to participate rather than initiate the change, and the challenge of getting students to come to class for hybrid models. One faculty member mentioned that the adaptive software allow more instruction on real writing and less on skill-and-drill activities.

But the way the grantees implemented adaptive learning software was not based on replacing faculty, at least for this program.

The post Adaptive Learning Market Acceleration Program (ALMAP) Summer Meeting Notes appeared first on e-Literate.

Fall 2012 US Distance Education Enrollment: Now viewable by each state

Wed, 2014-07-02 23:15

Starting in late 2013, the National Center for Educational Statistics (NCES) and its Integrated Postsecondary Education Data System (IPEDS) started providing preliminary data for the Fall 2012 term that for the first time includes online education. Using Tableau (thanks to Justin Menard for prompting me to use this), we can now see a profile of online education in the US for degree-granting colleges and university, broken out by sector and for each state.

Please note the following:

  • For the most part distance education and online education terms are interchangeable, but they are not equivalent as DE can include courses delivered by a medium other than the Internet (e.g. correspondence course).
  • There are three tabs below – the first shows totals for the US by sector and by level (grad, undergrad); the second also shows the data for each state (this is new); the third shows a map view.

Learn About Tableau

The post Fall 2012 US Distance Education Enrollment: Now viewable by each state appeared first on e-Literate.

Is the DOE backing down on proposed State Authorization regulations?

Thu, 2014-06-26 08:25

Now witness the firepower of this fully written and delivered WCET / UPCEA /Sloan-C letter!

- D. Poulin

One of the policies that we’re tracking at e-Literate is the proposed State Authorization regulation that the US Department of Education (DOE) has been pushing. The latest DOE language represents a dramatic increase in federal control of distance education and in bureaucratic compliance required of institutions and states. In the most recent post we shared a letter from WCET, UPCEA and Sloan-C to Secretary Duncan at the DOE.

What does it take to get all of the higher education institutions and associations to agree? Apparently the answer is for the Department of Education to propose its new State Authorization regulations. [snip]

Here’s what is newsworthy – the idea and proposed language is so damaging to innovation in higher ed (which the DOE so fervently supports in theory) and so burdensome to institutions and state regulators that three higher ed associations have banded together to oppose the proposed rules. WCET (WICHE Cooperative on Educational Technologies), UPCEA (University Professional and Continuing Education Association) and Sloan-C (Sloan Consortium) wrote a letter to Secretary Arne Duncan calling for the DOE to reconsider their planned State Authorization regulations.

While it is unclear how direct an impact the letter had, yesterday brought welcome news from Ted Mitchell at the DOE: they have effectively paused their efforts to introduce new State Authorization regulations. As described at Inside Higher Ed:

The Obama administration is delaying its plan to develop a controversial rule that would require online programs to obtain approval from each and every state in which they enroll students, a top Education Department official said Wednesday.

Under Secretary of Education Ted Mitchell said that the administration would not develop a new “state authorization” regulation for distance education programs before its November 1 deadline.
“We, for all intents and purposes, are pausing on state authorization,” Mitchell said during remarks at the Council for Higher Education Accreditation conference. “It’s complicated, and we want to get it right.”

Mitchell said he wanted make sure the regulation was addressing a “specific problem” as opposed to a general one. The goal, he said, should be to promote consumer protection while also allowing for innovation and recognizing that “we do live in the 21st century and boundaries don’t matter that much.”

It gets better. Mitchell made this statement while at a workshop for the Council for Higher Education Accreditation, and his speech mentioned his desire to clean up some of the regulatory burden on accrediting agencies. As described at the Chronicle:

Ted Mitchell, the under secretary of education, told attendees at a workshop held by the Council for Higher Education Accreditation that accreditors’ acceptance of more responsibility over the years for monitoring colleges had created “complicated expectations for institutions, regulators, politicians, and the public.”

Much of the work accreditors do to ensure that colleges comply with federal regulations is “less appropriate to accreditors than it may be to the state or federal government,” said Mr. Mitchell, who is the No. 2 official in the Department of Education and oversees all programs related to postsecondary education and federal student aid.

“If I could focus on a spot today,” he said, “it would be the compliance work and seeing if we could relieve accreditors of the burden of taking that on for us.”

This is just a speech, and we do not know what the DOE will eventually propose (or not) on State Authorization. But it is certainly a welcome sign that the department has heard the concerns of many in the higher education community.

Update: See Russ Poulin’s blog post at WCET with more context and inside info.

WCET joined with Sloan-C and UPCEA to write a letter to Education Secretary Arne Duncan and Under Secretary Mitchell about our concerns with the direction the Department was taking and to give recommendations on how the Department might proceed. I have also been talking with numerous groups and individuals that have been writing their own letters or have used their contacts.

On Tuesday of this week, Marshall Hill (Executive Director of the National Council on State Authorization Reciprocity Agreements) and some high-ranking members of the National Council leadership board met with Mr. Mitchell. According to Marshall, Mr. Mitchell was aware of many of the concerns that they raised and was very supportive of reciprocity. From that meeting, Mr. Mitchell indicated that more work needed to be done, but did not suggest the delay.

Mr. Mitchell’s reference in the Inside Higher Ed article about addressing a “specific problem” showed that our message was being heard.

The post Is the DOE backing down on proposed State Authorization regulations? appeared first on e-Literate.

WWW-based online education turns 20 this summer

Tue, 2014-06-24 17:01

I’m a little surprised that this hasn’t gotten any press, but Internet-based online education turns 20 this summer. There were previous distance education programs that used networks of one form or another as the medium (e.g. University of Phoenix established its “online campus” in 1989), but the real breakthrough is the use of the world wide web (WWW), effectively creating what people most commonly know as “the Internet”.

To the best of my knowledge (correct me in comments if there are earlier examples), the first accredited school to offer a course over the WWW was the Open University in a pilot Virtual Summer School project in the summer of 1994. The first course was in Cognitive Psychology, offered to 12 students, as described in this paper by Marc Eisenstadt and others involved in the project (the HTML no longer renders):

In August and September 1994, a Virtual Summer School (VSS) for Open University undergraduate course D309 Cognitive Psychology enabled students to attend an experimental version of summer school ‘electronically’, i.e. from their own homes using a computer and a modem. VSS students were able to participate in group discussions, run experiments, obtain one-to-one tuition, listen to lectures, ask questions, participate as subjects in experiments, conduct literature searches, browse original journal publications, work in project teams, undertake statistical analyses, prepare and submit nicely formatted individual or joint written work, prepare plenary session presentations, and even socialize and chit-chat, all without ever leaving their homes. The term ‘Virtual Summer School’ was used to mean that the software packages supplied to students emulate many aspects of a residential summer school, but without requiring physical attendance. As with many other Open University activities, we feel that face-to-face tuition and peer group interaction would still be preferable if it could be achieved. However, there are sometimes circumstances which preclude physical attendance, so we want to provide the best possible alternative. Virtual Summer School was a first step in this direction. This year, it was only an experimental option for a dozen already-excused students, which gave us a low-risk entry in order to assess the viability of the approach.

There is even a concept video put together by the Open University at the end of 1994 that includes excerpts of the VSS course.

And now for your trip down memory lane, I have taken the paper, cleaned up the formatting, and fixed / updated / removed the links that no longer work. The modified paper is below for easier reading:


Virtual Summer School Project, 1994



One of the great strengths of the UK’s Open University is its extensive infrastructure, which provides face-to-face tuition through a network of more than 7000 part-time tutors throughout the UK and Europe. This support network, combined with in-house production of high-quality text and BBC-produced videos, provides students with much more than is commonly implied by the phrase ‘distance teaching’! Moreover, students on many courses must attend residential schools (e.g. a one-week summer school to gain experience conducting Biology experiments), providing an additional layer of support. About 10% of students have genuine difficulty attending such residential schools, and increasingly we have started to think about addressing the needs of students at a greater distance from our base in the UK. This is where the Virtual Summer School comes in.

The Cognitive Psychology Virtual Summer School

In August and September 1994, a Virtual Summer School (VSS) for Open University undergraduate course D309 Cognitive Psychology enabled students to attend an experimental version of summer school ‘electronically’, i.e. from their own homes using a computer and a modem. VSS students were able to participate in group discussions, run experiments, obtain one-to-one tuition, listen to lectures, ask questions, participate as subjects in experiments, conduct literature searches, browse original journal publications, work in project teams, undertake statistical analyses, prepare and submit nicely formatted individual or joint written work, prepare plenary session presentations, and even socialize and chit-chat, all without ever leaving their homes. The term ‘Virtual Summer School’ was used to mean that the software packages supplied to students emulate many aspects of a residential summer school, but without requiring physical attendance. As with many other Open University activities, we feel that face-to-face tuition and peer group interaction would still be preferable if it could be achieved. However, there are sometimes circumstances which preclude physical attendance, so we want to provide the best possible alternative. Virtual Summer School was a first step in this direction. This year, it was only an experimental option for a dozen already-excused students, which gave us a low-risk entry in order to assess the viability of the approach.

Below we describe the technology involved, evaluation studies, and thoughts about the future.

The Technology

Three main categories of technology were required: communications & groupwork tools, support & infrastructure software/hardware, and academic project software.

Communications and Groupwork
  • Email, Usenet newsgroups, live chat lines and low-bandwidth (keyboard) conferencing: this technology was provided by FirstClass v. 2.5 from SoftArc in Toronto, and gave students a nice-looking veneer for many of their day-to-day interactions. A ‘Virtual Campus’ map appeared on their desktops, and folder navigation relied on a ‘room’ metaphor to describe crucial meeting places and bulletin boards.
  • WWW access: NCSA Mosaic 1.0.3 for Macintosh was provided for this purpose [in the days before Netscape was released] . Students had customized Hotlists which pointed them to academically-relevant places (such as Cognitive & Psychological Sciences on The Internet), as well as some fun places.
  • Internet videoconferencing: Using Cornell University’s CU-SeeMe, students with ordinary Macs or Windows PCs (even over dial-up lines from home) were able to watch multiple participants around the world. Video transmission from slightly higher-spec Macs & PCs was used for several Virtual Summer School events, including a Virtual Guest Lecture by Donald A. Norman, formerly Professor of Psychology at the University of California at San Diego (founder of its Cognitive Science Programme), and now an Apple Fellow.
  • Remote presentation software: we used a product called ‘The Virtual Meeting’ (from RTZ in Cupertino), which allowed synchronized slide & movie presentations on remote Macs & PCs distributed across local, wide, or global (including dial-in) networks, displayed images of all remote ‘participants’, and facilitated moderated turn-taking, ‘hand-raising’, interactive whiteboard drawing & question/answer sessions.
  • Mobile telephone support and voice conferencing: every VSS student was supplied with an NEC P100 cellular phone, so that they could use it while their domestic phone was busy with their modem (some day they’ll have ISDN of fibre optic lines, but not this year). Audio discussions were facilitated by group telephone conference calls, run concurrently with CU-SeeMe and other items shown above. Our largest telephone conference involved 17 participants, and worked fine given that basic politeness constraints were obeyed.
  • Remote diagnostic support and groupwork: Timbuktu Pro from Farallon, running over TCP/IP, enabled us to ‘cruise in’ to our students’ screens while chatting to them on their mobile phones, and to help them sort out specific problems. Students could also work in small self-moderated groups this way, connecting as observers to one user’s Macintosh.
Support and infrastructure software/hardware
  • Comms Infrastructure: TCP/IP support was provided by a combination of MacTCP, MacPPP, VersaTerm Telnet Tool on each student’s machine, plus an Annex box at The Open University connecting to a Mac Quadra 950 running a FirstClass Server and 3 Suns running cross-linked CU-SeeMe reflectors.
  • Tutorial Infrastructure: each student was supplied with HyperCard, MoviePlay, and SuperCard 1.7 to run pre-packaged tutorial and demonstration programs, some of which were controlled remotely by us during group presentations. Pre-packaged ‘guided tour’ demos of all the software were also provided (prepared with a combination of MacroMind Director and CameraMan). To help any computer-naive participants ‘bootstrap’ to the point where they can at least send us an email plea for help, we also supplied a short video showing them how to unpack and connect all of their equipment, and how to run some of the demos and FirstClass.
  • Hardware: one of our aims was to foreshadow the day in the near future when we can presuppose that (a) most students will be computer-literate, (b) students will have their own reasonable-specification hardware, (c) bandwidth limitations will not be so severe, and (d) all of our software will be cross-platform (e.g. Mac or Windows). We could only approximate that in 1994, so we supplied each VSS student with a Macintosh LC-II with 8MB of RAM, a 14.4Kbps modem, a StyleWriter-II printer, 13″ colour monitor, mobile phone and extra mobile phone battery. Students were given a conventional video cassette showing how to set up all the equipment (see tutorial infrastructure above).
Academic project software

Our students had four main support packages to help them in their Cognitive Psychology studies:

  • a custom-built ‘Word Presentation Program’, which allowed them to create stimuli for presentation to other students and automatically record data such as reaction times and button presses (they could create a turnkey experiment package for emailing to fellow students, and then have results emailed back);
  • a HyperCard-based statistics package, for analysing their data;
  • MacProlog from Logic Programming Associates in the UK, for writing simple Artificial Intelligence and Cognitive Simulation programs;
  • ClarisWorks, for preparing reports and presentations, reading articles that we emailed to them as attachments, and doing richer data analyses.
Timetable and evaluation

Students had a three-week warmup period in order to become familiar with their new equipment and run some trial (fun) activities with every piece of software, and formal academic activities took place from August 27th – Sept. 9th, 1994, mostly in the evenings. Thus, the conventional one-week residential summer school was stretched out for two weeks to allow for part-time working. During week one the students concentrated on experimental projects in the area of “Language & Memory” (typically demonstrating inferences that “go beyond the information given”). During week two the students wrote simple AI programs in Prolog that illustrate various aspects of cognitive processing (e.g. simulating children’s arithmetic errors). They were supplied with Paul Mulholland’s version of our own Prolog trace package (see descriptions of our work on Program Visualization) to facilitate their Prolog debugging activities.

A detailed questionnaire was supplied both to the Virtual Summer School students and to conventional summer school students taking the same course. We looked at how students spent their time, which activities were beneficial for them, and many other facets of their Virtual Summer School experience.

[removed reference to Kim Isikoff's paper and student interviews, as all links were broken]

The future

The Virtual Summer School finished on 9th September 1994 (following our Virtual Disco on 8th September 1994, incidentally…. we told students about music available on the World Wide Web for private use). What happens next? Here are several issues of importance to us:

  • We must lobby for ever-increasing ‘bandwidth’ [i.e. channel capacity, reflected directly in the amount and quality of full-colour full-screen moving images and quality sound that can be handled]. This is necessary not only for Open University students, but also for the whole of the UK, and indeed for the whole world. As capacity and technology improve, so does the public expectation and need [analagous to the way the M25 motorway was overfull with cars the first day it opened-- the technology itself helps stimulate demand]. Whatever the current ‘Information SuperHighway’ plans are [just like Motorway construction plans], there is a concern that they don’t go far enough.
  • We must RADICALLY improve both (i) the user interfaces and (ii) the underlying layers of communications tools. Even with the excellent software and vendor support that we had at our disposal, all the layers of tools needed (TCP/IP, PPP, Communications Toolbox, etc.) made a veritable house of cards. The layers of tools were (i) non-trivial to configure optimally in the first place (for us, not the students); (ii) non-trivial to mass-install as ‘turnkey’-ready systems for distribution to students; (iii) non-trivial for students to use straight ‘out of the box’ (naturally almost everything in the detailed infrastructure is hidden from the students, but one or two items must of necessity rear their ugly heads, and that gets tricky); and (iv) ‘temperamental’ (students could get interrupted or kicked off when using particular combinations of software). We were fully prepared for (iv), because that’s understandible in the current era of communicating via computers, but (i), (ii), and (iii) were more surprising. [If anyone doubts the nature of these difficulties, I hereby challenge them to use Timbuktu Pro, a wonderful software product, with 4 remotely-sited computer-naive students using TCP/IP over a dial-up PPP connection.] We can do better, and indeed we MUST do better in the future. Many vendors and academic institutions are working on these issues, and they need urgent attention.
  • We must obtain a better understanding of the nature of remote groupwork. Our students worked in groups of size 2, 3, or 4 (depending on various project selection circumstances). Yet even with pre-arranged group discussions by synchronous on-line chat or telephone conference calls, a lot of fast-paced activity would suddenly happen, involving just one student and one tutor. For example, student A might post a project idea to a communal reading area accessible only to fellow project-group students B and C and also tutor T. Tutor T might post a reply with some feedback, and A might read it and react to it before B and C had logged in again. Thus, A and T would have inadvertently created their own ‘shared reality’– a mini-dialogue INTENDED for B and C to participate in as well, yet B and C would get left behind just because of unlucky timing. The end result in this case would be that students A, B, and C would end up doing mostly individual projects, rather than a group project. Tutors could in future ‘hold back’, but this is probably an artificial solution. The ‘shared reality’ between A and T in the above scenario is no different from what would happen if A cornered T in the bar after the day’s activities had finished at a conventional Summer School. However, in that situation T could more easily ensure that B and C were brought up to date the next day. We may ultimately have to settle for project groups of size 2, but not before doing some more studies to try to make larger groups (e.g. size 4) much more cohesive and effective.
  • We need to improve ‘tutor leverage’ (ability to reach and influence more people). Let’s suppose that we have thoroughly researched and developed radical improvements for the three items above (more bandwidth, nice user interfaces with smooth computer/communications infrasture [sic], happy cohesive workgroups of size 4). It would be a shame if, after all that effort and achievement, each tutor could only deal with, say, 3 groups of 4 students anywhere in the world. The sensory overload for tutors at the existing Virtual Summer School was considerable… many simultaneous conversations and many pieces of software and technology running at once. The 1994 Virtual Summer School was (of necessity) run by a self-selecting group of tutors who were competent in both the subject matter and the technology infrastructure. Less technologically-capable tutors need to be able to deal with larger numbers of students in a comfortable fashion, or Virtual Summer School will remain quite a ‘niche’ activity.

The four areas above (more bandwidth, better computer/comms interfaces, larger workgroups, increased tutor leverage) are active areas of research for us…. stay tuned (and see what we’re now doing in KMi Stadium)!

Who made it work?
  • Marc Eisenstadt: VSS Course Director, Slave Driver, and Fusspot
  • Mike Brayshaw: VSS Tutor & Content Wizard
  • Tony Hasemer: VSS Tutor & FirstClass Wizard
  • Ches Lincoln: VSS Counsellor and FirstClass Guru
  • Simon Masterton: VSS Academic Assistant, Mosaic Webmaster, and Mobile Phone Guru
  • Stuart Watt: VSS Mac Wizard
  • Martin Le Voi: VSS Memory/Stats Advisor & Unix Guru
  • Kim Issroff: VSS Evaluation and <A HREF=”#kim-report”>Report</A>
  • Richard Ross: VSS Talking Head Guided Tour
  • Donald A. Norman (Apple, Inc.): VSS Virtual Guest Lecturer
  • Blaine Price: Unix & Internet Guru & Catalyst
  • Adam Freeman: Comms & Networking Guru
  • Ian Terrell: Network Infrastructure Wizard
  • Mark L. Miller (Apple, Inc.): Crucial Guidance
  • Christine Peyton (Apple UK): Support-against-all-odds
  • Ortenz Rose: Admin & Sanity Preservation
  • Elaine Sharkey: Warehousing/Shipping Logistics

Update: Changed title and Internet vs. WWW language to avoid post-hoc flunking of Dr. Chuck’s IHTS MOOC.

The post WWW-based online education turns 20 this summer appeared first on e-Literate.