Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 5 hours 14 min ago

e-Literate TV Preview: Essex County College and changing role of faculty

Wed, 2015-02-25 17:58

By Phil HillMore Posts (291)

As we get closer to the release of the new e-Literate TV series on personalized learning, Michael and I will be posting previews highlighting some of the more interesting segments from the series. When we first talked about the series with its sponsors, the Bill & Melinda Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent.

In this video preview (about 4:18 in duration), we hear from two faculty members who have first-hand experience in using a personalized learning approach as well as a traditional approach to remedial math. We also hear from students on what they are learning about learning. In our case studies so far, the real faculty issue is not that software is being designed to replace faculty, but rather that successful implementation of personalized learning necessarily changes the role of faculty. One of our goals with e-Literate TV is to allow faculty, staff and students to describe direct experiences in their own words. Take a look.

Click here to view the embedded video.

Stay tuned for the full episodes to be released on the In The Telling platform[1]. You can follow me (@PhilOnEdTech), Michael (@mfeldstein67), or e-Literate TV (@eLiterateTV) to stay up to date. You can also follow the e-Literate TV YouTube channel. We will also announce the release here on e-Literate.

  1. ITT is our partner in developing this series, providing video production as well as the platform.

The post e-Literate TV Preview: Essex County College and changing role of faculty appeared first on e-Literate.

First View of Bridge: The new corporate LMS from Instructure

Tue, 2015-02-24 04:41

By Phil HillMore Posts (291)

Last week I covered the announcement from Instructure that they had raised another $40 million in venture funding and were expanding into the corporate learning market. Today I was able to see a demo of their new corporate LMS, Bridge. While Instructure has very deliberately designed a separate product from Canvas, their education-focused LMS, you can see the same philosophy of market strategy and product design embedded in the new system. In a nutshell, Bridge is designed to a simple, intuitive platform that moves control of the learning design away from central HR or IT control and closer to the end user.

While our primary focus at e-Literate is on higher ed and even some K-12 learning, the development of professional development and corporate training markets are becoming more important even in the higher ed context. At the least, this is important for those who are tracking Instructure and how their company plans might affect the future of education platforms.

The core message of Instructure regarding Bridge – just as with Canvas – is that it is focused on ease-of-use whereas the entrenched competition has fallen prey to feature bloat based on the edge cases. Despite this claim and despite Instructure’s track record with Canvas, what does this mean? I’m pretty sure every vendor out there claims ease-of-use whether or not there are elegant or terrible designs[1].

Based on the demo, Bridge appears to define ease-of-use in three distinct areas – streamlined, clutter-free interface for learners, simple tools for content creation by business units, and simple tools for managing learners and content.

Learner User Experience

Bridge has been designed over the past year based on Instructure’s design to avoid force-fitting Canvas into corporate learning markets. The core use cases of this new market are far simpler than education use cases, and the resultant product has fewer bells and whistles than Canvas. In Instructure’s view, the current market has such cumbersome products that learning platforms are mostly used just for compliance – take this course or you lose your job – and not at all for actual learning. The Bridge interface (shown alongside the mobile screen and on laptop) is simple.

Mobile_same_as_laptop

Learner_progress

While this is a clean interface, I don’t see it as being that big of a differentiator or rationale for a new product line.

Content Creation

The content creation tools, however, start to show Instructure’s distinctive approach. They have made their living on being able to say no – refusing to let user requests for additional features to change their core design principle.  The approach for Bridge is to assume that content creators have no need to have web design or instructional design experience, providing them with simple formatting and suggestion-based tools to make content creation easy. The formatting looks to be on the level of Google Docs, or basic WordPress, rather than Microsoft Word.

Content_authoring_tool

When creating new content, the Bridge LMS even puts up prompts for pre-formatted content types.

Content_prompts

When creating quizzes, they have an interesting tool that adds natural language processing to facilitate simple questions that can be randomized. The author could write a simple sentence of what they are trying to convey to users, such as “Golden Gate Bridge is in San Francisco”. The tool selects each word and allows the author to add alternative objects that can serve in a quiz, such as suggesting San Mateo or San Diego (it is not clear if you can group words to replace the full “San Francisco” rather than “Francisco”). The randomized quiz questions could then be automatically created.

Quiz Creation

For content that is more complex, Instructure is taking the approach of saying ‘no’ – go get that content from a SCORM/AICC import coming from a more complex authoring tool.

Learner Administration Tools

Rather than relying on complex HR systems to manage employees, Bridge goes with a CSV import tool that reminds me of Tableau in that it pre-imports, shows the fields, and allows a drag-and-drop selection and re-ordering of fields for the final import[2].

CSV_Learner_Import

The system can also create or modify groups based on rules.

Group_creation_tool

To pull this together, Bridge attempts to automate as much of the background process as is feasible. To take one example, when you hire a new employee or change the definition of groups, the system retroactively adds the revised list of learners or groups to assigned courses.

For live training, you can see where Bridge takes the opposite approach to Canvas. In Canvas (as with most education LMSs), it is assumed that more time in the system means more time learning – the core job of learners. In Bridge, however, the assumption is that LMS time-on-task should be minimized. For compliance training in particular, you want the employee to spend as little time as reasonable training so they can get their real job done. Bridge focuses not on the live training itself but rather on the logistics tasks in setting up the course (scheduling, registering, taking attendance).

Live_training_tools_1

Prospects and Implications

Taken together, the big story here is that Instructure seeks to change the situation where learning management in corporations is cloistered within HR, IT and instructional design units.. As they related today, they want to democratize content creation and center learning in the business units where the subject matter experts reside.

Their future plans focus on engagement – getting feedback and dialogue from employees rather than just one-way content dissemination and compliance. If they are successful, this is where they will gain lasting differentiation in the market.

What does this mean from a market perspective? Although I do not have nearly as much experience with corporate training as I do with higher education, this LMS seems like a real system and a real market entry into corporate learning. The primary competitors in this space are not Blackboard, as TechCrunch and Buzzfeed implied, but are Saba, SumTotal, SuccessFactors, Cornerstone, etc. Unlike education, this is a highly fragmented market. I suspect that this means that the growth prospects for Instructure will be slower than in education, but real nonetheless. Lambda Solutions shared the Bersin LMS study to give a view of the market.

lms-market

This move is clearly timed to help with Instructure’s planned IPO that could happen as soon as November 2015[3]. Investors can now see potential growth in an adjacent market to ed tech where they have already demonstrated growth.

I mentioned in my last post that the biggest risk I see is management focus and attention. I suspect with their strong fund-raising ($90 million to date) that the company has enough cash to hire staff for both product lines, but senior management will oversee both the Canvas and the Bridge product lines and markets.

  1. Although I would love to see the honest ad: “With a horrible, bloated user interface based on your 300-item RFP checklist!”
  2. I assume they can integrate with HR systems as well, but we did not discuss this aspect.
  3. Note this is based on my heuristic analysis and not from Instructure employees.

The post First View of Bridge: The new corporate LMS from Instructure appeared first on e-Literate.

ITC #eLearning2015 Keynote Video and Material

Sat, 2015-02-21 17:20

By Phil HillMore Posts (291)

This past week I had the opportunity to provide the keynote at the Instructional Technology Council (ITC) eLearning2015 conference in Las Vegas. ITC is a great group that provides leadership and professional development to faculty and staff in community and junior colleges in online education, and increasingly in hybrid course models. To save time on individual sharing, I have included most of the material below.

Here is the MediaSite recording of the keynote:

And here are the slides in SlideShare:

And here is the YouTube channel for e-Literate TV. The Essex County College clip is a sneak preview of an upcoming e-Literate TV case study on personalized learning (more on that in the next post).

Click here to view the embedded video.

Finally, here are the two clips from the WCET14 student panel:

Need for some level of standardization:

Click here to view the embedded video.

Need for interaction:

Click here to view the embedded video.

And last, but certainly not least, the infamous Underpants Gnome video:

Click here to view the embedded video.

The post ITC #eLearning2015 Keynote Video and Material appeared first on e-Literate.

What TechCrunch Got Wrong (and Right) About Instructure Entering Corporate Learning Market

Thu, 2015-02-19 17:08

By Phil HillMore Posts (291)

After yesterday’s “sources say” report from TechCrunch about Instructure – maker of the Canvas LMS – raising a new round of financing and entering the corporate LMS space, Instructure changed plans and made their official announcement to today. The funding is to both expand the Canvas team and to establish the new corporate LMS team. I’m not a fan of media attempts to get a scoop based purely on rumors, and in this case TechCrunch got a few items wrong that are worth correcting.

  • Instructure raised $40 million in new financing (series E), not “between $50 to $70 million”. TechCrunch did hedge their bets with “low end of the range at over $40 million”.
  • The primary competition in the corporate LMS space is Saba, SumTotal, Skillsoft, Cornerstone – and not Blackboard.
  • The Canvas LMS was launched in 2010, not 2011. (OK, I’ll give them this one, as even Instructure seems to use the 2011 date).

TechCrunch did get the overall story of fund-raising and new corporate product right, but these details matter.

Instructure’s new product for the corporate learning market is called Bridge, with its web site here. This is an entirely new product, although it does share a similar product architecture as Canvas, the LMS designed for the education market (including being based on Ruby on Rails). Unlike Canvas, Bridge was designed mobile-first, with all mobile capabilities embedded in the product and not as separate applications. In an interview with Josh Coates, CEO of Instructure, he described their motivation for this new product.

We like the idea of building software that helps people get smarter. Post education there is a void, with bad corporate software.

The design goal of Bridge is to make the creation and consumption of learning content easy, although future directions for the company will emphasize employee engagement and two-way conversations within companies. According to Coates, this focus on engagement parallels their research for future emphasis in the education market.

Bridge

The Bridge product line will have a separate sales team and product team. From the press release:

Foundation partners include CLEARLINK, OpenTable and Oregon State University.

Oregon State University is an interesting customer of both products – they are adopting Canvas as part of their Unizin membership, and they are piloting Bridge as an internal HR system for training staff. This move will likely be adopted by other Canvas education customers.

Given the self-paced nature of both Competency-Based Education (CBE) and corporate learning systems, I asked if Bridge is targeted to get Instructure into the CBE land grab. Coates replied that they are researching whether and how to get into CBE, but they are first exploring if this can be done with Canvas. In other words, Bridge truly is aimed at the corporate learning market.

While Instructure has excelled on maintaining product focus and simplicity of user experience, this move outside of education raises the question about whether they can maintain company focus. The corporate market is very different than the education market – different product needs, fragmented vendor market, different buying patterns. Many companies have tried to cross over between education and corporate learning, but most have failed. Blackboard, D2L and Moodle have made a footprint in the corporate space using one product for both markets. Instructure’s approach is different.

As for the fund-raising aspects, Instructure has made it very clear they are planning to go public with an IPO sometime soon, as reported by Buzzfeed today.

CEO Josh Coates told BuzzFeed today that the company had raised an additional $40 million in growth funding ahead of a looming IPO, confirming a rumor that was first reported by Tech Crunch yesterday. The company has now raised around $90 million.

Given their cash, a natural question is whether Instructure plans to use this to acquire other companies. Coates replied that they get increasingly frequent inbound requests (for Instructure to buy other companies) that they evaluate, but they are not actively pursuing M&A as a key corporate strategy.

I have requested a demo of the product for next week, and I’ll share the results on e-Literate as appropriate.

Update: Paragraph on organization corrected to point out separate product team. Also added sentence on funding to go to both Canvas and Bridge.

The post What TechCrunch Got Wrong (and Right) About Instructure Entering Corporate Learning Market appeared first on e-Literate.

NGDLE: The quest to eat your cake and have it too

Tue, 2015-02-17 05:51

By Phil HillMore Posts (291)

And I’m going old school and sticking to the previous saying.

Google_Ngram_Viewer

Today I’m participating in the EDUCAUSE meeting on Next Generation Digital Learning Environments, funded by the Bill & Melinda Gates Foundation[1]. From the invitation:

The purpose of the panel is to identify potential investment strategies that are likely to encourage and hasten the arrival of “next-generation digital learning environments,” online learning environments that take us beyond the LMS to fully support the needs of today’s students and instructors. [snip]

It is clear that to meet the needs of higher education and today’s learner, the NGDLE must support a much wider range of functionality than today’s LMS, including different instructional modes, alternative credit models, personalized learning, robust data and content exchange, real-time and iterative assessment, the social web, and contemporary software design and usability practices. The policy and cultural context at our colleges and universities must also adapt to a world in which all learning has a digital component.

As I’m making an ill-timed trip from sunny California to snow-ravaged DC for a reduced-attendance meeting, I should at least lay down some of my thoughts on the subject in writing[2].

There is potential confusion of language here by implying NGDLE as an environment to replace today’s LMS. Are we talking about new, monolithic systems that replace today’s LMS but also have a range of functionality to support new needs, or are we talking about an environment that allows reasonably seamless integration and navigation between multiple systems? Put another way, investing in what?

To get at that question we should consider the current LMS market.

Current Market

Unlike five years ago, market dynamics are now leading to systems that better meet the needs of students. Primarily driven by the entrance of the Canvas LMS, the end of the Blackboard – Desire2Learn patent lawsuit, and new ed tech investment, today’s systems are lower in costs than previous systems and have much better usability. Canvas changed the standard of what an LMS can be for traditional courses – competitors that view it as just the shiny new object and not a material difference in usability have done so at their own peril. Blackboard is (probably / eventually / gosh I hope) releasing an entirely new user experience this year that seems to remove much of the multiple-click clunkiness of the past. Moodle has eliminated most of the scroll of death. Sakai 10 introduced a new user interface that is far better than what they had in the past.

It seems at every school I visit and every report I read, students are asking for consistency of usage and navigation along with more usable systems. This is, in fact, what the market is finally starting to deliver. It’s not a perfect market, but there are real changes occurring.

I have already written about the trend of the LMS, particularly based on IMS standards, to go from a walled garden approach:

walledgarden2

to an open garden approach that allows the coordination of the base system with external tools.

walledgarden5

 

Largely due to adoption of Learning Tools Interoperability (LTI) specifications from IMS Global, it is far easier today to integration different applications with an LMS. Perhaps more importantly, the ability to move the integration closer to end users (from central IT to departments and faculty) is getting closer and closer to reality. Michael has also written about the potential of the Caliper framework to be even more significant in expanding interoperability.

The LMS is not going away, but neither is it going to be the whole of the online learning experience anymore. It is one learning space among many now. What we need is a way to tie those spaces together into a coherent learning experience. Just because you have your Tuesday class session in the lecture hall and your Friday class session in the lab doesn’t mean that what happens in one is disjointed from what happens in the other. However diverse our learning spaces may be, we need a more unified learning experience. Caliper has the potential to provide that.

At the same time there are a new wave of learning platforms designed specifically for this latter category. I have started to cover the CBE platforms recently, as Motivis, Helix, FlatWorld, LoudCloud Systems, and others have been introduced with radically different features and capabilities. At e-Literate TV we are learning more about adaptive and personalized systems such as ALEKS, Smart Sparrow, OLI, Cerego and others that design around the learning.

If you look at this new wave of learning environments, you’ll see that they are designed around the learner instead of the course and are focused on competencies or some other form of learning outcomes.

In a sense, the market is working. Better usability for traditional LMS, greater interoperability, and new learning platforms designed around the learner. There is a risk for NGDLE in that you don’t want to screw up the market when it’s finally moving in the right direction.

And Yet . . .

The primary benefits of today’s LMS remains administrative management of traditionally-designed courses. From last year’s ECAR report on the LMS, faculty and students rated their LMS satisfaction highest for the basic administrative functions.

Faculty satisfaction LMS

Student satisfaction LMS

Opponents of the traditional LMS are right to call out how its design can stifle creativity and prevent real classroom engagement. Almost all capabilities of the LMS are available on the free Internet, typically in better-designed tools.

This situation leads to three challenges:

  • The community has discussed the need for direct teaching and learning support for years, yet most courses only use the LMS for rosters, grade book and document sharing (syllabus, readings, assignments). The market changed en masse to call their systems Learning Management Systems in the late 2000s, but the systems mostly remain Course Management Systems as previously named. Yes, some schools and faculty – innovators and early adopters – have found ways to get learning benefits out of the systems, but that is secondary to managing the course.
  • New educational delivery models such as competency-based education (CBE) and personalized learning require a learner-centric design that is not just based on added some features on top of the core LMS. It is worth noting that the new learning platforms tend to be wholesale replacements for the LMS in specific programs rather than expansion of capabilities.
  • The real gains in learner-specific functionality have arisen from applications that don’t attempt to be all things to all people. In today’s world it’s far easier to create a new web and mobile-based application that ever before, and many organizations are taking this approach. Any attempt to push platforms into broader functionality creates the risk of pushing the market backwards into more feature bloat.
Back to the NGDLE

I won’t go into investment strategies for NGDLE, as that is the topic for group discussions today. But I think it is worth calling out the need to support two seemingly incompatible needs.

  • Given the very real improvements in the LMS market, we should not abandon the gains made by institutions and faculty that have taken ~15 years to achieve.
  • The market should not just evolve – new educational models require new ground-up designs, and we need far more emphasis on learning support and student engagement.

Is it possible to eat your cake and have it, too? In my opinion, our best chance is through the encouragement and support for interoperability frameworks that allow a course or learner hub / aggregator – providing consistent navigation and support for faculty not looking to innovate with technology – along with an ecosystem of true learning applications and environments. This is the move to learning platforms, not just as marketing terms but as true support for integrated world of applications.

  1. Disclosure: Our upcoming e-Literate TV series has also received a grant from the Gates Foundation.
  2. Now that I’ve gone down for breakfast, the 2-inch snowfall would be somewhat embarrassing if not for the city being shut down.

The post NGDLE: The quest to eat your cake and have it too appeared first on e-Literate.

What Does Unizin Mean for Digital Learning?

Mon, 2015-02-16 13:41

By Michael FeldsteinMore Posts (1015)

Speaking of underpants gnomes sales pitches, Phil and I spent a fair amount of time hearing about Unizin at the ELI conference. Much of that time was spent hearing friends that I know, trust, and respect talk about the project. At length, in some cases. On the one hand, it is remarkable that, after these long conversations, I am not much clearer on the purpose of Unizin than I was the week before. On the other hand, being reminded that some of my friends really believe in this thing helped me refill my reservoir of patience for the project, which had frankly run dry.

Alas, that reservoir was largely drained away again during a Unizin presentation with the same title as this blog post. I went there expecting the presenters to answer that question for the audience.

Alack.

The main presentation was given by Anastasia Morrone of IUPUI, was probably the most straightforward and least hype-filled presentation about Unizin that I have heard so far. It was also short. Just when I was warming to it and figuring we’d get to the real meat, her last slide came up:

Split into groups of 5-7 people and discuss the following:

How can faculty, teaching center consultants, and learning technologists contribute to best practices with the evolving Unizin services?

Wait. What?

That’s right. They wanted us to tell them what Unizin means for digital learning. That might have been a good question to ask before they committed to spend a million dollars each on the initiative.

I joined one of the groups, resolving to try as hard as I could to keep my tongue in check and be constructive (or, at least, silent) for as long as I could. The very first comment in my group—not by me, I swear—was, “Before I can contribute, can somebody please explain to me what Unizin is?” It didn’t get any better from there. At the end of the breakout session, our group’s official answer was essentially, “Yeah, we don’t have any suggestions to contribute, so we’re hoping the other groups come up with something.” None of them did, really. The closest they came were a couple of vague comments on inclusive governance. I understand from a participant in one of the other groups that they simply refused to even try to answer the question. It was brutal.

Click here to view the embedded video.

Still, in the spirit of the good intentions behind their request for collaborative input, I will list here some possible ways in which Unizin could provide value, in descending order of credibility.

I’ll start with the moderately credible:

  • Provide a layer of support services on top of and around the LMS: This barely even gets mentioned by Unizin advocates but it is the one that makes the most sense to me. Increasingly, in addition to your LMS, you have a bunch of connected tools and services. It might be something basic like help desk support for the LMS itself. It might be figuring out how an external application like Voicethread works best with your LMS. As the LMS evolves into the hub of a larger ecosystem, it is putting increasing strain on IT department in everything from procurement to integration to ongoing support. Unizin could be a way of pooling resources across institutions to address those needs. If I were a CIO in a big university with lots of demands for LMS plug-in services, I would want this.
  • Provide a university-controlled environment for open courses: Back when Instructure announced Canvas Network, I commented that the company had cannily targeted the issue that MOOC providers seemed to be taking over the branding, not to mention substantial design and delivery decisions, from their university “partners.” Canvas Network is marketed as “open courses for the rest of us.” By adopting Canvas as their LMS, Unizin gets this for free. Again, if I were a CIO or Provost at a school that was either MOOCing or MOOC-curious, I would want this.
  • Providing buying power: What vendor would not want to sew up a sales deal with ten large universities or university systems (and counting) through one sales process? So far it is unclear how much Unizin has gained in reality through group negotiations, but it’s credible that they could be saving significant money through group contracting.
  • Provide a technology-assisted vehicle for sharing course materials and possibly even course cross-registrations: The institutions involved are large, and most or all probably have specialty strengths in some curricula area or other. I could see them wanting to trade, say, an Arabic degree program for a financial technology degree program. You don’t need a common learning technology infrastructure to make this work, but having one would make it easier.
  • Provide a home for a community researching topics like learning design and learning analytics: Again, you don’t need a common infrastructure for this, but it would help, as would having courses that are shared between institutions.

Would all of this amount to a significant contribution to digital learning, as the title of the ELI presentation seems to ask? Maybe! It depends on what happens in those last two bullet points. But the rollout of the program so far does not inspire confidence that the Unizin leadership knows how to facilitate the necessary kind of community-building. Quite the opposite, in fact. Furthermore, the software has only ancillary value in those areas, and yet it seems to be what Unizin leaders want to talk about 90%+ of the time.

Would these benefits justify a million-dollar price tag? That’s a different question. I’m skeptical, but a lot depends on specific inter-institutional intentions that are not public. A degree program has a monetary value to a university, and some universities can monetize the value better than others depending on which market they can access with significant degrees of penetration. Throw in the dollar savings on group contracting, and you can have a relatively hard number for the value of the coalition to a member. I know that a lot of university folk hate to think like that, but it seems to be the most credible way to add the value of these benefits up and get to a million dollars.

Let’s see if we can sweeten the pot by adding in the unclear or somewhat dubious but not entirely absurd benefits that some Unizin folk have claimed:

  • Unizin will enable universities to “own” the ecosystem: This claim is often immediately followed by the statement that their first step in building that ecosystem was to license Canvas. The Unizin folks seem to have at least some sense that it seems contradictory to claim you are owning the ecosystem by licensing a commercial product, so they immediately start talking about how Canvas is open source and Unizin could take it their own way if they wanted to. Yet this flies in the face of Unizin’s general stated direction of mostly licensing products and building connectors and such when they have to. Will all products they license be open source? Do they seriously commit to forking Canvas should particular circumstances arise? If not, what does “ownership” really mean? I buy it in relation to the MOOC providers, because there they are talking about owning brand and process. But beyond that, the message is pretty garbled. There could be something here, but I don’t know what it is yet.
  • Unizin could pressure vendors and standards groups to build better products: In the abstract, this sounds credible and similar to the buying power argument. The trouble is that it’s not clear either that pressure on these groups will solve our most significant problems or that Unizin will ask for the right things. I have argued that the biggest reason LMSs are…what they are is not vendor incompetence or recalcitrance but that faculty always ask for the same things. Would Unizin change this? Indiana University used what I would characterize as a relatively progressive evaluation framework when they chose Canvas, but there is no sign that they were using the framework to push their faculty to fundamentally rethink what they want to do with a virtual learning environment and therefore what it needs to be. I don’t doubt the intellectual capacity of the stakeholders in these institutions to ask the right questions. I doubt the will of the institutions themselves to push for better answers from their own constituents. As for the standards, as I have argued previously, the IMS is doing quite well at the moment. They could always move faster, and they could always use more university members who are willing to come to the table with concrete use cases and a commitment to put in the time necessary to work through a standards development process (including implementation). Unizin could do that, and it would be a good thing if they did. But it’s still pretty unclear to me how much their collective muscle would be useful to solve the hard problems.

Don’t get me wrong; I believe that both of the goals articulated above are laudable and potentially credible. But Unizin hasn’t really made the case yet.

Instead, at least some of the Unizin leaders have made claims that are either nonsensical (in that they don’t seem to actually mean anything in the real world) or absurd:

  • “We are building common gauge rails:” I love a good analogy, but it can only take you so far. What rides on those rails? And please don’t just say “content.” Are we talking about courses? Test banks? Individual test questions? Individual content pages? Each of these have very different reuse characteristics. Content isn’t just a set of widgets that can be loaded up in rail cars and used interchangeably wherever they are needed. If it were, then reuse would have been a solved problem ten years ago. What problem are you really trying to solve here, and why do you think that what you’re building will solve it (and is worth the price tag)?
  • “Unizin will make migrating to our next LMS easier because moving the content will be easy.” No. No, no, no, no, no, no, no. This is the perfect illustration of why the “common gauge rails” statement is meaningless. All major LMSs today can import IMS Common Cartridge format, and most can export in that format. You could modestly enhance this capability by building some automation that takes the export from one system and imports it into the other. But that is not the hard part of migration. The hard part is that LMSs work differently, so you have to redesign your content to make best use of the design and features of the new platform. Furthermore, these differences are generally not one that you want to stamp out—at least, not if you care about these platforms evolving and innovating. Content migration in education is inherently hard because context makes a huge difference. (And content reuse is exponentially harder for the same reason.) There are no widgets that can be neatly stacked in train cars. Your rails will not help here.
  • “Unizin will be like educational moneyball.” Again with the analogies. What does this mean? Give me an example of a concrete goal, and I will probably be able to evaluate the probability that you can achieve it, it’s value to students and the university, and therefore whether it is worth a million-dollar institutional investment. Unizin doesn’t give us that. Instead, it gives us statements like, “Nobody ever said that your data is too big.” Seriously? The case for Unizin comes down to “my data is bigger than yours”? Is this a well-considered institutional investment or a midlife crisis? The MOOC providers have gobs and gobs of data, but as HarvardX researcher Justin Reich has pointed out, “Big data sets do not, by virtue of their size, inherently possess answers to interesting questions….We have terabytes of data about what students clicked and very little understanding of what changed in their heads.” Tell us what kinds of research questions you intend to ask and how your investment will make it possible to answer them. Please. And also, don’t just wave your hands at PAR and steal some terms from their slides. I like PAR. It’s a Good Thing. But what new thing are you going to do with it that justifies a million bucks per institution?

I want to believe that my friends, who I respect, believe in Unizin because they see a clear justification for it. I want to believe that these schools are going to collectively invest $10 million or more doing something that makes sense and will improve education. But I need more than what I’m getting to be convinced. It can’t be the case that the people not in the inner circle have to convince themselves of the benefit of Unizin. One of my friends inside the Unizin coalition said to me, “You know, a lot of big institutions are signing on. More and more.” I replied, “That means that either something very good is happening or something very bad is happening.” Given the utter disaster that was the ELI session, I’m afraid that I continue to lean in the direction of badness.

 

The post What Does Unizin Mean for Digital Learning? appeared first on e-Literate.

Wanted – A Theory of Change

Sun, 2015-02-15 14:25

By Michael FeldsteinMore Posts (1014)

Phil and I went to the ELI conference this week. It was my first time attending, which is odd given that it is one of the best conferences that I’ve attended in quite a while. How did I not know this?

We went, in part, to do a session on our upcoming e-Literate TV series, which was filmed for use in the series. (Very meta.) Malcolm Brown and Veronica Diaz did a fantastic job of both facilitating and participating in the conversation. I can’t wait to see what we have on film. Phil and I also found that an usually high percentage of sessions were ones that we actually wanted to go to and, once there, didn’t feel the urge to leave. But the most important aspect of any conference is who shows up, and ELI did not disappoint there either. The crowd was diverse, but with a high percentage of super-interesting people. On the one hand, I felt like this was the first time that there were significant numbers of people talking about learning analytics who actually made sense. John Whitmer from Blackboard (but formerly from CSU), Mike Sharkey from Blue Canary (but formerly from University of Phoenix), Rob Robinson from Civitas (but formerly from the University of Texas), Eric Frank of Acrobatiq (formerly of Flat World Knowledge)—these people (among others) were all speaking a common language, and it turns out that language was English. I feel like that conversation is finally beginning to come down to earth. At the same time, I got to meet Gardner Campbell for the first time and ran into Jim Groom. One of the reasons that I admire both of these guys is that they challenge me. They unsettle me. They get under my skin, in a good way (although it doesn’t always feel that way in the moment).

And so it is that I find myself reflecting disproportionately on the brief conversations that I had with both of them, and about the nature of change in education.

I talked to Jim for maybe a grand total of 10 minutes, but one of the topics that came up was my post on why we haven’t seen the LMS get dramatically better in the last decade and why I’m pessimistic that we’ll see dramatic changes in the next decade. Jim said,

Your post made me angry. I’m not saying it was wrong. It was right. But it made me angry.

Hearing this pleased me inordinately, but I didn’t really think about why it pleased me until I was on the plane ride home. The truth is that the post was intended to make Jim (and others) angry. First of all, I was angry when I wrote it. We should be frustrated at how hard and slow change has been. It’s not like anybody out there is arguing that the LMS is the best thing since sliced bread. Even the vendors know better than to be too boastful these days. (Most of them, anyway.) At best, conversations about the LMS tend to go like the joke about the old Jewish man complaining about a restaurant: “The food here is terrible! And the portions are so small!” After a decade of this, the joke gets pretty old. Somehow, what seemed like Jack Benny has started to feel more like Franz Kafka.

Second, it is an unattractive personal quirk of mine than I can’t resist poking at somebody who seems confident of a truth, no matter what that truth happens to be. Even if I agree with them. If you say to me, “Michael, you know, I have learned that I don’t really know anything,” I will almost inevitably reply, “Oh yeah? Are you sure about that?” The urge is irresistible. If you think I’m exaggerating, then ask Dave Cormier. He and I had exactly this fight once. This may make me unpopular at parties—I like to tell myself that’s the reason—but it turns out to be useful in thinking about educational reform because just about everybody shares some blame in why change is hard, and nobody likes to admit that they are complicit in a situation that they find repugnant. Faculty hate to admit that some of them reinforce the worst tendencies of LMS and textbook vendors alike by choosing products that make their teaching easier rather than better. Administrators hate to admit that some of them are easily seduced by vendor pitches, or that they reflexively do whatever their peer institutions do without a lot of thought or analysis. Vendors hate to admit that their organizations often do whatever they have to in order close the sale, even if it’s bad for the students. And analysts and consultants…well…don’t get me started on those smug bastards. It would be a lot easier if there were one group, one cause that we could point to as the source of our troubles. But there isn’t. As a result, if we don’t acknowledge the many and complex causes of the problems we face, we risk having an underpants gnomes theory of change:

Click here to view the embedded video.

I don’t know what will work to bring real improvements to education, but here are a few things that won’t:

  • Just making better use of the LMS won’t transform education.
  • Just getting rid of the LMS won’t transform education.
  • Just bringing in the vendors won’t transform education.
  • Just getting rid of the vendors won’t transform education.
  • Just using big data won’t transform education.
  • Just busting the faculty unions won’t transform education.
  • Just listening to the faculty unions won’t transform education.

Critiques of some aspect of education or other are pervasive, but I almost always feel like I am listening to an underpants gnomes sales presentation, no matter who is pitching it, no matter what end of the political spectrum they are on. I understand what the speaker wants to do, and I also understand the end state to which the speaker aspires, but I almost never understand how the two are connected. We are sorely lacking a theory of change.

This brings me to my conversation with Gardner, which was also brief. He asked me whether I thought ELI was the community that could…. I put in an ellipse there both because I don’t remember Gardner’s exact wording and because a certain amount of what he was getting at was implied. I took him to mean that he was looking for the community that was super-progressive that could drive real change (although it is entirely possible that I was and am projecting some hope that he didn’t intend). It took me a while to wrap my head around this encounter too. On the one hand, I am a huge believer in the power of communities as networks for identifying and propagating positive change. On the other hand, I have grown to be deeply skeptical of them as having lasting power in broad educational reform. Every time I have found a community that I got excited about, one of two things inevitably happened: either so many people piled into it that it lost its focus and sense of mission, or it became so sure of its own righteousness that the epistemic closure became suffocating. There may be some sour grapes in that assessment—as Groucho Marx said, I don’t want to belong to any club that would have me as a member—but it’s not entirely so. I think communities are essential. And redeeming. And soul-nourishing. But I think it’s a rare community indeed—particularly in transient, professional, largely online communities, where members aren’t forced to work out their differences because they have to live with each other—that really provides transformative change. Most professional communities feel like havens, when I think we need to feel a certain amount of discomfort for real change to happen. The two are not mutually exclusive in principle—it is important to feel like you are in a safe environment in order to be open to being challenged—but in practice, I don’t get the sense that most of the professional communities I have been in have regularly encouraged  creative abrasion. At least, not for long, and not to the point where people get seriously unsettled.

Getting back to my reaction to Jim’s comment, I guess what pleased me so much is that I was proud to have provided a measure of hopefully productive and thought-provoking discomfort to somebody who has so often done me the same favor. This is a trait I admire in both Jim and Gardner. They won’t f**king leave me alone. Another thing that I admire about them is that they don’t just talk, and they don’t just play in their own little sandboxes. Both of them build experiments and invite others to play. If there is a way forward, that is it. We need to try things together and see how they work. We need to apply our theories and find out what breaks (and what works better than we could have possibly imagined). We need to see if what works for us will also work for others. Anyone who does that in education is a hero of mine.

So, yeah. Good conference.

 

The post Wanted – A Theory of Change appeared first on e-Literate.

e-Literate TV Case Study Preview: Middlebury College

Sun, 2015-02-15 10:48

By Michael FeldsteinMore Posts (1013)

As we get closer to the release of the new e-Literate TV series on personalized learning, Phil and I will be posting previews highlighting some of the more interesting segments from the series. Both our preview posts and the series itself start with Middlebury College. When we first talked about the series with its sponsors, the Bill & Melinda Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent. And as part of our effort to establish a more objective frame, we started the series by going not to a school that was a Gates Foundation grantee but to the kind of place that Americans probably think of first when they think of a high-quality personalized education outside the context of technology marketing. We decided to go to an elite New England liberal arts college. We wanted to use that ideal as the context for talking about personalizing learning through technology. At the same time, we were curious to find out how technology is changing these schools and their notion of what a personal education is.

We picked Middlebury because it fit the profile and because we had a good connection through our colleagues at IN THE TELLING.[1] We really weren’t sure what we would find once we arrived on campus with the cameras. Some of what we found there was not surprising. In a school with a student/teacher ratio of 8.6 to 1, we found strong student/teacher relationships and empowered, creative students. Understandably, we heard concerns that introducing technology into this environment would depersonalize education. But we also heard great dialogues between students and teachers about what “personalized” really means to students who have grown up with the internet. And, somewhat unexpectedly, we saw some signs that the future of educational technology at places like Middlebury College may not be as different from what we’re seeing at public colleges and universities as you might think, as you’ll see in the interview excerpt below.

Jeff Howarth is an Assistant Professor of Geography at Middlebury. He teaches a very popular survey-level course in Geographic Information Systems (GIS). But it’s really primarily a course about thinking about spaces. As Jeff pointed out to me, we typically provide little to no formal education on spacial reasoning in primary and secondary schooling. So the students walking into his class have a wide range of skills, based primarily on their natural ability to pick them up on their own. This broad heterogeneity is not so different from the wide spread of skills that we saw in the developmental math program at Essex County College in Newark, NJ. Furthermore, the difference between a novice and an expert within a knowledge domain is not just about how many competencies they have racked up. It’s also about how they acquire those competencies. Jeff did his own study of how students learn in his class which confirmed broader educational research showing that novices in a domain tend to start with specific problems and generalize outward, while experts (like professors, but also like more advanced students) tend to start with general principles and apply them to the specific problem at hand. As Jeff pointed out to me, the very structure of the class schedule conspires against serving novice learners in the way that works best for them. Typically, students go to a lecture in which they are given general principles and then are sent to a lab to apply those principles. That order works for students who have enough domain experience to frame specific situations in terms of the general principles but not for the novices who are just beginning to learn what those general principles might even look like.

When Jeff thought about how to serve the needs of his students, the solution he came up with—partly still a proposal at this point—bears a striking resemblance to the basic design of commercial “personalized learning” courseware. I emphasize that he arrived at this conclusion through his own thought process rather than by imitating commercial offerings. Here’s an excerpt in which he describes deciding to flip his classroom before he had ever even heard of the term:

Click here to view the embedded video.

In the full ten-minute episode, we hear Jeff talk about his ideas for personalized courseware (although he never uses that term). And in the thirty-minute series, we have a great dialogue between students and faculty as well as some important context setting from the college leadership. The end result is that the Middlebury case study shows us that personalized learning software tools do not just have to be inferior substitutes for the real thing that are only for “other people’s children” while simultaneously reminding us of what a real personal education looks like and what we must be careful not to lose as we bring more technology into the classroom.

  1. Full disclosure: Since filming the case study, Middlebury has become a client of MindWires Consulting, the company that Phil and I run together.

The post e-Literate TV Case Study Preview: Middlebury College appeared first on e-Literate.

California Community College OEI Selects LMS Vendor

Thu, 2015-02-12 13:53

By Phil HillMore Posts (289)

The Online Education Initiative (OEI) for California’s Community College System has just announced its vendor selection for a Common Course Management System (CCMS)[1]. For various reasons I cannot provide any commentary on this process, so I would prefer to simply direct people to the OEI blog site. Update: To answer some questions, the reason I cannot comment is that CCC is a MindWires client, and I facilitated the meetings. Based on this relationship we have a non-disclosure agreement with OEI.

Here is the full announcement.

The California Community Colleges (CCC) Online Education Initiative (OEI) announced its intent to award Instructure Inc. the contract to provide an online course management system and related services to community colleges statewide.

Support for Instructure’s Canvas system was nearly unanimous among the OEI’s Common Course Management System (CCMS) Committee members, with overwhelming support from student participants, officials said. Canvas is a course management platform that is currently being used by more than 1,000 colleges, universities and school districts across the country.

“Both the students and faculty members involved believed that students would be most successful using the Canvas system,” said OEI Statewide Program Director Steve Klein. “The student success element was a consistent focus throughout.”

The announcement includes some information on the process as well.

A 55-member selection committee participated in the RFP review that utilized an extensive scoring rubric. The decision-making process was guided by and included the active involvement of the CCMS Committee, which is composed of the CCMS Workgroup of the OEI Steering Committee, the members of OEI’s Management Team, and representatives from the eight Full Launch Pilot Colleges, which will be the first colleges to test and deploy the CCMS tool.

The recommendation culminated an extremely thorough decision-making process that included input from multiple sources statewide, and began with the OEI’s formation of a CCMS selection process in early 2014. The selection process was designed to ensure that a partner would be chosen to address the initiative’s vision for the future.

  1. Note that this is an Intent to Award, not yet a contract.

The post California Community College OEI Selects LMS Vendor appeared first on e-Literate.

A Sneak Preview of e-Literate TV at ELI

Tue, 2015-02-10 00:58

By Michael FeldsteinMore Posts (1013)

Phil and I will be chatting with Malcolm Brown and Veronica Diaz about our upcoming e-Literate TV series on personalized learning in a featured session at ELI tomorrow. We’ll be previewing short segments of video case studies that we’ve done on an elite New England liberal arts college, an urban community college, and large public university. Audience participation in the discussion is definitely encouraged. It will be tomorrow at 11:45 AM in California C for those of you who are here at the conference, and also webcast for those of you registered for the virtual conference.

We hope to see you there.

The post A Sneak Preview of e-Literate TV at ELI appeared first on e-Literate.

Flat World and CBE: Self-paced does not imply isolation

Mon, 2015-02-09 07:48

By Phil HillMore Posts (287)

As competency-based education (CBE) becomes more and more important to US higher education, it would be worth exploring the learning platforms in use. While there are cases of institutions using their traditional LMS to support a CBE program, there is a new market developing specifically around learning platforms that are designed specifically for self-paced, fully-online, competency-framework based approaches.

Several weeks ago Flat World announced their latest round of funding, raising $5 million of debt financing, raising their total to $40.7 million. The company started out by offering full e-textbooks (and was previously named FlatWorld Knowledge), developing 110 titles that included 25 of the 50 most-used lecture courses. The e-textbook market was not working out, however, and the company pivoted to competency-based education around the time that Chris Etesse became CEO two years ago. Now the company is developing a combined CBE learning platform with integrated course content – much of it repurposing the pre-existing e-textbook materials. Their first academic partner for CBE is Brandman University, a non-traditional part of the Chapman University system and is currently one of the CBEN network.

One central tenet of the Flat World approach is based on their history and pivot – a tight integration of content and platform. As Etesse describes it, content is a 1st-class citizen in their system whereas other loosely-coupled approaches that do not tie content and platform together can be difficult to navigate and collect learning analytics. In other words, this intentionally is a walled-garden approach. For Brandman, approximately 70% of the content comes from the pre-existing FlatWorld texts, 25% comes from various OER sources, and about 5% has been custom-designed for Brandman.

In other words, this is very much a walled garden by design. While there is support for outside content, I believe this integration must be done by Flat World designers.

As was the case for the description of the Helix CBE-based learning platform, my interest here is not merely to review one company’s products, but rather to illustrate aspects of the growing CBE movement using the demo.

CBE programs by their very nature tend to be self-paced. One criticism or line of questions I’m seeing more often deals with the nature of self-paced learning itself. Are students just plugging through mindless e-text and multiple-choice assessments in isolation? What Flat World illustrates – as with other major CBE learning platforms – is that self-paced does not imply isolation, either from a student-teacher or from a student-student perspective. New approaches that are different than simple discussion forums are required, however.

FlatWorld shows several learning activities:

  • Reading text and viewing multi-media content adaptively presented based on a pretest and progress against competencies;
  • Taking formative assessments primarily through multiple-choice quizzes;
  • Interacting with students and with faculty;
  • Working through project-based assignments;
  • Taking summative assessments through proctored, webcam-streaming approach.

The activities and assessments do not have to be students working in isolation and using multiple-choice. For example, the project based work can be included and assignments can include submission of written reports or based on short-form prompts. As can be seen below, the assessments can be based on submitted written work which faculty grade and use for feedback.

FWK_Demo_Submit_Assessment

For communication with others, students are tracked in how active they are in communicating with faculty and even with other students (called ‘social’), as seen below.

FWK_Demo_7_Day_Breakdown

One challenge of a self-paced program such as CBE approaches is figuring out how to encourage students to interact with others. There is not a simple cohort to work with – the interaction instead will often be based on content. Who else is working through the same content in roughly the same time period.

FlatWorld uses an approach that is very similar to Stack Overflow, where students can ask and answer questions over time, and the answers are voted up or down to allow the best answers to rise to the top. The stack overflow is moderated by faculty at Brandman. This not only allows students working on the same competencies at roughly the same time, but it even allows interaction with students on similar competencies separated in time.

FW_DiscussionBoards

 

FW_SocialInteraction

There certainly is a tendency in many CBE programs to stick to multiple-choice assignments and quizzes and to avoid much social interaction. That method is a whole lot easier to design, and with several hundred of new programs under development, I think the overall quality can be quite low in many programs, particularly those looking for a quick-win CBE introduction, essentially trying to jump on the bandwagon. You can see the tendency towards multiple-choice in the FlatWorld system as well.

But self-paced does not imply isolation, and the Flat World implementation of the Brandman University program shows how CBE can support project-based work, written assignments and assessments, and interaction between students and faculty as well as between multiple students.

The post Flat World and CBE: Self-paced does not imply isolation appeared first on e-Literate.

Instructure Releases 4th Security Audit, With a Crowd-sourcing Twist

Sat, 2015-02-07 12:17

By Phil Hill

Phil is a consultant and industry analyst covering the educational technology market primarily for higher education. He has written for e-Literate since Aug 2011. For a more complete biography, view his profile page.

Web | Twitter | LinkedIn | Google+ | More Posts (286)

In the fall of 2011 I made the following argument:

We need more transparency in the LMS market, and clients should have access to objective measurements of the security of a solution. To paraphrase Michael Feldstein’s suggestions from a 2009 post:

  • There is no guarantee that any LMS is more secure just because they say they are more secure
  • Customers should ask for, and LMS vendors should supply, detailed information on how the vendor or open source community has handled security issues in practice
  • LMS providers should make public a summary of vulnerabilities, including resolution time

I would add to this call for transparency that LMS vendors and open source communities should share information from their third-party security audits and tests.  All of the vendors that I talked to have some form of third-party penetration testing and security audits; however, how does this help the customer unless this information is transparent and available?  Of course this transparency should not include details that would advertise vulnerabilities to hackers, but there should be some manner to be open and transparent on what the audits are saying. [new emphasis added]

Inspired by fall events and this call for transparency, Instructure (maker of the Canvas LMS) decided to hold an public security audit using a white hat testing company, where A) the results of the testing would be shared publicly, and B) I would act as an independent observer to document the process. The results of this testing are described in two posts at e-Literate and by a post at Instructure.

Instructure has kept up the process, this year with a crowd-sourcing twist:

What was so special about this audit? For starters, we partnered with Bugcrowd to enlist the help of more than 60 top security researchers. To put that number in context, typical third-party security audits are performed by one or two researchers, who follow standard methodologies and use “tools of the trade.” Their results are predictable, consistent, and exactly what you’d want and expect from this type of service. This year, we wanted an audit that would produce “unexpected” results by testing our platform in unpredictable ways. And with dozens of the world’s top experts, plus Bugcrowd’s innovative and scrappy crowdsourcing approach, that’s exactly what we got.

So while last year’s audit found six issues, this year’s process unearthed a startling 59. (Yeah, you read that right. Fifty-nine.) Witness the power of crowdsourcing an open security audit.

The blog post goes on to state that all 59 issues have been fixed with no customer impacts.

I harp on this subject not just to congratulate Instructure on keeping up the process, but to maintain that the ed tech world would benefit from transparent, open security audits. Back in 2011 there were ed tech executives who disagreed with the approach of open audits.

There are risks, however, to this method of public security testing. Drazen Drazic, the managing director of Securus Global, indicated that in talking to people around the world through security-related social networks, no other companies have chosen to use an independent observer for this testing. This is not to argue that no one should do it, but clearly we are breaking new ground here and need to be cautious.

One downside of public security assessments is that the act of publicizing results can in fact increase the likelihood that vulnerabilities would be exploited by hackers. As one executive from a competitive LMS put it to me, we need to focus on security consistently and not as a once-a-year exercise. Any public exposure of vulnerabilities can increase the likelihood of hackers exploiting those vulnerabilities, so the trick is to not disclose specific pathways to exploitation. In our case, I described the category of vulnerability found, and I avoided disclosing any information on the critical and high-risk vulnerabilities until after they had been remediated. Still, this is a tricky area.

Two competitive LMS vendors have criticized these tests as a marketing ploy that could be dangerous. In their opinion, student and client data is best protected by keeping the testing process out of the public domain. I cannot speak for Instructure’s motivations regarding marketing, but I did want to share these criticisms.

We are now in the fourth year of Instructure providing transparent security audits, and I would note the following:

  • The act of publicizing the results has not in fact enabled hackers to exploit the security vulnerabilities identified.
  • While I am sure there is marketing value to this process, I would argue that the primary benefits have been enhanced security of the product, but more importantly better information for the institutions evaluating or even using Canvas.

I repeat my call for more ed tech vendors to follow a this type of process. I would love to cover similar stories.

The post Instructure Releases 4th Security Audit, With a Crowd-sourcing Twist appeared first on e-Literate.

Babson Study of Online Learning Released

Wed, 2015-02-04 23:52

By Phil Hill

Phil is a consultant and industry analyst covering the educational technology market primarily for higher education. He has written for e-Literate since Aug 2011. For a more complete biography, view his profile page.

Web | Twitter | LinkedIn | Google+ | More Posts (285)

Babson Survey Research Group (BSRG) just released its annual survey of online learning in US higher education (press release here). This year they have moved from use of survey methodology for the online enrollment section to use of IPEDS distance education data. Russ Poulin from WCET and I provided commentary on the two data sources as an appendix to the study.

The report highlights the significant drop in growth of online education in the US (which I covered previously in this e-Literate post). Some of the key findings:

  • Previous reports in this series noted the proportion of institutions that believe that online education is a critical component of their long-term strategy has shown small but steady increases for a decade, followed by a retreat in 2013.
  • After years of a consistently growing majority of chief academic officers rating the learning outcomes for online education “as good as or better” than those for face-to-face instruction, the pattern reversed itself last year.
  • This report series has used its own data to chronicle the continued increases in the number of students taking at least one online course. Online enrollments have increased at rates far in excess of those of overall higher education. The pattern, however, has been one of decreasing growth rates over time. This year marks the first use of IPEDS data to examine this trend.
  • While the number of students taking distance courses has grown by the millions over the past decade, it has not come without considerable concerns. Faculty acceptance has lagged, concerns about student retention linger, and leaders continue to worry that online courses require more faculty effort than face-to-face instruction.

BSRG looked at the low growth (which I characterized as ‘no discernible’ growth’ due to noise in the data) and broke down trends by sector.

Growth by sector

The report also found that more institutions are viewing online education as ‘critical to the long term strategy of my institution’.

Strategic online

 

There’s lots of good data and analysis available – read the whole report here.

I’ll write more about the critique of data sources that Russ and I provided in the next few days.

We are especially pleased that Phil Hill and Russ Poulin have contributed their analysis of the transition issues of moving to IPEDS data. Their clear and insightful description will be of value for all who track distance education.

I want to personally thank Jeff Seaman for the opportunity he and his team provided for us to provide this analysis.

The post Babson Study of Online Learning Released appeared first on e-Literate.

Is Standardized Testing a Pediatric Disease?

Sat, 2015-01-24 14:09

In my last post, I wrote about the tension between learning, with the emphasis on the needs and progress of individual human learners, and education, which is the system by which we try to guarantee learning to all but which we often subvert in our well-meaning but misguided attempts to measure whether we are delivering that learning. I spent a lot of time in that post exploring research by Gallup regarding the workplace performance of adults, various dimensions of personal wellbeing, and the links of both to each other and to college experiences. One of Gallup’s findings were that workers who are disengaged with their work are less healthy. They are more likely to get clinically depressed, more likely to get heart conditions, and more likely to die young. I then made a connection between disengaged adults and disengaged students. What I left implicit was that if being disengaged as an adult is bad for one’s health, it stands to reason that being disengaged as a child is also bad for one’s health. We could be literally making our children sick with schooling.

I am in the midst of reading Anya Kamenetz’s new book The Test. It has convinced me that I need to take some time making the connection explicit.

In that previous post, I wrote,

Also, people who love their jobs are more likely to both stay working longer and live longer. In a study George Gallup conducted in the 1950s,

…men who lived to see 95 did not retire until they were 80 years old on average. Even more remarkable, 93% of these men reported getting a great deal of satisfaction out of the work they did, and 86% reported having fun doing their job.

Conversely, a 2008 study the company found a link between employee disengagement and depression:

We measured their engagement levels and asked them if they had ever been diagnosed with depression. We excluded those who reported that they had been diagnosed with depression from our analysis. When we contacted the remaining panel members in 2009, we again asked them if they had been diagnosed with depression in the last year. It turned out that 5% of our panel members (who had no diagnosis of depression in 2008) had been newly diagnosed with depression. Further, those who were actively disengaged in their careers in 2008 were nearly twice as likely to be diagnosed with depression over the next year. While there are many factors that contribute to depression, being disengaged at work appears to be a leading indicator of a subsequent clinical diagnosis of depression.

Which is obviously bad for employer and employee alike.

In some cases, Gallup went all in with physiological studies. For example, they “recruited 168 employees and studied their engagement, heart rate, stress levels, and various emotions throughout the day,” using heart rate monitors, saliva samples, and handheld devices that surveyed employees on their activities and feelings of the moment at various points in the day.

After reviewing all of these data, it was clear that when people who are engaged in their jobs show up for work, they are having an entirely different experience than those who are disengage. [Emphasis in original.] For those who were engaged, happiness and interest throughout the day were significantly higher. Conversely, stress levels were substantially higher for those who were disengaged. Perhaps most strikingly, disengaged workers’ stress levels decreased and their happiness increased toward the end of the workday….[P]eople with low engagement…are simply waiting for the workday to end.

From here, the authors go on to talk about depression and heart attacks and all that bad stuff that happens to you when you hate that job. But there was one other striking passage at the beginning of this section:

Think back to when you were in school sitting through a class in which you had very little interest. Perhaps you eyes were fixed on the clock or you were staring blankly into space. You probably remember the anticipation of waiting for the bell to ring so you could get up from your desk and move on to whatever was next. More than two-thirds of workers around the world experience a similar feeling by the end of a typical workday.

I then went on to a point about preparing students to be engaged workers, but it’s worth pausing here and thinking for a moment. Schooling is the model, the archetype, for the workplace experience that literally causes people to lead shorter, sadder, sicker lives. Is that hyperbole? Is it a caricature of modern schooling? Actually, thanks to the current American obsession with standardized testing, the stereotype may actually understate the case.

In The Test, Kamenetz quotes the blog of a Chicago parent who had assisted her daughter’s class with computer-based testing. On the way home from the second day (?!) of testing, her daughter broke down in the car:

“I just can’t do this,” she sobbed. The ill-fitting headsets, the hard-to-hear instructions, the uncooperative mouse, the screen going to command modes, not being able to get clarification when she asked for it….It took just two days of standardized testing to doubt herself. “I’m just not smart, Mom. Not like everyone else. I’m just no good at kindergarten, just no good at all.”

I have read this paragraph a half dozen times now, and I still can’t get through it without tearing up.

Kamenetz then goes on to say that teacher and parents throughout the United States—especially the ones with elementary school-aged children—“report students throwing up, staying home with stomach aches, locking themselves in the bathroom, crying, having nightmares, and otherwise acting out on test days.”

A bit later in the book, she writes about a couple of Great Depression-era researchers named Harold Skeels and Harold Dye. They took a couple of one-year-old babies in an orphanage who had tested as “moderately to severely retarded” and moved them to a ward for mentally disabled young women, because the children were viewed as hopeless cases. Fourteen and sixteen months old, these girls were already discarded. But what happened next was anything but what the researchers expected. The girls became adopted by the residents and attendants of the ward. Kamenetz notes, “After just six months their IQ scores had improved to 77 and 87, and a few months after that their scores had climbed into the mid-90s, near average levels.”

The researchers were so taken aback that they repeated the experiment, bringing 13 “retarded” one- and two-year-old girls from orphanages to the adult women’s institution, where they were given foster mothers there.

According to an article discussing the case, the toddlers at the adult women’s home had toys bought for them by the attendants and clothes made for them by the residents. Their “mothers” cheerfully competed over which ones could be made to walk and talk first.

Meanwhile, a control group of supposedly low-IQ girls stayed at the orphanage, presumably living under the conditions one imagines in the kind of orphanage that would let some of its children be condemned to live out their lives in a mental institution when they were just 14 months old. What were the results?

The children [who were transferred to the mental institution] remained on the ward for a mean of nineteen months. All but two of the eleven gained more than 15 IQ points during that time. Once they tested at average intelligence they were moved to regular foster homes. A year after the experiment ended, of the thirteen original children, none was still classified as “feeble-minded.” At the first follow-up two and a half years later, in 1943, the mean IQ of the experimental group was exactly average, 101.4. Meanwhile the control group left at the orphanage had shown “marked deterioration” and now had an average IQ of 66.1, down from 86 at the beginning of the study.

Staying in the orphanage was actually more harmful to the young girls that putting them in an adult mental institution. This was not a short-term difference, either. In the 1960s, the researchers followed up with the girls from the original study.

Of the thirteen girls who had been adopted, first informally by developmentally disabled women[1] in the institution and then by families in the outside world, all of them were self-supporting. Eleven of them were married. They had a mean of 11.68 years of education. They earned an average wage of $4,224, which was in the range of average annual earnings for men in Iowa, their home state—not bad for a group of women from an institutional background in the 1960s.

Of the twelve girls in the control group, only four of them had jobs, all of them working in the institutions where they lived. Only three had been married. On average they had less than four years of schooling. The cost savings to the state for rescuing the girls who went on to live healthy, productive lives was approximately $200 million in today’s dollars.

Anya’s primary point for telling this story is to review the history of evidence that standardized tests are poor predictors of human potential. But the story is also a compelling illustration of the long-term harm to health and wellbeing that we do to humans when we subject them to inhumane conditions (and, on a more hopeful note, how just a little bit of human love and understanding can be so transformative in a person’s life). Note that the Gallup research shows long-term health effects for work situations that are likely a lot less stressful than those of living in a Depression-era orphanage and almost certainly not worse than the kind of stress that Chicago kindergartener endured.

As I was pondering this story, I was reminded of FDA Commissioner David Kessler. (Bear with me on this.) Kessler successfully argued that nicotine addiction is a pediatric disease based on the long-term harm that it does to children. On that basis, he was able to establish that regulating tobacco falls under the purview of the FDA and was therefore able to put a collar on the powerful tobacco industry and regulate it for the first time. Given the severe and long-term stress that American children endure today due to a testing regime that takes up to 25% of students’ total schooling time, I wonder whether similarly compelling evidence could be gathered showing that forcing students to endure endless rounds of high-stakes standardized testing has effects analogous to long-term exposure to hazardous waste.

  1. Michael’s note: Given the rest of the story that Anya is telling here, it makes one wonder how many of those women were really developmentally disabled.

The post Is Standardized Testing a Pediatric Disease? appeared first on e-Literate.

About Inside Higher Ed Selling Majority Stake

Sun, 2015-01-18 01:20

Update 1/21: See link and blurb at bottom of post from new Editor’s Note at Inside Higher Ed.

Last week the Huffington Post ran an article by David Halperin breaking the news that the private equity firm Quad Partners had acquired a controlling interest in Inside Higher Ed.

Quad Partners, a New York private equity firm that is invested heavily in the for-profit college industry, and whose founder has aggressively opposed regulation of that troubled industry, has acquired a controlling stake in the respected trade publication Inside Higher Ed (IHE), which often reports on for-profit colleges and the policy disputes surrounding them. There has been no public announcement, but the Quad Partners website now lists Inside Higher Ed as one of its investments, among a range of education-related companies, including for-profit trade schools Beckfield College, Blue Cliff College, Dorsey Schools, Pacific College of Oriental Medicine, and Marinello Schools of Beauty.

Doug Lederman, one of IHE’s two top editors, confirmed to me that Quad purchased a majority interest in IHE in November.

Quad Partner James Tieng is now an IHE board member. Quad also owns the influential college admissions management company Noel-Levitz and other education technology companies that contract with colleges and universities — another sector that IHE covers.

The rest of the article then goes full conspiracy theory, building off the for-profit connection of both Quad Partners and its founder. Halperin seems to believe mere indirect association with for-profits is evil and compromising in and of itself rather than finding any changes or compromises in IHE coverage.

The bigger issue in my mind was described by Keith Button at Education Dive.

While the list of potential conflicts of interest in such a sale is long, the fact that the deal wasn’t announced and the potential news coverage issues weren’t publicly addressed up-front raises more questions.

This issue of disclosure was partially addressed in the original article:

“I would expect people to be watching us” in light of this purchase, says Lederman. “Our credibility is hugely important to us, and ultimately it will rise or fall on the nature and tenor of our coverage.” He says IHE will go on as before: “The proof will be in what we publish.” If there are significant references in IHE to specific Quad-owned companies, the publication will disclose the relationship.

In my mind, IHE made a serious mistake by not publicizing the acquisition back in November and issuing a blanket disclosure. I don’t fault them for selling the controlling stake in the company, especially given the lack of a paywall. But I do fault them for not realizing how the lack of disclosure created the opportunity for a advocate to publicly challenge them. It’s actually ironic to see a full-fledged advocate (Halperin writes extensively attacking the for-profit sector as part of his funding and openly calls himself an advocate) require 100% pure financial independence for IHE.

There are two types of disclosure that are relevant – a blanket disclosure announcing a key event such as the sale of the majority of company shares, proactively distributed and available; and article-specific disclosures if IHE articles reference companies tied to their owners. IHE seems to be relying on the latter, but their credibility will take a hit by not doing the former.

IHE was caught off guard by the Huffington Post article, and they seem to have quickly put up an Ownership Statement on the same day the article ran (Jan 14th).

Inside Higher Ed is an independent journalism organization. The journalistic independence is critical in ensuring the fairness and thoroughness of our higher education coverage.

Inside Higher Ed Inc. is owned by its three founders, other individual investors, and Quad Partners, a private equity firm that invests in the education space. Quad purchased a controlling share of Inside Higher Ed in November 2014 from a group of venture capital firms that invested in the company originally a decade earlier.

Owners of Inside Higher Ed stock who are not editors play no role in the editorial policies of the company.

The problem is the following:

  • This statement comes across as a reaction to Halperin – you got us – leading to the appearance that IHE had something to hide; and
  • IHE has done little to actually disclose this ownership, as the statement is only linked on the About Us page and Doug Lederman’s page (no articles or prominent placement of significant news event).

I read and research quite a bit of higher ed news and it took me a while to find this statement, despite the fact that I was specifically looking for information. With the current placement, very few people would have seen it.

This news is relevant, more for the Quad Partners ownership of Noel-Levitz than for their ownership of Marinello Schools of Beauty. Higher ed enrollment in the US has been declining the past 2 years, and this change is shaping up to be one of the biggest drivers of change initiatives for institutions and associated markets. There might be no other organization with more influential on enrollment management than Noel-Levitz. In the past 12 months Inside Higher Ed has written eight articles where Noel-Levitz plays an important role, and this prominent Fortune article profiling the company states:

Noel-Levitz might be the most influential force in higher education pricing that you’ve never heard of, empowering what’s become a three-stage, market-distorting game for college administrators.

Readers should know about the ownership connection given the importance of enrollment management and college pricing, and readers should not have to find this if and only if they read an article with direct references.

Do I believe that Quad Partners has or will change IHE coverage, especially on enrollment management and pricing? No. In my experience, IHE’s leadership and the reporters I’ve dealt with have been very ethical and honest. Furthermore:

Lederman says that at the insistence of IHE, the purchase agreement includes a clause that precludes Quad Partners from any involvement in editorial operations. IHE was launched by Lederman and two co-founders in 2004, with a modest investment from three Washington DC-area venture funds, including the owners of the lead generation company Double Positive. Those three investors, who sold their shares to Quad in November, also had no role in editorial operations, says Lederman.

IHE does a great job covering important stories in higher ed, including a watch dog role of exposing problems that arise. We need them to be trusted, and they should quickly correct the mistake. My unsolicited advise:

  • Write an article disclosing the sale and linking to the Ownership Statement – don’t make this information hard to find;
  • Quote a portion of the purchase agreement clause in the article to clarify their statement of editorial independence; and
  • Create a separate page of editorial policies.

Update 1/19: In a separate Education Dive post from the weekend:

A top editor of Inside Higher Ed said Friday that, in hindsight, he wished there had been more transparency about the sale of the publication’s controlling interest to a private equity firm that has invested heavily in for-profit education.

“We were founded without any support, then we had one set of investors and we had never said anything about them,” Scott Jaschik, an Inside Higher Ed founder and editor, told Education Dive. “In hindsight, I wish we had, because clearly this is of interest to people.” [snip]

“I guess I would just say to anyone who has questions, read us and read our coverage and call me if you think we’re doing anything that we shouldn’t,” he said.

Excellent work by Education Dive, by the way. As for IHE, I still think they would benefit from a blanket disclosure.

Update 1/21: Inside Higher Ed has now posted a full blanket disclosure note. Good for them.

Some of you may have seen some recent blog posts and discussion on Twitter or elsewhere about Inside Higher Ed Inc.’s ownership status. We wanted you to have more information directly from us. [snip]

In November 2014, Quad Partners, a private equity firm that invests in numerous companies in the education space, including some small for-profit colleges, bought a controlling interest in our company by purchasing shares of Inside Higher Ed Inc.’s stock from our previous investors.

Quad intends to help Inside Higher Ed expand its staff, extend its reach, and improve its coverage and services. Its goal is to help Inside Higher Ed do what it does better. And yes, like all investors, it wants to make money.

Owners of Inside Higher Ed Inc. stock who are not editors play no role in the editorial policies of the company. Quad acknowledged explicitly in its agreement to invest in Inside Higher Ed Inc. that it would be precluded from any involvement in editorial operations.

The post About Inside Higher Ed Selling Majority Stake appeared first on e-Literate.

No Discernible Growth in US Higher Ed Online Learning

Tue, 2015-01-06 11:34

By 2015, 25 million post-secondary students in the United States will be taking classes online. And as that happens, the number of students who take classes exclusively on physical campuses will plummet, from 14.4 million in 2010 to just 4.1 million five years later, according to a new forecast released by market research firm Ambient Insight.

- Campus Technology, 2011

On the positive side, Moody’s notes that the U.S. Department of Education projects a 20-percent growth in master’s degrees and a 9-percent growth in associate degrees, opportunities in both online education and new certificate programs, and a rising earnings premium for those with college degrees.

- Chronicle of Higher Ed, 2014

Q.  How likely would it be that this fraction [% students taking online courses] would grow to become a majority of students over the next five years? A [from institutional academic leaders]. Nearly two-thirds responded that this was “Very likely,” with an additional one-quarter calling it “Likely.” [That’s almost 90% combined]

- Grade Change, Babson Survey 2013

More than two-thirds of instructors (68 percent) say their institutions are planning to expand their online offerings, but they are split on whether or not this is a good idea (36 percent positive, 38 percent negative, 26 percent neutral).

- Inside Higher Ed 2014

Still, the [disruptive innovation] theory predicts that, be it steam or online education, existing consumers will ultimately adopt the disruption, and a host of struggling colleges and universities — the bottom 25 percent of every tier, we predict — will disappear or merge in the next 10 to 15 years.

- Clayton Christensen in NY Times 2013

You could be forgiven for assuming that the continued growth of online education within US higher ed was a foregone conclusion. We all know it’s happening; the questions is how to adapt to the new world.

But what if the assumption is wrong? Based on the official Department of Education / NCES new IPEDS data for Fall 2013 term, for the first time there has been no discernible growth in postsecondary students taking at least one online course in the US.

From 2002 through 2013 the most reliable measure of this metric has been the Babson Survey Research Group (BSRG) annual reporting. While there are questions on absolute numbers due to questions on definition of what makes a course “online”, the year-over-year growth numbers have been quite reliable and are the most-referenced numbers available. Starting last year, using Fall 2012 data, the official IPEDS data started tracking online education, and last week they put out Fall 2013 data – allowing year-over-year changes.

I shared the recent overall IPEDS data in this post, noting the following:

By way of comparison, it is worth noting the similarities to the Fall 2012 data. The percentage data (e.g. percent of a sector taking exclusive / some / no DE courses) has not changed by more than 1% (rounded) in any of the data. This unfortunately makes the problems with IPEDS data validity all the more important.

It will be very interesting to see the Babson Survey Research Group data that is typically released in January. While Babson relies on voluntary survey data, as opposed to mandatory federal data reporting for IPEDS, their report should have better longitudinal validity. If this IPEDS data holds up, then I would expect the biggest story for this year’s Babson report to be the first year of no significant growth in online education since the survey started 15 years ago.

I subsequently found out that BSRG is moving this year to use the IPEDS data for online enrollment. So we already have the best data available, and there is no discernible growth. Nationwide there are just 77,493 more students taking at least one online class, a 1.4% increase.

Y-o-Y Analysis

Why The Phrase “No Discernible Growth”?

Even though there was a nationwide increase of 77,493 students taking at least one online course, representing a 1.4% growth, there is too much noise in the data for this to be considered real growth. Even with the drop in total enrollment, the percentage of students taking at least one online course only changed from 26.4% TO 27.1%.

Just take one school – Suffolk County Community College – who increased by roughly 21,600 student enrollments taking at least one online course from 2012 to 2013 due to a change in how they report data and not from actual enrollment increases. More than a quarter of the annual nationwide increase can be attributed to this one reporting change[1]. These and similar issues are why I use the phrase “no discernible growth” – the year-over-year changes are now lower than the ability of our data collection methods to accurately measure.

Combine Babson and IPEDS Growth Data

While we should not directly compare absolute numbers, it is reasonable to combine the BSRG year-over-year historical growth data (2003 – 2012) with the new IPEDS data (2012 – 2013).

Y-o-Y Growth Chart

One thing to notice is that is really a long-term trend of declining growth in online. With the release of last year’s BSRG report they specifically called out this trend.

The number of additional students taking at least one online course continued to grow at a rate far in excess of overall enrollments, but the rate was the lowest in a decade.

What has not been acknowledged or fully understood is the significance of this rate hitting zero, at least within the bounds of the noise in data collection.

Implications

Think of the implications here if online education has stopped growing in US higher education. Many of the assumptions underlying institutional strategic plans and ed tech vendor market data is based on continued growth in online learning. It is possible that there will be market changes leading back to year-over-year growth, but for now the assumptions might be wrong.

Rather than focusing just on this year, the more relevant questions are based on the future, particularly if you look at the longer-term trends. Have we hit a plateau in terms of the natural level of online enrollment? Will the trend continue to the point of online enrollments actually dropping below the overall enrollment? Will online enrollments bottom out and start to rise again once we get the newer generation of tools and pedagogical approaches such as personalized learning or competency-based education beyond pilot programs?

I am not one to discount the powerful effect that online education has had and will continue to have in the US, but the growth appears to be at specific schools rather than broad-based increases across sectors. Southern New Hampshire, Arizona State University, Grand Canyon University and others are growing their online enrollments, but University of Phoenix, DeVry University and others are dropping.

One issue to track is the general shift from for-profit enrollment to not-for-profit enrollment, even if the overall rates of online courses has remained relatively stable within each sector. There are approximately 80,000 fewer students taking at least one online course at for-profit institutions while there are approximately 157,000 more students in the same category at public and private not-for-profit sectors.

I suspect the changes will continue to happen in specific areas – number of working adults taking courses, often in competency-based programs, at specific schools and statewide systems with aggressive plans – but it also appears that just making assumptions of broad-based growth needs to be reconsidered.

Update: Please note that the data release is new and these are early results. If I find mistakes in the data or analysis that changes the analysis above, I’ll share in an updated post.

  1. Russ Poulin and I documented these issues in a separate post showing the noise is likely in the low hundreds of thousands.

The post No Discernible Growth in US Higher Ed Online Learning appeared first on e-Literate.

Fall 2013 IPEDS Data: New Profile of US Higher Ed Online Education

Mon, 2015-01-05 14:48

The National Center for Educational Statistics (NCES) and its Integrated Postsecondary Education Data System (IPEDS) provide the most official data on colleges and universities in the United States. Last year they released data on distance education (essentially online courses) starting with the Fall 2012 term. Last week they released new data for the Fall 2013 term. Below is a profile of online education in the US for degree-granting colleges and university, broken out by sector and for each state.

Please note the following:

  • For the most part distance education and online education terms are interchangeable, but they are not equivalent as DE can include courses delivered by a medium other than the Internet (e.g. correspondence course).
  • I have provided some flat images as well as an interactive graphic at the bottom of the post. The interactive graphic has much better image resolution than the flat images.
  • There are three tabs below in the interactive graphic – the first shows totals for the US by sector and by level (grad, undergrad); the second also shows the data for each state; the third shows a map view.
  • Please note that along with Russ Poulin from WCET we have identified some significant problems with IPEDS data validity – see this article for more information.


Fall Profile by Sector

By way of comparison, it is worth noting the similarities to the Fall 2012 data. The percentage data (e.g. percent of a sector taking exclusive / some / no DE courses) has not changed by more than 1% (rounded) in any of the data. This unfortunately makes the problems with IPEDS data validity all the more important.

It will be very interesting to see the Babson Survey Research Group data that is typically released in January. While Babson relies on voluntary survey data, as opposed to mandatory federal data reporting for IPEDS, their report should have better longitudinal validity. If this IPEDS data holds up, then I would expect the biggest story for this year’s Babson report to be the first year of no significant growth in online education since the survey started 15 years ago.

Fall 2012 Profile by Sector

If you select the middle tab, you can view the same data for any selected state. As an example, here is data for California.

Sector and State CA Example

There is also a map view of state data colored by number of, and percentage of, students taking at least one online class for each sector. If you hover over any state you can get the basic data. As an example, here is a view highlighting North Carolina public 4-year institutions.

Map NC public 4-year

 

Interactive Graphic

For those of you who have made it this far, here is the interactive graphic. Enjoy the data.

 Learn About Tableau

The post Fall 2013 IPEDS Data: New Profile of US Higher Ed Online Education appeared first on e-Literate.

Fall 2013 IPEDS Data: Top 30 largest online enrollments per institution

Mon, 2015-01-05 05:18

The National Center for Educational Statistics (NCES) and its Integrated Postsecondary Education Data System (IPEDS) provide the most official data on colleges and universities in the United States. Last year they released data on distance education (essentially online courses) starting with the Fall 2012 term. Last week they released new data for the Fall 2013 term.

Let’s look at the top 30 online programs for Fall 2013 (in terms of total number of students taking at least one online course) while showing both 2012 and 2013 data. Some notes on the data source:

  • I have combined the categories ‘students exclusively taking distance education courses’ and ‘students taking some but not all distance education courses’ to obtain the ‘at least one DE’ category;
  • The ‘All Students’ category combines those taking DE course with those taking no DE courses;
  • I have highlighted in red the not-for-profit sectors;
  • IPEDS tracks data based on the accredited body, which can differ for systems – I manually combined most for-profit systems into one institution entity as well as Arizona State University[1]; and
  • Please note that along with Russ Poulin from WCET we have identified some significant problems with IPEDS data validity – see this article for more information.

Fall 2013 Top 30 Largest Online Enrollments Per Institution All Students Taking At Least One Online Course (Graduate and Undergraduate, including Non-degree or certificate-seeking)

Top 30

First Impressions
  • With the exception of Grand Canyon University overall and Argosy University for DE, the for-profit sector took a beating with significant drops in both total enrollment and DE enrollment.
  • Western Governors University, University of Central Florida, Southern New Hampshire Univesity, Suffolk County Community College and Northern Virginia Community College seem to have the biggest DE enrollment increases; Suffolk is so dramatic a change in DE without a big change in total enrollment, however, that I assume they changed how they report the data (see the article mentioned above).

We’ll add more data tables and graphics breaking apart this data over the next few weeks here at e-Literate.

Update: Fixed the rank ordering for Suffolk County Community College and Northern Virginia Community College.

  1. ASU splits into various entities although the online programs are coordinated.

The post Fall 2013 IPEDS Data: Top 30 largest online enrollments per institution appeared first on e-Literate.

The Quotable Justin Reich: MOOC research needs to reboot

Sat, 2015-01-03 18:19

Thanks to Audrey Watters I just read a new article in Science Magazine and publicly posted here by Justin Reich, the lead researcher for HarvardX (Harvard’s implementation of edX and associated research team)[1]. Justin calls out the limitations of current MOOC research that focuses on A/B testing and engagement instead of learning, single-course context, and post hoc analysis with proper course design. While praising the field for making available cleansed data for any type of analysis, his core argument is that we need new approaches that cannot be solved just by research teams.
Update: Added link to publicly-available DOCX article.

While the whole article is worth reading, there are quite a few insightful quotes should get past the journal paywall.

  • Big data sets do not, by virtue of their size, inherently possess answers to interesting questions.
  • We have terabytes of data about what students clicked and very little understanding of what changed in their heads.
  • It does not require trillions of event logs to demonstrate that effort is correlated with achievement.

  • One reason that early MOOC studies have examined engagement or completion statistics is that most MOOCs do not have assessment structures that support robust inferences about learning.
  • Distinguishing between engagement and learning is particularly crucial in voluntary online learning settings, because media that provoke confusion and disequilibrium can be productive for learners.
  • Boosting motivation in well-designed courses is good, but if a MOOC’s overall pedagogical approach is misguided, then plug-in experiments can accelerate participation in ineffective practices.
  • For the first MOOC researchers, getting data cleaned for any analysis was an achievement. In early efforts, following the path of least resistance to produce results is a wise strategy, but it runs the risk of creating path dependencies.
  • For the first MOOC researchers, getting data cleaned for any analysis was an achievement. In early efforts, following the path of least resistance to produce results is a wise strategy, but it runs the risk of creating path dependencies.
Some e-Literate Context

This article is a welcome statement from one of the leading MOOC researchers, and it connects with some earlier posts and interactions at e-Literate. In June 2014 I wrote a post contrasting the MOOC research results with the approach taken at the University of Phoenix.

Beyond data aggregated over the entire course, the Harvard and MIT edX data provides no insight into learner patterns of behavior over time. Did the discussion forum posts increase or decrease over time, did video access change over time, etc? We don’t know. There is some insight we could obtain by looking at the last transaction event and number of chapters accessed, but the insight would be limited. But learner patterns of behavior can provide real insights, and it is here where the University of Phoenix (UoP) could teach Harvard and MIT some lessons on analytics. [snip]

UoP recognizes the value of learner behavior patterns, which can only be learned by viewing data patterns over time. The student’s behavior in a course is a long-running transaction, with data sets organized around the learner.[2]

Two days later I wrote a follow-up post based on commenters speculating that Harvard and MIT might have learning data that was just not released.

Granted, I am arguing without definitive proof, but this is a blog post, after all. I base my argument on two points – there is no evidence of HarvardX or MITx pursuing learner-centered long-running data, and I believe there is great difficulty getting non-event or non-aggregate data out of edX, at least in current forms.

Justin Reich replied in the comments, essentially agreeing about the lack of learner-centered long-running data analysis but disagreeing with my arguments on the effect of MOOC architecture and data availability. This comment from June aligns quite well with the current Science Magazine article.

My research presentation was not exhaustive, although generally belies my belief that we need advances in instrumentation and assessment. Fancy manipulations of of predictors (from the click stream) may be limited in value if we don’t have good measures of learning, or a rich understanding of the context of data. But I’m super excited, too, about people doing great work with the edX event log data, and it’ll get out.

It is very encouraging to see the HarvardX team pushing to move beyond clicks-as-engagement and get to actual learning analysis.

Additional Notes

Some additional notes:

  • I still maintain that the course-centric transactional design of MOOCs (as with most LMSs) plays a role in the current, limited MOOC research analysis. I have spoken to many MOOC researchers who lament the enormous amount of time it takes to parse JSON files to try and recreate patterns based on individual learners over time. While I believe that Harvard, MIT, and Stanford have research teams capable of this extraction, a learner-centered system architecture would do wonders to advance the state of art for learning analytics.
  • As mentioned above, I believe that the standard usage of an LMS in online or blended courses leads to many of the same limitations in learning analytics. You could apply many of Justin’s quotes outside of the MOOC world.
  • I wish Justin had moved beyond formal journal and conference proceedings articles in his references and included the results from the MOOC Research Initiative.[3] Although not peer-reviewed, several of these reports addressed deficiencies such as being discipline-specific and even including the assessment considered in the MOOC design (as opposed to post hoc analysis). These reports do not negate the points made in the Science Magazine article, but it would have been useful to include this set of reports as a basis to understand the current state of research.
  1. Note that Science Magazine access requires a subscription or purchase or individual article.
  2. Note that I based this argument on what UoP claims to be producing internally without being able to validate the results.
  3. Disclosure – MRI was funded by the Gates Foundation, which is also a sponsor of the next e-Literate TV series.

The post The Quotable Justin Reich: MOOC research needs to reboot appeared first on e-Literate.

Harmonizing Learning and Education

Thu, 2015-01-01 16:20

I’m the Whether Man, not the Weather Man, for after all it’s more important to know whether there will be weather than what the weather will be.

The Phantom Tollbooth

Dave Cormier has written a couple of great posts on our failure to take learner motivation seriously and the difference between improving learning and improving education. In the latter post—a response to Stephen Downes’ comment on the former post—Dave writes about the tension between improving an individual’s learning and improving our system of education, essentially positing that the reason why we as a society often fail to take learner engagement sufficiently seriously is because we become preoccupied with making the educational system accountable, a goal that we would be irresponsible not to take on but that we are also essentially doomed to fail at. (I may be putting words in his mouth on that last bit.) Dave writes,

There’s definitely something wrong if people are leaving their first degree and are not engaged in learning. We certainly need to address it. We totally want to be in the business of helping people do what they want to do. Try it. No really. Just try it. Sit down with a child and help them do what they want to do. And i don’t mean “hey this child has shown up with a random project they are totally passionate about and are asking me a question” I mean “stop them at a random time, say 8:25am, and just start helping them.” You will get blank stares. You’ll get resistance. You’ll get students who will say anything you want if it means you will go away/give them a grade. You will not enjoy this process. They will also not enjoy it.

There is something wrong. The problem is that we have built an education system with checks and balances, trying to make it accountable and progressive (in some cases), but we are building it without knowing why. We have not built an education system that encourages people to be engaged. The system is not designed to do it. It’s designed to get people to a ‘standard of knowing.’ Knowing a thing, in the sense of being able to repeat it back or demonstrate it, has no direct relationship to ‘engagement’. There are certainly some teachers that create spaces where engagement occurs, but they are swimming upstream, constantly battling the dreaded assessment and the need to cover the curriculum. The need to guarantee knowing.

He suggests that we need to redesign our education system around the goal of getting students to start caring and keep caring about learning. And his argument is interesting:

Give me a kid who’s forgotten 95% of the content they were measured in during K-12 and I will match that with almost every adult i know. Give me a kid who cares about learning… well… then i can help them do just about anything.

This is partly a workplace argument. It’s an economic value argument. It’s a public good argument. If Dave is right, then people who care about learning are going to be better at just about any job you throw at them than people who don’t. This is a critical argument in favor of public funding of a liberal arts education, personalized in the old-fashioned sense of having-to-do-with-individual-persons, that much of academia has ceded for no good reason I can think of. The sticky wicket, though, is accountability which, as Dave points out, is the main reason we have a schism between learning and education in the first place. Too bad we can’t demonstrate, statistically, that people who are passionate about learning are better workers. It’s a shame that we don’t have good data linking being excited about learning, being a high-performer in your job, and being a happy, fulfilled and economically well-off person. If we had that, we could largely resolve the tension between improving learning and improving education. We could give a compelling argument that it is in the taxpayers’ interest to build an education system whose purpose, as Dave suggests, is to increase the chances that students will start to care and continue to care about learning. It’s a tragedy that we don’t have proof of that link.

Oh, wait.

The Intuition Behind the Argument

Before I get into the numbers, I think it’s important to articulate the argument in a way that makes intuitive sense even to skeptics. As Dave points out, everybody agrees with the proposition that students should love learning if that proposition is presented to them as a platitude. Where people start to waffle is when we present the proposition to them as a priority, as in, “It is more important for students to learn to develop and nurture a passion for learning than it is for them to learn any particular thing.” And in order to resolve the tension between learning and education, we need to make an even stronger proposition: “A student who develops a passion for learning about subjects that are unrelated to her eventual career will, on balance, be a better employee and more successful professional than the same student who has studied content directly related to her eventual career with relative indifference.” Do you believe this proposition? Here’s a test:

Imagine that you could go back in time and choose an undergraduate major that was exactly tailored to the job that you do today. Would you be better or worse at your job than you are now? Would you be more or less happy?

Obviously, this test won’t work for people whose undergraduate major was the perfect pre-professional major for what they are doing now, which will include most faculty. But it should work for a majority of people, including lots of folks in business and government. In my case, I was a philosophy major, which prepared me well for a career in anything except philosophy. If I could have precognitively created a major for myself in educational technology back in the late 1980s, would I be more successful today? Would I be happier? The answer to both of those questions is almost certainly “no.” In fact, there is a good chance that I would have been less successful and less happy. Why? For one thing, I didn’t care about educational technology back then. I cared about philosophy. I pursued it with a passion. This gave me three things that I still have today. First, I have the intellectual tools of a philosopher. I don’t think I would have held onto the tools of another discipline if I didn’t care about them when I was learning about them. Second, I know what it feels like to pursue work that I am passionate about. I am addicted to that feeling. I am driven to find it in every job, and I am not satisfied until I do. This makes me more selective about the jobs I look at and much, much better at the ones that I take. And finally, though it was a long and winding road, my interest in philosophy led me to my interest in instructional technology in many ways. We tend to have a rather stunted notion of what it means for a subject we study to be “related” to our work. In my philosophy classes, I spent a lot of time thinking about what it means to “know” something, what it means to “learn” something, and what it means for something to be “good.” I got to see how these words are tangled up in logic, language, and culture, and how our notions of them change over time. I learned how to write and how to think, while I was simultaneously studying the first principles of language and cognition. All of these experiences, all of this knowledge, all of these skills have been directly valuable to me in my career as a professional non-philosopher (or a standup philosopher, as Mel Brooks might call me). I wouldn’t have them if I had majored in educational technology. I would have other things, but honestly, there are no deep skills in my work that I wish I had acquired through earlier specialization. Everything that I have needed to learn, I have been able to learn on the job. As Dave wrote, “Give me a kid who cares about learning… well… then i can help them do just about anything.”

If you are one of those people who majored in exactly what you ended up doing as a career, then try reversing the thought experiment. Suppose you could go back in time and major in anything you wanted. Something that you were passionate about, but something different from what you ended up majoring in. Would it have made a difference? Would you have been more or less successful in your current career? Would you have been more or less happy than you are now? For some folks, that pre-professional major was exactly what they needed to be doing. But I bet that, for a lot of folks, it wasn’t.

Survey says…?!

If any of this resonates with you at all, then you really must read the 2014 Gallup Purdue Index Report. You’ll have to register to get it, but trust me, this one is worth it. Gallup is most widely known for their political polling, but more broadly, their business is in collecting data that links people’s attitudes and beliefs to observable behaviors and objective outcomes. How likely is a person who thinks the “country is on the wrong track” to vote for the incumbent? Or to vote at all? Does believing that your manager is incompetent correlate with an increased chance of a serious heart problem? And conversely, does “having fun” at your job correlate with a higher chance of living into your 90s? Does having a “manager that cares about me as a person” mean that I am more likely to be judged a “top performer” at work and reduce the likelihood that I will be out sick? Does having a teacher who “makes me feel excited about learning” correlate with better workplace engagement when I graduate?

Ah. There it is.

To get the full impact of Gallup’s research, you have to follow it backwards from its roots. The company does significant business in employee satisfaction surveys. As with schooling, managers know that employee engagement matters but often fail to take it seriously. But according to research cited in Gallup’s book Wellbeing: The Five Essential Elements (which I also recommend), employees who could answer “yes” to the question about whether their manager cares about them as a person, are “more likely to be top performers, produce higher quality work, are less likely to be sick, less likely to change jobs, and less likely to get injured on the job.” Also, people who love their jobs are more likely to both stay working longer and live longer. In a study George Gallup conducted in the 1950s,

…men who lived to see 95 did not retire until they were 80 years old on average. Even more remarkable, 93% of these men reported getting a great deal of satisfaction out of the work they did, and 86% reported having fun doing their job.

Conversely, a 2008 study the company found a link between employee disengagement and depression:

We measured their engagement levels and asked them if they had ever been diagnosed with depression. We excluded those who reported that they had been diagnosed with depression from our analysis. When we contacted the remaining panel members in 2009, we again asked them if they had been diagnosed with depression in the last year. It turned out that 5% of our panel members (who had no diagnosis of depression in 2008) had been newly diagnosed with depression. Further, those who were actively disengaged in their careers in 2008 were nearly twice as likely to be diagnosed with depression over the next year. While there are many factors that contribute to depression, being disengaged at work appears to be a leading indicator of a subsequent clinical diagnosis of depression.

Which is obviously bad for employer and employee alike.

In some cases, Gallup went all in with physiological studies. For example, they “recruited 168 employees and studied their engagement, heart rate, stress levels, and various emotions throughout the day,” using heart rate monitors, saliva samples, and handheld devices that surveyed employees on their activities and feelings of the moment at various points in the day.

After reviewing all of these data, it was clear that when people who are engaged in their jobs show up for work, they are having an entirely different experience than those who are disengage. [Emphasis in original.] For those who were engaged, happiness and interest throughout the day were significantly higher. Conversely, stress levels were substantially higher for those who were disengaged. Perhaps most strikingly, disengaged workers’ stress levels decreased and their happiness increased toward the end of the workday….[P]eople with low engagement…are simply waiting for the workday to end.

From here, the authors go on to talk about depression and heart attacks and all that bad stuff that happens to you when you hate that job. But there was one other striking passage at the beginning of this section:

Think back to when you were in school sitting through a class in which you had very little interest. Perhaps you eyes were fixed on the clock or you were staring blankly into space. You probably remember the anticipation of waiting for the bell to ring so you could get up from your desk and move on to whatever was next. More than two-thirds of workers around the world experience a similar feeling by the end of a typical workday.

And here’s what Dave said in his first post:

Student separate into two categories… those that care and those that don’t care.

Our job, as educators, is to convince students who don’t care to start caring, and to encourage those who currently care, to continue caring.

All kinds of pedagogy happens after this… but it doesn’t happen until this happens.

So. In this case, we’re trying to make students move from the ‘not care’ category to the ‘care’ category by threatening to not allow them to stay with their friends. Grades serve a number of ‘not care to care’ purposes in our system. Your parents may get mad, so you should care. You’ll be embarrassed in front of your friends so you should care. In none of these cases are you caring about ‘learning’ but rather caring about things you, apparently, already care about. We take the ‘caring about learning’ part as a lost cause.

The problem with threatening people is that in order for it to continue to work, you have to continue to threaten them (well… there are other problems, but this is the relevant one for this discussion). And, as has happened, students no longer care about grades, or their parents believe their low grades are the fault of the teacher, then the whole system falls apart. You can only threaten people with things they care about.

I’m not suggesting that we shouldn’t hold kids accountable, but if we’re trying to encourage people to care about their work, about their world, is it practical to have it only work when someone is threatening them? Even if you are the most cynical personal imaginable, wouldn’t you like people to be able to do things when you aren’t actually threatening them? Are we promoting a ‘creative/knowledge economy’ by doing this? Are we building democracy? Unless you are a fascist (and i really mean that, unless you want a world where a couple of people tell everyone exactly what to do) you can’t really want the world to be this way.

It turns out that Dave actually overstates the case for Fascism. Fascist bosses get bad results from employees (in addition to, you know, killing them). If you want high-performing workers, you need engaged workers. And you can’t force people to engage.

Wellbeing isn’t just about work. It looks at five different types of personal “wellbeing”—career, social, financial, physical, and community—and shows how they are related to each other, to overall wellbeing, and to performance at work and in the world. (By the way, there’s a lot of good stuff in the sections on social and community wellbeing for the connectivists and constructionists in the crowd.)

We Don’t Need No Education

The Gallup Purdue Index Report picks up where Wellbeing leaves off. Having established some metrics that correlate both with overall personal happiness and success as well as workplace success, Gallup backs up and asks the question, “What kind of education is more likely to promote wellbeing?” They surveyed a number of college graduates in various age groups and with various measured levels of wellbeing, asking them to reflect back on their college experiences. What they didn’t find is in some ways as important as what they did find. They found no correlation between whether you went to a public or private, selective or non-selective school and whether you achieved high levels of overall wellbeing. It doesn’t matter, on average, whether you go to Harvard University or Podunk College. It doesn’t matter whether your school scored well in the U.S. News and World Report rankings. Student debt levels, on the other hand, do matter, so maybe that Harvard vs. Podunk choice matters after all. And, in a finding that will cheer my philosophy professors, it turns out that “[s]lightly more employed graduates who majored in the arts and humanities (41%) and social sciences (41%) are engaged at work than either science (38%) or business (37%) majors.”

What factors did matter? What moved the needle? Odds of thriving in all five areas of Gallup’s wellbeing index were

  • 1.7 times higher if “I had a mentor who encouraged me to pursue my goals and dreams”
  • 1.5 times higher if “I had at least one professor at [College] who made me excited about learning”
  • 1.7 times higher if “My professors at [College] cared about me as a person”
  • 1.5 times higher if “I had an internship or job that allowed me to apply what I was learning in the classroom”
  • 1.1 times higher if “I worked on a project that took a semester or more to complete”
  • 1.4 times higher if “I was extremely active in extracurricular activities and organizations while attending [College]”

Again, the institution type didn’t matter (except for students who went to for-profit private colleges, only 4% of which were found to be thriving on all five measures of wellbeing). It really comes down to feeling connected to your school work and your teachers, which does not correlate well with the various traditional criteria people use for evaluating the quality of an educational institution. If you buy Gallup’s chain of argument and evidence this, in turn, suggests that being a hippy-dippy earthy-crunchy touchy-feely constructivy-connectivy commie pinko guide on the side will produce more productive workers and a more robust economy (not to mention healthier, happier human beings who get sick less and therefore keep healthcare costs lower) than being a hard-bitten Taylorite-Skinnerite practical this-is-the-real-world-kid type career coach. It turns out that pursuing your dreams is a more economically productive strategy, for you and your country, than pursuing your career. It turns out that learning a passion to learn is more important for your practical success than learning any particular facts or skills. It turns out that it is more important to know whether there will be weather than what the weather will be.

So…what do we do with all this ed tech junk we just bought?

This doesn’t mean that ed tech is useless by any means, but it does mean that we have to think about what we use it for and what it can realistically accomplish. Obviously, anything that helps teachers and advisers connect with students, students connect with each other, or students connect with their passions is good. There’s also nothing inherently wrong with video lectures or adaptive learning programs as long as they are used as informational supplements once students start caring about what they learn or as tools to keep them caring about what they learn rather than substitutes for real engagement that shovel content in the name of “competency.” I’m interested in “flipping,” fad or no fad, because it emphasizes using the technology to clear the way for more direct human-to-human interactions with the students. Competencies themselves should be used more as markers of progress down a road that the student has chosen to travel rather than a set of hoops that the student must jump through (like a trained dog). Another thing that technologies can do is help students with what may be the only prerequisite to having passion to learn, which is believing that you can learn. In the places where I’ve seen adaptive learning software employed to most impressive effect, it has been in concert with outreach and support designed to help students who never learned to believe in themselves discover that they can, in fact, make progress in their education. Well-designed adaptive software lets them get help without feeling embarrassed and, perhaps more importantly, enables them to arrive at a confidence-building feeling of success and accomplishment quickly.

The core problem with our education system isn’t the technology or even the companies. It’s how we deform teaching and learning in the name of accountability in education. Corporate interests amplify this problem greatly because they sell to it, thus reinforcing it. But they are not where the problem begins. It begins when we say, “Yes, of course we want the students to love to learn, but we need to cover the material.” Or when we say, “It’s great that kids want to go to school every day, but really, how do we know that they’re learning anything?” It’s daunting to think about trying to change this deep cultural attitude. Nor does embracing Gallup’s train of evidence fully get us out of the genuine moral obligation to find some sort of real (but probably inherently deforming) measure of accountability for schools. But the most interesting and hopeful result from the Gallup research is this:

You don’t have to have every teacher make you feel excited about learning in order to have a better chance at a better life. You just need one.

Just one.

The post Harmonizing Learning and Education appeared first on e-Literate.