By Phil HillMore Posts (394)
Yesterday morning Barnes & Noble Education, the owner of 743 college bookstores and related services, announced the acquisition of LMS provider LoudCloud Systems for $17.9 million in cash. LoudCloud provides a traditional LMS for general higher education and K-12 needs, and they are best known for their customer Grand Canyon University, a for-profit provider and partial investor. LoudCloud from the beginning touted a modular approach to learning management systems, and one of their key modules is LoudBooks, an “adaptive and social” eReader of digital course materials. More recently, LoudCloud entered the competency-based education (CBE) market by releasing FASTRAK, an LMS specifically designed for CBE programs.
According to LISTedTECH data, LoudCloud is used in a half dozen colleges and university accounts in the US, although these are mostly for-profits with multiple campuses.
I have long heard discussions that LoudCloud was looking for an exit, so the acquisition of the company is not a surprise, per se. But why Barnes & Noble Education? I think that timing of the announcement helps answer that question.
Barnes & Noble Education just announced their quarterly earnings, and it is not a pretty picture.
Barnes & Noble Education Inc. BNED, -14.93% reported a third-quarter loss of $3.6 million, or 7 cents per share, versus a profit of $8.7 million, or 9 cents per share, for the same period last year. Sales for the quarter were $518.4 million, down from $521.6 million last year.
In the conference call CEO Max Roberts positioned the LoudCloud acquisition as a response to their needs in courseware, fighting off the flagging sales of campus stores.
As we look at sales for the quarter by merchandise category, our textbook sales declined 5.4% on a comparable basis, primarily due to the later Rush period and decreased enrollments in community colleges. Aggressive online marketplace pricing, digital direct courseware sales, OER content, along with students’ reluctance to purchase textbooks, is providing some headwinds for course material sales. In order to combat these headwinds, we have successfully piloted our textbook price-matching program on multiple campuses this fall, and again during the Spring Rush.
While we believe sales and rental of the printed textbook has a long tail, our investment in Faculty Enlight, which is now being used by over 225,000 faculty for the adoption of publisher platform digital content, either for an individual student or on a class-subscription basis. Along with our investment in LoudCloud, will provide us the necessary platform and tools to effectively compete for digital courseware, OER content and services sales.
Later in the call Roberts elaborated on the acquisition.
LoudCloud will provide a foundational asset for digital product development, and also accelerates the development of our advanced learning platform. This acquisition provides us immediate access to advanced technology and a data analytics platform. The software is cloud-based, software-as-a-service offering, and is easily configured, and is scalable. LoudCloud was founded by Manoj Kutty, along with a talented team of technologists. LoudCloud has a compelling suite of products, including a multi-learning management system, courseware development and eReading, an analytics to support competency-based education, aimed at developing improved student outcomes and retention.
LoudCloud has an established revenue stream and a low-cost expense structure. Our immediate priority will be to use the LoudCloud platform to deliver content and courseware solutions to our and their existing clients. LoudCloud uses its software to capture and analyze key behavioral and performance metrics from students, enabling educators to monitor and improve outcomes, ultimately improving students’ and institutional success.
My summary of this acquisition:
- This is a semi-distressed sale. LoudCloud Systems has raised $11 million in two rounds according to SEC documents, but they sell for $17.9 million. While the company has had some interesting products, they have never turned this into a growing market position in either traditional LMS markets or the CBE market – at least yet. CEO Manoj Kutty told me last year that their CBE play was a long-term approach not expected to produce significant revenue for several years.
- This company has confused me for years. I expected them to go further in the market than they have. The architecture and ideas in their platforms seem compelling, but they do not have a good track record of getting out of pilots and generating real sales.
- The primary rationale for the purchase is a pivot of sorts as Barnes & Noble addresses flagging sales of textbooks and the need to develop a Courseware and digital content strategy. This likely means that LoudBooks is the key technology Barnes & Noble needed. This is for internal company usage driving Barnes & Noble’s evolving digital content strategy. What they call a “foundational asset for digital produce development”.
- Beyond courseware, however, Barnes & Noble plans to continue operating LoudCloud as an LMS provider both for traditional markets and CBE markets. The open question is whether Barnes & Noble will invest in the LMS market, trying to accelerate LoudCloud’s growth beyond usage at the handful of schools using the system currently.
"2016 is going to be an eventful year for the LMS" ® by @mfeldstein67
— Phil Hill (@PhilOnEdTech) February 19, 2016
Expect more market news soon.
Update: Clarified language on higher ed clients.
Update: Changed the total equity funding to $11 million instead of $15 million and changed source of data. Crunchbase had the data links correct but the data on round 1 was just plain wrong. I apologize for not catching this earlier.
The post Barnes & Noble Education Buys LMS Provider LoudCloud Systems For $17.9 Million appeared first on e-Literate.
By Michael FeldsteinMore Posts (1062)
A while back, I wrote a rant about why LMSs fail to evolve. It got some…notoriety. Since Phil and I are now writing a bi-weekly column for the Chronicle‘s new Re:Learning section, I thought I’d try trimming down the piece and rewriting it for a general audience. It was an interesting exercise. On e-Literate, we have the luxury of writing pieces that are as long as we want them to be and of assuming that our audience is pretty knowledgeable about a wide range of tech issues. We can also choose to target different subgroups of our audience with each post. Honestly, there have been times when I’ve written a post that was primarily intended to persuade one (influential) person, although I always try to make my posts interesting to a broader range of people. When we write for the Chronicle, we are always writing for a broad audience of non-techies, and we have to keep our ideas small and focused enough that we can adequately express them in 1500 words or less. But to the degree that we succeed, we have an opportunity to be ambassadors for ed tech issues to broader academia.
Since many of you play that role of ambassador every day, I’d be grateful for your feedback. If you have a moment, please go read the article at the Chronicle and then come back here and comment on it. Did I pick the right aspects of the issue to focus on? Do you think this piece would resonate with your campus stakeholders? Would it be helpful to you in your conversations at work?
By Michael FeldsteinMore Posts (1062)
As many of you know, Phil and I have been exploring the topic of “personalized learning” for the past year, trying to figure out what the heck it means and what it’s really good for, separate from the hype. We’ve boiled down the idea into the beginnings of a framework, which we’ve just had published in EDUCAUSE Review. The central focus is to think of “personalized learning” as a set of technology-supported teaching practices that un depersonalize teaching. While we’re exploring that, we also dive into the question of why hype happens, why the hype cycle exists, and how we can try to kill it.
You can read it here.
By Phil HillMore Posts (394)
Chegg, which is in the midst of a dramatic change in their business model by moving from textbook rentals to digital student services, got slammed last week in the stock market. After reporting mixed results of better-than-expected earnings yet worse-than-expected revenues, their stock price lost 35% in one day (Feb 22). But this is not a story about Chegg or stock prices. What I find fascinating is an explanation that Chegg CEO Dan Rosensweig provided about e-textbooks in his discussion with analysts.
Far too often people assume that digital equals low costs, even for textbooks. Then we get reports and surveys looking at digital textbooks as a method to “save money”, where it is almost assumed that digital textbooks do save money; it’s just a question of whether faculty take this fact into consideration. Or stock market analysts make the same assumption, which was the topic of Rosenweig’s discussion on Mad Money. In this conversation, as described at Seeking Alpha, Rosenweig made a very interesting observation.
Another misunderstanding is how e-textbooks affect Chegg’s revenue. Chegg has historically recognized 100% of the revenue from e-textbook sales. Interestingly, Rosensweig claims that e-textbooks used to be growing at 60% a year but have since slowed to 0%. In his Mad Money interview, Rosensweig explained that this decrease does not actually stem from volume, rather it is due to an unforeseen imbalance in the price of textbooks.
Students are choosing to rent textbooks in print rather than e-textbooks because the former are far cheaper. Rosensweig exemplifies this with Chegg’s most popular textbook, Campbell’s Biology. The price to rent this book in print is $20, compared to $107 for the e-book version. Who would purchase an e-book when he or she could rent the paper version for one-fifth the price?
Taking a look at Chegg’s site we see this example.
Is the problem just Chegg? No, as the pricing is largely driven by the publishers. While Amazon’s difference is not as great, it is still dramatic ($23 hardcover rental vs. $93 digital textbook rental):
The market for textbooks is distorted – there is absolutely no reason that a digital textbook rental should cost five times what a physical textbook rental costs. This is not a market where you can make otherwise common sense assumptions such as digital being lower cost, or assumptions that a decrease in adoption means that students do not want more digital options.
Update: See Brad Zurcher G+ comment below. His first point (which I have quoted), if we substantiate, would change the story somewhat. Actually, if we could substantiate his first point, that would imply somewhat deceptive statements by Chegg’s CEO and lead to another story. If anyone knows how to find historical rental pricing from Chegg or Amazon, let me know.
Comparing physical rental prices in late February to digital rental prices is not quite comparing apples to apples. I don’t have screen grabs, but I’m willing to bet that Chegg and Amazon’s prices for 180 rentals in mid-December were 3-4X what they are today. It is an inventory management tactic at this point. For Chegg, its better to rent a book at a loss then it is to leave the book sitting on the warehouse shelf collecting dust. The fact that there is any discrepancy is illogical, but I don’t think it is quite as extreme as indicated above.
The post A Big Reason That Digital Textbooks Are Misunderstood appeared first on e-Literate.
By Phil HillMore Posts (393)
This week Jim Julius asked on Twitter if anyone had data on average length of time US higher ed institutions have had their current LMS.
@PhilOnEdTech Do you know of any stats on avg length of time US colleges/unis have been using their current primary LMS?
— Jim Julius (@jjulius) February 23, 2016
That’s an interesting question, as it should give insight into whether schools are unusual in how long or short they’ve been using one solution. It should also give some insight into the market itself.
I have previously shared how we are getting great new ed tech market data from non-survey methods – primarily LISTedTECH and Edutechnica. Michael and I have been working more deeply with the LISTedTECH guys on a new project we plan to roll out soon, taking advantage of the rich data that includes current and historical adoption information on LMS solutions. Thanks to this work, we can start to answer Jim’s question about the age of current LMS implementations, based on primary institutional adoption.
The horizontal bars on the left show average implementation time across US and Canadian higher ed institutions (the LISTedTECH guys are from Ottawa, so I’m sure Jim will forgive the reinterpretation of the question).Update: Average time is in years. The bars are the right side are for the number of institutions on each system.
We have additional data showing the distribution of these implementation times but have yet to figure out the presentation in a non-cluttered, intuitive manner. But we’re working on it.
I hope this helps answer the question and provide new data on the market. And expect to see more market data and analysis.
- Expect to see more updates and a better description of this work over the next 6-8 weeks.
By Michael FeldsteinMore Posts (1062)
Blackboard’s VP of Teaching and Learning, Valerie Schreiner, was kind enough to give Phil and me a fairly thorough update on the Ultra strategy on Wednesday. Their strategy is clearer to me now. That strategy, which I am about to share with you, is fairly coherent, reasonably interesting, and practically plausible in principle. There are still big execution questions that won’t be answered until we see what is released at BbWorld and talk to some customers. (More on this later.) But at least we have a better sense of what they are trying to do now.
We don’t usually spend this much sustained time focusing on one company and product, but since we are getting a lot of feedback from readers that it is helpful and timely for them, and since Blackboard has been extremely responsive in providing us with clarifications, we’re going to stay on this one until we feel that it’s been adequately covered.Ultra Isn’t a Product
One of the reasons that the company is struggling to explain their strategy is that Ultra isn’t really a product. It’s not a thing that you buy. It’s more a set of design goals. That’s hard enough to communicate to customers when everyone in the company understands it, but I don’t think they really do—at least, not uniformly across the company. As a result, the “Ultra” label gets attached to both products and aspects of products, with the result of it seeming like Ultra is both everything and nothing.
While Valerie didn’t explicitly describe Ultra for us in quite the way that I’m about to, my takeaway is that Ultra is the following set of design goals across all their products:
- Responsive design and mobile first everywhere
- Stream-based activity views so that people don’t always have to go poking around in individual course shells to find out what’s going on
- Redesign of the web-based course experience, and particularly the course authoring experience, around user workflows
I would add embedded analytics to this list, based on conversations that we’ve had with other Blackboard folks.
Creating a brand for a set of design goals is inherently fraught. Let’s look at two examples of how it makes communication of Blackboard’s strategy tricky for them. First, there’s mobile. Blackboard came out last year with a mobile app called Bb Student. It provides students with that activity stream view across courses and, of course, it’s mobile-first. (In fact, it’s mobile-only at the moment.) Furthermore, the company has made the product available for both traditional 9.x customers (which at this point is pretty much everybody) and their SaaS customers. People inside the company feel like they should be getting more credit for delivering on two major design goals (mobile first and stream-based activity views) as well as for delivering it to customers on the 9.x platform (which was more significant of a technical achievement than is immediately obvious).
There are a few reasons why they are not. First, LMS vendors have thrown out mobile apps in the past like they’re conference swag. They are shiny and cool, but they are not really presented as central. Customers don’t take them too seriously. Second, “student-centered,” like “mobile,” is treated as marketing thing. Schools are not often good at evaluating what “student-centered” really looks like. Bb Student will likely get credit with the customer base only after it has been out in the market for a while and schools actually see it making a difference for students.
But the biggest reason that Blackboard is not getting a lot of credit for Bb Student is that they made the mistake of mixing up their Ultra messaging with their SaaS messaging. Blackboard (and Jay Bhatt in particular) has sold Ultra as this great SaaS-based thing that is made possible by a shiny new architecture. The inference that both we and the customers we have talked to drew from that message is that anything coming on the current architecture is therefore some kind of “Ultra lite.” A pale imitation. Accordingly, Bb Student on 9.x came across as a backport for those poor folks who are stuck on the old platform. Which is everybody, since the new platform wasn’t available yet. Which reminds everybody that the new platform, the one they don’t have yet, the one that they hear is going to be so much better than the one they actually have, is overdue. Which reminds them that, while they are waiting for the new hotness that they’re not sure will ever come, they have complaints about the old and busted thing they have now. And they begin to suspect that those complaints are not getting addressed because everybody at Blackboard is busy working on the new hotness. The thing that they don’t have.
So, yeah. They really didn’t notice that student app thing, for the most part.
This is also what bit Blackboard hard on their gluteus maximus with the “Ultra theme” business. Valerie shared some survey data and other feedback showing that customers really are very interested in the Ultra design goals but are concerned about change management. The 9.x Ultra theme, which Blackboard created in response to this recent feedback, is intended to provide a stepping stone for faculty to experiment with aspects of the new course experience even while the school is on the 9.x platform. (I’ll share more details about what’s in 9.x and what’s not a little later in this post.) Again, folks inside Blackboard thought they would get credit for being responsive to customers and delivering on some of the design goals. And again, because the Ultra message got all mixed up with the SaaS message, what it sounded like—particularly in the absence of good communication even with their best customers about what’s happening with the larger Ultra picture—was “schools think Ultra is too simplistic and it’s not ready yet anyway so we’ll give them a theme that looks a little bit like Ultra.”
In order to understand what’s really going on, we have to disentangle the Ultra design goals from the SaaS product delivery. So what’s really going on with Ultra and SaaS delivery? I’m glad you asked….Ultra and SaaS Delivery
Forget about Ultra for a moment. Let’s talk SaaS. Back in the summer of 2014, Blackboard promised a SaaS version of Learn. In general, well-run SaaS provides unsexy but very important benefits like better uptime, better ability to handle peak usage loads, and easier, more frequent updates. The company actually quietly piloted the SaaS architecture as early as October of 2014 with their MOOC platform. When it worked, they rolled out a limited release of the SaaS platform to some early customers. There were two major limitations to this release which, in turn, limited their appeal. First, they provided little in the way of integration or customization options. Second, the company was actively tinkering with the platform as it was being used. Most university customers wouldn’t be comfortable with these limitations. The reason that Phil and I have had trouble finding pilot customers is that most of them were K12. The next release of the platform, which would have the stability and flexibility that higher ed customers typically need, was slated for release in October of 2015. They missed delivery by three months and released in January 2016. That’s what the SaaS Plus/Advantage announcement was all about. The problem for them is that they missed the window for spring pilots. So now the earliest that we will see customers doing real production pilots on the SaaS platform is May.
Here again, their messaging killed them, though in a different way. The schedule I just presented in the last paragraph is entirely reasonable for a rollout this complicated. And in that context, a three-month miss is not a huge deal in terms of trust. If Blackboard had originally telegraphed this kind of a timeline when they announced their intention to deliver SaaS in 2014, they would have been fine. But they didn’t. They set the expectation that they would deliver in 2015. And then, when they missed, they didn’t provide any specific information about their progress. No customers we have talked to knew anything. Nor did we. While Jay told us back in the summer of 2015 that there were pilots, he never mentioned that they were K12 customers, or that the MOOC platform was running the SaaS architecture, or what the plan or rationale was. For some reason, the company chose to announce a delivery schedule that they should have known from the beginning that they had no chance of making. Then, having failed to deliver, they also failed to communicate the realistic and sensible plan they apparently had in their back pocket all along.
It’s hard to tell how much of this is because Blackboard as a whole was groping its way toward coherence and how much of it was because the executive team wasn’t listening to their staff. Something clearly went off the rails internally.
But let’s set that aside for the moment. Blackboard is on a path toward SaaS. They have released the first version that they believe is higher ed customer-ready and will have real customers piloting it in production by the summer. Separately from that, they have been on a path toward design goals that they have been calling “Ultra.” They released Bb Student, which delivers on some of those design goals for customers on both current and SaaS versions of Learn, and they have announced the intention to release a theme for 9.x that delivers on some of the design goals for the web version of the product. Where do these two lines intersect?
We have known for some time that the SaaS product will allow for both Ultra and traditional Blackboard course experiences, on a course-by-course basis. Here is some new information:
So what will Ultra on SaaS have that Ultra on Learn 9.x will not? According to this slide from Valerie, two things:
- The activity stream like the one that is already in Bb Student (and therefore available on mobile for 9.x customers)
- The new, simpler workflows
The 9.x customers will get the mobile applications, responsive design (albeit delivered a little later than SaaS customers), and nicer, cleaner look and feel (again, a little later than SaaS customers). I also believe, based on the briefing that we got last summer, that sooner or later the SaaS customers will get some analytics that will not be possible to deliver on 9.x. But Blackboard isn’t announcing anything on that front at the moment.
We also asked about the positioning of Ultra as being for teachers with simpler needs. Is this the vision for Ultra in the long term, or is it more about the fact that they don’t have all the workflows in place yet? The answer we got was mostly the latter. When it became clear that customers were asking for stepping stones to help faculty make the change over to the new experience incrementally, developer resources were diverted from new workflows to smoothing the path. Ultra on SaaS is expected to meet more sophisticated needs over time. That said, it has always been the intention of the company to use Ultra as an opportunity to strip out bloat. But one person’s bloat is another person’s essential feature. They intend to leave the traditional Blackboard course experience as an option for those faculty who need the niche capabilities that they want to move out of the way of mainstream customers.What to Make of All of This
My first instinct when I saw the Ultra designs in 2014 was that Blackboard was engaging in some deep design thinking that could result in some meaningful differentiation that the market desperately needs. I still believe that. You can’t really get it from buzz phrases like “activity streams,” “mobile-first,” and “student-centered.” And you certainly can’t get it from the bloody mess of a job they’ve done with communication. But I have long had the sense that there is some really good thinking going on inside Blackboard, and Valerie went a long way toward reassuring me that, while it’s not yet clear that I was right, at least I was not crazy.
There are three big unknowns. First, however interesting their design thinking may be, is it productive? Are their ideas good ones? The main way that customers can test that out today is by taking the Bb Student app more seriously. Do some pilots. Actively encourage students to try it out. And if you do, let us know what you find out. The second unknown is, however good their ideas are, can they deliver polished products that work at scale? Bb Student is again one test of that, but we won’t really have the big test until there are customers using Ultra on SaaS in production, and until we see more workflows for Ultra being rolled out at a steady pace. And finally, however good their product turns out to be, will Blackboard be able to communicate to customers about what they are doing and what to expect? Given the new CEO, we will probably need a year to fully suss that out. BbWorld 2016 will be an important test, but there will be other tests before and after. If you count our conversation with Valerie Schreiner as one such test, she passed that one with flying colors. More of that, please.
Despite how tough we’ve been on them, Phil and I really want to see Blackboard succeed because we want to see all these companies and open source projects succeed. We want colleges and universities to have multiple good options. The company may have one more run at this before the customers start running for the exits. Our advice to customers is to pay close attention to what happens this year.
By Michael FeldsteinMore Posts (1052)
After Monday’s post on my confusion with Blackboard’s overall Learn strategy, I thought I would follow up with a reminder that there is one really important area where there are strong early signs that Blackboard is doing something right in a very important area: learning analytics. Learning analytics is one of those areas where there are many, many people talking and very few who are actually making sense. Blackboard has been hiring people that I usually call up when I have learning analytics questions and want to talk to somebody whose answers will actually make sense. To start with, they hired John Whitmer, who I have praised on this blog before. To get a flavor of who he is, here’s a talk that he gave when he was still back working at CSU:
If you’ve heard John give a talk post-hire (as you will have a chance to do a little later in this post), you’ll know that he is just as straight-talking, funny, and insightful now as then. And from our observations both at BbWorld 2015 and since, he appears to be in a position of significant influence within the company.
More recently, the company has hired Mike Sharkey to run their whole analytics group as part of their acquisition of his company, Blue Canary. Like John, he is one of those all too rare people who is both very good at explaining how learning analytics work and very comfortable calling BS on hype. It was a little tougher to find a recording of one of Mike’s talks for some reason, but here’s one of him when he was at the University of Phoenix:
Also recently, the company acquired X-Ray, a learning analytics product that was developed for Moodle and that Blackboard eventually intends to make available for Learn. Although the product itself is interesting, one of the motivations for the acquisition was that it included creator Sasha Dietrichson. I haven’t met Sasha yet but he has a reputation similar to that of John and Mike. I actually had the pleasure of attending Blackboard’s international launch event for X-Ray. I held off blogging about it because, in my opinion, the product still needed a little polish as of that event, but it’s worth bringing up in this context. You can see the whole event, including John’s presentation of the product, here:
(I recommend that you fast forward past the annoying animated commercial at the beginning.)
We haven’t yet seen these hires bear fruit in big ways (except arguably with X-Ray for the Moodle customers), but there are all the early indicators of a coherent investment in an analytics strategy that could turn into a major differentiator. So far, most of Instructure’s analytics work has been limited to making data available for others to use. And the last time we checked, D2L’s analytics strategy was stuck in the mud (although we are overdue for an update in that department and will be looking into it again soon). Furthermore, one major advantage of the new Learn SaaS architecture is that it provides infrastructure for learning analytics that would just not be possible on the older architecture. It’s hard for the company to tout that now when they don’t yet have products that show off the benefits. There is little question at this point that Blackboard grossly underestimated the amount of time it would take them to get the new architecture ready for prime time.
It’s possible that part of what we are seeing going on with Blackboard’s communications…er…strategy is that their product announcements (if, indeed, Ultra is a product) were so far out in front of delivery of any demonstrable benefits that it’s hard for anyone to explain the point of them without sounding completely pie-in-the-sky. Jay Bhatt felt a need, for whatever reason, to make grand pronouncements about how Blackboard was going to “transform education.” We criticized him for those comments at the time, but it’s looking increasingly likely that the damage he did through this penchant for grandiosity was more far-reaching than we imagined. But my larger point here is that, when I say that I am confused about what is going on in Blackboard, I really mean it. There may yet be a baby floating in this tub of stinky bathwater.
By Michael FeldsteinMore Posts (1052)
The good news is that Blackboard, after going quiet for a while, is out giving updates again. The bad news is that the more they talk, the less I understand. A year and a half ago, I thought that I understood their Ultra strategy and had a pretty good guess about their odds of executing it. Last summer, I felt much less sure about the execution but still reasonably comfortable that I understood the basic strategy. Now, after Phil’s update on their strategy, I am forced to admit that I understand what is happening in Blackboard right now about as well as I understand what is happening in the Republican Presidential primary. I’m generally pretty good at following this stuff, and even I have lost the thread. The messaging has gotten so garbled that I have lost confidence in my understanding of the company’s product vision and strategy.What is Ultra?
One of the telling aspects of the FAQ that Phil quoted in his post is that it includes a fairly extensive glossary. It is never a good sign when you have to provide a basic explanation of terms to describe your basic product matrix to experienced industry analysts and (presumably) long-time customers. Nor is this a new problem. Back in July, I felt compelled to offer to Blackboard a mini-glossary I made up myself because their product communication was so confusing. Here’s how I defined Ultra at the time:
Ultra is Blackboard’s new product philosophy and user interface. Rather than just sticking in a new tab or drop-down menu and a new bill from a new sales team every time we add new capabilities, we’re trying to design these capabilities into the core product experience in ways that fit with how customers would naturally use them. So rather than thinking about separate products living in separate places—like Collaborate, Community, Analytics, and Content, for example—you can think about synchronous collaboration, non-course groups, student progress tracking, and content sharing naturally when and where you need those capabilities in your daily academic life.
But is that what Ultra is intended to be, really? Did Blackboard intend Ultra to be what I described at one point but changed their minds? I can’t tell. On the one hand, when they tout the Blackboard Student mobile experience as being stream- and task-based, and independent of courses, it sounds a little like that vision of a unified experience. Of course, Blackboard immediately muddies the waters by saying that it is available for both Ultra and 9.x. So apparently Ultra both is and isn’t the LMS flavor. Now Blackboard muddies the waters further by suggesting that an “Ultra Course”—is that an Ultra Ultra course, an Ultra themed course, or an Ultra view?—is for instructors who are somehow less advanced.
Previously, I never got the impression Ultra was supposed to be an LMS with training wheels. Has the company drawn the lesson from customer feedback that any function they choose to leave out in an effort to streamline workflows will be perceived by customers as dumbing down or weakening the product? Sadly, there is some basis for that inference. That said, Blackboard Learn is perceived by the market as being old, clunky, and bloated. I don’t think that they can dig themselves out of that hole by positioning their next-generation experience as, basically, good enough for newbies and and lightweights. Further, as with much else here, the messaging problem makes it hard to tell whether there is also a deeper product management problem. In my experience, the category of “power users” is often invoked when the product team lacks a deep understanding of the segmentation that drives different feature needs. What makes a user a “power user”? What are they using that power for? Is this about people who teach online? Is it about people who teach courses that make heavy use of computer testing? Is it about departments that manage multiple sections of the same course? Or is a “power user” just any user who complains about the removal of any niche feature in the current product?
But maybe that’s not what Blackboard meant anyway. Maybe they meant that the current version of Ultra is suited for basic needs because Blackboard hasn’t yet completed whole areas of functionality like tests, group management, and LTI. But that doesn’t make a lot of sense either. According to Phil, the feedback they got from early adopters (whoever they are) was “come back when it’s finished.” An LMS course shell without those feature areas is not “finished.” And by the way, why has Blackboard made no apparent progress on delivering those areas in the last seven months, which is also seven months after the company had originally promised to deliver a finished product? What have they been doing over there?
- I don’t know what Ultra is.
- I don’t know who Ultra is supposed to be for or how it is supposed to help them, now or three years from now.
- Whatever it is, I don’t know when Ultra is supposed to be “finished” or how I will be able to tell when it is.
- I don’t know what an Ultra “theme” or “view” is and how it will differ from just plain “Ultra,” now or a year or three years from now.
- I don’t know what to expect at BbWorld 2016 or what Blackboard’s LMS product lineup is intended to be two or three years from now.
- I don’t know if Blackboard is making any progress toward releasing its next-generation product at all.
- I don’t know how SaaS impacts the specific functionality that is or will be in Ultra versus the Ultra “theme” or “view” and therefore how much customers who have been interested in Ultra should care about SaaS.
- I don’t understand what Blackboard’s vision is.
- I don’t understand what Blackboard’s capacity for execution is.
It is possible that this crescendo of confusion is a sign that the company, having twice set public goals for itself that were unrealistic, is in the midst of retrenching and establishing a more realistic path forward. It is also possible that there is a sound product strategy in here somewhere that is just getting buried in the confused communications. And it is possible that their communication is a total train wreck because their product strategy is a total train wreck. I just don’t know. I am losing confidence in my own ability to separate the signal from the noise here.
I’ve looked at clouds from both sides now,
From up and down,
And still, somehow,
It’s clouds’ illusions I recall.
I really don’t know clouds
– Joni Mitchell
By Phil HillMore Posts (391)
I was planning to write a descriptive post about the new UNC Learning Technology Commons, but there is already some excellent coverage. UNC’s Matthew Rascoff wrote a blog post on Medium that captures the basics quite well:
A compelling recent report from EDUCAUSE proposes that the “Next Generation Digital Learning Environment” will be based on a “‘Lego’ approach,” in which “components … allow individuals and institutions the opportunity to construct learning environments tailored to their requirements and goals.” The authors, Malcolm Brown, Joanne Dehoney, and Nancy Millichap, envision an app-like ecosystem of interoperable ed tech tools, each of which does one thing well, rather than the monolithic collection of functionality that is the learning management system.
Today, with the launch of the UNC Learning Technology Commons, the University of North Carolina system is taking a step in the direction of this more modular, flexible, and learner- and educator-centered approach.
We’re reimagining the way university faculty and staff buy the ed tech they need to help their students learn. Our goal is to empower educators with the best instructional tools available — in the classroom or online.
The UNC Learning Technology Commons is a system-wide effort to curate an annotated catalogue of digital learning products available for accelerated purchase by the 20,000 faculty members of the UNC system, and to build a community of educators who share (anonymized, aggregated) learning outcomes and user experiences with those products.
Upon launch of the commons last week, EdSurge published a very useful summary that hit on the procurement angle in more depth.
The Learning Technology Commons aims to iron out inefficiencies in edtech procurement. Rascoff recalls overhearing faculty from one UNC campus praise one tool, while instructors at another institution said they weren’t allowed to use it because their legal department thought it wasn’t compliant with FERPA.
With the Commons, faculty will be able to select tools that have already been approved by the UNC system. Any software, app, game or edtech provider can apply to join the network. They must agree to a standard set of requirements defined by UNC, which include agreeing to protect student privacy, comply with relevant laws and regulations, and share pricing at different volume levels. The idea is to make the decision-making process for faculty “feel more like downloading an app on your phone and less like doing a giant RFP for an enterprise planning system,” Rascoff says.
Today, the Chronicle described the commons with an emphasis on faculty-driven reviews of the apps.
The idea for the commons came out of a conversation at a summer event that included an associate professor and program director in the program in instructional systems technology at UNC-Charlotte. While talking with Mr. Rascoff and colleagues from other campuses, the professor, Florence Martin, said she realized that there was little communication across the university system about effective instructional technology and a lot of duplication of efforts.
The biggest advantage of the learning-technology commons, Ms. Martin said, is that she will be able to connect with professors in similar disciplines on other campuses and more easily find tools to use in her classroom.
“Colleagues can share experiences with others and learn from them,” she said. “That will be really meaningful.”
Jim Groom jumped in today, comparing the approach in North Carolina with the corporate – state collaboration seen in Virginia.
During my time at OpenVA one of the things I really wanted to see happen was various institutions around the state working together to share technical infrastructure. How could we think about scaling something like Domain of One’s Own for the VCCS? -or share that model with Virginia Tech or Norther Virginia Community College? Even beyond that, how could we rethink sharing machine images of various tools and applications these schools currently use to give all Virginia schools greater access to a diversity of educational technologies. A clearing house for educational technologies that might be inspired by and built on the model of something like AWS. [snip]
In terms of impact. Video conferencing? Really? The program switched up to start providing grants and course redesigns, but even that was crazy. 15 course redesigns in the 2014-2015 academic year at the tune of $500,000, or $33,000 for each course redesign—those are almost MOOC-like costs In short, 4VA was an brilliant example of how corporate and state collaborations ultimately become a drain on the taxpayer, and as much as $2 million in infrastructure costs squandered on technology you could reproduce well-enough with virtually free applications like Google Hangouts, Skype, or even appear.in.
All four posts are worth reading for those interested in North Carolina’s approach to empowering faculty-driven learning apps usage and discipline-specific sharing. I’d like to add some notes on two aspects of this effort that bear watching.How Procurement Processes Favor Big, Established Vendors
In a presentation titled “Fixing the broken edtech procurement system: How the good guys and gals can win” shared with GSV Labs last week, there was an emphasis on the distorted ed tech market. Quoting one vendor:
Several of our sales processes are four years in the making. Our average sales cycle is about eight months to get some of the schools in the district as customers and about 18 months to expand to the whole district. Fellow education entrepreneurs tell us that this is comparatively fast…
Slow sales cycles make things hard for entrepreneurs who need capital to keep operating, and who need to prove to investors that they have created something worth supporting. The typical investor does not understand that, in education, a product could be ‘ hot’ and yet take two years to sell.
The Learning Technology Commons app marketplace approach driven by end-user reviews wants to level the playing field and avoid the problem of only enterprise systems filtering through procurement, separated from what faculty really want to use. Jim Groom’s description of the 4VA multi-million dollar video conferencing is a prime example. The Commons takes a lightweight approach that should make it easy to get apps approved for consistent terms and conditions and do less big-company filtering.
This approach mirrors some of the work at UMassOnline during the Patrick Masson days to implement Needs Identification Framework for Technology Innovation (NIFTI).
How do we foster collaboration and communication around educational technologies such as e-Portfolio, Learning Activity Management Systems (LAMS), and Open Educational Resources (OER)? While UMassOnline has taken on many new technology initiatives in the past, the process in which the technology needs are identified for emerging tools has been missing. A methodology to identify the needs for new technologies was created in the summer of 2010. The Needs Identification Framework for Technology Innovation (NIFTI) or the “volcano” process includes a workflow to fill-in the the missing technology gaps from our campuses and allow for exploration of emerging technologies all while UMassOnline takes on the financial and technological responsibilities.
While NIFTI did not set up a commons or app center approach, it did acknowledge the wide scale usage of learning apps and provide a support structure for many faculty-initiated adoptions of specific tools.Next Generation Digital Learning Environment (NGDLE)
As mentioned in the Medium blog, the setup for the commons was described as going in the direction described by the EDUCAUSE NGDLE report. One thing North Carolina is doing is turning the typical LMS-driven procurement approach on its head. When I asked Rascoff how the apps would be pulled together, he said that the primary plan was to set up all accepted apps with Single Sign On (SSO) capabilities. Rascoff described that since the LMS is not where learning occurs for the most part, his team is leaving that decision up to the campuses and focusing their efforts on the learning apps.
I have heard frequent complaints about institutional innovation centers that just promote the known systems such as the LMS, and UNC is pushing a different approach as did UMassOnline. Will this effort get any traction on the UNC campuses? Will faculty both drive higher adoption of various learning applications as well as provide pedagogically-driven user reviews? Will this approach lead to multi-institution discipline-specific networking around the use of ed tech?
It’s obviously too early to tell, but the UNC Learning Technology Commons is worth watching.
The post UNC Learning Technology Commons: Easing the procurement problem with NGDLE appeared first on e-Literate.
By Phil HillMore Posts (390)
A full seven months after BbWorld 2015, Blackboard has yet to move any new functionality for Learn Ultra from “in development” or “in research” to “available” according a new public webinar. That is, Learn Ultra is no further along from a customer delivery perspective than they were in July 2015, when they were already a year late. Furthermore, Blackboard promised at BbWorld 2015 that Learn Ultra would be available to all customers in technical preview mode by Fall 2015 and available for active pilots with students and course content by Spring 2016, yet I have talked to several of Blackboard’s best long-term customers who have no access to Ultra and no updates on plans. What is becoming clear is that Learn Ultra as originally envisioned will not be in General Availability by BbWorld 2016.
I asked Blackboard for statements on the current Ultra status and was able to interview Mark Strassman, Senior VP of Product and Marketing, late last week. Mark described that Blackboard has decided to change their product strategy based on selected customer feedback in technical preview. According to Strassman:
Based on customer feedback on our Learn SaaS offering, we have evolved our strategy to make it easier for institutions to move to Learn SaaS and adopt Learn with the Ultra Experience. Later this year we will provide a new Ultra-inspired theme that customers can use with either Learn 9.1 or Learn SaaS with the Original Experience. This theme will give users of our current product an updated look-and-feel, and make it easier for them to deploy Learn Ultra, either standalone, or alongside existing courses in the Original Experience.
What this means is that Learn 9.1 Original will start to look more like Ultra and Ultra will start to look more like 9.1 Original. Let’s unpack that statement a little.Evolved Strategy
Basically Blackboard has been decided to reconceptualize Ultra. Originally viewed as minimalist, simple and elegant LMS for the majority of faculty, the feedback from technical preview schools has been “tell me when it’s done”. Blackboard leadership believes that schools are insisting on feature parity between Ultra and Learn 9.1. So the plan now is to find a more cohesive mix of Learn experiences.
To get Learn 9.1 to look more like Ultra, Blackboard is now developing an “Ultra-inspired theme” that can be applied to 9.1 Original to give more similar look-and-feel and consistent navigation to traditional courses. For Learn self-hosted and managed hosting customers, there will be similar navigation and responsive design, and for Learn SaaS customers (i.e. cloud customers),there will be additional usage of panels that are similar in nature to Ultra design. I captured the following from a recent public webinar.
According to an internal FAQ shared by Strassman:
Our plan is to release this new theme for Learn SaaS customers by Bb World 2016. For those not moving to SaaS, Learn 9.1 self-hosted and managed-hosted customers will also have access to the new Ultra-inspired theme with our Q4 2016 release.
To get Ultra to look more like Learn 9.1, Blackboard is revamping how many features or workflows go into Ultra long-term. While I have not seen an example of what this means, the minimalist design of Ultra will have to change to add these additional features.
To make this blending of products or user experiences into more of a cohesive experience and easier migration path, Blackboard is using what they call “Dual Course Mode”. This is for Learn SaaS customers only who have enabled Ultra. They can then choose either the Original Course View (aka Learn 9.1) or the Ultra Course View on a course-by-course, or program-by-program basis. In the webinar, Blackboard showed how they view Ultra as more applicable to light users of an LMS and Original as more applicable to power users.
Again from the FAQ:
One LMS solution does not fit every type of user. The Original Course View is the perfect solution for power users who want a greater degree of control over their courses; they get Learn’s original workflows and extensive capabilities with a modern look-and-feel. The Ultra Course View is the perfect solution for the more casual or new user who does not want or need that level of control; they get essential workflows in a brand new, modern, fully responsive view.
Likewise, system administrators can “control who gets what Course Type” by instructor, program, or institution.Execution
Let’s go back to my statements about further product delays with Ultra. Last year at BbWorld 2015, Blackboard shared a roadmap of sorts listing what was available then, in development, and in research. Michael described the plans as of last summer in this post.
Fast forward to the late January webinar and view the current roadmap status. Note that the list of “Available Now (Technical Preview)” has not changed at all.
There are still essential components that I would consider necessary for real usage in courses that are not ready – tests, groups, integrations for publishers and LTI. I suspect these were also big issues in the customer feedback that Mark Strassman described.
Update (2/16): I just found this tweet from Eric Silva that captures Blackboard message to tech preview customers. Someone get a copy editor for Blackboard please.
Some time has passed since we last communicated, and now we have more information to share with you. First, thank you for your patience during this process. We had planned to open the Tech Preview of Blackboard Learn’s Ultra experience up to you all by the end of 2015 but we wanted to make sure that the environment met our quality standards and we were able to properly support the thousands of people who we expect to experiment with it. It’s also taken longer because we’ve evolved our product strategy based on your direct feedback and have been building this technology in a way we feel will best fit with your teaching pedagogy and design. You can also view a recording of last week’s roadmap here.
In regards to the Tech Preview (which will now be called the Educator Preview), we plan to open up access in late March, but we will let you be the FIRST to gain access to this site a few weeks earlier. At that time, you’ll get a invite that allows you to self register and automatically receive your login credentials. There will be no waiting list and no delay.Impact and Messaging
On the positive side, I have heard a lot of frustration from current Blackboard customers that they would not get the benefits of Ultra unless they move to Learn SaaS. The development of the Ultra-inspired theme, if executed well and delivered by this summer, may address some of this frustration and ease a migration path towards eventual adoption of Learn SaaS and more Ultra-inspired benefits. At the same time, there would be less pressure on Ultra being delivered as a minimum viable product (MVP), as at least some of the Ultra benefits will flow to customers even in 9.1 Original Experience.
Michael and I will likely add more analysis, especially if we get access to talk to customers who have explored Ultra in Technical Preview and if we can get a more in-depth demo on the Ultra theme and full Ultra experience with updates since July. For now, I’ll close with another section of the FAQ to help keep the terminology straight, as it is confusing.
- Self-hosted, managed hosted, and SaaS are all deployment methods.
- SaaS Standard, SaaS Advantage, and SaaS Plus are all options for SaaS deployment that give customers an increasing degree of flexibility and control. For example, SaaS Standard provides a standard set of Building Blocks for all customers; SaaS Advantage and SaaS Plus allow institutions control over installing additional Building Blocks. SaaS Plus and Advantage also allow customers some control over when updates are deployed. SaaS Advantage adds service delivery management and direct data access. Learn 9.1 is the version of our flagship LMS that customers use if they are a self-hosted or managed hosted customer.
- Learn with the Original Experience is the traditional user experience (i.e. Learn 9.1) available with Learn SaaS.
- Learn with the Ultra Experience is Blackboard’s new user experience designed and built around the Learner’s needs. The Ultra Experience is characterized by integrated workflows and delivers an entirely new, delightful user interface. It is only available in SaaS deployments of Learn.
- Dual course mode are the two options that an institution can make available to their faculty when the Ultra Experience is enabled: Original Course View and the Ultra Course View. This approach was developed based on research and feedback from instructors, who have different levels of experience and comfort with LMS technology, and thus have different course needs.
- The Learn Original Course view provides all of the traditional Learn workflows and is known for its unmatched breadth and depth of functionality that allows for full control of the course design and presentation of content. This course view is available with both the Original and Ultra experiences.
- The Ultra Course View is a new Blackboard Learn course view that includes the most valued Learn functionality redesigned to provide simplified workflows, a modern look and feel, and a fully responsive interface on all devices. This course view is only available with the SaaS deployment of Learn when the Ultra experience is enabled.
- Institutions will have the option to make either the Original or Ultra course views available to instructor, departments or programs of their choosing.
- Learn 2016 theme is the Ultra-inspired theme that will be available for the Original Course View and Learn 9.1 hosted and self-hosted implementations.
Here is the full webinar from late January:
As we’ve already noted, 2016 is shaping up as quite an eventful year for the LMS market.
The post After Customer Feedback And Further Delays, Blackboard Changes Learn Ultra Strategy For LMS appeared first on e-Literate.
By Phil HillMore Posts (389)
Two weeks ago Michael and I posted an third article on EdSurge that described an encouraging course redesign for STEM gateway courses.
In our e-Literate TV series on personalized learning, we heard several first-hand stories about the power of simple and timely feedback. As described in the New York Times, administrators at the University of California, Davis, became interested in redesigning introductory biology and chemistry courses, because most of the 45 percent of students who dropped out of STEM programs did so by the middle of their second year. These students are the ones who typically take large lecture courses.
The team involved in the course-redesign projects wanted students to both receive more individual attention and to take more responsibility for their learning. To accomplish these goals, the team employed personalized learning practices as a way of making room for more active learning in the classroom. Students used software-based homework to experience much of the content that had previously been delivered in lectures. Faculty redesigned their lecture periods to become interactive discussions.
The UC Davis team focused first on redesigning the lab sections to move away from content delivery (TAs lecturing) to interactive sessions where students came to class prepared and then engaged in the material through group discussions (read the full EdSurge article for more context). In the UC Davis case, this interactive approach was based on three feedback loops:
- Immediate Feedback: The software provides tutoring and “immediate response to whether I push a button” as students work through problems, prior to class.
- Targeted Lecture and Discussion: The basic analytics showing how students have done on the pre-lab questions allows the TA to target lecture and discussion in a more personal manner—based on what the specific students in that particular section need. “I see the questions that most of my class had a difficulty with, and then I cover that in the next discussion,” Fox says.
- Guidance: The TA “would go over the answers in discussion.” This occurs both as she leads an interactive discussion with all students in the discussion section and as she provides individual guidance to students who need that help.
The opportunity to make the lab sections truly interactive, and not just one-way content delivery through lectures, is not unique to the UC Davis example. Shortly after publishing the article, I found another course redesign that plays on some of the same themes. This effort at Cal State Long Beach (CSULB) was described in the Press-Telegram article:
Sitting near a skeleton in a Cal State Long Beach classroom last week, Professor Kelly Young dissected a course redesign that transformed a class from a notorious stumbling block to a stepping stone toward graduation.
Young has reduced the number of students failing, withdrawing or performing below average in Bio 208: Human Anatomy from 50 percent to fewer than 20 percent in about four years, and poorly performing students have watched their grades climb, with continued improvement on the horizon.
That statistic is worth exploring, especially when considering that 500-600 students take this class each year at CSULB.
Thanks to the CSULB Course Redesign work, this work in Bio 208 has some very useful documentation available on the MERLOT repository. Like the UC Davis team, the CSULB team first redesigned the lab sections, “flipping them” to enable a more personalized approach within the small sections. Unlike UC Davis, CSULB centered the content on videos and podcasts.
While we have been working on refining the lecture over the past several years, the Redesign Project has allowed us to get serious about redesigning the laboratory (the source of low grades for most of the students). During the semester, students learn over 1,500 structures just in the laboratory portion of the course. Despite asking them to look at the material before class, students would routinely come to the laboratory session totally unprepared. Flipping the class was an enticing solution to increase preparedness- and therefore success.
After trial and error over a few years, the team has created a series of “Anatomy on Demand” annotated videos. But as the team pointed out, this is not the actual important factor.
While the videos often get attention in a flipped classroom proposal, the true focus of our project is what we do with the newly-created class time in the laboratory provided by flipping the lectures. The most important aspect of this project is our new interactive laboratory sessions that serve to deepen understanding of the material. The idea is that a student will watch the relevant short videos (usually 5-7 per week) prior to coming to the laboratory, arrive prepared to their laboratory, take a short quiz that is reduced in rigor but assures readiness, and then spend at least two hours in the laboratory exploring the structures in detail at interactive small group stations.
The effect has been that students are moving from receiving introductions to material and now participating in critical thinking in the lab.
This new method allows prepared students to deeply interact with the material, as opposed to merely being introduced to it. In previous years, we hoped to have students leave the laboratory with some rote memorization of the structures complete. In contrast, when students arrive with a basic understanding of the structures, we are able to use laboratory time to ask application and critical thinking questions.
After applying multiple redesign elements and interventions, the CSULB team started seeing impressive results, especially starting in Spring 2014. This is where they are tracking the reduction in percentage of students getting D, F, or Withdraw from almost 50% to approximately 20%.
Both of these course redesigns were led by university faculty and staff and are showing impressive results. Not just in grades but in deeper student learning. Kudos to both the UC Davis team and the CSULB team.
The post Making Lab Sections Interactive: More evidence on potential of course redesign appeared first on e-Literate.
By Phil HillMore Posts (389)
It’s worth giving credit where credit is due, and the US Department of Education (ED) has fixed a problem that Russ Poulin and I pointed out where they had previously left ~700 colleges out of the College Scorecard.
When the College Scorecard was announced, Russ noticed a handful of missing schools. When I did the whole data OCD thing, I discovered that more than 700 2-year institutions were missing, including nearly 1-in-4 community colleges. Eventually we published an article in the Washington Post describing this (and other) problems.
The missing community colleges were excluded on purely statistical grounds. If the college granted more certificates (official awards of less than a degree) than degrees in a year, then they were excluded as they were not “primarily degree-granting” institutions. We label this the “Brian Criterion” after the person authoring two discussion board posts that explained this undocumented filter.
This was a statistical decision because it affects graduation rates, but leaves the student wondering why so many colleges cannot be found. Consider Front Range Community College in Colorado with 1,673 associate’s degrees granted in 2012-13. Because they also awarded 1,771 certificates, the Scorecard filters them out from the consumer website.
Largely due to their community-serving mission, community colleges and other two-year institutions were primarily affected. By our calculations, approximately one in three two-year colleges were excluded (more than 700), including approximately one in four community colleges (more than 250).
It is ironic that the most-penalized institutions were community colleges and those innovating with interim certificates and stackable credentials in particular; indeed, the White House has been explicitly promoting both of these groups.
We never heard from the ED officially but had some backchannel communications from others that there were some fixes being considered.
On Wednesday I got a message from the infamous Brian on a Stack Exchange thread letting me know that ED had changed their approach.
The Department recently added institutions to the consumer site such that institutions that predominantly award certificates (PREDDEG=1) are included IF the highest degree is at least an Associate’s (HIGHDEG>=2 ) AND the institution offers an associate’s or bachelor’s degree (CIPxxASSOC>0 OR CIPxxBACHL>0)
In English, this means that the ED took out their artificial criterion and fixed this issue. Colleges that award degrees no longer get excluded from the College Scorecard because they award even more certificates.
It was a little tricky verifying the fix, as they have also changed how the College Scorecard classifies schools. Previously they let the user filter on associate’s programs, leading to institutions that predominantly award associate’s degrees. Now the scorecard will show you all institutions that award associate’s degrees. So the checksum activity must be done at a higher level. Low and behold, the count of public institutions in the Scorecard approximately matches the count from IPEDS. I also did spot checks on a dozen institutions that had previously been missing, and they are now in the Scorecard.
The other issues in the Washington Post article remain, but this headline problem has been fixed, but very quietly. I cannot find any announcement or release notes from ED, just this one line in their release notes:
Update national statistics to include certificate schools
So consider this blog post as the official ED press release, I guess. Thanks for fixing.
The post College Scorecard: ED quietly adds in 700 missing colleges appeared first on e-Literate.
By Michael FeldsteinMore Posts (1052)
Phil and I will be writing a twice-monthly column for the Chronicle’s new Re:Learning section. In my inaugural column, “Muy Loco Parentis,” I write about how schools make data privacy decisions on behalf of the students that the students wouldn’t make for themselves, and that may even be net harmful for the students. In contrast to the ways in which other campus policies have evolved, there is still very much a default paternalistic position regarding data.
But the one example that I didn’t cover in my piece happens to be the one that inspired it in the first place. A few months back at the OpenEd conference, I heard a presentation from CMU’s Norm Bier about that challenges of getting different schools to submit OLI student data to a common database for academic research. Basically, every school that wants to do this has to go through its own IRB process, and every IRB is different. Since the faculty using the OLI products usually aren’t engaged in the research themselves, it generally isn’t worth the hassle to go through this process, so the data doesn’t get submitted and the research doesn’t get done. Note that Pearson and McGraw Hill do not have this problem; if they want to look at student performance in a learning application across various schools, they can. Easily. Something is wrong with this picture. I proposed in Norm’s session that maybe students could be given an option to openly publish their data. Maybe that would get around the restrictions. David Wiley, who does a lot more academic research than I do, seemed to think this wasn’t a crazy idea, so I’ve been gnawing on the problem since then.
I have talked to a bunch of researchers about the idea. The first reaction is often skepticism. IRB is not so easy to circumvent (for good reason). What generally changed their minds was the following thought experiment:
- Suppose that, in some educational software program, there was a button labeled “Export.” Students could click the button and export their data in some suitably anonymized format. (Yes, yes, it is impossible to fully de-identify data, but let’s posit “reasonably anonymized” as assessed by a community of data scientists.) Would giving students the option to export their data to any server of their choosing trigger the requirement for IRB review? [Answer: No.]
- Suppose the export button offered a choice to export to CMU’s research server. Would giving students that option trigger the requirement for IRB review? [Answer: Probably not.]
There are two shades of gray here that are complications. First, researchers worry about the data bias that comes from opt in. And the further you lead students down the path toward encouraging them to share their data, such as making sharing the default, the more the uneasiness sets in. Second and relatedly, there is the issue of informed consent. There was a general feeling that, even if you get around IRB review, there is still a strong ethical obligation to do more than just pay lip service to informed consent. You need to really educate students on the potential consequences of sharing their data.
That’s all fair. I don’t claim that there is a silver bullet. But the thought experiment is revealing. Our intuitions, and therefore our policies, about student data privacy are strongly paternalistic in an academic context but shift pretty quickly once the institutional role fades and the student’s individual choice is foregrounded. I think this is an idea worth exploring further.
By Phil HillMore Posts (388)
Pearson has notified customers that LearningStudio will be shut down as a standalone LMS over the next 2-3 years. Created from the Pearson acquisition of both eCollege and Fronter, LearningStudio has been targeted primarily at fully-online programs and associated hybrid programs – not for simple augmentation of face-to-face classes. The customer base has mostly included for-profit institutions as well as not-for-profit programs that are often packaged with an online service prover model (e.g. Embanet customers). As of this year, LearningStudio has approximately 110 customers with 1.2 million unique student enrollments.
This decision is not one isolated to LearningStudio, as the end-of-life notification caps a series of moves by Pearson to get out of the LMS market in general.
Less than a year ago I wrote a post about Texas Christian University claiming that Pearson was “getting out of the LMS market”, although during research for that story the administrator requested a change in the campus newspaper.
“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”Hughes said Pearson withdrew from the LMS search process for TCU but remains an LMS provider.
From 2007 through 2012, Pearson aggressively moved into the LMS market. In 2007 the company acquired eCollege for $477 million, taking it private. In 2008 Pearson acquired the European LMS provider Fronter. In 2009 Pearson announced LearningStudio as the rebranded combination of eCollege and Fronter, predominantly from eCollege. Then the big PR move came in 2011 with the splashy announcement of OpenClass, an “completely free” and “amazing” LMS that dominated the discussion at EDUCAUSE that year, partially due to “misleading headlines” implying a partnership with Google.
In the past year, however, Pearson has reversed all of these strategic moves. Announced last September, OpenClass will no longer be available as of January 2018. In November Pearson sold Fronter to itsLearning. And now LearningStudio (and in effect eCollege) is being retired. To be more precise, LearningStudio is being retired as a standalone LMS. What is not publicized it that LearningStudio internally provides the infrastructure and platform support for Pearson’s MyLabs & Mastering courseware. That internal platform will remain, but the external product will go away.
For this story Michael and I interviewed Curtiss Barnes, Managing Director of Technology Products for Pearson Global Higher Education Barnes confirmed the story and said that all LearningStudio customers have been notified, and that there are no plans for a public announcement or press release. Barnes said the decision to get out of the LMS category was based on Pearson’s continuing efforts to reorganize and streamline the diversified company, and being competitive in the LMS market just doesn’t help meet corporate goals.
So what platforms and technology products do meet corporate goals? Barnes said that Pearson does courseware really well, with over 12 million students on these platforms overall and approximately 2 million per day. He sees large distinctions between content-agnostic LMS solutions and courseware. Courseware might require certain features that overlap LMS features, but the fundamentals of what’s being delivered goes well beyond content management store, calendaring, and other LMS basics to include instrumentation of content and science-based learning design. Barnes said that learning design is the key element they’re looking for as a company.
The front page for OpenClass now describes Pearson’s view on LMS and courseware markets.
On January 1, 2018, OpenClass will no longer be available to faculty, students, or administrators, and as of today, no new accounts will be created. You will be able to sign in and access OpenClass and we will maintain SLAs until January 1, 2018. We will also continue to provide Community Forum support and OpenClass Knowledge Base until this date.
At Pearson, we are relentlessly committed to driving learner outcomes and we see a bigger opportunity to provide value to our customers via programs such as MyLab & Mastering and REVEL, and through our professional services, such as curriculum design and online program management.
While the LMS will endure as an important piece of academic infrastructure, we believe our learning applications and services are truly “where the learning happens.” In short, withdrawing from the crowded LMS market allows us to concentrate on areas where we can make the biggest measurable impact on student learning outcomes.
Pearson has told customers that they still have engineers and operations teams to fully support continuing operations and mitigate bugs or issues affecting LearningStudio, but they are not developing new features. LearningStudio will remain available for customers through their existing contracts, but the earliest loss of support for any customer will be December 31, 2017 to allow customers whose contracts expire before then more time to select a different LMS and migrate their courses.
Michael and I pressed during the interview to see if Pearson is favoring one solution over another in their discussions with customers, but Barnes said that Pearson has decided to remain neutral. Customers are not being given recommendations on alternate solutions.
This move out of the LMS market by Pearson has a parallel with last year’s sale of PowerSchool, a Student Information System for the K-12 market. Pearson acquired PowerSchool from Apple in 2006, but it no longer made sense to try and be competitive in the SIS market.
Like the forced migration caused by WebCT and ANGEL end-of-life notices, there will now be more than 100 LMS changes triggered by this announcement. While the for-profit sector has taken big hits in enrollments over the past 3-4 years, there are still some very large online programs that now have to select a new LMS.
This has been an eventful year for the LMS market already, and it’s only one month old. Expect to see more movement and changes.
- Disclosure: Pearson is a client of MindWires Consulting on an separate project.
The post LearningStudio and OpenClass End-Of-Life: Pearson is getting out of LMS market appeared first on e-Literate.
By Michael FeldsteinMore Posts (1052)
Today we have a prime example of how Blackboard has been failing by not succeeding fast enough. The company issued a press release announcing “availability of new SaaS offerings.” After last year’s BbWorld, I wrote a post about how badly the company was communicating with its customers about important issues. One of the examples I cited was the confusion around their new SaaS offerings versus managed hosting:
What is “Premium SaaS”? Is it managed hosting? Is it private cloud? What does it mean for current managed hosting customers? What we have found is that there doesn’t seem to be complete shared understanding even among the Blackboard management team about what the answers to these questions are.
A week later, (as I wrote at the time), the company acted to clarify the situation. We got some documentation on what the forthcoming SaaS tiers would look like and how they related to existing managed hosting options. Good on them for responding quickly and appropriately to criticism.
Now, half a year after the announcement, the company has released said SaaS offerings. Along with it, they put out an FAQ and a comparison of the tiers. So they said what they were going to do, they did it, and they said what they did. All good. But half a year later?
In my recent post about Blackboard’s new CEO, I wrote,
Ballhaus inherits a company with a number of problems. Their customers are increasingly unhappy with the support they are getting on the current platform, unclear about how they will be affected by future development plans, and unconvinced that Blackboard will deliver a next-generation product in the near future that will be a compelling alternative to the competitors in the market. Schools going out to market for an LMS seem less and less likely to take Blackboard serious as a contender, which is particularly bad news since a significant proportion of those schools are currently Blackboard schools. The losses have been incremental so far, but it feels like we are at an inflection point. The dam is leaking, and it could burst.
The post Blackboard Did What It Said It Would Do. Eventually. appeared first on e-Literate.
By Phil HillMore Posts (387)
The Moodle Users Association (MUA), a crowd-funding group for additional Moodle core development, announced today that it is open for members to join. Technically the site was internally announced on its web site last Friday, but the press release came out today. As of this writing (Thurs evening PST), 44 members have signed up: 37 at the individual level, 1 at the bronze level, 3 at the silver level, 2 not sharing details, and Moodle Pty as the trademark holder. This equates to $8,680 – $22,580 of annual dues, depending on what level the two anonymous members chose.
Update (1/25): As of Monday morning, the numbers are 51 members: 44 at the individual level, 1 bronze, 3 silver, 2 not sharing details, and Moodle Pty as trademark holder; leading to $8,960 – $22,860 of annual dues.
Moodle News was the first outlet to describe the new organization (originally called Moodle Association but changed to Moodle Users Association when the organization was formalized), and in early December they summarized the motivation:
As mentioned recently in an article on e-Literate, it’s possible that a majority of all funding to Moodle.org originates from Blackboard. While this may be ironic, knowing the history of Blackboard, it is appropriate since the LMS company has quietly become the largest Moodle Partner by number of clients through acquisitions and growth over the last few years.
The Moodle Partner network is the lifeblood of Moodle HQ/Moodle.com and funds all of the full-time staff at Moodle HQ. Ten percent of all profit from Moodle partners is contributed back to fund the HQ’s staff, who then in turn organize developers around the world, coordinate QA cycles, keep releases on schedule, provide training and a free trial option for individual Moodle users.
In early 2015, Martin Dougiamas unveiled a plan to diversify Moodle funding by garnering support from non-Moodle Partners: organizations, users, and schools who were interested in contributing to Moodle.org, but were not providing Moodle related services, and perhaps wanted to get a little more out of contributions than a strict donation might give. The MUA is scheduled to kick off this December with it’s inaugural committee and funding which will be used to drive, in part, the development of the Moodle project through a democratic mechanism.
At e-Literate we have covered the inflection point faced by what is likely the the world’s most-used LMS since this past June. The inflection point comes from a variety of triggers:
- Blackboard acquisition of several Moodle Partners causing Moodle HQ, other Moodle Partners, and some subset of users’ concerns about commercialization;
- Creation of the Moodle Users Association as well as Moodle Cloud services as alternate paths to Moodle Partners for revenue and hosting; and
- Several Moodle-derivative efforts in 2015 such as Remote-Learner leaving the Moodle Partner program, Totara forking its Moodle code, and creation of the POET working group hosted by Apereo.
To be fair, Martin does not agree with my characterization of such an inflection point as described in this interview from September 2015:
Martin: Sorry, I don’t really agree with your characterization. Unlike nearly all other LMS companies, Moodle is not profit-focussed (all our revenue goes into salaries). We are an organisation that is completely focussed on supplying a true open source alternative for the world without resorting to venture capital and the profit-driven thinking that comes with that. [snip]
Phil: My note on “inflection point” is not based on a profit-driven assumption. The idea is that significant changes are underway that could change the future direction of Moodle. A lot depends on Blackboard’s acquisition strategy (assuming it goes beyond Remote-Learner UK and Nivel Siete), whether other Moodle Partners follow Remote-Learner’s decision, and whether Moodle Association shows signs of producing similar or larger revenues than the Moodle Partner program. What I don’t see happening is extension of the status quo.
Martin: Moodle’s mission is not changing at all, we are just expanding and improving how we do things in response to a shifting edtech world. We are starting the Moodle Association to fill a gap that our users have often expressed to us – they wanted a way to have some more direct input over major changes in core Moodle. There is no overlap between this and the Moodle Partners – in fact we are also doing a great deal to improve and grow the Moodle Partner program and as well as the user experience for those who need Moodle services from them.
I maintain my belief that there are big changes afoot in the Moodle community. The two primary drivers of how these changes might impact product development of Moodle core are whether and how Blackboard and Moodle HQ “mend fences” in their business relationship and whether the Moodle Users Association shows signs of producing significant revenue and significant influence on product development within the next 1 – 2 years. With the formal announcement today, let’s look at more details on MUA and its prospects.
There are four levels of MUA membership, ranging from 100AUD to 10,000AUD annual dues (roughly $70 to $7,000 in US dollars).
When Moodle News surveyed their readers about plans for membership (results in early December post), they found 70% planning to join at individual level, 23 % at bronze, 4% at silver, and 2% at gold. Thus far the actuals from the public members (excluding Moodle Pty and the two anonymous members) are 90% individual, 2% bronze, 7% silver, and 0% gold. It is quite possible, however, that organizations choosing bronze, silver, and gold levels will take longer to decide than individuals. Nevertheless, it initially appears that the majority of members will be at the individual level. Given the disparity in dues, however, the actual public revenue commitments are 30% individual, 7% bronze, 63% silver, and 0% gold. If either or both of the two anonymous members are organizations, this would tilt the numbers even further.
What would it take to generate “significant revenue”? As of a year ago, Moodle HQ had 34 employees which would require $3 – $5m of revenue to continue support For MUA to have significant impact, I would say that it needs minimum revenues of $500k – $1m within a couple of years. Without this level, the revenue itself is just in the noise and revenue has not been diversified. This minimum level would require 200+ organizational memberships in rough numbers.
Moving forward, it will be important to track organizational memberships primarily, seeing if there is a clear path for MUA to reach at least 200 in a year or two.
What would it take to generate “significant influence on product development”? Well first it is worth understanding how this works. Twice a year the MUA will “work through a cycle of project proposal development and project voting in three phases”.
The number of votes in the process are determined by membership levels. The pot of money to allocate to the specific development projects is based on the revenue. What this means is that MUA is not voting, per current rules, on overall Moodle Core development – they are voting on how to allocate MUA funds within Moodle Core development. Again, if there is not some reasonable level of revenue ($500k +), then the impact of this development will be minor.
I’m not suggesting that $500k is a hard delimiter of significant or not; it is just a rough number to help us understand MUA’s future importance.
We should not forget the introduction of MoodleCloud:
MoodleCloud is our own hosting platform, designed and run by us, the people who make Moodle.
The revenue to cover the costs of this services are provided by ads or by 5AUD / month payment to avoid ads. The service is limited to sites with 50 users or less. As currently designed, MoodleCloud would primarily pay for itself – scale does not necessarily create funds to pay for product development. But this service does provide a potential pathway for additional revenue with just a policy change
A strategic inflection point is a time in the life of business when its fundamentals are about to change. That change can mean an opportunity to rise to new heights. But it may just as likely signal the beginning of the end.
– Andy Grove, Only the Paranoid Survive
More broadly, the issue is whether MUA and even MoodleCloud will change the dynamics to allow Moodle to modernize and compete more effectively against within the overall LMS market. The fundamentals are changing for Moodle.
- Note: I do not have actual numbers on revenue amounts and am just using rough industry metrics of $90k – $150k per employee, fully loaded with office and equipment.
- I am not aware if Moodle Partners have any limitations on MoodleCloud terms in their contracts.
The post Launch Of Moodle Users Association: 44 members sign up, mostly as individuals appeared first on e-Literate.
By Phil HillMore Posts (387)
One year ago I wrote a post critical of Inside Higher Ed for not doing a blanket disclosure about the sale of a majority stake to a private equity firm with other education holdings (most notably Ruffalo Noel Levitz).
Subsequent to the disclosure from the Huffington Post, IHE put up an ownership statement disclosing the ownership change and calling out that only editors are involved in editorial policies. The About Us page prominently links to this ownership statement.
In an interview with Education Dive, Scott Jaschik (an Inside Higher Ed founder and editor) noted his regret for not disclosing the sale up front while concluding:
“I guess I would just say to anyone who has questions, read us and read our coverage and call me if you think we’re doing anything that we shouldn’t,” [Jaschik] said.
In the past year I have done exactly that – watching carefully for editorial shifts, complaining publicly about one article, and privately emailing Jaschik on another issue.
My conclusion? Inside Higher Ed has shown no bias and no change in editorial policies based on the new ownership – they are living up to their word. IHE [Jaschik in particular] has also been quite good in discussing any questions or issues based on their coverage. IHE should be commended for their quality coverage of higher education news.
The post Inside Higher Ed: One year after selling majority stake in company appeared first on e-Literate.
By Phil HillMore Posts (387)
I’ve never been a big TEDtalks fan, but recently I’ve been exploring some of the episodes, partially based on peer pressure.
— Glenda Morgan (@morganmundum) January 15, 2016
In the process I ran across a talk from Sebastian Wernicke, who has a bioinformatics background but now seems to specialize in giving talks. The talk in question is “How to use data to make a hit TV show”, which starts by looking at two data approaches to binge TV production – Amazon’s use of data analysis to choose a new show concept, leading to Alpha House, and Netflix’s use of data to look at lots of show components but then to let humans make conclusions and “take a leap of faith”, leading to House of Cards. The anecdotes set up his description of where data fits and where it doesn’t, and this mirrors what Michael and I are seeing in the use the broad application of personalized learning.
We have described in our most recent EdSurge article:
Bottom Line: Personalized learning is not a product you can buy. It is a strategy that good teachers can implement.
While Wernicke is not addressing education, he describes the same underlying issue in memorable way (starting at 8:18 in particular).
Now, personally I’ve seen a lot of this struggle with data myself, because I work in computational genetics, which is also a field where lots of very smart people are using unimaginable amounts of data to make pretty serious decisions like deciding on a cancer therapy or developing a drug. And over the years, I’ve noticed a sort of pattern or kind of rule, if you will, about the difference between successful decision-making with data and unsuccessful decision-making, and I find this a pattern worth sharing, and it goes something like this.
So whenever you’re solving a complex problem, you’re doing essentially two things. The first one is, you take that problem apart into its bits and pieces so that you can deeply analyze those bits and pieces, and then of course you do the second part. You put all of these bits and pieces back together again to come to your conclusion. And sometimes you have to do it over again, but it’s always those two things: taking apart and putting back together again.
And now the crucial thing is that data and data analysis is only good for the first part. Data and data analysis, no matter how powerful, can only help you taking a problem apart and understanding its pieces. It’s not suited to put those pieces back together again and then to come to a conclusion. There’s another tool that can do that, and we all have it, and that tool is the brain. If there’s one thing a brain is good at, it’s taking bits and pieces back together again, even when you have incomplete information, and coming to a good conclusion, especially if it’s the brain of an expert.
And that’s why I believe that Netflix was so successful, because they used data and brains where they belong in the process. They use data to first understand lots of pieces about their audience that they otherwise wouldn’t have been able to understand at that depth, but then the decision to take all these bits and pieces and put them back together again and make a show like “House of Cards,” that was nowhere in the data. Ted Sarandos and his team made that decision to license that show, which also meant, by the way, that they were taking a pretty big personal risk with that decision. And Amazon, on the other hand, they did it the wrong way around. They used data all the way to drive their decision-making, first when they held their competition of TV ideas, then when they selected “Alpha House” to make as a show. Which of course was a very safe decision for them, because they could always point at the data, saying, “This is what the data tells us.” But it didn’t lead to the exceptional results that they were hoping for.
So data is of course a massively useful tool to make better decisions, but I believe that things go wrong when data is starting to drive those decisions. No matter how powerful, data is just a tool . . .
We are not the only people to describe this distinction. Tony Bates’ latest blog post describes a crossroads we face in automation vs. empowerment:
The key question we face is whether online learning should aim to replace teachers and instructors through automation, or whether technology should be used to empower not only teachers but also learners. Of course, the answer will always be a mix of both, but getting the balance right is critical.
What I particularly like about the Wernicke description is that he gets to the difference between analysis (detailed examination of the elements or structure of something, typically as a basis for discussion or interpretation) and synthesis (combination or composition, in particular). Data is uniquely suited to the former, the human mind is uniquely suited to the latter.
This is not to say that the use of data and analytics can never be used to put information back together, but it is crucial to understand there is a world of difference in data for analysis and data for synthesis. In the world of education, the difference shows up in whether data is used to empower learners and teachers or whether it is used to attempt automation of the learning experience.
- Using Google’s definitions.
The post It’s Called Data Analysis And Not Data Synthesis For A Reason appeared first on e-Literate.
By Michael FeldsteinMore Posts (1052)
To recap what’s happened so far:
- Audrey Watters called our attention to a patent filing by Khan Academy.
- I expressed my concerns about the continuing patent problem that we have in educational technology.
- Carl Straumsheim explained the defensive use of patents in more detail and in the process motivated me to take a look at the specifics of the patent agreement that Khan Academy signed.
- I took an initial look at said agreement, known as the Innovator’s Agreement, and concluded that it was a step in the right direction but that I still had concerns.
Since then, I had a little more time to look at the actual legal language of the agreement and reflect on the larger edupatent problem. And I’ve come to the conclusion that Khan Academy did the right thing by adopting the agreement. We should feel good about what they’ve done. And given the realities that software patents exist and defensive patents are therefore a necessary evil, we should encourage other educational patent holders to do as Khan has done and adopt the same agreement.
The Innovator’s Agreement is actually quite clever. To recap the basic idea, companies that adopt the agreement give the inventors who are named on the patent application veto power over the patent’s assertion, except in cases where the company is acting in self-defense in response to legal action against it. More than just a pledge, it is a legally binding document. (Text of the agreement is here.)
The agreement travels with the patent, so if the company sells it then the new owner will still be bound by the agreement:
Assignee acknowledges and agrees that the above promises are intended to run with the Patents and are binding on any future owner, assignee or exclusive licensee who has been given the right to enforce any claims of the Patents against third parties. Assignee covenants with Inventors that any assignment or transfer of its right, title, or interest herein will be conveyed with the promises herein as an encumbrance.
The inventors do get to pass along assertion veto rights to their heirs:
[T]he license shall pass to the heirs of an inventor in the case that the inventor is deceased[…]
But if I’m reading the whole passage on those rights correctly, they can’t pass it along in a way that would damage the original intent (like selling it to a patent troll, for example), and there is a poison pill that basically says any protection from patent assertion that the inventor has a right to confer is invalid if it is granted under duress (for example, as a settlement payment in a threatened lawsuit):
Any sublicense granted by the Inventors under this section must be without threat or additional consideration; otherwise, the sublicense will be considered void ab initio. This license to the Inventors is not assignable, although the license shall pass to the heirs of an inventor in the case that the inventor is deceased, and the inventors, individually or jointly, may appoint a representative who may act on their behalf in granting sublicenses under this section. Assignee acknowledges and agrees that the promises in section 2 and 4 are intended to benefit third parties, except in the case of an assertion of claims of the Patents authorized under section 2.
There’s even a provision that says the company that holds the patent can assert in defense of third parties that are getting sued for patent infringement:
[The Company can assert the patent] against an Entity that has filed, maintained, or voluntarily participated in a patent infringement lawsuit against another in the past ten years, so long as the Entity has not instituted the patent infringement lawsuit defensively in response to a patent litigation threat against the Entity.
Overall, the Innovator’s Agreement is a pretty potent tool for deterring patent assertion. And while I would prefer that the power granted by the agreement be in the hands of a trusted third party, the protection of this agreement is still a big step forward, particularly if it is adopted widely enough that there are many parties holding such rights to different patents. The biggest thing that is missing is a strong motivation for the patent holders to assert their patents in the defense of a third party. For example, would Big LMS Company Patent Holder assert a patent in defense of Little Ed Tech Startup if the latter were being sued by Big Textbook Company that happened to also be a major business partner of Big LMS Company Patent Holder? I doubt it. In fact, I doubt that the third-party defense is likely to ever be invoked, for a variety of reasons. Secondarily, I’m not sure that the engineers named on the patents are always the best appointed defenders of education against patent assertion.
On the other hand, the Innovator’s Agreement has several virtues that my proposal does not. First, it already exists and has been vetted by Twitter’s undoubtedly super-expensive lawyers. Second, nobody would have to create a trust, fund it, and convince various patent holders to put their faith in it.
Under the circumstances, I think Khan Academy did the right thing by adopting the Innovator’s Agreement, and I think we should all encourage other holders of education-relevant patents to do the same. And by “encourage,” I mean both praise those that do adopt it and pressure those that don’t. Schools could even go so far as to make institution of the agreement a contractual requirement. Creation of a trust is always a possibility later down the line, using the Innovator’s Agreement as a template. (Twitter was kind enough to release the text of the agreement under a Creative Commons license.)
The post Patents Rethought: Khan Academy Did the Right Thing appeared first on e-Literate.
By Michael FeldsteinMore Posts (1052)
Carl Straumsheim has a good piece out on the Khan Academy patent Inside Higher Ed today. Much of it is a primer on the uses and limitations of defensive patents, but there is a piece on the specific nature of the patent pledge that Khan Academy has signed that I missed. The pledge, originally created by Twitter, is quite similar to my own proposal in a number of ways. It turns the decision-making regarding offensive use of the patent over to another party and, importantly, the agreement travels with the patent, even if it changes hands:
The IPA is a new way to do patent assignment that keeps control in the hands of engineers and designers. It is a commitment from Twitter to our employees that patents can only be used for defensive purposes. We will not use the patents from employees’ inventions in offensive litigation without their permission. What’s more, this control flows with the patents, so if we sold them to others, they could only use them as the inventor intended.
Shame on me for not doing my homework.
The big difference between this pledge and the one I propose is that I am suggesting that the third party be a trust rather than the inventing engineer. This has several virtues. First, engineers die, and not all of them are going to be equally vigilant in protecting education. Can the engineer sell this right to somebody else? Can the right be inherited? If it isn’t inherited, is the patent then unencumbered? Giving the rights to a trust lays this concern to rest. It also creates a proactive deterrent because the trust could sue anybody that is asserting an ed tech patent.
What I take from the details of Twitter’s pledge is that my proposal is probably legally viable. The original pledge just needs to be adapted to serve the specific needs of education.