Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 11 hours 48 min ago

Personalized Learning is Hard

Sat, 2015-08-29 10:04

By Michael FeldsteinMore Posts (1043)

Paul Fain has written a really good, nuanced article at IHE covering the update that Essex County College gave of their developmental math adaptive learning pilot at a recent conference in Washington, DC. We did a twopart case study on ECC in our e-Literate TV series). The headline results are as follows:

  • In the first year, the pass rate was worse than  in the traditional classes. (The first semester was “disastrous.”)
  • This year—the second year—the pass rate is coming closer to the traditional class but is still underperforming.
  • The article seems to imply that students who earn a C in the personalized learning class do better than students who earn a C in the traditional class, but the article is not explicit about that.

There is no magic pill. As Phil and I have been saying all along—most recently in my last post, which mentioned ECC’s use of adaptive learning—the software is, at best, an enabler. It’s the work that the students and teachers do around the software that makes the difference. Or not. In ECC’s case, they are trying to implement a pretty radical change in pedagogy with an at-risk population. It’s worth digging into the details.

Let’s start by reviewing the basics of their situation:

  • ECC has a 50% pass rate in their lowest level developmental math class, and a 50% pass rate in the next developmental math class up. Since a substantial majority of ECC students place into developmental math, a big part of ECC’s college completion problem can be traced to students failing developmental math.
  • ECC believes that a big reason they have a high failure rate is that students come into that class with an incredibly wide range of prior skills and knowledge—wide enough that a traditional lecture-based class would not address the needs of a majority of the students.
  • They decided to try a radical change in the way the developmental math course was structured.
    • Students would work self-paced on a mastery learning curriculum in labs using McGraw Hill’s ALEKS adaptive learning software. Students could ask each other or the roving instructor for help.
    • Students also met with a teacher each week, separately from the lab sessions, to report their progress of the week, assess the success or failure of their learning strategies, and set new strategies and goals for the next week.

So why does ECC think that they are not getting the results that they hoped for? Doug Walercz, ECC’s Vice President for Planning, Research, and Assessment, offered a few observations. From the article:

  • “[A]daptive courses provide less “accountability.” That’s because students move through content at different paces and it’s harder to make sure they master concepts by a certain point. ‘There is no classwide mile post.'”
  • “[T]he college leaned heavily on graduate students from nearby Rutgers University at Newark and the New Jersey Institute of Technology to teach parts of the adaptive courses during the first year.”
  • “’We underestimated the skill that you would need as a teacher to deliver that content,’ he said.”
  • “Faculty buy-in has also been a challenge. In adaptive courses, instructors do not give lectures or teach in the traditional format. Instead, they circulate among students who are working on computer-based courseware, offering help when needed, much like tutors. That feels like a job ‘below faculty status’ for some instructors, Walcerz said.”

Putting this all together, here is what I see:

  • ECC is starting with an at-risk population, a large portion of which probably has not been taught good meta-cognitive skills or help-seeking behaviors.
  • They are putting those students into a curriculum which, whatever its other virtues may be, puts a higher demand on those meta-cognitive and help-seeking behaviors than a traditional class would.
  • The burden of addressing that weakness in the course design falls on the faculty. But ECC has been working with untrained and inexperienced adjuncts—in, fact, graduate students—as well as some faculty who may be hostile to the project. (ECC has since moved away from using graduate students, according to the article.)

There may or may not also be problems with the software. For what it’s worth, Walercz seems to think highly of the software and doesn’t believe that it is contributing to the poor results. Personally, I think the problems with the match between the student skills and the course design are sufficient to explain the problem. The kind of burden that a self-paced program like this puts on these students is somewhat analogous to the burden that an online course puts on them. We know that the type of population that would be enrolled in a developmental math course in a community college in Newark, NJ typically does not do well in online courses. The difference is that, in ECC’s design, there actually are faculty there to intervene and coach the students personally. It stands to reason that the quality of that coaching would be a critical success factor.

Does this mean that ECC’s approach was a bad idea? I don’t think so. Differentiated instruction is a logical pedagogical response to a heterogeneous class problem. But it can only work in their environment if they have appropriately skilled, trained, and motivated faculty. ECC made substantial investments in software and facilities, but this result highlights the fact that the critical success factors in many cases will be making a substantial investment in providing faculty with appropriate professional development and a motivating compensation and promotion plan. It sounds like they have come to realize that and are taking some steps in that direction.

Truly effective innovation in education is hard. As Phil likes to stress, it takes both brutal honesty regarding the results and a commitment to iterate when the results are almost inevitably not what we hoped for in the first try. A while back, I blogged about an interesting case study at MSU where they did exactly that with a psychology class. If you read the comments thread in the follow-on post, you’ll see that Mike Caulfield brought up a potentially new insight that the course’s DWF pattern may be related to interactions between the course’s absence policy and the blended format. Course problems (and course successes) can be subtle and hard to tease out.

There. Is. No. Magic. Pill.

The post Personalized Learning is Hard appeared first on e-Literate.

The Fraught Interaction Design of Personalized Learning Products

Wed, 2015-08-26 12:49

By Michael FeldsteinMore Posts (1043)

David Wiley has a really interesting post up about Lumen Learning’s new personalized learning platform. Here’s an excerpt:

A typical high-level approach to personalization might include:

  • building up an internal model of what a student knows and can do,
  • algorithmically interrogating that model, and
  • providing the learner with a unique set of learning experiences based on the system’s analysis of the student model

Our thinking about personalization started here. But as we spoke to faculty and students, and pondered what we heard from them and what we have read in the literature, we began to see several problems with this approach. One in particular stood out:

There is no active role for the learner in this “personalized” experience. These systems reduce all the richness and complexity of deciding what a learner should be doing to – sometimes literally – a “Next” button. As these systems painstakingly work to learn how each student learns, the individual students lose out on the opportunity to learn this for themselves. Continued use of a system like this seems likely to create dependency in learners, as they stop stretching their metacognitive muscles and defer all decisions about what, when, and how long to study to The Machine.

Instructure’s Jared Stein really likes Lumen’s approach, writing,

So much work in predictive analytics and adaptive learning seeks to relieve people from the time-consuming work of individual diagnosis and remediation — that’s a two-edged sword: Using technology to increase efficiency can too easily sacrifice humanness — if you’re not deliberate in the design and usage of the technology. This topic came up quickly amongst the #DigPedNetwork group when Jim Groom and I chatted about closed/open learning environments earlier this month, suggesting that we haven’t fully explored this dilemma as educators or educational technologist.

I would add that I have seen very little evidence that either instructors or students place a high value on the adaptivity of these products. Phil and I have talked to a wide range of folks using these products, both in our work on the e-Literate TV case studies and in our general work as analysts. There is a lot of interest in the kind of meta-cognitive dashboarding that David is describing. There is little interest in, and in some cases active hostility toward, adaptivity. For example, Essex County College is using McGraw Hill’s ALEKS, which has one of the more sophisticated adaptive learning approaches on the market. But when we talked to faculty and staff there, the aspects of the program that they highlighted as most useful were a lot more mundane, e.g.,

It’s important for students to spend the time, right? I mean learning takes time, and it’s hard work. Asking students to keep time diaries is a very difficult ask, but when they’re working in an online platform, the platform keeps track of their time. So, on the first class day of the week, that’s goal-setting day. How many hours are you going to spend working on your math? How many topics are you planning to master? How many classes are you not going to be absent from?

I mean these are pretty simple goals, and then we give them a couple goals that they can just write whatever they feel like. And I’ve had students write, “I want to come to class with more energy,” and other such goals. And then, because we’ve got technology as our content delivery system, at the end of the week I can tell them, in a very efficient fashion that doesn’t take up a lot of my time, “You met your time goal, you met your topic goal,” or, “You approached it,” or, “You didn’t.”

So one of the most valuable functions of this system in this context is to reflect back to the students what they have done in terms that make sense to them and are relevant to the students’ self-selected learning goals. The measures are fairly crude—time on task, number of topics covered, and so on—and there is no adaptivity necessary at all.

But I also think that David’s post hints at some of the complexity of the design challenges with these products.

You can think of the family of personalized learning products as having potentially two components: diagnostic and prescriptive. Everybody who likes personalized learning products in any form likes the diagnostic component. The foundational value proposition for personalization, (which should not in any way be confused with “personal”), is having the system provide feedback to students and teachers about what the student does well and where the student is struggling. Furthermore, the perceived value of the product is directly related to the confidence that students and teachers have that the product is rendering an accurate diagnosis. That’s why I think products that provide black box diagnoses are doomed to market failure in the long term. As the market matures, students and teachers are going to want to know not only what the diagnosis is but also what the basis of the diagnosis is, so that they can judge for themselves whether they think the machine is correct.

Once the system has diagnosed the student’s knowledge or skill gaps—and it is worth calling out that these many of these personalized learning systems work on a deficit model, where the goal is to get students to fill in gaps—the next step is to prescribe actions that will help students to address those gaps. Here again we get into the issue of transparency. As David points out, some vendors hide the rationale for their prescriptions, even going so far as to remove user choice and just hide the adaptivity behind the “next” button. Note that the problem isn’t so much with providing a prescription as it is with the way in which it is provided. The other end of the spectrum, as David argues, is to make recommendations. The full set of statements from a well behaved personalized learning product to a student or teacher might be something like the following:

  1. This is where I think you have skill or knowledge gaps.
  2. This is the evidence and reasoning for my diagnosis.
  3. This is my suggestion for what you might want to do next.
  4. This is my reasoning for why I think it might help you.

It sounds verbose, but it can be done in fairly compact ways. Netflix’s “based on your liking Movie X and Movie Y, we think you would give Movie Z 3.5 stars” is one example of a compact explanation that provides at least some of this information. There are lots of ways that a thoughtful user interface designer can think about progressively revealing some of this information and providing “nudges” that encourage students on certain paths while still giving them the knowledge and freedom they need to make choices for themselves. The degree to which the system should be heavy-handed in its prescription probably depends in part on the pedagogical model. I can see something closer to “here, do this next” feeling appropriate in a self-paced CBE course than in a typical instructor-facilitated course. But even there, I think the Lumen folks are 100% right that the first responsibility of the adaptive learning system should be to help the learner understand what the system is suggesting and why so that the learner can gain better meta-cognitive understanding.

None of which is to say that the fancy adaptive learning algorithms themselves are useless. To the contrary. In an ideal world, the system will be looking at a wide range of evidence to provide more sophisticated evidence-based suggestions to the students. But the key word here is “suggestions.” Both because a critical part of any education is teaching students to be more self-aware of their learning processes and because faulty prescriptions in an educational setting can have serious consequences, personalized learning products need to evolve out of the black box phase as quickly as possible.

 

 

The post The Fraught Interaction Design of Personalized Learning Products appeared first on e-Literate.

Inside View Of Blackboard’s Moodle Strategy In Latin America

Wed, 2015-08-26 11:45

By Phil HillMore Posts (356)

One year ago Blackboard’s strategy for Moodle was floundering. After the 2012 acquisition of Moodlerooms and Netspot, Blackboard had kept its promises of supporting the open source community – and in fact, Blackboard pays much more than 50% of the total revenue going to Moodle HQ[1] – but that does not mean they had a strategy. Key Moodlerooms employees were leaving, and the management was frustrated. Last fall the remaining Moodlerooms management put together an emerging strategy to invest in (through corporate M&A) and grow the Moodle business, mostly outside of the US.

In just the past twelve months, Blackboard has acquired three Moodle-based companies – Remote-Learner UK (Moodle Partner in the UK), X-Ray Analytics (learning analytics for Moodle), and Nivel Siete (Moodle Partner in Colombia). When you add in organic growth to these acquisition, Blackboard has added ~450 new clients using Moodle in this same time period, reaching a current total of ~1400.

This is a change worth exploring. To paraphrase Michael’s statements to me and in his recent BbWorld coverage:

If you want to understand Blackboard and their future, you have to understand what they’re doing internationally. If you want to understand what they’re doing internationally, you have to understand what they’re doing with Moodle.

Based on this perspective, I accepted an invitation from Blackboard to come visit Nivel Siete last week to get a first-hand view of what this acquisition means I also attended the MoodleMoot Colombia #mootco15 conference and talked directly to Moodle customers in Latin America. Let’s first unpack that last phrase.

  • Note that due to the nature of this trip, I “talked directly” with Blackboard employees, Nivel Siete employees, Blackboard resellers, and Nivel Siete customers. They did give me free access to talk privately with whoever I wanted to, but treat this post as somewhat of an inside view rather than one that also includes perspectives from competitors.
  • “Moodle” is very significant in Latin America. It is the default LMS that dominates learning environments. The competition, or alternative solution, there is Blackboard Learn or . . . another route to get Moodle. In this market D2L and Canvas have virtually no presence – each company has just a couple of clients in Latin America and are not currently a factor in LMS decision-making. Schoology has one very large customer in Uruguay service hundreds of thousands of students. Blackboard Learn serves the top of the market – e.g. the top 10% in terms of revenue of Colombian institutions, where they already serves the majority of that sub-market according to the people I talked to. For the remaining 90%, it is pretty much Moodle, Moodle, alternate applications that are not LMSs, or nothing.[2]
  • I chose “customers” instead of “schools” or “institutions” for a reason. What is not understood in much of the education community is that Moodle has a large footprint outside of higher ed and K-12 markets. Approximately 2/3 of Nivel Siete’s clients are in corporate learning, and several others are government. And this situation is quite common for Moodle. In the US, more than 1/3 of Moodlerooms’ and approximately 1/2 of Remote-Learner’s customers are corporate learning. Phill Miller, the VP of International for Moodlerooms, said that for most of the Moodle hosting and service providers he has met, they also are serving corporate clients at similar numbers as education.
  • I chose “Latin America” instead of “Colombia” for a reason. While all but ~12 of Nivel Siete’s existing clients are in Colombia, Blackboard bought the company to act as a center of excellence or support service company for most of Latin America – Colombia, Mexico, Brazil, and Peru in particular. Cognos Online, their current local reseller for Latin America for core Blackboard products (Learn, Collaborate, etc) will become the reseller also for their Moodle customers. Nivel Siete will support a broader set of clients. In other words, this is not a simple acquisition of customers – it is an expansion of international presence.

And while we’re at it, the conference reception included a great opera mini flash mob (make sure to watch past 0:37):

Nivel Siete

Nivel Siete (meaning Level 7, a reference from two of the founders’ college days when a professor talked about need to understand deeper levels of the technology stack than just top-level applications that customers see), is a company of just over 20 employees in Bogota. They have 237+ clients, but that is growing. During the three days while I was there they signed several new contracts. They offer Moodle hosting and service in a cloud environment based on Amazon Web Services (AWS) – not true SaaS, as they allow multiple software versions in production and have not automated all provisioning or upgrade processes. What they primarily offer, according to the founders, is a culture of how to service and support using cloud services and specific marketing and sales techniques.

In Latin America, most customers care more about the local sales and support company than they do about the core software. As one person put it, they believe in skin-to-skin sales, where clients have relationships they trust as long as solutions are provided. Most LMS customers in Latin America do not care as much about the components of that solution as they do about relationships, service, and price. And yet, due to open source software and lightweight infrastructure needs, Moodle is dominant as noted above. The Moodle brand, code base, and code licensing does not matter as much as the Moodle culture and ecosystem. From a commercial standpoint, Nivel Siete’s competitors include a myriad of non Moodle Partner hosting providers – telcos bundling in hosting, mom-and-pop providers, self-hosting – or non-consumption. For a subset of the market, Nivel Siete has competed with Blackboard Learn.

Beyond Cognos Online, Blackboard has another ~9 resellers in Latin America, and Nivel Siete (or whatever they decide to name the new unit) will support all of these resellers. This is actually the biggest motivation other than cash for the company to sell – they were seeking methods to extend their influence, and this opportunity made the most sense.

Blackboard Learn and Ultra

What about that Learn sub-market? Most clients and sales people (resellers as well as Blackboard channel manager) are aware of Learn Ultra, but the market seems to understand already that Ultra is not for them . . . yet. They appear to be taking a ‘talk to me when it’s done and done in Spanish’ approach and not basing current decisions on Ultra. In this sense, the timing for Ultra does not matter all that much, as the market is not waiting on it. Once Ultra is ready for Latin America, Blackboard sales (channel manager and resellers) expect the switchover to be quicker than in the US, as LMS major upgrades (involving major UI and UX changes) or adoptions tend to take weeks or months instead of a year or more as we often see in the states. At least in the near term, Learn Ultra is not a big factor in this market.

What Blackboard is best known for in this market is the large SENA contract running on Learn. SENA (National Service for Learning) is a government organization that runs the majority of all vocational colleges – providing certificates and 2-year vocational degrees mostly for lower-income students, a real rising middle class move that is important in developing countries. Blackboard describes SENA as having 6+ million total enrollment, with ~80% in classrooms and ~20% in distance learning.

Integration

The challenge Blackboard faces is integrating its Learn and Moodle operations through the same groups – Nivel Siete internal group, Cognos Online and other resellers serving both lines – without muddling the message and go-to-market approach. Currently Learn is marketed and sold through traditional enterprise sales methods – multiple meetings, sales calls, large bids – while Nivel Siete’s offering of Moodle is marketed and sold with more of a subscription-based mentality. As described by ForceManagement:

A customer who has moved to a subscription-based model of consumption has completely different expectations about how companies are going interact with them.

How you market to them, how you sell to them, how you bill them, how you nurture the relationship – it’s all affected by the Subscription Economy. The customer’s idea of value has changed. And, if the customer’s idea of value has changed, your value proposition should be aligned accordingly. [snip]

The subscription-based sales process relies less on the closing of a sale and more on the nurturing of a long-term relationship to create lifetime customer value.

One of Nivel Siete’s most effective techniques is their The e-Learner Magazine that highlights customer telling their own stories and lessons in a quasi-independent fashion. The company has relied on inbound calls and quick signups and service startups. There is quite a different cultural difference between enterprise software and subscription-based approaches. While Blackboard themselves are facing such changes due to Ultra and newly-offered SaaS models, the group in Latin America is facing the challenge of two different cultures served by the same organizations today.

To help address this challenge, Cognos Online is planning to have two separate teams selling / servicing mainline Blackboard products and Moodle products. But even then, CEO Fernery Morales described that their biggest risk is muddling the message and integrating appropriately.

Moodle Strategy and Risk

At the same time, this strategy and growth comes at a time where the Moodle community at large appears to be at an inflection point. This inflection point I see comes from a variety of triggers:

  • Blackboard acquisitions causing Moodle HQ, other Moodle Partners, and some subset of users’ concerns about commercialization;
  • Creation of the Moodle Association as well as Moodle Cloud services as alternate paths to Moodle Partners for revenue and setup; and
  • Remote-Learner leaving the Moodle Partner program and planning to join the Moodle Association, with its associated lost revenue and public questioning value.

I don’t have time to fully describe these changes here, but Moodle itself is both an opportunity and a risk mostly based on its own success globally. More of that in a future post.

What Does This Mean Beyond Latin America?

It’s too early to fully know, but here are a few notes.

  • Despite the positioning in the US media, there is no “international” market. There are multiple local or regional markets outside of the US that have tremendous growth opportunities for US and other companies outside of those immediate markets. Addressing these markets puts a high premium on localization – having feet on the ground for people who know the culture, can be trusted in the region, and including product customizations meant for those markets. Much of the ed tech investment boom is built on expectations of international growth, but how many ed tech companies actually know how to address local or regional non-US markets? This focus on localizing international markets is one of Blackboard’s greatest strengths.
  • Based on the above, at least in Latin America Blackboard is building itself up as being the status quo before other learning platforms really get a chance to strategically enter the market. For example, Instructure has clearly not chosen to go after non English-speaking international markets yet, but by the time they do push Canvas into Latin America, and if Blackboard is successful integrating Nivel Siete, for example, it is likely Instructure will face an entrenched competitor and potential clients who by default assume Moodle or Learn as solutions.
  • Blackboard as a company has one big growth opportunity right now – the collection of non-US “international” markets that represent just under 1/4 of the company’s revenue. Domestic higher ed is not growing, K-12 is actually decreasing, but international is growing. These growing markets need Moodle and  traditional Learn 9.1 much more than Ultra. I suspect that this growing importance is creating more and more tension internal to Blackboard, as the company needs to balance Ultra with traditional Learn and Moodle development.
  • While I strongly believe in the mission of US community colleges and low-cost 4-year institutions, in Latin America the importance of education in building up an emerging middle class is much greater than in US. We hear this “importance of education” and “building of middle class” used in generic terms regarding ed tech potential, but seeing this connection more closely by being in country is inspiring. This is a real global need that can and should drive future investment in people and technology to address.
  1. This information based on tweet last spring showing Moodlerooms + Netspot combined were more than 50% of revenue, and that the next largest Moodle Partner, Remote-Learner, has left the program. Since last year I have confirmed this information through multiple sources.
  2. Again, much of this information is from people related to Blackboard, but it also matches my investigation of press releases and public statements about specific customers of D2L and Instructure.

The post Inside View Of Blackboard’s Moodle Strategy In Latin America appeared first on e-Literate.

Ed Tech Evaluation Plan: More problems than I initially thought

Mon, 2015-08-24 14:21

By Phil HillMore Posts (356)

Late last week I described the new plan from the US Department of Education (ED) and their Office of Educational Technology (OET) to “call for better methods for evaluating educational apps”. Essentially the ED is seeking proposals for new ed tech evaluation methods so that they can share the results with schools – helping them evaluate specific applications. My argument [updated DOE to be ED]:

Ed tech apps by themselves do not “work” in terms of improving academic performance. What “works” are pedagogical innovations and/or student support structure that are often enabled by ed tech apps. Asking if apps works is looking at the question inside out. The real question should be “Do pedagogical innovations or student support structures work, under which conditions, and which technology or apps support these innovations?”. [snip]

I could see that for certain studies, you could use the ED template and accomplish the same goal inside out (define the conditions as specific pedagogical usage or student support structures), thus giving valuable information. What I fear is that the pervasive assumption embedded in the program setup, asking over and over “does this app work” will prove fatal. You cannot put technology as the center of understanding academic performance.

Upon further thought as well as prompting from the comments and private notes, this ED plan has even more problems that I initially thought.

Advocate or Objective Evaluator

There is a real problem with this plan coming out of the Office of Educational Technology due to their mission.

The mission of the Office of Educational Technology (OET) is to provide leadership for transforming education through the power of technology. OET develops national educational technology policy and establishes the vision for how technology can be used to support learning.

The OET strongly advocates for the use of ed tech applications, which I think is a primary cause of their inside-out, technology first view of the world. They are not an objective organization in terms of whether and when technology should be used, but rather an advocate assuming that technology should be used, but please make it effective. Consider these two statements, the first from the National Technology Plan and the second from the paper “Learning Technology Effectiveness” [emphasis added]:

  • The plans calls for applying the advanced technologies used in our daily personal and professional lives to our entire education system to improve student learning, accelerate and scale up the adoption of effective practices, and and use data and information for continuous improvement.
  • While this fundamental right to technology access for learning is nonnegotiable, it is also just the first step to equitable learning opportunities.

I have no problem with these goals, per se, but it would be far more useful to not have advocates in charge of evaluations.

A Better View of Evaluation

Richard Hershman from the National Association of College Stores (NACS) shared with me an article that contained a fascinating section on just this subject.

Why Keep Asking the Same Questions When They Are Not the Right Questions?

There are no definitive answers to questions about the effectiveness of technology in boosting student learning, student readiness for workforce skills, teacher productivity, and cost effectiveness. True, some examples of technology have shown strong and consistent positive results. But even powerful programs might show no effects due to myriad methodological flaws. It would be most unfortunate to reject these because standardized tests showed no significant differences. Instead, measures should evaluate individual technologies against specific learning, collaboration, and communication goals.

The source of this excellent perspective on evaluating ed tech? An article called “Plugging In: Choosing and Using Educational Technology” from the North Central Regional Educational Laboratory and commissioned by the US Department of Education in 1995.

As Richard Parent commented in my recent post:

You’re exactly right to reframe this question. It’s distressing when the public demands to know “what works” as if there are a set of practices or tools that simply “are” good education. It’s downright depressing when those who should be in the know do so, too.

Update: This is not fully to the level of response, but Rolin Moe got Richard Culatta to respond to his tweet about the initial article.

@RMoeJo it’s true. I believe research has to adapt to pace of tech or we will continue to make decisions about edu apps with no evidence

— Richard Culatta (@rec54) August 25, 2015

Rolin Moe: Most important thing I have read all year – @philonedtech points out technocentric assumptions of US ED initiative

Richard Culatta: it’s true. I believe research has to adapt to pace of tech or we will continue to make decisions about edu apps with no evidence

The post Ed Tech Evaluation Plan: More problems than I initially thought appeared first on e-Literate.

US Department of Education: Almost a good idea on ed tech evaluation

Fri, 2015-08-21 16:53

By Phil HillMore Posts (356)

Richard Culatta from the US Department of Education (DOE, ED, never sure of proper acronym) wrote a Medium post today describing a new ED initiative to evaluate ed tech app effectiveness.

As increasingly more apps and digital tools for education become available, families and teachers are rightly asking how they can know if an app actually lives up to the claims made by its creators. The field of educational technology changes rapidly with apps launched daily; app creators often claim that their technologies are effective when there is no high-quality evidence to support these claims. Every app sounds world-changing in its app store description, but how do we know if an app really makes a difference for teaching and learning?

He then describes the traditional one-shot studies of the past (control group, control variables, year or so of studies, get results) and notes:

This traditional approach is appropriate in many circumstances, but just does not work well in the rapidly changing world of educational technology for a variety of reasons.

The reasons?

  • Takes too long
  • Costs too much and can’t keep up
  • Not iterative
  • Different purpose

This last one is worth calling out in detail, as it underlies the assumptions behind this initiative.

Traditional research approaches are useful in demonstrating causal connections. Rapid cycle tech evaluations have a different purpose. Most school leaders, for example, don’t require absolute certainty that an app is the key factor for improving student achievement. Instead, they want to know if an app is likely to work with their students and teachers. If a tool’s use is limited to an after-school program, for example, the evaluation could be adjusted to meet this more targeted need in these cases. The collection of some evidence is better than no evidence and definitely better than an over-reliance on the opinions of a small group of peers or well-designed marketing materials.

The ED plans are good in terms of improving the ability to evaluate effectiveness in such a manner that accounts for rapid technology evolution. The general idea of ED investing in the ability to provide better decision-making information is a good one. It’s also very useful to see ED recognize context of effectiveness claims.

The problem I see, and it could be a fatal one, is that ED is asking the wrong question for any technology or apps related to teaching and learning. [emphasis added]

The important questions to be asked of an app or tool are: does it work? with whom? and in what circumstances? Some tools work better with different populations; educators want to know if a study included students and schools similar to their own to know if the tool will likely work in their situations.

Ed tech apps by themselves do not “work” in terms of improving academic performance[1]. What “works” are pedagogical innovations and/or student support structure that are often enabled by ed tech apps. Asking if apps works is looking at the question inside out. The real question should be “Do pedagogical innovations or student support structures work, under which conditions, and which technology or apps support these innovations?”.

Consider our e-Literate TV coverage of Middlebury College and one professor’s independent discover of flipped classroom methods.

How do you get valuable information if you ask the question “Does YouTube work” to increase academic performance? You can’t. YouTube is a tool that the professor used. Now you could get valuable information if you ask the question “Does flipped classroom work for science courses, and which tools work in this context?” You could even ask “For the tools that support this flipped classroom usage, does the choice of tool (YouTube, Vimeo, etc) correlate with changes in student success in the course?”.

I could see that for certain studies, you could use the ED template and accomplish the same goal inside out (define the conditions as specific pedagogical usage or student support structures), thus giving valuable information. What I fear is that the pervasive assumption embedded in the program setup, asking over and over “does this app work” will prove fatal. You cannot put technology as the center of understanding academic performance.

I’ll post this as a comment to Richard’s Medium post as well. With a small change in the framing of the problem, this could be a valuable initiative from DOE.

Update: Changed DOE to ED for accuracy.

Update: This is not fully to the level of response, but Rolin Moe got Richard Culatta to respond to his tweet about this article.

@RMoeJo it’s true. I believe research has to adapt to pace of tech or we will continue to make decisions about edu apps with no evidence

— Richard Culatta (@rec54) August 25, 2015

Rolin Moe: Most important thing I have read all year – @philonedtech points out technocentric assumptions of US ED initiative

Richard Culatta: it’s true. I believe research has to adapt to pace of tech or we will continue to make decisions about edu apps with no evidence

  1. And yes, they throw in a line that it is not just about academic performance but also administrative claims. But the whole setup is on teaching and learning usage, which is the primary focus of my comments.

The post US Department of Education: Almost a good idea on ed tech evaluation appeared first on e-Literate.

68 Percent of Statistics Are Meaningless, Purdue University Edition

Fri, 2015-08-21 10:13

By Michael FeldsteinMore Posts (1043)

I don’t know of any other way to put this. Purdue University is harming higher education by knowingly peddling questionable research for the purpose of institutional self-aggrandizement. Purdue leadership should issue a retraction and an apology.

We have covered Purdue’s Course Signals extensively here at e-Literate. It is a pioneering program, and evidence does suggest that it helps at-risk students pass courses. That said, Purdue came out with a later study that is suspect. The study in question claimed that students who used Course Signals in consecutive classes were more likely to see improved performance over time, even in courses that did not use the tool. Mike Caulfield looked at the results and had an intuition that the result of the study was actually caused by selection bias. Students who stuck around to take courses in consecutive semesters were more likely to…stick around and take more courses in consecutive semesters. So students who stuck around to take more Course Signals courses in consecutive semesters would, like their peers, be more likely to stick around and take more courses. Al Essa did a mathematical simulation and proved Mike’s intuition that Purdue’s results could be the result of selection bias. Mike wrote up a great explainer here on e-Literate that goes into all the details. If there was indeed a mistake in the research, it was almost certainly an honest one. Nevertheless, there was an obligation on Purdue’s part to re-examine the research in light of the new critique. After all, the school was getting positive press from the research and had licensed the platform to SunGard (now Ellucian). Furthermore, as a pioneering and high-profile foray into learning analytics, Course Signals was getting a lot of attention and influencing future research and product development in the field. We needed a clearer answer regarding the validity of the findings.

Despite our calls here on the blog, and our efforts to contact Purdue directly, and attention the issue got in the academic press, Purdue chose to remain silent on the issue. Our sources informed us at the time that Purdue leadership was aware of the controversy surrounding the study and made a decision not to respond. Keep in mind that the research was conducted by Purdue staff rather than faculty. As a results, those researchers did not have the cover of academic freedom and were not free to address the study on their own without first getting a green light from their employer. To make matters more complicated, none of the researchers on that project still work at Purdue anymore. So the onus was on the institution to respond. They chose not to do so.

That was bad enough. Today it became clear that Purdue is actively promoting that questionable research. In a piece published today in Education Dive, Purdue’s “senior communications and marketing specialist” Steve Tally said

the initial five- and six-year raw data about the impact of Signals showed students who took at least two Signals-enabled courses had graduation rates that were 20% higher. Tally said the program is most effective in freshman and sophomore year classes.

“We’re changing students’ academic behaviors,” Tally said, “which is why the effect is so much stronger after two courses with Signals rather than one.” A second semester with Signals early on in students’ degree programs could set behaviors for the rest of their academic careers.

It’s hard to read this as anything other than a reference the study that Mike and Al challenged. Furthermore, the comment about “raw data” suggests that Purdue has made no effort to control for the selection bias in question. Two years after the study was challenged, they have not responded, not looked into it, and continue to use it to promote the image of the university.

This is unconscionable. If an academic scholar behaved that way, she would be ostracized in her field. And if a big vendor like Pearson or Blackboard behaved that way, it would be broadly vilified in the academic press and academic community. Purdue needs to come clean. They need to defend the basis on which they continue to make claims about their program the same way a scholar applying for tenure at their institution would be expected to be responsible for her claims. Purdue’s peer institutions likewise need to hold the school accountable and let them know that their reputation for integrity and credibility is at stake.

The post 68 Percent of Statistics Are Meaningless, Purdue University Edition appeared first on e-Literate.

Challenge Of Student Transition Between Active And Passive Learning Models

Thu, 2015-08-20 08:59

By Phil HillMore Posts (356)

Last week the Hechinger Report profiled an innovative charter school in San Diego called High Tech High (insert surfer jokes here) that follows an active, project based learning (PBL) model. The school doesn’t use textbooks, and they don’t base the curriculum on testing. The question they ask is whether this approach prepares students for college.

As a result, for [former HTH student Grace] Shefcik, college – with its large classes and lecture-based materials– came as a bit of a shock at first. At the University of California, Santa Cruz, she is one of more than 15,000 undergraduates, her assignments now usually consist of essays and exams. At High Tech High, Shefcik had just 127 students in her graduating class, allowing her to form close relationships with peers and teachers.

The premise of the article is that PBL prepares students for life but maybe not for college. Grace described the big difference between high school, with constant feedback and encouragement, to college, where you rarely get feedback. Other students describe their frustration in not knowing how to study for tests once they get to college.

After a recent screening of “Most Likely to Succeed” at the New Schools Summit in Burlingame, California, High Tech High CEO Larry Rosenstock told an audience, “We actually find that many of our students find themselves bored when they get to college.”

Teachers and administrators at High Tech High don’t tell many stories about their students reporting boredom, but they do hear about experiences like Shefcik’s. They say students find themselves overwhelmed by the different environment at college and have a difficult time making the transition to lecture-hall learning.

Students do tend to adjust, but this process can take longer than it does for traditionally-taught students.

But sometimes it takes High Tech High graduates a semester or a year at college or university before they feel like they’ve cracked the code.

“I had a harder time transitioning than other students,” said Mara Jacobs, a High Tech High graduate who just finished her second year at Cornell University in Ithaca, New York, and is the daughter of major donors Gary and Jerri-Ann Jacobs. “I couldn’t just do the work if I wasn’t bought into how I was being taught.”

My problem with the article is that it makes the assumption that all colleges outside of small private institutions base their entire curriculum on passive lectures and testing, not acknowledging many of the innovations and changes coming from these same colleges. We have profiled personalized learning approaches in our e-Literate TV series, including a PBL approach at Arizona State University for the Habitable Worlds course (see this episode for in-depth coverage).

Nevertheless, the general point remains that it is difficult for students to transition between active learning models and passive lecture and test models. The Hechinger Report calls out the example of K-12 students moving into college, but we talked to faculty and staff at UC Davis who saw the flip side of that coin – students used to passive learning at high school trying to adapt to an active learning science course in college.

Phil Hill: While the team at UC Davis is seeing some encouraging initial results from their course redesign, these changes are not easy. In our discussions, the faculty and staff provided insight into the primary barriers that they face when looking to build on their success and get other faculty members to redesign their courses.

Catherine Uvarov: Well, I have had some very interesting experiences with students. Last quarter, my class was mostly incoming freshman, and it’s like their very first quarter at UC Davis, so they have never taken a UC Davis class before. My class is pretty different from either classes they’ve taken in high school or other classes that they were still taking in their first quarter at Davis because these changes are not as widespread as they could be.

Some students push back at first, and they’re like, “Oh, my, gosh, I have to read the book. Oh, my, gosh, I have to open the textbook. Oh, my, gosh, I have to do homework every week. I have to do homework every day.” They kind of freaked out a little bit in the beginning, but as the quarter progressed, they realized that they are capable of doing this type of learning style.

There’s more info at both the Hechinger Report article and in the ASU and UC Davis case studies, but taken together they point out the challenges students face when transitioning between pedagogical models. These transitions can occur between high school and college, but more often they occur from course to course. Active learning and PBL are not just minor changes away from lecture and test – they require a new mindset and set of habits from students.

The post Challenge Of Student Transition Between Active And Passive Learning Models appeared first on e-Literate.

Reuters: Instructure has filed for IPO later this year

Thu, 2015-08-13 10:29

By Phil HillMore Posts (356)

Reuters is on a breaking news roll lately with ed tech. This time it is about Instructure filing for an initial public offering (IPO).

Instructure is planning an initial public offering later this year that could value the education software company at $500 million to $800 million, according to people familiar with the matter.

Instructure, based in Salt Lake City, has hired Morgan Stanley (MS.N) and Goldman Sachs (GS.N) to help prepare for the IPO, which has been filed confidentially, the people said. They requested anonymity because the news of the IPO was not public.

Under the Jumpstart Our Business Startups Act, new companies that generate less than $1 billion in revenue can file for IPOs with the U.S. Securities and Exchange Commission without immediately disclosing details publicly.

Instructure has long stated its plans to eventually IPO, so the main question has been one of timing. Now we know that it is late 2015 (assuming Reuters story is correct, but they have been quite accurate with similar stories).

Michael and I have written recently about Instructure’s strong performance, including this note about expanding markets and their consistent growth in higher ed, K-12 and potentially corporate learning.

InstructureCon 2015 Growth Slide

Taken together, what we see is a company with a fairly straightforward strategy. Pick a market where the company can introduce a learning platform that is far simpler and more elegant than the status quo, then just deliver and go for happy customers. Don’t expand beyond your core competency, don’t add parallel product lines, don’t over-complicate the product, don’t rely on corporate M&A. Where you have problems, address the gap. Rinse. Repeat.

Instructure has now solidified their dominance in US higher ed (having the most new client wins), they have hit their stride with K-12, and they are just starting with corporate learning. What’s next? I would assume international education markets, where Instructure has already started to make inroads in the UK and a few other locations.

The other pattern we see is that the company focuses on the mainstream from a technology adoption perspective. That doesn’t mean that they don’t want to serve early adopters with Canvas or Bridge, but Instructure more than any other LMS company knows how to say ‘No’. They don’t add features or change designs unless the result will help the mainstream adoption – which is primarily instructors. Of course students care, but they don’t choose whether to use an LMS for their course – faculty and teachers do. For education markets, the ability to satisfy early adopters rests heavily on the Canvas LTI-enabled integrations and acceptance of external application usage; this is in contrast to primarily relying on having all the features in one system.

Combine this news with that of Blackboard being up for sale and changes in Moodle’s approach, and you have some big moves in the LMS market that should have long-term impacts on institutional decision-making. Watch this space for more coverage.

The post Reuters: Instructure has filed for IPO later this year appeared first on e-Literate.

Blackboard Acquires Large Latin American Moodle Provider

Thu, 2015-08-13 10:13

By Michael FeldsteinMore Posts (1043)

In my first post-BbWorld blog post, I noted that the international market is hugely important for Blackboard and Moodle is hugely important for their international strategy. Nearly a quarter of the company’s revenue and much of their growth comes from their international business, where they seem to be way ahead of their main North American competition in many countries. Learn has some really large contracts—whole country contracts, in some cases—but Moodle has a lot of smaller contracts. In some countries, you just can’t get traction in the LMS space unless you have a Moodle offering. In my post, I predicted that we would see continuing investments in Moodle, based on what we heard from senior management.

Today, Blackboard announced that they have acquired Nivel Siete, a Colombia-based Moodle hosting and services provider with over 200 customers in Latin America. This follows their acquisition of Remote Learner UK, a company that serviced about 100 UK- and Ireland-based Moodle schools at the time of the acquisition and their acquisition of the X-Ray learning analytics company that currently is focused on Moodle. These are all in the last year. And they are on top of the original acquisition of Moodlerooms and Netspot, two of the biggest Moodle providers around. There are some interesting—and complicated—long-term implications here for the governance and financial structure of the Moodle ecosystem that Phil and I will eventually write about, but for now it’s worth noting that Blackboard is making serious investments in Moodle and international growth.

The post Blackboard Acquires Large Latin American Moodle Provider appeared first on e-Literate.

ED and CBE: Example of higher ed “structural barrier to change” that is out of institutions’ control

Thu, 2015-08-13 09:55

By Phil HillMore Posts (356)

There has been a great conversation going on in the comments to my recent post “Universities As Innovators That Have Difficulty Adopting Their Own Changes” on too many relevant issues to summarize (really, go read the ongoing comment thread). They mostly center on the institution and faculty reward system, yet those are not the only sources of structural barriers to change that lead institutions to this “difficulty adopting their own changes”. Increasingly there are outside forces that both encourage change and resist change, and it is important to recognize the impact of the entire higher education ecosystem.

Yesterday Amy Laitinen from New America wrote an excellent article titled “Whatever Happened to the Department’s Competency-Based Education Experiments?” highlighting just such an example.

About this time two years ago, President Obama went on his college affordability bus tour and unveiled his plan to take on the rising costs of higher education in front of thousands of students at SUNY Buffalo. Promoting innovation and competition was a key part of his plan and President Obama held up competency-based education (CBE) up as one of the “innovative new ways to prepare our students for a 21st century economy and maintain a high level of quality without breaking the bank.” The President touted Southern New Hampshire University’s College for America CBE approach. The university “gives course credit based on how well students master the material, not just on how many hours they spend in the classroom,” he explained. “So the idea would be if you’re learning the material faster, you can finish faster, which means you pay less and you save money.” This earned applause from students in the audience as well as from CBE practitioners around the country. [snip]

The problem is that day was nearly two years ago and the CBE experimental sites are not yet off the ground. It’s not because institutions aren’t ready and willing. They are. But the Department of Education has been dragging its feet. It took the Department nearly a year after the President’s announcement to issue a notice describing what the experiments would look like. Perhaps this could have been done more quickly, but CBE is complicated and it’s understandable that the Department wanted to be thorough in its review of the relevant laws and regulations (they turned out much more forward-thinking than I would have imagined). But the notice did go out, schools did apply, and schools were accepted to participate. But the experiment hasn’t started, because schools haven’t received guidance on how to do their experiments.

Amy goes on to describe how schools are repeatedly asking for guidance and how foundations like Lumina and Gates are doing the same, yet the Education Department (ED) has not or will not provide such guidance.

Matt Reed, writing at Inside Higher Ed this morning, asks why (or why not) does the ED not step up to move along the program, offers some possible answers and solicits input:

  • They’re overwhelmed. They approved the concept of CBE without first thinking through all of the implications for other policies, and now they’re playing catchup. This strikes me as highly likely.
  • They’re focused more on “gainful employment,” for-profit providers, student loan issues, and, until recently, the effort to produce college ratings. With other things on fire, something like CBE could easily get overshadowed. I consider this possibility entirely compatible with the previous one.
  • They’re stuck in a contradiction. At the very same time that they’re trying to encourage experimentation with moving away from the credit hour in the context of CBE, they’re also clamping down on the credit hour in the context of online teaching. It’s possible to do either, but doing both at the same time requires a level of theoretical hair-splitting far beyond what they’re usually called upon to do. My guess is that an initial rush of enthusiasm quickly gave way to dispirited foot-dragging as they realized that the two emphases can’t coexist.
  • Their alien overlords in Area 51, in conjunction with the Illuminati and the Trilateral Commission… (You can fill in the rest. I’m not a fan of this one, but any explanation of federal government behavior on the Internet has to include at least one reference to it. Let’s check that box and move on.)

Rather than add my own commentary or conjecture on the subject, I would prefer to just highlight this situation and note how we need to look beyond just colleges and universities, and even faculty reward systems, to understand the structural barriers to change for higher education.

The post ED and CBE: Example of higher ed “structural barrier to change” that is out of institutions’ control appeared first on e-Literate.

Online Teaching Conference (#CCCOTC15) Keynote

Mon, 2015-08-10 18:04

By Phil HillMore Posts (356)

Back in June I had the pleasure of giving the keynote at the Online Teaching Conference (#CCCOTC15) in San Diego, put on by the California Community College system. There was quite a bit of valuable backchannel discussions as well as sharing of the slides. The theme of the talk was:

Emerging Trends in Online / Hybrid Education and Implications for Faculty

As online and hybrid education enter the third decade, there are significant efforts to move beyond the virtualization of traditional face-to-face classroom and move more towards learner-centric approaches. This shift has the potential to change the discussion of whether online and hybrid approaches “can be as good as” traditional approaches to a discussion of how online and hybrid approaches “can provide better learning opportunities”.

For those who would like to see the keynote, I am including the video and slides below. Pat James’ introduction starts at 05:50, and my keynote starts at 09:15.

And my apologies for fumbling on the slide / website / video switches. I was not prepared for the mandatory usage of a PC instead of Mac.

Some key sections that seemed to resonate during the talk:

Historical Context and Unbundling (~13:15)

Hill Slides OTC 20150619 unbundling 1 Hill Slides OTC 20150619 unbundling 2

Differentiated Instruction Including Laura Gibbs Example Course (~32:00)

Hill Slides OTC 20150619 LG MF

LMS as Minivan of Education (~47:39)

Hill Slides OTC 20150619 minivan

Necessity to Support Both Sides of Chasm in Education (~51:45)

Hill Slides OTC 20150619 straddling

Full Slide Deck:

The post Online Teaching Conference (#CCCOTC15) Keynote appeared first on e-Literate.

Blackboard: Ask and Ye Shall Receive (Better Answers)

Sat, 2015-08-08 10:24

By Michael FeldsteinMore Posts (1041)

About a week ago, I complained about Blackboard’s lack of clarity in messaging about their platform in general and the implications for managed hosting customers in particular. I wrote, in part,

What is “Premium SaaS”? Is it managed hosting? Is it private cloud? What does it mean for current managed hosting customers? What we have found is that there doesn’t seem to be complete shared understanding even among the Blackboard management team about what the answers to these questions are.

The problem with this oversight is deeper than just leaving managed hosting customers in the dark. Blackboard is asking customers (and prospects) to have patience as the company performs a major retooling on their platforms. In order to get that patience, they need for customers to understand (and believe) that this really is a major retooling, what is being retooled (at a high level), and what they will get that’s different from other platforms at the end of the process. This is a hard communication challenge, but it is also Blackboard’s live-or-die challenge. They really need to convince customers and prospects that the platform has a bright future, and to do that, they have to communicate nuances and technical issues that are not easy to communicate to executives. This is not something that can be fixed with a couple of DevCon sessions.

That’s why I was happy to see Blackboard respond this week with more clarity.

For starters, a couple of days after the post, I got a couple of Tweets from Vivek Ramgopal, Blackboard’s Director of Product Marketing:

@mfeldstein67 Hi Michael. Based on customer feedback, we changed the naming of the SaaS tiers after Dec. Here you go. pic.twitter.com/A4D3yu6stz

— Vivek Ramgopal (@TweetsByVivek) August 3, 2015

Here’s a bit more of the exchange:

@mfeldstein67 Working on a few things to help clarify – but you're right, actions speak louder than words.

— Vivek Ramgopal (@TweetsByVivek) August 3, 2015

True to his word, less than a week after my original post, Blackboard put up a post entitled “The Ultra experience for Blackboard Learn: What does it Mean for self-hosted and managed deployments?” Let’s start with the title. They are referring to the “Ultra experience,” which suggests that Ultra is about user experience. So there’s a bit of clarity right off the bat. Unfortunately, they  muddy it up a bit again pretty quickly when they write, “The Ultra experience, which is consistent across Learn, Collaborate, and our new Bb Student app, is a foundational element of the New Learning Experience we introduced at BbWorld.” What is the difference between Ultra and the New Learning Experience? Why do we need two different marketing terms? Nevertheless, two steps forward and one step back is net one step forward.

It gets better from there. Here’s the meat of it:

During BbWorld, many Learn customers asked us when the Ultra experience will be coming to self-hosted and Managed Hosting deployments. Blackboard is exploring the ability to bring Learn SaaS and thus the Ultra experience to Blackboard-managed data centers and perhaps even customers’ own data centers. However, this does not mean that the Ultra experience is coming to self-hosted or Managed Hosting implementations as you know them today.

Part of the challenge in communicating this has been that for most of Blackboard’s history, “deployed in a Blackboard-managed data center” has meant our Managed Hosting infrastructure and “deployed in a customer’s own data center” has meant the traditional enterprise Blackboard Learn technology stack. With the introduction of Learn SaaS and the Ultra experience, though, we are talking about a new architecture and a new infrastructure (even though it might be in the same physical location).

As noted above, the Ultra experience was built on – and thus must sit on top of – a cloud computing architecture. The Learn 9.1 Managed Hosting deployment option available today does not use that type of cloud architecture. The same is true of self-hosted Learn 9.1 implementations. Therefore, it is not possible to bring the Ultra experience to self-hosted or Managed Hosting implementations as you know them today.

And here’s the summary at the bottom or the article:

  • The Ultra experience sits on top of a cloud architecture that is different from what self-hosted and Managed Hosting customers currently use today.
  • Thus, the Ultra experience will not be coming to Managed Hosting or self-hosted implementations as you know them today.
  • A cloud architecture that can support Learn SaaS and the Ultra experience will be coming to Blackboard-managed data centers in some regions and potentially even to customers’ own data centers in the future.
  • However, this will be a different infrastructure than today’s Managed Hosting and self-hosting options.

It is also important to note that we are fully committed to Learn 9.1 as well as self-hosted and Managed Hosting deployments.  We have quality improvements, workflow enhancements, and entirely new features (like improved grade exchange and competency-based education (CBE) tools) on the Learn 9.1 roadmap.

This is much, much better and forms the foundation of the communication they need to be having with customers and prospects. In addition to this, they need to continue to demonstrate what “Ultra” means in practical terms for teachers and studentsand talk about how the new capabilities are intimately connected to the new architecture. They need to play variations of this theme over and over and over again. And then they also need to have a constant drumbeat of both Ultra and non-Ultra updates, announced very clearly and loudly but without fanfare or hype. In other words, they need to make sure customers see that (a) they are making steady progress on both the present and the future for their customers, and (b) they understand this progress is the proof they need to demonstrate in order to win and hold customer trust rather than some super-fantabulous revolution. That’s pretty much the whole enchilada for them for at least the next 12 months.

A while back, Phil wrote about Instructure,

Despite Canvas LMS winning far more new higher ed and K-12 customers than any other vendor, I still hear competitors claim that schools select Canvas due to rigged RFPs or being the shiny new tool despite having no depth or substance. When listening to the market, however, (institutions – including faculty, students, IT staff, academic technology staff, and admin), I hear the opposite. Canvas is winning LMS selections despite, not because of, RFP processes, and there are material and substantive reasons for this success.

I think one of the reasons that competitors are sometimes perplexed by Instructure’s gains is that they make the mistake of believing that they are primarily competing on the strength of the product. Canvas is good, but it’s not so much better that it justifies the tectonic shift in market share that they are getting, and their competitors know it. What their competitors (and fans of other platforms) don’t seem to have noticed is that Instructure’s communications with customers have been freaky good from the very beginning. The detractors got misled by the early flash and the cheek—the flame thrower videos, Josh, the party thrown right across the hall from their competitor’s conference, Josh, the snarky T-shirts, Josh, and so on. What they missed in all of that was the consistent clarity of messaging. Instructure’s SVP of Marketing Misty Frost is just ridiculously good finding the one narrative thread that customers need to hear and making sure that they hear that, loud and clear, with no clutter. (I mean, c’mon. “Misty Frost?” If you were going to invent comic book superhero head of marketing—or, for that matter, supervillain head of marketing—you would name her Misty Frost.)

I would go one step further. This isn’t “just” about making sure customers understand the most important things about your product and your company (as if that weren’t vital enough). The number one way that I judge whether a company is likely to make a good product in the future is to look for signs that they are listening to their customers. Where does that show up first? In their communications. It is possible to be a good product company with lousy messaging, but it is impossible to be a great product company with lousy messaging, because great product companies weave what they are learning about their customers into the fabric of the culture. If Marketing can’t tell Product Management’s story, that means there is weak internal alignment.

That’s one reason why I take messaging so seriously and also one reason why I’m really pleased to see them respond to criticism of their communications so quickly and clearly.

The post Blackboard: Ask and Ye Shall Receive (Better Answers) appeared first on e-Literate.

Universities As Innovators That Have Difficulty Adopting Their Own Changes

Thu, 2015-08-06 19:16

By Phil HillMore Posts (355)

George Siemens made an excellent point in his recent blog post after his White House meeting.

I’m getting exceptionally irritated with the narrative of higher education is broken and universities haven’t changed. This is one of the most inaccurate pieces of @#%$ floating around in the “disrupt and transform” learning crowd. Universities are exceptional at innovating and changing.

While I agree with his primary point about false narratives with simplistic no-change assumptions, I think there is a risk about going too far the other direction. Universities have certainly changed, and there are many innovations within universities, but universities are not very good about diffusing the innovations that they do make. I made this same argument here and here.[1] Campus changes are apparent, but too often I see innovative course designs showing real results, but courses in the same department remain unchanged.

In my opinion Universities are exceptional at innovating, but they are not exceptional at changing.

In our e-Literate TV series on personalized learning, every case we reviewed was university, not vendor or foundation, driven. The universities drove the changes, and much of what we saw was very encouraging. But that does not mean that universities don’t face barriers in getting more faculty and course offerings to adopt changes that work. Take the University of California at Davis, where they are transforming large lecture intro to STEM courses into active learning laboratories that get students to really learn concepts and not just memorize facts. I’ve highlighted what they’re doing and how they’re doing it, but episode 3 of the case study also highlights the key barriers they face in adopting their own changes. I do not think UC Davis is unique here, just very open about it. The following is an interview with the iAMSTEM group that is supporting faculty and teaching assistants with the changes.

Phil Hill: But the biggest barrier might be with faculty members. Too often, the discussion is about resistance to new ideas without addressing the existing structural barriers.

It sounds like there are some very exciting changes—boisterous students, people wanting to learn—is some of what I’m hearing. What’s the biggest barrier that you guys face in terms of getting more of this cultural change to go throughout UC Davis? What do you see as the biggest barrier moving forward?

Erin Becker: Can I take this one?

Chris Pagliarulo: I think we all have some in mind.

Phil Hill: I’ll ask each one of you, so Erin?

Erin Becker: Incentivizing good teaching at the university—as it currently stands, most incentives that are built into the tenure package are based on research quality not on teaching quality.

So, asking instructors to put a lot of time and effort and energy into making these big instructional changes—it’s hard to incentivize that. If they’re going up for tenure, they want to spend more time in the lab.

Chris Pagliarulo: It’s risky.

Phil Hill: So, it’s the faculty compensation or reward system is not in alignment with spending time on improving teaching. Is that an accurate statement?

Chris Pagliarulo: Yep, that’s a key structural barrier.

Phil Hill: So, Chris, what would you say? Even if it’s the same thing, what do you see as the biggest barrier to this cultural shift?

Chris Pagliarulo: The next step would be, let’s imagine it was incentivized. It takes a lot of work to transform your instruction, and it’s also a bit of an emotional rollercoaster. When you change out of a habitual behavior, they call it the “J curve”. Immediately, your performance goes down, your attitude and affect goes down, and it takes somebody there to help you through both that process—and we need expertise, so there’s a major resource deficit that we have now.

If everyone was intellectually and emotionally ready to transform their instruction, it’s going to take a lot of work and a lot of resources to get there. So, that’s another thing that we would need to ramp up.

In other parts of the same episode, the UC Davis team talks about student expectations (active learning is hard and requires accountability for students, which is not easy at first) and student course evaluations (designed more for ‘do you like teacher & style’ than ‘is this an effective course’). In separate interviews with two faculty members (Marc Facciotti and Michelle Igo) who not only are teaching the redesigned courses but were key parts of the design process (you know, innovating), they both talked about how much time this takes. They have to get up to speed on pedagogical design, teach the course, sit in their peer’s courses to watch and learn, adjust their own courses, and improve each semester.  They described not only the time commitments but also the risk to their own careers by spending this much time on course redesign.[2]

There is nothing new here, just the opportunity to hear it from first-hand participants.

The point is, universities are not exceptional at adopting their own changes as there are structural barriers such as faculty reward, student expectations and student course evaluations. Change happens but it is difficult and slow. The faculty who lead change often do so at their own risk and in spite of their career needs, not in support of. None of this obviates George’s frustration at the no-change, “disrupt and transform” learning crowd (and I agree that is a big problem). But let’s not adopt the opposite viewpoint that all is well with the change process.

Note that I do not think that George is actually arguing for the all-is-well point, as evidenced in the Chronicle article on his blog post.

“Admittedly colleges have been slower to respond than corporations have” to changes in technology, Mr. Siemens added. But that’s how it should be, he argued. “When a university takes a big pedagogical risk and fails, that’s impacting someone’s life.” He admitted that colleges could be moving faster, but he felt that it is disingenuous to ignore the changes that are happening.

  1. The first article had more of a technology focus, but the same applies to the pedagogical side of change.
  2. Unfortunately these parts of the interviews ended up on the cutting room floor and are not in the videos.

The post Universities As Innovators That Have Difficulty Adopting Their Own Changes appeared first on e-Literate.

IBM’s Misleading or Just Incorrect National Ad on Student Retention

Tue, 2015-08-04 19:29

By Phil HillMore Posts (354)

Thanks to Chris Edwards for alerting me to this one. In a nutshell, IBM launched a national ad campaign last month that included commercials during Wimbledon on the weekend. They’re spending big money on this campaign about big data, learning analytics, and reducing “dropout rates” [emphasis added below].

In the US, 3 in 10 college students drop out, but how can you spot who is at risk? The one who lives far from campus? The one who works the night shift? The one with new responsibilities? One thing can’t tell you, but the right combination can. Universities are using IBM Analytics to understand pressures in and out of the classroom. Some expect to cut dropout rates by 25%. IBM Analytics is working to make education smarter every day.

If you follow the URL listed at the end of the YouTube video and select ‘Education’, you get to the main page that highlights the same video as above. For further explanation:

Using the IBM Exceptional Student Experience, universities can deploy predictive analytics running on the SoftLayer® cloud. They analyze more deeply what they always have—things like poor attendance and failing grades. But they are also beginning to factor in new kinds of data: how far students live from campus, how many hours a week they work, even how often they log on to the university Wi-Fi network. Universities can then customize a course of action for each student. Even something as simple as providing transportation vouchers can be beneficial. The platform gives professors, advisors and students themselves a view of personalized performance, so students can get help—before it’s too late.

With a little searching, you can find a series of education case studies which are quite well documented. But searching through them reveals no data to back up the claim made above. The closest one is Hamilton County, TN.

Through this innovation [predictive analytics], Hamilton County Schools achieved the best ‘No Child Left Behind’ results in its history, reducing its annual dropout rate by 25 percent and increasing its success rates by eight percent.

But that is a K-12 example. On the higher ed side, there is one case study for Ithaca College about increasing retention rates by 4% and another for Brockenhurst College about their expectation “to improve student recruitment and retention by 15 percent over a five-year period”. The University of Western Sydney is:

running a pilot program with the School of Business, where they have already designed specific retention strategies for each category. We’re really excited to see the impact this has on their attrition rates over the next year.

While IBM’s use of documented case studies is commendable, none of these back up the claims in the national ad. I contacted IBM to get more information, and they agreed to look into the matter. A representative for IBM’s Global Industry Marketing for Government and Education shared the same case studies on retention that I had already found. We had a phone call where I described the problem that nothing backed up the claim, and he said he would find out what the basis was for the commercial.

18 days later and no more responses despite several email reminders.

Let’s look at the claim in more detail.

“Some”

Some universities, some people, some marketing folks? Given the context of the ad which is all about higher ed, I would assume that this means “some colleges and universities”, but we don’t know.

“Expect”

Does this mean a group of universities has piloted the use of IBM Analytics, and the early results predict that campus-wide deployment might reduce dropouts by 25% (best case)? Or does this mean some guy they met at Starbucks expects some big results after reading IBM’s marketing? Or is someone expecting that Hamilton County K-12 results will do the same for higher ed? We don’t know.

“To Cut Dropout Rates”

The phrase “dropout rates” is used more commonly in K-12 and not higher ed. In higher ed, the issue is captured more by retention and persistence. Does IBM mean that specific course drops will decrease by 25%? Given the intro about 3 in 10 college students dropping out, I assume this means 25% fewer students will drop out of college – stop attending. In the Hamilton County K-12 example, they mean students dropping out of high school altogether. But we don’t know.

“By 25%”

This is the most troubling point. IBM takes a specific performance-sounding number which leads the viewer to believe that there are real results backing up this specific claim. But when pressed, even IBM executives cannot justify this specific number.

Two Conclusions, Neither Good

@PhilOnEdTech I mean, if a national ad budget is fueling unrealistic expectations about how to help solve a pressing #EdTech problem…

— Chris Edwards (@chris3edwards) July 11, 2015


I appreciate IBM’s initial willingness to talk to me, but Chris is right about fueling unrealistic expectations on a serious education problem. Given the lack of explanation, there are only two conclusions that I can draw from this experience. One is that IBM has some data buried somewhere to back up the claim, but no one can find it and therefore it is not documented. That is the best case scenario. The other, more problematic, conclusion is that IBM put out an ad without any data to back it up. Either way, the ad on student retention cannot be backed up by any documented case study. That is wrong.

My offer to IBM to get their explanation in public still stands. If they find some data to backup the ad, I will post the information here on e-Literate. If not, IBM should take down the ad, revise their web page, and publish an explanation of the changes.

The post IBM’s Misleading or Just Incorrect National Ad on Student Retention appeared first on e-Literate.

Blackboard Potential Sale: Market timing, financials, and some thoughts on potential buyers

Tue, 2015-08-04 06:45

By Phil HillMore Posts (353)

With Reuters’ story last week that Blackboard is putting itself up for sale through an auction, one question to ask is ‘why now?’. As Michael has pointed out, Blackboard is in the midst of a significant, but incomplete and late, re-architecture of its product line.

Bottom line: If you think that Ultra is all about playing catch-up with Instructure on usability, then the company’s late delivery, functionality gaps, and weird restrictions on where the product can and cannot be run look pretty terrible. But that’s probably not the right way to think about Ultra. The best analogy I can come up with is Apple’s Mac OS X. In both cases, we have a company that is trying to bring a large installed base of customers onto a substantially new architecture and new user experience without sending them running for the hills (or the competitors). This is a really hard challenge.

Market Timing

On the surface, it seems to be a high-risk move to try and sell a company before the changes are solidly in place and customers have demonstrated that they will move to new architecture rather than “running for the hills”.

Assuming that the Reuters story is accurate, I believe the answer to the question on ‘why now’ is that this move is about market timing – Blackboard wants to ride the current ed tech investment wave, and Providence Equity Partners (their owners) believe they can get maximum value now. This consideration trumps the otherwise logical strategy of waiting until more of the risk from the new user experience and cloud platform roll-out is removed by getting real products into significant number of customers’ hands. VC investment and M&A activity are at high and potentially unsustainable levels. 2U has shown that ed tech companies can go public and be a success. Lynda.com has shown that relatively mature companies can be acquired for very high valuations. Instructure is likely to go public in early 2016. If you want to get a high price, sometimes it’s worth going on a hot market before addressing most of the re-architecture risk.

Key Financials

There was one piece of information from the Reuters story that has gotten very little attention in follow-up reporting and analysis.

Blackboard has annual earnings before interest, tax, depreciation and amortization of around $200 million, some of the people added.

Two of the people said that Blackboard could fetch a valuation between 14 times to 17 times EBITDA, up to $3.4 billion, based on current multiples of subscription-based software companies.

Moody’s placed Blackboard’s EBITDA at roughly $180 million based on $1.4 billion of total rated debt and a 7.8 debt to EBITDA ratio. I suspect that the $200 million figure is based on forward-looking estimates while the $180 million is actuals. This Moody’s rating does give rough confirmation of the Reuters’ numbers.

For non startups, one of the most common valuation metrics is Enterprise Value / EBITDA.[1] Based on Blackboard’s earnings reports when it was publicly traded as well as financial analyst estimates, the company’s EBITDA was $96 million in 2010, estimated $120 in 2011, and was expected to rise to $154 million in 2012. Current EBITDA of $200 million would indicate a 30% increase in profitability in the past three years. That increase is far less than the 60% EBITDA growth in the previous two years, but it is healthy.

Blackboard’s revenue in 2010 was $447 million, and the current EBITDA estimates would indicate current revenues between $600 – $700 million.

Financial or Strategic Buyers

Industry medians for financial buyers (e.g. another private equity firm buying Blackboard from Providence) and subscription-based software companies is over 10x EBITDA (let’s say 10x – 13x, but likely not the 14x – 17x mentioned in article), even in the ed tech market (the company got just over 12x in 2011). This would lead to an expected valuation of roughly $2 – $2.5 billion, far less than the desired $3 billion but more than the $1.64 billion price paid in 2011. If Blackboard really thinks they can get up to $3 billion based on the Reuters quote above, then I would assume that they are thinking more of a strategic buyer (e.g. another, larger, company buying Blackboard for some strategic asset such as installed customer base or rapidly growing international presence). The challenge is that $3 billion is a big price tag, and there are not that many education-related companies that could afford this purchase. Pearson, Google, Microsoft, and LinkedIn are example companies with some foothold in education and the financial ability to make a purchase (speculation alert, as I am not saying I think any of those companies will make a bid).

To get a sense on the bet that Blackboard might be making, consider the case of Lynda.com being purchased by LinkedIn for $1.5 billion. Founded in 1995, Lynda.com’s 2014 revenues were only $150 million with EBITDA of $7.5 – $15 million. That’s an EBITDA ration of at least 100x. Only a strategic buyer could justify those numbers.

My suspicion is that Blackboard’s investment bankers have already done their research and have a rough idea of who has enough money and who might make a bid. This also leads to a question of timing.

How Would This Affect Customers?

Unlike 2011, we are not seeing the potential of a publicly-traded company going private, which was a far greater risk for customers worried about change. This time, the relative risk depends on who, if anyone, acquires Blackboard. If Blackboard settles for far lower price in auction and goes with a financial buyer, I would assume that the new owners would be betting on the same strategy that the current ownership and management team has in place (remove silos, streamline operations, grow internationally, re-architect product line), but they would plan to subsequently take Blackboard public in a few years.[2]

If, however, Blackboard is able to find a strategic buyer willing to pay the higher price, then the effect on current and prospective customers would largely depend on why that company is making the purchase. This scenario would lead to greater chance of significant change for customers, whether for good or bad (or both).

It will be interesting to find out if the Reuters story is indeed accurate and if Blackboard does get acquired. We’ll keep watching.

Update 8/4: Clarified language on EBITDA multiples for software industry.

  1. From Wikinvest: EBITDA, which stands for “Earnings Before Interest, Taxes, Depreciation, and Amortization”, is a way of evaluating a company’s profitability which excludes items that make comparisons across companies difficult and which are viewed as not central to the company’s core operations.
  2. Credit to Michael Feldstein for the thought process in this paragraph.

The post Blackboard Potential Sale: Market timing, financials, and some thoughts on potential buyers appeared first on e-Literate.

Using TAs As Key Component Of Active Learning Transformation at UC Davis

Mon, 2015-08-03 17:07

By Phil HillMore Posts (353)

Last week I described how UC Davis is making efforts to personalize one of the most impersonal of learning experiences – large lecture introductory science courses. It is telling that the first changes that they made were not to the lecture itself but to the associated discussion sections led by teaching assistants (TAs). It is well known that much of the instruction in lower division classes at large universities is led not by faculty but by TAs. This situation is often seen as a weakness of the business model of a research university, but it can also be leveraged as an opportunity to lead educational change. Consider this interview with staff from the iAMSTEM group at UC Davis from our e-Literate TV series on personalized learning:


Phil Hill: In biology, the UC Davis team applied the same learning principles but also included personalized learning software in the mix. The Open Learning Initiative or OLI is a grant funded group at Carnegie Mellon University that started in 2001. UC Davis is using the OLI platform within in the discussion sections of the redesigned biology course.

Erin Becker: In biology, we started at the level of the teaching assistants who are the ones that the students spend the majority of their face time with. They’re also the ones that we had the most control over because we had kind of more say over what the teaching assistants do than what the instructors do.

We went into the introductory biology course here and did a rigorous practice-based training program where we trained the TAs on various techniques that have evidence of effectiveness and increasing student leaning. From that, we then have expanded into working with the instructors in that same biology course.

Phil Hill: If you had to summarize the learning-based changes that have been made, how would you describe them?

Erin Becker: I guess if I had to sum it up, I’d say keeping the students accountable in class. We trained the TAs to (it sounds very simple) call on the students; make the students know that they are responsible for having the knowledge in class time in real time.

Chris Pagilarulo: Come prepared.

The interview then went on to discuss the role of the OLI software in this teaching transformation for TAs. I found it interesting that the TA (Amanda) positioned the software as a feedback mechanism – automatic feedback freeing up TA grading time and immediate feedback for learning.

Amanda Fox: Where now, we have Online Learning Initiative (OLI) as a pre-lab, and this is a set of somewhere between 15 to maybe 25 questions that they do online on their own. It doesn’t take the time from the TAs to grade that, so that there’s more questions that they can be asked, and be doing on their on their own time.

Phil Hill: Do you also get feedback from the system?

Amanda Fox: The OLI System?

Phil Hill: Yeah, from OLI?

Amanda Fox: We have a head TA who’s is charge of going over OLI, but I myself don’t do the grading for it, but I do get a readout of what questions they’re having the most difficult time with and what questions they felt comfortable with. Then I look at that. From each week we have a set of pre-discussions due, and I look at the results from that, and then I see the questions that most of my class had a difficulty with, and then I cover that in the next discussion.

I address maybe the top three questions out of fifteen or so that they had problems with, and I go over, “Do you understand why this is the correct answer, and why what you said is not correct?” One thing that’s awesome is that they get immediate response to whether I push a button, and it tells me right there: Did I push the correct button for the correct answer or not?

That immediate feedback, I think, is very helpful because otherwise these students were answering these questions maybe a week before they would turn it in to me. I would go over the answers in discussion, but that lapse in time between when they first thought about the question, and when they get the answer to it—I think it’s really good to have that immediate response between the two.

Software as feedback is only component of the redesign, however, as a different TA (Guy) described the iAMSTEM training into effective teaching styles.

Guy Shani: One of the biggest concepts that have been emphasized towards us—we have TA meetings for two hours every week before our discussions.And part of that is to go over the material for the week and especially with some of the TAs that are not in the exact field that that week’s discussion is on—we need a little bit of review on that.

But the other part is sort of understanding how we should be approaching teaching because a lot of the time (a lot of discussions), especially early on in lower-division classes, the approach has frequently been just lecture.Basically, having the students standing up with our backs to the class writing up on the board, and that’s not the most effective way to communicate an idea to the students.

So, we go over techniques like cold calling where we actively involve the students. I will say most of a sentence and call on a student at random to complete it, or I will ask a question and involve the students. This gives me both a good way to gauge whether they understand, and it keeps them all on their toes.

This theme of leveraging TAs for change by combining pedagogical training as well as focused software usage for immediate feedback to students is not unique to UC Davis. We saw a similar approach at Arizona State University.

The post Using TAs As Key Component Of Active Learning Transformation at UC Davis appeared first on e-Literate.

Blackboard’s Messaging Problems

Fri, 2015-07-31 15:07

By Michael FeldsteinMore Posts (1040)

There are a lot of things that are hard to evaluate from the outside when gauging how a company is doing under new management in the midst of a turnaround with big new products coming out. For example, how good is Ultra, Blackboard’s new user experience? (At least, I think the user experience is what they mean by “Ultra.” Most of the time.) We can look at it from the outside and play around with it for a bit, but the best way to judge it is to talk to a lot of folks who have spent time living with it and delivering courses in it. There aren’t that many of those at the moment. Blackboard has offered to put us in touch with some of them, and we will let you know what we learn from them after we talk to them. How likely is Blackboard to deliver the promised functionality on their Ultra to-do list to other customers on schedule (or at all)? Since this is a big initiative and the company doesn’t have much of a track record, it’s hard to tell in advance of them actually releasing software. We’ll watch and report on it as it comes out. How committed is Blackboard to self-hosted customers on the current platform? We have their word, and logical reasons why we believe they mean it when they say they want to support those customers, but we have to talk to a bunch of customers to find out what they think of the support that they are getting, and even then, we only know about Blackboard’s current execution, which is not the same as their future commitment. So there are a lot of critical aspects about the company that are just hard and time-consuming to evaluate and will have to wait on more data.

But not everything is hard to evaluate. Communication, for example, is pretty easy to judge. Last year I mocked Jay Bhatt pretty soundly for his keynote. (Of course, we have hit D2L a lot harder for their communication issues because theirs have been a lot worse.) In some ways, it is so easy to critique communication that we have to be careful not to just take cheap shots. Everybody loves to mock vendors in general and LMS vendors in particular. We’re mainly interested in communications problems that genuinely threaten to hurt their relationship with their customers. Blackboard does have serious customer communication problems at the moment, and they do matter. I’m going to hit on a few of them.

Keynote Hits Sour Notes

Since I critiqued last year’s keynote, an update in that department is as good a place to start as any. It’s sort of emblematic of the problem. This year’s keynote was better than last year’s but that doesn’t mean it was good. Of the two-hour presentation, only the last twenty minutes or so directly addressed the software. The rest was about values and process. I get why the company is doing this. As I said in last year’s review, they are nothing if not earnest. So, for example, when Jay Bhatt says that we need to start a “revolution” in education and that Blackboard is inviting “you”—presumably the educators in the room—to join them, it doesn’t carry the sinister tone of the slick Sillycon Valley startup CEO talking about “disrupting” education (by which they generally mean replacing dumb, mean, unionized bad people teachers with slick, nice, happy-faced software). Jay comes across as a dad and a former teacher who honestly cares about education and wants very much to do his part to improve it. But his pitch is tone deaf. No matter how earnest you are, you can’t take center stage as the CEO of a software company that has a long and infamous reputation for disregarding customers making education worse rather than better and then, giant-face projected on the jumbotron and simulcast on the web, convince people that you are just a dad who wants to make education better. It doesn’t work. It’s not going to win over skeptical customers, never mind skeptical prospective customers. No matter how much you sincerely mean it. No matter how much it is said with the best of intentions. You also can’t spend the first 90+ minutes of the keynote talking about process and then get around to admitting that your revolutionary software is a year late. Phil and I both give Jay and Blackboard tons of credit for being forthright about the delay in the keynote, and for generally showing a kind of honesty and openness that we don’t see very often from big ed tech vendors. Really, it’s rare, it’s important, and it deserves more credit that it will probably be given by a lot of people. But in terms of having the intended effect on the audience, owning up to your delivery problems in the last 10 minutes of a two-hour keynote, most of which was also not spent talking about the stuff that customers most immediately care about, will not have the desired effect. The reason Blackboard went though that first 90 minutes is that they, really, really want to tell you, with all their hearts, that “Gee whiz, gang, we really do care and we really are trying super-hard to create something that will make students’ lives better.” But if the punchline, after 90+ minutes, is “…and…uh…we know we told you we’d have it done a year ago, but honestly, we mean it, we’re still working on it,” you will not win converts.

The one thing I did like very much, besides the honesty about missing their delivery dates, was the day-in-the-life walk-throughs of the software. They very compactly and effectively conveyed the quality of thought and concern for the student that the first 90 minutes of process talk did not. If you want to convince me that you really care about students, then don’t talk to me about how much you really care about the students. Show me what you have learned from them. Because talk is cheap. I won’t believe that you really care about students in a way that affects what you do in your business until you show me that you have developed a clear and actionable understanding of what students need and want and care about. That is what the walk-throughs accomplished (although they would have been even more effective with just a smidge less “golly gee” enthusiasm).

There’s one simple thing Blackboard could do that would vastly improve their keynotes and make a host of rhetorical sins more forgivable. They could bring back Ray Henderson’s annual report card. Every year, Ray would start the keynote by laying out last year’s publicly declared goals, providing evidence of progress (or not) toward those goals—quantitative evidence, whenever possible—and setting the goals for the new year. This set the right tone for the whole conference. “I made you some promises. Here’s how I did on those promises. Here’s what I’m going to do better this year. And here are some new promises.” As a customer, I will hear whatever else you have to say to me next much more charitably if you do that first. For example, Phil and I have heard a number of customers express dissatisfaction with the length of time it takes to fix bugs. At a time when Blackboard is trying to convince self-hosted customers that they will not be abandoned, this is particularly important not to let get out-of-hand because every customer who has an unaddressed bug will be tempted to read it as evidence that the company is secretly abandoning 9.1 and just lying about it. But if Blackboard leadership got up on stage—as they used to—and said, “Here’s the number of new bugs we had in the past year, here’s average length of time that P1s go unaddressed, here’s the trend line on it, here’s our explanation of why that trend line is what it is, and here’s our promise that we will give you an update on this next year, even if it looks bad for us,” then customers are going to be much more likely to give the company the benefit of the doubt. If you’ve addressed my concerns as a customer and said your “mea culpas” first, then I’m going to be more inclined to believe that anything else you want to tell me is truthful and meant for my benefit.

What Is Ultra and What Does It Mean For Me?

Ultra Man

Another problem Blackboard has is that it is very hard to understand what they mean by “Ultra.” Sometimes they mean an architecture that enables a new user experience. Sometimes they mean the new user experience that may or may not require the architecture. And at no time do they fully clarify what it means for hosting.

Here’s a webinar from last December that provides a pretty representative picture of what Blackboard’s Ultra talk is like:

Most of the Ultra talk is about the user experience. So it makes sense to infer that Ultra is a new user experience which, for those with any significant experience with Blackboard or many of the other LMS providers, would suggest a new skin (or “lipstick on a pig,” as Phil recently put it). And yet, Ultra doesn’t run on the self-hosted version of Blackboard. Why is that? A cynical person would say (and cynical customers have said) that Blackboard is just trying to push people off of self-hosting. No, says Blackboard, not at all. Actually, the reason we can’t do self-hosted Ultra is because Ultra requires the new cloud architecture, which you can’t self-host.

Except for Ultra on mobile. You can experience Ultra on mobile today, even if you are running self-hosted 9.1.

Huh?

OK, so if I want to run Ultra, I can’t run it self-hosted (except for mobile, which is fine). What if I’m managed hosted? Here’s the slide from that webinar:

hosting_options_from_youtube

There you go. Clear as mud. What is “Premium SaaS”? Is it managed hosting? Is it private cloud? What does it mean for current managed hosting customers? What we have found is that there doesn’t seem to be complete shared understanding even among the Blackboard management team about what the answers to these questions are.

Update: I received a tweet from a Blackboard marketing person with an updated SaaS slide:

@mfeldstein67 Hi Michael. Based on customer feedback, we changed the naming of the SaaS tiers after Dec. Here you go. pic.twitter.com/A4D3yu6stz

— Vivek Ramgopal (@TweetsByVivek) August 3, 2015

While this still doesn’t address the managed hosting customers’ questions, it does provide some good additional information.

Based on what Phil and I have been able to glean about the true state of affairs, here’s how I would explain the situation if I were a Blackboard executive:

  • Ultra is Blackboard’s new product philosophy and user interface. Rather than just sticking in a new tab or drop-down menu and a new bill from a new sales team every time we add new capabilities, we’re trying to design these capabilities into the core product experience in ways that fit with how customers would naturally use them. So rather than thinking about separate products living in separate places—like Collaborate, Community, Analytics, and Content, for example—you can think about synchronous collaboration, non-course groups, student progress tracking, and content sharing naturally when and where you need those capabilities in your daily academic life.

  • Blackboard Learn Cloud [Note: This is my made-up name, not Blackboard’s official product name] is the new architecture that makes Ultra possible for Learn. It also enables you to gain all of the benefits of being in the cloud, like being super-reliable and less expensive. But with regard to Ultra, we can’t create that nifty integrated experience without adding some new technical infrastructure. Learn Cloud enables us to do that. Update: Ultra is still a work in progress and may not be appropriate for all professors and all courses in its current form. Luckily, Learn Cloud also runs the traditional Learn user experience that is available on Learn Enterprise. So you can run Learn Cloud now without impacting your faculty and have them switch over to the Ultra experience—on the same platform—whenever they are ready for it and it is ready for them.

  • Blackboard Learn Enterprise [another Feldstein-invented name] is the classic architecture for Learn, currently on version 9.1. We think that a significant number of customers, both in the US and abroad, will continue to want to use the current architecture for a long time to come, in part because they want or need to self-host. We are committed to actively developing Learn Enterprise for as long as a significant number of customers want to use it. Our published road maps go out two years, but that doesn’t mean we only plan to develop it for another two years. It just means that it’s silly to create technology road maps that are longer than two years, given how much technology changes. Because Learn Enterprise shares a lot of code with Learn Cloud, we actually can afford to continue supporting both as long as customers are buying both in numbers. So we really do mean it when we say plan to keep supporting Enterprise for the foreseeable future. We will also bring as much of the Ultra experience to Enterprise as the technology allows. That won’t be all or most, but it will be some. The product will continue moving forward and continue to benefit from our best thinking.

  • Self-hosted Learn Cloud isn’t going to happen any time soon, which means that self-hosted Ultra isn’t going to happen any time soon. It is possible that the technologies that we are using for Blackboard Cloud will mature enough in the future that we will be able to provide you with a self-hosted version that we feel confident that we can support. (This is a good example of why it is silly to create technology road maps that are more than two years long. Who knows what the Wizards of the Cloud will accomplish in two years?) But don’t hold your breath. For now and the foreseeable future, if you self-host, you will use Learn Enterprise, and we will keep supporting and actively developing it for you.

  • Mobile is a special case because a lot of the functionality of the mobile app has lived in the cloud from Day 1 (unlike Learn Enterprise). So we can deliver the Ultra experience to your mobile apps even if you are running Learn Enterprise at home.

  • Managed hosting customers cannot run Ultra on Learn for the same reason that self-hosted customers cannot: They are currently using Learn Enterprise. They can continue to use Learn Enterprise on managed hosting for as long as they want, as long as they don’t need Ultra. We will, eventually, offer Learn Private Cloud [yet another Feldstein-invented name]. Just as it sounds, this will be a private, Blackboard-hosted instance of Blackboard Cloud. Managed Hosted clients are welcome to switch to Learn Private Cloud when it becomes available, but it is not the same as managed hosting and may or may not meet your needs as well as other options. Please be sure to discuss it with your representative when it becomes available. In the meantime, we’ll provide you with detailed information about what would change if you moved from managed hosting of Blackboard Enterprise to Blackboard Cloud, along with detailed information about what the migration process would be like.

To be clear, I’m not 100% certain that what I’ve described above is factually correct, in part because Phil and I have heard slightly different versions of the story from different Blackboard executives. (I’m fairly sure it’s at least mostly right.) The main point is that, whatever the truth is, Blackboard needs to lay it out more clearly. Right now, they are missing easy wins because they are not communicating well.

Time will tell whether Ultra pays off. I’m actually pretty impressed with what I’ve seen so far. But no matter how good it turns out to be, Blackboard won’t start winning RFPs in real numbers until they start telling their story better.

The post Blackboard’s Messaging Problems appeared first on e-Literate.

Reuters: Blackboard up for sale, seeking up to $3 billion in auction

Tue, 2015-07-28 13:48

By Phil HillMore Posts (350)

As I was writing a post about Blackboard’s key challenges, I get notice from Reuters (anonymous sources, so interpret accordingly) that the company is on the market, seeking up to $3 billion. From Reuters:

Blackboard Inc, a U.S. software company that provides learning tools for high school and university classrooms, is exploring a sale that it hopes could value it at as much as $3 billion, including debt, according to people familiar with the matter.

Blackboard’s majority owner, private equity firm Providence Equity Partners LLC, has hired Deutsche Bank AG and Bank of America Corp to run an auction for the company, the people said this week. [snip]

Providence took Blackboard private in 2011 for $1.64 billion and also assumed $130 million in net debt.

A pioneer in education management software, Blackboard has seen its growth slow in recent years as cheaper and faster software upstarts such as Instructure Inc have tried to encroach on its turf. Since its launch in 2011, Instructure has signed up 1,200 colleges and school districts, according to its website.

This news makes the messaging from BbWorld as well as their ability to execute on strategy, particularly delivering the new Ultra user experience across all product lines – including the core LMS – much more important. I’ll get to that subject in the next post.

This news should not be all that unexpected, as one common private equity strategy is to reorganize and clean up a company (headcount, rationalize management structures, reorient the strategy) and then sell within 3 – 7 years. As we have covered here at e-Literate, Blackboard has gone through several rounds of layoffs, and many key employees have already left the company due to new management and restructuring plans. CEO Jay Bhatt has been consistent in his message about moving from a conglomeration of silo’d mini-companies based on past M&A to a unified company. We have also described the significant changes in strategy – both adopting open source solutions and planning to rework the entire user experience.

Also keep in mind that there is massive investment in ed tech lately, not only from venture capital but also from M&A.

Update 1: I should point out that the part of this news that is somewhat surprising is the potential sale while the Ultra strategy is incomplete. As Michael pointed out over the weekend:

Ultra is a year late: Let’s start with the obvious. The company showed off some cool demos at last year’s BbWorld, promising that the new experience would be Coming Soon to a Campus Near You. Since then, we haven’t really heard anything. So it wasn’t surprising to get confirmation that it is indeed behind schedule. What was more surprising was to see CEO Jay Bhatt state bluntly in the keynote that yes, Ultra is behind schedule because it was harder than they thought it would be. We don’t see that kind of no-spin honesty from ed tech vendors all that often.

Ultra isn’t finished yet: The product has been in use by a couple of dozen early adopter schools. (Phil and I haven’t spoken with any of the early adopters yet, but we intend to.) It will be available to all customers this summer. But Blackboard is calling it a “technical preview,” largely because there are large swathes of important functionality that have not yet been added to the Ultra experience–things like tests and groups. It’s probably fine to use it for simple (and fairly common) on-campus use cases, but there are still some open manholes here.

Update 2: I want to highlight (again) the nature of this news story. It’s from Reuters using multiple anonymous sources. While Reuters should be trustworthy, please note that the story has not yet been confirmed.

Update 3: In contact with Blackboard, I received the following statement (which does not answer any questions, but I am sharing nonetheless).

Blackboard, like many successful players in the technology industry, has become subject of sale rumors. Although we are transparent in our communications about the Blackboard business and direction when appropriate, it is our policy not to comment on rumors or speculation.

Blackboard is in an exciting industry that is generating substantial investor interest. Coming off a very successful BbWorld 2015 and a significant amount of positive customer and market momentum, potential investor interest in our company is not surprising.

We’ll update as we learn more, including if someone confirms the news outside of Reuters and their sources.

The post Reuters: Blackboard up for sale, seeking up to $3 billion in auction appeared first on e-Literate.

UC Davis: A look inside attempts to make large lecture classes active and personal

Mon, 2015-07-27 13:16

By Phil HillMore Posts (350)

In my recent keynote for the Online Teaching Conference, the core argument was as follows:

While there will be (significant) unbundling around the edges, the bigger potential impact [of ed innovation] is how existing colleges and universities allow technology-enabled change to enter the mainstream of the academic mission.

Let’s look at one example. Back in December the New York Times published an article highlighting work done at the University of California at Davis to transform large lecture classes into active learning formats.

Hundreds of students fill the seats, but the lecture hall stays quiet enough for everyone to hear each cough and crumpling piece of paper. The instructor speaks from a podium for nearly the entire 80 minutes. Most students take notes. Some scan the Internet. A few doze.

In a nearby hall, an instructor, Catherine Uvarov, peppers students with questions and presses them to explain and expand on their answers. Every few minutes, she has them solve problems in small groups. Running up and down the aisles, she sticks a microphone in front of a startled face, looking for an answer. Students dare not nod off or show up without doing the reading.

Both are introductory chemistry classes at the University of California campus here in Davis, but they present a sharp contrast — the traditional and orderly but dull versus the experimental and engaging but noisy. Breaking from practices that many educators say have proved ineffectual, Dr. Uvarov’s class is part of an effort at a small but growing number of colleges to transform the way science is taught.

This article follows the same argument laid out in the Washington Post nearly three years earlier.

Science, math and engineering departments at many universities are abandoning or retooling the lecture as a style of teaching, worried that it’s driving students away. [snip]

Lecture classrooms are the big-box retailers of academia, paragons of efficiency. One professor can teach hundreds of students in a single room, trailed by a retinue of teaching assistants.

But higher-education leaders increasingly blame the format for high attrition in science and math classes. They say the lecture is a turn-off, higher education at its most passive, leading to frustration and bad grades in highly challenging disciplines.

What do these large lecture transformations look like? We got the chance in our recent e-Literate TV case study to get an inside look at the work done at UC Davis (episode 1, episode 2, episode 3), including first-person accounts from faculty members and students.

The organizing idea is to apply active learning principles such as the flipped classroom to large introductory science classes.

Phil Hill: It sounds to me like you have common learning design principles that are being implemented, but they get implemented in different ways. So, you have common things of making students accountable, having the classes much more interactive where students have to react and try to apply what they’re learning.

Chris Pagliarulo: Yeah, the main general principle here is we’re trying to get—if you want to learn something complex, which is what we try to at an R1 university, that takes a lot of practice and feedback. Until recently, much of that was supposed to be going on at home with homework or whatnot, but it’s difficult to get feedback at home when the smart people aren’t there that would help you—either your peers or your professor.

So, that’s the whole idea of the flipped classroom where come prepared with some basic understand and take that time where you’re all together to do the high-quality practice and get the feedback while we’re all together. Everything that we’re doing is focused on that sort of principle—getting that principle into the classroom.

Professor Mitch Singer then describes his background in the redesign.

Phil Hill: Several years ago, the iAMSTEM group started working with the biology and chemistry departments to apply some of these learning concepts in an iterative fashion.

Mitch Singer: My (hopefully) permanent assignment now, at least for the next five years, will be what we call “BIS 2A,” which is the first introductory course of biology here at UC Davis. It’s part of a series, and its primary goal is to teach fundamentals of cellular and molecular biology going from origins up to the formation of a cell. We teach all the fundamentals in this class: the stuff that’s used for future ones.

About three to four years ago, I got involved in this class to sort of help redesign it, come up with a stronger curriculum, and primarily bring in sort of hands-on, interactive learning techniques, and we’ve done a bunch of experiments and changed the course in a variety of ways. It’s still evolving over the last several years. The biggest thing that we did was add a discussion section, which is two hours long where we’ve done a lot of our piloting for this interactive, online, personalized learning (as the new way of saying things, I guess). This year (last quarter in the fall) was the first time we really tried to quote, flip part of the classroom.

That is make the students take a little bit more responsibility for their own reading and learning, and then the classic lecture is more asking questions trying to get them to put a and b together to come up with c. It’s sort of that process that we’d like to emphasize and get them to actually learn, and that’s what we want to test them on not so much the facts, and that’s the biggest challenge.

If you want to see the potential transformation of this core, it is crucial to look at the large lecture classes and how to make them more effective. The UC Davis case study highlights what is actually happening in the field, with input from real educators and students.

The post UC Davis: A look inside attempts to make large lecture classes active and personal appeared first on e-Literate.

Blackboard Ultra and Other Product and Company Updates

Sat, 2015-07-25 08:58

By Michael FeldsteinMore Posts (1039)

Phil and I spent much of this past week at BbWorld trying to understand what is going on there. The fact that their next-generation Ultra user experience is a year behind is deservedly getting a lot of attention, so one of our goals going into the conference was to understand why this happened, where the development is now, and how confident we could be in the company’s development promises going forward. Blackboard, to their credit, gave us tons of access to their top executives and technical folks. Despite the impression that a casual observer might have, there is actually a ton going on at the company. I’m going to try to break down much of the major news at a high level in this post.

The News

Ultra is a year late: Let’s start with the obvious. The company showed off some cool demos at last year’s BbWorld, promising that the new experience would be Coming Soon to a Campus Near You. Since then, we haven’t really heard anything. So it wasn’t surprising to get confirmation that it is indeed behind schedule. What was more surprising was to see CEO Jay Bhatt state bluntly in the keynote that yes, Ultra is behind schedule because it was harder than they thought it would be. We don’t see that kind of no-spin honesty from ed tech vendors all that often.

Ultra isn’t finished yet: The product has been in use by a couple of dozen early adopter schools. (Phil and I haven’t spoken with any of the early adopters yet, but we intend to.) It will be available to all customers this summer. But Blackboard is calling it a “technical preview,” largely because there are large swathes of important functionality that have not yet been added to the Ultra experience–things like tests and groups. It’s probably fine to use it for simple (and fairly common) on-campus use cases, but there are still some open manholes here.

Screenshot 2015-07-25 09.34.48

Ultra is only available in SaaS at the moment and will not be available for on-premise installations any time soon: This was a surprise both to us and to a number of Blackboard customers we spoke to. It’s available now for SaaS customers and will be available for managed hosting customers, but the company is making no promises about self-hosted. The main reason is that they have added some pretty bleeding edge new components to the architecture that are hard to wrap up into an easily installable and maintainable bundle. The technical team believes this situation may change over time as the technologies that they are using mature—to be clear, we’re talking about third-party technologies like server containers rather than homegrown Blackboard technologies—they think it may become practical for schools to self-host Ultra if they still want to by that time. But don’t expect to see this happen in the next two years.

Ultra is much more than a usability makeover and much more ambitious than is commonly understood: There is a sense in the market that Ultra is Blackboard’s attempt to catch up with Instructure’s ease of use. While there is some truth to that, it would be a mistake to think of Ultra as just that. In fact, it is a very ambitious re-architecture that, for example, has the ability to capture a rich array of real-time learning analytics data. These substantial and ambitious under-the-hood changes, which Phil and I were briefed on extensively and which were also shared publicly at Blackboard’s Devcon, are the reason why Ultra is late and the reason why it can’t be locally installed at the moment. I’m not going to have room to go into the details here, but I may write more about it in a future post.

Blackboard “Classic” 9.x is continuing under active development: If you’re self-hosted, you will not be left behind. Blackboard claims that the 9.x code line will continue to be under active development for some time to come, and Phil and I found their claims to be fairly convincing. To begin with, Jay got burned at Autodesk when he tried to push customers onto a next-generation platform and they didn’t want to go. So he has a personal conviction that it’s a bad idea to try that again. But also, Blackboard gets close to a quarter of its revenue and most of its growth from international markets now, and for a variety of reasons, Ultra is not yet a good fit for those markets and probably won’t be any time soon. So self-hosted customers on Learn 9.x will likely get some love. This doesn’t mean development will be as fast as they would like; the company is pushing hard in a number of directions, and we get the definite sense that there is a strain on developer resources. But 9.x will not be abandoned or put into maintenance mode in the near future.

Slide09

If you want to get a sense of what Ultra feels like, try out the Blackboard Student mobile app: The way Blackboard uses the term “Ultra” is confusing, because sometimes it means the user experience but sometimes it means the next generation architecture for Learn. If you want to try Ultra the user experience, the play with the Student mobile app, which is in production today and which will work with Learn 9.x as well as Learn Ultra. Personally, I think it represents some really solid thinking about designing for students.

Slide60

Slide66

Moodle may make a comeback: One of the reasons that Moodle adoption has suffered in the United States the past few years is that it has lacked an advocate with a loud voice. Moodlerooms used to be the biggest promoter of the platform, and when Blackboard acquired them, they went quiet in the US. But, as I already mentioned, the international market is hugely important for Blackboard now, and Moodle is the cornerstone of the company’s international strategy. They have been quietly investing in the platform, making significant code contributions and acquisitions. There are signs that Blackboard may unleash Moodlerooms to compete robustly in the US market again. This would entail taking the risk that Moodle, a cheaper and lower-margin product, would cannibalize their Learn business, so file this under “we’ll believe it when we see it,” but Apple has killed the taboo of self-cannibalization when the circumstances are right, and they seem like they may be right in this situation.

Collaborate Ultra is more mature than Learn Ultra but still not mature: This is another case where thinking about Ultra as a usability facelift would be hugely underestimating the ambition of what Blackboard is trying to do. The new version of Collaborate is built on a new standard called WebRTC, which enables webconferencing over naked HTML rather than through Flash or Java. This is extremely hard stuff that big companies like Google, Microsoft, and Apple are still in the process of working out right now. It is just this side of crazy for a company the size of Blackboard to try to release a collaboration product based heavily on this technology. (And the only reason it’s not on the other side of crazy is because Blackboard acquired a company that has one of the world’s foremost experts on WebRTC.) Phil and I have used Collaborate Ultra a little bit. It’s very cool but a little buggy. And, like Learn Ultra, it’s still missing some features. At the moment, the sweet spot for the app appears to be online office hours.

Slide53

 

My Quick Take

I’m trying to restrain myself from writing a 10,000-word epic; there is just a ton to say here. I’ll give a high-level framework here and come back to some aspects in later posts. Bottom line: If you think that Ultra is all about playing catch-up with Instructure on usability, then the company’s late delivery, functionality gaps, and weird restrictions on where the product can and cannot be run look pretty terrible. But that’s probably not the right way to think about Ultra. The best analogy I can come up with is Apple’s Mac OS X. In both cases, we have a company that is trying to bring a large installed base of customers onto a substantially new architecture and new user experience without sending them running for the hills (or the competitors). This is a really hard challenge. Hardcore OS X early adopters will remember that 10.0 was essentially an unusable technology preview, 10.1 was usable but painful, 10.2 was starting to feel pretty good, and 10.3 was when we really began to see why the new world was going to be so much better than the old one. If I am right, Ultra will go through the same sort of evolution. I don’t know that these stages will each be a year long; I suspect that they may be shorter than that. But right now we are probably partway through the 10.0 era for Ultra. As I mentioned earlier in the post, Phil and I still need to talk to some Ultra customers to get a sense of real usage and, of course, since it will be generally available to SaaS customers for use in the fall semester, we’ll have more folks to talk to soon. We will be watching closely to see how big the gaps are and how quickly they are filled. For example, how long will it take Blackboard to get to the items labeled as “In Development” on their slides? Does that mean in a few months? More? And what about the “Research” column? Based on these slide and our conversations, I think the best case scenario is that we reach the 10.2 era—where the platform is reasonably feature-complete, usable, and feeling pretty good overall—by BbWorld 2016, and with some 10.3-type new and strongly differentiating features starting to creep into the picture. Or they could fall flat and utterly fail to deliver. Or something in between. I’m pretty excited by the scope of the company’s ambition and am willing to cut them some slack, partly because they persuaded me that what they are trying to do is pretty big and party because they persuaded me that they probably know what they are doing. But they have had their Mulligan. As the saying goes (when properly remembered), the proof of the pudding is in the eating. We’ll see what they deliver to customers in the next 6-12 months.

Watch this space.

The post Blackboard Ultra and Other Product and Company Updates appeared first on e-Literate.