Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 8 hours 35 min ago

Giving D2L Credit Where Credit Is Due

Thu, 2015-07-23 21:20

By Phil HillMore Posts (349)

Michael and I have made several specific criticisms of D2L’s marketing claims lately culminating in this blog post about examples based on work at the University of Wisconsin-Milwaukee (UWM) and California State University at Long Beach (CSULB).

I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.

This week brought welcome updates from D2L that go a long way towards addressing the issues we raised. As of Monday, I noticed that the ‘Why Brightspace? Results’ page now has links to supporting material for each claim, and the UWM claim has been reworded. Today, D2L released a blog post explaining these changes and admitting the mistakes. D2L even changed the web page to allow text selection for copy / paste. From the blog post:

Everyone wants more from education and training programs—so it’s critical that our customers are part of the process of measurement and constant improvement.

At Fusion, our customers came together to share new ideas and practices to push education forward. They like to hear about the amazing results, like U-Pace, which we post on our website. In our excitement to share the great results our customers are seeing through their programs, we didn’t always provide the details around the results. When we make mistakes, it’s our job to fix it—as we are doing now.

U-Pace is the specific program at UWM (course redesign from large lecture to self-paced / mastery approach), and D2L now links to a documented case study and quotes this case study in the blog post.

We have a Customer Success Program in place where approvals from our clients are acquired before we post anything about them. Stories are revisited every six months to make sure that they’re still valid and accurate. However, a recent customer success story was mistakenly posted on our website without their permission or knowledge. We will be doubling down on our efforts to help ensure that this doesn’t happen again, and we will work harder to provide citations for all the facts.

This “without their permission or knowledge” paragraph refers to a claim about CSULB.

Make no mistake, we’re extremely proud of what our clients are accomplishing. Our customers’ innovation, dedication, and just plain awesomeness is making a huge difference—and we’re proud to be a part of it. We will continue to measure and improve our offerings, listen to our community for suggestions, and when warranted, share their results. Here’s to them!

Kudos to D2L for these admissions and changes. Well done.

Notes and Caveats

While the overall change is very positive, I do have a few additional notes and caveats to consider.

  • The blog post today should have come from Renny Monaghan (Chief Marketing Officer) or John Baker (CEO). The blog post was written by Barry Dahl[1], and unless I misunderstand he is their lead for community engagement – building a user community that is mostly behind-login and not public-facing. The “mistakes” were made in official marketing and company communications. The leader of the department in charge of official messaging (Renny) or the company leader (John) should have taken ownership of what happened in the past and the corrections they are making.
  • In the blog post section describing the U-Pace program at the UWM, I would have included the description of moving from large lecture to self-paced / mastery approach. That change should not be embedded as one of “many factors that came together for UWM to achieve the results that they did, and that the increases in student success are not all attributed to their use of Brightspace.” That change to self-paced / mastery was the intervention, and all other factors are secondary. The case study describes the program quite well, but such an omission in the blog post is misleading.
  • The blog post only references UWM and CSULB examples, yet the ‘Why Brightspace? Results’ page added links to all claims. Changing them all was the right move.
  • Apparently, specific criticisms do not carry a CC-BY license.

These are welcome changes.

  1. For what it’s worth, Barry does great work for the company

The post Giving D2L Credit Where Credit Is Due appeared first on e-Literate.

Unizin Updates on Florida State System and Acquisition of Courseload

Wed, 2015-07-22 19:29

By Phil HillMore Posts (349)

I’m not sure when e-Literate was awarded the exclusive rights for non-PR Unizin coverage, but there were two announcements this week to cover.

State University System of Florida Joins

The first announcement is an update and confirmation of my recent post about the new associate membership option. If a member institution (one of the 11 members paying $1.050 million) sponsors their statewide system, that system can join Unizin as “associate members” for $100 thousand per year but without retaining a board seat and vote on product direction. The week the State University System of Florida (SUSFL) announced they are joining Unizin.

Building on its growing record of collaboration, the State University System of Florida, comprised of Florida’s 12 public universities, has joined Unizin, a group with a mission to have more control and influence over the digital learning ecosystem.

The decision helps secure Florida’s leadership in the realm of digital learning and gives access to tools under development, including a Digital Objects Repository and Learning Analytics. Florida is the first State University System to join the collaborative organization, which is a consortium of major research universities. The University of Florida is a founding member, alongside other top universities such as Pennsylvania State University, Ohio State University and the University of Michigan.The organization is a not-for-profit service operation and its membership is by invitation only.

It is not clear which of the 12 public universities beyond the University of Florida are actually planning to participate in Unizin. If you want details on the SUSFL plans and what associate membership means, go read the earlier post.

Courseload Acquisition And Content Relay

The second update is that Unizin acquired the IP, trademark, and remains of Courseload, a provider of e-reader platform for digital textbooks. From the announcement:

Unizin announced the acquisition of the Courseload software today. Courseload includes an eText reader platform and collaborative learning tools for the delivery of digital learning materials including Open Educational Resources, faculty-authored course packs, and publisher content. The addition of Courseload is a vital component for connecting content to learners in Unizin’s digital learning ecosystem.

This move now determines the second component of Unizin, as the plan is for the acquired Courseload employees will modify and develop a portion of their software to become the basis for the Content Relay. Previously Unizin had been planning to license or contract another organization to provide the Content Relay.

This acquisition means that Unizin will now be in the software development business and not just to integrate various products. This approach changes what had previously been the plans to not develop product, as Unizin co-founder and co-chairman of the board Brad Wheeler shared with me last year.

Unizin is not a Community Source effort in the way that I understand Community Source as we started applying the label 10+ years ago. Unizin is better understood, as you have reported, as a cloud-scale service operator somewhat like I2 [Internet2]. It does not plan to do lots of software development other than as needed for integrations. No biggie, just a nuanced observation from the end of the story.

When I asked Brad if this means that Unizin is ruling out product development, he replied:

Unizin is working on its roadmap for each area. If we do need to head down some development approach that is more than integration, we’ll give thought to the full range of options for best achieving that, but there is no plan to begin an open/community source effort at this time.

Courseload is based in Indianapolis, IN while Unizin is based in Austin, TX. This creates an interesting situation where a new organization will be managing a remote development team that likely outnumbers the pre-existing Unizin employees.

Common Origins

The Chronicle described the origins of Courseload in 2010.

Courseload, the e-book broker, started in 2000, when a co-founder, Mickey Levitan, a former Apple employee inspired by the company’s transformative role in the music industry, devised the idea and teamed up with a professor at Indiana University at Bloomington to try it. But the company failed to find enough takers, and it all but shut down after a brief run.

Then last year an official at Indiana, Bradley C. Wheeler, called Mr. Levitan and talked him into trying again.

Update (7/23): The following paragraph has been revised based on private communication from source which pointed out that Crunchbase data is wrong in this case.

Based on that company revival, in 2012 Courseload raised $1.6 million from IU’s Innovate Indiana fund according to Crunchbase. In 2012 the Innovate Indiana Fund, an organization that represents Indiana University’s push for economic development, joined other investment groups in helping to fund the new Courseload. The IIF investment was in the lower single digit % of the total raised. The The tight relationship with IU was further described in the Innovate Indiana end-of-year 2012 report.

In 2000, Mickey Levitan and IU Professor Alan Dennis had an idea that was ahead of its time. Through Courseload, the start-up learning platform company they cofounded, the two endeavored to make college course materials accessible online.

A decade later, Indiana University became the first customer, implementing the Courseload platform across all its campuses. Now with 50 clients and 32 employees, Courseload is leading the online course text revolution—lowering costs for students and providing capabilities that can improve educational outcomes, while offering professors the discretion to use the platform on a course-by-course basis. [snip]

Levitan is grateful for the company’s broad-reaching partnership with IU. Early support from [VP of IT Brad] Wheeler was critical to the company’s success, Levitan says. “He’s a wonderful partner and an extraordinary leader—a visionary who is ready to go out and shape the
world rather than be shaped by it.”

Levitan is also grateful for the company’s early and ongoing relationship with the IU Research and Technology Corporation (IURTC). Tony Armstrong, president and CEO of the IURTC, identified an early funding opportunity for Courseload through the Innovate Indiana Fund. Kenneth Green, manager of the Innovate Indiana Fund, sits on Courseload’s board of directors.

This Inside Higher Ed article from 2012 highlights the common origins of both Unizin and Courseload – both in terms of founder, Internet2, and common justification. As a reminder, Unizin is technically operates as part of Internet2.

In a session at the 2011 Educause conference in October, Bradley Wheeler, the chief information officer at Indiana University, issued a challenge to his colleagues. Unless universities assert their power as customers, the vendors that sell them products and services will continue squeezing those institutions for cash while dictating the terms under which they go digital.

That conversation revolved around expensive, institution-level investments such as learning-management platforms and enterprise resource planning software. Now Wheeler and his colleagues are looking to apply the same principles of “aggregated demand” to help students save money on electronic textbooks.

Internet2, a consortium of 221 colleges and universities, which last year brokered landmark deals with Box.com and Hewlett-Packard that gave its members discounts on cloud computing services, announced today that it had entered into a contract with McGraw-Hill, a major textbook publisher, aimed at creating similar discounts for students on digital course materials.

Moving Ahead

Unizin is now up to 11 full member institutions and 1 state-wide system associate member. Despite or because of the tangled paths of Unizin and Courseload, we finally have some clarity on the second component (the Content Relay) of the consortium’s services. It’s not what I would have guessed ahead of time, but I have to admit that seems to be a willing list of schools ready to join.

The post Unizin Updates on Florida State System and Acquisition of Courseload appeared first on e-Literate.

Release of University of California at Davis Case Study on e-Literate TV

Sun, 2015-07-19 16:55

By Phil HillMore Posts (348)

Today we are thrilled to release the fifth and final case study in our new e-Literate TV series on “personalized learning”. In this series, we examine how that term, which is heavily marketed but poorly defined, is implemented on the ground at a variety of colleges and universities. We plan to cap off this series with two analysis episodes looking at themes across the case studies.

We are adding three episodes from the University of California at Davis (UC Davis), a large research university that has a strong emphasis in science, technology, engineering, and math or STEM fields. The school has determined that the biggest opportunity to improve STEM education is to improve the success rates in introductory sciences classes – the ones typically taught in large lecture format at universities of their size. Can you personalize this most impersonal of academic experiences? What opportunities and barriers do institutions face when they try to extend personalized learning approaches?

You can see all the case studies (either 2 or 3 per case study) at the series link, and you can access individual episodes below.

UC Davis Case Study: Personalized The Large Lecture Class

UC Davis Case Study: Intro to Biology and Intro to Chemistry Examples

UC Davis Case Study: Opportunities and Barriers to Extending Personalization

e-Literate TV, owned and run by MindWires Consulting, is funded in part by the Bill & Melinda Gates Foundation. When we first talked about the series with the Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent.

As with the previous series, we are working in collaboration with In the Telling, our partners providing the platform and video production. Their Telling Story platform allows people to choose their level of engagement, from just watching the video to accessing synchronized transcripts and accessing transmedia. We have added content directly to the timeline of each video, bringing up further references, like e-Literate blog posts or relevant scholarly articles, in context. With In The Telling’s help, we are crafting episodes that we hope will be appealing and informative to those faculty, presidents, provosts, and other important college and university stakeholders who are not ed tech junkies.

We welcome your feedback, either in comments or on Twitter using the hashtag #eLiterateTV. Enjoy!

The post Release of University of California at Davis Case Study on e-Literate TV appeared first on e-Literate.

Unizin Perspective: Personalized learning’s existence and distance education experience

Wed, 2015-07-15 18:46

By Phil HillMore Posts (347)

By reading the Unizin pitch for the State University System of Florida shared yesterday, we can see quite a few claims about the (potential) benefits to be provided by the consortium. “Make sure that the universities were not cut out of [distance ed] process”; “Secure our foothold in the digital industry”; “Promote greater control and influence over the digital learning ecosystem”; Provide “access to the Canvas LMS at the Unizin price”; Provide “access to tools under development, including a Learning Object Repository and Learning Analytics”; Provide “potential for cooperative relationships to ‘share’ digital instruction within and across the consortium”.

I want to pick up on University of Florida provost Joe Glover’s further comment on Learning Analytics, however.

The third goal for Unizin is to acquire, create, or develop learning analytics. Some of the learning management systems have a rather primitive form of learning analytics. Unizin will build on what they have, and this will go from very mechanical types of learning analytics in terms of monitoring student progress and enabling intrusive advising and tutoring; all the way up to personalized learning, which is something that really does not exist yet but is one of the objectives of Unizin.

Personalized learning “really does not exist yet”? You can argue that personalized learning as a field is evolving and mostly in pilot programs, or that it is poorly defined and understood, or that there are not yet credible studies independently reviewing the efficacy of this family of approaches. But you cannot accurately say that personalized learning “really does not exist yet”. And is Unizin claiming that the consortium is key to making personalized learning a reality? This seemed to be one of the arguments in the pitch.

If A Tree Falls In A Different Sector . . .

There are multiple examples of personalized learning in practice, particularly at community colleges to deal with developmental math challenges. I have written about the massive investment in the emporium approach at Austin Community College’s ACCelerator Lab.

Rather than a pilot program, which I have argued plagues higher ed and prevents diffusion of innovations, Austin CC has committed to a A) a big program up front (~700 students in the Fall 2014 inaugural semester) and ~1,000 students in Spring 2015, yet B) they offer students the choice of traditional or emporium. To me, this offers the best of both worlds in allowing a big bet that doesn’t get caught in the “purgatory of pilots” while offering student choice.

We also shared through e-Literate TV an entire case study on Essex County College, showing their personalized learning approach.

In another e-Literate TV case study that does not focus on developmental math, we shared the personalized learning program at Empire State College, and they have been trying various personalized approaches for more than 40 years.

If A Tree Falls In A Non-Unizin Campus . . .

Personalized learning does exist, and Unizin schools could learn from the pioneers in this field. It would be wonderful if Unizin ends up helping to spread innovative teaching & learning practices within the research university community, but even there I would note that there are also some great examples in that group of schools (including at Arizona State University, UC Davis, and even at Unizin member Penn State). For that matter, the University of Florida would do well to travel two hours south and see the personalized learning programs in place at the University of Central Florida.

If this “consortium of large public universities that intends to secure its niche in the evolving digital ecosystem” means that the schools want to learn primarily among themselves, then Unizin will be falling prey to the same mistake that the large MOOC providers made – ignoring the rich history of innovation in in the field and thinking they are creating something without precedent leveraging their unique insight.

If A Tree Falls In A Distance Forest . . .

While Unizin has never claimed to be focused only on distance education, Glover does bring bring up the topic twice as the core of his argument.

That is a situation that we got ourselves in by not looking ahead to the future. We believe we are in a similar position with respect to distance learning at this point. [snip]

Every university in some sense runs a mom & pop operation in distance learning at this point, at least in comparison with large organizations like IBM and Pearson Learning that can bring hundreds of millions of dollars to the table. No university can afford to do that.

Let’s ignore the non sequitur about IBM for now. A few notes:

While these are larger non-profit online programs, it is not accurate to say that “every university in some sense runs a mom & pop operation”. It might be accurate based on the Unizin member institution experience, however. And the University of Florida did recently sign a long-term contract with Pearson Embanet to create its UF Online program, largely based on the school’s inexperience (beyond some masters programs) with fully online education.

In the graph below taken from the Fall 2013 IPEDS data, the Y axis is ratio of students taking exclusively DE courses (fully online programs), and the X axis is ratio of students taking some, but not all, DE courses (online courses within f2f program).

DE_Comparison_Unizin_vs_Public_4-year_-_philonedtech___Tableau_Public

We see that the U Florida and Penn State U has fairly high percentage of students taking some online courses,Penn State World Campus is fully online (not sure if World Campus is part of Unizin or not, but I included it to be safe), and that Oregon State seems to have some fully online presence. But in general Unizin schools are not leaders in distance learning compared to other public 4-year universities. This is not a solid basis to think they have the answers on distance learning needs within the consortium.

Look Outward, Not Inward

In my mind, Unizin is looking the wrong direction. The money and focus thus far has been for the 10 (now 11) member institutions to look inward – form a club, talk amongst themselves, and figure out what should happen in a digital ecosystem. A different, more useful approach would be to look outward: get the club together and look beyond their own successes (e.g. Penn State World Campus), go visit schools that are at the forefront of digital education, invite them in to present and share, and learn from others.

What I’m suggesting is that Unizin should focus a lot more on participating in open communities and focus a lot less on forming an exclusive club. If the schools then set the consortium’s mission as leading instructional change within the member institutions, and forming a support community based on the similar profile of schools, then we might see real results.

The post Unizin Perspective: Personalized learning’s existence and distance education experience appeared first on e-Literate.

Unizin Offering “Associate” Membership For Annual $100k Fee

Tue, 2015-07-14 16:33

By Phil HillMore Posts (346)

Alert unnamed readers prompted me after the last post on the Unizin contract to pursue the rumored secondary method of joining for $100k. You know who you are – thanks.

While researching this question, I came across a presentation by the University of Florida provost to the State University System of Florida (SUSFL) seeking to get the system to join Unizin under these new terms. The meeting was March 19, 2015, and the video archive is here (first 15 minutes), and the slide deck is here. The key section (full transcript below):

Associate Membership FLSUS

Joe Glover: One of the things that Unizin has done – as I’ve said it consists of those 10 large research universities – is that the Unizin board decided that member institutions may nominate their system – in this case the state university system of Florida – for Associate Membership for an annual fee of $100,000 per system.

For $100,000 the entire state university system of Florida (SUSFL) could become an associate member of Unizin and enjoy all the benefits that Unizin brings forward, whether it’s reduced pricing of products that it’s licensing, or whether it products that Unizin actually produces. Associate Membership does not qualify for board representation, but as I mentioned you do enjoy the benefits of Unizin products and services.

This section reminded me of one item I should have highlighted in the contract. In appendix B:

The annual membership fees are waived for Founding Investors through June 30, 2017.

Does this mean that founding institutions that “invested” $1.050 million over three years will have to start paying annual fees of $100,000 starting in June 2017? That’s my assumption, but I’m checking to see what this clause means and will share at e-Literate.

Update (7/17): I talked to Amin Qazi today (CEO of Unizin) who let me know that the annual membership fee for institutional members (currently the 11 schools paying $1.050 million) has not be determined yet.

What is clear is that Unizin considers the board seat – therefore input on the future direction and operations of Unizin – to be worth $700,000.[1]

Full Transcript

The presentation is fascinating in its entirety, so I’m sharing it below. There are many points that should be analyzed, but I’ll save that for other posts and for other people to explore.

Joe Glover: I’d like to begin by explaining the problem that Unizin was created to try and avoid, and I’m going to do it by analogy with the publishing problem with scientific journals. About 30 years ago there was a plethora of publishing companies that would take the intellectual property being produced by universities in the form of journal articles, and they would print them and publish them. There was a lot of competition, prices were relatively low to do that.

Then in the ensuing 30 years there was tremendous consolidation in that industry to the point that there are only three or four major publishers of scientific articles. As a consequence they have a de facto monopoly, and they’re in the position of now taking what we produce, packaging it, and selling it back to the libraries of universities basically at whatever price they want to charge. This is a national problem. It is not a problem that is unique to Florida, and I think that every state in the nation is trying to figure out how to resolve this problem because we can’t afford to continue to pay exorbitant prices for journals.

That is a situation that we got ourselves in by not looking ahead to the future. We believe we are in a similar position with respect to distance learning at this point.

We have a plethora of universities and commercial firms. all trying to get into the digital space. Most of us believe that over the next 10 – 15 – 20 years there will be tremendous consolidation in this industry, and it is likely that there will emerge a relatively small number of players who control the digital space.

This consortium of universities wanted to make sure that the universities were not cut out of this process or this industry in much the same way that they had been cut out of scholarly publishing.

Every university in some sense runs a mom & pop operation in distance learning at this point, at least in comparison with large organizations like IBM and Pearson Learning that can bring hundreds of millions of dollars to the table. No university can afford to do that.

So a consortium of major research universities in the country, in an effort to look down the road and to avoid this problem, and to secure our foothold in the digital industry, formed a consortium called Unizin. I’m going to go briefly through this to tell you what this is, and then to lay before you an opportunity that the state university system can consider for membership in this consortium to enjoy the advantages that we expect it to bring.

Slide 1

This consortium is very new – it was launched in 2014. Its current membership is by invitation only. You cannot apply to become a member of this consortium, it is by invitation. As I mentioned, its objective is to promote greater control and influence over the digital learning ecosystem.

It’s governance is fairly standard. It has a board of directors that is drawn from the founding members. It has a CEO. It has a staff and it’s acquiring more staff. As a legal entity it is a not-for-profit service operation which is hosted by Internet2.

Slide 2

It’s current members include the universities that you see listed on this screen. These are 10 major universities in the nation – they’re all large research universities. There are other research universities that are considering joining. Unizin actually started out with four universities and quickly acquired the other six that are on this list.

Associate Membership FLSUS

The primary goals for Unizin as defined by its board of directors are the following. To acquire a learning management system that will serve as the foundation for what Unizin produces and performs. Secondly, to acquire or create a repository for digital learning objects. At the moment we are all producing all sorts of things, ranging from videos to little scientific clips, demonstrations, to illustrations, to lectures, notes, in all sorts of different formats – some retrievable, some not retrievable, some shareable, some not shareable. None of which is indexed, none of which I can see outside the University of Florida.

We believe there needs to be a repository that all of the members of Unizin can place the objects that they create to promote digital learning into, with an index. And in principle there will be developed a notion of sharing of these objects. It could be free sharing, it could be licensing, it could be selling. That’s something to be discussed in the future.

The third goal for Unizin is to acquire, create, or develop learning analytics. Some of the learning management systems have a rather primitive form of learning analytics. Unizin will build on what they have, and this will go from very mechanical types of learning analytics in terms of monitoring student progress and enabling intrusive advising and tutoring; all the way up to personalized learning, which is something that really does not exist yet but is one of the objectives of Unizin.

Those are the three primary goals for Unizin. If you believe that those are three important elements of infrastructure then you are probably interested in Unizin.

I have alluded to the possibility of a club, or of sharing content. We could think about sharing content. We could think about sharing courses. We could think about sharing degree programs. That is not really Unizin’s objective at this point. I will tell you that the universities that form the board for Unizin are in conversation about that, and we expect that to be one of the things that Unizin enables us to do as we create this repository, as we develop learning analytics we expect to be able to begin to collaborate with these universities. There are a lot of interesting questions as you approach that frontier, and by no means have these been resolved, but we believe it is inevitable and important for universities to begin sharing what they do in the digital learning space, and so Unizin would form the foundation for that.

One of the things that Unizin has done – as I’ve said it consists of those 10 large research universities – is that the Unizin board decided that member institutions may nominate their system – in this case the state university system of Florida – for Associate Membership for an annual fee of $100,000 per system.

For $100,000 the entire state university system of Florida (SUSFL) could become an associate member of Unizin and enjoy all the benefits that Unizin brings forward, whether it’s reduced pricing of products that it’s licensing, or whether it products that Unizin actually produces. Associate Membership does not qualify for board representation, but as I mentioned you do enjoy the benefits of Unizin products and services.

Slide 4

The potential benefits to the state university system I believe are the following. Unizin has settled on Canvas as the learning management system which would underlie the Unizin projects of building a repository and learning analytics. If you did not use Canvas you would still enjoy the benefits of Unizin and their products, but the use of them would not be as seamless as if you were on Canvas. You would have to build a crosswalk from the Unizin products to whatever LMS you are using. If you happen to be using Canvas you would enjoy the benefits of the Unizin products in a seamless fashion.

Unizin has negotiated a discount with Canvas. And so actually the University of Florida had signed the contract with Canvas before Unizin even existed. As soon as Unizin was created and negotiated a contract with Canvas, we actually received a discount from the price that we had negotiated. Because there were 10 large universities working on this, and there is some power in purchasing.

The second benefit, or second potential benefit which I think the system could enjoy is access to the tools which are under development as I’ve mentioned, including a digital repository and learning analytics.

Third, the system would enjoy membership in a consortium of large public universities that intends to secure its niche in the evolving digital ecosystem. As I have mentioned, we do see some potential risk as the industry consolidates, that we could be cut out of this industry if we don’t take the proper precautions.

Finally, as I’ve mentioned, there is the potential for cooperative relationships within the consortium to share digital instruction and to share digital objects and courses and degrees. That is really at the beginning conversation stage, that is not a goal of the Unizin organization itself but is a goal of the universities that underpin Unizin.

Q. I guess the real question is, tell me to what extent you can, how this will benefit each of the other universities who are not members at this time. And number two, could some of our other universities eventually become members?

A. Thank you for that question because I didn’t clarify one point that the question gives me the opportunity to clarify. Additional universities could be members of Unizin, and there are some universities in conversation with Unizin at this point. However, there is a larger charge for universities to become full board members of Unizin. University of Florida committed a million dollars over three years as part of the capitalization of Unizin. Every board member has done exactly the same. If a university in the system were interested in joining Unizin as a board member to help direct Unizin’s goals and operations, we could talk about that, but it would involve that level of investment.

At the lower level of investment, the $100,000 level which would be for the whole system – let’s say you join tomorrow – then an individual university would immediately have access to the preferred pricing for the Canvas learning management system. That would be a benefit to individual universities in the system who already are on Canvas or are considering going on Canvas. As the other tools or products are either acquired or developed by Unizin, the individual campuses would have access to those as well.

Q. I’d like to hear from John Hitt [president of UCF]. How does your university look at this proposal as it relates to online?

JH. I think the group membership for the system makes sense. I don’t think that it would make a lot of sense to have multiple institutions paying in a million bucks apiece. We would probably be interested in the $100,000 share. I doubt we would go for the full membership.

Q. Do you see the benefits they’re offering to benefit to UCF at this point, or would you use it?

JH. Yes, I think we would use some of it. We have more enthusiasm for some aspects of the membership than others. Yes, I think it would be useful.

There were no further questions, but it was apparent that some board members were not sure if they were being asked to pay $1 million for each campus or $100,000. Despite this short questioning, the motion passed as shared in the meeting minutes.

Chair Hosseini recognized Mr. Lautenbach for the Innovation and Online Committee report. Mr. Lautenbach reported the Committee heard an update from Provost Joe Glover on the Unizin Consortium and the Committee directed Chancellor Criser to work with university leadership in pursuing membership for the State University System in the consortium.

  1. The $1.050 million investment over three years minus alternate cost of $100,000 for these same three years.

The post Unizin Offering “Associate” Membership For Annual $100k Fee appeared first on e-Literate.

Instructure Is Truly Anomalous

Tue, 2015-07-14 08:54

By Michael FeldsteinMore Posts (1038)

Phil started his last post with the following:

I’m not sure which is more surprising – Instructure’s continued growth with no major hiccups or their competitors’ inability after a half-decade to understand and accept what is at its core a very simple strategy.

Personally, I vote for Door #1. As surprising as the competition’s seeming sense of denial is, Instructure’s performance is truly shocking. After five years, I continue to be surprised by it. It’s not just how well they are executing. It’s that they seem to defy the laws of physics in the LMS market. We had no reason to believe that any LMS company could rack up the numbers they are showing—in several different areas—no matter how well they execute.

Back in late 2010, I wrote a twopart series on LMS market share. For context, this was a year after Blackboard acquired ANGEL, a month before Instructure records its first clients on the growth graph in Phil’s previous post, six months before we wrote our first post about Instructure on e-Literate, and two years before WebCT was officially killed off. At that time, Blackboard still had dominant market share—over 50%—but it was starting to become clear for the first time that their dominance might not last forever. The posts were my attempt to figure out what might happen next. Here’s what the non-Blackboard LMS market looked like then:

Here’s what the market share looked like when the then-present trends were projected out to 2014:

What we see here is a steady decline of Blackboard’s market share getting spread out among multiple platforms. It’s worth calling out a problem with the data that we had at that time. Campus Computing, the source of the market share information in this graph, tracks market share by company, not by platform. So we had no way of knowing how much of their market share was from their Learn platform and how much of it was from WebCT. This was crucial (or, at least, it seemed crucial at the time) because Blackboard was force-migrating WebCT customers to their Learn product. The rate at which Blackboard’s market share got distributed to other platforms depended on how much of the attrition was from WebCT CE customers, how many were WebCT Vista customers, and how many were Blackboard Learn customers. The CE customers tended to be small schools with small contracts, and Blackboard wasn’t making much of an effort to keep them. To the degree that Blackboard’s losses were confined to CE going forward, the company would do just fine. On the other hand, to the degree that Blackboard lost customers from its core Learn platform, it would be a sign of impending catastrophe. LMS migrations were so hard and painful that very few schools migrated unless they felt that they absolutely had to. CE customers left Blackboard in part because it was clear that Blackboard didn’t care about them and that they therefore would never get the quality of product and service (and pricing) that they needed. Blackboard was making a real effort to keep Vista customers, but it was an open question as to whether the forced migration would cause Vista schools to look around at other options, or whether Blackboard could keep the pain of migration low enough that it would be easier to just roll over to Learn than to move to something else. If, on the other hand, Blackboard started losing Learn contracts, it would mean that customers on their core platform felt that the pain of staying was worse than the pain of leaving. At the time, there was strong anecdotal evidence that CE customers were leaving in droves, moderate anecdotal evidence that Vista customers were preparing to leave, and little evidence that Learn customers were leaving. My sense at the time was that Blackboard would probably lose a bunch of customers through the WebCT sunset in 2012 and then the market would more or less settle back into stasis.

That’s not what happened. To begin with, Instructure roared onto the scene in 2011 and ended up stealing the lion’s share of the market share that Blackboard was leaking. But that’s not all. Take a look at the graph Josh Coates presented at the most recent Instructurecon:

As Phil wrote,

There appears to be three periods of growth here:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~65 clients per year;
  • From May 2012 until May 2014: Average growth of ~140 clients per year;
  • From May 2014 until present: Average growth of ~190 clients per year.

So Instructure’s growth has accelerated since the end of 2012, which is the opposite of what I would have expected. Where is that growth coming from? It’s hard to tell. Unfortunately, the data we have on LMS market share is not as good as one would hope. The best indications we have right now are that they are primarily coming from former Blackboard Learn and ANGEL customers. Switching data sources from Campus Computing to Edutechnica, here’s Phil’s September 2014 analysis:

  • Blackboard’s BbLearn and ANGEL continue to lose market share in US –[1] Using the 2013 to 2014 tables (> 2000 enrollments), BbLearn has dropped from 848 to 817 institutions and ANGEL has dropped from 162 to 123. Using the revised methodology, Blackboard market share for > 800 enrollments now stands at 33.5% of institutions and 43.5% of total enrollments.
  • Moodle, D2L, and Sakai have no changes in US – Using the 2013 to 2014 tables (> 2000 enrollments), D2L has added only 2 schools, Moodle none, and Sakai 2 schools.
  • Canvas is the fastest growing LMS and has overtaken D2L – Using the 2013 to 2014 tables (> 2000 enrollments), Canvas grew ~40% in one year (from 166 to 232 institutions). For the first time, Canvas appears to have have larger US market share than D2L (13.7% to 12.2% of total enrollments using table above).

But even if you assume that Instructure picked up 100% of the Learn and ANGEL customers—which is plausible, given these numbers—that’s still only 70 new customers. That’s half the ~140 new customers that Instructure is reporting. Could the rest be international? Maybe, although we have little reason to believe that to be the case. In the Edutechnica post that Phil references for the market share information, George Kroner does provide a little bit of information about Instructure’s international growth in the form of a graph of LMS market share in a few different countries:

We would need to see fully 50% of Instructure’s growth reflected in non-US markets to make the numbers square. We don’t see anything like that here. Of course, there are many other non-US markets. Maybe Canvas is all the rage in Turkmenistan. But it’s hard to square the circle. I just don’t know how to account for the company’s growth. I don’t doubt Instructure’s numbers. It’s just that there’s no way I can find to make sense of them with our current data about the market.

Beyond the numerical mystery, there seems to have been a change in market attitudes about LMS migration. Schools seem to be willing to look at alternatives even when they don’t have to. Nobody likes to migrate, of course, but a variety of factors, ranging from improved standards that make moving content easier to more technology maturity and experience among university faculty and staff, have reduced vendor lock-in. It’s a more fluid market now. I had hoped that would be the case someday but, in my heart of hearts, I really didn’t expect it. And at the moment, pretty much all of that new fluidity is flowing into Instructure—at least in US higher education.

Overall, Instructure’s growth is hard to explain. But there’s also another number that I can’t account for. I am in the process of writing an update to my post on the Glassdoor ratings of ed tech companies. At the moment, Instructure’s rating is 4.7. Out of 5. For reference, LinkedIn, which I used as context in last year’s post because it had one of the highest employee ratings on Glassdoor, currently rates only a 4.5. I have been to both Instructure’s and LinkedIn’s offices. LinkedIn’s is nicer. A lot nicer. I’m sure that their salaries are a lot higher as well. Instructure may be buoyed at the moment by the likelihood that they will have an IPO in the next year or two. But still. Instructure may be the highest rated company on Glassdoor right now, not just in ed tech, but the highest rated of any company.

Also weird is the fact that we don’t hear any major complaints from them from anywhere. People tell us stuff. Customers, former employees, and current employees come to us often to dish dirt. What we end up publishing is only the tip of the iceberg because we don’t publish anything unless we feel we have strong confirmation (which usually means multiple sources), we can protect our sources by preserving their anonymity, we believe the information is truly newsworthy, and so on. We hear a lot of dirt. But we hear very little about Instructure. When we poke around, we can get people to tell us things that they’re not happy with, but it’s all normal stuff—I really wish they had this feature, that feature doesn’t work as well as it could, the sales rep was a little annoying or a little unresponsive, and so on. And almost always, the person reporting the problem takes pains to tell us that he or she is generally happy with the company. As Phil wrote,

Companies change as they grow, and I have covered when the company lost both founders and a high-profile CTO. The company moves on, however, and I cannot find customers complaining (at least yet) that the company has changed and is ticking them off. They do have customer challenges, but so far these have been manageable challenges.

Pop quiz: Name the highest profile customer disaster (outage during examples or first week, broken implementation, major bugs, etc) for Canvas.

It’s not normal. And it can’t last forever. Sooner or later, gravity will assert itself and the company will start screwing up. They all do, eventually. But right now, Instructure’s performance is so good by multiple measures that it is almost literally unbelievable.

The post Instructure Is Truly Anomalous appeared first on e-Literate.

Instructure: Accelerating growth in 3 parallel markets

Mon, 2015-07-13 18:50

By Phil HillMore Posts (345)

I’m not sure which is more surprising – Instructure’s continued growth with no major hiccups or their competitors’ inability after a half-decade to understand and accept what is at its core a very simple strategy. Despite Canvas LMS winning far more new higher ed and K-12 customers than any other vendor, I still hear competitors claim that schools select Canvas due to rigged RFPs or being the shiny new tool despite having no depth or substance. When listening to the market, however, (institutions – including faculty, students, IT staff, academic technology staff, and admin), I hear the opposite. Canvas is winning LMS selections despite, not because of, RFP processes, and there are material and substantive reasons for this success.

The only competitor I see that seems to understand the depth of the challenge they face is Blackboard. Other LMS solutions are adding “cloud” options or making incremental improvements to usability, but only Blackboard is going for wholesale changes to both its User Experience (UX) and cloud hosting architecture. Unfortunately, I question whether Blackboard will be able to execute this strategy, but that is a story for another post.

Like last year’s post about InstructureCon, I believe that the company growth chart[1] gives a lot more information than just “gosh, we’re doing well”.

InstructureCon 2015 Growth Slide

Education Market Growth – Canvas

The use of Canvas in higher ed (show as blue above) has grown steadily, but not exponentially, since the product introduction more than 4 years ago. There appears to be three periods of growth here:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~65 clients per year;
  • From May 2012 until May 2014: Average growth of ~140 clients per year;
  • From May 2014 until present: Average growth of ~190 clients per year.

The use of Canvas in K-12 (show as red above) has grown much faster, and in fact Instructure has more K-12 clients than higher ed and has more sales people in K-12 than higher ed. Let that sink in for a moment – it is a point that is not well understood by the market. Over the same three periods:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~20 clients per year (much lower than higher ed);
  • From May 2012 until May 2014: Average growth of ~135 clients per year (almost the same as higher ed);
  • From May 2014 until present: Average growth of ~340 clients per year (far exceeds higher ed).

It should be noted, however, that K-12 clients tend to have fewer students per contract and tend to spend far less per student. I don’t have exact numbers, but we could assume the following:[2]

  • Instructure has more than 50% of its clients in K-12;
  • Instructure has 30 – 40% of its student counts in K-12; and
  • Instructure makes 25 – 33% of its revenue in K-12.
Corporate Market Growth – Bridge

Actually, the client numbers (shown in green above) do not show significant growth in corporate markets yet – just slow growth of ~30 per year. I wrote about the recent product introduction of Bridge (their LMS for corporate markets) here and here. This is a different strategy than other higher ed originated LMS approaches, where Blackboard, D2L, and Moodle all use the same LMS for both education and corporate markets.

In discussions at the conference, however, the company certainly believes they are about to experience real growth in the corporate market with the new product, and they are hiring the sales force to lead this effort. It will be interesting to watch over the next year to see if the company succeeds in getting similar levels of growth as in higher ed and K-12.

Product Announcements

There were two main product announcements at the conference:

  • After a half-decade on the market, Canvas is gradually moving to a new UX design. I’ll cover that more in a second post.
  • Instructure introduced Canvas Data, a hosted data solution that addresses the biggest weakness in Canvas (not in terms of leapfrogging competition but rather trying to close the gap or to remove the weakness).

At its core, Canvas Data is an easily accessible native-cloud service, delivered on Amazon Web Services through Redshift. Canvas Data provides clients access to their data, including course design features, course activity, assessment and evaluation, user and device characteristics and more.

Both announcements are interesting, but mostly as they further illuminate the company’s strategy.

Market Strategy

Taken together, what we see is a company with a fairly straightforward strategy. Pick a market where the company can introduce a learning platform that is far simpler and more elegant than the status quo, then just deliver and go for happy customers.  Don’t expand beyond your core competency, don’t add parallel product lines, don’t over-complicate the product, don’t rely on corporate M&A. Where you have problems, address the gap. Rinse. Repeat.

Instructure has now solidified their dominance in US higher ed (having the most new client wins), they have hit their stride with K-12, and they are just starting with corporate learning. What’s next? I would assume international education markets, where Instructure has already started to make inroads in the UK and a few other locations.

The other pattern we see is that the company focuses on the mainstream from a technology adoption perspective. That doesn’t mean that they don’t want to serve early adopters with Canvas or Bridge, but Instructure more than any other LMS company knows how to say ‘No’. They don’t add features or change designs unless the result will help the mainstream adoption – which is primarily instructors. Of course students care, but they don’t choose whether to use an LMS for their course – faculty and teachers do. For education markets, the ability to satisfy early adopters rests heavily on the Canvas LTI-enabled integrations and acceptance of external application usage; this is in contrast to primarily relying on having all the features in one system.

Avoid Problems

From the beginning Instructure designed their products from the ground up to fully utilize a cloud architecture, but this also applies to the product management and support services. Instructure has essentially one software version for each product[3] from the beginning, and unlike most other higher ed LMS providers, they reap the benefits of software release management and bug fixing simplicity. Cloud is not just an issue of cost-effective scaling, it is also a matter of getting the software out of the way – just have it work.

Companies change as they grow, and I have covered when the company lost both founders and a high profile CTO. The company moves on, however, and I cannot find customers complaining (at least yet) that the company has changed and is ticking them off. They do have customer challenges, but so far these have been manageable challenges.

Pop quiz: Name the highest profile customer disaster (outage during examples or first week, broken implementation, major bugs, etc) for Canvas.

It’s Not Complicated

I suspect that everything covered in this blog post has been said before, including at e-Literate. There is nothing complex or even nuanced here.

My biggest criticism at this year’s conference is that the keynotes were unfocused and didn’t share enough information about product roadmaps. It’s fine to not focus everything on technology and products, but come on, if you’re going to talk about empathy then tie it explicitly to how that concept affects your company’s approach to student-centered learning.

But despite the weak keynote and despite Josh Coates’ reputation as a jerk (he even referenced this in the keynote), consider the observation Michael made to me that Instructure is one of the very few companies whose employee reviews at Glassdoor rival (or even exceed) LinkedIn’s reviews. Trust me, this is not true for other ed tech companies.

Instructure_Reviews___Glassdoor

I typically don’t write blog posts this positive about ed tech companies, but at this point I think the market needs to realize just how well-managed Instructure is and how positive schools are as they adopt and use its LMS. So far Instructure has been a net positive for higher ed and K-12, but change has come too slowly to the rest of the ed tech market in response to Canvas. Competition is good.

  1. The chart shows the number of clients, which is essentially the number of contracts signed with institutions, school districts, or statewide systems adopting either Canvas or Bridge LMS products.
  2. Note: this includes some personal bar-napkin estimates and student count and revenue are not reported by the company.
  3. It’s a little more complicated than just one software version based on test servers and client acceptance of changes, but the general idea holds in terms of understanding strategy.

The post Instructure: Accelerating growth in 3 parallel markets appeared first on e-Literate.

Promising Research Results On Specific Forms Of Adaptive Learning / ITS

Fri, 2015-07-10 12:45

By Phil HillMore Posts (344)

Recently I described an unpublished study by Dragan Gasevic and team on the use of Knowillage / LeaP adaptive platform.[1] The context of article was on D2L’s misuse of the results, but the study itself is interesting in terms of its findings that adaptive learning usage (specifically LeaP in addition to Moodle within an Intro to Chemistry course) can improve academic performance. I will share more when and if the results become public.

If we look to published research reports there are other studies that back up the potential of adaptive approaches, but the most promising results appear to be for a subset of adaptive systems that provide not just content selection but also tutoring. Last year a research team from Simon Fraser University and Washington State University published a meta-analysis on Intelligent Tutoring Systems (ITS) which they described as having origins from 1970 and the development of SCHOLAR.[2] The study looked at 107 studies involving 14,321 participants and found:

The use of ITS was associated with greater achievement in comparison with teacher-led, large-group instruction (g .42), non-ITS computer-based instruction (g .57), and textbooks or workbooks (g .35). There was no significant difference between learning from ITS and learning from individualized human tutoring (g –.11) or small-group instruction (g .05). Significant, positive mean effect sizes were found regardless of whether the ITS was used as the principal means of instruction, a supplement to teacher-led instruction, an integral component of teacher-led instruction, or an aid to homework. Significant, positive effect sizes were found at all levels of education, in almost all subject domains evaluated, and whether or not the ITS provided feedback or modeled student misconceptions. The claim that ITS are relatively effective tools for learning is consistent with our analysis of potential publication bias.

Relationship of ITS and Adaptive Learning Software

Unlike most marketing and media descriptions of Adaptive Learning, the report is quite specific on defining what an Intelligent Tutoring System is and isn’t.

An ITS is a computer system that for each student:

  1. Performs tutoring functions by (a) presenting information to be learned, (b) asking questions or assigning learning tasks, (c) providing feedback or hints, (d) answering questions posed by students, or (e) offering prompts to provoke cognitive, motivational or metacognitive change
  2. By computing inferences from student responses con- structs either a persistent multidimensional model of the student’s psychological states (such as subject matter knowledge, learning strategies, motivations, or emotions) or locates the student’s current psychological state in a multidimensional domain model
  3. Uses the student modeling functions identified in point 2 to adapt one or more of the tutoring functions identified in point 1

There are plenty of computer-based instruction (CBI) methods out there, but ITS relies on a multidimensional model of the student in addition to a model of the subject area (domain model). The report also calls out that CBI approaches that only model the student in one dimension of item response theory (IRT, more or less the model of a student’s ability to correctly answer specific questions) are not ITS in their definition. IRT can be one of the dimensions but not the only dimension.

A 2014 meta-analysis referred to by the above report further clarifies the conditions for a system to be an ITS as follows [emphasis added]:

VanLehn (2006) described ITS as tutoring systems that have both an outer loop and an inner loop. The outer loop selects learning tasks; it may do so in an adaptive manner (i.e., select different problem sequences for different students) based on the system’s assessment of each individual student’s strengths and weaknesses with respect to the targeted learning objectives. The inner loop elicits steps within each task (e.g., problem-solving steps) and provides guidance with respect to these steps, typically in the form of feedback, hints, or error messages.

For the sloppy field of Adaptive Learning, this means that the study looks at systems that model students, provide immediate feedback to students, and provide hints and support to students as they work through a specific task (inner loop). Adaptive Learning systems that only change the content or tasks presented to students adaptively (outer loop) do not qualify. Some examples of Adaptive Learning / ITS systems include McGraw-Hill’s ALEKS and AutoTUTOR. Knowillage / LeaP is an example of a system that is not an ITS.

Promising Findings

The results showed “the use of ITS produced moderate, statistically significant mean effect sizes” compared to large-group human instruction, individual CBI, and textbooks / workbooks. The results showed no statistically significant mean effect sizes compared to small-group human instruction and individual tutoring. In other words, the study shows improvements of ITS over large lecture classes, non-ITS software tools, and textbooks / workbooks but no real difference with small classes or individual tutors.

ITS Fig 1

What is quite interesting is that the results hold across multiple intervention approaches. Using ITS as Principal instruction, Integrated class instruction, Separate in-class activities, Supplementary after-class instruction, or Homework give similar positive results.

Why Does ITS Give Positive Results?

The report hypothesizes that the primary reasons that ITS seems to provide positive results as follows [formatting added, excerpted]:

[ITS shared characteristics with other forms of CBI] Specifically, they have attributed the effectiveness of CBI to:

  • greater immediacy of feedback (Azevedo & Bernard, 1995),
  • feedback that is more response-specific (Sosa, Berger, Saw, & Mary, 2011),
  • greater cognitive engagement (Cohen & Dacanay, 1992),
  • more opportunity for practice and feedback (Martin, Klein, & Sullivan, 2007),
  • increased learner control (Hughes et al., 2013), and
  • individualized task selection (Corbalan, Kester, & Van Mer- riënboer, 2006).

[snip] The prior quantitative reviews also concluded that using ITS is associated with greater achievement than using non-ITS CBI. We hypothesize that multidimensional student modeling enables ITS to outperform non-ITS CBI on each of its advantages cited in the previous paragraph.

[snip] ITS may also be more effective than non-ITS CBI in the sense that ITS can extend the general advantages of CBI to wider set of learning activities. For example, the ability to score and provide individualized comments on a student’s essay would extend the advantage of immediate feedback well beyond what is possible in non-ITS CBI. This document is copyrighted by the American Psychological Association or one of its allied publishers. This article is intended solely for the personal use of the individual user and is not to be disseminated broadly.

Student modeling also enables ITS to interact with students at a finer level of granularity than test-and-branch CBI systems.

These are very encouraging results for the field of ITS and a subset of Adaptive Learning. I view the results not as saying adaptive learning is the way to go but rather as there is evidence that adaptive learning working applied in a tutoring role can improve academic performance in the right situations.

We need more evidence-based evaluation of different teaching strategies and edtech applications.

  1. When the study started Knowillage was an independent company; mid-way through study D2L bought Knowillage and renamed product as LeaP.
  2. I would link to G+ post by George Station here if it were not for the ironic impossibility of searching within that platform.

The post Promising Research Results On Specific Forms Of Adaptive Learning / ITS appeared first on e-Literate.

Unizin One Year Later: View of contract reveals . . . nothing of substance

Thu, 2015-07-09 09:18

By Phil HillMore Posts (343)

I’ve been meaning to write an update post on Unizin, as we broke the story here at e-Literate in May 2014 and Unizin went public a month later. It’s one year later, and we still have the most expensive method to get the Canvas LMS. There are also plans for a Content Relay and Analytics Relay as seen in ELI presentation, but the actual dates keep slipping.

Unizin Roadmap

e-Literate was able to obtain a copy of the Unizin contract, at least for the founding members, through a public records request. There is nothing to see here. Because there is nothing to see here. The essence of the contract is for a university to pay $1.050 million to become a member. The member university then has a right (but not an obligation) to then select and pay for actual services. Based on the contract, membership gets you . . . membership. Nothing else.

What is remarkable to me is the portion of the contract spelling out obligations. Section 3.1 calls out that “As a member of the Consortium, University agrees to the following:” and lists:

  • complying with Unizin bylaws and policies;
  • paying the $1.050 million; and
  • designating points of contact and representation on board.

Unizin agrees to nothing. There is literally no description of what Unizin provides beyond this description [emphasis added]:

This Agreement establishes the terms of University’s participation in the Consortium, an unincorporated member-owned association created to provide Consortium Members access to an evolving ecosystem of digitally enabled educational systems and collaborations.

What does access mean? For the past year the only service available has been Canvas as an LMS. When and if the Content Relay and Analytics Relay become available, member institutions will have the right to pay for those. Membership in Unizin gives a school input into defining those services as well.

As we described last year, paying a million dollars to join Unizin does not give a school any of the software. The school has to pay licensing & hosting fees for each service in addition to the initial investment.

The contract goes out of its way to point out that Unizin actually provides nothing. While this is contract legalese, it’s important to note this description in section 6.5 [original emphasized in ALL CAPS but shared here at lower volume].[1]

Consortium operator is not providing the Unizin services, or any other services, licenses, products, offerings or deliverables of any kind to University, and therefore makes no warranties, whether express or implied. Consortium Operator expressly disclaims all warranties in connection with the Unizin services and any other services, licenses, products, offerings or deliverables made available to University under or in connection with this agreement, both express and implied, …[snip]. Consortium Operator will not be liable for any data loss or corruption related to use of the Unizin services.

This contract appears to be at odds with the oft-stated goal of giving institutions control and ownership of their digital tools (also taken from ELI presentation).

We have a vested interest in staying in control of our data, our students, our content, and our reputation/brand.

I had planned to piece together clues and speculate on what functionality the Content Relay will provide, but given the delays it is probably best to just wait and see. I’ve been told by Unizin insiders and heard publicly at conference presentations since February 2015 about the imminent release of Content Relay, and right now we just have slideware. I have asked for a better description of what functionality the Content Relay will provide, but this information is not yet available.

Unizin leadership and board members understand this quandary. As Bruce Maas, CIO at U Wisconsin, put it to me this spring, his job promoting and explaining Unizin will get a lot easier when there is more to offer than just Canvas as the LMS.

For now, here is the full agreement as signed by the University of Florida [I have removed the signature page and contact information page as I do not see the need to make these public].

Download (PDF, 587KB)

  1. Also note that Unizin is unincorporated part of Internet2. Internet2 is the “Consortium Operator” and signer of this agreement.

The post Unizin One Year Later: View of contract reveals . . . nothing of substance appeared first on e-Literate.

The Importance Of Student Control Of Learning, Especially For Working Adults

Tue, 2015-07-07 13:32

By Phil HillMore Posts (343)

When giving keynotes at conferences over the past two years, I have observed that some of the best non-verbal feedback occurs when pointing out that personalized and adaptive learning does not equal black-box algorithms choosing content for students. Yes, there are plenty of approaches pitching that solution (Knewton in its early state being the best-known if not most-current example), but there are other approaches designed to give faculty or instructional designers control over learning paths or even to give students control. There seems to be a sense of relief, particularly from faculty members, when discussing the latter approach.

In the Empire State College case study on e-Literate TV, I found the conversation Michael had with [faculty member] Maya Richardson to be a great example of not just giving faculty insight into student learning but also giving students control over their own learning. As Maya explains, this is particularly important for the working adult population going back to school. The software used in this pedagogical approach is CogBooks.

Michael Feldstein:While so-called personalized learning programs are sometimes criticized for moving students lockstep through a linear process, Maya emphasizes the choice and control that students have regarding how they go through the content.

Maya Richardson:What it is—it’s a concept mapping, so they take concepts here, concepts here, and then there’s a split-off, and those concepts then split off and then split off and split off. And then, depending on the student, now students can go, “OK, I understood that concept. I already know that concept, so I don’t need to go to that one right now. I can skip and go here.” This is where the individualized and personalized learning comes in—like a smorgasbord, you pick and choose what you want to learn.

And then you come in; you do the discussion, and you either have more to add to it and a greater enrichment of the experience for yourself but also for your classmates. Then there are those who go, “OK, I need to go through each one of these, step by step, and learn each one, and then move down to learning these and then these and then these and then these,” and then at the end, they’ve gotten so much more out of it.

Maya then goes on to describe her visibility enabled by this pedagogical approach – not just of which concepts the student has mastered but also the learning process and choices that the student makes.

Maya Richardson:It’s that kind of opportunity that I can now watch and go, “OK, so you’re the kind of learner that I can just basically let you go and do what you need to do. I am not going to be interrupting your learning path because you have a very positive learning path. I can watch you do this. “It’s a great pattern. You’re going for it,” and I’m just going, “Wonderful. Just come in, do the discussion, do your test,” and I’m like, “A-student, perfect, great, way to go.” Then I see the ones that sort of the sporadic. They come in, they touch and go, and I go, “OK, let me see how you’re doing.”

There’s a lot more in this conversation, but I want to skip ahead a minute or so in the conversation to this key point about student control, or agency.

Michael Feldstein:Maya and her colleagues are thoughtful about how this kind of software fits with the holistic approach that ESC takes towards education.

Maya Richardson:The personalized learning part of it is taking ownership. I think it motivates. As an adult learner, it’s really important to find that you have some control over—when I go in, I know what I want to learn. I hope I know what I want to learn, and I hope I learn it at the end.

There are disciplines and contexts where adaptive algorithms choosing appropriate content makes sense, but I find that too often this is the assumption for all of personalized learning. This example from Empire State College illuminates the growing importance of student control, especially for the growing working adult populations.

The post The Importance Of Student Control Of Learning, Especially For Working Adults appeared first on e-Literate.

D2L Again Misusing Academic Data For Brightspace Marketing Claims

Thu, 2015-07-02 05:56

By Phil HillMore Posts (342)

At this point I’d say that we have established a pattern of behavior.

Michael and I have been quite critical of D2L and their pattern of marketing behavior that is misleading and harmful to the ed tech community. Michael put it best:

I can’t remember the last time I read one of D2L’s announcements without rolling my eyes. I used to have respect for the company, but now I have to make a conscious effort not to dismiss any of their pronouncements out-of-hand. Not because I think it’s impossible that they might be doing good work, but because they force me to dive into a mountain of horseshit in the hopes of finding a nugget of gold at the bottom. Every. Single. Time. I’m not sure how much of the problem is that they have decided that they need to be disingenuous because they are under threat from Instructure or under pressure from investors and how much of it is that they are genuinely deluding themselves. Sadly, there have been some signs that at least part of the problem is the latter situation, which is a lot harder to fix. But there is also a fundamental dishonesty in the way that these statistics have been presented.

Well, here’s the latest. John Baker put out a blog called This Isn’t Your Dad’s Distance Learning Program with this theme:

But rather than talking about products, I think it’s important to talk about principles. I believe that if we’re going to use education technology to close the attainment gap, it has to deliver results. That — as pragmatic as it is — is the main guiding principle.

The link about “deliver results” leads to this page (excerpted as it existed prior to June 30th, for reasons that will become apparent).

Why Brightspace

Why Brightspace? Results.

So the stage is set – use ed tech to deliver results, and Brightspace (D2L’s learning platform, or LMS) delivers results. Now we come to the proof, including these two examples.

CSULB UWM Results

According to Californiat State University-Long Beach, retention has improved 6% year-over-year since they adopted Brightspace.[snip]

University of Wisconsin-Milwaukee reported an increase in the number of students getting A’s and B’s in Brightspace-powered courses by over 170%

Great results, no? Let’s check the sources. Ah . . . clever marketing folks – no supporting data or even hyperlinks to learn more. Let’s just accept their claims and move along.

. . .

OK, that was a joke.

CSU Long Beach

I contacted CSU Long Beach to learn more, but I could find no one who knew where this data came from or even that D2L was making this claim. I shared the links and context, and they went off to explore. Today I get a message saying that the issue has been resolved, but that CSU Long Beach would make no public statements on the matter. Fair enough – the observations below are my own.

If you now look at that Results page now, the CSU Long Beach claim is no longer there – down the memory hole[1] with no explanation, replaced by a new claim about Mohawk College.

Mohawk UWM Results

While CSU Long Beach would not comment further on the situation, there are only two plausible explanations for the issue being resolved by D2L taking down the data. Either D2L was using legitimate data that they were not authorized to use (best case scenario) or D2L was using data that doesn’t really exist. I could speculate further, but the onus should be on D2L since they are the ones who made the claim.

UW Milwaukee

I also contacted UW Milwaukee to learn more, and I believe the data in question is from the U-Pace program which has been fully documented.[2][3]

The U-Pace instructional approach combnes self-paced, master-based learning with instructor-initiated Amplified Assistance in an online environment.

The control group was traditionally-taught (read that as large lecture classes) for Intro to Psychology.

From the EDUCAUSE Quarterly article on U-Pace, for disadvantaged students the number of A’s and B’s increased 163%. This is the closest data I can find to back up D2L’s claim of 170% increase.

U-Pace results EQ

There are three immediate problems here (ignoring the fact that I can’t find improvements of more than 170% – I’ll take 163%).

  1. First, the data claim is missing the context of “for underprepared students” who exhibited much higher gains than prepared students. That’s a great result for the U-Pace program, but it is also important context to include.
  2. The program is an instructional change, moving from large lecture classes to self-paced, mastery-learning approach. That is the intervention, not the use of the LMS. In fact, D2L was the LMS used in both the control group and the U-Pace treatment group.
  3. The program goes out of its way to call out the minimal technology needed to adopt the approach, and they even list Blackboard, Desire2Learn, and Moodle as examples of LMS’s that work with the following conditions:

U-Pace LMS Reqs

This is an instructional approach that claims to be LMS neutral with D2L’s Brightspace used in both the control group and treatment group, yet D2L positions the results as proof that Brightspace gets results! It’s wonderful that Brightspace LMS worked during the test and did not get in the way, but that is a far cry from Brightspace “delivering results”.

The Pattern

We have to now add these two cases to the Lone Star College and LeaP examples. In all cases, there is a pattern.

  1. D2L makes marketing claim implying their LMS Brightspace delivers results, referring to academic outcomes data with missing supporting data or references.
  2. I contact school or research group to learn more.
  3. Data is either misleading (treatment group is not LMS usage but instead instructional approach, adaptive learning technology, or student support software) or just plain wrong (with data taken down).
  4. In all cases, the results could have been presented honestly, showing the appropriate context, links for further reading, and explanation of the LMS role. But they were not presented honestly.
  5. e-Literate blog post almost writes itself.
  6. D2L moves on to make their next claim, with no explanations.

I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.

This is the misuse of someone else’s data for corporate gain. Institutional data. Student data. That is far different than using overly-positive descriptions of your own data or subjective observations. That is wrong.

The Offer

For D2L company officials, I have an offer.

  1. If you have answers or even corrections about these issues, please let us know through your own blog post or comments to this blog.
  2. If you find any mistakes in my analysis, I will write a correction post.
  3. We are happy to publish any reply you make here on e-Literate.
  1. Their web page does not allow archiving with the Wayback Machine, but I captured screenshots in anticipation of this move.
  2. Note – While I assume this claim derives from U-Pace, I am not sure. It is the closest example of real data that I could find, thanks to a helpful tip from UW-M staff. I’ll give D2L the benefit of the doubt despite their lack of reference.
  3. And really, D2L marketing staff should learn how to link to external sources. It’s good Internet practice.

The post D2L Again Misusing Academic Data For Brightspace Marketing Claims appeared first on e-Literate.

U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet

Tue, 2015-06-30 09:17

By Phil HillMore Posts (341)

It would be interesting to read (or write) a post mortem on this project some day.

Two and a half years ago I wrote a post describing the University of Phoenix investment of a billion dollars on new IT infrastructure, including hundreds of millions of dollars spent on a new, adaptive-learning LMS. In another post I described a ridiculous patent awarded to Apollo Group, parent company of U of Phoenix, that claimed ownership of adaptive activity streams. Beyond the patent, Apollo Group also purchased Carnegie Learning for $75 million as part of this effort.

And that’s all going away, as described by this morning’s Chronicle article on the company planning to go down to just 150,000 students (from a high of 460,000 several years ago).

And after spending years and untold millions on developing its own digital course platform that it said would revolutionize online learning, Mr. Cappelli said the university would drop its proprietary learning systems in favor of commercially available products. Many Apollo watchers had long expected that it would try to license its system to other colleges, but that never came to pass.

I wonder what the company will do with the patent and with Carnegie Learning assets now that they’re going with commercial products. I also wonder who is going to hire many of the developers. I don’t know the full story, but it is pretty clear that even with a budget of hundreds of millions of dollars and adjunct faculty with centralized course design, the University of Phoenix did not succeed in building the next generation learning platform.

Update: Here is full quote from earnings call:

Fifth. We plan to move away from certain proprietary and legacy IT systems to more efficiently meet student and organizational needs over time. This means transitioning an increased portion of our technology portfolio to commercial software providers, allowing us to focus more of our time and investment on educating and student outcomes. While Apollo was among the first to design an online classroom and supporting system, in today’s world it’s simply not as efficient to continue to support complicated, custom-designed systems particularly with the newer quality systems we have more recently found with of the self providers that now exist within the marketplace. This is expected to reduce costs over the long term, increase operational efficiency and effectiveness while still very much supporting a strong student experience.

The post U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet appeared first on e-Literate.

ASU Is No Longer Using Khan Academy In Developmental Math Program

Mon, 2015-06-29 17:37

By Phil HillMore Posts (340)

In these two episodes of e-Literate TV, we shared how Arizona State University (ASU) started using Khan Academy as the software platform for a redesigned developmental math course[1] (MAT 110). The program was designed in Summer 2014 and ran through Fall 2014 and Spring 2015 terms. Recognizing the public information shared through e-Literate TV, ASU officials recently informed us that they had made a programmatic change and will replace their use of Khan Academy software with McGraw-Hill’s LearnSmart software that is used in other sections of developmental math.

To put this news in context, here is the first episode’s mention of Khan Academy usage.

Phil Hill: The Khan Academy program that you’re doing, as I understand, it’s for general education math. Could you give just a quick summary of what the program is?

Adrian Sannier: Absolutely. So, for the last three-and-a-half years, maybe four, we have been using a variety of different computer tutor technologies to change the pedagogy that we use in first-year math. Now, first-year math begins with something we call “Math 110.” Math 110 is like if you don’t place into either college algebra, which has been the traditional first-year math course, or into a course we call “college math,” which is your non-STEM major math—if you don’t place into either of those, then that shows you need some remediation, some bolstering of some skills that you didn’t gain in high school.

So, we have a course for that. Our first-year math program encompasses getting you to either the ability to follow a STEM major or the ability to follow majors that don’t require as intense of a math education. What we’ve done is create an online mechanism to coach students. Each student is assigned a trained undergraduate coach under the direction of our instructor who then helps that student understand how to use the Khan Academy and other tools to work on the skills that they show deficit in and work toward being able to satisfy the very same standards and tests that we’ve always used to ascertain whether a student is prepared for the rest of their college work.

Luckily, the episode on MAT 110 focused mostly on the changing roles of faculty members and TAs when using an adaptive software approach, rather than focusing on Khan Academy itself. After reviewing the episode again, I believe that it stands on its own and is relevant even with the change in software platform. Nevertheless, I appreciate that ASU officials were proactive to let me know about this change, so that we can document the change here and in e-Literate TV transmedia.

The Change

Since the change has not been shared outside of this notification (limiting my ability to do research and analysis), I felt the best approach would be to again interview Adrian Sannier, Chief Academic Technology Officer at ASU Online. Below is the result of an email interview, followed by short commentary [emphasis added].

Phil Hill: Thanks for agreeing to this interview to update plans on the MAT 110 course featured in the recent e-Literate TV episode. Could you describe the learning platforms used by ASU in the new math programs (MAT 110 and MAT 117 in particular) as well as describe any changes that have occurred this year?

Adrian Sannier: Over the past four years, ASU has worked with a variety of different commercially available personalized math tutors from Knewton, Pearson, McGraw Hill and the Khan Academy applied to 3 different courses in Freshman Math at ASU – College Algebra, College Math and Developmental Math. Each of these platforms has strengths and weaknesses in practice, and the ASU team has worked closely with the providers to identify ways to drive continuous improvement in their use at ASU.

This past year ASU used a customized version of Pearson’s MyMathLab as the instructional platform for College Algebra and College Math. In Developmental Math, we taught some sections using the Khan Academy Learning Dashboard and others using McGraw Hill’s LearnSmart environment.

This Fall, ASU will be using the McGraw Hill platform for Developmental Math and Pearson’s MyMathLab for College Algebra and College Math. While we also achieved good results with the Khan Academy this past year, we weren’t comfortable with our current ability to integrate the Khan product at the institutional level.

ASU is committed to the personalized adaptive approach to Freshman mathematics instruction, and we are continuously evaluating the product space to identify the tools that we feel will work best for our students.

Phil Hill: I presume this means that ASU’s usage of McGraw Hill’s LearnSmart for Developmental Math will continue and also expand to essentially replace the usage of Khan Academy. Is this correct? If so, what do you see as the impact on faculty and students involved in the course sections that previously used Khan Academy?

Adrian Sannier: That’s right Phil. Based on our experience with the McGraw Hill product we don’t expect any adverse effects.

Phil Hill: Could you further explain the comment “we weren’t comfortable with our current ability to integrate the Khan product at the institutional level”? I believe that Khan Academy’s API approach is more targeted to B2C [business-to-consumer] applications, allowing individual users to access information rather than B2B [business-to-business] enterprise usage, whereas McGraw Hill LearnSmart and others are set up for B2B usage from an API perspective. Is this the general issue you have in mind?

Adrian Sannier: That’s right Phil. We’ve found that the less cognitive load an online environment places on students the better results we see. Clean, tight integrations into the rest of the student experience result in earlier and more significant student engagement, and better student success overall.

Notes

Keep in mind that ASU is quite protective of its relationship with multiple software vendors and that they go out of their way to not publicly complain or put their partners in a bad light, even if a change is required as in MAT 110. Adrian does make it clear, however, that the key issue is the ability to integrate reliably between multiple systems. As noted in the interview, I think a related issue here is a mismatch of business models. ASU wants enterprise software applications where they can deeply integrate with a reliable API to allow a student experience without undue “cognitive load” of navigating between applications. Khan Academy’s core business model relies on people navigating to their portal on their website, and this does not fit the enterprise software model. I have not interviewed Khan Academy, but this is how it looks from the outside.

There is another point to consider here. While I can see Adrian’s argument that “we don’t expect any adverse effects” in the long run, I do think there are switching costs in the short term. As Sue McClure told me via email, as an instructor she spent significantly more time than usual on this course due to course design and ramping up the new model. In addition, ASU added 11 TAs for the course sections using Khan Academy.  These people have likely learned important lessons about supporting students in an adaptive learning setting, but a great deal of their Khan-specific time is now gone. Plus, they will need to spend time learning LearnSmart before getting fully comfortable in that environment.

Unfortunately, with the quick change, we might not see hard data to determine if the changes were working. I believe ASU’s plans were to analyze and publish the results from this new program after the third term which will not happen.

If I find out more information, I’ll share it here.

  1. The terms remedial math and developmental math are interchangeable in this context.

The post ASU Is No Longer Using Khan Academy In Developmental Math Program appeared first on e-Literate.

Google Classroom Addresses Major Barrier To Deeper Higher Ed Adoption

Mon, 2015-06-29 11:28

By Phil HillMore Posts (339)

A year ago I wrote about Google Classroom, speculating whether it would affect the institutional LMS market in higher education. My initial conclusion:

I am not one to look at Google’s moves as the end of the LMS or a complete shift in the market (at least in the short term), but I do think Classroom is significant and worth watching. I suspect this will have a bigger impact on individual faculty adoption in higher ed or as a secondary LMS than it will on official institutional adoption, at least for the next 2 – 3 years.

And my explanation [emphasis added]:

But these features are targeted at innovators and early adopter instructors who are willing to fill in the gaps themselves.

  1. The course creation, including setting up of rosters, is easy for an instructor to do manually, but it is manual. There has been no discussion that I can find showing that the system can automatically create a course, including roster, and update over the add / drop period.

  1. There is no provision for multiple roles (student in one class, teacher in another) or for multiple teachers per class.
  2. The integration with Google Drive, especially with Google Docs and Sheets, is quite intuitive. But there is no provision for PDF or MS Word docs or even publisher-provided courseware.
  3. There does not appear to be a gradebook – just grading of individual assignments. There is a button to export grades, and I assume that you can combine all the grades into a custom Google Sheets spreadsheet or even pick a GAFE gradebook app. But there is no consistent gradebook available for all instructors within an institution to use and for students to see consistently.

Well today Google announced a new Google Classroom API that directly addresses the limitation in bullet #1 above and indirectly addresses #4.

The Classroom API allows admins to provision and manage classes at scale, and lets developers integrate their applications with Classroom. Until the end of July, we’ll be running a developer preview, during which interested admins and developers can sign up for early access. When the preview ends, all Apps for Education domains will be able to use the API, unless the admin has restricted access.

By using the API, admins will be able to provision and populate classes on behalf of their teachers, set up tools to sync their Student Information Systems with Classroom, and get basic visibility into which classes are being taught in their domain. The Classroom API also allows other apps to integrate with Classroom.

Google directly addresses the course roster management in their announcement; in fact, this appears to be the primary use case they had in mind. I suspect this by itself will have a big impact in the K-12 market (would love to hear John Watson’s take on this if he addresses in his blog), making it far more manageable for district-wide and school-wide Google Classroom adoptions.

The potential is also there for a third party to develop and integrate a viable grade book application available to an entire institution. While this could partially be done by the Google Apps for Education (GAFE) ecosystem, that is a light integration that doesn’t allow deep connection between learning activities and grades. The new API should allow for deeper integrations, although I am not sure how much of the current Google Classroom data will be exposed.

I still do not see Google Classroom as a current threat to the higher ed institutional LMS market, but it is getting closer. Current ed tech vendors should watch these developments.

Update: Changed Google Apps for Education acronym from GAE to GAFE.

The post Google Classroom Addresses Major Barrier To Deeper Higher Ed Adoption appeared first on e-Literate.

How Student and Faculty Interviews Were Chosen For e-Literate TV Series

Mon, 2015-06-29 06:47

By Phil HillMore Posts (338)

As part of our e-Literate TV set of case studies on personalized learning, Michael and I were fully aware that Arizona State University (ASU) was likely to generate the most controversy due to ASU’s aggressive changes to the concept of a modern research university. As we described in this introductory blog post:

Which is one reason why we’re pretty excited about the release of the first two case studies in our new e-Literate TV series on the trend of so-called “personalized learning.” We see the series as primarily an exercise in journalism. We tried not to hold onto any hypothesis too tightly going in, and we committed to reporting on whatever we found, good or bad. We did look for schools that were being thoughtful about what they were trying to do and worked with them cooperatively, so it was not the kind of journalism that was likely to result in an exposé. We went in search of the current state of the art as practiced in real classrooms, whatever that turned out to be and however well it is working.

As part of the back-and-forth discussions with the ASU case study release, John Warner brought up a good point in response to my description that our goal was “Basically to expose, let you form own opinions”.

@PhilOnEdTech Can't form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?

— John Warner (@biblioracle) June 1, 2015

Can’t form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?

Let’s explore this subject for the four case studies already released. Because the majority of interviewees shared positive experiences in our case studies, I’ll highlight some of the skeptical, negative or cautionary views that were captured in these case studies.

Our Approach To Lining Up Interviews

When we contacted schools to line up interviews on campus, it is natural to expect that the staff will tend to find the most positive examples of courses, faculty and students to share. As described above, we admit that we looked for schools with thoughtful approaches (and therefore courses), but we needed to try and expose some contrary or negative views as well. This is not to play gotcha journalism nor to create a false impression of equally good / equally bad perspectives. But it is important to capture that not everyone is pleased with the changes, and these skeptics are a good source of exposing risks and issues to watch. Below is the key section of the email sent to each school we visited.

The Case Study Filming Process
Each case study will include a couple of parts. First, we will interview the college leadership—whoever the school deems appropriate—to provide an overview of the school, it’s mission and history, it’s student body, and how “personalized education” (however that school defines the term) fits into that picture. If there are particular technology-driven initiatives related to personalized learning, then we may talk about those a bit. Second, we will want to talk some teachers and students, probably in a mixed group. We want to get some sample reactions from them about what they think is valuable about the education they get (or provide) at the school, how “personalization” fits into that, and how, when, and why they use or avoid technology in the pursuit of the educational goals. We’re not trying either to show “best/worst” here or to provide an “official” university position, but rather to present a dialog representing some of the diverse views present on the campus.

Campus Input on the Filming
In order for the project to have integrity, MindWires must maintain editorial independence. That said, our goal for the case studies is to show positive examples of campus communities that are authentically engaged in solving difficult educational challenges. We are interested in having the participants talk about both successes and failures, but our purpose in doing so is not to pass judgment on the institution but rather to enable to viewers to learn from the interviewees’ experiences. We are happy to work closely with each institution in selecting the participants and providing a general shape to the conversation. While we maintain editorial control over the final product, if there are portions of the interviews that make the institution uncomfortable then we are open to discussing those issues. As long as the institution is willing to allow an honest reflection of their own challenges and learning experiences as an educational community, then we are more than willing to be sensitive to and respectful of concerns that the end product not portray the institution in a way that might do harm to the very sort of campus community of practice that we are trying to capture and foster with our work.

As an example of what “willing to be sensitive to and respectful of concerns” means in practice, one institution expressed a concern that they did not want their participation in this personalized learning series to be over-interpreted as a full-bore endorsement of pedagogical change by the administration. The school was at the early stages of developing a dialog with faculty on where they want to go with digital education, and the administration did not want to imply that they already knew the direction and answers. We respected this request and took care to not imply any endorsement of direction by the administration.

Below are some notes on how this played out at several campuses.

Middlebury College

As described in our introductory blog post:

Middlebury College, the first school we went to when we started filming, was not taking part in any cross-institutional (or even institutional) effort to pilot personalized learning technologies and not the kind of school that is typically associated the “personalized learning” software craze. Which is exactly why we wanted to start there. When most Americans think of the best example of a personalized college education, they probably think of an elite New England liberal arts college with a student/teacher ratio of under nine to one. We wanted to go to Middlebury because we wanted a baseline for comparison. We were also curious about just what such schools are thinking about and doing with educational technologies.

Middlebury College staff helped identify one faculty member who is experimenting with technology use in his class with some interesting student feedback, which we highlighted in Middlebury Episode 2. They also found two faculty members for a panel discussion along with two students who have previously expressed strong opinions on where technology does and does not fit in their education. The panel discussion was highlighted in Middlebury Episode 3.

As this case study did not have a strong focus on a technology-enabled program, we did not push the issue of finding skeptical faculty or students and instead exposed that technology was not missing from the campus consideration of how to improve education.

The administration did express some cautionary notes on the use of technology to support “personalized learning” as captured in this segment:

Essex County College

By way of contrast, our second case study was at Essex County College, an urban community college in Newark, New Jersey. This school has invested approximately $1.2 million of its own money along with a $100 thousand Gates Foundation grant to implement an adaptive learning remedial math course designed around self-regulated learning. Our case study centered on this program specifically.

Of course, the place where you really expect to see a wide range of incoming skills and quality of previous education is in public colleges and universities, and at community colleges in particular. At Essex County College, 85% of incoming students start in the lowest level developmental math course. But that statistic glosses over a critical factor, which is there is a huge range of skills and abilities within that 85%. Some students enter almost ready for the next level, just needing to brush up on a few skills, while others come in with math skills at the fourth grade level. On top of that, students come in with a wide range of metacognitive skills. Some of them have not yet learned how to learn, at least this subject in this context.

Given the controversial nature of using adaptive learning software in a class, we decided to include a larger number of student voices in this case study. Douglas Walcerz, the faculty and staff member who designed the course, gave us direct access to the entire class. We actively solicited students to participate in interviews, as one class day was turned over to e-Literate TV video production and interviews, with the rest of the class watching their peers describe their experiences.

As we did the interviews, almost all students had a very positive view of the new class design, particularly the self-regulated learning aspect with the resultant empowerment they felt. What was missing was student voices who were not comfortable with the new approach. For the second day we actively solicited students who could provide a negative view. The result was shared in this interview:

As for faculty, it was easier to find some skeptical or cautionary voices, which we highlighted here.

As described above, our intent was not to present a false balance but rather to to include diverse viewpoints to help other schools know the issues to explore.

Arizona State University

At ASU we focused on two courses in particular, Habitable Worlds highlighted in episode 2 and remedial math (MAT 110) using Khan Academy software highlighted in episode 3.

We did have some difficulty getting on-campus student interviews due to both of these being online courses. For MAT 110 we did get find one student who expressed both positive and negative views on the approach, as shown in this episode.

Empire State College

Like ASU, Empire State College presented a challenge for on-campus video production from the nature of all-online courses. We worked with ESC staff to get students lined up for interviews, with the best stories coming from the prior learning affects on students.

It was easier and more relevant to explore the different perspectives on personalized learning from faculty and staff themselves, as evidenced by the following interview. ESC offered him up–proudly–knowing that he would be an independent voice. They understood what we meant in that email and were not afraid to show the tensions they are wrestling with on-camera. Not every administration will be as brave as ESC’s, but we are finding that spirit to be the norm rather than the exception.

Upcoming Episodes

It’s also worth pointing out the role of selecting colleges in the first place, which is not just about diversity. We know that different schools are going to have different perspectives, and we pick them carefully to set up a kind of implicit dialog. We know, for example, that ASU is going to give a full-throated endorsement of personalized learning software used to scale. So we balance them against Empire State College, which has always been about one-on-one mentoring in their design.

Hopefully this description of our process will help people like John Warner who need more information before forming their own opinion. At the least, consider this further documentation of the process. We are planning to release one additional case studies – the University of California at Davis in early July – as well as two analysis episodes. We’ll share more information once new episodes are released.

The post How Student and Faculty Interviews Were Chosen For e-Literate TV Series appeared first on e-Literate.

Prior Learning Assessments Done Right

Sun, 2015-06-28 21:53

By Michael FeldsteinMore Posts (1033)

This post has nothing to do with educational technology but everything to do with the kind of humane and truly personal education that we should be talking about when we throw around phrases like “personalized education.” Prior Learning Assessments (PLAs) go hand-in-glove with the trendy Competency-Based Education (CBE). The basic idea is that you test students on what they have learned in their own lives and give them credit toward their degrees based on what they already know. But it is often executed in a fairly mechanical way. Students are tested against the precise curriculum or competencies that a particular school has chosen for a particular class. Not too long ago, I heard somebody say, “We don’t need more college-ready students; we need more student-ready colleges.” In a logical and just world, we would start with what the student knows, rather than the with what one professor or group of professors decided one semester would be “the curriculum,” and we would give the student credit for whatever college-level knowledge she has.

It turns out that’s exactly what Empire State College (ESC) does. When we visited the college for an e-Literate TV case study, we learned quite a bit about this program and, in particular, about their PLA program for women of color.

But before we get into that, it’s worth backing up and looking at the larger context of ESC as an institution. Founded in 1971, the school was focused from the very beginning on “personalized learning”—but personalized in a sense that liberal intellectuals from the 1960s and 1970s would recognize and celebrate. Here’s Alan Mandell, who was one of the pioneering members of the faculty at ESC, on why the school has “mentors” rather than “professors”:

Alan Mandell: Every single person is called a mentor.

It’s valuable because of an assumption that is pretty much a kind of critique of the hierarchical model of teaching and learning that was the norm and remains the norm where there is a very, very clear sense of a professor professing to a student who is kind of taking in what one has to say.

Part of the idea of Empire State, and other institutions, more and more, is that there was something radically wrong with that. A, that students had something to teach us, as faculty, and that faculty had to learn to engage students in a more meaningful way to respond to their personal, academic, professional interests. It was part of the time. It was a notion of a kind of equality.

This was really interesting to me actually because I came here, and I was 25 years old. Every single student was older than I was, so the idea of learning from somebody else was actually not very difficult at all. It was just taken for granted. People would come with long professional lives, doing really interesting things, and I was a graduate student.

I feel, after many years, that this is still very much the case—that this is a more equal situation of faculty serving as guides to students who bring in much to the teaching and learning situation.

Unlike some of the recent adoptions of PLA, which are tied to CBE and the idea of getting students through their degree programs quickly, Empire State College approaches prior learning assessment in very much the spirit that Alan describes above. Here’s Associate Dean Cathy Leaker talking about their approach:

Cathy Leaker What makes Empire State College unique, even in the prior learning assessment field, is that many institutions that do prior learning assessment do what’s called a “course match.” In other words, a student would have to demonstrate—for example, if they wanted to claim credit for Introduction to Psychology, they would look at the learning objectives of the Introduction to Psychology course, and they would match their learning to that. We are much more open-ended, and as an institution, we really believe that learning happens everywhere, all the time. So, we try to look at learning organically, and we don’t assume that we already know exactly what might be required.

One of my colleagues, Elana Michelson, works on prior learning assessment. She started working in South Africa where they were—there it’s called “recognition for prior learning.” And she gives the example of some of the people who were involved in bringing down Apartheid, and how they, sort of as an institution working with the government, thought it might be ridiculous to ask those students to demonstrate problem solving skills, right? How the institution might look at problem-solving skills, and then if there was a strict match, they would say, “Well, wait a second. You don’t have it,” and yet, they’re activists that brought down the government and changed the world.

Those are some examples of why we really think we need to look at learning organically.

Students like Melinda come to us, talk about their learning, and then we try to help them identify it, come up with a name for it, and determine an amount of credit before submitting it for evaluation.

This is not personalized in the sense trying to figure out which institution-defined competencies you can check off on you way to an institution-defined collection of competencies that they call a “degree.” Rather, it’s an effort to have credentialed experts look at what you’ve done and what you know to find existing strengths that deserve to be recognized and credentialed. The Apartheid example is a particularly great one because it shows that traditional academic institutions may be poorly equipped to recognized and certify real-world demonstrations of competencies, particularly among people who come from disadvantaged or “marked” backgrounds. Here’s ESC faculty member Frances Boyce talking about why the school recognized a need to develop a particular PLA program for women of color:

Frances Boyce: Our project, Women of Color and Prior Learning Assessment, is based on a 2010 study done by Rebecca Klein-Collins and Richard Olson, “Fueling the Race to Success.” That found that students who do prior learning assessments are two and a half times more likely to graduate. When you start to unpack that data and you look at the graduation rates for students of color, for African American students the graduation rate increases fourfold. For Latino students it increases eightfold. Then, when you look at it in terms of gender, a woman who gets one to six credits in prior learning assessment will graduate more quickly than her male counterpart given the same amount of credit.

That seemed very important to us, and we decided, “Well, let’s see what we could do to improve the uptake rate for women of color.” So, we designed four workshops to help women of color, not only identify their learning—the value of their learning—but identify what they bring with them to the institution.

What’s going on here? Why is PLA more impactful than average for women and people of color? In addition to the fact that our institutions are not always prepared to recognize real-world knowledge and skills, as in the Apartheid example, people in non-privileged positions in our society are tacitly taught that college is not “for them.” That they don’t have what it takes to succeed there. By recognizing that they have, in fact, already acquired college-level skills and knowledge, PLA helps them get past the insults to their self-image and dignity and helps them to envision themselves as successful college graduates. Listen to ESC student Melinda Wills-Stallings’ story:

Michael Feldstein: I’m wondering if you can tell me, do you remember a particular moment, early on, when the lightbulb went off and you said to yourself, “Oh, that thing that’s part of my life counts”?

Melinda Wills-Stallings: I think when I was talking to my sons about the importance of their college education and how they couldn’t be successful without it and them saying to me, “But, Mom, you are successful. You run a school. You run a business.” To be told on days that I wasn’t there, the business wasn’t running properly or to be told by parents, “Oh, my, God. We’re so glad you’re back because we couldn’t get a bill, we couldn’t get a statement,” or, “No one knew how to get the payroll done.”

That’s when I knew, OK, but being told by an employer who said I wasn’t needed and I wasn’t relied on, I came to realize that it flipped on me. And I realized that’s what I had been told to keep me in my place, to keep me from aspiring to do the things that I knew that I was doing or I could do.

The lightbulb for me was when we were doing the interviews and Women of Color PLA, and Frances said to me, “That’s your navigational capital.” We would do these roundtables where you would interview with one mentor, and then you would go to another table. Then I went to another table, and she said, “Well, what do you hope to do with your college degree?” And I said, “I hope to pay it forward: to go continue doing what I love to do, but to come back to other women with like circumstances and inspire them and encourage them and support them to also getting their college degrees and always to be better today than I was yesterday, so that’s your aspirational capital.” And I went, “Oh, OK.” So, I have aspirational capital also, and then go to the next table and then I was like, I couldn’t wait to get to the next table because every table I went to, I walked away with one or two prior learning assessments.

And then to go home and to be able to put it into four- or five-page papers to submit that essay and to have it recognized as learning.

I was scared an awful lot of times from coming back to school because I felt, after I graduated high school and started college and decided I wanted to get married and have a family, I had missed the window to come back and get my college education. The light bulb was, “It’s never too late,” and that’s what I tell women who ask me, and I talk to them all the time about our school and our program. Like, “It’s never too late. You can always come back and get it done.”

Goals and dreams don’t have caps on them even though where I was, my employer had put a cap on where I could go on my salary and my position. Your goals and dreams don’t have a cap on it, so I think that was the light bulb for me—that it wasn’t too late.

It’s impossible to hear Melinda speak about her journey and not feel inspired. She built up the courage to walk into the doors of the college, despite being told repeatedly by her employer that she was not worthy. The PLA process quickly affirmed for her that she had done the right thing. At the same time, I recognize that traditionalists may feel uncomfortable with all this talk of “navigational capital” and “aspirational capital” and so on. Is there a danger of giving away degrees like candy and thus devaluing them? First, I don’t think there’s anything wrong with giving a person degree certification if they have become genuine experts in a college-appropriate subject through their life experience. In some ways, we are all the Scarecrow, the Tin Man, or the Cowardly Lion, waiting for some wizard to magically convey upon us a symbol that confers legitimacy upon our hard-won skills and attributes and thus somehow making them more real. But also, a funny thing happens when you treat a formal education as a tool for helping an individual reach her goals rather than a set of boxes that must be checked. Students start thinking about the work that education entails as something that is integral to them achieving those goals rather than a set of obstacles they have to get around in order to get the piece of paper that is the “real” value of college. Listen to ESC student Jessi Colón, a professional dancer who chose not to get all the credits she could have gotten for her dance knowledge because she wanted to focus on what she needed to learn for her next career working in animal welfare:

Jessi Colón: It was little bit tricky especially because I had really come here with the intention of maximizing and capitalizing on all this experience that I had. Part of the prior learning assessment and degree planning process is looking at other schools that may have somewhat relevant programs and trying to match what your learning is to those. As I was looking at other programs outside of New York or at other small, rural schools that do these little animal programs, I found that there were a lot of classes that I really wanted to take.

One of the really amazing things about Empire State is that they can also give you individualized courses, and I did a lot of those. So, once I saw these at other schools, I was like, “Man, I really want to take a class in animal-assisted therapy, and would I like to really, really indulge myself and do that or should I write another essay on jazz dance composition?” I knew that one would be more of a walk in the park than the other, but I was really excited about my degree and having this really personal degree allowed me to get excited about it. So, it made sense, though hard to let go of that prior learning in order to opt for the classes.

I could’ve written 20 different dance essays, but I wanted to really take a lot of classes. So, I filled that with taking more classes relevant to my degree, and then ended up only writing, I think, one or two dance-relevant essays.

It turns out that if you start from the assumption that the education they are coming for—not the certification, but the learning process itself—can and should have intrinsic value to them as tools toward pursuing their own ambitions, then people step up. They aspire to be more. They take on the work. If the education is designed to help them by recognizing how far they have come before they walk in the door and focusing on what they need to learn in order to do whatever it is they aspire to do after they leave, then students often come to see that gaming the system is just cheating themselves.

There are many ways to make schooling more personal but, in my opinion, what we see here is one of the deepest and most profound. This is what a student-ready college looks like. And in order to achieve it, there must be an institutional commitment to it that precedes the adoption of any educational technology. The software is just an enabler. If college community collectively commits to true personalization, then technology can help with that. If the community does not make such a commitment, then “personalized learning” software might help achieve other educational ends, but it will not personalize education in the sense that we see here.

I’m going to write a follow-up post how ESC is using that personalized learning software in their context, but you don’t have to wait to find out; you can just watch the second episode of the case study. While you’re at it, you should go back and watch the full ETV episode from which the above clips were excerpted. In addition to watching more great interview content, you can find a bunch of great related links to content that will let you dig deeper into many of the topics covered in the discussions.

The post Prior Learning Assessments Done Right appeared first on e-Literate.

Release of Empire State College Case Study on e-Literate TV

Fri, 2015-06-26 15:03

By Phil HillMore Posts (337)

Today we are thrilled to release the fourth case study in our new e-Literate TV series on “personalized learning”. In this series, we examine how that term, which is heavily marketed but poorly defined, is implemented on the ground at a variety of colleges and universities.

We are adding two episodes from Empire State College (ESC), a school that was founded in 1971 as part of the State University of New York. Through a lot of one-on-one, student-faculty interactions, the school was designed to serve the needs of students who don’t do well at traditional colleges. What problems are they trying to solve? How do students view some of the changes? What role does the practice of granting prior-learning assessments (PLA) play in non-traditional students’ education?

You can see all the case studies (either 2 or 3 per case study) at the series link, and you can access individual episodes below.

ESC Case Study: Personalized Prior Learning Assessments

ESC Case Study: Personalizing Personalization

e-Literate TV, owned and run by MindWires Consulting, is funded in part by the Bill & Melinda Gates Foundation. When we first talked about the series with the Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent.

As with the previous series, we are working in collaboration with In the Telling, our partners providing the platform and video production. Their Telling Story platform allows people to choose their level of engagement, from just watching the video to accessing synchronized transcripts and accessing transmedia. We have added content directly to the timeline of each video, bringing up further references, like e-Literate blog posts or relevant scholarly articles, in context. With In The Telling’s help, we are crafting episodes that we hope will be appealing and informative to those faculty, presidents, provosts, and other important college and university stakeholders who are not ed tech junkies.

We will release one more case study in early July, and we also have two episodes discussing the common themes we observed on the campuses. We welcome your feedback, either in comments or on Twitter using the hashtag #eLiterateTV.

Enjoy!

The post Release of Empire State College Case Study on e-Literate TV appeared first on e-Literate.

68% of Statistics Are Meaningless, D2L Edition

Wed, 2015-06-24 17:27

By Michael FeldsteinMore Posts (1033)

Two years ago, I wrote about how D2L’s analytics package looked serious and potentially ground-breaking, but that there were serious architectural issues with the underlying platform that were preventing the product from working properly for customers. Since then, we’ve been looking for signs that the company has dealt with these issues and is ready to deliver something interesting and powerful. And what we’ve seen is…uh…

…uh…

Well, the silence has ended. I didn’t get to go to FUSION this year, but I did look at the highlights of the analytics announcements, and they were…

…they were…

OK, I’ll be honest. They were incredibly disappointing in almost every way possible, and good examples of a really bad pattern of hype and misdirection that we’ve been seeing from D2L lately.

You can see a presentation of the “NEW Brightspace Insights(TM) Analytics Suite” here. I would embed the video for you but, naturally, D2L uses a custom player from which they have apparently stripped embedding capabilities. Anyway, one of the first things we learn from the talk is that, with their new, space-age, cold-fusion-powered platform, they “deliver the data to you 20 times faster than before.” Wow! Twenty times faster?! That’s…like…they’re giving us the data even before the students click or something. THEY ARE READING THE STUDENTS’ MINDS!

Uh, no. Not really.

A little later on in the presentation, if you listen closely, you’ll learn that D2L was running a batch process to update the data once every 24 hours. Now, two years after announcing their supposed breakthrough data analytics platform, they are proud to tell us that they can run a batch process every hour. As I write this, I am looking at my real-time analytics feed on my blog, watching people come and go. Which I’ve had for a while. For free. Of course, saying it that way, a batch process every hour, doesn’t sound quite as awesome as

TWENTY TIMES FASTER!!!!!

So they go with that.

There was an honest way in which they could have made the announcement and still sounded great. They could have said something like this:

You know, when LMSs were first developed, nobody was really thinking about analytics, and the technology to do analytics well really wasn’t at a level where it was practical for education anyway. Times have changed, and so we have had to rebuild Brightspace from the inside out to accommodate this new world. This is an ongoing process, but we’re here to announce a milestone. By being able to deliver you regular, intra-day updates, we can now make a big difference in their value to you. You can respond more quickly to student needs. We are going to show you a few examples of it today, but the bigger deal is that we have this new structural capability that will enable us to provide you with more timely analytics as we go.

That’s not a whole lot different in substance than what they actually said. And they really needed to communicate in a hype-free way, because what was the example that they gave for this blazing fast analytics capability? Why, the ability to see if students had watched a video.

Really. That was it.

Now, here again, D2L could have scored real points for this incredibly underwhelming example if they had talked honestly about Caliper and its role in this demo. The big deal here is that they are getting analytics not from Brightspace but from a third-party tool (Kaltura) using IMS Caliper. Regular readers know that I am a big fan of the standard-in-development. I think it’s fantastic that an LMS company has made an early commitment to implement the standard and is pushing it hard as differentiator. That can make the difference between a standard getting traction or remaining an academic exercise. How does D2L position this move? From their announcement:

With our previous analytics products, D2L clients received information on student success even before they took their first test. This has helped them improve student success in many ways, but the data is limited to Brightspace tools. The new Brightspace Insights is able to aggregate student data, leveraging IMS Caliper data, across a wide variety of learning tools within an institution’s technology ecosystem.

We’ve seen explosive growth in the use of external learning tools hooked into Brightspace over the past eighteen months. In fact, we are trending toward 200% growth over 2014. [Emphasis added.] That’s a lot of missing data.

This helps create a more complete view of the student. All of their progress and experiences are captured and delivered through high performance reports, comprehensive data visualizations, and predictive analytics.

Let’s think about an example like a student’s experiences with publisher content and applications. Until now, Brightspace was able to capture final grades but wouldn’t track things like practice quizzes or other assessments a student has taken. It wouldn’t know if a student didn’t get past the table of contents in a digital textbook. Now, the new Brightspace Insights captures all of this data and creates a more complete, living, breathing view of a student’s performance.

This is a big milestone for edtech. No other LMS provider is able to capture data across the learning technology ecosystem like this. [Emphasis added.]

I have no problem with D2L crowing about being early to market with a Caliper implementation. But let’s look at how they positioned it. First, they talked about 200% growth in use of external learning tools in 2015. But what does that mean? Going from one tool to three tools? And what kind of tools are they? And what do we know about how they are being used? OK, on that last question, maybe analytics are needed to answer it. But the point is that D2L has a pattern of punctuating every announcement or talk with an impressive-sounding but meaningless statistic to emphasize how awesome they are. Phil recently caught John Baker using…questionable retention statistics in a speech he gave. In that case, the problem wasn’t that the statistic itself was meaningless but rather that there was no reason to believe that D2L had anything to do with the improvement in the case being cited. And then there’s the slight-of-hand that Phil just called out regarding their LeaP marketing. It’s not as bad as some of the other examples, in my opinion, but still disturbingly consistent with the pattern we are seeing. I am starting to suspect that somebody in the company literally made a rule: Every talk or announcement must have a statistic in it. Doesn’t matter what the statistic is, or whether it means anything. Make one up if you have to, but get it in there.

But back to analytics. The more egregious claim in the quote above is that “no other LMS provider is able to capture data across the learning technology like this [example that we just gave],” because D2L can’t either yet. They have implemented a pre-final draft of a standard which requires both sides to implement in order for it to work. I don’t know of any publishers who have announced they are ready to provide data in the way described in D2L’s example. In fact, there are darned few app providers of any kind who are there yet. (Apparently, Kaltura is one of them.) Again, this could have been presented honestly in a way that made D2L look fantastic. Implementing first puts them in a leadership position, even if that leadership will take a while to pay practical dividends for the customer. But they went for hype instead.

I can’t remember the last time I read one of D2L’s announcements without rolling my eyes. I used to have respect for the company, but now I have to make a conscious effort not to dismiss any of their pronouncements out-of-hand. Not because I think it’s impossible that they might be doing good work, but because they force me to dive into a mountain of horseshit in the hopes of finding a nugget of gold at the bottom. Every. Single. Time. I’m not sure how much of the problem is that they have decided that they need to be disingenuous because they are under threat from Instructure or under pressure from investors and how much of it is that they are genuinely deluding themselves. Sadly, there have been some signs that at least part of the problem is the latter situation, which is a lot harder to fix. But there is also a fundamental dishonesty in the way that these statistics have been presented.

I don’t like writing this harshly about a company—particularly one that I have had reason to praise highly in the past. I don’t do it very often. But enough is enough already.

 

The post 68% of Statistics Are Meaningless, D2L Edition appeared first on e-Literate.

About The D2L Claim Of BrightSpace LeaP And Academic Improvements

Wed, 2015-06-24 16:07

By Phil HillMore Posts (335)

Recently I wrote a post checking up on a claim by D2L that seems to imply that their learning platform leads to measurable improvements in academic performance. The genesis of this thread is a panel discussion at the IMS Global conference where I argued that LMS usage in aggregate has not improved academic performance but is important, or even necessary, infrastructure with a critical role. Unfortunately, I found that D2L’s claim from Lone Star was misleading:

That’s right – D2L is taking a program where there is no evidence that LMS usage was a primary intervention and using the results to market and strongly suggest that using their LMS can “help schools go beyond simply managing learning to actually improving it”. There is no evidence presented[2] of D2L’s LMS being “foundational” – it happened to be the LMS during the pilot that centered on ECPS usage.

Subsequently I found a press release at D2L with a claim that appeared to be more rigorous and credible (written in an awful protected web page that prevents select – copy – paste).

D2L Launches the Next Generation of BrightSpace and Strives to Accelerate the Nation’s Path to 60% Attainment

D2L, the EdTech company that created Brightspace, today announces the next generation of its learning platform, designed to develop smarter learners and increase graduation rates. By featuring a new faculty user interface (UI) and bringing adaptive learning to the masses, Brightspace is more flexible, smarter, and easier to use. [snip]

D2L is changing the EdTech landscape by enabling students to learn more with Brightspace LeaP adaptive learning technology that brings personalized learning to the masses, and will help both increase graduation rates and produce smarter learners. The National Scientific Research Council of Canada (NSERC) produced a recent unpublished study that states: “After collating and processing the results, the results were very favourable for LeaP; the study demonstrates, with statistical significance, a 24% absolute gain and a 34% relative gain in final test scores over a traditional LMS while shortening the time on task by 30% all while maintaining a high subjective score on perceived usefulness.”

I asked the company to provide more information on this “unpublished study”, and I got no response.

Hello, Internet search and phone calls – time to do some investigation to see if there is real data to back up claims.

Details on the Study

The Natural Sciences and Engineering Research Council of Canada (NSERC) is somewhat similar to the National Science Foundation in the US – they are funding agency. When I called them they made it perfectly clear that they don’t produce any studies as claimed, they only fund them. I would have to find the appropriate study and contact the lead researcher. Luckily they shared the link to their awards database, and I did some searching on relevant terms. I eventually found some candidate studies and contacted the lead researchers. It turns out that the study in question was led by none other than Dragan Gasevic, founding program co-chair of the International Conference on Learning Analytics & Knowledge (LAK) in 2011 and 2012, and he is now at the University of Edinburgh.

The grant was one of NSERC’s Engage grants which look for researchers to team with companies, and Kowillage was the partner – they have an adaptive learning platform. D2L acquired Knowillage in the middle of the study, and they currently offer the technology as LeaP. LeaP is integrated into the main D2L learning platform (LMS).

The reason the study was not published was simply that Dragan was too busy, including his move to Edinburgh, to complete and publish, but he was happy to share information by Skype.

The study was done on an Introduction to Chemistry course at an unnamed Canadian university. Following ~130 students, the study looked at test scores and time to complete, with two objectives reported – from the class midterm and class final. This was a controlled experiment looking at three groupings:

  • A control group with no LMS, using just search tools and loosely organized content;
  • A group using Moodle as an LMS with no adaptive learning; and
  • A group using Moodle as an LMS with Knowillage / LeaP integrated following LTI standards.

Of note, this study did not even use D2L’s core learning platform, now branded as BrightSpace. It used Moodle as the LMS, but the study was not about the LMS – it was about the pedagogical usage of the adaptive engine used on top of Moodle. It is important to call out that to date, LeaP has been an add-on application that works with multiple LMSs. I have noticed that D2L now redirects their web pages that called out such integrations (e.g. this one showing integration with Canvas and this one with Blackboard) to new marketing just talking about BrightSpace. I do not know if this means D2L no longer allows LeaP integration with other LMSs or not. Update 6/25: Confirmed that LeaP is still being actively marketed to customers of other LMS vendors.

The study found evidence that Knowillage / LeaP allows students to have better test scores than students using just Moodle or no learning platform. This finding was significant even when controlling for students’ prior knowledge and for students’ dispositions (using a questionnaire commonly used in Psychology for motivational strategies and skills). The majority of the variability (a moderate effect size) was still explained by the test condition – use of adaptive learning software.

Dragan regrets the research team’s terminology of “absolute gain” and “relative gain”, but the research did clearly show increased test score gains by use of the adaptive software.

The results were quite different between the mid-term (no significant difference between Moodle+LeaP group and Moodle only group or control group) and the final (significant improvements for Moodle+LeaP well over other groups). Furthermore, the Moodle only group and control group with no LMS reversed gains between midterms and finals. To Dragan, these are study limitations and should be investigated in future research. He still would like to publish these results soon.

Overall, this is an interesting study, and I hope we get a published version soon – it could tell us a bit about adaptive learning, at least in the context of Intro to Chemistry usage.

Back to D2L Claim

Like the Lone Star example, I find a real problem with misleading marketing. D2L could have been more precise and said something like the following:

We acquired a tool, LeaP, that when integrated with another LMS was shown to improve academic performance in a controlled experiment funded by NSERC. We are now offering this tool with deep integration into our learning platform, BrightSpace, as we hope to see similar gains with our clients in the future.

Instead, D2L chose to use imprecise marketing language that implies, or allows the reader to conclude that their next-generation LMS has been proven to work better than a traditional LMS. They never come out and say “it was our LMS”, but they also don’t say enough for the reader to understand the context.

What is clear is that D2L’s LMS (the core of the BrightSpace learning platform) had nothing to do with the study, the actual gains were recorded by LeaP integrated with Moodle, and that the study was encouraging for adaptive learning and LeaP but limited in scope. We also have no evidence that the BrightSpace integration gives any different results than Moodle or Canvas or Blackboard Learn integrations with LeaP. For all we know given the scope of the study, it is entirely possible that there was something unique about the Moodle / LeaP integration that enabled the positive results. We don’t know that, but we can’t rule it out, either.

Kudos to D2L for acquiring Knowillage and for working to make it more available to customers, but once again the company needs to be more accurate in their marketing claims.

The post About The D2L Claim Of BrightSpace LeaP And Academic Improvements appeared first on e-Literate.

An Example Why LMS Should Not Be Only Part of Learning Ecosystem

Tue, 2015-06-23 11:51

By Phil HillMore Posts (333)

In Michael’s initial post on the Post-LMS, he built on this central theme:

Reading Phil’s multiple reviews of Competency-Based Education (CBE) “LMSs”, one of the implications that jumps out at me is that we see a much more rapid and coherent progression of learning platform designs if you start with a particular pedagogical approach in mind.

The idea here is not that the traditional LMS has no value (it can be critical infrastructure, particularly for mainstream faculty adoption), but rather that in the future we both see more learning platform designs being tied to specific pedagogies. This idea is quite relevant given the ongoing LMS users’ conferences (InstructureCon last week, D2L Fusion this week, BbWorld next month, Apereo / Sakai as well as iMoot in the past two months).

Later in the post Michael mentions ASU’s Habitable Worlds as an example of assessing the quality of students’ participation instead of direct grading.

A good example of this is ASU’s Habitable Worlds, which I have blogged about in the past and which will be featured in an episode of the aforementioned e-Literate TV series. Habitable Worlds is roughly in the pedagogical family of CBE and mastery learning. It’s also a PBL [problem-based learning] course. Students are given a randomly generated star field and are given a semester-long project to determine the likelihood that intelligent life exists in that star field. There are a number of self-paced adaptive lessons built on the Smart Sparrow platform. Students learn competencies through those lessons, but they are competencies that are necessary to complete the larger project, rather than simply a set of hoops that students need to jump through. In other words, the competency lessons are resources for the students.

In our recent case study on ASU, Lev Horodyskyj shared his experiences helping to design the course. He specifically called out the difficulties they faced when initially attempting this pedagogical approach with a traditional LMS.

Phil Hill: But the team initially found that the traditional technologies on campus were not suited to support this new personalized learning approach.

Lev Horodyskyj: Within a traditional system it was fairly difficult. Traditional learning management systems aren’t really set up to allow a lot of interactivity. They’re more designed to let you do things that you would normally do in a traditional classroom: multiple choice tests; quizzes; turning in papers; uploading, downloading things.

Especially when you’re teaching science, a range of possibilities are viable answers, and oftentimes when we teach science, we’re more interested in what you’re not allowed to do rather than what you’re allowed to do.

Traditional LMS’s don’t allow you to really program in huge parameter spaces that you can work with. They’re basically looking for, “What are the exact correct answers you are allowed to accept?”

I was brought into the picture once Ariel decided that this could be an interesting way to go, and I started playing around with the system. I instantly fell in love with it because it was basically like PowerPoint. I could drop whatever I wanted wherever I wanted, and then wire it up to behave the way I wanted it to behave.

Now, instead of painstakingly programming all the 60 possible answers that a student might write that are acceptable, I can all of sudden set up a page to take any answer I want and evaluate it in real time. I no longer have to program those 60 answers; I could just say, “Here are the range of answer that are acceptable,” and it would work with that.

Phil Hill: And this was the Smart Sparrow system?

Lev Horodyskyj: This was the Smart Sparrow system, correct. It was really eye-opening because it allowed so many more possibilities. It was literally a blank canvas where I could put whatever I wanted.

This pedagogical approach, supported by appropriate learning platform design, seems to lead to conceptual understanding.

Eric Berkebile: My experiences were very similar. What amazed me the most about it was more how the course was centered upon building concept. It wasn’t about hammering in detail. They weren’t trying to test you on, “How much can you remember out of what we’re feeding you?” It wasn’t about hammering in detail. They weren’t trying to test you on ‘How much can you remember?’

You go through the slides, you go through the different sections, and you are building conceptual knowledge while you are doing it. Once you’ve demonstrated that you can actually apply the concept that they are teaching you, then you can move forward. Until that happens, you’re going to be stuck exactly where you are, and you’re going to have to ask help from other students in the class; you’re going to have to use the resources available.

They want you to learn how to solve problems, they want you to learn how to apply the concepts, and they want you to do it in a way that’s going to work best for you.

Phil Hill: So, it’s multidisciplinary for various disciplines but all held together by project problem-solving around Drake’s equation?

Todd Gilbert: Yeah. One concept really ties it all together, and if you want to answer those questions around that kind of problem, like, “Is there life out there? Are we alone?” you can’t do that with just astronomy, you can’t do that with just biology. It touches everything, from sociology down to physics. Those are very, very different disciplines, so you have to be adaptable.

But I mean if you rise to that kind of a challenge—I can honestly say, this is not hyperbole or anything. It is my favorite class I’ve taken at this college, and it’s a half-semester online course. It is my favorite class I’ve taken at this college.

Eric Berkebile: By far the best course I’ve taken, and I’ve recommended it to everybody I’ve talked to since.

This approach is not mainstream in the sense that the vast majority of courses are not designed as problem-based learning, so I am not arguing that all LMSs should change accordingly or that Smart Sparrow is a superior product. I do, however, think that this episode gives a concrete example of how the traditional LMS should not be the only platform available in a learning ecosystem and how we will likely see more development of platforms tied to specific pedagogical approaches.

The post An Example Why LMS Should Not Be Only Part of Learning Ecosystem appeared first on e-Literate.