Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 8 hours 11 min ago

Student Course Evaluations and Impact on Active Learning

Mon, 2015-11-30 10:51

By Phil HillMore Posts (379)

The Chronicle has an article out today, “Can the Student Course Evaluation Be Redeemed?”, that rightly points out how student course evaluations are often counter-productive to improving teaching and learning. The article refers to a Stanford professor’s call for an instructor completed “inventory of the research-based teaching practices they use”, but most of the article centers on revised course evaluation tool from a Kansas State University spin-off (the IDEA Center). One of the key problems described is that “administrators often take their results as numerical gospel” as well as faculty misapplying the results.

However they’re used, a lot of course evaluations simply aren’t very good, [IDEA president] Mr. Ryalls says.

But as flawed as they are, faculty members still turn to them as some gauge of effectiveness in the classroom. About three-quarters of instructors use formal evaluations and informal feedback “quite a bit” or “very much” when altering their courses, according to the Faculty Survey of Student Engagement.

One limitation of many tools is that they ask students things they don’t really know. A frequent example: Was your instructor knowledgeable about course content?

There is one additional problem with most student course evaluations that is not explicitly covered in the Chronicle articles – students newly involved in active learning approaches often rate the course and instructor poorly even if they end up learning more effectively. We saw this in our e-Literate TV case study at UC Davis. In a previous post we highlighted how the routine hard work required of students in active learning courses can lead to poor evaluations, but later in the interview student course evaluations came up as a major barrier to improving teaching practices.

Phil Hill: Catherine, especially with even more of a firsthand view, what do you see as the biggest barrier?

Catherine Uvarov: Well, in a way, I was fortunate because I was more a newbie instructor, so I didn’t have like 20 years of experience where I had done it this other way. Just coming in and telling instructors, “Hey, that thing that you’ve been doing for 20 years. You could be doing it better.” They don’t want to hear that. That thing that you’ve been doing for 20 years. You could be doing it better. They have worked very hard over the past 15-, 20-plus years to optimize their instructional methods to the best of their ability within their set of norm practices.

Chris Pagliarulo: And the feedback that they were getting.

Catherine Uvarov: And the feedback, so there is a huge emphasis on student evaluations and how much students like you, which is not really correlated at all with how much they’re actually learning. So, if that’s the only measure of student learning or a student—anything in the class—is student evaluations, then that’s what the instructor is tuning for.

They’re not really figuring out if their students are learning or turning the mirror on themselves and saying, “What can I do to improve my student’s learning?” They’re just saying, “What can I do to make my students like me better?”

Phil Hill: Actually, I’d like you to go a little bit more detail on course evaluations as they’re currently used. I think I heard you say those are more based on, “Do students like me?” So, what do the current course evaluations really measure? What direction does it push faculty?

Catherine Uvarov: In my opinion, the student evaluations are pretty much worthless because the questions that they ask are very generic. It’s like, “Does the person speak loud? Are their visual aids clear?” It’s very generic and bland, and then it gets down to the only question that they really care about—rate the overall performance of this instructor.

What we have found in my flipped class and in any of these where the lecture is changing their style and making the emphasis more on the students, the students are thinking, “Well, I learned all of the material on my own, so the instructor didn’t teach me that material. I’m going to rate the instructor lower because they were not as valuable to me.

Erin Becker: When you make the students do more work, they don’t like you as much, and that hurts your course evaluations, which in turn feeds back in to the incentivization issue.

Marc Faciotti: It’s a challenge. If you’re not thinking about education all day—and most of us have research labs that occupy a lot of time as well (administrative duties and all that type of thing)—so if you don’t have training there, there’s a lot of catching up to do. Most institutions have great resources on campus. There’s people dying here at iAMSTEM to help and to catalyze some of these things. So, seek help, be realistic about how much you’re going to change the first time around, and have kind of a long-term plan for what you’d like to achieve.

Marco Molinaro: I think the biggest barrier we have right now is that the faculty rewards system doesn’t yet take in to account this type of experimentation and doesn’t really promote a faculty member based on the quality of their instruction and the effects that they’ve had on student learning.

Later in the Chronicle article there is a discussion about whether to scuttle student evaluations altogether. I strongly agree with this conclusion:

For Mr. Ryalls, of IDEA, the problems with students’ evaluations shouldn’t scuttle their use altogether. “What drives me crazy,” he says, “is this notion that students don’t know what the hell they’re talking about.” They spend more time than anyone else watching faculty members teach, he says. “Student voice matters.”

The post Student Course Evaluations and Impact on Active Learning appeared first on e-Literate.

New Visual From LISTedTECH Shows LMS Market By New Implementations

Mon, 2015-11-23 19:15

By Phil HillMore Posts (379)

Justin Menard and his team at ListEdTech have produced a great new visual on the LMS market in North America. Using his wiki-based data with 4,000+ institutions, he shows the percentage of LMS implementations per year (scaled to 100% for each year). While we are used to seeing LMS market share in terms of number or percentage of institutions using each LMS as primary system, this view highlights the dynamics of the market – which systems are getting adopted. See original post for full chart and additional description.


A few notes:

  • We’ve come a long way since the days of simple bar charts for market data.
  • Keep in mind that this data shows new implementations and not LMS decisions, per se. If a school goes through an RFP process and selects their current vendor, this data will likely not show up as a new implementation.
  • Also keep in mind that while 4,000 institutions is impressive, it is not exhaustive and not bias-free. Read this as the best data available on new implementations, painting a picture of trends in the market. Note the description at the bottom of Justin’s post on limitations of the data.
  • We have covered at e-Literate that Blackboard’s Moodle-based clients are growing in number (over 1,400 as of this fall), especially outside of the US. But note that this growth is not occurring in North America.
  • The dominant player in the market is obviously Canvas. As of last year (and I expect that 2015 will be even stronger for Instructure, maker of Canvas), Canvas has more North American new implementations that Blackboard, Moodle, and D2L combined. We have not seen this dominance of new implementations since Blackboard back in the 1998 – 2002 years when the market was a land grab.
  • Notice how Blackboard, WebCT, and ANGEL are all colored shades of blue – based on Blackboard’s acquisition of the other two companies. This allows you to also see the combined company effect.

More description of the data in the original post.

The post New Visual From LISTedTECH Shows LMS Market By New Implementations appeared first on e-Literate.

Rapid Fire Feedback from #WCET15

Mon, 2015-11-16 18:58

By Sasha ThackaberryMore Posts (1)

ed. We welcome Sasha Thackaberry with this post summarizing observations at last week’s WCET conference in Denver. We met Sasha while consulting at Cuyahoga Community College (Tri-C) and were impressed with her breadth of knowledge. We’ve asked her to provide her own take on the WCET conference.

It was my first time at the WCET Annual Meeting in Denver. I was extremely fortunate to attend and thrilled to present with Luke Dowden, whom I met for the first time at the conference. To hang on to some of the great learning, I wanted to create a brief recap of the conference and some thoughts on the next evolution of edtech in higher ed.

One theme that emerged from the conference was that higher education is moving beyond an era of traditional online courses and programs toward a more simultaneously disaggregated and integrated future. This evolution is creative, messy, and occurring on multiple fronts, with the twin drivers of tightening budgets and the need for college-ready graduates propelling new models of learning and teaching. Individual and highly specialized innovations are converging to create entirely new ecosystems of learning, both pedagogically and architecturally.

The overall tenor of the conference was future-forward. Not focused on limitations, instead the sessions and keynotes centered around a common theme: We can rebuild it. We have the technology…. We can make it better, stronger, faster, (and cheaper.)

1. We’re beyond “traditional online courses.”

Though this may seem like a “duh” moment to many of us in edtech, this is still news to parts of the higher education industry. There is a wide spectrum of implementation of online courses and programs in higher ed – anywhere from cohesive online programs that have a formalized course creation, redesign and QA process to those institutions that we liked to call to as the “Wild West” of online learning, which refers to courses created at will without formalized quality review, partnership with instructional designers or developers, or in some cases even required faculty training.

However, on the other end of the other spectrum, that Wild West concept referring to online courses is so last week. On the new frontier of digital learning innovation in higher ed we’re seeing modularization at a new level, competency-based education yes, but also microcredentials, integration of alternative providers like boot camps and incorporation of open learning like MOOCs. We’re talking convergence of digital innovation. Many national initiatives are supporting this evolution in various ways (see #2 about ecosystems.)

And how is this evolution occurring? These alternative learning models may seem like they’re fringe, outside of the mainstream, branded with the stamp of the all-too-overused “Disruptive Innovation.” However, we’re increasingly seeing these alternative models being intentionally incorporated into the fabric of higher education. The institutions leading the charge are strategically taking smart risks to be at the front-end of this evolutionary spectrum, meeting the needs of an increasingly diverse population of lifelong learners.

Check out Wichita State University’s Stackable Badges for awarding credit. The Colorado Community College System is conducting a system-wide digital badging initiative. There are initiatives to gain credentialing coherence like “Connecting Credentials” and the associated project the Credential Transparency Initiative. Colleges are increasingly aware of graduating students who can actually get jobs (note this is not a knock on a liberal arts education, which I passionately believe in; folks need to have a family-supporting wage too.) Adaptive learning was a big topic too – Colorado Technical University’s Intellipath is an example. Prior Learning Assessment (PLA) was a topic, and though it’s not new by any means, it is gaining additional adoption as colleges struggle to reconcile how credit will be awarded and transcribed. Competency-Based Education (CBE) was highlighted, with UW’x Flexible Option taking front and center for a session. There’ve been a bunch of new convenings and conferences popping up around this accelerating model, including CBExchange in Phoenix and CBE4CC in Denver. There were also the more traditional concerns about state authorization and student authentication as well as protecting student data. Less fun for my particular brand of geekiness, but no less critical to the success of our endeavors. There was also the groundbreaking work of scaling Open Educational Resources (#OER) that University of Maryland University College shared, which won them a WOW award. Dreaming the dream of a no-textbook-cost degree! All of these conversations – from talented colleagues – had one thing in common: they are deeply connected and intertwined.

2. We’re talking ecosystems and acceleration.

The portfolio approach is passe – the digital learning ecosystem is the new portfolio approach (for those of you unfamiliar, that was the world’s most awkward “Orange is the New Black” parallel.)

The interconnected nature of each of these elements means that institutions need to make decisions about scope and identity. First, institutions need to know where they are on the spectrum of innovation in digital learning. Secondly, they need to decide where they want to be. And – this is important – institutions that don’t make that decision consciously may find themselves overtaken by those institutions who have.
There are initiatives cropping up all over the place to bring the effectiveness of these new models to scale, including the Competency-Based Education Network (#C-BEN), which is another great organization/resource site, and IMS Global, which is attempting to solve the substantial system integration challenges that this ecosystem introduces. I could list them but it would be a long list, and it would be incomplete – it seems like new initiatives crop up every day.

This new digital ecosystem is like a living thing – it adapts, grows, rewires itself, makes new connections, mutates. The tightening of state budgets and the calls for job-ready graduates are the climate change of this digital ecosystem. Dramatic shifts in the environment are necessitating adaptation and change.

Though “evolve or die” may be a little dramatic, operating on the same assumptions that sustained higher ed a generation ago will not be successful for most institutions. Figuring out how to dig in with both hands and get messy and creative will be a successful strategy for institutions that are evolving, and improving student success as they go.

Interested in learning more about culture change and innovation? Check out the asynch resources from the presentation I did with Luke Dowden, “Strategic Innovation: Working Through the Strategy and the Skeptics.” Here’s the Google Site we used to curate resources and the strategies themselves: If you’re one of my Tweeple, you can catch all the action by searching #Strategy2Innovate on Twitter.

3. Oh the humanity – yep, we’ve got that too.

One of the challenges with “traditional” online education has long been “student engagement.” Student engagement is the strange and wondrous, amorphous term that we in education use to describe the human element – interactive communication, a sense of connection, a feeling of belonging and responsiveness that comes when learners – and college staff and administration – feel that they are part of a community. The complaint of asynchronous learning has long been that it lacks student engagement, and at least some data supports that the student grades and retention rates are higher in face-to-face classes and blended/hybrid courses than in fully online courses. This has been particularly pronounced at the community college level, though students who take online courses also graduate in greater percentages than those who do not. More than one study has found no difference overall in learning outcomes. And indeed, studies have revealed the importance of engagement within online courses for student success.

How do we square this lack of student engagement with the world in which we live where Twitter, Facebook, Instagram, Tumblr and a dozen other ubiquitous social media channels are a core part of how we stay connected to friends and family?

Here’s the secret answer – there’s a better way to do it.

When over one third of marriages in the U.S. start online, I think it’s time that we can squarely throw out the hypothesis that online interactions – in and of themselves – are less satisfying, less connecting, and less effective for learning than those in the ostensible “real world.” Our core business in higher education has not been wholly dependent on creating systems and processes to support meaningful interactions online., however, has quite a bit invested in meaningful interactions online. Colleges and universities have long been in the business of information transference – a totally different purpose and one that is being fractured as our understanding of effective learning and teaching transforms. The legacy systems that still dominate the marketplace like LMSs are not designed for connection, they’re designed for information delivery and assessment.

Student readiness and instructional design are of course, the other two core components of the triad of effective elearning, but this engagement component can, will, and is being tackled. As more colleges experiment with what student support, coaching, and advising mean, they’re finding huge dividends in virtual connections. And we’re talking beyond email or phone calls, we’re talking mobile apps, texting alerts, and a Skype, Facetime, Google Hangout, WebEx, and Adobe Connect world. A panel discussion about virtual student coaching brought up many of these concerns and how some companies and institutions are making online coaching effective and engaging. Engaging faculty – both full-time and particularly adjuncts – is critical too. Excelsior College shared strategies for engaging adjunct professors while UC Online and Case Western Reserve University shared their alternatives to sit-and-get training.

The point of the technology is not technology itself. It is a tool that enables rich learning.

This human digital experience is indeed “sometime in the future” for many colleges and universities. Meanwhile, innovative institutions are evolving their very DNA, creating agile processes that support turning on a dime. Size, funding, and stature no longer guarantee success for institutions. In an increasingly unbundled world of learning, the student has a level of choice like never before. They will vote with their mouses, with their tablets and thumbs.

4. Insert pithy leadership-related title here….

The second day of the conference was kicked off with a fantastic panel on leadership in edtech. Ron Legon from Quality Matters, Loraine Schmitt from Portland Community College, and Ray Schroeder from the Center for Online Leadership, UPCEA shared diverse and convergent perspectives led by the facilitation of Michael Abbiatti from WCET. High on the radar was the need to mentor and coach the new generation of leaders, and the importance – and often neglected – necessity of succession planning. Those of us Gen-Xers looking to get into the C-suite had our hopes buoyed by the demographics. The numbers of baby boomers retiring will leave a void of leadership in higher education and the visionaries in our field are both concerned and committed to preparing the next generation.

This message was very similar to one that I heard at the ACE Women’s Network of Ohio conference at Corporate College East at Cuyahoga Community College week before last. Women are a large part of edtech and anyone familiar with the #LeanIn movement knows that tech in general needs more women leaders. Under the school of “Chance Favors the Prepared” I – who am working on my dissertation in #CBE – met several other women at the conference, also middle-managers, who are doing the same, prepping for the next step. The research that is being pursued is meaningful and exciting, and it was great to connect with so many other like-minded curious and passionate people – among them Phil Hill, Ron Legon, Kelvin Bentley (my old boss,) Luke Dowden, and new friends Brenda Perea, Cali Morrison, Meg Raymond, Dale Johnson, Amanda Shaffer, Catrina Roitman, Wendy Bullock and Tammy Vercauteren. I also enjoyed meeting Vernon Smith, Michael Abbiatti and Russ Adkins – follow-up is on the way, so that’ll be fun as well.

Wrapping it up….

This was my first time at #WCET15. I will definitely be back. There were so many great learning opportunities and sessions that they all couldn’t be mentioned here – this post simply mentioned the path I took at the conference.

The current state of edtech in higher ed is messy and looking to get messier. It’s in that interesting place of explosive creativity, where everyone gets all the materials out, looks at the current piece of art, and decides how to deconstruct and then reconstruct it. This time we’re doing it with the student – not the institution – squarely in the middle of the picture. And then we’re making an infographic of it. And PDFing it. And making a text-only accessible version. And an audio-version. And an animation of it.

Because that’s just how we roll.

Right over the meet the student where they are.

The post Rapid Fire Feedback from #WCET15 appeared first on e-Literate.

Live Stream for Wednesday’s OpenEd Keynote

Mon, 2015-11-16 16:29

By Michael FeldsteinMore Posts (1053)

Phil and I will be giving a joint keynote to the OpenEd conference the day after tomorrow—Wednesday, 11/18—at 8:30 AM Pacific Standard Time. You can watch the live stream here. If you miss it live, don’t worry. The video will be available afterward at the same URL. For a modest license fee, all rights reserved.

Kidding about that last part.

The post Live Stream for Wednesday’s OpenEd Keynote appeared first on e-Literate.

Interview with Josh Coates, CEO of Instructure, on today’s IPO

Fri, 2015-11-13 17:01

By Phil HillMore Posts (379)

Instructure, maker of the Canvas (higher ed and K-12 markets) LMS and Bridge (corporate learning market) LMS, held their Initial Public Offering today. Prior to the IPO, Wall Street analysts focused on the company’s growth, its large losses, and the challenges of the education market. The company was priced on the lower end of its range ($16.00), and closed up 12.5% at $18.00.

This IPO and its lead up have been highly watched, particularly given the rapid growth in ed tech investments and questions on whether there are real businesses to emerge based on the investments. I had the opportunity to interview CEO Josh Coates today. What follows is an edited version of the interview, focusing mostly on how Instructure’s IPO will impact education markets and existing customers. I tried to capture as much of the Q&A verbatim as was feasible, but treat the answers below as a paraphrase.

Q. What are your impressions on how the IPO has gone so far?

A. The market in general has been a blood bath [ed. Dow down 585, or 3.3%, for the week], but we’re doing well so far. Given market conditions right now, we’re pleased as punch. We priced in range [ed. $16 – $18], and the market responding well. We’re really focused as a company 6-12 months down the road, but it is nice to get this IPO feedback.

Q. The use of funds in your S-1 filing indicates more of the same but with additional war chest. Do I read this correctly to say that you do not plan to change your limited mergers and acquisition (M&A) approach? If it’s just more of same, what is the biggest impact existing customers should expect (besides Jared Stein showing up in black Chevy Suburban with an entourage)?

A. We have a bias against M&A other than very limited cases. You are right that we plan no change of strategy with our usage of the funds [ed. they raised $70 million with the IPO]. Honestly, customers should expect no real change other than that they can now dig into our financials.

Q. Some of your competitors have been suggesting that the consistent losses listed in your S-1 means that you will have to raise prices. How do you respond? Will you be able to make additional revenue from existing clients?

A. Our prices are at market levels and we intend to keep them at market. We have fundamentally strong business that works, and it’s a healthy business, so we won’t have to do anything unnatural. [ed. I pushed back that there must be pressure to make additional revenue and upsell to existing clients]. Our upsell approach right now includes getting customers to add premium support. But we are a software company. Customers should expect us to create new independent software products every 12 – 18 months. Some existing customers will use, some won’t. That’s the strategy – create new business by building great new software.

Q. What is the relative importance of growth in K-12 vs Higher Ed vs Corporate Learning for your model over the next two years?

A. [ed. Josh did not directly answer this one but offered the following observations.] We have four markets that we serve – domestic higher ed, domestic K-12, domestic corporate learning, and international markets. Right now our fastest growth is in corporate learning, but that product, Bridge, was just released in February. Just behind that market in terms of growth is domestic K-12, which is largely a green-field market; we’ve just gotten started. It’s interesting, but by customer count, domestic K-12 is our largest market. We have to do well and grow in all four markets.

Q. Do you have any plans you can share on how long you’ll be at the company?

A. I will stick around as long as board will keep me. I love the job, have a lot more work to do, and have no active plans to leave.

Q. How will your IPO affect the broader ed tech market?

A. Everything is connected. The effect will be generally positive, providing an example of a little VC-backed company that launched in 2011 and has become a healthy independent company. This is a good proof point that education and training markets can support this type of company and investment.

Q. When will you aggressively target non-English-speaking international markets?

A. Both Canvas & Bridge are in 15 languages, and Instructure has clients in 25 countries. We provide phone support English & Spanish and soon Portuguese. We’re adding offices in multiple international cities including in Brazil. But we’re doing this carefully. [ed. I pushed back that in my visit to Latin America, very few people in ed tech had any real knowledge of Instructure.] You’re right – we’re just at point of setting up legal entities in Latin America and have done no marketing. We’re in the early days.

Given the nature of an IPO and SEC rules, some of these answers are not very specific and are in good faith. We’ll keep this interview in mind here at e-Literate, however, to see if the answers are backed up by future action.

The post Interview with Josh Coates, CEO of Instructure, on today’s IPO appeared first on e-Literate.

Data To Back Up Concerns Of Textbook Expenditures By First-Generation Students

Thu, 2015-11-12 15:49

By Phil HillMore Posts (379)

David Wiley has added to the conversation[1] over use of data on college textbook pricing and student spending patterns with “The Practical Cost of Textbooks”. The key argument is to go beyond prices and spending and look at the most direct measure of asking students themselves how textbooks costs have impacted them. He then looks at the Florida Virtual Campus surveys (also included in my post), concluding:

What impact does the cost of textbooks have on students? Textbook costs cause students to occasionally or frequently take fewer courses (35% of students), to drop or withdraw from courses (24%), and to earn either poor or failing grades (26%). Regardless of whether you have historically preferred the College Board number or the student survey number, a third fact that is beyond dispute is that surveys of students indicate that the cost of textbooks negatively impacts their learning (grades) and negatively impacts their time to graduation (drops, withdraws, and credits).

And yes, we need to do something about it.

Amen. Surveying over 18,000 students, the FVC surveys are quite important and should be on everyone’s radar.

More Out Of Data

I strongly feel that this type of discussion (as well as media quotes, policy, and legislation) should use the best data available, describe that data accurately, and ask for more data where there are holes. And it often takes multiple views into multiple data sources to get a complete picture. For the College Board / IPEDS data, it is indisputable what it says, but descriptions of the source should note that it comes from financial aid offices and not from student spending or budgets. For the student spending data (Student Monitor, NACS, Cal State), the data is solid and accurately described (that I have seen), but as reported they are missing segmentation across key student demographics.

In Mike Caulfield’s post he argues that it’s all well and good that students are creatively reducing their textbook expenses by renting, borrowing, etc, but that this data can mask other important factors.

You could use skills like this to cobble together that bag of books and say *that’s* the true cost. Look what you can get your books for if you game the system right!

But to say that, you’d have to have learned nothing in the past decade about why students fail. Requiring a non-traditional student to cobble together a bag of half-priced textbooks the way a second-generation student might is setting them up for failure.

Since I have argued that the student spending data is critical and much more relevant than the College Board data, let’s explore that data further – asking for more. Richard Hershman and NACS were kind enough to agree to requests for cross-tabs on two of their questions against sector and first-generation status.

For Fall 2014, students were asked how much they spent on required course materials.

Spending_and_Cost_Concerns 1

It turns out that Mike is right. First-generation students spend 10 % more, acquire 6% fewer textbooks, and end up paying 17% more per textbook than do non first-generation students. This data could be used as a starting point for policy that addresses this problem – explaining the problem and providing a measurement for improvement.

For Spring 2015, students were asked to list their top 3 cost concerns for college.

Spending_and_Cost_Concerns 3

First-generation students are somewhat more likely to worry about course materials (41% to 37%) than non first-generation students, but the view across sectors is more telling. 2-year college students are much more likely to worry about course materials (50% to 37%) than 4-year college students. Tuition is lower at 2-year schools, and fewer student live on campus or away from home. So it makes sense that course material concerns would increase in relative terms (% listing in top 3 concerns). It also makes sense how car payments / insurance / gas would be more important.

These are real issues of equity, and accurate understanding of multiple data sources is more likely to lead to effective policy decisions than using single sources. And you get that data by exploring it further and seeing where it takes you.

  1. My initial post, Mike Caulfield responseBracken Mosbacker, my response to Mike, Mike follow-up

The post Data To Back Up Concerns Of Textbook Expenditures By First-Generation Students appeared first on e-Literate.

What Schoology’s Venture Funding Means for the LMS Market

Wed, 2015-11-11 11:05

By Michael FeldsteinMore Posts (1053)

LMS vendor Schoology just raised $32 million in Series D venture capital funding, bringing the total that they’ve raised to just over $57 million. If you’ve never heard of them, that’s because they have mostly been focused on K12, where they are doing very well. But they have turned their attention to US higher ed recently. They had a surprisingly big presence at EDUCAUSE, where CEO Jeremy Friedman told me that they are prepared to make an aggressive push. Their ability to get major funding was probably helped by Instructure going to market, and possibly by the leak that Blackboard is putting itself on the block as well. I don’t generally take money guys too seriously in their ability to predict ed tech, but they may be lucky on this one. I think there may be an opening the US higher ed LMS market for a new entrant.

LMS selection for schools often works a little like the selection process that high school students typically go through when picking a college. Students looking at colleges usually have a favorite going in. Maybe their friends are going there. Or their big brother or sister. Or maybe they just heard that it’s cool. But they don’t apply to just one college, in case it doesn’t work out for one reason or another. So they have a second tier of schools that might be OK too. Generally, they don’t know much about your favorite school going in and they know even less about the “might be OK” schools. Depending on how cautious they are, they might throw in one or two “safety” schools that they really don’t want to go to but that they feel (or their parents feel) should be included for the sake of completeness.

Likewise, colleges and universities frequently go into an LMS evaluation process with a favorite. Because the selection is generally done by a committee of stakeholders rather than just one person, there might be conflicting opinions on what the favorite is. But more often than not, there is a nascent majority or a consensus opinion about the likely winner, at least among the core selection committee. Back in the early to mid-aughts, the default favorite was usually Blackboard because it was considered to be the safe alternative that everybody was using. When Blackboard faltered, the favorite began to split between D2L and Moodle—and occasionally Sakai, particularly for larger public universities—with type of school and geography having a big influence on which one was likely to be the frontrunner. These days, the schools that Phil and I talk to report Instructure as the starting frontrunner at least four times out of five, across school types or geographies.

But LMS selection processes still need their “might be OK” candidates. For one thing, most of them are mandated by policy or by law to do a real multi-vendor evaluation. And most evaluation committees genuinely do want to look at alternatives. Just because they have a sense going in of which alternative is most likely to be the best doesn’t mean that they are closed-minded. The trouble is that there aren’t many alternatives that selection committees feel hopeful about these days. Increasingly, Sakai and Moodle aren’t even making it to the serious evaluation stage in US higher ed evaluations; and even when they do, they are often treated like safety schools. Blackboard never fully recovered from reputational damage done under Michael Chasen and their failure to deliver on Ultra this year was a huge setback. At the moment, they are being treated like a safety school as often as not. If Ultra slips further—and maybe even if it doesn’t—they could start losing significant numbers of customers again. And we haven’t run into many schools that are particularly excited about D2L either. Probably the best I can say for them is that they are the least likely of the LMS companies that are not Instructure to be dismissed out-of-hand.

I think there’s an opportunity for a new entrant to get a fair hearing from selection committees that want a real horse race but aren’t excited about any of the incumbents. Ironically, the rise and success of Instructure has probably reduced risk aversion among schools to go with a scrappy start-up. I don’t know if Schoology is going to be the one that gets a foothold in the market because of this opening, but their timing is definitely good.

The post What Schoology’s Venture Funding Means for the LMS Market appeared first on e-Literate.

New Column At EdSurge

Tue, 2015-11-10 16:22

By Phil HillMore Posts (379)

Starting today, Michael and I are publishing a three-post series on personalized learning at EdSurge. Depending on how that goes, we could end up providing a regular column there.

The first post today is titled “Why Personalized Learning Matters to a New Generation of College Students”.

As we talk to the people on the front lines of these differentiated instructional approaches—students, faculty and staff with real-world experiences, both good and bad—the most significant theme that emerged was the challenge of helping students that come to class with wildly different starting knowledge and skill levels. Personalized learning should be seen as an effort for institutions to directly support students across the spectrum.


We’re excited to be working with EdSurge, helping them expand their coverage of higher education and helping us to share analysis and e-Literate TV content with a broader audience.

You can read the whole article here.

The post New Column At EdSurge appeared first on e-Literate.

Asking What Students Spend On Textbooks Is Very Important, But Insufficient

Tue, 2015-11-10 08:37

By Phil HillMore Posts (379)

Mike Caulfield responded to my post on data usage to understand college textbook expenditures. The core of my argument is a critique of commonly cited College Board data. That data originating from financial aid offices leads to the conclusion that students on average either spend or budget $1,200 per year with that number rising, while there is more reliable data originating from students showing the number to be half that amount and dropping.

In Mike’s response post yesterday, he generally agreed with the observation but is concerned that “readers of that piece are likely to take away the wrong conclusion from Phil’s figures (even if Phil himself does not)”. There is a risk that people see the lower numbers and conclude the “crisis is overblown”, leading to this observation:

If we’re looking to find out if prices for some set of goods are too high, then by definition we cannot look at what people are spending as a reliable gauge, because one of the big effects of “prices too high” is that people can’t afford what they need.

If you don’t pay attention to this you get in all sorts of tautologies.

In the specific world of textbooks, Mike considers the lower-cost method of renting used textbooks, noting:

So which figure do we use here? The chances of getting everything you need as a rental are low. Sure, you could be the super-prepared student who knows how to work the system and get them *all* as rentals — but not every student can be first in line at the bookstore. And the ones at the back of the line — guess their socio-economic class and first generation status?

This is an important issue, and I appreciate Mike’s understanding that I am not arguing that college textbook pricing is an overblown crisis. I agree that the crisis is real and that the hardest-hit are likely low socio-economic class and first generation students.

But let’s move past these agreements and drop the gloves.

I do not believe that we are solely “looking to find out if prices for some set of goods are too high”. What we need is a richer picture understanding both prices and expenditures. This past spring on the same College Board data subject I wrote:

It is important to look at both types of data – textbook list prices and student expenditures – to see some of the important market dynamics at play. All in all, students are exercising their market power to keep their expenditures down – buying used, renting, borrowing, obtaining illegally, delaying purchase, or just not using at all. And textbook publishers are suffering, despite (or largely because of) their rising prices.

But there are downsides for students. There are increasing number of students just not using their required course materials, and students often delay purchase until well into the academic term. Whether from perceived need or from rising prices, this is not a good situation for student retention and learning.

This is not too far from Mike’s parable of the Perdimians and the concept of a food (or textbook) deficit. The richer picture should include some measurement of what is needed (the bag of textbooks) and what is spent (the expenditures) and if there are gaps. If we want to use College Board data to understand pricing, however, then we will need better guidance from the Education Department on what to include and not include (for consistency) and we need the data broken down by student demographics. The data right now is fundamentally flawed.

On the subject of impacts of pricing, Kim Thanos from Lumen Learning pointed out a Florida report from 2012 that asked students about this deficit. The findings:

Florida Table 6

That’s a high percentage of students avoiding textbooks, avoid or dropping courses, and performing poorly due to costs. And this type of data is only available by asking students.

But overall, I suspect Mike and I are on the same page in terms of concerns – the crisis is real, we need pricing and expenditures, and poor or first-gen students might not have access to the same cost cutting measures. On that note, if you accept 2-year vs. 4-year schools as a proxy for the measure of poor / first-generation students, the NACS data backs up Mike’s point. 2-year students are far less likely to go get course materials for free.

Acquisition by sector


As part of his argument, Mike looked at “first year suggested schedule for a math education major” and looked up bookstore list pricing and rental of used textbooks for those courses over a year.

  • One year of new textbooks: $1430.68
  • One year of rentals (mostly): $712.46
  • Mix, half rentals, half new: $1071.57

Getting back to the College Board usage of $1,225 for books and supplies and using the mix of half rentals, half new, Mike observes:

What I think you’ll find out is that although there is a lot of variability in the cost of textbooks for a year in different degrees, the College Board estimate is much closer  to estimating what students actually need than other measures offered.

Yes, $1,225 is closer to $1,072 than is the NACS and Student Monitor figures of $530 – $640. But the benefit of asking students for data and not relying on financial aid self-reporting is that the data is already far richer than the headlines. First-year students spend much more than others, as seen in the 2014 NACS report:

Highest lowest spending

It turns out that $1,019 is very close to Mike’s number of $1,072.

We need data on both pricing of required course materials (including local substitutes) and expenditures by students. The poor / first-generation problem should be addressed by better data from students – cross-tabs on student expenditures and asking them the impacts of pricing in particular – rather than falling back to College Board usage of financial aid office reporting.

Asking students how much they spend on college textbooks is an excellent question, but it is insufficient.

Thanks to Mike for extending this conversation and adding in the challenge of poor / first-generation students.

The post Asking What Students Spend On Textbooks Is Very Important, But Insufficient appeared first on e-Literate.

Bad Data Can Lead To Bad Policy: College students don’t spend $1,200+ on textbooks

Sun, 2015-11-08 20:56

By Phil HillMore Posts (379)

The average US college student does not spend or budget more than $1,200 for textbooks, with that number rising each year, as commonly reported in the national media. The best data available continues to show that students spend roughly half of that amount, and that number is going down over time, not up.

Last spring I wrote a post documenting that the College Board is not a reliable source for college textbook expenditures. With last week’s release of College Board data, it is worth repeating that data for their “Books and Supplies” category are:

average amounts allotted in determining total cost of attendance and do not necessarily reflect actual student expenditures.

Much more reliable data from the National Association of College Stores (NACS) and the Student Monitor consistently show that students on average spend between $530 – $640 per year for textbooks or “required course materials”.[1]

There is also fairly clear data from NACS and Student Monitor showing that student expenditures on textbooks or “required course materials” is going down[2].

NACS Expenditures Per Year 2015

NPR Student Monitor

This week Campus Technology published an article on California State University’s Affordable Learning Solutions project, extensively quoting assistant vice chancellor for academic technology services Gerry Hanley. This article included an estimate from the system that further backs up the $530 – $640 figures.

[Hanley’s] rough estimate: As of a few years ago, learners at the 23-campus, 460,200-student university system were spending $300 million a year on course materials — about $651 per student per school year.

That $1,200 Figure

The College Board data in its recent report shows the “Books and Supplies” category is ranging from $1,249 to $1,364 across the college sectors.

College Board Fall 2015

Furthermore, the College Board data shows this category rising year over year. Why is that? The reason is that the College Board gets this data from its member institutions who provide the books and supplies budget estimates they use for the National Center for Education Statistics’, or (NCES) Integrated Postsecondary Education Data System (IPEDS), which are self-reporting of campus financial aid offices which often want to maximize student eligibility[3] of loans and grants[4].

Update: Edited paragraph above to clarify that CB gets data from NCES which in turn gets data from financial aid offices and their self reporting. NPSAS does not act as a collection point of the data.

We have the College Board non-data implying that students spend more than $1,200 per year on textbooks, with that number rising each year, when the best data shows the expenditures of roughly half the amount, $600, with that number dropping.

This is why it is frustrating to see the continued usage of College Board data in the media and now in a bill introduced in the US Senate. SB2176 is the “Affordable College Textbook Act” sponsored by Senators Durbin, Franken and King. As described in the bill:

According to the College Board, during the 2014–2015 academic year, the average student budget for college books and supplies at 4-year public institutions of higher education was $1,225.

I should note that the description given to the media by OER advocates has changed since last spring from “students typically spend” and “average student spends as much as” to the more accurate “student budget for”. But this is a technicality that, while less misleading, is still misleading. I also have not seen anything indicating that the $1,225 number has led to bad policy in this bill.

The Affordable College Textbook Act was introduced in the 2013, did not get out of committee, and was reintroduced with minor changes a month ago. The bill would create a grant program for OER pilot programs, targeted at proposals that can:

demonstrate the greatest potential to— (1) achieve the highest level of savings for students [etc]

The primary purpose of the bill is to create savings for students; it should start with understanding what students actually spend and not a somewhat arbitrary financial aid number.

For what it’s worth, it should be noted that NACS is one of the groups actively supporting this bill.

Why Does This Matter?

I have had several people argue that this distinction does not really matter, but I maintain that it does for two reasons.

The first reason is simply a matter of transparency. For those advocating for open educational resources (OER), open should include open process and open data sources. The primary barrier to adoption of OER is faculty acceptance and not the existence of pilot programs. If faculty by and large start to believe that advocates are using misleading data, that will not help with getting more faculty to use open resources.

The second reason is that we have moved beyond the $1,200+ number being used just in media circles and now having it potentially embedded in legislation and policy decisions. What happens when policy makers or even vendors start to base real decisions on this number? What if a program or institution or state body decided that enough is enough, we need to cut that $1,200 average down to $700 per student by executive decision? Or what if a vendor convinced an administrator that they could supply all texts, digitally, for $700 per student as long as the school required 100% adoption? Those are hypotheticals, but if this scenario played out in some form, students would actually end up having to spend more money, not less, on textbooks or required course materials. We would also have missed the chance to spend policy capital on efforts that do help students save money.

Cal State and the AFS program should be commended for using actual student expenditure data as they set goals of reducing student expenses by 50%. If they had started with the College Board data for current state but then surveyed student actual expenditures to determine savings, Cal State would have almost met their own goal already, and there would be little reason to push any further for savings.

There is growing acceptance of the potential for OER, which means that decisions with bigger impact are likely to be made in the future. Having misleading information that ignores the actual expenditures by students can lead to bad policy decisions.


  1. Read the spring post and a a postscript if you’d like to see the details.
  2. Note that NACS used to be a biannual study and does not have data for AY2009 and AY2011.
  3. Update: Note that this is but one example of flawed data. There is little federal guidance on how to determine the Books & Supplies category and no consistency in how financial aid offices do so.
  4. See postscript post for more details.

The post Bad Data Can Lead To Bad Policy: College students don’t spend $1,200+ on textbooks appeared first on e-Literate.

Instructure Dodges A Data Bullet

Wed, 2015-11-04 10:23

By Phil HillMore Posts (378)

Last week’s EDUCAUSE conference was relatively news free, which is actually a good thing as overall ed tech hype levels have come down. Near the end of the conference, however, I heard from three different sources about a growing backlash against Instructure for their developing plans for Canvas Data and real-time events. “They’re Blackboarding us”, “the honeymoon is over”, “we’re upset and that is on the record”. By all appearances, this frustration mostly by R1 institutions was likely to become the biggest PR challenge for Instructure since their 2012 outage, especially considering their impending IPO.

The first complaint centered on Instructure plans to charge for daily data exports as part of Canvas Data, which Instructure announced at InstructureCon in June as:

a hosted data solution providing fully optimized data to K-12 and higher education institutions capturing online teaching and learning activity. As a fundamental tool for education improvement, the basic version of the service will be made available to Canvas clients at no additional cost, with premium versions available for purchase.

What that last phrase meant was that monthly data access was free, but institutions had to pay for daily access. By the EDUCAUSE conference, institutions that are part of the self-organized  “Canvas R1 Peers” group were quite upset that Instructure was essentially selling their own data back to them, and arguments of additional infrastructure costs were falling flat.

Vince Kellen, CIO of the University of Kentucky, was quite adamant on the principle that vendors should not sell back institutional data – that belongs to the schools. At the most vendors should charge for infrastructure.

The second complaint involved a product under development – not yet in beta – called Live Events. This product will provide access to clickstream data and live events, ideally following IMS standards and supporting the Caliper framework. Unizin is the primary customer driving this development, but the Canvas R1 Peers group is also playing an active role. The concern is that the definition of which data to make available in real-time, and how that data is structured to allow realistic access by schools analyzing the data, has not yet been defined to a level that satisfies Unizin and the Peers group.

I contacted the company Friday mid day while also conducting interviews with the schools and with Unizin. Apparently the issues quickly escalated within the company, and Friday evening I got a call from CEO Josh Coates. He said that they had held an internal meeting and decided that their plans were wrong and had to change. They would no longer charge for daily access to Canvas Data. On Monday they posted a blog announcing this decision.

tl;dr: Canvas Data is out of beta. This means free daily data logs are available to all clients. [snip]

We just took Canvas Data out of beta. A beta is a chance to test, with actual clients, the technology, the user experience, and even possible for-cost add-on features. Some of the things we learned from the 30 beta institutions were that once-a-month updates aren’t enough (Canvas Data “Basic”), and charging extra for daily updates is kinda lame (Canvas Data “Plus”).

“Kinda lame” is not the Canvas Way. So we changed it: No more Canvas Data Basic vs. Plus; It’s now just Canvas Data, with daily updates of downloadable files, at no additional cost, for everyone.

Checking back with schools from the Canvas R1 Peers group and Unizin, I was told that Instructure really did defuse the Canvas Data issue with that one quick decision.

On the Live Events issue, the Canvas R1 Peers group put together a requirements document over the weekend that collected data needs from Berkeley, UT Austin, U Kentucky, and the University of Washington[1]. This document was shared with Instructure through Internet2 based on the Net+ contract with Instructure, and they are now working out the details.

Vince Kellen indicated that “Live Events is real minimal start in the right direction”, but that Instructure will need to figure out how to handle transactional events with no data loss and clickstream data not requiring the same fidelity within the same system.

Additional sources confirmed that the Canvas Data issue was resolved and that Instructure was on the right path with Live Events, although there is still a lot of work to be done.

Amin Qazi, CEO of Unizin, stated in an email:

Yes, Unizin had an agreement which allowed access to the daily Canvas Data files without our members paying any additional fees. My understanding of the new pricing model is all Instructure Canvas customers now have a similar arrangement.

Unizin is only beginning to explore the benefits of Live Events from Canvas. We are transporting the data from Instructure to our members via cloud-based infrastructure Unizin is building and maintaining, at no cost to our members. We have started developing some prototypes to take advantage of this data to meet our objective of increasing learner success.

Unizin has had, and plans to have, discussions with Instructure regarding the breadth of the data available (current:, the continued conformity of that data to the IMS Global standards, and certain aspects of privacy and security. Unizin believes these topics are of interest to all Instructure Canvas customers.

We understand this is a beta product from Instructure and we appreciate their willingness to engage in these discussions, and potentially dedicate time and resources. We look forward to working with Instructure to mature Live Events.

In the end, there is work remaining for Instructure to support institutions wanting to access and analyze their learning data from the LMS, but Instructure dodged a bullet by quick decision-making.

Additional Notes
  • I am still amazed that Instructure’s competitors do not understand how Instructure’s rapid and non-defensive acknowledgement and resolution of problems is a major factor in their growth. There were no excuses given this weekend, just decisions and clear communication back to customers.
  • This is the clearest demonstration of value by Unizin that I have seen. Amin’s explanation goes beyond the vague generalities that have plagued Unizin over the past 18 months and is specific and real.
  1. There might be other schools involved.

The post Instructure Dodges A Data Bullet appeared first on e-Literate.

EDUCAUSE and Robot Tutors In The Sky: When investors are your main customers

Tue, 2015-10-27 15:46

By Phil HillMore Posts (377)

Yippie i ohhh ohh ohh
Yippie i aye ye ye
Robot tutors in the sky

Before I head out to Indianapolis for the EDUCAUSE conference, I keep thinking back to a comment someone made in response to Michael’s description of Knewton marketing as “selling snake oil”. I can’t find the exact quote, but the gist was:

This is what happens when you start to see VCs as your main customers.

This viewpoint could be applied well beyond Knewton, as they successfully parlay their marketing hype into raising more than $100 million to date (I suspect with another round in the works based on the aggressive marketing). Martin Weller has a post out today looking back at the MOOC investment mania and lessons learned such as “Don’t go cheap – they won’t respect you” and “Big rhetoric wins – allied with the fear factor”. The post is somewhat tongue-in-cheek and cynical in nature . . . but spot on.

Update: Ray Henderson shared a recent WSJ story about Chegg and how they are suffering from trying to increase market valuation by a “ratchet”.

Tech startups eager to land sky-high valuations from investors might want to heed the cautionary tale of Chegg Inc., the textbook rental service whose stock has languished since its IPO in 2013.

In a candid interview, an early investor in Chegg revealed how the company gunned for the highest possible valuation in several funding rounds ahead of its public offering. Chegg in exchange granted venture capitalists a favorable term called a “ratchet” that guaranteed the share price in the IPO would be higher than what they paid.

The move backfired. When Chegg went public, it was motivated to set an IPO price that met the terms of the covenant, or Chegg would have to pay the difference in shares to the early investors. The stock plummeted on the first day of trading and hasn’t recovered.

The entire ed tech market finds itself in the interesting position where it is easier to raise large sums of money from VCs or private equity or strategic buyers than it is to establish real business models with paying customers.

On one hand:

  • Ed Tech private investment (seed, angel, VC, private equity) has hit an all-time high of $3.76 billion for the first 9 months of 2015, according to Ambient Insight; and
  • M&A activity in ed tech is even higher, with $6.8 billion in Q3 of 2015 alone, according to Berkery Noyes.

On the other hand:

  • In the LMS market Blackboard is laying off staff and their owners are trying find an exit and D2L has hit a plateau despite massive investment. Instructure, while set for a half-billion+ IPO later this year has yet to set concrete plans to become profitable, and they are by far the hottest company in this market.
  • In the MOOC market, Coursera is just now getting to a repeatable revenue model, yet that is likely $20 million per year or less.
  • Other than ALEKS and MyLabs (owned by McGraw-Hill and Pearson), it is unlikely that any of the adaptive software providers have yet become profitable.
  • Etc, etc.

I am not one to argue against investment in ed tech, and I do think ed tech has growing potential when properly applied to help improve educational experiences and outcomes. However, there is a real danger when it is much easier for an extended period of time for companies to raise private investment or get bought out at high multiples than it is to establish real revenue models with end user customers – mostly institutions. The risk is that the VCs and private equity funders become the main customers and company marketing and product plans center on pleasing investors more than educators and students.

Knewton has fallen into this trap (although at $100 million + you could argue it is not a trap from their perspective) as have many others others.

What is needed in the market is for more focus to be applied to companies finding and simply delighting customers. This is a balance, as there is a trap on the other side of just supporting the status quo. But the balance right now is heavily tilted towards pleasing investors.

This is one of the main issues I plan to watch for at the EDUCAUSE conference – how much the company messages and products are targeted at educators and students vs. how much they are targeted at investors.

The post EDUCAUSE and Robot Tutors In The Sky: When investors are your main customers appeared first on e-Literate.

The Starling: Pre-K Ed Tech

Tue, 2015-10-20 13:10

By Michael FeldsteinMore Posts (1053)

The product I am going to tell you about here was created by two of my former seventh and eighth grade students. I love these guys. So yes, I am biased. But that knowledge also presents an opportunity. I am 100% confident that they have only the best of intentions. With that in mind, I can look at the genesis of an idea for pre-K ed tech—a particularly fraught corner of a fraught field—knowing that there is no scam or hidden agenda here and see some of the challenges that arise when trying to go from best of intentions to implemented product.

I think Starling is a great idea that has the potential to do a lot of good in the world. But I also think it will raise some eyebrows.

The Boggiano brothers started their current journey when they learned that the number of words that children hear spoken to them in the first four years of their lives has an enormous cognitive, emotional, and social impact for the rest of their lives. If a child isn’t getting the right kinds of social interaction as an infant and toddler, then intervention in kindergarten may already be too late in some respects.

There is a strong socioeconomic element to this effect. Parents who have to work multiple jobs are going to have fewer opportunities to engage with their children. Parents who are not literate, or who were not read to as children, are going to be less likely to read to their children. And so on. This “word gap” perpetuates and reinforces socioeconomic gaps across generations. Furthermore, as even middle class parents are working longer hours, spending more time on smart phones, and so on, it seems like the word gap could be a potential problem for just about any parent.

So the brothers got an idea: Why not build a kind of Fitbit for infant language exposure? Why not create a device that helps parents keep track of how much they are talking to their children, set goals, and see and improve their progress toward their goals? They got some guidance from Stanford’s Language Learning Lab at Stanford’s Center for Infant Studies and, along with co-founder Nicki Boyd, set out to design the product. Long story short, they have built prototypes, piloted them, and now are seeking funding to scale up production via Indiegogo.

I’ll set aside the technical complexities of designing such a product for now, but there are some business and implementation complexities worth exploring. First, in order to change the world, they first have to stay in business. Who is likely to buy the product first? Not the families who have the biggest word gaps. It’s much more likely to be the families that are already giving their children every advantage possible. So that’s who they are marketing to as their first customers. There’s a Tesla-like business strategy here, not in terms of price point but in terms of trying to create a market by targeting the demographic with the most disposable income and the ability to be trend setters and taste makers. But there is a danger here that Tesla doesn’t have, particularly when you add the product’s data collection capabilities into the mix. Starling could be easily be pigeon-holed—pardon the bird pun—as a product that rich parents whose main concern is whether they will be able to get their child into that exclusive kindergarten program buy to make sure that their nannies are speaking the correct number of words to the children. This is not a product design problem. It’s not even primarily a marketing problem. It’s just one of the complexities that arises when product meets world.

On the social mission end of things, simply providing economically challenged families with a gadget, however well designed it may be, is not going to magically transform families. Starling’s creators are well aware of that problem. That is why they are working with an existing non-profit called Literacy Lab that works directly with underprivileged families and is focused on the word gap and early childhood language learning problems. The idea is that Starling will be a tool that parents and their support organization can use together as part of a more holistic approach. Their Indiegogo page gives you an option to donate a Starling to a family in need being served by Literacy Lab. That’s the option I chose. But I suspect that the niche where Starling has the most potential for social impact is working class families that are economically (relatively) stable but where the parents have limited literacy, or no family history of college education, or no experience being read to, or came from a family where their own parents were working multiple jobs a week. Those are the families that can probably get the most benefit with the least amount of extra help.

The other complexity worth mentioning here is privacy. Starling doesn’t record conversations, and it’s hard for me to imagine what advertisers could do with information about how many words you spoke to your one-year-old even if the company chose to sell that information. But it’s hard not to feel at least a little uneasy about the idea of recording your interactions with your child pretty much from birth and putting that information “in the cloud.” I actually don’t know the degree to which the product uploads info to the internet at the moment. It appears that the device stores the information locally until you access it with your smart phone, which wouldn’t necessarily require any of the data to go outside of your local area network at all. But, for example, if the product is going to be used as a communication tool between parents and support folks like those at Literacy Lab, that starts pushing product development in the direction of online dashboards.

I believe in what the founders are trying to accomplish with Starling and I trust them. I have contributed to their project and I hope that you will too. But Starling is also an excellent case study in just how complicated ed tech can be, even when you come to it with a good idea and the best of intentions.

The post The Starling: Pre-K Ed Tech appeared first on e-Literate.

Why Is Blackboard Laying Off Staff Despite Improved Market Share Position?

Mon, 2015-10-19 16:12

By Phil HillMore Posts (377)

Over the past two weeks Blackboard had another round of layoffs, likely due to the company missing financial targets. While one estimate places the number at roughly 200, from what I have heard the number is closer to 90 – 100 people let go. I asked the company for commentary on the layoffs and associated reorganization. By email they declined to comment on the number of employees let go but added this comment:

These changes included the elimination of select positions across the company. We deeply appreciate the contributions made by the affected employees and are supporting them in their transition.

This is not the first layoff at Blackboard since they were taken private in 2011; rather this is the latest in a series of cuts that have gone well beyond “trimming the fat”. Posts on and glassdoor paint a picture of high attrition due to routine layoffs and many staff leaving by their own choice. We have written on several of these events here at e-Literate. 90 here, 100 there, 74 . . . it adds up, especially when combined with staff departures.

To get another view into the company downsizing, consider that Blackboard recently signed a lease that will trim its corporate headquarters by 37%:

Founded in a Dupont Circle row house in 1997, Blackboard, which has occupied 111,895 square feet at 650 Massachusetts Ave. since 2008, will move into Ogilvy Public Relations Wordwide’s former space on the eighth through 10th floors [70,482 square feet] beginning in December 2015.

This follows a recent move in its Reston, Virginia facility that cuts its office space there by over 50%.

In an ironic turn of events, the new headquarters move will put the company into the same building it occupied before 2008, and their neighbors in the building will include former CEO Michael Chasen’s SocialRadar and CBE provider FlatWorld.

Why More Layoffs When US Market Share Finally Stabilizes?

Earlier this week I shared a post on the current state of the US higher education LMS market – Blackboard’s core market – and noted that the company has at least for now “stopped its well-documented losses in US higher ed and has even risen in the past year”. Blackboard is also seeing real growth in international markets for both Learn and Moodle Solutions. So why would they have to resort to more layoffs? I believe the answer lies in the nature of their private equity ownership and the necessary efforts to manage the associated debt as foretold by Michael in 2011:

The most fundamental fact you need to know about companies like Providence is that they use large amounts of debt to purchase other companies. Blackboard, as a publicly traded company, has lots of people who own little pieces of it in the form of stock shares. Some of those stock shares have been purchased with borrowed money, but a lot of them have been purchased with cash. It’s the stock market’s system of aggregating a large number of buyers with smaller amounts of cash that enables companies to get the cash they need to grow big quickly. When a private equity firm like Providence purchases a publicly traded company like Blackboard, they do so by essentially taking out a giant mortgage to buy up all the shares that are floating around in the market.

Why does this matter? It matters because giant mortgages come with giant mortgage payments. Private equity firms are able to get very good interest rates, but even a low interest rate on the purchase of a $1.5 billion company adds up to something substantial. There is a cost structure that is inherent in these deals, and that structure puts some constraints on what the new owners can do with the company.

In February of this year Blackboard acquired Schoolwires for $92 million, of which $85 million was in the form of new debt. This led Moody’s to revisit their rating on Blackboard’s corporate debt in February, reaffirming their B2 rating on corporate family, but changing their outlook to negative. This essentially means that Moody’s believes it more likely that they will have to downgrade Blackboard’s debt rating in the near future. In March they reaffirmed this position, and in April Moody’s put out a summary credit opinion (this document requires purchase). The following notes are pulled from these three documents.

  • Estimated revenue for 2015 is $685 million, compared to $481 million for the last 12 months when they were public (July 1, 2010 through June 30, 2011) and $630 million in 2013 (estimated by Moody’s).
  • However, revenues are “stagnating … over the past several quarters, after good growth in prior years”.
  • Total ratable (public) debt of $1.3 billion.
  • Estimated EBITDA (earnings before interest, taxes, depreciation & amortization) of $170 million based on adjusted debt/EBITDA ratio of 7.8, compared to estimated EBITDA of $120 million in 2011 [Note: some estimates place the expected EBITDA to be $180 – $190 million for 2015].
  • K-12 revenue is approximately $88 million (15% of total), and it is falling by 4-6% per year.
  • “Adequate” liquidity as of March 31, 2015, based on $32 million cash, expected $30 free cash flow per year, and availability of $100 million revolving debt.
  • The rating “could be upgraded if the company were to demonstrate meaningful revenue growth, if free- cash-flow-to-debt reaches double-digit percentages, and adjusted debt-to-EBITDA were to fall to 4.5 times on a sustained basis.”
  • The rating “could be downgraded if the company fails to show progress in reducing debt-to-EBITDA toward 7.0 times, liquidity declines materially, or the company pursues further acquisitions that add to financial leverage.”

For the purposes of both selling the company (which is often driven by EBITDA ratios) and for avoiding debt rating problems, Blackboard is highly motivated to improve earnings. They are not growing in K-12, their overall revenues are stagnating, so the best option is to cut costs. My answer to the question of ‘why layoffs now when market share finally improves’ is ‘the need to manage debt ratios’.

Impact of Ultra Timing

In the same email exchange with Blackboard I asked about the Ultra delays and their relationship to the layoffs, both as cause and effect.

Q. As we have reported on the negative effect of the delays in releasing Ultra, what are the risks that this round of layoffs will cause further delays to Learn Ultra? Do you still expect to have Learn Ultra available for pilot release at the end of the year and general release summer 2016?

A. We are extremely proud of Learn with our Ultra experience and it is being very well-received. We recently made the Ultra experience of Learn available in technical preview with a number of elegantly designed and fully responsive workflows to support students and instructors.

We have had about 1,000 users sign up for the technical preview and we have enabled the technical preview for our more than 50 customers on Learn SaaS. We’re also gearing up to open a Learn Ultra Trial site soon and plan to continue to build on this capability and be ready for pilots with a select set of customers at the beginning of 2016.

As Michael described, the Ultra user experience upgrades are more than a year late, and general availability for Learn Ultra (the new-look core LMS product) are now forecast for mid 2016. This means that revenue gains due to the company’s investment in Ultra, which Michael and I consider a significant and well-designed change based on cloud technology stack, are not likely to occur for another year. I note that Blackboard’s spokesman did not directly answer my question about the layoffs causing risks of further delays – it takes a lot of people to get a product line ready for full release, and all of the attrition at Blackboard has to have an effect.

Investment and Cost Cutting

Blackboard is caught between the need to invest and complete a product re-architecture that is highly complex and aggressive, and the financial requirements of highly-leveraged private equity ownership. The need to invest and the need to cut costs. While some people are looking at the potential sale as a new risk to evaluate, it is probably better to view the current financial situation Blackboard finds itself in as the result of the risk picked up by their previous sale in 2011 to private equity ownership.

Blackboard therefore finds itself in a tough situation. Assuming debt stays roughly constant, they would need to significantly increase earnings (on the order of another $10 – $15 million), and not pursue any new debt-financed acquisitions. If they acquire another company with enough size to help their finances in the short term, their credit rating would likely be downgraded. If they do not acquire, they have to rely on current products (not including Learn Ultra) to improve their bottom line. And note that 100 employees at a software firm, including benefits, costs more than $10 million per year.

Given the company’s turnaround situation and these limitations, it is actually quite surprising how much Blackboard has achieved in getting rid of company siloscutting headcount and costs, while simultaneously re-architecting a product line and supporting current customer configurations. The news that they are up for sale and resorting to further layoffs indicates to me that the company’s financials are much higher on the priority list as Providence looks for an exit.


Associated with the current round of layoffs, Blackboard has reversed some of its recent moves to centralize the company and is instead (re)building the product teams to have dedicated sales and marketing support. The Washington Post summarized the centralization strategy in January 2014:

Employees who were once divided among Blackboard’s individual products are now grouped based on the company’s target markets: higher education, K-12 education and international education.

Blackboard confirmed this move to focus more on product teams, including dedicated sales and market staff:

First, We have recently made some organizational changes that will simplify our operations, provide more accountability and focus on our products as well as help us deliver on our customers’ expanding needs. These changes are aligned with our annual goals and priorities and are designed to accelerate the business. Blackboard is dedicated to leading a transformation in education and to the success of our customers. These changes will not impact our ability to continue serving our customers in the manner to which they are accustomed or to do what is right for the education industry.

One specific area that is likely to benefit from the reorganization is Blackboard’s Moodle groups, as described in Phill Miller’s recent blog.

A few weeks ago, I blogged about the massive growth that Blackboard has seen in its Moodle client base. To further support this growth we have created a new, more focused team called Open Solutions for Moodle. Why?

In the past few months we have concluded three acquisitions in the Moodle space: Remote-Learner UK, X-Ray Analytics, and Nivel Siete. By bringing these companies together in a unified team, we are able to stop duplicating efforts and focus more on improving the experience for our clients and innovating on top of the Moodle platform.

Blackboard now has the largest team in the world working on Moodle, from developers and quality assurance people to graphic designers, support people and consultants, sales people, trainers and more. Bringing these teams together means that the Nivel Siete team in Bogota works seamlessly with the Remote-Learner UK team in England, the NetSpot team in Adelaide and the greater Moodlerooms team in Baltimore and throughout the world. The combination of these groups means that we can create a truly global community among our Moodle clients, not just focusing on Moodle but on everything that happens around it, from learning analytics or real time collaboration tools to accessibility and improving course design.

Difficult Market

There is a lot of volatility in the LMS market – Blackboard’s situation, Moodle’s inflection point, Instructure’s impending IPO, and new learning platforms from CBE, courseware and adaptive software providers. It will be interesting to get updates from Blackboard at next week’s EDUCAUSE conference and to get a better sense of customer reactions to Learn Ultra, Learn 9.1 and Moodle Solutions.

The post Why Is Blackboard Laying Off Staff Despite Improved Market Share Position? appeared first on e-Literate.

The Rise of Antisocial Deconstructivism

Sat, 2015-10-17 08:58

By Michael FeldsteinMore Posts (1052)

Phil and I gave our first ever joint keynote at the OLC conference this week. We didn’t want to just do dueling PowerPoints, so instead we tried a format that I have been calling a social constructivist keynote. Each of us would present on a topic for a few minutes, and then the two of us would talk about it for a few minutes. We planned the arc of the topics we would cover in advance, but we didn’t rehearse the talks for each other or script the conversation. The discussions came pretty close to the kind of bull shooting conversations that we have all the time. We set a time limit of eight minutes for each segment so that we could get through our presentation with enough time to bring the audience into the conversation at the end. It seemed to work pretty well.

In the course of the conversation, we spontaneously came up with a term that we both like and that seemed to resonate with the audience: antisocial deconstructivism.[1] It’s the approach of breaking learning down into teeny, tiny bits, tied to fine-grained competencies and micro-assessments, that students learn on their own by following a prescription that is created for them, possibly with the help of a robot. To be clear, the term isn’t entirely meant to mock. There are times when antisocial deconstructivism is an appropriate pedagogical technique. For example, it’s pretty good for helping nursing students memorize medical terminology or IT students learn the basic components of a network. It can be good for learning some math kinds of skills, depending on your philosophy of math education. Any situation in which you are working fairly low on Bloom’s Taxonomy might be OK for it as an approach. Procedural knowledge that either doesn’t require higher order problem solving skills or where problem solving skills are best built incrementally by slowing increasing problem complexity is a particularly appropriate type of candidate for antisocial deconstructivism.

But we do mock it when it is presented not as a pedagogical technique but as a pedagogical ideology. It’s the idea that anything worth learning can be learned best, most cheaply, and “at scale” this way. It’s the fetishization of one tool in the teaching toolbox as the technological society’s Great Leap Forward. The worst, crudest examples of MOOCs, Competency-Based Education, and personalized learning software hype are all manifestations of this stunted (and self-interested) view of education. Antisocial deconstructivism is like botulism. A little bit injected in just the right spot by a trained expert can smooth out some wrinkles that bother you, treat a chronic headache, or refocus a lazy eye. A little more injected in the wrong places and you can quickly start to look like a parody of the thing that you are trying to be. Any more that, and what you have is not a tool but a toxin.

You might be suffering from antisocial deconstructivist toxicity if you find yourself believing any of the following:

  • Short videos of lectures by Ivy League professors, coupled with little quizzes at the end, will almost always provide a better education than a class taught by a live, human, non-Ivy League professor.
  • Short vendor-produced articles or animations, coupled with little quizzes at the end, will almost always provide a better education than a class taught by a live, human, non-Ivy League professor.
  • We think of our product like a robot tutor in the sky that can semi-read your mind and figure out what your strengths and weaknesses are, down to the percentile.

If you exhibit any of these symptoms, then get yourself to a great teacher immediately and have them demonstrate for you what it is that videos, quizzes and robots cannot do.

  1. “Antisocial deconstructivism” should not be confused with “antisocial deconstructionism,” the latter of which is redundant. Anyone who writes like Derrida did clearly is actively hostile to the idea of shared meaning making as something that provides net positive value, or even the desire to communicate with other humans.

The post The Rise of Antisocial Deconstructivism appeared first on e-Literate.

College Scorecard Article Published In Washington Post

Fri, 2015-10-16 14:19

By Phil HillMore Posts (375)

I have written several posts looking at the new College Scorecard and its inherent flaws in the data, often starting with observations from Russ Poulin at WCET. Today Susan Svrluga, education reporter at the Washington Post, posted a new article co-written by me and Russ and titled “Hundreds of colleges missing from Obama’s College Scorecard?”. The gist of the article is taking a holistic view of data problems and why they exist.

In a nutshell, the College Scorecard combines data from multiple sources – primarily from the Education Department’s own Integrated Postsecondary Education Data System (IPEDS) and National Student Loan Data System (NSLDS) – and publishes the results of both as a consumer-facing Web site and an analyst-friendly data download. The essence of the problem is that throughout this process the data is filtered based on questionable assumptions, leading to the fuzzy lens viewing subsets of the real data.

Poulin Hill College Scorecard Graphic

The post College Scorecard Article Published In Washington Post appeared first on e-Literate.

State of the US Higher Education LMS Market: 2015 Edition

Wed, 2015-10-14 16:42

By Phil HillMore Posts (375)

I shared the most recent graphic summarizing the LMS market in October 2014, and thanks to revised data sources it’s time for an update. As with all previous versions, the 2005 – 2009 data points are based on the Campus Computing Project, and therefore is based on US adoption from non-profit institutions. This set of longitudinal data provides an anchor for the summary.

The primary data source for 2013 – 2015 is Edutechnica, which not only does a more direct measurement of a larger number of schools (viewing all schools in IPEDS database with more than 700 FTE enrollments), but it also allows scaling based on enrollment per institution. This means that the latter years now more accurately represent how many students use a particular LMS. This post at Edutechnica provides the most recent data for Fall 2015.

A few items to note:

  • The most important recent feature of the market – the rapid rise of Canvas, surpassing D2L and quickly closing in on Moodle – is quite visible now.
  • Blackboard Learn appears to have stopped its well-documented losses in US higher ed and has even risen in the past year. Its market share is far smaller than at its 2009 peak, but the company is no longer losing large numbers of clients each year.
  • D2L, Sakai, and Moodle have risen ever so slightly, but in effect have hit a plateau.
  • ANGEL and eCollege have lost market share.
  • There is a growing area of “Alternative Learning Platforms” that includes OpenEdX, 2U, Helix and Motivis (the newly commercialized learning platform from College for America).
  • While the data is more solid than 2012 and prior years, keep in mind that you should treat the graphic as telling a story of the market rather than being a chart of exact data.
  • Update (10/16): Removed “Curtin University as origin of Moodle based on Martin’s comments. Mea culpa – should have caught this earlier.
  • Update (10/23): Changed graphic title to refer to US and not North America.


The post State of the US Higher Education LMS Market: 2015 Edition appeared first on e-Literate.

Yes, I did say that Knewton is “selling snake oil”

Tue, 2015-10-13 10:15

By Michael FeldsteinMore Posts (1050)

My comments in today’s NPR article about Knewton are getting some attention on Twitter. One comment in particular, actually. The one where I accused Knewton CEO Jose Ferriera of selling snake oil. To understand the basis of the comment, you have only to look at the quote from Jose earlier in that selfsame article:

We think of it like a robot tutor in the sky that can semi-read your mind and figure out what your strengths and weaknesses are, down to the percentile.

Can it read what I’m thinking now?

And then there’s this:

We can take the combined data power of millions of students — all the people who are just like you — [who] had to learn a particular concept before, that you have to learn today — to find the best pieces of content, proven most effective for people just like you, and give that to you every single time.

Really? Every single time?

Objectively speaking, I don’t know much about Knewton’s platform or the value that it adds. The only efficacy studies I have seen are for Pearson products, and those studies do not differentiate between Knewton features and Pearson-native features in terms of impact. If you dig deep enough on their site, you can find some information on the techniques that they use. For example, this overview paper, while not perfect, is a credible attempt to explain the techniques that the platform employs and the benefits to the users in layperson’s terms. If Knewton were doing more of this, I wouldn’t be as critical. There are still questions about the value of the product, but they are on the same level with many other products in the adaptive learning category. A company like Knewton should be working hard to demystify what their product does and providing hard, reproducible evidence that they add value.

But much of what Jose says, at least to the media, is the opposite. No responsible educator or parent should adopt a product—even if it is free—from a company whose CEO describes it as a “robot tutor in the sky that can semi-read your mind” and give you content “proven most effective for people like you every single time.” I’m sorry, but this sort of quasi-mystical garbage debases the very notion of education and harms Knewton’s brand in the process.

If you want to sell me a product that helps students to learn, then don’t insult my intelligence. Explain what the damned thing does in clear, concrete, and straightforward language, with real-world examples when possible. I may not be a data scientist, but I’m not an idiot either. If you can translate the technical mumbo jumbo into sensible teaching insights for me, then you just might make a sale. But most people wouldn’t buy a used car from somebody who describes their product using language as hyperbolic as “robot tutors in the sky.” The same principle should hold for educational technology.

The post Yes, I did say that Knewton is “selling snake oil” appeared first on e-Literate.

Instructure Files for IPO

Sun, 2015-10-11 11:50

By Michael FeldsteinMore Posts (1050)

This doesn’t exactly come as a shock, but Instructure has filed for an IPO, and is expecting a post-IPO valuation of somewhere between $500 million and $800 million. Whenever a private company does this, they have to file a form called an S-1 with the SEC, which contains all kinds of financial and strategic information. You can find Instructure’s here.

Also not a big surprise, but the numbers in the report show big growth. Subscription revenue rose 72% from 2013 to 2014. Coincidentally, Edutechinica just published its 3rd annual LMS Data Update. As you can see, Canvas went through the roof in US higher ed while the other major LMS players were either flat or close to it:

But in order to achieve that growth, they have been running at a loss. The problem seems to be in high expenses rather than low revenues, which suggests that the losses are coming from the company bulking up rather than from them undercutting the competition on price. Also, most of those revenues still come from the US. Instructure has offices in London, Hong Kong, and Sydney. That’s not a huge international presence. And as it happens, Eductechnica has data for the UK and Australia:

You can see that Instructure has not made a major dent yet in either country. (You can also see that their biggest competitor by far in those two countries is Blackboard, particularly when you consider that Blackboard now owns major Moodle hosting operations in both countries.)

One last bit that jumped out at me after a quick scan of the S-1 wasn’t financial. Their characterization of analytics was interesting. Throughout the document, they try to make the case that high user engagement and utilization lead to better learning analytics. For example,

Strong User Engagement Leads to Robust Data Analytics

Given today’s focus on accountability and performance, both academic institutions and companies are striving to improve learning outcomes. To do so, an organization must first understand the variables that impact results, such as attendance metrics, user engagement, and the efficacy of various learning content and technologies for individual learners. A learning management system has the potential to provide significant insight to educators and administrators on their students’ and employees’ progress toward meeting learning objectives and the factors impacting performance. In addition, such learning management systems can facilitate insightful benchmarking to allow organizations to explore other drivers of learning outcomes.

Strong user engagement with learning management systems is critical to maximize the potential of data analytics. High utilization enables the learning management system to capture more data, and leads to more insightful analyses on user behavior, quality of individual courses and effectiveness of digital content. Better analytics enables instructors and administrators to make more informed decisions about instruction and materials that in turn drive improved learning outcomes and performance for individuals and companies. This virtuous cycle among user engagement, data analytics and learning outcomes represents the evolution of learning technology. We believe that the market increasingly is demanding learning management software that delivers both robust analytics and strong user engagement.

They later go on to say,

Over eight million instructors, students and employees have used our software over the 12 months ended June 30, 2015. According to self-reported data in an ECAR 2014 survey, 58% of faculty in higher education use a learning management system to share content with students, while our internal analysis of higher education institutions using Canvas shows that 71% of faculty use Canvas to share content with students.

But their claim in terms of what they actually have for data analytics is fairly weak:

Our platform provides users with open API access to data analytics. We deliver the analytics in an easy to understand and consumable way, that is optimized for independent analysis. This open visibility allows learners to view their own progress in real-time, educators to adjust programs and personalize curricula for maximum effectiveness and organizations to benchmark user data internally and respond to patterns observed.

They are not arguing that they have good analytics, but rather that the high utilization of the software plus open APIs will enable customers to build good analytics themselves.

The post Instructure Files for IPO appeared first on e-Literate.

Response to Robert Talbert: Pedagogical change is difficult, many need support

Wed, 2015-10-07 12:51

By Phil HillMore Posts (372)

On Monday Robert Talbert, associate professor at Grand Valley State University and author of the Casting Out Nines blog, wrote a provocative and important post titled “Active learning as an ethical issue”. Robert noted:

The recent Proceedings of the National Academy of Sciences study stands out among these recent studies. It is a meta-study of 225 prior studies on active learning, and the results are bracing: students in these studies who were in classes focused on lecture and direct instruction in the classroom were 55% more likely to fail their courses than their counterparts in active learning focused classes, and scored almost half a standard deviation lower than their active learning counterparts on exams.

This sentence from the PNAS study stopped me in my tracks when I first read it:

“If the experiments analyzed here had been conducted as randomized controlled trials of medical interventions, they may have been stopped for benefit—meaning that enrolling patients in the control condition might be discontinued because the treatment being tested was clearly more beneficial.”

Robert’s central point is that active learning should be thought of as an ethical issue, where it could be considered unethical to withhold treatment. He then asks why faculty might withhold active learning and listed four reasons: self-preservation, laziness, a weird and irrational superiority complex, and legitimate external forces (such as overly controlling school structure).

The argument is an interesting and compelling one based on the study, and it is worth reading the whole article and his follow-up post. I wish we treated teaching and learning more often as an ethical issue,but I would add one additional reason that the active learning treatment is not more prevalent. This one comes from our discussions with faculty and support staff as part of our e-Literate TV series on personalized learning, and Michael and I summarized the point in the introduction episode. In a nutshell, changing to active learning (described as personalized learning in the series, but this terms overlaps with active learning in the context of this discussion) designs often or usually comes along with a fundamental change in role of the faculty and TAs involved. This changing role is profound and not easy, especially if faculty try to make changes on their own without peer or staff support.

Michael Feldstein: And going along with that was a willingness for faculty and for students to really ask some hard questions about the roles that they needed to take in the classroom, right? This is no longer, “I go up as a faculty member, and I lecture. I tell you what you know. And you, as a student, dutifully write it down and regurgitate it on the test.”

Faculty have to be comfortable letting go of a certain amount of control. We heard that over and over again. And students need to be comfortable and confident taking a certain amount of control over their education.
Faculty have to be comfortable letting go of a certain amount of control.

Phil Hill: Just taking a step back, I can’t emphasize enough what we’re talking about, the fact that this personalized learning, where it’s done in an appropriate manner, absolutely doesn’t replace the faculty. So, we just need to move past a lot of that part of the discussion.

But that change in role is significant. So, when you say, “You’ve got to give up some control,” we’re not just talking a slight adjustment in your teaching; you’re talking about something deeply held, internal beliefs. And part of the implication there is not just that it’s significantly different, but it also means it takes a lot of work and a lot of time to make that transition in a role.
It’s not just that it’s significantly different . . . , it also means it takes a lot of work and a lot of time.

And then, to pick up on your other point, students are coming in, and they need to be much more part of an active learning experience. Well, they’ve gone through, likely, the K–12 system, where they’ve almost been taught to be passive learners, or that’s sort of their expectations.

But now they’re coming in, and they’re being asked to do a lot of active work—to really stay up to speed, not put off work and cram right before the exams, but come in prepared to the classes. And a lot of times, they’re teaching themselves. So, those two change in roles are very significant, and they take time for people to deal with.

Michael also noted this challenge of knowing how to change in his recent post on lectures.

Following the IHE piece on Essex County College’s struggles to get good outcomes from their personalized learning program in developmental math, and following my blog post on the topic, Phil and I had an interesting exchange about the topic in email with ECC’s Vice President for Planning, Research, and Assessment Doug Walercz. With his permission, I’d like to share some of his observations with you. One of the big takeaways from the conversation, for me, is that our cultural notion of the pedagogical work that happens in a good lecture is pretty impoverished relative to the reality. We don’t have a clear understanding of all the things that a good lecture accomplishes, and therefore we often lose valuable elements of student support when we try to replace it. This has pretty serious implications for MOOCs, flipped classrooms, personalized learning, and a wide array of pedagogical approaches that replace a traditional in-person lecture with something else.

Michael then shared Doug’s email describing his viewpoints on expert blindness and the nature of combining non-cognitive aspects of teaching with content delivery. Michael concludes:

We don’t fully understand what we are doing well now. Therefore, when we attempt to deconstruct it and then reconstruct it in a different environment, we don’t really know what we will miss or how we will need to retrain our instructors so that we won’t miss it. That’s why it is so important to undertake these sorts of experiments thoughtfully, self-critically, and iteratively.

The point is that changing to any new pedagogy – active learning, adaptive, personalized, etc – changes the role of faculty and the methods of providing support to students in significant ways. I would add this difficulty with understanding and implementing change to Robert’s list of reasons why the active learning treatment has been withheld. Using Robert’s argument that this this an ethical issue, this reason should not be one to prevent such a change, but it is a reason why many faculty have not yet changed or a reason that additional support for faculty might be needed in order to allow more extensive adoption of active learning.

Some faculty will be able to make these changes to active learning on their own – think of them as autodidacts in learning about learning – but if you want deeper changes, then we need to acknowledge that many, if not most, faculty will need support to do so.

In Robert’s follow-up post, he makes an important point about assessment and effectiveness:

But also do this: Gather formative assessment data on a regular basis and see what students are actually learning.Don’t try to base the effectiveness of your teaching on how much passion and verve you appear to bring to lectures; don’t base it on summative assessments where the data come too late for students to act on them; don’t base it on how many students talk in your discussions or how bright and bushy tailed they appear to be. Base it on data that you collect about student learning.

Then do this: Analyze your assessment data when you get it, and objectively decide whether your teaching is helping students learn. And if it isn’t, consider how you might change, and then make the change.

The post Response to Robert Talbert: Pedagogical change is difficult, many need support appeared first on e-Literate.