Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 16 hours 50 min ago

ASU, edX and The Black Knight: MOOCs are not dead yet

Wed, 2015-04-22 18:24

By Phil HillMore Posts (307)

In 2012 I wrote a post during the emergence of MOOC mania, pointing out some barriers that must be overcome for the new model to survive.

So what are the barriers that must be overcome for the MOOC concept (in future generations) to become self-sustaining? To me the most obvious barriers are:

  • Developing revenue models to make the concept self-sustaining;
  • Delivering valuable signifiers of completion such as credentials, badges or acceptance into accredited programs;
  • Providing an experience and perceived value that enables higher course completion rates (most today have less than 10% of registered students actually completing the course); and
  • Authenticating students in a manner to satisfy accrediting institutions or hiring companies that the student identify is actually known.

Fig 3 EvolutionCombine20120927

Since that time, of course, the MOOC hype has faded away, partially based on the above barriers not being overcome.

Today, Arizona State University (ASU) and edX announced a new program, Global Freshman Academy, that takes direct aim at all four barriers and could be the most significant MOOC program yet. From the New York Times story:

Arizona State University, one of the nation’s largest universities, is joining with edX, a nonprofit online venture founded by M.I.T. and Harvard, to offer an online freshman year that will be available worldwide with no admissions process and full university credit.

In the new Global Freshman Academy, each credit will cost $200, but students will not have to pay until they pass the courses, which will be offered on the edX platform as MOOCs, or Massive Open Online Courses.

Later in the article we find out more details on pricing and number of courses.

The new program will offer 12 courses — eight make up a freshman year — created by Arizona State professors. It will take an unlimited number of students. Neither Mr. Agarwal nor Mr. Crow would predict how many might enroll this year.

The only upfront cost will be $45 a course for an identity-verified certificate. Altogether, eight courses and a year of credit will cost less than $6,000.

ASU will pay for the course development and edX will pay for the platform. They eventually hope to get foundation funding, but ASU president Michael Crow promised that “we’re going ahead no matter what”.

This is a big commitment, and it will be interesting to see the results of program that addresses revenue models, identity verification, completion rates and awarding actual credit. As Crow described:

“We were not big believers in MOOCs without credit, courses without a connection to degrees, so we focused our attention on building degree programs,” Mr. Crow said.

Pay attention to this one, whether you’re a MOOC fan or not.

Update:

The post ASU, edX and The Black Knight: MOOCs are not dead yet appeared first on e-Literate.

Cisco’s Collaborative Knowledge: Further blurring of higher ed & professional dev lines

Tue, 2015-04-21 17:31

By Phil HillMore Posts (307)

Cisco, which at one time was the most valuable in the world, made an announcement that apparently got no one’s attention (outside of the venerable e-Literate). Cisco[1] released a new product, Collaborative Knowledge (CK), that is designed to allow companies to access real-time expertise and enable collaborative work based on employees’ expertise, or in another word, competencies. From the press release (because I cannot find an independent news article to reference):

To be positioned for growth, performance and productivity, organizations must transform into digital workplaces where knowledge sharing, learning and talent innovation are able to occur in real-time, anytime, anywhere.

Cisco Collaborative Knowledge integrates best-in-class consumer and business technologies to enable capabilities such as highly secure knowledge sharing, expert identification, continuous learning, social networking and analytics into one complete and end-to-end enterprise knowledge exchange. With Cisco Collaborative Knowledge, workers are able to benefit from these continuous learning features, helping organizations innovate and solve real-world business challenges.

Beyond the Buzzwords, What Is It?

The key description here is “knowledge sharing, expert identification, continuous learning, social networking and analytics”. The best way to conceptualize this product is by *not* viewing it as an LMS, which in corporate circles tends to be designed around formal learning programs and learning administrators needs. Like Instructure’s Bridge product, the new new Cisco offering is designed around end user needs, and it seems to be a very different approach – not knowledge management, but employee access to knowledge, learning and networking based on expertise.

dt000613

 http://dilbert.com/strip/2000-06-13

Unlike an LMS, Cisco CK attempts to leverage informal, or tacit, knowledge by building up profiles of employees that include endorsed knowledge maps.

Profile_example

In aggregate, a company builds up a knowledge map that allows employees to browse and search.

Company_knowledge_map

One of the core use cases is for an employee to do a context-sensitive universal search across employees, communities, libraries and training catalogs. Once there are users identified with the endorsed skills matching a search, there is built-in capability to contact that employee by phone, email, or WebEx virtual discussion.

Experts_with_Webex_button

In another tab of results, you can find communities – which include discussions, blogs & wikis.

Find_Communities

Community_example

This product seems to hit the right notes in terms of helping end users – employees – get their jobs done; contrary to historical learning or knowledge systems that feel like forcing employees to make some learning department’s job easier. What is probably the most interesting aspect to me, in terms of corporate knowledge & collaboration, is how a full implementation of Cisco CK would reorganize a company more along personal knowledge, networking and experience and away from hierarchies and linear control.

Use in Higher Ed

During the demo, the group mostly described usage of Cisco CK within companies, or perhaps as a nod to me being in the call, “also in higher ed”. To be honest, I don’t see that the straight-forward implementation of the product suite makes sense within a college or university. While the concept makes sense on paper, universities (especially faculty) are organized into semi-autonomous departments, divisions or colleges where cross-campus collaboration is not encouraged unless for a defined academic program. I could see faculty seeing this as a time sink, not wanting to be “catalogued” and not wanting people to be able to access them with one click. I could be wrong here, but it seems like a cultural mismatch.

I could see Cisco CK applied across a discipline-specific group, but in many cases it would be difficult to know who is the purchasing entity and who is administering the system. Cisco’s example video released along with the product announcement was based on the New York Academy of Sciences, a scientific society, which somewhat backs up this supposition. There might be other direct uses in education, but likely not in higher education institutions. Let me know if I’m missing something in the comments.

What is relevant to higher ed in my mind, however, is not the idea of institutional implementations but rather the set of uses that could be enabled by connecting to external data sources. During the demo the team described that the system does allow access to multiple sites, but some integrations are not there, yet. If a user searches for a particular skill or competency, one of the search results will include relevant sections of the training catalog. I believe the system is designed for the primary source to be the corporate LMS here. But what if the “catalog” includes continuing education courses offered by partner institutions? What about MOOCs targeted at professional development – particularly following the concept of the Open Education Alliance or Coursera’s Specializations?

This track seems to be a real opening for educational providers – whether institutions in a continuing education role or alternative providers – to more directly connect to employers and their money. The service might not just be for courses but also for external experts as shown in the video above. This move could further blur the line between higher education and professional development.

Training_catalog

Furthermore, let’s look at the knowledge map of each employee’s profile. Right now it seems set up to be an internal database, with Cisco providing an internal LinkedIn service for their customers. I asked if Cisco had plans to allow external definitions of the knowledge map, such as directly integrating to LinkedIn. They indicated ‘it is on the roadmap’. If that does happen, now you can see a direct mapping of actual competencies from someone’s education into company-endorsed expertise. You could be known within a company not just as ‘Sarah with an accounting degree working in corporate finance’, but as ‘Sarah with expertise on amortization and competitive analysis’.

I do not know enough about the corporate knowledge / training market to judge whether Cisco CK will be a success, but the product is intriguing. If they go down the path of integrating external data sources of training or education opportunities, and if they go down the path of acknowledging a LinkedIn definition of skills (or perhaps competencies coming from a CBE degree), then this announcement can be quite significant. This announcement would accelerate the move towards companies defining more from the demand-side on what educational opportunities they want for their employees and what skills or competencies they want from college graduates.

There is a growing movement among companies, especially technology companies, to value skills and competencies. What Cisco CK gives a view of is that this valuation is not just a matter of hiring college graduates. This valuation is moving into how a company operates and how employees are valued over time based on their acquired knowledge. Cisco CK also has the potential to offer a valuable marketplace for post-degree or alternative-to-degree education providers.

From a long-term perspective, count Cisco CK as a view towards a redefinition of what institutions and alternative educational providers produce as outputs – not just degrees and grades, but also skills and competencies and lifelong learning opportunities.

  1. Disclosure: Cisco, through a different division, is a client of MindWires Consulting.

The post Cisco’s Collaborative Knowledge: Further blurring of higher ed & professional dev lines appeared first on e-Literate.

2U Learning Platform Update: Removal of Moodle, addition of accessibility options

Mon, 2015-04-20 06:25

By Phil HillMore Posts (307)

2U has now been a public company for over a year, and that had what is easily the most successful education IPO in recent history. Shares have almost doubled from $13.00 at IPO to $25.50 last week. At the same time, there is a swirl of news around their new partner Yale and the Physician Assistant’s program – first the announcement of program from one of the elite of elite schools, second the news that accreditation approval for the new program is not going to be as easy as hoped.

While both aspects are newsworthy, I’d like to dive deeper into their infrastructure and learning platforms. The company is far from complacent, as they continue to make significant changes.

One emerging trend that both Michael and I have been covering is the growing idea that there are real benefits to be gained when pedagogy and platform are developed in parallel. From Michael’s intro to the Post-LMS series:

Reading Phil’s multiple reviews of Competency-Based Education (CBE) “LMSs”, one of the implications that jumps out at me is that we see a much more rapid and coherent progression of learning platform designs if you start with a particular pedagogical approach in mind. CBE is loosely tied to family of pedagogical methods, perhaps the most important of which at the moment is mastery learning. In contrast, questions about why general LMSs aren’t “better” beg the question, “Better for what?” Since conversations of LMS design are usually divorced from conversations of learning design, we end up pretending that the foundational design assumptions in an LMS are pedagogically neutral when they are actually assumptions based on traditional lecture/test pedagogy. I don’t know what a “better” LMS looks like, but I am starting to get a sense of what an LMS that is better for CBE looks like. In some ways, the relationship between platform and pedagogy is similar to the relationship former Apple luminary Alan Kay claimed between software and hardware: “People who are really serious about software should make their own hardware.” It’s hard to separate serious digital learning design from digital learning platform design (or, for that matter, from physical classroom design). The advances in CBE platforms are a case in point.

2U is following the same concept. Their pedagogy is based on small discussion sections (they boast an average class size of ~11 students) within masters level programs, combining synchronous discussions using a Brady Bunch approach.

Live Courses

They also use a Bi-directional Learning Tool (BLT). The following video references the ill-fated Semester Online program, but the tool applies to all their customers.

2U’s approach also adds in custom-developed video segments that act as case studies.

Learning Platform Keeps Connect, Removes Moodle

Initially 2U patched together Moodle as an LMS and Adobe Connect as web conferencing for the video sessions, developing custom tools and applications to tie it all together. In additional to the learning platforms used within the courses, 2U also developed custom enrollment projections, marketing, support and application services, but in this post I’m going to focus on the learning components.

In an interview with James Kenigsberg, CTO, and Rob Cohen, President & COO, they described the rationale for the recent changes as architectural in nature – moving to a more modular approach and improving reliability. James and Rob said that their learning platforms are absolutely a pairing of technology and pedagogy. In their term, agnostic platforms don’t accomplish much.

James described their origins of using Moodle with belief that it’s “OK to start with a bowl of spaghetti code if you understand what you want”, and that this is their second refactoring of code in the past six years. They had already heavily customized the Moodle code, but now 2U will have all Moodle components out of the platform by the end of CY 2015. In their description Moodle was great to start with as the base, but now they need a different approach.

2U relies heavily on Adobe Connect, with access to video tools and rooms available throughout the overall learning platform. The rationale for Adobe Connect (vs. Blackboard Collaborate, for example) in that Connect provides a persistent “room” for each faculty member, allowing them to customize, add their own content & quizzes, setup of polls, and general configuration[1]. This room is then available to them through their courses. Other tools tend to have separate meeting instances, such that the content and configuration and setup but no longer available after the meeting. For general configuration of the room, faculty members using 2U’s platform can make choices such as only allowing students to speak in virtual room as they raise their hand vs. initiating everyone to talk on unmute.

For the technology stack, 2U is based on Amazon Web Services (AWS) with files saved to Amazon’s S3 file system. The BLT is built on Angular JS.

Accessibility

2U has also taken advantage of the combined platform + pedagogy approach to make some improvements in accessibility as well. For this area, the benefit is more from combining platform and content than pure pedagogy, however.

For sight-impaired students, there is already compatibility with screen readers such as JAWS, but there is a new audio-overlay feature that is interesting. For the case study videos, 2U enables an option for students to hear a narrated audio track in parallel to the recorded video’s playback. For example, in this video from the social work program at USC, the Abby character is talking to a social worker. The audio track option adds descriptions to give the video context for sight-impaired, such as:

Later, Abby rushes into Carol’s office. [dialogue] Abby sits down. [dialogue]

Abby pre flashback

During one transition, Abby describes her memories from childhood, and the audio overlay describes.

In a flashback, ten year old Abby lies across her bed doing homework. Fran looks in. [dialogue] Abby sits up and gathers her books. [dialogue]

Abby flashback

This tight integration works because the same people working on the platform are also working on the course material.

For hearing-impaired students, 2U has added two different transcript capabilities. One choice is having full transcript below video[2].

Full

Another choice is to overlay the transcript as in closed-caption style.

Overlay

As there are more efforts to create online courses and programs, the topic of accessibility is becoming more important. Just this month, edX settled with the Department of Justice while there are lawsuits against Harvard and MIT for their usage of the platform.

EdX, an online learning platform that Harvard co-founded with MIT in 2012, entered into a settlement agreement with the Department of Justice on Thursday and will address alleged violations of the Americans with Disabilities Act. That settlement could come to bear on a separate but similar lawsuit against Harvard that revolves around issues of accessibility online.

Namely, the edX settlement will require the platform to become accessible for people with disabilities—including those who are deaf or visually impaired. [snip]

The settlement comes as the National Association of the Deaf sues Harvard and MIT for allegedly discriminating against the deaf and hard of hearing by not providing online captioning both for the courses they offer through edX and the rest of their online content. The private lawsuit, filed in February, accuses the University of violating both the American with Disabilities Act and the Rehabilitation Act, which requires that educational institutions that receive federal funding provide equal access to disabled individuals. Legal experts have said that the suits against Harvard and MIT has merit.

This challenge of supporting students with disabilities within online courses has been a difficult one to solve, particularly as real solutions require both the platform to have generic capabilities, the content (often created by individual faculty on their own prerogative) to follow appropriate guidelines, and for the addition of transcripts / captions and audio.

2U has the benefit of being directly involved in all three areas and by having their learning platforms designed and customized for their specific pedagogical approach.

Standing Apart in Crowded Market

2U’s approach is unique in the crowded market of Online Service Providers, or “enablers”. 2U is vertically integrated and focused on niche programs – high-tuition masters programs at elite institutions. Most of the competition – Pearson EmbaNet, Wiley Deltak, LearningHouse, Academic Partnerships, etc – are going in different directions that include broad offerings (masters, bachelors, broad range of pedagogy).

I was a little late in covering 2U, largely because of my discomfort with two interdependent aspects of their business:

Furthermore, this vertically-integrated company goes against much of the movement towards interoperability and breaking down walled gardens. But the company is growing and seems to be quite successful, and I do like the strong focus on academic quality and student support. It is worth understanding how this tight combination of platform and pedagogy within the company plays out.

  1. Note: I believe that Bb Collaborate has an option for persistent faculty sessions, but the core design is based on events.
  2. In both cases I’m showing the mouse hover to also show the platform selection tool.

The post 2U Learning Platform Update: Removal of Moodle, addition of accessibility options appeared first on e-Literate.

Interesting Comment on Pearson’s LMS Plans From Customer

Thu, 2015-04-16 06:20

By Phil HillMore Posts (307)

On April 1, long-time eCollege (aka Pearson’s LearningStudio) customer Texas Christian University (TCU) gave an update on their LMS selection process to the student newspaper TCU360. In this article there was an interesting statement[1] worth exploring [emphasis added].

“eCollege” will soon be a thing of the past.

TCU has narrowed its search for a Learning Management System to two platforms, Blackboard and Desire2Learn (D2L).

“We’ve had feedback, from faculty specifically, that it’s time for change,” Assistant Provost of Educational Technology and Faculty Development Romy Hughes said.

TCU has used Pearson’s Learning Studio system since 1999.

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”

That last comment got my attention. The eCollege / LearningStudio platform has been around for a long time, and there have been questions about where Pearson was going in the LMS market based on 2011’s introduction of OpenClass. Would OpenClass replace LearningStudio over time, and would it strongly change the LMS market? Would both OpenClass and LearningStudio continue as standalone LMS products? It is quite clear by now that OpenClass itself has not changed the market, but LearningStudio has a long-time customer base of fully online programs – many in the for-profit sector.

Furthermore, with Pearson’s reorganization around efficacy, their core document states (p. 14):

The overarching idea was that our investments should be driven towards those products which deliver the highest impact for learners while sustaining us financially so we can continue to invest in new models and improvements.

There is a question of whether Pearson’s internal reviews around LearningStudio and OpenClass are leading to strategic changes around their position in the LMS market.

I asked for Pearson to provide official comment, and David Daniels, president of Pearson Education, responded with the following clarification.

Pearson has not left the LMS space and will continue to invest in our current generation MyLabs and support our many customers on LearningStudio into the future. Pearson’s Learning Studio still powers over 3 Million enrollments annually in the fully remote, online learning space. Our commitment to servicing these students and their institutions is unwavering. Our focus has been and will be on how we support these students within the learning environment. Our range of support services includes learning design and assessment support, integration, data and analytics , student retention, tutoring, and technical support.

This statement is quite clear that there is no imminent end-of-life for LearningStudio, and it is also quite clear about their focus on the “fully remote, online learning space”. This system is primarily used by fully online programs, but there have been a handful of campus-wide clients such as TCU still using the system from the early days. That Pearson LearningStudio would not be appropriate for TCU’s future is partially explained by this focus on full online.

The statement does make an interesting distinction, however, between investing in MyLabs and supporting LearningStudio. My read is that Pearson is not investing in LearningStudio in terms of major product advances and next generation plans but is continuing to fully support current customers. My read is also that Pearson would add new customers to LearningStudio if part of a broader deal tied to content or online “enabling” services (such as Embanet), but that there is no plan for the company to compete in pure LMS competitions.

To help back up this reading, I discovered that the TCU360 article was updated as follows:

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”Hughes said Pearson withdrew from the LMS search process for TCU but remains an LMS provider.

At TCU, at least, the competition is down to Blackboard and D2L, with D2L in the driver’s seat. This competition is also notable by Canvas not being one of the finalists (haven’t seen this situation lately).

One final note on TCU’s selection process described in the article.

These percentages were based on a 214-item questionnaire called the Review Request for Information (RFI) document. These questions were used to assess whether or not a system had the features that TCU was looking for.

“Most LMS vendors told us it took them exactly three months to complete [the questionnaire] because there were so many specific details we were looking for,” Hughes said.

I’ve said it before and I’ll say it again – making a strategic platform selection by a laundry list of hundreds of detailed feature requirements is not a healthy process. I would not brag that it took vendors three full months to complete a questionnaire. But we have one more example to clarify Michael’s classic “Dammit, the LMS” post.

Do you want to know why the LMS has barely evolved at all over the last twenty years and will probably barely evolve at all over the next twenty years? It’s not because the terrible, horrible, no-good LMS vendors are trying to suck the blood out of the poor universities. It’s not because the terrible, horrible, no-good university administrators are trying to build a panopticon in which they can oppress the faculty. The reason that we get more of the same year after year is that, year after year, when faculty are given an opportunity to ask for what they want, they ask for more of the same.

I’d be willing to bet that the vast majority of those 214 items in the RFI are detailed features or direct derivatives of what TCU already has. Even if I’m wrong, it makes little sense for a school to specify the future with detailed requirements; they’re selecting a vendor, not specifying a new design. I wish TCU the best in their LMS selection process, but I would recommend that they put more emphasis on strategic analysis and less on counting check-boxes.

  1. Statement from the original article before it was updated.

The post Interesting Comment on Pearson’s LMS Plans From Customer appeared first on e-Literate.

Ellucian Buys Helix LMS, But Will It Matter?

Wed, 2015-04-15 09:14

By Phil HillMore Posts (307)

At this year’s Ellucian users’ conference #elive15, one of the two big stories has been that Ellucian acquired the Helix LMS, including taking on the development team. I have previously described the Helix LMS in “Helix: View of an LMS designed for competency-based education” as well as the subsequent offer for sale in “Helix Education puts their competency-based LMS up for sale”. The emerging market for CBE-based learning platforms is quickly growing, at least in terms of pilot programs and long-term potential, and Helix is one of the most full-featured, well-designed systems out there.

The Announcement

From the announcement:

Ellucian has acquired Helix Education’s competency-based education LMS and introduced a 2015 development partner program to collaborate with customers on the next-generation, cloud-only solution.

As the non-traditional student stands to make up a significant majority of learners by 2019, Ellucian is investing in technologies that align with priorities of colleges and universities it serves. CBE programs offer a promising new way for institutions to reduce the cost and time of obtaining a high-quality degree that aligns with the skills required by today’s employers.

I had been surprised at the announcement of intent-to-sell in December, noting:

The other side of the market effect will be determined by which company buys the Helix LMS. Will a financial buyer (e.g. private equity) choose to create a standalone CBE platform company? Will a traditional LMS company buy the Helix LMS to broaden their reach in the quickly-growing CBE space (350 programs in development in the US)? Or will an online service provider and partial competitor of Helix Education buy the LMS? It will be interesting to see which companies bid on this product line and who wins.

And I am surprised at the answer – a private equity owned ERP vendor. Throughout the mid 2000s there was talk about the ERP vendors like SunGard Higher Education (SGHE) (which combined with Datatel in 2011 and renamed as Ellucian in 2012) and Oracle entering the LMS market by acquisition, yet this did not materialize beyond the dreaded strategic partnership . . . until perhaps this week. But the Helix LMS was designed specifically for CBE programs, not general usage, so is this really a move into the broader LMS market?

When I interviewed Helix Education about the LMS last summer, they stated several times that the system could be used for non-CBE programs, but there is no evidence that this has actually occurred. I’ll admit that it is more likely to expand a CBE system into general usage than it is to convert a course-based traditional LMS into a CBE system, but it is not clear that the end result of such an expansion would remain a compelling product with user experience appreciated by faculty and students. The path is not risk-free.

Based on briefings yesterday at #elive15, there is evidence that:

  • Ellucian plans to expand the Helix LMS (which will be renamed) beyond CBE; and
  • Ellucian understands that there is development still remaining for this broader usage[1].

Ellucian LMS

Courtesy Ryan Schwiebert:

Support for broad set of delivery models: CBE, Online, Hybrid, Blended, Traditional, CE/WFD

One Challenge: Strategy

But there are already signs that Ellucian is not committed to deliver an LMS with “support for broad set of delivery models”. As described at Inside Higher Ed:

At its user conference in New Orleans, Ellucian announced the acquisition of Helix Education’s learning management system. The company will “blend” the software, which supports nontraditional methods of tracking student progress, into its student information system, said Mark Jones, chief product officer at Ellucian. While he stressed that the company is not planning to become a major learning management system provider, Ellucian will make the system available to departments interested in offering competency-based education.

“The initial goal and focus is on enabling competency-based education programs to flourish,” Jones said. “In terms of being a broader L.M.S. solution, if our customers find value… we will certainly have that conversation.”

I asked Jim Ritchey, president of Delta Initiative and who is attending the conference, for his reaction to Ellucian’s strategy. Jim noted the reaction at the conference to the news “seemed to be more of a curiosity than interest”, and then added:

To me, one of the key questions is how Ellucian will “blend” the software. Do they mean that schools will be able to post the results of the competency based courses to the SIS, or are they talking about leveraging other products within the LMS? For example, some of the capabilities of Pilot could be leveraged to deliver additional capabilities to the LMS. The concern I would have is that tying the LMS to other products will cause the LMS development to be dependent on the roadmaps of the other products. Ellucian will need to find the right level of independence for the LMS so it can grow as a solution while using other products to enhance capabilities. Will the LMS get lost?

In addition there the differing nature of the products to consider. The Helix LMS is centered on the learner and the learner’s schedule, while Banner, Colleague, and PowerCampus are centered on academic terms and courses. These differing design concepts could cause the blending process to remove some of the unique value of the LMS.

Another Challenge: Execution

On paper, this deal seems significant. The company with arguably the greatest number of US higher ed clients now owns an LMS that not only has a modern design but also is targeted at the new wave of CBE programs. The real question, however, is whether Ellucian can pull this off based on their own track record.

Since the 2011 acquisition of SGHE by the private equity firm Hellman & Friedman, Ellucian has endured wave after wave of layoffs and cost cutting measures. I described in 2011 how the SGHE acquisition could pay for itself.

If Hellman & Friedman can achieve reasonable efficiencies by combing SGHE with Datatel, this investment could potentially justify itself in 5 – 7 years by focusing on cash flow operating income, even without SGHE finding a way to reverse its decline in revenue.

Add to this Ellucian’s poor track record of delivering on major product upgrades. The transition from Banner 8 to Banner 9, or later to Banner XE, was described in 2008, promised in 2010, re-promised in 2011, and updated in 2012 / 2013. Banner XE is actually a strategy and not a product. To a degree, this is more a statement of the administrative systems / ERP market in general than just on Ellucian, but the point is that this is a company in a slow-moving market. Workday’s entry into the higher education ERP market has shaken up the current vendors – primarily Ellucian and Oracle / Peoplesoft – and I suspect that many of Ellucian’s changes are in direct response to Workday’s new market power.

Ellucian has bought itself a very good LMS and a solid development team. But will Ellucian have the management discipline to finish the product development and integration that hits the sweet spot for at least some customers? Furthermore, will the Ellucian sales staff sell effectively into the academic systems market?

A related question is why Ellucian is trying to expand into this adjacent market. It seems that Ellucian is suffering from having too many products, and the LMS addition that from the outset requires a new set of development could be a distraction. As Ritchey described after the 2012 conference (paraphrasing what he heard from other attendees):

The approach makes sense, but the hard decisions have not been made. Supporting every product is easy to say and not easy to deliver. At some point in time, they will finalize the strategy and that is when we will begin to learn the future.

In The End . . .

The best argument I have read for this acquisition was provided by Education Dive.

Ellucian is already one of the largest providers of cloud-based software and this latest shift with Banner and Colleague will allow its higher education clients to do even more remotely. Enterprise resource planning systems help colleges and universities increase efficiency with technology. Ellucian touts its ERPs as solutions for automating admissions, creating a student portal for services as well as a faculty portal for grades and institutional information, simplifying records management, managing records, and tracking institutional metrics. The LMS acquisition is expected to take the data analytics piece even further, giving clients more information about students to aid in retention and other initiatives.

But these benefits will matter if and only if Ellucian can overcome its history and deliver focused product improvements. The signals I’m getting so far are that Ellucian has not figured out its strategy and has not demonstrated its ability to execute in this area. Color me watchful but skeptical.

  1. See the “development partner program” part of the announcement.

The post Ellucian Buys Helix LMS, But Will It Matter? appeared first on e-Literate.

GSV 2015 Review

Sun, 2015-04-12 11:06

By Michael FeldsteinMore Posts (1024)

The basic underlying theme of the 2015 GSV Ed Innovation conference is “more is more.” There were more people, more presentations, more deal-making, more celebrities…more of everything, really. If you previously thought that the conference and the deal-making behind it was awesome, you would probably find this year to be awesomer. If you thought it was gross, you would probably think this year was grosser. Overall, it has gotten so big that there is just too much to wrap your head around. I really don’t know how to summarize the conference.

But I can give some observations and impressions.

More dumb money: Let’s start with a basic fact: There is more money coming into the market.

If there is more total money coming in, then it stands to reason that there is also more dumb money coming in. I definitely saw plenty of stupid products that were funded, acquired, and/or breathlessly covered. While it wasn’t directly conference-related, I found it apropos that Boundless was acquired right around the time of the conference. I have made my opinions about Boundless clear before. I have no opinion about Valore’s decision to acquire them, in large part because I don’t know the important details. It might make sense for a company like Valore to acquire Boundless for their platform—if the price is right. But this doesn’t appear to be a triumph for Boundless or their investors. To the contrary, it smells like a bailout of Boundless’ investors to me, although I admit that have no evidence to prove that. If the company were doing so awesomely, then I don’t think the investors would have sold at this point. (Boundless, in typical Boundless fashion, characterizes the transaction as a “merger” rather than an “acquisition.” #Winning.) Of course, you wouldn’t know that this is anything less than the total takeover of education from the breathless press coverage. Xconomy asks whether the combined company will be the “Netflix of educational publishing.”

Really?

So yeah, there’s plenty of dumb money funding dumb companies, aided and abetted by dumb press coverage. But is there proportionally more dumb money, or is there just more dumb money in absolute terms as part of the overall increase in investment? This is an important question, because it is a strong indicator of whether the idiocy is just part of what comes when an immature industry grows or whether we are in a bubble. This particular kind of market analysis is somewhat outside my wheelhouse, but my sense, based on my fragmented experience of the conference added to other recent experiences and observations, is that it’s a bit of both. Parts of the market have clearly gotten ahead of themselves, but there also are some real businesses emerging. Unsurprisingly, some of the biggest successes are not the ones that are out to “disrupt” education. Apparently the ed tech company that got the most money last year was Lynda.com which, in addition to being a good bet, doesn’t really compete head-on with colleges (and, in fact, sells to schools). Phil has written a fair bit about 2U; that company only exists because they have been able to get high-end schools to trust them with their prestige brands. This brings me to my next observation:

More smart money: 2U is a good example of a company that, if you had described it to me in advance, I probably would have told you that it never could work. The companies that do well are likely to be the ones that either figure out an angle that few people see coming or execute extremely well (or, in 2U’s case, both).[1] 2U is also one of very few ed tech that have made it to a successful IPO (although there are more that have been successfully sold to a textbook publisher, LMS vendor, or other large company). I am seeing more genuinely interesting companies getting funding and recognition. Three recent examples: Lumen Learning getting angel funding, Acrobatiq winning the ASU-GSV Return on Education Award, and Civitas closing Series C funding a couple of months ago. I also had more interesting and fewer eye-rolling conversations at the conference this year than in past years. Part of that is because my filters are getting better, but I also think that the median educational IQ of the conference attendees has risen a bit as at least some of the players learn from experience.

Textbooks are dead, dead, dead: McGraw Hill Education CEO David Levin was compelled to start his talk by saying, essentially, “Yeah yeah yeah, everybody hates textbooks and they are dying as a viable business. We get it. We’re going to have all digital products for much less money than the paper textbooks very soon, and students will be able to order the paper books for a nominal fee.” He then went on to announce a new platform where educators can develop their own content.

Pay heed, OER advocates.

I saw Mark Cuban: He has noticeably impressive pecs. Also,

Arizona is nicer than Massachusetts in early April.

  1. Corollary: Companies trying to be the “Netflix of education” or the “Uber of education” or the “Facebook of education” will usually turn out to be as ridiculous—meaning “worthy of ridicule”—as they sound.

The post GSV 2015 Review appeared first on e-Literate.

Why LinkedIn Matters

Sat, 2015-04-11 12:54

By Michael FeldsteinMore Posts (1024)

A few folks have asked me to elaborate on why I think LinkedIn is the most interesting—and possibly the most consequential—company in ed tech.

Imagine that you wanted to do a longitudinal study of how students from a particular college do in their careers. In other words, you want to study long-term outcomes. How did going to that college affect their careers? Do some majors do better than others? And how do alumni fare when compared to their peers who went to other schools? Think about how you would get the data. The college could ask alumni, but it would be very hard to get a good response rate, and even then, the data would go stale pretty quickly. There are governmental data sources you could look at, but there are all kinds of thorny privacy and regulatory issues.

There is only one place in the world I know of where bazillions of people voluntarily enter their longitudinal college and career information, keep it up-to-date, and actually want it to be public.

LinkedIn.

LinkedIn is the only organization I know of, public or private, that has the data to study long-term career outcomes of education in a broad and meaningful way. Nobody else comes close. Not even the government. Their data set is enormous, fairly comprehensive, and probably reasonably accurate. Which also means that they are increasingly in a position to recommend colleges, majors, and individual courses and competencies. An acquisition like Lynda.com gives them an ability to sell an add-on service—“People who are in your career track advanced faster when they took a course like this one, which is available to you for only X dollars”—but it also feeds their data set. Right now, schools are not reporting individual courses to the company, and it’s really too much to expect individuals to fill out comprehensive lists of courses that they took. The more that LinkedIn can capture that information automatically, the more the company can start searching for evidence that enables them to reliably make more fine-grained recommendations to job seekers (like which skills or competencies they should acquire) as well as to employers (like what kinds of credentials to look for in a job candidate). Will the data actually provide credible evidence to make such recommendations? I don’t know. But if it does, LinkedIn is really the only organization that’s in a position to find that evidence right now. This is the enormous implication of the Lynda.com acquisition that the press has mostly missed, and it’s also one reason of many why Pando Daily’s angle on the acquisition—“Did LinkedIn’s acquisition of Lynda just kill the ed tech space?“—is a laughable piece of link bait garbage. The primary value of the acquisition wasn’t content. It was data. It was providing additional, fine-grained nodes on the career graphs of their users. Which means that LinkedIn is likely to do more acquisitions and more partnerships that help accomplish the same end, including providing access of that data for companies and schools to do their own longitudinal outcomes research. Far from “killing ed tech,” this is the first step toward building an ecosystem.

Credit: Social Media Delivered

 

The post Why LinkedIn Matters appeared first on e-Literate.

LinkedIn: I Told You So (Sorta)

Sat, 2015-04-11 05:54

By Michael FeldsteinMore Posts (1024)

In December 2012, I tweeted:

Let it be known that I was the first to predict that Coursera will be acquired by LinkedIn.

— Michael Feldstein (@mfeldstein67) December 5, 2012

At the time, Coursera was the darling of online ed startups. Since then, it has lost its way somewhat, while Lynda.com has taken off like a rocket. Which is probably one big reason why LinkedIn chose to acquire Lynda.com (rather than Coursera) for $1.5 billion. I still think it’s possible that they could acquire a MOOC provider as well, but Udacity seems like a better fit than Coursera at this point.

I’ve said it before and I’ll say it again: LinkedIn is the most interesting company in ed tech.

The post LinkedIn: I Told You So (Sorta) appeared first on e-Literate.

About the Diverging Textbook Prices and Student Expenditures

Mon, 2015-03-30 16:56

By Phil HillMore Posts (307)

This is part 3 in this series. Part 1 described the most reliable data on A) how much US college textbook prices are rising and B) how much students actually pay for textbooks, showing that the College Board data is not reliable for either measure. Part 2 provided additional detail on the data source (College Board, NCES, NACS, Student Monitor) and their methodologies. Note that the textbook market is moving into a required course materials market, and in the immediate series I use both terms somewhat interchangeably based on which source I’m quoting. They are largely equivalent, but not identical.

Based on the most reliable data we have, the average college textbook prices are rising at three times the rate of inflation while average student expenditures on textbooks is remaining flat or even falling, in either case below the rate of inflation. Average student expenditures of approximately $600 per year is about half of what gets commonly reported in the national media. The combined chart comes from this GAO Report (using CPI data) and this NPR report (using Student Monitor data).

Combined Chart

Does this indicate a functioning market, and does this indicate that we don’t have a textbook pricing problem? No, and no.

Why Are Student Expenditures Not Rising Along With Prices?

The answer to this question can be partly found in the financials of your major publishing company. If students were buying new textbooks at the same rate as they used to, publishing companies would be thriving instead of cutting thousands of employees or even resorting to bankruptcy to stay afloat. Students are increasingly choosing to not buy new textbooks.

Let’s look at the NACS data (this one from Fall 2013 data, new data coming out later this week):

NACS 2013 Did Not Acquire

Notes

A few notes to highlight:

  • 30% of surveyed students chose not to acquire at least one required course material. On average, these students skipped acquiring three textbooks in just one term.
  • The top reason in this report is not based on price: 38.5% chose not to acquire required course materials because they felt the materials were not needed or wanted, and 30.2% chose not to acquire based on price.
  • By combining answers, 38.5% chose to borrow the course materials or “it was available elsewhere without purchase”.
  • From the following page (not shown), when asked what students used to substitute for non-acquired course materials:
    • 57.1% just used notes from class;
    • 46.5% borrowed material from friends or libraries; and
    • 19.1% got the chapter or material illegally.

Average expenditures don’t capture the full story, and later in the report it is noted that:

  • Students at two-year colleges spent 31% more than the average on required course materials;
  • Overall first year students spent 23% more than the average on required course materials; and
  • Overall second year students spent 10% more than the average on required course materials.

In other words, the high enrollment courses in the first two years lead to the highest student expenditures on textbooks. Note that we’re still not talking about $1,200 per year spending as often reported based on College Board data, even for these first two years.

Student Monitor also captures some information of note.

  • They report identical data – 30% choosing not to acquire at least one textbook.
  • 29% of students report they bought ‘required course materials’ that ended up not being used. Of these students, 52% will be more likely to “wait longer before purchasing course materials”.
  • They categorize the reasons for not acquiring textbooks differently; professor not using the “required” material was listed by 22% of students, lower than affordability at 31%.
  • 73% of students who downloaded textbooks illegally did so to “save money”.
Negative Impact on Students

It is important to look at both types of data – textbook list prices and student expenditures – to see some of the important market dynamics at play. All in all, students are exercising their market power to keep their expenditures down – buying used, renting, borrowing, obtaining illegally, delaying purchase, or just not using at all. And textbook publishers are suffering, despite (or largely because of) their rising prices.

But there are downsides for students. There are increasing number of students just not using their required course materials, and students often delay purchase until well into the academic term. Whether from perceived need or from rising prices, this is not a good situation for student retention and learning.

The post About the Diverging Textbook Prices and Student Expenditures appeared first on e-Literate.

Postscript on Student Textbook Expenditures: More details on data sources

Fri, 2015-03-27 12:20

By Phil HillMore Posts (307)

There has been a fair amount of discussion around my post two days ago about what US postsecondary students actually pay for textbooks.

The shortest answer is that US college students spend an average of $600 per year on textbooks despite rising retail prices.

I would not use College Board as a source on this subject, as they do not collect their own data on textbook pricing or expenditures, and they only use budget estimates.

<wonk> I argued that the two best sources for rising average textbook price are the Bureau of Labor Statistics and the National Association of College Stores (NACS), and when you look at what students actually pay (including rental, non-consumption, etc) the best sources are NACS and Student Monitor. In this post I’ll share more information on the data sources and their methodologies. The purpose is to help people understand what these sources tell us and what they don’t tell us.

College Board and NPSAS

My going-in- argument was that the College Board is not a credible source on what students actually pay:

The College Board is working to help people estimate the total cost of attendance; they are not providing actual source data on textbook costs, nor do they even claim to do so. Reporters and advocates just fail to read the footnotes.

Both the College Board and National Postsecondary Student Aid Study (NPSAS, official data for the National Center for Education Statistics, or NCES) currently use cost of attendance data created by financial aid offices of each institution, using the category “Books and Supplies”. There is no precise guidance from DOE on the definition of this category, and financial aid offices use very idiosyncratic methods for this budget estimate. Some schools like to maximize the amount of financial aid available to students, so there is motivation to keep this category artificially high.

The difference is three-fold:

  • NPSAS uses official census reporting from schools while the College Board gathers data from a subset of institution – their member institutions;
  • NPSAS reports the combined data “Average net price” and not the sub-category “Books and Supplies”; and
  • College Board data targeted at freshman full-time student.

From NCES report just released today based on 2012 data (footnote to figure 1):

The budget includes room and board, books and supplies, transportation, and personal expenses. This value is used as students’ budgets for the purposes of awarding federal financial aid. In calculating the net price, all grant aid is subtracted from the total price of attendance.

And the databook definition used, page 130:

The estimated cost of books and supplies for classes at NPSAS institution during the 2011–12 academic year. This variable is not comparable to the student-reported cost of books and supplies (CSTBKS) in NPSAS:08.

What’s that? It turns out that in 2008 NCES actually used a student survey – asking them what they spent rather than asking financial aid offices for net price budget calculation. NCES fully acknowledges that the current financial aid method “is not comparable” to student survey data.

As an example of how this data is calculated, see this guidance letter from the state of California [emphasis added].

The California Student Aid Commission (CSAC) has adopted student expense budgets, Attachment A, for use by the Commission for 2015-16 Cal Grant programs. The budget allowances are based on statewide averages from the 2006-07 Student Expenses and Resources Survey (SEARS) data and adjusted to 2015-16 with the forecasted changes in the California Consumer Price Index (CPI) produced by the Department of Finance.

The College Board asks essentially the same question from the same sources. I’ll repeat again – The College Board is not claiming to be an actual data source for what students actually spend on textbooks.

NACS

NACS has two sources of data – both bookstore financial reporting from member institutions and from a Student Watch survey report put out in the Fall and Spring of each academic year. NACS started collecting student expenditure data in 2007, initially every two years, then every year, then twice a year.

NACS sends their survey through approximately 20 – 25 member institutions to distribute to the full student population for that institution or a representative sample. For the Fall 2013 report:

Student WatchTM is conducted online twice a year, in the fall and spring terms. It is designed to proportionately match the most recent figures of U.S. higher education published in The Chronicle of Higher Education: 2013/2014 Almanac. Twenty campuses were selected to participate based on the following factors: public vs. private schools, two-year vs. four-year degree programs, and small, medium, and large enrollment levels.

Participating campuses included:

  • Fourteen four-year institutions and six two-year schools; and
  • Eighteen U.S. states were represented.

Campus bookstores distributed the survey to their students via email. Each campus survey fielded for a two week period in October 2013. A total of 12,195 valid responses were collected. To further strengthen the accuracy and representativeness of the responses collected, the data was weighted based on gender using student enrollment figures published in The Chronicle of Higher Education: 2013/2014 Almanac. The margin of error for this study is +/- 0.89% at the 95% confidence interval.

I interviewed Rich Hershman and Liz Riddle, who shared the specific definitions they use.

Required Course Materials:Professor requires this material for the class and has made this known through the syllabus, the bookstore, learning management system, and/or verbal instructions. These are materials you purchase/rent/borrow and may include textbooks (including print and/or digital versions), access codes, course packs, or other customized materials. Does not include optional or recommended materials.

The survey goes to students who report what they actually spent. This includes the categories of sharing materials, choosing not to acquire, rental, purchase new and purchase used.

The data is aggregated across full-time and part-time students, undergraduates and graduates. So the best way to read the data I shared previously ($638 per year) is as per-capita spending. The report breaks down further by institution type (2-yr public, etc) and type (purchase new, rental, etc). The Fall 2014 data is being released next week, and I’ll share more breakdowns with this data.

In future years NACS plans to expand the survey to go through approximately 100 institutions.

Student Monitor

Student Monitor describes their survey as follows:

  • Conducted each Spring and Fall semester
  • On campus, one-on-one intercepts conducted by professional interviewers during the three week period March 24th to April 14th, 2014 [Spring 2014 data] and October 13th-27th [Fall 2014 data]
  • 1,200 Four Year full-time undergrads (Representative sample, 100 campuses stratified by Enrollment, Type, Location, Census Region/Division)
  • Margin of error +/- 2.4%

In other words, this is an intercept survey conducted with live interviews on campus, targeting full-time undergraduates. This includes the categories of sharing materials, choosing not to acquire, rental, purchase new and purchase used.

In comparison to NACS, Student Monitor tracks more schools (100 vs. 20) but fewer students (1,200 vs. 12,000).

Despite the differences in methodology, Student Monitor and NACS report spending that is fairly consistent (both on the order of $600 per year per student).

New Data in Canada

Alex Usher from Higher Education Strategy Associates shared a blog post in response to my post that is quite interesting.

This data is a little old (2012), but it’s interesting, so my colleague Jacqueline Lambert and I thought we’d share it with you. Back then, when HESA was running a student panel, we asked about 1350 university students across Canada about how much they spent on textbooks, coursepacks, and supplies for their fall semester. [snip]

Nearly 85% of students reported spending on textbooks. What Figure 1 shows is a situation where the median amount spent is just below $300, and the mean is near $330. In addition to spending on textbooks, another 40% or so bought a coursepack (median expenditure $50), and another 25% reported buying other supplies of some description (median expenditure: also $50). Throw that altogether and you’re looking at average spending of around $385 for a single semester.

Subtracting out the “other supplies” that do not fit in NACS / Student Monitor definitions, and acknowledging that fall spending is typically higher than spring due to full-year courses, this data is also in the same ballpark of $600 per year (slightly higher in this case).

Upcoming NPSAS Data

The Higher Education Act of 2008 required NCES to add student expenditures on course materials to the NPSAS database, but this has not been added yet. According to Rich Hershman from NACS, NCES is using a survey question that is quite similar to NACS and field testing this spring. The biggest difference will be that NPSAS is annual data whereas NACS and Student Monitor send out their survey in fall and spring (then combining data).

Sometime in 2016 we should have better federal data on actual student expenditures.

</wonk>

Update: Mistakenly published without reference to California financial aid guidance. Now fixed.

Update 3/30: I mistakenly referred to the IPEDS database for NCES when this data is part of National Postsecondary Student Aid Study (NPSAS). All references to IPEDS have been corrected to NPSAS. I apologize for confusion.

The post Postscript on Student Textbook Expenditures: More details on data sources appeared first on e-Literate.

How Much Do College Students Actually Pay For Textbooks?

Wed, 2015-03-25 07:16

By Phil HillMore Posts (306)

With all of the talk about the unreasonably high price of college textbooks, the unfulfilled potential of open educational resources (OER), and student difficulty in paying for course materials, it is surprising how little is understood about student textbook expenses. The following two quotes illustrate the most common problem.

Atlantic: “According to a recent College Board report, university students typically spend as much as $1,200 a year total on textbooks.”

US News: “In a survey of more than 2,000 college students in 33 states and 156 different campuses, the U.S. Public Interest Research Group found the average student spends as much as $1,200 each year on textbooks and supplies alone.”

While I am entirely sympathetic to the need and desire to lower textbook and course material prices for students, no one is served well by misleading information, and this information is misleading. Let’s look at the actual sources of data and what that data tells us, focusing on the aggregate measures of changes in average textbook pricing in the US and average student expenditures on textbooks. What the data tells us is that the answer is that students spend on average $600 per year on textbooks, not $1,200.

First, however, let’s address the all-too-common College Board reference.

College Board Reference

The College Board positions itself as the source for the cost of college, and their reports look at tuition (published and net), room & board, books & supplies, and other expenses. This chart is the source of most confusion.

College Board Chart

The light blue “Books and Supplies” data, ranging from $1,225 to $1,328, leads to the often-quoted $1,200 number. But look at the note right below the chart:

Other expense categories are the average amounts allotted in determining total cost of attendance and do not necessarily reflect actual student expenditures.

That’s right – the College Board just adds budget estimates for the books & supplies category, and this is not at all part of their actual survey data. The College Board does, however, point people to one source that they use as a rough basis for their budgets.

According to the National Association of College Stores, the average price of a new textbook increased from $62 (in 2011 dollars) in 2006-07 to $68 in 2011-12. Students also rely on textbook rentals, used books, and digital resources. (http://www.nacs.org/research/industrystatistics/higheredfactsfigures.aspx)

The College Board is working to help people estimate the total cost of attendance; they are not providing actual source data on textbook costs, nor do they even claim to do so. Reporters and advocates just fail to read the footnotes. The US Public Interest Research Group is one of the primary reasons that journalists use the College Board data incorrectly, but I’ll leave that subject for another post.

The other issue is the combination of books and supplies. Let’s look at actual data and sources specifically for college textbooks.

Average Textbook Price Changes

What about the idea that textbook prices keep increasing?

BLS and Textbook Price Index

The primary source of public data for this question is the Consumer Price Index (CPI) from the Bureau of Labor Statistics (BLS). The CPI sets up a pricing index based on a complex regression model. The index is set to 100 for December, 2001 when they started tracking this category. Using this data tool for series CUUR0000SSEA011 (college textbooks), we can see the pricing index from 2002 – 2014[1].

CPI Annual

This data equates to roughly 6% year-over-year increases in the price index of new textbooks, roughly doubling every 11 years. But note that this data is not inflation-adjusted, as the CPI is used to help determine the inflation rate. Since the US average inflation rate over 2002 – 2014 has averaged 2%, this means that textbook prices are rising roughly 3 times the rate of inflation.

NACS and Average Price Per Textbook

NACS, as its name implies, surveys college bookstores to determine what students spend on various items. The College Board uses them as a source. This is the most concise summary, also showing rising textbook prices on a raw, non inflation-adjusted basis, although a lower rate of increase than the CPI.

The following graph for average textbook prices is based on data obtained in the annual financial survey of college stores. The most recent data for “average price” was based on the sale of 3.4 million new books and 1.9 million used books sold in 134 U.S. college stores, obtained in the Independent College Stores Financial Survey 2013-14.

NACS Avg Textbook Price

Other Studies

The Government Accountability Office (GAO) did a study in 2013 looking at textbook pricing, but their data source was the BLS. This chart, however, is popularly cited.

GAO Chart

There are several private studies done by publishers or service companies that give similar results, but by definition these are not public.

Student Expenditure on Books and Supplies

For most discussion on textbook pricing, the more relevant question is what do students actually spend on textbooks, or at least on required course materials. Does the data above indicate that students are spending more and more every year? The answer is no, and the reason is that there are far more options today for getting textbooks than there used to be, and one choice – choosing not to acquire the course materials – is rapidly growing. According to Student Monitor, 30% of students choose to not acquire every college textbook.

Prior to the mid 2000s, the rough model for student expenditures was that roughly 65% purchased new textbooks and 35% bought used textbooks. Today, there are options for rentals, digital textbooks, and courseware, and the ratios are changing.

The two primary public sources for how much students spend on textbooks are the National Association of College Stores (NACS) and The Student Monitor.

NACS

The NACS also measures average student expenditure for required course materials, which is somewhat broader than textbooks but does not include non-required course supplies.

The latest available data on student spending is from Student Watch: Attitudes & Behaviors toward Course Materials, Fall 2014. Based on survey data, students spent an average of $313 on their required course materials, including purchases and rentals, for that fall term. Students spent an average of $358 on purchases for “necessary but not required” technology, such as laptops, USB drives, for the same period.

NACS Course Material Expenditures

Note that by the nature of analyzing college bookstores, NACS is biased towards traditional face-to-face education and students aged 18-24.

Update: I should have described the NACS methodology in more depth (or probably need a follow-on post), but their survey is distributed through the bookstore to students. Purchasing through Amazon, Chegg, rental, and decisions not to purchase are all captured in that study. It’s not flawless, but it is not just for purchases through the bookstore. From the study itself:

Campus bookstores distributed the survey to their students via email. Each campus survey fielded for a two week period in October 2013. A total of 12,195 valid responses were collected. To further strengthen the accuracy and representativeness of the responses collected, the data was weighted based on gender using student enrollment figures published in The Chronicle of Higher Education: 2013/2014 Almanac. The margin of error for this study is +/- 0.89% at the 95% confidence interval.

Student Monitor

Student Monitor is a company that provides syndicated and custom market research, and they produce extensive research on college expenses in the spring and fall of each year. This group interviews students for their data, rather than analyzing college bookstore financials, which is a different methodology than NACS. Based on the Fall 2014 data specifically on textbooks, students spent an average of $320 per term, which is quite close to the $638 per year calculated by NACS. Based on information from page 126:

Average Student Acquisition of Textbooks by Format/Source for Fall 2014

  • New print: 59% of acquirers, $150 total mean
  • Used print: 59% of acquirers, $108 total mean
  • Rented print: 29% of acquirers, $38 total mean
  • eTextbooks (unlimited use): 16% of acquirers, $15 total mean
  • eTextbooks (limited use): NA% of acquirers, $9 total mean
  • eTextbooks (file sharing): 8% of acquirers, $NA total mean
  • Total for Fall 2014: $320 mean
  • Total on Annual Basis: $640 mean

Note, however, that the Fall 2014 data ($640 annual) represents a steep increase from the previous trend as reported by NPR (but based on Student Monitor data). I have asked Student Monitor for commentary on the increase but have not heard back (yet).

NPR Student Monitor

Like NACS, Student Monitor is biased towards traditional face-to-face education and students aged 18-24.

Summary

I would summarize the data as follows:

The shortest answer is that US college students spend an average of $600 per year on textbooks despite rising retail prices.

I would not use College Board as a source on this subject, as they do not collect their own data on textbook pricing or expenditures, and they only use budget estimates.

I would like to thank Rob Reynolds from NextThought for his explanation and advice on the subject.

Update (3/25): See note on NACS above.

Update (3/27): See postcript post for additional information on data sources.

  1. Note that BLS has a category CUSR0000SEEA (Educational Books & Supplies) that has been tracked far longer than the sub-category College Textbooks. We’ll use the textbooks to simplify comparisons.

The post How Much Do College Students Actually Pay For Textbooks? appeared first on e-Literate.

Austin Community College’s ACCelerator: Big bet on emporium approach with no pilots

Sun, 2015-03-22 14:55

By Phil HillMore Posts (305)

While at SXSWedu, I was able to visit Austin Community College’s ACCelerator lab, which got a fair bit of publicity over the past month. While the centerpiece of ACCelerator usage is for developental math, the 600+ workstation facility spread over 32,000 square feet also supports Tutoring in a variety of subjectsFirst year experienceGroup advisingAcademic CoachingAdult EducationContinuing EducationCollege readiness assessment preparation, and Student skills workshops.

ACCelerator

But it is the developmental math course that has received the most coverage.

Austin Community College welcomed second lady Dr. Jill Biden and Under Secretary of Education Dr. Ted Mitchell on Monday, March 9, to tour the Highland Campus’ ACCelerator and meet with students and faculty of the college’s new developmental math course, MATD 0421. [snip]

“I teach a lot of developmental students,” says Dr. Biden. “The one stumbling block does seem to be math and math anxiety and ‘Can I do it?’. This (course) seems to be so empowering and so positive. Students can see immediate success.”

MATD 0421 is a self-paced, emporium-style course that encompasses all three levels of developmental math. Paul Fain at Inside Higher Ed had an excellent article that included a description of the motivation.

Dismal remedial success rates have been a problem at Austin, which enrolls 60,000 students. So faculty members from the college looked around for alternative approaches to teaching math.

“Really, there’s nothing to lose,” said [Austin CC president] Rhodes.

The Highland Campus, where the ACCelerator lab is located, is built in a former shopping mall. Student in Austin CC can choose courses at any of the 8 campuses or 5 centers. All developmental math at the Highland Campus is run through MATD 0421, so students across the system can choose traditional approaches at other campuses of the emporium approach at Highland.

Austin CC picked this approach after researching several other initiatives (Fain describes Virginia Tech and Montgomery College examples). The IHE article then describes the design:

Austin officials decided to try the emporium method. They paired it with adaptive courseware, which adjusts to individual learners based on their progress and ability to master concepts. The college went with ALEKS, an adaptive software platform from McGraw-Hill Education.

Fain describes the personalization aspect:

The new remedial math course is offered at the ACCelerator. The computer stations are arranged in loose clusters of 25 or so. Faculty members are easy to spot in blue vests. Student coaches and staff wear red ones.

This creates a more personalized form of learning, said Stacey Güney, the ACCelerator’s director. That might seem paradoxical in computer lab that has a bit of a Matrix feel. But Güney said that instead of a class size of 25 students per instructor, the course features 25 classes of one student.

“In here there is no back of the classroom,” she said.

While the program is fairly new (second term), there are some initial results described by the official site:

In MATD 0421’s inaugural semester:

  • The withdrawal rate was less than half the rate for traditional developmental math courses.
  • 75 percent of the students completed the equivalent of one traditional course.
  • Nearly 45 percent completed the equivalent to a course and one-half.
  • Over 14 percent completed the equivalent to two courses.
  • 13 students completed all the equivalent of three courses.

Go read the full IHE article for a thorough description. I would offer the following observations.

  • Rather than a pilot program, which I have argued plagues higher ed and prevents diffusion of innovations, Austin CC has committed to a A) a big program up front (~700 students in the Fall 2014 inaugural semester) and ~1,000 students in Spring 2015, yet B) they offer students the choice of traditional or emporium. To me, this offers the best of both worlds in allowing a big bet that doesn’t get caught in the “purgatory of pilots” while offering student choice.
  • While the computer lab and software are easy headlines, I hope people don’t miss the heavy staffing that are a central feature of this lab – there are more than 90 faculty and staff working there, teaching the modular courses, roving the aisles to provide help, and working in help desks. The ACCelerator is NOT an exercise in replacing faculty with computers.
  • During my tour, instructor Christie Allen-Johnson and associate professor Ann P. Vance described their plans to perform a more structured analysis of the results. Expect to see more validated outcomes starting at the end of CY2015.
  • When and if Austin CC proves the value and results of the model, that would be the time to migrate most of the remaining developmental math courses into this emporium model.
  • The one area that concerns me is the lack of structured time for students away from the workstations. Developmental students in community colleges often have not experienced academic success – knowing how to succeed, learning how to learn, believing in their ability to succeed – and often this non-cognitive aspect of math is as important as the actual coursework. Allen-Johnson described the availability of coaching that goes beyond coursework, but that is different than providing structure for coaching and self-regulated learning.

The post Austin Community College’s ACCelerator: Big bet on emporium approach with no pilots appeared first on e-Literate.

Our Policy on Cookies and Tracking

Thu, 2015-03-19 10:00

By Michael FeldsteinMore Posts (1024)

In the wake of the Pearson social media monitoring controversy, edubloggers like Audrey Watters and D’arcy Norman have announced their policies regarding code that can potentially track users on their blogs. This is a good idea, so we are following their example.

We use Google Analytics and WordPress analytics on both e-Literate and e-Literate TV. The main reason we do so is that we believe the information these packages provide help us create more useful content. Even after a decade of blogging, we are still surprised sometimes by which posts earn your attention and which ones don’t. We look at our analytics results fairly regularly to see what we can learn about writing more content that you find to be worth your time. This is by no means the only or even the main way that we decide what we will write, but we think of it as one of relatively few clues we have to understand to which posts and topics will have the most value to you. We do not run ads and have no intention of doing so in the future. In the case of e-Literate TV, where the content is expensive to make, we may also use information regarding the number of viewers of the episodes in the future to demonstrate to sponsors that our content is having an impact. We make no effort to track individuals and, in fact, have always had a policy of letting our readers comment on posts without registering on the site. But Google in particular is likely making more extensive use of the usage data that they gather.

In addition to the two analytics packages mentioned above, we do embed YouTube videos and use social media buttons, which may carry their own tracking code with them from the companies that supply them. Unfortunately, this is just part of the deal with embedding YouTube videos or adding convenient “Tweet this” links. The tracking code (which usually, but not always, means the same thing as “cookies”) on our site is pretty typical for what you will find for any site that provides these sorts of conveniences.

But that doesn’t mean that you have to allow yourself to be tracked if you prefer not to be. There are a number of excellent anti-tracking plugins available for the mainstream browsers, including Ghostery and Disconnect. If you are concerned about being tracked (here or anywhere), then we recommend installing one or more of these plugins, and we also recommend spending a little time to learn how they work and what sorts of tracking code are embedded on the different sites you visit so that you can make informed and fine-grained decisions about what information you do and do not want to share. These tools often let you make service-by-service and site-by-site decisions, but they generally start with the default of protecting your privacy by blocking everything.

To sum up and clarify our privacy policies:

  • We do use Google Analytics and WordPress analytics.
  • We do embed social media tools that in some cases carry their own tracking code.
  • We do not make any effort to track individuals on our sites.
  • We do not use or plan to use analytics for ads or in any way sell the information from our analytics to third parties, including but not limited to ads.
  • We may in the future provide high-level summaries of site traffic and video views to e-Literate TV sponsors.
  • We do support commenting on blog posts without registration.[1]
  • We do provide our full posts in our RSS feed, which excludes most (but not all) tracking code.
  • We do provide CC-BY licensing on our content so that it can be used on other sites, including ones that do not have any tracking code .
  1. Note: We do require an email address from commenters for the sole purpose of providing us with a means of contacting the poster in the event that the person has written something uncivil or marginally inappropriate and we need to discuss the matter with that person privately before deciding what to do about moderation. In the 10-year history of e-Literate, this has happened about three or four times. There are two differences relevant to reader privacy between requiring the email address and requiring registration. First, we allow people to use multiple email addresses or even temporary email addresses if they do not wish that email to be personally identifiable. We only require that the email address be a working address. Second and probably more importantly, without registration, there is no mechanism to link comments to browsing behavior on the site.

The post Our Policy on Cookies and Tracking appeared first on e-Literate.

Back To The Future: Looking at LMS forecasts from 2011 – 2014

Wed, 2015-03-18 18:11

By Phil HillMore Posts (303)

At today’s Learning Analytics and Knowledge 2015 conference (#LAK15), Charles Severance (aka Dr. Chuck) gave the morning keynote organized around the theme of going back in time to see what people (myself and Richard Katz primarily) were forecasting for education. By looking at the reality of 2015, we can see which forecasts were on track and which were not. I like this concept, as it is useful to go back and see what we got right and wrong, so this post is meant to provide some additional context particularly for LMS market. Chuck’s keynote also gives cover for doing so without seeming too self-absorbed.

But enough about me. What do you think about me?

I use the term forecast since I tend to describe patterns and trends and then try to describe the implications. This is different than the Katz video which aimed to make specific predictions as a thought-provoking device.

Pre-2011

I introduced the LMS squid diagram in 2008 as a tool to help people see the LMS market holistically rather than focusing on detailed features. Too much of campus evaluations then (and even now) missed the big picture that there were only a handful of vendors and some significant market dynamics at play.

A 2009 presentation, by the way, was the basis for Michael and me connecting for the first time. Bromance.

2011

In early 2011 I wrote a post on Visigoths at the LMS Gates, noting:

I am less inclined to rely on straight-line projections of market data to look ahead, and am more inclined to think the market changes we are seeing are driven by outside forces with potentially nonlinear effects. Rome may have been weakened from within, but when real change happened, the Visigoths made it happen. [snip]

Today, there is a flood of new money into the educational technology market. In addition to the potential acquisition of Blackboard, Instructure just raised $8M in venture funding and vying for the role of Alaric in their marketing position, Pearson has been heavily investing in Learning Studio (eCollege for you old-timers), and Moodlerooms raised $7+M in venture funding. Publishing companies, ERP vendors, private equity, venture funding – these are major disruptive forces. And there is still significant moves being made by technology companies such as Google.

In August I started blogging at e-Literate with this post on Emerging Trends in LMS / Ed Tech Market. The trends I described (summary here, see post for full description):

From my viewpoint in 2011, the market has essentially moved beyond Blackboard as the dominant player driving most of the market dynamics.

  • The market is more competitive, with more options, than it has been for years.
  • Related to the above, there is a trend towards software as a service (SaaS) models for new LMS solutions.
  • Also related to the above, the market is demanding and getting real Web 2.0 and Web 3.0 advances in LMS user interfaces and functionality. We are starting to see some real improvements in usability in the LMS market.
  • The lines are blurring between content delivery systems (e.g. Cengage MindTap, Pearson MyLabs, etc) and LMS.
  • Along those same lines, it is also interesting in what is not being seen as a strategic blurring of lines – between LMS and student information systems.
  • Analytics and data reporting are not just aspirational goals for LMS deployments, but real requirements driven by real deadlines.

Looking back at the 2011 posts, I would note the following:

  • I think all of the basic trends have proven to be accurate, although I over-stated the analytics importance of “real requirements driven by real deadlines”. Analytics are important and some schools have real requirements, but for most schools analytics is not far beyond “aspirational goals”.
  • Chuck over-interpreted the “it’s all about MyLabs”. The real point is the blurring of lines between previously distinct categories of delivery platforms and digital content. I would argue that the courseware movement as well as most CBE platforms shows this impact in 2015. MyLabs was just an example in the graphic.
  • My main message about outside forces was that the internal players (Blackboard, Desire2Learn, Moodle, etc) were not going to be the source of change, rather “new competitors and new dynamics” would force change. Through the graphic, I over-emphasized the ERP and big tech players (Oracle, Google, Pearson & eCollege, etc) while I under-emphasized Instructure, which has proven to be the biggest source of change (although driven by VC funding).
  • I still like the Rome / Visigoths / Alaric metaphor.
2012

In early 2012 I had a post Farewell to the Enterprise LMS, Greetings to the Learning Platform that formed the basis of the forecasts Chuck commented on in the LAK15 keynote.

In my opinion, when we look back on market changes, 2011 will stand out as the year when the LMS market passed the point of no return and changed forever. What we are now seeing are some real signs of what the future market will look like, and the actual definition of the market is changing. We are going from an enterprise LMS market to a learning platform market.

In a second post I defined the characteristics of a Learning Platform (or what I meant by the term):

  1. Learning Platforms are next-generation technology compared to legacy LMS solutions arising in the late 1990’s / early 2000’s. While many features are shared between legacy LMS and learning platforms, the core designs are not constrained by the course-centric, walled-garden approach pioneered by earlier generations.
  2. Learning Platforms tend to be SaaS (software as a service) offerings, based in a public or private cloud on multi-tenant designs. Rather than being viewed as an enterprise application to be set up as a customized instance for each institution, there is a shared platform that supports multiple customers, leveraging a shared technology stack, database, and application web services.
  3. Learning Platforms are intended to support and interoperate with multiple learning and social applications, and not just as extensions to the enterprise system, but as a core design consideration.
  4. Learning Platforms are designed around the learner, giving a sense of identify that is maintained throughout the learning lifecycle. Learners are not just pre-defined roles with access levels within each course, but central actors in the system design.
  5. Learning Platforms therefore are social in nature, supporting connections between learners and customization of content based on learner needs.
  6. Learning Platforms include built-in analytics based on the amalgamation of learner data across courses, across institutions, and even beyond institutions.
  7. Learning Platforms allow for the discovery of instructional content, user-generated content, and of other learners.

Going back to the Farewell post, the forecast was:

Another trend that is becoming apparent is that many of the new offerings are not attempting to fully replace the legacy LMS, at least all at once. Rather than competing with all of the possible features that are typical in enterprise LMS solutions, the new platforms appear to target specific institutional problems and offer only the features needed. Perhaps inspired by Apple’s success in offering elegant solutions at the expense of offering all the features, or perhaps inspired by Clayton Christensen’s disruptive innovation model, the new learning platform providers are perfectly willing to say ‘no – we just don’t offer this feature or that feature’.

Looking back at the 2012 posts, I would note the following:

  • I still see the move from enterprise LMS to learning platform, but it is happening slower than I might have thought and more unevenly. The attributes of SaaS and fewer features has happened (witness Canvas in particular), and the interoperability capabilities are occurring (with special thanks to Chuck and his work with IMS developing LTI). However, the adoption and true usage of multiple learning and social applications connected through the platform is quite slow.
  • The attributes of learner-centric design built-in analytics can be seen in many of the CBE platforms, but not really in the general LMS market itself.
2013 / 2014

In 2013 and 2014 I updated the LMS squid graphic.

 

  • Chuck was right to point out the revision that I no longer included the outside forces of ERP & big tech. The key point of 2011 forecasts was outside forces making changes, but by 2013 it was clear that ERP & big tech were not part of this change.
  • There is also a big addition of homegrown solutions, or alternative learning platforms that is worth noting. The entrance of so new CBE platforms designed from the ground up for a specific purposes is an example of this trend.
Overall Notes

Thanks to Chuck, this has been informative (to me, at least) to go back and review forecasts and see what I got right and what I got wrong. Chuck’s general point on my forecasts seem to be that I am over-emphasizing the emergence of learning platforms at least as a distinct category from enterprise LMS, and that we’re still seeing LMS market although with changed internals (fewer features, more interoperability). I don’t disagree with this point (if I am summarizing accurately). However, if you read the actual forecasts above, I don’t think Chuck and I are too far apart. I may be more optimistic than he is and need to clarify my terminology somewhat, but we’re in the same ball park.

Now let’s turn the tables. My main critique with Dr. Chuck’s keynote is that he just didn’t commit on the song. We know he is willing to boldly sing, after all (skip ahead to 1:29).

Click here to view the embedded video.

Update: Clarified language on LTI spec

The post Back To The Future: Looking at LMS forecasts from 2011 – 2014 appeared first on e-Literate.

Blackboard Brain Drain: One third of executive team leaves in past 3 months

Tue, 2015-03-17 10:02

By Phil HillMore Posts (302)

In August 2013 Michael described Ray Henderson’s departure from an operational role at Blackboard. As of the end of 2014, Ray is no longer on the board of directors at Blackboard either. He is focusing on his board activity (including In The Telling, our partner for e-Literate TV) and helping with other ed tech companies. While Ray’s departure from the board did not come as a surprise to me, I have been noting the surprising number of other high-level departures from Blackboard recently.

As of December 24, 2014, Blackboard listed 12 company executives in their About > Leadership page. Of those 12 people, 4 have left the company since early January. Below is the list of the leadership team at that time along with notes on changes:

  • Jay Bhatt, CEO
  • Maurice Heiblum, SVP Higher Education, Corporate And Government Markets (DEPARTED February, new job unlisted)
  • Mark Belles, SVP K-12 (DEPARTED March, now President & COO at Teaching Strategies, LLC)
  • David Marr, SVP Transact
  • Matthew Small, SVP & Managing Director, International
  • Gary Lang, SVP Product Development, Support And Cloud Services (DEPARTED January, now VP B2B Technology, Amazon Supply)
  • Katie Blot, SVP Educational Services (now SVP Corporate Strategy & Business Development)
  • Mark Strassman, SVP Industry and Product Management
  • Bill Davis, CFO
  • Michael Bisignano, SVP General Counsel, Secretary (DEPARTED February, now EVP & General Counsel at CA Technologies)
  • Denise Haselhorst, SVP Human Resources
  • Tracey Stout, SVP Marketing

Beyond the leadership team, there are three others worth highlighting.

  • Brad Koch, VP Product Management (DEPARTED January, now at Instructure)
  • David Ashman, VP Chief Architect, Cloud Architecture (DEPARTED February, now CTO at Teaching Strategies, LLC)
  • Mark Drechsler, Senior Director, Consulting (APAC) (DEPARTED March, now at Flinders University)

I mentioned Brad’s departure already and the significance in this post. Mark is significant in terms of his influence in the Australian market, as he came aboard from the acquisition of NetSpot.

David is significant as he was Chief Architect and had the primary vision for Blackboard’s impending moving into the cloud. Michael described this move in his post last July.

Phil and I are still trying to nail down some of the details on this one, particularly since the term “cloud” is used particularly loosely in ed tech. For example, we don’t consider D2L’s virtualization to be a cloud implementation. But from what we can tell so far, it looks like a true elastic, single-instance multi-tenant implementation on top of Amazon Web Services. It’s kind of incredible. And by “kind of incredible,” I mean I have a hard time believing it. Re-engineering a legacy platform to a cloud architecture takes some serious technical mojo, not to mention a lot of pain. If it is true, then the Blackboard technical team has to have been working on this for a long time, laying the groundwork long before Jay and his team arrived. But who cares? If they are able to deliver a true cloud solution while still maintaining managed hosting and self-hosted options, that will be a major technical accomplishment and a significant differentiator.

This seems like the real deal as far as we can tell, but it definitely merits some more investigation and validation. We’ll let you know more as we learn it.

This rollout of new cloud architecture has taken a while, and I believe it is hitting select customers this year. Will David’s departure add risk to this move? I talked to David a few weeks ago, and he said that he was leaving for a great opportunity at Teaching Strategies, and that while he was perhaps the most visible face of the cloud at Blackboard, others behind the scenes are keeping the vision. He does not see added risk. While I appreciate the direct answers David gave me to my questions, I still cannot see how the departure of Gary Lang and David Ashman will not add risk.

So why are so many people leaving? From initial research and questions, the general answer seems to be ‘great opportunity for me professionally or personally, loved working at Blackboard, time to move on’. There is no smoking gun that I can find, and most departures are going to very good jobs.

Jay Bhatt, Blackboard’s CEO, provided the following statement based on my questions.

As part of the natural evolution of business, there have been some transitions that have taken place. A handful of executives have moved onto new roles, motivated by both personal and professional reasons. With these transitions, we have had the opportunity to add some great new executive talent to our company as well. Individuals who bring the experience and expertise we need to truly capture the growth opportunity we have in front of us. This includes Mark Gruzin, our new NAHE/ProEd GTM lead, Peter George, our new head of product development and a new general counsel who will be starting later this month. The amazing feedback we continue to receive from customers and others in the industry reinforces how far we’ve come and that we are on the right path. As Blackboard continues to evolve, our leaders remain dedicated to moving the company forward into the next stage of our transformation.

While Jay’s statement matches what I have heard, I would note the following:

  • The percentage of leadership changes within a 3 month period rises above the level of “natural evolution of business”. Correlation does not imply causation, but neither does it imply a coincidence.
  • The people leaving have a long history in educational technology (Gary Lang being the exception), but I have not seen the same in reverse direction. Mark Gruzin comes from a background in worldwide sales and federal software group at IBM. Peter George comes from a background in Identity & Access Management as well as Workforce Management companies. They both seem to be heavy hitters, but not in ed tech. Likewise, Jay himself along with Mark Strassman and Gary Lang had no ed tech experience when they joined Blackboard. This is not necessarily a mistake, as fresh ideas and approaches were needed, but it is worth noting the stark differences in people leaving and people coming in.
  • These changes come in the middle of Blackboard making huge bets on a completely new user experience and a move into the cloud. These changes were announced last year, but they have not been completed. This is the most important area to watch – whether Blackboard completes these changes and successfully rolls them out to the market.

We’ll keep watching and update where appropriate.

The post Blackboard Brain Drain: One third of executive team leaves in past 3 months appeared first on e-Literate.

Rutgers and ProctorTrack Fiasco: Impact of listening to regulations but not to students

Mon, 2015-03-16 13:07

By Phil HillMore Posts (302)

If you want to observe the unfolding impact of an institution ignoring the impact of policy decisions on students, watch the situation at Rutgers University. If you want to see the power of a single student saying “enough is enough”, go thank Betsy Chao and sign her petition. The current situation is that students are protesting the Rutgers usage of ProctorTrack software – which costs students $32 in additional fees, accessing their personal webcams, automatically tracks face and knuckle video as well as watching browser activity – in online courses. Students seem to be outraged at the lack of concern over student privacy and additional fees.

Prior to 2015, Rutgers already provided services for online courses to comply with federal regulations to monitor student identity. The rationale cited [emphasis added]:

The 2008 Higher Education Opportunity Act (HEOA) requires institutions with distance education programs to have security mechanisms in place that ensure that the student enrolled in a particular course is in fact the same individual who also participates in course activities, is graded for the course, and receives the academic credit. According to the Department of Education, accrediting agencies must require distance education providers to authenticate students’ identities through secure (Learning Management System) log-ins and passwords, proctored exams, as well as “new identification technologies and practices as they become widely accepted.”

This academic term, Rutgers added a new option – ProctorTrack:

Proctortrack is cost-effective and scalable for any institution size. Through proprietary facial recognition algorithms, the platform automates proctoring by monitoring student behavior and action for test policy compliance. Proctortrack can detect when students leave their space, search online for additional resources, look at hard notes, consult with someone, or are replaced during a test.

This occurred at the same time as the parent company Verificient received a patent for their approach, in January 2015.

A missing piece not covered in the media thus far is that Rutgers leaves the choice of student identify verification approach up to individual faculty or academic program [emphasis added].

In face-to-face courses, all students’ identities are confirmed by photo ID prior to sitting for each exam and their activities are monitored throughout the exam period. To meet accreditation requirements for online courses, this process must also take place. Rutgers makes available electronic proctoring services for online students across the nation and can assist with on-site proctoring solutions. Student privacy during a proctored exam at a distance is maintained through direct communication and the use of a secure testing service. Students must be informed on the first day of class of any additional costs they may incur for exam proctoring and student authentication solutions.

The method of student authentication used in a course is the choice of the individual instructor and the academic unit offering the course. In addition to technology solutions such as Examity and ProctorTrack, student authentication can also be achieved through traditional on-site exam proctoring solutions. If you have any questions, talk to your course instructor.

As the use of of ProctorTrack rolled out this term, at least one student – senior Betsy Chao – was disturbed and on February 5th created a petition on change.org.

However, I recently received emails from both online courses, notifying me of a required “Proctortrack Onboarding” assessment to set up Proctortrack software. Upon reading the instructions, I was bewildered to discover that you had to pay an additional $32 for the software on top of the $100 convenience fee already required of online courses. And I’m told it’s $32 per online class. $32 isn’t exactly a large sum, but it’s certainly not pocket change to me. Especially if I’m taking more than one online class. I’m sure there are many other college students who echo this sentiment. Not only that, but nowhere in either of the syllabi was there any inkling of the use of Proctortrack or the $32 charge. [snip]

Not only that, but on an even more serious note, I certainly thought that the delicate issue of privacy would be more gracefully handled, especially within a school where the use of webcams was directly involved in a student’s death. As a result, I thought Rutgers would be highly sensitive to the issue of privacy.

If accurate, this clearly violates the notification policy of Rutgers highlighted above. Betsy goes on to describe the alarming implications relating to student privacy.

On February 7th, New Brunswick Today picked up on the story.

Seven years ago, Congress passed the Higher Education Opportunity Act of 2008, authorizing the U.S Department of Education to outline numerous recommendations on how institutions should administer online classes.

The law recommended that a systemic approach be deveoped to ensure that the student taking exams and submitting projects is the same as the student who receives the final grade, and that institutions of higher education employ “secure logins and passwords, or proctored exams to verify a student’s identity.”

Other recommendations include the use of an identity verification process, and the monitoring by institutions of the evolution of identity verification technology.

Under these recommendations by the U.S Department of Education, Rutgers would technically be within its right to implement the use of ProctorTrack, or an alternative form of identity verification technology.

However, the recommendations are by no means requirements, and an institution can decide whether or not to take action.

The student newspaper at Rutgers, The Daily Targum, ran stories on February 9th and February 12th, both highly critical of the new software usage. All of this attention thanks to one student who refused to quietly comply.

The real problem in my opinion can be found in this statement from the New Brunswick Today article.

“The university has put significant effort into protecting the privacy of online students,” said the Rutgers spokesperson. “The 2008 Act requires that verification methods not interfere with student privacy and Rutgers takes this issue very seriously.”

The Rutgers Center for Center for Online and Hybrid Learning and Instructional Technologies (COHLIT) would oversee the implementation and compliance with the usage of ProctorTrack, according to Rutgers spokesperson E.J. Miranda, who insisted it is not mandatory.

“ProctorTrack is one method, but COHLIT offers other options to students, faculty and departments for compliance with the federal requirements, such as Examity and ExamGuard,” said Miranda.

Rutgers has also put up a FAQ page on the subject.

The problem is that Rutgers is paying attention to federal regulations and assuming their solutions are just fine, yet:

  • Rutgers staff clearly spent little or no time asking students for their input on such an important and highly charged subject;
  • Rutgers policy leaves the choice purely up to faculty or academic programs, meaning that there was no coordinated decision-making and communication to students;
  • Now that students are complaining, Rutgers spokes person has been getting defensive, implying ‘there’s nothing to see here’ and not taking the student concerns seriously;
  • At no point that I can find has Rutgers acknowledged the problem of a lack of notification and new charges for students, nor have they acknowledged that students are saying that this solution goes too far.

That is why this is a fiasco. Student privacy is a big issue, and students should have some input into the policies shaped by institutions. The February 12th student paper put it quite well in conclusion.

Granted, I understand the University’s concern — if Rutgers is implementing online courses, there need to be accountability measures that prevent students from cheating. However, monitoring and recording our computer activity during online courses is not the solution, and failing to properly inform students of ProctorTrack’s payment fee is only a further blight on a rather terrible product. If Rutgers wants to transition to online courses, then the University needs to hold some inkling of respect for student privacy. Otherwise, undergraduates have absolutely no incentive to sign up for online classes.

If Rutgers administration wants to defuse this situation, they will be to find a way to talk and listen to students on the subject. Pure and simple.

H/T: Thanks to Audrey Watters and to Jonathan Rees for highlighting this situation.

Update: Bumping comment from Russ Poulin into post itself [emphasis added]:

The last paragraph in the federal regulation regarding academic integrity (602.17) reads:

“(2) Makes clear in writing that institutions must use processes that protect student privacy and notify students of any projected additional student charges associated with the verification of student identity at the time of registration or enrollment.”

The privacy issue is always a tricky one when needing to meet the other requirements of this section. But, it does sound like students were not notified of the additional charges at the time of registration.

The post Rutgers and ProctorTrack Fiasco: Impact of listening to regulations but not to students appeared first on e-Literate.

Slides and Follow-up From Faculty Development Workshop at Aurora University

Fri, 2015-03-13 20:41

By Phil HillMore Posts (302)

Today I facilitated a faculty development workshop at Aurora University, sponsored by the Center for Excellence in Teaching and Learning and the IT Department. I always enjoy sessions like this, particularly with the ability to focus our discussions squarely on technology in support of teaching and learning. The session was titled “Emerging Trends in Educational Technology and Implications for Faculty”. Below are very rough notes, slides, and a follow-up.

Apparent Dilemma and Challenge

Building off of previous presentations at ITC Network, there is an apparent dilemma:

  • One one hand, little has changed: Despite all the hype and investment in ed tech, there is only one new fully-established LMS vendor in the past decade (Canvas), and the top uses of LMS are for course management (rosters, content sharing, grades). Plus the MOOC movement fizzled out, at least for replacing higher ed programs or courses.
  • On the other hand, everything has changed: There are examples of redesigned courses such as Habitable Worlds at ASU that are showing dramatic results in the depth of learning by students.

The best lens to understand this dilemma is Everett Rogers’ Diffusions of Innovations and the technology adoption curve and categories. Geoffrey Moore extended this work to call out a chasm between Innovators / Early Adopters on the left side (wanting advanced tech, OK with partial solutions they cobble together, pushing the boundary) and Majority / Laggards on the right side (wanting full solution – just make it work, make it reliable, make it intuitive). Whereas Moore described Crossing the Chasm for technology companies (moving from one side to the other), in most cases in education we don’t have that choice. The challenge in education is Straddling the Chasm (a concept I’m developing with a couple of consulting clients as well as observations from e-Literate TV case studies):

Straddling the Chasm 3

This view can help explain how advances in pedagogy and learning approaches generally fit on the left side and have not diffused into mainstream, whereas advances in simple course management generally fit on the right side and have diffused, although we want more than that. You can also view the left side as faculty wanting to try new tools and faculty on the right just wanting the basics to work.

The trend in market moving away from walled garden offers education the chance to straddle the chasm.

Implications for Faculty

1) The changes are not fully in place, and it’s going to be a bumpy ride. One example is difficulty in protecting privacy and allowing accessibility in tools not fully centralized. Plus, the LTI 2.0+ and Caliper interoperability standards & frameworks are still a work in progress.

2) While there are new possibilities to use commercial tools, there are new responsibilities as the left side of chasm and non-standard apps require faculty and local support (department, division) to pick up support challenges.

3) There is a challenge is balance innovation with the student need for consistency across courses, mostly in terms of navigation and course administration.

4) While there are new opportunities for student-faculty and student-student engagement, there are new demands on faculty to change their role and to be available on the students’ schedule.

5) Sometimes, simple is best. It amazes me how often the simple act of moving lecture or content delivery online is trivialized. What is enabled here is the ability for students to work at their own pace and replay certain segments without shame or fear of holding up their peers (or even jumping ahead and accelerating).

Slides

Emerging Trends in Educational Technology and Implications for Faculty from Phil Hill Follow-Up

One item discussed in the workshop was how to take advantage of this approach in Aurora’s LMS, Moodle. While Moodle has always supported the open approach and has supported LTI standards, I neglected to mention a missing element. Commercial apps such as Twitter, Google+, etc, do not natively follow LTI standards, which are education-specific. The EduAppCenter was created to help with this challenge by creating a library of apps and wrappers around apps that are LTI-compliant.

The post Slides and Follow-up From Faculty Development Workshop at Aurora University appeared first on e-Literate.

Brian Whitmer No Longer in Operational Role at Instructure

Wed, 2015-03-11 09:17

By Phil HillMore Posts (302)

Just over a year and a half ago, Devlin Daley left Instructure, the company he co-founded. It turns out that both founders have made changes as Brian Whitmer, the other company co-founder, left his operational role in 2014 but is still on the board of directors. For some context from the 2013 post:

Instructure was founded in 2008 by Brian Whitmer and Devlin Daley. At the time Brian and Devlin were graduate students at BYU who had just taken a class taught by Josh Coates, where their assignment was to come up with a product and business model to address a specific challenge. Brian and Devlin chose the LMS market based on the poor designs and older architectures dominating the market. This design led to the founding of Instructure, with Josh eventually providing seed funding and becoming CEO by 2010.

Brian had a lead role until last year for Instructure’s usability design and for it’s open architecture and support for LTI standards.

The reason for Brian’s departure (based on both Brian’s comments and comments from Instructure statements) is based on his family. Brian’s daughter has Rett Syndrome:

Rett syndrome is a rare non-inherited genetic postnatal neurological disorder that occurs almost exclusively in girls and leads to severe impairments, affecting nearly every aspect of the child’s life: their ability to speak, walk, eat, and even breathe easily.

As Instructure grew, Devlin became the road show guy while Brian stayed mostly at home, largely due to family. Brian’s personal experiences have led him to create a new company: CoughDrop.

Some people are hard to hear — through no fault of their own. Disabilities like autism, cerebral palsy, Down syndrome, Angelman syndrome and Rett syndrome make it harder for many individuals to communicate on their own. Many people use Augmentative and Alternative Communication (AAC) tools in order to help make their voices heard.

We work to help bring out the voices of those with complex communication needs through good tech that actually makes things easier and supports everyone in helping the individual succeed.

This work sounds a lot like early Instructure, as Brian related to me this week.

Augmentative Communication is a lot like LMS space was, in need of a reminder of how things can be better.

By the middle of 2014, Brian left all operational duties although he remains on the board (and he plans to remain on the board and acting as an adviser).

How will this affect Instructure? I would look at Brian’s key roles in usability and open platform to see if Instructure keeps up his vision. From my view the usability is just baked into the company’s DNA[1] and will likely not suffer. The question is more on the open side. Brian led the initiative for the App Center as I described in 2013:

The key idea is that the platform is built to easily add and support multiple applications. The apps themselves will come from EduAppCenter, a website that launched this past week. There are already more than 100 apps available, with the apps built on top of the Learning Tools Interoperability (LTI) specification from IMS global learning consortium. There are educational apps available (e.g. Khan Academy, CourseSmart, Piazza, the big publishers, Merlot) as well as general-purpose tools (e.g. YouTube, Dropbox, WordPress, Wikipedia).

The apps themselves are wrappers that pre-integrate and give structure access to each of these tools. Since LTI is the most far-reaching ed tech specification, most of the apps should work on other LMS systems. The concept is that other LMS vendors will also sign on the edu-apps site, truly making them interoperable. Whether that happens in reality remains to be seen.

What the App Center will bring once it is released is the simple ability for Canvas end-users to add the apps themselves. If a faculty adds an app, it will be available for their courses, independent of whether any other faculty use that set up. The same applies for students who might, for example, prefer to use Dropbox to organize and share files rather than native LMS capabilities.

The actual adoption by faculty and institutions of this capability takes far longer than people writing about it (myself included) would desire. It takes time and persistence to keep up the faith. The biggest risk that Instructure faces by losing Brian’s operational role is whether they will keep this vision and maintain their support for open standards and third-party apps – opening up the walled garden, in other words.

Melissa Loble, Senior Director of Partners & Programs at Instructure[2], will play a key role in keeping this open vision alive. I have not heard anything indicating that Instructure is changing, but this is a risk from losing a founder who internally ‘owned’ this vision.

I plan to share some other HR news from the ed tech market in future posts, but for now I wish Brian the best with his new venture – he is one of the truly good guys in ed tech.

Update: I should have given credit to Audrey Watters, who prompted me to get a clear answer on this subject.

  1. Much to Brian’s credit
  2. Formerly Associate Dean of Distance Ed at UC Irvine and key player in Walking Dead MOOC

The post Brian Whitmer No Longer in Operational Role at Instructure appeared first on e-Literate.

Dana Center and New Mathways Project: Taking curriculum innovations to scale

Tue, 2015-03-10 15:01

By Phil HillMore Posts (301)

Last week the University of Texas’ Dana Center announced a new initiative to digitize their print-based math curriculum and expand to all 50 community colleges in Texas. The New Mathways Project is ‘built around three mathematics pathways and a supporting student success course’, and they have already developed curriculum in print:

Tinkering with the traditional sequence of math courses has long been a controversial idea in academic circles, with proponents of algebra saying it teaches valuable reasoning skills. But many two-year college students are adults seeking a credential that will improve their job prospects. “The idea that they should be broadly prepared isn’t as compelling as organizing programs that help them get a first [better-paying] job, with an eye on their second and third,” says Uri Treisman, executive director of the Charles A. Dana Center at UT Austin, which spearheads the New Mathways Project. [snip]

Treisman’s team has worked with community-college faculty to create three alternatives to the traditional math sequence. The first two pathways, which are meant for humanities majors, lead to a college-level class in statistics or quantitative reasoning. The third, which is still in development, will be meant for science, technology, engineering, and math majors, and will focus more on algebra. All three pathways are meant for students who would typically place into elementary algebra, just one level below intermediate algebra.

When starting, the original problem was viewed as ‘fixing developmental math’. As they got into the design, the team restated the problem to be solved as ‘developing coherent pathways through gateway courses into modern degrees of study that lead to economic mobility’. The Dana Center worked with the Texas Association of Community Colleges to develop the curriculum, which is focused on active learning and group work that can be tied to the real world.

The Dana Center approach is based on four principles:

  • Courses student take in college math should be connected to their field of study.
  • The curriculum should accelerate or compress to allow students to move through developmental courses in one year.
  • Courses should align with student support more closely, and sophisticated learning support will be connected to campus support structures.
  • Materials should be connected to context-sensitive improvement strategy.

What they have found is that there are multiple programs nationwide working roughly along the same principles, including the California improvement project, Accelerated learning project at Baltimore City College, and work in Tennessee at Austin Peay College. In their view the fact of independent bodies coming to similar conclusions adds validity to the overall concept.

One interesting aspect of the project is that it is targeted for an entire state’s community college system – this is not a pilot approach. After winning an Request for Proposal selection, Pearson[1] will integrate the active-learning content into a customized mix of MyMathLabs, Learning Catalytics, StatCrunch and CourseConnect tools. Given the Dana Center’s small size, one differentiator for Pearson was their size and ability to help a program move to scale.

Another interesting aspect is the partnership approach with TACC. As shared on the web site:

  • A commitment to reform: The TACC colleges have agreed to provide seed money for the project over 10 years, demonstrating a long-term commitment to the project.
  • Input from the field: TACC member institutions will serve as codevelopers, working with the Dana Center to develop the NMP course materials, tools, and services. They will also serve as implementation sites. This collaboration with practitioners in the field is critical to building a program informed by the people who will actually use it.
  • Alignment of state and institutional policies: Through its role as an advocate for community colleges, TACC can connect state and local leaders to develop policies to support the NMP goal of accelerated progress to and through coursework to attain a degree.

MDRC, the same group analyzing CUNY’s ASAP program, will provide independent reporting of the results. There should be implementation data available by the end of the year, and randomized controlled studies to be released in 2016.

To me, this is a very interesting initiative to watch. Given MDRC’s history of thorough documentation, we should be able to learn plenty of lessons from the state-wide deployment.

  1. Disclosure: Pearson is a client of MindWires Consulting.

The post Dana Center and New Mathways Project: Taking curriculum innovations to scale appeared first on e-Literate.

Blueprint for a Post-LMS, Part 5

Sun, 2015-03-08 17:38

By Michael FeldsteinMore Posts (1021)

In parts 1, 2, 3, and 4 of this series, I laid out a model for a learning platform that is designed to support discussion-centric courses. I emphasized how learning design and platform design have to co-evolve, which means, in part, that a new platform isn’t going to change much if it is not accompanied by pedagogy that fits well with the strengths and limitations of the platform. I also argued that we won’t see widespread changes in pedagogy until we can change faculty relationships with pedagogy (and course ownership), and I proposed a combination of platform, course design, and professional development that might begin to chip away at that problem. All of these ideas are based heavily on lessons learned from social software  and from cMOOCs.

In this final post in the series, I’m going to give a few examples of how this model could be extended to other assessment types and related pedagogical approaches, and then I’ll finish up by talking about what it would take to make the peer grading system described in part 2 be (potentially) accepted by students as at least a component of a grading system in a for-credit class.

Competency-Based Education

I started out the series talking about Habitable Worlds, a course out of ASU that I’ve written about before and that we feature in the forthcoming e-Literate TV series on personalized learning. It’s an interesting hybrid design. It has strong elements of competency-based education (CBE) and mastery learning, but the core of it is problem-based learning (PBL). The competency elements are really just building blocks that students need in the service of solving the big problem of the course. Here’s course co-designer and teacher Ariel Anbar talking about the motivation behind the course:

Click here to view the embedded video.

It’s clear that the students are focused on the overarching problem rather than the competencies:

Click here to view the embedded video.

And, as I pointed out in the first post in the series, they end up using the discussion board for the course very much like professionals might use a work-related online community of practice to help them work through their problems when they get stuck:

Click here to view the embedded video.

This is exactly the kind of behavior that we want to see and that the analytics I designed in part 3 are designed to measure. You could attach a grade to the students’ online discussion behaviors. But it’s really superfluous. Students get their grade from solving the problem of the course. That said, it would be helpful to the students if productive behaviors were highlighted by the system in order to make them easier to learn. And by “learn,” I don’t mean “here are the 11 discussion competencies that you need to display.” I mean, rather, that there are different patterns of productive behavior in a high-functioning group. It would be good for students to see not only the atomic behaviors but different patterns and even how different patterns complement each other within a group. Furthermore, I could imagine that some employers might be interested in knowing the collaboration style that a potential employee would bring to the mix. This would be a good fit for badges. Notice that, in this model, badges, competencies, and course grades serve distinct purposes. They are not interchangeable. Competencies and badges are closer to each other than either is to a grade. They both indicate that the student has mastered some skill or knowledge that is necessary to the central problem. But they are different from each other in ways that I haven’t entirely teased out in my own head yet. And they are not sufficient for a good course grade. To get that, the student must integrate and apply them toward generating a novel solution to a complex problem.

The one aspect of Habitable Worlds that might not fit with the model I’ve outlined in this series is the degree to which it has a mandatory sequence. I don’t know the course well enough to have a clear sense, but I suspect that the lessons are pretty tightly scripted, due in part to the fact that the overarching structure of the course is based on an equation. You can’t really drop out one of the variables or change the order willy-nilly in an equation. There’s nothing wrong with that in and of itself, but in order to take full advantage of the system I’ve proposed here, the course design must have a certain amount of play in it for faculty teaching their individual classes to contribute additions and modifications back. It’s possible to use the discussion analytics elements without the social learning design elements, but then you don’t get potential the system offers for faculty buy-in “lift.”

Adding Assignment Types

I’ve written this entire series talking about “discussion-based courses” as if that were a thing, but it’s vastly more common to have discussion and writing courses. One interesting consequences of the work that we did abstracting out the Discourse trust levels is that we created a basic (and somewhat unconventional) generalized peer review system in the process. As long as conversation is the metric, we can measure the conversational aspects generated by any student-created artifact. For example, we could create a facility in OAE for students to claim the RSS feeds from their blogs. Remember, any integration represents a potential opportunity to make additional inferences. Once a post is syndicated into the system and associated with the student, it can generate a Discourse thread just like any other document. That discussion can be included in  With a little more work, you could have student apply direct ratings such as “likes” to the documents themselves. Making the assessment work for these different types isn’t quite as straightforward as I’m making it sound, either from a user experience design perspective or from a technology perspective. But the foundation is there to build on.

One of the commenters on part 1 of the series provided another interesting use case:

I’m the product manager for Wiki Education Foundation, a nonprofit that helps professors run Wikipedia assignments, in which the students write Wikipedia articles in place of traditional term papers. We’re building a system for managing these assignments, from building a week-by-week assignment plan that follows best practices, to keeping track of student activity on Wikipedia, to pulling in view data for the articles students work on, to finding automated ways of helping students work through or avoid the typical stumbling blocks for new Wikipedia editors.

Wikipedia is its own rich medium for conversation and interaction. I could imagine taking that abstracted peer review system and just hooking it up directly to student activity within Wikipedia itself. Once we start down this path, we really need to start talking about IMS Caliper and federated analytics. This has been a real bottom-up analysis, but we quickly reach the point where we want to start abstracting out the particular systems or even system types, and start looking at a general architecture for sharing learning data (safely). I’m not going to elaborate on it here—even I have to stop at some point—but again, if you made it this far, you might find it useful to go back and reread my original post on the IMS Caliper draft standard and the comments I made on its federated nature in my most recent walled garden post. Much of what I have proposed here from an architectural perspective is designed specifically with a Caliper implementation in mind.

Formal Grading

I suppose my favorite model so far for incorporating the discussion trust system into a graded, for-credit class is the model I described above where the analytics act as more of a coach to help students learn productive discussion behavior, while the class grade actually comes from their solution to the central problem, project, or riddle of the course. But if we wanted to integrate the trust analytics as part of the formal grading system, we’d have to get over the “Wikipedia objection,” meaning the belief that somehow vetting by a single expert produces more reliably generates accurate results than crowdsourcing. Some students will want grades from their teachers and will tend to think that the trust levels are bogus as a grade. (Some teachers will agree.) To address their concerns, we need three things. First, we need objectivity, by which I mean that the scoring criteria themselves are being applied the same to everyone. “Objectivity” is often about as real in student evaluation as it is journalism (which is to say, it isn’t), but people do want some sense of fairness, which is probably a better goal. Clear ratings criteria applied to everyone equally gives some sense of fairness. Second, the trust scores themselves must be transparent, by which I mean that students should be able to see how they earned their trust scores. They should also be able to see various paths to improving their scores. And finally, there should be auditability, by which I mean that, in the event that a student is given a score by her peers that her teacher genuinely disagrees with (e.g., a group ganging up to give one student thumbs-downs, or a lot of conversation being generated around something that is essentially not helpful to the problem-solving effort), there is an ability for the faculty member to override that score. This last piece can be a rabbit hole, both in terms of user interface design and in terms of eroding the very sense you’re trying to build of a trust network, but it is probably necessary to get buy-in. The best thing to do is to pilot the trust system (and the course design that is supposed to inspire ranking-worthy conversation) and refine it to the point where it inspires a high degree of confidence before you start using it for formal grading.

That’s All

No, really. Even I run out of gas. Eventually.

For a while.

The post Blueprint for a Post-LMS, Part 5 appeared first on e-Literate.