Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 4 hours 13 min ago

How Much Do College Students Actually Pay For Textbooks?

Wed, 2015-03-25 07:16

By Phil HillMore Posts (306)

With all of the talk about the unreasonably high price of college textbooks, the unfulfilled potential of open educational resources (OER), and student difficulty in paying for course materials, it is surprising how little is understood about student textbook expenses. The following two quotes illustrate the most common problem.

Atlantic: “According to a recent College Board report, university students typically spend as much as $1,200 a year total on textbooks.”

US News: “In a survey of more than 2,000 college students in 33 states and 156 different campuses, the U.S. Public Interest Research Group found the average student spends as much as $1,200 each year on textbooks and supplies alone.”

While I am entirely sympathetic to the need and desire to lower textbook and course material prices for students, no one is served well by misleading information, and this information is misleading. Let’s look at the actual sources of data and what that data tells us, focusing on the aggregate measures of changes in average textbook pricing in the US and average student expenditures on textbooks. What the data tells us is that the answer is that students spend on average $600 per year on textbooks, not $1,200.

First, however, let’s address the all-too-common College Board reference.

College Board Reference

The College Board positions itself as the source for the cost of college, and their reports look at tuition (published and net), room & board, books & supplies, and other expenses. This chart is the source of most confusion.

College Board Chart

The light blue “Books and Supplies” data, ranging from $1,225 to $1,328, leads to the often-quoted $1,200 number. But look at the note right below the chart:

Other expense categories are the average amounts allotted in determining total cost of attendance and do not necessarily reflect actual student expenditures.

That’s right – the College Board just adds budget estimates for the books & supplies category, and this is not at all part of their actual survey data. The College Board does, however, point people to one source that they use as a rough basis for their budgets.

According to the National Association of College Stores, the average price of a new textbook increased from $62 (in 2011 dollars) in 2006-07 to $68 in 2011-12. Students also rely on textbook rentals, used books, and digital resources. (http://www.nacs.org/research/industrystatistics/higheredfactsfigures.aspx)

The College Board is working to help people estimate the total cost of attendance; they are not providing actual source data on textbook costs, nor do they even claim to do so. Reporters and advocates just fail to read the footnotes. The US Public Interest Research Group is one of the primary reasons that journalists use the College Board data incorrectly, but I’ll leave that subject for another post.

The other issue is the combination of books and supplies. Let’s look at actual data and sources specifically for college textbooks.

Average Textbook Price Changes

What about the idea that textbook prices keep increasing?

BLS and Textbook Price Index

The primary source of public data for this question is the Consumer Price Index (CPI) from the Bureau of Labor Statistics (BLS). The CPI sets up a pricing index based on a complex regression model. The index is set to 100 for December, 2001 when they started tracking this category. Using this data tool for series CUUR0000SSEA011 (college textbooks), we can see the pricing index from 2002 – 2014[1].

CPI Annual

This data equates to roughly 6% year-over-year increases in the price index of new textbooks, roughly doubling every 11 years. But note that this data is not inflation-adjusted, as the CPI is used to help determine the inflation rate. Since the US average inflation rate over 2002 – 2014 has averaged 2%, this means that textbook prices are rising roughly 3 times the rate of inflation.

NACS and Average Price Per Textbook

NACS, as its name implies, surveys college bookstores to determine what students spend on various items. The College Board uses them as a source. This is the most concise summary, also showing rising textbook prices on a raw, non inflation-adjusted basis, although a lower rate of increase than the CPI.

The following graph for average textbook prices is based on data obtained in the annual financial survey of college stores. The most recent data for “average price” was based on the sale of 3.4 million new books and 1.9 million used books sold in 134 U.S. college stores, obtained in the Independent College Stores Financial Survey 2013-14.

NACS Avg Textbook Price

Other Studies

The Government Accountability Office (GAO) did a study in 2013 looking at textbook pricing, but their data source was the BLS. This chart, however, is popularly cited.

GAO Chart

There are several private studies done by publishers or service companies that give similar results, but by definition these are not public.

Student Expenditure on Books and Supplies

For most discussion on textbook pricing, the more relevant question is what do students actually spend on textbooks, or at least on required course materials. Does the data above indicate that students are spending more and more every year? The answer is no, and the reason is that there are far more options today for getting textbooks than there used to be, and one choice – choosing not to acquire the course materials – is rapidly growing. According to Student Monitor, 30% of students choose to not acquire every college textbook.

Prior to the mid 2000s, the rough model for student expenditures was that roughly 65% purchased new textbooks and 35% bought used textbooks. Today, there are options for rentals, digital textbooks, and courseware, and the ratios are changing.

The two primary public sources for how much students spend on textbooks are the National Association of College Stores (NACS) and The Student Monitor.

NACS

The NACS also measures average student expenditure for required course materials, which is somewhat broader than textbooks but does not include non-required course supplies.

The latest available data on student spending is from Student Watch: Attitudes & Behaviors toward Course Materials, Fall 2014. Based on survey data, students spent an average of $313 on their required course materials, including purchases and rentals, for that fall term. Students spent an average of $358 on purchases for “necessary but not required” technology, such as laptops, USB drives, for the same period.

NACS Course Material Expenditures

Note that by the nature of analyzing college bookstores, NACS is biased towards traditional face-to-face education and students aged 18-24.

Update: I should have described the NACS methodology in more depth (or probably need a follow-on post), but their survey is distributed through the bookstore to students. Purchasing through Amazon, Chegg, rental, and decisions not to purchase are all captured in that study. It’s not flawless, but it is not just for purchases through the bookstore. From the study itself:

Campus bookstores distributed the survey to their students via email. Each campus survey fielded for a two week period in October 2013. A total of 12,195 valid responses were collected. To further strengthen the accuracy and representativeness of the responses collected, the data was weighted based on gender using student enrollment figures published in The Chronicle of Higher Education: 2013/2014 Almanac. The margin of error for this study is +/- 0.89% at the 95% confidence interval.

Student Monitor

Student Monitor is a company that provides syndicated and custom market research, and they produce extensive research on college expenses in the spring and fall of each year. This group interviews students for their data, rather than analyzing college bookstore financials, which is a different methodology than NACS. Based on the Fall 2014 data specifically on textbooks, students spent an average of $320 per term, which is quite close to the $638 per year calculated by NACS. Based on information from page 126:

Average Student Acquisition of Textbooks by Format/Source for Fall 2014

  • New print: 59% of acquirers, $150 total mean
  • Used print: 59% of acquirers, $108 total mean
  • Rented print: 29% of acquirers, $38 total mean
  • eTextbooks (unlimited use): 16% of acquirers, $15 total mean
  • eTextbooks (limited use): NA% of acquirers, $9 total mean
  • eTextbooks (file sharing): 8% of acquirers, $NA total mean
  • Total for Fall 2014: $320 mean
  • Total on Annual Basis: $640 mean

Note, however, that the Fall 2014 data ($640 annual) represents a steep increase from the previous trend as reported by NPR (but based on Student Monitor data). I have asked Student Monitor for commentary on the increase but have not heard back (yet).

NPR Student Monitor

Like NACS, Student Monitor is biased towards traditional face-to-face education and students aged 18-24.

Summary

I would summarize the data as follows:

The shortest answer is that US college students spend an average of $600 per year on textbooks despite rising retail prices.

I would not use College Board as a source on this subject, as they do not collect their own data on textbook pricing or expenditures, and they only use budget estimates.

I would like to thank Rob Reynolds from NextThought for his explanation and advice on the subject.

Update (3/25): See note on NACS above.

Update (3/27): See postcript post for additional information on data sources.

  1. Note that BLS has a category CUSR0000SEEA (Educational Books & Supplies) that has been tracked far longer than the sub-category College Textbooks. We’ll use the textbooks to simplify comparisons.

The post How Much Do College Students Actually Pay For Textbooks? appeared first on e-Literate.

Austin Community College’s ACCelerator: Big bet on emporium approach with no pilots

Sun, 2015-03-22 14:55

By Phil HillMore Posts (305)

While at SXSWedu, I was able to visit Austin Community College’s ACCelerator lab, which got a fair bit of publicity over the past month. While the centerpiece of ACCelerator usage is for developental math, the 600+ workstation facility spread over 32,000 square feet also supports Tutoring in a variety of subjectsFirst year experienceGroup advisingAcademic CoachingAdult EducationContinuing EducationCollege readiness assessment preparation, and Student skills workshops.

ACCelerator

But it is the developmental math course that has received the most coverage.

Austin Community College welcomed second lady Dr. Jill Biden and Under Secretary of Education Dr. Ted Mitchell on Monday, March 9, to tour the Highland Campus’ ACCelerator and meet with students and faculty of the college’s new developmental math course, MATD 0421. [snip]

“I teach a lot of developmental students,” says Dr. Biden. “The one stumbling block does seem to be math and math anxiety and ‘Can I do it?’. This (course) seems to be so empowering and so positive. Students can see immediate success.”

MATD 0421 is a self-paced, emporium-style course that encompasses all three levels of developmental math. Paul Fain at Inside Higher Ed had an excellent article that included a description of the motivation.

Dismal remedial success rates have been a problem at Austin, which enrolls 60,000 students. So faculty members from the college looked around for alternative approaches to teaching math.

“Really, there’s nothing to lose,” said [Austin CC president] Rhodes.

The Highland Campus, where the ACCelerator lab is located, is built in a former shopping mall. Student in Austin CC can choose courses at any of the 8 campuses or 5 centers. All developmental math at the Highland Campus is run through MATD 0421, so students across the system can choose traditional approaches at other campuses of the emporium approach at Highland.

Austin CC picked this approach after researching several other initiatives (Fain describes Virginia Tech and Montgomery College examples). The IHE article then describes the design:

Austin officials decided to try the emporium method. They paired it with adaptive courseware, which adjusts to individual learners based on their progress and ability to master concepts. The college went with ALEKS, an adaptive software platform from McGraw-Hill Education.

Fain describes the personalization aspect:

The new remedial math course is offered at the ACCelerator. The computer stations are arranged in loose clusters of 25 or so. Faculty members are easy to spot in blue vests. Student coaches and staff wear red ones.

This creates a more personalized form of learning, said Stacey Güney, the ACCelerator’s director. That might seem paradoxical in computer lab that has a bit of a Matrix feel. But Güney said that instead of a class size of 25 students per instructor, the course features 25 classes of one student.

“In here there is no back of the classroom,” she said.

While the program is fairly new (second term), there are some initial results described by the official site:

In MATD 0421’s inaugural semester:

  • The withdrawal rate was less than half the rate for traditional developmental math courses.
  • 75 percent of the students completed the equivalent of one traditional course.
  • Nearly 45 percent completed the equivalent to a course and one-half.
  • Over 14 percent completed the equivalent to two courses.
  • 13 students completed all the equivalent of three courses.

Go read the full IHE article for a thorough description. I would offer the following observations.

  • Rather than a pilot program, which I have argued plagues higher ed and prevents diffusion of innovations, Austin CC has committed to a A) a big program up front (~700 students in the Fall 2014 inaugural semester) and ~1,000 students in Spring 2015, yet B) they offer students the choice of traditional or emporium. To me, this offers the best of both worlds in allowing a big bet that doesn’t get caught in the “purgatory of pilots” while offering student choice.
  • While the computer lab and software are easy headlines, I hope people don’t miss the heavy staffing that are a central feature of this lab – there are more than 90 faculty and staff working there, teaching the modular courses, roving the aisles to provide help, and working in help desks. The ACCelerator is NOT an exercise in replacing faculty with computers.
  • During my tour, instructor Christie Allen-Johnson and associate professor Ann P. Vance described their plans to perform a more structured analysis of the results. Expect to see more validated outcomes starting at the end of CY2015.
  • When and if Austin CC proves the value and results of the model, that would be the time to migrate most of the remaining developmental math courses into this emporium model.
  • The one area that concerns me is the lack of structured time for students away from the workstations. Developmental students in community colleges often have not experienced academic success – knowing how to succeed, learning how to learn, believing in their ability to succeed – and often this non-cognitive aspect of math is as important as the actual coursework. Allen-Johnson described the availability of coaching that goes beyond coursework, but that is different than providing structure for coaching and self-regulated learning.

The post Austin Community College’s ACCelerator: Big bet on emporium approach with no pilots appeared first on e-Literate.

Our Policy on Cookies and Tracking

Thu, 2015-03-19 10:00

By Michael FeldsteinMore Posts (1024)

In the wake of the Pearson social media monitoring controversy, edubloggers like Audrey Watters and D’arcy Norman have announced their policies regarding code that can potentially track users on their blogs. This is a good idea, so we are following their example.

We use Google Analytics and WordPress analytics on both e-Literate and e-Literate TV. The main reason we do so is that we believe the information these packages provide help us create more useful content. Even after a decade of blogging, we are still surprised sometimes by which posts earn your attention and which ones don’t. We look at our analytics results fairly regularly to see what we can learn about writing more content that you find to be worth your time. This is by no means the only or even the main way that we decide what we will write, but we think of it as one of relatively few clues we have to understand to which posts and topics will have the most value to you. We do not run ads and have no intention of doing so in the future. In the case of e-Literate TV, where the content is expensive to make, we may also use information regarding the number of viewers of the episodes in the future to demonstrate to sponsors that our content is having an impact. We make no effort to track individuals and, in fact, have always had a policy of letting our readers comment on posts without registering on the site. But Google in particular is likely making more extensive use of the usage data that they gather.

In addition to the two analytics packages mentioned above, we do embed YouTube videos and use social media buttons, which may carry their own tracking code with them from the companies that supply them. Unfortunately, this is just part of the deal with embedding YouTube videos or adding convenient “Tweet this” links. The tracking code (which usually, but not always, means the same thing as “cookies”) on our site is pretty typical for what you will find for any site that provides these sorts of conveniences.

But that doesn’t mean that you have to allow yourself to be tracked if you prefer not to be. There are a number of excellent anti-tracking plugins available for the mainstream browsers, including Ghostery and Disconnect. If you are concerned about being tracked (here or anywhere), then we recommend installing one or more of these plugins, and we also recommend spending a little time to learn how they work and what sorts of tracking code are embedded on the different sites you visit so that you can make informed and fine-grained decisions about what information you do and do not want to share. These tools often let you make service-by-service and site-by-site decisions, but they generally start with the default of protecting your privacy by blocking everything.

To sum up and clarify our privacy policies:

  • We do use Google Analytics and WordPress analytics.
  • We do embed social media tools that in some cases carry their own tracking code.
  • We do not make any effort to track individuals on our sites.
  • We do not use or plan to use analytics for ads or in any way sell the information from our analytics to third parties, including but not limited to ads.
  • We may in the future provide high-level summaries of site traffic and video views to e-Literate TV sponsors.
  • We do support commenting on blog posts without registration.[1]
  • We do provide our full posts in our RSS feed, which excludes most (but not all) tracking code.
  • We do provide CC-BY licensing on our content so that it can be used on other sites, including ones that do not have any tracking code .
  1. Note: We do require an email address from commenters for the sole purpose of providing us with a means of contacting the poster in the event that the person has written something uncivil or marginally inappropriate and we need to discuss the matter with that person privately before deciding what to do about moderation. In the 10-year history of e-Literate, this has happened about three or four times. There are two differences relevant to reader privacy between requiring the email address and requiring registration. First, we allow people to use multiple email addresses or even temporary email addresses if they do not wish that email to be personally identifiable. We only require that the email address be a working address. Second and probably more importantly, without registration, there is no mechanism to link comments to browsing behavior on the site.

The post Our Policy on Cookies and Tracking appeared first on e-Literate.

Back To The Future: Looking at LMS forecasts from 2011 – 2014

Wed, 2015-03-18 18:11

By Phil HillMore Posts (303)

At today’s Learning Analytics and Knowledge 2015 conference (#LAK15), Charles Severance (aka Dr. Chuck) gave the morning keynote organized around the theme of going back in time to see what people (myself and Richard Katz primarily) were forecasting for education. By looking at the reality of 2015, we can see which forecasts were on track and which were not. I like this concept, as it is useful to go back and see what we got right and wrong, so this post is meant to provide some additional context particularly for LMS market. Chuck’s keynote also gives cover for doing so without seeming too self-absorbed.

But enough about me. What do you think about me?

I use the term forecast since I tend to describe patterns and trends and then try to describe the implications. This is different than the Katz video which aimed to make specific predictions as a thought-provoking device.

Pre-2011

I introduced the LMS squid diagram in 2008 as a tool to help people see the LMS market holistically rather than focusing on detailed features. Too much of campus evaluations then (and even now) missed the big picture that there were only a handful of vendors and some significant market dynamics at play.

A 2009 presentation, by the way, was the basis for Michael and me connecting for the first time. Bromance.

2011

In early 2011 I wrote a post on Visigoths at the LMS Gates, noting:

I am less inclined to rely on straight-line projections of market data to look ahead, and am more inclined to think the market changes we are seeing are driven by outside forces with potentially nonlinear effects. Rome may have been weakened from within, but when real change happened, the Visigoths made it happen. [snip]

Today, there is a flood of new money into the educational technology market. In addition to the potential acquisition of Blackboard, Instructure just raised $8M in venture funding and vying for the role of Alaric in their marketing position, Pearson has been heavily investing in Learning Studio (eCollege for you old-timers), and Moodlerooms raised $7+M in venture funding. Publishing companies, ERP vendors, private equity, venture funding – these are major disruptive forces. And there is still significant moves being made by technology companies such as Google.

In August I started blogging at e-Literate with this post on Emerging Trends in LMS / Ed Tech Market. The trends I described (summary here, see post for full description):

From my viewpoint in 2011, the market has essentially moved beyond Blackboard as the dominant player driving most of the market dynamics.

  • The market is more competitive, with more options, than it has been for years.
  • Related to the above, there is a trend towards software as a service (SaaS) models for new LMS solutions.
  • Also related to the above, the market is demanding and getting real Web 2.0 and Web 3.0 advances in LMS user interfaces and functionality. We are starting to see some real improvements in usability in the LMS market.
  • The lines are blurring between content delivery systems (e.g. Cengage MindTap, Pearson MyLabs, etc) and LMS.
  • Along those same lines, it is also interesting in what is not being seen as a strategic blurring of lines – between LMS and student information systems.
  • Analytics and data reporting are not just aspirational goals for LMS deployments, but real requirements driven by real deadlines.

Looking back at the 2011 posts, I would note the following:

  • I think all of the basic trends have proven to be accurate, although I over-stated the analytics importance of “real requirements driven by real deadlines”. Analytics are important and some schools have real requirements, but for most schools analytics is not far beyond “aspirational goals”.
  • Chuck over-interpreted the “it’s all about MyLabs”. The real point is the blurring of lines between previously distinct categories of delivery platforms and digital content. I would argue that the courseware movement as well as most CBE platforms shows this impact in 2015. MyLabs was just an example in the graphic.
  • My main message about outside forces was that the internal players (Blackboard, Desire2Learn, Moodle, etc) were not going to be the source of change, rather “new competitors and new dynamics” would force change. Through the graphic, I over-emphasized the ERP and big tech players (Oracle, Google, Pearson & eCollege, etc) while I under-emphasized Instructure, which has proven to be the biggest source of change (although driven by VC funding).
  • I still like the Rome / Visigoths / Alaric metaphor.
2012

In early 2012 I had a post Farewell to the Enterprise LMS, Greetings to the Learning Platform that formed the basis of the forecasts Chuck commented on in the LAK15 keynote.

In my opinion, when we look back on market changes, 2011 will stand out as the year when the LMS market passed the point of no return and changed forever. What we are now seeing are some real signs of what the future market will look like, and the actual definition of the market is changing. We are going from an enterprise LMS market to a learning platform market.

In a second post I defined the characteristics of a Learning Platform (or what I meant by the term):

  1. Learning Platforms are next-generation technology compared to legacy LMS solutions arising in the late 1990’s / early 2000’s. While many features are shared between legacy LMS and learning platforms, the core designs are not constrained by the course-centric, walled-garden approach pioneered by earlier generations.
  2. Learning Platforms tend to be SaaS (software as a service) offerings, based in a public or private cloud on multi-tenant designs. Rather than being viewed as an enterprise application to be set up as a customized instance for each institution, there is a shared platform that supports multiple customers, leveraging a shared technology stack, database, and application web services.
  3. Learning Platforms are intended to support and interoperate with multiple learning and social applications, and not just as extensions to the enterprise system, but as a core design consideration.
  4. Learning Platforms are designed around the learner, giving a sense of identify that is maintained throughout the learning lifecycle. Learners are not just pre-defined roles with access levels within each course, but central actors in the system design.
  5. Learning Platforms therefore are social in nature, supporting connections between learners and customization of content based on learner needs.
  6. Learning Platforms include built-in analytics based on the amalgamation of learner data across courses, across institutions, and even beyond institutions.
  7. Learning Platforms allow for the discovery of instructional content, user-generated content, and of other learners.

Going back to the Farewell post, the forecast was:

Another trend that is becoming apparent is that many of the new offerings are not attempting to fully replace the legacy LMS, at least all at once. Rather than competing with all of the possible features that are typical in enterprise LMS solutions, the new platforms appear to target specific institutional problems and offer only the features needed. Perhaps inspired by Apple’s success in offering elegant solutions at the expense of offering all the features, or perhaps inspired by Clayton Christensen’s disruptive innovation model, the new learning platform providers are perfectly willing to say ‘no – we just don’t offer this feature or that feature’.

Looking back at the 2012 posts, I would note the following:

  • I still see the move from enterprise LMS to learning platform, but it is happening slower than I might have thought and more unevenly. The attributes of SaaS and fewer features has happened (witness Canvas in particular), and the interoperability capabilities are occurring (with special thanks to Chuck and his work with IMS developing LTI). However, the adoption and true usage of multiple learning and social applications connected through the platform is quite slow.
  • The attributes of learner-centric design built-in analytics can be seen in many of the CBE platforms, but not really in the general LMS market itself.
2013 / 2014

In 2013 and 2014 I updated the LMS squid graphic.

 

  • Chuck was right to point out the revision that I no longer included the outside forces of ERP & big tech. The key point of 2011 forecasts was outside forces making changes, but by 2013 it was clear that ERP & big tech were not part of this change.
  • There is also a big addition of homegrown solutions, or alternative learning platforms that is worth noting. The entrance of so new CBE platforms designed from the ground up for a specific purposes is an example of this trend.
Overall Notes

Thanks to Chuck, this has been informative (to me, at least) to go back and review forecasts and see what I got right and what I got wrong. Chuck’s general point on my forecasts seem to be that I am over-emphasizing the emergence of learning platforms at least as a distinct category from enterprise LMS, and that we’re still seeing LMS market although with changed internals (fewer features, more interoperability). I don’t disagree with this point (if I am summarizing accurately). However, if you read the actual forecasts above, I don’t think Chuck and I are too far apart. I may be more optimistic than he is and need to clarify my terminology somewhat, but we’re in the same ball park.

Now let’s turn the tables. My main critique with Dr. Chuck’s keynote is that he just didn’t commit on the song. We know he is willing to boldly sing, after all (skip ahead to 1:29).

Click here to view the embedded video.

Update: Clarified language on LTI spec

The post Back To The Future: Looking at LMS forecasts from 2011 – 2014 appeared first on e-Literate.

Blackboard Brain Drain: One third of executive team leaves in past 3 months

Tue, 2015-03-17 10:02

By Phil HillMore Posts (302)

In August 2013 Michael described Ray Henderson’s departure from an operational role at Blackboard. As of the end of 2014, Ray is no longer on the board of directors at Blackboard either. He is focusing on his board activity (including In The Telling, our partner for e-Literate TV) and helping with other ed tech companies. While Ray’s departure from the board did not come as a surprise to me, I have been noting the surprising number of other high-level departures from Blackboard recently.

As of December 24, 2014, Blackboard listed 12 company executives in their About > Leadership page. Of those 12 people, 4 have left the company since early January. Below is the list of the leadership team at that time along with notes on changes:

  • Jay Bhatt, CEO
  • Maurice Heiblum, SVP Higher Education, Corporate And Government Markets (DEPARTED February, new job unlisted)
  • Mark Belles, SVP K-12 (DEPARTED March, now President & COO at Teaching Strategies, LLC)
  • David Marr, SVP Transact
  • Matthew Small, SVP & Managing Director, International
  • Gary Lang, SVP Product Development, Support And Cloud Services (DEPARTED January, now VP B2B Technology, Amazon Supply)
  • Katie Blot, SVP Educational Services (now SVP Corporate Strategy & Business Development)
  • Mark Strassman, SVP Industry and Product Management
  • Bill Davis, CFO
  • Michael Bisignano, SVP General Counsel, Secretary (DEPARTED February, now EVP & General Counsel at CA Technologies)
  • Denise Haselhorst, SVP Human Resources
  • Tracey Stout, SVP Marketing

Beyond the leadership team, there are three others worth highlighting.

  • Brad Koch, VP Product Management (DEPARTED January, now at Instructure)
  • David Ashman, VP Chief Architect, Cloud Architecture (DEPARTED February, now CTO at Teaching Strategies, LLC)
  • Mark Drechsler, Senior Director, Consulting (APAC) (DEPARTED March, now at Flinders University)

I mentioned Brad’s departure already and the significance in this post. Mark is significant in terms of his influence in the Australian market, as he came aboard from the acquisition of NetSpot.

David is significant as he was Chief Architect and had the primary vision for Blackboard’s impending moving into the cloud. Michael described this move in his post last July.

Phil and I are still trying to nail down some of the details on this one, particularly since the term “cloud” is used particularly loosely in ed tech. For example, we don’t consider D2L’s virtualization to be a cloud implementation. But from what we can tell so far, it looks like a true elastic, single-instance multi-tenant implementation on top of Amazon Web Services. It’s kind of incredible. And by “kind of incredible,” I mean I have a hard time believing it. Re-engineering a legacy platform to a cloud architecture takes some serious technical mojo, not to mention a lot of pain. If it is true, then the Blackboard technical team has to have been working on this for a long time, laying the groundwork long before Jay and his team arrived. But who cares? If they are able to deliver a true cloud solution while still maintaining managed hosting and self-hosted options, that will be a major technical accomplishment and a significant differentiator.

This seems like the real deal as far as we can tell, but it definitely merits some more investigation and validation. We’ll let you know more as we learn it.

This rollout of new cloud architecture has taken a while, and I believe it is hitting select customers this year. Will David’s departure add risk to this move? I talked to David a few weeks ago, and he said that he was leaving for a great opportunity at Teaching Strategies, and that while he was perhaps the most visible face of the cloud at Blackboard, others behind the scenes are keeping the vision. He does not see added risk. While I appreciate the direct answers David gave me to my questions, I still cannot see how the departure of Gary Lang and David Ashman will not add risk.

So why are so many people leaving? From initial research and questions, the general answer seems to be ‘great opportunity for me professionally or personally, loved working at Blackboard, time to move on’. There is no smoking gun that I can find, and most departures are going to very good jobs.

Jay Bhatt, Blackboard’s CEO, provided the following statement based on my questions.

As part of the natural evolution of business, there have been some transitions that have taken place. A handful of executives have moved onto new roles, motivated by both personal and professional reasons. With these transitions, we have had the opportunity to add some great new executive talent to our company as well. Individuals who bring the experience and expertise we need to truly capture the growth opportunity we have in front of us. This includes Mark Gruzin, our new NAHE/ProEd GTM lead, Peter George, our new head of product development and a new general counsel who will be starting later this month. The amazing feedback we continue to receive from customers and others in the industry reinforces how far we’ve come and that we are on the right path. As Blackboard continues to evolve, our leaders remain dedicated to moving the company forward into the next stage of our transformation.

While Jay’s statement matches what I have heard, I would note the following:

  • The percentage of leadership changes within a 3 month period rises above the level of “natural evolution of business”. Correlation does not imply causation, but neither does it imply a coincidence.
  • The people leaving have a long history in educational technology (Gary Lang being the exception), but I have not seen the same in reverse direction. Mark Gruzin comes from a background in worldwide sales and federal software group at IBM. Peter George comes from a background in Identity & Access Management as well as Workforce Management companies. They both seem to be heavy hitters, but not in ed tech. Likewise, Jay himself along with Mark Strassman and Gary Lang had no ed tech experience when they joined Blackboard. This is not necessarily a mistake, as fresh ideas and approaches were needed, but it is worth noting the stark differences in people leaving and people coming in.
  • These changes come in the middle of Blackboard making huge bets on a completely new user experience and a move into the cloud. These changes were announced last year, but they have not been completed. This is the most important area to watch – whether Blackboard completes these changes and successfully rolls them out to the market.

We’ll keep watching and update where appropriate.

The post Blackboard Brain Drain: One third of executive team leaves in past 3 months appeared first on e-Literate.

Rutgers and ProctorTrack Fiasco: Impact of listening to regulations but not to students

Mon, 2015-03-16 13:07

By Phil HillMore Posts (302)

If you want to observe the unfolding impact of an institution ignoring the impact of policy decisions on students, watch the situation at Rutgers University. If you want to see the power of a single student saying “enough is enough”, go thank Betsy Chao and sign her petition. The current situation is that students are protesting the Rutgers usage of ProctorTrack software – which costs students $32 in additional fees, accessing their personal webcams, automatically tracks face and knuckle video as well as watching browser activity – in online courses. Students seem to be outraged at the lack of concern over student privacy and additional fees.

Prior to 2015, Rutgers already provided services for online courses to comply with federal regulations to monitor student identity. The rationale cited [emphasis added]:

The 2008 Higher Education Opportunity Act (HEOA) requires institutions with distance education programs to have security mechanisms in place that ensure that the student enrolled in a particular course is in fact the same individual who also participates in course activities, is graded for the course, and receives the academic credit. According to the Department of Education, accrediting agencies must require distance education providers to authenticate students’ identities through secure (Learning Management System) log-ins and passwords, proctored exams, as well as “new identification technologies and practices as they become widely accepted.”

This academic term, Rutgers added a new option – ProctorTrack:

Proctortrack is cost-effective and scalable for any institution size. Through proprietary facial recognition algorithms, the platform automates proctoring by monitoring student behavior and action for test policy compliance. Proctortrack can detect when students leave their space, search online for additional resources, look at hard notes, consult with someone, or are replaced during a test.

This occurred at the same time as the parent company Verificient received a patent for their approach, in January 2015.

A missing piece not covered in the media thus far is that Rutgers leaves the choice of student identify verification approach up to individual faculty or academic program [emphasis added].

In face-to-face courses, all students’ identities are confirmed by photo ID prior to sitting for each exam and their activities are monitored throughout the exam period. To meet accreditation requirements for online courses, this process must also take place. Rutgers makes available electronic proctoring services for online students across the nation and can assist with on-site proctoring solutions. Student privacy during a proctored exam at a distance is maintained through direct communication and the use of a secure testing service. Students must be informed on the first day of class of any additional costs they may incur for exam proctoring and student authentication solutions.

The method of student authentication used in a course is the choice of the individual instructor and the academic unit offering the course. In addition to technology solutions such as Examity and ProctorTrack, student authentication can also be achieved through traditional on-site exam proctoring solutions. If you have any questions, talk to your course instructor.

As the use of of ProctorTrack rolled out this term, at least one student – senior Betsy Chao – was disturbed and on February 5th created a petition on change.org.

However, I recently received emails from both online courses, notifying me of a required “Proctortrack Onboarding” assessment to set up Proctortrack software. Upon reading the instructions, I was bewildered to discover that you had to pay an additional $32 for the software on top of the $100 convenience fee already required of online courses. And I’m told it’s $32 per online class. $32 isn’t exactly a large sum, but it’s certainly not pocket change to me. Especially if I’m taking more than one online class. I’m sure there are many other college students who echo this sentiment. Not only that, but nowhere in either of the syllabi was there any inkling of the use of Proctortrack or the $32 charge. [snip]

Not only that, but on an even more serious note, I certainly thought that the delicate issue of privacy would be more gracefully handled, especially within a school where the use of webcams was directly involved in a student’s death. As a result, I thought Rutgers would be highly sensitive to the issue of privacy.

If accurate, this clearly violates the notification policy of Rutgers highlighted above. Betsy goes on to describe the alarming implications relating to student privacy.

On February 7th, New Brunswick Today picked up on the story.

Seven years ago, Congress passed the Higher Education Opportunity Act of 2008, authorizing the U.S Department of Education to outline numerous recommendations on how institutions should administer online classes.

The law recommended that a systemic approach be deveoped to ensure that the student taking exams and submitting projects is the same as the student who receives the final grade, and that institutions of higher education employ “secure logins and passwords, or proctored exams to verify a student’s identity.”

Other recommendations include the use of an identity verification process, and the monitoring by institutions of the evolution of identity verification technology.

Under these recommendations by the U.S Department of Education, Rutgers would technically be within its right to implement the use of ProctorTrack, or an alternative form of identity verification technology.

However, the recommendations are by no means requirements, and an institution can decide whether or not to take action.

The student newspaper at Rutgers, The Daily Targum, ran stories on February 9th and February 12th, both highly critical of the new software usage. All of this attention thanks to one student who refused to quietly comply.

The real problem in my opinion can be found in this statement from the New Brunswick Today article.

“The university has put significant effort into protecting the privacy of online students,” said the Rutgers spokesperson. “The 2008 Act requires that verification methods not interfere with student privacy and Rutgers takes this issue very seriously.”

The Rutgers Center for Center for Online and Hybrid Learning and Instructional Technologies (COHLIT) would oversee the implementation and compliance with the usage of ProctorTrack, according to Rutgers spokesperson E.J. Miranda, who insisted it is not mandatory.

“ProctorTrack is one method, but COHLIT offers other options to students, faculty and departments for compliance with the federal requirements, such as Examity and ExamGuard,” said Miranda.

Rutgers has also put up a FAQ page on the subject.

The problem is that Rutgers is paying attention to federal regulations and assuming their solutions are just fine, yet:

  • Rutgers staff clearly spent little or no time asking students for their input on such an important and highly charged subject;
  • Rutgers policy leaves the choice purely up to faculty or academic programs, meaning that there was no coordinated decision-making and communication to students;
  • Now that students are complaining, Rutgers spokes person has been getting defensive, implying ‘there’s nothing to see here’ and not taking the student concerns seriously;
  • At no point that I can find has Rutgers acknowledged the problem of a lack of notification and new charges for students, nor have they acknowledged that students are saying that this solution goes too far.

That is why this is a fiasco. Student privacy is a big issue, and students should have some input into the policies shaped by institutions. The February 12th student paper put it quite well in conclusion.

Granted, I understand the University’s concern — if Rutgers is implementing online courses, there need to be accountability measures that prevent students from cheating. However, monitoring and recording our computer activity during online courses is not the solution, and failing to properly inform students of ProctorTrack’s payment fee is only a further blight on a rather terrible product. If Rutgers wants to transition to online courses, then the University needs to hold some inkling of respect for student privacy. Otherwise, undergraduates have absolutely no incentive to sign up for online classes.

If Rutgers administration wants to defuse this situation, they will be to find a way to talk and listen to students on the subject. Pure and simple.

H/T: Thanks to Audrey Watters and to Jonathan Rees for highlighting this situation.

Update: Bumping comment from Russ Poulin into post itself [emphasis added]:

The last paragraph in the federal regulation regarding academic integrity (602.17) reads:

“(2) Makes clear in writing that institutions must use processes that protect student privacy and notify students of any projected additional student charges associated with the verification of student identity at the time of registration or enrollment.”

The privacy issue is always a tricky one when needing to meet the other requirements of this section. But, it does sound like students were not notified of the additional charges at the time of registration.

The post Rutgers and ProctorTrack Fiasco: Impact of listening to regulations but not to students appeared first on e-Literate.