Ray Henderson captured the changing trend of the past two EDUCAUSE conferences quite well.
The #Edu14 drinking game: sure inebriation in 13 from vendor claims of "mooc" "cloud" or "disrupting edu". In 2014: "competency based."
— Ray Henderson (@readmeray) October 3, 2014
The #Edu14 drinking game: sure inebriation in 13 from vendor claims of “mooc” “cloud” or “disrupting edu”. In 2014: “competency based.”
Two years ago, the best-known competency-based education (CBE) initiatives were at Western Governors University (WGU), Southern New Hampshire University’s College for America (CfA), and SUNY’s Excelsior College. In an article this past summer describing the US Department of Education’s focus on CBE, Paul Fain noted [emphasis added]:
The U.S. Department of Education will give its blessing — and grant federal aid eligibility — to colleges’ experimentation with competency-based education and prior learning assessment.
On Tuesday the department announced a new round of its “experimental sites” initiative, which waives certain rules for federal aid programs so institutions can test new approaches without losing their aid eligibility. Many colleges may ramp up their experiments with competency-based programs — and sources said more than 350 institutions currently offer or are seeking to create such degree tracks.
One issue I’ve noticed, however, is that many schools are looking to duplicate the solution of CBE without understanding the the problems and context that allowed WGU, CfA and Excelsior to thrive. By looking at the three main CBE initiatives, it is important to note at least three lessons that are significant factors in their success to date, and these lessons are readily available but perhaps not well-understood.Lesson 1: CBE as means to address specific student population
None of the main CBE programs were designed to target a general student population or to offer just another modality. In all three cases, their first consideration was how to provide education to working adults looking to finish a degree, change a career, or advance a career.
As described by WGU’s website:
Western Governors University is specifically designed to help adult learners like you fit college into your already busy lives. Returning to college is a challenge. Yet, tens of thousands of working adults are doing it. There’s no reason you can’t be one of them.
As described by College for America’s website:
We are a nonprofit college that partners with employers nationwide to make a college degree possible for their employees. We help employers develop their workforce by offering frontline workers a competency-based degree program built on project-based learning that is uniquely applicable in the workplace, flexibly scheduled to fit in busy lives, and extraordinarily affordable.
As described by Excelsior’s website:
Excelsior’s famously-flexible online degree programs are created for working adults.
SNHU’s ubiquitous president Paul Leblanc described the challenge of not understanding the target for CBE at last year’s WCET conference (from my conference notes):
One of the things that muddies our own internal debates and policy maker debates is that we say things about higher education as if it’s monolithic. We say that ‘competency-based education is going to ruin the experience of 18-year-olds’. Well, that’s a different higher ed than the people we serve in College for America. There are multiple types of higher ed with different missions.
The one CfA is interested in is the world of working adults – this represent the majority of college students today. Working adults need credentials that are useful in the workplace, they need low cost, they need me short completion time, and they need convenience. Education has to compete with work and family requirements.
CfA targets the bottom 10% of wage earners in large companies – these are the people not earning sustainable wages. They need stability and advancement opportunities.
CfA has two primary customers – the students and the employers who want to develop their people. In fact, CfA does not have a retail offering, and they directly work with employers to help employees get their degrees.Lesson 2: Separate organizations to run CBE
In all three cases the use of CBE to serve working adults necessitated entirely new organizations that were designed to provide the proper support and structure based on this model.
WGU was conceived as a separate non-profit organization in 1995 and incorporated in 1997 specifically to design and enable the new programs. College for America was spun out of SNHU in 2012. Excelsior College started 40 years ago as Regents College, focused on both mastery and competency-based programs. The CBE nursing program was founded in 1975.
CBE has some unique characteristics that do not fit well within traditional educational organizations. From a CBE primary I wrote in 2012 and updated in 2013:
I would add that the integration of self-paced programs not tied to credit hours into existing higher education models presents an enormous challenge. Colleges and universities have built up large bureaucracies – expensive administrative systems, complex business processes, large departments – to address financial aid and accreditation compliance, all based on fixed academic terms and credit hours. Registration systems, and even state funding models, are tied to the fixed semester, quarter or academic year – largely defined by numbers of credit hours.
It is not an easy task to allow transfer credits coming from a self-paced program, especially if a student is taking both CBE courses and credit-hour courses at the same time. The systems and processes often cannot handle this dichotomy.
Beyond the self-paced student-centered scheduling issues, there are also different mentoring roles required to support students, and these roles are not typically understood or available at traditional institutions. Consider the mentoring roles at WGU as described in EvoLLLutions:
Faculty mentors (each of whom have at least a master’s degree) are assigned a student caseload and their full-time role is to provide student support. They may use a variety of communication methods that, depending on student preferences,include calling — but also Skype, email and even snail mail for encouraging notes.
Course mentors are the second type of WGU mentor. These full-time faculty members hold their Ph.D. and serve as content experts. They are also assigned a student caseload. Responsibilities of course mentors include creating a social community among students currently enrolled in their courses and teaching webinars focused specifically on competencies students typically find difficult. Finally, they support students one-on-one based on requests from the student or referral from the student’s faculty mentor.Lesson 3: Competency is not the same as mastery
John Ebersole, the president of Excelsior College, called out the distinction between competency and mastery in an essay this summer at Inside Higher Ed.
On close examination, one might ask if competency-based education (or CBE) programs are really about “competency,” or are they concerned with something else? Perhaps what is being measured is more closely akin to subject matter “mastery.” The latter can be determined in a relatively straightforward manner, using various forms of examinations, projects and other forms of assessment.
However, an understanding of theories, concepts and terms tells us little about an individual’s ability to apply any of these in practice, let alone doing so with the skill and proficiency which would be associated with competence.
Deeming someone competent, in a professional sense, is a task that few competency-based education programs address. While doing an excellent job, in many instances, of determining mastery of a body of knowledge, most fall short in the assessment of true competence.
Ebersole goes on to describe the need for true competency measuring, and his observation that I share about programs confusing the two concepts..
A focus on learning independent of time, while welcome, is not the only consideration here. We also need to be more precise in our terminology. The appropriateness of the word competency is questioned when there is no assessment of the use of the learning achieved through a CBE program. Western Governors University, Southern New Hampshire, and Excelsior offer programs that do assess true competency.
Unfortunately, the vast majority of the newly created CBE programs do not. This conflation of terms needs to be addressed if employers are to see value in what is being sold. A determination of “competency” that does not include an assessment of one’s ability to apply theories and concepts cannot be considered a “competency-based” program.Whither the Bandwagon
I don’t think that the potential of CBE is limited only to the existing models nor do I think WGU, CfA, and Excelsior are automatically the best initiatives. But an aphorism variously attributed to Pablo Picasso, Dalai Lama XIV or bassist Jeff Berlin might provide guidance to the new programs:
Know the rules well, so you can break them effectively
How many new CBE programs are being attempted that target the same student population as the parent institutions? How many new CBE programs are being attempted in the same organization structure? And how many new CBE programs are actually based on testing only of masteries and not competencies?
Judging by media reports and observations at EDUCAUSE, I think there are far too many programs attempting this new educational model of CBE as a silver bullet. They are moving beyond the model and lessons from WGU, College for America and Excelsior without first understanding why those initiatives have been successful. I don’t intend to name names here but just to note that the 350 new programs cited in Paul Fain’s article would do well to ground themselves in a solid foundation that understands and builds off of successful models.
The post Competency-Based Education: Not just a drinking game appeared first on e-Literate.
The changes with Kuali are accelerating, and there are some big updates on the strategy.
Earlier this week the Kuali Foundation distributed an Information Update obtained by e-Literate on many of the details of the transition to Kuali 2.0 and the addition of the for-profit KualiCo. Some of the key clarifications:
- KualiCo will be an independent C Corporation with a board of directors. KualiCo will not be a subsidiary of Kuali Foundation. Capital structure, equity allocations, and business plans are confidential and will not be shared publicly for the same reasons these things are rarely shared by private companies. The board of directors will start out with three members and will move to five or seven over time. Directors will include the CEO and an equal number of educational administrators and outside directors. One of the educational administrators will be appointed by the Kuali Foundation. Outside directors will be compensated with equity. Educational administrators will not be compensated in any way and could only serve as a director with the explicit permission of their university administration with attention to all relevant institutional policies.
- KualiCo’s only initial equity investor is the Kuali Foundation. The Kuali Foundation will invest up to $2M from the Foundation’s cash reserves. [snip] For its equity investment, the Kuali Foundation will have the right to designate a director on the KualiCo Board of Directors. The Kuali Foundation, through its director, will have an exceptional veto right to block the sale of the company, an IPO of the company or a change to the open source license. This helps ensure that KualiCo will stay focused on marketplace-winning products and services rather than on flipping the company on Wall Street.
- The Kuali Foundation is not licensing the Kuali software code for Kuali products to KualiCo as Kuali software is already fully open source and could be used by anyone for any purpose — as is already being done today. No license transfer or grant is needed by KualiCo or anyone else.
- The copyright for the AGPL3 software will be copyright KualiCo for the open source distribution that is available to everyone. It would very quickly become untenable to even try to manage multiple copyright lines as various sections of code evolve through the natural enhancement processes of an open source community.
One key point the document describes at length is the lack of financial interest from individuals in the Kuali Foundation and KualiCo, including the uncompensated director position, the lack of equity held by individuals outside of KualiCo, etc.
Two other key points that are particularly relevant to yesterday’s news:
- Each project board will decide if, when, to what extent, and for what term to engage with KualiCo. Project boards could decide to continue on as they currently do, to engage KualiCo in a limited way, or to allow KualiCo to help drive substantial change to the software approach to that product. If a project chooses not to engage KualiCo, KualiCo will have less initial funding to invest in enhancing the product, but will slowly build up those funds over time by hosting the product and enhancing the product for its customers. Choosing to engage with KualiCo in any fashion requires code to be reissued under the AGPL3 license (see Open Source section).
- KualiCo will be working with the Kuali community to make improvements to current Kuali products. In addition to enhancing the current codebase, KualiCo is beginning the re-write of Kuali products with a modern technology stack. The initial focus will be on Kuali Student and then HR. Complete rewrites of KFS and KC will likely not begin for 3-5 years.
With this in mind, yesterday the Kuali Student (KS) Project Board met and made the decision to sunset their current project and to transition to KualiCo development. Bob Cook, CIO at the University of Toronto and chair of the KS Project Board confirmed by email.
I can say that the Board adopted its resolution because it is excited about the opportunity that KualiCo presents for accelerating the delivery of high quality administrative services for use in higher education, and is eager to understand how to best align our knowledgeable project efforts to achieve that goal. [snip]
In recognition of the opportunity presented by the establishment of KualiCo as a new facet in the Kuali community, the Kuali Student Board has struck a working group to develop a plan for transitioning future development of Kuali Student by the KualiCo. The plan will be presented to the Board for consideration.
While Bob did not confirm the additional level of details I asked (“It would be premature to anticipate specific outcomes from a planning process that has not commenced”), my understanding is that it is safe to assume:
- Kuali Student will transition to AGPL license with KualiCo holding copyright;
- KualiCo will develop a new product roadmap based on recoding / additions for multi-tenant framework; and
- Some of all of the current KS development efforts will be shut down over the next month or two.
KS Project Director Rajiv Kaushik sent a note to the full KS team with more details:
KS Board met today and continued discussions on a transition to Kuali 2.0. That thread is still very active with most current investors moving in the Kuali 2.0 direction. In the meantime, UMD announced its intent to invest in Kuali 2.0 and to withdraw in 2 weeks from the current KS effort. Since this impacts all product streams, Sean, Mike and I are planning work over the next 2 weeks while we still have UMD on board. More to come on that tomorrow at the Sprint demo meeting.
I will update or correct this information as needed.
Kuali Student (KS) is the centerpiece of Kuali – it is the largest and most complex project and the most central value to higher education. KS was conceived in 2007. Unlike KFS, Coeus and Rice, Kuali Student was designed from the ground up. The full suite of modules within Kuali Student had been scheduled to be released between 2012 – 2015 in a single-tenant architecture. With the transition, there will be a new roadmap redeveloping for multi-tenant and updated technology stack.
Just how large has this project been? According to a financial analysis of 2009-2013 performed by instructional media + magic inc. Kuali Student had $30 million in expenditures in that 5-year span. The 2014 records are not yet available nor the 2007-8 records, but an educated guess is that the total is closer to $40 million.
I mention this to show the scope of Kuali Student to date as well as the relative project size compared to other Kuali projects. I wrote a post on cloud computing around the LMS that might be relevant to the future KualiCo development, calling out how cloud technologies and services are driving down the cost of product development and time. In the case of the LMS, the difference has been close to an order of magnitude compared to the first generation:
Think about the implications – largely due to cloud technologies such as Amazon web services (which underpins Lore as well as Instructure and LoudCloud), a new learning platform can be designed in less than a year for a few million dollars. The current generation of enterprise LMS solutions often cost tens of millions of dollars (for example, WebCT raised $30M prior to 2000 to create its original LMS and scale to a solid market position, and raised a further $95M in 2000 alone), or product redesigns take many years to be released (for example, Sakai OAE took 3 years to go from concept to release 1.0). It no longer takes such large investments or extended timeframes to create a learning platform.
Cloud technologies are enabling a rapid escalation in the pace of innovation, and they are lowering the barriers to entry for markets such as learning platforms. Lore’s redesign in such a short timeframe gives a concrete example of how quickly systems can now be developed.
How will these dynamics apply to student information systems? Given the strong emphasis on workflow and detailed user functionality, I suspect that the differences will be less than for the LMS, but still significant. In other words, I would not see the redevelopment of Kuali Student to take anywhere close to $40 million or seven years, but I will be interested to see the new roadmap when it comes out.
This decision – moving Kuali Student to KualiCo – along with the foundation’s ability to hold on to the current community members (both institutions and commercial affiliates) will be the make-or-break bets that the Kuali Foundation has made with the move to Kuali 2.0. Stay tuned for more updates before the Kuali Days conference in November.
Say what you will about the move away from Community Source, Kuali is definitely not sitting on its laurels and being cautious. This redevelopment of Kuali Student with a new structure is bold and high-risk.
- Disclosure: Jim Farmer from im+m has been a guest blogger at e-Literate for many years.
- It’s probably more than that, but let’s use a conservative estimate to set general scope.
The post Kuali Student Sunsetting $40 million project, moving to KualiCo appeared first on e-Literate.
I have long thought that LinkedIn has the potential to be one of the most transformative companies in ed tech for one simple reason: They have far more cross-institutional longitudinal outcomes data than anybody else—including government agencies. Just about anybody else who wants access to career path information of graduates across universities would face major privacy and data gathering hurdles. But LinkedIn has somehow convinced hundreds of millions of users to voluntarily enter that information and make it available for public consumption. The company clearly knows this and has been working behind the scenes to make use of this advantage. I have been waiting to see what they will come up with.
I have to say that I’m disappointed with their decision that their first foray would be a college ranking system. While I wouldn’t go so far as to say that these sorts of things have zero utility, they suffer from two big and unavoidable problems. First, like any standardized test—and I mean this explicitly in the academic meaning of the term “test”—they are prone to abuse through oversimplification of their meaning and overemphasis on their significance. (It’s not obvious to me that they would be subject to manipulation by colleges the way other surveys are, given LinkedIn’s ranking method, so at least there’s that.) Second and more importantly, they are not very useful even when designed well and interpreted properly. Many students change their majors and career goals between when they choose their college and when they graduate. According to the National Center for Education Statistics, 80% of undergraduates change their majors at least once, and the average student changes majors three times. Therefore, telling high schools students applying to college which school is ranked best for, say, a career in accounting has less potential impact on the students’ long-term success and happiness than one might think.
It would be more interesting and useful to have LinkedIn tackle cross-institutional questions that could help students make better decisions once they are in a particular college. What are the top majors for any given career? For example, if I want to be a bond trader on Wall Street, do I have to major in finance? (My guess is that the answer to this question is “no,” but I would love to see real data on it.) Or how about the other way around: What are the top careers for people in my major? My guess is that LinkedIn wanted to start off with something that (a) they had a lot of data on (which means something coarse-grained) and (b) was relatively simple to correlate. The questions I’m suggesting here would fit that bill while being more useful than a college ranking system (and less likely to generate institutional blow-back).
Well that was an interesting session at Educause as described at Inside Higher Ed:
It took the Kuali leadership 20 minutes to address the elephant in the conference center meeting room.
“Change is ugly, and change is difficult, and the only difference here is you’re going to see all the ugliness as we go through the change because we’re completely transparent,” said John F. (Barry) Walsh, a strategic adviser for the Kuali Foundation. “We’re not going to hide any difficulty that we run into. That’s the way we operate. It’s definitely a rich environment for people who want to chuck hand grenades. Hey, have a shot — we’re wide open.” [snip]
Walsh, who has been dubbed the “father of Kuali,” issued that proclamation after a back-and-forth with higher education consultant Phil Hill, who during an early morning session asked the Kuali leadership to clarify which parts of the company’s software would remain open source.
While the article describes the communication and pushback issues with Kuali’s creation of a for-profit entity quite well (go read the whole article), I think it’s worth digging into what Carl generously describes as a “back-and-forth”. What happened was that there was a slide describing the relicensing of Kuali code as AGPL, and the last bullet caught my attention:
- AGPL > GPL & ECL for SaaS
- Full versions always downloadable by customers
- Only feature “held back” is multi-tenant framework
If you need a read on the change of open source licenses and why this issue is leading to some of the pushback, go read Chuck Severance’s blog post.
Does ‘held back’ mean that the multi-tenant framework to enable cloud hosting partially existed but is not moving to AGPL, or does it mean that the framework would be AGPL but not downloadable by customers, or does it mean that the framework is not open course? That was the basis of my question.
Several Kuali Foundation representatives attempted to indirectly answer the question without addressing the license.
“I’ll be very blunt here,” Walsh said. “It’s a commercial protection — that’s all it is.”
The back-and-forth involved trying to get a clear answer, and the answer is that the multi-tenant framework to be developed / owned by KualiCo will not be open source – it will be proprietary code. I asked Joel Dehlin for additional context after the session, and he explained that all Kuali functionality will be open source, but the infrastructure to allow cloud hosting is not open source.
This is a significant clarification on the future model. While Kuali has always supported an ecosystem with commercial partners that can offer proprietary code, this is the first time that Kuali itself will offer proprietary, non open source code.
What is not clear is whether any of the “multi-tenant framework” already exists and will be converted to a proprietary license or if all of this code will be created by KualiCo from the ground up. If anyone knows the answer, let me know in the comments.
“Unfortunately some of what we’re hearing is out of a misunderstanding or miscommunication on our part,” said Eric Denna, vice president of IT and chief information officer at the University of Maryland at College Park. “Brad [Wheeler, chair of the foundation’s board of directors,] and I routinely are on the phone saying, ‘You know, we have day jobs.’ We weren’t hired to be communications officers.”
Suggestion: Simple answers such as “What ‘held back’ means is that the framework will be owned by KualiCo and not open source and therefore not downloadable” would avoid some of the perceived need for communication officers.
- Kuali Foundation is partial owner and investor in KualiCo.
The post Kuali Foundation: Clarification on future proprietary code appeared first on e-Literate.
This article is cross-posted to the WCET blog.
After billions of dollars spent on administrative computer systems and billions of dollars invested in ed tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online & hybrid education, flexible terms, and the expansion of continuing and extended education. Based on an investigation of the recently released distance education data for IPEDS, the primary national education database maintained by the National Center for Education Statistics (NCES), we have found significant confusion over basic definitions of terms, manual gathering of data outside of the computer systems designed to collect data, and, due to confusion over which students to include in IPEDS data, the systematic non-reporting of large numbers of degree-seeking students.
In Fall 2012, the IPEDS (Integrated Postsecondary Education Data System) data collection for the first time included distance education – primarily for online courses and programs. This data is important for policy makers and institutional enrollment management as well as for the companies serving the higher education market.
We first noticed the discrepancies based on feedback from analysis that we have both included at the e-Literate and WCET blogs. One of the most troubling calls came from a state university representative that said that the school has never reported any students who took their credit bearing courses through their self-supported, continuing education program. Since they did not include the enrollments in reporting to the state, they did not report those enrollments to IPEDS. These were credits toward degrees and certificate programs offered by the university and therefore should have been included in IPEDS reporting based on the following instructions.
Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.
Unfortunately, the instructions call out this confusing exclusion (one example out of four):
Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs).
How many schools have interpreted this continuing education exclusion to apply to all continuing education enrollments? To do an initial check, we contacted several campuses in the California State University system and were told that all IPEDS reporting was handled at the system level. Based on the introduction of the Fall 2012 distance education changes, Cal State re-evaluated whether to change their reporting policy. A system spokesman explained that:
I’ve spoken with our analytic studies staff and they’ve indicated that the standard practice for data reporting has been to share only data for state-supported enrollments. We have not been asked by IPEDS to do otherwise so when we report distance learning data next spring, we plan on once again sharing only state-supported students.
Within the Cal State system, this means that more than 50,000 students taking for-credit self-support courses will not be reported, and this student group has never been reported.
One of the reasons for the confusion as well as the significance of this change is that continuing education units have moved past their roots of offering CEUs and non-credit courses for the general public (hence the name continuing education) and taking up a new role of offering courses not funded by the state (hence self-support). Since these courses and programs are not state funded, they are not subject to the same oversight and restrictions as state-funded equivalents such as maximum tuition per credit hour.
This situation allows continuing education units in public schools to become laboratories and innovators in online education. The flip side is that given the non-state-funded nature of these courses and programs, it appears that schools may not be reporting these for-credit enrollments through IPEDS, whether or not the students were in online courses. However, the changes in distance education reporting may actually trigger changes in reporting.Do Other Colleges Also Omit Students from Their IPEDS Report?
Given what was learned from the California State University System, we were interested in learning if other colleges were having similar problems with reporting distance education enrollments to IPEDS. WCET conducted a non-scientific canvassing of colleges to get their feedback on what problems they may have encountered. Twenty-one institutions were selected through a non-scientific process of identifying colleges that reported enrollment figures that seemed incongruous with their size or distance education operations. See the “Appendix A: Methodology” for more details.
From early August to mid-September, we sought answers regarding whether the colleges reported all for-credit distance education and online enrollments for Fall 2012. If they did not, we asked about the size of the undercount and why some enrollments were not reported.
Typically, the response included some back-and-forth between the institutional research and distance education units at each college. Through these conversations, we quickly realized that we should have asked a question about the U.S. Department of Education’s definition of “distance education.” Institutions were very unclear about what activities to include or exclude in their counts. Some used local definitions that varied from the federal expectations. As a result, we asked that question as often as we could.The Responses
Twenty institutions provided useable responses. We agreed to keep responses confidential. Table 1 provides a very high level summary of the responses to the following two questions:
- Counts Correct? – Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
- Problem with “Distance Education” Definition? – Although we did not specifically ask this question, several people volunteered that they had trouble applying the IPEDS definition.
Of those that assured us that they submitted the correct distance education counts, some of them also reported having used their own definitions or processes for distance education. This would make their reported counts incomparable to the vast majority of others reporting.One institution declined to respond. Given that its website advertises many hundreds of online courses, the distance education counts reported would leave us to believe that they either: a) under-reported, or b) average one or two students per online class. The second scenario seems unlikely.Findings
This analysis found several issues that call into question the usability of IPEDS distance education enrollment counts and, more broadly and more disturbingly, IPEDS statistics, in general.
There is a large undercount of distance education students
While only a few institutions reported an undercount, one was from the California State University System and another from a large university system in another populous state. Since the same procedures were used within each system, there are a few hundred thousand students who were not counted in just those two systems.
In California, they have never reported students enrolled in Continuing Education (self-support) units to IPEDS. A source of the problem may be in the survey instructions. Respondents are asked to exclude: “Students enrolled exclusively in Continuing Education Units (CEUs).” The intent of this statement is to exclude those taking only non-credit courses. It is conceivable that some might misinterpret this to mean to exclude those in the campuses continuing education division. What was supposed to be reported was the number of students taking for-credit courses regardless of what college or institutional unit was responsible for offering the course.
In the other large system, they do not report out-of-state students as they do not receive funding from the state coffers.
It is unclear what the numeric scope would be if we knew the actual numbers across all institutions. Given that the total number of “students enrolled exclusively in distance education courses” for Fall 2012 was 2,653,426, an undercount of a hundred thousand students just from these two systems would be a 4% error. That percentage is attention-getting on its own.
The IPEDS methodology does not work for innovative programs…and this will only get worse
Because it uses as many as 28 start dates for courses, one institutional respondent estimated that there was approximately a 40% undercount in its reported enrollments. A student completing a full complement of courses in a 15-week period might not be enrolled in all of those courses at the census date. With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering.
The definition of ‘distance education’ is causing confusion
It is impossible to get an accurate count of anything if there is not a clear understanding of what should or should not be included in the count. The definition of a “distance education course” from the IPEDS Glossary is:
A course in which the instructional content is delivered exclusively via distance education. Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.
Even with that definition, colleges faced problems with counting ‘blended’ or ‘hybrid’ courses. What percentage of a course needs to be offered at a distance to be counted in the federal report? Some colleges had their own standard (or one prescribed by the state) with the percentage to be labeled a “distance education” course varied greatly. One reported that it included all courses with more than 50% of the course being offered at a distance.
To clarify the federal definition, one college said they called the IPEDS help desk. After escalating the issue to a second line manager, they were still unclear on exactly how to apply the definition.
The Online Learning Consortium is updating their distance education definitions. Their current work could inform IPEDs on possible definitions, but probably contains too many categories for such wide-spread data gathering.
There is a large overcount of distance education students
Because many colleges used their own definition, there is a massive overcount of distance education. At least, it is an overcount relative to the current IPEDS definition. This raises the question, is the near 100% standard imposed by that definition useful in interpreting activity in this mode of instruction? Is it the correct standard since no one else seems to use it?
In addressing the anomalies, IPEDS reporting becomes burdensome or the problems ignored
In decentralized institutions or in institutions with “self-support” units that operate independently from the rest of campus, their data systems are often not connected. They are also faced with simultaneously having to reconcile differing “distance education” definitions. One choice for institutional researchers is to knit together numbers from incompatible data systems and/or with differing definitions. Often by hand. To their credit, institutional researchers overcome many such obstacles. Whether it is through misunderstanding the requirements or not having the ability to perform the work, some colleges did not tackle this burdensome task.Conclusions – We Don’t Know
While these analyses have shed light on the subject, we are still left with the feeling that we don’t know what we don’t know. In brief the biggest finding is that we do not know what we do not know and bring to mind former Secretary of Defense Donald Rumsfeld’s famous rambling:
There are known knowns. These are things we know that we know. We also know there are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are ones we don’t know we don’t know.
The net effect is not known
Some institutions reported accurately, some overcounted, some undercounted, some did both at the same time. What should the actual count be?
We don’t know.
The 2012 numbers are not a credible baseline
The distance education field looked forward to the 2012 Fall Enrollment statistics with distance education numbers as a welcomed baseline to the size and growth of this mode of instruction. That is not possible and the problems will persist with the 2013 Fall Enrollment report when those numbers are released. These problems can be fixed, but it will take work. When can we get a credible baseline?
We don’t know.
A large number of students have not been included on ANY IPEDS survey, EVER.
A bigger issue for the U.S. Department of Education goes well beyond the laser-focused issue of distance education enrollments. Our findings indicate that there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted. What is the impact on IPEDS? What is the impact on the states where they systematically underreported large numbers of students?
We don’t know.
Who is at fault?
Everybody and nobody. IPEDS is faced with institutional practices that vary greatly and often change from year-to-year as innovations are introduced. Institutional researchers are faced with reporting requirements that vary depending on the need, such as state oversight agencies, IPEDS, accrediting agencies, external surveys and ranking services, and internal pressures from the marketing and public relations staffs. They do the best they can in a difficult situation. Meanwhile, we are in an environment in which innovations may no longer fit into classic definitional measurement boxes.
What to expect?
In the end, this expansion of data from NCES through the IPEDS database is a worthwhile effort in our opinion, and we should see greater usage of real data to support policy decisions and market decisions thanks to this effort. However, we recommend the following:
- The data changes from the Fall 2012 to Fall 2013 reporting periods will include significant changes in methodology from participating institutions. Assuming that we get improved definitions over time, there will also be changes in reporting methodology at least through Fall 2015. Therefore we recommend analysts and policy-makers not put too much credence in year-over-year changes for the first two or three years.
- The most immediate improvement available is for NCES to clarify and gain broader consensus on the distance education definitions. This process should include working with accrediting agencies, whose own definitions influence school reporting, as well as leading colleges and universities with extensive online experience.
The selection process for institutions to survey was neither random nor scientific. A multi-step process of identifying institutions that might have had problems in reporting distance education enrollments was undertaken. The goal was to identify twenty institutions to be canvassed. The steps included:
- A first cut was created by an “eyeball” analysis of the Fall 2012 IPEDS Fall Enrollment database to identify institutions that may have had problems in responding to the distance education enrollment question.
- Colleges that reported distance education enrollments that did not appear to be in scope with the size of the institution (i.e., a large institution with very low distance education enrollments) or what we knew about their distance education operations were included.
- Special attention was paid to land grant colleges as they are likely to have self-funded continuing or distance education units.
- Institutions in the California State University system were excluded.
- This resulted in a list of a little more than 100 institutions.
- The second cut was based upon:
- Including colleges across different regions of the country.
- Including a private college and an HBCU as indicators as to whether this problem might be found in colleges from those institutional categories.
- Twenty institutions were identified.
- In side discussions with a distance education leader at a public university, they agreed to participate in the survey. This brought the total to twenty-one institutions.
Questions Asked in the Survey
- Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
- If the IPEDS data reported does not include all for-credit distance education and online enrollments for Fall 2012, approximately how many enrollments are under-counted?
- If the IPEDS data reported does not include all for-credit distance education and online enrollments for Fall 2012, why did you not report some enrollments?
The post Investigation of IPEDS Distance Education Data Highlights System Not Ready for Modern Trends appeared first on e-Literate.
In Fall 2013 we saw a rich source of LMS market data emerge.
George Kroner, a former engineer at Blackboard who now works for University of Maryland University College (UMUC), has developed what may be the most thorough measurement of LMS adoption in higher education at Edutechnica (OK, he’s better at coding and analysis than site naming). This side project (not affiliated with UMUC) started two months ago based on George’s ambition to unite various learning communities with better data. He said that he was inspired by the Campus Computing Project (CCP) and that Edutechnica should be seen as complementary to the CCP.
The project is based on a web crawler that checks against national databases as a starting point to identify the higher education institution, then goes out to the official school web site to find the official LMS (or multiple LMSs officially used). The initial data is all based on the Anglosphere (US, UK, Canada, Australia), but there is no reason this data could not expand.
There is new data available in Edutechnica’s one-year update, with year-over-year comparisons available as well as improvements to the methodology. Note that the methodology has improved both in terms of setting the denominator and in terms of how many schools are included in the data collection.
The Fall 2014 data which now includes all schools with more than 800 enrollments:
There’s more data available on the site, including measures of the Anglosphere (combining US, UK, Canada and Australia data) as well as comparison tables for 2013 to 2014. Go read the whole post.
- Blackboard’s BbLearn and ANGEL continue to lose market share in US - Using the 2013 to 2014 tables (> 2000 enrollments), BbLearn has dropped from 848 to 817 institutions and ANGEL has dropped from 162 to 123. Using the revised methodology, Blackboard market share for > 800 enrollments now stands at 33.5% of institutions and 43.5% of total enrollments.
- Moodle, D2L, and Sakai have no changes in US - Using the 2013 to 2014 tables (> 2000 enrollments), D2L has added only 2 schools, Moodle none, and Sakai 2 schools.
- Canvas is the fasted growing LMS and has overtaken D2L - Using the 2013 to 2014 tables (> 2000 enrollments), Canvas grew ~40% in one year (from 166 to 232 institutions). For the first time, Canvas appears to have have larger US market share than D2L (13.7% to 12.2% of total enrollments using table above).
- BbLearn is popular in the UK while Moodle is largest provider in Canada and Australia - The non-US numbers are worth reviewing, even without the same amount of detail as we have for US numbers.
While this data is very useful, I will again point out that no one to my knowledge has independently verified the accuracy of the data at this site. I have done sanity checks against Campus Computing and ITC data, but I do not have access to the Edutechnica specific mechanism for counting systems. In order to gain longer-term acceptance of these data sets, we will need some method to provide some level of verification.
In the meantime, enjoy the new market data.
Update: Allan Christie has a post up questioning the source data for Australia. I hope this information is used to improve the Edutechnica data set or at least leads to clarifications.
Put simply, it is generally accepted that there are 39 universities (38 public, 1 private) in Australia. Given the small number of universities and my knowledge of the sector I know that there are 20 (51%) universities which use Blackboard as their enterprise LMS, 16 (41%) use Moodle, and 3 (8%) use D2L. It is acknowledged that there are some departments within universities that use another LMS but according to Edutechnica’s methodology these were excluded from their analysis.
- Disclosure: Blackboard is a client of MindWires Consulting.
The post New LMS Market Data: Edutechnica provides one-year update appeared first on e-Literate.
D’Arcy Norman started a lively inter-blog conversation like we haven’t seen in the edublogosphere in quite a while with his post on the false binary between LMS and open. His main point is that, even if you think that the open web provides a better learning environment, an LMS provides a better-than-nothing learning environment for faculty who can’t or won’t go through the work of using open web tools, and in some cases may be perfectly adequate for the educational need at hand. The institution has an obligation to provide the least-common-denominator tool set in order to help raise the baseline, and the LMS is it. This provoked a number of responses, but I want to focus on Phil’s two responses, which talk at a conceptual level about building a bridge between the “walled garden” of the LMS and the open web (or, to draw on his analogy, keeping the garden but removing the walls that demarcate its border). There are some interesting implications from this line of reasoning that could be explored. What would be the most likely path for this interoperability to develop? What role would the LMS play when the change is complete? For that matter, what would the whole ecosystem look like?
Seemingly separately from this discussion, we have the new Unizin coalition. Every time that Phil or I write a post on the topic, the most common response we get is, “Uh…yeah, I still don’t get it. Tell me again what the point of Unizin is, please?” The truth is that the Unizin coalition is still holding its cards close to its vest. I suspect there are details of the deals being discussed in back rooms that are crucial to understanding why universities are potentially interested. That said, we do know a couple of broad, high-level ambitions that the Unizin leadership has discussed publicly. One of those is to advance the state of learning analytics. Colorado State University’s VP of Information Technology Pat Burns has frequently talked about “educational Moneyball” in the context of Unizin’s value proposition. And having spoken with a number of stakeholders at Unizin-curious schools, it is fair to say that there is a high level of frustration with the current state of play in commercial learning analytics offerings that is driving some of the interest. But the dots have not been connected for us. What is the most feasible path for advancing the state of learning analytics? And how could Unizin help in this regard?
It turns out that the walled garden questions and the learning analytics questions are related.The Current State of Interoperability
Right now, our LMS gardens still have walls and very few doors, but they do have windows, thanks to the IMS LTI standard. You can do a few things with LTI, including the following:
- Send a student from the LMS to someplace elsewhere on the web with single sign-on
- Bring that “elsewhere” place inside the LMS experience by putting it in an iframe (again, with single sign-on)
- Send assessment results (if there are any) back from that “elsewhere” to the LMS gradebook.
The first use case for LTI was to bring in a third-party tool (like a web conferencing app or a subject-specific test engine) into the LMS, making it feel like a native tool. The second use case was to send students out to a tool that needed to full control of the screen real estate (like an eBook reader or an immersive learning environment) but to make that process easier for students (through single sign-on) and teachers (through grade return). This is nice, as far as it goes, but it has some significant limitations. From a user experience perspective, it still privileges the LMS as “home base.” As D’Arcy points out, that’s fine for some uses and less fine for others. Further, when you go from the LMS to an LTI tool and back, there’s very little information shared between the tool. For example, you can use LTI to send a student from the LMS to a WordPress multiuser installation, have WordPress register that student and sign that student in, and even provision a new WordPress site for that student. But you can’t have it feed back information on all the student’s posts and comments into a dashboard that combines it with the student’s activity in the LMS and in other LTI tools. Nor can you use LTI to aggregate student posts from their respective WordPress blogs that are related to a specific topic. All of that would have to be coded separately (or, more likely, not done at all). This is less than ideal from both user experience and analytics perspectives.Enter Uniz…Er…Caliper
There is an IMS standard in development called Caliper that is intended to address this problem (among many others). I have described some of the details of it elsewhere, but for our current purposes the main thing you need to know is that it is based on the same concepts (although not the same technical standards) as the semantic web. What is that? Here’s a high-level explanation from the Man Himself, Mr. Tim Berners-Lee:
The basic idea is that web sites “understand” each other. The LMS would “understand” that a blog provides posts and comments, both of which have authors and tags and categories, and some of which have parent/child relationships with others. Imagine if, during the LTI initial connection, the blog told the LMS about what it is and what it can provide. The LMS could then reply, “Great! I will send you some people who can be ‘authors’, and I will send you some assignments that can be ‘tags.’ Tell me about everything that goes on with my authors and tags.” This would allow instructors to combine blog data with LMS data in their LMS dashboard, start LMS discussion threads off of blog posts, and probably a bunch of other nifty things I haven’t thought of.
But that’s not the only way you could use Caliper. The thing about the semantic web is that it is not hub-and-spoke in design and does not have to have a “center.” It is truly federated. Perhaps the best analogy is to think of your mobile phone. Imagine if students had their own private learning data wallets, the same way that your phone has your contact information, location, and so on. Whenever a learning application—an LMS, a blog, a homework product, whatever—wanted to know something about you, you would get a warning telling you which information the app was asking to access and asking you to approve that access. (Goodbye, FERPA freakouts.) You could then work in those individual apps. You could authorize apps to share information with each other. And you would have your own personal notification center that would aggregate activity alerts from those apps. That notification center could become the primary interface for your learning activities across all the many apps you use. The PLE prototypes that I have seen basically tried to do a basic subset of this capability set using mostly RSS and a lot of duct tape. Caliper would enable a richer, more flexible version of this with a lot less point-to-point hand coding required. You could, for example, use any Caliper-enabled eBook reader that you choose on any device that you choose to do your course-related reading. You could choose to share your annotations with other people in the class and have their annotations appear in your reader. You could share information about what you’ve read and when you’ve read it (or not) with the instructor or with a FitBit-style analytics system that helps recommend better study habits. The LMS could remain primary, fade into the background, or go away entirely, based on the individual needs of the class and the students.
Caliper is being marketed as a learning analytics standard, but because it is based on the concepts underlying the semantic web, it is much more than that.Can Unizin Help?
One of the claims that Unizin stakeholders make is that the coalition can can accelerate the arrival of useful learning analytics. We have very few specifics to back up this claim so far, but there are occasionally revealing tidbits. For example, University of Wisconsin CIO Bruce Mass wrote, “…IMS Global is already working with some Unizin institutions on new standards.” I assume he is primarily referring to Caliper, since it is the only new learning analytics standard that I know of at the IMS. His characterization is misleading, since it suggests a peer-to-peer relationship between the Unizin institutions and IMS. That is not what is happening. Some Unizin institutions are working in IMS on Caliper, by which I mean that they are participating in the working group. I do not mean to slight or denigrate their contributions. I know some of these folks. They are good smart people, and I have no doubt that they are good contributors. But the IMS is leading the standards development process, and the Unizin institutions are participating side-by-side with other institutions and with vendors in that process.
Can Unizin help accelerate the process? Yes they can, in the same ways that other participants in the working group can. They can contribute representatives to the working groups, and those representatives can suggest use cases. They can review documents. They can write documents. They can implement working prototypes or push their vendors to do so. The latter is probably the biggest thing that anyone can do to move a standard forward. Sitting around a table and thinking about the standard is good and useful, but it’s not a real standard until multiple parties implement it. It’s pretty common for vendors to tell their customers, “Oh yes, of course we will implement Caliper, just as soon as the specification is finalized,” while failing to mention that the specification cannot be finalized until there are implementers. What you end up with is a bunch of kids standing around the pool, each waiting for somebody else to jump in first. In other words, what you end up with is paralysis. If Unizin can accelerate the rate of implementation and testing of the proposed specification by either implementing themselves or pushing their vendor(s) to implement, then they can accelerate the development of real market solutions for learning analytics. And once those solutions exist, then Unizin institutions (along with everyone else) can use them and try to discover how to use all that data to actually improve learning. These are not unique and earth-shaking contributions that only Unizin could make, but they are real and important ones. I hope that they make them.
Within higher education, we tend to talk about LMS solutions based on an institutional perspective – which systems can serve as the official LMS for an entire institution. While this view is important and forms the basis for my LMS graphics, the emergence of new educational delivery models has led to the development of some interesting program-specific LMS models. One example that I have already written about is 2U’s platform (built on top of Moodle and Adobe Connect) for their specific Online Service Provider (OSP) business.
One educational model that is becoming more and more important is competency-based education (CBE). One of the challenges for this model is that the traditional LMS – based on a traditional model using grades, seat time and synchronous cohort of students – is not easily adapted to serve CBE needs. As described in this CBE primer:
OBE [Outcome-based education] can be implemented in various modalities, including face-to-face, online and hybrid models.
Competency-based education (CBE) is a narrower concept, a subset or instance of OBE, where the outcomes are more closely tied to job skills or employment needs, and the methods are typically self-paced. Again based on the Malan article, the six critical components of CBE are as follows:
- Explicit learning outcomes with respect to the required skills and concomitant proficiency (standards for assessment)
- A flexible time frame to master these skills
- A variety of instructional activities to facilitate learning
- Criterion-referenced testing of the required outcomes
- Certification based on demonstrated learning outcomes
- Adaptable programs to ensure optimum learner guidance
In 2008 Altius Education, started by Paul Freedman, worked with Tiffin University to create a new entity called Ivy Bridge College. The goal of Ivy Bridge was to help students get associate degrees and then transfer to a four-year program. Altius developed the Helix LMS specifically for this mission. All was fine until the regional accrediting agency shut down Ivy Bridge with only three months notice.
The end result was that Altius sold the LMS and much of the engineering team to Datamark in 2013. Datamark is an educational services firm with a focus on leveraging data. With the acquisition of the Helix technology, Datamark could expand into the teaching and learning process, leading them to rebrand as Helix Education – a sign of the centrality of the LMS to the company’s strategy. Think of Helix Education now as an OSP (a la carte services that don’t require tuition revenue sharing) with an emphasis on CBE programs.
I asked the Helix team to give me a demo of their tool, with permission to do screen grabs, to get a better sense of the system design and inter-relationship with CBE. The goal is to understand some of the nuances brought about by a CBE design focus, based on 3 of the 6 components from SPT Malan. I am not trying to judge better or worse in terms of LMS design but rather to get a better visualization of the implications of CBE.
First – the main page:
The learning outcomes are embedded throughout the course, including the ability to allow a pretest and determine the starting set of competencies.
From a faculty and / or course designer standpoint, there is also a view to determine the coverage of competencies within the course.
The system gives estimates of the time required for most tasks and allows the student to choose their own course plan. The “schedule” then adjusts based on the course plan.
Because the course is essentially self-paced, Helix adds some features to allow peer assessment and discussions, often in an asynchronous manner (based on current and past discussions of students at the same point in the course, but allowing real-time faculty interaction).
Students can ‘like’ and choose to ‘follow’ different discussions. I asked if students can ‘follow’ other students, and the answer was no (but possible in roadmap).
This one was interesting, as Helix allows students to select different levels of guidance – hints, examples, step-by-step, etc.
- I am doing some research to see if I can get other examples from CBE programs, but there seems to be movement of newer educational models, including CBE, becoming more likely to develop program-specific LMS solutions.
- While the Helix team stated that the assessments (formative and summative) can be set up with essays or other non-multiple-choice formats, the overall view of this platform gives a good view of the fairly prescriptive, objectivist nature of CBE.
- Helix stated several times during the demo that the LMS could be applied to non-CBE programs, but my focus was on the CBE nature of the system.
- Helix is building up its management team, including several key hires with a background at eCollege (acquired by Pearon in 2007), Western Governors University and DeVry University.
- Finally, this feature is not CBE-based, but I do like the ability for students to provide feedback on lessons within the course itself.
- More on that sad tale in this post as well as this Chronicle article.
- See Tony Bates article for more on this topic.
The post Helix: View of an LMS designed for competency-based education appeared first on e-Literate.
In yesterday’s post I described where I (and many others) see the LMS market heading in terms of interoperability.
At the same time, the LMS does a very poor job at providing a lot of the learning technologies desired by faculty and students. There is no way that a monolithic LMS can keep up with the market – it cannot match functionality of open internet tools especially without adding feature bloat.
I would add that part of the cause of the “false binary position” that D’Arcy points out is that much of the public commentary focuses on where the LMS has been rather than where it is going. There is a significant movement based on interoperability that is leading, perhaps painfully and slowly, to a world where the LMS can coexist with open educational tools, with even end users (faculty and students) eventually having the ability to select their tools that can share rosters and data with the institutional LMS.
Coexistence and interoperability, however, should not imply merely having links from the LMS to external tools as is too often the case.The Walled Garden
The LMS (which George Station rightly points out was really called the Course Management System in the early years) started out as a walled garden with basic functionality of syllabus sharing, announcements, gradebook, email, and a few other tools.
Over time, as both Jared Stein points out in his blog post:
Flash forward to 2005(ish), when “Web 2.0” was on many educators’ minds as a new wave of services that made it easier for anyone to express themselves to anyone who was interested in participating. New web services and social media made the legacy LMS look like what it was: A slow-moving cruise ship that locked passengers in their cabins. It didn’t care about user experience. It didn’t care about integrating with social media. It didn’t care about encouraging novel practices or experimentation. But those were really just symptoms; the sickness was that the LMS vendors didn’t care about what was happening in our culture and in our communities as connectivity and multimedia exploded through the open web.
The LMS vendors did not just ignore these new services, however, but they tried to eat their cake and have it, too, by creating poor imitations of the external tools and stuffing them inside the LMS.
As Web 2.0 tools proliferated, this approach of maintaining the walled garden was one of the primary causes of feature bloat and poorly-designed learning tools within the LMS.
This situation – a walled garden LMS with feature bloat and inelegant tools while multiplying external tools become available – represents the bad side of the ed tech market as it has existed. Despite the weakness of this design approach, the vendors themselves were not the only ones at fault. As Mike Caulfield points out in his description of the “elegant and extensible Prometheus:
A number of years later I asked a person I knew who worked at Prometheus why Prometheus failed. Did Blackboard crush them?
His answer was interesting. No, it wasn’t Blackboard at all. It was the educational institutions. With the slow, resource-intensive and state-mandated RFP processes, the interminable faculty commitees, and the way that even after the deal was signed the institution would delay payment and implementation as long as possible (or suddenly throw it into an unanticipated ‘final review’) it was just not possible to grow a stable business. The process institutions followed was supposed to ensure equitable access to contracts, but what it did was made it impossible for any company not sitting on a pile of cash to stay in business. (I’m extrapolating a bit here, but not much).
I would add that the RFP process also encourages a feature checklist mentality, elevating the importance of being able to say “we have that feature” and minimizing the ability to say “this design doesn’t suck”.
Many institutions have reacted slowly to the proliferation of tools and officially support only the enterprise LMS – often due to FERPA / student privacy concerns but also due to perceived inability of central units to provide support to faculty and students on multiple tools.
But this is a choice, even in the current market with limited interoperability. There are other institutions that support not only the official enterprise LMS but also multiple learning tools. While institutions have a responsibility to provide baseline LMS services for faculty, there is a strong argument that they also have a responsibility to support the innovators and early adopters that want to explore with different learning tools, whether or not they integrate with the LMS within a course.Moving Beyond the Wall
But can the market progress such that the enterprise LMS can coexist with open tools even at the course level? The answer in my mind is yes, and the work to move in this direction has been in progress for years. Thanks to LTI specification, and in the future the Caliper interoperability framework, the vision that George Kroner describes is getting closer and closer.
But the LMSs today won’t be the LMSs of tomorrow. Rather than being a “dumping ground” for content, maybe one possible future for LMSs is as Learning Management Scaffolding – metaphorically supporting learning no matter its shape or form – with content being viewed and activities taking place inside and outside of the LMS. Maybe content will be seamlessly navigable around the LMS and the web – and perhaps in other types of systems like LCMSs – Learning Content Management Systems. Maybe learning tools of all types and sizes – but external to the LMS – will support every long-tail instructional desire imaginable while assessment results feed back into the LMS gradebook. Maybe the LMS will be the storage mechanism for leaning analytics as well, but it is more likely that it will become only one source of data feeding into another system better-suited for the task. But try as I might I fail to imagine a future in which some centrally-managed, instructor-accessible system stores rosters and grades, enforces privacy and security policies, and provides some form of starting-off point for students.
In this developing future market, coexistence of LMS and Open will include not just links or grudging institutional support, but it will also include information sharing of rosters, data, and context. Open tools that will start with the class roster in place, data of user activity shared between apps, and the ability to external apps to be run in the context of the course design and recent class activities.
There will be painful implementations – caused both by LMS vendors and by institutions – that will prevent a smooth transition to this breakdown of the walled garden, but it will become increasingly difficult for LMS solutions to survive over time if they don’t adapt. There will also be market niches (e.g. specific online programs) that will retain the walled garden LMS approach, but in general the markets should change.
I personally see the realistic future as having more of a choice of tools rather than a minimal LMS. LMS vendors will continue to have reasons to develop (or acquire) their own internal tools, and there will even be cases where the tight integration and focused development will lead to better tools in the LMS than outside. The key change will be the ability for integration decisions – which tools to use in specific classes or in specific institutions – to be made closer to the faculty and student end users. From LMS vendor to central IT to academic program to even individual faculty – moving closer to those who know the specific needs of the class. Central IT and the institution will remain important in setting policies and permissions to protect student privacy and provide guidance to faculty and course designers who are more conservative in their ed tech usage. But either way (minimal LMS or swappable tool LMS), I think the long-term trend is moving in this direction of LMS and Open tool coexistence.
Update 9/19: Updated graphics to add LMS label, CC license and logo to facilitate sharing outside of blog.
D’Arcy Norman has an excellent blog post up titled “On the false binary of LMS vs. Open” that captures a false framing issue.
We’re pushed into a false binary position – either you’re on the side of the evil LMS, working to destroy all that is beautiful and good, or you’re on the side of openness, love, and awesomeness. Choose. There is no possible way to teach (or learn) effectively in an LMS! It is EVIL and must be rooted out before it sinks its rotting tendrils into the unsuspecting students who are completely and utterly defenseless against its unnatural power!
While D’Arcy is a proponent of open tools, he rightly calls out the need to understand institutional responsibilities.
But. We can’t just abdicate the responsibility of the institution to provide the facilities that are needed to support the activities of the instructors and students. That doesn’t mean just “hey – there’s the internet. go to it.” It means providing ways for students to register in courses. For their enrolment to be automatically processed to provision access to resources (physical classrooms, online environments, libraries, etc…). For students’ grades and records to be automatically pushed back into the Registrar’s database so they can get credit for completing the course. For integration with library systems, to grant acccess to online reserve reading materials and other resources needed as part of the course.
This is an important point, in that the institutional LMS is important and will not, and should not, go away anytime soon. I have pointed out recently that the LMS is one of the very few technologies now used in a majority of courses within an institution, and the institutional responsibility described above helping to explain why.
In our consulting work Michael and I often help survey institutions to discover what technologies are being used within courses, and typically the only technologies that are used by a majority of faculty members or in a majority of courses are the following:
- AV presentation in the classroom;
- PowerPoint usage in the classroom (obviously connected with the projectors);
- Learning Management Systems (LMS);
- Digital content at lower level than a full textbook (through open Internet, library, publishers, other faculty, or OER); and
- File sharing applications.
At the same time, the LMS does a very poor job at providing a lot of the learning technologies desired by faculty and students. There is no way that a monolithic LMS can keep up with the market – it cannot match functionality of open internet tools especially without adding feature bloat.
I would add that part of the cause of the “false binary position” that D’Arcy points out is that much of the public commentary focuses on where the LMS has been rather than where it is going. There is a significant movement based on interoperability that is leading, perhaps painfully and slowly, to a world where the LMS can coexist with open educational tools, with even end users (faculty and students) eventually having the ability to select their tools that can share rosters and data with the institutional LMS.
Below is a modified presentation I gave at the Apereo Mexico conference in the spring (with a few changes to explain slides without audio). The key point is that there are subtle changes to the LMS market that are significant, and the coexistence of the LMS with open tools will be central to the market’s future.
Will all LMS vendors move this direction? In marketing, yes, but in reality, no. There are different approaches to this coexistence issue from the LMS vendors, ranging from lip service to outright support, and several points in between. But the overall trend is clearing moving this direction, even if some solutions lose out over time.
The post LMS and Open: The false binary is based on past, not future markets appeared first on e-Literate.
Back around New Year, Michael wrote a post examining Pearson’s efficacy initiative and calling on the company to engage in active discussions with various communities within higher education about defining “efficacy” with educators rather than for educators. It turns out that post got a fair bit of attention within the company. It was circulated in a company-wide email from CEO John Fallon, and the blog post and all the comments were required reading for portions of the company leadership. After a series of discussions with the company, we, through our consulting company, have been hired by Pearson to facilitate a few of these conversations. We also asked for and received permission to blog about them. Since this is an exception to our rule that we don’t blog about our paid engagements, we want to tell you a little more about the engagement, our rationale for blogging about it, and the ground rules.
The project itself is fairly straightforward. We’re facilitating conversations with a few different groups of educators in different contexts. The focus of each conversation is how they define and measure educational effectiveness in their respective contexts. There will be some discussion of Pearson’s efficacy efforts at a high level, but mainly for the purpose of trying to map what the educators are telling us about their practices to how Pearson is thinking about efficacy in the current iteration of their approach. After doing a few of these, we’ll bring together the participants along with other educators in a culminating event. At this meeting, the participants will hear a summary of the lessons learned from the earlier conversations, learn a bit more about Pearson’s efficacy work, and then break up into mixed discussion groups to provide more feedback on how to move the efficacy conversation forward and how Pearson’s own efforts can be improved to make them maximally useful to educators.
Since both e-Literate readers and Pearson seemed to get a lot of value from our original post on the topic, we believe there would be value in sharing some of the ongoing conversation here as well. So we asked for and received permission from Pearson to blog about it. Here are the ground rules:
- We are not getting paid to blog and are under no obligation to blog.
- Our blog posts do not require prior editorial review by Pearson.
- Discussions with Pearson during the engagement are considered fair game for blogging unless they are explicitly flagged as otherwise.
- On the other hand, we will ask for Pearson customers for approval prior to writing about their own campus initiatives (and, in fact, will extend that courtesy to all academic participants).
The main focus of these posts, like the engagement itself, is likely to be on how the notion of efficacy resonates (or doesn’t) with various academic communities in various contexts. Defining and measuring the effectiveness of educational experiences—when measurement is possible and sensible—is a subject with much broader application’s than Pearson’s product development, which is why we are making an exception to our blogging recusal policy for our consulting engagements and why we appreciate Pearson giving us a free hand to write about what we learn.
In case anyone needed additional information to counter the Brookings-fed meme that “Americans who borrowed to finance their education are no worse off today than they were a generation ago”, theU.S. Government Accountability Office (GAO) released a report yesterday with some significant findings. As reported at Inside Higher Ed by Michael Stratford:
More than 700,000 households headed by Americans 65 or older now carry student debt, according to a report released Wednesday by the U.S. Government Accountability Office. And the amount of debt owed by borrowers 65 and older jumped from $2.8 billion in 2005 to $18.2 billion last year. [snip]
Between 2004 and 2010, for instance, the number of households headed by individuals 65 to 74 with student loan debt more than quadrupled, going from 1 percent to 4 percent of all such families. During that same period, the rate of borrowing among Americans under 44 years old increased between 40 and 80 percent, even though borrowing among that age group is far more prevalent than it is among senior citizens.
I have been highly critical of the Brookings Institutions and their report and update. This new information from the GAO goes outside the selective Brookings data set of households headed by people aged 20 – 40, but it should be considered by anyone trying to draw conclusions about student debt holders.
Noting that Brookings analysis is based on “Americans who borrowed to finance their education” and the GAO report is on student debt holders, it is worth asking if we’re looking at a similar definition. For the most part, yes, as explained at IHE:
While some of the debt reflects loans taken out by parents on behalf of their children, the vast majority — roughly 70 to 80 percent of the outstanding debt — is attributable to the borrowers’ own education. Parent PLUS loans accounted for only about 27 percent of the student debt held by borrowers 50 to 64 years old, and an even smaller share for borrowers over 65.
Go read at least the entire IHE article, if not the entire GAO report.
Student debt is a growing problem in the US, and the Brookings Institution conclusions are misleading at best.
These data indicate that typical borrowers are no worse off now than they were a generation ago …
Their data is based on the triennial Survey of Consumer Finances (SCF) by the Federal Reserve Board, with the report based on 2010 data. With the release of the 2013 SCF data, Brookings Institution put out an update this week on their report, and they continue with the lede:
The 2013 data confirm that Americans who borrowed to finance their educations are no worse off today than they were a generation ago. Given the rising returns to postsecondary education, they are probably better off, on average. But just because higher education is still a good investment for most students does not mean that high and rising college costs should be left unquestioned.
This conclusion is drawn despite the following observations of changes from 2010 – 2013 in their own update:
- The share of young (age 20 – 40) households with student debt rose from 36% to 38%;
- The average amount of debt per household rose 14%;
- The distribution of debt holders rose by 50% for debt levels of $20k – $75k and dropped by 19% for debt levels of $1k – $10k; and
- Wage income is stagnant and same level as ~1999, yet debt amounts have risen by ~50% in that same time period (see below).
Brookings’ conclusion from this chart?
The upshot of the 2013 data is that households with education debt today are still no worse off than their counterparts were more than 20 years ago. Even though rising debt continued to cut into stagnant incomes, the average household with debt is better off than it used to be.
The strongest argument that Brookings presents is that the median monthly payment-to-income ratios have stayed fairly consistent at ~4% over the past 20 years. What they fail to mention is that households are taking much longer to pay off student loans now.
More importantly, the Brookings analysis ignores the simple and direct measurement of loan delinquency. See this footnote from the original report [emphasis added]:
These statistics are based on households that had education debt, annual wage income of at least $1,000, and that were making positive monthly payments on student loans. Between 24 and 36 percent of borrowers with wage income of at least $1,000 were not making positive monthly payments, likely due to use of deferment and forbearance …
That’s what I call selective data analysis. In the same SCF report that Brookings used for its update:
The delinquency rate for student loans has gone up ~50% from 2010 to 2013!
How can anyone claim that Americans with student debt are no worse off when:
- More people have student debt;
- The average amount of debt has risen;
- Wage income has not risen; and
- The delinquency rate for student loans has risen.
None of the secondary spreadsheet jockeying from Brookings counters these basic facts. This ongoing analysis by Brookings on student debt is a farce.
The post Brookings Institution analysis on student debt becoming a farce appeared first on e-Literate.
So please, if you’re thinking about writing that post or article, don’t.
I’m begging you.
The post What the Apple Watch Tells Us About the Future of Ed Tech appeared first on e-Literate.
In a post last month I questioned the growth claims that D2L was pushing to the media based on their recent massive funding round. A key part of the article was pointing out the lack of real reporting from news media.
It is worth noting that not a single media outlet listed by EDUKWEST or quoted above (WSJ, Reuters, Bloomberg, re/code, edSurge, TheStar) challenged or even questioned D2L’s bold claims. It would help if more media outlets didn’t view their job as paraphrasing press releases.
I should give credit where it’s due: Education reporter Molly Hensley-Clancy at Buzzfeed has done some solid reporting with her article out today.
In response to detailed questions from BuzzFeed News about figures to back up its claims of record growth in higher education and internationally, the company released a statement to BuzzFeed News, saying “As a private company, D2L does not publicly disclose these details. The past year has been one of record growth for D2L, culminating in the recent $85 million round of financing.” A representative declined to make the company’s CEO, or any other executive, available for an interview related to the company’s growth.
The stonewalling didn’t come as a surprise to former employees with whom BuzzFeed News spoke.
“The picture they’re painting of growth is not accurate,” said one former employee, who left the company within the last year and asked to remain anonymous, citing his confidentiality agreement with the company. “If you look at actual metrics, they tell a different story. They’re very likely not seeing growth in higher education.”
Molly’s article included discussions with three former D2L employees, an interview with CSU Channel Islands CIO Michael Berman, and a D2L official response (in a manner of speaking). Who would have thought that Buzzfeed would be the source of valuable reporting that challenges the all-too-easy headlines provided through press releases?
Me, for one. If you follow the Buzzfeed education articles, you’ll notice a pattern of this type of reporting – mostly focused on the business of education. Consider the following articles:
- The Education Department’s Problematic Billion-Dollar Partnership With Debt Collection Agencies
- Students At For-Profit College Say They Weren’t Told Their School Was Being Sold
- Documents Show What The Department Of Education Knew About Corinthian College’s Financial Situation
In each case, Molly challenges public perceptions, digs up unique information through interviews and document research, and describes the findings in a hard-hitting but balanced article. Buzzfeed is becoming an important source for education news and a welcome addition.
The post Say What? Buzzfeed follows up on D2L story with solid reporting appeared first on e-Literate.
In a recent post on Kuali, I characterized Unizin as a community source initiative. Brad Wheeler, CIO at Indiana University and co-founder of Kuali and Unizin, responded via email (with permission to quote):
Unizin is not a Community Source effort in the way that I understand Community Source as we started applying the label 10+ years ago. Unizin is better understood, as you have reported, as a cloud-scale service operator somewhat like I2. It does not plan to do lots of software development other than as needed for integrations. No biggie, just a nuanced observation from the end of the story.
Brad is correct, and I note that Unizin document has been fairly consistent in the lack of plans for software development, as seen in Unizin FAQs:
Is Unizin another open- or community-source project like Sakai or Kuali?
No – those endeavors focus on building software as a community, for use by individual institutions. Unizin strives to foster a community more concerned with creating and sharing content and improving outcomes.
I have already revised the Kuali post to add a clarification on this point. I asked Brad whether this means that Unizin is ruling out software development. His reply:
Unizin is working on its roadmap for each area. If we do need to head down some development approach that is more than integration, we’ll give thought to the full range of options for best achieving that, but there is no plan to begin an open/community source effort at this time.
All public indications are that Unizin plans to source existing technologies (as they have done with Canvas as the LMS) for content repository and learning analytics functionality, focusing any software development on integrations.Potential New Consortium Members
Based on two articles about the University of Iowa (D2L) exploration of Unizin, the University of Illinois (Blackboard), University of Wisconsin (D2L), University of Minnesota (Moodle), and Ohio State University (D2L) are listed as schools that are “close to joining” Unizin, quoting the Iowa CIO and an internal Illinois email.
The original four consortium members are Indiana University, University of Michigan, University of Florida and Colorado State University. Unizin materials originally indicated the following set of potential partners:
- Colorado State University (Blackboard)
- Indiana University (Sakai)
- Oregon State University (Blackboard)
- Purdue University (Blackboard)
- University of Florida (Sakai, with some use of Canvas)
- University of Maryland (Canvas)
- University of Michigan (Sakai)
- University of Texas (Blackboard, with at least one campus in the process of moving to Canvas)
- University of Wisconsin (Desire2Learn)
- University of Utah (Canvas)
Of the new schools mentioned, only Wisconsin appears on the earlier list.
From the first article:
Lon Moeller, the associate provost for undergraduate education, will present the idea to the council. He said he is not yet sure whether the UI will commit to Unizin.
“We’re talking to faculty and administrators across campus to see if there’s interest in Unizin as a program,” he said. “We’re evaluating Unizin as a pilot project.”
From the second article:
“We are discussing doing a pilot project for next spring, but we need to decide if the project is worth our time,” Fleagle [CIO] said.
The articles make it sound like UI officials are pushing for the campus to join Unizin:
If the UI decided to join Unizin, it could have input into the new software as well as help to develop it.
“There is no software yet. The four partners [Colorado State, Indiana, Florida, and Michigan] are working on it,” Fleagle said. “They have groups working on it now. If we wanted to do it, we need to get in so that Iowa’s needs can get met … The disadvantage of waiting is that you don’t get access to those discussions right away. I think we have as good of people at Iowa as anyone else involved influencing it.”
I talked to Maggie Jesse, Senior IT Director for the Office of Teaching, Learning, & Technology, to get a better understanding of their decision process. The picture that emerges is a campus that has been invited to join Unizin, is potentially willing to do a pilot program, but is not in marketing mode.
To set the stage, Iowa recently performed an assessment to get campus feedback on ICON, their branded version of the LMS. The message coming back from the assessment was that faculty were not motivated to do a major change - they were not unhappy and did not need the burden of an LMS change. Furthermore, Iowa just completed an upgrade to ICON, changing the D2L version and reconfiguring the campus hosting and support structure.
In the article quote above, a faculty member had asked whether Iowa should just wait for others to join Unizin and let Iowa decide later. Steve Fleagle answered that the argument for deciding early was to influence Unizin direction. The pilot option allows Iowa to work within its usual culture of campus decision making- piloting a system and then taking the outcomes to the campus for a decision. Iowa has developed a communications plan to discuss a possible Unizin pilot with over 20 different campus groups to determine if there is interest in Unizin, and yesterday’s meeting was just one of these.Illinois Confusion
The first article quoted sections of an email at the University of Illinois that reflected a poor understanding of Unizin [emphasis added].
Charles Tucker, the vice provost for undergraduate education and innovation at the University of Illinois, wrote in an email that the benefits of many universities joining Unizin include being able to band together to “influence the companies that provide software for teaching and learning.” [snip]
Tucker said the system can streamline different technologies to make things simpler for professors and students.
“The Unizin approach is to promote a standards-based approach where the same digital course materials can work in any learning-management system,” he said. “Faculty spend less time making the technology work, and students get a better course.”
This statement is odds with the description at Inside Higher Ed from July:
The digital learning consortium, announced Wednesday morning, aims to simplify how universities share learning analytics, content and software platforms. But in order to do so, Unizin needs its members to use the same infrastructure. A common learning management system is the first part of that package.
“You don’t really have common infrastructure if you’re saying everything is heterogeneous,” said Brad Wheeler, the Unizin co-founder who serves as vice president for IT and chief information officer at Indiana University. “A lot of these different learning tools — Sakai, Blackboard, Canvas — they all do a bunch of really good stuff. But five universities picking five different ones — what’s the end value in that if they want to do something together?” [snip]
“The presumption is if I join Unizin, I’m going to be using Canvas sooner or later,” said Joel Dehlin, Instructure’s chief technology officer [ed. now CEO of Kuali's for-profit arm]. Joining Unizin without using Canvas, he added, is “like joining a country club and paying for the country club and not using the golf course.”
The goal of Unizin includes promoting control over digital course materials and even various reuse of content. This goal is not based on LMS neutrality driven by standards, but it is based on having a common infrastructure that would sit on top of and benefit from standards. Note that these are goals and not realities yet. However, Illinois might benefit from making their decision based on a more accurate understanding of Unizin.
Update: See comment below from Rob Abel, CEO of the IMS Global Learning Consortium, below re. standards.
- Disclosure: I was directly involved as a consultant in the original Iowa decision to adopt D2L. See this link for details.
The post Unizin Updates: Clarification on software development and potential new members appeared first on e-Literate.
Last month I shared the system announcement that the Cal State Online (CSO) initiative is finished. Despite the phrasing of “re-visioning” and the retention of the name, the concept of a standalone unit to deliver and market online programs for the system is gone. Based on documents obtained by e-Literate through a public records request:
- The original concept of “a standardized, centralized, comprehensive business, marketing and outreach support structure for all aspects of online program delivery for the Cal State University System” was defined in summer 2011, formally launched in Spring 2013, and ultimately abandoned in Fall 2013;
- CSO was only able to enroll 130 full-time equivalent students (FTES) in CY2013 despite starting from pre-existing campus-based online programs and despite minimum thresholds of 1,670 FTES in the Pearson contract;
- CSO was able to sign up only five undergraduate degree-completion programs and two master’s programs offered at four of the 23 Cal State campuses;
- Faculty groups overtly supported investments in online education but did not feel included in the key decision processes;
- Pearson’s contract as a full-service Online Service Provider was in place for less than one year before contract renegotiations began, ultimately leading to LMS services only; and
- The ultimate trigger to abandon the original model was the $10 million state funding for online education to address bottleneck courses.
That last one might seem counter-intuitive without the understanding that CSO did not even attempt to support matriculated Cal State students in state-funded programs.
Terminology note: CSO measured course enrollments as “one student registered in one online course”, such that one student taking two courses would equal two course enrollments, etc. Internally CSO calculated 10 course enrollments = 1 FTES.
Below is a narrative of the key milestones and decisions as described by the public documents. I’ll share more of my thoughts in a future post.2011
Based on foundational work done in 2010 by the Technology Steering Committee (TSC), a group of nine campus presidents along with six Chancellor’s Office staff, a contract is awarded to a consultant (Richard Katz and Associates) to produce five reports on online learning (link will download zip file) and Cal State Universities work to date. TSC then produced an overview document for what would become CSO in June 2011, including 10 guiding principles and the first schedule estimate. An October 2011 update document further clarified the plans. Some key decisions made in 2011 included forming a separate 501(c)3 organization owned by Cal State University and funding the creation of CSO by the contribution of $50,000 from each of the 23 CSU campuses.
Two key decisions from this period are worth highlighting, as they explain much of the trajectory of CSO in retrospect. The first one defined the need for an Online Service Provider (ultimately chosen as Pearson).
A business partner for CSU Online might be needed in order to provide the necessary student support services, including, for example, advising, financial aid, career services, and tutoring. In addition, a business partner could provide the 24/7/365 help desk support absolutely critical for CSU Online. Market research and marketing of programs are other potential areas for the contributions of a business partner. Instructional design support for faculty is another potential area, as is technological support for the effort.
The second decision defined a strategy in terms of which types of online programs to add in which order.
Following from the bedrock of our Principles, the TSC supported a tactical entrance into CSU Online by focusing on those areas in which CSU campuses are already strong and proficient. We believe that it is imperative to start from a position of program strength rather than to straggle into the market in areas as yet not fully defined or ready for implementation. Accordingly, the TSC recommends that CSU Online address six areas, with two ready for immediate roll out.
- The 60 or so Masters level programs that exist throughout the CSU should comprise our initial effort with an eye toward serving the extensive mid-career professional and unemployed adults who are in need of this level of education to advance their careers.
- Our second focus should entail the presentation of two or three degree completion programs in an effort to enhance workforce development.
An important note on both of these areas is that they are both self-support, offered through continued or extended education groups and not eligible for state funding. These self-support programs do not have the same constraints on setting tuition and tend to it significantly higher than state-support mainline programs.
The overview also estimated the timeline to include an RFP for commercial partner (OSP) to be released in Fall 2011.
By late 2011 there were already signs of faculty discontent with the inclusion of faculty in CSO decision-making and with the planned usage of a commercial partner. The Cal State Dominguez Hills faculty senate resolved in November:
Growing faculty concerns about the minimal faculty input in the development of the Online Initiative, as well as the direction the Initiative may be taking have led three Academic Senates (CSUSB, CSU Stanislaus, and Sonoma State) to pass resolutions calling for the suspension of the Initiative until basic issues are addressed and approved by campus senates. In addition a “CSU Online Faculty Task Force,” consisting of over 80 faculty across the CSU, has been actively questioning features of the Initiative and has written an open letter to Chancellor Reed expressing opposition to outsourcing to for‐profit online providers or attempts to circumvent collective bargaining.
The task force open letter can be found here.2012
The RFP was actually released in April 2012. To my reading, the document was unorganized and lacked enough structure to let bidders know what to expect or what was needed. On schedule and enrollments, the RFP advised the following:
1.5 Cal State Online expects to officially launch in January 2013, with as many as ten degree programs. For the late fall 2012 term (beginning in late October 2012) Cal State Online anticipates offering two to three courses in several programs in a live beta test term.
1.6 ENROLLMENT PROJECTIONS Vendors should base proposals on 1,000 three unit course enrollments in year one and 3,000 three unit course enrollments in year two.
The RFP evaluation process was described in the first CSO Advisory Board meeting notes from June 2012, showing the final decision to select between Pearson and Academic Partnerships. Pearson was selected as the partner, and their contract has an unexplained change in enrollments.
The spending amounts detailed below (which may also be increased as appropriate, in Pearson’s discretion) are dependent on Cal State Online meeting the defined Enrollment thresholds for the prior calendar year. If Cal State Online does not meet such thresholds, the spending amounts for the then-current calendar year will be adjusted to reflect the actual number of enrollments achieved during the previous calendar year.
I do not know how the numbers went from an estimate of 1,000 course enrollments for 2013 in the RFP to a minimum of 16,701 course enrollments for 2013 in the contract. In retrospect, this huge increase can be described as wishful thinking, perhaps with the goals of making the financial case work for both CSO and Pearson.
The Advisory Board also decided in the June 2012 meeting to set standardized tuition for CSO at $500 per unit (compared with approximately $270 per unit for traditional campus student with 12 units per semester).
By October CSO had identified the specific campus programs interested in participating, document in the Launch Programs Report. The first page called out two of the first programs bringing in 200 students and 20 students – in other words, CSO migrated several hundred students to get started.
In the Spring 2013 term, CSO kicked off with the Launch Programs described in the February 2013 Advisory Board meeting minutes.
Launch Programs: 6 Programs from 3 Campuses
- CSU Fullerton launched 3 courses in their online Business BA program January 14th 2013; marketing and recruiting of next group of students in progress. 35 + 18 Existing Students.
- CSU Dominguez Hills will launch their BA MBA and PA MPA online programs in spring 2013; marketing and recruiting students is in progress. BA Applied Studies will launch in summer 2013; first CSU reconnect program.
- CSU Monterey Bay will launch two new masters programs, Technology and MS in IT Management in spring 2013 and MS in Instructional Science and Technology will launch in summer 2013. Marketing to begin ASAP.
The notes also call out a financial model (document not shared with Advisory Board but notes taken) with three scenarios.
- Scenerio [sic] 1: Baseline Growth Modeling where projected enrollments grom [sic] from 188 to 7500; programs grom from 3 to 25; revenues from to over $11 million and additional investment required $2.2 million. Break even in FY 12/14.
- Scenario 2: Break Even in fiscal year 2012/14 Modeling where enrollments from from 188 to 15,750, programs grom from 3 to 30, revenues grom to over 23 million and additional investment required is $1 million.
- Scenario 3: Best/Strong Growth where enrollments grow from 254 to 36,250, programs grow from 3 to 50, revenues grow to over $54 million and additional investment required is $1 million.
The budget planning seems to fall on fiscal years (Jul 1 – Jun 30), whereas all other CSO planning was based on calendar years. Note that the best case scenario included an additional $1 million in CSU investment, and the baseline scenario estimated 7,500 course enrollments from Fall 13 thru Spring 14. Based on an email exchange with CSU Public Affairs, Fall 13 saw almost 1,200 course enrollments, which would have required a six-fold increase in Spring 14 just to make the baseline scenario.
Update: Also in February, CSO executive director Ruth Claire Black testified at the Little Hoover Commission (an independent state oversight board in California) describing the CSO initiative as part of discussion on state needs in higher education.
By the April Advisory Board meeting, CSO was seeing some positive interest from campuses, although the numbers were fairly modest compared to previous CSO estimations.
April Launch Report
- Fullerton business degree completion program is making good progress; 83 applications pending, 17 admitted for fall. Heavily oversubscribed for Fullerton. Good review from stundents on coaching. 50% of inquiries are for Fullerton program.
- Dominguez Hills BS Applied Studies program starts May 4. Large cohort of existing students. 13 students admitted for summer; fall 17 students admitted.
- The next undergraduate program will be the Northridge Reconnect program. In the next 30 days website will be updated to reflect Reconnect.
- Fresno MBA 60 inquiries; 1 applicant and 1 admission
- Other 4 grad programs slow build; redirect marketing resources towards masters programs
- Fresno Homeland Security Certificate website and Humboldt Golden Four are up on website. We are seeing equal demand across the courses (3 GE courses)
- Interest list has grown significantly; campuses who are not currently participating Cal State Online is full for fall. If existing Cal State Online campus may have capacity. Sociology at Fullerton. Dominguez Hills QA for fall start. Taking advantage of launch financial model.
The notes showed the group watching new activity from the California state legislature regarding online education, including the infamous SB 520. This raised the question of what Cal State Online’s role should be with this new emphasis. [emphasis added below]
Can Cal State Online fulfill the role of putting all online? Where should we focus? State side or Cal State Online. Chancellor wants this to happen. Ruth and Marge are working on a plan. Need to be cautious to not cause confusion to students and not diminish Cal State Online.
Requirement of bill is that courses must be articulated statewide. Makes sense for Cal State Online to take ownership.
In May the CSU faculty senate passed a resolution calling on Cal State Online to promote all online programs and not just the six run through CSO.
RESOLVED: That all online degree programs offered by CSU campuses be given the same degree of prominence on the Calstateonline.com and Calstateonline.net websites as the online degree programs offered through Cal State Online; and be it further
RESOLVED: That there should be no charge for listing statesupport online degree programs on the Calstateonline.com and Calstateonline.net websites;
By the June Advisory Board meeting, there was some progress for Fall enrollments, and there was concern that the state legislature did not understand the bottleneck problem.
Legislature thinks that if students knew about online courses our bottleneck problem would be solved. State is not funding FTES. Enrolling students online will need state subsidy. There is a belief that we can educate students online cheaply. There is a disconnect in Sacramento. Enrollment caps are more the issue, not bottlenecks.
There was also an enrollment presentation for the June meeting:
Despite planned meetings every two months, the CSO Advisory Board did not meet again until October, and in this interim the decision was made to abandon the original concept and to change the Pearson contract. Advisory Board members were not pleased with the process.
In early summer Pearson requested changes in the CSU/Pearson contract; wanted to increase CSU costs for services. The quality of the marketing provided by Pearson was not adequate. There were multiple meetings between Pearson and Cal State Online to resolve concerns resulting in changes to the contract.
The new marketing firm for Cal State Online is DENT; replaces Pearson; started in July 2013. So far there is a high level of satisfaction
A communication was distributed to the Advisory Board and CSU system stakeholders on October 17th regarding the Pearson/Cal State Online contract changes. The communication can be found on the Cal State Online CSYOU site [ed. no longer available].
- Members of the Advisory Board stated that there was little to no communication to them about the changes taking place. The last board meeting was a telelconference call in June and the August in-person meeting was cancelled.
- There was a need to keep only a small number of people involved during the complicated negotiation process
The CSO entity was never formed as a 501(c)3 organization, and with the summer changes CSO would now report to Academic Affairs. The meeting notes further describe the changes.
The current Cal State Online business model will be in place until the end of 2013 and will then change. The Advisory Board will help identify opportunities and provide direction. It is anticipated that this will result in some changes in current program participation but hope that the current campuses will continue. Since campuses now have the option to use the LMS platform of their choice some campuses may elect to change to their own platform. [snip]
The Governor contributed $10 million to increase online education within the CSU. AB 386 Levine. Public postsecondary education: cross-enrollment: online education at the California State University was approved by the Governor on September 26, 2013 [emphasis added].
- With the changes in the Pearson relationship and the passing of AB 386 we are now taking a much broader view of Cal State Online; will be used as a store front for CSU online courses. All online courses and programs in system will have Cal State Online as the store front.
The CSU faculty senate unanimously passed another resolution related to CSO in November. The resolution applauded the movement of CSO to report to Academic Affairs and the allowance for campus selection of LMS, but the real focus was the lack of faculty input in the decision-making.
RESOLVED: That the Academic Senate of the California State University (ASCSU) express its dismay that recent changes to Cal State Online were announced to system constituencies without review or input from the Cal State Online Advisory Board; and be it further [snip]
RESOLVED: That the ASCSU contend that the dissolution of the Cal State Online Board should not occur until a plan for a new governance structure that includes faculty is established, and be it further
RESOLVED: That the ASCSU recommend the establishment of a newly configured Cal State Online system wide advisory committee to include at least 5 faculty members, and the creation of a charge, in a partnership between the ASCSU and the Academic Affairs division of the Chancellor’s Office;
This issue – involvement in decision-making – was continued at the final Advisory Board meeting just three days after the senate resolution.
Ephraim Smith (VP Academic Affairs): The Cal State Online Board was originally created for a 501c3 organization but there was a change in direction and did not pursue 501c3; board then acted as advisory. Now that Cal State Online hase moved to Academic Affairs the question is how should it interact with constituencies; work through existing committees? Need to discuss.
There are three full pages of notes on the resultant discussion, ended in a plan to form a Commission that looks broadly at online education across the CSU.2014
Despite the decision being made in Fall 2013 on the major changes to Cal State Online, the systemwide communication listed in my July post was not made until June 2014. The above description is mostly based on CSO documentation, but I plan to add a few of my own thoughts of the lessons learned from this short-lived online initiative in a future post.
- CSU officials did not respond to requests to be interviewed for this story. The offer is still open if someone would like to comment.
- The contract is no longer available in public, so I will only share one excerpt here.
- Disclosure: Michael and I wrote a white paper for 20 Million Minds Foundation calling out how Cal State Online did not attempt to address relieving bottleneck courses for matriculated students, which was the purported goal of much of the state legislative debate.
The post Cal State Online: Public records shed light on what happened appeared first on e-Literate.
August 27, 2014
There’s just a mind-boggling amount of money per student that’s being spent on administration
Andrew Gillen, quoted in “New Analysis Shows Problematic Booming Higher Ed Administrators,” Huffington Post, August 26, 2014
Administrative growth drives up costs at state-owned universities
Debra Edrleu, TribLive, July 28, 2013
Across U.S. higher education, nonclassroom costs have ballooned, administrative payrolls being a prime example.
Wall Street Journal as quoted by Phil Hill, e-Literate, January 2, 2013
Administrative costs on college campuses are soaring.
J. Paul Robinson, quoted in “Bureaucrats Paid $250,000 Feed OutcryOver College Costs, Bloomberg News, November 14, 2012
Administrative Costs Mushrooming
Georget Leff , John William Pope Center for Higher Education Policy, September 15, 2010
Are these true, or generalizations that lack the rigor of research? What does the data say?
Since 2004 The National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) financial survey of colleges and universities has reported the costs of Institutional Support in a standard form. This broad category includes “general administrative services, central executive-level activities concerned with management, legal and fiscal operations, space management, employee personnel and records, … and information technology.” In business this is often called “administration.”
Data from NCES’s Digest of Education Statistics 2012 shows decreases in cost per student from 2003-2004 through 2010-2011 except for public 4 year colleges and universities that increased expenses by 4.1% as shown in Table 1.
Institutional Support per Student
ChangePublic 4 year
4.1%Private 4 year
-15.7%Public 2 year
-16.3%Private 2 year
Table 1 – Cost of “administration” per enrolled student
These data are expressed in July 2014 dollars adjusted using the Consumer Price Index CPI-U so the results would be unaffected by inflation. The year 2003-2004 was selected for comparison because the data definitions and formats were the first consistent with 2010-11. Because private colleges and universities do not report operation of plant, that cost was omitted from the percentage computations of both. Headcount was used since administrative expenses are more closely related to enrollment of real students than to a mythical full-time equivalent (FTE).
These data are shown graphically in Figure 1.
Figure 1 – Comparative Administrative Expenses 2003-2004 and 2010-2011
Data showing administration as a percent of institutional expenses omitting independent organizations, hospitals, and auxiliary enterprises, is shown in Figure 2.
Figure 2 – Administration Expenses as a Percent of Institutional Expenses
The percentages are near equal for the two years though the administration expenses per student declined during this period except for the public 4 year colleges and universities. This reduction, likely true also for the cost of instruction, is influenced by increased enrollment and institutional budget that was typically less or about the same as 2003-2004.
The IPEDS revision introduced in the late 70s early 80s was based on program budgeting. The mission of the college or university was considered to be a combination of instruction, research, and public service—sometimes call direct costs. The library and computing was consolidated into academic support upon the belief that books would transition into electronic documents. Student services was another indirect category that includes admissions, registrar, and activities that contribute to students emotional and physical well-being, intramural athletics, and student organizations. Intercollegiate athletics and student health services may be included “except when operated as self-supporting auxiliary enterprises.”
IPEDS tried to avoid financial aid in institutional expenses of mission-based programs since, for example, it is a transfer payment of one student (tuition paid) to another (tuition discount).
NCES now makes the data from these surveys available using several different statistical tools (software).
The NCES data are very useful in analysis and in communicating with the public that seem to be receiving more opinions than facts.
This analysis is an example of verifying assertions that administration expenses are mushrooming, soaring, or ballooning.
Are administrative expenses soaring? The evidence is “no.” But that doesn’t make a sensational headline.
As Phil noted in yesterday’s post, Kuali is moving to a for-profit model, and it looks like it is motivated more by sustainability pressures than by some grand affirmative vision for the organization. There has been a long-term debate in higher education about the value of “community source,” which is a particular governance and funding model for open source projects. This debate is arguably one of the reasons why Indiana University left the Sakai Foundation (as I will get into later in this post). At the moment, Kuali is easily the most high-profile and well-funded project that still identifies itself as Community Source. The fact that this project, led by the single most vocal proponent for the Community Source model, is moving to a different model strongly suggests that Community Source has failed.
It’s worth taking some time to talk about why it has failed, because the story has implications for a wide range of open-licensed educational projects. For example, it is very relevant to my recent post on business models for Open Educational Resources (OER).What Is Community Source?
The term “Community Source” has a specific meaning and history within higher education. It was first (and possibly only) applied to a series of open source software projects funded by the Mellon Foundation, including Sakai, Kuali, Fedora, and DSpace (the latter two of which have merged). As originally conceived, Community Source was an approach that was intended to solve a perceived resource allocation problem in open source. As then-Mellon Foundation Associate Program Officer Chris Mackie put it,
For all that the OSS movement has produced some runaway successes, including projects like Perl, Linux, and Mozilla Firefox, there appear to be certain types of challenges that are difficult for OSS to tackle. Most notably, voluntaristic OSS projects struggle to launch products whose primary customers are institutions rather than individuals: financial or HR systems rather than Web servers or browsers; or uniform, manageable desktop environments rather than programming languages or operating systems. This limitation may trace to any of several factors: the number of programmers having the special expertise required to deliver an enterprise information system may be too small to sustain a community; the software may be inherently too unglamorous or uninteresting to attract volunteers; the benefits of the software may be too diffuse to encourage beneficiaries to collaborate to produce it; the software may be too complex for its development to be coordinated on a purely volunteer basis; the software may require the active, committed participation of specific firms or institutions having strong disincentives to participate in OSS; and so on. Any of these factors might be enough to prevent the successful formation of an OSS project, and there are many useful types of enterprise software—including much of the enterprise software needed by higher education institutions—to which several of them apply. In short, however well a standard OSS approach may work for many projects, there is little reason to believe that the same model can work for every conceivable software project.
This is not very different from the argument I made recently about OER:
In the early days of open source, projects were typically supported through individual volunteers or small collections of volunteers, which limited the kinds and size of open source software projects that could be created. This is also largely the state of OER today. Much of it is built by volunteers. Sometimes it is grant funded, but there typically is not grant money to maintain and update it. Under these circumstances, if the project is of the type that can be adequately well maintained through committed volunteer efforts, then it can survive and potentially thrive. If not, then it will languish and potentially die.
The Mellon Foundation’s answer to this problem was Community Source, again as described by Chris Mackie:
Under this new model, several institutions contract together to build software for a common need, with the intent of releasing that software as open source. The institutions form a virtual development organization consisting of employees seconded from each of the partners. This entity is governed cooperatively by the partners and managed as if it were an enterprise software development organization, with project and team leads, architects, developers, and usability specialists, and all the trappings of organizational life, including reporting relationships and formal incentive structures. During and after the initial construction phase, the consortial partners open the project and invite in anyone who cares to contribute; over time the project evolves into a more ordinary OSS project, albeit one in which institutions rather than individual volunteers usually continue to play a major role.
A good friend of mine who has been involved in Mellon-funded projects since the early days describes Community Source more succinctly as a consortium with a license. Consortial development is a longstanding and well understood method of getting things done in higher education. If I say to you, “Kuali is a consortium of universities trying to build an ERP system together,” you will probably have some fairly well-developed notions of what the pros and cons of that approach might be. The primary innovation of Community Source is that it adds an open source license to the product that the consortium develops, thus enabling another (outer) circle of schools to adopt and contribute to the project. But make no mistake: Community Source functions primarily like a traditional institutional consortium. This can be best encapsulated by what Community Source proponents refer to as the Golden Rule: “If you bring the gold then you make the rules.”
Proponents of Community Source suggested even from the early days that Community Source is different from open source. Technically, that’s not true, since Community Source projects produce open source software. But it is fair to say that Community Source borrows the innovation of the open source license while maintaining traditional consortial governance and enterprise software management techniques. Indiana University CIO and Community Source proponent Brad Wheeler sometimes refers to Community Source as “the pub between the Cathedral and the Bazaar (a reference to Eric Raymond’s seminal essay on open source development).” More recently, Brad and University of Michigan’s Dean of Libraries James Hilton codified what they consider to be the contrasts between open source and Community Source in their essay “The Marketecture of Community,” and which Brad elaborates on in his piece “Speeding Up On Curves.” They represent different models of procuring software in a two-by-two matrix, where the dimensions are “authority” and “influence”:
Note that both of these dimensions are about the degree of control that the purchaser has in deciding what goes into the software. It is fundamentally a procurement perspective. However, procuring software and developing software are very different processes.A Case Study in Failure and Success
The Sakai community and the projects under its umbrella provide an interesting historical example to see how Community Source has worked and where it has broken down. In its early days, Indiana University and the University of Michigan where primary contributors to Sakai and very much promoted the idea of Community Source. I remember a former colleague returning from a Sakai conference in the summer of 2005 commenting, “That was the strangest open source conference I have ever been to. I have never seen an open source project use the number of dollars they have raised as their primary measure of success.” The model was very heavily consortial in those days, and the development of the project reflected that model. Different schools built different modules, which were then integrated into a portal. As Conway’s Law predicts, this organizational decision led to a number of technical decisions. Modules developed by different schools were of differing quality and often integrated with each other poorly. The portal framework created serious usability problems like breaking the “back” button on the browser. Some of the architectural consequences of this approach took many years to remediate. Nevertheless, Sakai did achieve a small but significant minority of U.S. higher education market share, particularly at its peak a few years ago. Here’s a graph showing the growth of non-Blackboard LMSs in the US as of 2010, courtesy of data from the Campus Computing Project:
Meanwhile, around 2009, Cambridge University built the first prototype of what was then called “Sakai 3.” It was intended to be a ground-up rewrite of a next-generation system. Cambridge began developing it themselves as an experiment out of their Centre for Applied Research in Educational Technologies, but it was quickly seized upon by NYU and several other schools in the Sakai community as interesting and “the future.” A consortial model was spun up around it, and then spun up some more. Under pressure from Indiana University and University of Michigan, the project group created multiple layers of governance, the highest of which eventually required a $500K institutional commitment in order to participate. Numbers of feature requirements and deadlines proliferated, while project velocity slowed. The project hit technical hurdles, principally around scalability, that it was unable to resolve, particularly given ambitious deadlines for new functionality. In mid-2012, Indiana University and University of Michigan “paused investment” in the project. Shortly thereafter, they left the project altogether, taking with them monies that they had previously committed to invest under a Memorandum of Understanding. The project quickly collapsed after that, with several other major investors leaving. (Reread Phil’s post from yesterday with this in mind and you’ll see the implications for measuring Kuali’s financial health.)
Interestingly, the project didn’t die. Greatly diminished in resources but freed from governance and management constraints of the consortial approach, the remaining team not only finally re-architected the platform to solve the scalability problems but also have managed seven major releases since that implosion in 2012. The project, now called Apereo OAE, has returned to its roots as an academic (including learning) collaboration platform and is not trying to be a direct LMS replacement. It has even begun to pick up significant numbers of new adoptees—a subject that I will return to in a future post.
It’s hard to look at the trajectory of this project and not conclude that the Community Source model was a fairly direct and significant cause of its troubles. Part of the problem was the complex negotiations that come along with any consortium. But a bigger part, in my opinion, was the set of largely obsolete enterprise software management attitudes and techniques that come along as a not-so-hidden part of the Community Source philosophy. In practice, Community Source is essentially project management approach focused on maximizing the control and influence of the IT managers whose budgets are paying for the projects. But those people are often not the right people to make decisions about software development, and the waterfall processes that they often demand in order to exert that influence and control (particularly in a consortial setting) are antithetical to current best practices in software engineering. In my opinion, Community Source is dead primarily because the Gantt Chart is dead.Not One Problem but Two
Community Source was originally developed to address one problem, which was the challenge of marshalling development resources for complex (and sometimes boring) software development projects that benefit higher education. It is important to understand that, in the 20 years since the Mellon Foundation began promoting the approach, a lot has changed in the world of software development. To begin with, there are many more open source frameworks and better tools for developing good software more quickly. As a result, the number of people needed for software products (including voluntaristic open source projects) has shrunk dramatically—in some cases by as much as an order of magnitude. Instructure is a great example of a software platform that reached first release with probably less than a tenth of the money that Sakai took to reach its first release. But also, we can reconsider that “voluntaristic” requirement in a variety of ways. I have seen a lot of skepticism about the notion of Kuali moving to a commercial model. Kent Brooks’ recent post is a good example. The funny thing about it, though, is that he waxes poetic about Moodle, which has a particularly rich network of for-profit companies upon which it depends for development, including Martin Dougiamas’ company at the center. In fact, in his graphic of his ideal world of all open source, almost every project listed has one or more commercial companies behind it without which it would either not exist or would be struggling to improve:
BigBlueButton is developed entirely by a commercial entity. The Apache web server gets roughly 80% of its contributions from commercial entities, many of which (like IBM) get direct financial benefit from the project. And Google Apps aren’t even open source. They’re just free. Some of these projects have strong methods for incorporating voluntaristic user contributions and taking community input on requirements, while others have weak ones. But across that spectrum of practices, community models, and sustainability models, they manage to deliver value. There is no one magic formula that is obviously superior to the others in all cases. This is not to say that shifting Kuali’s sustainability model to a commercial entity is inevitably a fine idea that will succeed in enabling the software to thrive while preserving the community’s values. It’s simply to say that moving to a commercially-driven sustainability model isn’t inherently bad or evil. The value (or lack thereof) will all depend on how the shift is done and what the Kuali-adopting schools see as their primary goals.
But there is also a second problem we must consider—one that we’ve learned to worry about in the last couple of decades of progress in the craft of software engineering (or possibly a lot earlier, if you want to go back as far as the publication of The Mythical Man Month). What is the best way to plan and execute software development projects in light of the high degree of uncertainty inherent in developing any software with non-trivial complexity and a non-trivial set of potential users? If Community Source failed primarily because consortia are hard to coordinate, then moving to corporate management should solve that problem. But if it failed primarily because it reproduces failed IT management practices, then moving to a more centralized decision-making model could exacerbate the problem. Shifting the main stakeholders in the project from consortium partners to company investors and board members does not require a change in this mindset. No matter who the CEO of the new entity is, I personally don’t see Kuali succeeding unless it can throw off its legacy of Community Source IT consortium mentality and the obsolete, 1990′s-era IT management practices that undergird it.
- No, I did not make that up. See, for example, https://chronicle.com/article/Business-Software-Built-by/49147
Ten years ago, a group of universities started a collaborative software project touted as an alternative to commercial software companies, which were criticized as too costly. On Friday the project’s leaders made a surprising announcement: that it would essentially become a commercial entity. [snip]
The Kuali Foundation will continue to exist as a non-profit, but it will be an investor in a new commercial entity to back the Kuali software development. Leaders insisted that they would maintain the values of the project despite creating the kind of organization that they once criticized. For one thing, the source software will remain free and open, but the company will sell services, like software hosting. On Friday the group issued an FAQ with details about the change.
As Carl Straumsheim put it at Inside Higher Ed:
The Kuali Foundation, after a decade of fighting commercial software vendors as a community source initiative, will launch a commercial company to better fight… commercial software vendors.
Despite the positioning that this change is about innovating into the next decade, there is much more to this change than might be apparent on the surface. The creation of a for-profit entity to “lead the development and ongoing support” and to enable “an additional path for investment to accelerate existing and create new Kuali products“ fundamentally moves Kuali away from the community source model. Member institutions will no longer have voting rights for Kuali projects but will instead be able to “sit on customer councils and will give feedback about design and priority”. Given such a transformative change to the underlying model, there are some big questions to address.Financial Needs
Kuali, being a non-profit foundation, has its financial records available online, and the tax reporting form 990s are easily obtained through sites such as GuideStar. Furthermore, instructional media + magic (im+m) has a public eLibrary where they have shared Kuali documentation over the years. There does not appear to be a smoking gun found in the financials to directly explain the need for such a significant change, but there are hints of issues that provide some insight. In a recent analysis of Kuali’s financials from these public sources, im+m noted how Kuali has reserves to survive between 8 – 34 months with no additional income, depending on the percentage of uncollectible accounts receivables. In an article in the Chronicle this past spring, Kuali leaders described their apparent financial strength.
The foundation is in the best financial shape it has ever been, its officials say. Membership dues for small colleges start at a few thousand dollars; some big institutions contribute up to seven figures for specific software projects.
“We are about a $30-million net-asset organization,” says Ms. Foutty, the executive director. “There is not a concern that we are going to lack cash flow or net assets to do what we want to do.”
But what comprises these net assets? It turns out that the vast majority is comprised of accounts receivable, and more specifically, committed in-kind contribution of project resources from member institutions on the various projects. By looking at the financial report from last year (ended June 30, 2013 – see p. 3), we can see that Kuali had net assets of $26.4 million of which $21.3 million were “contributions receivable”. I would assume that current assets have approximately the same ratios. What this means is that a foundation such as Kuali is more dependent on member institutions keeping the faith and honoring contribution commitments than they are on pure dues and hard cash. Kuali cannot afford for too many institutions to to pull out of the consortium and write-off their commitments, and this aspect is based on whether Kuali will deliver the products that the institutions need.Timing
According to the Kuali web site, the addition of a for-profit entity was based on two community strategy meetings that were held June 25-26 and July 30-31 of this year. Brad Wheeler, chair of the Kuali Foundation and CIO at Indiana University, wrote his summary of the meetings on Aug 1, 2014, including these two prophetic notes:
- We need to accelerate completion of our full suite of Kuali software applications, and to do so we need access to substantially more capital than we have secured to date to meet this need of colleges and universities.
- Kuali should consider any applicable insights from a new breed of “professional open source” firms (ex. RedHat, MySQL, Instructure) that are succeeding in blending commercial, open source, and foundation models. This should include consideration of possibly creating a commercial arm of the Kuali community.
There were also direct notes about the need for cloud services and better project coordination and decision-making.
The changes announced on Friday come less than two months after the first community strategy meeting, so I have trouble seeing the meetings as the cause and the Friday changes as the effect. There is reason to believe that the changes have been in the works prior to June of this year.Change as an Indicator
When Kuali makes this radical of a change (moving away from community source model) within this short of a timeframe (less than two months), I think the best way to view the change is as an indicator that there are bigger issues under the surface. I wrote in a post on Unizin about a key question about the community source model:
Community source has proven its ability to develop viable solutions for known product categories and generally based on existing solutions – consider Sakai as an LMS (heavily based on U Michigan’s CHEF implementation and to a lesser degree on Indiana University’s OnCourse), Kuali Financial System (based directly on IU’s financial system), and Kuali Coeus (based on MIT’s research administration system). When you get rid of a pre-existing solution, the results are less promising. Kuali Student, based on a known product category but designed from the ground up, is currently on track to take almost 8 years from concept to full functionality. Looking further, are there any examples where a new product in an ill-defined product category has successfully been developed in a community source model?
Kent Brooks, CIO of Casper College, wrote a post this morning and called out a critical aspect of why this challenge is so important.
My overall observation is that the 10 year old Kuali project seems to have hit a bit of a lull in new adoptions. Partly is because institutions such as mine provide the next ‘wave of growth’ potential and most are unwilling to listen to the Kuali talk when there is not a Kuali Walk…aka a complete suite of tools with which one can operate the entire institution. It is a deal breaker for the 4000ish small to mid sized institutions in the US alone.
In other words, the vision of Kuali requires the availability of Kuali Student in particular, but also for HR / Payroll. Both of these project are based on future promises. I strongly suspect that the lack of completion of a complete suite of tools that Kent mentions is the real driving issue here for the changes.
Kuali must have new investment in order to complete its suite of applications, and the for-profit entity is the vehicle that the Foundation needs to raise the capital. One model that certainly informs this approach is ANGEL Learning, a for-profit entity which was founded and partially owned by the non-profit Indiana University (IU). ANGEL was able to raise additional investment beyond IU, and when ANGEL was sold for $100 million in 2009, IU made approximately $23 million in proceeds from the sale.Required Change
Although there is a lot still to learn, my view is that the creation of a for-profit entity is not just a choice for acceleration into the next decade but is a change that the Kuali Foundation feels is required. Kuali can no longer bet that the community source model as currently implemented can successfully complete new products not based on pre-existing university applications, and they cannot rely on the current model to attract sufficient investment to finish the job.
Brad Wheeler was quoted at Inside Higher Education summarizing the changes.
“What we’re really doing is gathering the good things a .com can do: stronger means of making decisions, looking broadly at the needs of higher education and maybe sharpening product offerings a bit more,” Wheeler said. “This is going to be a very values-based organization with patient capital, not venture capital.”
The foundation will fund the launch, Wheeler said. For future funding, the company won’t pursue venture capital or private equity, but money from “values-based investors” such as university foundations. That means Kuali won’t need to be run like a traditional ed-tech startup, he said, as the company won’t be “beholden to Wall Street.”
In a post from this afternoon, Chris Coppola from rSmart (a co-founder of Kuali) provided his summary:
The Kuali mission is unwavering, to drive down the cost of administration for colleges and universities to keep more money focused on the core teaching and research mission. Our (the Kuali community) mission hasn’t changed, but the ability to execute on it has improved dramatically. The former structure made it too difficult for colleges and universities to engage and benefit from Kuali’s work. This new model will simplify how institutions can engage. The former structure breeds a lot of duplicative (and even competitive) work. The new structure will be more efficient.More to Come
There is a lot of news to unpack here, and Michael and I will report and provide analysis as we learn more. For now, there are some big questions to consider:
- If you read the rest of Kent Brooks’ blog, you’ll see that he is now delaying the decision for his school to join the Kuali community. How many other schools will rethink their membership in Kuali based on the new model? The Kuali FAQ acknowledges that they will lose members but also predicts they will gain new membership. Will this prediction prove to be accurate?
- More importantly, are there already current member institutions providing significant resources that are threatening to pull out of Kuali?
- Given the central need for new, significant investment, will Kuali and the new for-profit entity succeed in bringing in this investment?
- Will the new entity directly address the project challenges and complete the full suite of applications that is needed by the Kuali community?
- What effect will Kuali’s changes have on other community source initiatives such as Sakai / Apereo and Unizin (if it does get into software development)?
Update 8/26: Clarified language on voting rights from ‘customers’ to ‘member institutions’; added qualified in last question re. Unizin (it would only be community source if it gets into software development).
- Disclosure: Jim Farmer from im+m has been a guest blogger at e-Literate for many years.
The post Kuali For-Profit: Change is an indicator of bigger issues appeared first on e-Literate.