Skip to Main Content

LMS Evaluation Planning Toolkit: Evaluate

This toolkit houses material created and identified by PALNI's LMS Evaluation Task Force to assist member institutions embarking upon the process of an existing Learning Management System (LMS) examination and/or change.

Overview

The evaluate phase of the LMS Review Process is the most time intensive phase as system administrators determine which systems adequately improve learning outcomes, promote instructional innovation, enhance student engagement and offer functionality for different modes of course offering.

Use the tabs below to review evaluation considerations.

Evaluation Considerations

needs analysis image

 

Needs Analysis

 

It is at this evaluation phase that a User Needs Analysis (different from the initial user survey analysis in the Prepare Phase tab) would be helpful in determining what users need or want from their LMS. Creating a functional requirements list for an LMS user survey will be helpful. Utilize the RFP document from the Initiate tab to assist.  

Once an institution has arrived at a set of final LMS vendors to evaluate, consider how each system will reveal differentiation. For some institutions, in addition to running pilots, this will entail developing a comprehensive rubric to aid in a more systematic and detailed analysis of a given system's qualities. Creating a technical rubric will allow LMS support staff, charged with the care and maintenance of the system, a voice in the decision-making process. This type of technical rubric includes a method of scoring the LMS on technical features, functionality, administrative tools, and support criteria. View the resources below to help in this initial user needs analysis process.


 

financial evaluation image

 

Financial Evaluation Considerations

 

Institutions will need to have a sense of current LMS costs before they evaluate LMS alternatives. If the institution currently has a vended LMS, the financial considerations process will be straightforward. If, however, the system is a self-hosted, open source LMS, they will need to determine the amount of staff hours that support the LMS, as well as any infrastructure and support costs. These "indirect" costs are challenging to ascertain, but will make it easier to compare the true costs of changing systems.

Vended LMS systems are often offered on a cloud or SAAS (software as a service) model where they carry the burden of maintaining uptime, patching and update schedules. Further, cloud-based services tend to have robust backup and redundancy contingencies articulated in SLA (service level agreements) in case of system or network failures.

LMS vendors will often provide different support packages at varying price points. Typically, the premium support packages include 24/7 dedicated Help Desk service levels. Vended LMS systems typically price out a subscription to their service per student FTE of the institution. It is not uncommon to see multi-year pricing proposals that heavily discount unit FTE fees in earlier years that then increate dramatically as the contract matures. This approach can be beneficial in that it can offset implementation costs or the costs of having to run two vended systems concurrently for a period of time.

Institutions opting for an open-source/locally hosted option will carry the burden of maintaining servers, performing patching, implementing security measures, etc. These costs both in terms of hardware and staff hours, but also in terms of opportunity costs, must be considered. Cost-related items to consider include:

  • system migration cost
    • pricing per FTE
  • training and support cost
    • inefficient processes
    • employee time/effort
  • self-hosted vs. vended cost
    • on-premise/open source
    • hosted cloud-based
    • hardware
    • patching/upkeep
    • all-inclusive and packaged
    • metered pricing
  • consortium support cost
  • marketing and branding cost
  • piloting and running multiple systems at once
    • analytics
    • missing features

user trials and pilots image

 

User Trials & Pilots

 

By this step, typically the earlier vetting process to arrive at systems to evaluate has been achieved either through research, an RFP process, or another method. Institutions should now begin working with the selected vendor(s) to fully evaluate the capacities of their systems. One common way to do this is by running a pilot test of the LMS. Keep in mind that vendors usually charge for these pilots. The terms typically will include the following stipulations: 

  • A cap on how many users will be supported/allowed to participate during the trial period. It is not uncommon to see pilots that involve 400-500 students. 
  • The time frame of the trial which may span a fall and spring semester, while others may only cover a single semester.

As you plan your trial period with a pilot group of users, consider the following:

  • Vendors will usually provide sandbox course environments where select users can get accounts and begin experiencing the LMS features before students are loaded into the system. 
  • Soliciting feedback from pilot users is imperative to the evaluation process
    • User surveys (consider the three model approaches outlined in Thomas Brush's article
    • Focus groups
    • Technical evaluation via a rubric

 

Pilot participant survey examples above offer questions on what users wish the LMS could do and their experience with the system being reviewed.

technical and functional image

 

Technical & Functional Evaluations

 

During the trial agreement with the vendor, there is opportunity to test backend authentication connections. Consider these areas when testing:

  • Connections between the LMS and the institutional Student Information System (SIS)
  • Authentication connections with the new system and available support
  • Security protocols used for the LMS
  • Updates and patching schedules 
  • API and LTI capacities
  • Access to log reports
  • Non-academic uses of the LMS (campus-wide surveys, campus committee projects, etc.)
  • Vendor support responsiveness
  • Ability for administrators to run comprehensive & custom reports

decision making image

 

Decision-Making Process

 

As the pilot period concludes, institutions should begin collecting feedback from the pilot participants. Surveys are a great way to do this, and if carefully crafted, can give a sense of how users regard using the systems and how their learning was impacted by the systems (refer back to the User Trials/Pilots tab for links to survey examples). Focus group themes will emerge and the technical rubric will also give insight into what the support burden might be for the evaluated system. Analyzing these three dimensions as to how well they align with institutional goals will, ideally, reveal a clear path forward.

As you begin the decision-making process, consider the following action steps:

  • Analyze focus group and survey data from pilots
  • Score technical rubrics
  • Compare costs, keeping in mind opportunities to purchase via consortium pricing (e.g. EandI.org)
  • Prepare institutional reports
  • Arrive at recommendation & deliver the system choice to decision makers

Literature Review

Black, E.W., Beck, D., Dawson, K., Jinks, S., & DiPietro, M. (2007). The other side of the LMS: Considering implementation and use in the adoption of an LMS in online and blended learning environmentsTechtrends: Linking Research & Practice to Improve Learning, 51(2), 35-39. 

Brown, M., Dehoney, J., & Millichap, N. (2015, April 27). The next generation digital learning environment: A report on research. Educause. Retrieved from https://library.educause.edu/resources/2015/4/the-next-generation-digital-learning-environment-a-report-on-research

Brush, T. (2016, April). Relative utility of three models for user evaluations of learning management systems: A higher-ed institution decision context (Doctoral dissertation). Retrieved from https://scholarworks.iu.edu/dspace/handle/2022/20909

Fritz, J. & Whitmer, J. (2017, February 27). Learning analytics research for LMS course design: Two studies. Educause. Retrieved from https://er.educause.edu/articles/2017/2/learning-analytics-research-for-lms-course-design-two-studies

Goodrum, D. (2016, April). Relative utility of three models for user evaluations of learning management systems: A higher-ed institution decision context (Doctoral dissertation). Indiana University, Bloomington, IN. Retrieved from https://scholarworks.iu.edu/dspace/handle/2022/20909

Steel, C. & Levy, M. (2009). Creativity and constraint: Understanding teacher beliefs and the use of LMS technologies. Retrieved from https://www.ascilite.org/conferences/auckland09/procs/steel.pdf

Wright, C., Lopes, V., Montgomerie, T., Reju, S., & Schmoller, S. (2014, April 21). Selecting a learning management system: Advice from an academic perspective. Retrieved from https://er.educause.edu/articles/2014/4/selecting-a-learning-management-system-advice-from-an-academic-perspective

Change Management & Communication Tips

Communication Tips

The system decision needs to be communicated institutionally from the top, down. As you determine your communication channels and craft your messages, consider the following tips:

  • Identify and target layers of leadership to help with the communication process
  • Clearly articulate rationale for the change
  • Create ownership and collective buy-in by determining enthusiastic early champions 
  • Reinforce core issues consistently
  • Prepare for the unexpected