Selecting non-academic software

From the Field

Out_of_Print_thumbnail

Scaling up the use of digital content does not happen with one sweeping gesture. Policymakers and district leaders meed to consider several interrelated factors in moving from a textbook-based world to one that is digital.

The process for selecting software for business and school operations, data management and information technology may be somewhat different from that for academic software. Academic software may be more localized and specific while the other three usually are applied districtwide and more complex. Sometimes in school settings, a particular software program may be mandated by an external authority. As an example, a state board of education might request that data be submitted for federal and state reporting requirements in a format that’s proprietary to a specific vendor’s program. Or a district may belong to a regional education cooperative that encourages its school clients to choose among just a couple of software programs for a given function. In those cases where choices are limited, software selection can be an easier process.

More often than not, a school system is left on its own to navigate the process of selecting software. It is important to gather as much information from as many sources as possible prior to taking software through a formal selection process. This includes reading trade publications, going to conferences, looking closely at web sites and talking with peers. Effectiveness data is difficult to come by, but probably is the most valuable information if it is available. The evaluation and selection process may take one of several models, depending on a number of factors: the software functionality sought; expected cost; district culture; and the impact of the program on overall school operations.


  • A department may perform the evaluation internally.
  • A specific department may perform the evaluation with the assistance of the IT organization.
  • A cross-department selection committee or “task force” may be given the job of evaluating the options and recommending a short list or a finalist program.
  • Personnel from several districts may work together in the selection process to gain vendor attention and leverage buying power.

The selection process for any of these models will follow a similar pattern:

  1. Gather and define needs and requirements for the software solution based on input from department users or the selection committee. This forms the basis of the evaluation rubric, a form listing each criteria and a way to score how well the software meets that requirement or need.
  2. Compile a document to communicate those aspects to the school community and external parties. The document should include a list of specific criteria by which the software will be evaluated as well as the evaluation rubric.
  3. Issue an RFI or RFP. For larger purchases (amounts may vary depending upon district size and purchasing laws and practices), the contents of that document may eventually be turned into a request for information or request for proposal, which is posted through normal purchasing channels.
  4. Advertise for possible candidates. In addition to advertising through the normal school district purchasing processes, this step may also include going to peer districts and schools for recommendations or using online resources such as education organizations, software portals or K-12 publications that compile lists and descriptions of potentially useful software.
  5. Develop a “finalist” list of software candidates from submitting providers. The department or selection committee should do an initial run-through of all possible candidates against the evaluation rubric to shorten the list to a manageable group of likely choices. This initial evaluation can be handled by running a demonstration version of the software; reviewing program documentation and product materials; and checking in with peer districts and schools and reference contacts provided by the vendor.
  6. Make the final selection. The handful of candidates that are left to choose from should be re-evaluated against the evaluation rubric based on working with a demonstration version of the software and discussions with the software vendor. For major purchases, the district or school may request that the vendor come on site to deliver a presentation and address questions. The score from the evaluation rubric will determine the final selection.
  7. Provide training. The people who install and maintain the software – system administrators, program administrators and line staff users –  will need training and support to ensure that the transition to the new software is as easy as possible and that the software is used effectively over time. This training may be available from the software developers as a part of the purchase price or as a separate fee. Training on new software is not the place to save pennies.

Software evaluation rubrics

Many examples of school district software evaluation rubrics can be found online. The rubric is intended to be used by each person involved in the selection process. Many rubrics will include a list of the selection criteria in the left column followed by a scoring metric:

More complete rubrics will include the criteria under each column to help the evaluator assess how closely the objective is met:

Although each type of software (financial vs. web filtering vs. learning management system) will have needs and requirements specific to the category, most software evaluation rubrics include some basic elements:

  • Platform requirements: Will the software run on the hardware and operating systems we have? Will it work on the browser we use?
  • Software model: Is the software available in the delivery model we prefer—on-premises vs. software-as-a-service vs. cloud-based?
  • Installation: How easy is the software to deploy?
  • User help: How comprehensive is the documentation or other help guidance?
  • User interface: Is the interface aesthetically pleasing? Are the graphics useful?
  • Ease of use: How easy is it for the user to find what’s needed in the software and interact with the functions of the program?
  • Content: Is the information shown on the screen accurate, relevant, current, complete and grammatical?
  • Performance: Is the software responsive to user input?
  • Reporting: Does the software provide reports in the format or data output required? Is it possible to customize the reporting? Is it possible for the user to create new reports as needs evolve?
  • Technical support: Is the support available in the forms and times required (online text, online forum, live real-time, live delayed response)?
  • Cost: What is the cost for implementation? What is the cost for annual licenses? What are the maintenance or renewal costs? What are the upgrade costs for new versions? Is training available as a part of the purchase price or is it an additional fee? Are there cooperative purchase agreements or statewide purchases available to lower costs through volume?
  • Company: Does the company have a track record in education? Does it make reference schools or district contacts available for pre-purchase consultation? What is the company’s roadmap for future development of the software?
LEADERSHIP - TECHNOLOGY - INNOVATION - LEARNING
©2017 SETDA, All Rights Reserved Privacy Policy