Collecting and Reporting Data

22 Calculating and Reporting Student Savings 

Jeff Gallant

As discussed in the Data Collection chapter, before you begin collecting data, you should have a plan for what you need to collect, why you need to collect it, and how you will collect it for the purposes of decision-making and reporting. That’s not all, though; the second step in your data strategy is to decide how larger measures, like student savings, are calculated, presented, and analyzed.

Calculation Principles

When you are calculating any savings data to present to the public or to your stakeholders (see Chapter 8, Building Familiarity on Campus, for more on getting to know your stakeholders, and see Chapter 21, Data Collection and Strategies for OER Programs, for more on gathering data to answer stakeholder questions), consider a few key principles in processing this data:

Be able to say “it’s at least this much.” Take every caveat into account when calculating and presenting savings data and report conservatively. An example of this is using only the direct reported savings from OER program grantees in calculations instead of a reported potential for more savings if the entire department adopts the resources. If your actual savings are higher than your reported potential savings, that’s great news for all if it’s revealed later on!

Use transparent methods whenever possible. Some of the numbers you have will be directly taken from surveys, reports, and faculty-reported savings estimates, but others (such as applying an average textbook cost savings estimate per student in one course) may have to be taken from external sources. If possible, use external sources who report their own data and estimates openly, and link to those sources within your reports and shared spreadsheets.

Adapt your methods to the needs of your stakeholders. If you notice an emerging critical question that you did not anticipate in your collection strategy, try to adapt your calculation methods. For example, in its first year, Affordable Learning Georgia started tracking annual savings estimates from its faculty teams; when it was clear that administrators needed semester-by-semester data to answer their questions about OER use on campus, we changed our new grantees’ reporting methods to include semester-by-semester estimates and checked in with our past grantees on this as well.

Reporting Principles

Be as transparent as possible, whenever helpful. While many of your stakeholders will just need a summary report to keep up-to-date on your program, providing all of the important data in its primary form (spreadsheets, a database) and sharing your methodologies will help in “sharing the work” on how you reached the numbers in your report and assist others in answering their more complex questions. Try to avoid sharing extraneous and/or sensitive data: in your tracking sheets, you may have things like grants office contacts, invoice and purchase order numbers, or personal identifiable information (PII) that does not need to be shared.

Balance this transparency with up-front simplicity. Media outlets and administrators do not have exorbitant amounts of time to parse your data, so while having unprocessed or highly-technical data available is a plus, providing data in only a non-usable or jargon-heavy way is like having a layer of opacity on top of your intended transparency. If your own communications office or upper administration wants to know about your numbers, consider giving a “short version” in reports and communications first, and then put the details below that or in an attachment for reference.

Provide a slow-moving metric for the public. Similar to keeping things simple up-front, media outlets and administrators will be confused by a constantly-moving set of impact numbers. For example, if you are adding student savings being reported by faculty on a rolling basis, and then removing student savings immediately when faculty say they have discontinued using OER, newer reports and articles will start looking “wrong” and creating confusion. In this case, you may want to provide consistent numbers annually in reports and keep per-semester numbers in your shared spreadsheets for those who want to dive deeper into the data.

Reach out with good news. You may have wonderful reports that share great news about your program through your website or a newsletter, but sometimes you need that to go further. While much of this chapter focuses on the creation of reports to suit various stakeholders and media outlets, be sure to use all possible channels to market your program’s successes. Provide appropriate visuals: if you’re sharing savings across multiple institutions or departments, use tables; if you’re tracking student savings across a period of time, use line graphs or bar charts; if you’re showing the percentage of your grant teams that saw positive, neutral, or negative changes in outcomes, use pie charts. For more on marketing, see Chapter 7, Marketing Your OER Program.

Using Direct Data from Grants

After planning your data collection strategy for course redesign / OER implementation grants and confirming a fully-completed round of projects, using the collected savings data or savings estimates from grant projects seems far simpler than having to find an overall per-course or per-textbook average for an entire institution. However, the complexities within using direct project data from grants are in the details:

What if a grant project is discontinued? Not every grant team will use OER or no-cost or low-cost materials forever, due to either individual decisions, changes in leadership, or changes in the commercial market. How will you find which projects have discontinued the use of these materials?

What if grantees expand their use of OER beyond your estimates? Similar to discontinuations, how will you find out which projects have proliferated the use of OER within the department? ALG addresses this through annual sustainability checks, which are explained in detail further in this chapter.

What if grantees start using different OER materials? Can you ascribe the savings being generated from a team using different OER or no-cost or low-cost materials to a grant if the team moves to a different set of OER? The grant project most likely introduced them to OER, and they’re now undergoing improvements to their course on their own with that work in mind. ALG does report these same-course cost savings if grantees move to different open materials.

What if grantees expand their use of OER to new courses? Can you ascribe the savings being generated from a team using OER in a different course that wasn’t part of the grant project? As with the different materials issue, there’s no easy answer. ALG does not report on this expanded use of OER in new courses directly, but this expansion could be measured through accurate no-cost and low-cost course markings.

What happens when grantees, especially project leads, leave their institution? ALG’s discontinued projects have largely been caused by turnover, whether it’s a project lead leaving for another institution (the most likely) or a new Department Chair mandating the use of a particular set of resources for a course (far more rare). Not only are projects discontinued due to turnover, but data is far harder to collect when the Project Lead for a grant team suddenly disappears. How will you manage turnover in grant projects? ALG checks with Project Leads yearly in sustainability checks, but due to Project Lead turnover complicating who is able to respond for a team, we are looking at department-focused methods via campus advocates moving forward.

Adoption Data and Campus Store Partnerships

The University of Missouri’s Affordable and Open Educational Resources Program has an integrated partnership between the program and the Mizzou Stores, the bookstores for the University of Missouri. Beyond the potential for bookstores to provide print-on-demand services for open textbooks, the Mizzou Stores actively reach out to faculty to report OER and no-cost material adoption within their collection of textbook adoption data. This is an example of a highly-effective campus partnership between what at first seems like an unlikely pairing but is actually an alliance with a common goal: student success (Bell, 2018).

Partnerships like the one at the University of Missouri have a common goal: collecting textbook adoption data per the nationwide mandate for higher education institutions within Section 133 of the United States Higher Education Opportunity Act (HEOA). Section 133 is not just a mandate for collecting adoption data: the section encourages institutions to work together to find ways to reduce the cost of textbooks to students. While campus store partnerships are not always prevalent or possible, it stands to reason that the common goal of reducing textbook costs and the mechanics of reporting adoption data would foster some partnerships between adoption-data collectors (often campus stores) and OER program managers (Stocker, 2018).

Adoption Data Caveats

Like nearly all data collected for reporting, adoption data will come with caveats:

Incomplete Adoption Data: By far the most common issue with using textbook adoption data is getting adoptions reported for 100% of courses. In practice, this is quite difficult to do, depending on the data collection methods, how mandatory it is to report textbook adoptions within policy, who reports the adoptions (instructors, chairs, deans, administrative assistants), and how departments account for last-minute instructor assignments to sections and low-enrollment cancellations.

OER Knowledge: If faculty are asked if they adopted an open textbook and haven’t received any training on what “open” means, it stands to reason that this OER adoption data will only be as accurate as each individual faculty member’s own knowledge of what it means for something to be open. Even if textbook adoption data at an institution is at 100%, a faculty knowledge gap on OER will render this data inaccurate.

What’s Actually Required or Optional: If syllabi are being analyzed for adoption data, this can be a more accurate, if more time-consuming, way of collecting textbook adoptions; however, what materials are labeled as “required” and “optional” may differ in practice, and there isn’t a way to determine this from just looking at syllabi.

A Hidden Cost of “Free:” Student Privacy: Just because a resource does not come with a dollar amount does not mean it’s open or even zero-cost. For example, some seemingly zero-cost proprietary platforms might require the student to create an account, and the platform’s company may give this student account data or even behavioral data to a third-party company for monetary compensation. The cost to the student, in this case, would be privacy; truly open educational resources, or even those marked as zero-cost, should not require this.

Print Versions of Open Textbooks: The same print-on-demand advantages of a campus store partnership can complicate adoption reporting as well:

  • If an OpenStax textbook is available as a print copy for $50 but it’s optional due to the digital version being open, does every instructor know and report this as a $0 course material? Do they have guidelines on what to do in this situation?
  • Do some instructors require the printed version of an open textbook?
  • Does the Campus Store make it clear to students at registration and when shopping for texts that a free digital version is available?

Your campus store may be just one example of a partner on-campus that can share helpful data and analysis advice. Affordable Learning Georgia partners with the Research and Policy Analysis office for enrollment data and course marking reports. Departments and offices with a goal to provide good data stewardship can help strengthen your data collection, analysis, and reporting.

A Common OER Calculation Issue: Using an Average Materials Cost per Course (or Course Section)

When you have institution-wide or system-wide data on how many courses are using OER, or how many courses have zero-cost or low-cost resources, you will often have only this data – not how much it would have cost to use a commercial textbook (Open Oregon Educational Resources 2018). When possible, use actual cost savings as reported by the faculty teaching the course or by your bookstore’s data. Arriving at a total savings estimate for this data would be extremely difficult without an average per-student per-course savings estimate. Various groups have modeled how to arrive at an estimate:

  • The Student PIRGs used reports of per-course savings when transforming a course from using a commercial textbook to OER to reach an estimate of $100 average cost savings per course (Student PIRGs 2015).
  • Nicole Allen and David Wiley presented at the 2016 Open Education Conference on multiple cost-savings studies and concluded that $100 was a reasonable per-course savings estimate (Allen and Wiley 2016).
  • In 2018, SPARC and Lumen Learning came to a more specific estimate using disaggregated IPEDS data to reach an estimate of $116.94 per course (Nyamweya 2018).
  • The National Association of College Stores provided an average textbook cost using college store data to reach an $82-per textbook average. This is slightly different from a per-course average, though, as the average includes low-cost scholarly monographs, novels, and trade publications, which are often assigned in a group of required resources for one course (Open Oregon Educational Resources 2018).
  • OpenStax used the 2015-2016 NCES National Postsecondary Student Aid Restricted-Use Data File and an internally-calculated average of 7 courses per year that would likely require a textbook to reach a $79.37-per-course textbook cost average (Ruth 2018).

Affordable Learning Georgia used the NCES 2016-2017 Baccalaureate and Beyond Codebook, the latest national report available, for the average textbook costs per student per year (NCES 2021). ALG took the average student-reported expenditures per year on digital textbooks and print-only textbooks from the NCES report (222.75 + 523.04 = 745.79) and divided this per-year average by 7 per OpenStax’s average for 7 courses per year, arriving at a per-course savings number for zero-cost materials courses of $106.54.

Because Affordable Learning Georgia also has a low-cost designator for course sections with materials costing $40 and under, we subtracted the maximum cost for a low-cost designator from this average to get $66.54 in savings for low-cost materials courses.

Sustainability Checks: An Example from Affordable Learning Georgia

In order to keep our reports as accurate as possible, ALG distributes an annual survey to ensure that grant teams past one year of the final semester of the project are still using OER and/or no-cost or low-cost materials. If a project has discontinued the use of these materials and have returned to using commercial materials, the project is marked as Discontinued in a sustainability check column for that year and savings, therefore, need to be zeroed out for the year. To keep these methods conservative, groups that have not responded are marked as “unknown” and are also zeroed out, even though there is a possibility that OER are still being implemented.

To eliminate having to manually zero-out discontinued projects or fill in continued projects for each semester’s savings estimates, ALG uses data validation and if-then formulas.

Example: ALG Sustainability Checks and If-Then Mechanics in Excel

Sustainability Check Status Column

Use one column for your annual sustainability check that contains the status of each project. There should only be a few different statuses to each sustainability check, such as continued, discontinued, in-progress, and unknown. Depending on your situation, you may have a few extra statuses as well (for example, we have Continuous Improvement grants which do not generate student savings). ALG uses one row per grant project and track savings in semester columns.

Eliminate the potential for typographical variations or errors through the Data Validation option in Excel. Make a list of the terms you want included in the sustainability check column in a different sheet, and let the Data Validation refer to that list. You can even toggle a drop-down box option for the column, where you can select one of the terms from the list in each cell’s drop-down menu.

Other Columns

A Sustainability Check may be the appropriate time to update other data in your spreadsheet. Other columns used in a Sustainability Check may include:

  • Average Students Affected Per Semester (updated data)
  • Cost Savings Per Student (updates to commercial textbook prices, updates to low-cost resource prices)
  • Reason for Discontinued Projects (Turnover, Quality Issue, Departmental Mandate, etc.)

Integrating Sustainability Checks into Savings Data with If-Then Formulas

Once your sustainability checks are complete, in the corresponding semesters’ data columns, select one of the multipliers, such as the number of students affected. Instead of just grabbing the per-semester student count, put an if-then formula in the cell:

=IF([the cell in your Sustainability Check column]=”Continued”, [the cell in your Students per Semester column], 0)

This formula looks at the Sustainability Check cell in your grant project’s row. If the cell says it’s Continued, the formula puts the Students per Semester (as reported by each grant team) into the formula’s cell. If the Sustainability Check cell says anything else, the formula will place a 0.

By using the formula above for all the cells in the column, you have a sustainability check-dependent student count for the whole semester, without the need to fill in individual cells with student numbers.

What’s great about if/then formulas is that your data can be automatically updated if something changes. For example, if you haven’t heard from a Project Lead for two years and therefore had an “Unknown” cell for them, but suddenly they get back to you and say that the whole team has been using OER for all those years and it’s still going. Great! Set the Sustainability Check cells for those years to “Continued,” and those savings will automatically update because of the if/then formulas in each students-per-semester cell.


The way you calculate and report savings data is essential for running and evaluating your OER program. These ongoing tasks let you see the impact of your and your team’s hard work, and they give you a clear indicator of the milestones your team has achieved. Reporting your data is a way to celebrate the successes of the program and your contributions to the open movement broadly. Starting from a set of calculation principles can help you get started, as can existing models from successful OER programs. Whenever you can, look to other departments within your institution or system to build on previous reporting and avoid duplication, and knowing how externally reported data is collected and analyzed will be crucial.

Recommended Resources

For the purposes of these two data chapters, there are CC BY 4.0-licensed files demonstrating many of the above techniques and principles in action:

  • Demo Excel File for Grants Tracking: While the savings and student numbers in the file are randomized numbers within a given range, the formulas are set up to function in the same way as the described methods above.
  • Power BI Dashboard for the Excel File: This should automatically work with the data in the Demo Excel File to create a data dashboard and a one-page flyer. Be sure both files are in the same folder; it’s possible you’ll need to add the Excel file yourself depending on how Power BI is working on your particular computer.
  • Worksheet for the Excel File containing five questions that are easily findable, five that will take more time to answer, and five that will require additional information such as qualitative data.

Examples of OER Data Reporting

Key Takeaways
  1. Balancing transparency and simplicity is an essential reporting principle to keep in mind when analyzing your data and constructing reports.
  2. Giving frequently-updated or live statistics can be helpful for key stakeholders who are close to your program, but the public and media outlets need a solid set of slow-moving numbers in order to effectively tell the story of your program.
  3. Instructors and departments will occasionally discontinue the use of OER due to various factors, especially faculty turnover. Be sure to regularly check on the sustainability of each grant project or reported OER adoption in order to provide accurate impact estimates in your reports.
  4. Keep your estimates for average costs as transparent as possible, and be prepared to explain how you estimate these savings.
  5. If possible, partner with the departments and/or organizations which have course materials adoption data, such as a campus store or auxiliary services department.


Bell, Steven. 2018. “Archived webinar: We’re In This Together: Better Library-Bookstore Relationships for Student Academic Success.” Open Oregon Educational Resources.

Open Oregon Educational Resources. 2017. “Is the average cost of a textbook $100?”

Jensen, Kristi and Shane Nackerud, eds. 2018. The Evolution of Affordable Content Efforts in the Higher Education Environment: Programs, Case Studies, and Examples. University of Minnesota.

National Center for Education Statistics. 2018. “2015-16 National Postsecondary Student Aid Study (NPSAS:16) Restricted-Use Data File.”

‌National Center for Education Statistics. 2021. “Baccalaureate and Beyond: 2016/2017 Codebook By Subject.”

‌Nyamweya, M. 2018. “A New Method for Estimating OER Savings” SPARC.

Ruth, David. 2018. “48 Percent of Colleges, 2.2 Million Students Using Free OpenStax Textbooks This Year.” OpenStax.

University of Missouri. 2016. “‌Affordable and Open Educational Resources.” Accessed January 30, 2022.

U.S. Government Publishing Office. 2008. “Higher Education Opportunity Act.”

Wiley, David and Nicole Allen. 2013. “Save Students a Billion Dollars.” Accessed January 30, 2022.


Icon for the Creative Commons Attribution 4.0 International License

The OER Starter Kit for Program Managers Copyright © 2022 by Jeff Gallant is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.