14 Assessment
When course marking is well communicated, students have uniform and equitable access to details that allow them to make informed decisions about course registration. From an institution’s perspective, course schedules act, at the most basic level, as a mechanism to organize and manage classroom space and resources. Course schedule policies explicitly aim to maximize an efficient use of space and time to meet student needs (Boise State University 2011, University of Iowa 2020). For example, Drexel University’s (n.d.) course schedule policy states that their policies exist to enable students to create conflict-free schedules and to graduate in a timely manner. For many students, graduating in a timely manner is of paramount importance as the cost of tuition and fees becomes increasingly and prohibitively expensive. The cost of undergraduate tuition, fees, as well as room and board between the academic years 2006/07 and 2016/17 rose by 31% for public institutions and 24% for private institutions (U.S. Department of Education, National Center for Education Statistics 2019). While course markings primarily provide students with information to help them plan their academic program, institutions can also use them to collect data in order to evaluate its course marking program and measure other effects, including teaching loads, student behaviors, and student needs.
Open and affordable course marking initiatives may develop organically within an institution, collaboratively as a part of a consortial effort, or responsively to meet changing state legislative requirements. Recognizing the motivations that drive a particular course marking initiative is imperative in planning for and measuring success. Accordingly, open and affordable course marking initiative assessments will differ.
How the data will be used and to whom the data will be reported will determine what type of data should be collected. For instance, gathering and reporting data within a single institution will likely require codification and coordination between different departments. Standardizing the reporting mechanisms can lead to effective and reliable data collection and should be considered best practice. For those working with a consortial or statewide initiative, assessment may be complicated by the variety and type of reporting mechanisms and data collected. This was the case with the Affordable Learning Georgia (2020) program, whose impetus for implementing course markings was, in part, assessing their system-wide OER grant program. Though all 26 institutions involved in Affordable Learning Georgia use a system-wide registration system, differences in data entry processes and lack of enforcement made this impossible (Chae et al. 2019). It is advisable to have initial conversations with relevant stakeholders about what, how, and why data are to be collected and how that data can be mediated among various systems.
Creating best practices for assessing the impact of open and affordable course marking initiatives is complicated by the fact that even the most established open and affordable course marking projects are still—to some extent—under development. This chapter will outline potential strategies for measuring the impact of open and affordable course marking initiatives by discussing assessment methods, awareness, compliance with mandates, cost savings, student success, and enrollment.
Planning for Assessment
Planning for assessment before implementing the initiative is prudent. The planning can be an informal or formal process, depending on the needs, timeline, and resources available. If the effects of a course marking initiative need to be reported to other entities, such as administrators or peer institutions, a structured approach may facilitate the process. Running through the list of questions below might be sufficient for initiatives operating without reporting requirements.
What are the goals of this course marking initiative?
Developing goals early in the process makes it easier to conceptualize and measure their effects. Consider creating goals that are SMART—that is, specific, measurable, achievable, relevant, and time-bound. SMART goals should be specific to the needs and ability of your initiative. A sample SMART goal for an institution looking to start up a data-generating course marking program might be to establish a list of persons responsible for course marking reporting for all academic departments within six months; whereas a sample SMART goal for an institution with an established program might be to double the number of students enrolled in courses using an open or affordable textbook over the course of four semesters.
Who are the stakeholders?
Think about who this initiative impacts. Identifying stakeholders can help identify the target population for a survey or focus group to better identify the strengths and weaknesses of a course marking initiative. Stakeholders may include students, instructors, registrars, or partners from other units. For more detail, Part II (Stakeholders) provides a substantive overview of these groups.
Who are the collaborators?
Collaborators are stakeholders who help implement the initiatives because they have some level of influence or power over the process. Examples of collaborators include administrators, instructors, department heads, libraries, institutional research departments, and registrar and student affairs offices. For example, students are stakeholders as users of course markings, whereas student government is a collaborator through active advocacy and feedback.
What is the timeline of the course marking initiative?
Generally speaking, creating a timeline using established goals can help keep an initiative on track. For evaluative purposes, marking specific times to evaluate and revisit specific goals, collaborations, or processes can be helpful to monitor progress and address changing needs and priorities. Since course marking is implemented in phases and dependent on collaborations with others, flexible timelines are vital. For example, since course marking requires working with specific departments, it is helpful to schedule initial meetings with key people in those departments, touch base with them throughout implementation, and check back with a post-implementation followup.
How will I measure the data?
Specific questions about measuring data include deciding on what data to collect while considering stakeholders, whether the evaluation will take a quantitative or qualitative approach, and what resources are needed to collect and measure data (e.g., survey instruments, incentives, software).
Collecting and Using Student Data
Administrators, admissions, and enrollment management departments collect a large amount of student information. This information might be used to analyze student academic cycles through student profiles, forecast academic offerings and financial state of an institution, or propose ways to improve student learning. Just as institutional review boards exist to protect research participants, instructors and administrators collecting and using student data should consider whether the collection and use of student data is ethical and necessary for the purposes of their assessment.
Though instructors adhere to the federally mandated Family Educational Rights and Privacy Act, which works to protect the privacy of students’ educational records, some questions about the ethical collection and use of student data remain unanswered (Jones 2019). Newsworthy international cases such as the 2017 Equifax data breach or the 2018 Facebook-Cambridge Analytica Data Scandal exemplify how large amounts of personal data make valuable and sometime vulnerable targets for exploitation. The ethical collection and use of student data is especially important to consider when contracting with third-party vendors whose ethics may not necessarily align with those in academia. In August 2019, Senators Dick Durbin, Edward Markey, and Richard Blumenthal sent letters to educational technology companies expressing their concern about these companies’ handling of student data (Durbin 2019).
The proliferation of learning analytics, the process of gathering and analyzing data in order to profile learners, may assist instructors in better understanding the variables that contribute to student success (Alexander et al. 2019). Most of the student data are collected through the digital interfaces of learning management systems such as Blackboard or Canvas. At an individual level, collecting student profiles could potentially help instructors, advisers, and student success staff provide early intervention through customized emails at critical points in the semester based on individual students’ performance and predictive learning analytics (Sclatar and Mullan 2017). These methods of early intervention have gained traction as institutions, facilitated by third-party vendors, actively try to figure out best practices to support student persistence, retention, and matriculation. Establishing institutional guidelines for student data collection and use is vital to preserving transparency while optimizing service.
If students’ preferences for open and affordable course markings are collected in student profiles, that data could provide some insight into what types of students—traditional, adult, first-generation, or veteran, for example—might select courses that use open or affordable learning materials. With the potential benefits learning analytics may bring to supporting students, instructors must also critically consider the ways in which learning analytics are susceptible to concerns of consent, bias, privacy, and ethics. Many colleges and universities are already collecting data about students through the learning management system or tracking their location through swipe systems in order to assist students, but benevolent use does not immunize data collection from ethical scrutiny. Understanding the ways in which institutions collect and use data, while respecting students’ autonomy and privacy, can help open educational resources (OER) practitioners better understand how collecting and analyzing open and affordable course marking data can fit into the larger landscape of learning analytics.
Collecting accurate and comprehensive student data is vital in colleges and universities where there is a heightened need for instructors to demonstrate return on invest as a result of neoliberal policies. Neoliberalism in academia conceptualizes higher education as a free market in which students are consumers and education is a commodity rather than a social or public good (Saunders 2007). Given decreases in funding to state colleges and universities over the last few decades (Chronicle of Higher Education 2014; Pew Trusts 2019), public institutions increasingly rely on revenue generated by tuition and other income streams. Institutions traditionally tracked student factors such as grade point averages, major selections, number of credits enrolled, and number of credits attempted to help determine individual students’ persistence and retention. Marking open and affordable courses can be another factor in attempting to understand student persistence and retention.
Institutions develop initiatives and programs, as well as collect and analyze student data, with the intention to improve students’ higher education experiences. These initiatives can be especially critical in a student’s first year. When administrators mark courses with designations and descriptions, they can track the implementation of institutional initiatives and analyze whether these interventions—such as offering more service learning courses—have had an impact on retention (Gardner 2002, 146). Though having this quantitative data is a valuable piece of the evaluation process for retention initiatives, the student data should be considered in connection with other factors not captured in the student information system (SIS), including external factors at home or work, that also contribute to attrition. Akin to the ways in which service learning designations may function, open and affordable course markings are another form of institutional intervention which may be measured against student retention and persistence.
Analyzing Data
Data for course marking initiatives may come from a variety of sources, such as reports from the SIS or focus groups with stakeholders involved in the course marking process. Since many reports draw data from a complex array of information sources, analyzing data requires a basic understanding of quantitative and qualitative methods and the ability to decide which method is most appropriate to use in the evaluation process. This overview of quantitative and qualitative methods will not be exhaustive or comprehensive, but explains some basic principles one must understand when considering how to evaluate an open or affordable course marking initiative.
Qualitative methods measure observations and data that are not numerical. Some methods of gathering qualitative data include focus groups, interviews, and observation. Using qualitative methods can provide insight into processes, experiences, and perceptions. Researchers often choose qualitative methods to explain and/or create a narrative of an experience or situation. A compelling narrative or case study about student agency can be a strong indicator of the success of open and affordable course marking projects.
Quantitative methods are used to measure countable aspects of a course marking initiative—for example, the number of courses marked, the number of students enrolled in marked courses, or the number of programs or departments involved in the marking initiative. Using quantitative methods can provide valuable insight into the reach of the course marking initiative. Measuring the reach of an initiative can be particularly helpful when demonstrating value to administrators, state officials, or other stakeholders who might be potential advocates or partners. Using quantitative methods requires an understanding of descriptive statistics, inferential statistics, and confounding variables.
Statistics can be categorized as descriptive or inferential, which are used for different purposes. Descriptive statistics report the basics of what is measured. In the case of course marking, descriptive statistics might be as simple as calculating the number of open and affordable course markings. On the other hand, inferential statistics use probability theory to infer other meanings and draw new conclusions from the data set. For example, inferential statistics could be used to identify which types of students (e.g., traditional, adult, first-generation, or veteran) are more likely to enroll in courses using an open or affordable textbook in order to graduate more quickly. Having that type of information is helpful when formulating a marketing plan (e.g., partnering with advisers who work with specific student groups). Descriptive and inferential statistics are valuable for their specific purposes.
If not accounted for, confounding variables introduce bias into the analysis by implying a correlation where there is none. When analyzing the impact that course marking has on outcomes such as student awareness, course selection, and persistence, researchers identify and control for confounding variables to mitigate against distorting the association between an exposure to course markings and an outcome (Pennsylvania State University 2018).
For example, choosing a course marked as a service learning course does not necessarily correlate to a higher interest in service learning per se. The course could be the only one offered during an opening in a student’s schedule, taught by a popular instructor, or offered in a preferred format. The same is true for courses marked as using open and affordable materials. Researchers need to continually identify and control for confounding variables wherever possible to make accurate inferences. Transparency requires disclosure when it is too difficult to control for confounding variables, and this limitation must be mentioned when presenting the data to an audience of stakeholders.
Evaluating an initiative may require some familiarity with the principles and methods of data analysis. A number of open textbooks have been published on the subject. Introductory Statistics (Illowsky and Dean 2020) introduces the basic statistics principles necessary for data analysis.
Assessing Open and Affordable Course Markings
By marking open and affordable courses through the SIS, institutions create an opportunity to perform basic assessment of courses that use open and affordable content and to run reports based on student success metrics. Running reports through the SIS reduces the possibility of sampling error and duplicative reporting processes.
The Open Education Group published the Guidebook to Research on Open Educational Resources Adoption, which outlines ways to measure the impact of OER on student and instructor use, cost savings, student outcomes, and perceptions of OER (Hilton et al. 2016a). The guidebook provides specific research questions and measurable variables, identifies the confounds and offers suggestions for controlling for these variables, and indicates statistical methods for analyzing the data.
The data gathered for these processes may be collected via surveys, questionnaires, reports from instructors, or reports from the SIS. The OER Champion Playbook (2017) includes “plays” created to help one identify and measure goals related to the impact of a program, the amount of cost savings, instructor and student satisfaction, progress and completion, as well as student learning and engagement.
Wiley (2019a) of the Open Education Group also created the “OER Adoption Impact Calculator,” with an easy to use, web-based interface that allows one to enter data fields, such as the number of enrollments using OER, the average cost of textbook(s) replaced, and the average cost spent by students using OER. This tool allows one to calculate the total textbook cost to students, the course throughput rate, additional tuition revenue from increased enrollment intensity, tuition revenue refunded to students who drop, and net change in institutional revenue. The Guidebook to Research, the “OER Champion Playbook,” and the “OER Adoption Impact Calculator” are great resources for those are new to learning how to conduct basic assessment and research on OER.
Adding a course designation for open and affordable content in the course registration system and schedule of classes provides a mechanism for running reports to track open and affordable usage across an institution. Case studies from Houston Community College and State University of New York explicitly indicate that one of their goals in developing course marking initiatives at their respective institutions was to develop methods of tracking and reporting open and affordable course material usage. Many instructors independently adopt textbooks without a formal system set up to account for the actual number of courses marked or to assess the impact of open and affordable courses on student success. Marking the courses is the first step of collecting data in order to use that information. If institutions are solely focused on cost savings, they might choose to use descriptive statistics to measure the amount of potential cost savings, whereas other institutions might use statistical inferences to measure the impact of courses using open or affordable textbooks. Evaluating open and affordable course markings is not always straightforward, especially given the conflation and low awareness of the terms “open,” “OER,” and “affordable.” Notably, Houston Community College stopped using OER as a course marker and began marking Low Cost and Zero Cost courses. It is possible that the potential loss of the ability to untangle the impacts of OER versus affordable learning initiatives through a schedule search may have unintended detrimental effects for measuring impact. As more institutions move to include OER courses with no-cost and low-cost course markings, researchers lose the ability to differentiate the OER courses from the non-OER courses. The focus on student cost savings, as well as the cyclical and internal textbook adoption process, makes the process of measuring impact factors beyond cost savings difficult.
Open and Affordable resource awareness
As nascent course marking initiatives expand and new initiatives are created, program coordinators and researchers should focus on awareness of course marking among students and instructors. Evaluating course marking awareness is necessary for evaluating whether students and instructors know about open and affordable course markings and use them in making decisions. Coordinators interested in expanding course marking initiatives might consider collecting information that sheds light on the student enrollment decision-making process as a compelling argument in favor of expanding course marking to include open and affordable course designations.
There are a number of ways in which course marking initiatives can contribute to the general awareness of open and affordable concepts, materials, and programs. The act of marking open and affordable courses naturally leads to more awareness of courses that use open and affordable materials as students discover the markings in the SIS and instructors notice their peers using and talking about open and affordable materials in the classroom. Each institution represented in Part VII (Case Studies) made a conscious decision to use specific terminology when identifying OER, no-cost, and low-cost courses for their institutional audience.
The term “OER” does not mean much to the average student. Typically, policy-driven promotion of OER or open education or prioritization of OER as part of an affordable content initiative contributes to whether the course marking includes “OER” as a designator. Since implementing open and affordable course marking, Houston Community College has seen an increase in the number of courses marked. Though the increase could be attributed to a number of external factors, marking these courses is a significant step in furthering discoverability and overall awareness across the institution.
To promote awareness, Central Virginia Community College’s schedule of classes clearly defines OER at the point of usage (see fig. 18.1). Despite the prominent OER definition, some students mistakenly believe that courses marked as OER are delivered online. Even the instructors who adopt OER may have confusion about the term, particularly at institutions that mark zero or low cost materials. At Houston Community College, Smith notes that the number of students who were reported to be enrolled in OER courses is not accurate because many instructors were unaware of the differences between open and other affordable course materials, conflating terms and perhaps overestimating the number of students actually enrolled in OER courses.
For institutions that also use zero and low-cost designators, marketing materials and communications must be extra vigilant to prevent confusion around the terms, and program coordinators should develop a regular assessment routine to measure understanding of the terminology. Academic advisers and staff in registrar offices should be prioritized in educational outreach efforts to assist with the provision of explanations for students. Usability tests or cross-sectional focus groups could be useful mechanisms for assessing potential users’ understanding of the language used. Additionally, a brief follow-up survey requested from users who have encountered marketing materials designed to explain terminology and branding would be useful in assessing the effectiveness of various outreach techniques.
In the State University of New York system, Tompkins Cortland, Fulton-Montgomery, and Dutchess Community Colleges’ course registration systems provide a clickable link to a definition of OER. At Tompkins Cortland Community College, students can filter courses to “Show only OER courses.” Assessing these awareness strategies may include counting the number of clicks on the definition link or how frequently students used the “Show only OER courses” limiter.
Mt. Hood Community College and Nicolet College decided not to use OER as a designator. Instead, these campuses identify classes with affordable course materials as either No Cost or Low Cost in their registration system. Both institutions concluded that students do not clearly understand the term “OER” nor that No Cost materials are not all OER. Erie Community College also avoids using the term OER but has just one designation—an “AIM” badge—for Affordable Instructional Materials, which encompasses both OER and materials that cost less than $30.
Clear and concise descriptions of open and affordable designations are important in understanding students’ and instructors’ awareness of open and affordable concepts. As stakeholders design awareness surveys and questionnaires, they can easily refer to the clear descriptions used in open and affordable course markings to build the most effective assessment tools.
In Chapter 2 (Legislative Implications), a 2018 report for the Oregon Higher Education Coordinating Commission noted that students were not aware of OER courses and/or the information was not available in a timely manner (Freed et al. 2018). The researchers recommended making a common form of designation across the state. A City University of New York survey indicated students were not aware they were in a zero textbook cost (ZTC) course. These observations support registrar professionals’ assertion that students do not closely read information in the SIS; rather, they are in the registration system to conduct business (Kitch 2015). Measuring awareness should not stop after reporting the survey results. The assessment of open and affordable course marking programs should be iterative and continuously focus on areas where changes can potentially improve awareness. For example, the 2018 Oregon study recommendations include using a phrase or icon that is easy to understand and using it consistently in more places than just the registration system (Freed et al. 2018).
At Mt. Hood Community College, administrators specifically asked the student government association (SGA) to differentiate between OER, no cost, and low cost before implementing the course markings in the SIS. The SGA recommended definitions for each term, identifying terminology that would be easy for students to understand. This pre-course marking data collection from the target population not only provides evidence supporting the use of one set of terminology over another but also creates an early awareness of open and affordable characteristics among the target population (students) prior to rolling out the course markings.
Sometimes informal conversations with instructors or students about their usage or understanding of open and/or affordable materials can be illuminating. These comments, which are qualitative in nature, add value to an assessment report by providing more insight into the nuances of how aware instructors and students are of open and affordable concepts. Anecdotally, librarians at Lower Columbia College suggest that course marking, and the associated collaboration, outreach, and marketing performed to implement and advertise the initiative, led to greater awareness and visibility of affordable textbook initiatives on campus. Marking the courses and performing the necessary legwork to disseminate and retrieve information from instructors kept the program an active topic of campus conversations (Hicks and Gillaspy-Steinhilper, Personal communication 2018).
In “Participant Experiences and Financial Impacts: Findings from Year 2 of Achieving the Dream's OER Degree Initiative,” responses to surveys and site visit interviews from 2016/17 and 2017/18 indicated students were unaware of the OER course options before they registered for classes (Griffiths et al. 2018). Seven of the colleges included in the research study marked OER and ZTC in the schedule of classes or course catalog at the time of the research, and at least two of these institutions included explanations for the course labels. Twenty-four percent of students reported they saw the OER icon by the course name during registration, and 23% said cost saving was a strong factor in their enrolling in the class (Griffiths et al. 2018, 15). For institutions that approach OER course marking as a way to build awareness and promote OER courses to students, the findings from the Achieving the Dream report and several case studies seem to indicate that OER course marking alone is not enough to raise awareness levels.
In addition to SIS analytics, which show enrollment in open and affordable courses, surveys have been developed, implemented, and analyzed to understand student awareness of open and affordable course materials. Several surveys are available at the OER Research Toolkit (Open Education Group 2017), which are designed to identify student use of open and affordable course materials and their perception of the quality of the materials. In Achieving the Dream’s student survey responses, researchers discovered that though student awareness of OER was initially low, a majority would enroll in an OER course again (Griffiths et al. 2018). A combination of SIS data and student survey responses provides a more holistic view of open and affordable course initiatives.
Assessment Tools for Cost Awareness |
Assessment Method | Impact |
SIS usage report | Quantitative | Tracks student hits on links/limiters for open and affordable courses |
Student survey or questionnaire | Quantitative and qualitative |
Descriptive of student awareness (e.g., OER, affordable, and zero-cost courses) |
Instructor survey or questionnaire | Quantitative and qualitative |
Descriptive of instructor awareness (e.g., OER, affordable, and zero-cost courses) |
Compliance with mandates
Chapter 2 (Legislative Implications) and Chapter 3 (Institutional Policy) discussed state and institutional mandates for institutions to mark courses within the registration system or the schedule of classes. For example, Central Virginia Community College’s OER course marking initiative started because of a grant administered by the Virginia Community College System Chancellor’s Innovation Fund and the Hewlett Foundation, which stipulated the need for the institution to mark the classes for the Virginia Community College System. One way to demonstrate compliance with legislative, institutional, and grant requirements is to run a usage report. Some mandates might require institutions to report the number of courses, the number of programs involved, or the overall student cost savings due to marking open and affordable courses. Marking open and affordable courses to demonstrate compliance often leads institutions to realize that collecting this data is advantageous in other ways. For instance, the grant funding from the Virginia Community College System and the Hewlett Foundation not only allowed Central Virginia Community College to implement the program, but also provided an easier way for students in the system to discover the courses and created a mechanism to report back simple data to the funding sources about adoptions across the institution.
Assessment Tools for Compliance |
Assessment Method | Impact |
SIS usage report | Quantitative | Adherence to governmental mandates Demonstrates achievement of institutional benchmarks and/or goals |
Survey of instructors Sample survey: Bliss et al, 2013 |
Quantitative, qualitative, mixed methods |
Number of students using open and affordable materials Cost of previous course material(s) |
Students’ Cost Savings
Some institutions use course markings to report on student cost savings of open and affordable materials versus traditional textbooks. Mt. Hood Community College, for example, collects data on courses adopting OER and compares those numbers with campus store data to determine a general estimate of student cost savings. City University of New York’s Open Education Librarian Ann Fiddler notes that though the system has seen a dramatic rise in cost savings and in the number of courses taught using OER, it has been difficult to determine how much of the cost savings can be attributed to the 3,000 ZTC sections (about 5% of the total courses offered) and how much to other more long-standing OER initiatives. Nevertheless, existing ZTC course designations can be used to run reports to measure OER usage and cost savings (Fiddler and McKinney, Personal communication 2018).
The data collected on student cost savings also may feed back into the marketing and communication efforts to promote open and affordable course marking initiatives. Whether the communications plan targets students, instructors, administrators, or external stakeholders, highlighting baseline student cost savings or trends in student cost savings over time can be an appealing part of the messaging, as evidenced by Lower Columbia College’s 2016 campaign to promote the success of their alternative educational resources (fig. 23.1).
Assessment Tools for Cost Savings |
Assessment Method | Impact |
SIS usage report | Quantitative | Number of courses using open and affordable materials Number of students enrolled in courses using open and affordable materials |
Student survey Sample questionnaire: Florida Virtual Campus (2016) |
Quantitative | How much money students spend on textbooks Compare/track cost savings over time |
Student Success
Though the question of cost requires access to systems outside of the SIS to calculate estimated savings, outcomes can be measured using the information contained within the SIS such as final grades, drops, and withdrawals for sections of courses using open and affordable course materials versus those using a traditional textbook. As mentioned in Chapter 9 (Student Information Systems), the registrar, records office, assessment program, and information technology department may have access to the SIS to run reports. In several case studies, including Mt. Hood Community College, Houston Community College, and State University of New York, institutional OER leaders also have access to the SIS to run reports. Student outcomes can also be measured by assessing course throughput rates—drop rates, withdrawal rates, and C or better rates—for sections of OER courses as compared with sections of courses taught with a traditional textbook.
Kwantlen Polytechnic University, an institution which added a course attribute field that allows students and other stakeholders to filter courses that are part of their Zed Cred program (the Canadian equivalent of the Z-Degree), has noted the vast potential of using this filtering mechanism to determine the impact of the overall Zed Cred program. For example, reports can be generated each semester comparing courses that have both participating and non-participating sections in the Zed Cred program. Using these reports, insights can be gained into important metrics, including grade distributions, course withdrawals, and course failure rates.
As Hilton and colleagues (2016b, 19) explain, “while cost-savings are important to some instructors, the more vital issue relates to student learning.” Tidewater Community College implemented ZTC courses (or Z courses), designed for the Z-Degree program. Students see and can choose Z courses during registration. During the Fall 2013 through Spring 2015 terms, researchers compared student course throughput rates in Z courses with rates in non-Z courses (Hilton et al. 2016b) using data generated from SIS reports. The authors acknowledge the study design does not indicate causation, but the results of the study align with previous studies that indicate students perform as well or better in courses using OER as in courses using traditional textbooks (Hilton et al. 2016b, 24).
Retention is also a popular metric among institutional administrators. Nathan Smith at Houston Community College describes his close relationship with the Office of Institutional Research in tracking metrics such as grades, drops, and withdrawals. The courses are marked as low-cost, zero-cost, or Inclusive Access courses, and these distinctions enable the institution to compare student success in open and affordable courses with that in courses that use traditional textbooks. Data collected about open and affordable enrollment does not, by itself, indicate an effect on retention. However, open and affordable courses can be a data point in the conversation about student retention, along with student engagement practices such as service learning courses (Bringle, Hatcher, and Muthia 2010, 45) and students’ personal financial situations (Hope 2015, 12), while still controlling for confounding variables. With access to reports in the SIS, researchers can also assess student success metrics for Pell-eligible students enrolled in OER, no-cost, and low-cost courses, which are important for assessing marginalized student persistence.
Assessment Tools for Student Success |
Assessment method | Impact |
SIS reports: final grades, failure rates, withdrawal rates, and Pell eligibility | Quantitative | Compare student success and persistence in open and affordable courses to comparable traditional courses |
Student survey: Sample questionnaire: Jhangiani et al. (2018) |
Qualitative and quantitative | Compare student responses with course performance data |
Instructor report: final grades | Quantitative | Compare open and affordable courses to comparable traditional courses |
Focus groups or interviews | Qualitative | Descriptive of why students choose courses marked with an open or affordable textbook |
Student Enrollment
Few reported assessments have been performed to determine the impact of open and affordable course marking on student enrollment; it is an area in which future stakeholders will likely choose to explore. City University of New York stakeholders have begun to consider whether course marking impacts student enrollment in certain courses. City University of New York stakeholders report that future analysis for the ZTC initiative will focus on assessing whether students register for courses specifically based on searches performed for the ZTC designation or whether they enroll in these courses for unrelated reasons. Andrew McKinney, the City University of New York open education coordinator, reports that this analysis could be done by conducting student surveys or by requesting queries performed from the registrar’s office. While these quantitative and qualitative impact studies are still in the developmental phase, measuring the impact of the ZTC course marking initiative on student enrollment will remain a factor in determining future directions for City University of New York’s OER activities (Fiddler and McKinney, Personal communication 2018).
Since implementing course marking that designates courses using cost-free resources, Kwantlen Polytechnic University has seen an increase in the wait-list for Zed Cred courses over equivalent courses not participating in the program. This indication reflects student assertions that courses using cost-free resources are preferable to those with more traditional costs. Using wait-lists for Zed Cred or Z-Degree programs is one mechanism to assess the popularity or demand for courses using open and affordable materials. This type of information can be extremely compelling in demonstrating the value of marking open and affordable courses to administrators and other stakeholders who might be able to assist or expand existing initiatives. While City University of New York and Kwantlen Polytechnic University are exploring this assessment, researchers have not yet published results showing improvement in registration numbers for courses using open and affordable resources versus traditional course materials.
Assessment Tools for Enrollment |
Assessment Method | Impact |
SIS usage report | Quantitative | Compare open and affordable enrollment to comparable traditional course enrollment |
Student survey or questionnaire | Qualitative and quantitative | Descriptive of student decision-making |
Conclusion
Though a handful of institutions that have recently implemented OER have shared assessments of their programs’ impact on students, instructors, and institutions, the available information on the effects of marking open and affordable courses is still scarce. Some institutions have expressed future plans to measure the impact of open and affordable course markings. For example, Nicolet College hopes to determine if a correlation exists between No Cost/Low Cost designations and enrollment, as well as what potential effects course marking may have on the degree pathways of students. It is likely that other recent initiatives are also actively collecting and assessing impact measures to be used internally or shared with the larger academic community at a later date. Thus far, the few programs that have assessed and shared their open and affordable course marking initiatives have measured compliance with mandates, cost savings, effects on student enrollment, and awareness of open and affordable initiatives and programs.
Within the growing literature of open and affordable course marking, initiatives frequently report about their impact using narrative and case study formats, such as those found in Part VII (Case Studies). Using a narrative reporting method allows programs to combine their qualitative and quantitative data in a way that delivers statistically relevant information while also providing critical context connecting the program to local communities. The descriptive nature of the narrative format lends itself well to the potentially disparate audiences of administrators and students alike and supports student outreach and promotional communication activities. See Chapter 13 (Implementation) for more details.
Opportunities exist for institutions as they implement course marking to develop new ways to measure their impact. As these programs evolve and literature is published on the subject, demonstrable effects of course marking may encourage stakeholders at other institutions to consider and develop new open and affordable course marking ventures.
Also called attributes, designations, tags, flags, labels: specific, searchable attributes or designations that are applied to courses, allowing students to quickly identify important information to aid in their decision making and allow them to efficiently plan their academic careers. Course markings may include letters, numbers, graphic symbols, or colors and can designate any information about a course, including service learning status, additional costs, course sequencing requirements, and whether the course fulfills specific general education requirements.
The act of gathering and analyzing large amounts of students, data via technology with the goal of improving student success and retention. Learning analytics can be gathered through online learning platforms, learning management systems, or other platforms and contexts. With the growing interest and use of learning analytics in higher education, issues of privacy, consent, and ethics are paramount.
Free teaching and learning materials that are licensed to allow for revision and reuse.
An economic system favoring free market capitalism. Since the 1970s, state governments and higher education institutions have increasingly shifted the burden of tuition costs to students and outsourced institutional services to third party vendors (e.g., technology infrastructure such as learning management systems, dining services, and university bookstores). Critics charge that by favoring free market economics, neoliberalism impedes diversity, equity, and inclusion efforts and limits access to open and affordable education.
Also called Registration System, Course Timetable Software or Course Schedule Platform: a web-based application designed to aggregate key information about students, including demographic information, contact information, registration status, degree progression, grades, and other information. Some SISs assist students with enrollment, financial aid processes, and final payment for courses.
Measures the effects of a combination of student responses to courses, which include dropping a course, withdrawing from a course, and completing the course with a C or better final grade (Hilton et al. 2016). Researchers use the aggregate course throughput rate to compare student outcomes in course sections using traditional learning materials versus sections using open and affordable materials.
Distinguishes between students who enroll as full-time and part-time based on numbers of credits.
Also called Course Schedule or Schedule of Courses: a college or university’s listing of courses to be offered each semester or quarter, which includes details on class time, prerequisites, instructor of record, and other information; it is updated for each academic period.
Courses that do not require students to spend money for textbooks. May be achieved through the use of OER, library-licensed content, or other free resources.
An organization that assists community colleges with sustainable institutional transformation to increase student success, especially of low-income students and students of color. One initiative for their network of community colleges focuses on the increased adoption of OER.
Course Marking Legislation
California. 2016. SB 1359: Public postsecondary education: Course material. https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=201520160SB1359
Colorado. 2018. HB 18-1331: Open educational resources - council - grant program - guidelines - appropriation. https://leg.colorado.gov/bills/hb18-1331
Louisiana. 2019. SB 117: Provides relative to postsecondary textbooks and instructional materials. https://legis.la.gov/legis/BillInfo.aspx?i=236063&sbi=y
Oregon. 2015. HB 2871: Relating to higher education; and declaring an emergency. https://olis.leg.state.or.us/liz/2015R1/Measures/Overview/HB2871
Texas. 2017. SB 810: Relating to the purchase and use of open educational resources. https://capitol.texas.gov/billlookup/Text.aspx?LegSess=85R&Bill=SB810
Virginia. 2019. HB 2380: Higher educational institutions, public; online course catalogue, no-cost/low-cost course materials. https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2380
Washington. 2017. HB 1375: Informing students of low-cost course materials for community and technical college courses. https://lawfilesext.leg.wa.gov/biennium/2017-18/Pdf/Bills/Session%20Laws/House/1375-S2.SL.pdf
General References
Acalog Catalog Management. 2020. Digarc. Catalog software. https://www.digarc.com/solutions/academic-catalog-management
Achieving the Dream. 2020. "Open Educational Resources (OER) Degree Initiative." https://www.achievingthedream.org/resources/initiatives/open-educational-resources-oer-degree-initiative
Affordable Learning Georgia. 2020. "Course Catalog Designators-About." https://www.affordablelearninggeorgia.org/about/course_catalog_designators
Ahearn, L. M. 2001. "Language and Agency." Annual Review of Anthropology 30: 109-137. https://doi.org/10.1146/annurev.anthro.30.1.109
Alexander, Bryan, Kevin Ashford-Rowe, Noreen Barajas-Murphy, Gregory Dobbin, Jessica Knott, Mark McCormack, Jeffery Pomerantz, Ryan Seilhamer, and Nicole Weber. 2019. EDUCAUSE Horizon Report: 2019 Higher Education Edition. https://library.educause.edu/-/media/files/library/2019/4/2019horizonreport.pdf
American Association of Collegiate Registrars and Admissions Officers. 2016. “The Interactive Catalog of the Future.” AACRAO Connect (June 14). https://www.aacrao.org/resources/newsletters-blogs/aacrao-connect/article/the-interactive-catalog-of-the-future
American Library Association. 2012. "Developing and Implementing a Simple Media/Communications Plan." http://www.ala.org/advocacy/advleg/publicawareness/campaign%40yourlibrary/prtools/handbook/media-plan
Babad, Elisha. 2001. “Students’ Course Selection: Differential Consideration for First and Last Course.” Research in Higher Education 42 (4): 469-492. https://doi.org/10.1023/A:1011058926613
Bell, Steven. 2017. “What about the Bookstore: Textbook Affordability Programs and the Academic Library-Bookstore Relationship.” College & Research Libraries News 78 (7). https://crln.acrl.org/index.php/crlnews/article/view/16700/18183
———. 2018 “Course Materials Adoption: A Faculty Survey and Outlook for the OER Landscape.” Choice. https://choice360.org/librarianship/whitepaper
Bliss, T. J., T. Jared Robinson, John Hilton, and David A. Wiley. 2013. “An OER COUP: College Teacher and Student Perceptions of Open Educational Resources.” Journal of Interactive Media in Education 1: Annex A and Annex B. https://jime.open.ac.uk/articles/10.5334/2013-04/
Board of Governors of the Federal Reserve System. 2019. “Report on the Economic Well-Being of U.S. Households in 2018-May 2019.” https://www.federalreserve.gov/publications/2019-economic-well-being-of-us-households-in-2018-student-loans-and-other-education-debt.htm
Boise State University. 2011. “University Policy 4160: Development of Schedule of Classes-General.” Boise State University. https://policy.boisestate.edu/academic-affairs-faculty-administration/policy-title-development-of-schedule-of-classes-general
Bringle, Robert G., Julie A. Hatcher, and Richard N. Muthiah. 2010. “The Role of Service-Learning on the Retention of First-Year Students to Second Year.” Michigan Journal of Community Service Learning 16 (2): 38-49. http://hdl.handle.net/2027/spo.3239521.0016.203
Business Wire. 2020. "Textbook Publishers Sued by College Students Claiming Giants Conspired to Illegally Monopolize Lucrative Market, Law Firm FeganScott Announces." https://www.businesswire.com/news/home/20200513005658/en/Textbook-Publishers-Sued-College-Students-Claiming-Giants
California State University. n.d. "Frequently Asked Questions." Accessed April 15, 2020. https://als.csuprojects.org/
Cangialosi, K. 2018. Twitter post. May 3, 2018, 7:53 p.m., https://twitter.com/karencang/status/992235740854673408
Chae, Boyoung, Kevin Corcoran, Michael Daly, Ann Fiddler, Jeff Gallant, James Glapa-Grossklag, Amy Hofer, and Michelle Reed. 2019. Price Transparency: State Approaches to OER/No Cost/Low Cost Course Schedule Designators. Arlington, TX: Mavs Open Press. https://uta.pressbooks.pub/pricetransparency
Chronicle of Higher Education. 2014. “25 Years of Declining State Support for Public Colleges.” https://www.chronicle.com/interactives/statesupport
Civitas Learning. 2018. "Press Release: Popular Course Planner Embeds New Search Tool to Help Students Identify Classes with Free or Low-Cost Course Materials." https://www.civitaslearning.com/press/press-release-popular-course-planner-embeds-new-search-tool-to-help-students-identify-classes-with-free-or-low-cost-course-materials
Cohen, Arthur M., Brawer, Florence B., and Carrie B. Kisker. 2014. The American Community College. Sixth ed. San Francisco, CA: Jossey-Bass.
Colvard, Nicholas, C. Edward Watson, and Hyojin Park. 2018. "The Impact of Open Educational Resources on Various Student Success Metrics." International Journal of Teaching and Learning in Higher Education 30 (2): 262-76. http://www.isetl.org/ijtlhe/pdf/IJTLHE3386.pdf
Cromwell, Josh. 2017. "Connecting Library Textbook Programs to Campus Initiatives." In Diaz, C. (Ed.), Affordable Course Materials: Electronic Textbooks and Open Education Resources. Chicago, IL.: American Library Association.
CSCU Open Educational Resources. n.d. Open Educational Resources. Accessed April 15, 2020. https://www.ct.edu/oer
Cummings-Sauls, Rebel, Matt Ruen, Sarah Beaubien, and Jeremy Smith. 2018. "Open Partnerships: Identifying and Recruiting Allies for Open Educational Resources Initiatives." In A. Wesolek, J. Lashley, and A. Langley (Eds.), OER: A Field Guide for Academic Librarians. Forest Grove, OR: Pacific University Press. https://doi.org/10.7710/9781945398797
CUNY. 2020. "Guidelines for Designating a Course Section with the 'Zero Textbook Cost' (ZTC/OER) Attribute." https://www.cuny.edu/libraries/open-educational-resources/guidelines-for-zero-textbook-cost-course-designation
CUNYMedia. 2018. "Zero Textbook Cost Courses." Video. https://www.youtube.com/watch?v=jRjTRza_I18
Davis, Martin. 2018. "Thinking Customization? Keep in Mind the Top Challenges of Customizing an ERP System." Ultra Consultants. https://ultraconsultants.com/erp-software-blog/challenges-of-customizing-erp-system/
Drexel University. n.d. “Course Scheduling Policies.” Accessed February 19, 2020. https://drexel.edu/registrar/scheduling/overview/scheduling-policies
Durbin, Dick. 2019. “Durbin, Markey, Blumenthal Request Information on Student Data Collection Practices.” Dick Durbin United States Senator Illinois. https://www.durbin.senate.gov/newsroom/press-releases/durbin-markey-blumenthal-request-information-on-student-data-collection-practices
Eckel, Peter D., and Adrianna Kezar. 2016. "The Intersecting Authority of Boards, Presidents, and Faculty: Toward Shared Leadership." In M. N. Bastedo, P. G. Altbach, P.G., and P. J. Gumport (Eds.), American Higher Education in the Twenty-First Century: Social, Political, and Economics Challenges, pp. 155-187. https://repository.upenn.edu/gse_pubs/463
Ellucian. 2020. "GIT Infrastructure Implementation (Banner Technical)." https://training.ellucian.com/course/git-infrastructure-implementation-banner-technical
Ellucian Banner. 2020. Ellucian. Student information system software. https://www.ellucian.com/solutions/ellucian-banner
Ellucian PowerCampus. 2020. Ellucian. Student information system software. https://www.ellucian.com/solutions/ellucian-powercampus
Finkbeiner, Nicole. 2019. "OER Class Schedule Survey (Responses)." https://docs.google.com/spreadsheets/d/1FwWXkOC3mc6NpjK18Tv53Qg2r2b1uL6hupHa9G30a98/
Finkbeiner, Nicole. n.d. "OER Class Schedule Survey." Form. Accessed April 15, 2020. https://docs.google.com/forms/d/e/1FAIpQLScwvUHLcmkObkEUFiGMZ9PrtbcLU_dDLJHNibu3vCQV-OrB8w/viewform
Fischer, L., J. Hilton, T. J. Robinson, and D. Wiley. 2015. "A multi-institutional study of the impact of open textbook adoption on the learning outcomes of post-secondary students." Journal of Computing in Higher Education 27 (3): 159-172. https://doi.org/10.1007/s12528-015-9101-x
Florida Virtual Campus. 2016. 2016 Student Textbook and Course Materials Survey. https://florida.theorangegrove.org/og/file/3a65c507-2510-42d7-814c-ffdefd394b6c/1/2016%20Student%20Textbook%20Survey.pdf
Freed, Brooke, Amber Friedman, Sarah Lawlis, and Angie Stapleton. 2018. Evaluating Oregon’s Open Educational Resources Designation Requirement. Report. Salem, OR: University of Oregon School of Planning, Public Policy and Management. https://www.oregon.gov/highered/research/Documents/Reports/HECC-Final-OER-Report_2018.pdf
Fulton-Montgomery Community College Board of Trustees. 2016. "Motion to Approve Open Educational Resource Course Fee." https://library.fmcc.edu/ld.php?content_id=31016775
G2. 2019. “Best Student Information Systems (SIS).” https://www.g2.com/categories/student-information-systems-sis
Gallant, Jeff, and Marie Lasseter. 2018. 2018 USG Survey Report on Open Educational Resources. Affordable Learning Georgia. https://www.affordablelearninggeorgia.org/documents/2018_USG_OER_Survey.pdf
Gardner, John N. 2002. “What, So What, Now What: Reflections, Findings, Conclusions, and Recommendations on Service-Learning and the First-Year Experience.” In Service-Learning and the First-Year Experience: Preparing Students for Personal Success and Civic Responsibility. Monograph Series No. 34. Edited by Edward Zlotkowski, pp. 141-150. Columbia, SC: University of South Carolina, National Resource Center for The First-Year Experience and Students in Transition.
Gerdes, John H., and Tena B. Crews. 2010. “Developing Course Profiles to Match Course Characteristics with Student Learning Styles.” NACADA Journal 30 (1): 23-33. https://doi.org/10.12930/0271-9517-30.1.23
Goodman, Jennifer. 2017. “Collaborating to Scale OER.” Inside Higher Ed. https://www.insidehighered.com/digital-learning/article/2017/06/28/maricopa-millions-scales-oer-across-11-arizona-community
Griffiths, Rebecca, Shari Gardner, Patrik Lundh, Linda Shear, Alexandra Ball, Jessica Mislevy, Shuai Wang, Donna Desrochers, and Richard Staisloff. 2018. "Participant Experiences and Financial Impacts: Findings from Year 2 of Achieving the Dream’s OER Degree Initiative." Menlo Park, CA: SRI International. https://www.achievingthedream.org/resource/17507/participant-experiences-and-financial-impacts-findings-from-year-2-of-achieving-the-dream-s-oer-degree-initiative
Hewlett Foundation. 2015. Open Educational Resources. https://www.hewlett.org/wp-content/uploads/2016/11/Open_Educational_Resources_December_2015.pdf
Hilton III, John. 2016. “Open Educational Resources and College Textbook Choices: A Review of Research on Efficacy and Perceptions.” Educational Technology Research and Development 64 (4): 573-590. https://doi.org/10.1007/s11423-016-9434-9
Hilton III, John, Lane Fischer, David Wiley, and Linda Williams. 2016. “Maintaining Momentum Toward Graduation: OER and the Course Throughput Rate.” International Review of Research in Open and Distributed Learning 17 (6): 18-27. https://doi.org/10.19173/irrodl.v17i6.2686
Hilton III, John, David Wiley, Lane Fischer, and Rob Nyland. 2016. Guidebook to Research on Open Educational Resources Adoption. Open Education Group http://openedgroup.org/wp-content/uploads/2016/08/OER-Research-Guidebook.pdf
Hofer, Amy, Zip Krummel, Kristen Kane, John Schoppert, Jennifer Lantrip, Jessica Knoch, and Jenn Kepka. 2017. “OER Impact Research.” Slide presentation. OER Symposium, Portland State University, May 11, 2017. https://pdxscholar.library.pdx.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1015&context=oer_symposium
Hope, Joan. 2015. “Collaborate to provide comprehensive support, boost retention.” The Successful Registrar 15 (4): 12. https://doi.org/10.1002/tsr.30073
Illowsky, Barbara, and Susan Dean. 2020. Introductory Statistics. Houston, TX: OpenStax. https://openstax.org/details/books/introductory-statistics
Jaggars, Shanna Smith, Amanda L. Folk, and David Mullins. 2018. “Understanding students’ satisfaction with OERs as course materials.” Performance Measurement and Metrics 19 (1): 66-74. https://doi.org/10.1108/PMM-12-2017-0059
James, Theresa A. 2006. A Handbook for Honors Programs at Two-Year Colleges. Lincoln, NE: National Collegiate Honors Council. http://digitalcommons.unl.edu/nchcmono/6
Jensen, Kristi. 2015. "Keeping Up with....Affordable Course Content." Association of College and Research Libraries. http://www.ala.org/acrl/publications/keeping_up_with/coursecontent
Jenzabar. 2020. Jenzabar. Student information system software. Boston, MA. https://www.jenzabar.com
Jhangiani, Rajiv. 2019a. "Kwantlen Polytechnic University's Zed Cred Initiative." Video. https://media.kpu.ca/media/Kwantlen+Polytechnic+University%27s+Zed+Cred+initiative/0_afre278g
———. 2019b. "Search the Course Timetable for Zero Textbook Cost Courses." Video. Kwantlen Polytechnic University. https://media.kpu.ca/media/Search+the+course+timetable+for+zero+textbook+cost+courses/0_i6w7c3jc
Jhangiani, Rajiv S., Farhad N. Dastur, Richard Le Grand, and Kurt Penner. 2018. “Digital Open Textbook Questionnaire.” http://openedgroup.org/wp-content/uploads/2016/08/Digital-open-textbook-questionnaire.docx
Jones, Kyle M. L. 2019. “Learning Analytics and Higher Education: A Proposed Model for Establishing Informed Consent Mechanisms to Promote Student Privacy and Autonomy.” International Journal of Educational Technology in Higher Education 16 (24): 1-22. https://doi.org/10.1186/s41239-019-0155-0
Kim, Joshua. 2014. “A College Bookstore Q&A.” Inside Higher Ed, June 12, 2014. https://www.insidehighered.com/blogs/technology-and-learning/college-bookstore-qa
Kitch, Rhonda Kay. 2015. “Best Practices for Communicating Critical Messages from a Registrar's Office to Traditional-Aged College Students.” Ph.D. diss., North Dakota State University. https://hdl.handle.net/10365/27870
Klaudinyi, Jen, David Koehler, Jody Potter, and Heather White. 2018. “OER Designations in the Schedule: System Considerations.” Open Oregon Educational Resources. Slidedeck presented online February 21, 2018. Video, 1:12:07. https://openoregon.org/archived-webinar-oer-designations-in-the-schedule/
Kuh, George, Jillian Kinzie, John Schuh, Elizabeth Whitt. 2005. Student Success in College: Creating Conditions that Matter. San Francisco: Jossey-Bass.
Lashley, Jonathan, Rebel Cummings-Sauls, Andrew Bennett, and Brian Lindshield. 2017. "Cultivating Textbook Alternatives from the Ground Up: One Public University’s Sustainable Model for Open and Alternative Educational Resource Proliferation." The International Review of Research in Open and Distributed Learning 18 (4). https://doi.org/10.19173/irrodl.v18i4.3010
Lengerich, Eugene, and Pennsylvania State University. 2018. “Bias, Confounding and Effect Modification.” In STAT 507: Epidemiological Research Methods. https://newonlinecourses.science.psu.edu/stat507/node/34/
Librarians as Open Education Leaders. 2015. https://libraryasleader.org
Lieberman, Mark. 2017. "OER and Affordable-Textbook Labeling Gains Ground." Inside Higher Ed. https://www.insidehighered.com/digital-learning/article/2017/12/06/states-mandate-oer-and-affordable-textbook-labeling-challenges
Lindgren, Robb, and Rudy McDaniel. 2012. "Transforming Online Learning through Narrative and Student Agency." Educational Technology & Society 15 (4): 344-355.
Long Beach Community College. n.d. Zero Textbook Cost: Course Status Page. Accessed April 15, 2020. https://lbcc.instructure.com/courses/26684/pages/zero-textbook-cost-course-status-page
Lower Columbia College. 2018. “Class Schedule Fall 2018.” https://services4.lowercolumbia.edu/info/webResources2/ClassSchedule/ClassScheduleFall2018.pdf
Lupton, Ellen, and Noel Cunningham. 2014. Type on Screen: A Guide for Designers, Developers, Writers, and Students. New York, NY: Princeton Architectural Press.
Lupton, Ellen, and Jennifer C. Phillips. 2015. Graphic Design: The New Basics. Second ed. New York: Princeton Architectural Press.
Maricopa Community Colleges. 2020. "Open Education Resources." https://www.maricopa.edu/why-maricopa/maricopa-millions
McKenzie, Lindsay. 2020. “A Legal Challenge for Inclusive Access.” Inside Higher Ed, January 27, 2020. https://www.insidehighered.com/news/2020/01/27/independent-bookstores-mount-inclusive-access-lawsuit
Mt. Hood Community College. 2019. "How to Identify Low-Cost and No-Cost Textbooks on the Course Schedule." Video.https://youtu.be/4u2NiOZTAl4
———. 2020a. "Course Section Reporting Form & FAQ." https://www.mhcc.edu/CourseSectionReporting/
———. 2020b. "Textbook Affordability Team." https://www.mhcc.edu/TATeam/
Munro, Daniel, Jenna Omassi, and Brady Yano. 2016. OER Student Toolkit. Victoria, BC: BCcampus. https://opentextbc.ca/studenttoolkit
National Association of College Stores. 2017. “Highlights from Faculty Watch Attitudes and Behaviors toward Course Materials 2016-17 Report.” http://www.nacs.org/research/FacultyWatchKeyFindings2017.aspx
National Council for Voluntary Organisations. 2019. "Developing a Communications Strategy." https://knowhow.ncvo.org.uk/campaigns/communications/communications-strategy
New Media Consortium, and EDUCAUSE Learning Initiative. 2016. The NMC Horizon Report: 2016 Higher Education Edition. Austin, TX. https://www.sconul.ac.uk/sites/default/files/documents/2016-nmc-horizon-report-he-EN-1.pdf
OER Champion Playbook. 2017. Portland, OR: Lumen Learning. https://lumenlearning.com/champion-playbook
Open Education Group. 2017. OER Research Toolkit. https://openedgroup.org/toolkit
———. 2019. “The Review Project.” http://openedgroup.org/review
Open Oregon. 2019. “Small Dollar Amounts Are Significant.” Open Oregon Educational Resources. https://openoregon.org/small-dollar-amounts-are-significant/
Open Textbook Alliance. 2016. Student Government Toolkit: Making Textbooks Affordable. https://studentgovresources.org/wp-content/uploads/2016/09/Open_Textbook_Alliance_Toolkit.pdf
Open Washington. 2016. "Open Course Library-Open Washington: Open Educational Resources Network." https://www.openwa.org/open-course-library
Pearson, Victoria, and Carolyne Culver. 2016. Writing a Communications Strategy. PowerPoint. Oxford, UK: University of Oxford. https://www.ox.ac.uk/sites/files/oxford/media_wysiwyg/Writing%20a%20communications%20strategy%20%2818.02.16%29.pdf
Pew Trusts. 2019. "Two Decades of Change in Federal and State Higher Education Funding." https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2019/10/two-decades-of-change-in-federal-and-state-higher-education-funding
Portland Community College. 2016. "Material Costs in the Schedule FAQ." https://docs.google.com/document/d/13JiRACnJ6u4zgeH7kvsTL2HK3ET711C3cgt2txHE5b0/edit
Reed, Michelle. 2018. "Open for UTA Students: Find OER Courses." Arlington, TX: University of Texas at Arlington. http://libguides.uta.edu/students/MyMav
———. 2019. Texas Toolkit for OER Course Markings (A Living Guide). Arlington, TX: UTA Libraries. https://libguides.uta.edu/TXtoolkit
Romo, Vanessa. 2018. "Hunger and Homelessness Are Widespread among College Students, Study Finds." The Two-Way. NPR. April 3, 2018. https://www.npr.org/sections/thetwo-way/2018/04/03/599197919/hunger-and-homelessness-are-widespread-among-college-students-study-finds
Saunders, Daniel. 2007. "The Impact of Neoliberalism on College Students." Journal of College and Character 8 (5): 1-9. https://doi.org/10.2202/1940-1639.1620
Sclatar, Niall, and Joel Mullan. 2017. "Jisc briefing: Learning Analytics and Student Success: Assessing the Evidence." http://repository.jisc.ac.uk/6560/1/learning-analytics_and_student_success.pdf
Seaman, Julia, and Jeff Seaman. 2017. Opening the Textbook: Educational Resources in U.S. Higher Education. Report. https://www.onlinelearningsurvey.com/reports/openingthetextbook2017.pdf
———. 2018. Freeing the Textbook: Educational Resources in U.S. Higher Education, 2018. Report. https://www.onlinelearningsurvey.com/reports/freeingthetextbook2018.pdf
Senack, Ethan. 2014. “Fixing the Broken Textbook Market: How Students Respond to High Textbook Costs and Demand Alternatives.” U.S. PIRG Education Fund and the Student PIRGS. https://uspirg.org/reports/usp/fixing-broken-textbook-market
SPARC. 2019a. "2018-2019 Connect OER Report." https://sparcopen.org/our-work/connect-oer/reports
———. 2019b. "OER State Policy Tracker." Retrieved February 19, 2020, from https://sparcopen.org/our-work/state-policy-tracking/
Sparcopen. 2020. State of OE Policies. Repository. https://github.com/sparcopen/open-education-state-policy-tracking/commits/master/StateOEPolicies.csv
Student PIRGs. 2004. "New Report Shows College Textbooks Are 'Ripoff 101': Publishers Increase Prices through Gimmicks, Faculty Are Concerned." Press Release. https://studentpirgs.org/2004/01/29/new-report-shows-college-textbooks-are-ripoff-101/
University of Iowa. 2020. “Fall and Spring: Policies and Procedures.” https://registrar.uiowa.edu/fall-and-spring-policies-and-procedures
SUNY Erie. 2014. "Welcome to Online Course Blueprint." https://www.ecc.edu/blueprint/
SUNY OER Services. 2020. "SUNY OER Ready-to-Adopt Courses." https://oer.suny.edu/
U.S. Bureau of Labor Statistics. 2016. “College Tuition and Fees Increase 63 Percent since January 2006.” The Economics Daily, August 30, 2016. https://www.bls.gov/opub/ted/2016/college-tuition-and-fees-increase-63-percent-since-january-2006.htm
U.S. Department of Education. 2008. Higher Education Opportunity Act of 2008. https://www2.ed.gov/policy/highered/leg/hea08/index.html
U.S. Department of Education, National Center for Education Statistics. 2019. "Digest of Education Statistics, 2017 (NCES 2018-070)." https://nces.ed.gov/fastfacts/display.asp?id=76
U.S. Government Accountability Office. 2013. College Textbooks: Students Have Greater Access to Textbook Information. https://www.gao.gov/assets/660/655066.pdf
Walz, Anita, Kristi Jensen, and Joseph Salem. 2016. SPEC Kit 351: Affordable Course Content and Open Educational Resources. Washington, DC: Association of Research Libraries. https://publications.arl.org/Affordable-Course-Content-Open-Educational-Resources-SPEC-Kit-351/
Washington Community and Technical Colleges. 2019. Implementation Guide of OER and Low-Cost Labeling Policies for Washington Community and Technical Colleges. Report. Washington State Board for Community and Technical Colleges. https://docs.google.com/document/d/1FDMutJccGdEZ1mtB-4eoAbvCmkS7t-tvqGoBrAIopO0
Wiley, David. 2014. “The Access Compromise and the 5th R.” Iterating toward Openness, March 5, 2014. https://opencontent.org/blog/archives/3221
———. 2019a. “The OER Adoption Impact Calculator.” Version 1.2. http://impact.lumenlearning.com
———. 2019b. “On ZTC, OER, and a More Expansive View.” Iterating toward Openness, August 27, 2019. https://opencontent.org/blog/archives/6117
———. n.d. Defining the "Open" in Open Content and Open Educational Resources. Accessed April 15, 2020. https://opencontent.org/definition
Also called Zed Cred: a degree, certificate, or curriculum path that has completely adopted free or zero-cost course materials so that as students progress through the degree they do not pay for course materials. All courses within the degree program must commit to zero-costs in order for the degree to be designated a Z-Degree.
A marketing term used to describe an agreement between textbook publishers and professors/institutions that allows all students enrolled in a specific course to be automatically charged for course materials through institutional fees. In the United States, organizations are legally required to provide students with options to opt-out of automatic purchasing programs. Multiple lawsuits have been filed against publishers and bookstores over such programs, including a class-action lawsuit filed in April 2020 by FeganScott on behalf of college students against Cengage Learning, McGraw Hill, Pearson Education, Follett Higher Education Group, and Barnes & Noble College Bookseller.