Collecting and Reporting Data

21

Jeff Gallant

Why Collect Data?

When running an OER program, time can often be scarce, and a program manager may wonder why data collection needs to occupy a significant portion of this always-limited resource. Besides extrinsic motivators such as a demand for data in a report to stakeholders, data is also the peripheral nervous system of your program’s organization; it is the way a program can assess successes, failures, emergent ideas, and urgent issues to address.

How Can Data Help?

Data collection, analysis, and reporting can yield multiple benefits for your OER program. Data can guide strategic decision-making when a program wants to adjust, adapt, and move forward. A report based on collected and analyzed data can help tell the story of why your program has the potential to, or already is, making a difference in your institution or system. Data can also indicate a need for targeted open education programming and funding, such as data on high textbook costs, students’ financial needs, or faculty awareness of OER.

Data Collection and Ethics

Before collecting data, be sure to keep some basic ethical guidelines in mind:

Privacy

Data privacy may be a priority at your institution already, given the potential for cybercrime and the existence of legal requirements for data privacy, such as the Family Educational Rights and Privacy Act (FERPA) in the United States. Alongside these, be sure that you have a reason to collect and/or report any personal identifiable information (PII) before doing so. For example, if an awarded grant proposal requires an institutional email address (which is likely already displayed on their campus website), this is likely enough contact information; including a phone number or a physical address may not be necessary.

Consent

Anyone who is submitting data to you, or getting data collected passively via web or platform analytics, should know who this data will be submitted to and why it’s being collected. When someone has informed consent in submitting data to you, not only do they acknowledge that their data is being collected, but they know your intent in collecting and using this data. Transparency in your data needs pre-collection and your reporting post-collection will help.

Anonymity

Alongside data privacy, be sure to anonymize, or “de-identify,” the data you collect whenever necessary. For example, in the United States, institutions often meet FERPA guidelines in collecting student data by replacing PII with an anonymized identifier with no meaning to anyone except the data collectors and analyzers (OECD 2016).

Equity

Equity is often an implicit goal in OER programs; any program that attempts to close the gap in educational materials costs and barriers to access is essentially addressing educational equity (DOERS3 2021). Therefore, be sure to collect, analyze, and report your data through the lens of educational equity; if you can analyze how the implementation of OER affects marginalized demographic groups at your institution, that analysis will help your initiative tell a more focused and comprehensive story of how your program works. See the Disaggregating Your Data section later in the chapter for more details.

Data Collection and Institutional Review Boards (IRB)

Institutional Review Boards are essential for ensuring that research taking place complies with both legal and institutional guidelines. In the United States, institutional review boards are built into federal regulations and ensure protections for all human research subjects. In the United States regulations, educational research on the effectiveness of instructional strategies, curriculum changes, and classroom management methods are typically exempt, so long as the research does not impact students’ ability to learn and instructors’ ability to teach (U.S. Department of Health and Human Services 2018).

Surveys, interviews, and focus groups can be exempt, but the more easily identifiable the subjects are on a personal level, the more likely that this research will not be exempt. Keeping privacy, consent, and anonymity in mind will help with this, as will awareness and transparency of any third parties that may be responsible for the collection and/or protection of your data.

This will likely cover most open education research, but it is still good to do due diligence with your IRB; there may be an outstanding issue with a particular study or further institutional guidelines which need to be met. Once you have a plan for a research study or a large data collection effort, contact your IRB for your institution’s own details and guidelines.

An OER Data Workflow, From Strategy to Collection

When you are starting your OER program’s data collection, don’t start at the collecting part: instead, start with figuring out exactly why you need data and how you’ll use it once it’s collected. Then, move to the what with actual data collection. To model this process, this chapter will work in chronological order and discuss data collection at the end of the chapter. You may encounter some data-specific terminology in the examples that you’re unsure of at first, and this is okay; data terminology is discussed in the Data Collection section at the end.

Step 1: Creating a Critical Questions List

Because the needs of your stakeholders should drive your decision-making in an OER program, start with these needs when determining what type of data you want to collect. You can do this by looking at each stakeholder group for the program and determining a list of critical questions that your key stakeholders will ask, therefore determining a need to collect the necessary data to answer these questions (see Step 3).

Please note that you first need to know your stakeholders and their environments. See Chapter 8, Building Familiarity on Campus, for ways to familiarize yourself with the needs of diverse stakeholders at your institution. This familiarity will also help you know how to obtain data from different departments or offices at your institution, and possibly how easy or difficult that process will be.

Stakeholder Example: Executive Administration

  • How much in textbook cost avoidance have you saved students over the past academic year?
  • How many students has this program affected?
  • What are the savings numbers for last semester?
  • Has the implementation of OER affected student retention at a course level? A degree program level?
  • Has the implementation of OER affected student success? Is this effect larger for first-generation students?
  • How long should we expect savings to continue due to one award? Do faculty turn to commercial resources after a certain period of time? Why do they, if so?
  • How do faculty feel about OER? Does that differ by department or if they’re teaching introductory / advanced courses?
  • Do students think that the cost of materials is an important thing for our institution to address?

Stakeholder Example: Instructional Faculty (Instructors, Instructional Designers)

  • Has the implementation of OER affected student success? Is this a same-instructor comparison, or an aggregate of all instructors?
  • Has the implementation of OER affected student success in the College of Arts and Sciences?
  • Has the implementation of OER affected student success in our IT degree programs?
  • For all the OER used in the Biology department, which textbook is used most for Concepts of Biology?
  • How do students feel about the OER materials they’ve used?
  • Do enough of us know about OER to get started with implementation? How do faculty feel about OER once they get to know it?

Stakeholder Example: Students and Student Government Associations

  • We are looking to support the implementation of OER campus-wide. Which faculty already are adopting OER?
  • If all of our World History I sections had no-cost OER instead of commercial textbooks, how much would this save students over the next academic year?
  • Is a student who takes an OER course in Electrical Engineering at our technical college more or less likely to be hired directly after graduation?
  • What’s keeping our faculty from adopting OER? How can we help with any barriers they’re facing?
  • How do students feel about OER once they’ve used it?

Stakeholder Example: Campus Stores

  • What percentage of students on campus are interested in a print-on-demand program for OER?
  • Do bookstore employees know about OER? What do they think about it?
  • What do students think about our new low-cost mathematics platform?
  • How are students performing due to our new low-cost psychology adaptive platform?
  • If we do a print service for open textbooks, what percentage of students in the course would want a printed textbook?

Step 2: Exploring Data Types

Once you have a comprehensive list of critical questions, turn these lists into tables with a column for the data you need to collect. The table can set the standards for how and when you collect various data for each project within your OER program, along with partnerships that need to be made to gather data outside of your direct access and/or control.

Because data collection may be a new practice to a first-time OER program manager, we will first discuss some basic data methodologies and ways to gather useful, analyzable OER-related data at your institution or system.

Quantitative Methods: Getting Impactful Numerical Data

Quantitative methods of data collection result in data that can be represented by and condensed into numbers (Blackstone 2012). Quantitative data may have a reputation for being a less human way of looking at a program, as it’s often seen as the “hard” or “objective” kind used exclusively on impact or accountability reports. This isn’t the whole story for OER program managers; quantitative data can find the magnitude of the effect of particular OER programs or projects, the most pressing needs of an institution or a department when selecting course materials, or how different introductory courses at your institution have adopted OER at different rates.

OER programs are, by default, focused on educational equity, and these programs can be sidetracked by the perceived “objectivity” of quantitative data. All quantitative data, including OER program data, will have its flaws, and too much reliance on quantitative data may steer a program into the illusion of pure objectivity (Armor 1998). Be sure to take into account how power, privilege, and inequity could interact and intersect with your data and how you analyze it. Qualitative data can also help with this, as discussed in the next section.

Quantitative Surveying

A form with answers which can be quantified is sent to instructors, staff, and/or students. Quantitative response formats include numerical responses, multiple choice responses, rating scales, and rankings.

Examples of quantitative survey data gathered and analyzed for OER programs include:

  • Percentage of higher education faculty who are aware of OER (Seaman 2018, p.11)
    • Keep in mind: How aware is “aware” when it’s self-reported? Are you asking this question and defining terminology first? Do faculty know that awareness goes beyond knowing these definitions?
  • Ranking of the most serious barriers to adopting OER (Seaman 2017, p.30)
    • Keep in mind: If you have a “quality” item ranked in here, how do faculty define what quality is? Open-ended qualitative responses will help here.
Web Analytics

If you have a website for your initiative that offers impactful opportunities like professional development, grant applications, or OER discovery assistance, analytics can help you understand where your stakeholders are visiting, what they are prioritizing, and how long they are spending on the site. Analytics may also be available from third-parties in your web-based OER repositories and textbook platforms. Be sure to take privacy into account when addressing analytics: are you (or third-parties) over-collecting what you need?

Examples of web analytics data gathered and analyzed for OER programs include:

  • Number of unique users visiting a web page per time period (day, week, month, etc.)
    • Keep in mind: These are often anonymized, but they’re based on unique internet protocol (IP) addresses. Are these IP addresses deleted, or are they stored somewhere? This could be personal identifiable information (PII) that would need to be protected.
  • Top regions/countries with OER downloads or views in a repository
    • Keep in mind: Are these places more likely to use your OER because they have English as a first or second language? What would happen if you offered translations?
  • Most-downloaded open resources in a repository
    • Keep in mind: This may be because of one gigantic supersection adoption, or it may be many individuals downloading a particular resource from all walks of life. It’s possible that due to protecting PII, you may never know the difference. An adoption survey can help, but it’s tough to get a high response rate on those surveys
No-Cost and Low-Cost Designators in Course Schedules

OER program managers have been considering the implementation of no-cost and low-cost course materials designators in student registration systems since 2013, when Maricopa Community College’s Maricopa Millions program implemented its OER designator and shared this practice with the larger community (Maricopa Community College 2013). Multiple states have now mandated no-cost, low-cost, and/or OER designators, and both individual institutions and university systems have moved these programs forward in recent years (SPARC 2021). There are many factors to keep in mind when using designator data: see the Further Reading section in this chapter for a comprehensive resource on designators to assist with this.

Examples of no-cost and low-cost course materials designator data include:

  • Percentage of sections with no-cost course materials designators in a course
  • Number of student course enrollments affected by no-cost and low-cost course materials designators

Qualitative Methods: Getting Meaningful Perspectives and Experiences

Qualitative data has a reputation for being the “soft” or subjective data that, at first glance, may appear to not be as helpful in determining the impact of your program or informing future decisions. As an OER program manager, it’s a great idea to throw this reputation out entirely: qualitative data collection is extremely helpful in illustrating the meaning behind quantitative data, understanding the overall emotions and opinions surrounding various goals and projects within your program, and identifying emerging trends which your more deterministic quantitative questions could not have anticipated.

Qualitative data can be intimidating, as extra time and skills are required to manage and analyze open-ended qualitative data. Still, the outcomes of this extra time and effort can impact the quality and sustainability of your program heavily, and getting to know qualitative data is highly recommended.

Qualitative Surveying

Often within the same form as quantitative questions, qualitative survey questions demand answers which can be categorized and interpreted at a semantic level. Qualitative responses in surveys are typically open-ended short responses and open-ended paragraph/essay responses, along with the “Other” text box option for quantitative multiple-choice questions.

Examples of qualitative survey data gathered and analyzed for OER programs include:

  • Quotes which are illustrative of corresponding quantitative OER findings (Bell 2018, p.14)
  • Emerging trends and issues in OER which quantitative survey questions did not anticipate (Gallant 2018, p.25)
Interviews and Focus Groups

Interviews and focus groups are in-depth, largely qualitative data collection methods which normally involve a conversation with someone, or a group of people, in order to dive deeper into a particular topic than quantitative research can usually cover (Bhattacherjee 2012). One salient benefit of these methods is that trends and potential issues often emerge naturally from these conversations.

Unlike surveys, interviewing and running focus groups require building a rapport with participants, listening actively, handling emotions during conversations, and managing issues of inclusion, such as the hidden cultural and power dimensions of a conversation (McGrath 2018). Having more than one researcher working on the project can help keep interview and focus group analyses from skewing in the direction of one researcher’s line of thinking. Focus groups may also include more methodical activities to start a focused conversation, such as card sorting for usability and user experience topics (Babich 2019).

As one person or group cannot reliably represent an entire group of people, more than one interview or focus group is often planned when gathering data about the efficacy of an OER program. Examples of interview data gathered and analyzed for OER programs include:

  • Emergent ideas from students on the usability of an OER text or platform (Cooney 2017, p.169)
  • Emergent issues regarding registration deadlines and faculty textbook adoptions from groups of students (Freed, Friedman, Lawlis, and Stanton 2018)
OER Data Examples

The following table includes some examples of key impact indicators, data kept for analysis and calculations, data that disaggregates other data into various groups, methods used by grantees or overall instructional faculty, and perceptions of various course material-related topics by overall instructional faculty. Please feel free to add to, remove, or revise this data list for your own program’s needs and contexts.

OER Data List Spreadsheet

Disaggregating Your Data to Address Equity Directly

A focus on equal, day-one, no-cost access to resources is inherently a focus on equity; equity should be within an OER program manager’s mode of thinking at all times, and therefore measures focusing on equity should be integral to every OER program’s overall goals and strategies. When addressing student success with OER, measuring only the total aggregate data for all students can be a quick way to gauge overall effectiveness, but it is not a way to find out if equal access to quality resources is leading to more success specifically for students with barriers to that access. In fact, only looking at the data of all students affected by an OER course transformation will likely lead to an analysis that isn’t measuring your intended equity-focused outcomes (Grimaldi, Mallick, and Waters 2019).

To bring equity into your data collection strategy, plan to collect disaggregated data (data categorized by various groups) from your institution which reflects various equity groups being addressed primarily in an equity-focused program: those with barriers to educational materials access and those for whom traditional materials tend to exclude. Examples of disaggregated-data studies on OER efficacy include a same-instructor, multi-year analysis which breaks down efficacy results by Pell eligibility, race/ethnicity, and enrollment status, which found disproportionate effects for marginalized groups (Colvard, Watson, and Park 2018) and a two-semester study of a calculus course with and without OER in 2014, which found no effect for all students but positive effects for marginalized groups (Delgado, Delgado, and Hilton 2019)

  • Demographic groups that should be considered in OER data disaggregation include but are not limited to (DOERS 2021):
  • Socioeconomic status of the student or student’s family (Pell Grant eligibility)
  • Race/ethnicity of the student
  • Gender of the student, including self-reported gender identities
  • Indigenous status, if applicable in your region
  • First-generation status
  • Enrollment status and teaching/learning modalities (part-time, full-time / in-person, online, hybrid)
  • Accessibility needs / students with varied abilities

Step 3: Assigning Specific Data Collection to Your Critical Questions

Now that you have your critical questions from stakeholders across your institution and a working knowledge of the fundamentals of OER data collection, it’s time to plan your actions. By assigning the types of data required by your stakeholders to answer their questions, you’ll have a framework for exactly what you need to collect, along with the stakeholders for which the reports on this data will be intended.

Table 21.1. Stakeholder Example: Executive Administration
Critical Question Data Required
How much in textbook cost avoidance have you saved students over the past academic year? Student OER/zero-cost section enrollments affected this academic year, cost savings per student per course
How many students has this program affected? Section enrollments affected cumulatively at the semester level
What are the savings numbers for last semester? Section enrollments affected at the semester level, cost savings per student per course
Has the implementation of OER affected student retention at a course level? A degree program level? Institution’s preferred method of measuring student retention per each OER section, e.g. Drop/Fail/Withdraw or D/F/Withdraw (DFW) rates
Has the implementation of OER affected student success? Is this effect larger for first-generation students? Grades and/or learning outcomes/competencies data per student affected, disaggregation of data by demographic groups
How long should we expect savings to continue due to one award? Do faculty turn to commercial resources after a certain period of time? Why do they, if so? Checks on sustainability of an OER implementation for each instructor and each course/section, survey responses if OER use is discontinued
How do faculty feel about OER? Does that differ by department or if they’re teaching introductory / advanced courses? Qualitative data from surveys, focus groups, and/or interviews with instructional faculty
Do students think that the cost of materials is an important thing for our institution to address? Qualitative data from surveys, focus groups, and/or interviews with students
Table 21.2. Stakeholder Example: Instructional Faculty (Instructors, Instructional Designers)
Critical Question Data Required
Has the implementation of OER affected student success? Is this a same-instructor comparison, or an aggregate of all instructors? Grades and/or learning outcomes/competencies data per section, disaggregation of data by instructor of course before and after OER implementation
Has the implementation of OER affected student success in the College of Arts and Sciences? OER sections per college, grades and/or learning outcomes/competencies data per section
Has the implementation of OER affected student success in our IT degree programs? OER sections per degree program, grades and/or learning outcomes/competencies data per section
For all the OER used in the Biology department, which textbook is used most for Concepts of Biology? OER sections per department, open textbook(s) or other OER adopted per section
How do students feel about the OER materials they’ve used? Qualitative data from surveys, focus groups, and/or interviews with students
Do enough of us know about OER to get started with implementation? How do faculty feel about OER once they get to know it? Qualitative and quantitative data on the participation in / impact of professional development programming on OER at the institution
Table 21.3. Stakeholder Example: Students and Student Government Associations
Critical Question Data Required
We are looking to support the implementation of OER campus-wide. Which faculty already are adopting OER? Instructional faculty in each OER section, colleges/departments/degree programs per section
If all of our World History I sections had no-cost OER instead of commercial textbooks, how much would this save students over the next academic year? Annual OER projections, projected number of enrollments in next AY, average savings per student per course
Is a student who takes an OER course in Electrical Engineering at our technical college more or less likely to be hired directly after graduation? OER section enrollment per student per degree program, hiring data per student
What’s keeping our faculty from adopting OER? How can we help with any barriers they’re facing? Qualitative data from surveys, focus groups, and/or interviews with instructional faculty
How do students feel about OER once they’ve used it? Course evaluations, surveys
Table 21.4. Stakeholder Example: Campus Stores
Critical Question Data Required
What percentage of students on campus are interested in a print-on-demand program for OER? Qualitative data from surveys, focus groups, and/or interviews with students
Do bookstore employees know about OER? What do they think about it? Qualitative data from surveys, focus groups, and/or interviews
What do students think about our new low-cost mathematics platform? Qualitative data from surveys, focus groups, and/or interviews with students
How are students performing due to our new low-cost psychology adaptive platform? Platforms adopted per section, grades and/or learning outcomes/competencies data per section, same-instructor comparisons before and after
If we do a print service for open textbooks, what’s the average percentage of students in the course who would want a printed textbook? Qualitative data from surveys, focus groups, and/or interviews with students

Step 4: Creating a Place for Your Data Collection

Now that you have identified which data you need to collect based on stakeholder needs, categorized and defined each type of data, and determined the methods for data collection for each, it’s time to create one place where all of this data resides. Whenever possible, keep this data together in one file; questions will inevitably arise which will require you to bring data points together that may have seemed entirely disconnected at first.

There is no one correct method or platform to host your data. When considering where this place for your data will reside, consider the following:

  • Which methods are you most familiar with?
  • Which methods allow you to sort by any data point easily?
  • Which methods allow a quick search of your data?
  • Which methods can manage multiple years of data? Does the system get overloaded when you have too many columns or rows?
  • Will you keep any data considered personally identifiable information? In this case, which methods allow for you to comply with all FERPA guidelines and manage personal data ethically? What should you not share with the public due to privacy?
  • If some data needs to be protected (e.g. in the event of gathering PII), how secure are the methods and platforms from cyberattacks?
  • Which methods allow for accessible data visualization? This will allow you to make your data more usable and readable to stakeholders.

Here are a few examples of places for OER data. All of these examples have internal data storage tools that are linked directly to their external reporting structures:

  • Affordable Learning Georgia uses Microsoft Excel for one large ALG Tracking sheet. This sheet is hosted in a shared drive and able to be edited by anyone in ALG. Microsoft Power BI links with Excel to create data visualizations and export to PDF for institution-specific reports.
  • Kwantlen Polytechnic University visualizes their live Zero Textbook Cost program data through Tableau.
  • Open Oregon Educational Resources stores their data in a Google Sheet and visualizes this data in a searchable web table.

By this point in the planning process, you should have a solid data strategy and plan in place for your OER program. This plan should evolve over time as stakeholder needs and data platform capabilities change and/or expand.

Conclusion

Data collection allows an OER program manager to analyze program activities, determine the impact of projects, and report on this impact to governments, executive administrators, faculty and staff, students, and the public. Determining how you will measure the impact of your OER program early in the building process is a crucial part of creating and sustaining a successful program. Be sure to refer to your Environmental Scan (see Building Familiarity on Campus) in determining who, other than you and your team, collects and shares this helpful data.

Recommended Resources

Marking Open and Affordable Courses (Hare, Kirschner, and Reed 2020), an open text published by the University of Texas at Arlington, is a comprehensive guide to no-cost and low-cost designators, containing analyses of the policy and practices behind OER/affordable course markings and nine case studies from diverse higher education institutions and systems.

Getting to know the basics of quantitative and qualitative research is an essential task for new OER program managers. Social Science Research: Principles, Methods, and Practices (Bhattacherjee 2012) is an open textbook that dives into the theories behind both quantitative and qualitative research; be sure to check out the full chapter on qualitative analysis (p.113).

This text only addresses the collection of data as immediately relevant to OER Program Managers. For a more in-depth look at OER research methods (for example, as meant to be published within a peer-reviewed journal), please read the OER Research Toolkit (Open Education Group 2016).

Key Takeaways
  1. Use quantitative data to find the magnitude of the effect of particular OER programs or projects, the needs of your institution and its departments when selecting course materials, or how different introductory courses at your institution have adopted OER at different rates.
  2. Use qualitative data to illustrate the meaning behind quantitative data, gain an understanding of the overall emotions and opinions surrounding various goals and projects within your program, and identify emerging trends which your more deterministic quantitative questions could not have anticipated.
  3. OER programs are inherently focused on equity. Planning on collecting disaggregated data by groups with barriers to quality educational resource access will help measure the effect your program has on the students who need it most.
  4. Reporting data will be largely based on what your key stakeholders want to know. Use information from your environmental scan to further plan the data you will collect.
  5. Stakeholder needs and the capabilities of platforms to keep and analyze data will change over time at your institution. Be sure that your plan evolves alongside these changes.

References

Armor, David Alain. 1998. “The illusion of objectivity: A bias in the perception of freedom from bias.” Dissertation Abstracts International: Section B: The Sciences and Engineering, 59(9-B), 5163. American Psychological Association. https://psycnet.apa.org/record/1999-95006-117

Atlassian. 2021. “User Stories | Examples and Template.” Atlassian.com. Accessed January 30, 2022.https://www.atlassian.com/agile/project-management/user-stories

Babich, Nick. 2019. “Card Sorting Best Practices for UX.” Adobe. Accessed January 30, 2022. https://xd.adobe.com/ideas/process/information-architecture/card-sorting-best-practices/

Bell, Steven. 2018. ‌”Course Materials Adoption: A Faculty Survey and Outlook for the OER Landscape.” Choice 360. https://www.choice360.org/research/course-materials-adoption-a-faculty-survey-and-outlook-for-the-oer-landscape/

Bhattacherjee, Anol. 2012. Social Science Research: Principles, Methods, and Practices. Florida: University of South Florida Libraries. https://digitalcommons.usf.edu/oa_textbooks/3/

Blackstone, Amy. 2012. Principles of Sociological Inquiry – Qualitative and Quantitative Methods. Saylor Foundation. https://open.umn.edu/opentextbooks/textbooks/principles-of-sociological-inquiry-qualitative-and-quantitative-methods

Colvard, Nicholas B., C. Edward Watson, and Hyojin Park. 2018. “The Impact of Open Educational Resources on Various Student Success Metrics.” International Journal of Teaching and Learning in Higher Education 30(2): 262–76. https://www.isetl.org/ijtlhe/pdf/IJTLHE3386.pdf

Delgado, Huimei, Michael Delgado and John Hilton III. 2019. “On the Efficacy of Open Educational Resources.” The International Review of Research in Open and Distributed Learning 30(1): 184-203. http://www.irrodl.org/index.php/irrodl/article/view/3892/4959

DOERS3. 2021. “OER Equity Blueprint: Theoretical Framework and Research Foundation.” https://www.doers3.org/theoretical-framework-and-research-foundation.html

Freed, Brooke, Amber Friedman, Sarah Lawlis, and Angie Stapleton. 2018. “Evaluating Oregon’s Open Educational Resources Designation Requirement.” https://www.oregon.gov/highered/research/Documents/Reports/HECC-Final-OER-Report_2018.pdf

Grimaldi, Philip, Debshila Basu Mallick, Andrew Waters, and Richard Baraniuk. 2019. “Do open educational resources improve student learning? Implications of the access hypothesis.” PLOS ONE, 14(3). https://doi.org/10.1371/journal.pone.0212508

Hare, Sarah, Jessica Kirschner, and Michelle Reed (Eds). 2020. Marking Open and Affordable Courses: Best Practices and Case Studies. Arlington, TX: Mavs Open Press. https://uta.pressbooks.pub/markingopenandaffordablecourses/

Kwantlen Polytechnic University. 2020. “KPU classes – with $0 for textbooks!” Accessed January 30, 2022. https://www.kpu.ca/open/ztc

Maricopa Community Colleges. 2013. “Open Educational Resources.” Accessed January 30, 2022. https://www.maricopa.edu/current-students/open-educational-resources

McGrath, Cormac, Per J. Palmgren, and Matilda Liljedahl. 2018. “Twelve tips for conducting qualitative research interviews.” Medical Teacher 41(9): 1002-1006. https://www.tandfonline.com/doi/full/10.1080/0142159X.2018.1497149

OECD. 2016. “Research Ethics and New Forms of Data for Social and Economic Research.” OECD Science, Technology and Industry Policy Papers, 34. Paris: OECD Publishing. https://doi.org/10.1787/5jln7vnpxs32-en

‌Open Oregon Educational Resources. n.d. “Resources.” Accessed January 20, 2022. https://openoregon.org/resources/

Oxford Lexico. 2020. “Definition of DATA.” Lexico Dictionaries | English. https://www.lexico.com/en/definition/data

Seaman, Julia E., and Jeff Seaman. 2018. “Freeing the Textbook: Educational Resources in U.S. Higher Education, 2018.” Babson Survey Research Group. https://www.onlinelearningsurvey.com/reports/freeingthetextbook2018.pdf

Seaman, Julia. and Jeff Seaman. 2017. “Opening the Textbook: Educational Resources in Higher Education, 2017.” Bay View Analytics. https://www.bayviewanalytics.com/reports/openingthetextbook2017.pdf

SPARC. 2021. “OER State Policy Tracker.” Accessed January 30, 2022. https://sparcopen.org/our-work/state-policy-tracking/

‌University System of Georgia. 2018. “2018 USG Survey Report on Open Educational Resources.” Affordable Learning Georgia. https://www.affordablelearninggeorgia.org/documents/2018_USG_OER_Survey.pdf

‌University System of Georgia. 2021. “ALG Data Center.” Affordable Learning Georgia. Accessed January 30, 2022. https://www.affordablelearninggeorgia.org/about/data

U.S. Department of Health and Human Services. 2018. “45 CFR 46.” https://www.hhs.gov/ohrp/regulations-and-policy/regulations/45-cfr-46/index.html

License

Icon for the Creative Commons Attribution 4.0 International License

The OER Starter Kit for Program Managers Copyright © 2022 by Abbey K. Elder; Stefanie Buck; Jeff Gallant; Marco Seiferle-Valencia; and Apurva Ashok is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book