Frequently Asked Questions
General
Quantitative reasoning, also called numeracy or quantitative literacy, is the ability to use and make sense of numbers.
Development and validation of QuaRCS quantitative questions took place over eight semesters of administration and refinement (2010-2015).
Content validity was established by selecting for assessment the ten quantitative skills ranked most highly by educators in (a) supporting science literacy and (b) everyday life.
Significant differences in the performance of groups with different levels of numeracy (general education science students, students enrolled in STEM major courses, and experts) demonstrates that the instrument is capable of distinguishing varied levels of numeracy, from novice to expert.
The development and validation of the QuaRCS is described in detail in:
Follette, K. B., McCarthy, D. W., Dokter, E., Buxner, S., & Prather, E. (2015). ”The Quantitative Reasoning for College Science (QuaRCS) Assessment, 1: Development and Validation.” Numeracy, Vol. 8, Iss. 2.
The skills assessed on the QuaRCS vary between the versions of the assessment. For the QuaRCS “Full” 25 question assessment, the skills assessed are:
- Graph Reading
- Table Reading
- Arithmetic
- Proportional Reasoning
- Estimation
- Percentages/Fractions
- Statistics and Probability
- Area and Volume
- Unit Conversions and Dimensional Analysis
- Error
The QuaRCS “Light” version of the assessment includes skills 1-6 (Arithmetic, Table Reading, Graph Reading, Proportional Reasoning, Estimation, and Percentages)
The QuaRCS “STEM” version is a subset of the most challenging QuaRCS questions. This version of the assessment includes the following skills: Arithmetic, Graph Reading, Proportional Reasoning, Estimation, Percentages, Probability/Statistics, and Units/Dimensional Analysis
Our study is aimed to inform the role that college level instruction can and should play in the teaching and improvement of the basic numerical skills that modern adults need in order to be discerning voters, conscientious consumers, and informed citizens.
In Phase 1 of the study (currently in progress), we aim to inform the problem by administering the assessment broadly across a wide range of institution types and course disciplines. This will allow us to establish a national baseline and to probe the general relationships between students skills, attitudes, and confidence, as well as how attitudes and skills vary across groups.
In Phase 2 of the study, we aim to study the classroom practices of instructors who are making measurable improvements in their students’ skills and attitudes and to design curricular modules and recommendations based on the practices of these instructors.
For Students
The instructors will know who completed the assessment, but will not receive individual results, and the report they receive will be fully anonymized, per our Institutional Review Board (IRB) protocol. We will not share names, academic majors, class years, or demographic information. The report will only include responses to the quantitative and attitudinal questions on the assessment, as well as scores and composites.
On average, students take 15 to 25 minutes to complete the fifteen-question (light) and fourteen-question (STEM) assessments, and about 30-40 minutes to complete the twenty five-question (full) assessment.
In addition to our research, we would like to provide resources for students interested in improving their quantitative reasoning skills. These resources are not a part or product of our study, but are provided to provide students with the opportunity to shape their own educational experiences.
Understanding Quantitative Reasoning/Numeracy: The National Numeracy Network (NNN) provides a brief overview of numeracy as well as a list of recommended books. The NNN website also includes a section of Improi resources.
Improving Your Quantitative Reasoning Skills/Numeracy:
This website, created by a quantitative reasoning researcher, provides a number of projects to help build your quantitative reasoning skills. We recommend you start with project 1 and work through as many projects as feels helpful.
This website, created by the Science Education Resource Center (SERC) at Carleton college is aimed at geoscience teachers but has a wide range of resources and tools that students can use independently.
For Instructors
Any college classes are welcome to take the QuaRCS assessment. However, we are particularly interested in studying general STEM education classes, especially for nonmajors. These students are less likely to intentionally seek out ways to improve quantitative reasoning skills, perhaps even actively avoiding classes that directly address mathematical ability such as physics, statistics, or explicitly labeled quantitative reasoning courses. In addition, STEM majors are more likely to have developed their quantitative reasoning skills beyond what the QuaRCS assessment is designed to assess.
Yes! We have a version of the assessment specifically for STEM major courses, and in some cases instructors of introductory courses for STEM majors may wish to administer the full or light versions of the assessment (if, for example, they would like to gauge their students’ skills in more elementary numerical skills). For upper-level STEM courses, the assessment is less useful as many STEM majors perform above the level that QuaRCS is designed to study.
Yes! We are very interested in understanding more about how students’ numerical skills and attitudes vary among humanities and STEM disciplines.
To sign up to administer the QuaRCS to your students, you should begin by completing the instructor intake form (*link*)
Completion of the form will trigger us to add your name and institution as an option to the online survey. Once we have done so (typically within 2-3 business days), you will receive a link to assign your students, along with more information about how to administer the assessment.
In addition to helping further our understanding of Quantitative Reasoning skills, you will receive a report on improvements in student attitudes and skills from pre to post semester. We will do all of the grading and help you quantify how your course influences students’ quantitative skills and attitudes toward mathematics.
The due date for the assessment(s) that you enter on the instructor intake form will trigger automatic generation of your report on the day after your due date. Once they have been inspected by a QuaRCS team member for errors, you will be sent the following files within 2-3 business days.
- A report summarizing your students’ skills and attitudes toward mathematics
- A list of the names of students who completed the assessment
- A csv file containing anonymized student data
- A Jupyter notebook file containing the code needed to generate the visualizations in the report, should you wish to use or modify them
Per our Institutional Review Board (IRB) protocol, we can provide you only with fully anonymized data for your students. You will receive a file with student responses to the quantitative and attitudinal questions on the assessment, as well as scores and composites, however this file will not contain names, academic majors, class years, or demographic information.
Your report is kept strictly confidential and is sent only to you. All instructor information is replaced with an anonymous identifier before analysis of any QuaRCS data, including by the QuaRCS study team. School and course information are also replaced by generic identifiers such as “Liberal arts college” or “geological sciences” before analysis. Your name and individual course results will never be released.
We ask that you not debrief individual questions or the general results of the survey for your students due to concerns about the impact that it will have on post-semester assessment.
For most general education classes, we recommend the fifteen-question “QuaRCS Light” assessment. This version is shorter than the original version but has been similarly validated and meets all standard metrics for reliability. This shorter assessment measures the quantitative skills of Arithmetic, Table Reading, Graph Reading, Proportional Reasoning, Estimation, and Percentages.
The 25 question “QuaQuaRCSRCs Full” assessment covers a larger set of skills. Our current recommendation is that general education course instructors administer this assessment if you are particularly interested in the missing skills (Probability/Statistics, Area/Volume, Error, and Units/Dimensional Analysis) or feel that your students are particularly motivated.
The “QuaRCS STEM” assessment is targeted for students who are already in a STEM field. This pilot assessment is a subset of the 14 most challenging QuaRCS questions and has been shown to meet the appropriate metrics for validity and reliability for a single large (N=261) STEM major course. We are in the process of validating it for a larger population (you can help!).
On average, students take 15 to 25 minutes to complete the fifteen-question (light) and fourteen-question (STEM) assessments, and about 30-40 minutes to complete the twenty five-question (full) assessment
We highly recommend that you assign the QuaRCS to your students once at the beginning of your course and again at the end of your term. This allows us and you to assess the impact of instruction on your students’ skills and attitudes toward mathematics. However, if you are just looking to establish a baseline measure, you are welcome to administer the QuaRCS just once at any point during the term.
We strongly recommend that you assign the instrument for participation credit as part of a required homework*. If you choose to assign the instrument for extra credit, please note that your reports will not necessarily reflect the skills and opinions of your class as a whole, as our past experience shows that the response rate will generally be lower and fewer students will complete both pre- and post-semester assessments.
*An alternate assignment will be provided to instructors upon request if any of their students request not to participate in the study
The QuaRCS is best able to establish a baseline measure of quantitative skills and student attitudes toward mathematics when administered at the very beginning of your course. We recommend that you assign it to your students for participation credit as early as possible during your term.
Choice of a “post-semester” assessment date is somewhat more complex. We find that student willingness to devote effort to the assessment drops at the very end of the term. This is mitigated somewhat when administering the QuaRCS “Light” version of the assessment, but in any case, we suggest that you administer the “post” assessment 2-3 weeks before the conclusion of your term.
For Researchers
Our automated data download is currently under revision, please reach out to quarcs@amherst.edu to request data
Scores, attitudes, and basic demographics are available from our dataset.
Please contact us directly for more information while we revise our automated process.