Managing Free Response Questions

What makes a free response question good? Link to this section

Student at desk writing on paper with pencil

Free response questions ask students to explain their answer, which means you can often elicit a deeper understanding of the student's knowledge (Criswell & Criswell, 2004).

The best free response questions ask the student to process information or knowledge rather than to reproduce it, such as explaining their reasoning process or asking them to apply what they know in new contexts (Schuwirth & VanDerVleuten, 2004).

Management tools Link to this section

Person sitting on a large stack of booksManaging free response questions can be difficult. From collecting them in a simple, straightforward way to grading and sharing feedback, free response questions are often challenging for instructors.

Google Classroom has an excellent grading workflow when working with digital work. Classroom makes student work accessible to you from any location where you have internet access, rather than lugging home stacks of papers. You can create a comment back with comments you find yourself using frequently, and assigning scores is easy as you toggle through submitted work. You can also add private comments to a student's work to give them direct feedback on their submission, all in one place. Learn more about grading and feedback tools in Google Classroom.

Self assessment can sometimes be an option for grading students, and Forms is a great tool to create a question-based tool that prompt students to reflect on their mastery. Learn more about Google Forms.

Using rubrics Link to this section

Rubrics tell the scorer what features of the student's answer they should focus on and how to award and distribute points for the response (Livingston, 2009), which can be useful for standardization across grade levels and/or school sites.

You can create a rubric with many different tools, but Sheets and Docs are both great for creating tables, which is the usual format for building a rubric.

Google Classroom has rubric integration only in beta, so for most teachers there's still only one place for one total score in their grading workflow. However, we have two suggested workarounds:

Resources for creating rubrics:

Consider your assessment options Link to this section

Three students looking at an iPad togetherDespite all of their benefits, free response questions are not always the best way to assess students, and in fact, no single question type is consistently and intrinsically better than others (Criswell & Criswell, 2004; Schuwirth & VanDerVleuten, 2004). In fact, a student's performance will tend to vary more from question to question when using free response questions, and the scoring process requires judgement, meaning that different instructors might give different scores to the same response (Livingston, 2009).

In addition, there are also gender factors in student performance that vary on question type (Breland, Danos, Kahn, Kubota, & Bonner, 1994; Mazzeo, Schmitt, & Bleistein, 1992; Livingston & Rupp, 2004). When males and females perform equally well on a set of multiple choice questions, females outperform males on the free response questions. Yet when males and females perform equally well on a set of free response questions, males will outperform females on the multiple choice questions. While this difference is unavoidable in many respects, it is simply beneficial to assess students in several different ways to truly capture their mastery.

Google Forms' Quiz feature is a simple, easy way to offer multiple choice questions when appropriate for assessment. Learn more about Google Forms & Quizzes.

Breland, H. M., Danos, D. O., Kahn, H. D., Kubota, M. Y., & Bonner, M. W. (1994). Performance versus objective testing and gender: An exploratory study of an Advanced Placement history examination. Journal of Educational Measurement, 31, 275-293.

Criswell, J. R., & Criswell, S. J. (2004). Asking essay questions: Answering contemporary needs. Education, 124(3), 510-516.

Livingston, S. A., & Rupp, S. L. (2004). Performance of men and women on multiple-choice and constructed- response tests for beginning teachers (ETS Research Report No. RR-04-48). Princeton, NJ: Educational Testing Service.

Livingston, S. A. (2009). Constructed-response test questions: Why we use them; How we score them. R&D Connections, (11), 1-8.

MacCann, R., Eastment, B., & Pickering, S. (2002). Responding to free response examination questions: Computer versus pen and paper. British Journal of Educational Technology, 33(2), 173-188.

Mazzeo, J., Schmitt, A., & Bleistein, C. A. (1992). Sex-related performance differences on constructed-response and multiple-choice sections of Advanced Placement examinations (College Board Research Rep. No. 92-7; ETS Research Report No. RR-93-05). New York: College Entrance Examination Board.

Schuwirth, L., & VanDerVleuten, C. (2004). Different written assessment methods: What can be said about their strengths and weaknesses? Medical Education, 38(9), 974-979.

This page was last updated on June 3, 2021