Explain The Significance Of Essay Type Test Items About Fractions

Forty-two states and the District of Columbia are now using the same math and English standards, but the tests they use to determine how well students have mastered them still vary significantly.

One of the goals of the Common Core State Standards was to be able to compare student performance from state to state on a yearly basis. Five years ago, it looked like that would happen. Nearly all Common Core adopters were in at least one of two national consortia that would be creating new exams to accompany the standards, the Smarter Balanced Assessment Consortium and Partnership for College and Career Readiness, known as PARCC.

Those numbers have dwindled. Just 20 states and the District of Columbia plan to give one of the two tests this spring. Others are back where they started: Using tests unique to their state. So even though, in theory, students in Connecticut, Wisconsin and Arizona are all learning the same thing, they’ll be measured differently.

The Common Core writers were very interested in improving how fractions are taught in U.S. classrooms, so we looked at six of these tests to compare how they deal with word problems involving fractions in the fifth grade: those from New York, Wyoming, Florida, as well as PARCC, Smarter Balanced and ACT Aspire, an exam made by the group that produces the college readiness exam. (So far, Aspire is only given in Alabama.)

Related: Common Core testing showdown in Massachusetts

As always, when we’re talking about testing, there are caveats. Even though all these questions deal with fractions, they may be testing different standards. They’re also a mix of actually tested items and sample questions. The sample questions never appeared on the tests but were published ahead of the exams to give teachers and students an idea of what to expect, and the actual test items were released after appearing on an exam last spring. In both cases, just because a type of question doesn’t show up here, doesn’t mean it wasn’t on the test – or won’t be on future tests.

In other words, without an army of undercover fifth-grade reporters spying for us, it’s impossible to do a comprehensive comparison of the exams.

Nevertheless, while we can’t draw conclusions about which test was best from this sample of questions, we can see some important differences in each one’s approach and how they differ – or not – from the old way of doing things.

Related: With all the tests scored, policymakers are grappling with what Common Core test results mean

Let’s start with the obvious: whether a question is multiple choice. Smarter Balanced and PARCC are computer based assessments, meaning it’s easy to go beyond multiple choice and require students to type in open-response answers. The contrast is seen most starkly when comparing these questions from ACT Aspire, a paper-based test, and PARCC.

It’s virtually the same question, but it’s much easier to guess the right answer on ACT Aspire than PARCC. That’s compounded by the fact that the correct answer on the ACT question, 36, is a clear outlier.

“You don’t want the correct answer to stand out,” said Andrew Latham, director of Assessment & Standards Development Services at WestEd, which developed test questions for both PARCC and Smarter Balanced. If students understand enough to know the answer must be greater than 9, they don’t have to do any math to get the right answer, so the question doesn’t necessarily test whether they’re able to divide by a fraction.

The test makers also made different choices about how many steps each problem would take to solve. Look at these two similar questions from Smarter Balanced and Wyoming’s state exam.

The Smarter Balanced question only requires one step to get the right answer, dividing 2 by 1/5. To get Wyoming’s test question right, students first need to be able to use the number line to figure out the shortest and longest distances before they can do the rest of the math. “If you can’t read a number line, it doesn’t matter if you can subtract fractions or not, you’ll get it wrong,” Latham said.

And that’s not a good thing, he added. “I prefer when you focus more on a given standard. If they got it wrong, I don’t know why they got it wrong.”

Phil Daro, one of the lead writers of the Common Core math standards, pointed out that students are asked to do multistep problems in the classroom, however. “You have to have them on the test,” he said.

Related: The surprising initial results from a new Common Core exam

Of course, paper-and-pencil tests don’t only rely on multiple-choice questions. They’ll also include open response questions, like the New York examples, where students are given a standard word problem and asked to show their work before writing down an answer.

These questions take more time and money to grade, and are more prone to human error while grading, but there are pluses. Like the write-in answer on the above PARCC example, short answer math questions eliminate students’ ability to guess with the added benefit of allowing students to get partial credit, if they set up the problem correctly but make an error adding two fractions, for instance.

PARCC sometimes mimics that process with a question like this, which requires students to first write out the expression they’d use to find the answer.

Regardless of whether a question is multiple choice or open response, clarity matters a lot. That’s a particularly important consideration on math tests, where you run the risk of a student getting a question wrong because of weak reading skills rather than weak math skills.

Daro pointed to this ACT Aspire question as an example of “inconsiderate” writing:

He suggested the phrasing could have been made better for a fifth grader by using consistent language and saying “Mario divided his circle into two equal sections.” And the instructions were particularly confusing. “Selecting a word that names the fraction?” Daro said. “That’s like a grammarian talking, not a fifth grader.”

The range of ambiguity in question phrasing is highlighted by these two PARCC items.

The first question’s wording is straightforward, the experts said. The second is more convoluted. In part it’s because the questions are attempting to assess different things. The first just checks if students can subtract fractions, while the second tries to measure students reasoning. (If you’re curious but all the fractions are making you cross-eyed, the correct answer is B and E.)

Daro criticized the test makers for using multiple choice at all to attempt to test how students think. “Multiple choice is the wrong genre for that,” he said. “Either have the kid produce the argument or show them a single argument and have them critique that.”

He also cautioned against blaming Common Core for poorly written test questions – particularly when many, if not all of these items could have been on old exams. Many people think that “whenever you see something that looks odd, it’s because of the Common Core, but that’s just not true,” he said. “Standards can differ in ways that don’t manifest in different items on a test.”

Read more about Common Core.

Short Answer & Essay Tests

Strategies, Ideas, and Recommendations from the faculty Development Literature

General Strategies

  • Do not use essay questions to evaluate understanding that could be tested with multiple-choice questions.

    Save essay questions for testing higher levels of thought (application, synthesis, and evaluation), not recall facts. Appropriate tasks for essays include: Comparing: Identify the similarities and differences between
    Relating cause and effect: What are the major causes of...? What would be the most likely effects of...?
    Justifying: Explain why you agree or disagree with the following statement.
    Generalizing: State a set of principles that can explain the following events.
    Inferring: How would character X react to the following?
    Creating: what would happen if...?
    Applying: Describe a situation that illustrates the principle of.
    Analyzing: Find and correct the reasoning errors in the following passage.
    Evaluating: Assess the strengths and weaknesses of.

  • Don't give students a choice of questions to answer.

    There are three drawbacks to giving students a choice. First, some students will waste time trying to decide which questions to answer. Second, you will not know whether all students are equally knowledgeable about all the topics covered on the test. Third, since some questions are likely to be harder than others, the test could be unfair.

  • Ask students to write more than one essay.

    Tests that ask only one question are less valid and reliable than those with a wider sampling of test items. In a fifty-minute class period, you may be able to pose three essay questions or ten short answer questions.

  • Give students advice on how to approach an essay or short-answer test.

    To reduce students' anxiety and help them see that you want them to do their best, give them pointers on how to take an essay exam. For example:

    • Survey the entire test quickly, noting the directions and estimating the importance and difficulty of each question. If ideas or answers come to mind, jot them down quickly.
    • Outline each answer before you begin to write. Jot down notes on important points, arrange them in a pattern, and add specific details under each point.


Writing Effective Test Questions

  • State the question clearly and precisely.

    Avoid vague questions that could lead students to different interpretations. If you use the word "how" or "why" in an essay question, students will be better able to develop a clear thesis. As examples of essay and short-answer questions:
    Poor: What are three types of market organization? In what ways are they different from one another?
    Better: Define oligopoly. How does oligopoly differ from both perfect competition and monopoly in terms of number of firms, control over price, conditions of entry, cost structure, and long-term profitability?

    Poor: Name the principles that determined postwar American foreign policy.
    Better: Describe three principles on which American foreign policy was based between 1945 and 1960; illustrate each of the principles with two actions of the executive branch of government.

  • Consider the layout of the question.

    If you want students to consider certain aspects or issues in developing their answers, set them out in separate paragraph. Leave the questions on a line by itself.

  • Write out the correct answer yourself.

    Use your version to help you revise the question, as needed, and to estimate how much time students will need to complete the question. If you can answer the question in ten minutes, students will probably need twenty to thirty minutes. Use these estimates in determining the number of questions to ask on the exam. Give students advice on how much time to spend on each question.

  • Decide on guidelines for full and partial credit.

    Decide which specific facts or ideas a student must mention to earn full credit and how you will award partial credit. Below is an example of a holistic scoring rubric used to evaluate essays:

    • Full credit-six points: The essay clearly states a position, provides support for the position, and raises a counterargument or objection and refutes it.
    • Five points: The essay states a position, supports it, and raises a counterargument or objection and refutes it. The essay contains one or more of the following ragged edges: evidence is not uniformly persuasive, counterargument is not a serious threat to the position, some ideas seem out of place.
    • Four points: The essay states a position and raises a counterargument, but neither is well developed. The objection or counterargument may lean toward the trivial. The essay also seems disorganized.
    • Three points: The essay states a position, provides evidence supporting the position, and is well organized. However, the essay does not address possible objections or counterarguments. Thus, even though the essay may be better organized than the essay given four points, it should not receive more than three points.
    • Two points: The essay states a position and provides some support but does not do it very well. Evidence is scanty, trivial, or general. The essay achieves it length largely through repetition of ideas and inclusion of irrelevant information.
    • One point: The essay does not state the student's position on the issue. Instead, it restates the position presented in the question and summarizes evidence discussed in class or in the reading.
  • Read the exams without looking at the students' names.

    Try not to bias your grading by carrying over your perceptions about individual students. Some faculty ask students to put a number or pseudonym on the exam and to place that number / pseudonym on an index card that is turned in with the test, or have students write their names on the last page of the blue book or on the back of the test.

  • Skim all exams quickly, without assigning any grades.

    Before you begin grading, you will want an overview of the general level of performance and the range of students' responses.

  • Choose examples of exams to serve as anchors or standards.

    Identify exams that are excellent, good, adequate, and poor. Use these papers to refresh your memory of the standards by which you are grading and to ensure fairness over the period of time you spend grading.

  • Grade each exam question by question rather than grading all questions for a single student.

    Shuffle papers before scoring the next question to distribute your fatigue factor randomly. By randomly shuffling papers you also avoid ordering effects.

  • Avoid judging exams on extraneous factors.

    Don't let handwriting, use of pen or pencil, format (for example, many lists), or other such factors influence your judgment about the intellectual quality of the response.

  • Write comments on students' exams.

    Write brief notes on strengths and weaknesses to indicate what students have done well and where they need to improve. The process of writing comments also keeps your attention focused on the response. And your comments will refresh your memory if a student wants to talk to you about the exam.

  • Strive to balance positive and critical comments.

    Focus on the organization and flow of the response, not on whether you agree or disagree with the students' ideas. Experiences faculty note, however, that students tend not to read their returned final exams, so you probably do not need to comment extensively on those.

  • Read only a modest number of exams at a time.

    Most faculty tire after reading ten or so responses. Take short breaks to keep up your concentration. Also, try to set limits on how long to spend on each paper so that you maintain you energy level and do not get overwhelmed. However, research suggests that you read all responses to a single question in one sitting to avoid extraneous factors influencing your grading (for example, time of day, temperature, and so on).

  • If you can, read some of the papers twice.

    Wait two days or so and review a random set of exams without looking at the grades you assigned. Rereading helps you increase your reliability as a grader. If your two score differ, take the average.

  • Place the grade on the last page of the exam.

    This protects students' privacy when you return or they pick up their tests.
    Returning Essay Exams

  • Return exams promptly.

    A quick turnaround reinforces learning and capitalizes on students' interest in the results. Try to return tests within a week or so.

  • Review the exam in class.

    Give students a copy of the scoring guide or grading criteria you used. Let students know what a good answer included and the most common errors the class made. If you wish, read an example of a good answer and contrast it with a poor answer you created. Give students information on the distribution of scores so they know where they stand.

  • Use groups to discuss test questions.

    Some faculty break the class into small groups to discuss answers to the test. Unresolved questions are brought up to the class as a whole.

  • Get feedback from the class about the test.

    Ask students to tell you what was particularly difficult or unexpected. Find out how they prepared for the exam and what they wish they had done differently. Pass along to next year's class tips on the specific skills and strategies this class found effective.

  • Keep a file of essay questions.

    Include a copy of the test with your annotations on ways to improve it, the mistakes students made in responding to various question, the distribution of students' performance, and comments that students made about the exam. If possible, keep copies of good and poor exams.


Sources

The Strategies, Ideas and Recommendations Here Come Primarily From:

Gross Davis, B. Tools for Teaching. San Francisco, Jossey-Bass, 1993.

McKeachie, W. J. Teaching Tips. (10th ed.) Lexington, Mass.: Heath, 2002.

Walvoord, B. E. and Johnson Anderson, V. Effective Grading. San Francisco, Jossey-Bass, 1998.

And These Additional Sources...
Brooks, P. Working in Subject A Courses. Berkeley: Subject A Program, University of California, 1990.

Cashin, W. E. "Improving Essay Tests." Idea Paper, no. 17. Manhattan: Center for Faculty

Evaluation and Development in Higher Education, Kansas State University, 1987.

Erickson, B. L., and Strommer, D. W. Teaching College Freshmen. San Francisco:

Jossey-Bass, 1991.

Fuhrmann, B. S. and Grasha, A. F. A Practical Handbook for College Teachers. Boston:

Little, Brown, 1983.

Jacobs, L. C. and Chase, C. I. Developing and Using Tests Effectively: A Guide for Faculty.

San Francisco: Jossey-Bass, 1992.

Jedrey, C. M. "Grading and Evaluation." In M. M. gullette (ed.), The Art and Craft of Teaching.

Cambridge, Mass.: Harvard University Press, 1984.

Lowman, J. Mastering the Techniques of Teaching. San Francisco: Jossey-Bass, 1984.

Ory, J. C. Improving Your Test Questions. Urbana:

Office of Instructional Res., University of Illinois, 1985.

Tollefson, S. K. Encouraging Student Writing. Berkeley:

Office of Educational Development, University of California, 1988.

Unruh, D. Test Scoring manual: Guide for Developing and Scoring Course Examinations.

Los Angeles: Office of Instructional Development, University of California, 1988.

Walvoord, B. E. Helping Students Write Well: A Guide for Teachers in All Disciplines.

(2nded.) New York: Modern Language Association, 1986.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *