This past week, I was helping out with a portion of my department’s
Assessment efforts. The particular task for our group was to go through
selected common questions from past year exams and score them according to a
pre-determined rubric. The rubric assesses (1) conceptual knowledge, (2)
ability to apply that knowledge to solve a problem, and (3) clarity of work. It
is the third category that I would like to discuss in today’s post because
there was much to mull over.
Let me describe the assessment setup. I’m grateful to a very
organized colleague who coordinated the efforts. Since this was part of a
program-level assessment, we were looking at outcomes for our majors. First, General
Chemistry final exams from declared majors had to be extracted from close to
two thousand exam scripts covering a four-year span. The exams were turned to
the page displaying the question to be assessed, and an ID number was written
on that page. The rest of the exam was stapled to prevent easily looking at the
student’s name. Thankfully, I did not have to do any of this – and I profusely
thank all those who did the work. (We’ll need to come up with a less
time-intensive approach in the future.)
At this point, the faculty team of four got to work. The
rubric was discussed beforehand, then we individually scored five exams and
discussed our results. This was to establish norming procedures. Subsequently we
would divide up the rest of the pile, otherwise it would take too long for us
as a group to go through discussion and scoring each exam. The five exams
chosen for norming covered the range from “student nailed the question” to
“multiple errors and confusion all over the place”. After norming, we got to
work. Each exam had to be scored independently by two people (instead of all
four). The coordinator would then do all the necessary post-analysis including
checking the agreement level and if there were any anomalies.
Was it tedious to score a bunch of exams using a rubric?
Yes, but I learned some very interesting things that I would not have
considered when grading my final exams at the end of the semester. First, with
a much larger data set we could see both the commonality (of student errors)
and the diversity of approaches used. Every year we teach multiple sections of
General Chemistry. (Last year alone we had 15 sections of the first semester
course!) Second, since all four of us were in the same room, we could discuss
what was most important to us in determining whether a student successfully
demonstrated conceptual knowledge or how we interpreted clarity. (It was
usually straightforward to see if a student could apply their knowledge to
solve a problem.) I’m pleased to say that while we did agree for the most part,
we also learned from each other. Now, I happen to be in a department where
hallway chatting with my colleagues about various aspects of teaching is
commonplace. However, having the goal of assessing a particular question opened
up both a wider and deeper conversation. A hallway chat might be 10-15 minutes.
Being in the same room for 3-4 hours with a specific task provides a different
and complementary environment. (Our coordinator also provided excellent food
and snacks to tide us over!)
I would like to focus on discussing the clarity aspect in
problem-solving, particularly in how students present their answers on paper. This
jumped out at me during our session assessing the solving of a stoichiometry
problem, because as faculty we sometimes disagreed about what details we wanted
to see in a solution. A capable student who knows how to solve a multi-step
numerical problem is going to get the correct values for the final answer,
however some skip steps. For example, students correctly calculated the number
of moles of reactants, and then “knew” which was the limiting reactant based on
the next set of calculations – but did not always clearly point out which was
the limiting reactant. Other students would scrawl “limiting reactant” or LR
somewhere close by, but did not show how they knew. (It requires a quick
algebraic calculation based on the relative stoichiometry of the reactants in
the balanced equation.) Very likely, they punched their calculator, figured out
the right answer, and then moved on. The other place where clarity was an issue
was not clearly indicating when one was calculating “amount consumed” versus
“amount produced” and in some cases “amount leftover” for the non-limiting
reactant. Some students wrote in all the numbers correctly, even showing the
calculation, but with little to no explanatory text.
One point of discussion that came up is “who” the audience
should be. Clearly, the students are writing for the instructor – but they
often tacitly assume that the instructor can “read between the lines”. Now,
since we were all experienced faculty and we’ve used the common question
multiple times (for assessment purposes), we can follow what the student is
doing even if steps are skipped or text is not written out clearly – at least
when the student gets the correct final numerical values. (If the student gets
the final values wrong, then we have to look a little more carefully at where
the student went wrong to see if any partial credit can be given and to
indicate the source of the error.) But perhaps the students should aim their
explanations not at the instructor, but at the level of a fellow student in the
same class. That’s what I tell students in the accompanying lab course as they write
in their lab notebooks and put together their lab reports, but I don’t think
I’ve explicitly said the same for exams in the “lecture” portion of the course.
Now, I do model what should go into a solution. When
something new is introduced, I work out the solution step-by-step on the board
with accompanying explanatory remarks and assumptions. My exam solution sets
also include all this. But I think there are three issues I need to consider
further. Many moons ago, I made the switch to using online homework (via
Pearson’s Mastering Chemistry since we use a Pearson textbook for General
Chemistry). Like any other system there are pros and cons. The major pro is
that students are motivated to constantly keep up and can work anywhere,
anytime as long as they have an internet connection. They also get immediate
feedback. It also reduces my grading substantially since I no longer grade
homework (which was rather tedious). The major con is that students type in
their numerical values or very short (easy-for-computer-to-grade) answers in
the online system. They don’t get much practice writing out an answer in full clarity, unless they happen to be
predisposed to doing so. A few do, as evidence of their homework-notebook (I
ask students to work the problems on paper and bring it in to my office when
they have questions), but most others have a sketchy outline with all sorts of
skipped steps. So they only get practice when they are taking notes in class
(when I work an example on the board). When we actively work in small groups on
problems in class, I often don’t collect their worksheets – although I do
provide a solution set (sometimes on the spot on the board). When I circulate
in class, I’m usually not looking for the clarity in the written solution; I’m
focused on trying to help the students understand the main concepts and apply
them. I need to rethink my approach.
One possible route is to go back to making students turn in
solutions to problem sets. I still do this when I have a smaller class, but in
a larger class the grading becomes substantial. But perhaps I should still do
this several times during the semester in a larger class. Another possibility
is to actually have students look at previous student work (varying from
excellent to downright confusing) and critique it using a rubric, i.e., have
the students do the same thing I did for Assessment. This might be a very
valuable exercise and I should block off some class time to make sure this
happens several times during the semester. While I could come up with examples
demonstrating the range, I think using actual student work might be more
convincing. I was flabbergasted by some of the answers in the assessment, as I
had to decide what the score should be on the rubric.
The second thing I need to consider is how I tend to give
the stronger students in my classes a pass (i.e. full credit) for a correct
final solution even though steps have been skipped. In the assessment, I did
not know the identity of the student and they were often not from my class
(given the multiplicity of sections) and therefore I really took a good look at
clarity (because it was on the rubric). If being able to present your work
clearly and with sufficiently complete information is a key skill, I need to
make sure this gets emphasized sufficiently not just in my teaching, but in my
grading. Yes, I know the students in my own class and I know those who “know”
how to do the problem. I can even tell what they are doing in their heads and
not writing down. But the writing down and presenting is important – very
important in fact – for any future career.
Third, I realized that some of the students skip steps or
don’t present their work as clearly because of time pressure in exams. My exams
are tight – the average student has just enough time to finish, but with very
little leeway. I think that time on task is one among several measures of
whether a student understands the material. A student who really understands
will be able to solve the problems much more quickly than one who is
floundering around trying many things that don’t work. But maybe they are too
tight, and I need to reduce the amount of material asked on an exam and balance
it with clarity in solution presentation. This means that I need to emphasize
the importance of clarity throughout the semester if I want to see the students
do this on an exam.
That’s a lot to mull over, but I’ve still got six weeks to
prepare. I did learn some very useful things from this assessment exercise, and
it’s something our group will be sharing with the department. Sometimes
assessment can be useful, even though there are some tedious bits and related
administrative busywork. I certainly benefited from my participation in the
exercise.
No comments:
Post a Comment