By Kevin Pry, Associate Professor of English, Lebanon Valley College
When Julius Caesar took his legions across the Rubricon River into Italy and marched on Rome to change the old Republic forever, he knew there was no turning back--he was committed wholeheartedly to discarding an old set of assumptions and practices for new ones. My experiences with assessment have put me into a situation that would have felt familiar to one of Caesar's veteran legionaries, for in the struggle to improve our assessment, I have had to push beyond my traditional understanding of how to use rubrics. I have had to develop a methodology that has given new scope and effectiveness to the way I devise assignments, evaluate student work, and assess the results. I jokingly call this change, "Crossing the Rubricon."
In the past, I used rubrics to grade major written or oral assignments, using them like checklists to determine whether or not students demonstrated their skill so that I could give specific feedback to them for the future. I was an old grading centurion following the old Roman regulations, more for discipline's sake than as an innovative tactician in the war on ignorance. But I noticed that the use of conventional rubrics often seemed to penalize students in assignments where I was trying to promote risk-taking and creativity. For example, in acting classes, there are some techniques and concepts that can only be learned by trying to employ them and failing at one's initial attempts to do them.
This led to Epiphany #1: One can devise a rubric that puts a positive grade value on how useful a student's unsuccessful attempts at employing a technique was to promoting class discussions and student learning.
Of course, I had always reviewed the results of student learning, analyzing how they met/failed to meet criteria. Before, I responded to their failures by trying new ways of teaching or discussing bewildering or confusing material. I hadn't shifted the structure of my tried-and-true assignments because they worked for most students. When I made the decision to cross the Rubricon and devise detailed rubrics for both large and small assignments, I discovered that the act of thinking in detail about how to use rubrics to generate evidence for course and program assessment led me to zero in on the instructions and prompts for each task, fine-tuning these to line them up with desired outcomes in a far more coherent and obvious manner. This, naturally, is a major step in improving outcomes.
Thus, Epiphany #2: Rubric writing is an artistically satisfying task, requiring you to analyze what you really want students to accomplish in an assignment. Aligning prompts and instructions, criteria for evaluation, and desired outcomes produces important insights into where you should be focusing your energy and technique as a teacher.
With the push to "close the loop," I feared that the mechanics of having to assess multiple courses for multiple objectives might consume too much time and efforts. But the insight that one detailed rubric can be made to assess multiple objectives in one cleverly designed assignment led to Epiphany #3: That's what they meant by "work smarter, not harder."
Showing posts with label student learning. Show all posts
Showing posts with label student learning. Show all posts
Wednesday, November 13, 2019
Thursday, February 14, 2019
Capturing A Rich Narrative: Experiential Learning Opportunities
If assessment provides a way of telling our story, then tracking
experiential learning opportunities is probably one of the most exciting parts
of the narrative.
By “experiential learning,” I am not referring to a good or
even great experience, like taking students to an art museum or engaging them
in a community service activity for one afternoon. I am talking about those hands-on experiences
that occur over a period of time and enhance deeper learning. As many of the departmental assessment
reports document, these high impact experiences are integral to a Utica College
education.
In a number of academic departments, these types of
experiences result in student presentations at regional or national
conferences.
- Last October, 3 students attended the Seaway Section of the Mathematical Association of America (MAA) meeting at the University of Toronto Mississauga. This spring, 1 student will present at the MAA Seaway Section meeting at St. John Fisher.
- From 2017 through 2018, 5 chemistry students presented their research at the American Chemical Society’s national conferences, and one presented at the CSTEP Statewide Conference.
- 15 students have been included as co-authors on presentations made at regional and national psychology conferences from 2017-2019. Two students have also been included as co-authors with a faculty member in a prestigious professional journal publication.
- In the geoscience program, students engage in field trips during lab periods and on weekends. They also participate in internships, independent research, and may opt for a 4 to 6 week field camp experience to study the geologic features of a particular region. In 2017, 2 undergraduates presented posters at a professional conference, and 1 student’s research was published in Northeastern Geographer.
Experiential learning isn’t realized solely in conducting
research and giving presentations, however.
Students are writing for the Tangerine.
They are performing on stage in musicals
and dramatic productions. They are
studying abroad. They are completing
internships. And sometimes experiential
learning happens right in the classroom or during residencies, as in the case
of the Financial Crime Management program.
In this program, graduate students get hands-on experiences using
computing software and financial analysis tools and applying them to real-world
criminal cases in economic crime.
Experiential learning exposes students to new opportunities
and often takes them outside their comfort zones. In MGT/PRL 345, students spend spring break
in New York City, where their instructor has arranged for them to visit with UC
alumni and other top communications professionals at agencies such as G & S
Business Communications, the Wall Street
Journal, Glamour, NBC News, the New York Power Authority, and the 9/11
Memorial and Museum. Student reflections
indicate that this experience is a transformative one, especially for those who
come from small, rural towns where opportunities are limited and who have never
visited a large city. One student wrote,
“In college, it’s hard to figure out where you firmly belong or it’s difficult
to see yourself in five years. But when
you visit an [organization] and you feel like you could belong there, it’s an
empowering feeling.”
Now if these aren’t impressive outcomes, I don’t know what are.
Wednesday, November 29, 2017
Involving Students in Assessment
By Ann Damiano
In her keynote address at the Assessment Network of New York
conference (April 2017), Natasha Jankowski, Director of the National Institute
for Learning Outcomes, challenged participants to develop assessment processes
that are student-centered. She
concluded that assessment is something we should do with students, not something that is done to students.
Multiple stakeholders should be involved in our assessment
efforts, particularly when it comes to communicating and interpreting results, as
well as generating plans based on these results. Students
are our most important stakeholder, and so their involvement in the process
is imperative.
One way is to include students in the dissemination plan for
institutional survey results. Findings
from NSSE, the Noel-Levitz Student Satisfaction Inventory, and even the Student
Opinion on Teaching (SOOT) might be shared with student leaders. If warranted, students could collaborate with
personnel in Academic and Student Affairs to create plans or makes
recommendations based on the survey results.
For example, if NSSE findings indicate that less than 60% of seniors
perceive the College contributed to their understanding of people different
from them, students might propose ways the institution could improves its
curricular and co-curricular offerings so that we are more successful at
achieving this tenet of our mission.
When assessing student learning goals, we should not assume
students share the same operational definitions as their faculty. That they might not underscores the
importance of getting their input into what results mean, and likewise,
highlights the importance of using multiple methods to assess a single
goal.
Most recently (and at my previous institution), I assembled
two student groups to review results related to integrating knowledge, problem-solving,
quantitative reasoning, and intercultural competence. For each of these learning goals, the
findings from diverse sources either conflicted with one another or the results
indicated that no matter what “improvements” faculty made to the curriculum, we
were still not achieving the desired outcomes.
The students brought a different perspective to the discussion than that
articulated by the three faculty groups that reviewed the data. Important insights from the students included
the following:
- Students defined “integrating knowledge” as applying classroom learning to real-life situations, whereas faculty used it to refer to apply what was learned in one course to another;
- Problem-solving is best developed in the co-curricular experience, where students are often forced to derive solutions independently, as opposed to in the curricular experience, which is much more structured and faculty-directed;
- While the college may provide numerous offerings related to inclusion and diversity, a lack of diversity on the faculty combined with pedagogies that do not promote inclusion and the absence of global perspectives in courses throughout the curriculum potentially contributed to students not achieving the desired outcome related to intercultural competence.
The students’ interpretations of assessment findings dared the
faculty to make improvements that challenged them in ways their own conclusions
had not. Rethinking one’s pedagogy, for
instance, requires much greater effort and imagination than adjusting course
requirements or modifying an assessment instrument. Yet new pedagogical approaches may be necessary if we are going to help students achieve outcomes.
Collaborating with students on assessment results expands
our understanding of what the results might mean. As one faculty member noted, including
students in our processes “sends a message that we do this for the students,
that they’re the major stakeholder, and they literally have a seat at the
table.”
Subscribe to:
Posts (Atom)
Reporting and Analyzing Assessment Findings
It’s not unusual to see assessment reports where the findings are summarized as such: “23% met expectations, 52% exceeded expectations, a...
-
It’s not unusual to see assessment reports where the findings are summarized as such: “23% met expectations, 52% exceeded expectations, a...
-
Student learning is typically measured using direct or indirect methods. Direct measures provide clear evidence of what students have and ha...
-
Erick Montenegro and Natasha A. Jankowski advocate for the importance of using various methods to assess students’ knowledge as opposed to ...