Wednesday, November 10, 2021

Overcoming the COVID Shift: The Impact of Resilience and Effort During a Pandemic

 By Matthew Marmet

The COVID-19 pandemic has certainly presented challenges and opportunities across all sectors.  The fact that it is going to take eight weeks for me to get a replacement window for my basement is indicative of the supply chain struggles we are currently experiencing.  The focus of this piece, though, will be on the world of education, where one of the biggest challenges we faced is what I have come to call the “COVID shift.”

I am not touting myself as the inventor of some groundbreaking, trademarkable term when I say COVID shift.  I use it simply because it perfectly captures what our institution experienced.  The COVID shift, for us, meant a shift from traditional on-ground delivery of educational materials (pre-COVID shift) to either completely virtual or hybrid learning environments (post-COVID shift).

Before moving forward, I’d like to take a walk back in time, to a simpler place where students came to class not worried about whether they remembered their masks.  A time when group activities and other pedagogical techniques could be easily implemented to supplement in-class lecture. For me, this was the time of the flipped classroom, a move away from what my favorite instructional designer likes to call “straight lecturing.”  Instead, class time is spent with students engaging in other activities and problem-solving tasks, which have been shown to have a positive impact on student attitudes and performance.

The COVID shift forced us to un-flip. In-class activities involving group work and close student interaction were not much of a possibility. With my love for the flipped classroom, the question I asked myself was: How can I avoid transitioning to a straight lecture style of teaching in these restricted environments?  My students had been thrust into a less-than-ideal situation, so I put the onus on myself to help them stay engaged.

What I ended up doing varied depending on the environment.  During the time classes were completely virtual, I tried to create fun, interesting (and sometimes embarrassing) demonstrations that I would conduct on camera for the students. Additionally, I would hold virtual brown bag lunches with my students, where an entire class session was dedicated to creating a relaxing environment for them.  Students were able to take the time to separate themselves from the minutiae of being stuck at home with their parents.  Interestingly, as a brief aside, I asked my students during class about the specific issues they were facing because of the pandemic.  Being “stuck at home” was near the top of this list. Once we were able to graduate to the hybrid environment, I shifted in-class activities from the group level to the individual level whenever it made sense.

Although most students seemed engaged at face value during these class sessions, I was curious to know if the positive academic impacts of student engagement would shine through in the post-COVID shift environment.  This helped to inform a research question for a study I just presented at the northeast regional ACBSP conference. It addressed whether there was a significant difference in academic success between pre-COVID shift students and post-COVID shift students. Naturally, my hope was that no difference would exist between these two groups.  I’ll save the statistical jargon and present the results of my study in two words: It worked!

With these findings, one question came to mind: Why did this happen?  Our students went through major changes to how they were used to experiencing college, but still managed to achieve success. To me, there were factors on both the student side and the faculty side that came into play. And these two factors were student resilience and faculty effort, which ultimately lead to student success.

In terms of student resilience, I’d like to provide an anecdotal quote from a student during one of our brown bag lunch sessions.  They said, “I am afraid to go outside, not because of getting sick, but because I am worried about getting attacked because I am Asian.”  During a time when the threat of such attacks was very real, imagine trying to succeed at anything when this is where you are mentally.  But I will tell you that this particular student did succeed, along with many others for several reasons.  First, they were willing to put in the work.  Rather than throw in the towel, they faced the difficulties of the course and the new learning environments head on.  I think this also speaks to our students’ ability to adapt, which I thanked them for on numerous occasions across these semesters.  And finally, “scrappy” is the perfect term to describe our students, a lot of whom are first generation college students who had to fight and claw just to go college in the first place.  They were not about to let this hurdle get in the way of their continuing education.

From the faculty effort side, I dove into the qualitative data from students who filled out the SOOT.  I share this data not to brag or boast, but because it is one of my proudest moments thus far in my short tenure in academia.  In both the pre- and post-COVID shift environments, language like “genuinely cared about the welfare of the students” and “cared about us as human beings” was offered.

Please remember I said at the beginning of this entry that the COVID-19 pandemic has presented us with both challenges AND opportunities.  I think these words speak to the opportunities we can find in all the current disruption, maybe the most important of which is the chance to get in touch with the human side of whatever it is we’re doing.  Success matters, but how you get across that finish line, to me, matters more.  Empathy and compassion will always win, and we might even be surprised with how robust the results are, regardless of the environment we find ourselves in.

 


Wednesday, October 27, 2021

Involving Students in Program Assessments

 For too long, assessment has been something we do to students, not something we do for or with students. Nicholas Curtis and Robin Anderson note that “The current systems of program-level assessment in the United States does not incorporate (or mention) students other than as sources of information” (page 7).

 We need to change that.

 If an institution calls itself student-centered and claims it values inclusion, it follows that students should have agency in certain educational processes, such as assessment. Further, when we analyze and interpret assessment findings and survey results, we are limited by our own perspectives and biases. Involving students at some point in our assessment efforts will expand and improve our understanding of what the results mean.

Curtis and Anderson write, “Without the involvement of  . . . students, our thoughts about the intended educational experiences . . .  are not going to relate to the actual experiences of our students” (page 10). In addition, our knowledge and understanding may often be restricted to a single course or major and not embrace the totality—the Gestalt—of the student experience.

 Giving students agency in assessment is not relinquishing authority to them. Rather, it is collaborating with them in an effort to identify ways a program or the institution might improve the educational experience it offers.

How a department involves students in its assessment processes depends on what its members want to learn or understand. One strategy would be to present assessment results to a representative sample of students and ask them to share their understanding of the findings or describe an experience they had that is illustrative of the finding. An example of this was when UC students in March 2020 had the chance to respond to the results from the climate survey. When presented with the survey finding that students of color do not feel welcomed in classes taught by white faculty, students described instances where white faculty deliberately looked to white students to answer questions in class and times when white faculty remained silent about racial incidents that occurred on campus. Soliciting this kind of information from students better positions faculty to address a finding that initially left them feeling defensive and confused.

 A second way to involve students in assessment is to ask them how they understand the learning goals. How can we be certain, for instance, that students define goals such as problem-solving and teamwork the same way we do? I recall years ago asking students to explain why the college was receiving consistently low ratings from students when asked how well their coursework developed their problem-solving skills. Students explained that they didn’t consider problem-solving as being addressed in the curriculum. Instead, they saw it as something they developed more in their co-curricular experiences, where they might be given a project to complete and it was entirely up to them to manage all the steps needed to bring the project to fruition. In their classes, they explained, all the “problems” were solved by the instructor who organized the course.

 A third possible way to involve students is to include them in planning. Ask them what kinds of assessments might truly capture the student experience in the program. Collaborate with them on implementing action plans to address areas where students may be underperforming in the program.

 Including students in our assessment processes should be done thoughtfully and sparingly, but it should be done. Curtis and Anderson observe that creating student-faculty partnerships “spurred interesting and deep conversations about the benefits of thinking and assessing at the program-level rather than the classroom-level” (page 10). This is especially important since faculty generally focus on and care primarily about their individual courses, whereas students consider a single course as part of a larger experience.

 A former faculty colleague of mine articulated the benefit he saw in involving students in assessment: “It sends a message that we do this for the students, that they’re the major stakeholders, and that they literally have a seat at the table.” Another stated, “[They] help faculty and departments understand that we engage in assessment processes for the benefit of our students. Including them communicates back to the student body the importance of assessment and what [the college] does to ensure that they receive a quality education.”

 Work Cited

Curtis, N.A., & Anderson, R.D. (2021, May). A Framework for Developing Student-Faculty Partnerships in Program-Level Student Learning Outcomes Assessment. (Occasional Paper No. 53). Urbana, IL: University of Illinois and Indiana University. National Institute for Learning  Outcomes Assessment.             



Wednesday, September 22, 2021

Evidence-Based Storytelling: How Assessment Helps Us Tell a Compelling Narrative

There’s no question that assessment is often regarded as a fill-in-the-blank, paint-by-number bureaucratic enterprise. For too long, assessment specialists and accrediting agencies promoted a linear approach where faculty make specific changes in courses and curriculum based on assessment findings with the aim of improving student learning and then re-assess to document the effects of these changes.  

Yet Natasha Jankowski, former executive director of the National Institute for Learning Outcomes Assessment asks, “Can one ever actually know that the changes made enhanced student learning?”

So much influences a person’s growth and development in the years between starting a degree and earning one. Oral communication skills, for example, might be developed a great deal in an undergraduate curriculum, but so, too, might these abilities be improved by the experiences a student has in the co-curricular environment or the world of work.

There are modifications we can make in an effort to maximize student learning and ensure that all students have the opportunity to develop the knowledge, skills, and competencies faculty consider critical in a discipline. However, we simply cannot make emphatic claims to a causal relationship between what we did and the extent to which it improved student learning.

Jankowski advocates for a different approach to how we might demonstrate educational effectiveness. She argues that the meaning we draw from assessment findings, our understanding of the data, constitutes the important narrative. This meaning shapes the story we derive from the evidence we have gathered. 

“In assessment there is so much doing that there is limited time, if any, built into reflecting upon the data and deciding what it all says, what argument might be made, and what story it tells about students and their learning” (page 11).

Stories give evidence meaning. The 2020-2021 assessment report from the Department of Philosophy documents an important story about teaching and learning in a pandemic where COVID fatigue resulted in students’ failing to complete assignments as well as an increase in cheating. The quantitative findings suggest that student learning was on a downward trend. But the numbers alone don’t tell the story. The meaning inferred from the numbers by the faculty quoted in the report’s narrative does.

The report from the English Department provides an illustration that shows how students achieve learning beyond that which is articulated in a program goal. Students who participate in the design and creation of Ampersand, the College’s literary journal, “go beyond” the goal of making authorial choices: “[T]hey learn to collaborate, they learn skills of layout and editing as they produce a publication that appears in both print and online forms.”

Evidence-based stories—stories informed by the quantitative and qualitative evidence we systematically gather—are how we best illustrate the value and impact of our individual programs and of higher education. These stories also tell us what we need to change or improve in our teaching, course content, and curriculum.

“Some of our stories are tragedies,” Jankowski writes, “and some are tales of heroics and adventures” (page 12). They provide us with a richer, deeper, and more meaningful way to discuss assessment findings than the linear, formulaic approach does. Whether our stories have a happy endings or sad conclusions, they deserve to be told. 


Jankowski, N. (2021, February). Evidence-Based Storytelling in Assessment. Occasional Paper No. 50). Urbana: IL: University         of Illinois and Indiana University. National Institute for Learning Outcomes Assessment.


Tuesday, September 7, 2021

Capturing the COVID Narrative

The full implications of how COVID-19 will impact higher education probably won’t be realized for another four or five years. Still, there’s much we already know about the pandemic’s effect: lower than expected enrollments, fiscal challenges, and increased disenfranchisement among students are just a few of the immediate consequences.  

The 2020-2021 assessments in co-curricular and student support services provide authentic and compelling evidence of COVID’s effect on Utica College students and operations. In particular, the report from the Department of Athletics offers a narrative about a year when “all of the plans in place . . . had been compromised,” a year of “twists, turns, starts, and stops” for which there was no playbook.

One method used to assess the student-athlete experience is the “Athlete Viewpoint,” an instrument that measures student-athlete commitment and well-being, team culture, academic advising, institutional acceptance, and, in 2020, the impact of COVID.

A significant finding from the department’s assessment was that more than half of student-athletes who responded to the survey indicated challenges with mental health. The report notes that COVID-19 created a “rapidly changing landscape” and “student athletes did not know what roadmap to follow . . .  to successfully start and complete a season.”

In response to this finding, the department plans to provide student-athletes access to more internal and external mental health and general health and wellness resources.

While the 2020-2021 academic year witnessed countless challenges and disappointments for UC’s Department of Athletics, the report also shows that despite these hardships, our student-athletes persisted and, in some instances, thrived. Close to 90% of conference games were completed, 43.75% of eligible teams finished among the top 4 teams in their conference, 31.25% qualified for playoffs, and over a quarter of the teams advanced to the conference championship.

Assessment is more than quantitative evidence gathered for the purpose of compliance. It is a way to use evidence to tell a story. The report from the Department of Athletics is a story of student-athlete resilience, and, I might venture to add, story of historical significance.  

Wednesday, March 24, 2021

How the Pandemic Influenced Teaching and Assessment in One Program

 A group of international educators writing about online teaching during the COVID-19 crisis note that when life returns to normal, “the worst thing that could happen is not learning from the crisis we experienced” (Rapanta, et al. 941).

Assistant Professor of Wellness and Adventure Education (WAE), Timothy Abraham, probably agrees. At a recent meeting of the Academic Assessment Committee, Abraham stated that in his program, instructors have no intention of returning to their pre-pandemic approach to teaching and learning.  

He said, “What we learned during this pandemic improved how we teach our students.”

Like most of us, Abraham misses being able to interact daily with students and colleagues, particularly when eating lunch in the cafeteria. And a great deal of the instruction in his discipline requires close contact with students, so the abrupt change to a virtual environment last spring posed a considerable challenge.

In the fall semester, however, he discovered that the hybrid approach resulted in his using face-to-face instructional time more productively.

While acknowledging that lectures have value, Abraham spends face-to-face time having students engage in hands-on, active learning. Likewise, his WAE colleague, assistant professor Megen Hemstrought, uses face-to-face time to incorporate 21st century skills like critical thinking, problem solving, collaboration, and technology literacy.

Hemstrought says that when students came to class two or three times a week, as in the past, many did not prepare ahead of time. She finds that having class less often (usually once weekly) motivates students to prepare better in advance so they can have more robust conversations, do more meaningful active learning, and delve deeper into the topics at hand.  She uses TED talks, textbook readings, and articles to get students prepared before coming to class. 

Abraham creates ”discovery activities” to help students make connections to the material and pique their interest to learn more. He then uses asynchronous learning modalities in Engage, the College’s learning management system to “fill in the holes.” Students may watch the instructional videos on their own time and at their own pace, giving this strategy the added benefit of supporting an individualized approach to teaching. The Knowmia tool used to create videos and make them accessible to all learners provides video analytics so that instructors can see how much of each video is viewed and how much time each student spends attending to the lecture.

This, Abraham contends, gives faculty a more objective way to measure student participation in a course than how they might have been doing in a traditional on-ground class.

Abraham and Hemstrought agree that not only have their pedagogical methods improved, so have their assessment strategies. Abraham reports, “I’m not always using ‘tests’ to assess learning, like I’ve done in the past.” Instead, he is opting for writing assignments, reflections, and practical application projects.

“It creates a little more work grading on my end, but I want to give them an assessment that makes them think. Plus, this prevents them from simply looking up answers at the same time they’re taking a test at home on Engage.”

Rapanta, et all. say that how we respond to a crisis “may precipitate enhanced learning and teaching practices in the postdigital era” (924).

This has certainly been true in the Wellness and Adventure Education program and probably in other programs as well at Utica College. It’s an important narrative to document.


 Work Cited

Rapanta, Chrysi, Luca Botturi, Peter Goodyear, Lourdes Guardia, and Marguerite Koole. “Online University Teaching During and After the Covid-19 Crisis: Refocusing Teacher Presence andLearning Activity. Postdigital Science and Education, vol. 2, 2020, pp. 923 – 945.

Tuesday, March 9, 2021

Educating for Professional Success

I’m not surprised when I hear that employers are dissatisfied with the skills recent college graduates bring to the workplace. I’ve been hearing that since the 1980s. However, employers’ complaints historically focused on what are called “soft skills,” abilities that are typically associated with the liberal arts or the co-curricular experience.

More recently, employers have criticized college graduates for lacking “hard skills” as well, and large corporations report investing $30,000 to $50,000 a year to fill the gaps between what employers want and what college graduates have learned.

Utica College’s Associate Professor of Computer Science, Ronny L. Bull, has been addressing this gap for years.  

It all starts with building relationships. Bull has ties with several area employers, and he uses these connections to learn how prepared UC computer science graduates are when they enter the profession. Through informal conversations, he learns where graduates’ knowledge and skills are lacking, and he uses this information to make modifications to the curriculum and to refine pedagogy. For example, when he learned that graduates needed more knowledge and experience with coding, he revised all of his 100-level to 400-level courses in order to teach coding skills that tied back to the course objectives. Well-crafted coding projects reinforced the understanding of this content by engaging the students. As students progressed through their degree program, they could expect more and more challenging coding problems presented to them in higher level courses.  A secondary effect was that they were better prepared for the real world.

“It’s not just about knowledge, though,” he said. “Interpersonal skills, teamwork, and problem-solving are other competencies we address through our curricular offerings, since these ‘soft skills’ are critical to a graduate’s success as well.”

By networking with employers, Dr. Bull is able to “create a pipeline of people who know how to do a job.” His success is indicated by the fact that area employers ask for Utica College graduates, mostly because “they don’t have to train them,” Bull stated.

A carefully designed curriculum that has hands-on experiences built into its courses and that provides  students with real-world opportunities during their undergraduate years allows Dr. Bull to develop students’ knowledge and skills so that they can be successful in the professional workplace. More importantly, however, it gives him insight into individual students’ interests, abilities, and aptitudes. In turn, this allows him to match student interns and graduates with employers.

Helping students see what kinds of opportunities exist in the computing industry is part of the education Dr. Bull provides his students. Prior to the pandemic, he regularly brought students to tour local facilities and meet employers. On many of these visits, the employers they were introduced to were Utica College computer science alumni! It isn’t just career-readiness that Bull emphasizes, however. The curriculum and its emphasis on real-world problems and projects also prepares students for graduate education, and over the years, Bull has brought dozens of students to conferences sponsored by ACM, the world’s largest educational and scientific computing society.

Professor Bull also relies on the relationships he has established with computer science graduates when it comes to assessing the curriculum. Several years ago, he set up a discord server where students and faculty could collaborate, share, and chat. This created a community within the department, and when students graduated, they remained part of this group. Alumni were able to share feedback about the curriculum with their former faculty—feedback that was then used to make changes. 

The work that Ronny Bull does soliciting feedback from employers and alumni and using this information to develop curriculum and create experiential learning opportunities is an example of assessment at its finest, seamlessly woven into the fabric of what he us doing to develop students into successful professionals and lifelong learners.

 



Wilkie, Dana. “Employers Say College Grads Lack Hard Skills, Too.” Society for Human Resource Management, 21 Octover 2019, www.shrm.org/resourcestools. Accessed 3 March 2021. 

 

 

Wednesday, February 24, 2021

The Limits and Usefulness of Anecdotal Evidence in Assessment

 We’ve all been at those meetings where someone steers the conversation with claims such as, “Students don’t like having early morning classes, and when they do, they leave the institution.” There’s no source for this statement; there’s no evidence in any survey, focus group, or analysis of enrollment data to substantiate the claim.

People typically respond to statements such as that one by saying, “That’s anecdotal evidence.” Not so. Unsubstantiated claims don’t rise to the level of anecdotal evidence. They remain exactly what they are: unverified statements, usually intended to promote an agenda or enhance a speaker’s credibility.

Anecdotal evidence refers to stories about people and their experiences.  By itself, an anecdote is not reliable evidence. However, when anecdotes are used judiciously in conjunction with quantitative data, they provide insight into what the data might mean. Further, anecdotes have emotional appeal. They remind us why we should care about the data in the first place.

Student responses to the climate survey data at UC (March 2019) provide excellent examples of how anecdotes may be paired with data to advance understanding.  On the survey, more than half of students of color reported feeling that they did not matter in classes taught by white faculty, and compared to white students, fewer students of color reported feeling affirmed by white faculty.

When students were asked to respond to these data points by narrating their experiences, they described occasions where white faculty directed their questions solely to white students and where white faculty ignored racist comments made by other students in the class. They gave examples of classes where the majority of students performed poorly on an exam and the professors indicated that the reason is because the course material is rigorous, and the students are unprepared for the demands of the subject. They mentioned classes where the instructor never learned their names and seldom acknowledged them outside of class.

On the plus side, they spoke of positive experiences that made them feel affirmed, supported, and part of a community.

These stories—anecdotes—give texture and meaning to data that might perplex, dismay, or be easily dismissed.

The limits of anecdotal evidence, which is usually based on individual experience, should be obvious. Its usefulness, though, cannot be overlooked. One benefit is that invites inquiry and may have implications for research. Its major benefit is described by Michael Shermer, publisher of Skeptic: “Anecdotes  . . . help in explaining data points that do not make sense. Hearing stories about data points that do not make intuitive sense can uncover the hidden variables that are really driving the results.”

Work Cited

Shermer, Michael. "How Anecdotal Evidence Can Undermine Scientific Results: Why Subjective Anecdotes Often Trump Objective Data." Scientific American (2008).

Reporting and Analyzing Assessment Findings

  It’s not unusual to see assessment reports where the findings are summarized as such:  “23% met expectations, 52% exceeded expectations, a...