As scholars, we spend years training in our disciplines, developing techniques to make meaning of the world. As a chemist, I start with a hunch about a chemical reaction, then design experiments and measurements to collect data that provide insight into the reaction. From there, I continue with the iterative process of hypothesizing, experimenting, measuring, and analyzing. For me, engagement in the scientific research process began in college, continued during graduate school, carried me through a post-doctoral fellowship, and employment in an industrial R&D lab.
In 2009, I returned from industry to join the faculty at City Tech, expanding my daily activities beyond research to teaching. The scope of the question, “How do we know what we know?” also expanded. It came to include how we know what skills, knowledge, and attitudes students learn in college. We know what we write on the syllabus and what we think we are teaching, but what do students actually learn? My arrival at City Tech and my encounter with this question, coincided with a major initiative to develop a college-wide assessment program.
We built our assessment program from the ground up. In chemistry, we started with the quite tractable goal of identifying and measuring a few student learning outcomes in two of our critical courses. We analyzed our data, devised and implemented appropriate interventions, and re-measured our outcomes. As in the chemistry lab, our understanding of the pertinent issue was continually refined by the insights we gained from data. The data-driven process was familiar, but the question had changed from one of chemical reaction mechanism to one of student learning. As our assessment efforts expanded, we included program level learning outcomes. The question then became, “What do students learn as chemistry majors?”
The process of assessing course and program level student learning outcomes was a practical hands-on training in chemical education research. For example, beginning in my first year at City Tech, I saw that students struggled writing lab reports. I thought I was teaching lab report writing, but students’ incomplete mastery of the skills necessary to write a substantial lab report suggested otherwise. Simple things like formatting could easily be addressed, but issues of organization and section-appropriate content seemed to recur frequently. To address these issues, I devised a new inquiry-based method for teaching lab report writing, and I worked with a colleague to implement the new approach in our courses. We then set about assessing the impact of the new teaching practices on student learning. Applying the same techniques that we used in course and program level assessments, we gained a more complete understanding of the efficacy of the new teaching method, and were able to share the results with the chemical education community through a journal article.
In the spring of 2020, teaching and learning underwent a major change as the pandemic forced a quick switch to remote course delivery. New teaching methods were required. However, the question of how we know what skills, knowledge, and attitudes students learn remained. In fact, the need to develop effective courses through rapid adaptations in teaching methods imbued the question with greater urgency. As chemistry curricula typically include a laboratory component, we had to be creative with our adapted methodologies. In my Instrumental Methods of Analysis class, I developed and sent students kits for hands-on remote learning. The kits comprised the optical and electronic components for students to assemble their own computer-interfaced spectrometers. Using the instruments they assembled, students conducted a series of qualitative and quantitative chemical analyses. It seemed like a positive learning experience, but now I want to better understand the impact of this new practice on student learning. How does a hands-on remote experience affect student learning and student attitudes about learning? How does student leaning through a hands-on remote experience compare to that of an in-person chemistry laboratory experience? Our past engagement in assessment—years of assessing student learning outcomes in our courses and programs—provides the curiosity and confidence to attempt to answer our current questions.
Dr. L. Jay Deiner is an Associate Professor of Chemistry at New York City College of Technology, City University of New York (CUNY).
Deiner, L.J., Newsome, D. and Samaroo, D., 2012. Directed self-inquiry: A scaffold for teaching laboratory report writing. Journal of Chemical Education, 89(12), pp.1511-1514.
This entry is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.
- From the Assess@CUNY Staff (Vol 1, Issue 2) on
- The Potential and Pitfalls of Net Promoter Scores (NPS) as a “Business World” Metric in Academic Assessment on
- Combatting a Compliance Mindset by Advocating for Betterment on
- Interview: with Dr. Natasha Jankowski on
- How 20 Years of Education Reform Has Created Greater Inequality on
- Academic Affairs
- Administrative Units / AES
- Education Reform
- Formative Assessment
- Fostering Culture
- General Education
- Literature Reviews
- Spotlight on Methods
- Spotlight on People
- Success Stories
- Vol 1, Issue 1 (Feb 2020)
- Vol 1, Issue 2 (Jul 2020)
- Vol 2, Issue 1 (Sept 2020)
- Vol 3, Issue 1 (Nov 2021)
- Vol 3, Issue 2 (Apr 2022)
- Vol 4, Issue 1 (May 2023)