The Assessment Council of the City University of New York gratefully acknowledges Wiley Online Library and New Directions for Community Colleges (Jossey-Bass New Directions Series) for permission to reprint this article from Volume 2019, Issue 186, What Works in Assessment, for this inaugural issue of Assess@CUNY, a publication of the CUNY Assessment Council. 

Abstract

This chapter illustrates the value of formative assessment as the key to better understand students who struggle with classroom learning. The authors present ways in which professional development, classroom assessment, and student services can come together with the common goal of building a community focused on the student learning experience and student success.

“Assessment is not a spreadsheet—it’s a conversation”
– Joe Bower (2015)

Introduction

A popular New York Times Op‐ed laments the waste of time, energy, and resources on the pointless exercise of learning outcomes assessment (Worthen, 2018). The article exemplifies how the purpose, value, and process of assessment are utterly misunderstood by many in higher education. The etymology of the word assess comes from the Latin verb assidere meaning to sit beside. If we were to properly recapture this original meaning of assessment, instructors need no longer view assessment as meaningless data collected and submitted but rather as a process during which we can sit beside our students and fully participate in their learning experience (Stefanakis, 2002).

Assessment can be broadly classified either as summative or formative. A summative assessment, which occurs at the end of a course or a program, is a high‐stakes assessment of learning and can appear in the form of a unit exam, midterm or final exam, term paper, and project report. A formative assessment, which occurs throughout the learning period, is a low‐stakes assessment for learning (Stiggins, 2008) and can appear in the form of an active learning exercise, weekly quiz, reflective journal, or similar product. While it is not possible or advisable for an instructor to sit beside a student as they take a summative assessment, it is possible for an instructor to envision working alongside a student before, after, or perhaps even during a formative assessment.

The formative assessment is a powerful tool that can motivate students, build their confidence, help them to develop strong study skills, provide room for their failures, allow them to be uncomfortable, and grow into self‐regulated learners (Stiggins, 2002; Stiggins & DuFour, 2009; Black & Wiliam, 1998; Cauley & McMillan, 2010; Nicol & MacFarlane‐Dick, 2006). The formative assessment provides a unique opportunity for instructors to closely observe student progress on a regular basis and intervene early in the learning period to address cognitive or student disposition issues that may be impeding the learning process (Moss & Brookhart, 2009). Sensitive observation of the student learning experience within the context of a formative assessment, compels the instructor to also reflect on their own teaching practice. Reflective journaling is an easy, yet powerful way for instructors to examine the effectiveness of their teaching strategies (Schön, 1983). Combined with formative assessment, this journaling habit can enter a reflective cycle (Gibbs, 1988; Kolb, 1984) that is solicitous, systematic, and highly informative. A reflective cycle includes stages such as description of a learning opportunity or formative assessment, observation of the student learning experience, reflective journaling, analysis, conclusions, and an action plan.

During this reflective practice, an instructor may observe certain cognitive or affective issues that impede a student’s progress. Being uncomfortable or becoming stuck is a normal part of the learning process (Land, Cousin, Meyer, & Davies, 2005), and so it is important for an instructor to reach out to the student and begin a conversation. Communication helps the teacher and the learner to find out whether the problem arises from the cognitive domain (Bloom, Englehart, Furst, Hill, & Krathwohl, 1956) or the affective domain (Krathwohl, Bloom, & Masia, 1973) or both. Even on a more general level, the strength of a formative assessment lies in communication (Brookhart, Moss, & Long, 2008). An instructor must clearly communicate expectations and learning goals to the entire class before the assessment begins and provide adequate feedback immediately after the assessment ends for the formative assessment to be effective (Akkaraju, 2016; Nicol & MacFarlane‐Dick, 2006). In this chapter, we present a case study from a Bronx Community College (BCC) gateway science course in which the formative assessment is used to positively impact student issues in the cognitive and affective domains.

College instructors are often not trained in teaching or assessment (Brownell & Tanner, 2012) when hired. The burden of training falls on the institution. At BCC, training in assessment has a two‐pronged approach: (1) a campus‐wide assessment effort guides department‐level design and training and (2) faculty development efforts integrate assessment into their training. Newly hired faculty members present a unique opportunity to influence their development as instructors and shape their perceptions and use of assessment. In this chapter, we present a case study from BCC’s New Faculty Seminar (NFS), which exposes new faculty to meaningful formative assessments at the classroom level (Angelo & Cross, 1993), which aim to develop teacher empathy in encouraging student motivation and success while concurrently encouraging faculty members to report teaching excellence through scholarship publication (Kreber, 2002). Part of closing the assessment loop with students is connecting them to services external to the classroom setting that can help them reach their goals.

Outside the realm of the physical and virtual classroom setting, the instructor may need to extend communication to other personnel such as academic advisors, student success coaches, supplemental instructors, and tutors to make effective use of the campus safety net that helps to catch the struggling student before they fall (Greenfield, Keup, & Gardener, 2013). Student success is no longer predicated only on whether students have the necessary academic skills to excel in the classroom. Students’ social and emotional adjustment to college also plays an integral role; additionally, success depends on whether students develop equally important nonacademic skills, attitudes, habits, and behaviors (Karp & Bork, 2014; Tinto, 1993). Institutions, simultaneously, must provide access to a broad category of interventions and support to address the root cause of student attrition, namely, early contact and community building, academic involvement and support, monitoring, early warning, and counseling and advising. By providing much needed assistance early in students’ academic career, students will more likely have a reasonable chance of being successful in college (Tinto, 1993). In this chapter, we present a case study from BCC’s First Year Program in which First Year Seminar (FYS) instructors provide formative assessment data through Starfish, an early alert and student tracking system, to advisors and peer mentors in order to affect student behaviors that influence student success.

The overall 3‐year graduation rate for Community Colleges in the United States is around 22% with 42% of students entering freshmen in need of at least one remedial course (U.S. Department of Education, National Center for Education Statistics, 2018). The current 3‐year graduation rate for BCC is 16% with 90% of entering freshmen in need of at least one remedial course (CUNY Office of Instructional Research, 2017). To say that we have a great deal of work to do in ensuring student success, would be an understatement. We began the process of revitalization by developing a new strategic plan that is much more in line with our vision for the college.

BCC institutional goals that directly impact student success are to build a community of excellence, to empower students to succeed, and to deepen student learning. In this chapter, we describe how formative assessment helps us to approach these goals using three principle routes: (1) the classroom practice of using formative assessments via the flipped learning model; (2) a 1‐year professional development seminar for new faculty members that emphasizes formative assessment via the classroom assessment project; (3) using formative assessment data to inform student support services via the Starfish student success platform (Starfish, 2007).

Embracing Formative Assessment in the Classroom

We found that formative assessments, when designed thoughtfully and deployed with deliberative care, can have a profound influence on a student’s learning, disposition, and overall performance. By employing the flipped learning format, it is possible to harness the power of the formative assessment in promoting student success (Akkaraju, 2016; Bergmann & Sams, 2012). However, for it to be successful, it is not enough to just perform formative assessments; it is necessary to embrace it fully (Moss & Brookhart, 2009) by adopting a growth mindset (Boaler, 2013; Dweck, 2008) and by practicing pedagogical love (Maatta & Uusiautti, 2012).

At BCC, Human Anatomy & Physiology I is a gateway course with a 60% pass rate. We present a case study in which the flipped learning approach was used along with regular formative assessments to tackle student disposition traits such as preparedness, punctuality, and attendance. A total of 90 students drawn from four consecutive semesters were targeted in this study.

We used the flipped classroom model in conjunction with reminder notifications. These tools served to overcome common student tendencies that could negatively impact performance. Within the context of the flipped learning format, students were provided with a learning module ahead of each class session and were expected to grasp factual and basic conceptual knowledge before coming to class. Using a private mobile messaging platform called Remind (Remind, 2014), a text message was sent to the students to prompt them to prepare for an upcoming formative assessment, which was in the form of a weekly quiz given at the beginning of each class session. Typically, the lecture session lasted for 2 hours and 45 minutes. The text message also reminded the students that the quiz would not be given beyond the first 20 minutes of the class period and that they had to be on time. The quizzes were generally designed to be straightforward and served as confidence boosters. However, they were not easy to the point that students could score high marks with minimal preparation. These formative assessments were meant to act as desirable difficulties, meaning that they are achievable and yet, somewhat challenging (Brown, Roediger, & McDaniel, 2014). The benchmark was set at 80% for each of these quizzes and this expectation was frequently communicated to the students during the first few weeks of the semester. The students were also made aware that all quiz grades (after dropping the two lowest quiz grades) combined would be equivalent to a unit exam grade.

Student disposition traits such as punctuality, attendance, and preparedness were easily gauged from the weekly quiz grades. This also made it easy for the instructor to keep track of student engagement and pay close attention to those that were struggling and those that were disinclined to engage in the learning process. Typically, about 40% of the students were in real danger of failing the course reflecting the average failure rate for this gateway course. However, with the formative assessments, it was possible to reach out to these students from the second week onward and communicate with the college advisors and success coaches starting around the fourth week of the semester using Starfish (Starfish, 2007). During individual student consultations between the instructor and the student, with the help of the formative assessment results, it was easy to identify and correct poor study habits.

Students in all four sections responded positively to these formative learning assessments and achieved the benchmark for preparedness, punctuality, and attendance. About 86% of the students achieved the benchmark for punctuality and attendance and about 81% of the students met the benchmark for preparedness. Overall, 83% of the students passed the course with at least a C. More importantly, 90% of the students remained in the course until the end of the semester. It must also be noted that student confidence and class morale were generally found to be high in this learning environment. We found that the formative assessments given in the context of the flipped classroom method combined with frequent reminders is a powerful learning model that leads to better student success.

Engaging Faculty in the Practice of Formative Assessment

At BCC, all faculty development programs use assessment to evaluate the effectiveness of the program. In addition, the faculty development programs integrate pedagogy and assessment into the skills that are being taught to participating faculty. Faculty development programs at BCC use a mentorship model, where faculty mentors design and run the programs and guide the participant mentees in their professional development.

As one example, the mandatory NFS is a two‐semester program for newly hired full time faculty members that focuses on the following major areas: onboarding for BCC practices and processes, career planning and development, pedagogy, and meaningful and effective assessment. Participants are required to complete work monthly in all areas and produce one major artifact in each of the areas. For assessment, they are introduced to the concepts of formative and summative assessments through the implementation of a pilot Classroom Assessment Project (CAP) in the Fall semester and the design and implementation of individual CAPs in the Spring semester. Formative assessments are introduced as a mechanism for finding the proper balance between maintaining standards and creating inclusive, student‐centered approaches to teaching and learning.

The CAP is focused on improving the educational effectiveness of one of the courses that the faculty member is teaching at the time. For the pilot CAP, the participants are given the problem to address: lack of student preparedness for class. They are then guided to choose a strategy, grounded in pedagogical theory, to address the problem. Strategies to address lack of student preparedness include online or in‐person quizzes on the reading, using a flipped classroom approach, and requiring student notes on the reading. The participants then set a benchmark, implement their chosen strategy, record the results of the formative assessments, and then share and reflect on those results with the entire group of new faculty participants.

For the Spring semester CAP, the participants use their experiences with the pilot CAP to design and implement a new CAP for the Spring semester. This Spring CAP can be an improvement on the pilot, or it may address an entirely different problem area. The participants choose a problem area, determine a strategy to address it, set a benchmark, implement the strategy and record and reflect on the results. The design of the CAP must be completed prior to the start of the Spring semester. Participants are expected to present the design and preliminary results of the CAP as poster presentations at BCC’s annual Faculty Day Conference in April. Written reports on the CAP are required by the end of May, once final exams are completed, and are displayed on each participant’s individual e‐portfolio. CAPs have addressed topics from student preparedness to encouraging student participation in discussions to improving mathematic skills in business classes. All faculty participants are required to report the results of their projects, so every participant has real data about the actual student learning outcomes in the classes that they teach.

In the past 4 years of the program, approximately 22% (68 out of 300) of the full‐time faculty members at BCC have participated in the NFS program and 21% (62 out of 300) of the full‐time faculty members satisfactorily completed the CAP portion of the seminar. In Fall 2017, 46% of the instructors on campus were full‐time faculty members. The average class size at BCC was approximately 22 students and instructors teach an average of three courses per semester; each semester about 4,000 students are being taught by faculty members who have completed this training.

As a compliment to training new faculty members in formative assessment using the CAP, the NFS program reminds new faculty members in a timely manner of the institutional requirements of their position and the services available to students, including the use of Starfish in relaying formative assessment data like early progress surveys to advisors.

Enveloping Students With Care Through Student Support Services

Formative assessment data resulting from early progress surveys is a key indicator that provides important snapshots of student progress and is used in all advising units at BCC. Advisors can leverage this information to deliver high quality advising in a data rich environment that allows for a more informed advising conversation between a student and an advisor.

Currently, BCC uses Starfish, an early alert and student tracking system, which simplifies the communication process for sharing and recording student concerns with a corresponding automated workflow. Ninety‐seven percent of FYS instructors and 79% of faculty teaching “high fail and remedial” courses file early progress surveys that provide information about attendance, missing assignments, and discipline‐specific comprehension. Faculty and others on campus can efficiently share their concerns with advisors and other support services professionals (Disability Services, Personal Counseling, Athletics, Tutoring Services) who are able to reach out and offer help in a personalized, intentional, and timely manner. The use of Starfish on campus has created a connected community where communication is streamlined; and, the guesswork of where and to whom to make a referral is made easier. The platform connects students with a success network that includes academic advisors or success coaches and faculty for all classes in which they are currently enrolled. If students are associated with special programs, other support connections are included in the students’ success network.

This integrated approach to communication, support, and data allows for members within students’ success networks to craft interventions that can potentially link students to resources, engage students in reflective practice, and create new habits that build resilience. With confidence, faculty can raise a concern at any given time regarding a student issue and know that there is someone on the other side conducting appropriate outreach and providing updates on outcomes of such interventions, thus “closing the loop.”

Conclusion

We have shown through these case studies that formative assessment can be effective in gateway science classes, faculty can be trained effectively in formative assessment, and instructors can provide meaningful formative assessment data through systems like Starfish to student support services. We theorize that collaboration among instructors, advisors, mentors, and student success personnel can result in improved student success that will be realized as better retention and graduation rates.

Author(s)

Shylaja Akkaraju is a professor of biological sciences at Bronx Community College (CUNY).

Seher Atamturktur is a professor of biological sciences at Bronx Community College (CUNY).

Laura Broughton is an associate professor of biological sciences at Bronx Community College (CUNY).

Tica Frazer is associate director of the First Year Program at Bronx Community College (CUNY).

References

  • Akkaraju, S. (2016). The role of flipped learning on managing cognitive load in a threshold concept in physiology, Journal of Effective Teaching, 16( 3), 28– 43.
  • Angelo, T. A., & Cross, P. (1993). Classroom assessment techniques: A handbook for college teachers ( 2nd ed.). San Francisco, CA: Jossey‐Bass.
  • Bergmann, J., & Sams, A. (2012). Flip your classroom: Reach every student in every class every day ( 1st ed.). Arlington, VA: International Society for Technology in Education.
  • Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning, Assessment in Education: Principles, Policy & Practice, 5( 1), 7– 74.
  • Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York: David McKay Co.
  • Boaler, J. (2013). Ability and mathematics: The mindset revolution that is reshaping education, Forum, 55( 1), 143– 152.
  • Bower, J. (2015, December 17). Assessment and measurement are not the same thing [Blog post]. Retrieved from http://joe-bower.blogspot.com/2015/12/assessment-and-measurement-are-not-same.html
  • Brookhart, S. M., Moss, C. M., & Long, B. A. (2008). Formative assessment that empowers. Educational Leadership, 66( 3), 52– 57.
  • Brown, P. C., Roediger, H. L., & McDaniel, M. A. (2014). Make it stick: The science of successful learning (p. 336). Cambridge, MA: Belknap Press.
  • Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and…tensions with professional identity? CBE Life Sciences Education, 11( 4), 339– 346. http://doi.org/10.1187/cbe.12-09-0163
  • Cauley, K. M., & McMillan, J. H. (2010). Formative assessment techniques to support student motivation and achievement. The Clearing House, 83( 1), 1– 6.
  • CUNY Office of Institutional Research. (2017). Graduation rates 2016–17 for Bronx Community College. Retrieved from https://www2.cuny.edu/wp-content/uploads/sites/4/page-assets/about/administration/offices/oira/institutional/reports/integrated/2016-2017/IPEDS_GRS_2017_BCC.pdf
  • Dweck, C. S. (2008). Mindset: The new psychology of success. New York: Ballantine Books.
  • Gibbs, G. (1988). Learning by doing: A guide to teaching and learning methods. Oxford, UK: Oxford Further Education Unit.
  • Greenfield, G. M., Keup, J. R., & Gardener, J. N. (2013). Developing and sustaining successful first‐year programs: A guide for practitioners. San Francisco, CA: Jossey‐Bass.
  • Karp, M. M., & Bork, R. H. (2014). “They never told me what to expect, so I didn’t know what to do”: Defining and clarifying the role of a community college student. Teachers College Record, 116( 5), 1– 40.
  • Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development (Vol. 1). Englewood Cliffs, NJ: Prentice‐Hall.
  • Krathwohl, D. R., Bloom, B. S., & Masia, B. B. (1973). Taxonomy of educational objectives: the classification of educational goals. Handbook II: Affective domain. New York: David McKay Co.
  • Kreber, C. (2002). Teaching excellence, teaching expertise, and the scholarship of teaching. Innovative Higher Education, 27( 1), 5– 23.
  • Land, R., Cousin, G., Meyer, J. H. F., & Davies, P. (2005). Threshold concepts and troublesome knowledge: Linkages to ways of thinking and practicing (3)—Implications for course design and evaluation. In C. Rust (Ed.), Improving Student Learning: Diversity and Inclusivity. (pp. 53– 64). Oxford, UK: Oxford Centre for Staff and Learning Development.
  • Maatta, K., & Uusiautti, S. (2012). Pedagogical authority and pedagogical love: Connected or incompatible? International Journal of Whole Schooling, 8( 1), 21– 39.
  • Moss, C. M., & Brookhart, S. M. (2009). Advancing formative assessment in every classroom: A guide for instructional leaders. Alexandria, VA: ASCD.
  • Nicol, D. J., & MacFarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31( 2), 199– 218.
  • Remind. (2014). Remind school communication [Mobile application software]. Retrieved from http://remind.com
  • Schön, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.
  • Starfish. (2007). Starfish enterprise success program [Computer software]. Retrieved from http://starfishsolutions.com
  • Stefanakis, E. H. (2002). Multiple intelligences and portfolios: A window into a learner’s mind (p. 9). Portsmouth, England: Heineman.
  • Stiggins, R. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83( 10), 758– 765.
  • Stiggins, R. (2008). Assessment manifesto: A call for development of balanced assessment systems. Princeton, NJ: Educational Testing Service.
  • Stiggins, R., & DuFour, R. (2009). Maximizing the power of formative assessments. Phi Delta Kappan, 90( 9), 640– 644.
  • Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition. Chicago, IL: University of Chicago Press.
  • U.S. Department of Education, National Center for Education Statistics. (2018). Trend generator. Retrieved from https://nces.ed.gov/ipeds/trendgenerator/#/
  • Worthen, M. (2018, February 23). The misguided drive to measure learning outcomes. The New York Times. Retrieved from https://www.nytimes.com/
 

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Set your Twitter account name in your settings to use the TwitterBar Section.
css.php
Need help with the Commons? Visit our
help page
Send us a message
Skip to toolbar