Introduction
Curiosity may have killed the cat, but it also enticed Alice to enter Wonderland. Curiosity is the drive to seek knowledge to fill a gap in our understanding, without an extrinsic reward (Loewenstein, 1994). Curiosity is central to human evolution and learning since our minds desire to eradicate uncertainty. Therefore, when shown something unfamiliar, our first impulse is to explore, seeking information to fill the knowledge gap (Loewenstein, 1994; Shin & Kim, 2019). As assessment professionals, we can build on our colleagues’ natural curiosity to engage them in the assessment process and overcome barriers to assessment, such as fear and institutional culture. This article summarizes the current state of assessment within student affairs, highlighting its importance in enhancing student experiences and outcomes. It also explores how student affairs practitioners use curiosity to reframe assessment practices within the scope of their roles, providing innovative approaches to improve their effectiveness.
Assessment within Student Affairs
Before sharing how curiosity can be used to motivate student affairs professionals to engage in the assessment process, we must first understand the ways student affairs practitioners typically perceive assessment efforts. In three studies of student affairs professional at graduate (Castillo-Montoya, 2020), early career (Hoffman, 2010), and senior levels (Parnell et. al., 2018), the authors found that student affairs professionals were consistently interested in learning assessment skills and understood the importance of assessment. However, all three studies revealed that participants were not confident in their ability to use data, nor did they know how to use collected data. Two studies by Cox et al. (2017) and McCaul (2015) found that higher education institutions are better at collecting data than using it effectively, with institutions reporting using their data to make improvements to offerings less than 40% of the time. Both studies found that one reason data was not frequently used was the lack of curiosity about the assessment results beyond simply pacifying a stakeholder request or as part of an external process (e.g., accreditation, grant writing).
The barriers and fears surrounding assessment often prevent the formation of positive cultures of assessment in higher education institutions. In a survey of 771 mid-level or higher student affairs professionals, Fuller and Lane (2017) found that the following factors explained over half the variance in institutions’ attempts to create a culture of assessment: the division’s commitment to assessment, the communication of assessment results, the connection of assessment data to change processes, and the fear of assessment being manipulated to tell a false story or used to punish others. There found an inverse relationship between the first three factors (division’s commitment to assessment, communications of assessment results, and connection of assessment data to change processes) and fear of assessment, implying that fear of assessment may co-exist even in a culture of assessment (Fuller & Lane, 2017). Similarly, a study of three institutions with strong cultures of assessment found that fear of assessment and prioritization of assessment were still barriers to the large-scale, successful implementation of assessment results (Green et al., 2008). However, several studies have demonstrated that employees’ fear of assessment can be dispelled through proactive relationship building, engaging in conversations regarding the analysis and use of assessment data, and showcasing how the assessment data is being used (Beshara-Blauth, 2018; Fuller & Lane, 2017; Julian, 2013; Ridgeway, 2014). These results imply that proactively building relationships and broadly sharing assessment results can help motivate student affairs staff members to positively engage in assessment processes.
Current Research on Curiosity as a Motivator
Although there has been limited research linking curiosity to higher education assessment, other industries have identified curiosity as a motivator for overcoming negative expectations and increasing job satisfaction. For example, in a study to understand how to entice Western people to eat insects, Stone et. al (2022) found that curiosity was the highest predictor of willingness to try insectoid and other unfamiliar food, even accounting for initial prejudice. Litman (2005) put forth a theoretical idea that curiosity was dynamic and could be increased through changing whether people wanted and/or liked new information. As these factors increased, Litman hypothesized that curiosity and the dopamine response would also increase (2005). A study of over 200 hotel workers found that social curiosity, the drive to understand people, positively mediated the relationship between the guest/employee-relationship and the employee’s job satisfaction, even for unsatisfied employees (Gao & Ayoun, 2023). Since social curiosity is a dynamic trait, the authors hypothesized that workplaces that seek to frame guest interactions as social curiosity could improve employee satisfaction and reduce turnover. Similarly, a study by Sinegar and Suma (2024) of approximately 200 finance employees found that allowing for curiosity, defined as innovative work behaviors, increased employees’ performance and job satisfaction levels. The authors hypothesized that employees who are provided the freedom to investigate new ideas, strategies, and growth opportunities find greater meaning in their work (Sinegar & Suma, 2024). By creating environments where curiosity is encouraged and positively reenforced, employees are more like to find satisfaction.
Method
This qualitative study used an exemplar sampling strategy to identify student affairs directors who were highly skilled at using assessment data to make changes and understand how they were able to do so. In an exemplar sampling strategy, “researchers deliberately identify and study a sample of individuals or entities that exhibit a particular characteristic in an exceptional manner” (Bronk et al., 2013, p. 1). The following criteria were used to define highly skilled at using assessment, based on definitions from NASPA and ACPA’s Professional Competency Areas for Student Affairs Educators: Assessment’s Evaluation and Research Competency (2016), ACPA’s Assessment, Skills, and Knowledge (ASK) Standards (Mitchell, 2006) and works such as Henning and Roberts (2016), Schuh (2013), Suskie (2018), and Yousey-Elsener et al. (2015):
- A person who “closes the loop,” taking data that has been analyzed and implementing the results into future practice.
- A person that uses data to make changes, including improvements, deletions, and continuations to programs, events, and services. Their use of data goes beyond accountability or reporting to a stakeholder.
- A person who communicates the data they collected with others to better understand the findings.
- A person who considers the ethics, reliability, and accuracy of the data they are using.
- A person who uses their data regularly (more than once a year) to guide their department.
- A person who has been the director of a student affairs department for at least two years to ensure they have sufficient experience using assessment data.
Student Affairs Directors were nominated by assessment professionals employed at universities within the United States. After a director was nominated for the study, an hour-long interview was scheduled. During the interview, directors shared at least one document showing how they used assessment to create change. Data was collected in spring and summer of 2022. A total of 16 nominations resulted in 10 interviews and 33 documents. Data was transcribed from the interviews and analyzed and coded in two rounds.
Results
Five of the interviewed directors named curiosity as a strategy to motivate their colleagues to participate in the assessment process. There were three ways curiosity supported change: lowering the barrier to entry, using visualizations to answer questions in real-time, and demonstrating how curiosity and data can support systemic change.
Lowering the Barrier to Entry
Aviva and Liz spoke about how curiosity lowers the barrier to assessment for new staff. Aviva often used curiosity as an entry point to the assessment process, since it’s “a softer entry point for folks to because it feels safe and non-technical. You don’t even have to know how to use data …hopefully those conversations would be really nurturing and supportive.” Aviva explained how she held this type of conversation with a new staff member who told her that “Assessment isn’t really my thing.” Rather than try to convince the staff member that they were wrong, Aviva decided to try to engage their curiosity by saying:
Let’s start with what you’re curious about and see if we can get you some data that will answer that. Then I think you’re going to be hooked because you’re going to realize that this piece, that’s an unknown or theoretical, because you’re reading about it at other institutions, you can tap into that more locally. I feel like there’s something this helps powerfully addictive about being able to see insights from data.
Aviva’s choice to find shared interest and encourage the staff member to ask a question that can be answered with data was an innovative way to convince them to not fear assessment.
Although employed by another higher education institution, Liz used similar language to describe how she introduces the topic of assessment to her staff members. For example, Liz first asks her staff to identify a topic of interest, “Find something that’s meaningful to you and start with that, because then when you get excited about it, you’re going to explore other ways to tell the story.” Liz wanted her staff to find ways to use assessment results that would be meaningful to them. She also acknowledges to her staff that there are many right ways to collect and analyze assessment data to answer questions and encourages her staff to explore the strengths of different assessment methods and to determine the most appropriate method for answering their questions. By inviting her staff to identify topics they are curious about, Liz helps motivate her staff to engage in the assessment process and helps the topic of assessment within the department.
Using Visualizations to Answer Questions in Real Time
Erica and Gary spoke about how they used visuals to inspire curiosity among their staff members which led to creating change based on the assessment results. Erica shared how she uses the learning community student success dashboard to inspire curiosity when meeting with academic colleges and campus partners. The dashboard helps identify how students in the learning community compared to their peers, accounting for demographic differences and visualizing metrics such as retention and GPA. The dashboard allows staff to start “disaggregating student participation data to see if there’s equity gaps in terms of particular students participating in or not participating in a learning community.” Erica noted that she used the dashboard during a meeting with the international services office to highlight engagement differences for international students compared to their domestic peers. Subsequently, both offices probed deeper into the issue to identify the barriers to the engagement of international students in the learning communities. They learned that the learning community enrollment process was the main barrier to international student enrollment and participation numbers. Erica shared that she and her team were able to “hone in on what needed to be changed as an institutional approach to seat reservation that would allow us to get those students into the learning community,” resulting in a 20% increase in participation by international students.
Gary also found that having a visual data tracker helped people become curious about the data. As the director of the Student Union, Gary created an Excel tracker to show work orders, room reservations, event attendance, and average hourly building entry. While his staff was initially resistant to using the data tracker, Gary used staff meetings, line meetings, and individual appointments to role model that it was safe to be curious and ask questions such as, “What types of reservations are we seeing and what spaces by what kinds of orgs? What kinds of groups?… How many times are we putting in requests for these different things [work orders]?” After posing the questions, he demonstrated how to use the data tracker to answer the questions. He noted that as the culture grew,
They see the dots connected, so I think that you have to get over the hump of like the resistance but then it becomes part of the culture. And now I see like our student activities associate director, we had a meeting last week with a staff and they’re making changes to their structure of the Student Activities Council, based on the data captured.
Both Gary and Erica engaged the curiosity of their staff members by asking good questions and demonstrating how dashboards and trackers can be used to answer the questions. They trained staff on how data can be used to answer questions that lead to improving the services offered by their office. Further, they demonstrated to staff members how to interact with visual data in group settings that led to substantive changes at their respective institutions.
How Curiosity and Data Support Systemic Change
CJ spoke about the importance of engaging the curiosity of their team as a motivator for creating departmental and institution-wide change. CJ, who serves as the director of student conduct, explained as someone who loves assessment, “We have to be good qualitative analysts, and asking questions, and we have to stay curious in our work regardless of what functional area because… [without it] we cannot create systemic change.” For example, CJ explained that he helped to create systemic change in 2021 via an ad hoc equity analysis of Code of Conduct violations. He was curious as to why international students are found in violation of the code more frequently than domestic students. In seeking the answer, he identified two issues: (1) few international students served on the hearing board, and (2) international students did not receive the same introduction to the Code of Conduct as domestic students. Based on these findings, “we ended up doing some strategic recruitment for that [hearing board membership]. And we got in for International Student Orientation and started normalizing some of the expectations around…the code.” At the time of the interview, CJ was working on a similar analysis of why “more male-identified folks go through our process than female-identified, which is counterintuitive to our enrollment.” CJ said he expected to use assessment data to continue making changes at the university both at the office level as well as they system policy level.
Discussion
Engaging the curiosity of student affairs professionals during the assessment process can powerfully encourage the collection and the use of data to drive change initiatives aimed at improving services and programs. The student affairs directors in this study framed assessment as more than an administrative process, accreditation process, or best practice; instead, they engaged their employees in an information-seeking expedition to answer questions. This framing engaged the dopaminergic reward system of their staff members, which has been linked in research studies to positive information-seeking experiences as well as an opportunity to socially connect with their staff (Lowenstein, 1994; Litman, 2005; Shin & Kim, 2019). For assessment professionals seeking to engage staff in the assessment cycle, our participants noted that a powerful first step is to frame the assessment process as explorative, information seeking, and gap filling. Erica accomplished this by asking her team to look at a specific sub-population and asking them to speculate about why there was a difference in the data. Similarly, Aviva started with questioning her team about topics they were curious about. In both cases, the director highlighted an information gap and then invited colleagues to collaborate with them to fill that gap. Each director knew the importance of first building a positive baseline regarding assessment before expanding their office’s assessment efforts.
Previous studies on curiosity in the workplace (Gao & Ayoun, 2023; Litman, 2005; Sinegar & Suma, 2024) have identified the importance of creating a work environment that fosters curiosity, defined as innovation. Embracing curiosity in the workplace requires updating assessment processes and moving beyond measuring the same metrics each year to identify incremental and longitudinal changes. There is no doubt that assessment professionals must fulfill their obligations to provide data to internal and external agencies, but that should only be a portion of a comprehensive assessment plan. For example, CJ noted that his office’s assessment plan included metric tracking of student satisfaction and learning and counts of different types of infractions to comply with the Cleary Act, but he also conducted additional analyses of the data on equity issues to create substantive change to better meet the needs of all students. Liz also encouraged curiosity by working with her staff to encourage different methods to answer the same question, based on the strengths and interests of staff members. Her assessment plan allowed for flexibility, but she still insisted on being able to answer the questions that were needed by her institution for continuous improvement. Assessment professionals can partner with student affairs professions to co-create assessment plans that meet the needs of internal and external stakeholders while still embracing curiosity as part of the process.
When engaging in a curiosity-based approach to assessment, scholars highlight the importance of making people feel safe to engage in curiosity without repercussion (Litman, 2005; Sinegar & Suma, 2024). Similarly, it is important to create a safe environment when looking to improve the assessment culture (Fuller & Lane, 2017; Green et al.,2008). As demonstrated in this study by Gary, Erica, and Liz, a curiosity-based approach to assessment encourages assessment professionals to build trust-filled relationships, collect and analyze data with colleagues, and to co-create change based on the data. Engaging in conversations with colleagues is consistent with the advice provided in studies by Beshara-Blauth (2018), Julian (2013), and Ridgeway (2014) which focused on the importance of creating relationships in creating a positive culture of assessment. Positive cultures of assessment provide the safety needed for curiosity to flourish.
Conclusion
As humans, curiosity is hard-wired into our brains, and when we can answer questions by engaging in information-seeking behaviors, it provides a pleasurable experience. Using curiosity within the assessment process can reduce barriers around assessment, increase completion of the assessment cycle, and help make assessment enjoyable and meaningful. The directors in this study exemplified the specific ways curiosity leads a positive experience in collecting, analyzing, and, even more importantly, using the data to make changes. Let’s role model flexing our curiosity muscles with our colleagues to engage in meaningful assessment processes that lead to positive change!
Authors
Dr. Rebecca Goldstein serves as the Director of Assessment and Research at Florida Atlantic University. In this role, she leads strategic planning at the divisional level and oversees strategic planning and assessment efforts for 20 departments. She also leads analytics and dashboard use for the Division of Student Affairs and collaborates with Academic Affairs to determine Student Success Metrics. Rebecca earned her Doctor of Philosophy in Higher Education Leadership from Florida Atlantic University and her dissertation, “How Student Affairs Directors Use Assessment to Make Changes” provided a focus on the actions directors take in order to put their data into practices to improve their organizations. She also has an M.Ed in Higher Education and Student Affairs from the University of South Carolina and a B.S. in Environmental Science from Indiana University. Her professional interests include big data analytics, assessment use and practice, organizational leadership and dynamics, and business approaches to student affairs.
Dr. Jennifer Bloom is a tenured Professor in the Department of Educational Leadership and Research Methodology and the Founder of the Office of Appreciative Education (https://www.fau.edu/oae/) at Florida Atlantic University (FAU). She is also the founder of FAU’s Office of Appreciative Education. She previously served on the faculty at the University of South Carolina and as Associate Dean for Student Affairs & Medical Scholars Program at the University of Illinois College of Medicine at Urbana-Champaign. Dr. Bloom was the 2007-08 President of NACADA: The Global Community for Academic Advising and, in 2017, received NACADA’s Virginia N. Gordon Award for Excellence in the Field of Advising. She has co-authored six books, numerous articles, and has presented at 500+ institutions and conferences.
Works Cited
Beshara-Blauth, A. M. (2018). Talk data to me: Bolstering the communication of data to facilitate data-informed decision making in community colleges (Publication No. 10931840) [Doctoral dissertation, University of Maryland University College]. ProQuest Dissertation & Theses Global.
Bronk, K. C. (2012). The exemplar methodology: An approach to studying the leading edge of development. Psychology of Well-Being: Theory, Research and Practice, 2(5). https://doi.org/10.1186/2211-1522-2-5
Castillo-Montoya, M. (2020). Assessment: The “wild card” in student affairs. Journal of Student Affairs Research and Practice, 58(1), 79–93. https://doi.org/10.1080/19496591.2019.1707090
Cox, B. E., Reason, R. D., Tobolowsky, B. F., Brower, R. L., Patterson, S., Luczyk, S., & Roberts, K. (2017). Lip service or actionable insights? Linking student experiences to institutional assessment and data-driven decision making in higher education. Journal of Higher Education, 88(6), 835–862. https://doi.org/10.1080/00221546.2016.1272320
Fuller, M. B., & Lane, F. C. (2017). An empirical model of culture of assessment in student affairs. Research & Practice in Assessment, 12, 18–27.
Green, A. S., Jones, E., & Aloi, S. (2008). An exploration of high-quality student affairs learning outcomes assessment practices. Journal of Student Affairs Research and Practice, 45(1), 133–137. https://doi.org/10.2202/1949-6605.1910
Guo, Y., & Ayoun, B. (2023). What’s in it for them? the role of social curiosity and social needs in motivating and retaining hospitality employees. International Journal of Hospitality Management, 115. https://doi.org/10.1016/j.ijhm.2023.103596
Henning, G., & Roberts, D. (2016). Student affairs assessment: Theory to practice (1st ed.). Stylus Publishing.
Hoffman, J. L. (2010). Perceptions of assessment competency among new student affairs professionals (Publication No. 3437511) [Doctoral dissertation, University of California, Los Angeles]. ProQuest Dissertation & Theses Global.
Julian, N. D. (2013). Exploring the culture of assessment within a division of student affairs (Publication No. 3570909) [Doctoral dissertation, California State University, Fullerton]. ProQuest Dissertation & Theses Global.
Litman, J. (2005). Curiosity and the pleasures of learning: Wanting and liking new information. Cognition and Emotion, 19(6), 793–814. https://doi.org/10.1080/02699930541000101
Loewenstein, G. (1994). The Psychology of Curiosity: A review and reinterpretation. Psychological Bulletin, 116(1), 75–98. https://doi.org/10.1037//0033-2909.116.1.75
McCaul, J. L. (2015). Closing the loop: A study of how the national survey of student engagement (NSSE) is used for decision-making and planning in student affairs. Western Michigan University.
Parnell, A. R., Jones, D., Wesaw, A., & Brooks, D. C. (2018). Institutions’ use of data analytics for student success: Results from a national landscape analysis. NASPA, AIR and EDUCAUSE.
Ridgeway, L. (2014). The role of the senior student affairs officer in creating and sustaining a culture of assessment: A case study. Clemson University.
Schuh, J. H. (2013). Developing a culture of assessment in student affairs. New Directions for Student Services, 2013(142), 89–98. https://doi.org/10.1002/ss.20052
Siregar, B. A., & Suma, D. (2024). How innovative work behavior affects employee performance: The mediating role of curiosity. Quality-Access to Success, 25(200). https://doi.org/10.47750/qas/25.200.30
Stone, H., FitzGibbon, L., Millan, E., & Murayama, K. (2022). Curious to eat insects? curiosity as a key predictor of willingness to try novel food. Appetite, 168. https://doi.org/10.1016/j.appet.2021.105790
Suskie, L. A. (2018). Assessing student learning: A common sense guide (3rd ed.). Jossey-Bass,a Wiley Brand.
Yousey-Elsener, K., Bentrim, E. M., & Henning, G. (Eds.). (2015). Coordinating student affairs divisional assessment: A practical guide. Stylus Publishing, LLC.
No comments yet.