Mapping the Response to AI and Its Impact on Assessment Redesign Through Document Analysis

Abstract

This paper investigates the response of higher education institutions to the integration of generative artificial intelligence (GenAI) in assessment design, prompted by the advent of tools like ChatGPT, Claude, Gemini, and Bard. Through a document analysis of 135 English-speaking websites, including university and educational organization guidelines, the study identifies and discusses emerging trends and approaches to incorporating GenAI into educational assessments. It scrutinizes how universities and organizations initially address the impact of GenAI, focusing on the guidance provided to instructors. The analysis reveals seven principal themes: the terminology used to describe AI’s role in assessment, provisional principles guiding its use, advice to instructors on implementing GenAI, the opportunity to refocus on the purpose of assessment, examples and roles of AI in assessment, relevant theories and pedagogies, and the recognition of this period as the initial wave of GenAI integration in education. The paper highlights the diverse approaches and formats of these guidelines and emphasizes the need for continuous adaptation and reevaluation as the understanding of GenAI in education evolves. This study serves as a foundational exploration of the initial institutional responses to GenAI, setting the stage for future research and development in this rapidly changing field.

Introduction
Since the release of ChatGPT (https://chat.openai.com/) in November 2022, and later more generative AI (GenAI) tools such as Claude (https://claude.ai/), Gemini (https://deepmind.google/technologies/gemini/), and Bard (https://bard.google.com/), universities and organizations have been striving to create guidelines for instructors on how to use GenAI in their assessment design. The guidelines vary in their language and goals, and they will undoubtedly change as the understanding of GenAI and its use develops.

In this paper, assessment is used to describe the systematic process of evaluating and measuring student learning. It includes formative and summative assessment as well as assessment for learning, as learning, and of learning that take place at the course level.

This paper investigates these initial guidelines and outlines how pioneer universities and educational organizations have addressed the presence of GenAI and its impact on assessment. My research question was “what guidance related to GenAI and assessment do higher education institutions provide to their instructors?” My findings create a starting point from which the use of AI in assessment design can be tracked to inform future updates to institutional guidelines.

I conducted this analysis on publicly available online guidance documents from higher education institutions and educational organizations using online search techniques, listservs, and a Padlet on GenAI (CETL et al., 2023). Document analysis is the process of examining and evaluating documents to extract useful information, gain insights, or make informed decisions “in such a way that empirical knowledge is produced and understanding is developed” (Bowen, 2009, p. 33). Thematic analysis can be considered a form of pattern recognition with the data. This analysis takes emerging themes and makes them into categories used for further analysis, making it a useful practice for grounded theory. It includes careful, focused reading and rereading of data, as well as coding and category construction.

Exclusion and Inclusion Criteria
I reviewed a total of 135 English-speaking websites for the analysis, of which 98 (73%) had content related to AI and assessment and, notably, 37 (27%) did not. The content had to pertain to institutional approaches to GenAI and assessment. I excluded course-level syllabi, opinion pieces, and policies that did not explicitly mention assessment. I included guidance or policy documents created by university teaching and learning centers, provost offices, or specific faculties. In addition to guidance from universities from around the world, I also reviewed that of educational organizations such as UNESCO, EDCan Network, Jisc, the Tertiary Education Quality and Standards Agency, and the European Network for Academic Integrity Working Group Centre for Academic Integrity (E-CAIU).

Findings
The analysis revealed that universities took diverse approaches to the creation of these guidance documents, their formatting, and the messages portrayed. Broadly speaking, documents related to GenAI could be grouped into policy documents, resources for students, resources for staff, and blogs. Some universities had webpages dedicated to AI and assessment, others had FAQ pages, and still others had links to external resources (e.g., Georgetown referenced Vanderbilt and Yale) or adopted a regional approach (e.g., Australasian Academic Integrity Network, 2023). Macquarie University in Australia had resources that needed a username and password to be accessed. The analysis is summarized in seven themes: (a) terminology; (b) provisional principles; (c) advice to instructors; (d) an opportunity to refocus on the purpose of assessment; (e) examples, uses, and role of AI; (f) theories, pedagogies, concepts, and analysis; and (g) first wave.

Terminology
The terminology used to describe the relationship between AI and assessment varied among institutions. Differences were closely related to the role of GenAI in assessment or the relationship between GenAI and assessment. Some terminology acknowledged the important role of AI in assessment, such as “AI-enabled assessment,” “AI-driven assessments,” “GenAI enhanced,” “AI-based,” and “advancing assessment with AI.” Other sources, such as Princeton University (McGraw Center for Teaching & Learning, n.d.), explained that AI had only a minimal impact, being only a compiler. Some wording demonstrated the relationship people have with AI, such as the phrase “assessments for an AI-enabled world” used at University College London (n.d. , para. 1).

Some documents used terminology that did not include GenAI at all, but rather referred to new ways of thinking about assessment given the GenAI disruption, such as forward-thinking assessment (Volante et al., 2023), rethinking assessment practices (Assessment, AI and Academic Integrity, n.d.), assessment reform (Jisc, 2023), and future-focused assessment (Monash University, n.d.). Notably, Anohina (2005) found the same phrasing when analyzing the terms used to describe virtual learning.

In 2005, Donohue and Howe-Steiger described terms related to e-learning as a cacophony of jargon. The same description could be applied in 2023 for terms related to GenAI and assessment. Nevertheless, it is important to analyze these terms because they have their own nuance. Instructors and institutions are dealing with a recent phenomenon that does not yet have established definitions. A deeper analysis of this terminology would therefore improve understanding of how the role of AI is perceived, how it is projected to impact assessment design, and what links exist between terminology, identity, and learning context.

Provisional Principles
The second theme present in the guidance documents was a set of “provisional principles,” a term borrowed from McMaster University (2023, n.d.). These principles highlight the complex relationship between academia and GenAI, acknowledging both the risks and opportunities this technology presents and emphasizing the varied reactions and expectations it elicits. The first provisional principle I found in my data refers to GenAI’s multifaceted impact. The University of Guelph (2023) exemplified this multifaceted impact, speculating that the application of GenAI will be influenced by different disciplinary cultures.

A second principle, in terms of usage policies, is a consensus against outright banning of GenAI in assessments. Although some educators may consider implementing a ban in full, it is important to weigh whether such a prohibition might inadvertently restrict GenAI as an assistive tool—or whether it could even be enforced. Detection of unauthorized GenAI use is generally deemed unfeasible (University College London, n.d.), yet institutions such as Singapore Management University (2023) have still advocated for monitoring where possible and University of Delaware (Guidry, 2023) provided links to two detection tools.

A third principle acknowledges GenAI’s growing role in education and the need to adjust academic spaces accordingly. The projected omnipresence of AI and its tools in one’s everyday life, as the University of Alberta (Centre for Teaching and Learning, n.d.-a) has noted, necessitates adaptation in academic spaces as part of teaching modern learners. University of Warwick (Fisher et al., 2023) further underscored the linkage between technology and pedagogies, advocating for their joint consideration in developing comprehensive assessment strategies.

Transparency was also considered an important principle. Recommendations included providing clear reasons behind the decisions of the instructors (Centre for Teaching and Learning, University of Alberta, n.d.-a, n.d.-b), demonstrating to the learners that instructors are also learning about AI (Centre for Learning and Teaching, n.d.), and designing for transparency (Teaching and Learning Resource Center, n.d.).

For the most part, the use of GenAI in assessment has been put at the discretion of instructors. Therefore, its utilization varies. This variation is indicative of the far-reaching consequences of any policy. The guidance documents I reviewed provided a range of actions that could be taken but did not specify any one action that should be followed by everyone (e.g., Cornell University, n.d.; Institute for Teaching and Learning Innovation, n.d.; The CIEL Blog, n.d.). The documents did include some directions for instructors; for example, the British Columbia Institute of Technology (n.d.) advised, “If an assignment is easily done by an automated response system, is it worth asking students to do it?” (p. 3).

Among the universities whose guidelines were included in this analysis, some admitted that AI will have a role in preparing students for future careers. University of Technology Sydney (LX Team, 2023), for example, has promoted the concept of “AI-ready” lawyers. This forward-thinking approach includes developing GenAI literacy among students and staff. The Russell Group (n.d.) emphasized equipping staff to support students in using AI tools effectively and appropriately. This preparation is integral to ensuring graduates are equipped for a technologically advanced workforce.

A final principle I uncovered is a cautionary one against prioritizing assessment security over authentic learning experiences, equity, and student well-being (e.g., Centre for Pedagogical Innovation, n.d.). This principle encourages educators to consider how AI aligns with learning outcomes, course content, degree-level expectations, experiential learning, learning assessment outcomes, and work-integrated learning. The intent is to balance technological advancements with core educational values, ensuring that assessments are fair, relevant, and beneficial to all students.

Advice to Instructors
Most of the guidance documents I reviewed had a section that contained advice for instructors, constituting an institutional response to instructor expectations. Much as what happened during the COVID-19 pandemic, faculty members are looking for guidance and advice. By providing this advice, universities have also acknowledged instructors’ need for it. As part of the wide array of advice, different approaches were used to describe how instructors might allow the use of GenAI in their assessment. Some documents included the benefits and limitations of AI in assessment (Teaching and Learning Resource Center, n.d.). Advice on wording was a prevalent topic (e.g., Centre for Teaching and Learning, University of Alberta, n.d.-b), and many sources included examples of wording in syllabi, such as Georgetown University (Center for New Designs in Learning and Scholarship, n.d.-a, n.d.-b) and University of Delaware (Guidry, 2023). Many universities provided different usage scenarios. For example, if instructors at the University of Alberta chose to allow the use of GenAI for brainstorming, they could use the following sentence:

You are asked to use Generative AI tools in this course. AI use will, however, be dependent on assignment and assessment requirements. Please follow all assessment task-specific directions and guidance as provided. If you have any questions or concerns, please do not hesitate to ask during office hours or after class. (Centre for Teaching and Learning, University of Alberta, n.d.-b. Sample Statement A)

The documents also contained suggestions that instructors could share with their students on how to acknowledge the use of AI. Some guidelines gave advice on the choice of the assessment itself (Instructional Technology and Design Services, n.d.; Liu & Bridgeman, 2023a). Many documents provided additional GenAI tool suggestions that instructors could explore and recommend to their learners. Some universities had built versions of GenAI that could serve certain purposes. Liu and Bridgeman (2023a), at the University of Sydney, for instance, had a unique resource that explained how instructors could assess students’ use of AI. The resource contained rubrics that evaluated the use of AI in the assessment, among other elements.

Georgetown University (Center for New Designs in Learning and Scholarship, n.d.-a) included a list of questions that instructors could ask themselves when designing their assessment, such as “How might students use AI tools while working on this assignment? How might AI undercut the goals of this assignment? How could you mitigate this? How might AI enhance the assignment? Where would students need help figuring that out?” (Questions to ask section, para. 1).

Another recurring piece of advice, such as at Macquarie University (Kozar, 2023), was to ask students to keep an audit trail or logbook of how they had used GenAI throughout the course. Instructors were advised to ask the learners to reflect on their logbooks and receive a grade on that reflection. In fact, some universities recommended that the instructors learn from their students and be even more active in asking the students how they are using AI, such as Ohio State University (Teaching and Learning Resource Center, n.d.).
Of interest, Chapman University (n.d.) was the only university in the list that discussed accessibility. In a section on how AI technologies might help people with communication disabilities, the university provided examples of how GenAI could make assessment more equitable. As GenAI becomes more relevant and understanding of its use grows, there should be more focus on how to use GenAI for greater equity.

An Opportunity to Refocus on the Purpose of Assessment
As the University of Lethbridge (Shapiro, 2023) has noted, educators must rethink assessment methods. Refocusing on the purpose of assessment is a theme that represents a new direction for assessment. This notion allows instructors to move away from policing students’ use of GenAI and instead focus on the assessment itself and consequently on students’ learning. Some universities have differentiated between short- and long-term approaches to assessment redesign, highlighting that the long-term approach should be to redesign assessment entirely (King’s College London, n.d.). Most of the universities have focused on giving instructors advice and resources to design alternative ways of assessing students.

This principle also included the concept of iterative assessment (Centre for Pedagogical Innovation, n.d.), scaffold assessment (Centre for Teaching, Learning and Technology, 2023), and nested or staged assessment (Assessment, AI and Academic Integrity, n.d.) that allow learners to learn during the assessment and allows the instructors to provide adequate feedback. Refocusing on the purpose of assessment entails the creation of meaningful assessments in which students can extend or expand upon any content created by GenAI (e.g., Liu & Bridgeman, 2023a). It also entailed designing authentic assessments—a theme found in the majority of the documents.

Rethinking also means focusing on employability. Instructors have been advised to rethink what skills would make their graduates more employable and design their assessments around those skills (Khan, 2023; Kinash et al., 2018; Shapiro, 2023). Assignments that can be easily completed by GenAI systems are of questionable value (British Columbia Institute of Technology, n.d.). Ohio State University (Teaching and Learning Resource Center, n.d.) remarked upon the undeniable reality of the implications of AI in education and the need to address them. These perspectives collectively highlight the flexibility and thoughtfulness required in incorporating GenAI into academic settings, as well as the opportunities for innovation in course and assessment design.

Examples, Uses, and Role of AI
Most of the documents in my analysis included examples of how instructors are using, or could be using, GenAI in their assessment. Some universities provided real examples from the university itself (Centre for Teaching, Learning and Technology (2023), Liu & Bridgeman, 2023a). This use of context-specific examples is an indication of a learning organization (Kools & Stoll, 2016). It also offers the potential for social learning (Kendal et al., 2005), allowing the instructors to learn what is happening at their own university (Shettleworth, 2009). A second type of example was the provision of links to examples at other universities, like the ones found at Chapman University (n.d.) and the University of Delaware (Guidry, 2023). A third variation was hypothetical examples of what instructors could include and comparisons between what an assignment might look like before and after integrating GenAI (e.g., Center for New Designs in Learning and Scholarship, n.d.-a). Georgetown University (Center for New Designs in Learning and Scholarship, n.d.-b) also included the rationale that instructors might use while choosing one approach or the other for the use of AI in assessment.

Along the same lines, Flinders University (n.d.) described the different affordances of AI in assessment: AI in the planning stages of a task, AI as a core part of the task, AI for self-testing, and AI as a copyediting tool. Other universities described the different roles that AI can take in the assessment: AI as initial support for ideation and brainstorming (Liu & Bridgeman, 2023a), AI as feedback support, AI as content generator for creativity and research (Centre for Teaching, Learning and Technology, 2023), AI as optimizer for rubrics and assessments, AI as a tool for self-directed learning (Chapman University, n.d.), and AI as a copilot (Liu & Bridgeman, 2023b). Similarly, some universities described the tasks that students could do while using AI, such as analyzing GenAI output, assessing the quality of GenAI output, and comparing GenAI output, among other examples (e.g., Instructional Technology and Design Services, n.d.).

Some universities provided links to specific GenAI tools that instructors could direct their students to. Centre for Teaching, Learning and Technology (2023) suggested that instructors explore Perplexity (http://perplexity.ai/) and Tufts (Center for the Enhancement of Learning and Teaching, 2023) recommended that instructors explore Elicit (https://elicit.com/), Consensus (https://consensus.app/search/), and AI Essay Writer (https://www.the-good-ai.com/). This theme is a large one, and deeper analysis is needed to differentiate between the different roles that GenAI can play. Such an analysis is beyond the scope of this paper but will be the focus of future research paper.

Theories, Pedagogies, Concepts, and Analysis
This theme had a minor presence in the data. A few documents capitalized on the shortcomings of GenAI, such as one from UC Berkeley (n.d.). Tufts University linked the use of GenAI to theories of change (Latour, 1984) and explained that the decision to use GenAI in assessment could comprise the following steps: resisting and deflecting, reflecting, and adapting, embracing, and redesigning. Relational pedagogies such as ethics of care, pedagogy of care, connectedness (Adams, 2018; Noddings, 2005), and trauma-informed pedagogy (Venet, 2021) were also mentioned as ways to create more meaningful assessments such as at Brock University (Centre for Pedagogical Innovation, n.d.) and Dalhousie University (Centre for Learning and Teaching, n.d.).

Some of the documents included a focus on students, student partnerships, and the important role that students play in the assessment process. For example, Assessment, AI and Academic Integrity. (n.d.) at the University of Melbourne dedicated multiple pages to highlight the experiences of students. Jisc (2023) also published a report about students’ perceptions of the use of AI in assessment.

The University of Maryland (2023) included the constructive alignment concept (Biggs, 1996), in which learning outcomes are clarified before teaching takes place, as a priority when designing assessments. E-CAIU (Khan, 2023) also addressed the importance of revisiting assessments to make sure they align with learning outcomes. As well, the learning outcomes themselves might need to be reviewed UNESCO, 2023). The enactment of these theories, pedagogies, and concepts as a result of GenAI is also an important field of study, especially compared to the surge in the use of relational pedagogies during the pandemic.

First Wave
The seventh theme revealed in the document analysis captures the idea that this is the first wave of guiding documents. For example, the University of Alberta (Centre for Teaching and Learning, n.d.-b) specified that “the content below presents initial guidance for addressing AI integration in teaching while prioritizing student learning and assessment” (para. 1). Many sources included statements indicating that these guidelines would be regularly reviewed and revised, such as this one from the University of Alberta (Centre for Teaching and Learning, 2023):
Be aware: Due to rapid iterations and advances in Generative AI technologies and tools, teaching and learning advances using AI are in a near-constant state of flux. Consequently, the advice and suggestions provided here reflect best practices in the current moment. (para. 5)

Similarly, Ohio State University (Teaching and Learning Resource Center, n.d.). added this statement “The insights and guidance provided in this teaching topic will evolve as new information emerges around AI tools and their impact on teaching and learning” (para. 2).

Additionally, documents included invitations for instructors to share how they are using AI in their assessment. Those statements could indicate that institutions are moving toward a learning organization model. However, more research is needed to determine whether these universities will update themselves by undertaking continuous learning cycles (Yang et al., 2004).

Conclusion
This paper has presented a summary of seven themes that emerged from the document analysis of guidance documents on the use of GenAI in assessment in higher education institutions. Given that the sources represent the first wave of documents, a second document analysis will need to be conducted to understand how these themes are evolving as instructors’ and students’ understanding of GenAI evolves. In addition, future research should include a deeper analysis of each of the themes.

Author

Dr. Eliana El Khoury is an assistant professor at Athabasca University. She researches alternative methods of assessment. Her research agenda focuses on AI in assessment, equity in assessment, and open educational resources as assessments.

 

References

Adams, K. (2018). Relational pedagogy in higher education [Doctoral dissertation, University of Oklahoma]. OU Dissertations. https://shareok.org/handle/11244/299945

Anohina, A. (2005). Analysis of the terminology used in the field of virtual learning. Journal of Educational Technology & Society, 8(3), 91–102.

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364. https://doi.org/10.1007/BF00138871

Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 27–40. https://doi.org/10.3316/QRJ0902027

Donohue, B. C., & Howe-Steiger, L. (2005). Faculty and administrators collaborating for e-learning courseware. Educause Quarterly, 28(1), 20–32. https://er.educause.edu/articles/2005/1/faculty-and-administrators-collaborating-for-elearning-courseware

Kendal, R. L., Coolen, I., van Bergen, Y., & Laland, K. N. (2005). Trade‐offs in the adaptive use of social and asocial learning. Advances in the Study of Behavior, 35, 333–379. https://doi.org/10.1016/S0065-3454(05)35008-X

Kinash, S., McGillivray, L., & Crane, L. (2018). Do university students, alumni, educators and employers link assessment and graduate employability? Higher Education Research & Development, 37(2), 301–315. https://doi.org/10.1080/07294360.2017.1370439

Kools, M., & Stoll, L. (2016). What makes a school a learning organisation? (OECD Education Working Papers). OECD. https://doi.org/10.1787/19939019

Latour, B. (1984). The powers of association. The Sociological Review, 32(1_suppl), 264–280. https://doi.org/10.1111/j.1467-954X.1984.tb00115.x

Noddings, N., (2005). The challenge to care in schools: An alternative approach to education (2nd ed.). Teachers College Press.

Shettleworth, S. J. (2009). Cognition, evolution, and behavior. Oxford University Press.

Venet, A. S. (2021). Equity-centered trauma-informed education. W. W. Norton & Company.

Yang, B., Watkins, K. E., & Marsick, V. J. (2004). The construct of the learning organization: Dimensions, measurement, and validation. Human Resource Development Quarterly, 15(1), 31–55. https://doi.org/10.1002/hrdq.1086

Anselmo, L., Kendon, T., & Moya, B. (2023, February). A first response to assessment and ChatGPT in your courses. Taylor Institute for Teaching and Learning, University of Calgary. https://taylorinstitute.ucalgary.ca/first-response-assessment-and-chatgpt

Assessment, AI and Academic Integrity. (n.d.). Using AI to enhance assessment. University of Melbourne. https://melbourne-cshe.unimelb.edu.au/ai-aai/home/ai-assessment/using-ai-to-enhance-assessment

Australasian Academic Integrity Network. (2023, May). Summary of institutional responses to the use of generative artificial intelligence (Version 1.1). https://cdn.csu.edu.au/__data/assets/pdf_file/0007/4187851/AAIN-Institutional-Responses-to-the-use-of-Generative-Artificial-Intelligence.pdf

Bommenel, E., & Forsyth, R. (2023). The potential impact of AI tools on assessment. Lunds University. https://www.education.lu.se/artikel/potential-impact-ai-tools-assessment

British Columbia Institute of Technology. (n.d.). An introduction to generative AI tools. https://www.bcit.ca/files/ltc/pdf/intro_to_gen_ai_tools.pdf

Burk, A. (2023). What does AI mean in your classroom? The Ciel Blog. https://wordpress.viu.ca/ciel/2023/09/01/what-does-ai-mean-in-your-classroom/

Carleton University. (2023). Recommendations and guidelines. https://carleton.ca/tls/teachingresources/generative-artificial-intelligence/recommendations-and-guidelines/

Carter, M. (2020, October 6). Professor Paul Fyfe brings a humanistic approach to data. North Carolina State University. https://news.dasa.ncsu.edu/professor-paul-fyfe-brings-a-humanistic-approach-to-data/

Center for New Designs in Learning and Scholarship. (n.d.-a). Assignments. Georgetown University. https://cndls.georgetown.edu/ai/assignments/

Center for New Designs in Learning and Scholarship. (n.d.-b). Policies. Georgetown University. https://cndls.georgetown.edu/ai/policies/#:~:text=Keep%20in%20mind%20that%20the,a%20violation%20of%20academic%20integrity

Center for Teaching Excellence. (n.d.). ChatGPT and generative AI. University of South Carolina. https://sc.edu/about/offices_and_divisions/cte/teaching_resources/chatgpt/index.php

Centre for Teaching, Learning and Technology. (2023). Assignment and Assessment Design Using Generative AI. University of British Columbia. Assignment and Assessment Design Using Generative AI – AI In Teaching and Learning (ubc.ca)

Center for the Enhancement of Learning and Teaching. (2023, Fall). Artificial intelligence resources for Tufts faculty and staff. Tufts University. https://provost.tufts.edu/celt/online-resources/artificial-intelligence/

Centre for Learning and Teaching. (n.d.). Designing assessments with A.I. in mind. Dalhousie University. https://www.dal.ca/dept/clt/e-learning/AI_Resource/designing-assessments-with-a-i–in-mind.html

Centre for Pedagogical Innovation. (n.d.). Designing assessment to mitigate the use of AI writing tools. Brock University. https://brocku.ca/pedagogical-innovation/resources/guidance-on-chatgpt-and-generative-ai/assessment-design/#1690901224970-dd0ef9f0-0b61

Centre for Teaching and Learning, University of Alberta. (n.d.-a). AI-squared—artificial intelligence and academic integrity. https://www.ualberta.ca/centre-for-teaching-and-learning/teaching-toolkit/teaching-in-the-context-of-ai/artificial-intelligence-academic-integrity.html

Centre for Teaching and Learning, University of Alberta. (n.d.-b). Statements of expectations (syllabus). https://www.ualberta.ca/centre-for-teaching-and-learning/teaching-toolkit/teaching-in-the-context-of-ai/statements-of-expectations.html

Centre for Teaching and Learning, University of Alberta. (2023, July 19). Teaching in the context of AI. https://www.ualberta.ca/centre-for-teaching-and-learning/teaching-toolkit/teaching-in-the-context-of-ai/index.html

Centre for the Integration of Research, Teaching and Learning. (2023). Short guide 9: Assessment in the age of AI. https://www.ucc.ie/en/cirtl/resources/shortguides/shortguide9assessmentintheageofai/

CETL, Stephens, C, Taylor, M., & Yadegari, S. (with 8 anonymous contributors). (2023). University policies on generative AI. Padlet. https://padlet.com/cetl6/university-policies-on-generative-ai-m9n7wf05r7rdc6pe

Chapman University. (n.d.). Artificial intelligence in the classroom: A collection of resources/ideas prepared by CETL. https://www.chapman.edu/ai/atificial-intelligence-in-the-classroom.aspx

Clay, G., & Lee, C. W. (2023). Embracing constructive dialogue and oral assessments in the age of AI. Inside Higher Ed. https://www.insidehighered.com/opinion/views/2023/08/03/how-professors-can-use-dialogue-based-course-assessments-opinion

Compton, M. (2023). Sandpit: Testing the capabilities of ChatGPT—Examples of reading, precising, reformatting text and refs, tabulation, rubric, feedback and marking. https://docs.google.com/document/d/1K_UgkLt6–Bqv_FViREBvRzXbD4yR4LzHep15caln4U/edit#heading=h.4uc5kd2g4low

Cornell University (n.d.). AI in Assignment Design. https://teaching.cornell.edu/generative-artificial-intelligence/ai-assignment-design

Crane, K., & Thompson, K. (Hosts). (2023, September 12). Advancing assessment with AI in academia (No.2) [Audio podcast episode]. In Kates Discuss: A Teaching and Learning Podcast. Centre for Teaching and Learning, Dalhousie University. https://focus.clt.dal.ca/blog/kates-discuss-episode-2-advancing-assessment-with-ai

Digital Futures Institute. (n.d.). Thinking about assessment in the time of generative artificial intelligence. Columbia University. https://www.tc.columbia.edu/digitalfuturesinstitute/ai-in-education/thinking-about-assessment-in-the-time-of-generative-artificial-intelligence/

Education University of Hong Kong. (2023). EdUHK releases pedagogical approaches on AI tools to promote self-regulated learning. https://www.eduhk.hk/en/press-releases/eduhk-releases-pedagogical-approaches-on-ai-tools-to-promote-self-regulated-learning

Educator Centre for Academic Teaching and Learning at JU. (n.d.). Preventing and detecting unauthorised use of AI. Jönköping University. https://ju.se/portal/educate/en/guides/artificial-intelligence/preventing-and-detecting-unauthorised-use-of-ai.html#showmore-Threealternatives

Fischer, I., Mirbahai, L., Buxton, D., Ako-Adounvo, M.-D., Beer, L., Bortnowschi, M., Fowler, M., Grierson, S., Griffin, L., Gupta, N., Lucas, M., Lukeš, D., Voice, M., Walker, M., Xiang, L., Xu, Y., & Yang, C. (2023). How can artificial intelligence (AI) be harnessed by educators to support teaching, learning and assessments? Actionable insights. University of Warwick. https://warwick.ac.uk/fac/cross_fac/academy/activities/learningcircles/future-of-learning/ai_report_for_educators_16-7-23.pdf

Guidry, K. R. (2023). Considerations for using and addressing advanced automated tools in coursework and assignments. University of Delaware. https://ctal.udel.edu/advanced-automated-tools/

Hillier, M. (2023, June 20). Advising students about using and citing generative artificial intelligence for assessment. https://teche.mq.edu.au/2023/03/advising-students-about-using-and-citing-generative-artificial-intelligence-for-assessment/

Institute for Teaching and Learning Innovation. (2023). Teaching, learning, and assessment with Generative AI. https://itali.uq.edu.au/teaching-guidance/teaching-learning-and-assessment-generative-ai.

Instructional Technology and Design Services (n.d.). ChatGPT and Artificial Intelligence https://www.montclair.edu/itds/digital-pedagogy/pedagogical-strategies-and-practices/ai/

James Cook University. (n.d.). Assessment and artificial intelligence: Information sheet—part 1 of 2. https://www.jcu.edu.au/__data/assets/pdf_file/0007/2037265/assessment-AI.pdf

Jisc. (2023). National centre for AI in tertiary education: Student perceptions of generative AI. University of Manchester. https://repository.jisc.ac.uk/9218/1/NCAI-Students-Perceptions-of-generative-AI-Report.pdf

Khan, Z. R. (2023). Artificial intelligence content generators in education for schools and universities: A good practice guide. European Network for Academic Integrity Working Group Centre for Academic Integrity in the UAE; University of Wollongong in Dubai. https://www.academicintegrity-uae.com/_files/ugd/b7fc81_37358050587a4486985fdd4b8ead13c0.pdf

King’s College London (n.d.). King’s guidance on generative AI for teaching, assessment and feedback. King’s guidance on generative AI for teaching, assessment and feedback – King’s College London (kcl.ac.uk).

Kozar, O. (2023). Assessments and AI … A three-stage approach. Macquarie University. https://teche.mq.edu.au/2023/05/assessing-students-in-this-new-a-i-enhanced-world-a-3-stage-approach/

Liu, D., & Bridgeman, A. (2023). Student-staff forums on generative AI at Sydney. University of Sydney. https://educational-innovation.sydney.edu.au/teaching@sydney/student-staff-forums-on-generative-ai-at-sydney/?utm_source=sendgrid.com&utm_medium=email&utm_campaign=website

Liu, D., & Bridgeman, A. (2023). What to do about assessments if we can’t out-design or out-run AI? University of Sydney. https://wordpress.viu.ca/ciel/2023/09/01/what-does-ai-mean-in-your-classroom/

Liu, D., Ho, E., Weeks, R., & Bridgeman, A. (2023). How AI can be used meaningfully by teachers and students in 2023. University of Sydney. https://educational-innovation.sydney.edu.au/teaching@sydney/how-ai-can-be-used-meaningfully-by-teachers-and-students-in-2023/

LX Team. (2023, August 10). AI case study: Consistent MME projects with Anna Lidfors Lindqvist. University of Technology Sydney. https://lx.uts.edu.au/collections/artificial-intelligence-in-learning-and-teaching/resources/ai-case-study-anna-lidfors-lindqvist/

McGraw Center for Teaching & Learning. (n.d.). Guidance on AI/ChatGPT. Princeton University. https://mcgraw.princeton.edu/guidance-aichatgpt

McMaster University. (n.d.). Generative artificial intelligence in teaching and learning. https://mi.mcmaster.ca/generative-artificial-intelligence-in-teaching-and-learning/

McMaster University. (2023). Provisional guidelines on the use of generative AI in teaching and learning. https://provost.mcmaster.ca/office-of-the-provost-2/generative-artificial-intelligence/task-force-on-generative-ai-in-teaching-and-learning/provisional-guidelines-on-the-use-of-generative-ai-in-teaching-and-learning/

Meenakumari, J. (2021). Harnessing the power of artificial intelligence for summative and formative assessments in higher education. EdTechReview. https://www.edtechreview.in/trends-insights/trends/power-of-ai-for-assessments-in-higher-ed/

Monash University. (n.d.). Generative AI and assessment. https://www.monash.edu/learning-teaching/teachhq/Teaching-practices/artificial-intelligence/generative-ai-and-assessment

Mulder, R., Baik, C., & Ryan, T. (2023). Rethinking assessment in response to AI. University of Melbourne. https://melbourne-cshe.unimelb.edu.au/__data/assets/pdf_file/0004/4712062/Assessment-Guide_Web_Final.pdf

O’Leary, Z. (2013). The essential guide to doing your research project (2nd ed.). SAGE. https://www.sagepub.com/sites/default/files/upm-binaries/58625_O%27Leary__The_Ess_Guide_to_doing_your_research_project.pdf

Office of Undergraduate Education. (n.d.). AI guidance & FAQs. Harvard College. https://oue.fas.harvard.edu/ai-guidance

Russell Group. (n.d.). Russell Group principles on the use of generative AI tools in education. https://russellgroup.ac.uk/media/6137/rg_ai_principles-final.pdf

Schmidli, L. (with Harris, M., Caffrey, A., Caloro, A., Klein, J., Loya, L., Macasaet, D., Schock, E., & Story, P.). (2023). Considerations for using AI in the classroom. University of Wisconsin-Madison. https://idc.ls.wisc.edu/ls-design-for-learning-series/considerations-ai-classroom/

Shapiro, S. (2023). Exploring the impact of generative AI on education: Opportunities, challenges, and ethical considerations. University of Lethbridge. https://www.ulethbridge.ca/teachingcentre/exploring-impact-generative-ai-education-opportunities-challenges-and-ethical

Singapore Management University. (2023). Use of AI tools in assessment and teaching. https://cte.smu.edu.sg/resources/use-of-AI-tools

Smart, B., & Botha, C. (2023, March 14). A practical guide to ethical use of ChatGPT in essay writing. Mail & Guardian. https://mg.co.za/thoughtleader/opinion/2023-03-14-a-practical-guide-to-ethical-use-of-chatgpt-in-essay-writing/

Smith, G. (2023). Transform assessment with AI and GPT: A positive approach for teachers. ThisIsGraeme. https://thisisgraeme.me/2023/03/20/assessment-with-ai-and-gpt/

St Mary’s AI Steering Group. (2023). Guidance for staff on artificial intelligence: How to maximize the value of AI in teaching and assessment. St Mary’s University. https://www.stmarys.ac.uk/policies/docs/staff-guidance-on-ai-september-2023.pdf

Steele, S. (2023, November 21). Generative artificial intelligence (GenAI) assessment statements for students. Faculty Learning Hub, Conestoga College. https://tlconestoga.ca/artificial-intelligence-ai-assessment-statements-for-students/

Teaching and Learning Resource Center. (n.d.). AI: considerations for teaching and learning. Ohio State University. https://teaching.resources.osu.edu/teaching-topics/ai-considerations-teaching-learning

Teaching and Learning Support Service. (n.d.). Artificial intelligence (AI). University of Ottawa. https://saea-tlss.uottawa.ca/en/course-design/artificial-intelligence-ai#resources

Teaching and Learning Transformation Center. (2023, August). Artificial intelligence (AI). University of Maryland. https://tltc.umd.edu/artificial-intelligence-ai

The CIEL Blog. (2023). Generative AI and Assessment. Vancouver Island University. https://wordpress.viu.ca/ciel/2023/09/13/generative-ai-and-assessment/

Trust, T. (n.d.). ChatGPT and education. Northern Illinois University. https://www.niu.edu/citl/resources/guides/chatgpt-and-education.shtml#additional-resources

UC Berkeley. (n.d.). Understanding AI writing tools and their uses for teaching and learning at UC Berkeley. https://teaching.berkeley.edu/understanding-ai-writing-tools-and-their-uses-teaching-and-learning-uc-berkeley

UCD Teaching & Learning. (2023). Quick guide on generative artificial intelligence in learning and assessment (faculty guidance). University College Dublin. https://www.ucd.ie/teaching/t4media/Generative_Artificial_Intelligence_Quick_Guide.pdf

University College London. (n.d.). Designing assessments for an AI-enabled world. https://www.ucl.ac.uk/teaching-learning/generative-ai-hub/designing-assessments-ai-enabled-world

University of Amsterdam. (n.d.). How to make your assessment more AI-proof. https://tlc.uva.nl/en/article/how-to-make-your-assessment-more-ai-proof/

University of Auckland. (2023). The use of generative AI tools in coursework. https://teachwell.auckland.ac.nz/resources/assessment/ai-tools-in-coursework/

University of Birmingham. (n.d.). Generative artificial intelligence and its role within teaching, learning and assessment. https://www.birmingham.ac.uk/university/hefi/gai/index.aspx

Flinders University. (n.d.). Good practice guide – Designing assessment for Artificial Intelligence and academic integrity – Flinders University Staff

UNESCO. (2023). Guidance for generative AI in education and research. Unesco.org. https://www.unesco.org/en/articles/guidance-generative-ai-education-and-research

University of Glasgow. (n.d.). Learning and teaching: How can I adapt assessment to deal with generative AI? https://www.gla.ac.uk/myglasgow/learningandteaching/aiguidance/howcaniadaptassessmenttodealwithgenerativeai/

University of Guelph. (2023). Provisional recommendations for the use of generative AI. https://news.dasa.ncsu.edu/professor-paul-fyfe-brings-a-humanistic-approach-to-data/

University of Toronto. (n.d.-a). ChatGPT and generative AI in the classroom. https://www.viceprovostundergrad.utoronto.ca/strategic-priorities/digital-learning/special-initiative-artificial-intelligence/

University of Toronto. (n.d.-b). Generative artificial intelligence in the classroom: What is generative artificial intelligence and what application does it have for classroom instruction & learning. https://teaching.utoronto.ca/resources/generative-artificial-intelligence-in-the-classroom/

University of Waterloo. (n.d.). Artificial intelligence and ChatGPT. https://uwaterloo.ca/academic-integrity/artificial-intelligence-and-chatgpt

Volante, L., DeLuca, C., & Klinger, D. A. (2023). Forward-thinking assessment in the era of artificial intelligence: Strategies to facilitate deep learning. Education Canada Network. https://www.edcan.ca/articles/forward-thinking-assessment-in-the-era-of-artificial-intelligence/

No comments yet.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Powered by WordPress. Designed by WooThemes

Skip to toolbar