Generative AI – Instructional Opportunities and Challenges

Learn About Generative AI

Generative AI is at our fingertips and rapidly maturing. Keeping pace with the technical landscape, emerging capabilities, and new integrations will be a challenge but is essential, especially at a Research 1 university.

ChatGPT was the first and most visible player in this landscape late in 2022. Since then, AI capabilities are increasingly an integrated feature of commonly used platforms, including Microsoft products, Google Workspace (Docs, Slides, etc.), and web browsers. As these AI capabilities proliferate and become more sophisticated, the challenges and opportunities are also more abundant and nuanced.

It is inevitable that students will make use of this technology either intentionally through the creation of AI accounts and the purchase of plug-ins, or unintentionally as chat/composing capabilities are integrated into Google and Microsoft’s suites of programs. In addition, in the next few years students will begin to arrive on campus well versed in AI use and capabilities.

Campus News & Events

Guiding Principles for Instructors

This is an accordion element with a series of buttons that open and close related content panels.

Advance accessibility and equity

Generative AI offers both benefits and risks related to accessibility and equity. Generative AI tools may present opportunities for students who are multilingual, who struggle with writer’s block or writing anxiety, or who are entering a new discipline. At the same time, AI can generate racist, sexist, and other kinds of biased responses and potentially promote such biases. It may also present accessibility barriers for students with certain disabilities. Financial equity is another concern – some students will be able to afford to buy premium subscriptions while others will not. UW–Madison is considering institutional options to address this last issue.

Protect data and intellectual property

Instructors and administrators are responsible for protecting student privacy and intellectual work, as well as securing FERPA-protected data. They are also responsible for protecting nonpublic research and instructional materials.

With this in mind:

  • Students should not be required to submit drafts of assignments they create to an AI tool unless doing so is an intentional part of the assignment.
  • Students should not be asked to submit personal information to AI tools.
  • Instructors should not submit student work into AI tools for the purposes of automating feedback and comments.
  • Instructors are encouraged to communicate to students that assignment prompts and other instructor-created materials are protected intellectual property and submitting that to an AI tool without permission may be a violation of intellectual property rights. (Find further guidance on communicating with students, including sample syllabi statements, below.)

Consider educational uses

Generative AI can support student learning and open new opportunities for teaching. Students can dialogue with AI chatbots in ways that generate new knowledge and insights and that mimic conversations with peers or even instructors.

Potential uses for students

  • At various stages of the learning process – to brainstorm ideas, to summarize or distill complex thinking, to learn specific genre conventions, to check grammar, to test out ideas or formulas.
  • To clarify complex readings, issues or points of confusion.
  • To support student research – finding, generating and analyzing data quickly; encouraging multimodal approaches to communication (e.g., generating websites or posters).

Potential uses for instructors

  • Help with designing course materials, including syllabi, questions, quizzes and prompts for learning activities.
  • Explaining complex concepts in multiple ways for diverse learners.
  • Generating announcements or newsletters for large classes or cohorts.
  • While instructor feedback on an entire student work should not be automated, AI can assist with generating ideas for praise, considering additional approaches, and offering stylistic or grammatical suggestions to a specific paragraph.

Consider adapting learning experiences and assessments

When considering how to adapt learning experiences and assessment, the following strategies may have the added advantage of helping to foster a sense of belonging for students, which promotes student success.

  • Promoting a growth mindset with learning tasks and assessments that are well scaffolded so students feel capable of success without relying exclusively on AI.
  • Developing authentic assessments and activities that can be completed without AI, such as in-class discussion; personal narratives; assignments focused on local and/or very recent events; applications of learning to real-world scenarios.
  • Considering metacognitive learning strategies, such as writers’ memos, reflections on learning and sharing thinking processes.
  • Building in collaborative learning, such as peer review and scaffolded group projects.
  • Considering flipped classroom opportunities using real-time educational tools such as Top Hat, time in class to brainstorm or work on assignments, and active learning strategies.
  • Using process-over-product approaches, such as sharing Google drafting histories, scaffolded assignments and assessments, check-ins on progress, and conferences with students to gauge learning. The Writing Across the Curriculum Sourcebook includes suggestions for scaffolding writing assignments.

Discuss course expectations and academic integrity with students

The proliferation of generative AI tools necessitates intentional and thoughtful discussions about AI, broadly (what it is, can do, etc.) and, more specifically, how it relates to student learning and  academic integrity.

Instructors are encouraged to communicate with students about generative AI (including how it relates to academic integrity), both at the start of a course (in course syllabi, in Canvas and in conversation) and throughout the semester.

Here are some things to consider when thinking through communications with students:

  • Clearly communicate expectations regarding the use of generative AI tools early and often. Some instructors might have a policy for the entire course, or have specific instructions for individual assignments and assessments. Recognize that students may receive different guidelines from other instructors.
  • Provide clear instructions, examples and explanations to help students understand the expectations. Explain when generative AI can be used (e.g., initial queries, topic development, help structuring a written assignment or project) and when AI might hinder their learning or development of essential skills and knowledge (e.g., relying on AI to complete a math assignment that builds skills necessary for more advanced work).
    • If instructors plan to discourage or prohibit the use of AI, it is particularly important that they explain why to students in the context of learning objectives for the course.
    • If allowed, consider when/how/in what way AI-generated content should be cited and whether to require students to turn in, or at least retain, chat transcripts.
  • Establish a clear and transparent dialogue with students early (and often) to help avoid an instructor-student dynamic built on mistrust. Using a “misconduct” lens can create a climate of policing/suspicion.
  • Incorporate the topic of generative AI into a broader discussion on academic integrity and professional ethics within an instructor’s discipline.
  • Participate in conversations about academic integrity and generative AI with colleagues in your department. Departments are being encouraged to facilitate conversations within their specific discipline to clarify shared expectations and identify strategies for promoting professional ethics within the discipline. Academic associate deans, the Center for Teaching, Learning & Mentoring and Writing Across the Curriculum are available to support these conversations.

Address potential misconduct using established policies and processes

Instructors are encouraged to take a proactive approach to prevent misconduct, using the teaching and communication strategies outlined above.

Avoid the detection arms race – generative AI detection tools are imperfect at best, carry the risk of false positives, have been shown to be biased against non-native English speakers, and will not prevent students from using these tools.

If an instructor suspects a student has not followed their established course guidelines on generative AI, they should address it as they would any case of suspected academic misconduct, beginning by meeting with the student to discuss concerns.