Assessment in the Context of AI
Learning Assessment is Complex
Assessing learning is never as simple as measuring what students “know.” There is always a gap between knowledge and the performance of knowledge — between understanding a concept internally and being able to demonstrate that understanding under the specific conditions of an assessment. A student may grasp course material deeply but falter in a timed exam, public presentation, or unfamiliar format. For that reason, effective assessment design doesn’t just test learning; it also teaches students how to succeed in the performance of learning. That means helping them understand expectations, practice relevant skills, and receive feedback before they are graded. In short, authentic assessment isn’t only about fairness — it’s about teaching for transfer and preparing students to show what they know with confidence and competence.
Rethinking Learning in the Age of AI
Generative AI is reshaping how we think about student work, learning, and academic integrity. Faculty across disciplines are re-examining traditional assessments — not simply to “AI-proof” their courses, but to better understand what genuine learning looks like in this new context. Many are asking: How can we help students make their thinking visible? One answer to that question is that we need to teach students the content of the course and also teach them how to perform their learning of the content.
Teaching Students to Perform their Learning
-
Students need structured opportunities to practice the kind of thinking, writing, or performance the assessment requires. Treat the assessment not as the end of learning, but as the final stage in a sequence of formative steps that build the necessary skills.
Example:
Before a portfolio is due, have students annotate an earlier piece to identify growth. Before an in-class essay, hold a short “timed writing clinic.” -
Students often don’t understand what kind of thinking an assessment is asking for. Instructors should model and name the mental moves: identifying evidence, framing a claim, connecting theory to text, synthesizing perspectives, etc.
Example:
Think-alouds, sample annotations, or showing “A-level” reasoning in real time help students visualize how experts think. -
Every assessment has an implicit genre — a research proposal, lab report, policy memo, performance critique, annotated text — and each genre has its own conventions and signals of quality. Students can’t intuit those; they need to be taught.
Example:
Discuss what counts as good evidence, clear organization, or strong delivery in that specific form. Use exemplars, checklists, and peer review. -
Students learn best when assessments feel like a natural extension of what they’ve been practicing all semester. Scaffold the assessment with smaller assignments that preview its demands and give feedback along the way.
Example:
A reflective journal builds toward a portfolio. A reading annotation practice prepares for a comparative essay. -
Authentic assessments require iteration. Students will fail, revise, and grow. Faculty can frame that process as part of the learning rather than as weakness.
Example:
Include opportunities for revision or resubmission. Use formative feedback cycles instead of one-shot grading. -
When students understand why an assessment looks different — for example, because it reveals thinking that AI can’t replicate — they are more motivated to engage sincerely. Transparency builds buy-in.
Example:
Explain, “This annotation exercise helps you practice disciplinary reading — something AI can’t do for you.” -
Authentic assessment should evaluate both the product (e.g., the essay, presentation, or portfolio) and the underlying skills that got students there: critical reading, reasoning, reflection, and synthesis.
Example:
Include a reflection or process statement that explains how the student approached the work, what choices they made, and why. -
Just as athletes and musicians rehearse before performing, students need space to practice under similar conditions to the real assessment. Practice makes assessment familiar, not frightening.
Example:
Run a mock presentation day, in-class annotation, or short, timed analysis. Then debrief what went well and what they’d adjust. -
The most powerful assessments generate learning through the act of doing them. When the task itself builds skills (rather than merely testing recall), students internalize both the content and the process.
Example:
A simulation or case study that requires applying course concepts is the learning, not just a test of it. -
Finally, close the loop: make sure what you teach in class directly supports what you’re asking students to demonstrate. Misalignment (e.g., lectures on theory, but exams on application) is one of the biggest barriers to student success.
A Selection of AI-Responsive Learning Assessments
-
The following examples were chosen either because many instructors at UM have already moved to these modalities or because they could spark ideas for new curricular work. If you would like to strategize the specific scenario of a class, please feel free to reach out to Amy Ratto Parks for a conversation.
-
A traditional, closed-book, handwritten or in-person exam that asks students to generate ideas, analyze, or solve problems under time constraints.
Why an Instructor Might Use This:
Faculty seeking to understand students’ unassisted reasoning may return to in-class writing as a way to see human thinking in real time, free from digital editing or AI assistance.How to Prepare Students:
Give practice opportunities with timed writing. Offer clarity on expectations for structure and depth. Normalize the experience by discussing test anxiety and strategies for pacing and focus.Opportunities:
- Reveals authentic recall and synthesis.
- Highlights fluency and conceptual mastery.
- Limits dependence on generative tools.
Challenges:
- Accessibility and anxiety concerns.
- Depth limited by time.
- Handwriting and grading consistency.
-
A curated collection of work over time, demonstrating learning progress and reflection.
Why an Instructor Might Use This:
Portfolios counter AI-generated one-offs by showcasing the evolution of student thinking. They allow instructors to see process, revision, and genuine engagement.How to Prepare Students:
Build in checkpoints for feedback. Model reflective writing and artifact selection. Help students connect early drafts to final products.Opportunities:
- Makes learning over time visible.
- Encourages reflection and metacognition.
- Integrates multimodal evidence of growth.
Challenges:
- Time-intensive to assemble and assess.
- Requires clear rubrics.
- Students may struggle to identify representative work.
-
Students communicate or demonstrate understanding publicly through oral, visual, or creative performance.
Why an Instructor Might Use This:
AI can produce text, but not presence, voice, or situational awareness. Live presentation allows faculty to witness students’ reasoning and confidence firsthand.How to Prepare Students:
Offer low-stakes practice sessions. Share clear performance criteria and rubrics. Normalize nervousness and discuss delivery techniques.Opportunities:
- Builds professional communication skills.
- Creates observable evidence of mastery.
- Engages authentic audiences.
Challenges:
- Performance anxiety and subjectivity.
- Logistical complexities.
-
Students produce written annotations—either in class or digitally—that demonstrate their reading comprehension, conceptual understanding, and analytical reasoning. Annotations might include defining key terms, identifying how passages illustrate course concepts, making short claims about the text’s function or meaning, or posing theory-based questions.
Why an Instructor Might Use This:
Annotation tasks allow instructors to see students' thinking on the page. They reveal how students move from comprehension to analysis — how they connect evidence to course frameworks and apply disciplinary vocabulary. In an AI era, this task is especially valuable because it captures spontaneous, contextual interpretation rather than polished, machine-assisted prose.How to Prepare Students:
Model annotation practices explicitly. Provide examples showing how to identify course concepts in context, frame a claim, and pose theory-based questions. Use low-stakes practice rounds early in the term, and if digital tools (like shared docs) are used, demonstrate both the technical and interpretive moves you expect.Opportunities:
- Makes reading and interpretive processes visible.
- Reinforces key disciplinary vocabulary and analytical habits.
- Offers insight into comprehension before summative assignments.
- Provides formative feedback opportunities for deeper learning.
Challenges:
- Students may default to surface-level notes without strong prompts.
- Requires careful modeling and feedback to encourage depth over quantity.
-
Students communicate learning through podcasts, videos, infographics, or websites.
Why an Instructor Might Use This:
Digital projects shift attention from “writing for the teacher” to “communicating for an audience,” allowing students to apply content through design and storytelling that AI can’t easily fake.How to Prepare Students:
Provide technical tutorials or sample projects. Clarify grading balance between content and design. Encourage collaborative planning.Opportunities:
- Builds digital literacy and creativity.
- Encourages diverse forms of expression.
- Fosters engagement and real-world communication.
Challenges:
- Requires clear rubrics and equitable tech access.
- Can privilege form over substance.
-
Students apply course concepts in real-world or community contexts.
Why an Instructor Might Use This:
These projects connect academic learning to lived experience — something that cannot be automated. They let students see the impact of their knowledge.How to Prepare Students:
Set clear goals for both service and learning outcomes. Debrief regularly to connect experiences to course theory.Opportunities:
- Promotes civic engagement and reflection.
- Strengthens transfer of learning.
- Aligns with institutional mission.
Challenges:
- Coordination and consistency across sites.
- Assessment of learning vs. service impact.
-
Students adopt professional or historical roles to navigate realistic scenarios.
Why an Instructor Might Use This:
Simulations develop judgment, empathy, and adaptability — qualities AI lacks but professions demand.How to Prepare Students:
Explain roles and expectations. Provide context, vocabulary, and structured debriefing to help them process the experience.Opportunities:
- Builds critical thinking and adaptability.
- Encourages perspective-taking.
- Makes learning memorable.
Challenges:
- Requires scaffolding and support.
- Some students resist performative elements.
-
Students reflect throughout the term on what and how they are learning.
Why an Instructor Might Use This:
Reflection allows instructors to see how students think, not just what they produce — offering transparency into cognitive and emotional engagement.How to Prepare Students:
Model reflection prompts. Offer examples of depth vs. summary. Make reflection a recurring, low-stakes routine.Opportunities:
- Makes learning processes visible.
- Fosters metacognitive awareness.
- Strengthens self-directed learning.
Challenges:
- AI can create reflective writing; drafts should rely on evidence from the student's own work
- Requires carefully written assignment to guide students
- Requires intentional rubric to guide instructor response