My university gives “guidance” on the use of generative AI in student work. It’s not really guidance, because it simply doesn’t care — you can allow it or prohibit it. They even give us boilerplate that we can use in our syllabuses! If we want to prohibit it, we can say
In this class, the ability to [skill or competency] is essential for [field of study/professional application]. Because this course emphasizes [skill for development or specific learning outcome], using Generative AI tools [including those available to you through the University of Minnesota,] are not permitted.
If we allow it, we can say
In this course, students will [statement of learning outcomes, competencies, or disciplinary goals]. Given that Generative AI may aid in [developing or exploring course, discipline, professional, or institutional goals/competency], students may use these tools in the following ways:
The example allowing AI goes on much longer than the prohibitive example.
I will be prohibiting it in all my classes. So far, I’ve been pretty gentle in my corrections — when someone turns in a paper with a substantial, obvious AI, I tend to just flag it, explain that this is a poorly written exploration of the thesis, please rewrite it. Do I need to get meaner? Maybe. All the evidence says students aren’t learning when they have the crutch of AI. As Rebecca Watson explains, ChatGPT is bad for your brain.
I was doing a lot of online exams, thanks to COVID, but since the threat of disease has abated (it’s not gone yet!), I’ve gone back to doing all exams in class, where students can’t use online sources. My classes tend to be rather quantitative, with questions that demand short or numerical answers, so generative AI is mostly not a concern. If students started answering with AI hallucinations, it would be! I’m thinking of adding an additional component, though, an extra hour-long in-class session where students have to address an essay question at length, without AI of course. They’ll hate it and dread it, but I think it would be good for them. Even STEM students need to know how to integrate information and synthesize it into a coherent summary.
Another point I like in Rebecca’s video is that she talks about how she had to learn to love learning in her undergrad career. That’s also essential! Taking the time to challenge yourself and explore topics outside your narrow major. Another gripe with my university is that they are promoting this Degree in Three program, where you undertake an accelerated program to finish up your bachelor’s degree in three years, which emphasizes racing through the educational experience to get that precious diploma. I hate it. For one, it’s always been possible to finish the undergrad program in three years, we don’t put obstacles in front of students to get an extra year of tuition out of them, and we’ve always had ambitious students who overload themselves with 20 credits (instead of the typical 15) every semester. It makes for a killer schedule and can suck much of the joy out of learning. It’s also unrealistic for the majority of our students — every year we get students enrolled in biology and chemistry programs that lack basic algebra skills, because the grade schools are doing a poor job of preparing them. We have solid remedial programs at the same time we tell them they can zoom right through the curriculum? No, those are contradictory.
I think I’m going to be the ol’ stick-in-the-mud who tells students I’ll fail them for using ChatGPT, and also tells them they should plan on finishing a four year program in four years.