A person listening to headphones while watching a bright sunset, with the sun casting a warm glow over the scene.

Applied Podcast – Ep. 11

AI and Academic Integrity

Featuring Dr. Joseph Brown

What does the future of higher education look like in a world where every student has access to powerful AI tools? How can educators be more prepared to handle this rapidly evolving landscape?

Headshot of Joseph Brown

Listen to Applied on:

Spotify | Amazon Music | Apple Podcasts

Dr. Joseph Brown (00:00:00):
These models are advancing really quickly and I want to have an opportunity to talk to our faculty directly about here’s what we’re seeing, here’s what you need to know, and make sure they’re equipped to manage that in their classrooms. Because as of right now, CSU’s primary stance on generative AI is we’re allowing our faculty to chart what that looks like in their courses, which I know might be confusing for students because it might look really different if they go from one side of campus to the other in a given day. But what we found is that the need for disciplines to identify how this might play out in their discipline was really important and to equip our faculty as opposed to prohibiting them from that kind of innovation that we’re going to see over the next few years. (Intro Music)

Beren Goguen (00:00:56):
Welcome to Applied. I’m Beren Goguen. Today I’m joined by Dr. Joseph Brown, a higher education leader and teaching and learning expert with more than 15 years of experience leading academic programs at universities across the country. He currently serves as the director of Academic Integrity at Colorado State University, where he provides strategic leadership, creates educational resources and mentors faculty. Dr. Brown, thanks for being here.

Dr. Joseph Brown (00:01:21):
Thanks for having me, Beren. And I appreciate you asking me to come and talk with the larger CSU community about these very important issues.

Beren Goguen (00:01:29):
Can you tell us a little about your background and how you came to be the Director of Academic Integrity?

Dr. Joseph Brown (00:01:34):
So I started as an English professor at a small state college in Georgia. I taught there for six years and earned tenure and all of those things, right? Really intensive teaching load because that was the mission of the institution. And while I was there, I saw the value of academic integrity. You’re teaching a lot of writing. You’re teaching to students who are engaging with higher ed. for the very first time. Usually their families are engaging with higher ed. for the very first time. You have many occasions to talk about academic integrity, why it’s important a student does their own work. Flash forward a couple of years towards the end of that stint, I did a fellowship program that the university system of Georgia puts on that’s really kind of for professors in the system to discuss issues around teaching and learning. It also had a very strong technology component. It got me thinking about doing something in teaching and learning. And about that time I saw the job come open here at Colorado State, applied and got the job, moved out here and began work. And that was a really good match I think because it connected my interests in teaching and learning to the ability to do something on a broad scale. It’s something at a large land grant institution like CSU. There’s a lot of opportunity to see the ways that you can encourage learning, right? Lasting learning. That’s really important.

Beren Goguen (00:02:52):
And so you’re working in the institute of… what is the institute called again?

Dr. Joseph Brown (00:02:56):
So it’s the Institute for Learning and Teaching, which gives us the acronym TILT. But colloquially around higher ed, you’ll actually hear these centers called Centers for Teaching and Learning.

Beren Goguen (00:03:05):
And could you tell us a little bit about the center and what they do?

Dr. Joseph Brown (00:03:09):
So TILT is actually a lot of things. And part of that has to do with just the need across campus. So, for example, there’s my unit, which is learning programs. Academic integrity is part of that, but it also includes tutoring. It includes workshops that can be for student audiences. In another part of the building, you’ll find instructional designers who work with faculty to essentially build courses from the ground up and then also to accentuate learning and make sure the faculty have opportunities to engage with … the newest ideas in teaching and learning. There’s also an assessment component. We think teaching without assessment… I mean does that really connect to learning? So that’s an open question in higher ed. and we want to make sure that we’re engaging those kinds of questions and making sure that people are actually learning when we go through all of this effort. So when we talk about TILT, it has become a center for those kinds of teaching and learning questions on our campus and we’re really proud of that. But I think it’s also important to say that it does a lot of different things.

Beren Goguen (00:04:09):
Assessment is certainly an ongoing topic of discussion, I think especially with online learning… it’s just different. And so I imagine that group helps a lot with that… how to work through that for instructors who are maybe new to the online teaching space?

Dr. Joseph Brown (00:04:25):
Yeah, that’s my understanding as well. So I don’t directly engage with a lot of the assessment activities that occur, but I think that it is infused to a lot of the things that we do. Any kind of program that you run, you really do want to ask the question: Is it working? Are people learning from this? Is it having the impact that your investment is hoping for? There’s a great line in a book that I read for the teaching fellowship that I did in Georgia and it said teaching without assessment as talking. And I think that’s a great point. We can kind of think of these traditional models of the professor in front of the classroom, the sage on stage… I think one of the questions that was never really asked was: okay, but are the students learning the things that you were hoping that they would learn? And I think that that component is specifically helpful in answering that question.

Beren Goguen (00:05:09):
And I imagine academic integrity does come into play there with assessment, especially when you’re assessing people in the form of a paper or asking them to do their own research.

Dr. Joseph Brown (00:05:20):
Exactly. So academic integrity is really where that crossroads meets, right? It’s like you have spent time developing learning activities for your students. Well, when it comes time to assess that learning, we need to make sure that their work is their own. Right? Now, the phrase that people in this field are using more often is “authentic engagement with learning.” So that might really start at the level of consuming the information in an authentic way. I mean actually sitting down and doing the work, in other words. And then that translates then ultimately into the assessment activity that people are doing to prove that they’ve learned the things that you hoped they would learn through the course of those activities.

Beren Goguen (00:06:01):
These are all things I didn’t think about when I was an undergrad, even grad school… I mean I thought about it some, but a lot of it relies on the guidance that you get from your instructors. And some students probably receive better guidance than others in that regard.

Dr. Joseph Brown (00:06:18):
Yeah, that’s one of the things we know about student preparation when they come to the university is that they’re coming from all kinds of backgrounds. The levels of, I guess their preparation in high school varies depending on what school system they went to or what part of the country they’re from. And so some students will have heard about this for years and it doesn’t necessarily mean that they’re going to act with integrity once they hit our classrooms. There’s a different dynamic that occurs, but at least they know about it. They know kind of what the rules are more or less. And then there are students who, this is new for them. Again, we’re a land grant institution and we take that mission really seriously. You hear people talk about it on campus all the time, and we admit students who are maybe first generation college, who are going to try college for a semester and see if this is what’s right for them.

(00:07:04):
And one of the phrases we’re hearing a lot about is the hidden curriculum of college. Academic integrity is kind of part of that hidden curriculum. If your parents didn’t go to the university, you might not have ever heard of academic integrity. So it’s also our job to tell them about it and to make sure that they understand what we are talking about. The other wrinkle to this is that we have a lot of students from international preparations and academic integrity can look really different depending on what country you come from. The first time I ever interacted with this idea, I was a grad student at a conference, actually, in Tucson. And I was having lunch with a grad student from another… I think he was from New Zealand. And he said essentially: “I really don’t understand Americans’ focus on academic integrity and plagiarism.” He’s like, “If it’s a good idea, wouldn’t you want to share it?” And I thought that was interesting. Yes, we do want to share it, but there are ways that we think that’s an appropriate way of sharing, whereas in other preparations that isn’t necessarily their norm. And so we want to do our responsibility by making sure we’re all starting on the same page.

Beren Goguen (00:08:04):
I never thought about that from an international perspective or a cultural perspective, but that’s important to understand. When you grow up in the United States, in the US education system, you might have more familiarity than someone coming from another country.

Dr. Joseph Brown (00:08:21):
I think that message has kind of gotten out to our faculty. I do want to take a step back though and say there are parts of this that are quite universal. I was doing a hearing with a student who was, I believe from China, and we were talking about some of the evidence that had been brought. I can’t remember what the issue was now, but he put his head in his hands and he goes: “My dad is going to kill me.” And I thought, oh, that is so universal, that sense of I know I did something and I know I’m going to be held accountable for it. So I think that there is a tendency sometimes of faculty to say… to kind of overread that and go, well, they don’t know anything about this. Well, that may not exactly be true. And obviously we do a lot at the beginning of their term here, when they begin at the CSU, to say: These are the rules. And I know that they may be new, but it’s time to learn them. This is what the expectation is in this classroom and acknowledging that that may be totally different from the classroom you came in and be respectful of that. But now there’s a new set of rules, in the same way that we would expect when you walk from building to building or you go to a dinner where there’s a different set of norms, right? You have to pick up on those. And so we want to make sure we make that easy and transparent. We don’t want them to have to struggle to learn those.

Beren Goguen (00:09:36):
So do faculty come to you sometimes when they’re having an issue with a student and to get kind of guidance through that?

Dr. Joseph Brown (00:09:43):
Yeah, so that’s, I would say, the primary way faculty interact with me is that a thing has happened in their course. It might involve a student or multiple students, and I think it says something about CSU that that’s not incredibly common. I still have faculty who say: “This is the first time I’ve ever had to deal with this.” And so they’ll call and they’ll say: “A thing happened. This is what I’ve noticed. It seems weird to me.” Or “The way this student answered this question was odd,” or “I found a copy of a thing that the student clearly used. What do I do next?” And so from that point, it’s just kind of coaching of here are the next steps so that we follow the rules that we say that we’re going to follow, the rules we’ve written down. That’s really important for due process reasons at the university.

Beren Goguen (00:10:27):
Being consistent.

Dr. Joseph Brown (00:10:28):
Exactly.

Beren Goguen (00:10:29):
So in your view, what’s one of the biggest challenges for academic integrity right now?

Dr. Joseph Brown (00:10:34):
I’m tempted to say generative of AI, but let me pause for just a moment because I know we’re going to talk about that in a minute. Really, I think the forces underneath kind of what’s happening in higher ed. and in the public sphere are presenting some real challenges beyond whatever the mini challenge of the day is. And what I mean is there are tremendous winds that are blowing in the public sphere, and really a lot of this has to do with our economy. Students are coming into the classroom with a sense of, I got to get done with this really quickly. I have plans. I have things I need to do. I have a job I want. And I think that it’s just the reality of living in 2024, where if I can do it as quickly as possible… they want to term it as efficiently.

(00:11:17):
And I think there’s something about that that I think is of its time, and so that creates a lot of pressure on these kind of learning moments. But some of these things just cannot be learned quickly. And it does… Some of the most lasting learning requires a bit of struggle. These are concepts that build upon each other that have been difficult for a very long time. They’re not reducible. And, in other words, there really are no shortcuts. There’s no way for me to make this really complicated concept something that you can digest in the next 20 minutes. There are things that you’re just going to have to do and then that process is going to build on itself and build on itself. And at the end of that process, you have learned that thing. And I think that what I’ve seen throughout my career is that that window, that time that they expect that they should be able to learn a thing is shrinking and they’re frustrated, especially when the economy’s doing well because I think students are like anybody else, they look around and they go, how am I doing compared to my friend?

(00:12:21):
And my friend might be already working in an industry without a college degree, and when an economy’s doing well, they’re thinking about: I’m missing out. I’m missing out on money I could be earning. The bet has always been a good bet for higher education, which is that, while that may be true right now, what we know is: College graduates do better over the long term for the rest of their lives. But it does mean a level of commitment over the next four years that, I think, my concern is that fewer and fewer students feel like is worth the four years. We want to resist those. We want to speak, I think, eloquently about the power of an education and an undergraduate degree, that it’s not just entrance into your profession. It’s not a gate that you have to pass through, right? It’s really a foundation for a life that you’re wanting to achieve.

Beren Goguen (00:13:13):
Some of those courses that they might not understand the value of now could end up serving them a lot in the future down the road.

Dr. Joseph Brown (00:13:21):
Absolutely. You think about the value of the core curriculum is not really a value that at 18 or 19 or 20 you’re really going to see. But when you are older and you realize that those things help me build onto these other concepts that made me the person I am, that made me valuable to my employer 30 years later, these are the kind of arguments that I think that we have to get better at making, right? The core curriculum exists, for example, because employers told us we need students who know these skills. So in a strange way, it’s odd when you hear from students who say, well, I don’t need that class because I’m not going to do that thing. One of the types of students I used to teach all the time at my last institution were nursing students. And I remember in 101 and 102 English, these are composition classes, and they would be really frustrated with me. And I remember they said: “Why do I need to do this?” And I’m like: “You don’t think you need to know how to be an effective writer as a potential nurse?” That is an incredibly important skill. Literally, your ability to write clearly could be the difference between someone getting the treatment they need and not. So yeah, to me, I thought it was a really easy argument to make, but at that life stage, it’s a little harder to see that bigger picture and this is what we need… the context we need to bring to students.

Beren Goguen (00:14:39):
It’s similar with other more technical fields. I’m not trying to be a writer. It’s like, but you will be writing. You will be communicating in the written form, maybe not writing really long form content, but being able to speak concisely in a way that people understand, understand who you’re speaking to and how to communicate with them effectively and get your message across in a way that they can understand is a universal skill in my opinion.

Dr. Joseph Brown (00:15:04):
Yeah, absolutely. I agree with that. I mean, obviously, I’m an English person, too. I think that your ability to read critically is a superpower. I mean, our society is still built on that skill, and it’s been surprising to me to watch some of the things that have been said about the rise of generative AI and how that would change education. And yet all of those things are being said in these think pieces that are coming out in the Atlantic and the New York Times. And I feel like in some ways it kind of unwrites that argument. It’s like, well, but we understand it because we have that preparation because we can think critically about this. And I think that’s an incredibly important skill for people. I talk to my son and my daughter about this all the time. Reading is a superpower. It’s a thing we can do that other animals on the planet cannot. We should definitely use that. And I think that when we give that up, when we kind of cheapen that… I think that’s one of the things I worry about with academic integrity and the way it interacts with our assessments is like… you have to read the whole thing. You need to complete the whole assignment. That’s part of the process that translates to that long-lasting learning.

Beren Goguen (00:16:12):
Do you think the internet and social media and just sort of the way people live now in the modern era as opposed to maybe 10, 20 years ago has just shortened people’s patience and attention span and maybe that’s a factor?

Dr. Joseph Brown (00:16:26):
Well, I want to be careful about that because I’m getting at to the age now where I think some of those things are just becoming more and more invisible to me. And so I don’t want to be one of those folks who’s like kids today. I think that is a concern that people have had really since the oral tradition. There’s a great Frontline documentary about this very question, about the rise of digital literacy and what it’s having…. the impact it’s having. I used to show it to my students and there’s this great argument, I’m going to paraphrase it here, where the speaker says, essentially: The book was a technology. The book was a very disruptive technology. At that point, the oral tradition ruled the world. You would memorize thousands of lines of text and when the book came along, people lamented the loss of memory,

(00:17:15):
But at the time they did not really understand what they were getting in return. So they were in this kind of messy middle phase of the evolution of that technology. They didn’t understand what they were getting was access to thousands of ideas that they would never have had before. So I think something like that is occurring on a broader scale in higher ed. In terms of social media, I think that does have something to do with our attention span. And I think that many writers have weighed in on this question. Particularly people who teach undergraduates are pointing out that they can’t really assign longer texts anymore and expect them to be finished because our attention span is just getting shorter and shorter. I do believe that’s one of those other pressures we’re seeing being applied to higher ed. And it’s something that we need to have an answer for.

(00:18:02):
And that answer could look a lot of different ways. It could be insisting on those longer texts and explaining… doing a better job of explaining the value of having students engage in that process. But it also could look really different because there were people who would say, well, what was the goal of the learning behind that thing itself? And if we’re achieving that learning without an 800 page novel, then isn’t that kind of the same thing? Won’t the student eventually get to the novel if it’s valuable to them? And so I feel like those are questions that we’re grappling with right now.

Beren Goguen (00:18:36):
So it’s kind of a middle ground that we need to find.

Dr. Joseph Brown (00:18:38):
I think so. And I think asking the question is really important. We have some really smart people on this campus who are very much engaged in these sorts of questions. And I think that’s the essence of higher ed. Is that that’s not a static… There’s not an answer. That we’re actively thinking through these issues and understanding that those answers evolve over time. I mean, Nicholas Carr has that. There’s a great article I used to have my students read it in 102: “Is Google making us stupid?” And students read it now and they’re like, Google? What are you talking about? I was like, well, Google stands in for the internet, our digital lives. And I guess one of the reasons I had them read that was that I wanted them to think critically about their behavior when they engage with their own learning, that they need to bring something to that too, right? We’ve left that idea behind that they’re just receiving all this. That’s not the kind of place CSU is, right?

Beren Goguen (00:19:33):
It’s a two-way thing.

Dr. Joseph Brown (00:19:34):
Absolutely. I mean, that’s why we talk about engaged learning, that they’re participating in this process. They have the other half of the circle, and when we pull those things together, that’s when some really magical things happen in a classroom. So I have them read it really to encourage them to think about the ways in which the forces of technology and modernity are affecting how they learn. (Music Break)

Beren Goguen (00:20:04):
What’s the success you or your department has accomplished recently?

Dr. Joseph Brown (00:20:08):
Well, the first thing that I would point to is the artificial intelligence and academic integrity blog that’s connected to the Institute for Learning and Teaching website. Our faculty and our university community have found that to be a valuable resource as this issue involving generative AI has continued to just evolve. I mean, this issue is not the same that we saw in January of ’22. And so as it’s continued to evolve, we’ve tried to provide more resources for our faculty to understand what was going on, including really, I would say, a highly visible post called “Your Fall Survival Toolkit” for our faculty, mainly just to help them understand: This is what you’re likely to see in the classroom. Here are resources that can help you immediately think through some of these questions of how you’ll manage them. And the feedback was that this was very helpful for our faculty, especially faculty who were teaching for the first time or had not really engaged the idea that generative AI might affect their classrooms.

(00:21:07):
And so I really love that space. It’s kind of funny to talk in 2024 about the value of a blog, but what we found was that anything we put up website wise that was static, we would immediately have to revise it. And so this gives us an opportunity to just continually update as new information comes out. I think we’re going to face some challenges and some headwinds in the fall semester this year. These models are advancing really quickly, and I want to have an opportunity to talk to our faculty directly about here’s what we’re seeing, here’s what you need to know, and make sure they’re equipped to manage that in their classrooms. Because, as of right now, CSU’S primary stance on generative AI is we’re allowing our faculty to kind of chart what that looks like in their courses, which I know might be confusing for students because it might look really different if they go from one side of campus to the other in a given day.

(00:21:57):
But what we found is that the need for disciplines to identify how this might play out in their discipline was really important — and to equip our faculty as opposed to prohibiting them from that kind of innovation that we’re going to see over the next few years. So I really like that resource. I think it’s one of the things… we’ve got a lot of feedback that people have liked that. They’ve enjoyed having that kind of communication. And really it just allows TILT, again, to be this conduit of: Here’s what’s going on in teaching and learning circles. Let me bring this back to the university and make it digestible to our faculty who are incredibly busy people and productive people. So they don’t have time necessarily to go through the same kind of journals and published material that I do. But then to make it as accessible and easy for them to digest.

Beren Goguen (00:22:45):
So generative AI is a big topic, obviously, in higher education right now. How is CSU addressing the challenges that AI pose to academic integrity?

Dr. Joseph Brown (00:22:55):
So I think there have been various phases where CSU has, I guess, diversified its approach. So the first stage was really just helping people understand what was going on, what it was, and how it might impact the classroom space, the learning environment. I think at this point we have moved on to a phase of: We need to understand from a policy perspective, from an institutional perspective, how we’re going to manage not only the growth of these technologies in our spaces, but I guess start to solidify some of our approaches to this because I think that the initial impression of, well — teach with it — was really frustrating for our faculty because they were saying, well, I wasn’t trained in generative AI. I’m dazzled by it. I think it’s interesting or I think it’s a threat or I want it to disappear. But being able to make those connections was another thing they were being asked to do in their learning space and their disciplinary environment

(00:23:49):
that was just really challenging. And so now I think that the focus is on giving our faculty some more centralized and solidified direction of: Here’s what we think is going to happen over the near term in the future, right? In the long term, here’s how we’re going to manage the growth of this in our space. And I think the real challenge right now is trying to find those places where the institution can consolidate into some kind of broad directives. So, in other words, giving people tools that they’re asking for and need. How can I detect it when I think it might be there? That is an ongoing question. That’s a really tricky question right now. The other component is, well, if I want to begin training my students to use this exciting new technology that’s going to be expected in their field, how do I do that responsibly?

(00:24:42):
That’s another kind of point that we need to give guidance on. I think right now the impression has been that it’s kind of like how will it translate to your classroom? That’s fine in the near term, but in the long term, that’s going to be really a challenge because we’re going to see students come to our classrooms already prepared to use this technology, already expecting to use it. And that window of being able to tell them how to use it is closing on us very quickly. So we need to have some sense of how to consolidate this all down into something that’s manageable. Otherwise, I think we’re going to feel frustrated as students come to our classrooms with expectations that we can’t meet or we didn’t intend to meet. And then they’re going to run headlong into our expectations for how work should be completed. That might be kind of outdated. So we need to start thinking through that.

Beren Goguen (00:25:33):
So, is there kind of an understanding, then, that there’s a way to use generative AI with academic integrity and there’s a way not to?

Dr. Joseph Brown (00:25:42):
I think that’s evolving. I think the first attempt at this was essentially encouraging students to be transparent about when they were using it, because one of the speakers that we had on campus and one of the previous provost colloquiums last year… So this was Abram Anders and he made the argument: They’re already using it in the classroom. And they’re either going to do it surreptitiously or they’re going to do it openly. In that scenario, you actually get to teach. And I don’t know if that’s necessarily true. We haven’t seen much data yet about student usage. I mean, we’re getting some initial surveys, but we also believe that… the field believes that this, right now, this is that really messy period where some students are using it to a high degree and some students are ignoring it completely. And so we don’t really understand the level of adoption.

(00:26:35):
So, I think the initial thought was: Let’s find some way for students to be transparent about their use. Let’s show them the ways in which it can be used ethically, and some of those uses are involving, like, don’t ask it to produce an essay. I’m going to use an essay, just an example here because it’s my field. Don’t ask them to produce an essay, but you may begin using it to generate ideas for how you might get started on an essay. Try different versions of an outline. All of these things might initially bother me as an English teacher because I think, well, that’s part of the invention process. That’s part of the learning. And all of that is true, but also, realistically, I got so tired of telling students to do an outline that, at this point, it’s like, well, if it gets you started and we get to a place where you have a draft that we can then work on — we can maybe backwards engineer that draft into something more organized — then, okay. Let’s maybe use that as a potential way we could use that new technology. I also like the idea of seeing what kind of feedback it can give a student. I think that that has been one of the more interesting areas where it’s failed to give them valuable feedback. It seems like I would say very superficial feedback right now because it’s not really, as we know, it’s not really reading the piece.

Beren Goguen (00:27:53):
It’s sort of emulating how a human might talk when giving feedback, but not necessarily based on what they actually wrote.

Dr. Joseph Brown (00:28:01):
Yeah, exactly. Yeah, I mean, again, most people don’t really understand how these systems work. But I mean, it’s essentially charting the probability of what the next word is, given the previous words.

(00:28:11):
And it might appear to be talking about your paper, but it kind of isn’t, and certainly not in the way that your peer might in peer review or your instructor might, and giving you formal feedback. And so I think helping students understand the limitations of the technology right now are really important, and that’s primarily because students when they use this, they’re coming to these technologies from a place of… as a learner, as someone who is not an expert. So when I look at the output it creates, I see something very different than what my students will see. They’re dazzled by it. They’re like, wow, it made a whole draft on that idea. Well, it did, but it wouldn’t necessarily… I would value that draft very differently than they would. And the difference is expertise. Eventually students will become experts in their field, or at least they’re going to be practitioners and they’re going to know immediately what’s a good set of therapy objectives for a speech language pathologist and what are terrible objectives for a particular therapy session, but that’s going to take time and learning in your field. The danger right now is they treat all of the output as the same. They’re just like, this looks good. Sure, that sounds like something a professional would write. That’s the real danger right now. I think helping students see some of the limitations of that technology as part of the learning that we need to value.

Beren Goguen (00:29:38):
Looking at it more critically as if you would look at your own work more critically. How can I improve this? Is a skill that you grow.

Dr. Joseph Brown (00:29:46):
Yeah, absolutely. I mean, I think we do this anyway when we ask students to evaluate writing, and so it seems strange to me that we wouldn’t ask them to do that with the output of this new technology. Now figuring out how that connects to your class, if you’re a faculty member, is a totally different question right now. That’s one of those kind of open concerns of how do I blend this with some things that I’m currently doing? But that idea is something that we’re consistently asking students to do. Here, read this and evaluate it. What’s valuable about this argument? What are the weaknesses of this argument? We are looking at texts in those ways in the same way we would hope we’d have them look at this kind of writing. At the same time, I just want to point out — this is one of the things I’ve been trying to talk on campus about — a clock is ticking here.

(00:30:31):
I mean, the situation is this. Our students right now, were raised in a completely different writing and really just ecosystem for learning than they will be in the next seven years. I’m going to use seven because I mean, generally speaking, we’re talking about how long it takes for a student to go from the beginning of middle school to entering our classes. So, when I look at students in my current class and I’m asking them about their chat GPT usage, I’m getting a lot of hesitancy. They’re worried. Well, is it okay for me to even talk about this? Because they were raised in an ecosystem where shortcuts were not… They know what that is. They know that’s probably not what I should be talking about with my professor. I think that within a year, what we saw is students were being encouraged to use these technologies to write admissions letters — and by their counselors!

(00:31:26):
So it seems wrong of us to say, well, that’s not right. You can’t do that. The adults in their lives are saying: Hey, this is a tool and you should use it. I think if you flash forward a number of years, what we’re going to see are students who are in our classrooms who really were raised in a completely different ecosystem and their expectations for how writing is done will be different. I think we need to figure out how we’re going to manage that because we don’t want to be the “university of no.” You can’t do that. I know you’ve been taught that the last seven years, but that’s not real learning. They’re just not going to believe that, and I think that would seem really old fashioned. At the same time, I don’t want to sound like one of these AI snake oil salesman. I have profound worries about this technology and its effect on higher ed.

(00:32:12):
I think the fact is we don’t own any of this. They’re owned by the most powerful corporations in the world. There’s been a lot written about how they’ve been trained and the challenges of the bias that’s inherent in those systems. I’m also… I’m worried about the learning that’s occurring alongside these systems. If we can really call that learning. I’m worried about the shortcuts and the impact that those have. But I also want to live in the world. And I have to acknowledge that some of these things are just not under our control, and we have to figure out how to manage them appropriately. Otherwise, we’re going to be increasingly making an argument that just does not resonate with students.

Beren Goguen (00:32:52):
The genie is already out the bottle. Academia can’t put it back in. It’s the reality of the situation and the way the world is headed. I imagine AI is just going to continue to be more and more integrated into the way we work and even just communicate. It’s built into your email. It’s built into your phone. Younger folks are just going to become more used to interacting with AI the way you interact with a person, asking questions. As a person in their forties, I’ve never really gotten into asking my phone a question or asking the little box in my living room and talking to that, but I’m not in that demographic. Whereas my kids probably will be, and it’ll just be second nature. And then, if they come into academia and it’s like: no, no, you can’t do any of that. That’s going to be jarring and it’s not going to work out very well.

Dr. Joseph Brown (00:33:45):
Exactly. But I mean, we’re also talking about academia and we’re really good at holding different ideas in our head at the same time. And so I can be worried about the effect this is going to have on writers and thinkers at the same time, recognizing that it’s going to be a significant part of their professional lives. And what we do is not necessarily just professionalization. I mean they’re on a plan of study that’s eventually going to produce engineers and managers and writers and all of that, all the things… all of the graduates we produce every year. But our most basic charge is to produce thinkers. I mean, this is, I think, the value… As a liberal arts guy, this is the value of what we can do, the full life that we want our students to have. So I can at the same time recognize that this is going to be an incredibly important part of their professional lives in 30 years.

(00:34:38):
I can also be really worried about what we’re losing when we adopt these technologies. And I think that slows us down a little bit. I think that’s really effective because then we start to really evaluate why we’re doing the things we’re doing in the classroom. There will be things that we have been asking students to do for a long time that I think we’re just going to have to let go of. I’m not a huge fan of the annotative bibliography. I stopped teaching it after a while. I’m not necessarily saying that it shouldn’t be taught, that’s not valuable, but we only have 16 weeks with students in a standard course. And so I think that what’s going to happen is this sort of thing is going to add pressure to what we decide to choose in those class meetings. And when we say 16 weeks, we’re talking, what an hour, three times a week. So really we’re not talking about a lot of time with them, and I think that this will challenge us to look at all the things we’ve asked them to do and say, why that? And I think that process will be actually really helpful and really healthy for us.

Beren Goguen (00:35:44):
How do you think students can be more involved in this conversation?

Dr. Joseph Brown (00:35:47):
I think in the near term, students need to get really active about talking to their professors about what is going to be appropriate and what’s not with regenerative AI. I think, just generally speaking, making sure they understand the expectations of a course is just how you do college in 2024. It’s not always… we don’t put it on a billboard. We do put it on a syllabus. The research says we don’t do a lot of reading of the syllabus. So I think just being active about… Hey, is there a place for generative AI on this assignment? Or just having your ears kind of open for when that is appropriate. I would be willing to bet that in most classes, there’s not going to be a blanket “sure you can use it” situation. That’s just not the reality right now. You might have a class where the professor says, “Hey, we’re going to really push the envelope this year on this particular issue, and so I’m expecting you to collaborate with generative AI on every assignment.”

(00:36:40):
That might be one class. For the most part, what I’m seeing in terms of adoption from faculty is they might have one assignment where they allow students to use generative AI in a very limited area. And it involves some sort of reflective piece about like, well, what did you learn from using it in that, well, how much might you use this in the future and on this particular kind of work? So I think being clear about what those expectations are is the most valuable thing a student can do right off the bat. And I think that that’s going to change over time. I think that clarity will become a bigger part of our teaching in the near future as we kind of involve more of this technology in our instruction. If this were my son or daughter in our class, I would say go talk to the professor and say: Are we allowed to use generative AI at all in this class this semester? And you’ll know by their reaction what that answer is. Right now it’s going to be, no, don’t do that. It’s cheating. Or it’s going to be, yeah, but it’s going to be on assignment three, and I’m going to talk to you way more about that when we get closer. I think that allows you to plan for what’s appropriate and how you do your work.

Beren Goguen (00:37:49):
So it gets back to the transparency and just being upfront, having a conversation, making sure everyone is on the same page.

Dr. Joseph Brown (00:37:56):
And I think… I know that there’s this power dynamic of this is the person teaching your course, and this is a faculty member at the institution you were thrilled to get into and you’re excited about being here, but these are people. I mean, these are people who really are excited to teach you. So go have that conversation with them, and in the middle of that, you’re going to probably talk to them about other parts of the course that you might not have the greatest clarity on, and that’ll be… all of that is really good for you. And also, they’ll know you. They’ll know your name. This is really great. Then they’re like, oh, that student really cares. These are the kinds of things I talk to students about all the time. I was like, just go talk to them. They’re humans and they’re really happy to help you. No matter what movies you’ve seen and that sort of thing about the fussy professor or whatever. They really are glad you’re here and they really want to teach you. (Music Break)

Beren Goguen (00:38:56):
In one of your posts, you said that faculty should consider including some components of AI competency alongside or within more traditional assignments while holding students accountable for the accuracy of the final products. What’s an example of how an instructor might do that?

Dr. Joseph Brown (00:39:12):
So early on, there were a lot of different models. People were suggesting about what this might look like, and this is one of them. Another one, for example, is just opening the floodgates and saying, I’m just going to assume everything you produce is from generative AI, but I’m going to grade you as if it’s yours, which I thought was kind of giving up. So I like this as a step back. It’s like, well, wait a minute. There is a way for us to design learning opportunities for students in a way where we’re really still holding the reins on how they can use it. But we’re also making them accountable for what it’s doing in their work. And the example I would give would be kind of doing group work in a… let’s say a STEM class, where you’re doing a lab report. You’ve done a lab with your partners and you each contribute to different parts of that lab. But then you have to write your own report.

(00:40:04):
So I would think of it in terms of: You’ve gotten this data from your partner, from the lab, this collaborator. Well, that’s kind of what generative AI is going to be in this kind of scenario, but you’re responsible for that data. You don’t get to just mentally check out and go: “I’m sure it’s fine.” Instead, you have to go look through it and go, okay. Yes, I remember when we did that. This was the result and this was the correct finding for that particular activity. I think that’s kind of what we’re talking about here. So thinking of generative AI as a collaborator in a learning product, a learning outcome and assessment, and that encourages students to really read carefully through the information that they have received or to speak fluently about where they use it and how and why. I think that will be really helpful.

(00:40:51):
It really rolls back this fear that we have where they’re one prompt away from finishing an assignment. They’re going to get our assignment, they’re going to drop it in and say, write a four page report on this subject. No one could claim that’s going to be learning. We’re not talking about that. But I think that giving them some sort of activity where they have to engage with it and then evaluate what they’re given and then revise, be transparent about what is left. I think that encourages a kind of editorship of their work that we’re going to see more and more of as we see these students move into professional fields and we’re going to see more of that in professions. I don’t think it’s going to be just drop it in and then ask for the output.

Beren Goguen (00:41:33):
Instead of looking to AI as a shortcut that will do the work for you. You’re looking at it as a collaborator. AI makes mistakes, but it can speed the process up, so you can kind of collaborate with it, having the final oversight.

Dr. Joseph Brown (00:41:47):
Absolutely. Think about it for a moment. There are… So as English guys, we know there are different genres of writing. And so one I’m completely unfamiliar with is writing a petition. I haven’t written a petition. So when I look at getting started on that, I want to see models. Well, that’s one really great opportunity there is show me some models of an effective petition. And so I get a sense of, okay, there’s a format here. There’s an introduction that explains the rationale, and then there are reasons clearly stated, maybe bulleted or whatever. I think that those are kind of ways in which we might think of, really from the outset, using this technology as a partner, as a teammate. I think… I want to anthropomorphize this situation, like well, it’s a person. But it’s not. But if we think of it as a situation where we can lean on a well-trained collaborator, but ultimately we’re responsible for what is turned in, then this produces better thinkers, more critical thinkers about the work that they’re submitting.

Beren Goguen (00:42:55):
And they’ll be thinking more critically about using the technology in the future as opposed to just this is a quick fix.

Dr. Joseph Brown (00:43:01):
Yeah, I think this is going to be a bit of a race though. I mean, right now it’s really easy to teach students about the limitations of these technologies because we’re aware of their failures. They became really public and anybody can write and ask you to do something and just see how badly it can do. However, we know that these are not going to stay at that level of development for very long. And so, I think there is a sense in which they’re going to quickly catch up to our ability to find ways to show students the limitations. And so I think that we are going to reach a point where they’re incredibly well-trained, they’re incredibly powerful, and I think our challenges then are going to be quite different than what we’re facing right now. But I do think part of it is just making sure students are aware that this isn’t a genie.

(00:43:47):
It is not going to come up with the answer that you were intending. At their best, they can come up with an output that’s okay when you’ve trained it enough, when you’ve asked it the right kinds of questions or you’ve given it the right kind of data to look through. And I think that’s part of a learning process for our students in the same way that when they sift through information on Google, they very quickly… They’ve gotten really good at knowing that’s not really valuable information. I don’t need to go to that website. This looks like this was appropriate information. This is an authority. We need to give them more credit. I think students… Give more credit than we currently are. They were pretty good about catching up to Google. I think they’re going to be pretty good about figuring out what’s not great about these technologies.

Beren Goguen (00:44:27):
Ten years from now, technology could be good enough that you ask it to do it, and it just does it, and it’s actually pretty close to what an undergrad student would produce, maybe even better. I guess in a way that’s plagiarism. It’s like you’re paying a professional writer to write something for you.

Dr. Joseph Brown (00:44:46):
I would say that the best versions of these models right now, the ones that you have to pay for, are capable of doing mostly that right now. And that was really scary. During the beginning of the summer. It looked like OpenAI was just going to give away their generation four… their GPT-4 model. It was like a slightly slightly dumbed down version but capable of producing this kind of work. And they were going to give it away for free. And then they kind of face planted for about a week and they rolled that back. I thought that was going to create a huge challenge for our faculty in just determining what is authentically student work and what’s not. I think that there are a couple of takeaways from that. Number one, these are still companies trying to produce products for the market and there are going to be fits and starts.

(00:45:31):
There’s also a barrier about cost. $20 doesn’t sound like a lot to most people who are probably spending that on a Disney Plus subscription, but the psychology of that is still, I think, pretty powerful. I look at it and I think, well, they’re paying a lot of money to go to the university. $20 is, if I’m really being cynical, I mean, I think that’s a cost that you could justify. But I’m also aware that people make all kinds of strange decisions when it comes to things they pay for. So I think cost will still be a barrier. The thing that kind of worries me the most though is that, I mean, we’re talking about right now… we’re in that scenario where there are multiple models. They’re competing against one another so that every time they release something new, they want you to try it for a little while. But so much literal energy is required to produce the next version of these models. We’re talking about the kind of energy that nation states have to produce.

Beren Goguen (00:46:23):
It’s massive. Yeah.

Dr. Joseph Brown (00:46:24):
It’s massive. I mean, that’s one of the things that people just don’t talk about and…

Beren Goguen (00:46:26):
The cost is massive.

Dr. Joseph Brown (00:46:28):
I think right now people think of GPT four versus a potential GPT five, like their iPhone 4 versus iPhone 5, which I know is a 20-year-old phone. But they think it was the next iPhone. It’s not like that. So, I mean, to your point, it’s massive commitments that are being made by these companies. And that means that fewer and fewer companies are going to be trying, which means, I think, eventually we’re down to probably like it is with Microsoft or Apple. We are going to have some sort of one or the other, and it’s going to really funnel down into some very limited choices. And then we have the kind of unsatisfying, worrisome situation where our students are being asked to use these technologies that we don’t own. Those companies are not beholden to any kind of state control whatsoever and should raise alarms for people.

(00:47:22):
I jokingly refer to it sometimes as what happened to grocery store checkout lines. And you and I are old enough to remember when there were people checking out our groceries all over the store, and when those first do-it-yourself checkout lanes came through, they were like: “Hey, this will save you some time. Why not? If you only have a certain number, just check out. And so we thought it was a novelty. I think in some ways we’re excited by the novelty of generative AI right now, and we’re envisioning a future that is probably a little rosier than what it’s going to be because you and I both know that when you go to the grocery store now, you’re lucky if there’s a checkout person. And we’re all waiting, stuck in line behind the self-checkout thing that’s malfunctioning, because of course it is, right? And we don’t really have a choice anymore. The store isn’t going to invest in human checkout folks anymore. They’re not doing that. That’s just not what the market will bear. So what I worry is if we go too far in on this, we’re going to be setting up our students for a future where they’re kind of serving a technology that we don’t have any control or say over how it performs. And it becomes a part of, I guess, their intellectual life in a way that we’re all kind of really uncomfortable with.

Beren Goguen (00:48:34):
Interesting to think about what it’s going to be like in the next five to ten years. Because like you said, we’re in a transition period where it’s not set yet and maybe it never will be. But…

Dr. Joseph Brown (00:48:42):
Let’s take a step back for a second and say maybe there’s a totally different scenario that awaits us, which is: It turns out to be not a huge thing. So, as the director of the Academic Integrity program at CSU, I can tell you that in just the eight years that I’ve worked here, we have seen a steady parade of potentially industry changing technologies that turned out to actually not be a big deal at all. So for example, Chegg. This now is a floundering, failing company that for a very short time, mostly because of the pandemic, they were the talk of higher ed. Because they were essentially facilitating cheating under the guise of academic support. And one of the things I talked to our faculty then was… the reason they’re successful now is because they’re mimicking the way we talk to students about academic support.

(00:49:33):
And that might have a short-term benefit while our students are virtual, but when they come back to campuses, I really love the idea that our academic support is free. They paid for it by virtue of the fact of being here. We just need to be a little more accessible. And so what essentially happened is people came back to campus. Students did the things that they normally do. They go to their professor’s office hours. They go to tutoring programs that are all across the university. And they kind of stopped using the pay-for service. And so that left these companies… Chegg is only one of them. There were a handful. They’re still kind of out there. But we were really worried about them for about six months. And it turned out to be… the market kind of corrected and I think in a valuable way, which is it kind of encouraged students to just go to the thing that people were handing out for free because they cared,

(00:50:22):
because you were there, you were a student there. But we learned from that. We learned we needed to be a little more accessible. We needed to have more flexible hours with students. We needed to engage with them online more than we were. And I think that could be an argument for: This won’t be as big of a deal as we worry that it might be. And we do have time on our side as higher education. We’ve been around for a while. We think in terms of generations. And the great thing about working for a university is [we] get to have a vision that is much longer than just a few years, more than a quarterly interest statement. I think that’s valuable because we can also chart, at the same time that we manage the challenge of these new and emerging technologies… we can chart a course where maybe they weren’t going to be that big of a deal. After all, we can keep doing the things that we know are valuable for people.

Beren Goguen (00:51:13):
If you think about it, ultimately at the end of the day, there will always be students who are looking for a shortcut, who are looking for a way to not have to do the work. And there are students who are invested in their education and want to do the work and want to learn. And there’s not really too much academia can do about that or the university can do about that. It’s going to play itself out to an extent, would you say?

Dr. Joseph Brown (00:51:36):
Yeah, I think that’s been a story for a really long time. I think that’s accurate. I also want to point out though that these are norms that we’re talking about, and I think norms are going to shift. I think they already are shifting. I mean, I don’t know how many of these letters of recommendation that are coming from high school teachers are actually written by the teachers anymore. And that’s what we’re hearing from the admissions office. They seem… And to be fair, they used to get packets. I was talking with someone in our admissions office, they used to get packets where there maybe were 30 students applying from that high school and that teacher had been asked to write the letter of recommendation and it was the same letter over and over, just different student name. So I don’t want to really throw concern over those processes.

(00:52:16):
They’re not really that different. But I do want to say I’m still writing letters of recommendation for my students that I’m not asked so much that I don’t have time for that. And I think it’s a valuable part of just reflecting on the student and those are my words. But I do wonder if in five years I’ll look back at that and it will seem really quaint. And I think that that’s when we know norms are shifting. So while it may be true that there were always going to be students who want to game the system, to do this as easily, as effortlessly as possible, that’s actually a really small number of students historically and they’re pretty easy to see actually. I mean, the conduct folks can talk more about that, but the fact of the matter is our faculty do a great job of identifying when the student work is authentic and when it’s not. They actually read their work, is the bottom line.

(00:53:06):
They care a lot about whether or not the student is learning. And so I think identifying those students is a different kind of matter. What I’m concerned about is: As these technologies flow into our daily lives, I think that those norms about what we think is authentically ours will change. And so right now it’s really easy. You have to go to chat GPT and log in or you have to go to Claud and you’re consciously aware that I’m asking you to do a thing. But already we’re seeing these technologies show up in Outlook, in Microsoft Word, and the things we use on a day-today basis. I don’t think I’ve done a Google search in the last month that hasn’t been AI generated, which has been an incredibly frustrating experience because half the time it’s not the right thing. But my point of bringing that up is that level of interaction, I think, changes our impression of what is ours and what was given to us. And over time, if you just keep playing that game out, we might find ourselves in a really different place. My hope is that academic integrity doesn’t erode at that point. I think that this is one of the fundamental pillars of higher education — is you learn something. I need to know you learned it, right? You have to show me in a way where I can assess it. The ‘how’ to do that is the thing that’s probably changing over time. (Music Break)

(00:54:42):
My mother attended the University of Georgia, a first generation college student in the early 1960s, and she had the most interesting stories about what the university was like. It was these very institutional types of stories, like, you had to prove you belong there. And I think it’s very much from the lens of being a first generation student. But the connecting points were: all of her classes were hard. They were challenging classes. She talked a good bit about having a philosophy class that was really challenging for her. But she engaged in the class. She did the reading and when it came time to prove that she had achieved the learning that was intended in that class — that she had done the work — she did it. She passed the class. And the story she told me was that there were these high-stake moments where everybody remembers the blue books.

(00:55:30):
They’ve been around forever… of finishing the assignment and passing the class. And it meant something. It was valuable to her. She completed her degree. She was a teacher for 32 years in a metro Atlanta school district where she passed on what she knew to others, a long career of just helping in her community. And I think that when we create learning moments that matter for students, those are the lives that we engender, that we encourage. They feel like: “I did something that was valuable. And I also feel the need to share it, share that life, that knowledge with other people.” I like her stories about the university because I think that it’s okay for some of this to be hard. We like to talk ourselves out of that, right? That everything should be somehow more accessible. Yes, the university should be more accessible, but I like the idea that you’ve achieved something by proving that you could pass those tests.

(00:56:29):
You could do the things that showed that you had learned and that learning wasn’t going away anytime soon. So that’s a really valuable story to me because without her journey at the University of Georgia, I don’t go to college. We know the statistics here. And the fact of the matter is my brother, my sister and I, we all achieved college degrees. And I think that kind of achievement echoes through generations. And that’s the kind of thing I want people to kind of think of is when instructors are making learning opportunities for students, it’s not necessarily like we’re trying to put up a barrier. But it is a process through which they become something else. I have this line that’s from Tennyson. He’s talking about leaving his kingdom to his son Telemachus, and he says: I leave you the scepter and the aisle and this mission, this work, pursuing to fulfill the desire to take a rugged people and through soft degree, subdue them to the useful and the good.

(00:57:25):
What he’s essentially saying is education is this beautiful process by which we refine human beings. We give them something that they could not have achieved on their own, right? And that is a process, and the outcome is this refined, accomplished person. And so I like working in this field because I can help assist that process, and I love seeing our graduates out in the community. They’re amazing people. My son was just coached by one, easily the best coach he had ever had. And because he’s a thinker, he’s a person who has had an opportunity to really think about the things that he is doing. And so this is one of the things that makes me really excited about the work we’re doing here.

Beren Goguen (00:58:04):
You don’t remember the easy classes, really. You remember the classes that challenged you and those are the ones that stick with you.

Dr. Joseph Brown (00:58:11):
Absolutely. I think we all have in our mind the classes that those were hard and you’re proud of those because it required something of you. In fact, I remember texts that were really hard, that I remember thinking: I don’t think I’m going to get through this text. I’m going to finish it. And I think I’m a big fan of the nose-to-the-grindstone moment of, like, I’m going to finish this. I’m going to read it. If I read to the end of the book and I still don’t understand it, then I’ll figure that out, too. But I’m going to get through it. I think those moments of persistence and resilience are really important. And obviously that’s a big part of what we talk about at TILT as well… How to encourage those in the classroom.

Beren Goguen (00:58:51):
And it’s the same in the STEM field. It’s the challenges that take a lot of effort that are usually the most rewarding.

Dr. Joseph Brown (00:58:56):
Yeah, no, that makes sense. See, I think STEM actually understands this in a way, really easily, this idea of failure and kind of a recursive process of failure and then we’re going to try something new. Failure and then we’re going to try something new.

Beren Goguen (00:59:10):
You learn from the failures a lot.

Dr. Joseph Brown (00:59:11):
Absolutely. Yeah. I think we’ve got to get more comfortable with being okay with that failure. It’s going to happen, and then spending the time and figuring out what it is, what led to that, and what we can pull from it.

(00:59:22):
Thanks for talking.

(00:59:24):
Of course. Thanks for having me. (Music)

Beren Goguen (00:59:32):
Thanks for listening to this episode of the Applied Podcast. If you’d like to learn more about the Institute for Learning and Teaching at CSU, you can find a link to that website in the show notes. If you’d like to learn more about the ethics of artificial intelligence, I included a link to a recorded presentation by Dr. Matthew Decomp during the CSU Provost’s Ethics Colloquium from November of 2023. And of course, I would also highly recommend checking out episode eight of the Applied podcast, which features my interview with Dr. Sudeep Pasricha, a professor with joint appointments in the departments of Electrical and Computer Engineering, Computer Science, and Systems Engineering at CSU. In that episode, we speak about artificial intelligence, data centers, and the environmental impacts of both.


Explore More Episodes

Ep. 10

Entrepreneurship Should be for Everyone, with Mark Schreiber

What is entrepreneurship? Should it be taught in K-12 school? What could that look like and what are the barriers?

View All Episodes