0:02 amna: This year's senior class
0:03 at universities across the
0:05 country is the first to have
0:08 spent nearly its entire college
0:10 career in the age of generative
0:11 ai, a type of artificial
0:13 intelligence that can create new
0:14 content, like text and images.
0:20 As the technology improves, it's
0:21 harder to distinguish from human
0:22 work.
0:23 And it's shaking academia to its
0:24 core with some very big
0:24 questions.
0:27 Special correspondent Fred de
0:27 Sam Lazaro has the story for our
0:28 series, rethinking college.
0:32 >> And the principle of humanity
0:33 says, treat all people as ends
0:36 in themselves, never merely as
0:39 means.
0:40 Fred: About two years ago, Megan
0:41 Fritts, a philosophy professor
0:42 at the university of Arkansas at
0:42 Little Rock, began spotting
0:43 something unusual about her
0:44 students' writing.
0:46 >> You suddenly get an essay or
0:48 a test answer, some kind of
0:57 assignment from a student who's
0:57 normal writing you're familiar
0:58 with, and you get something back
1:01 that sort of sounds like an
1:02 official business document or a
1:02 piece of technical writing.
1:04 Writing that sounds very highly
1:04 polished, but very impersonal.
1:08 Fred: Impersonal because it
1:09 likely wasn't written by a
1:10 person.
1:19 This was the beginning of a
1:19 turning point for higher Ed, as
1:20 generative ai had swept through,
1:21 not only her campus, but college
1:22 campuses across the country.
1:22 A survey last year found that
1:23 86% of college students are now
1:26 using ai tools, like chatgpt,
1:28 Claude ai, and Google gemini,
1:29 for school work.
1:33 The reason generative ai has
1:34 spread so quickly on college
1:35 campuses is not hard to
1:36 understand.
1:38 It's transformed tasks that used
1:40 to take hours, even days, of
1:42 writing and revision into
1:46 something that can be done in
1:47 mere minutes.
1:50 For example, I can ask chatgpt,
1:52 write me a 1000-word essay on
1:57 the topic of, "Is it ok to lie?"
1:59 And using a massive amount of
2:01 data, it predicts and generates
2:02 sentences on this topic
2:03 instantly.
2:05 Fritts says the impact has been
2:06 deeply disruptive.
2:08 >> If I'm reading the writings
2:14 of chatgpt instead of my
2:15 students, I have lost the very
2:17 best tool that I have to see if
2:20 I am being effective in my
2:21 capacity as an instructor or
2:22 not.
2:24 >> We really need a framework in
2:28 which people can use these
2:29 things and innovate
2:30 while minimizing the risk.
2:31 Fred: University policymakers
2:33 have scrambled to stay ahead.
2:36 >> I think the realization over
2:38 the past year and a half is the
2:40 technology is outpacing our
2:43 ability to detect it.
2:44 Fred: Vice provost of research
2:45 Brian berry leads one of U.A.
2:47 Little Rock's committees tasked
2:49 with creating clear campus-wide
2:50 policies on ai.
2:51 >> I think it really comes down
2:55 to us helping students
2:56 understand what's at risk.
2:59 Helping them understand that if
2:59 they use ai in the right way,
3:02 it's literally the most powerful
3:03 tool that they've ever been able
3:04 to use and it will make huge
3:05 differences.
3:07 But if they use it in the wrong
3:08 way, it could short circuit
3:09 their learning process.
3:11 Fred: The university is
3:12 finalizing a policy that lets
3:14 professors determine what ai use
3:18 is acceptable in their
3:18 classrooms, as long as they
3:19 clearly outline it in their
3:20 syllabus.
3:21 But for Fritts, who has a strict
3:23 no ai policy, identifying it has
3:25 been complicated and time
3:26 consuming.
3:32 >> Phrasely is one of the
3:34 softwares that I use.
3:35 If I suspect ai use, then the
3:36 first thing I do is I do use
3:37 detection software.
3:40 I actually use eight different
3:41 detection softwares.
3:44 Fred: If her suspicion is
3:45 confirmed, she does meet with
3:46 the student.
3:47 >> If they can talk about the
3:49 thing that they wrote about,
3:49 then great.
3:50 But a lot of times they can't.
3:51 Fred: Sounds like it's tedious
3:52 and a lot more work for
3:53 professors like yourself.
3:56 >> It certainly cuts into my
3:58 life quite a bit.
4:01 It, at least has sometimes, made
4:02 teaching feel like policing.
4:05 Fred: And these detection
4:06 methods are not foolproof.
4:07 Students online say that they're
4:08 caught in the middle.
4:14 >>We might find out if I'm
4:22 about to get kicked out of
4:23 college.
4:25 Fred: Ashley Dunn was a senior
4:27 at Louisiana state university
4:27 when she was accused of using ai
4:28 to write a short essay for a
4:29 British literature class, after
4:30 a detection tool flagged her
4:30 writing last year.
4:31 >> I was like, am I gonna fail
4:33 this class?
4:34 Am I gonna get a zero?
4:36 Every college takes plagiarism
4:37 and that kind of thing very
4:38 seriously.
4:39 So I was just freaking out.
4:43 Fred: After communicating with
4:46 her professor, Dunn says she was
4:47 eventually given an a for the
4:48 assignment, but the response to
4:48 her on tiktok proves that this
4:50 is a widespread issue.
4:52 >> A lot of people ended up
4:57 making responses to my video,
4:57 pretty much saying that they had
4:58 gone through the same thing, but
4:59 that they didn't really get as
5:04 lucky and they ended up either
5:05 getting zeros or failing the
5:06 class.
5:08 Some people recently have been
5:09 making videos about, oh, my
5:09 professor said that my essay was
5:14 ai because I used an emdash, and
5:20 , but that's just a regular way
5:25 of writing, especially for a
5:26 college level.
5:29 Fred: Not all schools are anti
5:30 ai.
5:31 Some are actually looking for
5:32 ways to embrace it.
5:34 Lori Kendall teaches an
5:34 entrepreneurship class in the
5:36 fisher college of business at
5:37 the Ohio state university.
5:39 >> When gen ai came out, I and
5:43 every other instructor did, oh,
5:44 great.
5:45 Now what?
5:52 Do we allow ai?
5:53 Do we not allow ai?
5:54 And the reality is, you know
5:54 what?
5:55 They're going to use it anyway.
5:56 Fred: She now encourages her
5:57 students to use to ai to
5:57 critically examine their
5:58 original work and as a learning
5:59 aid.
6:02 >> A lot of people might use ai
6:03 just to get assignments done or
6:06 plagiarism, but I like to use ai
6:10 for deeper understanding.
6:11 Fred: Rachel Gervais is a first
6:13 year student, majoring in air
6:14 transportation.
6:18 >> I oftentimes use ai to create
6:20 questions regarding this topic.
6:20 So I not only get a better
6:21 understanding of the actual
6:22 material, but I also can test
6:23 and see what I need to maybe
6:23 focus on even more.
6:25 >> If you don't use ai or the
6:26 next technology that comes along
6:27 to be effective, you're not
6:28 going to be competitive in the
6:29 job market.
6:29 The job market's changing right
6:30 underneath your feet.
6:32 >> As the chief academic
6:35 officer, I get to decide on
6:36 academic integrity issues, honor
6:37 code, and violations.
6:39 Fred: Ravi bellamkonda is the
6:42 executive vice president and
6:47 provost at Ohio state
6:48 university.
6:51 He says he was struck by one
6:51 alleged violation last year, a
6:52 student accused of using ai.
6:53 It was a case of cheating, he
6:54 says, but it made him think.
6:59 >> What if there existed
7:00 technology that indeed lets our
7:01 students produce work of very
7:02 high quality?
7:04 Shouldn't we investigate this a
7:05 little further?
7:06 Fred: Bellamkonda spearheaded
7:08 Ohio state's new ai fluency
7:10 initiative, which requires that
7:11 all undergraduate students,
7:12 across academic disciplines,
7:14 learn and use ai tools.
7:17 >> The trick is to figure out,
7:20 like any human interaction with
7:23 technology, what can we offload
7:24 to technology, and what do we
7:27 need to add value to?
7:33 Ohio state wants to be at the
7:33 front of that creation of those
7:35 rules.
7:37 Fred: That's prompted
7:38 experimentation across the
7:39 disciplines, like music
7:41 professor Tina Tallon's ai and
7:43 music class, which explores
7:44 innovative uses of the
7:45 technology.
7:47 >> I always start the class by
7:48 asking them to think about a
7:49 challenge in their field.
7:51 At that point, we're not even
7:52 talking about ai.
7:53 I just want them to identify
7:55 something that either they've
7:56 run up against or that their
7:59 students or their colleagues
7:59 have.
8:01 Fred: One member of her class,
8:03 tuba instructor and doctoral
8:03 student will Roesch, is using ai
8:06 to analyze airflow into his
8:07 instrument over thousands of
8:08 repetitions.
8:11 The data will help guide
8:12 students on how to play the
8:13 perfect note.
8:17 Another, Natalia Moreno
8:18 Buitrago, is a music education
8:20 grad student studying how babies
8:21 acquire musical knowledge.
8:24 She used to spend hours combing
8:25 through home recordings of
8:27 research subjects, listening for
8:29 moments when parents or
8:30 caregivers sing or hum around
8:31 the infant.
8:34 Now, ai does this for her.
8:35 >> If we critically examine the
8:41 tools that we're engaging with
8:42 and are actively involved in the
8:42 development of them, I think we
8:43 can do some pretty incredible
8:44 things.
8:45 Fred: But, inevitably, these
8:48 tolls also bring major
8:52 disruption, both to academia,
8:52 and to the jobs students hope to
8:53 someday fill.
8:54 >> How do we go through a
8:56 transformative moment like this
8:57 with the disruptions that it is
8:59 going to cause
9:00 and yet do this in a way that
9:03 ultimately is additive to us as
9:04 a society?
9:07 That it improves our lot as
9:08 human beings?
9:09 Fred: A question without a clear
9:11 answer, he says, but one that
9:12 students should help tackle.
9:15 For the pbs "News hour," I'm
9:18 Fred de Sam Lazaro, in Columbus,
9:18 Ohio.
9:20 ♪♪