Study: 94% Of AI-Generated College Writing Is Undetected By Teachers

https://www.forbes.com/sites/dereknewton/2024/11/30/study-94-of-ai-generated-college-writing-is-undetected-by-teachers/

created by MetaKnowing on 01/12/2024 at 00:30 UTC

15161 upvotes, 64 top-level comments (showing 25)

Comments

Comment by AutoModerator at 01/12/2024 at 00:30 UTC

1 upvotes, 1 direct replies

WARNING! The link in question may require you to disable ad-blockers to see content. Though not required, please consider submitting an alternative source for this story.

WARNING! Disabling your ad blocker may open you up to malware infections, malicious cookies and can expose you to unwanted tracker networks. PROCEED WITH CAUTION.

Do not open any files which are automatically downloaded, and do not enter personal information on any page you do not trust. If you are concerned about tracking, consider opening the page in an incognito window, and verify that your browser is sending "do not track" requests.

IF YOU ENCOUNTER ANY MALWARE, MALICIOUS TRACKERS, CLICKJACKING, OR REDIRECT LOOPS PLEASE MESSAGE THE /r/technology MODERATORS IMMEDIATELY.

1: /message/compose/?to=/r/technology

Comment by StatisticianOwn9953 at 01/12/2024 at 00:44 UTC

2267 upvotes, 7 direct replies

Aside from weighting exams more heavily, it's difficult to see how you can get around this. All it takes is some clear instructions and editing out obvious GPTisms, and most people won't have a clue unless there are factual errors (though such assignments would require citations anyway)

Comment by [deleted] at 01/12/2024 at 00:32 UTC

2748 upvotes, 15 direct replies

We are creating generations of dumb shits that is for sure.

Comment by LibraryBig3287 at 01/12/2024 at 00:57 UTC

348 upvotes, 4 direct replies

Think of how much worse these MBA factory dorks are gonna wreck society.

Comment by Interesting_Ant3592 at 01/12/2024 at 00:56 UTC

1056 upvotes, 9 direct replies

Oh trust me, they are detected. But we cant definitively prove its AI which is the problem.

I’ve Graded many papers where its painfully obvious its partly or wholely AI written. The voice changes, gpt has phrases it loves to use, it starts random tangents.

Hilariously enough we will probably see a rise in hand written exams as a result.

Comment by oldaliumfarmer at 01/12/2024 at 02:01 UTC

446 upvotes, 8 direct replies

The last time I gave a student a zero for cheating he came in with his lawyer. They don't come to learn.

Comment by [deleted] at 01/12/2024 at 02:35 UTC

127 upvotes, 3 direct replies

[deleted]

Comment by Eradicator_1729 at 01/12/2024 at 00:57 UTC

163 upvotes, 4 direct replies

There’s only two ways to fix this, at least as I see things.

The preferred thing would be to convince students (somehow) that using AI isn’t in their best interest and they should do the work themselves because it’s better for them in the long run. The problem is that this just seems extremely unlikely to happen.

The second option is to move all writing to an in-class structure. I don’t think it should take up regular class time so I’d envision a writing “lab” component where students would, once a week, have to report to a classroom space and devote their time to writing. Ideally this would be done by hand, and all reference materials would have to be hard copies. But no access to computers would be allowed.

The alternative is to just give up on getting real writing.

Comment by Rememberancy at 01/12/2024 at 01:33 UTC

61 upvotes, 3 direct replies

Most of it isn’t undetected. We know in most cases. It’s just at this point what the hell can we do about it?

There is a way to use ai ethically, It’s an amazing tool for serious students / scholars

If I had my way every single social science class would require a passing grade in a blue book in order to pass the course. That’s what I had to do in college to get my degrees.

Comment by prairiepasque at 01/12/2024 at 02:10 UTC

54 upvotes, 1 direct replies

Interesting, but a few things to note.

1. AI responses were submitted in online exams for open-ended/essay questions, **not** as essays. There were 1,134 "real" submissions and 63 AI submissions from the researchers. I point this out because it's likely harder to discern a pattern in one paragraph of text than it is in several pages of text.

2. We do not know if some of the 1,134 submissions deemed as "real" were also AI submitted by the students. This would decrease the reported detection rate. The authors discuss this issue and say that 74% of students surveyed said they would use AI in a future course (meaning a lot of them probably did use AI).

3. The university had no AI detection software (not sure this would have helped, anyway), so detection was by eye only.

4. The university's policy for AI was basically that it's "not allowed" and that professors should keep an eye out for it. The authors do not assess how the university's stated policies and actual practices may differ, i.e. professors may be pressured to turn a blind eye in order to keep enrollment numbers up, thereby giving a false impression of the reported detection rate.

5. Adding on to that, it is well known issue that online courses are the most likely to suffer from AI submissions. It's very possible (I'd argue likely) that professors are overwhelmed and burned out by AI submissions and are simply choosing not to pursue the matter. They are also plagued by conflicting academic misconduct policies and, without tenure, may be essentially powerless to confront AI misconduct.

It is likely that the **actual** (or at least suspected) detection rate is much, much higher.

Check out r/professors to see their woes and frustration in action. They're *very* well aware of the rampant AI cheating.

Comment by otherwhitematt at 01/12/2024 at 01:23 UTC

29 upvotes, 1 direct replies

I’m currently working on a doctoral degree in education. While all of our dissertation related work involves heavy amounts of discussion with our chair and committee, the courses I’ve had with graded weekly discussion posts with required replies have been filled with AI posts and responses. I’d say that every person who has had an AI sounding response has been in an administrative position.

Comment by Anxious-Depth-7983 at 01/12/2024 at 01:52 UTC

66 upvotes, 6 direct replies

The main person who is hurt by it is themselves. Unfortunately, lack of integrity is how you seem to succeed these days.

Comment by StonkSalty at 01/12/2024 at 01:07 UTC

32 upvotes, 2 direct replies

Wait until 94% of college assignment correction is done by AI

Comment by ilifwdrht78 at 01/12/2024 at 02:20 UTC

104 upvotes, 4 direct replies

Professors need to stop assigning "busy work" in the form of writing assignments. In my education methods class (where I should be learning hands-on teaching), we spent 8 weeks of my semester reading a chapter and regurgitating it in a 500-word summary. This is a master's program, btw.

Comment by SplendidPunkinButter at 01/12/2024 at 01:05 UTC

53 upvotes, 3 direct replies

There’s a movie called The Paper Chase. There’s a scene in this movie where the hardass Professor is grading the final exams and you realize he’s just counting how many lines everyone wrote and assigning higher grades to the students who wrote more lines.

This movie was made in the 1970s

What I’m saying is there’s a tradition of professors not carefully reading essays in college that far predates ChatGPT, or else that joke would never have worked

(Great movie btw)

Comment by Party_Lawfulness_272 at 01/12/2024 at 01:05 UTC*

16 upvotes, 1 direct replies

Honestly, critical reading assignments in class and with readings are the best way to combat this. AI is good for spitting things out, but anyone using it will quickly find themselves unable to *comprehend* what they are reading when test time comes. Hell, I would make a quiz based off the book/topic and have them try to form an opinion.

Comment by ImportantComb9997 at 01/12/2024 at 04:43 UTC

15 upvotes, 0 direct replies

--Frank Herbert, Dune

Comment by Haschlol at 01/12/2024 at 08:24 UTC

7 upvotes, 0 direct replies

I don't have a problem with this as long as you the student actually edit the text the GPT gives you. Thus meaning you learn what you need to, but don't have to do mindless work. This is the whole point of AI anyway, so if you're honest with yourself and only use it to write a rough draft, it's great. The problem arises from students not studying and letting GPT do 99% of the work.

Comment by fgnrtzbdbbt at 01/12/2024 at 08:57 UTC

7 upvotes, 0 direct replies

"...by allowing the use of AI but disallowing the reliable technology that can detect it."

You can stop reading right there because the only remaining question is, whose talented marketing people the author has been listening to.

Comment by Able-Inspector-7984 at 01/12/2024 at 01:52 UTC

59 upvotes, 2 direct replies

We gonna have a generation of stupid, weak and unable of critical thinking or any kind of thinking all over the world if everyone uses ai

Comment by TheRainbowpill93 at 01/12/2024 at 03:27 UTC

12 upvotes, 1 direct replies

Not gonna lie , I’ve used AI to write my science lab reports lol.

I use my own data , of course so they never really tell. You can also ask AI to write it how an undergrad would write it and it’s pretty indistinguishable to something I would write.

Comment by Hairy-Summer7386 at 01/12/2024 at 02:25 UTC

37 upvotes, 2 direct replies

Eh. I’m old fashioned. I write my essays the night before they’re due and typically get a solid 60-70%.

I never understood the need to use AI or any language generative models to write your essays. I’ve seen people compare it to using calculators and I kinda disagree. With calculators, you still have to understand the math itself to use it correctly. I feel like with AI you only need to cite the sources and what you want and boom. Your work is done. There’s no actual work being done nor is your own individual understanding of the material being expressed. It’s hollow.

Comment by Bebobopbe at 01/12/2024 at 12:52 UTC

6 upvotes, 1 direct replies

What we are finding out is homework is useless. Exams are going to be worth everything.

Comment by yes-rico-kaboom at 01/12/2024 at 05:49 UTC

20 upvotes, 1 direct replies

Considering college is expensive as shit, I don’t really mind the fact that people are giving themselves as much of a chance to pass classes with all the bullshit busy work that’s being forced upon them. Many colleges don’t teach subjects even remotely in a way that actually teaches the subjects properly. Students have to spend hours and hours doing homework and research and when they get into the workforce how you get your answer matters less and less, and your actual output is what matters. It’s even worse when you’re paying thousands to have to fight to understand the content through someone’s accent or inability to teach.

Schools should be teaching using all available methods. AI isn’t going anywhere and it’s an incredibly useful tool no matter how much people hate on it. Being able to use it in a way that augments your learning is a great thing.

The workplaces will suss out who does and doesn’t know their shit. Colleges need to adapt

Comment by cman1098 at 01/12/2024 at 05:02 UTC

4 upvotes, 1 direct replies

Make the assignments much harder and encourage the use of AI.

And then any writing assignments need to be done in person with pen and paper.