Comment by Videoboysayscube on 01/12/2024 at 03:43 UTC

4 upvotes, 1 direct replies (showing 1)

View submission: Study: 94% Of AI-Generated College Writing Is Undetected By Teachers

View parent comment

This is exactly the 'you won't always have a calculator in your pocket' mindset. The genie is out of the bottle. AI is here to stay. Any attempt to restrict it is futile.

Also I think there's something to say about the longevity of fields where AI usage alone is enough to ace a class. If the AI can generate the results all on its own, why do we need the student?

Replies

Comment by JivanP at 01/12/2024 at 14:15 UTC*

6 upvotes, 0 direct replies

The difference is that people are grossly misusing the technology. A calculator is only a good tool if you know what to enter into it and how to interpret the output. We teach people that, it's called mathematics class. GPT is the same, but apparently we're not correctly teaching critical thinking and research skills well enough currently, because large swathes of people are misappropriating its outputs.

I have literally, as recently as this week, seen marketing folk on LinkedIn talking about using a percentage calculator, and people in the comments saying, "just use AI for this, it works." We're seriously at a stage where we need to massively stress the fact that, no, it doesn't always just correctly do what you want it do, and that's not even something it's designed/intended to do correctly.

In classes where AI does well, we are trying to teach students to apply concepts and methods to new, unseen things by appealing to old, well-studied things. Talking about such well-studied things is GPT's bread and butter, because it learns from the corpus of writings that already exist out there in the world about such things. But how well can it extrapolate from all that source material and apply the concepts involved to studying and talking about new things that no-one has encountered yet, and how does this compare to a human doing the same?