Comments

You must log in or register to comment.

blud_13 t1_jabllga wrote

All Your Base Belong to Us - ChatGPT

2

A-Delonix-Regia t1_jabn841 wrote

Why? IMO any content created by ChatGPT should not be taken into account when grading students' assignments since the student didn't create it in its final form with their brain (same reason why I feel that quotes by other people (unless using as an example and not to argue in favour of your own point), spellcheckers, and Grammarly should not be used by students for assignments that affect their grades, but can be used to learn how to write better), so there should be no benefit to using such content.

−1

A-Delonix-Regia t1_jabvfaa wrote

Reread my top level comment. I said it is okay to use while learning. In my opinion, if you use such tools in graded assignments, you are hiding your own lack of knowledge and fooling yourself.

I mean, graded assignments are a flawed concept in education, but there is no real benefit in using tools like AI there since they do not reflect your own knowledge, unless you use the AI to do research like get a quick summary of what nihilism is about and then rephrase the output in your own words.

−3

slantedangle t1_jabwimo wrote

Why would anyone be allowed to quote Chatgpt in their essay?

What value would a teacher see in quoting a chatgpt for their student? How does quoting a Chatgpt improve education?

I can possibly see using it to get a summary, for ones own reading compression on a topic. But not as a source to quote from for your essay. It's built on top of language models. Essentially, it mimicks our writing. Depending on what you feed it, "sometimes good, sometimes like shit."

46

Ok-Lobster-919 t1_jabwyb5 wrote

I don't think anybody fully understand the risks of not learning. I read a study about a reduction in neuroplasticity in people who relied on a GPS to traverse their surroundings, like a weakening of the ability to learn. I wouldn't be that surprised if neuroplastic changes continued with the failure to learn spelling. Is spelling not just mapping words to their correct spellings in the brain?

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6020662/

Here's a study, there could be a lot more work done but I think it's interesting.

Have people thought of the implications of replacing more and more spatial logic centers of the brain with tools like Chat GPT? Fun stuff to think about! Maybe in the future people will be classified as "learned knowledge clasically" and "has chat gpt".

−2

slantedangle t1_jabx6yq wrote

>Because you won't have a spell checker in your pocket at all times in the real life...

You also have a calculator in your pocket at all times in the real life. What's your opinion on whether or not students should learn math?

1

C0rn3j t1_jabxicf wrote

If you're dealing with grammar, you're 99.9% doing it in the digital world and can easily fix it with assistance with no drawbacks.

If you're in the casino wondering what's the chance of rolling snake eyes, you're unlikely to be pulling out your phone, so I'd say basic math is much more important than perfect grammar without assistance.

0

goteamnick t1_jabxpfj wrote

I mean, sure, you're allowed to. But I doubt whoever is marking it is going to be impressed.

13

slantedangle t1_jabxyi0 wrote

>If you're dealing with grammar, you're 99.9% doing it in the digital world and can easily fix it with assistance with no drawbacks.

>If you're in the casino wondering what's the chance of rolling snake eyes, you're unlikely to be pulling out your phone, so I'd say basic math is much more important than perfect grammar without assistance.

So you think students should learn math in case they want to gamble? That's your best argument?

1

despitegirls t1_jaby6w1 wrote

I'm trying to understand that myself. Perhaps if you used it to summarize work that you created? I can't see trusting it as a source for information since it doesn't provide sources to where it has learned information, at least by default. This is something that Microsoft's implementation in Bing actually does.

15

bigfatmatt01 t1_jac05jz wrote

Spellcheck and Grammar check both do the work that an editor would do in the proofreading step. When we were in school that step was done by another student to teach how editing works. I have no problem with those tools. Chatgpt replaces the process of turning ideas and thoughts into written word. That I have an issue with.

1

AbstractEngima t1_jac06su wrote

How is this even possible? Anyone with a brain knows that ChatGPT is nothing more than a unreliable narrator that pulls random bits of information and then puts into inaccurate mashed up information.

It's already basically following the same process as any other AI does, which is taking little bits of existing information and puts it together based on patterns, rather than actual understanding of the source material.

8

slantedangle t1_jac0yi1 wrote

>Perhaps if you used it to summarize work that you created?

I would want my students to learn and practice how to summarize work on their own.

The only good reason I can think of would be in the context of mass summaries. Chatgpt would be good at creating many summaries all at once. It's scalable. As an experimental tool or to show examples and patterns. I can't see any justifiable uses for students in a typical classroom, and certainly not for submitting work on behalf of the student, instead of the student writing it themselves.

> I can't see trusting it as a source for information since it doesn't provide sources to where it has learned information, at least by default.

I wouldn't trust it, at all. It's not just the source information. Even if it pulled from good sources, it doesn't perform any comprehension or logic or reasoning of the content. The way it works is through a language model. It arranges words together much like a glorified auto complete does. It doesn't check to see whether what it wrote is coherent or correct.

11

ixid t1_jac6dlg wrote

Because it often provides a well-written and clear summary of topics that are widely discussed on the internet, which would include most school level topics. So ultimately the student is more likely to understand and retain the knowledge if they read it and think about it.

0

QueenOfQuok t1_jac8gbk wrote

Why would you want to? ChatGPT essays read like a 6th-grader's book report when they barely did the reading.

1

slantedangle t1_jac8uj2 wrote

Even if you used chatgpt for READING comprehension, you wouldn't want them to quote it for submitting an essay. You would always want them to quote the source in homework or a test or a thesis, something the student WROTE. Hopefully the context wasn't lost on you.

4

ixid t1_jac9l1l wrote

We're talking about pre-university level education, talking about quoting sources is rather grandiose. Most kids just look at the textbook and wikipedia. ChatGPT is not even lowering the level. Hopefully the context wasn't lost on you.

−1

slantedangle t1_jacarc1 wrote

>We're talking about pre-university level education, talking about quoting sources is rather grandiose.

And yet that is precisely what we are talking about here, isn't it?

>Most kids just look at the textbook and wikipedia. ChatGPT is not even lowering the level. Hopefully the context wasn't lost on you.

Then he already has a source, no? What is the point of quoting the chatgpt rather than the source? I see the point of READING the chatgpt. Not QUOTING it. Apparently the context was indeed lost on you.

5

Badtrainwreck t1_jacceq3 wrote

Not all writing is about the actual writing part, but mostly about comprehension, essay format, and proper citations. It’s like math, some might question why use a calculator for math, but it’s not about working math out by hand it’s about learning how to solve the problems.

1

gurenkagurenda t1_jacebvc wrote

I don’t know how you want to define “understanding” when talking about a non-sentient LLM, but in my experiments, ChatGPT consistently gets reading comprehension questions from SAT practice tests right, and it’s well known that it has passed multiple professional exams. It’s nowhere close to infallible, but you’re also underselling what it does.

4

E_Snap t1_jaciiyu wrote

Haha, this is your generation’s “The internet isn’t a real source! You have to find it in a book!”

Don’t worry, teachers and professors eventually realized that they were being curmudgeonly idiots and decided that everything but Wikipedia was a fine and dandy source.

For all of the classes they have to take to get their credentials, you’d think one of them would be something along the lines of “How to stay on the right side of history”.

−2

Mi-mus t1_jackqm9 wrote

A little unrelated to the topic. But anyone else got this uneasy feeling chatGPT might become some kind of oracle for society at large? In the same way google operates now but completely binary. There is obviously a back door to this thing… so what would be stopping third party influence for generation of desired answers? Peoples opinions on a variety of topics could be nudged so easily and on the cheap.

4

Hogo-Nano t1_jackrez wrote

If I was still in school i'd just generate the essay then spend like 5-10 minutes editing it to my own words. Kids arent stupid and probably will do the same thing. Also, now that the 'CHATGPT DECTECTOR' is open source I imagine they can just plug in their essay and see if it gets flagged before submitting to a teacher.

If you want to test a kid on essay skills have them read something before class and write an essay on it IN CLASS with a pencil and paper. Otherwise you can't complain when kids cheat.

1

slantedangle t1_jacw16h wrote

>Not all writing is about the actual writing part, but mostly about comprehension, essay format, and proper citations.

Writing an ESSAY, which I believe is this context, is an exercise in sourcing, reading comprehension, critical thinking, grammar, spelling, sentence structure, document process, among other things.

>It’s like math, some might question why use a calculator for math, but it’s not about working math out by hand it’s about learning how to solve the problems.

We work out math problems by hand in order to learn the operations and sequences, exercise the computation, practice the writing of symbols, translate problems to rigor, among other things.

Admittedly, it's not about the hand eye coordination and fine motor skills. I will presume are not talking about that.

Quoting chatgpt would arguably, to lesser and more degrees, depending on the nature of the essay or curriculum, circumvent many of these skills which are learned in conjunction, by writing essays (and doing math).

2

TeaKingMac t1_jacxmf5 wrote

>How does quoting a Chatgpt improve education?

Yeah, ChatGPT is like a tertiary or worse level source. If Wikipedia is an unacceptable source, ChatGPT is at least an order of magnitude worse

11

gurenkagurenda t1_jacxqtz wrote

Yes. A little while back, I had someone use a Computerphile video showing ChatGPT missing on college level physics questions as proof that ChatGPT is incapable of comprehension. The bar at this point has been set so high that apparently only a small minority of humans are capable of understanding.

1

slantedangle t1_jacy34k wrote

>It's a pity schools can't teach people not to be bellends. You'd have benefited.

It's a pity schools don't teach people to just stop or say "I don't know", when they can't answer a question, instead of relying on ad hominems to end their conversations. You'd have benefitted.

2

ShawnyMcKnight t1_jad4dtd wrote

Also when you site the source that doesn’t mean you can just use the source word for word.

0

TeaKingMac t1_jad6gd4 wrote

>it’s well known that it has passed multiple professional exams.

Well yeah. There's very clearly defined correct answers for professional exams.

When a student is writing an essay, the primary objective is creating and defending an argument. Abdicating that responsibility to ChatGPT is circumventing the entire point of the assignment

3

Any_Protection_8 t1_jad900z wrote

Chatgpt makes a lot of mistakes. Honestly. I work with it and I have to crosscheck everything it says because it balantly tries to serve an answer that pleases. But that is not essentially correct.

2

TeaKingMac t1_jadgiv6 wrote

"quoting" ChatGPT as a source is also stupid, because it's neither a primary (best) source, or even a secondary source, like a newspaper article.

It's just a random assortment of (mostly correct) information. That's the same reason why academia doesn't currently allow Wikipedia as a source for information.

1

DavidBrooker t1_jadh8w2 wrote

What does the value of quoting a chatbot, or what does the improvement afforded by quoting a chatbot, have to do with what should be allowed? What is and is not allowed is an ethical issue, not one of pedagogy. Being lazy and unoriginal is allowed already, this is just a new means of being lazy and unoriginal so long as it isn't also done unethically.

A teacher is still free to give an F for someone who writes an essay full of block quotes to ChatGPT. They're just not obligated to give an F and recommend disciplinary action.

2

Swamptor t1_jadscvd wrote

What if you were writing an essay about chatGPT or the history of chatbots or something like that? Of course you can quote it. Saying you can't quote it would be categorically insane. You can quote literally anything.

1

SidewaysFancyPrance t1_jadyt2z wrote

ChatGPT is at least one step removed from the actual source material, and ChatGPT isn't trying to be "right." You should just bypass ChatGPT and go to actual source material instead of asking a language AI to try to summarize it for you, knowing that it will often confidently present you with wrong information.

4

slantedangle t1_jae3fp3 wrote

You can certainly use insults to deliver an ad hominen, there are others, such as attacking character or reputation or with motive. Ad hominen is used to describe a strategy in which a person using it will focus on the person making an argument rather than the content of the argument.

1

soovestho t1_jae57fn wrote

Ok but AI is an amalgamation of sources. How does one just cite “ChatGPT”?

3

Odysseyan t1_jaedn2l wrote

The better question is HOW do you cite it? It doesn't save your texts and you can't link them for review. Can I just state whatever and say that it was ChatGPT then?

"Hitler did nothing wrong" - chatGPT

2

TeaKingMac t1_jaepy4k wrote

> it is not a primary source

AND NEITHER IS ChatGPT

No original information comes from ChatGPT. It is just a repository.

That's my point.

>it's neither a primary (best) source, or even a secondary source, like a newspaper article.

> It's just a random assortment of (mostly correct) information. That's the same reason why academia doesn't currently allow Wikipedia as a source for information

0