My first response to the news that ChatGPT was being integrated into iOS18 was to groan. It was a nail in the coffin of student writing skills.
I have been grappling with GPT in classes for the last 18 months, but this fall I intentionally wrote an assignment in which the students had to write an essay about a documentary. The questions were tightly woven to the film. I ran the questions through GPT and it did not have any details of the actual documentary. This was for a large class with 147 students. Easy assignment. Watch the film. Write a response.
I failed 23% of the class for using GPT. They wrote the same answers, and instead of answering how the "defense attorney Brandy cross-examined the sheriff's deputy" with specifics - they wrote how Brandy could have done this. She might have done this. over and over. They showed no evidence of having watched the film. When asked other questions they gave very generic responses. I failed them on the merits for not answering the questions and commented that their responses were highly consistent with GPT.
ONLY one student complained - and she dug her own hole admitting she didn't watch the film, but "went to the internet to get a little help." Oh, and she used GPT to write her email to me. Actually three students did, begging for grade improvements.
So, if I am jaded, it's based in my own actual experiences. MY problem with AI is that students dont' care and don't even think its wrong to use it. For me, they are given an assignment to learn critical thinking and writing skills. NOT to copy and paste from AI. The problem is it is HARD to prove. The AI responses are not always the same.
Google already has generative AI in chrome which you can use in searches. Apple's integration of ChatGPT into Siri isn't surprising, but it mkes it even harder. Are we literally forced to do in-class writing only?
The irony is I use ChatGPT as a summary tool - it is very powerful if you give it text and ask it to extract and summarize info from the text. I can upload a document and query it. BUT it isn't always correct. It occasionally gets it wrong, even when it is supposedly ONLY working with the text you give it.
Education is at a turning pojnt - and while some people would say "this is no different then Wikipedia" I'd argue it is different. Yes, Wikipedia lets you have an encyclopedia of information that might or might not be factally correct. But AI will re-write your text. It will write your text for you. How do we deal with it? Is this the educated workforce we want? One that writes in AI. Do we want mindless drones? Maybe.
And I never thought I'd see the birth of Skynet.