Bad things have already happened because people don't understand the LLMs do not generate true or reliable information. Like you said, they generate "convincing gibberish." This will only get worse because of all the unrealistic hype regarding what they can do.
I am remembering the couple who followed an early GPS program up a fire road into the mountains and froze to death.
I feel very lucky that I retired from teaching college English just before this software became available and thus I never had to figure out how to adapt to it.
no subject
Bad things have already happened because people don't understand the LLMs do not generate true or reliable information. Like you said, they generate "convincing gibberish." This will only get worse because of all the unrealistic hype regarding what they can do.
I am remembering the couple who followed an early GPS program up a fire road into the mountains and froze to death.
I feel very lucky that I retired from teaching college English just before this software became available and thus I never had to figure out how to adapt to it.