This is a stupid cop out. You can read something that an AI engine spits out and judge whether it’s true or not. And even on a technical level, modern AI engines do a lot more than just what we traditionally think of as an LLM do. They conduct research, gather data, transform it, process it and return results based on that. I mean, I told one to take a handwritten table, tansform it into an excel sheet and give it back to me. It did it more or less perfectly. How can that possibly be construed as just guessing the next word?
This is a stupid cop out. You can read something that an AI engine spits out and judge whether it’s true or not. And even on a technical level, modern AI engines do a lot more than just what we traditionally think of as an LLM do. They conduct research, gather data, transform it, process it and return results based on that. I mean, I told one to take a handwritten table, tansform it into an excel sheet and give it back to me. It did it more or less perfectly. How can that possibly be construed as just guessing the next word?