ChatGPT can teach you how to make drugs or manipulate the 2024 U.S. presidential election if you know how to ask properly
teach you how to make drugs
My guess: This will be like all the other content it spits out.
It will sound convincing but the chance of it being accurate is very low.
LLM’s, to use an old Aussie term are Bullshit Artists. They are great at creating convincing narratives but truth or its lack is not a concern.
I agree. Worse, those who want to make money selling AI are telling tales of a glorious future, exaggerating AI as if it were a prophet of the Untold Truth.
It should be illegal to call this shit AI, it’s blatantly misleading. These are LLMs, sophisticated chatbots, but there is no actual intelligence at play here.
As the other guy said all they are really good at is bullshitting the user. Since they are not actually intelligent in any way they cannot determine if what they are saying is the truth, or even close to the truth.
I’ve written several python scripts to extend the capabilities of a text to speech AI using an offline Llama2 70B model. It takes enthusiast level hardware to run. The few errors it makes in code snippets it can also correct by pasting the terminal error message and prompting.
I don’t write python, haven’t had to go online to look up anything about Python syntax, and have a script that can take any text and convert it to speech, convert any wave file into text, or concatenate any wave files.
It takes larger models and some testing to find a good combination for a task. It also helps to use a model that is not subject to public spotlight and political pressure like GPT or Bard. The tiny models like 7B’s and 13B’s are like talking to children or teenagers.
“Forbidden knowledge” my ass. You can just Google these things.