How it is possible that ChatGPT makes so many incorrect statements and spits out wrong facts? We explain why ChatGPT isn't a truth oracle.
βΊ Sponsor: Arize AI. Sign up for Arize:
Free industry certification in ML observability:
Learn about embedding drift from Arize research:
Check out our daily #MachineLearning Quiz Questions:
π ChatGPT blog:
π Evaluating the Factual Consistency of Large Language Models Through Summarization:
π WebGPT: Improving the Factual Accuracy of Language Models through Web Browsing:
π Behavior cloning is miscalibrated:
πΊ ChatGPT vs. Sparrow:
π Transformers as Algorithms: Generalization and Stability in In-context Learning:
π Do Prompt-Based Models Really Understand the Meaning of their Prompts?:
Thanks to our Patrons who support us in Tier 2, 3, 4: π
Dres. Trost GbR, Siltax, Edvard GrΓΈdem, Vignesh Valliappan, Mutual Information, Mike Ton
Outline:
00:00 ChatGPT spits out wrong facts
02:15 Arize AI (Sponsor)
03:40 How does ChaGPT / a language model work?
05:53 Why ChatGPT generates nonsense
06:38 Confidence and clarifications
07:21 Limits of behavioral cloning
09:04 Phrasing
09:21 Jail breaks
09:45 Is ChatGPT even usable?
ββββββββββββββββββββββββββ
π₯ Optionally, pay us a coffee to help with our Coffee Bean production! β
Patreon:
Ko-fi:
ββββββββββββββββββββββββββ
π Links:
AICoffeeBreakQuiz:
Twitter:
Reddit:
YouTube:
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #researchβ
Music π΅ : Intentions β Anno Domini Beats
Video editing: Nils Trost
It’s always fun to see these new chatbots say something silly, but it’s even nicer to have an idea of why it happens π Thanks for the video! π
PS: I found the secret π
well, there’s no way to say which is the fourth child since you don’t specify the order they were born
Maybe that’s the correct answer π ChatGPT was right all along!
Haha, love this! π
well you really had me at @5:13Β π
Hey! Long time I didn’t pass by the comment section.
Thanks for the video π
Hello! We missed you. π Thanks for saying hi.
the mushroom background pic π
Just one possible explanation why chatGPT might be hallucinating π