I talk to the new Facebook Blender Chatbot

This is what a 9 Billion parameter transformer can do. I take a look at FAIR's new paper "Recipes for building an open-domain chatbot" and try out their chatbot live!

Jump to 3:00 to see the chatbot in action.

Paper:
Blog:
Code:

Abstract:
Building open-domain chatbots is a challenging area for machine learning research. While prior work has shown that scaling neural models in the number of parameters and the size of the data they are trained on gives improved results, we show that other ingredients are important for a high-performing chatbot. Good conversation requires a number of skills that an expert conversationalist blends in a seamless way: providing engaging talking points and listening to their partners, and displaying knowledge, empathy and personality appropriately, while maintaining a consistent persona. We show that large scale models can learn these skills when given appropriate training data and choice of generation strategy. We build variants of these recipes with 90M, 2.7B and 9.4B parameter models, and make our models and code publicly available under the collective name Blender. Human evaluations show our best models are superior to existing approaches in multi-turn dialogue in terms of engagingness and humanness measurements. We then discuss the limitations of this work by analyzing failure cases of our models.

Authors: Stephen Roller, Emily Dinan, Naman Goyal, Da Ju, Mary Williamson, Yinhan Liu, Jing Xu, Myle Ott, Kurt Shuster, Eric M. Smith, Y-Lan Boureau, Jason Weston

Links:
YouTube:
Twitter:
BitChute:
Minds:

Leave a Reply

Your email address will not be published. Required fields are marked *

Amazon Affiliate Disclaimer

Amazon Affiliate Disclaimer

β€œAs an Amazon Associate I earn from qualifying purchases.”

Learn more about the Amazon Affiliate Program