facebook image

Facebook is trying to teach chatbots how to chit-chat

Facebook is trying to teach chatbots how to chit-chat

Despite the death of its personal AI assistant M, Facebook hasn’t given up on chatbots just yet. Over the past couple of years, it’s slowly improved what its artificial agents can do, but their latest challenge is something that can confound even the smartest human: making small talk.

You’d be forgiven for thinking otherwise because of their name, but chatbots can’t really chat. As researchers from Facebook’s FAIR lab explain in a pre-print paper published this week, they fail at this task on a number of levels. First, they don’t display a “consistent personality,” sticking to the same set of facts about themselves throughout a conversation; second, they don’t remember what they or their conversational partners have said in the past; and third, when faced with a question they don’t understand, they tend to fall back on diversionary or preprogrammed responses, like “I don’t know.”

Even with these constraints, chatbots can be engaging. (See, for example, the famous ELIZA bot from the 1960s, which acted as a rudimentary therapist by relying on stock phrases like “How do you feel right now?”) But the goal now is not just interrogation, but conversation; to try to recreate this attribute, researchers have turned to deep learning. This means that instead of mapping out preprogrammed questions and answers, chatbots are taught by looking for patterns in large datasets.

So far this has got some good results, but one challenge now, say Facebook’s researchers, is getting the right data to begin with. Many contemporary chatbots are trained on dialogue taken from movie scripts, for example. And this means that when they’re having meandering conversations (rather than directed dialogue for tasks like customer service) they tend to come up with some odd non-sequiturs. See below, where Person 1 is a real human, and Person 2 is a chatbot trained on movie subtitles:[PERSON 1:] Oh yeah? well I enjoy a nice steak. how about you?

[PERSON 2:] Father always used to say that if we ever had the money you have… we’ d have steak and ice cream three times every day!
[PERSON 1:] I eat all the steak and chicken too, even bacon
[PERSON 2:] Father always used to say that if we ever had the money you have… we’ d have steak and ice cream three times every day!
[PERSON 1:] yeah you said that already
[PERSON 2:] Dynamic structure can absorb shock.
[PERSON 1:] yes it can. I know about that as a bodybuilder
[PERSON 2:] Ladies and gentlemen… the strongest man in the world is blowing up a hot water bottle.

Not great, even if you are into strongmen.

To try to fix this, Facebook’s engineers have built their own dataset to train chatbots with. It’s called Persona-Chat, and consists of more than 160,000 lines of dialogue, sourced from workers found on Amazon’s Mechanical Turk marketplace. (The resource for human data used to train AI.)

Source by:-theverge

Share: