• 7 months ago
More than 19,000 dialects are spoken in India, creating difficulties for an AI bot to explain somewhat taboo sexual health topics. The creators of the Myna Bolo project have found a solution.

Category

🗞
News
Transcript
00:00 The main Abolo bot can answer tricky questions about sex and health in India
00:04 But it has to understand cultural nuances and different dialects
00:08 Not only is India one of the most linguistically diverse areas with you know hundreds of languages
00:13 But within those languages there are so many sub languages sub dialects slangs
00:18 Not only slangs within those languages, but slangs that are different within those families
00:23 So how do we then produce answers in a chat bot that is meant to target women that are maybe not literate to the standardised
00:29 version of our languages
00:31 Advanced large language models enable the chat bot to comprehend and engage in multiple languages
00:37 The LLMs used by the main Abolo app are trained to connect with users in their native languages like Marathi and even
00:44 Hinglish Hindi mixed with English words. While people don't think sex they do say sambandh karna
00:50 But maybe a mom says sambandh karna or a daughter says a relation rakhna
00:55 So there are different ways of saying the same thing so creating that nuance then using slang so things that are replaced
01:03 Little words that could be replaced by a metaphor that is actually used instead so having those metaphors compiled
01:09 Lots of back-end database is needed so the bot can understand a prompt correctly and doesn't get lost in translation
01:17 or even worse create confusion with incorrect information or
01:22 hallucinations
01:24 This requires intensive groundwork
01:27 We check the answers that the chatbot provides and if it is wrong then we have a correct version that we create
01:35 So then we take all the correct questions and the correct answers and we feed it back into the model and that is actually
01:41 20 to 25 thousand questions that we are fact-checking and vetting not only with our staff
01:48 But also with medical professionals and doctors and health professionals who've been in the field for 10 years
01:53 Hallucinations is the AI issue that almost across industries
01:56 We are trying to figure out how to decrease it and along with other things like, you know
02:01 Data bias, gender bias, medical bias that comes up in the data because it is in the world
02:06 They're the reason why there's gender bias in the data because there's gender bias in the world

Recommended