Newswise — BUFFALO, N.Y. – An innovative chatbot designed for sharing critical information about sexual and reproductive health (SRH) with young people in India is demonstrating how artificial intelligence (AI) applications can engage vulnerable and hard-to-reach population segments.
Working with the Population Foundation of India (PFI), Helen Wang, PhD, an associate professor of communication in the University at Buffalo College of Arts and Sciences, examined the user-centered design and engagement of SnehAI, the first Hinglish (Hindi and English) chatbot purposefully developed for social and behavioral change.
“Many AI technologies today are motivated by profit, but we must also be aware that AI can be leveraged in ways that facilitate social and behavior change,” says Wang, who specializes in entertainment-education and storytelling as instruments for health promotion. “SnehAI is a powerful testimonial of the vital potential that lies in AI for good.”
The findings from Wang’s instrumental case study appear in the Journal of Medical Internet Research.
“Our research team looked exclusively at SnehAI’s effectiveness with regard to SRH, but I’m confident that the chatbot’s design can be easily adapted for different purposes, in different countries. In fact, I know our partners have developed similar applications to address domestic violence in South Africa. The latest version of SnehAI will also include child protection,” says Wang.
The interface of SnehAI (pronounced SNAY-ha eh-eye) is an avatar based on a popular television drama in India. The show presents themes that include gender equality and family planning. After two successful seasons, the PFI, with a technical partner, developed the chatbot in order to extend the show’s reach from rural and adult populations to include urban youth in ways that promoted SRH and advocated for the well-being of women and girls.
About 18% of the world’s population lives in India. That’s 1.4 billion people, about half of whom are under the age of 25. Despite policy commitments and some recent progress, the SRH needs of young people in India are lacking, according to Wang.
“Quality education about SRH is highly limited, contraceptive practices are heavily skewed toward female sterilization, and unsafe abortions are rampant,” she says.
SRH misinformation compounds existing problems, with young people often unware of contraceptive options and the dangers of sexually transmitted infections. Embarrassed or uncomfortable, young people remain silent, or direct their questions and acquire information through often unreliable web-based platforms.
But Wang says SnehAI provides what’s perceived as a safe space. It’s an intelligent bot that provides a non-judgmental arena that arouses no concern from users about how questions might be received.
“The avatar is based on the protagonist in the drama, a trusted medical doctor and a champion for health and social issues in society,” says Wang. “It’s not just a random visual representation of a human. That’s important because this avatar is connected to powerful stories seeded with accurate information modeled by positive characters.”
But the chatbot itself always clearly explains to its users, as soon as they launch the app on Facebook Messenger, that there is no human behind the avatar.
“Some people might ask: Who wants to talk about sensitive personal issues with a machine? But we observed the opposite effect,” says Wang. “There was no concern for how someone might react to a question. Chatting with an intelligent being without having to worry about embarrassment, shame and guilt in this case actually helps build confidence and trust, especially when the users are informed about what data may be recorded and their rights to protect personal privacy.”
Over five months, SnehAI interacted with almost 120,000 unique users, especially young men, with more than 8 million messages. About half of those messages were deeply personal questions and genuine concerns regarding SRH exchanged through texts and handled by the chatbot’s natural language processing capacity.
“Overall, SnehAI successfully presented itself as a trusted friend and mentor across 15 areas representing opportunities for engagement and action,” says Wang. “In particular, it fostered curiosity about SRH and a willingness to ask questions. And 71,211 is the number of visits to the helplines feature of the chatbot, which demonstrates how transmedia storytelling – from a television drama to social media and then an AI chatbot – can inspire users to take action.
“SnehAI is a significant representation of the potential impact of AI technologies for social good.”
Wang’s collaborators for the study include Sneha Gupta, a UB doctoral student in the Department of Communication; Arvind Singhal, PhD, an endowed professor of communication and director of the Social Justice Initiative at the University of Texas at El Paso; Poonam Muttreja and Sanghamitra Singh, of Population Foundation of India; and Poorva Sharma and Alice Piterova, of AI for Good UK.