Recently, Mental Health America hosted a webinar, “Building Equity in Technology,” highlighting Microsoft’s AI for Accessibility program and focusing on the impacts technology has on the lives of those who are disabled or suffering from mental illness, particularly in Black communities. Speakers included Wendy Chisholm, principal accessibility architect for Microsoft, Dr. Desmond Upton Patton, associate professor of social work and sociology at the Columbia School of Social Work, and Theresa Nguyen, LCSW, chief program officer and vice president of research and innovation at Mental Health America.
AI for Accessibility was created to fund and support the development of AI tools that empower people living with disabilities. The program is one of five under Microsoft’s AI for Good initiative. “All of these programs are working to give resources and support and platforms to people who are doing really good stuff in the world,” Chisholm said.
Inclusive AI for individuals with disabilities
AI for Accessibility announced its five-year plan to shift the ‘Disability Divide.’ “It’s this gap in inclusion for people with disabilities,” Chisholm explained. “Before the pandemic, in the United States, the employment rate for people with disabilities was half that for people without.” The number of people living with disabilities is also growing, adding urgency to the matter.
“The biggest issue is just the lack of data, and the biases we see in the data that we do have,” said Chisholm. “People with disabilities are underemployed, and therefore they are underrepresented in employment data. And then, when you train algorithms on that data, those algorithms are going to be biased.”
To address this hazard, Chisholm and AI for Accessibility search for candidates to fund who will create inclusive datasets and models, especially for people with disabilities. Chisholm observed that initiatives and systems built by and for people with disabilities often lead to innovations that benefit everyone, giving the example of OXO Good Grips, kitchen utensils designed by a woman with arthritis which are easier for everyone to use. “And that’s often what you see in innovations for people with disabilities. Really, it can impact all of us.”
AI can be especially useful as a tool to aid in mental health care, keeping patients and their providers connected and informed while battling the stigmas people face for seeking care. Some related ongoing projects supported by AI for Accessibility include an AI designed to recognize and suggest empathy in text messages, an adaptive text messaging AI intended to reach young adults who might not seek formal mental health care, and a workshop to gather experts from a range of disciplines to discuss and address the impact of AI on Black people seeking mental health support.
Grieving on the Digital Street
Dr. Patton, director of a research group at Columbia University called the SAFElab, spoke next on the intersection of grief, trauma, social media, and artificial intelligence. His team focuses on the way Black youth in Chicago use social media to grieve and the applications of AI in interacting with that phenomenon. “We have been using social media as a neighborhood and applying AI to study issues of well-being for the last 8 years.”
Young people using social media heavily while living in areas where they are exposed to violent crime makes the online platforms a unique window into their mental health and well-being—and a space that reacts to and spawns more real-life violence. “They are also talking about people that they have issues with on social media. These all become the triggers for online arguments that become offline violence,” Dr. Patton said.
This exchange of online and offline conflict can spiral into more lethal tragedy. Dr. Patton shared a case study of Gakirah Barnes, a young woman who had built a reputation online as a prolific gang hitter with thousands of followers who was killed by gun violence in front of her home in Chicago. The Twitter feed she left behind showed glimpses of her timeline of trauma and grief, but as Dr. Patton sought to learn from that timeline, he discovered that no AI tools existed which could interpret or interact with that dataset.
“We created a methodology called CASM, Contextual Analysis of Social Media, where we focus on centering culture and context and nuance in social media posts,” said Dr. Patton. They partnered with young people from the community to ensure they were properly interpreting data from posts online. “In any labeling process that we engage in, we wanted to make sure that we were picking up important clues that would perhaps be misunderstood or overlooked in a traditional labeling process.” Dr. Patton and his team then used their system and a tool called natural language processing to identify patterns of speech and analyze Barnes’ posts for words, acronyms, and emojis representing grief.
“Social media should be considered a place where many people are processing complex trauma and grief over their life,” he said. “We need to think about this as a neighborhood.” Pairing qualitative, clinical and computational approaches like CASM may make better support, intervention, and prevention work possible to aid young people everywhere.
Co-designed AI and Mental Health
As part of their ongoing efforts to make mental health screening ubiquitous and readily accessible to young people, Mental Health America launched MHA Screening in 2014, an online tool for people to screen themselves. The need for the tool has been subsequently proven through its increasingly high demand, with 9.8 million screens completed — about 2.5 million of which occurred in 2020 alone.
Introducing the results gathered so far, Nguyen noted that the data the screening tool gathers on the demographics and care status of its users reflects a population seeking help online. Nearly half self-identified as a racial or ethnic minority, and although 23% scored moderate to severe for all conditions, the majority had never received previous treatment for mental health.
“They feel high rates of ambivalence about what to do next,” Nguyen said of the tool’s users, “and that makes sense, because when we’re first struggling with a mental health condition, especially ones where you have abnormal experiences — maybe you’re hearing voices, or you have a change of your sense of reality — we know that it just takes a little bit of time for people to make sense of what’s happening to themselves.”
Data from MHA Screening shows that the majority of users are interested in taking steps that are self-directed to address their mental health. “Fifty percent of the entire population, when asked what they wanted to do, chose DIY tools, chose things that gave them a lot of control,” said Nguyen. Creating tools that allow users to manage their own care seemed like the best method to reach MHA Screening’s population, but determining which type of tool to use took careful consideration. Apps, for example, lack consistent engagement. “App use tends to taper off after two weeks,” said Nguyen.
While the team was determining what AI tools would serve their population best, how the team went about deciding and designing that tool was just as important. Nguyen said her team took care to heed the feedback they received from their screening tool and other data, and mentioned that several team members also have mental health conditions, whose input was crucial to making the most effective AI tool they can.
“It’s okay if someone asks us to review something after it’s done. The best thing to do is to actually design something from beginning to end,” Nguyen said. “People living with mental health conditions are there when we develop ideas, are there when we design the research itself, and we are designing the product. We are reviewing the findings. We are giving guidance on how to change the product, how to change the research.”
Nguyen emphasized that the continued development process prioritizes co-designing the AI tool with people with mental health conditions. “Something is done with you, and not to you, or at you, which often happens in the mental health community.”
After reviewing feedback and data, Nguyen’s team decided to make an AI tool in the form of a text messaging bot to encourage user engagement while still leaving the control of care management in the users’ hands. The AI is intended, among other things, to learn users’ habits and adapt to suit their preferences for the purpose of engaging users as successfully as possible.
Early tests of the tool have shown high rates of engagement, but Nguyen said her team is not finished. . “Let’s keep going and test with more and more people. And that’s really the goal, to get this out to as many people as we can.” Recently, the team conducted a test with 5,000 people, though their goal is to increase that number to 10,000 next year.
“I am hopeful that AI and technology in general can really help us build a more equitable society,” said Chisholm. While these projects are still in development, all three speakers expressed optimism about the future of AI tools in reaching and supporting people seeking to manage their mental health, especially racial minorities and others inadequately supported by the health care system.
Learn more about AI for Accessibility at aka.ms/ai4a
Apply for an AI for Accessibility grant: aka.ms/grant