Newswise — While debate continues on the pros and cons of the proliferation of artificial intelligence in a broad range of applications, one channel in which it is making undisputed advancement is medical research.
While being extremely conscientious about preserving personal medical information in their efforts, researchers at Arizona State University’ School of Biological Health Systems Engineering are, among other projects, using AI to sort through decades of medical data to identify treatments for neurological disorders, learning how cancer cells replicate so they can be conquered and finding solutions that are keeping seniors safe from life-threatening falls.
Unbinding speech for patients with locked-in syndrome
One key to untethering communication in locked-in syndrome, a neurological disorder that causes speech loss in those who are still cognitively intact, may be using AI to understand how the brain processes language.
Locked-in syndrome is generally caused by damage to the brainstem which, in addition to speech loss, results in quadriplegia and the loss motor coordination and body sensory perception. Patients usually remain aware of their surroundings and maintain the ability to hear and move their eyes. But the patient, while able to process what’s happening in the surrounding environment, is essentially unable to communicate about it.
The Diving Bell and the Butterfly, a 1997 book (and subsequent 2007 movie) about Jean-Dominique Bauby, an Elle Magazine editor who suffered from the syndrome, brought general awareness to the syndrome. A 2023 Netflix mystery, titled simply “Locked In,” recently brought more attention to the syndrome, depicting a caregiver using an “alphabet board” to elicit blinks from the patient as a laborious method of spelling out words.
Bradley Greger, an ASU associate professor of neural engineering, is working to develop tools that can restore communication to patients with locked-in syndrome.
Collaborating with neurosurgeon Andrew Yang, M.D. and others at the Barrow Neurological Institute, for which he also serves as an adjunct associate professor, Greger is gathering and using AI to interpret decades’ worth of anonymized clinical data from patients who’ve suffered from neurological disorders.
“Neurosurgeons are reluctant to put devices anywhere near the communication part of the brain out of concern that, in addition to already having lost the ability to speak, patients also could lose the ability to understand language,” explained Greger.
“So instead of using neural implants, we are using AI to analyze the data from micro-electrodes that rest on the surface of the brain, which provide insight on brain activity both in resting states and during task execution. This technique tells how the brain processes language as it was happening,” Greger said.
Once AI can de-code how speech is generated within the neural network, Greger envisions that patients will be able to mentally “speak” and have those thoughts translated, again by AI, into actual language.
“We have achieved an initial communication rate of 93 percent accuracy,” Greger said. Out of 100 words, only seven might be wrong,” Greger said. “We are aiming for a model based on human data that can decode with high accuracy, resulting in rapid and clear communication.”
Greger clarifies that researchers don’t want locked-in patients to merely think a sentence, they want them to use the same process they would when saying it out loud.
“It would be like mouthing the words without moving your lips or having an internal conversation,” Greger said. “Attempting to say the sentence vs. just thinking it means we can bypass creating a new thought-to-speech speech channel when converting to AI assisted speech.”
Greger also notes that the brain is extremely active while listening to speech – even in patients who are locked in. “Ultimately, we’ll be able to understand not only what the patient is saying, we’ll also be able to confirm cognitive understanding when listening to speech,” he said.
New non-invasive devices combined with AI technologies are now available to record neural data 24/7 and will provide a “whole new window to the brain,” said Greger.
“We’re just beginning to have access to long-term data,” he said. “The application possibilities will be happening on a global scale that can be applied to a wide range of movement disorders including Parkinson’s disease and epilepsy.”
Managing cancer cells to conquer mesothelioma and glioblastoma
Using artificial intelligence (AI) to track the life cycle of cancer cells may be key to managing and treating some forms of cancer.
The cell cycle – the progression in which cells make new copies of themselves – is a fundamental biological process across systems. Quiescent (dormant) cancer cells (QCCs) are merely “hanging out and not replicating,” according to Christopher Plaisier, an ASU bioengineering assistant professor whose primary focus is on glioblastoma and mesothelioma.
“While quiescent, the cells are unresponsive to most treatment modes, like chemotherapy or radiation,” explained Plaisier. “At some point, QCCs may re-enter the cycle and cancerous growth proliferates.”
Identifying when QCCs are likely to remain dormant versus restarting cancerous cell development is key to developing new, more effective cancer treatments.
Using artificial intelligence to gather and process information from existing single cancer cell data studies, the team in the Plaisier Lab has been building a classification database. The process is building on the glioblastoma SYstems Genetic Network ANaLysis (gbmSYGNAL) pipeline developed while Plaisier was a researcher at the Institute for Systems Biology in Seattle.
According to Plaisier, there are several goals for this research.
“First, we’ll be able to see what percentage of quiescent cells typically re-enter the cancer cell multiplication stage and what percentage will remain dormant,” he said. “The more we know how they progress through the stages, the better we’ll know how to manage the disease.
“We also want to evaluate what other cells are likely to become targets.”
As for what happens once the triggers between active cancer cell development, quiescence and the resumption of cancer development are identified, Plaisier says that’s still a subject for debate among researchers.
“Do we find ways to push cells back to a quiescent state and take steps to keep them there,” questions Plaisier, “or do we move them forward to cancerous development so they can be treated with radiation and chemotherapy? At this point we don’t know which will net the best results.”
Also, part of Plaisier’s research in the AI category is bi-clustering algorithms – a data mining technique that can be used to analyze gene expression data. The process provides insights to categorize cancer subtypes based on similar gene expressions, and may ultimately pave the way to create gene-specific treatments, including drug therapies that can target specific genes and the proteins in cells that support cancerous growth.
“Coupled with MRI data, which helps us identify specific biological features, AI can help us build a model of the gene regulatory network that combines cell status, genetic proclivity and biological features to develop patient-specific treatments,” said Plaisier.
Plaisier’s team collaborates with researchers and practitioners from the Barrow Neurological Institute and the Mayo Clinic Cancer Center in Phoenix and the Fred Hutch Cancer Center in Seattle.
Keeping vulnerable seniors safe from falls
We don’t often think of artificial intelligence as a daily tool for seniors, but research coming from Thurmon Lockhart, an ASU biomedical engineering professor, is helping keep older adults on their feet.
The annual numbers on trips and falls, especially among seniors, are staggering, according to the Centers for Disease Control and Prevention. Annually, there are a reported 36 million older adult falls, which result in more than 32,000 deaths. More than 95 percent of hip fractures are caused by falls, requiring hospitalization for at least 300,000 seniors.
Lockhart notes that seniors are vulnerable to falling because of a range of conditions, including sarcopenia, or a loss of muscle mass; postural hypotension (dropping blood pressure when rising from prone or sitting positions), and balance challenges associated with diseases that can affect the circulatory system, like diabetes and heart disease. Diminishing vision and some medications also can be factors.
“Traditional fall risk assessments for seniors don’t always target specific types of risk, like muscle weakness or gait stability,” explained Lockhart. “In addition, there wasn’t tool that could be used at home to gather information about patient vulnerabilities, which could then be used to create activity profiles valuable in preventing falls during activities of daily living.
Using inertial measurement units (IMUs), which are small, self-contained devices a patient wears across the sternum, has allowed data acquisition beyond the lab for body posture, upper and lower extremity movements and other relevant information for fall prediction.
“These profiles can be used generally as related to various conditions, as well as for creating patient-specific risk assessments that clinicians can use to modify patient behavior and activities. For example, if the patient learns to wait a specific amount of time before trying to walk after sitting up in bed, that can be valuable.
“IMUs can be used outside the lab to analyze vulnerabilities in a patient’s running, walking, dressing and even eating habits,” said Lockhart, whose team has used them, along with a machine learning system, to predict fall risk with more than 82 percent accuracy. (Machine learning is an AI application that uses mathematical models to help a computer learn without direct oversight.)
Lockhart, whose work has earned him the designation as the first Professor of Life in Motion by the Musculoskeletal Orthopedic Research and Education, or MORE, Foundation, focuses on identifying the sensorimotor deficits and neurological disorders related to aging and their impact on fall accidents.
His lab has a wide range of non-invasive equipment, including a treadmill with a harness to prevent falls and a 3D, virtual reality room. These tools are able to measure characteristics in a subject’s gait, including biomechanical, neural, metabolic and cardiovascular variables, and have been valuable for collecting data about what types of medical conditions can trigger falls.
But the limitations of these facilities and technologies, while continuing to be valuable for general fall risk data collection, have led Lockhart to pursue IMU applications to provide a more universal understanding of the daily challenges faced by seniors.
Additionally, Lockhart has developed the Lockhart Monitor, a smartphone app that monitors physical activity that can detect fall risks – alerting the user or, in some cases, a caregiver.
Lockhart Monitor, or MyACTome, assesses patient balance and walking stability patterns to quantity biomechanical performance and fall risk. Lockhart has been working in concert with the industry partner institution, The CORE Institute.
“This integration allows for real-time customization of individual patient care pathways, all tracked through centralized data system and reported to clinical team to enhance the ability to improve patient outcomes, target high-risk patients to reduce avoidable injuries,” said Lockhart.
See the full article on ASU Now.