AI beats experts in predicting future quality of ‘mini-organs’
Organoids—tiny lab-grown tissues that mimic organ function and structure—are transforming biomedical research. They promise breakthroughs in personalized transplants, improved models of diseases such as Alzheimer’s and cancer, and a more accurate understanding of how drugs work.
Now, researchers at Kyushu University and Nagoya University in Japan have developed a model that uses artificial intelligence (AI) to predict early organoid development. This model is faster and more accurate than expert researchers, making organoid culture more efficient and cost-effective.
In this study, published in communication biology On December 6, 2024, researchers focused on predicting the development of hypothalamic-pituitary organoids. These organoids mimic the functions of the pituitary gland, including the production of adrenocorticotropic hormone (ACTH): an important hormone that regulates stress, metabolism, blood pressure and inflammation. A lack of ACTH can lead to fatigue, anorexia, and other potentially life-threatening problems.
“In our laboratory, our studies in mice have shown that transplantation of hypothalamic-pituitary organoids has the potential to treat ACTH deficiency in humans,” said corresponding author Hidetaka Suga, associate professor at Nagoya University Graduate School of Medicine.
However, a key challenge for researchers is determining whether the organoids develop normally. Organoids are derived from stem cells suspended in fluid and are sensitive to small environmental changes, leading to changes in their development and final quality.
The researchers found that a sign of good progress is the widespread expression of a protein called RAX in early developmental stages, which often leads to the subsequent generation of organoids with strong ACTH secretion.
“We can track development by genetically modifying the organoids so that the RAX protein fluoresces,” Suga said. “However, organoids for clinical use, such as transplantation, cannot be genetically modified to emit fluorescence. Therefore, our researchers must judge based on what they see with their eyes: a time-consuming and inaccurate process.
So Suga and his colleagues in Nagoya worked with Hirohiko Niioka, a professor in the Data-Driven Innovation Program at Kyushu University, to train deep learning models for this work.
“Deep learning models are a type of artificial intelligence that mimic the way the human brain processes information, allowing them to analyze and classify large amounts of data by identifying patterns,” Niioka explained.
The Nagoya researchers took fluorescence and brightfield images of organoids with fluorescent RAX protein at 30 days of development, which show what the organoids look like under normal white light without any fluorescence. Using fluorescence images as a guide, they classified 1500 brightfield images into three quality categories: A (wide RAX expression, high quality); A (wide RAX expression, high quality); B (medium RAX expression, medium quality) and C (narrow RAX expression, low quality).
Niioka then trained two advanced deep learning models: EfficientNetV2-S and Vision Transformer (developed by Google for image recognition) to predict organoid quality categories. He used 1200 brightfield images (400 per category) as a training set.
After training, Niioka combines the two deep learning models into an integrated model to further improve performance. The team used the remaining 300 images (100 per category) to test the now-optimized integrated model, which classified bright-field images of organoids with 70 percent accuracy.
In comparison, researchers with years of experience growing organoids were less than 60% accurate when predicting the category of the same brightfield image.
“Deep learning models outperform experts in every aspect: accuracy, sensitivity and speed,” Niioka said.
The next step is to examine whether the integrated model can also correctly classify bright-field images of organoids without the need to genetically modify RAX to fluoresce.
The researchers tested the trained whole-body model on bright-field images of hypothalamic-pituitary organoids without fluorescent RAX protein at 30 days of development. Using staining techniques, they found that organoids modeled as A (high quality) indeed showed high expression of RAX at 30 days. These organoids subsequently exhibited high ACTH secretion as they continued to be cultured. Meanwhile, RAX and later ACTH levels were lower in organoids classified by the model as C (low quality).
“As a result, our model can predict the final mass of organoids at an early stage of development based solely on visual appearance,” Niioka said. “To our knowledge, this is the first time in the world that deep learning has been used to predict the future of organoid development.”
Going forward, the researchers plan to improve the accuracy of the deep learning model by training on larger data sets. But even at its current level of accuracy, the model has profound implications for current organoid research.
“We can quickly and easily select high-quality organoids for transplantation and disease modeling, and reduce time and cost by identifying and removing poorly developed organoids,” Suga concluded. “This is a game changer.”
2024-12-06 16:20:59