Multimodal Translation

Diseases show up in different ways depending on how we look at them. For example, RNA sequences tell us what genes are active inside cells, while medical images show us what tissues and organs look like from the outside. Each view gives us valuable information, but today they usually live in separate worlds.

Our project aims to connect these worlds by teaching AI to translate between RNA and medical images. We want to see if patterns in gene activity can predict what shows up in a scan, and if features in a scan can hint at what's happening at the molecular level. If successful, this could make it possible to get genetic-level insights from a simple image — without needing invasive tests — and help researchers understand how changes in our genes actually appear in the body. This kind of translation could lead to earlier diagnoses, more precise treatments, and a deeper understanding of how diseases work.

[Link]
Publications
A. Singh, M. Fromandi, D. Pimentel-Alarcón, D. Werling, A. Gasch, and J.P. Yu. "Intrinsic gene expression correlates of the biophysically modeled diffusion MRI signal in autism spectrum disorder". Biological Psychiatry. 2024.