

Retro Biosciences Inc., a startup using artificial intelligence to discover new drugs, is reportedly raising $1 billion in funding to support its research.
The Financial Times today cited sources as saying that OpenAI Chief Executive Sam Altman is participating in the investment. He previously provided the initial $180 million funding round that helped launch Retro.
Retro launched in mid-2022 with the goal of extending the average human lifespan by 10 years. Using the $180 million in initial funding it received from Altman, the company filled a San Francisco warehouse with shipping containers repurposed into labs. It’s using those labs and a custom artificial intelligence model developed by OpenAI to develop new medicines.
The company’s work reportedly focuses on Yamanaka factors, four proteins that have emerged as a major focus for longevity researchers. The proteins are capable of turning a human skin cell into a stem cell. Retro believes that this phenomenon could help reverse aging.
Currently, the process of turning skin cells into stem cells with Yamanaka factors takes several weeks. Additionally, fewer than 1% of the cells to which scientists apply Yamanaka factors complete the process successfully. Those limitations are holding back efforts to turn the proteins into therapies.
Last week, Retro revealed that it’s working with OpenAI to tackle the challenge. The companies have developed a small language model called GPT-4b micro that can research ways of enhancing Yamanaka factors. According to the MIT Technology Review, the model helped Retro make two Yamanaka factors more than 50 times as effective as before.
That the model is called GPT-4b micro suggests it’s a customized version of GPT-4. In 2023, OpenAI launched an offering that enables organizations to commission customized versions of its neural networks. Customers can not only modify the dataset on which OpenAI’s AI models are trained but also change their hyperparameters, the configuration settings that determine how a neural network goes about processing data.
Proteins such as Yamanaka factors are made of molecules called amino acids. OpenAI trained GPT-4b micro on descriptions of various proteins’ amino acid sequences, as well as data about how proteins interact. The model reportedly has a considerably smaller hardware footprint than the company’s flagship large language models.
Retro’s researchers interact with GPT-4b micro using a method called few-shot prompting. The usual way of instructing an LLM to perform a task is by providing a description of the action to be performed. With few-shot prompting, users enter not only a task description but also multiple examples of how to go about performing it, which boosts AI output quality.
According to Fortune, Retro will use the new capital that it’s raising to support the development of three new therapies. One focuses on rejuvenating blood cells while another is designed to treat Alzheimer’s disease. The latter therapy is reportedly set to begin early-stage clinical trials in Australia later this year.
THANK YOU