
GP-MoLFormer: Molecular Generation via Transformers
A 46.8M parameter transformer for molecular generation trained on 1.1B SMILES, introducing pair-tuning for efficient …

A 46.8M parameter transformer for molecular generation trained on 1.1B SMILES, introducing pair-tuning for efficient …

Introduces displacement interpolation to prove ground state uniqueness via optimal transport, establishing foundations …

A continuous-time normalizing flow using stochastic interpolants and quadratic loss to bypass costly ODE …

A simulation-free framework for training Continuous Normalizing Flows using Conditional Flow Matching and Optimal …

A unified ODE-based framework for generative modeling and domain transfer that learns straight paths for fast 1-step …

Theoretical paper proving the equivalence between training Denoising Autoencoders and performing Score Matching on a …

Unified SDE framework for score-based generative models, introducing Predictor-Corrector samplers and achieving SOTA on …

Flow matching model that co-generates ligands and flexible protein pockets, addressing rigid-receptor limitations in …

A fast, diverse inverse folding method combining deep learning with Potts models to capture full sequence landscapes.

A Riemannian flow matching framework for generating Metal-Organic Framework structures by treating building blocks as …

A robust, type-safe Python library for converting chemical strings (SMILES, SELFIES, InChI) into publication-quality …

Summary of Kingma & Welling's foundational VAE paper introducing the reparameterization trick and variational …