Highlights
- Pro
Starred repositories
Joint embedding of protein sequence and structure with discrete and continuous compressions of protein folding model latent spaces. https://www.biorxiv.org/content/10.1101/2024.08.06.606920v1
A generative model for programmable protein design
Tool for deep mutational scanning experiments.
Official code repository for the paper "ProteinNPT: Improving Protein Property Prediction and Design with Non-Parametric Transformers"
official Pytorch implementation of ICCV 2021 paper FuseFormer: Fusing Fine-Grained Information in Transformers for Video Inpainting.
[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Sandbox for Deep-Learning based Computational Protein Design
Generation of protein sequences and evolutionary alignments via discrete diffusion models
Repository for code and models for the paper "Extrapolative Controlled Sequence Generation via Iterative Refinement"
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Traffic Prediction models - UAGCRN and UAGCTransformer
This repository implements Gibbs sampling with Graph-based Smoothing
Reference implementation for DPO (Direct Preference Optimization)
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
Unofficial implementation of Palette: Image-to-Image Diffusion Models by Pytorch
A collection of tasks to probe the effectiveness of protein sequence representations in modeling aspects of protein design
Implementation for SE(3) diffusion model with application to protein backbone generation
A lightning-fast search API that fits effortlessly into your apps, websites, and workflow
Source code for Twitter's Recommendation Algorithm
Official repository for the paper "Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval"
High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features (PPO, DQN, C51, DDPG, TD3, SAC, PPG)
[ICLR 2023] "Mole-BERT: Rethinking Pre-training Graph Neural Networks for Molecules"
Implementation and experiments of graph embedding algorithms.
PyTorch code for Vision Transformers training with the Self-Supervised learning method DINO
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities