Highlights
- Pro
Stars
An open source implementation of CLIP.
Autoregressive Model Beats Diffusion: 🦙 Llama for Scalable Image Generation
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
Adapting LLaMA Decoder to Vision Transformer
Taming Transformers for High-Resolution Image Synthesis
PyTorch package for the discrete VAE used for DALL·E.
A high-throughput and memory-efficient inference and serving engine for LLMs
[ICML'24 Spotlight] LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning
MetaMath: Bootstrap Your Own Mathematical Questions for Large Language Models
Tools for merging pretrained large language models.
Example models using DeepSpeed
A framework for few-shot evaluation of language models.
Official style files for papers submitted to venues of the Association for Computational Linguistics
Rigourous evaluation of LLM-synthesized code - NeurIPS 2023
Awesome-LLM: a curated list of Large Language Model
Accelerating Vision-Language Pretraining with Free Language Modeling (CVPR 2023)
Official code for "pi-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation", ICML 2023.
The official GitHub page for the survey paper "A Survey of Large Language Models".
Implementation of Hinton's forward-forward (FF) algorithm - an alternative to back-propagation
Official code for the paper "Task2Vec: Task Embedding for Meta-Learning" (https://arxiv.org/abs/1902.03545, ICCV 2019)
Awesome papers on Language-Model-as-a-Service (LMaaS)
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
dbash / visda2022-org
Forked from lhoyer/DAFormer[NeurIPS 2022] VisDA 2022 Challenge Toolkit
哈尔滨工业大学(深圳)计算机专业课程攻略 | Guidance for courses in Department of Computer Science, Harbin Institute of Technology (Shenzhen)