Stars
"An Investigation of the Combination of Rehearsal and Knowledge Distillation in Continual Learning for Spoken Language Understanding", accepted at INTERSPEECH 2023.
NEVIS'22: Benchmarking the next generation of never-ending learners
Robust Speech Recognition via Large-Scale Weak Supervision
Learning to Prompt (L2P) for Continual Learning @ CVPR22 and DualPrompt: Complementary Prompting for Rehearsal-free Continual Learning @ ECCV22
[CVPR2022] Representation Compensation Networks for Continual Semantic Segmentation
Codebase used in the paper "Foundational Models for Continual Learning: An Empirical Study of Latent Replay".
🐢 Open-Source Evaluation & Testing for AI & LLM systems
Official PyTorch implementation of "Multi-Head Distillation for Continual Unsupervised Domain Adaptation in Semantic Segmentation"
This repository contains implementations and illustrative code to accompany DeepMind publications
Over 200 figures and diagrams of the most popular deep learning architectures and layers FREE TO USE in your blog posts, slides, presentations, or papers.
Official implementation of "CoMFormer: Continual Learning in Semantic and Panoptic Segmentation"
Code release for "Masked-attention Mask Transformer for Universal Image Segmentation"
A tool for active reading and personal knowledge management
Acceptance rates for the major AI conferences
FFCV: Fast Forward Computer Vision (and other ML workloads!)
ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in Semantic Segmentation
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
A clean and simple data loading library for Continual Learning
Pure python implementation of byte pair encoding (subword tokenization)
Script to automatically book a vaccine slot on Doctolib in the next seven days.
Prototype-based Incremental Few-Shot Semantic Segmentation
Boost LaTeX typesetting efficiency with preview, compile, autocomplete, colorize, and more.
self-studying the Sutton & Barto the hard way