Skip to content

Latest commit

 

History

History
111 lines (96 loc) · 15.1 KB

README.md

File metadata and controls

111 lines (96 loc) · 15.1 KB

Annotate Research Papers

Alt Text

Why annotated papers?

Do you love reading research papers? Or do you want to read more research papers but find them intimidating? Or are you looking for annotated research papers that are much easier to understand?

If you are in any of the categories listed above, then you have arrived at the right place. I spend a lot of time reading papers. It is a crucial part of my ML work. If you want to do research or you want to be a better ML engineer, then you should read papers. This habit of reading papers will help you to remain updated with the field.

Note: I am a pen-paper guy. Nothing beats the pen-paper reading experience, but in the ongoing scenarios (pandemic, lockdown, etc.), I am not able to print the papers. Taking this as an opportunity to share my thought process, I will be sharing the annotated research papers in this repo. The order of the papers won't strictly be according to the timeline on arXiv. Sometimes I put a paper on hold and read it after a while.

PS: I cannot annotate all the papers I read, but you can expect all the interesting papers to be uploaded here.

Table of Contents

Field Category Annotated Paper
Computer Vision Adaptive Risk Minimization Abstract
Axial DeepLab Code Abstract
ConvNext Code Abstract
EfficientNetsV2 Code Abstract
Supervised Flow-edge Guided Video Completion Code Abstract
Is Batch Norm Unique? Abstract
Knowledge Distillation: A good teacher is patient and consistent Code Abstract
RandConv Code Abstract
Polyloss Code Abstract
Scaling Down Deep Learning Code Abstract
Segment Anything Abstract
Supervised Contrastive Learning Code Abstract
Vision Transformer Code Abstract
Are all negatives created equal in contrastive instance discrimination? Abstract
Towards Domain-Agnostic Contrastive Learning Abstract
Self-Supervised Emerging Properties in Self-Supervised Vision Transformers Code Abstract
Decoder Denoising Pretraining Abstract
Masked Autoencoders Code Abstract
Swav Code Abstract
What Should Not Be Contrastive in Contrastive Learning Abstract
Vision Transformers need Registers Abstract
Semi-Supervised CoMatch Code Abstract
Diffusion Models Understanding Diffusion Models Abstract
On the Importance of Noise Scheduling for Diffusion Models Abstract
Emergent Correspondence from Diffusion Models Abstract
GANs CycleGan Code Abstract
Interpretability and Explainability What is being transferred in transfer learning? Code Abstract
Explaining in Style Code Abstract
NLP Do Language Embeddings Capture Scales? Abstract
mSLAM Abstract
Cramming Abstract
Shortened Llama Abstract
CoPE Abstract
LLMs cannot plan but can help planning Abstract
Mixture of A Million Experts Abstract
Agent Workflow Memory Code Abstract
What Matters for Model Merging at Scale? Abstract
Speech SpeechStew Abstract
mSLAM Abstract
WhisperX Code Abstract
MLLMs VCoder: Versatile Vision Encoder for MLLMs Code Abstract
Sigmoid Loss for Image-Text Pretraining Abstract
MobileCLIP Abstract
MM1 Abstract
Ferretv2 Abstract
VisualFactChecker Abstract
JanusFLow Code Abstract
Others Multi-Task Self-Training for Learning General Representations Abstract
Decoder Denoising Pretraining for Semantic Segmentation Abstract

Community Contributions

Note: The annotated papers in this section are contributed by the community. As I cannot verify the annotation for each paper, I will lay out the guidelines for annotations so that every annotated paper has similar annotated sections.