Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
-
Updated
Jan 1, 2019 - Python
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Pre-training of Deep Bidirectional Transformers for Language Understanding
Universal Joint Feature Extraction for P300 EEG Classification Using Multi-Task Autoencoder (IEEE Access)
Team Kakao&Brain's Grammatical Error Correction System for the ACL 2019 BEA Shared Task
Autoregressive Predictive Coding: An unsupervised autoregressive model for speech representation learning
A collection of Audio and Speech pre-trained models.
Vector Quantized Autoregressive Predictive Coding (VQ-APC)
Implementation of our paper "Exploiting Unsupervised Data for Emotion Recognition in Conversations" in the Findings of EMNLP-2020.
Implementation of Marge, Pre-training via Paraphrasing, in Pytorch
Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining, in Pytorch
Code of ICKG2020 best student paper: A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition
Towards Semantics-Enhanced Pre-Training: Can Lexicon Definitions Help Learning Sentence Meanings? (AAAI 2021)
[ICML 2020] DrRepair: Learning to Repair Programs from Error Messages
Code and Data for EMNLP2020 Paper "KGPT: Knowledge-Grounded Pre-Training for Data-to-Text Generation"
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
This repository is about how to pretrain language models on a custom corpus.
预训练模型知识量度量竞赛 Baseline F1 0.35 BERTForMaskedLM
Using SqueezeNet to classify video frames coming from a webcam or a smartphone camera
Code for the ICLR 2021 Paper "In-N-Out: Pre-Training and Self-Training using Auxiliary Information for Out-of-Distribution Robustness"
A paper list of pre-trained language models (PLMs).
Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.
To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."