Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
-
Updated
Feb 29, 2020 - Python
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
Arabic NER system with a strong performance
Jointly Learning knowledge graph Embedding, Fine Grain Entity Types and Language Modeling.
Implementation of GAP: Graph Neighborhood Attentive Pooling, https://arxiv.org/abs/2001.10394. A context-sensitve graph (network) representation learning algorithm that relies only on the structure of the graph.
Code for the paper "Contextualized Weak Supervision for Text Classification"
A curated list of pretrained sentence and word embedding models
Code for "Let's Stop Incorrect Comparisons in End-to-end Relation Extraction!", EMNLP 2020
Code for "Contextualized Embeddings in Named-Entity Recognition", ECIR 2020
This project explores both Transfer Learning and Feature Extraction for obtaining contextual word embeddings using BERT-family model to solve a problem related to the Fake News Detection task, i.e. Stance Detection.
The official repo for the EACL 2023 paper "Quantifying Context Mixing in Transformers"
AI ChatBot using Python Tensorflow and Natural Language Processing (NLP) along side TFLearn
😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
Add a description, image, and links to the contextualized-representation topic page so that developers can more easily learn about it.
To associate your repository with the contextualized-representation topic, visit your repo's landing page and select "manage topics."