recentpopularlog in


« earlier   
stanfordnlp/stanfordnlp: Official Stanford NLP Python Library for Many Human Languages
Official Stanford NLP Python Library for Many Human Languages - stanfordnlp/stanfordnlp
stanford  nlp 
21 hours ago by johns
How to build a convincing reddit personality with GPT2 and BERT
generating replies to Reddit comments using GPT-2 and BERT
2 days ago by theodoreweld
[2002.08902] Application of Pre-training Models in Named Entity Recognition
Named Entity Recognition (NER) is a fundamental Natural Language Processing (NLP) task to extract entities from unstructured data. The previous methods for NER were based on machine learning or deep learning. Recently, pre-training models have significantly improved performance on multiple NLP tasks. In this paper, firstly, we introduce the architecture and pre-training tasks of four common pre-training models: BERT, ERNIE, ERNIE2.0-tiny, and RoBERTa. Then, we apply these pre-training models to a NER task by fine-tuning, and compare the effects of the different model architecture and pre-training tasks on the NER task. The experiment results showed that RoBERTa achieved state-of-the-art results on the MSRA-2006 dataset.
deeplearning  NLP 
2 days ago by hustwj

Copy this bookmark:

to read