NER ner_ontonotes_bert_emb config

Hi,

I noticed, that few month ago you added new config for NER ner_ontonotes_bert_emb.json. As I see it’s using bert transformers preprocessor and embedder and also gensim lib. Is this config somehow related to this paper [https://arxiv.org/abs/1911.04474](http://TENER: Adapting Transformer Encoder for Named Entity Recognition)? Is this model better than ontonotes_berl_mult? Did you estimate its F1 score or other metrics?
Also in config download section - there is no model download, only bert :slight_smile:

Sorry for a lot of quetions, but i didn’t find anything about this config in NER documentation.
Thanks!

PS
We are using ontonotes_bert_mult model now, and it’s reaally nice, thank you!

Hi @zharenkov,

It’s a sample configuration for using a pre-trained Bert model as a vectorizer without fine-tuning.
Sorry, I don’t remember the exact scores it reaches but they’re worse than that of ner_ontonotes_bert_emb but a little bit better than for ner_conll2003.

Ok, thank you for the information!