I noticed, that few month ago you added new config for NER ner_ontonotes_bert_emb.json. As I see it’s using bert transformers preprocessor and embedder and also gensim lib. Is this config somehow related to this paper [https://arxiv.org/abs/1911.04474](http://TENER: Adapting Transformer Encoder for Named Entity Recognition)? Is this model better than ontonotes_berl_mult? Did you estimate its F1 score or other metrics?
Also in config download section - there is no model download, only bert
Sorry for a lot of quetions, but i didn’t find anything about this config in NER documentation.
We are using ontonotes_bert_mult model now, and it’s reaally nice, thank you!