Hi !
Thank you very much for your answers, it helps me a lot to move further in my project
1 - To add the new slot value I had to modify ~/.deeppavlov/downloads/dstc2/slot_vals.json
and not ~/.deeppavlov/download/dstc2/dstc_slot_vals.json
like you suggested, but I think itâs the same file.
After adding the new word argentinian
in slot_vals.json
and removing the old files in ~/.deeppavlov/models/slotfill_dstc2/
I tried to train the slotfiller using the following command : python3 -m deeppavlov PATH/slotfill_dstc2-1.json
, but I had another error :
As you can see in the first image, more specifically in this line :
2019-09-10 21:43:21.599 INFO in âdeeppavlov.core.data.utilsâ[âutilsâ] at line 63: Downloading from
http://files.deeppavlov.ai/datasets/dstc_slot_vals.json to /home/yasser/Documents/Project/dstc2-test-1/slot_vals.json
a file PATH/dstc2-test-1/slot_vals.json is created with the old entities ( without argentinian ) because itâs downloaded from the URL.
What I want to do is to modify the right config file so that my slotfiller picks up from the file I âslot_vals.jsonâ file I modified.
This how my config file
slotfill_dstc2-1.json
looks like : +
As you can see I modified the path for the dataset_reader
and the dataset_iterator
and also the dstc_slotfilling
class name save and load path.
I also modified the path for the ner_dstc2
config file, and this how it looks like :
I donât know wat do I have to modify in order to push the slot_filler pick from the slot_vals.json
I modified.
2 - I have verified the config file " gobot_dstc2-1.json "
, and the line "id":"word_vocab"
was not deleted.
However, this method that consists of just removing one line and interacting with the go_bot by loading the old vocabulary will just allow me to interact with the go_bot on the âold dataâ and not the ânew added dataâ right ? (the argentinian restaurant).
Because this is not what I want, I would like to interact with the go_bot on the new added data, so that he can be able to answer questions about the argentinian restaurant.
Thank you very much for your help !