Cannot add new data on the Go_Bot

Hi !
I am actually working with Deeppavlov, and I am currently working on the go_bot model.
I add new data by adding new discussions to the DSTC2 dataset, like this :

As you can see I added a new restaurant with a food origin that has never been cited before, so I also add the new restaurant in the database.
but when I run the command : python -m deeppavlov train PATH/gobot_dstc2-1.json -d
to train the new model on the new added data, I get this error :

I am sure that this error is because the new word “argentinian” because it has never appeared before, normally when add for example a new “indian” restaurant it works without any problem.
I have tried to modify in the python script the variable “obs_size” but I get another error about the size of the input of the neural network.
I would like to know what do I have to modify to make thins thing work, please.
Thank you in advance for your help :smile:

Probably this is becasue the model was trained and saved with different set of parameters. I’m not an expert in dstc2, but maybe one possible way is to re-train the model with your updated data.

1 Like

Thank you for your answer !
That’s actually what I am trying to do, as you can see above in my previous message, I am trying to train my model again from scratch with these new added data, but I have the feeling that he has the old saved parameters and still goes back to them and outputs an error.

There are several ways to fix your issue.

  1. If you intend to train the model on new data from scratch, then remove files of pretrained model from disk and rerun usual train command (without -d):
rm -r ~/.deeppavlov/models/gobot_dstc2/
python -m deeppavlov train PATH/gobot_dstc2-1.json
  1. If your wish is to fine-tune the existing model on new data, then just remove the line "fit_on": ["x_tokens"] from config section with "id": "word_vocab". The problem with fit_on is that the vocabulary is then rebuilt every time you add new words in the dataset. Removing fit_on will only load old vocabulary of words from disk. Hence the input dimensions of the policy network won’t change with new data.
1 Like

Hi !
Thank you very much for your answers, that’s very helpful !
1- I will try to do that and get back to ASAP !
2- FOr the fine-tuning method : I don’t know if the config section you’re talking about is the one in the “gobot_dstc2-4.json” file :

After removing the two lines "fit_on" : ["x_tokens"]and "id": "word_vocab" I had this error :

I have also another question about the fine-tuning technique, you said that if I remove fit_on, will only load the old vocabulary and the input dimensions of policy network will not change with new data, but If I understood well this is not what I want, right ?
I want my model to be able to detect this new added data and be able to answer according to it.
Can you please explain me this concept, thank you in advance for your help.

Yes, the second method implies that the vocab stays the same and new words will be interpreted as unknown token UNK.
For the method to work, please remove only the line with fit_on without deleting "id": "word_vocab".

1 Like

Hi !
Thank you for your answers, I come back to you because I tried the two solutions but none of them worked for me :

1- For the first method I deleted all the files contained in /.deeppavlov/models/gobot_dstc2/ and trained again the chatbot.
After the training When I run the command python -m deeppavlov interact PATH/gobot_dstc2-1.json the chatbot didn’t recognize the word argentinian again.
When I checked the file word.dict in the gobot_dstc2 folder, I saw that indeed the word argentinian
was recognized but only two times because I only added one discussion in each file (train/valid/test) which was normal.
I tried to duplicate the discussion in all the files, but the chatbot was still not able to recognize the new word.
After that I checked the file word_dict in the /.deeppavlov/models/slotfill_dstc2/ and the word argentinian was not there, so I thought this is the origin of the problem.
Do I have to modify something in the gobot_dstc2-1.json file or the slotfill_dstc2.json ?

2- For the second file when I only delete the line "fit_on": ["x_tokens"] and keep all the remaining lines the same, I get this error :

After doing the method 1 and coming back to the second one I get the old error about the size of the variable obs_size.

Thank you very much for your help :blush:

Hi, @chicolovic!

  1. Yes, you got it right. To add new slot value first you need to add new slot value in the file ~/.deeppavlov/download/dstc2/dstc_slot_vals.jsonand retrain your slot-filller:
rm -r ~/.deeppavlov/models/slotfill_dstc2/
python3 -m deeppavlov train slotfill_dstc2

Then check the work of your slotfiller by interacting with it:

python3 -m deeppavlov interact slotfill_dstc2

And then you should retrain the whole bot that uses predictions from the new slotfiller:

rm -r ~/.deeppavlov/models/gobot_dstc2
python3 -m deeppavlov train PATH/gobot_dstc2-1.json
1 Like
  1. Please provide the whole config file for the second approach. The error is probably due to deleted line "id": "word_vocab", you shouldn’t delete it.

Hi !
Thank you very much for your answers, it helps me a lot to move further in my project

1 - To add the new slot value I had to modify ~/.deeppavlov/downloads/dstc2/slot_vals.json and not ~/.deeppavlov/download/dstc2/dstc_slot_vals.json like you suggested, but I think it’s the same file.

After adding the new word argentinian in slot_vals.json and removing the old files in ~/.deeppavlov/models/slotfill_dstc2/ I tried to train the slotfiller using the following command : python3 -m deeppavlov PATH/slotfill_dstc2-1.json , but I had another error :

As you can see in the first image, more specifically in this line :
2019-09-10 21:43:21.599 INFO in ‘’[‘utils’] at line 63: Downloading from to /home/yasser/Documents/Project/dstc2-test-1/slot_vals.json
a file PATH/dstc2-test-1/slot_vals.json is created with the old entities ( without argentinian ) because it’s downloaded from the URL.
What I want to do is to modify the right config file so that my slotfiller picks up from the file I “slot_vals.json” file I modified.
This how my config file slotfill_dstc2-1.json looks like : +

As you can see I modified the path for the dataset_reader and the dataset_iterator and also the dstc_slotfilling class name save and load path.
I also modified the path for the ner_dstc2 config file, and this how it looks like :

I don’t know wat do I have to modify in order to push the slot_filler pick from the slot_vals.json I modified.

2 - I have verified the config file " gobot_dstc2-1.json ", and the line "id":"word_vocab" was not deleted.
However, this method that consists of just removing one line and interacting with the go_bot by loading the old vocabulary will just allow me to interact with the go_bot on the “old data” and not the “new added data” right ? (the argentinian restaurant).
Because this is not what I want, I would like to interact with the go_bot on the new added data, so that he can be able to answer questions about the argentinian restaurant.

Thank you very much for your help ! :smiley:

First of all, the latest release contains new gobot tutorial, check it out.

  1. I see that the problem is that the slot_vals.json file gets downloaded each time the dstc_ner_iterator is called. That was a bug and is now fixed in the 0.6.0 release. Update deeppavlov's version and rerun the code.

  2. You can, for example, retrain (from scratch) slot_filler on new data first, and then fine-tune old bot on new data with new slot_filler. This approach will answer questions about argentinian restaurants. So, your gobot_dstc2-1.json differs from gobot_dstc2.json only by removed "fit_on": ["x_tokens"] line? That should work, really. Please provide the config just in case.

Thank you for your answers!

1- After modifying the file ~/.deeppavlov/download/dstc2/dstc_slot_vals.json
and deleting the files contained in /.deeppavlov/models/slotfill_dstc2/ using the following commandrm -r ~/.deeppavlov/models/slotfill_dstc2/ I encounter the same error :

This is how the config looks like :
However, it’s quite strange when I try to train the slotfiller by adding -d at the end of the command, it works, and recognizes the word argentinian, but not in the same way compared to the old ones :
It was also the same thing when I tried to interact with the gobot using the -d at the end of the command : python3 -m deeppavlov interact PATH/gobot_dstc2-1.json -d, it also worked :
But I need to specify the word “food” after “argentinian”.

2- I verified nothing is deleted, but I still have a problem with the slotfiller like I explained above so I couldn’t test this solution, here’s the config :

Thank you for your help !

  1. My fault, the slotfill_dstc2.json only uses ner model and can’t train it. To retrain slot_filler please run python3 -m deeppavlov train ner_dstc2 instead of running python3 -m deeppavlov train slotfill_dstc2. You can then interact with the model using slotfill_dstc2.json.

  2. Please remove the line “fit_on”: [“x_tokens”] from config section with “id”: “word_vocab” (without deleting “id”: “word_vocab” line).