Keras: update model with a bigger training set
I trained a model with Keras for text classification (supervised learning) using a training set. Let's say that there are 50.000 sentences in this training set.
During a week I collect 5.000 new sentences and I add them to the old training set.
If next week I want to train a new model with the new and bigger training set (50.000 old sentences + 5.000 new sentences), should I restart the training phase from the beginning, or can I take the old model and "update" it in some way to save some time?
python tensorflow keras
add a comment |
I trained a model with Keras for text classification (supervised learning) using a training set. Let's say that there are 50.000 sentences in this training set.
During a week I collect 5.000 new sentences and I add them to the old training set.
If next week I want to train a new model with the new and bigger training set (50.000 old sentences + 5.000 new sentences), should I restart the training phase from the beginning, or can I take the old model and "update" it in some way to save some time?
python tensorflow keras
2
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58
add a comment |
I trained a model with Keras for text classification (supervised learning) using a training set. Let's say that there are 50.000 sentences in this training set.
During a week I collect 5.000 new sentences and I add them to the old training set.
If next week I want to train a new model with the new and bigger training set (50.000 old sentences + 5.000 new sentences), should I restart the training phase from the beginning, or can I take the old model and "update" it in some way to save some time?
python tensorflow keras
I trained a model with Keras for text classification (supervised learning) using a training set. Let's say that there are 50.000 sentences in this training set.
During a week I collect 5.000 new sentences and I add them to the old training set.
If next week I want to train a new model with the new and bigger training set (50.000 old sentences + 5.000 new sentences), should I restart the training phase from the beginning, or can I take the old model and "update" it in some way to save some time?
python tensorflow keras
python tensorflow keras
asked Nov 14 '18 at 8:49
erik.berik.b
327
327
2
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58
add a comment |
2
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58
2
2
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58
add a comment |
1 Answer
1
active
oldest
votes
You can save/load model/weights. Check out this tutorial by Jason Brownlee.
After you loaded the weights, you can start training with the new dataset (the 55000 samples). As the 'training' is basically just updating weights, and you loaded your trained weights, you are now 'updating' the already trained model.
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53296165%2fkeras-update-model-with-a-bigger-training-set%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You can save/load model/weights. Check out this tutorial by Jason Brownlee.
After you loaded the weights, you can start training with the new dataset (the 55000 samples). As the 'training' is basically just updating weights, and you loaded your trained weights, you are now 'updating' the already trained model.
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
add a comment |
You can save/load model/weights. Check out this tutorial by Jason Brownlee.
After you loaded the weights, you can start training with the new dataset (the 55000 samples). As the 'training' is basically just updating weights, and you loaded your trained weights, you are now 'updating' the already trained model.
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
add a comment |
You can save/load model/weights. Check out this tutorial by Jason Brownlee.
After you loaded the weights, you can start training with the new dataset (the 55000 samples). As the 'training' is basically just updating weights, and you loaded your trained weights, you are now 'updating' the already trained model.
You can save/load model/weights. Check out this tutorial by Jason Brownlee.
After you loaded the weights, you can start training with the new dataset (the 55000 samples). As the 'training' is basically just updating weights, and you loaded your trained weights, you are now 'updating' the already trained model.
answered Nov 14 '18 at 8:57
DinariDinari
1,659522
1,659522
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
add a comment |
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
Thank you for the quick reply! I tried the solution in the tutorial, and it works if the input layer (each different single word in my sentences) does not change between the first partial training set and the second full training set. I mean, if the 5000 sentences I add to my initial training set are made with words that are already present in the first 50000 sentences, then I can "update" my model. If there are some new words I have to retrain the model from scratch, because the input layer has changed, am I right?
– erik.b
Nov 14 '18 at 9:29
1
1
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
Assuming you use bag-of-words representation, you can load the whole model, and switch just the input layer. That way you will keep all the trained weights, apart from the first ones. You can look here for the way to do it: stackoverflow.com/questions/49546922/…
– Dinari
Nov 14 '18 at 10:17
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53296165%2fkeras-update-model-with-a-bigger-training-set%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
check this post . Your question is explained there.
– Juan Julián Cea Moran
Nov 14 '18 at 8:58