Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improvements to hyperparameter tuning tutorial. #22

Open
stevehadd opened this issue Nov 8, 2022 · 0 comments
Open

Improvements to hyperparameter tuning tutorial. #22

stevehadd opened this issue Nov 8, 2022 · 0 comments
Assignees

Comments

@stevehadd
Copy link
Member

From review by @HarveySouth
Could: extend hyperparameter tuning tutorial with a test set evaluation from the best model. It would be interesting to discuss, but my own idea of reporting a good estimate of model performance involves finding the best hyperparameters that maximize the desired metric on the validation set, and then applying exactly the same training process to the model, trained on the training and validation data aggregated, tested against the test set since step/epoch are hyperparameters the final model inference metric is reportable. It is of course possible to report just the whole training process (which should be done anyway) and leave interpretation to the user/reader, but not sure what a best practice in this case would look like.

@stevehadd stevehadd self-assigned this Nov 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant