Hyperparameter Tuning with Richard Liaw
Podcast: Play in new window | Download
Subscribe: RSS
Hyperparameters define the strategy for exploring a space in which a machine learning model is being developed. Whereas the parameters of a machine learning model are the actual data coming into a system, the hyperparameters define how those data points are fed into the training process for building a model to be used by an end consumer.
A different set of hyperparameters will yield a different model. Thus, it is important to try different hyperparameter configurations to see which models end up performing better for a given application. Hyperparameter tuning is an art and a science.
Richard Liaw is an engineer and researcher, and the creator of Tune, a library for scalable hyperparameter tuning. Richard joins the show to talk through hyperparameters and the software that he has built for tuning them.
Sponsorship inquiries: sponsor@softwareengineeringdaily.com
Transcript
Transcript provided by We Edit Podcasts. Software Engineering Daily listeners can go to weeditpodcasts.com/sed to get 20% off the first two months of audio editing and transcription services. Thanks to We Edit Podcasts for partnering with SE Daily. Please click here to view this show’s transcript.