Aurélien Geron

Results 284 comments of Aurélien Geron

Hi @hansglick , That's a great question, thanks! The KL divergence equation computes the divergence between two probability distributions (see [my video on this topic](https://www.youtube.com/watch?v=ErfnhcEV1O8)). For example, if the probability...

Hi @serchinnho , Thanks for your question, and my sincere apologies for the late response, I somehow missed your message. The `best_model` object is a clone of the best `SGDRegressor`...

Thanks a lot for your very kind words, @serchinnho , and I'm really glad you found the solution!

Hi @ethenkaufmann , Thanks for your feedback. There are a few issues in the code: * When defining a model or a layer using subclassing, you should not create `Input`...

For this model, the Functional API is probably much easier. For example, `UNet` is implemented here: https://keras.io/examples/vision/oxford_pets_image_segmentation/ Alternatively, if the goal is to learn how to use the Subclassing API,...

Hi @koxt2 , Thanks for your question. The RandomForestRegressor on line 94 doesn't specify `max_features`, so it uses the default which is to use the total number of features in...

Hi @OrnellaFanais , Sorry for the late response, I was busy on other projects but I'm back. The code in the book can definitely be used almost as-is with many...

Thanks for your feedback. The `best_estimator_` attribute ends with an underscore. In Scikit-Learn, this means that it's a learned parameter, so it's only available after the `fit()` function finishes successfully....

Note: just to make sure, I just ran the [Colab Notebook for this chapter](https://colab.research.google.com/github/ageron/handson-ml2/blob/master/10_neural_nets_with_keras.ipynb#scrollTo=ff7VqMPhd0yH), and `best_estimator_` worked fine, so I think the hypothesis above is correct (the `fit()` function was...

Hi @ChrisChiasson , Great catch, thanks a lot! I believe you are referring to the naive baseline in the "Forecasting Several Steps Ahead" section. I'm not sure why I defined...