Issue with Input[39]
The example provided in input[39] gives the following output

Opposed to what is displayed in the book for LogisticRegression, I am new to the course and the boundary which separates class 0 and 1, the slope of the line seems to be inverted
Output from the book

Thank you for the issue. Which version of sklearn are you using?
Can you please give the output of sklearn.show_versions()?
sklearn.show_versions()
System: python: 3.7.5 (default, Oct 25 2019, 10:52:18) [Clang 4.0.1 (tags/RELEASE_401/final)] executable: /Users/binod/.conda/envs/machine/bin/python machine: Darwin-18.7.0-x86_64-i386-64bit
Python dependencies: pip: 19.3.1 setuptools: 42.0.2.post20191203 sklearn: 0.22 numpy: 1.17.4 scipy: 1.4.0 Cython: None pandas: 0.25.3 matplotlib: 2.2.4 joblib: 0.14.1
Built with OpenMP: True
Python version 3.7.5
Ok so the issue is that the solver in scikit-learn changed from liblinear to lbfgs. However, the new solver is more accurate. If the data was scaled, the two solvers would give similar results. I think the best thing to do would probably be to scale the data and make this less sensitive to the solver choices for now.