Chris Endemann
Chris Endemann
I know this is a minor point, and apologies in advance for potentially starting a syntax war, but I'm not sure I agree with the choice of capitalizing "Deep Learning"...
I taught this workshop last week, and learners were very curious about the purpose of the bias nodes. I think it would be nice to add a short text snippet...
Several additions here that are intertwined due to the addition of some helper functions. In case it is helpful, these changes can also be reviewed from my [forked repo's website](https://uw-madison-datascience.github.io/2022-10-26-machine-learning-novice-sklearn/)....
There were some links pertaining to incubator lessons which had the old hyperlink syntax of [link name](URL). These weren't working so I replaced them with hrefs.
I taught this lesson last week (third time!) at the University of Wisconsin-Madison. This year, I experimented with using prefilled Jupyter notebooks rather than having everyone type out all of...
In the same spirit as the regression episode, I thought it might be useful to establish a baseline expectation in terms of the categorical cross-entropy loss metric. You can calculate...
I have some updated slides that I used to teach this lesson last week: https://docs.google.com/presentation/d/1uT4uvfWrpvrrQEFp84PGfAQ2r9Ylqx8tbwiVFuGfEao/edit?usp=sharing Please feel free to use/repurpose anything in there. I felt it was important to comment...
It's very useful to have a diagonal reference line when viewing scatterplots that plot true vs predicted data. Deviations above/below the diagonal tell you where the model is making more...