Helpful Resources
Good Blogs: https://colah.github.io/
Graph Theory: http://olizardo.bol.ucla.edu/classes/soc-111/textbook/_book/1-intro.html#intro
Derivatives of Lin. Algebra: http://michael.orlitzky.com/articles/the_derivative_of_a_quadratic_form.xhtml
Use non-linear model at end of NN : https://www.reddit.com/r/MachineLearning/comments/qex0o7/d_mlps_are_actually_nonlinear_linear/
Computer science at UCF: https://www.cs.ucf.edu/~kienhua/classes/
Statistics with proofs and python: https://xavierbourretsicotte.github.io/#
Statistics notes written out on Twitter: https://twitter.com/mervenoyann/status/1386752998131605510
Data science project (including XGBoost) example implementation: https://github.com/alexeygrigorev/mlbookcamp-code/blob/master/course-zoomcamp/06-trees/notebook.ipynb
Data science project template: https://drivendata.github.io/cookiecutter-data-science/#cookiecutter-data-science
Deep learning descriptions: https://arthurdouillard.com/deepcourse/
DL Timeseries Reserve: https://github.com/Alro10/deep-learning-time-series
Cool Papers
Uncertainty estimation NN: https://arxiv.org/pdf/2003.02037.pdf
Decomposing sensitivity components for calibration: https://arxiv.org/pdf/2110.14577.pdf
HMM -> ODE: http://www.stat.columbia.edu/~liam/teaching/neurostat-fall20/papers/hmm/minka-lds-techreport.pdf
Data + ODE: https://arxiv.org/pdf/2103.10153.pdf