-
Notifications
You must be signed in to change notification settings - Fork 2
Resources
Welcome to the resources wiki!
- Basic Introduction Tutorial
- Turing Institute Research Software Engineering Course
- Code Style Guide: PEP8
- Guide to Environments and Conda
- Jupyter Notebooks Guide
- NumFocus Courses Catalog
Guides to Libraries:
- Introduction
-
Notebooks
- examples,
- your own,
- link to Google Drive,
- link to GitHub.
- External data
- Videos
- Andrew Ng’s Machine Learning online course on Coursera
- Coursera Machine Learning in Python
- Resources accompanying Anita's book explaining the mathematical workings (ask Anita for access)
- Harvard Class Material (with videos) - recommended by Risa
- ML Tutorial: Gaussian Processes, Richard Turner (YouTube)
- High performance Jupyter on combining Dask and Rapids in JupyterLab and notebooks (notebooks on github)
- Crash course on Machine Learning and Distributed Computing Frameworks in Data-centers Basic concepts like performance, parallelism or virtualization, machine learning and data analytics processes, including supervised and unsupervised learning, also neural networks, use cases and exercises on Apache Spark, a platform for distributing data processes, and Intel BigDL, a Spark library optimized for neural network and Deep Learning.
Beginners:
- Kaggle
- The hundred-page machine learning book (Burkov, with code on GitHub)
- IBM Machine Learning for Dummies
- Introduction to Statistical Learning (James, Witten, Hastie, Tibshirani)
Advanced:
- Information Theory, Inference, and Learning Algorithms (MacKay, 2003)
- Gaussian Processes for Machine Learning (Rasmussen and Williams, 2006)
- Pattern Recognition and Machine Learning (Bishop, 2006, Spring)
- Deep Learning (Goodfellow, Bengio, Courville, 2016, MIT Press)
- Elements of Statistical Learning (Hastie, Tibshirani, Friedman)
- Machine Learning: A Probabilistic Perspective (Murphy)
- https://madewithml.com/topics/
- Distillation of ML publications
- Archive Sanity
- Papers with Code
- Machinelearning on Reddit
- Weights and Biases experiment logging - Docs
- Kaggle
- The A-Z of AI and Machine Learning: Comprehensive Glossary
- Machine Learning Map
- Neural Network Zoo
- Towards Data Science
- Jupyter Notebook Review Tool
- Software Carpentry
- Data Carpentry
- AI4ESS
- Reproducibility
- Alan Turing Institute YouTube Channel
- Alan Turing Institute Podcasts
- Cambridge Spark
- Microsoft Research Podcasts
- Deep Learning
- Digital Twins: The Next Phase of the AI Revolution?
- Schrodinger Lecture: Data, Data Everywhere But Let's Stop and Think
- Artificial Intelligence, the History and Future - with Chris Bishop
- The Brad Efron Honorary Symposium on LARGE-SCALE INFERENCE - Keith Baggerly
- How statistics lost their power, and why we should fear what comes next
- IBM Watson | Full Q&A | Oxford Union
- Militarizing Your Backyard with Python: Computer Vision and the Squirrel Hordes
These resources are ordered by usefulness within each subsection.
- 3Blue1Brown Deep Learning Series - a good intro to get intuition
- MIT Deep Learning Lectures (2020) - the first 6 are a good overview of the different areas of neural network research
- Hugo Larochelle's Short Lectures (2016) - quite dated now but the first 17 are good
- How NNs Work
- Coursera Deep Learning Part 1 and Part 2 - covers all the basics but in the form of short videos + notebooks with code if you prefer that
- Interactive neural network in your browser - play around with this to get intuition for all the hyperparameters
- Nielsen's online textbook - a fairly mathematical but very readable version of the DL basics
- Part 2 of the Deep Learning textbook - even more detailed version of the above
- CS231n (Stanford) - one of the best explanations of neural networks specialised for images
- Introduction to LSTMs by Chris Olah - very good explanation with intuition.
- PyTorch basic tutorial - PyTorch is a Python library for neural networks that is the most popular among researchers (due to the clunkiness of Tensorflow, the other popular library)
- JAX library and tutorial - JAX is a relatively new neural network library which is basically Numpy + AutoDiff + GPU support. It's a much more bare bones framework compared to PyTorch but can be useful to see exactly what is going on.
Podcast with Karen Ottewell, who is Director of Academic Development and Training for International Students. Academic writing and argumentation is in itself a different form of English also for first language speakers. Especially interesting is the difference between writer responsibility and reader responsibility. Writer responsibility means that the writer has to guide and to make the effort for the reader to understand. Reader responsibility means that the reader has to make the effort. Writer responsibility is not just about clarity, but also structure. You might find more resources suitable for yourself here https://www.langcen.cam.ac.uk/adtis/adtis-index.html