[ Local | CHICS hub ]
Special Topic in Solid State Physics: Machine Learning for Physical Scientists
รหัสวิชา 2304641 ปีการศึกษา 2563
Week | Topics | Readings | Homework |
---|---|---|---|
Week 0-1 | Course Introduction (video, slides), Introduction to Statistical Learning Theory Framework (video, slides) | Mehta's Sec. 1-3, Cucker and Smale's (paper) | HW 0 |
Week 2 | Linear Regression (video, slides), Regularization (video,slides) | Mehta's Sec. 5-6 | HW 1 (out) |
Week 3 | Bias-Variance Decomposition, Introduction to Bayesian Inference (video,slides), MLE MAP and Introduction to multi-class Classification (video,slides), Logistic Regression (video,slides) | Mehta's Sec. 5-7, Bias-variance (note), Perceptron Learning Algortihm (link) | HW 1 (due) |
Week 4 | Kernel Methods (video,slides), Bagging (reducing variance) and Boosting (reducing bias) (A great video lecture from the University of Waterloo) | Mehta's section 8, Shalev-Shwartz's book chapter on kernel methods(link), How to win the Netflix prize with ensemble learning (link) | HW 2 (out) |
Week 5 | Review Phase Transitions in Ising Model (video 1, video 2), Intro to Statistical Physics of Inference and Spin Glasses (video, slides) | Skim this excellent review paper to explore how phase transitions can arise from statistical inference (paper) | HW 2 (due) |
Week 6 | Computational Phase Transitions in Statistical Inference, from learnable to impossible (video,slides) | Read this nice lecture note given at the Courant Institute on Phase Transitions in Statistical Estimation Problem (note), and a more rigorous treatment of spike-wigner model (link) | Readings for Week 5-6 |
Week 7 | Intro to computational complexity, its connection to Spin Glass, and whether a computer can solve a computational problem - a lecture series by Prof. Christopher Moore (video 1, video 2, video 3,notes) | A nice lecture note on statistical physics of spin glass, with discussions on replica symmetry breaking (link), บทความเรื่อง P vs NP (link), The Computer Science and Physics of Community Detection (link) | Readings for Week 5-7 |
Week 8 | Introduction to Deep Learning, Feedforward Neural Networks, Backpropagation, Stochastic Gradient Descent (Yann Lecun's NYU course video1 - skip the first 15 minutes, video2) | Mehta's Sec. 9, 11 | HW 3 (out) |
Week 9 | Convolutional Neural Networks (video), Optimization in Deep Learning (video) | Mehta's Sec. 5, 10, 11 | HW 3 (due) |
Week 10 | Intro to Information Bottleneck by Dr. Wave Ngampruetikorn (video,slides) | Deterministic Information Bottleneck (link) | HW 4 (out) |
Week 11 | Deep Learning for Medical Imagings by Dr. Tiam Jaroensri (Google Health) (video) | HW 4 (due) HW 5 (out) | |
Week 12 | Double Descent Phenomena in Deep Learning, Project Discussion, Looking Forward (video), Graph Neural Networks - guest Lecture by Dr. Teerachote(link to GitHub, video) | HW 5 (due) HW 6 (out) | |
Week 13 | Introduction to Quantum Machine Learning - guest lecture by P' Supanut (CQT, Singapore) (video, lecture note) | HW 6 (due) |
In the age of Big Data, “Artificial Intelligence (AI) is the new electricity” is perhaps not an overstatement. Well-developed AI can turn seemingly useless pile of information into useful knowledge, which in turn can influence important decision making or can even fuel scientific discoveries. In the style that is suitable for physicists, this course provides fundamental concepts and techniques of Machine Learning (ML), a subfield of AI that harnesses the power of computation, whether classical or quantum, to turn data into useful computational models. We will first cover the core principles of statistical learning theory, which is a backbone of ML, including overfitting, regularization, bias-variance tradeoff, generalization, and model complexity. We will then cover important classical models of supervised and unsupervised learning such as ensemble models, Deep Learning, clustering and manifold learning, energy-based models such as Restricted Boltzmann Machines, and variational inference. Throughout the course, we will make an emphasis on natural connections between ML and statistical physics. Also, some quantum-inspired algorithms for ML will be presented towards the end of the course. In addition to pencil and paper style homework, we will also have a python-based or Tensorflow-based programming component of homework. We will use Jupyter notebooks for the programming components, where you will learn how to deploy ML algorithms in practice from physics-inspired datasets such as the Ising Model, the XY Model and topological phase transitions, Monte Carlo simulations for some scattering experiments in LHC, and etc. We will end the course with project-based presentations, where students will tackle some of open problems in ML that physicists might be able to contribute, or applying ML to solve complex physics problems of interests.
- 50% Homework
- 20% Exam
- 30% Final Project and Presentation
- David Mackay's Information Theory, Inference, and Learning Algorithms.
- Pankaj Mehta, et. al. A high-bias, low-variance introduction to Machine Learning for physicists
- Caltech’s Learning from Data
- Peter Wittek’s Quantum Machine Learning: What Quantum Computing Means to Data Mining
- Roman Orus’ A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States
- Quantum Mechanics 2 (Chula Playlist)
- Statistical Physics (Chula Playlist)
- Python programming (Recommended Course)