A new trend in CS is to replace expensive calculations with a neural network, which is faster and more accurate than equations. For example:
- GraphCast replaces the computationally expensive physics-based weather equations that only supercomputers can run.
- TeaNet predicts how atoms move thousands of times faster than quantum mechanical modelling equations (DFT).
Does this mean that physics-based algorithms are useless? No! Cause:
- If data points are missing, scientists use the formulas to interpolate them.
- We can use the formulas to generate training data for our new ML models.
- Isn't this crazy? "Gathering training data" just becomes "solving the formulas with different values for x".
Anyhow, if you ever need to speed up code, see if you can replace it with a neural net. I hope you’re all staying warm in this wonderful new year.
- Curtis
P.S. Since I quit my startup, I’ve been trying to figure out a fun computational-based project to work on. If you’re doing some cool research in science, I’d love to call you!
#102: Grammarly for Anything | #104: Making Unhackable Servers