diff --git a/.nojekyll b/.nojekyll index ece341cc..af6f4909 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -eec41acf \ No newline at end of file +585ab0ee \ No newline at end of file diff --git a/dl_primer.html b/dl_primer.html index 66cdc282..d72f2031 100644 --- a/dl_primer.html +++ b/dl_primer.html @@ -84,6 +84,8 @@ } + + @@ -438,7 +440,7 @@
The concept of deep learning has its roots in the early artificial neural networks. It has witnessed several waves of popularity, starting with the introduction of the Perceptron in the 1950s (Rosenblatt 1957), followed by the development of backpropagation algorithms in the 1980s (Rumelhart, Hinton, and Williams 1986).
-The term “deep learning” emerged in the 2000s, marked by breakthroughs in computational power and data availability. Key milestones include the successful training of deep networks by Geoffrey Hinton, one of the god fathers of AI, and the resurgence of neural networks as a potent tool for data analysis and modeling.
+The term deep learning emerged in the 2000s, marked by breakthroughs in computational power and data availability. Key milestones include the successful training of deep networks such as AlexNet (Krizhevsky, Sutskever, and Hinton 2012) by Geoffrey Hinton, one of the god fathers of AI, and the resurgence of neural networks as a potent tool for data analysis and modeling.
In recent years, deep learning has witnessed exponential growth, becoming a transformative force across various industries. Figure 3.1 shows that we are currently in the third era of deep learning. From 1952 to 2010, computational growth followed an 18-month doubling pattern. This dramatically accelerated to a 6-month cycle from 2010 to 2022. At the same time, we witnessed the advent of major-scale models between 2015 and 2022; these appeared 2 to 3 orders of magnitude faster and followed a 10-month doubling cycle.