diff --git a/optimizations.qmd b/optimizations.qmd index 9b457c3d..21ec4cfb 100644 --- a/optimizations.qmd +++ b/optimizations.qmd @@ -1,7 +1,5 @@ # Model Optimizations -<<<<<<< HEAD -======= ::: {.callout-tip} ## Learning Objectives @@ -9,7 +7,6 @@ ::: ->>>>>>> upstream/main ## Introduction When machine learning models are deployed on systems, especially on resource-constrained embedded systems, the optimization of models is a necessity. While machine learning inherently often demands substantial computational resources, the systems are inherently limited in memory, processing power, and energy. This chapter will dive into the art and science of optimizing machine learning models to ensure they are lightweight, efficient, and effective when deployed in TinyML scenarios.