Skip to content

Commit

Permalink
Typo
Browse files Browse the repository at this point in the history
  • Loading branch information
edomora97 committed Jun 10, 2020
1 parent 4357924 commit 1c333af
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
6 changes: 3 additions & 3 deletions lectures/2020-06-03.tex
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ \chapter{Recursive Identification}
\section{Least square}

\[
\hat{\theta_N} = \argmin_\theta \left\{ J_N(\theta) = \frac{1}{N} \sum_{t=1}^N \left( y(t) - \hat{y}(t|t-1, \theta) \right)^2 \right\}
\hat{\theta}_N = \argmin_\theta \left\{ J_N(\theta) = \frac{1}{N} \sum_{t=1}^N \left( y(t) - \hat{y}(t|t-1, \theta) \right)^2 \right\}
\]

We need to find the model predictor $\hat{y}(t|t-1, \theta)$.
Expand Down Expand Up @@ -100,7 +100,7 @@ \subsection{First form}
\hat{\theta}_{N-1} &= S(N-1)^{-1} \sum_{t=1}^{N-1} \phi(t)y(t) \\
\sum_{t=1}^{N-1} \phi(t)y(t) &= S(N-1)\hat{\theta}_{N-1} \\
\sum_{t=1}^{N} \phi(t)y(t) &= \sum_{t=1}^{N-1} \phi(t)y(t) + \phi(N)y(N) \\
\sum_{t=1}^{N} \phi(t)y(t) &= S(N-1)\hat{\theta}_{N-1} + \phi(N)y(N) = \text{ equation } (1.2) \\
\sum_{t=1}^{N} \phi(t)y(t) &= S(N-1)\hat{\theta}_{N-1} + \phi(N)y(N) = \text{ equation } (7.2) \\
S(N) \hat{\theta}_N &= S(N-1) \hat{\theta}_{N-1} + \phi(N)y(N)
\end{align}

Expand Down Expand Up @@ -229,7 +229,7 @@ \section{Recursive Least Square with Forgetting Factor}

$\hat{\alpha}_0$ is the correct estimation at time $N$, but it does not minimizes the objective function $J_N(\alpha)= \frac{1}{N} \sum_{t=1}^N \left( y(t) - \hat{y}(t|t-1, \alpha) \right)^2$ because it considers the entire time history of the system.

In order to identify a time varying parameter the RLS must be force to forget old data.
In order to identify a time varying parameter the RLS must be forced to forget old data.
The solution is provided by the minimization of $J_N$:
\[
J_N(\theta) = \frac{1}{N} \sum_{t=1}^N \rho^{N-t}\left( y(t) - \hat{y}(t|t-1,\theta) \right)^2
Expand Down
8 changes: 4 additions & 4 deletions lectures/2020-06-04.tex
Original file line number Diff line number Diff line change
Expand Up @@ -203,8 +203,8 @@ \subsection*{D to A converter}
Another simple discretization technique frequently used is the discretization of time-derivative $\dot{x}$.

\begin{align*}
\text{\textbf{eulero backward}} &\qquad \dot{x} \approx \frac{x(t)-x(t-1)}{\Delta T} = \frac{x(t)-z^{-1}x(t)}{\Delta T} = \frac{z-1}{z\Delta T} x(t) \\
\text{\textbf{eulero forward}} &\qquad \dot{x} \approx \frac{x(t+1)-x(t)}{\Delta T} = \frac{zx(t)-x(t)}{\Delta T} = \frac{z-1}{\Delta T} x(t)
\text{\textbf{Eulero backward}} &\qquad \dot{x} \approx \frac{x(t)-x(t-1)}{\Delta T} = \frac{x(t)-z^{-1}x(t)}{\Delta T} = \frac{z-1}{z\Delta T} x(t) \\
\text{\textbf{Eulero forward}} &\qquad \dot{x} \approx \frac{x(t+1)-x(t)}{\Delta T} = \frac{zx(t)-x(t)}{\Delta T} = \frac{z-1}{\Delta T} x(t)
\end{align*}

General formula
Expand Down Expand Up @@ -344,9 +344,9 @@ \subsection*{D to A converter}

\draw[dotted] (1.5,1.875) -- (1.5,0) node[below] {};
\node[below] at (0.75,0) {\footnotesize B.W. of interest};
\draw (4,0.1) -- (4,0) node[below] {$\scriptstyle f_S$};
\draw (4,0.1) -- (4,0) node[below] {$\scriptstyle \omega_S$};
\draw (1.5,0) edge[bend left=20,->] (4,0);
\node at (2.75,0.45) {\footnotesize x10};
\node at (2.75,0.45) {$\scriptstyle \times 10$};
\end{tikzpicture}
\end{figure}

Expand Down

0 comments on commit 1c333af

Please sign in to comment.