Change back-propagation to backpropagation

This commit is contained in:
Tobias Eidelpes 2023-11-22 10:59:39 +01:00
parent d79f968c6f
commit 3ed52bf88a

View File

@ -790,8 +790,7 @@ and updating the weights within the network is it possible to gain
experience $E$ at carrying out a task $T$. How the weights are updated
depends on the algorithm which is used during the \emph{backward pass}
to minimize the error. This type of procedure is referred to as
\emph{back-propagation} (see
section~\ref{ssec:theory-back-propagation}).
\emph{backpropagation} (see section~\ref{ssec:theory-backprop}).
One common type of loss function is the \gls{mse} which is widely used
in regression problems. The \gls{mse} is a popular choice because it