From 2553b4ed31e2b73bf3d20f05a353bb03710d338e Mon Sep 17 00:00:00 2001 From: Tobias Eidelpes Date: Thu, 9 Nov 2023 20:23:09 +0100 Subject: [PATCH] Remove Inception v4 chapter --- thesis/thesis.tex | 13 +++++++++++-- 1 file changed, 11 insertions(+), 2 deletions(-) diff --git a/thesis/thesis.tex b/thesis/thesis.tex index fed3565..1d494a2 100644 --- a/thesis/thesis.tex +++ b/thesis/thesis.tex @@ -1510,8 +1510,17 @@ section~\ref{sssec:theory-googlenet}) by 0.9\%. \subsubsection{ResNet} \label{sssec:theory-resnet} -\subsubsection{Inception v4} -\label{sssec:theory-inception-v4} +The 22-layer structure of GoogLeNet \cite{szegedy2015} and the +19-layer structure of VGGNet \cite{simonyan2015} showed that +\emph{going deeper} is beneficial for achieving better classification +performance. However, the authors of VGGNet already note that stacking +even more layers does not lead to better performance because the model +is \emph{saturated}. \textcite{he2016} provide a solution to the +vanishing gradient as well as the degradation problem by introducing +\emph{skip connections} to the network. They call their resulting +network architecture \emph{ResNet} and since it is used in this work, +we will give a more detailed account of its structure in +section~\ref{sec:methods-classification}. \subsubsection{DenseNet} \label{sssec:theory-densenet}