Artificial Neural Networks for Engineering Students
Forfatter
Matthew Burns
Sidst opdateret
9 år siden
Licens
Creative Commons CC BY 4.0
Resumé
A primer to get UCSD ECE students into neural networks.
A primer to get UCSD ECE students into neural networks.
\documentclass{article}
\usepackage{nips15submit_09, times}
\usepackage{hyperref}
\usepackage{url}
\usepackage{amsmath}
\usepackage{algorithm}
\usepackage[noend]{algpseudocode}
\usepackage{graphicx}
\usepackage{amssymb}
\usepackage{eqnarray}
\usepackage{multicol}
\setlength\columnsep{20pt}
\usepackage[margin={.75in,1.25in}]{geometry}
%Bibliography setup
\usepackage[backend=biber]{biblatex}
\addbibresource{references.bib}
\title{Artificial Neural Networks for Engineering Students}
\author{
\large{Matthew Burns}\\
\texttt{mdburns@eng.ucsd.edu}\\
}
% The \author macro works with any number of authors. There are two commands
% used to separate the names and addresses of multiple authors: \And and \AND.
%
% Using \And between authors leaves it to \LaTeX{} to determine where to break
% the lines. Using \AND forces a linebreak at that point. So, if \LaTeX{}
% puts 3 of 4 authors names on the first line, and the last on the second
% line, try using \AND instead of \And before the third author name.
\newcommand{\fix}{\marginpar{FIX}}
\newcommand{\new}{\marginpar{NEW}}
\nipsfinalcopy % Uncomment for camera-ready version
\begin{document}
\maketitle
At the request of some ECE students, I have started a neural networks reference page. Instead of overwhelming the reader with the many sources and opinions that exist, I have chosen the few references which I see as important to getting started.
\begin{enumerate}
\item \textbf{Neural net application examples}:
\begin{enumerate}
\item \href{http://www.eetimes.com/document.asp?doc_id=1266579}{Cat faces}: Google trains a billion-parameter neural network on GPUs (\$\$\$) and discovers cats on the internet... so cool.
\item \href{https://www.youtube.com/watch?v=xN1d3qHMIEQ}{Deep Mind}: Acquired by Google for \$500M+ because they made an unsupervised net that can learn to play Atari games at superhuman levels. Combines deep learning and reinforcement learning.
\item \href{https://www.youtube.com/watch?v=EtMyH_--vnU}{Deep Learning for Decision Making and Control}: Advanced application of probabilistic neural networks. Combines deep learning with optimal control.
\end{enumerate}
\item \textbf{Video lecture}:
\begin{enumerate}
\item \href{https://www.youtube.com/watch?v=qgx57X0fBdA}{Deep Learning for Computer Vision (Rob Fergus)}
\item \href{http://cs.nyu.edu/~fergus/presentations/nips2013_final.pdf}{Slides}
\end{enumerate}
\item \textbf{Reading:}
\begin{enumerate}
\item Bishop \cite{bishop1995neural}: this is the main theoretical reference for neural networks. It even has a chapter on Bayesian interpretations at the end, tying neural networks to probabilistic graphical models.
\item Efficient BackProp \cite{lecun2012efficient}: Also referred to as "Tricks of the trade" because it was included in a book with that title.
\item Convolutional Neural Networks \cite{krizhevsky2012imagenet}: Trained a deep, very high dimensional ($\mathbb{R}^{60,000,000}$) convolutional network on ImageNet dataset. This network architecture learns the optimal FIR filters which produce features which are good for separating the different image classes. What is interesting is that the neural network features get better classification performance than human designed features.
\end{enumerate}
\item \textbf{Tutorial projects:}
\begin{enumerate}
\item \href{http://deeplearning.stanford.edu/tutorial/}{UFLDL Tutorial}
\item These are intense project intended to be completed in teams of three people.
\item If you successfully complete these projects, you will know deterministic neural networks well.
\item This project sequence starts you off at ECE 174; if you took that class, then you are well prepared to start. You can think of these projects as a template for an independent study, taken with the other reading I mentioned, esp. Bishop.
\end{enumerate}
\item \textbf{Open source projects:}
\begin{enumerate}
\item \href{https://github.com/BVLC/caffe}{Caffe}: a cutting edge image classification system. You could hand design your own image features for several years to get the best possible generalized performance. Or you can rip off the features from this project and automagically get state of the art performance on ImageNet.
\item \href{https://github.com/numenta/nupic}{NuPIC}: This allows for spatiotemporal pattern classification. Some have criticized this system because it has not produced impressive results. On the other hand, there is quite a following, so worth mentioning.
\href{https://www.youtube.com/watch?v=5r1vZ1ymrQE}{video}
\end{enumerate}
\item \textbf{Datasets:}
\begin{enumerate}
\item \href{http://www.image-net.org/}{ImageNet}
\item \href{http://yann.lecun.com/exdb/mnist/}{MNIST}
\item \href{https://www.kaggle.com/}{kaggle}
\item \href{http://clickdamage.com/sourcecode/cv_datasets.php}{Extensive list of computer vision datasets}
\end{enumerate}
\end{enumerate}
\printbibliography
\nocite{*}
\end{document}