Mxnet Fullyconnected Example

We use 8 features to predict whether the patient has diabetes or not by a 3 fully connected layer network with code here.

Cs231n Backpropagation

Motivation. In this section we will develop expertise with an intuitive understanding of backpropagation, which is a way of computing gradients of expressions through recursive application of chain rule. Understanding of this process and its subtleties is critical for you to understand, and effectively develop, design and debug Neural Networks.

Memory Management In Deep Learning

Memory management in big data processing is a critial issue, especially in deep learning where there might be billions of parameters in one model.

Mxnet Logistic Regression Example

We use 8 features to predict whether the patient has diabetes or not. The source code is here.

Mxnet Intro To Module Api

Module - MXNet’s high-level interface for neural network training.

Cs231n Optimization Sgd

Optimization is the process of finding the set of parameters W that minimize the loss function.

Cs231n Linear Classification

The approach will have two major components: a score function that maps the raw data to class scores, and a loss function that quantifies the agreement between the predicted scores and the ground truth labels.

Cs231n Intro To Image Classification

In this section we will introduce the Image Classification problem, which is the task of assigning an input image one label from a fixed set of categories. This is one of the core problems in Computer Vision that, despite its simplicity, has a large variety of practical applications. Moreover, as we will see later in the course, many other seemingly distinct Computer Vision tasks (such as object detection, segmentation) can be reduced to image classification.