LMJW Blog My notes.
Posts with the tag deep-learning:

Intro to deep learning notes

This is a note for MIT 6.S191 course link course 1. Intro to deep learning The Perceptron: Forward Propagation Single layer neural network with tensorflow: from tf.keras.layers import * inputs = Inputs(m) hidden = Dense(d1)(inputs) outputs = Dense(d2)(hidden) model = Model(inputs, outputs) This four lines of code computes the single layer NN. Deep Neural Network More hidden layers Applying Neural Networks Quantifying Loss Compare Predicted loss vs actual loss Minimize loss Different loss functions Binary cross entropy loss loss = tf.reduce_mean( tf.nn.softmax_cross_entropy_with_logits( model.y, model.pred )) Mean squared error loss loss = tf.