Common steps for pre-processing a new dataset are: Figure out the dimensions and shapes of the problem (m_train, m_test, num_px, ...) Reshape the datasets such that each example is now a vector of size (num_px * num_px * 3, 1) "Standardize" the data You've implemented several functions that: Initialize (w,b) Optimize the loss iteratively to […]

# Category: Neural Networks and Deep Learning

## Week 2 - Python and Vectorization

Vectorization if non-verctorized:

1 2 3 4 |
Z = 0 for i in range(nx): z += W[i] * X[i] Z += b |

if vectorized:

1 |
Z = np.dot(W,X) + b |

it is much faster

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
import numpy as np import time a = np.random.rand(1000000) b = np.random.rand(1000000) #check how much time was used tic = time.time() c = np.dot(a,b) toc = time.time() print("Vectorized version : " + str(1000*(toc-tic) + "ms") # 1.5ms c = 0 tic = time.time() for i in range(1000000): c += a[i]*b[i] toc = time.time() print("For loop version : " + str(1000*(toc-tic) + "ms") #474.2ms |

Vectorizing Logistic Regression b is real number, it will be automatically changed to vector to be added each element of matrix (python broadcasting)

1 |
Z = np.dot(W.T,X) + b |

A note on python/numpy vectors to simplify code and to avoid bug, don't use rank 1 array

## Week 2 - Logistic Regression as a Neural Network

Binary Classification 1 (cat) vs 0 (non cat) example Cat image if Red, Green, Blue 64 x 64 pixels matrices input feature vector : 64 x 64 x 3 dimension = (64 x 64 x 3, 1) matrix Notation (x,y) : single training example is represented by a pair. x is an x-dimensional feature vector […]