Skip to content
Snippets Groups Projects
Select Git revision
  • 9c756525053f9dba48753eee2c422c2a8a8e025f
  • main default protected
  • solution
3 results

notes.md

Blame
  • activations.py 322 B
    import numpy as np
    
    def relu(x):
        ''' Rectified Linear Unit (ReLU) '''
        return np.maximum(0., x)
        
    def drelu(x):
        ''' derivative of ReLU '''
        y=np.zeros(x.shape) 
        y[np.where(x>0)]=1
        return y 
    
    def softmax(x):    
        ''' softmax '''
        z = np.exp(x)
        return z / np.sum(z,axis=-1,keepdims=True)