Skip to content
Snippets Groups Projects
Select Git revision
  • f4d93212b9ffec0ea09cd549600ad4d5a2c61dca
  • main default protected
  • solution
3 results

solution.js

Blame
  • activations.py 322 B
    import numpy as np
    
    def relu(x):
        ''' Rectified Linear Unit (ReLU) '''
        return np.maximum(0., x)
        
    def drelu(x):
        ''' derivative of ReLU '''
        y=np.zeros(x.shape) 
        y[np.where(x>0)]=1
        return y 
    
    def softmax(x):    
        ''' softmax '''
        z = np.exp(x)
        return z / np.sum(z,axis=-1,keepdims=True)