Python Activation function

Activation function

Activation function is for Neural Network.

We will use activation function in output layer.

For regression, use identity function

For classification, use softmax

Activation function example

  • Step
  • Sigmoid
  • ReLU(rectified linear unit)
  • Softmax
  • Identity

Implementation

Use numpy

import numpy as np

def step_function(x):
	return np.array(x > 0, dtype=np.int)

def sigmoid(x):
	return 1 / (1 + np.exp(-x))

def relu(x):
	return np.maximum(0, x)

def softmax(a):
	c = np.max(a)
	exp_a = np.exp(a - c)  # For overflow
	sum_exp_a = np.sum(exp_a)
	y = exp_a / sum_exp_a
	return y

def identity_function(x):
	return x
Data Python
スポンサーリンク
Professional Programmer2

コメント