Transfer function and Activation function with respect to Neural Networks have their respective roles to play.
Activation function helps the neural network to learn complex patterns form the data. It helps in deciding which signals to be passed on to the next layer, Activation function also adds non-linearity to the neural network, which makes complex tasks easier to learn.
There are various activation functions like ReLU, Softmax, Tanh function, sigmoid function and others.
The basic thing is that activation functions work on some threshold value and once that value is crossed the signal is triggered,
While Transfer function is used to translate input to output signals. It calculates the net weight of the neural network. Thus, transfer function also combines activation function along with providing output.
Few transfer functions are Unit step (threshold), sigmoid, Gaussian.