The general use case is to use BatchNormalization between the linear and non-linear layers in our network. It normalizes the input to our activation function so that we're centered in the linear section of the activation function (such as Sigmoid). You need to import this function in your code. This is a part of the normalization module.
from keras.layers.normalization import BatchNormalization