In the language of neural networks, this is what it means:
One epoch equals one forward and one backward pass through all of the training scenarios.
Batch size refers to the amount of training examples that can be processed in a single forward/backward pass. You'll need additional memory space as the batch size grows.
The number of iterations equals the number of passes, with each pass containing [batch size] instances. To be clear, a pass consists of one forward and one backward pass (we do not count the forward pass and backward pass as two different passes).
For example, if your batch size is 500 and you have 1000 training instances, 1 epoch will take 2 iterations to complete.
Some people use the term "batch" to refer to the full training set, while others use it to refer to the number of training instances in one forward/backward pass (as I did in this answer). The term mini-batch can be used to avoid misunderstanding and clarify that batch refers to the amount of training instances in a single forward/backward pass.