English edit

Etymology edit

Shortening of backpropagation.

Noun edit

backprop (uncountable)

  1. backpropagation
    • 2015, Sanjeev Arora, Yingyu Liang, Tengyu Ma, “Why are deep nets reversible: A simple theory, with implications for training”, in arXiv[1]:
      The generative model suggests a simple modification for training---use an input to produce several synthetic inputs with the same label, and include them in the backprop training.
    • 2020, Timothy P. Lillicrap, Adam Santoro, Luke Marris, Colin J. Akerman, Geoffrey Hinton, “Backpropagation and the brain”, in Nature[2]:
      In machine learning, backpropagation of error (‘backprop’) is the algorithm most often used to train deep neural networks and is the most successful learning procedure for these networks.