0

I have a neural network that is being trained with a changing cost function. Could I use backpropagation at all? If yes, how would I do this?

922073
  • 11
  • 2
  • Hi, I think we need more detail to answer this question. What do you mean by "changing cost function"? Are some of your NN parameters influenced by an external stochastic variable? Then you'd have to look in to Monte Carlo methods for gradient estimation, but it's really hard to say whether that is what you are looking for. – postnubilaphoebus Apr 15 '23 at 09:29

1 Answers1

0

There is nothing wrong with changing your cost/loss function after every step while training a neural network. For example, this paper looks at a weight decay scheduler (which changes the weight regularization at every step): https://arxiv.org/abs/2006.08643.

shatz
  • 144
  • 3