動画検索
関連広告
検索結果
A typical loss metric in regression problems
Discussing the primal computation
Task: Backward propagate cotangent information
Relevant for backpropagation in Neural Networks (as part of Deep Learning)
Often not interested in reference solution cotangent
General vector-Jacobian product (pullback rule
Finding a closed-form expression for the Jacobian
Changing to index notation
Back to symbolic notation
The other Jacobian
Plugging Jacobians into vJp rule
Full Pullback rule
Some remarks
Outro
What is the L2-loss ([nonlinear] Least-Squares loss)
Dimensionalities involved
Typical for regression problems (like Neural Networks for Scientific Machine Learning)
Task: Forward propagate tangent information
General Jacobian-vector product (pushforward)
Finding closed-form expression of Jacobian
Moving to index notation
Result back in symbolic notation
Full Pushforward rule
Computational considerations: storing the elementwise delta
Often there is no tangent on the reference vector
Summary
Outro