Gradient of $||Ax - y||^2$ with respect to $A$
up vote
1
down vote
favorite
How do I proceed to find $nabla_A||Ax - y||^2$ where $A in mathbb{R}^{ntimes n}$ and $x,y in mathbb{R}^n$ and the norm is the Euclidean norm.
Attempt so far
$$||Ax - y||^2 = (Ax-y)^T(Ax-y) = x^TA^TAx - 2x^TAy + y^Ty $$
$$ nabla_A(x^TAy) = xy^T$$
Where I am stuck
I don't know how to tackle the $x^TA^TAx$ term since if I try to apply chain rule, I will have to differentiate a matrix with respect to a matrix.
vector-analysis matrix-calculus
add a comment |
up vote
1
down vote
favorite
How do I proceed to find $nabla_A||Ax - y||^2$ where $A in mathbb{R}^{ntimes n}$ and $x,y in mathbb{R}^n$ and the norm is the Euclidean norm.
Attempt so far
$$||Ax - y||^2 = (Ax-y)^T(Ax-y) = x^TA^TAx - 2x^TAy + y^Ty $$
$$ nabla_A(x^TAy) = xy^T$$
Where I am stuck
I don't know how to tackle the $x^TA^TAx$ term since if I try to apply chain rule, I will have to differentiate a matrix with respect to a matrix.
vector-analysis matrix-calculus
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
How do I proceed to find $nabla_A||Ax - y||^2$ where $A in mathbb{R}^{ntimes n}$ and $x,y in mathbb{R}^n$ and the norm is the Euclidean norm.
Attempt so far
$$||Ax - y||^2 = (Ax-y)^T(Ax-y) = x^TA^TAx - 2x^TAy + y^Ty $$
$$ nabla_A(x^TAy) = xy^T$$
Where I am stuck
I don't know how to tackle the $x^TA^TAx$ term since if I try to apply chain rule, I will have to differentiate a matrix with respect to a matrix.
vector-analysis matrix-calculus
How do I proceed to find $nabla_A||Ax - y||^2$ where $A in mathbb{R}^{ntimes n}$ and $x,y in mathbb{R}^n$ and the norm is the Euclidean norm.
Attempt so far
$$||Ax - y||^2 = (Ax-y)^T(Ax-y) = x^TA^TAx - 2x^TAy + y^Ty $$
$$ nabla_A(x^TAy) = xy^T$$
Where I am stuck
I don't know how to tackle the $x^TA^TAx$ term since if I try to apply chain rule, I will have to differentiate a matrix with respect to a matrix.
vector-analysis matrix-calculus
vector-analysis matrix-calculus
edited Nov 22 at 9:31
user550103
8911315
8911315
asked Nov 22 at 6:38
Anant Joshi
451211
451211
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
Before we start deriving the gradient, some facts and notations for brevity:
- Trace and Frobenius product relation $$leftlangle A, B Crightrangle={rm tr}(A^TBC) := A : B C$$
- Cyclic properties of Trace/Frobenius product
begin{align}
A : B C
&= BC : A \
&= A C^T : B \
&= {text{etc.}} cr
end{align}
Let $f := left|Ax-y right|^2 = Ax-y:Ax-y$.
Now, we can obtain the differential first, and then the gradient.
begin{align}
df
&= dleft( Ax-y:Ax-y right) \
&= left(dA x : Ax-yright) + left(Ax-y : dA xright) \
&= 2 left(Ax - yright) : dA x \
&= 2left( Ax-yright)x^T : dA\
end{align}
Thus, the gradient is
begin{align}
frac{partial}{partial A} left( left|Ax-y right|^2 right)= 2left( Ax-yright)x^T.
end{align}
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
add a comment |
up vote
0
down vote
For matrices, the most easy way is often to get back to definition of differentiability, i.e. :
$$||(A+H)x - y||^2-||Ax - y||^2=L_A(H) + o(|H|)$$
With $L_A$ a linear map.
We begin with :
$$||(A+H)x - y||^2=langle Ax+Hx-y , Ax+Hx-y rangle.$$
Then we have :
$$||(A+H)x - y||^2= langle Ax-y,Ax-yrangle + 2langle Hx,Ax -yrangle + langle Hx,Hxrangle.$$
To conclude : $$||(A+H)x - y||^2-||Ax - y||^2=2langle Hx,Ax -yrangle + o (|H|)$$
Thus :
$$left(nabla_A||Ax - y||^2right)_{i,j}=2langle E_{i,j}x,Ax -yrangle$$
With $E_{i,j}$ the matrix with a $1$ at row $i$ and column $j$ and $0$ otherwise.
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Before we start deriving the gradient, some facts and notations for brevity:
- Trace and Frobenius product relation $$leftlangle A, B Crightrangle={rm tr}(A^TBC) := A : B C$$
- Cyclic properties of Trace/Frobenius product
begin{align}
A : B C
&= BC : A \
&= A C^T : B \
&= {text{etc.}} cr
end{align}
Let $f := left|Ax-y right|^2 = Ax-y:Ax-y$.
Now, we can obtain the differential first, and then the gradient.
begin{align}
df
&= dleft( Ax-y:Ax-y right) \
&= left(dA x : Ax-yright) + left(Ax-y : dA xright) \
&= 2 left(Ax - yright) : dA x \
&= 2left( Ax-yright)x^T : dA\
end{align}
Thus, the gradient is
begin{align}
frac{partial}{partial A} left( left|Ax-y right|^2 right)= 2left( Ax-yright)x^T.
end{align}
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
add a comment |
up vote
1
down vote
Before we start deriving the gradient, some facts and notations for brevity:
- Trace and Frobenius product relation $$leftlangle A, B Crightrangle={rm tr}(A^TBC) := A : B C$$
- Cyclic properties of Trace/Frobenius product
begin{align}
A : B C
&= BC : A \
&= A C^T : B \
&= {text{etc.}} cr
end{align}
Let $f := left|Ax-y right|^2 = Ax-y:Ax-y$.
Now, we can obtain the differential first, and then the gradient.
begin{align}
df
&= dleft( Ax-y:Ax-y right) \
&= left(dA x : Ax-yright) + left(Ax-y : dA xright) \
&= 2 left(Ax - yright) : dA x \
&= 2left( Ax-yright)x^T : dA\
end{align}
Thus, the gradient is
begin{align}
frac{partial}{partial A} left( left|Ax-y right|^2 right)= 2left( Ax-yright)x^T.
end{align}
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
add a comment |
up vote
1
down vote
up vote
1
down vote
Before we start deriving the gradient, some facts and notations for brevity:
- Trace and Frobenius product relation $$leftlangle A, B Crightrangle={rm tr}(A^TBC) := A : B C$$
- Cyclic properties of Trace/Frobenius product
begin{align}
A : B C
&= BC : A \
&= A C^T : B \
&= {text{etc.}} cr
end{align}
Let $f := left|Ax-y right|^2 = Ax-y:Ax-y$.
Now, we can obtain the differential first, and then the gradient.
begin{align}
df
&= dleft( Ax-y:Ax-y right) \
&= left(dA x : Ax-yright) + left(Ax-y : dA xright) \
&= 2 left(Ax - yright) : dA x \
&= 2left( Ax-yright)x^T : dA\
end{align}
Thus, the gradient is
begin{align}
frac{partial}{partial A} left( left|Ax-y right|^2 right)= 2left( Ax-yright)x^T.
end{align}
Before we start deriving the gradient, some facts and notations for brevity:
- Trace and Frobenius product relation $$leftlangle A, B Crightrangle={rm tr}(A^TBC) := A : B C$$
- Cyclic properties of Trace/Frobenius product
begin{align}
A : B C
&= BC : A \
&= A C^T : B \
&= {text{etc.}} cr
end{align}
Let $f := left|Ax-y right|^2 = Ax-y:Ax-y$.
Now, we can obtain the differential first, and then the gradient.
begin{align}
df
&= dleft( Ax-y:Ax-y right) \
&= left(dA x : Ax-yright) + left(Ax-y : dA xright) \
&= 2 left(Ax - yright) : dA x \
&= 2left( Ax-yright)x^T : dA\
end{align}
Thus, the gradient is
begin{align}
frac{partial}{partial A} left( left|Ax-y right|^2 right)= 2left( Ax-yright)x^T.
end{align}
edited Nov 22 at 9:17
answered Nov 22 at 9:08
user550103
8911315
8911315
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
add a comment |
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
What is the difference between differential and gradient?
– Anant Joshi
Nov 22 at 9:19
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
see this math.stackexchange.com/questions/289923/…
– user550103
Nov 22 at 9:20
add a comment |
up vote
0
down vote
For matrices, the most easy way is often to get back to definition of differentiability, i.e. :
$$||(A+H)x - y||^2-||Ax - y||^2=L_A(H) + o(|H|)$$
With $L_A$ a linear map.
We begin with :
$$||(A+H)x - y||^2=langle Ax+Hx-y , Ax+Hx-y rangle.$$
Then we have :
$$||(A+H)x - y||^2= langle Ax-y,Ax-yrangle + 2langle Hx,Ax -yrangle + langle Hx,Hxrangle.$$
To conclude : $$||(A+H)x - y||^2-||Ax - y||^2=2langle Hx,Ax -yrangle + o (|H|)$$
Thus :
$$left(nabla_A||Ax - y||^2right)_{i,j}=2langle E_{i,j}x,Ax -yrangle$$
With $E_{i,j}$ the matrix with a $1$ at row $i$ and column $j$ and $0$ otherwise.
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
add a comment |
up vote
0
down vote
For matrices, the most easy way is often to get back to definition of differentiability, i.e. :
$$||(A+H)x - y||^2-||Ax - y||^2=L_A(H) + o(|H|)$$
With $L_A$ a linear map.
We begin with :
$$||(A+H)x - y||^2=langle Ax+Hx-y , Ax+Hx-y rangle.$$
Then we have :
$$||(A+H)x - y||^2= langle Ax-y,Ax-yrangle + 2langle Hx,Ax -yrangle + langle Hx,Hxrangle.$$
To conclude : $$||(A+H)x - y||^2-||Ax - y||^2=2langle Hx,Ax -yrangle + o (|H|)$$
Thus :
$$left(nabla_A||Ax - y||^2right)_{i,j}=2langle E_{i,j}x,Ax -yrangle$$
With $E_{i,j}$ the matrix with a $1$ at row $i$ and column $j$ and $0$ otherwise.
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
add a comment |
up vote
0
down vote
up vote
0
down vote
For matrices, the most easy way is often to get back to definition of differentiability, i.e. :
$$||(A+H)x - y||^2-||Ax - y||^2=L_A(H) + o(|H|)$$
With $L_A$ a linear map.
We begin with :
$$||(A+H)x - y||^2=langle Ax+Hx-y , Ax+Hx-y rangle.$$
Then we have :
$$||(A+H)x - y||^2= langle Ax-y,Ax-yrangle + 2langle Hx,Ax -yrangle + langle Hx,Hxrangle.$$
To conclude : $$||(A+H)x - y||^2-||Ax - y||^2=2langle Hx,Ax -yrangle + o (|H|)$$
Thus :
$$left(nabla_A||Ax - y||^2right)_{i,j}=2langle E_{i,j}x,Ax -yrangle$$
With $E_{i,j}$ the matrix with a $1$ at row $i$ and column $j$ and $0$ otherwise.
For matrices, the most easy way is often to get back to definition of differentiability, i.e. :
$$||(A+H)x - y||^2-||Ax - y||^2=L_A(H) + o(|H|)$$
With $L_A$ a linear map.
We begin with :
$$||(A+H)x - y||^2=langle Ax+Hx-y , Ax+Hx-y rangle.$$
Then we have :
$$||(A+H)x - y||^2= langle Ax-y,Ax-yrangle + 2langle Hx,Ax -yrangle + langle Hx,Hxrangle.$$
To conclude : $$||(A+H)x - y||^2-||Ax - y||^2=2langle Hx,Ax -yrangle + o (|H|)$$
Thus :
$$left(nabla_A||Ax - y||^2right)_{i,j}=2langle E_{i,j}x,Ax -yrangle$$
With $E_{i,j}$ the matrix with a $1$ at row $i$ and column $j$ and $0$ otherwise.
edited Nov 22 at 9:20
answered Nov 22 at 9:11
nicomezi
3,9421819
3,9421819
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
add a comment |
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Why is the final answer a number and not a matrix?
– Anant Joshi
Nov 22 at 9:13
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
Because I misunderstood the notation and went a bit fast at the end. Is it better now ?
– nicomezi
Nov 22 at 9:21
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3008814%2fgradient-of-ax-y2-with-respect-to-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown