SSOR with acceleration by CG
$begingroup$
I need to implement SSOR method with acceleration by conjugate gradient method. But I don't understand how we can to combine two iteration methods? Both algorithms solving $Ax=b$.
In book "Applied Iterative Methods", Hageman, Young described Conjugate Gradient Acceleration for Richardson method (p. 146), but I did't understand: how we combined two methods. I don't understand already at the stage which of the methods should be embedded into the other.
I appreciated any help about my questions.
numerical-methods
$endgroup$
add a comment |
$begingroup$
I need to implement SSOR method with acceleration by conjugate gradient method. But I don't understand how we can to combine two iteration methods? Both algorithms solving $Ax=b$.
In book "Applied Iterative Methods", Hageman, Young described Conjugate Gradient Acceleration for Richardson method (p. 146), but I did't understand: how we combined two methods. I don't understand already at the stage which of the methods should be embedded into the other.
I appreciated any help about my questions.
numerical-methods
$endgroup$
add a comment |
$begingroup$
I need to implement SSOR method with acceleration by conjugate gradient method. But I don't understand how we can to combine two iteration methods? Both algorithms solving $Ax=b$.
In book "Applied Iterative Methods", Hageman, Young described Conjugate Gradient Acceleration for Richardson method (p. 146), but I did't understand: how we combined two methods. I don't understand already at the stage which of the methods should be embedded into the other.
I appreciated any help about my questions.
numerical-methods
$endgroup$
I need to implement SSOR method with acceleration by conjugate gradient method. But I don't understand how we can to combine two iteration methods? Both algorithms solving $Ax=b$.
In book "Applied Iterative Methods", Hageman, Young described Conjugate Gradient Acceleration for Richardson method (p. 146), but I did't understand: how we combined two methods. I don't understand already at the stage which of the methods should be embedded into the other.
I appreciated any help about my questions.
numerical-methods
numerical-methods
asked Dec 14 '18 at 19:33
PennywisePennywise
18610
18610
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
You need to use one method as a preconditioner for the other. My guess would be that you are asked to use SSOR as a preconditioner for CG.
$endgroup$
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
add a comment |
$begingroup$
So, it's turned out to be simple.
Here's matlab code. It's not needed more explanation. Just read "Applied Iterative Methods", Hageman, Young.
function [ x, err, iter, flag ] = ssor_cg(A, x, b, w, max_it, tol)
D = diag(diag(A));
I = eye(size(A));
C_L = -tril(A, -1);
C_U = -triu(A, 1);
W = mpower(A, 1/2);
Q = w / (2 - w) * (1 / w * D - C_L) * mpower(D, -1) * (1 / w * D - C_U);
G = I - mpower(Q, -1) * A;
k = mpower(Q, -1) * b;
delta = 0;
p = 0;
alpha = 0;
for iter = 0 : max_it
x_prev = x;
p_prev = p;
delta = G * x + k - x;
delta_prev = delta;
if iter == 0
p = delta;
else
p = delta + alpha * p_prev;
end
if iter > 0
alpha = (dot(W * delta, W * delta)) / (dot(W * delta_prev, W * delta_prev));
end
lambda = (dot(W * delta, W * delta)) / (dot(W * p, W * (I - G) * p));
x = x_prev + lambda * p;
err = norm(x_prev); % compute error
if (err <= tol) % check for convergence
break
end
end
end
$endgroup$
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039808%2fssor-with-acceleration-by-cg%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You need to use one method as a preconditioner for the other. My guess would be that you are asked to use SSOR as a preconditioner for CG.
$endgroup$
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
add a comment |
$begingroup$
You need to use one method as a preconditioner for the other. My guess would be that you are asked to use SSOR as a preconditioner for CG.
$endgroup$
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
add a comment |
$begingroup$
You need to use one method as a preconditioner for the other. My guess would be that you are asked to use SSOR as a preconditioner for CG.
$endgroup$
You need to use one method as a preconditioner for the other. My guess would be that you are asked to use SSOR as a preconditioner for CG.
answered Dec 17 '18 at 6:00
VorKirVorKir
1768
1768
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
add a comment |
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
$begingroup$
It's not exactly what I wanted. Preconditioner it's another way to "merge" iterative methods (e.g. see Preconditioned Conjugate Gradient).
$endgroup$
– Pennywise
Dec 17 '18 at 19:56
add a comment |
$begingroup$
So, it's turned out to be simple.
Here's matlab code. It's not needed more explanation. Just read "Applied Iterative Methods", Hageman, Young.
function [ x, err, iter, flag ] = ssor_cg(A, x, b, w, max_it, tol)
D = diag(diag(A));
I = eye(size(A));
C_L = -tril(A, -1);
C_U = -triu(A, 1);
W = mpower(A, 1/2);
Q = w / (2 - w) * (1 / w * D - C_L) * mpower(D, -1) * (1 / w * D - C_U);
G = I - mpower(Q, -1) * A;
k = mpower(Q, -1) * b;
delta = 0;
p = 0;
alpha = 0;
for iter = 0 : max_it
x_prev = x;
p_prev = p;
delta = G * x + k - x;
delta_prev = delta;
if iter == 0
p = delta;
else
p = delta + alpha * p_prev;
end
if iter > 0
alpha = (dot(W * delta, W * delta)) / (dot(W * delta_prev, W * delta_prev));
end
lambda = (dot(W * delta, W * delta)) / (dot(W * p, W * (I - G) * p));
x = x_prev + lambda * p;
err = norm(x_prev); % compute error
if (err <= tol) % check for convergence
break
end
end
end
$endgroup$
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
add a comment |
$begingroup$
So, it's turned out to be simple.
Here's matlab code. It's not needed more explanation. Just read "Applied Iterative Methods", Hageman, Young.
function [ x, err, iter, flag ] = ssor_cg(A, x, b, w, max_it, tol)
D = diag(diag(A));
I = eye(size(A));
C_L = -tril(A, -1);
C_U = -triu(A, 1);
W = mpower(A, 1/2);
Q = w / (2 - w) * (1 / w * D - C_L) * mpower(D, -1) * (1 / w * D - C_U);
G = I - mpower(Q, -1) * A;
k = mpower(Q, -1) * b;
delta = 0;
p = 0;
alpha = 0;
for iter = 0 : max_it
x_prev = x;
p_prev = p;
delta = G * x + k - x;
delta_prev = delta;
if iter == 0
p = delta;
else
p = delta + alpha * p_prev;
end
if iter > 0
alpha = (dot(W * delta, W * delta)) / (dot(W * delta_prev, W * delta_prev));
end
lambda = (dot(W * delta, W * delta)) / (dot(W * p, W * (I - G) * p));
x = x_prev + lambda * p;
err = norm(x_prev); % compute error
if (err <= tol) % check for convergence
break
end
end
end
$endgroup$
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
add a comment |
$begingroup$
So, it's turned out to be simple.
Here's matlab code. It's not needed more explanation. Just read "Applied Iterative Methods", Hageman, Young.
function [ x, err, iter, flag ] = ssor_cg(A, x, b, w, max_it, tol)
D = diag(diag(A));
I = eye(size(A));
C_L = -tril(A, -1);
C_U = -triu(A, 1);
W = mpower(A, 1/2);
Q = w / (2 - w) * (1 / w * D - C_L) * mpower(D, -1) * (1 / w * D - C_U);
G = I - mpower(Q, -1) * A;
k = mpower(Q, -1) * b;
delta = 0;
p = 0;
alpha = 0;
for iter = 0 : max_it
x_prev = x;
p_prev = p;
delta = G * x + k - x;
delta_prev = delta;
if iter == 0
p = delta;
else
p = delta + alpha * p_prev;
end
if iter > 0
alpha = (dot(W * delta, W * delta)) / (dot(W * delta_prev, W * delta_prev));
end
lambda = (dot(W * delta, W * delta)) / (dot(W * p, W * (I - G) * p));
x = x_prev + lambda * p;
err = norm(x_prev); % compute error
if (err <= tol) % check for convergence
break
end
end
end
$endgroup$
So, it's turned out to be simple.
Here's matlab code. It's not needed more explanation. Just read "Applied Iterative Methods", Hageman, Young.
function [ x, err, iter, flag ] = ssor_cg(A, x, b, w, max_it, tol)
D = diag(diag(A));
I = eye(size(A));
C_L = -tril(A, -1);
C_U = -triu(A, 1);
W = mpower(A, 1/2);
Q = w / (2 - w) * (1 / w * D - C_L) * mpower(D, -1) * (1 / w * D - C_U);
G = I - mpower(Q, -1) * A;
k = mpower(Q, -1) * b;
delta = 0;
p = 0;
alpha = 0;
for iter = 0 : max_it
x_prev = x;
p_prev = p;
delta = G * x + k - x;
delta_prev = delta;
if iter == 0
p = delta;
else
p = delta + alpha * p_prev;
end
if iter > 0
alpha = (dot(W * delta, W * delta)) / (dot(W * delta_prev, W * delta_prev));
end
lambda = (dot(W * delta, W * delta)) / (dot(W * p, W * (I - G) * p));
x = x_prev + lambda * p;
err = norm(x_prev); % compute error
if (err <= tol) % check for convergence
break
end
end
end
answered Dec 17 '18 at 19:51
PennywisePennywise
18610
18610
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
add a comment |
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
$begingroup$
Ill read later but i dont see the difference with the preconditioned CG. You basically modify the system for cg to solve for the system premultiplied by the SSOR operator.
$endgroup$
– VorKir
Dec 17 '18 at 20:07
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039808%2fssor-with-acceleration-by-cg%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown