Solving simple congruences by hand
$begingroup$
When I am faced with a simple linear congruence such as
$$9x equiv 7 pmod{13}$$
and I am working without any calculating aid handy, I tend to do something like the following:
"Notice" that adding $13$ on the right and subtracting $13x$ on the left gives:
$$-4x equiv 20 pmod{13}$$
so that $$x equiv -5 equiv 8 pmod{13}.$$
Clearly this process works and is easy to justify (apart from not having an algorithm for "noticing"), but my question is this: I have a vague recollection of reading somewhere this sort of process was the preferred method of C. F. Gauss, but I cannot find any evidence for this now, so does anyone know anything about this, or could provide a reference? (Or have I just imagined it all?)
I would also be interested to hear if anyone else does anything similar.
elementary-number-theory reference-request modular-arithmetic arithmetic
$endgroup$
add a comment |
$begingroup$
When I am faced with a simple linear congruence such as
$$9x equiv 7 pmod{13}$$
and I am working without any calculating aid handy, I tend to do something like the following:
"Notice" that adding $13$ on the right and subtracting $13x$ on the left gives:
$$-4x equiv 20 pmod{13}$$
so that $$x equiv -5 equiv 8 pmod{13}.$$
Clearly this process works and is easy to justify (apart from not having an algorithm for "noticing"), but my question is this: I have a vague recollection of reading somewhere this sort of process was the preferred method of C. F. Gauss, but I cannot find any evidence for this now, so does anyone know anything about this, or could provide a reference? (Or have I just imagined it all?)
I would also be interested to hear if anyone else does anything similar.
elementary-number-theory reference-request modular-arithmetic arithmetic
$endgroup$
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46
add a comment |
$begingroup$
When I am faced with a simple linear congruence such as
$$9x equiv 7 pmod{13}$$
and I am working without any calculating aid handy, I tend to do something like the following:
"Notice" that adding $13$ on the right and subtracting $13x$ on the left gives:
$$-4x equiv 20 pmod{13}$$
so that $$x equiv -5 equiv 8 pmod{13}.$$
Clearly this process works and is easy to justify (apart from not having an algorithm for "noticing"), but my question is this: I have a vague recollection of reading somewhere this sort of process was the preferred method of C. F. Gauss, but I cannot find any evidence for this now, so does anyone know anything about this, or could provide a reference? (Or have I just imagined it all?)
I would also be interested to hear if anyone else does anything similar.
elementary-number-theory reference-request modular-arithmetic arithmetic
$endgroup$
When I am faced with a simple linear congruence such as
$$9x equiv 7 pmod{13}$$
and I am working without any calculating aid handy, I tend to do something like the following:
"Notice" that adding $13$ on the right and subtracting $13x$ on the left gives:
$$-4x equiv 20 pmod{13}$$
so that $$x equiv -5 equiv 8 pmod{13}.$$
Clearly this process works and is easy to justify (apart from not having an algorithm for "noticing"), but my question is this: I have a vague recollection of reading somewhere this sort of process was the preferred method of C. F. Gauss, but I cannot find any evidence for this now, so does anyone know anything about this, or could provide a reference? (Or have I just imagined it all?)
I would also be interested to hear if anyone else does anything similar.
elementary-number-theory reference-request modular-arithmetic arithmetic
elementary-number-theory reference-request modular-arithmetic arithmetic
edited Mar 9 at 1:39
Martin Sleziak
44.9k10122277
44.9k10122277
asked Jul 24 '12 at 14:48
Old JohnOld John
17.6k34999
17.6k34999
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46
add a comment |
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Generally, if $,b,$ is coprime to the modulus $m$ then (by Bezout) it is invertible $!bmod m,,$ so scaling $,bxequiv a,$ by $,b^{-1},$ we obtain the unique solution $,xequiv b^{-1}a.,$ We can quickly compute $,b^{-1}pmod{!m},$ by the extended Euclidean algorithm, but there are often more convenient ways for smaller numbers. We describe a few of these methods below, where we view $, xequiv b^{-1}a equiv a/b,$ as a modular fraction.
By Gauss' algorithm, scale $rm:color{#C00}{frac{A}B} to frac{AN}{BN}: $ by the least $rm,N,$ so that $rm, BN ge 13,, $ reduce mod $13,,$ iterate.
$$rmdisplaystyle mod 13!: color{#C00}{frac{7}9} ,equiv, frac{14}{18}, equiv, color{#C00}{frac{1}5},equiv, frac{3}{15},equiv, color{#C00}{frac{3}2} ,equiv, frac{21}{14} ,equiv, color{#C00}{frac{8}1}$$
Denominators of the $color{#c00}{rm reduced}$ fractions decrease $,color{#C00}{ 9 > 5 > 2> ldots},$ so reach $color{#C00}{1},$ (not $,0,$ else the denominator would be a proper factor of the prime modulus; it may fail for composite modulus)
Or, simpler, allowing negative residues $displaystyle frac{7}9,equiv, frac{7}{!-4! ,},equiv,frac{21}{!!-12 !!},equiv, frac{8}1$
This optimization of using balanced residues $0,pm 1, pm 2.ldots$ works for modular arithmetic in general. Here we can also optimize by (sometimes) cancelling obvious common factors, or by pulling out obvious factors of denominators, etc. For example
$$frac{7}9,equiv, frac{!-6,}{!-4,},equivfrac{!-3,}{!-2,},equivfrac{10}{!-2,},equiv,-5$$
$$frac{7}9,equiv,frac{!-1cdot 6}{ 3cdot 3},equiv,frac{!,12cdot 6!}{ ,3cdot 3},equiv, 4cdot 2$$
As you did, we can check if the quotient $rm,a/bequiv (apm!13,i)/(bpm!13,j),$ is exact for small $rm,i,j,,$ e.g.
$$ frac{1}7,equiv frac{!-12}{-6},equiv, 2; frac{5}7,equiv,frac{18}{!-6!,},equiv -3$$
When working with smaller numbers there is a higher probability of such optimizations being applicable (the law of small numbers), so it's well-worth looking for such in manual calculations.
More generally we can make the quotient exact by using Inverse Reciprocity.
$bmod 13!: dfrac{a}{b}equiv dfrac{a-13left[color{#0a0}{dfrac{a}{13}}bmod bright]}b, $ e.g. $, dfrac{8}9equiv dfrac{8-13overbrace{left[dfrac{8}{color{#c00}{13}}bmod 9right]}^{largecolor{#c00}{ 13 ,equiv, 4 }}}9equivdfrac{8-13[2]}9equiv-2$
Note that the value $,color{#0a0}{xequiv a/13},$ is what is needed to make the numerator divisible by $b,,$ i.e.
$qquadquadbmod b!:, a-13,[color{#0a0}x]equiv 0iff 13xequiv aiff color{#0a0}{xequiv a/13}$
Note $ $ Gauss' algorithm is my name for a special case of the Euclidean algorithm that's implicit in Gauss' Disquisitiones Arithmeticae, Art. 13, 1801. I don't know if Gauss explicitly used this algorithm elsewhere (apparently he chose to avoid use or mention of the Euclidean algorithm in Disq. Arith.).
The reformulation in terms of fractions does not occur in Gauss' work as far as I know. I devised it in my youth before I had perused Disq. Arith. It is likely very old but I don't recall seeing it in any literature. I'd be very grateful for any historical references.
Beware $ $ Modular fraction arithmetic is valid only for fractions with denominator coprime to the modulus. See here for further discussion.
$endgroup$
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
|
show 25 more comments
$begingroup$
When the prime is a reasonably small one I'd rather find directly the inverse:
$$9^{-1}=frac{1}{9}=3pmod {13}Longrightarrow 9x=7Longrightarrow x=7cdot 9^{-1}=7cdot 3= 21=8pmod {13}$$
But...I try Gauss's method when the prime is big and/or evaluating inverses is messy.
$endgroup$
add a comment |
$begingroup$
9x = 7 mod 13
9x = 7 + 13n
9x = 20 for n = 1
9x = 33 for n = 2
9x = 46 for n = 3
9x = 59 for n = 4
9x = 72 for n = 5
Then x = 8 mod 13
You arrive at the correct answer before n = 13.
$endgroup$
add a comment |
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f174676%2fsolving-simple-congruences-by-hand%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Generally, if $,b,$ is coprime to the modulus $m$ then (by Bezout) it is invertible $!bmod m,,$ so scaling $,bxequiv a,$ by $,b^{-1},$ we obtain the unique solution $,xequiv b^{-1}a.,$ We can quickly compute $,b^{-1}pmod{!m},$ by the extended Euclidean algorithm, but there are often more convenient ways for smaller numbers. We describe a few of these methods below, where we view $, xequiv b^{-1}a equiv a/b,$ as a modular fraction.
By Gauss' algorithm, scale $rm:color{#C00}{frac{A}B} to frac{AN}{BN}: $ by the least $rm,N,$ so that $rm, BN ge 13,, $ reduce mod $13,,$ iterate.
$$rmdisplaystyle mod 13!: color{#C00}{frac{7}9} ,equiv, frac{14}{18}, equiv, color{#C00}{frac{1}5},equiv, frac{3}{15},equiv, color{#C00}{frac{3}2} ,equiv, frac{21}{14} ,equiv, color{#C00}{frac{8}1}$$
Denominators of the $color{#c00}{rm reduced}$ fractions decrease $,color{#C00}{ 9 > 5 > 2> ldots},$ so reach $color{#C00}{1},$ (not $,0,$ else the denominator would be a proper factor of the prime modulus; it may fail for composite modulus)
Or, simpler, allowing negative residues $displaystyle frac{7}9,equiv, frac{7}{!-4! ,},equiv,frac{21}{!!-12 !!},equiv, frac{8}1$
This optimization of using balanced residues $0,pm 1, pm 2.ldots$ works for modular arithmetic in general. Here we can also optimize by (sometimes) cancelling obvious common factors, or by pulling out obvious factors of denominators, etc. For example
$$frac{7}9,equiv, frac{!-6,}{!-4,},equivfrac{!-3,}{!-2,},equivfrac{10}{!-2,},equiv,-5$$
$$frac{7}9,equiv,frac{!-1cdot 6}{ 3cdot 3},equiv,frac{!,12cdot 6!}{ ,3cdot 3},equiv, 4cdot 2$$
As you did, we can check if the quotient $rm,a/bequiv (apm!13,i)/(bpm!13,j),$ is exact for small $rm,i,j,,$ e.g.
$$ frac{1}7,equiv frac{!-12}{-6},equiv, 2; frac{5}7,equiv,frac{18}{!-6!,},equiv -3$$
When working with smaller numbers there is a higher probability of such optimizations being applicable (the law of small numbers), so it's well-worth looking for such in manual calculations.
More generally we can make the quotient exact by using Inverse Reciprocity.
$bmod 13!: dfrac{a}{b}equiv dfrac{a-13left[color{#0a0}{dfrac{a}{13}}bmod bright]}b, $ e.g. $, dfrac{8}9equiv dfrac{8-13overbrace{left[dfrac{8}{color{#c00}{13}}bmod 9right]}^{largecolor{#c00}{ 13 ,equiv, 4 }}}9equivdfrac{8-13[2]}9equiv-2$
Note that the value $,color{#0a0}{xequiv a/13},$ is what is needed to make the numerator divisible by $b,,$ i.e.
$qquadquadbmod b!:, a-13,[color{#0a0}x]equiv 0iff 13xequiv aiff color{#0a0}{xequiv a/13}$
Note $ $ Gauss' algorithm is my name for a special case of the Euclidean algorithm that's implicit in Gauss' Disquisitiones Arithmeticae, Art. 13, 1801. I don't know if Gauss explicitly used this algorithm elsewhere (apparently he chose to avoid use or mention of the Euclidean algorithm in Disq. Arith.).
The reformulation in terms of fractions does not occur in Gauss' work as far as I know. I devised it in my youth before I had perused Disq. Arith. It is likely very old but I don't recall seeing it in any literature. I'd be very grateful for any historical references.
Beware $ $ Modular fraction arithmetic is valid only for fractions with denominator coprime to the modulus. See here for further discussion.
$endgroup$
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
|
show 25 more comments
$begingroup$
Generally, if $,b,$ is coprime to the modulus $m$ then (by Bezout) it is invertible $!bmod m,,$ so scaling $,bxequiv a,$ by $,b^{-1},$ we obtain the unique solution $,xequiv b^{-1}a.,$ We can quickly compute $,b^{-1}pmod{!m},$ by the extended Euclidean algorithm, but there are often more convenient ways for smaller numbers. We describe a few of these methods below, where we view $, xequiv b^{-1}a equiv a/b,$ as a modular fraction.
By Gauss' algorithm, scale $rm:color{#C00}{frac{A}B} to frac{AN}{BN}: $ by the least $rm,N,$ so that $rm, BN ge 13,, $ reduce mod $13,,$ iterate.
$$rmdisplaystyle mod 13!: color{#C00}{frac{7}9} ,equiv, frac{14}{18}, equiv, color{#C00}{frac{1}5},equiv, frac{3}{15},equiv, color{#C00}{frac{3}2} ,equiv, frac{21}{14} ,equiv, color{#C00}{frac{8}1}$$
Denominators of the $color{#c00}{rm reduced}$ fractions decrease $,color{#C00}{ 9 > 5 > 2> ldots},$ so reach $color{#C00}{1},$ (not $,0,$ else the denominator would be a proper factor of the prime modulus; it may fail for composite modulus)
Or, simpler, allowing negative residues $displaystyle frac{7}9,equiv, frac{7}{!-4! ,},equiv,frac{21}{!!-12 !!},equiv, frac{8}1$
This optimization of using balanced residues $0,pm 1, pm 2.ldots$ works for modular arithmetic in general. Here we can also optimize by (sometimes) cancelling obvious common factors, or by pulling out obvious factors of denominators, etc. For example
$$frac{7}9,equiv, frac{!-6,}{!-4,},equivfrac{!-3,}{!-2,},equivfrac{10}{!-2,},equiv,-5$$
$$frac{7}9,equiv,frac{!-1cdot 6}{ 3cdot 3},equiv,frac{!,12cdot 6!}{ ,3cdot 3},equiv, 4cdot 2$$
As you did, we can check if the quotient $rm,a/bequiv (apm!13,i)/(bpm!13,j),$ is exact for small $rm,i,j,,$ e.g.
$$ frac{1}7,equiv frac{!-12}{-6},equiv, 2; frac{5}7,equiv,frac{18}{!-6!,},equiv -3$$
When working with smaller numbers there is a higher probability of such optimizations being applicable (the law of small numbers), so it's well-worth looking for such in manual calculations.
More generally we can make the quotient exact by using Inverse Reciprocity.
$bmod 13!: dfrac{a}{b}equiv dfrac{a-13left[color{#0a0}{dfrac{a}{13}}bmod bright]}b, $ e.g. $, dfrac{8}9equiv dfrac{8-13overbrace{left[dfrac{8}{color{#c00}{13}}bmod 9right]}^{largecolor{#c00}{ 13 ,equiv, 4 }}}9equivdfrac{8-13[2]}9equiv-2$
Note that the value $,color{#0a0}{xequiv a/13},$ is what is needed to make the numerator divisible by $b,,$ i.e.
$qquadquadbmod b!:, a-13,[color{#0a0}x]equiv 0iff 13xequiv aiff color{#0a0}{xequiv a/13}$
Note $ $ Gauss' algorithm is my name for a special case of the Euclidean algorithm that's implicit in Gauss' Disquisitiones Arithmeticae, Art. 13, 1801. I don't know if Gauss explicitly used this algorithm elsewhere (apparently he chose to avoid use or mention of the Euclidean algorithm in Disq. Arith.).
The reformulation in terms of fractions does not occur in Gauss' work as far as I know. I devised it in my youth before I had perused Disq. Arith. It is likely very old but I don't recall seeing it in any literature. I'd be very grateful for any historical references.
Beware $ $ Modular fraction arithmetic is valid only for fractions with denominator coprime to the modulus. See here for further discussion.
$endgroup$
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
|
show 25 more comments
$begingroup$
Generally, if $,b,$ is coprime to the modulus $m$ then (by Bezout) it is invertible $!bmod m,,$ so scaling $,bxequiv a,$ by $,b^{-1},$ we obtain the unique solution $,xequiv b^{-1}a.,$ We can quickly compute $,b^{-1}pmod{!m},$ by the extended Euclidean algorithm, but there are often more convenient ways for smaller numbers. We describe a few of these methods below, where we view $, xequiv b^{-1}a equiv a/b,$ as a modular fraction.
By Gauss' algorithm, scale $rm:color{#C00}{frac{A}B} to frac{AN}{BN}: $ by the least $rm,N,$ so that $rm, BN ge 13,, $ reduce mod $13,,$ iterate.
$$rmdisplaystyle mod 13!: color{#C00}{frac{7}9} ,equiv, frac{14}{18}, equiv, color{#C00}{frac{1}5},equiv, frac{3}{15},equiv, color{#C00}{frac{3}2} ,equiv, frac{21}{14} ,equiv, color{#C00}{frac{8}1}$$
Denominators of the $color{#c00}{rm reduced}$ fractions decrease $,color{#C00}{ 9 > 5 > 2> ldots},$ so reach $color{#C00}{1},$ (not $,0,$ else the denominator would be a proper factor of the prime modulus; it may fail for composite modulus)
Or, simpler, allowing negative residues $displaystyle frac{7}9,equiv, frac{7}{!-4! ,},equiv,frac{21}{!!-12 !!},equiv, frac{8}1$
This optimization of using balanced residues $0,pm 1, pm 2.ldots$ works for modular arithmetic in general. Here we can also optimize by (sometimes) cancelling obvious common factors, or by pulling out obvious factors of denominators, etc. For example
$$frac{7}9,equiv, frac{!-6,}{!-4,},equivfrac{!-3,}{!-2,},equivfrac{10}{!-2,},equiv,-5$$
$$frac{7}9,equiv,frac{!-1cdot 6}{ 3cdot 3},equiv,frac{!,12cdot 6!}{ ,3cdot 3},equiv, 4cdot 2$$
As you did, we can check if the quotient $rm,a/bequiv (apm!13,i)/(bpm!13,j),$ is exact for small $rm,i,j,,$ e.g.
$$ frac{1}7,equiv frac{!-12}{-6},equiv, 2; frac{5}7,equiv,frac{18}{!-6!,},equiv -3$$
When working with smaller numbers there is a higher probability of such optimizations being applicable (the law of small numbers), so it's well-worth looking for such in manual calculations.
More generally we can make the quotient exact by using Inverse Reciprocity.
$bmod 13!: dfrac{a}{b}equiv dfrac{a-13left[color{#0a0}{dfrac{a}{13}}bmod bright]}b, $ e.g. $, dfrac{8}9equiv dfrac{8-13overbrace{left[dfrac{8}{color{#c00}{13}}bmod 9right]}^{largecolor{#c00}{ 13 ,equiv, 4 }}}9equivdfrac{8-13[2]}9equiv-2$
Note that the value $,color{#0a0}{xequiv a/13},$ is what is needed to make the numerator divisible by $b,,$ i.e.
$qquadquadbmod b!:, a-13,[color{#0a0}x]equiv 0iff 13xequiv aiff color{#0a0}{xequiv a/13}$
Note $ $ Gauss' algorithm is my name for a special case of the Euclidean algorithm that's implicit in Gauss' Disquisitiones Arithmeticae, Art. 13, 1801. I don't know if Gauss explicitly used this algorithm elsewhere (apparently he chose to avoid use or mention of the Euclidean algorithm in Disq. Arith.).
The reformulation in terms of fractions does not occur in Gauss' work as far as I know. I devised it in my youth before I had perused Disq. Arith. It is likely very old but I don't recall seeing it in any literature. I'd be very grateful for any historical references.
Beware $ $ Modular fraction arithmetic is valid only for fractions with denominator coprime to the modulus. See here for further discussion.
$endgroup$
Generally, if $,b,$ is coprime to the modulus $m$ then (by Bezout) it is invertible $!bmod m,,$ so scaling $,bxequiv a,$ by $,b^{-1},$ we obtain the unique solution $,xequiv b^{-1}a.,$ We can quickly compute $,b^{-1}pmod{!m},$ by the extended Euclidean algorithm, but there are often more convenient ways for smaller numbers. We describe a few of these methods below, where we view $, xequiv b^{-1}a equiv a/b,$ as a modular fraction.
By Gauss' algorithm, scale $rm:color{#C00}{frac{A}B} to frac{AN}{BN}: $ by the least $rm,N,$ so that $rm, BN ge 13,, $ reduce mod $13,,$ iterate.
$$rmdisplaystyle mod 13!: color{#C00}{frac{7}9} ,equiv, frac{14}{18}, equiv, color{#C00}{frac{1}5},equiv, frac{3}{15},equiv, color{#C00}{frac{3}2} ,equiv, frac{21}{14} ,equiv, color{#C00}{frac{8}1}$$
Denominators of the $color{#c00}{rm reduced}$ fractions decrease $,color{#C00}{ 9 > 5 > 2> ldots},$ so reach $color{#C00}{1},$ (not $,0,$ else the denominator would be a proper factor of the prime modulus; it may fail for composite modulus)
Or, simpler, allowing negative residues $displaystyle frac{7}9,equiv, frac{7}{!-4! ,},equiv,frac{21}{!!-12 !!},equiv, frac{8}1$
This optimization of using balanced residues $0,pm 1, pm 2.ldots$ works for modular arithmetic in general. Here we can also optimize by (sometimes) cancelling obvious common factors, or by pulling out obvious factors of denominators, etc. For example
$$frac{7}9,equiv, frac{!-6,}{!-4,},equivfrac{!-3,}{!-2,},equivfrac{10}{!-2,},equiv,-5$$
$$frac{7}9,equiv,frac{!-1cdot 6}{ 3cdot 3},equiv,frac{!,12cdot 6!}{ ,3cdot 3},equiv, 4cdot 2$$
As you did, we can check if the quotient $rm,a/bequiv (apm!13,i)/(bpm!13,j),$ is exact for small $rm,i,j,,$ e.g.
$$ frac{1}7,equiv frac{!-12}{-6},equiv, 2; frac{5}7,equiv,frac{18}{!-6!,},equiv -3$$
When working with smaller numbers there is a higher probability of such optimizations being applicable (the law of small numbers), so it's well-worth looking for such in manual calculations.
More generally we can make the quotient exact by using Inverse Reciprocity.
$bmod 13!: dfrac{a}{b}equiv dfrac{a-13left[color{#0a0}{dfrac{a}{13}}bmod bright]}b, $ e.g. $, dfrac{8}9equiv dfrac{8-13overbrace{left[dfrac{8}{color{#c00}{13}}bmod 9right]}^{largecolor{#c00}{ 13 ,equiv, 4 }}}9equivdfrac{8-13[2]}9equiv-2$
Note that the value $,color{#0a0}{xequiv a/13},$ is what is needed to make the numerator divisible by $b,,$ i.e.
$qquadquadbmod b!:, a-13,[color{#0a0}x]equiv 0iff 13xequiv aiff color{#0a0}{xequiv a/13}$
Note $ $ Gauss' algorithm is my name for a special case of the Euclidean algorithm that's implicit in Gauss' Disquisitiones Arithmeticae, Art. 13, 1801. I don't know if Gauss explicitly used this algorithm elsewhere (apparently he chose to avoid use or mention of the Euclidean algorithm in Disq. Arith.).
The reformulation in terms of fractions does not occur in Gauss' work as far as I know. I devised it in my youth before I had perused Disq. Arith. It is likely very old but I don't recall seeing it in any literature. I'd be very grateful for any historical references.
Beware $ $ Modular fraction arithmetic is valid only for fractions with denominator coprime to the modulus. See here for further discussion.
edited Mar 24 at 20:53
answered Jul 24 '12 at 15:11
Bill DubuqueBill Dubuque
213k29196654
213k29196654
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
|
show 25 more comments
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
Thanks for the reference - so it was Gauss after all. The Gauss process is basically what I do, although I keep an eye out for any possible shortcuts, as in the example I gave.
$endgroup$
– Old John
Jul 24 '12 at 15:21
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
@mathh As the linked post says, Gauss's algorithm requires prime modulus. Generally modular fractions make sense only for denominators coprime to the modulus. Thus when scaling fractions we must restrict to scale factors coprime to the modulus, e.g. in your case we can do $tag*{}$ ${rm mod} 10!: dfrac{1}3equiv dfrac{3}9equiv dfrac{3}{-1} equiv -3equiv 7 $
$endgroup$
– Bill Dubuque
Aug 18 '14 at 16:44
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
$begingroup$
I'm sorry, I deleted the comment before you replied since I found it out. I would add to the above explanations that the denominators can be reduced if and only if the modulus is coprime to them.
$endgroup$
– user26486
Aug 18 '14 at 16:51
1
1
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
$begingroup$
@Michael Yes, it shows $bmod 13!: 9^{-1}equiv color{#0a0}2timescolor{#0a0}3timescolor{#0a0}7equiv 3. $ The sequence of *decreasing* multiples of $9$ is comprised by the reduced denominators (in red in the answer), i.e. as below
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
1
1
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
$begingroup$
$${begin{align}&9 overset{large timescolor{#0a0} 2}longrightarrow 5overset{large timescolor{#0a0} 3}longrightarrow 2,overset{large timescolor{#0a0} 7}longrightarrow, 1\[.4em] Rightarrow & 9 times color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 1\[.4em] Rightarrow &9^{-1}!equiv color{#0a0}2 times color{#0a0}3 times color{#0a0}7equiv 3& end{align}quadqquad}$$
$endgroup$
– Bill Dubuque
Jan 15 at 15:51
|
show 25 more comments
$begingroup$
When the prime is a reasonably small one I'd rather find directly the inverse:
$$9^{-1}=frac{1}{9}=3pmod {13}Longrightarrow 9x=7Longrightarrow x=7cdot 9^{-1}=7cdot 3= 21=8pmod {13}$$
But...I try Gauss's method when the prime is big and/or evaluating inverses is messy.
$endgroup$
add a comment |
$begingroup$
When the prime is a reasonably small one I'd rather find directly the inverse:
$$9^{-1}=frac{1}{9}=3pmod {13}Longrightarrow 9x=7Longrightarrow x=7cdot 9^{-1}=7cdot 3= 21=8pmod {13}$$
But...I try Gauss's method when the prime is big and/or evaluating inverses is messy.
$endgroup$
add a comment |
$begingroup$
When the prime is a reasonably small one I'd rather find directly the inverse:
$$9^{-1}=frac{1}{9}=3pmod {13}Longrightarrow 9x=7Longrightarrow x=7cdot 9^{-1}=7cdot 3= 21=8pmod {13}$$
But...I try Gauss's method when the prime is big and/or evaluating inverses is messy.
$endgroup$
When the prime is a reasonably small one I'd rather find directly the inverse:
$$9^{-1}=frac{1}{9}=3pmod {13}Longrightarrow 9x=7Longrightarrow x=7cdot 9^{-1}=7cdot 3= 21=8pmod {13}$$
But...I try Gauss's method when the prime is big and/or evaluating inverses is messy.
answered Jul 24 '12 at 15:10
DonAntonioDonAntonio
180k1494233
180k1494233
add a comment |
add a comment |
$begingroup$
9x = 7 mod 13
9x = 7 + 13n
9x = 20 for n = 1
9x = 33 for n = 2
9x = 46 for n = 3
9x = 59 for n = 4
9x = 72 for n = 5
Then x = 8 mod 13
You arrive at the correct answer before n = 13.
$endgroup$
add a comment |
$begingroup$
9x = 7 mod 13
9x = 7 + 13n
9x = 20 for n = 1
9x = 33 for n = 2
9x = 46 for n = 3
9x = 59 for n = 4
9x = 72 for n = 5
Then x = 8 mod 13
You arrive at the correct answer before n = 13.
$endgroup$
add a comment |
$begingroup$
9x = 7 mod 13
9x = 7 + 13n
9x = 20 for n = 1
9x = 33 for n = 2
9x = 46 for n = 3
9x = 59 for n = 4
9x = 72 for n = 5
Then x = 8 mod 13
You arrive at the correct answer before n = 13.
$endgroup$
9x = 7 mod 13
9x = 7 + 13n
9x = 20 for n = 1
9x = 33 for n = 2
9x = 46 for n = 3
9x = 59 for n = 4
9x = 72 for n = 5
Then x = 8 mod 13
You arrive at the correct answer before n = 13.
edited Oct 23 '14 at 15:38
answered Oct 22 '14 at 22:58
John ButnorJohn Butnor
663
663
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f174676%2fsolving-simple-congruences-by-hand%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I suppose we all have our own "ad hoc" methods. I would have "noticed" that multiplying by $3$ gives $x equiv 21 equiv 8$ (mod $13$). I have always believed that Gauss invented modular arithmetic. It is certainly discussed at length in Disquisitiones Arithmeticae, which I own a copy of.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 15:08
$begingroup$
It's not clear to me that the process always works, but it is interesting. It seems to me that the trick is that we get to replace our equation and hope for common prime factors between the coefficients.
$endgroup$
– Andrew
Jul 24 '12 at 15:09
$begingroup$
@GeoffRobinson I am pretty sure Gauss did invent the notation, and I too have a copy of Disquisitiones Arithmeticae, but I cannot see anything in it along exactly the lines of my question, unfortunately.
$endgroup$
– Old John
Jul 24 '12 at 15:13
$begingroup$
It is not an algorithm in the precise sense of the word. The idea is basically to multiply or divide (on both sides) or add (or subtract) a multiple of the modulus (on either side) to get something which is equivalent, but "simpler". Repeating this vague process a number of times will give a solution (if the original has a solution) as at each step the multiple of $x$ on the left is going to get smaller, and the new congruence is equivalent to the previous one.
$endgroup$
– Old John
Jul 24 '12 at 15:16
$begingroup$
@Old John: Thanks. Yes Bill Dubuque's answer made what was going on reasonably clear.
$endgroup$
– Geoff Robinson
Jul 24 '12 at 18:46