Matrices Commuting with a Kronecker Sum












0












$begingroup$


Throughout, let $A$ and $B$ be complex $m times m$ and $n times n$ matrices respectively. By $A otimes B$, we mean the matrix formed from the Kronecker product of $A$ and $B$, and by $A oplus B$, we mean the matrix formed by the Kronecker sum of $A$ and $B$. Namely, $$A otimes I_n + I_m otimes B,$$ where $I_r$ is an $r times r$ identity matrix.



Let $C$ be an arbitrary $mn times mn$ matrix that commutes with $A oplus B$. What (if anything) can be said about $C$?



For example, one could write $C$ as
$$C=sum_{k=1}^msum_{l=1}^n (X_{kl} otimes e^n_{kl})=sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes Y_{ij}),$$
where $e^r_{ij}$ is the standard $r times r$ matrix with a 1 as the $(i,j)$-th entry and every other entry is 0, and $X_{kl}$ and $Y_{ij}$ are $m times m$ and $n times n$ matrices respectively. If $C$ commutes with $A oplus B$, then does every matrix $X_{kl}$ commute with $A$ and every matrix $Y_{ij}$ commute with $B$?



I have found no counterexamples thus far to the above, but I also fail to see why it might be true in general. If we write
$$C(A oplus B)=(A oplus B)C$$
then we can deduce from the mixed-multiplication property of Kronecker products that
$$sum_{k=1}^msum_{l=1}^n ((X_{kl}A-A X_{kl}) otimes e^n_{kl}) + sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes (Y_{ij}B-B Y_{ij}))=0,$$
but it doesn't seem clear to me at all that one might be able to deduce from this that $X_{kl}A-A X_{kl}=0$ and $Y_{ij}B-B Y_{ij}=0$ for any $i,j,k,l$.










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    Throughout, let $A$ and $B$ be complex $m times m$ and $n times n$ matrices respectively. By $A otimes B$, we mean the matrix formed from the Kronecker product of $A$ and $B$, and by $A oplus B$, we mean the matrix formed by the Kronecker sum of $A$ and $B$. Namely, $$A otimes I_n + I_m otimes B,$$ where $I_r$ is an $r times r$ identity matrix.



    Let $C$ be an arbitrary $mn times mn$ matrix that commutes with $A oplus B$. What (if anything) can be said about $C$?



    For example, one could write $C$ as
    $$C=sum_{k=1}^msum_{l=1}^n (X_{kl} otimes e^n_{kl})=sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes Y_{ij}),$$
    where $e^r_{ij}$ is the standard $r times r$ matrix with a 1 as the $(i,j)$-th entry and every other entry is 0, and $X_{kl}$ and $Y_{ij}$ are $m times m$ and $n times n$ matrices respectively. If $C$ commutes with $A oplus B$, then does every matrix $X_{kl}$ commute with $A$ and every matrix $Y_{ij}$ commute with $B$?



    I have found no counterexamples thus far to the above, but I also fail to see why it might be true in general. If we write
    $$C(A oplus B)=(A oplus B)C$$
    then we can deduce from the mixed-multiplication property of Kronecker products that
    $$sum_{k=1}^msum_{l=1}^n ((X_{kl}A-A X_{kl}) otimes e^n_{kl}) + sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes (Y_{ij}B-B Y_{ij}))=0,$$
    but it doesn't seem clear to me at all that one might be able to deduce from this that $X_{kl}A-A X_{kl}=0$ and $Y_{ij}B-B Y_{ij}=0$ for any $i,j,k,l$.










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      Throughout, let $A$ and $B$ be complex $m times m$ and $n times n$ matrices respectively. By $A otimes B$, we mean the matrix formed from the Kronecker product of $A$ and $B$, and by $A oplus B$, we mean the matrix formed by the Kronecker sum of $A$ and $B$. Namely, $$A otimes I_n + I_m otimes B,$$ where $I_r$ is an $r times r$ identity matrix.



      Let $C$ be an arbitrary $mn times mn$ matrix that commutes with $A oplus B$. What (if anything) can be said about $C$?



      For example, one could write $C$ as
      $$C=sum_{k=1}^msum_{l=1}^n (X_{kl} otimes e^n_{kl})=sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes Y_{ij}),$$
      where $e^r_{ij}$ is the standard $r times r$ matrix with a 1 as the $(i,j)$-th entry and every other entry is 0, and $X_{kl}$ and $Y_{ij}$ are $m times m$ and $n times n$ matrices respectively. If $C$ commutes with $A oplus B$, then does every matrix $X_{kl}$ commute with $A$ and every matrix $Y_{ij}$ commute with $B$?



      I have found no counterexamples thus far to the above, but I also fail to see why it might be true in general. If we write
      $$C(A oplus B)=(A oplus B)C$$
      then we can deduce from the mixed-multiplication property of Kronecker products that
      $$sum_{k=1}^msum_{l=1}^n ((X_{kl}A-A X_{kl}) otimes e^n_{kl}) + sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes (Y_{ij}B-B Y_{ij}))=0,$$
      but it doesn't seem clear to me at all that one might be able to deduce from this that $X_{kl}A-A X_{kl}=0$ and $Y_{ij}B-B Y_{ij}=0$ for any $i,j,k,l$.










      share|cite|improve this question









      $endgroup$




      Throughout, let $A$ and $B$ be complex $m times m$ and $n times n$ matrices respectively. By $A otimes B$, we mean the matrix formed from the Kronecker product of $A$ and $B$, and by $A oplus B$, we mean the matrix formed by the Kronecker sum of $A$ and $B$. Namely, $$A otimes I_n + I_m otimes B,$$ where $I_r$ is an $r times r$ identity matrix.



      Let $C$ be an arbitrary $mn times mn$ matrix that commutes with $A oplus B$. What (if anything) can be said about $C$?



      For example, one could write $C$ as
      $$C=sum_{k=1}^msum_{l=1}^n (X_{kl} otimes e^n_{kl})=sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes Y_{ij}),$$
      where $e^r_{ij}$ is the standard $r times r$ matrix with a 1 as the $(i,j)$-th entry and every other entry is 0, and $X_{kl}$ and $Y_{ij}$ are $m times m$ and $n times n$ matrices respectively. If $C$ commutes with $A oplus B$, then does every matrix $X_{kl}$ commute with $A$ and every matrix $Y_{ij}$ commute with $B$?



      I have found no counterexamples thus far to the above, but I also fail to see why it might be true in general. If we write
      $$C(A oplus B)=(A oplus B)C$$
      then we can deduce from the mixed-multiplication property of Kronecker products that
      $$sum_{k=1}^msum_{l=1}^n ((X_{kl}A-A X_{kl}) otimes e^n_{kl}) + sum_{i=1}^msum_{j=1}^n (e^m_{ij} otimes (Y_{ij}B-B Y_{ij}))=0,$$
      but it doesn't seem clear to me at all that one might be able to deduce from this that $X_{kl}A-A X_{kl}=0$ and $Y_{ij}B-B Y_{ij}=0$ for any $i,j,k,l$.







      linear-algebra matrices kronecker-product






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Dec 7 '18 at 13:38









      IterafIteraf

      777




      777






















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          Let $spectrum(A)=(lambda_i)_{ileq m},spectrum(B)=(mu_j)_{jleq n}$.



          We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(lambda_i)$ (resp. the $(mu_j)$) are distinct.



          Moreover $spectrum(Aoplus B)=(lambda_i+mu_j)_{i,j}$ has $mn$ distinct elements - Note that $Aotimes I$ and $Iotimes B$ commute-.



          Then $C(Aoplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $Aoplus B$.



          Finally, the $mn$ linearly independent matrices in the form $A^iotimes B^j,i< m,j<n$ constitute a basis of $C(Aoplus B)$.



          EDIT. Answer to the OP. I think you did not understand one word of my post.



          For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $lambda_i+mu_j$ are distinct.



          For ii). My friend, $C(Aoplus B)$ is the commutant of $Aoplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.



          For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^iotimes B^j)$ as basis.



          For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:57












          • $begingroup$
            2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
            $endgroup$
            – Iteraf
            Dec 10 '18 at 1:43











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3029913%2fmatrices-commuting-with-a-kronecker-sum%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          Let $spectrum(A)=(lambda_i)_{ileq m},spectrum(B)=(mu_j)_{jleq n}$.



          We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(lambda_i)$ (resp. the $(mu_j)$) are distinct.



          Moreover $spectrum(Aoplus B)=(lambda_i+mu_j)_{i,j}$ has $mn$ distinct elements - Note that $Aotimes I$ and $Iotimes B$ commute-.



          Then $C(Aoplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $Aoplus B$.



          Finally, the $mn$ linearly independent matrices in the form $A^iotimes B^j,i< m,j<n$ constitute a basis of $C(Aoplus B)$.



          EDIT. Answer to the OP. I think you did not understand one word of my post.



          For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $lambda_i+mu_j$ are distinct.



          For ii). My friend, $C(Aoplus B)$ is the commutant of $Aoplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.



          For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^iotimes B^j)$ as basis.



          For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:57












          • $begingroup$
            2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
            $endgroup$
            – Iteraf
            Dec 10 '18 at 1:43
















          0












          $begingroup$

          Let $spectrum(A)=(lambda_i)_{ileq m},spectrum(B)=(mu_j)_{jleq n}$.



          We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(lambda_i)$ (resp. the $(mu_j)$) are distinct.



          Moreover $spectrum(Aoplus B)=(lambda_i+mu_j)_{i,j}$ has $mn$ distinct elements - Note that $Aotimes I$ and $Iotimes B$ commute-.



          Then $C(Aoplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $Aoplus B$.



          Finally, the $mn$ linearly independent matrices in the form $A^iotimes B^j,i< m,j<n$ constitute a basis of $C(Aoplus B)$.



          EDIT. Answer to the OP. I think you did not understand one word of my post.



          For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $lambda_i+mu_j$ are distinct.



          For ii). My friend, $C(Aoplus B)$ is the commutant of $Aoplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.



          For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^iotimes B^j)$ as basis.



          For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:57












          • $begingroup$
            2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
            $endgroup$
            – Iteraf
            Dec 10 '18 at 1:43














          0












          0








          0





          $begingroup$

          Let $spectrum(A)=(lambda_i)_{ileq m},spectrum(B)=(mu_j)_{jleq n}$.



          We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(lambda_i)$ (resp. the $(mu_j)$) are distinct.



          Moreover $spectrum(Aoplus B)=(lambda_i+mu_j)_{i,j}$ has $mn$ distinct elements - Note that $Aotimes I$ and $Iotimes B$ commute-.



          Then $C(Aoplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $Aoplus B$.



          Finally, the $mn$ linearly independent matrices in the form $A^iotimes B^j,i< m,j<n$ constitute a basis of $C(Aoplus B)$.



          EDIT. Answer to the OP. I think you did not understand one word of my post.



          For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $lambda_i+mu_j$ are distinct.



          For ii). My friend, $C(Aoplus B)$ is the commutant of $Aoplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.



          For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^iotimes B^j)$ as basis.



          For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.






          share|cite|improve this answer











          $endgroup$



          Let $spectrum(A)=(lambda_i)_{ileq m},spectrum(B)=(mu_j)_{jleq n}$.



          We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(lambda_i)$ (resp. the $(mu_j)$) are distinct.



          Moreover $spectrum(Aoplus B)=(lambda_i+mu_j)_{i,j}$ has $mn$ distinct elements - Note that $Aotimes I$ and $Iotimes B$ commute-.



          Then $C(Aoplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $Aoplus B$.



          Finally, the $mn$ linearly independent matrices in the form $A^iotimes B^j,i< m,j<n$ constitute a basis of $C(Aoplus B)$.



          EDIT. Answer to the OP. I think you did not understand one word of my post.



          For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $lambda_i+mu_j$ are distinct.



          For ii). My friend, $C(Aoplus B)$ is the commutant of $Aoplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.



          For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^iotimes B^j)$ as basis.



          For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 10 '18 at 0:28

























          answered Dec 8 '18 at 20:19









          loup blancloup blanc

          22.7k21850




          22.7k21850












          • $begingroup$
            Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:57












          • $begingroup$
            2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
            $endgroup$
            – Iteraf
            Dec 10 '18 at 1:43


















          • $begingroup$
            Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:57












          • $begingroup$
            2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
            $endgroup$
            – Iteraf
            Dec 9 '18 at 22:58










          • $begingroup$
            My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
            $endgroup$
            – Iteraf
            Dec 10 '18 at 1:43
















          $begingroup$
          Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:57






          $begingroup$
          Thank you for your response. It is very much appreciated! I have a few questions/comments: 1) Why do you say that the $(lambda_i)$ and $(mu_i)$ are distinct? This need not be true in general. A trivial example is if we take $A=B$, but it need not even be true for two matrices that are not similar. For example take $A=J_{1,3}$ and $B$ to be a direct sum of $J_{1,2}$ and $J_{1,1}$, where $J_{lambda,r}$ is a Jordan block of size $r$ corresponding to an eigenvalue $lambda$. In this example, $A$ is not similar to $B$, and yet they have the same spectra.
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:57














          $begingroup$
          2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58




          $begingroup$
          2) To be clear here, you are considering the matrix $C(A oplus B)$ itself as a vector space? What exactly do you mean by 'constituted by the polynomials in $A oplus B$'? What polynomials and what precisely is meant here by 'constituted'?
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58












          $begingroup$
          3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58




          $begingroup$
          3)I fail to see how $A^i otimes B^j$ can in generality be a basis for $C(A otimes B)$ whilst the latter has dimension $mn$. For example take $A=B=e_{13}^3$ (so $A^2=B^2=0$). Similarly take $C=e_{19}^9$. Then $C(A oplus B)=0$, and thus is 0-dimensional if considered as a (matrix) vector space, whereas $A otimes B=e_{19}^9$ and is thus 1-dimensional.
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58












          $begingroup$
          4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58




          $begingroup$
          4) I'm not sure how your response as a whole moves towards answering questions I had about the properties of the commutative matrix $C$ (or the matrices $X_{kl}$ and $Y_{ij}$).
          $endgroup$
          – Iteraf
          Dec 9 '18 at 22:58












          $begingroup$
          My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
          $endgroup$
          – Iteraf
          Dec 10 '18 at 1:43




          $begingroup$
          My apologies if you took offence at my questions and comments. It was genuinely an attempt to understand and to clarify some points of confusion, and it was not my intention to come across in any other way. With (4), this was intended as a prompt for further clarification, but I appreciate I was not clear here.
          $endgroup$
          – Iteraf
          Dec 10 '18 at 1:43


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3029913%2fmatrices-commuting-with-a-kronecker-sum%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          To store a contact into the json file from server.js file using a class in NodeJS

          Redirect URL with Chrome Remote Debugging Android Devices

          Dieringhausen