Relation between independence and correlation of uniform random variables












7












$begingroup$


My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal. However, I can't come up with a counterexample to disprove the claim I am asking about. Please provide either a counterexample or a proof.










share|cite|improve this question











$endgroup$

















    7












    $begingroup$


    My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



    I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal. However, I can't come up with a counterexample to disprove the claim I am asking about. Please provide either a counterexample or a proof.










    share|cite|improve this question











    $endgroup$















      7












      7








      7


      2



      $begingroup$


      My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



      I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal. However, I can't come up with a counterexample to disprove the claim I am asking about. Please provide either a counterexample or a proof.










      share|cite|improve this question











      $endgroup$




      My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



      I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal. However, I can't come up with a counterexample to disprove the claim I am asking about. Please provide either a counterexample or a proof.







      correlation independence uniform






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited 2 days ago









      Community

      1




      1










      asked Mar 17 at 22:38









      PeiffapPeiffap

      496




      496






















          1 Answer
          1






          active

          oldest

          votes


















          13












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:




          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.



          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1),$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then being uncorrelated would imply independence.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).





          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)



          [Incidentally by transforming all of these to normality (i.e. transforming $X$ to $Phi^{-1}(frac12(X+1))$ and so forth), you get examples of uncorrelated normal random variables that are not independent. Naturally they aren't jointly normal.]






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            Mar 17 at 23:34












          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            Mar 17 at 23:49












          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            Mar 17 at 23:51











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "65"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          13












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:




          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.



          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1),$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then being uncorrelated would imply independence.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).





          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)



          [Incidentally by transforming all of these to normality (i.e. transforming $X$ to $Phi^{-1}(frac12(X+1))$ and so forth), you get examples of uncorrelated normal random variables that are not independent. Naturally they aren't jointly normal.]






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            Mar 17 at 23:34












          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            Mar 17 at 23:49












          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            Mar 17 at 23:51
















          13












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:




          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.



          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1),$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then being uncorrelated would imply independence.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).





          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)



          [Incidentally by transforming all of these to normality (i.e. transforming $X$ to $Phi^{-1}(frac12(X+1))$ and so forth), you get examples of uncorrelated normal random variables that are not independent. Naturally they aren't jointly normal.]






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            Mar 17 at 23:34












          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            Mar 17 at 23:49












          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            Mar 17 at 23:51














          13












          13








          13





          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:




          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.



          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1),$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then being uncorrelated would imply independence.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).





          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)



          [Incidentally by transforming all of these to normality (i.e. transforming $X$ to $Phi^{-1}(frac12(X+1))$ and so forth), you get examples of uncorrelated normal random variables that are not independent. Naturally they aren't jointly normal.]






          share|cite|improve this answer











          $endgroup$



          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:




          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.



          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1),$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then being uncorrelated would imply independence.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).





          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)



          [Incidentally by transforming all of these to normality (i.e. transforming $X$ to $Phi^{-1}(frac12(X+1))$ and so forth), you get examples of uncorrelated normal random variables that are not independent. Naturally they aren't jointly normal.]







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited 2 days ago

























          answered Mar 17 at 23:03









          Glen_bGlen_b

          214k23415765




          214k23415765












          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            Mar 17 at 23:34












          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            Mar 17 at 23:49












          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            Mar 17 at 23:51


















          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            Mar 17 at 23:34












          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            Mar 17 at 23:49












          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            Mar 17 at 23:51
















          $begingroup$
          Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
          $endgroup$
          – Peiffap
          Mar 17 at 23:34






          $begingroup$
          Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
          $endgroup$
          – Peiffap
          Mar 17 at 23:34














          $begingroup$
          Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
          $endgroup$
          – Glen_b
          Mar 17 at 23:49






          $begingroup$
          Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
          $endgroup$
          – Glen_b
          Mar 17 at 23:49














          $begingroup$
          They make it visually clearer, yes. Thank you, again.
          $endgroup$
          – Peiffap
          Mar 17 at 23:51




          $begingroup$
          They make it visually clearer, yes. Thank you, again.
          $endgroup$
          – Peiffap
          Mar 17 at 23:51


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Cross Validated!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          If I really need a card on my start hand, how many mulligans make sense? [duplicate]

          Alcedinidae

          Can an atomic nucleus contain both particles and antiparticles? [duplicate]