Random Walks on high dimensional spaces












8















I've read on a paper that, in the two dimensional case, if you start from the origin and take steps of length one in arbitrary directions (uniformely on the unit sphere $S^1$, not left-right-up-down), the expected distance after $n$ steps from the starting point is approximated by $sqrt{npi}/2$



(source: https://pdfs.semanticscholar.org/6685/1166d588821456477f2007a37bf0428a2cf2.pdf).



I was wondering if there was a similar formula for higher dimensional random walks, which means:



Starting from the origin in $mathbb{R}^d$ if I take $n$ steps in random directions (which doesn't have to be aligned to any axes, can be any uniformly chosen random direction taken from the sphere $S^{d-1}$), what is the expected value of the distance where I end up from the origin?
e.g. how distant is the point from the origin after having taken $n$ steps in random directions?



I don't need a precise formula, everything that just gives an idea on how large the expected distance is works just fine. Could also be a loose upper bound.
(if you could add a reference to the answer as well it would be great! :D )



EDIT: A friend suggested me that the answer should be in the heat equation, which means that I only need to integrate the d-dimensional gaussian. Right?



Thank you very much!










share|cite|improve this question





























    8















    I've read on a paper that, in the two dimensional case, if you start from the origin and take steps of length one in arbitrary directions (uniformely on the unit sphere $S^1$, not left-right-up-down), the expected distance after $n$ steps from the starting point is approximated by $sqrt{npi}/2$



    (source: https://pdfs.semanticscholar.org/6685/1166d588821456477f2007a37bf0428a2cf2.pdf).



    I was wondering if there was a similar formula for higher dimensional random walks, which means:



    Starting from the origin in $mathbb{R}^d$ if I take $n$ steps in random directions (which doesn't have to be aligned to any axes, can be any uniformly chosen random direction taken from the sphere $S^{d-1}$), what is the expected value of the distance where I end up from the origin?
    e.g. how distant is the point from the origin after having taken $n$ steps in random directions?



    I don't need a precise formula, everything that just gives an idea on how large the expected distance is works just fine. Could also be a loose upper bound.
    (if you could add a reference to the answer as well it would be great! :D )



    EDIT: A friend suggested me that the answer should be in the heat equation, which means that I only need to integrate the d-dimensional gaussian. Right?



    Thank you very much!










    share|cite|improve this question



























      8












      8








      8


      2






      I've read on a paper that, in the two dimensional case, if you start from the origin and take steps of length one in arbitrary directions (uniformely on the unit sphere $S^1$, not left-right-up-down), the expected distance after $n$ steps from the starting point is approximated by $sqrt{npi}/2$



      (source: https://pdfs.semanticscholar.org/6685/1166d588821456477f2007a37bf0428a2cf2.pdf).



      I was wondering if there was a similar formula for higher dimensional random walks, which means:



      Starting from the origin in $mathbb{R}^d$ if I take $n$ steps in random directions (which doesn't have to be aligned to any axes, can be any uniformly chosen random direction taken from the sphere $S^{d-1}$), what is the expected value of the distance where I end up from the origin?
      e.g. how distant is the point from the origin after having taken $n$ steps in random directions?



      I don't need a precise formula, everything that just gives an idea on how large the expected distance is works just fine. Could also be a loose upper bound.
      (if you could add a reference to the answer as well it would be great! :D )



      EDIT: A friend suggested me that the answer should be in the heat equation, which means that I only need to integrate the d-dimensional gaussian. Right?



      Thank you very much!










      share|cite|improve this question
















      I've read on a paper that, in the two dimensional case, if you start from the origin and take steps of length one in arbitrary directions (uniformely on the unit sphere $S^1$, not left-right-up-down), the expected distance after $n$ steps from the starting point is approximated by $sqrt{npi}/2$



      (source: https://pdfs.semanticscholar.org/6685/1166d588821456477f2007a37bf0428a2cf2.pdf).



      I was wondering if there was a similar formula for higher dimensional random walks, which means:



      Starting from the origin in $mathbb{R}^d$ if I take $n$ steps in random directions (which doesn't have to be aligned to any axes, can be any uniformly chosen random direction taken from the sphere $S^{d-1}$), what is the expected value of the distance where I end up from the origin?
      e.g. how distant is the point from the origin after having taken $n$ steps in random directions?



      I don't need a precise formula, everything that just gives an idea on how large the expected distance is works just fine. Could also be a loose upper bound.
      (if you could add a reference to the answer as well it would be great! :D )



      EDIT: A friend suggested me that the answer should be in the heat equation, which means that I only need to integrate the d-dimensional gaussian. Right?



      Thank you very much!







      pr.probability st.statistics random-walks






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited 2 days ago







      Alfred

















      asked 2 days ago









      AlfredAlfred

      1498




      1498






















          2 Answers
          2






          active

          oldest

          votes


















          11














          Let $X_1,X_2,dots$ be iid random vectors each uniformly distributed on $S^{d-1}$. Let $S_n:=sum_1^n X_i$. By the symmetry, $EX_1=0$. Also, $1=|X_1|^2=sum_{j=1}^d X_{1j}^2$, where $X_1=(X_{11},dots,X_{1d})$. Since the $X_{1j}$'s are exchangeable and $1=E|X_1|^2=sum_{j=1}^d EX_{1j}^2$, we conclude that $EX_{1j}^2=frac1d$ for all $j$. Also, for any distinct $j,k=1,dots,d$, the pair $(X_{1j},X_{1k})$ equals $(-X_{1j},X_{1k})$ in distribution, whence $EX_{1j}X_{1k}=0$. So, $EX_1=0$ and the covariance matrix of $X_1$ is $frac1d,I_d$, where $I_d$ is the $dtimes d$ identity matrix.



          So, by the multivariate central limit theorem, $frac1{sqrt n},S_n$ converges in distribution (as $ntoinfty$) to the random vector $frac1{sqrt d}Z$, where $Z=(Z_1,dots,Z_d)$ and $Z_1,dots,Z_d$ are iid $N(0,1)$. Therefore and because of the uniform integrability of $frac1{sqrt n},|S_n|$ (provided by the observation that $E|S_n|^2=n$, made in the answer by Pierre PC), we have
          begin{equation}
          Efrac1{sqrt n},|S_n|toell_d:=frac1{sqrt d}Esqrt{sum_1^d Z_j^2},
          end{equation}

          and
          hence
          begin{equation}
          E|S_n|sim ell_dsqrt n. tag{*}
          end{equation}

          Since the distribution of $sum_1^d Z_j^2$ is $chi^2_d=text{Gamma}(frac d2,2)$, we see that
          begin{equation}
          ell_d=sqrt{frac2d},frac{Gamma((d+1)/2)}{Gamma(d/2)}.
          end{equation}

          In particular, for $d=2$ (*) becomes
          begin{equation}
          E|S_n|sim tfrac12,sqrt{pi n},
          end{equation}

          which is what you read in that paper.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago



















          2














          Let $(X_k)_{kgeq1}$ be a sequence of points chosen independently and uniformly on the $(d-1)$-sphere. Set $S_n=X_1+cdots+X_n$ and $rho_n=|S_n|$.



          It is clear that
          $$ rho_{n+1}^2 = rho_{n}^2 + 2langle S_n,X_{n+1} rangle + 1. $$
          Now because $mathbb E[langle S_n,X_{n+1}rangle|X_1,cdots, X_n] = langle S_n,mathbb E[X_{n+1}]rangle = 0$, we get
          $$ mathbb E[rho_{n+1}^2] = mathbb E[rho_{n}^2] + 2mathbb E[langle S_n,X_{n+1} rangle] + 1 = mathbb E[rho_{n}^2] + 1 = cdots = n+1. $$
          Hence, for instance, $mathbb Erho_nleqsqrt{mathbb Erho_n^2}leqsqrt n$.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "504"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320630%2frandom-walks-on-high-dimensional-spaces%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          11














          Let $X_1,X_2,dots$ be iid random vectors each uniformly distributed on $S^{d-1}$. Let $S_n:=sum_1^n X_i$. By the symmetry, $EX_1=0$. Also, $1=|X_1|^2=sum_{j=1}^d X_{1j}^2$, where $X_1=(X_{11},dots,X_{1d})$. Since the $X_{1j}$'s are exchangeable and $1=E|X_1|^2=sum_{j=1}^d EX_{1j}^2$, we conclude that $EX_{1j}^2=frac1d$ for all $j$. Also, for any distinct $j,k=1,dots,d$, the pair $(X_{1j},X_{1k})$ equals $(-X_{1j},X_{1k})$ in distribution, whence $EX_{1j}X_{1k}=0$. So, $EX_1=0$ and the covariance matrix of $X_1$ is $frac1d,I_d$, where $I_d$ is the $dtimes d$ identity matrix.



          So, by the multivariate central limit theorem, $frac1{sqrt n},S_n$ converges in distribution (as $ntoinfty$) to the random vector $frac1{sqrt d}Z$, where $Z=(Z_1,dots,Z_d)$ and $Z_1,dots,Z_d$ are iid $N(0,1)$. Therefore and because of the uniform integrability of $frac1{sqrt n},|S_n|$ (provided by the observation that $E|S_n|^2=n$, made in the answer by Pierre PC), we have
          begin{equation}
          Efrac1{sqrt n},|S_n|toell_d:=frac1{sqrt d}Esqrt{sum_1^d Z_j^2},
          end{equation}

          and
          hence
          begin{equation}
          E|S_n|sim ell_dsqrt n. tag{*}
          end{equation}

          Since the distribution of $sum_1^d Z_j^2$ is $chi^2_d=text{Gamma}(frac d2,2)$, we see that
          begin{equation}
          ell_d=sqrt{frac2d},frac{Gamma((d+1)/2)}{Gamma(d/2)}.
          end{equation}

          In particular, for $d=2$ (*) becomes
          begin{equation}
          E|S_n|sim tfrac12,sqrt{pi n},
          end{equation}

          which is what you read in that paper.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago
















          11














          Let $X_1,X_2,dots$ be iid random vectors each uniformly distributed on $S^{d-1}$. Let $S_n:=sum_1^n X_i$. By the symmetry, $EX_1=0$. Also, $1=|X_1|^2=sum_{j=1}^d X_{1j}^2$, where $X_1=(X_{11},dots,X_{1d})$. Since the $X_{1j}$'s are exchangeable and $1=E|X_1|^2=sum_{j=1}^d EX_{1j}^2$, we conclude that $EX_{1j}^2=frac1d$ for all $j$. Also, for any distinct $j,k=1,dots,d$, the pair $(X_{1j},X_{1k})$ equals $(-X_{1j},X_{1k})$ in distribution, whence $EX_{1j}X_{1k}=0$. So, $EX_1=0$ and the covariance matrix of $X_1$ is $frac1d,I_d$, where $I_d$ is the $dtimes d$ identity matrix.



          So, by the multivariate central limit theorem, $frac1{sqrt n},S_n$ converges in distribution (as $ntoinfty$) to the random vector $frac1{sqrt d}Z$, where $Z=(Z_1,dots,Z_d)$ and $Z_1,dots,Z_d$ are iid $N(0,1)$. Therefore and because of the uniform integrability of $frac1{sqrt n},|S_n|$ (provided by the observation that $E|S_n|^2=n$, made in the answer by Pierre PC), we have
          begin{equation}
          Efrac1{sqrt n},|S_n|toell_d:=frac1{sqrt d}Esqrt{sum_1^d Z_j^2},
          end{equation}

          and
          hence
          begin{equation}
          E|S_n|sim ell_dsqrt n. tag{*}
          end{equation}

          Since the distribution of $sum_1^d Z_j^2$ is $chi^2_d=text{Gamma}(frac d2,2)$, we see that
          begin{equation}
          ell_d=sqrt{frac2d},frac{Gamma((d+1)/2)}{Gamma(d/2)}.
          end{equation}

          In particular, for $d=2$ (*) becomes
          begin{equation}
          E|S_n|sim tfrac12,sqrt{pi n},
          end{equation}

          which is what you read in that paper.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago














          11












          11








          11







          Let $X_1,X_2,dots$ be iid random vectors each uniformly distributed on $S^{d-1}$. Let $S_n:=sum_1^n X_i$. By the symmetry, $EX_1=0$. Also, $1=|X_1|^2=sum_{j=1}^d X_{1j}^2$, where $X_1=(X_{11},dots,X_{1d})$. Since the $X_{1j}$'s are exchangeable and $1=E|X_1|^2=sum_{j=1}^d EX_{1j}^2$, we conclude that $EX_{1j}^2=frac1d$ for all $j$. Also, for any distinct $j,k=1,dots,d$, the pair $(X_{1j},X_{1k})$ equals $(-X_{1j},X_{1k})$ in distribution, whence $EX_{1j}X_{1k}=0$. So, $EX_1=0$ and the covariance matrix of $X_1$ is $frac1d,I_d$, where $I_d$ is the $dtimes d$ identity matrix.



          So, by the multivariate central limit theorem, $frac1{sqrt n},S_n$ converges in distribution (as $ntoinfty$) to the random vector $frac1{sqrt d}Z$, where $Z=(Z_1,dots,Z_d)$ and $Z_1,dots,Z_d$ are iid $N(0,1)$. Therefore and because of the uniform integrability of $frac1{sqrt n},|S_n|$ (provided by the observation that $E|S_n|^2=n$, made in the answer by Pierre PC), we have
          begin{equation}
          Efrac1{sqrt n},|S_n|toell_d:=frac1{sqrt d}Esqrt{sum_1^d Z_j^2},
          end{equation}

          and
          hence
          begin{equation}
          E|S_n|sim ell_dsqrt n. tag{*}
          end{equation}

          Since the distribution of $sum_1^d Z_j^2$ is $chi^2_d=text{Gamma}(frac d2,2)$, we see that
          begin{equation}
          ell_d=sqrt{frac2d},frac{Gamma((d+1)/2)}{Gamma(d/2)}.
          end{equation}

          In particular, for $d=2$ (*) becomes
          begin{equation}
          E|S_n|sim tfrac12,sqrt{pi n},
          end{equation}

          which is what you read in that paper.






          share|cite|improve this answer













          Let $X_1,X_2,dots$ be iid random vectors each uniformly distributed on $S^{d-1}$. Let $S_n:=sum_1^n X_i$. By the symmetry, $EX_1=0$. Also, $1=|X_1|^2=sum_{j=1}^d X_{1j}^2$, where $X_1=(X_{11},dots,X_{1d})$. Since the $X_{1j}$'s are exchangeable and $1=E|X_1|^2=sum_{j=1}^d EX_{1j}^2$, we conclude that $EX_{1j}^2=frac1d$ for all $j$. Also, for any distinct $j,k=1,dots,d$, the pair $(X_{1j},X_{1k})$ equals $(-X_{1j},X_{1k})$ in distribution, whence $EX_{1j}X_{1k}=0$. So, $EX_1=0$ and the covariance matrix of $X_1$ is $frac1d,I_d$, where $I_d$ is the $dtimes d$ identity matrix.



          So, by the multivariate central limit theorem, $frac1{sqrt n},S_n$ converges in distribution (as $ntoinfty$) to the random vector $frac1{sqrt d}Z$, where $Z=(Z_1,dots,Z_d)$ and $Z_1,dots,Z_d$ are iid $N(0,1)$. Therefore and because of the uniform integrability of $frac1{sqrt n},|S_n|$ (provided by the observation that $E|S_n|^2=n$, made in the answer by Pierre PC), we have
          begin{equation}
          Efrac1{sqrt n},|S_n|toell_d:=frac1{sqrt d}Esqrt{sum_1^d Z_j^2},
          end{equation}

          and
          hence
          begin{equation}
          E|S_n|sim ell_dsqrt n. tag{*}
          end{equation}

          Since the distribution of $sum_1^d Z_j^2$ is $chi^2_d=text{Gamma}(frac d2,2)$, we see that
          begin{equation}
          ell_d=sqrt{frac2d},frac{Gamma((d+1)/2)}{Gamma(d/2)}.
          end{equation}

          In particular, for $d=2$ (*) becomes
          begin{equation}
          E|S_n|sim tfrac12,sqrt{pi n},
          end{equation}

          which is what you read in that paper.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 2 days ago









          Iosif PinelisIosif Pinelis

          18.2k22159




          18.2k22159













          • Thank you very much!

            – Alfred
            2 days ago



















          • Thank you very much!

            – Alfred
            2 days ago

















          Thank you very much!

          – Alfred
          2 days ago





          Thank you very much!

          – Alfred
          2 days ago











          2














          Let $(X_k)_{kgeq1}$ be a sequence of points chosen independently and uniformly on the $(d-1)$-sphere. Set $S_n=X_1+cdots+X_n$ and $rho_n=|S_n|$.



          It is clear that
          $$ rho_{n+1}^2 = rho_{n}^2 + 2langle S_n,X_{n+1} rangle + 1. $$
          Now because $mathbb E[langle S_n,X_{n+1}rangle|X_1,cdots, X_n] = langle S_n,mathbb E[X_{n+1}]rangle = 0$, we get
          $$ mathbb E[rho_{n+1}^2] = mathbb E[rho_{n}^2] + 2mathbb E[langle S_n,X_{n+1} rangle] + 1 = mathbb E[rho_{n}^2] + 1 = cdots = n+1. $$
          Hence, for instance, $mathbb Erho_nleqsqrt{mathbb Erho_n^2}leqsqrt n$.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago
















          2














          Let $(X_k)_{kgeq1}$ be a sequence of points chosen independently and uniformly on the $(d-1)$-sphere. Set $S_n=X_1+cdots+X_n$ and $rho_n=|S_n|$.



          It is clear that
          $$ rho_{n+1}^2 = rho_{n}^2 + 2langle S_n,X_{n+1} rangle + 1. $$
          Now because $mathbb E[langle S_n,X_{n+1}rangle|X_1,cdots, X_n] = langle S_n,mathbb E[X_{n+1}]rangle = 0$, we get
          $$ mathbb E[rho_{n+1}^2] = mathbb E[rho_{n}^2] + 2mathbb E[langle S_n,X_{n+1} rangle] + 1 = mathbb E[rho_{n}^2] + 1 = cdots = n+1. $$
          Hence, for instance, $mathbb Erho_nleqsqrt{mathbb Erho_n^2}leqsqrt n$.






          share|cite|improve this answer
























          • Thank you very much!

            – Alfred
            2 days ago














          2












          2








          2







          Let $(X_k)_{kgeq1}$ be a sequence of points chosen independently and uniformly on the $(d-1)$-sphere. Set $S_n=X_1+cdots+X_n$ and $rho_n=|S_n|$.



          It is clear that
          $$ rho_{n+1}^2 = rho_{n}^2 + 2langle S_n,X_{n+1} rangle + 1. $$
          Now because $mathbb E[langle S_n,X_{n+1}rangle|X_1,cdots, X_n] = langle S_n,mathbb E[X_{n+1}]rangle = 0$, we get
          $$ mathbb E[rho_{n+1}^2] = mathbb E[rho_{n}^2] + 2mathbb E[langle S_n,X_{n+1} rangle] + 1 = mathbb E[rho_{n}^2] + 1 = cdots = n+1. $$
          Hence, for instance, $mathbb Erho_nleqsqrt{mathbb Erho_n^2}leqsqrt n$.






          share|cite|improve this answer













          Let $(X_k)_{kgeq1}$ be a sequence of points chosen independently and uniformly on the $(d-1)$-sphere. Set $S_n=X_1+cdots+X_n$ and $rho_n=|S_n|$.



          It is clear that
          $$ rho_{n+1}^2 = rho_{n}^2 + 2langle S_n,X_{n+1} rangle + 1. $$
          Now because $mathbb E[langle S_n,X_{n+1}rangle|X_1,cdots, X_n] = langle S_n,mathbb E[X_{n+1}]rangle = 0$, we get
          $$ mathbb E[rho_{n+1}^2] = mathbb E[rho_{n}^2] + 2mathbb E[langle S_n,X_{n+1} rangle] + 1 = mathbb E[rho_{n}^2] + 1 = cdots = n+1. $$
          Hence, for instance, $mathbb Erho_nleqsqrt{mathbb Erho_n^2}leqsqrt n$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 2 days ago









          Pierre PCPierre PC

          966




          966













          • Thank you very much!

            – Alfred
            2 days ago



















          • Thank you very much!

            – Alfred
            2 days ago

















          Thank you very much!

          – Alfred
          2 days ago





          Thank you very much!

          – Alfred
          2 days ago


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to MathOverflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320630%2frandom-walks-on-high-dimensional-spaces%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          "Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

          Alcedinidae

          RAC Tourist Trophy