On an expected value inequality.









up vote
6
down vote

favorite
1












Given $X$ a random variable that takes values on all of $mathbbR$ with associated probability density function $f$ is it true that for all $r > 0$



$$E left[ int_X-r^X+r f(x) dx right] ge E left[ int_X-r^X+r g(x) dx right]$$



for any other probability density function $g$ ?



This seems intuitively true to me and I imagine if it were to be true that it has been proven but I can't find a similar result on the standard textbooks, even a reference is welcome.










share|cite|improve this question



























    up vote
    6
    down vote

    favorite
    1












    Given $X$ a random variable that takes values on all of $mathbbR$ with associated probability density function $f$ is it true that for all $r > 0$



    $$E left[ int_X-r^X+r f(x) dx right] ge E left[ int_X-r^X+r g(x) dx right]$$



    for any other probability density function $g$ ?



    This seems intuitively true to me and I imagine if it were to be true that it has been proven but I can't find a similar result on the standard textbooks, even a reference is welcome.










    share|cite|improve this question

























      up vote
      6
      down vote

      favorite
      1









      up vote
      6
      down vote

      favorite
      1






      1





      Given $X$ a random variable that takes values on all of $mathbbR$ with associated probability density function $f$ is it true that for all $r > 0$



      $$E left[ int_X-r^X+r f(x) dx right] ge E left[ int_X-r^X+r g(x) dx right]$$



      for any other probability density function $g$ ?



      This seems intuitively true to me and I imagine if it were to be true that it has been proven but I can't find a similar result on the standard textbooks, even a reference is welcome.










      share|cite|improve this question















      Given $X$ a random variable that takes values on all of $mathbbR$ with associated probability density function $f$ is it true that for all $r > 0$



      $$E left[ int_X-r^X+r f(x) dx right] ge E left[ int_X-r^X+r g(x) dx right]$$



      for any other probability density function $g$ ?



      This seems intuitively true to me and I imagine if it were to be true that it has been proven but I can't find a similar result on the standard textbooks, even a reference is welcome.







      probability integration probability-theory inequality






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 10 at 20:24

























      asked Nov 10 at 17:17









      Monolite

      1,4692925




      1,4692925




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          6
          down vote



          accepted










          Taking the particular case of small $r$ ($r to 0$) and continuous $f$, your inequality turns equivalent to



          $$ int f^2 ge int f g $$



          with the restrictions $int f = int g = 1$ and $fge 0$, $gge 0$. This is clearly false. For a fixed $f$ we maximize $int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.



          Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).






          share|cite|improve this answer





























            up vote
            3
            down vote













            Unfortunately, your intuitive conjecture is INCORRECT.



            Let $f(x)$ be the PDF of the random variable $X$ and $F(x)$ be its cumulative PDF, so that $F'(x)=f(x)$, or
            $$F(x)=int_-infty^x f(t)dt$$
            Similarly, let $g(x)$ be another PDF with cumulative PDF $G(x)$. Then the expected value of the integral
            $$int_X-r^X+r g(x)dx$$
            is equal to
            $$int_-infty^infty int_x-r^x+r f(x)g(t)dtdx=int_-infty^infty (G(x+r)-G(x-r))f(x)dx$$
            By using integration by parts, we have that
            $$int_-infty^infty (G(x+r)-G(x-r))f(x)dx=int_-infty^infty (F(x+r)-F(x-r))g(x)dx$$
            Consider this simple counterexample. Let $r=1$, and suppose that
            $$f(x)=frac1pifrac11+x^2$$
            Then, if your conjecture is true, for no function $g$ will the integral
            $$frac1piint_-infty^infty (arctan(x+1)-arctan(x-1))g(x)dx$$
            even surpass the value
            $$frac1pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+x^2dxapprox 0.1475$$
            However, suppose that we let
            $$g(x)=frac4pifrac11+4x^2$$
            Then the value of our integral is equal to
            $$frac4pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+4x^2dxapprox 0.3743$$
            which disproves your conjecture.






            share|cite|improve this answer




















              Your Answer





              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2992859%2fon-an-expected-value-inequality%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              6
              down vote



              accepted










              Taking the particular case of small $r$ ($r to 0$) and continuous $f$, your inequality turns equivalent to



              $$ int f^2 ge int f g $$



              with the restrictions $int f = int g = 1$ and $fge 0$, $gge 0$. This is clearly false. For a fixed $f$ we maximize $int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.



              Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).






              share|cite|improve this answer


























                up vote
                6
                down vote



                accepted










                Taking the particular case of small $r$ ($r to 0$) and continuous $f$, your inequality turns equivalent to



                $$ int f^2 ge int f g $$



                with the restrictions $int f = int g = 1$ and $fge 0$, $gge 0$. This is clearly false. For a fixed $f$ we maximize $int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.



                Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).






                share|cite|improve this answer
























                  up vote
                  6
                  down vote



                  accepted







                  up vote
                  6
                  down vote



                  accepted






                  Taking the particular case of small $r$ ($r to 0$) and continuous $f$, your inequality turns equivalent to



                  $$ int f^2 ge int f g $$



                  with the restrictions $int f = int g = 1$ and $fge 0$, $gge 0$. This is clearly false. For a fixed $f$ we maximize $int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.



                  Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).






                  share|cite|improve this answer














                  Taking the particular case of small $r$ ($r to 0$) and continuous $f$, your inequality turns equivalent to



                  $$ int f^2 ge int f g $$



                  with the restrictions $int f = int g = 1$ and $fge 0$, $gge 0$. This is clearly false. For a fixed $f$ we maximize $int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.



                  Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Nov 10 at 19:23

























                  answered Nov 10 at 18:46









                  leonbloy

                  39.6k645105




                  39.6k645105




















                      up vote
                      3
                      down vote













                      Unfortunately, your intuitive conjecture is INCORRECT.



                      Let $f(x)$ be the PDF of the random variable $X$ and $F(x)$ be its cumulative PDF, so that $F'(x)=f(x)$, or
                      $$F(x)=int_-infty^x f(t)dt$$
                      Similarly, let $g(x)$ be another PDF with cumulative PDF $G(x)$. Then the expected value of the integral
                      $$int_X-r^X+r g(x)dx$$
                      is equal to
                      $$int_-infty^infty int_x-r^x+r f(x)g(t)dtdx=int_-infty^infty (G(x+r)-G(x-r))f(x)dx$$
                      By using integration by parts, we have that
                      $$int_-infty^infty (G(x+r)-G(x-r))f(x)dx=int_-infty^infty (F(x+r)-F(x-r))g(x)dx$$
                      Consider this simple counterexample. Let $r=1$, and suppose that
                      $$f(x)=frac1pifrac11+x^2$$
                      Then, if your conjecture is true, for no function $g$ will the integral
                      $$frac1piint_-infty^infty (arctan(x+1)-arctan(x-1))g(x)dx$$
                      even surpass the value
                      $$frac1pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+x^2dxapprox 0.1475$$
                      However, suppose that we let
                      $$g(x)=frac4pifrac11+4x^2$$
                      Then the value of our integral is equal to
                      $$frac4pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+4x^2dxapprox 0.3743$$
                      which disproves your conjecture.






                      share|cite|improve this answer
























                        up vote
                        3
                        down vote













                        Unfortunately, your intuitive conjecture is INCORRECT.



                        Let $f(x)$ be the PDF of the random variable $X$ and $F(x)$ be its cumulative PDF, so that $F'(x)=f(x)$, or
                        $$F(x)=int_-infty^x f(t)dt$$
                        Similarly, let $g(x)$ be another PDF with cumulative PDF $G(x)$. Then the expected value of the integral
                        $$int_X-r^X+r g(x)dx$$
                        is equal to
                        $$int_-infty^infty int_x-r^x+r f(x)g(t)dtdx=int_-infty^infty (G(x+r)-G(x-r))f(x)dx$$
                        By using integration by parts, we have that
                        $$int_-infty^infty (G(x+r)-G(x-r))f(x)dx=int_-infty^infty (F(x+r)-F(x-r))g(x)dx$$
                        Consider this simple counterexample. Let $r=1$, and suppose that
                        $$f(x)=frac1pifrac11+x^2$$
                        Then, if your conjecture is true, for no function $g$ will the integral
                        $$frac1piint_-infty^infty (arctan(x+1)-arctan(x-1))g(x)dx$$
                        even surpass the value
                        $$frac1pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+x^2dxapprox 0.1475$$
                        However, suppose that we let
                        $$g(x)=frac4pifrac11+4x^2$$
                        Then the value of our integral is equal to
                        $$frac4pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+4x^2dxapprox 0.3743$$
                        which disproves your conjecture.






                        share|cite|improve this answer






















                          up vote
                          3
                          down vote










                          up vote
                          3
                          down vote









                          Unfortunately, your intuitive conjecture is INCORRECT.



                          Let $f(x)$ be the PDF of the random variable $X$ and $F(x)$ be its cumulative PDF, so that $F'(x)=f(x)$, or
                          $$F(x)=int_-infty^x f(t)dt$$
                          Similarly, let $g(x)$ be another PDF with cumulative PDF $G(x)$. Then the expected value of the integral
                          $$int_X-r^X+r g(x)dx$$
                          is equal to
                          $$int_-infty^infty int_x-r^x+r f(x)g(t)dtdx=int_-infty^infty (G(x+r)-G(x-r))f(x)dx$$
                          By using integration by parts, we have that
                          $$int_-infty^infty (G(x+r)-G(x-r))f(x)dx=int_-infty^infty (F(x+r)-F(x-r))g(x)dx$$
                          Consider this simple counterexample. Let $r=1$, and suppose that
                          $$f(x)=frac1pifrac11+x^2$$
                          Then, if your conjecture is true, for no function $g$ will the integral
                          $$frac1piint_-infty^infty (arctan(x+1)-arctan(x-1))g(x)dx$$
                          even surpass the value
                          $$frac1pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+x^2dxapprox 0.1475$$
                          However, suppose that we let
                          $$g(x)=frac4pifrac11+4x^2$$
                          Then the value of our integral is equal to
                          $$frac4pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+4x^2dxapprox 0.3743$$
                          which disproves your conjecture.






                          share|cite|improve this answer












                          Unfortunately, your intuitive conjecture is INCORRECT.



                          Let $f(x)$ be the PDF of the random variable $X$ and $F(x)$ be its cumulative PDF, so that $F'(x)=f(x)$, or
                          $$F(x)=int_-infty^x f(t)dt$$
                          Similarly, let $g(x)$ be another PDF with cumulative PDF $G(x)$. Then the expected value of the integral
                          $$int_X-r^X+r g(x)dx$$
                          is equal to
                          $$int_-infty^infty int_x-r^x+r f(x)g(t)dtdx=int_-infty^infty (G(x+r)-G(x-r))f(x)dx$$
                          By using integration by parts, we have that
                          $$int_-infty^infty (G(x+r)-G(x-r))f(x)dx=int_-infty^infty (F(x+r)-F(x-r))g(x)dx$$
                          Consider this simple counterexample. Let $r=1$, and suppose that
                          $$f(x)=frac1pifrac11+x^2$$
                          Then, if your conjecture is true, for no function $g$ will the integral
                          $$frac1piint_-infty^infty (arctan(x+1)-arctan(x-1))g(x)dx$$
                          even surpass the value
                          $$frac1pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+x^2dxapprox 0.1475$$
                          However, suppose that we let
                          $$g(x)=frac4pifrac11+4x^2$$
                          Then the value of our integral is equal to
                          $$frac4pi^2int_-infty^infty fracarctan(x+1)-arctan(x-1)1+4x^2dxapprox 0.3743$$
                          which disproves your conjecture.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Nov 10 at 18:03









                          Frpzzd

                          19.8k638101




                          19.8k638101



























                               

                              draft saved


                              draft discarded















































                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2992859%2fon-an-expected-value-inequality%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              這個網誌中的熱門文章

                              How to read a connectionString WITH PROVIDER in .NET Core?

                              In R, how to develop a multiplot heatmap.2 figure showing key labels successfully

                              Museum of Modern and Contemporary Art of Trento and Rovereto