Uniform Probabilities on a Triangle

Uniform Probabilities on a Triangle


Hi. In this problem, we’re going
to get a bunch of practice working with multiple random
variables together. And so we’ll look at joint
PDFs, marginal PDFs, conditional PDFs, and also get
some practice calculating expectations as well. So the problem gives us a pair
of random variables– x and y. And we’re told that the joint
distribution is uniformly distributed on this triangle
here, with the vertices being 0, 0 1, 0, and 0, 1. So it’s uniform in
this triangle. And the first part of the
problem is just to figure out what exactly is disjoint PDF of
the two random variables. So in this case, it’s pretty
easy to calculate, because we have a uniform distribution. And remember, when you have a
uniform distribution, you can just imagine it being
a sort of plateau coming out of the board. And it’s flat. And so the height of the
plateau, in order to calculate it, you just need to figure
out what the area of this thing is, of this triangle is. So remember, when you had single
random variables, what we had to do was calculate, for
uniform distribution, we had to integrate to 1. So you took the length, and you
took 1 over the length was the correct scaling factor. Here, you take the area. And the height has to make it so
that the entire volume here integrates to 1. So the joint PDF is just
going to be 1 over whatever this area is. And the area is pretty
simple to calculate. It’s 1/2 base times height. So it’s 1/2. And so what we have is
that the area is 1/2. And so the joint PDF of x and
y is going to equal 2. But remember, you always have
to be careful when writing these things to remember
the ranges when these things are valid. So it’s only 2 within
this triangle. And outside of the
triangle, it’s 0. So what exactly does inside
the triangle mean? Well, we can write it
more mathematically. So this diagonal line, it’s
given by x plus y equals 1. So everything in the triangle
is really x plus y is less than or equal to 1. It means everything under
this triangle. And so we need x plus y to be
less then or equal to 1 and also x to be non-negative and
y to be non-negative. So with these inequalities,
that captures everything within this triangle. And otherwise, the joint
PDF is going to be 0. The next part asks us to find,
using this joint PDF, the marginal of y. And remember, when you have
a joint PDF of two random variables, you essentially have
everything that you need, because from this joint PDF, you
can calculate marginals, you can calculate from the
margins, you can calculate conditionals. The joint PDF captures
everything that there is to know about this pair of
random variables. Now, to calculate a marginal PDF
of y, remember a marginal really just means collapsing
the other random variable down. And so you can just imagine
taking this thing and collapsing it down
onto the y-axis. And mathematically, that is just
saying that we integrate out the other random variable. So the other random variable
in this case will be x. We take x and we get rid of
it by integrating out from negative infinity to infinity. Of course, this joint PDF
is 0 in a lot of places. And so a lot of these
will be 0. And only for a certain range
of x’s will this integral actually be non-zero. And so again, the other time
when we have to be careful is when we have these limits of
integration, we need to make sure that we have the
right limits. And so we know that the
joint PDF is 2. It’s nonzero only within
this triangle. And so it’s only 2 within
this triangle, which means what for x? Well, depending on what
x and y are, this will be either 2 or 0. So let’s just fix
some value of y. Pretend that we’ve picked some
value y, let’s say here. We want this value of y. Well, what are the values of x
such that the joint PDF for that value y is actually
nonzero, it’s actually 2? Well, it’s everything from
x equals 0 to whatever x value this is. But this x value, actually, if
you think about it, is just 1 minus y, because this line
is x plus y equals 1. So whatever y is, x is going
to be 1 minus that. And so the correct limits
would actually be from 0 to 1 minus y. And then the rest of that
is pretty simple. You integrate this. This is a pretty simple
integral. And you get that it’s actually
two times 1 minus y. That’s a y. But of course, again, we need to
make sure that we have the right regions. So this is not always true
for y, of course. This is only true for
y between 0 and 1. And otherwise, it’s actually 0,
because when you take a y down here, well, there’s no
values of x that will give you a nonzero joint PDF. And if you take a value of y
higher than this, the same thing happens. So we can actually draw
this out and see what it looks like. So let’s actually draw
a small picture here. Here’s y. Here’s the marginal PDF of y. And here’s 2. And it actually looks
like this. It’s a triangle and a 0
outside this range. So does that make sense? Well, first of all, you see
that actually does in fact integrates to 1,
which is good. And the other thing we notice
is that there is a higher density for smaller
values of y. So why is that? Why are smaller values
of y more likely than larger values of y? Well, because when you have
smaller values of y, you’re down here. And it’s more likely because
there are more values of x that go along with it that
make that value of y more likely to appear. Say you have a large
value of y. Then you’re up here
at the tip. Well, there aren’t very many
combinations of x and y that give you that large
a value of y. And so that large value of
y becomes less likely. Another way to think about it
is, when you collapse this down, there’s a lot more stuff
to collapse down its base. There’s a lot of x’s
to collapse down. But up here, there’s only a
very little bit of x to collapse down. And the PDF of y becomes
more skewed towards smaller values of y. So now, the next thing that we
want to do is calculate the conditional PDF of x, given y. Well, let’s just recall
what that means. This is what we’re looking for–
the conditional PDF of x, given y. And remember, this is calculated
by taking the joint and dividing by the
marginal of y. So we actually have the
top and the bottom. We have to joint PDF from part
A. And from part B, we calculated the marginal
PDF of y. So we have both pieces. So let’s actually
plug them in. Again, the thing that you have
to be careful here is about the ranges of x and y where
these things are valid, because this is only non-zero
when x and y fall within this triangle. And this is only non-zero when
y is between 0 and 1. So we need to be careful. So the top, when it’s
non-zero, it’s 2. And the bottom, when it’s
non-zero, it’s 2 times 1 minus y. So we can simplify that to
be 1 over 1 minus y. And when is this true? Well, it’s true when x and y are
in the triangle and y is between 0 and 1. So put another way, that means
that this is valid when y is between 0 and 1 and x is between
0 and 1 minus y, because whatever x has to be,
it has to be such that they actually still fall within
this triangle. And outside of this, it’s 0. So let’s see what this
actually looks like. So this is x, and this is the
conditional PDF of x, given y. Let’s say this is
1 right here. Then what it’s saying is, let’s
say we’re given that y is some little y. Let’s say it’s somewhere here. Then it’s saying that the
conditional PDF of x given y is this thing. But notice that this value,
1 over 1 minus y, does not depend on x. So in fact, it actually
is uniform. So it’s uniform between
0 and 1 minus y. And the height is something
like 1 over 1 minus y. And this is so that the scaling
makes it so that actually is a valid PDF, because
the integral is to 1. So why is the case? Why is that when you condition
on y being some value, you get that the PDF of x is
actually uniform? Well, when you look over here,
let’s again just pretend that you’re taking this value of y. Well, when you’re conditioning
on y being this value, you’re basically taking a slice of this
joint PDF at this point. But remember, the original
joint PDF was uniform. So when you take a slice of a
uniform distribution, joint uniform distribution,
you still get something that is uniform. Just imagine that you have
a cake that is flat. Now, you take a slice
at this level. Then whatever slice you have
is also going to be imagine being a flat rectangle. So it’s still going
to be uniform. And that’s why the conditional
PDF of x given y is also uniform. Part D now asks us to find a
conditional expectation of x. So we want to find the
expectation of x, given that y is some little y. And for this, we can
use the definition. Remember, expectations are
really just weighted sums. Or in the [? continuous ?]
case, it’s an integral. So you take the value. And then you weight
it by the density. And in this case, because we’re
taking conditional a expectation, what we weight it
by is the conditional density. So it’s the conditional
density of x given that y is little y. We integrate with
respect to x. And fortunately, we know what
this conditional PDF is, because we calculated it earlier
in part C. And we know that it’s this– 1 over 1 minus y. But again, we have to be
careful, because this formula, 1 over 1 minus y, is only
valid certain cases. So let’s think about
this first. Let’s think about some
extreme cases. What if y, little
y, is negative? If little y is negative,
we’re conditioning on something over here. And so there is no density for
y being negative or for y, say, in other cases when
y is greater than 1. And so in those cases, this
expectation is just undefined, because conditioning on that
doesn’t really make sense, because there’s no density
for those values of y. Now, let’s consider the case
that actually makes, sense where y is between 0 and 1. Now, we’re in business, because
that is the range where this formula is valid. So this formula is valid,
and we can plug it in. So it’s 1 over 1 minus y dx. And then the final thing that we
again need to check is what the limits of this
integration is. So we’re integrating
with respect to x. So we need to write down what
values of x, what ranges of x is this conditional PDF valid. Well, luckily, we specified
that here. x has to be between
0 and 1 minus y. So let’s actually calculate
this integral. This 1 over 1 minus y is a
constant with respect to x. You can just pull that out. And then now, you’re really
just integrating x from 0 to 1 minus y. So the integral of x is
[? 1 ?], 1/2x squared. So you get a 1/2x squared, and
you integrate that from 0 to 1 minus y. And so when you plug in
the limits, you’ll get a 1 minus y squared. That will cancel out the
1 over 1 minus y. And what you’re left with is
just 1 minus y over 2. And again, we have to specify
that this is only true for y between 0 and 1. Now, we can again actually
verify that this makes sense. What we’re really looking for is
the conditional expectation of x given some value of y. And we already said that
condition on y being some value of x is uniformly
distributed between 0 and 1 minus y. And so remember for our uniform
distribution, the expectation is simple. It’s just the midpoint. So the midpoint of 0
and 1 minus y is exactly 1 minus y/2. So that’s a nice way of
verifying that this answer is actually correct. Now, the second part of
part D asks us to do a little bit more. We have to use the total
expectation theorem in order to somehow write the expectation
of x in terms of the expectation of y. So the first thing we’ll
do is use the total expectation theorem. So the total expectation theorem
is just saying, well, we can take these conditional
expectations. And now, we can integrate this
by the marginal density of y, then we’ll get the actual
expectation of x. You can think of it as just kind
of applying the law of iterated expectations as well. So this integral is going
to look like this. You take the conditional
expectation. So this is the expectation of x
if y were equal to little y. And now, what is that
probability? Well, now we just multiply that
by the density of y at that actual value of little y. And we integrate with
respect to y. Now, we’ve already calculated
what this conditional expectation is. It’s 1 minus y/2. So let’s plug that in. 1 minus y/2 times the
marginal of y. There’s a couple ways of
attacking this problem now. One way is, we can actually
just plug in that marginal of y. We’ve already calculated that
out in part B. And then we can do this integral and calculate
out the expectation. But maybe we don’t really want
to do so much calculus. So let’s do what the
problem says and try a different approach. So what the problem suggests is
to write this in terms of the expectation of y. And what is the expectation
of y? Well, the expectation of y is
going to look something like the integral of y times
the marginal of y. So let’s see if we can identify
something like that and pull it out. Well, yeah, we actually
do have that. We have y times the marginal
of y, integrated. So let’s isolate that. So besides that, we
also have this. We have the integral of the
first term, is 1/2 times the marginal of y. And then the second term is
minus 1/2 times the integral of y of dy. This is just me splitting
this integral up into two separate integrals. Now, we know what this is. The 1/2 we can pull out. And then the rest of it is
just the integral of a marginal of a density from minus
infinity to infinity. And by definition, that
has to be equal to 1. So this just gives us a 1/2. And now, what is this? We get a minus 1/2. And now this, we already said
that is the expectation of y. So what we have is the
expectation of y. So in the second part of this
part D, we’ve expressed the expectation of x in terms
of the expectation of y. Now, maybe that seems like
that’s not too helpful, because we don’t know what
either of those two are. But if we think about this
problem, and as part E suggests, we can see that
there’s symmetry in this problem, because x and y are
essentially symmetric. So imagine this is x equals y. There’s symmetry in this
problem, because if you were to swap the roles of x and y,
you would have exactly the same joint PDF. So what that suggests is that
by symmetry then, it must be that the expectation of x and
the expectation of y are exactly the same. And that is using the
symmetry argument. And that helps us now, because
we can plug that in and solve for expectation of x. So expectation of x is 1/2 minus
1/2 expectation of x. So we have 3/2 expectation
of x equals 1/2. So expectation of
x equals 1/3. And of course, expectation
of y is also 1/3. And so it turns out that the
expectation is around there. So this problem had
several parts. And it allowed us to start
out from just a raw joint distribution, calculate
marginals, calculate conditionals, and then from
there, calculate all kinds of conditional expectations
and expectations. And a couple of important points
to remember are, when you do these joint
distributions, it’s very important to consider where
values are valid. So you have to keep in mind
when you write out these conditional PDFs and joint PDFs
and marginal PDFs, what ranges the formulas you
calculated are valid for. And that also translates to
when you’re calculating expectations and such. When you have integrals, you
need to be very careful about the limits of your integration,
to make sure that they line up with the range
where the values are actually valid. And the last thing, which is
kind of unrelated, but it is actually a common tool that’s
used in a lot of problems is, when you see symmetry in these
problems, that can help a lot, because it will simplify things
and allow you to use facts like these to help
you calculate what the final answer is. Of course, this is also comes
along with practice. You may not immediately see that
there could be a symmetry argument that will help
with this problem. But with practice, when you do
more of these problems, you’ll eventually build up
that kind of–

3 Replies to “Uniform Probabilities on a Triangle”

Leave a Reply

Your email address will not be published. Required fields are marked *