Lesson 44 – Keep waiting: The memoryless property of exponential distribution

Bob Waits for the Bus

As the building entrance door closes behind, Bob glances at his post-it note. It has the directions and address of the car dealer. Bob is finally ready to buy his first (used) car. He walks to the nearby bus stop jubilantly thinking he will seldom use the bus again. Bob is tired of the waiting. Throughout these years the one thing he could establish is that the average wait time for his inbound 105 at the Cross St @ Main St is 15 minutes.

Bob may not care, but we know that his wait time follows an exponential distribution that has a probability density function  f(t) = \lambda e^{-\lambda t} .

The random variable T, the wait time between buses is an exponential distribution with parameter  \lambda . He waits 15 minutes on average. Some days he boards the bus earlier than 15 minutes, and some days he waits much longer.

Looking at the function  f(t) = \lambda e^{-\lambda t} , and the typical information we have for exponential distribution, i.e., the average wait time, it will be useful to relate the parameter  \lambda to the average wait time.

The average wait time is the average of the distribution — the expected value E[.].

E[X] for a continuous distribution, as you know from lesson 24 is  E[X] = \int x f(x) dx.

Applying this using the limits of the exponential distribution, we can derive the following.

 E[T] = \int_{0}^{\infty} t f(t) dt

 E[T] = \int_{0}^{\infty} t \lambda e^{-\lambda t} dt

 E[T] = \lambda \int_{0}^{\infty} t e^{-\lambda t} dt

The definite integral is  \frac{1}{\lambda^{2}} .

So we have

E[T] = \frac{1}{\lambda}

The parameter  \lambda is a non-negative real number ( \lambda > 0), and represents the reciprocal of the expected value of T.

In Bob’s case, since the average wait time (E[T]) is 15 minutes, the parameter \lambda is 0.066.

Bob gets to the bus shelter, greets the person next to him and thinks to himself “Hope the wait will not exceed 10 minutes today.”

Please tell him the probability he waits more than 10 minutes is 0.5134.

 P(T > 10) = e^{-\lambda t} = e^{-10/15} = 0.5134

Bob is visibly anxious. He turns his hand and looks at his wristwatch. “10 minutes. The wait won’t be much longer.”

Please tell him about the memoryless property of the exponential distribution. The probability that he waits for another ten minutes, given he already waited 10 minutes is also 0.5134.

Let’s see how. We will assume t represents the first ten minutes and s represents the second ten minutes.

 P(T > t + s \mid T > t) = \frac{P(T > t \cap T > t+s)}{P(T > t)}

\hspace{10cm} = \frac{P(T > t+s)}{P(T > t)}

\hspace{10cm} = \frac{e^{-\lambda (t+s)}}{e^{-\lambda t}}

\hspace{10cm} = \frac{e^{-\lambda t} e^{-\lambda s}}{e^{-\lambda t}}

\hspace{10cm} = e^{-\lambda s}

 P(T > 10 + 10 \mid T > 10) = e^{-10\lambda} = 0.5134

The probability distribution of the remaining time until the event occurs is always the same regardless of the time that passed.

There is no memory in the process. The history is not relevant. The time to next arrival is not influenced by when the last event or arrival occurred.

This property is unique to the strictly decreasing functions: exponential and the geometric distributions.

The probability that Bob has to wait another s minutes (t + s) given that he already waited t minutes is the same as the probability that Bob waited the first s minutes. It is independent of the current wait time.

Bob Gets His First Chevy

Bob arrives at the dealers. He loves the look of the red 1997 Chevy. He looks over the window pane; “Ah, manual shift!” That made up his mind. He knows what he is getting. The price was reasonable. A good running engine is all he needed to drive it away.

The manager was young, a Harvard alum, as Bob identified from things in the room. “There is no guarantee these days with academic inflation, … the young lad is running a family business, … or his passion is to sell cars,” he thought to himself.

The manager tells him that the engine is in perfect running condition and the average breakdown time is four years. Bob does some estimates ($$$$) in his mind while checking out the car. He is happy with what he is getting and closes the deal.

Please tell Bob that there is a 22% likelihood that his Chevy manual shift will break down in the first year.

The number of years this car will run ~ exponential
distribution with a rate (\lambda) of 1/4.

Since the average breakdown time (expected value E[T]) is four years, the parameter \lambda = 1/4.

 P(T \le 1) = 1 - e^{-\lambda t} = 1 - e^{-(1/4)} = 0.22

Bob should also know that there is a 37% chance that his car will still be running fine after four years.

P(T > 4) = e^{-4\lambda} = e^{-4/4} = 0.37

Bob in Four Years

Bob used the car for four years now with regular servicing, standard oil changes, and tire rotations. The engine is great.

Since the average lifetime has passed, should he think about a new car? How long should we expect his car to continue without a breakdown? Another four years?

Since he used it for four years, what is the probability that there will be no breakdown until the next four years?

You guessed it, 37%.

 P(T > 8 \mid T > 4) = \frac{P(T > 8)}{P(T > 4)} = e^{-4\lambda} = 0.37

Now let’s have a visual interpretation of this memoryless property.

The probability distribution of the wait time (engine breakdown) for  \lambda = 1/4 looks like this.

Let us assume another random variable  T_{2} = T - 4, as the breakdown time after four years of usage. The lower bound for  T_{2} is 0 (since we measure from four years), and the upper bound is \infty.

For any values  T > 4 , the distribution is another exponential function — it is shifted by four years.

 f(t_{2}) = \lambda e^{-\lambda t_{2}} = \lambda e^{-\lambda (t-4)}

Watch this animation, you will understand it better.

The original distribution is represented using the black line. The conditional distribution  P(T > 4+s \mid T > 4) is shown as a red line using links.

The same red line with links (truncated at 4) is shown as the shifted exponential distribution (f(t_{2})=\lambda e^{-\lambda (t-4)}). So, the red line with links from t = 4 is the same as the original function from t = 0. It is just shifted.

The average value of T is four years. The average value of T_{2} = T - 4 is also four. They have the same distribution.

If Bob reads our lessons, he’d understand that his Chevy will serve him, on the average, another four years.

Just like the car dealer’s four-year liberal arts degree from Harvard is forgotten, Bob’s four-year car usage history is forgotten — Memoryless.

As the saying goes, some memories are best forgotten, but the lessons from our classroom are never forgotten.

If you find this useful, please like, share and subscribe.
You can also follow me on Twitter @realDevineni for updates on new lessons.

Lesson 43 – Wait time: The language of exponential distribution

Wednesday, no, the Waiting Day

November 29, 2017

6:00 AM

As the cool river breeze kisses my face, I hear the pleasant sound of the waves. “How delightful,” I think, as I drop into the abyss of eternal happiness. The sound of the waves continues to haunt me. I run away from the river; the waves run with me. I close my ears; the waves are still here.

It’s the time when your dream dims into reality. Ah, it’s the sound of the “waves” on my iPhone. Deeply disappointed, I hit the snooze and wait for my dream to come back.

6:54 AM

“Not again,” I screamed. I have 30 minutes to get ready and going. I-95 is already bustling. I can’t afford to wait long in the toll lane. Doctor’s check-in at 8 AM.

7:55 AM

“Come on, let’s go.” For the 48th time, waiting in the toll lane, I curse myself for not having gotten the EZ pass that week. “Let’s go, let’s go.” I maneuver my way while being rude to the nasty guy who tried to sneak in front of my car. Finally, I pay cash at the toll and drive off in a swift to my doctor’s.

8:15 AM

“Hi, I have an appointment this morning. Hope I am not late.” The pretty lady at the desk stared at me, gave me a folder and asked me to wait. Dr. D will be with you shortly. As I was waiting for my turn, I realized that the lady’s stare was for my stupid question. My appointment was at 8 AM after all.

8:50 AM

The doctor steps in; “Please come in” he said. A visibly displeased me walked-in instantly, all the way shaking my head for the delay. My boss will be waiting for me at the office. We are launching a new product today.

9:15 AM

“You are in perfect health. The HDLs and LDLs are normal. Continue the healthy eating and exercise practices you have. See you next time, but don’t wait too long for the next visit.”

9:25 AM

My wait continues, this time for the train. “The next downtown 1-train will arrive in 10 minutes,” said the man (pre-recorded).

10:00 AM

My boss expressed his displeasure at my delay in his usual sarcastic ways. “But, I told you I was going to be late today,” I said to myself. We get busy with work and the product launch.

1:00 PM

I am waiting in the teller line at the local bank. Essential bank formalities and some checks to deposit. There were already ten people before me; there is only one teller, and for some reason, she is taking her own sweet time to serve each customer.

The only other living being in the bank (bank employees of course) is the manager; she is busy helping a person with his mortgage. “Poor guy seems to be buying a house at the peak,” I thought as I start counting the time it is talking to serve each customer.

1:35 PM

“One extra-hot Cappuccino,” said Joe at Starbucks in his usual stern voice. The wait for my coffee was not as annoying. There’s something about coffee and me. Can wait forever ! or maybe it is Starbucks; I can’t say.

7:00 PM

After a long tiring, waiting day, I am still waiting for my train.

I waited 22 minutes. The train surely has to come in the next minute,” I said to myself.

The clock ticks, my energy drops, still no trail.

.

.

.

“The next uptown 1-train is now arriving. Please stand away from the platform edge.”

I step in and grab the one remaining seat. “Finally; no more waiting for the day,” I said to myself.

The wheels rattle, the brains muffle, and the eyes scuttle. Same beautiful abyss of happy, restful state from the morning.

8:00 PM

As I park my car and check my door mail, I realize that my day was filled with wait times. I said to myself, “Aren’t these the examples of exponential distribution that data analysis guy from college used to talk about? I finally understand it. You live and learn.”

9:00 PM

I start logging my Wednesday, no, the waiting day.

“Let me derive the necessary functions for the exponential distribution before I go to bed,” I said to myself.

The time between arrivals at service facilitates, time to failure of systems, flood occurrence, etc., can be modeled as exponential distributions.

Since I want to measure the time between events, I should think of time T as a continuous random variable,  t_{1}, t_{2}, t_{3} , etc., like this.

That means, this distribution is positive only, as  T \ge 0 (non-negative real numbers).

We can have a small wait time or a long wait time. It varies, and we are estimating the probability that T is less than or greater than a particular time, and between two times.

The distribution of the probability of these wait times is called the exponential distribution.

As I watch the events and wait times figure carefully, I can sense that there is a relation between the Poisson distribution and the Exponential distribution.

The Poisson distribution represents the number of events in an interval of time, and the exponential distribution represents the time between these events.

If N is the number of events during an interval ( a span of time) with an average rate of occurrence \lambda,

P(N = k) = \frac{e^{-\lambda t}(\lambda t)^{k}}{k!}

If T is measured as the time to next occurrence or arrival, then it should follow an exponential distribution.

The time to arrival exceeds some value t, only if N = 0 within t, i.e., if there are no events in an interval [0, t].

 P(T > t) = P(N = 0) = \frac{e^{-\lambda t}(\lambda t)^{0}}{0!} = e^{-\lambda t}

If  P(T > t) = e^{-\lambda t} , then  P(T \le t) = 1 - e^{-\lambda t} .

I know that P(T \le t) = F(t) is the cumulative density function. It is the integral of the probability density function. F(t) = \int_{0}^{t}f(t)dt.

The probability density function f(t) can then be obtained by taking the derivative of F(t).

f(t) = \frac{d}{dt}F(t) = \frac{d}{dt}(1-e^{-\lambda t}) = \lambda e^{-\lambda t}

The random variable T, the wait time between successive events is an exponential distribution with parameter \lambda.

Let me map this on to the experiences I had today.

If on average, 25 vehicles pass the toll per hour, \lambda=25 per hour. Then the wait time distribution for the next vehicle at the toll should look like this.

The probability that I will wait more than 5 minutes to pass the toll is  P(T > 5) = e^{-\lambda t} = e^{-25*(5/60)} = 0.125.

So, the probability that my wait time will be less than 5 minutes is 0.875. Not bad. I should have known this before I swore at the guy who got in my way.

It is clear that the distribution will be flatter if \lambda is smaller and steeper if \lambda is larger.

10:00 PM

I lay in my bed with a feeling of accomplishment. My waiting day was eventful; I checked off all boxes on my to-do list. I now have a clear understanding of exponential distribution.

10:05 PM

I am hoping that I get the same beautiful dream. My mind is still on exponential distribution with one question.

“I waited 22 minutes for the train in the evening, why did it not arrive in the next few minutes? Since I waited a long time, shouldn’t the train arrive immediately?”

A tired body always beats the mind.

It was time for the last thought to dissolve into the darkness. The SHIREBOURN river is “waiting” for me on the other side of the darkness.

If you find this useful, please like, share and subscribe.
You can also follow me on Twitter @realDevineni for updates on new lessons.