My numbersense class for Principal Analytics Prep has passed the midway point, and we covered probabilistic thinking. Yesterday, the idea came to me that the surge protector is a good example to use in teaching probability. I just need to find some statistics - which turns out to be hard to come by... so if you know some sources, please let me know!

Appliances in the U.S. are rated 110 (or is it 120) volts. Anyone who have lived elsewhere (say, Europe or parts of Asia) may know that in some countries, appliances use 240 volts. If we take an American appliance to Europe, the 240 volts of electricity will immediately kill the device. The 110V or 240V standard is an average value, and we should expect fluctuations around those values (which are peaks of a sine curve). There is a natural variability in the voltage.

Then there are surges. Statistically, we can define surges as rare events - maybe voltage that is at least three standard deviations above the normal value. (There may be an official definition but I wasn't able to find it on a quick Google search.) Surges are apparently caused by lightning or switching. (Here is a somewhat useful NIST document, which may have been partly or wholly written by suppliers of surge protection equipment.)

I would like to find some data on the statistical distribution of voltage delivered. Then, students can figure out if the data resemble a normal distribution or some other probability model. We can estimate the frequency of surges. This can lead to a quantitative assessment of expected loss due to power surges.

Further, we must account for an asymmetrical cost structure. Having too low a voltage is also a problem. However, it is a much lesser problem compared to having a too high a voltage. There also does not appear to be such a thing as a "negative power surge".

The above lays the groundwork for making a decision about whether one should buy a surge protector. There are different types of surge protectors providing different levels of protection at different prices. How can we decide whether to invest in the next better surge protector?

The NIST document referenced above makes this non-quantified assertion: "A large stack of dollar bills and some change to replace your unprotected computer, if and when a lightning or some other surge destroyed it ..... or use a small number of bills to purchase a 'surge protector' for peace of mind and effective protection." How can we quantify such a statement?

"Statistically, we can define surges as rare events - maybe voltage that is at least three standard deviations above the normal value."

Uhm... What is the unit of measure for time? A second? A minute?

Using normal distribution a value farther than three standard deviations has probability 0.3% (or 0.15% taking into account only positive deviations).

Should I deduce that surges occour every 333 (667) seconds or minutes on average?

Posted by: Antonio | 09/16/2017 at 02:58 PM

Antonio: I'm just speculating there. I haven't been able to find a data source to know what is the right probability model for it. If we have empirical data, we just need to plot the periodic peaks of the voltage. The average of these peaks should be around 110V in the U.S. but it's not clear what the standard deviation is, or the shape of the distribution. Surges could be much more than 3 SD away - I just don't have any data to say one way or another.

In terms of sampling frequency, using shorter time units means that there are many more observations so it shouldn't matter.

Posted by: Kaiser | 09/18/2017 at 12:52 AM