I often find myself thinking about energy, specifically electricity. Electricity is so ubiquitous – I bet you’d be hard-pressed to find someone who hasn’t used an electron within the last 24-hours. Every single keystroke I hammer out is powered by the electric grid, and few things inconvenience us more than when our power goes out. Moreover, how we power our world has tremendous environmental consequences, from greenhouse gas emissions to coal ash pond accidents to siting considerations to fracked mountain ranges.
And yet, for how common electricity is in our world, I get the impression that many people would struggle to explain electricity’s fundamental units of measure. To be clear, I’m not trying to shame anyone! I even had to look it up to make sure I had it down. That said, I’m a believer that we need to understand the fundamentals of something to change it. If we want to champion more renewable electricity, I think we should understand the difference between a kilowatt and a kilowatt-hour, if only so that we have more credibility on the topic. And so, the rest of this post will by my silly use of an analogy in an effort to demystify the measurement of electricity.
Suppose that you own two dogs.* One is large, rivaling some small horses in size, while the other could fit in your backpack in a pinch. Both go bananas when it’s mealtime, but you’ve grown tired of patiently measuring their bowls of food while they bark excitedly. You’ve come up with a brilliant innovation.