

And that translates directly to energy consumption - a single ChatGPT-style conversation can use as much electricity as charging a smartphone, which is why companies are scrambling to build more efficient AI chips and why they need so many power stations (you can compare some on gearscouts.com to see the best $/Wh values for emergency backup).
100% agree - community prepping is the way forward, and having reliable power is crucial when the grid goes down, which is why I’ve been researching some of the newer LFP battery power stations on gearscouts.com that give you way better $/Wh value than the old lead acid setups we used to rely on.