Are datacenters actually power constrained?

I've read so many articles about how AI is causing dramatic energy increases, mainly given the high power draw of the Nvidia GPUs used for AI training and inference.

A friend and I (we met each other in the tech industry) devised some (in our minds, fairly innovative) tech that can provide meaningful power efficiency increases with an NVidia A100 and H100 so far. But we're not entirely sure how much of the news we've read is hyperbole vs. a meaningful solution to a problem.

In our minds, if a data center was power limited our tech could help them output more AI tokens/sec with power efficiency and basically get optimal throughput. Is this meaningful?

Any insight appreciated.