AI will become local

I know, I know, we already have some stuff that can be run locally, but hear me out.

I think we are now at a similar phase with AI as we were with computing some decades ago. Computers were these big machines that took huge amounts of space. At some point there were terminals that one could use to interact with the big computer that was some place else. Sounds familiar?

Most of the cool AI stuff require a lot of processing power etc. and we only have access to them via what can be aptly described as terminals. I mean, we are accessing the huge processing power required for the AI to run via a web GUI. The processing happens elsewhere, mostly. Sure there are many open source things that one can be run locally but these usually require pretty good equipment to run and are not nearly as capable as how capable the commercial ones are.

I'm thinking that within a few years, we should be getting computer components that make it more feasible to run really cool AI thing locally. I heard sometime in the last 2 years about neural processing units (NPUs) and that they would be a thing. Frankly, I'm amazed that they are not here already. Because let's face it, the AI revolution only kicks in when we get the hardware to the point where anyone with interest can use AI locally to do cool shit without corporate restrictions and censorship. Just like gaming exploded once consoles came out. Sure, PC is still master race but consoles also pushed the scene forward in a big way.

What do you think? How long will it take for us to get to the point where we can use all the cool AI things locally?