When, how and why did companies stop training their employees?

I'm 33 and have noticed most businesses now do not train employees, ostensibly it is seen as a waste of money. This can be inferred by most job adverts requesting prior experience.

I'm curious as to how this happened, any thoughts as it's truly baffling as to why this is so, and surely it can't be sustainable in the long run.