Hardware for faster matrix/tensor multiplication leads to faster training, thus helping. More contributors to your favorite python frameworks leads to better tools, thus helping. Etc.
I am aware that chatbots don’t cure cancer, but discarding all the contributions of the last two years is disingenuous at best.
Well, AI has been in those places for a while. The hype cycle is around generative AI which just isn’t useful for that type of thing.
I’m sure if Nvidia, AMD, Apple and Co create npus or tpus for Gen ai they can also be used for those places, thus improving them along.
Why do you think that?
Nothing I’ve seen with current generative AI techniques leads me to believe that it has any particular utility for system design or architecture.
There are AI techniques that can help with such things, they’re just not the generative variety.
Hardware for faster matrix/tensor multiplication leads to faster training, thus helping. More contributors to your favorite python frameworks leads to better tools, thus helping. Etc.
I am aware that chatbots don’t cure cancer, but discarding all the contributions of the last two years is disingenuous at best.