> pure BEVs are still the significantly more expensive option
New technology often has ‘new’ tradeoffs, are GPU’s are sill only situationally better than CPU’s.
DC fast charging is several times more expensive than home charging which heavily influences the economics of buying an EV without subsidies. Same deal with plug in Hybrids or long range batteries on PEV, if you don’t need the range you’re just wasting money. So there’s cases when an unsubsidized PEV is the cheapest option and that line will change over time even if it’s not going away anytime soon.
AI on the other hand is such a wide umbrella it doesn’t really make sense to talk about specific tradeoffs beyond the short term. Nobody can say what the downsides will be in 10-20 years because they aren’t extrapolating a specific technology with clear tradeoffs. Self driving cars could be taking over industries in 15 years, or still quite limited we can’t say.
GPUs are a good example - they started getting traction in the early 2000s/late 90s.
Once in the mid 2000s we figured out that single-thread perf won't scale, GPUs became the next scaling frontier and it was thought that they'd complement and supplant CPUs - with the Xbox and smartphones having integrated GPUs, and games starting to rely on general purpose compute shaders, a lot of folks (including me) thought that the software in the future will constantly pingpong between CPU and GPU execution? Got an array to sort? Let the GPU handle that. Got a JPEG to decode? GPU. Etc.
I took an in depth CUDA course back in the early 2010s, thinking that come 5 years or so, all professional signal processing will move to GPUs, and GPU algorithm knowledge will be just as widespread and expected as how to program a CPU, and I would need to Leetcode a bitonic sort to get a regular-ass job.
What happened? GPUs weren't really used, data sharing between CPU and GPU is still cumbersome and slow, dedicated accelerators like video decoders weren't replaced by general purpose GPU compute, we still have special function units for these.
There are technical challenges sure to doing these things, but very solvable ones.
GPUs are still stuck in 2 niches - video games, and AI (which incidentally got huge). Everybody still writes single-threaded Python and Js.
There was every reason to be optimistic about GPGPU back then, and there's every reason to be optimistic about AI now.
Not sure where this will go, but probably not where we expect it to.
New technology often has ‘new’ tradeoffs, are GPU’s are sill only situationally better than CPU’s.
DC fast charging is several times more expensive than home charging which heavily influences the economics of buying an EV without subsidies. Same deal with plug in Hybrids or long range batteries on PEV, if you don’t need the range you’re just wasting money. So there’s cases when an unsubsidized PEV is the cheapest option and that line will change over time even if it’s not going away anytime soon.
AI on the other hand is such a wide umbrella it doesn’t really make sense to talk about specific tradeoffs beyond the short term. Nobody can say what the downsides will be in 10-20 years because they aren’t extrapolating a specific technology with clear tradeoffs. Self driving cars could be taking over industries in 15 years, or still quite limited we can’t say.