2 Comments
User's avatar
Akhil Baranwal's avatar

Great post, as always!

This reminds me of the paper "Hardware Lottery" by Sara Hooker, which builds the argument that most of ML has been disproportionately shaped by the hardware on which it ran - basically the generalised SIMT SMs. CUDA became a moat due to adoption by industries like finance and oil-and-gas and ofcourse, scientific computing. ML only happened to hitchhike on this bandwagon of massive parallelisation, and fell into the moat as well.

Expand full comment
Bharath Suresh's avatar

Seems like an interesting paper, thanks for sharing.

I think this also explains why "out-of-the-box" architectures like Neuromorphic or Quantum computing haven't had major success beyond academia. (at least so far...) Software tends to be optimized for the hardware that's available, and once that's done, there needs to be a huge benefit to justify porting those optimizations to a different architecture.

Expand full comment