@[email protected] to Open [email protected] • 1 year agoAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkmessage-square15fedilinkarrow-up1178cross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]
arrow-up1178external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.com@[email protected] to Open [email protected] • 1 year agomessage-square15fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected][email protected]
minus-squaremayooooolinkfedilink10•1 year agoA serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
minus-squarepoVoqlinkfedilink11•1 year agoIt’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
minus-square@[email protected]linkfedilink3•1 year agoWhen the AI and data center hardware will stop being profitable.
A serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
It’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
.
When the AI and data center hardware will stop being profitable.