The vast majority of consumer devices, both mobile and laptops/desktops, are not powerful enough to run local AI with a good user experience yet, and even if they were, a lot of users would still prefer having it run in the cloud rather than using up their phone battery
- 0 Posts
- 3 Comments
Joined 2 years ago
Cake day: June 11th, 2023
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
etrotta@beehaw.orgto Technology@beehaw.org•The music industry is waging war upon an AI server on DiscordEnglish2·2 years agoSaying that Stable Diffusion was trained by “individuals” is a bit of a stretch, it cost over half a million dollars worth of compute to train it, and Stability AI is still a company in the end of the day. If that still counts as trained by individuals, then so does Midjourney and Dalle
To be fair, I wouldn’t include “loading the whole model into VRAM” as part of the cost, given they can just keep it in there between different requests, and it might be down to hundreds of billions or dozens of billions instead of trillions… but even after all improvements it should still be orders of magnitude more expensive than normal search, which just makes their decision even crazier