LettucePrime OP t1_j9nv0b8 wrote
Reply to comment by Surur in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
Ehh no actually, that's not true. ChatGPT inferences are several times more expensive than your typical Google search, & utilize the same hardware resources used to train the model, operating at the same intensity, it seems.
Surur t1_j9nv53o wrote
That's not what I said lol. I said its manageable on hardware a consumer can buy.
LettucePrime OP t1_j9nw9fu wrote
I understand you now, my apologies.
Surur t1_j9nwi3k wrote
Sure, NP, and you are partially right also lol. It may cost closer to $80,000 to have your own ChatGPT instance.
https://twitter.com/tomgoldsteincs/status/1600196988703690752
But then that sounds like a business opportunity lol.
Viewing a single comment thread. View all comments