[D] Best way to run LLMs in the cloud? Submitted by QTQRQD t3_11jjd18 2 years ago in MachineLearning 15 comments 27
Quick-Hovercraft-997 t1_jbx9gcj wrote 2 years ago if latency is not a critical requirement, you can try serverless GPU cloud like banana.dev, pipeline.ai . These platform provide an easy to use template for deploying LLM. Permalink 1
Viewing a single comment thread. View all comments