LBRY Block Explorer

LBRY Claims • running-4-llms-from-ollama.ai-each-with

67d8bbceeeafe7632a5db1ffb441584aa7f91818

Published By
Anonymous
Created On
19 Dec 2023 20:31:12 UTC
Transaction ID
Cost
Safe for Work
Free
Yes
Running 4 LLMs from Ollama.ai each with GPU and then CPU
It is amazing that some models run at reasonable speed just with CPU and no GPU. With ollama.ai it is easy to try out models. This video is just to give some clue as to how fast some of these LLMs are with CPU or GPU today.<br />...<br /><a href="https://www.youtube.com/watch?v=X6YSkHoH-mo" target="_blank" rel="nofollow">https://www.youtube.com/watch?v=X6YSkHoH-mo</a>
Author
Content Type
Unspecified
video/mp4
Language
English
Open in LBRY