I created a highly personalised large language model with Nvidia’s entertaining Chat with RTX app but at 60GB+ I’m now beginning to wonder if it’s worth keeping around

0 0

Owners of RTX 40- and 30-series graphics cards can now setup their own personalised large language model (LLM) on their own PC. It’s one that’s eminently capable of sifting through old documents or distilling down the essence of YouTube videos.

Chat with RTX is now available to download from Nvidia’s website for free from today, February 13. It works with any current or last generation graphics card with at least 8GB or more VRAM, which includes every desktop card bar the RTX 3050 6GB and excludes a few mid- to low-end laptop GPUs. It also requires 50—100GB of storage space on your PC, depending on the AI models downloaded.

You may also like...