If your chatbots are still not crashing, here’s another one. NVIDIA is launching Chat with RTX, an alternative to ChatGPT that only works with text and video and does so in local mode.. No cloud connection. Of course, you need a modern RTX graphics card.
The first generation of generative AI, such as ChatGPT, Bard, Copilot and companies, runs in the cloud. The “excuse” is that they require a lot of processing power to operate (and waste energy due to 99% of the trivial questions asked).
This is a half-truth. They could work in local mode, but this would require powerful hardware. On the other hand, companies want you to use their cloud so that you are well connected to them.
The problem with AI in the cloud is that All requests and personal data pass through the servers of Microsoft, Google, Meta and the company. (regular) and we don’t know what they do with them.
There are already versions of GPT, Copilot or Gemini that partially work locally, but these are very shortened versions, more “dumb” compared to the full version in the cloud.
NVIDIA wants to change that with Communicate with RTX, a customizable generative AI that works locally without sending any data overseas.. But, as we will see, this requires powerful hardware.
NVIDIA’s new Generative AI is not a general purpose chatbot. It is a custom AI limited to documents and videos..
You simply “train” it with your own documents in TXT, PDF, DOC/DOCX and XML formats. It also works with YouTube video links.
Once you provide the material, You can answer questions based on these documents or videos, summarize, and draw conclusions.etc. This can be seen in the introductory video of the news.
Chat with RTX works with different language models. At the moment you can choose between Mistral and Lama de Meta. But soon it will be compatible with others.
The most interesting thing is that does not upload data to the cloud. It runs locally using NVIDIA TensorRT-LLM and the AI chips of NVIDIA’s most powerful cards.
That’s why Equipment requirements are high
– Requires a computer with 16 GB RAM, Windows 11 and an RTX 3000 or RTX 4000 graphics card with 8 GB VRAM.If you have a computer with these features, You can now download Chat with RTX, NVIDIA’s generative AI.from its official website and use it for free.
image Source, getty imagesItem InformationAuthor, Gerardo LisardiAuthor title, bbc news worldNovember 7, 2024, 12:04 GMTUpdated…
In the future, important advances may be made in the diagnosis of lung cancer. A…
Federal Reserve System (FRS) fell this Thursday Guys official US interest in 0.25 points percentages…
Elon Musk has reasons to be happy. Donald Trump won and entered TOP 20 best…
I couldn't explain it. It seemed like a lie, but it was true. Football is…
Simon Kinberg, continuing to work on creating a franchise X-Menetc. Seoul South Mars With Ridley…