Chat With RTX Is Here: Nvidia’s Offline AI Chatbot Is Ready To Talk

Nvidia has launched Chat with RTX, an artificial intelligence (AI)-powered chatbot offering instant, offline conversations directly on your PC.

The California-based technology company has been a major force in the AI industry since the emergence of generative AI, with its AI chips powering AI products and services. For instance, Nvidia’s A100 chip powers OpenAI’s ChatGPT chatbot.

Moreover, the company has an AI platform that focuses on providing end-to-end solutions to enterprises. Last year, Nvidia showed research that explained the process of using AI to improve chip design. Despite making waves in AI, Nvidia’s bread and butter continues to be the gaming industry for consumers.

Now, the GPU maker is trying to seamlessly blend these areas with Chat with RTX, an offline chatbot that offers AI-backed gaming experiences.

What separates Chat with RTX from other AI bots?

“Chat with RTX uses retrieval-augmented generation (RAG), NVIDIA TensorRT-LLM software and NVIDIA RTX acceleration to bring generative AI capabilities to local, GeForce-powered Windows PCs,” Nvidia’s Jesse Clayton explained in a blog post.

The personalised AI chatbot is currently available for download and the installer is 35GB. It is also crucial to remember that you will need a Windows PC or workstation that runs on an RTX 30 or 40-series GPU and has at least 8GB VRAM.

After downloading, you can install the app with a few clicks and start using it right away. On the downside, Chat with RTX lacks knowledge of the outside world since it is a local chatbot.

Nevertheless, you can feed it with your personal data, such as files, documents and more and train it to run queries on them. For example, you can feed it a large volume of documents and then ask it to answer a specific question, saving you time searching through vast information yourself.

Alternatively, you can use the offline chatbot as a research tool to quickly flip through studies and papers. The bot supports a wide range of file formats including text, PDF, doc/docx and XML.

As if that weren’t enough, the AI bot can understand YouTube videos as well. All you need to do is to give it a link and it will answer questions or summarise the whole thing using the video’s transcript. However, it requires internet access for this functionality.

A demo video shared on the blog post suggests Chat with RTX is a Web server along with a Python instance that lacks the information of a large language model (LLM) when it is freshly downloaded. You can choose between Mistral or Llama 2 models to train it and then use your own data to run queries.

Read original article here

Denial of responsibility! Pioneer Newz is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment