Item logo image for LLM-X

LLM-X

4.0 (

1 rating

)
Extension Tools 137 users
Item media 4 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 2 (screenshot) for LLM-X
Item media 3 (screenshot) for LLM-X
Item media 4 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 2 (screenshot) for LLM-X
Item media 3 (screenshot) for LLM-X
Item media 4 (screenshot) for LLM-X

Overview

LLM-X! An app for people to talk to Ollama, LM Studio, Automatic 1111, Gemini nano and more!

LLM-X ( also called llmx) is a webapp and a chrome extension. The entire codebase (no secrets!) is on Github. The entire webapp is also hosted on github. This app is designed for users to be able to chat with Local AI models including a text to image provider! Supports: Ollama, LM Studio (through the lmstudio sdk), Gemini nano (built in browser, currently chrome canary only), Automatic1111 (for image generation), and OpenAI compatible endpoints! Features: - Multiple Models can be run at the same time! - Chat bot sending and receive images and text, regenerate responses - Saving multiple chats - Quick bar that allows for easy wipe of all data - Much more! No RAG support yet, apologies. It should be coming soon!

Details

  • Version
    1.1.7
  • Updated
    August 12, 2025
  • Offered by
    mr.demarcus.johnson
  • Size
    962KiB
  • Languages
    English (United States)
  • Developer
    Email
    mr.demarcus.johnson@gmail.com
  • Non-trader
    This developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.

Privacy

LLM-X has disclosed the following information regarding the collection and usage of your data. More detailed information can be found in the developer's privacy policy .

LLM-X handles the following:

Website content

This developer declares that your data is

  • Not being sold to third parties, outside of the approved use cases
  • Not being used or transferred for purposes that are unrelated to the item's core functionality
  • Not being used or transferred to determine creditworthiness or for lending purposes

Support

Related

Ollama EasyPHPRompt

5.0

Ollama EasyPHPRompt Chrome Extension

Local AI helper

0.0

LocalAI provides access to web content with your authorisation, without storing it. It can be configured to use files and…

Offload: Fully private AI for any website using local models.

5.0

A fully private in-browser AI assistant. Works even offline. No external API dependencies.

Local LLM Helper

2.0

Interact with your local LLM server directly from your browser.

sidellama

5.0

sidellama

BYO LLM: DevTools

5.0

Bring Your Own AI LLM to Chrome DevTools for enhanced debugging and page analysis. Connect to OpenRouter, OpenAI or Custom.

open-os LLM Browser Extension

4.5

Quick access to your favorite local LLM from your browser (Ollama).

ollama-ui

4.5

This extension hosts an ollama-ui web server on localhost

Orian (Ollama WebUI)

3.1

Quick access to your favorite local LLM from your browser (Ollama).

Page Assist - A Web UI for Local AI Models

4.9

Use your locally running AI models to assist you in your web browsing.

Ollamazing

3.5

Web extension to use local AI models

AIskNet

5.0

Locally-run AI that answers questions from your current webpage

Ollama EasyPHPRompt

5.0

Ollama EasyPHPRompt Chrome Extension

Local AI helper

0.0

LocalAI provides access to web content with your authorisation, without storing it. It can be configured to use files and…

Offload: Fully private AI for any website using local models.

5.0

A fully private in-browser AI assistant. Works even offline. No external API dependencies.

Local LLM Helper

2.0

Interact with your local LLM server directly from your browser.

sidellama

5.0

sidellama

BYO LLM: DevTools

5.0

Bring Your Own AI LLM to Chrome DevTools for enhanced debugging and page analysis. Connect to OpenRouter, OpenAI or Custom.

open-os LLM Browser Extension

4.5

Quick access to your favorite local LLM from your browser (Ollama).

ollama-ui

4.5

This extension hosts an ollama-ui web server on localhost

Google apps
Design a Mobile Site
View Site in Mobile | Classic
Share by: