Item logo image for ollama-ui

ollama-ui

4.5 (

32 ratings

)
Item media 1 (screenshot) for ollama-ui

Overview

This extension hosts an ollama-ui web server on localhost

Just a simple HTML UI for Ollama Source: https://github.com/ollama-ui/ollama-ui

4.5 out of 5 32 ratings

Learn more about results and reviews.

Details

  • Version
    1.6
  • Updated
    October 1, 2023
  • Offered by
    ollama.ai.ui
  • Size
    149KiB
  • Languages
    English
  • Developer
    Email
    ollama.ai.ui@gmail.com
  • Non-trader
    This developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.

Privacy

The developer has disclosed that it will not collect or use your data.

This developer declares that your data is

  • Not being sold to third parties, outside of the approved use cases
  • Not being used or transferred for purposes that are unrelated to the item's core functionality
  • Not being used or transferred to determine creditworthiness or for lending purposes

Support

For help with questions, suggestions, or problems, visit the developer's support site

Related

sidellama

5.0

sidellama

Ollama Chrome API

3.0

Allow websites to access your locally running Ollama instance.

Local LLM Helper

2.0

Interact with your local LLM server directly from your browser.

Local LLama LLM AI Chat Query Tool

5.0

Query a local model from your browser.

Obsidian Web Clipper

4.8

Save and highlight web pages in a private and durable format that you can access offline. The official extension for Obsidian.

Ollama KISS UI

5.0

A simple, stupid UI for Ollama. Keep It Simple, Stupid (KISS).

Offline AI Chat (Ollama)

4.0

Chat interface for your local Ollama AI models. Requires Ollama to be installed and running on localhost.

open-os LLM Browser Extension

4.5

Quick access to your favorite local LLM from your browser (Ollama).

Highlight X | AI Chat with Ollama, Local AI, Open AI, Claude, Gemini & more

3.9

HighlightX is a powerful chat app that connects you to AI models like Ollama, LocalAI, OpenAI, Claude, Gemini & more in one place

Ollama Client - Chat with Local LLM Models

5.0

Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma — fully offline.

Orian (Ollama WebUI)

3.1

Quick access to your favorite local LLM from your browser (Ollama).

Perplexity - AI Companion

4.0

Ask anything while you browse

sidellama

5.0

sidellama

Ollama Chrome API

3.0

Allow websites to access your locally running Ollama instance.

Local LLM Helper

2.0

Interact with your local LLM server directly from your browser.

Local LLama LLM AI Chat Query Tool

5.0

Query a local model from your browser.

Obsidian Web Clipper

4.8

Save and highlight web pages in a private and durable format that you can access offline. The official extension for Obsidian.

Ollama KISS UI

5.0

A simple, stupid UI for Ollama. Keep It Simple, Stupid (KISS).

Offline AI Chat (Ollama)

4.0

Chat interface for your local Ollama AI models. Requires Ollama to be installed and running on localhost.

open-os LLM Browser Extension

4.5

Quick access to your favorite local LLM from your browser (Ollama).

Google apps
Design a Mobile Site
View Site in Mobile | Classic
Share by: