LLMX; Easiest 3rd party Local LLM UI for the web!
-
Updated
Jun 27, 2024 - TypeScript
LLMX; Easiest 3rd party Local LLM UI for the web!
automate the batching and execution of prompts.
RAG with LM studio, local LLMs, Scientific PDF text extraction,
Quill is a cutting-edge fullstack SaaS platform built from scratch using Next.js 13.5, tRPC, TypeScript, Prisma, and Tailwind. It features a beautiful landing and pricing page, real-time streaming API responses, and robust authentication via Kinde. With modern UI components, optimistic updates, and seamless data fetching.
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
Solve complex problems Intelligently orchestrate subagents using Local LLM, Embeddings,duckduckgo search
Serverless single HTML page access to an OpenAI API compatible Local LLM
This repository hosts a web-based chat application using LM Studio's AI models like Mistral, OpenAI, and Llama through a Gradio interface. It maintains conversation history for a continuous, coherent chat experience akin to ChatGPT or Claude.
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."