🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
-
Updated
May 9, 2024 - Python
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the creation of chatbots, and search the web with AI (in real-time) with ease.
This repository contains Node.js examples to get started with the Amazon Bedrock service.
XMPP Bot designed for privacy AI language model interactions
Create your own CUSTOMIZED Dolphin Mixtral 8x7b & Dolphin Llama 3 model using Ollama
AI Voice-Powered TODO app
Implement named entity recognition (NER) using regex and fine-tuned LLM, with a total of 15 categories. The ultimate goal is to apply the model to detect personally identifiable information (PII) in student writing.
Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
An AI based chatbot built for SaaS Entrepreneurs
The workshop covers two topics: How to use GenAI tools for image data using Huggingface models (Stable Diffusion) in Python and How to build simple interface for self-hosted LLM-based chatbot in Jupyter notebook
Build a chatbot with the Mixtral 8x7B Instruct model using the Panel chat interface.
Crew of AI Agents that investigate a company to help you prepare for your next interview
This project aims to build a RAG model to chat with your PDFs
VLib is a digital library platform targeting college library systems that utilises a vector database for discovering required resources and thereby making information accessible to all users irrespective of their knowledge level. It overcomes the incapability of present systems to handle descriptive queries thereby limiting information access.
LLMs are commonly used to rewrite or make stylistic changes to text. The goal is to recover the LLM prompt that was used to transform a given text.
working on llm research
Add a description, image, and links to the mixtral-8x7b-instruct topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b-instruct topic, visit your repo's landing page and select "manage topics."