An interactive web UI for chatting with Ollama, featuring Hana - a lively 2D anime character
β’ Features β’ Installation β’ Usage β’ Screenshots β’ Configuration β’ Contributing
- Interactive Chat Interface: Talk with Ollama's powerful language models in a friendly web UI
- Live 2D Animation: Chat with Hana, an expressive anime character that responds to your interactions
- Markdown Support: View beautifully formatted responses with syntax highlighting
- LaTeX Math Rendering: Display mathematical equations properly with KaTeX
- Customizable Settings: Choose your Ollama model and configure system prompts
- Responsive Design: Enjoy HanaVerse on desktop and mobile devices
- Real-time Response: Stream responses as they're generated
This is just a skeleton demo Link = https://hanaverse.vercel.app/
hanaverse.demo.1.1.1.mp4
- Python 3.8+ installed
- Ollama installed and running on your system
- Git
git clone https://github.com/Ashish-Patnaik/HanaVerse.git
cd HanaVerse
pip install -r requirements.txt
python server.py
The server will start running at http://localhost:5000 by default.
After running the server open index.html in your web browser and accese HanaVerse
- Start Chatting: Type your message in the input box and press Send or hit Enter
- Change Settings: Click the hamburger menu (β°) to access settings
- Configure Ollama:
- Set your Ollama server URL (default: http://localhost:11434)
- Choose your preferred model (e.g., llama3:8b, codellama:7b, mistral:latest)
- Customize the system prompt for specialized responses
- Stop Generation: Click the Stop button anytime to halt response generation
-
Only support Cubism 4 models
-
Models need to support motionsync3
Reference: https://docs.live2d.com/en/cubism-editor-manual/motion-sync/
-
Add your models in model directory
HanaVerse works with any model available in your Ollama installation. Some recommended models:
llama3:8b
- Great general purpose assistant (default)codellama:7b
- Specialized for coding tasksmistral:latest
- Alternative high-quality modelphi3:latest
- Microsoft's compact but powerful model
Customize the system prompt to get specialized responses:
- Math Helper: "Format math using LaTeX. Show step-by-step solutions."
- Coding Assistant: "Provide code examples with detailed explanations. Use appropriate syntax highlighting."
- Recipe Generator: "Present ingredients as bullet points and steps as numbered lists."
HanaVerse/
βββ server.py # Flask server for handling API requests
βββ index.html # Main web interface
βββ style.css # CSS styling for the UI
βββ script.js # Core functionality for Live2D character
βββ chat.js # Chat interaction logic
βββ sdk/ # Live2D SDK components
βββ prism/ # Syntax highlighting library
βββ katex/ # Math rendering library
βββ models/ # Live2D model files for Hana
βββ requirements.txt # Python dependencies
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under a Custom Non-Commercial Use License.
You may use, copy, and run this software for personal or educational use only.
Commercial use and modification for commercial purposes are not allowed for now.
- Ollama for the local LLM runtime
- Live2D for the Cubism SDK
- pixi-live2d-display for the WebGL rendering
- KaTeX for math rendering
- Prism for syntax highlighting
- Live2d motionsync library