Skip to content

modern web application built with Next.js that allows you to send a single prompt to multiple open-source and proprietary AI models simultaneously and compare their responses in a clean, side-by-side interface.

Notifications You must be signed in to change notification settings

NavuluriBalaji/Model-Inventory

Repository files navigation

Model-Inventory: AI Model Showdown

Model-Inventory is a sleek, modern web application built with Next.js that allows you to send a single prompt to multiple open-source and proprietary AI models simultaneously and compare their responses in a clean, side-by-side interface.

Model-Inventory Screenshot


✨ Key Features

  • Side-by-Side Model Comparison: Enter a prompt once and get responses from multiple AI models displayed in a horizontally scrollable view.
  • Multiple Free & Paid Models: Integrated with a variety of free-to-use models via OpenRouter (like Llama 3, DeepSeek, and Mistral) and Google's Gemini.
  • Performance Metrics: Each response card displays how long the model took to generate the output, so you can compare speed.
  • Persistent Chat History: Your conversations are saved in the collapsible sidebar, allowing you to revisit and continue them at any time.
  • Secure API Key Management: API keys are stored securely in your browser's local storage and are never exposed to a server.
  • Dynamic & Interactive UI:
    • A beautiful dark theme with a subtle animated gradient background.
    • A collapsible sidebar for managing chat history.
    • A clean, modern design inspired by leading AI chat applications.
  • Intelligent Response Formatting: Automatically detects and pretty-prints JSON or table-formatted responses for better readability.
  • Responsive Design: A great experience on both desktop and mobile devices.

🛠️ Tech Stack

🚀 Getting Started

Follow these steps to get a local copy up and running.

Prerequisites

  • Node.js (v18 or newer)
  • npm or yarn

Installation

  1. Clone the repository:

    git clone https://github.com/NavuluriBalaji/Model-Inventory.git
    cd Model-Inventory
  2. Install dependencies:

    npm install
  3. Set up API Keys:

    • Get your API keys from:
    • You can either set them in the in-app settings UI or create a .env.local file in the root of your project and add your keys there:
      GEMINI_API_KEY="YOUR_GEMINI_API_KEY"
      OPENROUTER_API_KEY="YOUR_OPENROUTER_API_KEY"
      
  4. Run the development server:

    npm run dev

Open http://localhost:9002 with your browser to see the result.

👤 Author

Navuluri Balaji


Feel free to contribute to this project by submitting issues or pull requests. If you find it useful, please consider giving it a ⭐ on GitHub!

About

modern web application built with Next.js that allows you to send a single prompt to multiple open-source and proprietary AI models simultaneously and compare their responses in a clean, side-by-side interface.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages