A Python package for extracting structured information about vintage gaming consoles from unstructured user input. Ideal for collectors, enthusiasts, and restorers who need quick access to key details like hardware specs, game libraries, and historical context.
Install the package via pip:
pip install vintageconsoleinfo- Extracts structured data from textual descriptions of vintage consoles (e.g., Interton Video Computer 4000).
- Supports customizable LLM backends (default: LLM7).
- Uses regex pattern matching for reliable data extraction.
- Works with OpenAI, Anthropic, Google, or any LangChain-compatible LLM.
from vintageconsoleinfo import vintageconsoleinfo
# Example input about the Interton Video Computer 4000
user_input = """
The Interton Video Computer 4000 is a 1983 console with a Z80 CPU,
4KB RAM, and a built-in keyboard. It supports games like 'Space Invaders'
and 'Breakout'.
"""
response = vintageconsoleinfo(user_input)
print(response) # Structured output (e.g., specs, games, etc.)from langchain_openai import ChatOpenAI
from vintageconsoleinfo import vintageconsoleinfo
llm = ChatOpenAI(model="gpt-3.5-turbo")
response = vintageconsoleinfo(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from vintageconsoleinfo import vintageconsoleinfo
llm = ChatAnthropic(model="claude-2")
response = vintageconsoleinfo(user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from vintageconsoleinfo import vintageconsoleinfo
llm = ChatGoogleGenerativeAI(model="gemini-pro")
response = vintageconsoleinfo(user_input, llm=llm)- Uses
LLM7_API_KEYfrom environment variables or falls back to a default. - Free tier rate limits are sufficient for most use cases.
- Get a free API key: LLM7 Registration.
response = vintageconsoleinfo(user_input, api_key="your_llm7_api_key")| Parameter | Type | Description |
|---|---|---|
user_input |
str |
Text describing a vintage console (required). |
api_key |
Optional[str] |
LLM7 API key (optional; defaults to env var). |
llm |
Optional[BaseChatModel] |
Custom LangChain LLM (optional; defaults to ChatLLM7). |
- The package uses LLM7 by default (via
langchain_llm7). - For production use, ensure your LLM backend meets rate limits.
- Extracted data follows a structured format (regex-based).
MIT
Report bugs or request features at: 🔗 GitHub Issues
Eugene Evstafev 📧 hi@euegne.plus 🔗 GitHub: chigwell