A sophisticated web-enabled Language Model framework built on Ollama, featuring advanced reasoning capabilities and automated web search integration. The system utilizes the Gemma2:9B model as its core engine while incorporating multiple reasoning stages and web search capabilities for enhanced response accuracy.
-
Standalone Operation
- Functions as a complete autonomous system
- Independently retrieves and processes information from the web
- Self-evaluates knowledge gaps and automatically initiates web searches
- Performs multi-stage analysis and verification of gathered information
-
LLM Agent Integration
- Can serve as a specialized web search tool for LLM agent systems
- Perfect for integration into multi-agent architectures
- Acts as an information retrieval and processing agent
- Enhances other agents' capabilities with real-time web data
This demo demonstrates the system processing a user query 'bitcoin exchange rate', automatically transforming it to 'bitcoin price USD' for web search. The system then performs net scraping to gather relevant data and applies reflective reasoning methods to analyze the information before delivering comprehensive results to the user.
-
Enhanced Reasoning Process
- Multi-stage thought process evaluation
- Confidence scoring for each reasoning step
- Detailed analysis and critique phases
- Comprehensive answer synthesis
-
Integrated Web Search
- Automatic knowledge evaluation
- Multi-iteration search strategy
- Smart query generation
- Source credibility assessment
- Content parsing and analysis
-
Conversation Management
- Dynamic context management
- Customizable system prompts
- Conversation history tracking
- Reasoning chain visualization
- Asynchronous processing using
asyncio
andaiohttp
- Caching system for search results and parsed content
- Multiple website parsing strategies (static/dynamic)
- Structured thought process using enum-based stages
- Comprehensive error handling and recovery
- Modular architecture for easy extension
- Python 3.7+
- Ollama with Gemma2:9B model
- Required Python packages:
- aiohttp
- requests
- beautifulsoup4
- duckduckgo_search
- scrapy
- Clone the repository:
git clone https://github.com/kazkozdev/net-reflective-reasoning-llm.git
cd net-reflective-reasoning-llm
- Install dependencies:
pip install -r requirements.txt
- Ensure Ollama is installed and the Gemma2:9B model is available:
# Install Ollama from https://ollama.ai
ollama pull gemma2:9b
Run the main script:
python src/main.py
Import and use as a web search agent in your multi-agent system:
from src.net_reflective_llm import advancedgptlike
# Initialize as web search agent
search_agent = advancedgptlike(model_name="gemma2:9b")
# Use in async context
async def example():
response, reasoning = await search_agent.model.process_query("your query here")
return response, reasoning
clear
- Reset conversation historyexplain
- View detailed reasoning chain for last responsesystem <prompt>
- Update system promptquit
orexit
- Exit the program
The system consists of several key components:
- EnhancedLLM: Core class managing LLM interactions and reasoning process
- WebSearchManager: Handles web searches and content parsing
- Conversation: Manages conversation history and context
- ReasoningChain: Tracks and structures the reasoning process
- Initial Thoughts
- Search Required
- Analysis
- Critique
- Refinement
- Final Answer
Contributions are welcome! Please read our Contributing Guidelines for details on how to submit pull requests, report issues, and contribute to the project.
- Open an issue for bug reports or feature requests
- Join the discussion for questions or ideas
This project is licensed under the MIT License - see the LICENSE file for details.
- Built on the Ollama framework
- Uses the Gemma2:9B model
- Inspired by advanced reasoning techniques in AI systems