optillm-core Introduction¶
optillm-core is the foundational library that provides the shared traits, types, and interfaces used by all OptimLLM implementations.
What is optillm-core?¶
It provides:
- Traits: Abstract interfaces for extending functionality
- Types: Common data structures used across implementations
- Error Handling: Unified error types
- Utilities: Helper functions and macros
Key Components¶
Traits¶
Optimizer: Interface for optimization techniquesModelClient: Interface for LLM provider communication
Types¶
Solution: Result from optimization containing reasoning and answerPrompt: Request structure for LLM callsResponseEvent: Events streamed from LLM responsesTokenUsage: Token consumption tracking
Error Types¶
OptillmError: Comprehensive error enumResult<T>: Type alias forResult<T, OptillmError>
Dependencies¶
tokio: Async runtimeasync-trait: Async trait supportserde: Serialization/deserializationthiserror: Error handling
When to Use optillm-core¶
Use optillm-core when:
- Creating a new optimization technique
- Implementing a new LLM provider
- Building on top of OptimLLM functionality
- Integrating OptimLLM into your system
Integration Pattern¶
Rust
// Add dependency
[dependencies]
optillm-core = { path = "../core" }
// Import types
use optillm_core::{
Optimizer, ModelClient, Solution, Prompt,
ResponseEvent, Result,
};
// Implement traits
#[async_trait]
impl Optimizer for MyOptimizer { }
// Use in your code
let result = my_optimizer.optimize(query, client).await?;
Next Steps¶
- Read ModelClient for LLM integration
- Read Optimizer Trait for creating optimizers
- Check Types for data structure details
- Review Error Handling for error patterns
See the API Reference for complete API documentation.