Package uno.anahata.ai
package uno.anahata.ai
Provides the core, high-level classes that orchestrate the AI chat application.
This package contains the central components responsible for managing the application's lifecycle, configuration, and the primary interaction loop with the Gemini API. It forms the foundational layer of the AI assistant's operation.
Key Classes:
Chat: The main engine of the application. It orchestrates the entire chat session, managing the conversation loop, processing user input, handling tool calls, interacting with the context, and communicating with the generative model.ChatMessage: A rich, stateful representation of a single message within the chat history. It serves as the core data model, encapsulating the content, role, usage metadata, and crucially, the dependency relationships between different parts of a message (e.g., a FunctionCall and its corresponding FunctionResponse).AnahataConfig: A utility class for managing global application paths, specifically locating and creating the central working directory (~/.anahata/ai-assistant).AnahataExecutors: A factory for creating dedicated, session-specific cached thread pools. Threads created are daemon threads, ensuring they do not prevent the application from shutting down.Executors: Provides a general-purpose, shared, cached thread pool with non-daemon threads for background tasks not tied to a specific chat session's lifecycle.MessageRole: A type-safe enum representing the author of a message (USER, MODEL, or TOOL).
-
ClassDescriptionManages the global configuration and working directory for the Anahata AI framework.A utility class for creating and managing named
ExecutorServiceinstances for various asynchronous tasks within the Anahata AI framework.The central orchestrator for a Gemini AI chat session.A rich, stateful representation of a single message in the chat history.Provides a shared, cached thread pool for general-purpose asynchronous tasks.A type-safe enum representing the role of the author of aChatMessage.
