Skip to content

Latest commit

 

History

History
316 lines (222 loc) · 5.49 KB

File metadata and controls

316 lines (222 loc) · 5.49 KB

C++ API Reference

Complete reference for the C++ API.

Namespace

import mcpplibs.llmapi;
using namespace mcpplibs::llmapi;

Client Class

Constructor

Client(std::string_view apiKey, std::string_view baseUrl = URL::OpenAI)
Client(const char* apiKey, std::string_view baseUrl = URL::OpenAI)

Parameters:

  • apiKey - API key (can be from std::getenv())
  • baseUrl - Base URL (see Providers)

Example:

Client client(std::getenv("OPENAI_API_KEY"), URL::Poe);

Configuration Methods

All configuration methods return Client& for chaining.

model()

Client& model(std::string_view model)

Set the model name.

Example:

client.model("gpt-5");

Message Methods

All message methods return Client& for chaining.

user()

Client& user(std::string_view content)

Add a user message.

Example:

client.user("What is C++?");

system()

Client& system(std::string_view content)

Add a system message (usually for initial instructions).

Example:

client.system("You are a helpful assistant.");

assistant()

Client& assistant(std::string_view content)

Add an assistant message (usually for few-shot examples or manual history).

Note: Normally not needed - responses are auto-saved to history.

Example:

client.assistant("I understand.");

add_message()

Client& add_message(std::string_view role, std::string_view content)

Add a message with custom role.

Example:

client.add_message("user", "Hello");

clear()

Client& clear()

Clear all conversation history.

Example:

client.clear();

Request Methods

request() - Non-Streaming

Json request()

Execute a non-streaming request. Returns full JSON response. Automatically saves assistant reply to history.

Returns: nlohmann::json object with full API response

Example:

auto response = client.user("Hello").request();
std::println("{}", response["choices"][0]["message"]["content"]);

request(callback) - Streaming

template<StreamCallback Callback>
void request(Callback&& callback)

Execute a streaming request. Automatically saves complete assistant reply to history.

Parameters:

  • callback - Function accepting std::string_view (each content chunk)

Example:

client.user("Tell me a story").request([](std::string_view chunk) {
    std::print("{}", chunk);
    std::cout.flush();
});

Getter Methods

getAnswer()

std::string getAnswer() const

Get the last assistant reply from conversation history.

Returns: Last assistant message content, or empty string if none

Example:

client.request();
std::string answer = client.getAnswer();
std::println("Last answer: {}", answer);

getMessages()

Json getMessages() const

Get full conversation history as JSON array.

Returns: JSON array of all messages

Example:

auto history = client.getMessages();
for (const auto& msg : history) {
    std::println("{}: {}", msg["role"], msg["content"]);
}

getMessageCount()

int getMessageCount() const

Get total number of messages in conversation history.

Returns: Number of messages

Example:

std::println("Messages: {}", client.getMessageCount());

getApiKey()

std::string_view getApiKey() const

Get the API key.

getBaseUrl()

std::string_view getBaseUrl() const

Get the base URL.

getModel()

std::string_view getModel() const

Get the current model name.

StreamCallback Concept

template<typename F>
concept StreamCallback = std::invocable<F, std::string_view> && 
                        std::same_as<std::invoke_result_t<F, std::string_view>, void>;

Type constraint for streaming callbacks. Accepts any callable that:

  • Takes std::string_view parameter
  • Returns void

Valid callbacks:

// Lambda
[](std::string_view chunk) { std::print("{}", chunk); }

// Function
void my_callback(std::string_view chunk) { /* ... */ }

// Functor
struct Printer {
    void operator()(std::string_view chunk) { /* ... */ }
};

JSON Type

using Json = nlohmann::json;

The library uses nlohmann/json for JSON handling.

Complete Example

import mcpplibs.llmapi;
import std;

int main() {
    using namespace mcpplibs::llmapi;
    
    Client client(std::getenv("OPENAI_API_KEY"), URL::Poe);
    
    // Configure
    client.model("gpt-5")
          .system("You are a helpful assistant.");
    
    // First question (non-streaming)
    client.user("What is C++?");
    client.request();
    std::println("Answer 1: {}", client.getAnswer());
    
    // Follow-up (streaming) - uses conversation history
    client.user("Tell me more");
    std::print("Answer 2: ");
    client.request([](std::string_view chunk) {
        std::print("{}", chunk);
        std::cout.flush();
    });
    std::println("\n");
    
    // Check history
    std::println("Total messages: {}", client.getMessageCount());
    
    return 0;
}

Error Handling

All methods may throw exceptions:

  • std::runtime_error - API errors, network errors
  • std::invalid_argument - Invalid parameters
  • nlohmann::json::exception - JSON parsing errors

Example:

try {
    client.user("Hello").request();
} catch (const std::runtime_error& e) {
    std::println("Error: {}", e.what());
}