Stop Paying Twice: Bringing Your AI Subscriptions to Rust

I’ve been working on a couple of new Rust crates lately that solve a problem I kept running into: how do you use your Anthropic or OpenAI subscriptions in your own projects without needing to mess around with API keys?

The Problem

Most tools that interact with Claude or ChatGPT expect you to have an API key. But what if you already have a Claude Pro/Max subscription or a ChatGPT Plus account? You’re already paying for access, but you still need to set up a separate API key (and often pay separately) just to use these services programmatically.

Enter OAuth 2.0 authentication with PKCE support.

Introducing anthropic-auth and openai-auth

I recently released two crates to crates.io that handle the OAuth dance for you:

Both crates offer sync and async APIs, PKCE support for secure authentication, and let you leverage your existing subscriptions without needing separate API keys.

How to Use Them

anthropic-auth

Add it to your Cargo.toml:

[dependencies]
anthropic-auth = "0.1"

Then authenticate with your Claude subscription:

use anthropic_auth::{OAuthClient, OAuthConfig, OAuthMode};

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = OAuthClient::new(OAuthConfig::default())?;
    let flow = client.start_flow(OAuthMode::Max)?;
    
    println!("Visit: {}", flow.authorization_url);
    // User authorizes and receives: "code#state"
    let response = "l0pnTslNFOmTgp28REMrbt4wyLNR25SJePqjk4CAHjoen0TJ#FgE6g_6khGKFFhXAw3tULPM00CPaqgE3Cq6id79Surg";
    
    let tokens = client.exchange_code(response, &flow.state, &flow.verifier)?;
    println!("Got access token!");
    
    // Later, refresh if needed
    if tokens.is_expired() {
        let new_tokens = client.refresh_token(&tokens.refresh_token)?;
    }
    
    Ok(())
}

The crate supports two OAuth modes:

  • Max mode: Use your Claude Pro/Max subscription for API access
  • Console mode: Programmatically create API keys

openai-auth

Add it to your Cargo.toml:

[dependencies]
openai-auth = "0.1"

Authenticate with your ChatGPT subscription:

use openai_auth::{OAuthClient, OAuthConfig};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = OAuthClient::new(OAuthConfig::default())?;
    let flow = client.start_flow()?;
    
    println!("Visit: {}", flow.authorization_url);
    // User authorizes and receives authorization code
    let response = "code123";
    
    let tokens = client.exchange_code(response, &flow.pkce_verifier).await?;
    println!("Got access token!");
    
    Ok(())
}

Both crates are runtime-agnostic when using the async API - they work with tokio, async-std, smol, or whatever async runtime you prefer.

A Word About Token Storage Security

Now, here’s where things get interesting - and a bit concerning.

While working on these crates, I noticed that codex, a popular CLI tool, stores authentication tokens in plain text at ~/.codex/auth.json. This is… not great. A quick GitHub search for path:*codex/auth.json reveals numerous repositories where developers have accidentally committed their auth tokens to public repositories.

This is a real security problem. Anyone with access to these tokens can impersonate you, use your API quota, and potentially access sensitive information.

The Right Way: Use System Keyring

Neither anthropic-auth nor openai-auth handle token persistence by design. We deliberately left this up to you, because there’s already a proper solution: the keyring crate.

Instead of storing tokens in plain text files, use your operating system’s secure credential storage:

  • macOS: Keychain
  • Linux: Secret Service API / libsecret
  • Windows: Credential Manager

Here’s how we’re doing it in the upcoming release of qmt CLI:

use anthropic_auth::TokenSet;
use keyring::Entry;

const SERVICE_NAME: &str = "querymt-cli";

pub struct SecretStore {}

impl SecretStore {
    pub fn new() -> io::Result<Self> {
        Ok(SecretStore {})
    }

    pub fn set(&mut self, key: impl Into<String>, value: impl Into<String>) -> io::Result<()> {
        let key = key.into();
        let value = value.into();

        let entry = Entry::new(SERVICE_NAME, &key)
            .map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?;

        entry
            .set_password(&value)
            .map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))
    }

    pub fn get(&self, key: &str) -> Option<String> {
        let entry = Entry::new(SERVICE_NAME, key).ok()?;
        entry.get_password().ok()
    }

    pub fn set_oauth_tokens(&mut self, provider: &str, tokens: &TokenSet) -> io::Result<()> {
        let tokens_json = serde_json::to_string(tokens)
            .map_err(|e| io::Error::new(io::ErrorKind::InvalidData, e.to_string()))?;

        self.set(format!("oauth_{}", provider), tokens_json)
    }

    pub fn get_oauth_tokens(&self, provider: &str) -> Option<TokenSet> {
        let tokens_json = self.get(&format!("oauth_{}", provider))?;
        serde_json::from_str(&tokens_json).ok()
    }
}

This approach means your tokens are stored using the same secure mechanisms that store your browser passwords, SSH keys, and other sensitive credentials. No more plain text files sitting in your home directory waiting to be accidentally committed to Git.

Wrapping Up

If you’re building Rust applications that need to authenticate with Anthropic or OpenAI, check out these crates. They handle the OAuth complexity for you, support both sync and async workflows, and let you use your existing subscriptions.

Just remember: never store authentication tokens in plain text. Use the keyring crate or another secure credential storage system. Your future self (and your security team) will thank you.

Both crates are available on crates.io:

And the source code is on GitHub:

Happy authenticating!

queryMT

all about generating by querying


2026-01-06