oauth2-passkey Documentation
Drop-in OAuth2 and Passkey authentication for Rust web applications.
Why OAuth2 + Passkey?
Password authentication is fundamentally flawed - even strong, unique passwords are vulnerable to phishing, brute-force attacks, and server-side breaches. 2FA adds complexity without fixing the root cause.
This library avoids passwords entirely:
- Register with Google OAuth2 - One-click signup, no password to create
- Add a Passkey - Register biometric authentication (fingerprint, face)
- Login with Passkey - Fast, phishing-resistant daily authentication
- OAuth2 as backup - Recovery option if device is lost
After authentication, the library issues a secure session cookie to maintain login state. No password management. No 2FA implementation. Better security.
Getting Started
New to oauth2-passkey? Start here:
- Introduction - Why this approach works
- Quick Start - Prerequisites and running demos
- Architecture - System components and data flow
Introduction
What is oauth2-passkey?
oauth2-passkey is a passwordless authentication library for Rust web applications. Password authentication is fundamentally flawed - even strong, unique passwords are vulnerable to phishing, brute-force attacks, and server-side breaches. This library avoids passwords entirely.
Intended workflow: Users register with Google OAuth2, then add a Passkey for fast, phishing-resistant daily login. OAuth2 remains as a backup if the device is lost. After authentication, the library issues a secure session cookie to maintain login state.
Key Features
- Passkey - Phishing-resistant login with biometrics, inherently multi-factor (no 2FA needed)
- Google OAuth2 - One-click registration and backup authentication
- Account linking - Users can add multiple login methods to one account
- Minimal setup - Works with SQLite out of the box, scales to PostgreSQL + Redis
Supported Authentication Methods
OAuth2/OpenID Connect (Google)
The library provides full OAuth2/OIDC integration with Google, allowing users to authenticate using their existing Google accounts. This is the familiar “Sign in with Google” flow that users expect from modern web applications.
WebAuthn/Passkey
WebAuthn (Web Authentication) enables passwordless authentication using passkeys. Users can register and authenticate using:
- Platform authenticators (Touch ID, Face ID, Windows Hello)
- Security keys (YubiKey, etc.)
- Cross-device authentication via smartphones
Both authentication methods can be used independently or together, giving users flexibility in how they access their accounts.
Use Cases
Web Application Authentication
Add secure authentication to any Rust web application built with the Axum framework. The library handles:
- User registration and login flows
- Session management
- Secure cookie handling
Multiple Authentication Methods
Allow users to choose their preferred authentication method:
- First-time users can register with Google OAuth2 OR create a Passkey
- Existing users can add additional login methods to their account
- Authentication works with any linked method (OAuth2 or Passkey)
Secure Session Management
The library provides built-in session management with:
- Secure session cookies
- CSRF protection
- Configurable session expiration
- Support for both development (in-memory) and production (Redis) session stores
Account Administration
The first registered user is automatically promoted to admin, enabling account management capabilities for other users.
Target Audience
Rust Web Developers
This library is designed for Rust developers building web applications who need authentication functionality without implementing it from scratch. It provides:
- Clean, idiomatic Rust APIs
- Comprehensive error handling
- Minimal dependencies
Axum Framework Users
The oauth2-passkey-axum crate provides seamless integration with the Axum web framework:
- Ready-to-use route handlers
- Built-in static assets (JS/CSS) for login UI
- HTML templates for authentication pages
- Extractors for accessing authenticated user information
Why OAuth2 + Passkey?
Intended workflow: Users create an account with Google OAuth2, then register a Passkey for daily login. OAuth2 serves as the initial registration method and backup.
Password Authentication is Fundamentally Flawed
Password-based authentication has inherent design flaws that cannot be fixed by better implementation:
- Weak passwords - Users choose predictable passwords (123456, password, etc.). No amount of complexity rules can change human behavior.
- Password reuse - Users reuse passwords across sites, making credential stuffing attacks effective
- Phishing vulnerability - Users can be tricked into entering passwords on fake sites
- 2FA is a band-aid - Two-factor authentication exists because passwords alone are insufficient. It adds complexity without addressing the root cause.
Our Solution: OAuth2 for Registration, Passkey for Login
This library is designed for a specific workflow:
-
Initial Registration with Google OAuth2
- Users sign up with one click using their Google account
- No password to create or remember
- Google handles the authentication security
-
Register a Passkey
- After registration, prompt users to add a Passkey
- Uses device biometrics (fingerprint, face) or security key
- Stored securely on user’s device
-
Daily Login with Passkey
- Fast biometric authentication (1-2 seconds)
- Phishing-resistant (bound to your domain)
- Works offline from Google
-
OAuth2 as Backup
- If device is lost, Google OAuth2 still works
- User can register a new Passkey on new device
Benefits
- No password management - You never store or validate passwords
- No 2FA implementation needed - Passkey is inherently multi-factor (device possession + biometrics)
- Phishing resistant - Passkeys are cryptographically bound to origin
- Fast login - Biometric authentication in seconds
- Resilient - Multiple auth methods provide fallback options
- Reduced attack surface - No password database to breach
Technical Highlights
- Beginner-friendly - Works out of the box with SQLite
- Production-ready - Scales to PostgreSQL + Redis
- Security built-in - CSRF, secure sessions,
__Host-cookie prefix - Minimal dependencies - Careful dependency selection
Next Steps
Continue to the next chapter to learn about the library architecture and how the components work together.
Quick Start
This guide walks you through running the demo applications to quickly experience OAuth2 and WebAuthn/Passkey authentication.
Prerequisites
- Rust toolchain: Latest stable version
- Web browser: Chrome (recommended, fewest issues), Android Chrome, or iOS Safari
Running Demo Applications
The repository includes several demo applications to showcase different authentication scenarios.
demo-both (OAuth2 + Passkey)
A complete authentication example showcasing both Google OAuth2 and WebAuthn/Passkey authentication in a single integrated application.
Features:
- Dual authentication methods (Google OAuth2 and WebAuthn/Passkey)
- Session management with CSRF protection
- User management and registration
- Admin interface for user administration
Setup
-
Choose Your ORIGIN
Decide how you’ll access the application:
- localhost (default):
http://localhost:3001- for local desktop testing (localhost is a secure context) - tunnel (mobile testing): Use Cloudflare Tunnel to get a public URL like
https://random-name.trycloudflare.com
Note: If using a tunnel, set it up first to get your URL before proceeding.
- localhost (default):
-
Get Google OAuth2 Credentials
- Go to Google Cloud Console
- Create OAuth2 credentials (Web application)
- Add
{YOUR_ORIGIN}/o2p/oauth2/authorizedto “Authorized redirect URIs”- For localhost:
http://localhost:3001/o2p/oauth2/authorized - For tunnel:
https://random-name.trycloudflare.com/o2p/oauth2/authorized
- For localhost:
- Copy the Client ID and Client Secret
-
Environment Setup
cd demo-both cp ../dot.env.simple .envEdit
.envwith your configuration:# Required: Base URL of your application (must match step 1) ORIGIN='http://localhost:3001' # Required: Google OAuth2 credentials (from step 2) OAUTH2_GOOGLE_CLIENT_ID='your-client-id.apps.googleusercontent.com' OAUTH2_GOOGLE_CLIENT_SECRET='your-client-secret' # Database (SQLite for easy setup) GENERIC_DATA_STORE_TYPE=sqlite GENERIC_DATA_STORE_URL='sqlite:/tmp/auth.db' # Cache (in-memory for demo) GENERIC_CACHE_STORE_TYPE=memory GENERIC_CACHE_STORE_URL='memory' -
Run the Demo
cargo runThe application starts on:
- HTTP: Port 3001 (access as
http://localhost:3001)
Note: localhost is a secure context, so WebAuthn/Passkey works over HTTP. For production, use a reverse proxy (nginx/Caddy) to provide HTTPS.
- HTTP: Port 3001 (access as
-
Try the Demo
- Visit your ORIGIN URL (e.g.,
http://localhost:3001or your tunnel URL) - Create a user with Google OAuth2 or Passkey
- Navigate to the user account page:
{YOUR_ORIGIN}/o2p/user/account - Add new Passkey or OAuth2 account
- Log out and sign in with a different method
- Explore credential linking and protected pages (p1-p6)
- Admin features: The first user gets admin privileges at
{YOUR_ORIGIN}/o2p/admin/index
- Visit your ORIGIN URL (e.g.,
Other Demo Applications
| Demo | Description | Notes |
|---|---|---|
| demo-oauth2 | OAuth2-only authentication | Simpler setup, no passkey |
| demo-passkey | Passkey-only authentication | No Google credentials needed |
| demo-custom-login | Custom login/summary pages | See Custom Pages |
| demo-profile | User profile extension | Bio, avatar, theme |
| demo-todo | App data linked to users | CRUD with user isolation |
| demo-cross-origin | Cross-Origin Same-Site (Pattern 2) | Auth + API servers |
Setup is similar to demo-both: copy .env from dot.env.simple, adjust for each demo’s needs.
Basic Configuration
Common Environment Variables
| Variable | Required | Description |
|---|---|---|
ORIGIN | Yes | Base URL of your application (e.g., http://localhost:3001) |
OAUTH2_GOOGLE_CLIENT_ID | For OAuth2 | Google OAuth2 client ID |
OAUTH2_GOOGLE_CLIENT_SECRET | For OAuth2 | Google OAuth2 client secret |
GENERIC_DATA_STORE_TYPE | Yes | Database type: sqlite or postgresql |
GENERIC_DATA_STORE_URL | Yes | Database connection URL |
GENERIC_CACHE_STORE_TYPE | Yes | Cache type: memory or redis |
GENERIC_CACHE_STORE_URL | Yes | Cache connection URL |
Development vs Production
Development (SQLite + Memory)
GENERIC_DATA_STORE_TYPE=sqlite
GENERIC_DATA_STORE_URL='sqlite:./auth.db'
GENERIC_CACHE_STORE_TYPE=memory
GENERIC_CACHE_STORE_URL='memory://demo'
Production (PostgreSQL + Redis)
GENERIC_DATA_STORE_TYPE=postgresql
GENERIC_DATA_STORE_URL='postgresql://user:pass@localhost/dbname'
GENERIC_CACHE_STORE_TYPE=redis
GENERIC_CACHE_STORE_URL='redis://localhost:6379'
To start PostgreSQL and Redis with Docker:
cd db && docker compose up -d
Troubleshooting
Common Issues
-
“Invalid origin” error
- Ensure
ORIGINin.envmatches the URL you’re visiting exactly - Use
http://localhost:3001(not127.0.0.1)
- Ensure
-
Google OAuth2 not working
- Check your Google OAuth2 credentials in
.env - Verify authorized origins and redirect URIs in Google Cloud Console
- Check your Google OAuth2 credentials in
-
WebAuthn/Passkey not working
- WebAuthn requires a secure context (localhost or HTTPS)
- Try a different browser (Chrome has the best WebAuthn support)
- Clear browser data for localhost if needed
-
“Authenticator not found” error
- Ensure your device has biometric capabilities enabled
- Try using a security key if available
-
Database errors
- SQLite database is created automatically
- Delete the database file to reset:
rm auth.db - Ensure the path in
GENERIC_DATA_STORE_URLis writable
Development Tips
- Logs: Check console output for detailed error messages
- Reset database: Delete
auth.dbto clear all sessions and credentials
Architecture
Overview
This chapter describes the architecture of the oauth2-passkey library.
Current Components
- demo-both: Example Axum application that uses both OAuth2 and passkey authentication
- demo-oauth2: Example Axum application using OAuth2-only authentication
- demo-passkey: Example Axum application using passkey-only authentication
- demo-custom-login: Example Axum application with custom login and account pages
- demo-profile: Example Axum application demonstrating user profile extension
- demo-todo: Example Axum application demonstrating app data linked to users
- demo-cross-origin: Example Axum application demonstrating cross-origin authentication (Auth + API servers)
- oauth2_passkey_axum: Provides OAuth2 and passkey authentication handlers for Axum applications
- Includes routers for OAuth2, passkey, and user account endpoints
- Handles HTTP-specific concerns like request/response handling
- oauth2_passkey: Core authentication coordination library
- config: Environment variable configuration management
- coordination: Central coordination layer that orchestrates authentication flows
- oauth2: OAuth2 authentication operations, stores OAuth2 accounts
- passkey: Passkey/WebAuthn operations, stores passkey credentials
- session: Session management using cache store, provides session cookies
- storage: Cache and SQL store providers (PostgreSQL and SQLite support)
- userdb: User database operations, provides user_id management
- utils: Common utility functions
- test_utils: Testing utilities for unit and integration tests
Component Responsibilities
oauth2_passkey (Core Library)
-
coordination: Provides a unified API for authentication operations
- Orchestrates the authentication flows between different modules
- Handles error mapping and coordination between components
- Exposes high-level functions for authentication operations
-
oauth2: Handles OAuth2 authentication
- Manages OAuth2 provider integration
- Stores and retrieves OAuth2 accounts
- Handles OAuth2 authentication flow (authorization, token exchange)
-
passkey: Handles WebAuthn/Passkey authentication
- Manages passkey registration and authentication
- Stores and retrieves passkey credentials
- Implements WebAuthn protocol for credential verification
-
session: Manages user sessions
- Creates and validates session tokens
- Handles session cookies and page session tokens
- Provides user information from sessions
-
storage: Provides data persistence
- Implements cache storage for temporary data
- Provides SQL database access for persistent data
- Supports both PostgreSQL and SQLite
-
userdb: Manages user accounts
- Creates and updates user records
- Provides user lookup functionality
- Links authentication methods to user accounts
oauth2_passkey_axum (Axum Integration)
- Provides Axum-specific HTTP handlers and routers
- Translates between HTTP requests/responses and core library functions
- Manages authentication middleware for Axum applications
Security Considerations
- Session tokens are securely managed with proper expiration
- Page session tokens provide protection against session desynchronization
- Passkey credentials follow WebAuthn security standards
- OAuth2 implementation follows best practices for authorization flow
Data Flow
┌─────────────────┐
│ Browser │
└────────┬────────┘
│ HTTP Request
▼
┌─────────────────────────────────────────────────────────────────────┐
│ oauth2_passkey_axum │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
│ │ Router │───▶│ Handlers │───▶│ Static Assets (UI) │ │
│ └─────────────┘ └──────┬──────┘ └─────────────────────────┘ │
└────────────────────────────┼────────────────────────────────────────┘
│ calls
▼
┌─────────────────────────────────────────────────────────────────────┐
│ oauth2_passkey │
│ │
│ ┌───────────────────────────────────────────────────────────────┐ │
│ │ coordination │ │
│ │ (orchestrates authentication flows) │ │
│ └───────┬─────────────┬─────────────┬─────────────┬─────────────┘ │
│ │ │ │ │ │
│ ▼ ▼ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ oauth2 │ │ passkey │ │ session │ │ userdb │ │
│ │ │ │ │ │ │ │ │ │
│ │ Google │ │ WebAuthn │ │ cookies │ │ accounts │ │
│ │ OIDC │ │ FIDO2 │ │ tokens │ │ linking │ │
│ └────┬─────┘ └────┬─────┘ └────┬─────┘ └────┬─────┘ │
│ │ │ │ │ │
│ └─────────────┴──────┬──────┴─────────────┘ │
│ ▼ │
│ ┌───────────────────────────────────────────────────────────────┐ │
│ │ storage │ │
│ │ ┌─────────────────────┐ ┌──────────────────────────────┐ │ │
│ │ │ cache_store │ │ data_store │ │ │
│ │ │ (session, CSRF, │ │ (users, credentials, │ │ │
│ │ │ WebAuthn state) │ │ OAuth2 accounts) │ │ │
│ │ └──────────┬──────────┘ └───────────────┬──────────────┘ │ │
│ └─────────────┼───────────────────────────────┼─────────────────┘ │
└────────────────┼───────────────────────────────┼────────────────────┘
│ │
▼ ▼
┌────────────────┐ ┌────────────────┐
│ Memory / Redis │ │ SQLite / PgSQL │
└────────────────┘ └────────────────┘
Request Flow Example (Passkey Authentication)
- Browser sends authentication request to
/o2p/passkey/auth/start - Router dispatches to passkey handler in
oauth2_passkey_axum - Handler calls
coordinationlayer inoauth2_passkey - Coordination orchestrates:
sessionvalidates existing session statepasskeygenerates WebAuthn challenge (stored incache_store)
- Response returns to browser with challenge
- Browser completes WebAuthn ceremony, sends assertion
passkeyverifies assertion against stored credential (data_store)sessioncreates authenticated session (stored incache_store)userdbretrieves user information- Response returns with session cookie
Key Design Points
- coordination is the central orchestration layer - all auth flows go through it
- oauth2 and passkey modules are independent (no cross-dependencies)
- cache_store handles temporary data (sessions, CSRF tokens, WebAuthn challenges)
- data_store handles persistent data (users, credentials, OAuth2 accounts)
Why Singleton Pattern Instead of Axum State
In typical Axum applications, shared resources (database pools, caches) are passed to handlers via the State<T> extractor. This library takes a different approach: it uses global static storage initialized by init().
This design means:
- For library users: You don’t need to manage state - just call
init().await?and merge the router - Internally: Any function can access configuration and database connections without threading state through every function argument
oauth2_passkey_axum::init().await?;
let app = Router::new()
.merge(oauth2_passkey_full_router());
This provides a simpler API at the cost of some flexibility. For a detailed comparison of both approaches and the trade-offs involved, see Storage Pattern.
Future Directions
- Further consolidation of related functionality
- Enhanced error handling and logging with standardization on thiserror
- Additional authentication methods
- Improved documentation and examples
Basic Setup
This chapter explains how to integrate oauth2-passkey into your Axum application.
Axum Integration
Basic Setup
use axum::{Router, routing::get};
use oauth2_passkey_axum::{init, oauth2_passkey_full_router, AuthUser}; // [1]
async fn protected(user: AuthUser) -> String { // [2]
format!("Hello, {}!", user.label)
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
init().await?; // [3]
let app = Router::new()
.route("/", get(|| async { "Public page" }))
.route("/protected", get(protected))
.merge(oauth2_passkey_full_router()); // [4]
let listener = tokio::net::TcpListener::bind("127.0.0.1:3000").await?;
axum::serve(listener, app).await?;
Ok(())
}
Key Components
| Component | Description | |
|---|---|---|
| [1] | use statement | Imports the necessary components from oauth2_passkey_axum. |
| [2] | AuthUser | Axum extractor for authenticated user information. |
| [3] | init() | Initializes database and cache connections. Must be called before serving requests. |
| [4] | oauth2_passkey_full_router() | Adds all auth routes under /o2p prefix. |
Protected Routes
Use AuthUser as an extractor to require authentication:
use oauth2_passkey_axum::AuthUser;
// Requires authentication - redirects to login if not authenticated
async fn dashboard(user: AuthUser) -> String {
format!("Welcome, {}! (user_id: {})", user.label, user.user_id)
}
// AuthUser fields:
// - user_id: Unique user identifier
// - label: Display name
// - account: Account identifier
// - csrf_token: CSRF token for forms
// - is_admin: Whether user has admin privileges
For more protection methods (optional authentication, middleware-based protection), see Route Protection.
Available Endpoints
The oauth2_passkey_full_router() provides these endpoint groups under O2P_ROUTE_PREFIX (default: /o2p):
| Path | Description |
|---|---|
/oauth2/... | OAuth2 authentication (login, callback) |
/passkey/... | WebAuthn/Passkey authentication (register, authenticate) |
/user/... | User pages (login, account, logout) |
/admin/... | Admin interface (user management) |
For a complete list of all endpoints, see Axum Integration API - Endpoint Reference.
Route Protection
This chapter covers different methods to protect routes in your Axum application.
Method 1: AuthUser Extractor (Simplest)
Use AuthUser as an extractor. Unauthenticated users are redirected to login.
use oauth2_passkey_axum::AuthUser;
// Requires authentication - redirects to login if not authenticated
async fn dashboard(user: AuthUser) -> String {
format!("Welcome, {}! (user_id: {})", user.label, user.user_id)
}
// AuthUser fields:
// - user_id: Unique user identifier
// - label: Display name
// - account: Account identifier
// - csrf_token: CSRF token for forms
// - is_admin: Whether user has admin privileges
Method 2: Option<AuthUser> (Optional Authentication)
Allow both authenticated and anonymous users:
use oauth2_passkey_axum::AuthUser;
// Works for both authenticated and anonymous users
async fn public_page(user: Option<AuthUser>) -> String {
match user {
Some(u) => format!("Hello, {}!", u.label),
None => "Hello, Anonymous!".to_string(),
}
}
Method 3: Middleware (Route-Layer Protection)
Apply middleware to protect entire routes or groups:
use axum::{Router, Extension, middleware::from_fn, routing::get};
use oauth2_passkey_axum::{
is_authenticated_redirect, // Redirects to login
is_authenticated_user_redirect, // Redirects + provides AuthUser via Extension
is_authenticated_401, // Returns 401 (for APIs)
AuthUser, CsrfToken,
};
let app = Router::new()
// Method 3a: Middleware only (no user info needed)
.route("/protected", get(protected_page)
.route_layer(from_fn(is_authenticated_redirect)))
// Method 3b: Middleware + user info via Extension
.route("/dashboard", get(dashboard_with_user)
.route_layer(from_fn(is_authenticated_user_redirect)))
// Method 3c: Protect entire nested router
.nest("/admin", admin_router()
.route_layer(from_fn(is_authenticated_redirect)));
// With is_authenticated_redirect: no user argument needed
async fn protected_page() -> String {
"Protected content".to_string()
}
// With is_authenticated_user_redirect: get user via Extension
async fn dashboard_with_user(Extension(user): Extension<AuthUser>) -> String {
format!("Welcome, {}!", user.label)
}
Middleware Comparison
| Middleware | Unauthenticated | Available in Handler |
|---|---|---|
is_authenticated_redirect | Redirect to login | CsrfToken |
is_authenticated_user_redirect | Redirect to login | AuthUser (includes csrf_token) |
is_authenticated_401 | Return 401 | CsrfToken |
is_authenticated_user_401 | Return 401 | AuthUser (includes csrf_token) |
Note: The
_user_variants perform an additional database query to fetch user information. Use the non-user variants when you only need to verify authentication without accessing user details in your handler.
Note: Handlers receive the CSRF token to embed in rendered pages. See CSRF Token Handling for details.
When to Use Each Method
| Method | Use Case |
|---|---|
AuthUser extractor | Simple protected routes where you need user info |
Option<AuthUser> | Pages that work for both authenticated and anonymous users |
is_authenticated[_user]_redirect | Protect routes with redirect to login for unauthenticated users |
is_authenticated[_user]_401 | API endpoints that should return 401 instead of redirect |
CSRF Token Handling
State-changing requests (POST, PUT, DELETE, PATCH) require a valid CSRF token. This guide explains how to implement CSRF protection for your custom pages.
Overview
Token Acquisition
| Client Type | Embedding | Response Header | Endpoint |
|---|---|---|---|
| JavaScript | Optional | Available | Available |
| HTML Form | Required | Not available | Not available |
JavaScript clients can use response headers, the endpoint, or embedded values. HTML Forms must embed tokens at render time.
Token Usage
| Client Type | Token Location | Verification |
|---|---|---|
| JavaScript | X-CSRF-Token header | Automatic (middleware) |
| HTML Form | Hidden field in body | Manual (your handler) |
Middleware can read headers but not the body, so form tokens require manual verification.
JavaScript (Automatic Verification)
1. Get the Token
Option A: Embed in template from AuthUser.csrf_token:
<script>
const csrfToken = '{{ csrf_token }}';
</script>
Option B: Read from response header of any authenticated request:
const csrfToken = response.headers.get('X-CSRF-Token');
Option C: Fetch from endpoint:
const response = await fetch(`${O2P_ROUTE_PREFIX}/user/csrf_token`, {
credentials: 'include'
});
const { csrf_token: csrfToken } = await response.json();
2. Include in Requests
Add X-CSRF-Token header to all state-changing requests:
fetch(`${O2P_ROUTE_PREFIX}/user/update`, {
method: 'PUT',
headers: {
'X-CSRF-Token': csrfToken,
'Content-Type': 'application/json'
},
credentials: 'include',
body: JSON.stringify({ user_id, account, label })
});
That’s it! The middleware verifies the token automatically. No handler code needed.
HTML Form (Manual Verification)
Step 1: Get Token and Embed in Form
Handler (GET):
use askama::Template;
use axum::{Extension, response::{Html, IntoResponse}};
use oauth2_passkey_axum::CsrfToken;
#[derive(Template)]
#[template(path = "form.j2")]
struct FormTemplate<'a> {
csrf_token: &'a str,
}
pub async fn form_page(Extension(csrf_token): Extension<CsrfToken>) -> impl IntoResponse {
let template = FormTemplate { csrf_token: csrf_token.as_str() };
Html(template.render().unwrap())
}
Template (form.j2):
<form method="POST" action="/submit">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
<input type="text" name="message">
<button type="submit">Submit</button>
</form>
Step 2: Define Form Data Structure
use serde::Deserialize;
#[derive(Deserialize)]
pub struct FormData {
message: String,
csrf_token: String,
}
Step 3: Verify Token in Handler
use axum::{Extension, extract::Form, response::{Html, IntoResponse}, http::StatusCode};
use oauth2_passkey_axum::{CsrfToken, CsrfHeaderVerified};
use subtle::ConstantTimeEq;
pub async fn form_post(
Extension(csrf_token): Extension<CsrfToken>,
Extension(csrf_header_verified): Extension<CsrfHeaderVerified>,
Form(data): Form<FormData>,
) -> impl IntoResponse {
// Skip if already verified via header (AJAX request)
if !csrf_header_verified.0 {
// Verify form token with constant-time comparison
if !data.csrf_token.as_bytes().ct_eq(csrf_token.as_str().as_bytes()).into() {
return (StatusCode::FORBIDDEN, "Invalid CSRF token").into_response();
}
}
Html(format!("Success: {}", data.message)).into_response()
}
Step 4: Register Routes
use axum::{Router, routing::get, middleware::from_fn};
use oauth2_passkey_axum::is_authenticated_redirect;
let app = Router::new()
.route("/form", get(form_page).post(form_post))
.route_layer(from_fn(is_authenticated_redirect));
Key Points
- JavaScript: Include
X-CSRF-Tokenheader → automatic verification - HTML Form: Embed token in hidden field → verify manually with
subtle::ConstantTimeEq - Always use constant-time comparison (
ct_eq) - never== - Add
subtleto yourCargo.toml:subtle = "2"
For security best practices and troubleshooting, see CSRF Protection Guide.
User Data Integration
This guide explains how to manage application-specific user data alongside the oauth2-passkey library.
Overview
The oauth2-passkey library manages authentication data (users, credentials, OAuth2 accounts) in its own tables. Your application typically needs additional user data such as profiles, preferences, or application-specific records.
Recommended approach: Create separate tables in your database linked by user_id.
Database Patterns
Pattern 1: One-to-One (User Profile)
Each user has exactly one profile record.
oauth2-passkey library Your Application
+------------------+ +------------------+
| users | | user_profiles |
+------------------+ +------------------+
| user_id (PK) |<--------->| user_id (PK/FK) |
| account | | display_name |
| label | | bio |
| ... | | avatar_url |
+------------------+ | theme |
+------------------+
See demo-profile for a complete example.
Pattern 2: One-to-Many (User Data)
Each user has multiple records (todos, posts, orders, etc.).
oauth2-passkey library Your Application
+------------------+ +------------------+
| users | | todos |
+------------------+ +------------------+
| user_id (PK) |<----+ | id (PK) |
| account | +---->| user_id (FK) |
| label | | title |
| ... | | completed |
+------------------+ +------------------+
See demo-todo for a complete example.
Database Configuration
The library and your application can use any combination of databases.
Same Database
Both library and app share a single PostgreSQL database.
GENERIC_DATA_STORE_TYPE=postgresql
GENERIC_DATA_STORE_URL='postgres://demo:demo@localhost:5432/demo'
YOUR_APP_DATABASE_URL='postgres://demo:demo@localhost:5432/demo'
Benefits:
- Foreign key constraints between
usersand your tables - Efficient JOINs across authentication and application data
- Single database to manage and backup
Separate Databases
Library and app use independent databases.
GENERIC_DATA_STORE_TYPE=sqlite
GENERIC_DATA_STORE_URL='sqlite:/tmp/auth.db'
YOUR_APP_DATABASE_URL='postgres://demo:demo@localhost:5432/myapp'
Benefits:
- Clear isolation between library and application data
- Independent scaling and management
- Flexibility to use different database systems
Implementation Guide
1. Define Your Schema
-- One-to-one: User profiles
CREATE TABLE user_profiles (
user_id TEXT PRIMARY KEY,
display_name TEXT,
bio TEXT,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- One-to-many: User todos
CREATE TABLE todos (
id SERIAL PRIMARY KEY,
user_id TEXT NOT NULL,
title TEXT NOT NULL,
completed BOOLEAN DEFAULT FALSE,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_todos_user_id ON todos(user_id);
2. Set Up Database Connection
Use standard Axum State<T> pattern for your application’s database:
use sqlx::PgPool;
#[derive(Clone)]
pub struct AppState {
pub pool: PgPool,
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize oauth2-passkey library (uses its own storage)
oauth2_passkey_axum::init().await?;
// Initialize your app's database
let pool = PgPoolOptions::new()
.connect(&std::env::var("YOUR_APP_DATABASE_URL")?)
.await?;
let state = AppState { pool };
// Build routes
let app = Router::new()
.route("/", get(index))
.with_state(state)
.merge(oauth2_passkey_full_router());
// ...
}
3. Access User ID in Handlers
Use the AuthUser extractor to get the authenticated user’s ID:
use oauth2_passkey_axum::AuthUser;
async fn get_profile(
State(state): State<AppState>,
user: AuthUser, // Automatically extracts authenticated user
) -> Result<Response, StatusCode> {
let profile = sqlx::query_as!(
UserProfile,
"SELECT * FROM user_profiles WHERE user_id = $1",
user.id
)
.fetch_optional(&state.pool)
.await?;
// ...
}
4. Protect Routes
Use middleware to require authentication:
use oauth2_passkey_axum::is_authenticated_redirect;
pub fn protected_routes() -> Router<AppState> {
Router::new()
.route("/profile", get(show_profile).post(update_profile))
.route_layer(from_fn(is_authenticated_redirect))
}
Accessing OAuth2 Account Data
The library stores OAuth2 account information including profile pictures. You can access this data:
use oauth2_passkey_axum::{list_accounts_core, UserId};
async fn get_google_avatar(user_id: &str) -> Option<String> {
let user_id = UserId::new(user_id.to_string()).ok()?;
let accounts = list_accounts_core(user_id).await.ok()?;
accounts
.into_iter()
.find(|a| a.provider == "google")
.and_then(|a| a.picture)
}
Example Applications
| Demo | Description |
|---|---|
| demo-profile | User profile extension (settings, preferences) |
| demo-todo | App data linked to users (general pattern) |
Related Documentation
- Storage Pattern - Why the library uses singleton pattern
- Route Protection - Authentication middleware options
- CSRF Token Handling - Protecting form submissions
Configuration
Overview
The oauth2-passkey library uses environment variables for configuration. This approach provides flexibility for different deployment scenarios and keeps sensitive credentials out of source code.
Configuration is loaded at application startup. All variables can be set via:
- Environment variables directly
- A
.envfile in the project root (using thedotenvycrate)
Required Variables
These variables must be set for the library to function.
ORIGIN
The base URL of your application. This is used for:
- Constructing OAuth2 callback URLs
- Setting the WebAuthn Relying Party ID
- Validating request origins
ORIGIN='https://your-domain.example.com'
Important: No trailing slash.
OAUTH2_GOOGLE_CLIENT_ID
Your Google OAuth2 client ID obtained from the Google Cloud Console.
OAUTH2_GOOGLE_CLIENT_ID='your-client-id.apps.googleusercontent.com'
OAUTH2_GOOGLE_CLIENT_SECRET
Your Google OAuth2 client secret obtained from the Google Cloud Console.
OAUTH2_GOOGLE_CLIENT_SECRET='your-client-secret'
OAUTH2_ISSUER_URL
The OIDC (OpenID Connect) issuer URL. The library uses OIDC Discovery to automatically fetch endpoint configurations from the .well-known/openid-configuration URL.
OAUTH2_ISSUER_URL='https://accounts.google.com'
For Google authentication, use https://accounts.google.com.
Database Configuration
GENERIC_DATA_STORE_TYPE
Specifies the database backend for persistent storage.
| Value | Description |
|---|---|
sqlite | SQLite database (development/testing) |
postgres | PostgreSQL database (production) |
GENERIC_DATA_STORE_TYPE=postgres
GENERIC_DATA_STORE_URL
Connection URL for the database.
PostgreSQL format:
GENERIC_DATA_STORE_URL='postgresql://user:password@host:port/database'
SQLite formats:
# File-based SQLite
GENERIC_DATA_STORE_URL='sqlite:/path/to/database.db'
GENERIC_DATA_STORE_URL='sqlite:./db/sqlite/data/data.db'
# In-memory SQLite (useful for testing)
GENERIC_DATA_STORE_URL='sqlite:file:memdb1?mode=memory&cache=shared'
GENERIC_DATA_STORE_URL=':memory:'
Cache Configuration
GENERIC_CACHE_STORE_TYPE
Specifies the cache backend for temporary data (sessions, challenges, CSRF tokens).
| Value | Description |
|---|---|
memory | In-memory cache (development/single instance) |
redis | Redis cache (production/multi-instance) |
GENERIC_CACHE_STORE_TYPE=redis
GENERIC_CACHE_STORE_URL
Connection URL for Redis (only required when using Redis cache).
GENERIC_CACHE_STORE_URL='redis://localhost:6379'
Optional Variables
Route Configuration
O2P_ROUTE_PREFIX
Main route prefix for all authentication endpoints.
- Default:
/o2p - Endpoints affected: OAuth2, passkey, login, logout, summary pages
O2P_ROUTE_PREFIX='/o2p'
O2P_LOGIN_URL
URL of the login page. Used by middleware and the AuthUser extractor to redirect unauthenticated users.
- Default:
/o2p/user/login - Set this to override the default login page URL
- Required when the
login-uifeature is disabled
O2P_LOGIN_URL='/o2p/user/login'
O2P_DEFAULT_REDIRECT
Default redirect URL for authenticated-user flows. Used when:
-
Authenticated users visit the login page (bounce to app root)
-
Logout redirect target in templates
-
Default:
/
O2P_DEFAULT_REDIRECT='/'
O2P_ACCOUNT_URL
URL path for the user account management page.
- Default:
/o2p/user/account
O2P_ACCOUNT_URL='/o2p/user/account'
O2P_RESPOND_WITH_X_CSRF_TOKEN
Controls whether the X-CSRF-Token header is included in responses.
- Default:
true - Values:
true,false
O2P_RESPOND_WITH_X_CSRF_TOKEN=false
Demo Mode
O2P_DEMO_MODE
Enable demo mode for public demo deployments.
- Default:
false - Values:
true,false
O2P_DEMO_MODE=true
When enabled:
- All new users automatically receive admin privileges
- Admin pages mask other users’ sensitive data (email, name, IDs, IP addresses)
- A placeholder user is created at seq=1 so no real user gets first-user admin treatment
WebAuthn Configuration
WEBAUTHN_ADDITIONAL_ORIGINS
Additional origins allowed to use WebAuthn credentials (for multi-domain support).
WEBAUTHN_ADDITIONAL_ORIGINS='https://example.com'
Important: No trailing slash in URLs.
PASSKEY_RP_NAME
The Relying Party name displayed to users during passkey registration.
- Default: Same as ORIGIN
PASSKEY_RP_NAME='My Application'
PASSKEY_TIMEOUT
Client-side timeout (in seconds) sent to the authenticator.
- Default:
60
PASSKEY_TIMEOUT=60
PASSKEY_CHALLENGE_TIMEOUT
Server-side timeout (in seconds) for challenge validity.
- Default:
60
PASSKEY_CHALLENGE_TIMEOUT=60
PASSKEY_AUTHENTICATOR_ATTACHMENT
Specifies which type of authenticator to allow.
| Value | Description |
|---|---|
platform | Built-in authenticators (Touch ID, Face ID, Windows Hello, password managers) |
cross-platform | Removable authenticators (YubiKey, security keys) |
None | Allow any type |
- Default:
platform
PASSKEY_AUTHENTICATOR_ATTACHMENT='platform'
PASSKEY_RESIDENT_KEY
Controls resident key (discoverable credential) requirement.
| Value | Description |
|---|---|
required | Credential must be discoverable |
preferred | Prefer discoverable, but allow non-discoverable |
discouraged | Prefer non-discoverable credentials |
- Default:
required
PASSKEY_RESIDENT_KEY='required'
PASSKEY_REQUIRE_RESIDENT_KEY
Whether to require resident key support.
- Default:
true - Values:
true,false
PASSKEY_REQUIRE_RESIDENT_KEY=true
PASSKEY_USER_VERIFICATION
User verification requirement during authentication.
| Value | Description |
|---|---|
required | Always require user verification (PIN, biometric) |
preferred | Request verification if available |
discouraged | Skip verification if possible |
- Default:
discouraged
PASSKEY_USER_VERIFICATION='discouraged'
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL
Controls user handle generation strategy.
| Value | Behavior |
|---|---|
false | Use single user_handle for all credentials for a user (limits to one credential per user per site) |
true | Generate unique user_handle per credential (allows multiple credentials per user per site) |
- Default:
false
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL=false
Note: Password managers typically allow only one credential per user identifier. Set to true if users need multiple passkeys.
OAuth2 Advanced Configuration
OAUTH2_AUTH_URL
Override the authorization endpoint (normally discovered from issuer).
OAUTH2_AUTH_URL='https://accounts.google.com/o/oauth2/v2/auth'
OAUTH2_TOKEN_URL
Override the token endpoint (normally discovered from issuer).
OAUTH2_TOKEN_URL='https://oauth2.googleapis.com/token'
OAUTH2_SCOPE
OAuth2 scopes to request.
- Default:
openid+email+profile
OAUTH2_SCOPE='openid+email+profile'
OAUTH2_RESPONSE_MODE
How the authorization response is returned.
| Value | Description |
|---|---|
form_post | Response via POST to callback (uses SameSite=None for CSRF cookies) |
query | Response via query parameters (uses SameSite=Lax for CSRF cookies) |
- Default:
form_post
OAUTH2_RESPONSE_MODE='form_post'
OAUTH2_RESPONSE_TYPE
OAuth2 response type.
- Default:
code - Note: Only the authorization code flow is supported.
OAUTH2_RESPONSE_TYPE='code'
Cookie Configuration
OAUTH2_CSRF_COOKIE_NAME
Name of the CSRF protection cookie for OAuth2 flows.
- Default:
__Host-CsrfId
OAUTH2_CSRF_COOKIE_NAME='__Host-CsrfId'
OAUTH2_CSRF_COOKIE_MAX_AGE
Maximum age (in seconds) for the CSRF cookie.
- Default:
60
OAUTH2_CSRF_COOKIE_MAX_AGE=60
SESSION_COOKIE_NAME
Name of the session cookie.
- Default:
__Host-SessionId
SESSION_COOKIE_NAME='__Host-SessionId'
SESSION_COOKIE_MAX_AGE
Maximum age (in seconds) for the session cookie.
- Default:
600
SESSION_COOKIE_MAX_AGE=600
SESSION_CONFLICT_POLICY
Controls behavior when a user logs in while already having active sessions.
| Value | Description |
|---|---|
allow | Permit multiple concurrent sessions (default) |
replace | Invalidate all existing sessions, create a new one |
reject | Deny login if an active session already exists |
- Default:
allow
SESSION_CONFLICT_POLICY=allow
See Session Conflict Policy for detailed documentation.
User Field Mapping
These settings control how user fields are mapped between authentication providers and the internal user model.
OAUTH2_USER_ACCOUNT_FIELD
OAuth2 claim to use for User.account.
- Default:
email
OAUTH2_USER_ACCOUNT_FIELD='email'
OAUTH2_USER_LABEL_FIELD
OAuth2 claim to use for User.label.
- Default:
name
OAUTH2_USER_LABEL_FIELD='name'
PASSKEY_USER_ACCOUNT_FIELD
Passkey field to use for User.account.
- Default:
name
PASSKEY_USER_ACCOUNT_FIELD='name'
PASSKEY_USER_LABEL_FIELD
Passkey field to use for User.label.
- Default:
display_name
PASSKEY_USER_LABEL_FIELD='display_name'
Security Configuration
AUTH_SERVER_SECRET
Secret key used for token signing.
- Default:
default_secret_key_change_in_production
AUTH_SERVER_SECRET='your-secret-key-here'
Warning: Always change this value in production environments. Use a cryptographically secure random string.
Database Table Configuration
DB_TABLE_PREFIX
Prefix for all database tables created by the library.
- Default:
o2p_
DB_TABLE_PREFIX='o2p_'
DB_TABLE_USERS
Custom name for the users table.
- Default:
{prefix}users(e.g.,o2p_users)
DB_TABLE_USERS='o2p_users'
DB_TABLE_PASSKEY_CREDENTIALS
Custom name for the passkey credentials table.
- Default:
{prefix}passkey_credentials(e.g.,o2p_passkey_credentials)
DB_TABLE_PASSKEY_CREDENTIALS='o2p_passkey_credentials'
DB_TABLE_OAUTH2_ACCOUNTS
Custom name for the OAuth2 accounts table.
- Default:
{prefix}oauth2_accounts(e.g.,o2p_oauth2_accounts)
DB_TABLE_OAUTH2_ACCOUNTS='o2p_oauth2_accounts'
Experimental Features
O2P_PASSKEY_PROMOTION
Prompt users to register a passkey after OAuth2 login. When a user logs in via OAuth2 without an existing passkey, a registration modal can be shown.
- Default: unset (disabled)
| Value | Description |
|---|---|
| unset | Disabled, no promotion |
ask | Show a confirmation modal asking the user to register a passkey |
force | Directly show the passkey registration dialog (skip confirmation) |
O2P_PASSKEY_PROMOTION=ask
The promotion uses a UA + AAGUID heuristic to detect whether the user’s platform authenticator is likely available, avoiding prompts on unsupported devices.
Configuration Examples
Development (SQLite + Memory)
Minimal configuration for local development:
# Required
ORIGIN='http://localhost:3000'
OAUTH2_GOOGLE_CLIENT_ID='your-dev-client-id.apps.googleusercontent.com'
OAUTH2_GOOGLE_CLIENT_SECRET='your-dev-client-secret'
OAUTH2_ISSUER_URL='https://accounts.google.com'
# Storage - lightweight options for development
GENERIC_DATA_STORE_TYPE=sqlite
GENERIC_DATA_STORE_URL='sqlite:./dev.db'
GENERIC_CACHE_STORE_TYPE=memory
For testing with in-memory storage:
GENERIC_DATA_STORE_TYPE=sqlite
GENERIC_DATA_STORE_URL=':memory:'
GENERIC_CACHE_STORE_TYPE=memory
Production (PostgreSQL + Redis)
Recommended configuration for production deployments:
# Required
ORIGIN='https://your-domain.example.com'
OAUTH2_GOOGLE_CLIENT_ID='your-prod-client-id.apps.googleusercontent.com'
OAUTH2_GOOGLE_CLIENT_SECRET='your-prod-client-secret'
OAUTH2_ISSUER_URL='https://accounts.google.com'
# Production storage
GENERIC_DATA_STORE_TYPE=postgres
GENERIC_DATA_STORE_URL='postgresql://user:password@db-host:5432/production_db'
GENERIC_CACHE_STORE_TYPE=redis
GENERIC_CACHE_STORE_URL='redis://redis-host:6379'
# Security - CHANGE THIS!
AUTH_SERVER_SECRET='your-cryptographically-secure-random-string'
# Optional: Customize routes
O2P_ROUTE_PREFIX='/auth'
# Optional: Extend session duration (1 hour)
SESSION_COOKIE_MAX_AGE=3600
Multi-Domain Setup
For applications serving multiple domains:
ORIGIN='https://primary-domain.example.com'
WEBAUTHN_ADDITIONAL_ORIGINS='https://secondary-domain.example.com'
Custom Table Names
For integrating with existing database schemas:
DB_TABLE_PREFIX='myapp_auth_'
DB_TABLE_USERS='myapp_auth_users'
DB_TABLE_PASSKEY_CREDENTIALS='myapp_auth_passkeys'
DB_TABLE_OAUTH2_ACCOUNTS='myapp_auth_oauth2'
Multi-Origin Passkey Setup
This page explains how to configure passkeys to work across multiple origins (subdomains) sharing the same Relying Party (RP) ID.
When You Need This
Single domain: If your application runs on a single domain (e.g., https://example.com), you do NOT need this configuration.
Multiple origins: You need this when:
- Your app runs on multiple subdomains (e.g.,
app.example.comandlogin.example.com) - You want passkeys registered on one subdomain to work on another
- You’re sharing authentication across development and staging environments
How It Works
WebAuthn passkeys are bound to a Relying Party ID (RP ID), not to a specific origin. By default, the RP ID matches your domain.
RP ID: example.com
Allowed Origins:
├── https://example.com (main site)
├── https://app.example.com (application)
└── https://login.example.com (login portal)
All these origins share the same RP ID (example.com), so a passkey registered on any of them works on all of them.
The /.well-known/webauthn endpoint tells browsers which origins are allowed to use this RP ID.
Configuration
Step 1: Set Environment Variables
# .env
# Your main origin
ORIGIN=https://example.com
# The RP ID (usually your root domain)
PASSKEY_RP_ID=example.com
# Additional origins that can use the same passkeys
WEBAUTHN_ADDITIONAL_ORIGINS=https://app.example.com,https://login.example.com
Step 2: Use the Unified Router
If you use oauth2_passkey_full_router() (recommended), the /.well-known/webauthn endpoint is automatically included when WEBAUTHN_ADDITIONAL_ORIGINS is set:
use axum::Router;
use oauth2_passkey_axum::{oauth2_passkey_full_router, init};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
init().await?;
let app = Router::new()
// All auth routes + /.well-known/webauthn (when multi-origin is configured)
.merge(oauth2_passkey_full_router());
let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await?;
axum::serve(listener, app).await?;
Ok(())
}
Alternative: If you use oauth2_passkey_router() directly, add the well-known router manually:
use axum::Router;
use oauth2_passkey_axum::{
oauth2_passkey_router, passkey_well_known_router,
init, O2P_ROUTE_PREFIX
};
let app = Router::new()
.merge(passkey_well_known_router())
.nest(O2P_ROUTE_PREFIX.as_str(), oauth2_passkey_router());
Step 3: Verify Setup
After starting your server, verify the endpoint returns the correct configuration:
curl https://example.com/.well-known/webauthn
Expected response:
{
"rp_id": "example.com",
"origins": [
"https://example.com",
"https://app.example.com",
"https://login.example.com"
]
}
Important Notes
RP ID Requirements
The RP ID must be a parent domain of all origins:
| RP ID | Origin | Valid? |
|---|---|---|
example.com | https://example.com | Yes |
example.com | https://app.example.com | Yes |
example.com | https://other-site.com | No |
Browser Support
Related Origins is supported in modern browsers. Check caniuse.com for current browser support.
Security Considerations
- The
/.well-known/webauthnendpoint exposes only public configuration (RP ID and allowed origins) - No authentication is required for this endpoint (browsers fetch it before authentication)
- Only list origins you actually control
Troubleshooting
Passkey not working on subdomain
- Verify
WEBAUTHN_ADDITIONAL_ORIGINSincludes the subdomain - Check that
/.well-known/webauthnis accessible from the subdomain - Ensure the RP ID is a parent domain of all origins
404 on /.well-known/webauthn
- If using
oauth2_passkey_full_router(): VerifyWEBAUTHN_ADDITIONAL_ORIGINSis set - If using
oauth2_passkey_router()directly: Add.merge(passkey_well_known_router())to your router
Origins list is empty
Check that ORIGIN environment variable is set correctly.
Server Setup
This guide covers server setup patterns for running OAuth2/Passkey authentication, based on the demo applications.
Overview
Demo applications run HTTP servers on port 3001. For production deployments requiring HTTPS:
- localhost development: WebAuthn works over HTTP (localhost is a secure context)
- Production: Use a reverse proxy (nginx/Caddy) to handle TLS termination
Tracing Initialization
Initialize tracing before other setup to capture all logs:
pub(crate) fn init_tracing(app_name: &str) {
let env_filter = tracing_subscriber::EnvFilter::try_from_default_env()
.unwrap_or_else(|_| {
#[cfg(debug_assertions)]
{
format!(
"oauth2_passkey_axum=trace,oauth2_passkey=trace,{app_name}=trace,info"
).into()
}
#[cfg(not(debug_assertions))]
{
"info".into()
}
});
tracing_subscriber::registry()
.with(env_filter)
.with(tracing_subscriber::fmt::layer())
.init();
}
Default Log Levels
| Build | Command | Default Level |
|---|---|---|
| Debug | cargo run | oauth2_passkey=trace, app=trace, others=info |
| Release | cargo build --release | info |
Override with RUST_LOG:
RUST_LOG=debug cargo run
HTTP Server
Spawn HTTP server using tokio::net::TcpListener:
use axum::Router;
use std::net::SocketAddr;
use tokio::task::JoinHandle;
pub(crate) fn spawn_http_server(port: u16, app: Router) -> JoinHandle<()> {
tokio::spawn(async move {
let addr = SocketAddr::from(([0, 0, 0, 0], port));
tracing::info!("HTTP server listening on {}", addr);
let listener = tokio::net::TcpListener::bind(addr).await.unwrap();
axum::serve(listener, app).await.unwrap();
})
}
Production HTTPS Setup
For production, use a reverse proxy to handle TLS termination:
Caddy Example
example.com {
reverse_proxy localhost:3001
}
nginx Example
server {
listen 443 ssl;
server_name example.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://localhost:3001;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
When using a reverse proxy:
- Set
ORIGINto your HTTPS URL (e.g.,https://example.com) - The proxy forwards to your HTTP server on port 3001
Development Tunnels
For remote testing, use tunnel services like ngrok or cloudflared:
- Tunnel URL:
https://myapp.trycloudflare.com - Set
ORIGIN='https://myapp.trycloudflare.com' - Tunnel forwards to
http://localhost:3001
Complete main.rs Example
use axum::{
Router,
http::StatusCode,
response::{IntoResponse, Redirect, Response},
routing::get,
};
use axum::response::Html;
use askama::Template;
use dotenvy::dotenv;
use std::net::SocketAddr;
use tokio::task::JoinHandle;
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt};
use oauth2_passkey_axum::{AuthUser, O2P_LOGIN_URL, O2P_ROUTE_PREFIX, oauth2_passkey_full_router};
#[derive(Template)]
#[template(path = "index.j2")]
struct IndexTemplate<'a> {
message: &'a str,
prefix: &'a str,
}
async fn index(user: Option<AuthUser>) -> Result<Response, (StatusCode, String)> {
match user {
Some(_) => {
let template = IndexTemplate {
message: "Welcome! You are authenticated.",
prefix: O2P_ROUTE_PREFIX.as_str(),
};
match template.render() {
Ok(html) => Ok(Html(html).into_response()),
Err(e) => Err((StatusCode::INTERNAL_SERVER_ERROR, e.to_string())),
}
}
None => Ok(Redirect::to(O2P_LOGIN_URL.as_str()).into_response()),
}
}
fn spawn_http_server(port: u16, app: Router) -> JoinHandle<()> {
tokio::spawn(async move {
let addr = SocketAddr::from(([0, 0, 0, 0], port));
tracing::info!("HTTP server listening on {}", addr);
let listener = tokio::net::TcpListener::bind(addr).await.unwrap();
axum::serve(listener, app).await.unwrap();
})
}
fn init_tracing(app_name: &str) {
let env_filter = tracing_subscriber::EnvFilter::try_from_default_env()
.unwrap_or_else(|_| {
#[cfg(debug_assertions)]
{
format!(
"oauth2_passkey_axum=trace,oauth2_passkey=trace,{app_name}=trace,info"
).into()
}
#[cfg(not(debug_assertions))]
{
"info".into()
}
});
tracing_subscriber::registry()
.with(env_filter)
.with(tracing_subscriber::fmt::layer())
.init();
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize logging
init_tracing("my-app");
// Load environment variables
dotenv().ok();
// Initialize oauth2-passkey library
oauth2_passkey_axum::init().await?;
// Build application router
let app = Router::new()
.route("/", get(index))
.merge(oauth2_passkey_full_router());
// Start server
spawn_http_server(3001, app).await?;
Ok(())
}
TLS Certificates (bundled-tls)
The library makes HTTPS requests to OAuth2/OIDC providers (e.g., Google) for token exchange and JWKS fetching. By default, it uses the system’s CA certificates for TLS verification.
For minimal container deployments (scratch or alpine Docker images) where system certificates are not available, enable the bundled-tls feature to bundle Mozilla root certificates:
[dependencies]
oauth2-passkey-axum = { version = "0.3", features = ["bundled-tls"] }
This bundles certificates from the webpki-roots crate and configures ALPN protocol negotiation for proper TLS handshakes.
When to use bundled-tls:
- Scratch Docker images (no filesystem, no
/etc/ssl/certs/) - Alpine Linux containers without
ca-certificatespackage - Any environment where system CA certificates are missing
When NOT needed:
- Standard Linux distributions with
ca-certificatesinstalled - Docker images based on Debian, Ubuntu, or similar
Example minimal Dockerfile:
FROM rust:latest AS builder
WORKDIR /app
COPY . .
RUN cargo build --release --features bundled-tls
FROM scratch
COPY --from=builder /app/target/release/myapp /
ENTRYPOINT ["/myapp"]
Required Dependencies
Add these to your Cargo.toml:
[dependencies]
axum = "0.8"
tokio = { version = "1", features = ["full"] }
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
dotenvy = "0.15"
askama = "0.12"
oauth2-passkey-axum = "0.3"
Startup Sequence
- Initialize tracing
- Load environment variables with
dotenv() - Call
oauth2_passkey_axum::init().await - Build router with
oauth2_passkey_full_router() - Start HTTP server
Deployment Patterns
This guide covers different deployment patterns for integrating oauth2-passkey with various client types and architectures.
Overview
| # | Client | Origin Relationship | Authentication | Session Maintenance | Status |
|---|---|---|---|---|---|
| 1 | Browser (traditional/SPA) | Same-Origin | Browser | Cookie | Supported |
| 2 | Browser (traditional/SPA) | Cross-Origin, Same-Site | Browser | Cookie + Domain + CORS | Supported |
| 3 | Browser (traditional/SPA) | Cross-Site | - | - | Out of scope |
| 4 | Native App | - | Passkey (Native API) | Bearer | Supported |
| 5 | Native App | - | OAuth2 (In-App Browser) | Bearer | Not yet supported |
Understanding Origin and Site
Before diving into deployment patterns, it’s important to understand the difference between Origin and Site:
- Same-Origin: Scheme, host, and port are identical (e.g.,
https://example.com:443) - Cross-Origin: Any difference in scheme, host, or port (e.g.,
https://app.example.comvshttps://api.example.com) - Same-Site: Same eTLD+1 (e.g.,
app.example.comandapi.example.comshareexample.com) - Cross-Site: Different eTLD+1 (e.g.,
example.comvsanother.com)
Pattern 1: Same-Origin (Recommended)
The simplest and most secure pattern. Your web application and API share the same origin.
Architecture
https://example.com
├── / (Web pages)
├── /api/ (API endpoints)
└── /o2p/ (Authentication endpoints)
Configuration
# .env
ORIGIN='https://example.com'
SESSION_AUTH_MODE=cookie
Characteristics
- HttpOnly cookies automatically sent with every request
- CSRF protection required and handled automatically
- No additional CORS configuration needed
- Works with traditional server-rendered pages and SPAs
Pattern 2: Cross-Origin, Same-Site
Your frontend and API are on different subdomains of the same domain.
Architecture
https://app.example.com (Frontend / SPA)
https://api.example.com (API server with oauth2-passkey)
Requirements
- Cookie Domain attribute: Set to parent domain
- CORS configuration: On API server
- Frontend fetch configuration: Include credentials
Cookie Configuration
# .env for api.example.com
ORIGIN='https://api.example.com'
SESSION_AUTH_MODE=cookie
SESSION_COOKIE_DOMAIN='.example.com' # Note the leading dot
CORS Configuration (Axum)
#![allow(unused)]
fn main() {
use tower_http::cors::{CorsLayer, Any};
use http::{Method, header::{CONTENT_TYPE, AUTHORIZATION}};
let cors = CorsLayer::new()
.allow_origin("https://app.example.com".parse::<HeaderValue>().unwrap())
.allow_credentials(true) // Required for cookies
.allow_methods([Method::GET, Method::POST, Method::DELETE])
.allow_headers([CONTENT_TYPE, AUTHORIZATION]);
let app = Router::new()
.merge(oauth2_passkey_full_router())
.layer(cors);
}
Frontend Configuration
// All API requests must include credentials
fetch('https://api.example.com/o2p/passkey/auth/start', {
method: 'POST',
credentials: 'include', // Required for cross-origin cookies
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ account: 'user@example.com' }),
});
Important Notes
SameSite=Laxcookies work because both subdomains are Same-SiteAccess-Control-Allow-Origincannot be*when usingcredentials: include- The API server must explicitly list allowed origins
Demo Application
See demo-cross-origin for a working example that demonstrates:
- Auth Server with OAuth2/Passkey authentication
- Separate API Server validating session cookies
- CORS configuration for cross-origin requests
- Multiple testing methods (localhost, HTTPS proxy, direct HTTPS)
Pattern 3: Cross-Site (Not Supported)
Cross-site requests (different eTLD+1) face significant restrictions:
- Third-party cookies are being phased out by browsers
SameSite=None; Secureis required but increasingly blocked
Recommendation: Use Pattern 2 by hosting your frontend and API on the same site, or use a reverse proxy to make them appear same-origin.
Alternative: Reverse Proxy
https://example.com
├── / → Frontend server (proxied)
└── /api/ → API server (proxied)
With this setup, the browser sees everything as same-origin.
Pattern 4: Native App with Passkey
Native mobile apps can use platform Passkey APIs directly.
Architecture
[Native App (iOS/Android)]
│
├── Platform Passkey API (ASAuthorizationController / CredentialManager)
│
└── https://api.example.com (API server)
└── Bearer token authentication
Configuration
# .env
ORIGIN='https://api.example.com'
SESSION_AUTH_MODE=bearer
Authentication Flow
-
Start authentication
POST /api/passkey/auth/start Content-Type: application/json {"account": "user@example.com"} -
Process with platform API (iOS example)
let authController = ASAuthorizationController(authorizationRequests: [request]) authController.delegate = self authController.performRequests() -
Complete authentication
POST /api/passkey/auth/finish Content-Type: application/json {"id": "...", "rawId": "...", "response": {...}, "type": "public-key"} -
Receive Bearer token
{ "token": "session_id_here", "token_type": "Bearer", "expires_in": 600 } -
Use token for subsequent requests
GET /api/protected Authorization: Bearer session_id_here
Security Considerations
- Store tokens securely (iOS Keychain, Android EncryptedSharedPreferences)
- Bearer tokens don’t require CSRF protection
- Implement token refresh before expiration
Pattern 5: Native App with OAuth2 (Not Yet Supported)
OAuth2 authentication from native apps requires special handling due to In-App Browser limitations.
The Challenge
In-App Browsers (ASWebAuthenticationSession on iOS, Custom Tabs on Android) cannot read HTTP response bodies or headers. They can only detect URL changes.
OAuth2 Authentication Flow
┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Native App │ │ In-App │ │ API Server │ │ OAuth2 │
│ │ │ Browser │ │ (o2p) │ │ Provider │
└──────┬──────┘ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘
│ │ │ │
1. │ Open In-App Browser │ │
│──────────────────>│ │ │
│ │ │ │
2. │ │ GET /o2p/oauth2/google/start │
│ │──────────────────>│ │
│ │ │ │
3. │ │ 302 Redirect to Google │
│ │<──────────────────│ │
│ │ │ │
4. │ │ User authenticates with Google │
│ │──────────────────────────────────────>│
│ │ │ │
5. │ │ 302 Redirect to /o2p/oauth2/callback?code=xxx
│ │<──────────────────────────────────────│
│ │ │ │
6. │ │ GET /o2p/oauth2/callback?code=xxx │
│ │──────────────────>│ │
│ │ │ │
7. │ │ │ Exchange code for tokens
│ │ │──────────────────>│
│ │ │ │
8. │ │ │ Access token + ID token
│ │ │<──────────────────│
│ │ │ │
9. │ │ 302 Redirect to myapp://callback?code=yyy
│ │<──────────────────│ │
│ │ │ │
10. │ URL Scheme triggers app launch │ │
│<──────────────────│ │ │
│ │ │ │
11. │ POST /o2p/api/token/exchange { code: yyy } │
│──────────────────────────────────────>│ │
│ │ │ │
12. │ { token: "xxx", token_type: "Bearer", expires_in: 600 } │
│<──────────────────────────────────────│ │
│ │ │ │
Key Points:
- Steps 1-8: Standard OAuth2 flow (same as browser)
- Step 9: Instead of returning session cookie, redirect to Custom URL Scheme with a short-lived code
- Steps 10-12: Native app exchanges code for Bearer token via API call
Redirect Methods
Option A: Custom URL Scheme
myapp://callback?code=xxx
- Pros: Simple to implement, works on all platforms
- Cons: Security risk - any app can register the same scheme
Security Risk: URL Scheme Hijacking
1. Legitimate app registers: myapp://
2. Malicious app also registers: myapp://
3. When OAuth2 redirects to myapp://callback?code=xxx
4. OS may open malicious app instead
5. Malicious app steals the authorization code
This is why we use a short-lived code (not a long-lived token) in the redirect URL. Even if intercepted, the code:
- Expires quickly (typically 30-60 seconds)
- Can only be exchanged once
- Requires the code exchange endpoint
Option B: Universal Links / App Links (Recommended)
https://api.example.com/app/callback?code=xxx
- Pros: Secure - only verified domain owner’s app can receive
- Cons: Requires server configuration, HTTPS only
How Universal Links Work
- Server Configuration: Host
apple-app-site-association(iOS) orassetlinks.json(Android) at/.well-known/ - App Registration: App declares which domains it handles in its entitlements
- Verification: OS verifies the association file matches the app’s bundle ID / package name
- Secure Routing: Only the verified app can receive links for that domain
Example: apple-app-site-association
{
"applinks": {
"apps": [],
"details": [{
"appID": "TEAMID.com.example.myapp",
"paths": ["/app/callback"]
}]
}
}
Universal Links Flow
┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Native App │ │ OS (iOS/ │ │ API Server │ │ OAuth2 │
│ │ │ Android) │ │ (o2p) │ │ Provider │
└──────┬──────┘ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘
│ │ │ │
│ (App install) │ │ │
│ │ │ │
0. │ │ GET /.well-known/apple-app-site-association
│ │──────────────────>│ │
│ │ │ │
│ │ { applinks: { details: [...] } } │
│ │<──────────────────│ │
│ │ │ │
│ (OS caches app-domain association) │ │
│ │ │ │
│ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─│
│ │ │ │
│ (OAuth2 authentication - same as Custom URL Scheme) │
│ │ │ │
9. │ │ 302 Redirect to │ │
│ │ https://api.example.com/app/callback?code=yyy
│ │<──────────────────│ │
│ │ │ │
10. │ │ (OS intercepts │ │
│ │ HTTPS URL) │ │
│ │ │ │
11. │ │ (Verify: Is this │ │
│ │ domain registered│ │
│ │ for this app?) │ │
│ │ │ │
12. │ Launch app with URL │ │
│<──────────────────│ │ │
│ │ │ │
13. │ POST /o2p/api/token/exchange { code: yyy } │
│──────────────────────────────────────>│ │
│ │ │ │
14. │ { token: "xxx", token_type: "Bearer", expires_in: 600 } │
│<──────────────────────────────────────│ │
│ │ │ │
Key Differences from Custom URL Scheme:
- Step 0: OS fetches and caches
apple-app-site-associationat app install time - Steps 10-11: OS verifies the domain is registered for this specific app
- No hijacking possible: Only the verified app can receive URLs for this domain
Configuration (Planned)
# .env
ORIGIN='https://api.example.com'
SESSION_AUTH_MODE=bearer
# For Custom URL Scheme (Option A)
NATIVE_APP_CALLBACK_URL='myapp://callback'
# For Universal Links (Option B - Recommended)
NATIVE_APP_CALLBACK_URL='https://api.example.com/app/callback'
Security Comparison
| Method | URL Scheme Hijacking | Requires HTTPS | Setup Complexity |
|---|---|---|---|
| Custom URL Scheme | Vulnerable | No | Low |
| Universal Links | Protected | Yes | Medium |
Recommendation: Use Universal Links / App Links for production applications. Custom URL Scheme may be acceptable for development or low-security applications.
Implementation Status
This pattern requires additional implementation:
- Code Exchange Endpoint:
POST /o2p/api/token/exchange - OAuth2 Callback Modification: Redirect to configured callback URL with code
- Code Generation: Short-lived, one-time-use codes
- Configuration:
NATIVE_APP_CALLBACK_URLenvironment variable
Currently not supported. See issue #2026-01-23-01 for tracking.
Infrastructure Considerations
Session Storage
| Deployment | Recommended Storage |
|---|---|
| Single server | Memory (default) |
| Multiple servers (Load Balanced) | Redis |
For multi-server deployments, all servers must share the same session store:
# .env
GENERIC_CACHE_STORE_TYPE=redis
GENERIC_CACHE_STORE_URL='redis://redis-server:6379'
Database
Similarly, all servers must share the same database:
# .env
GENERIC_DATA_STORE_TYPE=postgres
GENERIC_DATA_STORE_URL='postgres://user:pass@db-server/oauth2_passkey'
Choosing the Right Pattern
| Use Case | Recommended Pattern |
|---|---|
| Traditional web app | Pattern 1 (Same-Origin) |
| SPA on same domain | Pattern 1 (Same-Origin) |
| SPA on subdomain | Pattern 2 (Cross-Origin, Same-Site) |
| SPA on different domain | Use reverse proxy → Pattern 1 |
| iOS/Android with Passkey | Pattern 4 (Native Passkey) |
| iOS/Android with OAuth2 | Pattern 5 (not yet supported) |
Summary
- For browsers: Use Cookie-based authentication (Pattern 1 or 2)
- For native apps with Passkey: Use Bearer token authentication (Pattern 4)
- Avoid Cross-Site: Third-party cookies are unreliable
- Multi-server: Always use shared Redis and database
Built-in Themes
The library ships with 9 pre-built CSS themes. To apply a theme, set a single environment variable – no custom CSS files or additional code needed.
Quick Start
# .env
O2P_CUSTOM_CSS_URL=/o2p/themes/theme-zinc.css
Available Themes
Themes are served at {O2P_ROUTE_PREFIX}/themes/ (default prefix: /o2p).
| Theme | URL Path | Style |
|---|---|---|
| Zinc | /o2p/themes/theme-zinc.css | Neutral zinc palette |
| Slate | /o2p/themes/theme-slate.css | Cool slate gray palette |
| Blue | /o2p/themes/theme-blue.css | Primary blue palette |
| Violet | /o2p/themes/theme-violet.css | Elegant violet palette |
| Rose | /o2p/themes/theme-rose.css | Warm rose palette |
| Neumorphism | /o2p/themes/theme-neumorphism.css | Soft shadows creating depth |
| Material | /o2p/themes/theme-material.css | Google Material design principles |
| Eco | /o2p/themes/theme-eco.css | Nature-inspired green tones |
| SaaS | /o2p/themes/theme-saas.css | Stripe-inspired purple accents |
Note: If you use a custom
O2P_ROUTE_PREFIX, replace/o2pin the paths accordingly.
Theme Types
Variable-only themes
Zinc, Slate, Blue, Violet, Rose
These themes override :root CSS variables and a few selectors like the .login-page background gradient. They are lightweight and predictable – the page structure and layout remain identical to the default.
Extended themes
Neumorphism, Material, Eco, SaaS
These themes override CSS variables and add additional CSS rules for borders, shadows, gradients, and other visual effects. They produce a more distinctive look.
Further Customization
If the built-in themes don’t match your needs, you can create your own CSS file. See Customizing CSS for the full CSS Custom Properties reference and examples.
Customizing CSS
This page explains how to create your own CSS theme for the built-in UI pages. If you just want to pick a ready-made theme, see Built-in Themes first.
The built-in pages use CSS Custom Properties (CSS variables) for theming. You can override these variables to change colors, fonts, spacing, and more without modifying the HTML structure.
Quick Start
- Create a CSS file with your overrides:
/* static/my-theme.css */
:root {
--o2p-primary: #ff6b6b;
--o2p-background: #1a1a2e;
}
-
Serve the CSS file from your application (see Serving Your CSS File).
-
Set the environment variable:
# .env
O2P_CUSTOM_CSS_URL=/static/my-theme.css
CSS Custom Properties Reference
Colors
| Property | Default | Description |
|---|---|---|
--o2p-primary | #4f46e5 | Primary action buttons |
--o2p-primary-hover | #4338ca | Primary button hover state |
--o2p-oauth2 | #6366f1 | OAuth2 buttons |
--o2p-oauth2-hover | #4f46e5 | OAuth2 button hover state |
--o2p-passkey | #818cf8 | Passkey buttons |
--o2p-passkey-hover | #6366f1 | Passkey button hover state |
--o2p-danger | #dc2626 | Delete/danger buttons |
--o2p-danger-hover | #b91c1c | Danger button hover state |
--o2p-secondary | #6b7280 | Secondary/cancel buttons |
--o2p-secondary-hover | #4b5563 | Secondary button hover state |
Text
| Property | Default | Description |
|---|---|---|
--o2p-text | #111827 | Primary text color |
--o2p-text-secondary | #4b5563 | Secondary text color |
--o2p-text-light | #9ca3af | Light/muted text |
Backgrounds
| Property | Default | Description |
|---|---|---|
--o2p-background | #f9fafb | Page background |
--o2p-surface | #ffffff | Card/container background |
--o2p-surface-alt | #f3f4f6 | Alternate surface (items) |
Borders & Radius
| Property | Default | Description |
|---|---|---|
--o2p-border | #e5e7eb | Border color |
--o2p-border-light | #f3f4f6 | Light border color |
--o2p-radius-sm | 4px | Small radius (inputs) |
--o2p-radius-md | 6px | Medium radius (buttons) |
--o2p-radius-lg | 8px | Large radius (cards) |
Spacing
| Property | Default | Description |
|---|---|---|
--o2p-space-xs | 4px | Extra small spacing |
--o2p-space-sm | 8px | Small spacing |
--o2p-space-md | 16px | Medium spacing |
--o2p-space-lg | 24px | Large spacing |
--o2p-space-xl | 32px | Extra large spacing |
Typography
| Property | Default | Description |
|---|---|---|
--o2p-font | system-ui, -apple-system, ... | Font family |
--o2p-font-size | 16px | Base font size |
--o2p-line-height | 1.6 | Line height |
Shadows
| Property | Default | Description |
|---|---|---|
--o2p-shadow | 0 1px 3px rgba(0,0,0,0.08) | Standard shadow |
--o2p-shadow-lg | 0 4px 12px rgba(0,0,0,0.1) | Large shadow |
Serving Your CSS File
Add a route in your application to serve the custom CSS:
use axum::{Router, routing::get, response::Response, http::{StatusCode, header::CONTENT_TYPE}};
async fn serve_custom_css() -> Response {
let css = include_str!("../static/my-theme.css");
Response::builder()
.status(StatusCode::OK)
.header(CONTENT_TYPE, "text/css")
.body(css.into())
.unwrap()
}
let app = Router::new()
.route("/static/my-theme.css", get(serve_custom_css))
.merge(oauth2_passkey_full_router());
Credential Type Styling
Passkey and OAuth2 credentials are visually distinguished with colored left borders:
- Passkey credentials: Uses
--o2p-passkeycolor - OAuth2 accounts: Uses
--o2p-secondarycolor
These use CSS classes .passkey and .oauth2 on credential items:
.item.passkey {
border-left-color: var(--o2p-passkey);
}
.item.oauth2 {
border-left-color: var(--o2p-secondary);
}
Examples
Dark Mode
/* dark-theme.css */
:root {
/* Dark backgrounds */
--o2p-background: #1a1a2e;
--o2p-surface: #16213e;
--o2p-surface-alt: #1f2b47;
/* Light text */
--o2p-text: #e4e4e4;
--o2p-text-secondary: #a0a0a0;
--o2p-text-light: #6c6c6c;
/* Darker borders */
--o2p-border: #0f3460;
--o2p-border-light: #1a3a5c;
/* Adjusted shadows for dark mode */
--o2p-shadow: 0 2px 8px rgba(0, 0, 0, 0.3);
--o2p-shadow-lg: 0 4px 16px rgba(0, 0, 0, 0.4);
}
Brand Colors
/* brand-theme.css */
:root {
/* Use your brand's primary color */
--o2p-primary: #e91e63;
--o2p-primary-hover: #c2185b;
}
/* Override the login page gradient */
.login-page {
background: linear-gradient(135deg, #e91e63 0%, #9c27b0 100%);
}
Rounded Style
/* rounded-theme.css */
:root {
--o2p-radius-sm: 12px;
--o2p-radius-md: 16px;
--o2p-radius-lg: 24px;
}
When to Use Templates
CSS customization is sufficient for most branding needs. Consider template customization when you need:
- Different page structure or layout
- Additional form fields or sections
- Integration with your existing design system
- Completely different user flow
Customizing Built-in Pages - Templates
This library provides built-in UI pages for login, account management, and administration:
- Login (
/o2p/user/login) - Sign in and account creation - Account (
/o2p/user/account) - User account management - Admin List (
/o2p/admin/index) - User list for administrators - Admin User (
/o2p/admin/user/{id}) - User detail view for administrators
You can customize these pages in two ways:
| Method | Effort | When to Use |
|---|---|---|
| Built-in Themes | None | Pick a pre-built theme |
| CSS | Low | Change colors, fonts, spacing |
| Templates (this page) | High | Replace page structure entirely |
Overview
This page explains how to create custom pages to replace the built-in UI. The process involves:
- Creating your custom pages (handlers + templates)
- Disabling the built-in UI via feature flags
See Disabling Built-in UI for feature flag configuration.
Custom Login Page
By default, the AuthUser extractor redirects unauthenticated users to the built-in login page at /o2p/user/login.
To use your own custom login page:
- Set
O2P_LOGIN_URLenvironment variable to your page URL - Create your login page with the JavaScript APIs
┌─────────────────────────────────────────────────────────────┐
│ User visits /protected │
│ ↓ │
│ AuthUser extractor checks session │
│ ↓ │
│ Not authenticated -> Redirect to O2P_LOGIN_URL │
│ ↓ │
│ Your custom login page (/login) │
│ ↓ │
│ User clicks login button -> JavaScript API handles auth │
│ ↓ │
│ Success -> Redirect back to original page │
└─────────────────────────────────────────────────────────────┘
1. Set Environment Variable
# .env
O2P_LOGIN_URL='/login'
2. Create Login Handler
use askama::Template;
use axum::{response::{Html, IntoResponse, Redirect}, http::StatusCode};
use oauth2_passkey_axum::{AuthUser, O2P_ROUTE_PREFIX};
#[derive(Template)]
#[template(path = "login.j2")]
struct LoginTemplate<'a> {
o2p_route_prefix: &'a str,
}
async fn login(user: Option<AuthUser>) -> impl IntoResponse {
match user {
Some(_) => Redirect::to("/").into_response(),
None => {
let template = LoginTemplate {
o2p_route_prefix: O2P_ROUTE_PREFIX.as_str(),
};
Html(template.render().unwrap()).into_response()
}
}
}
3. Create Login Template
<!-- templates/login.j2 -->
<!DOCTYPE html>
<html>
<head>
<script>
const O2P_ROUTE_PREFIX = '{{o2p_route_prefix}}';
</script>
<script src="{{o2p_route_prefix}}/oauth2/oauth2.js"></script>
<script src="{{o2p_route_prefix}}/passkey/passkey.js"></script>
</head>
<body>
<h1>Login</h1>
<!-- Sign In -->
<button onclick="oauth2.openPopup('login')">Sign in with Google</button>
<button onclick="startAuthentication()">Sign in with Passkey</button>
<!-- Create Account -->
<button onclick="oauth2.openPopup('create_user')">Create account with Google</button>
<button onclick="showRegistrationModal('create_user')">Create account with Passkey</button>
</body>
</html>
4. Register Route
let app = Router::new()
.route("/login", get(login))
.route("/protected", get(protected))
.merge(oauth2_passkey_full_router());
Custom Account Page
The library provides a built-in account management page at /o2p/user/account, but you can create your own.
1. Create Summary Handler
use oauth2_passkey_axum::{
AuthUser, O2P_ROUTE_PREFIX, OAuth2Account, PasskeyCredential,
UserId, list_accounts_core, list_credentials_core,
};
#[derive(Template)]
#[template(path = "summary.j2")]
struct SummaryTemplate {
user_account: String,
user_label: String,
passkeys: Vec<PasskeyInfo>,
oauth2_accounts: Vec<OAuth2Info>,
}
async fn summary(user: AuthUser) -> impl IntoResponse {
let user_id = UserId::new(user.id.clone()).expect("Invalid user ID");
// Fetch passkey credentials
let passkeys = list_credentials_core(user_id.clone()).await
.unwrap_or_default()
.iter()
.map(|c| PasskeyInfo {
name: c.user.name.clone(),
created_at: c.created_at.format("%Y-%m-%d").to_string(),
})
.collect();
// Fetch OAuth2 accounts
let oauth2_accounts = list_accounts_core(user_id).await
.unwrap_or_default()
.iter()
.map(|a| OAuth2Info {
provider: a.provider.clone(),
email: a.email.clone(),
})
.collect();
let template = SummaryTemplate {
user_account: user.account,
user_label: user.label,
passkeys,
oauth2_accounts,
};
Html(template.render().unwrap())
}
2. Create Summary Template
<!-- templates/summary.j2 -->
<!DOCTYPE html>
<html>
<body>
<h1>User Summary</h1>
<h2>Account</h2>
<p>{{user_account}} ({{user_label}})</p>
<h2>Passkeys</h2>
{% for passkey in passkeys %}
<div>{{passkey.name}} - {{passkey.created_at}}</div>
{% endfor %}
<h2>OAuth2 Accounts</h2>
{% for account in oauth2_accounts %}
<div>{{account.provider}}: {{account.email}}</div>
{% endfor %}
</body>
</html>
3. Register Route
let app = Router::new()
.route("/summary", get(summary))
.merge(oauth2_passkey_full_router());
Custom Admin Page
The library provides a built-in admin interface at /o2p/admin/index for managing users.
Disabling Built-in Admin UI
To disable the built-in admin UI and create your own:
# Cargo.toml
[dependencies]
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["user-ui"] }
Admin Privilege Check
The first registered user (sequence_number = 1) is automatically an admin. Other users can be granted admin status. Check admin privileges using has_admin_privileges():
async fn admin_guard(user: AuthUser) -> Result<(), StatusCode> {
if !user.has_admin_privileges() {
return Err(StatusCode::FORBIDDEN);
}
Ok(())
}
1. Create Admin List Handler
use oauth2_passkey_axum::{
AuthUser, DbUser, SessionId, get_all_users,
};
#[derive(Template)]
#[template(path = "admin_list.j2")]
struct AdminListTemplate {
users: Vec<UserInfo>,
}
async fn admin_list(user: AuthUser) -> Result<impl IntoResponse, StatusCode> {
// Check admin privileges
if !user.has_admin_privileges() {
return Err(StatusCode::FORBIDDEN);
}
// Fetch all users
let session_id = SessionId::new(user.session_id.clone())
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
let users = get_all_users(session_id).await
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?
.iter()
.map(|u| UserInfo {
id: u.id.clone(),
account: u.account.clone(),
label: u.label.clone(),
is_admin: u.has_admin_privileges(),
})
.collect();
let template = AdminListTemplate { users };
Ok(Html(template.render().unwrap()))
}
2. Create Admin List Template
<!-- templates/admin_list.j2 -->
<!DOCTYPE html>
<html>
<body>
<h1>User Management</h1>
<table>
<tr>
<th>Account</th>
<th>Label</th>
<th>Admin</th>
<th>Actions</th>
</tr>
{% for user in users %}
<tr>
<td>{{user.account}}</td>
<td>{{user.label}}</td>
<td>{{user.is_admin}}</td>
<td><a href="/admin/user/{{user.id}}">View</a></td>
</tr>
{% endfor %}
</table>
</body>
</html>
3. Register Admin Routes
let app = Router::new()
.route("/admin/users", get(admin_list))
.route("/admin/user/:id", get(admin_user_detail))
.merge(oauth2_passkey_full_router());
Admin API Functions
The library exports functions for admin operations. All require admin privileges.
| Function | Description |
|---|---|
get_all_users(session_id) | Fetch all users |
get_user(session_id, user_id) | Fetch a specific user |
update_user_admin_status(session_id, user_id, is_admin) | Grant/revoke admin status |
delete_user_account_admin(session_id, user_id) | Delete a user account |
delete_passkey_credential_admin(session_id, credential_id) | Delete a passkey credential |
delete_oauth2_account_admin(session_id, provider_user_id) | Unlink an OAuth2 account |
use oauth2_passkey_axum::{
SessionId, UserId, CredentialId, ProviderUserId,
get_all_users, get_user, update_user_admin_status,
delete_user_account_admin, delete_passkey_credential_admin,
delete_oauth2_account_admin,
};
// Example: Toggle admin status
async fn toggle_admin(user: AuthUser, target_user_id: &str) -> Result<(), String> {
let session_id = SessionId::new(user.session_id).map_err(|e| e.to_string())?;
let user_id = UserId::new(target_user_id.to_string()).map_err(|e| e.to_string())?;
// Get current status
let target = get_user(session_id.clone(), user_id.clone()).await
.map_err(|e| e.to_string())?
.ok_or("User not found")?;
// Toggle (first user cannot be changed)
update_user_admin_status(session_id, user_id, !target.is_admin).await
.map_err(|e| e.to_string())?;
Ok(())
}
Note: The first user (sequence_number = 1) cannot have their admin status changed for security reasons.
JavaScript API
Authentication
| Function | Description |
|---|---|
oauth2.openPopup('login') | Sign in with OAuth2 |
oauth2.openPopup('create_user') | Create account with OAuth2 |
startAuthentication() | Sign in with passkey |
showRegistrationModal('create_user') | Create account with passkey |
Account Linking (from summary page)
| Function | Description |
|---|---|
oauth2.openPopup('add_to_user') | Link OAuth2 account to current user |
showRegistrationModal('add_to_user') | Add passkey to current user |
REST API for Account Management
All endpoints require CSRF token in X-CSRF-Token header.
User Profile
// Update account/label
fetch(`${O2P_ROUTE_PREFIX}/user/update`, {
method: 'PUT',
headers: { 'X-CSRF-Token': csrfToken, 'Content-Type': 'application/json' },
body: JSON.stringify({ user_id, account, label })
});
// Delete account (removes all linked credentials)
fetch(`${O2P_ROUTE_PREFIX}/user/delete`, {
method: 'DELETE',
headers: { 'X-CSRF-Token': csrfToken, 'Content-Type': 'application/json' },
body: JSON.stringify({ user_id })
});
Passkey Credentials
// Delete passkey
fetch(`${O2P_ROUTE_PREFIX}/passkey/credentials/${credentialId}`, {
method: 'DELETE',
headers: { 'X-CSRF-Token': csrfToken }
});
OAuth2 Accounts
// Unlink OAuth2 account
fetch(`${O2P_ROUTE_PREFIX}/oauth2/accounts/${provider}/${providerUserId}`, {
method: 'DELETE',
headers: { 'X-CSRF-Token': csrfToken }
});
Logout
window.location.href = O2P_ROUTE_PREFIX + "/user/logout?redirect=/";
Working Example
See demo-custom-login for a complete working example with styled templates.
demo-custom-login/
├── src/
│ └── main.rs # Routes and handlers
├── templates/
│ ├── login.j2 # Custom login page
│ ├── summary.j2 # Custom summary page
│ ├── index_anon.j2 # Index for anonymous users
│ ├── index_user.j2 # Index for authenticated users
│ └── protected.j2 # Protected page
└── Cargo.toml
cd demo-custom-login
cp ../dot.env.example .env
# Add: O2P_LOGIN_URL='/login'
cargo run
# Open http://localhost:3001
Environment Variables
| Variable | Default | Description |
|---|---|---|
O2P_LOGIN_URL | /o2p/user/login | Redirect destination for unauthenticated users |
O2P_ADMIN_URL | /o2p/admin/index | Admin panel URL (used in summary page) |
O2P_ROUTE_PREFIX | /o2p | Prefix for all auth endpoints |
Note:
O2P_LOGIN_URLis required for custom login pages to work. Although it doesn’t appear in your application code, the library reads it internally to determine where to redirect unauthenticated users.
Disabling Built-in UI
After creating your custom pages, disable the corresponding built-in UI to avoid shipping unused code.
The library provides three feature flags:
| Feature | Default | Controls |
|---|---|---|
login-ui | ON | Login page (/user/login) |
user-ui | ON | Account management page (/user/account) |
admin-ui | ON | Admin pages (/admin/index, /admin/user/{id}) |
Configure these in your Cargo.toml based on which pages you’re replacing:
# Replace ALL pages with custom templates (recommended for full customization)
oauth2-passkey-axum = { version = "0.3", default-features = false }
# Custom login page only, keep built-in account management and admin UI
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["user-ui", "admin-ui"] }
# Replace only admin pages, keep built-in login and account management
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["login-ui", "user-ui"] }
# Replace only user pages, keep built-in login and admin
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["login-ui", "admin-ui"] }
Note: API endpoints (logout, delete account, admin operations, etc.) are always available regardless of feature flags. Only the HTML pages and their static assets are affected.
See demo-custom-login for a complete example with default-features = false.
OAuth2 JavaScript API
This guide explains how to use the OAuth2 JavaScript API for popup-based Google authentication.
Loading the Script
Include the oauth2.js script in your HTML page:
<script src="{{auth_route_prefix}}/oauth2/oauth2.js"></script>
The script path uses your configured authentication route prefix. If using Jinja2/Tera templates, pass the prefix from your server.
Setting the Route Prefix
Define O2P_ROUTE_PREFIX before using the API. This tells the script where to find OAuth2 endpoints:
const O2P_ROUTE_PREFIX = '/auth'; // Adjust to match your configuration
In templates:
<script>
const O2P_ROUTE_PREFIX = '{{auth_route_prefix}}';
</script>
API Functions
oauth2.openPopup(mode)
Opens a popup window for Google OAuth2 authentication. The mode parameter determines the authentication behavior.
Create New User
Creates a new user account. Fails if the Google account is already linked:
oauth2.openPopup('create_user')
Login Existing User
Logs in an existing user. Fails if the Google account is not registered:
oauth2.openPopup('login')
Create User or Login
Automatically creates a new user or logs in if the account already exists:
oauth2.openPopup('create_user_or_login')
Automatic Page Reload
When authentication completes successfully, the popup sends an auth_complete message to the parent window. The oauth2.js script automatically reloads the parent page to reflect the new authentication state.
Complete HTML Example
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>OAuth2 Login Example</title>
</head>
<body>
<h1>Welcome</h1>
<div>
<button onclick="oauth2.openPopup('create_user')">Create User</button>
<button onclick="oauth2.openPopup('login')">Login</button>
<button onclick="oauth2.openPopup('create_user_or_login')">Either way</button>
</div>
<!-- Load the OAuth2 JavaScript -->
<script src="/auth/oauth2/oauth2.js"></script>
<script>
const O2P_ROUTE_PREFIX = '/auth';
</script>
</body>
</html>
Template Example (Jinja2/Tera)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>OAuth2 Login</title>
</head>
<body>
<h1>{{message}}</h1>
<div>
<button onclick="oauth2.openPopup('create_user')">Create User</button>
<button onclick="oauth2.openPopup('login')">Login</button>
<button onclick="oauth2.openPopup('create_user_or_login')">Either way</button>
<script src="{{auth_route_prefix}}/oauth2/oauth2.js"></script>
<script>
const O2P_ROUTE_PREFIX = '{{auth_route_prefix}}';
</script>
</div>
</body>
</html>
Notes
- The popup window opens at 550x640 pixels positioned at the right side of the screen
- Popup blockers may interfere with the authentication flow; inform users if popups are blocked
- The script handles cleanup automatically when the page unloads
Passkey JavaScript API
This guide explains how to use the Passkey JavaScript API for WebAuthn/Passkey authentication.
Loading the Script
Include the passkey.js script in your HTML page:
<script src="{{o2p_route_prefix}}/passkey/passkey.js"></script>
The script path uses your configured authentication route prefix.
Setting the Route Prefix
Define O2P_ROUTE_PREFIX before loading the script. This tells the script where to find Passkey endpoints:
const O2P_ROUTE_PREFIX = '/auth'; // Adjust to match your configuration
In templates:
<script>
const O2P_ROUTE_PREFIX = '{{o2p_route_prefix}}';
</script>
<script src="{{o2p_route_prefix}}/passkey/passkey.js"></script>
WebAuthn Feature Detection
The passkey.js script automatically detects WebAuthn Signal API capabilities on page load using the getClientCapabilities() API (Chrome 131+, Edge 132+).
initPasskeyCapabilities()
Called automatically when passkey.js loads. Queries the browser for supported WebAuthn capabilities and caches the result.
// Called automatically - no need to call manually
// Result cached in _passkeyCapabilities
await initPasskeyCapabilities();
hasSignalCapability(capabilityName)
Check whether a specific WebAuthn Signal API capability is supported by the browser.
if (hasSignalCapability('signalUnknownCredential')) {
// Browser supports telling authenticator about deleted credentials
}
if (hasSignalCapability('signalAllAcceptedCredentials')) {
// Browser supports credential list synchronization
}
Falls back to typeof checks when getClientCapabilities() is not available.
CSRF Token for Registration
Passkey registration requires a CSRF token. Obtain it from the response header and define it before calling registration functions:
let csrfToken = null;
// Fetch CSRF token from response header
fetch(window.location.href, { method: 'HEAD' })
.then(response => {
csrfToken = response.headers.get('X-CSRF-Token') || null;
});
API Functions
startAuthentication()
Initiates passkey authentication. Opens the browser’s passkey selector and verifies the credential with the server.
startAuthentication()
On success, the page automatically reloads to reflect the authenticated state.
showRegistrationModal(mode)
Opens a modal dialog for passkey registration. The user enters a username and display name, then the browser prompts to create a passkey.
Create New User
Creates a new user account with a passkey:
showRegistrationModal('create_user')
The modal pre-fills default values. If the user is already logged in, it attempts to fetch user info and pre-fill based on existing account data.
Complete HTML Example
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Passkey Login Example</title>
<script>
const O2P_ROUTE_PREFIX = '/auth';
</script>
<script src="/auth/passkey/passkey.js"></script>
</head>
<body>
<h1>Welcome</h1>
<div>
<button onclick="showRegistrationModal('create_user')">Register Passkey</button>
<button onclick="startAuthentication()">Sign in</button>
</div>
<script>
// CSRF token required for registration
let csrfToken = null;
fetch(window.location.href, { method: 'HEAD' })
.then(response => {
csrfToken = response.headers.get('X-CSRF-Token') || null;
});
</script>
</body>
</html>
Template Example (Jinja2/Tera)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Passkey Demo</title>
<script>
const O2P_ROUTE_PREFIX = '{{o2p_route_prefix}}';
</script>
<script src="{{o2p_route_prefix}}/passkey/passkey.js"></script>
</head>
<body>
<h1>{{message}}</h1>
<p>Welcome to our site.</p>
<div style="display: flex; gap: 10px; margin-bottom: 20px;">
<button onclick="showRegistrationModal('create_user')">Register Passkey</button>
<button onclick="startAuthentication()">Sign in</button>
</div>
<script>
let csrfToken = null;
fetch(window.location.href, { method: 'HEAD' })
.then(response => {
csrfToken = response.headers.get('X-CSRF-Token') || null;
});
</script>
</body>
</html>
Notes
- The registration modal dynamically creates a form for username and display name input
- On successful authentication or registration, the page automatically reloads
- Browser support for WebAuthn/Passkey is required; check compatibility before deployment
- The
csrfTokenvariable must be defined in the global scope for registration to work
Askama Templates with AuthUser
This guide explains how to use Askama templates with the AuthUser extractor in oauth2-passkey applications.
Overview
Askama is a type-safe, compiled template engine for Rust that uses Jinja2-like syntax. When combined with oauth2-passkey, you can create dynamic HTML pages that display authenticated user information.
Defining Template Structs
Template structs define the data available in your templates. Use the #[derive(Template)] macro and specify the template file path.
Basic Template with Message
use askama::Template;
#[derive(Template)]
#[template(path = "index_anon.j2")]
struct IndexTemplateAnon<'a> {
message: &'a str,
auth_route_prefix: &'a str,
}
Template with AuthUser
To display authenticated user information, include AuthUser as a field:
use askama::Template;
use oauth2_passkey_axum::AuthUser;
#[derive(Template)]
#[template(path = "protected.j2")]
struct ProtectedTemplate<'a> {
user: AuthUser,
auth_route_prefix: &'a str,
}
Passing O2P_ROUTE_PREFIX to Templates
The O2P_ROUTE_PREFIX constant contains the authentication route prefix (default: /o2p). Pass it to templates to construct authentication URLs correctly.
use oauth2_passkey_axum::{AuthUser, O2P_ROUTE_PREFIX};
#[derive(Template)]
#[template(path = "index_user.j2")]
struct IndexTemplateUser<'a> {
message: &'a str,
auth_route_prefix: &'a str,
}
// When creating the template:
let template = IndexTemplateUser {
message: &message,
auth_route_prefix: O2P_ROUTE_PREFIX.as_str(),
};
Rendering Templates in Handlers
Handler with Optional Authentication
Use Option<AuthUser> to handle both authenticated and anonymous users:
use askama::Template;
use axum::{http::StatusCode, response::Html};
use oauth2_passkey_axum::{AuthUser, O2P_ROUTE_PREFIX};
#[derive(Template)]
#[template(path = "index_user.j2")]
struct IndexTemplateUser<'a> {
message: &'a str,
auth_route_prefix: &'a str,
}
#[derive(Template)]
#[template(path = "index_anon.j2")]
struct IndexTemplateAnon<'a> {
message: &'a str,
auth_route_prefix: &'a str,
}
pub async fn index(user: Option<AuthUser>) -> Result<Html<String>, (StatusCode, String)> {
match user {
Some(u) => {
let message = format!("Hey {}!", u.account);
let template = IndexTemplateUser {
message: &message,
auth_route_prefix: O2P_ROUTE_PREFIX.as_str(),
};
let html = Html(
template
.render()
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, e.to_string()))?,
);
Ok(html)
}
None => {
let message = "Click the Login button below.".to_string();
let template = IndexTemplateAnon {
message: &message,
auth_route_prefix: O2P_ROUTE_PREFIX.as_str(),
};
let html = Html(
template
.render()
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, e.to_string()))?,
);
Ok(html)
}
}
}
Handler Requiring Authentication
Use AuthUser directly (not Option) to require authentication:
pub async fn protected(user: AuthUser) -> Result<Html<String>, (StatusCode, String)> {
let template = ProtectedTemplate {
user,
auth_route_prefix: O2P_ROUTE_PREFIX.as_str(),
};
let html = Html(
template
.render()
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, e.to_string()))?,
);
Ok(html)
}
AuthUser Fields
The AuthUser struct provides the following fields for use in templates:
| Field | Type | Description |
|---|---|---|
id | String | Unique user identifier |
account | String | User’s account name (email or username) |
label | String | User’s display name |
is_admin | bool | Whether the user has admin privileges |
sequence_number | Option<i64> | Database-assigned sequence number |
created_at | DateTime<Utc> | When the account was created |
updated_at | DateTime<Utc> | When the account was last updated |
csrf_token | String | CSRF token for form submissions |
Template Examples (Jinja2 Syntax)
Displaying User Information
<div class="user-info">
<h2>Your Account Information:</h2>
<table>
<tr>
<td>User ID</td>
<td>{{ user.id }}</td>
</tr>
<tr>
<td>Account</td>
<td>{{ user.account }}</td>
</tr>
<tr>
<td>Label</td>
<td>{{ user.label }}</td>
</tr>
<tr>
<td>Is Admin</td>
<td>{{ user.is_admin }}</td>
</tr>
<tr>
<td>Created At</td>
<td>{{ user.created_at }}</td>
</tr>
</table>
</div>
Handling Optional Fields
Use conditional syntax for Option types:
<tr>
<td>Sequence Number</td>
<td>{% if user.sequence_number.is_some() %}{{ user.sequence_number.unwrap() }}{% else %}None{% endif %}</td>
</tr>
Using auth_route_prefix for URLs
Include the route prefix in authentication-related URLs:
<!-- Logout button -->
<button onclick="Logout()">Logout</button>
<script>
function Logout() {
window.location.href = "{{auth_route_prefix}}/user/logout?redirect=/";
}
</script>
<!-- Include OAuth2 JavaScript -->
<script src="{{auth_route_prefix}}/oauth2/oauth2.js"></script>
<!-- Set prefix for JavaScript use -->
<script>
const O2P_ROUTE_PREFIX = '{{auth_route_prefix}}';
</script>
Login Buttons (Anonymous Users)
<button onclick="oauth2.openPopup('create_user')">Create User</button>
<button onclick="oauth2.openPopup('login')">Login</button>
<button onclick="oauth2.openPopup('create_user_or_login')">Either way</button>
<script src="{{auth_route_prefix}}/oauth2/oauth2.js"></script>
<script>
const O2P_ROUTE_PREFIX = '{{auth_route_prefix}}';
</script>
Conditional Content Based on Admin Status
{% if user.is_admin %}
<div class="admin-panel">
<h3>Admin Controls</h3>
<a href="{{auth_route_prefix}}/admin/index">Manage Users</a>
</div>
{% endif %}
Including CSRF Token in Forms
<form method="post" action="/api/update-profile">
<input type="hidden" name="csrf_token" value="{{ user.csrf_token }}">
<!-- form fields -->
<button type="submit">Update</button>
</form>
Template File Location
Place template files in a templates/ directory at your crate root. Askama will look for templates relative to this directory based on the path specified in #[template(path = "...")].
your-app/
├── Cargo.toml
├── src/
│ └── handlers.rs
└── templates/
├── index_anon.j2
├── index_user.j2
└── protected.j2
Dependencies
Add Askama to your Cargo.toml:
[dependencies]
askama = "0.12"
OAuth2 Account Linking Implementation Guide
This guide explains how to implement OAuth2 account linking functionality in your application using the oauth2-passkey library.
Overview
OAuth2 account linking allows users to connect multiple OAuth2/OpenID Connect accounts (like Google, GitHub, etc.) to a single user account in your application. This is useful for:
- Allowing users to sign in with different OAuth2 providers
- Consolidating multiple accounts under one user identity
- Providing flexibility in authentication methods
Prerequisites
- User must have an active session (already authenticated)
- OAuth2 provider must be configured in your application
- Understanding of the oauth2-passkey library session management
Implementation Steps
1. Get User’s CSRF Token
First, you need to retrieve the CSRF token from the user’s active session. The library provides an endpoint for this:
#![allow(unused)]
fn main() {
// In your Axum handler
async fn get_csrf_token(auth_user: AuthUser) -> Result<Json<Value>, (StatusCode, String)> {
Ok(Json(json!({
"csrf_token": auth_user.csrf_token
})))
}
}
2. Generate Page Session Token
Use the CSRF token to generate a page session token that will be used for session boundary protection:
#![allow(unused)]
fn main() {
use oauth2_passkey::generate_page_session_token;
// Generate page session token from CSRF token
let page_session_token = generate_page_session_token(&csrf_token);
}
3. Client-Side Implementation
HTML Template (using Jinja2/Askama)
<!-- Add OAuth2 Account Button -->
<button onclick="linkOAuth2Account()">Add New OAuth2 Account</button>
<script>
// Page session token for session boundary protection (from server)
const PAGE_SESSION_TOKEN = "{{ page_session_token }}";
function linkOAuth2Account() {
// Open OAuth2 popup with add_to_user mode and page session token
const oauth2Url = `/auth/oauth2/google/start?mode=add_to_user&context=${PAGE_SESSION_TOKEN}`;
// Open in popup window
const popup = window.open(
oauth2Url,
'oauth2_popup',
'width=500,height=600,scrollbars=yes,resizable=yes'
);
// Listen for popup completion
const checkClosed = setInterval(() => {
if (popup.closed) {
clearInterval(checkClosed);
// Refresh page or update UI to show new linked account
location.reload();
}
}, 1000);
}
</script>
JavaScript Module Approach
// oauth2-linking.js
class OAuth2AccountLinker {
constructor(pageSessionToken, routePrefix = '') {
this.pageSessionToken = pageSessionToken;
this.routePrefix = routePrefix;
}
/**
* Link a new OAuth2 account to the current user
* @param {string} provider - OAuth2 provider (e.g., 'google', 'github')
*/
linkAccount(provider) {
const oauth2Url = `${this.routePrefix}/auth/oauth2/${provider}/start?mode=add_to_user&context=${this.pageSessionToken}`;
return new Promise((resolve, reject) => {
const popup = window.open(
oauth2Url,
'oauth2_linking_popup',
'width=500,height=600,scrollbars=yes,resizable=yes'
);
if (!popup) {
reject(new Error('Popup blocked'));
return;
}
// Monitor popup for completion
const checkClosed = setInterval(() => {
if (popup.closed) {
clearInterval(checkClosed);
resolve();
}
}, 1000);
// Timeout after 5 minutes
setTimeout(() => {
clearInterval(checkClosed);
if (!popup.closed) {
popup.close();
}
reject(new Error('OAuth2 linking timeout'));
}, 300000);
});
}
}
// Usage
const linker = new OAuth2AccountLinker(PAGE_SESSION_TOKEN, '/auth');
linker.linkAccount('google')
.then(() => {
console.log('Account linked successfully');
// Update UI or refresh page
location.reload();
})
.catch(error => {
console.error('Account linking failed:', error);
});
4. Server-Side Handler Implementation
If you’re not using the oauth2_passkey_axum crate, you’ll need to implement the OAuth2 linking handler:
#![allow(unused)]
fn main() {
use oauth2_passkey::{
prepare_oauth2_auth_request,
verify_page_session_token,
};
// OAuth2 start handler for account linking
async fn oauth2_start_linking(
Query(params): Query<HashMap<String, String>>,
headers: HeaderMap,
) -> Result<impl IntoResponse, (StatusCode, String)> {
// Check if this is an account linking request
if params.get("mode") == Some(&"add_to_user".to_string()) {
// Verify page session token for session boundary protection
let context = params.get("context"); // Option<&String>
if let Err(e) = verify_page_session_token(&headers, context).await {
return Err((StatusCode::BAD_REQUEST, format!("Invalid session context: {}", e)));
}
// Continue with OAuth2 flow in add_to_user mode
// Use prepare_oauth2_auth_request to generate the authorization URL
// Returns (auth_url, response_headers) tuple
match prepare_oauth2_auth_request(headers, Some("add_to_user")).await {
Ok((auth_url, response_headers)) => {
// Build response with redirect and set cookies from response_headers
let mut response = Redirect::to(&auth_url).into_response();
response.headers_mut().extend(response_headers);
Ok(response)
}
Err(e) => {
Err((StatusCode::INTERNAL_SERVER_ERROR, format!("OAuth2 start failed: {}", e)))
}
}
} else {
// Handle regular OAuth2 registration/login
// ... existing logic
}
}
}
Note: The callback is handled by
get_authorized_core()(for GET requests withresponse_mode=query) orpost_authorized_core()(for POST requests withresponse_mode=form_post). These are automatically routed by theoauth2_passkey_axumcrate.
Complete Example: User Settings Page
Here’s a complete example showing how to implement OAuth2 account linking in a user settings page:
Server-Side (Rust + Axum)
#![allow(unused)]
fn main() {
use askama::Template;
use axum::{
extract::Query,
http::{HeaderMap, StatusCode},
response::{Html, Json},
Extension,
};
use oauth2_passkey::{generate_page_session_token, list_accounts_core, UserId};
use serde_json::{json, Value};
use std::collections::HashMap;
#[derive(Template)]
#[template(path = "user_settings.html")]
struct UserSettingsTemplate {
user: AuthUser,
oauth2_accounts: Vec<OAuth2Account>,
page_session_token: String,
}
async fn user_settings(auth_user: AuthUser) -> Result<Html<String>, (StatusCode, String)> {
// Generate page session token for OAuth2 linking
let page_session_token = generate_page_session_token(&auth_user.csrf_token);
// Get user's linked OAuth2 accounts
// UserId::new() returns Result, so we need to handle the error
let user_id = UserId::new(auth_user.id.clone())
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, format!("Invalid user ID: {}", e)))?;
let oauth2_accounts = list_accounts_core(user_id)
.await
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, format!("Failed to get accounts: {}", e)))?;
let template = UserSettingsTemplate {
user: auth_user,
oauth2_accounts,
page_session_token,
};
let html = template.render()
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, format!("Template error: {}", e)))?;
Ok(Html(html))
}
// CSRF token endpoint for client-side use
async fn get_csrf_token(auth_user: AuthUser) -> Json<Value> {
Json(json!({
"csrf_token": auth_user.csrf_token
}))
}
}
Template (user_settings.html)
<!DOCTYPE html>
<html>
<head>
<title>User Settings</title>
</head>
<body>
<h1>User Settings</h1>
<section>
<h2>Linked OAuth2 Accounts</h2>
{% if oauth2_accounts.is_empty() %}
<p>No OAuth2 accounts linked yet.</p>
{% else %}
{% for account in oauth2_accounts %}
<div class="account-item">
<strong>{{ account.provider }}</strong>: {{ account.email }}
<button onclick="unlinkAccount('{{ account.provider }}', '{{ account.provider_user_id }}')">
Unlink
</button>
</div>
{% endfor %}
{% endif %}
<button onclick="linkGoogleAccount()">Link Google Account</button>
</section>
<script>
const PAGE_SESSION_TOKEN = "{{ page_session_token }}";
function linkGoogleAccount() {
const oauth2Url = `/auth/oauth2/google/start?mode=add_to_user&context=${PAGE_SESSION_TOKEN}`;
const popup = window.open(
oauth2Url,
'google_linking',
'width=500,height=600,scrollbars=yes,resizable=yes'
);
const checkClosed = setInterval(() => {
if (popup.closed) {
clearInterval(checkClosed);
location.reload(); // Refresh to show new account
}
}, 1000);
}
function unlinkAccount(provider, providerUserId) {
if (confirm(`Unlink ${provider} account?`)) {
fetch(`/auth/oauth2/${provider}/unlink`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
provider_user_id: providerUserId
})
})
.then(() => location.reload())
.catch(error => console.error('Unlink failed:', error));
}
}
</script>
</body>
</html>
Key Points to Remember
-
Page Session Token: Always generate and use a page session token for account linking to prevent session/page desynchronization attacks
-
Session Verification: The
verify_page_session_token()function ensures that the OAuth2 linking request comes from the same user session -
Mode Parameter: Use
mode=add_to_userto indicate this is an account linking operation, not a new user registration -
Context Parameter: Pass the page session token as the
contextparameter in the OAuth2 start URL -
Popup Window: Use a popup window for OAuth2 linking to maintain the user’s context on the main page
-
Error Handling: Always handle cases where popup is blocked, OAuth2 fails, or session is invalid
Security Considerations
- CSRF Protection: The page session token prevents cross-site request forgery attacks
- Session Validation: Always verify the user has an active session before allowing account linking
- Same-User Verification: The page session token ensures the linking request comes from the authenticated user
- Popup Security: Popup windows prevent redirect-based attacks on the main application window
Testing
When testing OAuth2 account linking, ensure you:
- Test with multiple OAuth2 providers
- Verify session persistence throughout the linking process
- Test popup blocking scenarios
- Validate error handling for invalid tokens
- Confirm proper unlinking functionality
This implementation pattern provides secure, user-friendly OAuth2 account linking while maintaining proper session boundaries and security protections.
Passkey/WebAuthn Implementation
This chapter provides a practical guide to implementing WebAuthn/Passkey authentication using the oauth2-passkey library. It covers both client-side JavaScript implementation and server-side Rust handlers.
Overview
WebAuthn (Web Authentication) is a W3C standard that enables passwordless authentication using cryptographic credentials called passkeys. Passkeys can be stored on:
- Platform authenticators: Built-in device security (Windows Hello, Apple Touch ID/Face ID, Android fingerprint)
- Roaming authenticators: Hardware security keys (YubiKey, Google Titan)
- Cross-device passkeys: Synced across devices via password managers or platform accounts
Key Security Benefits
- Phishing-resistant: Credentials are bound to the origin domain
- No shared secrets: Private keys never leave the authenticator
- Replay attack prevention: Signature counters prevent credential cloning
- User verification: Optional biometric or PIN verification
Architecture Overview
The passkey implementation follows a layered architecture:
+---------------------------+
| Client (Browser/JS) |
+---------------------------+
|
v
+---------------------------+
| oauth2_passkey_axum | <-- HTTP handlers
+---------------------------+
|
v
+---------------------------+
| oauth2_passkey |
| (coordination layer) | <-- Business logic
+---------------------------+
|
v
+---------------------------+
| passkey module | <-- Core WebAuthn logic
| (register/auth flows) |
+---------------------------+
|
v
+---------------------------+
| Storage layer | <-- SQLite/PostgreSQL
| (PasskeyStore) |
+---------------------------+
Registration Flow
The registration flow creates a new passkey credential and associates it with a user account.
Flow Diagram
Client Server Authenticator
| | |
|-- POST /register/start ->| |
| |-- Generate challenge --------|
| |-- Store options in cache ----|
|<- RegistrationOptions ---| |
| | |
|-- navigator.credentials.create() --------------------> |
| | |-- Create keypair
| | |-- Sign challenge
|<------------------------- Credential ------------------|
| | |
|-- POST /register/finish ->| |
| |-- Validate challenge --------|
| |-- Verify attestation --------|
| |-- Store credential ----------|
|<- Success + Session -----| |
Step 1: Start Registration
The client requests registration options from the server.
Endpoint: POST /passkey/register/start
Request Body:
{
"username": "user@example.com",
"displayname": "John Doe",
"mode": "create_user"
}
The mode field specifies the registration intent:
create_user: Creating a new user account with a passkeyadd_to_user: Adding a passkey to an existing authenticated user
Server Response (RegistrationOptions):
{
"challenge": "base64url-encoded-random-bytes",
"rpId": "example.com",
"rp": {
"name": "Example App",
"id": "example.com"
},
"user": {
"user_handle": "random-user-handle",
"name": "user@example.com",
"displayName": "John Doe"
},
"pubKeyCredParams": [
{ "type": "public-key", "alg": -7 },
{ "type": "public-key", "alg": -257 }
],
"authenticatorSelection": {
"authenticatorAttachment": "platform",
"residentKey": "required",
"requireResidentKey": true,
"userVerification": "discouraged"
},
"timeout": 60000,
"attestation": "direct"
}
Step 2: Browser Creates Credential
The browser’s WebAuthn API creates the credential using the authenticator.
// Convert base64url challenge to Uint8Array
options.challenge = base64URLToUint8Array(options.challenge);
options.user.id = base64URLToUint8Array(options.user.user_handle);
const credential = await navigator.credentials.create({
publicKey: options
});
Step 3: Finish Registration
The client sends the created credential to the server for verification and storage.
Endpoint: POST /passkey/register/finish
Request Body (RegisterCredential):
{
"id": "credential-id",
"raw_id": "base64url-encoded-raw-id",
"type": "public-key",
"response": {
"attestation_object": "base64url-encoded-attestation",
"client_data_json": "base64url-encoded-client-data"
},
"user_handle": "user-handle-from-options"
}
The server performs these validations:
- Decode and verify
clientDataJSON(type, challenge, origin) - Parse and verify attestation object
- Extract and store the public key
- Create user account (for
create_usermode) - Store the credential in the database
Authentication Flow
The authentication flow verifies a user’s identity using their registered passkey.
Flow Diagram
Client Server Authenticator
| | |
|-- POST /auth/start ----->| |
| |-- Generate challenge --------|
| |-- Store challenge in cache --|
|<- AuthenticationOptions -| |
| | |
|-- navigator.credentials.get() -----------------------> |
| | |-- Find credential
| | |-- Sign challenge
|<------------------------- Assertion -------------------|
| | |
|-- POST /auth/finish ---->| |
| |-- Validate challenge --------|
| |-- Verify signature ----------|
| |-- Update counter ------------|
|<- Success + Session -----| |
Step 1: Start Authentication
The client requests authentication options.
Endpoint: POST /passkey/auth/start
Request Body (optional username for non-discoverable credentials):
{
"username": "user@example.com"
}
Or empty body {} for discoverable credentials.
Server Response (AuthenticationOptions):
{
"challenge": "base64url-encoded-random-bytes",
"timeout": 60000,
"rpId": "example.com",
"allowCredentials": [],
"userVerification": "discouraged",
"authId": "unique-auth-session-id"
}
Step 2: Browser Gets Assertion
options.challenge = base64URLToUint8Array(options.challenge);
const credential = await navigator.credentials.get({
publicKey: options
});
Step 3: Finish Authentication
Endpoint: POST /passkey/auth/finish
Request Body (AuthenticatorResponse):
{
"id": "credential-id",
"raw_id": "base64url-encoded-raw-id",
"auth_id": "auth-session-id-from-options",
"response": {
"authenticator_data": "base64url-encoded-auth-data",
"client_data_json": "base64url-encoded-client-data",
"signature": "base64url-encoded-signature",
"user_handle": "base64url-encoded-user-handle"
}
}
The server performs these verifications:
- Validate the challenge matches the stored challenge
- Verify client data (type, origin, challenge)
- Verify authenticator data (RP ID hash, flags)
- Verify the signature using the stored public key
- Check and update the signature counter
- Create a session for the authenticated user
Credential Management
Listing Credentials
Authenticated users can list their registered passkey credentials.
Endpoint: GET /passkey/credentials
Response:
[
{
"credential_id": "abc123...",
"user_id": "user-uuid",
"public_key": "base64url-encoded-key",
"aaguid": "authenticator-aaguid",
"counter": 5,
"user": {
"user_handle": "handle",
"name": "user@example.com",
"displayName": "John Doe"
},
"created_at": "2024-01-01T00:00:00Z",
"updated_at": "2024-01-01T00:00:00Z",
"last_used_at": "2024-01-15T12:00:00Z"
}
]
Updating Credentials
Users can update the name and display name of their credentials.
Endpoint: POST /passkey/credential/update
Request Body:
{
"credential_id": "abc123...",
"name": "New Name",
"display_name": "New Display Name"
}
Deleting Credentials
Users can delete their own passkey credentials.
Endpoint: DELETE /passkey/credentials/{credential_id}
Response: 204 No Content on success
Client-Side Implementation
Base64URL Utilities
WebAuthn uses base64url encoding. These utility functions handle the conversion:
function arrayBufferToBase64URL(buffer) {
if (!buffer) return null;
const bytes = new Uint8Array(buffer);
let str = '';
for (const byte of bytes) {
str += String.fromCharCode(byte);
}
return btoa(str).replace(/\+/g, '-').replace(/\//g, '_').replace(/=/g, '');
}
function base64URLToUint8Array(base64URL) {
if (!base64URL) return null;
const padding = '='.repeat((4 - base64URL.length % 4) % 4);
const base64 = base64URL.replace(/-/g, '+').replace(/_/g, '/') + padding;
const rawData = atob(base64);
const outputArray = new Uint8Array(rawData.length);
for (let i = 0; i < rawData.length; ++i) {
outputArray[i] = rawData.charCodeAt(i);
}
return outputArray;
}
Complete Registration Example
async function startRegistration(mode, username, displayname) {
try {
// Step 1: Get registration options from server
const startResponse = await fetch(O2P_ROUTE_PREFIX + '/passkey/register/start', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
credentials: 'same-origin',
body: JSON.stringify({
username: username,
displayname: displayname,
mode: mode // 'create_user' or 'add_to_user'
})
});
if (!startResponse.ok) {
throw new Error('Failed to start registration');
}
const options = await startResponse.json();
// Step 2: Convert base64url to ArrayBuffer
let userHandle = options.user.user_handle;
options.challenge = base64URLToUint8Array(options.challenge);
options.user.id = base64URLToUint8Array(userHandle);
// Step 3: Create credential using WebAuthn API
const credential = await navigator.credentials.create({
publicKey: options
});
// Step 4: Prepare response for server
const credentialResponse = {
id: credential.id,
raw_id: arrayBufferToBase64URL(credential.rawId),
type: credential.type,
response: {
attestation_object: arrayBufferToBase64URL(
credential.response.attestationObject
),
client_data_json: arrayBufferToBase64URL(
credential.response.clientDataJSON
)
},
user_handle: userHandle,
mode: mode
};
// Step 5: Send credential to server
const finishResponse = await fetch(O2P_ROUTE_PREFIX + '/passkey/register/finish', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
credentials: 'same-origin',
body: JSON.stringify(credentialResponse)
});
if (finishResponse.ok) {
location.reload();
} else {
throw new Error('Registration verification failed');
}
} catch (error) {
console.error('Registration error:', error);
alert('Registration failed: ' + error.message);
}
}
Complete Authentication Example
async function startAuthentication() {
try {
// Step 1: Get authentication options from server
const startResponse = await fetch(O2P_ROUTE_PREFIX + '/passkey/auth/start', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: '{}'
});
if (!startResponse.ok) {
throw new Error('Failed to start authentication');
}
const options = await startResponse.json();
// Step 2: Convert challenge to ArrayBuffer
options.challenge = base64URLToUint8Array(options.challenge);
// Step 3: Get credential using WebAuthn API
const credential = await navigator.credentials.get({
publicKey: options
});
// Step 4: Prepare response for server
const authResponse = {
auth_id: options.authId,
id: credential.id,
raw_id: arrayBufferToBase64URL(credential.rawId),
type: credential.type,
response: {
authenticator_data: arrayBufferToBase64URL(
credential.response.authenticatorData
),
client_data_json: arrayBufferToBase64URL(
credential.response.clientDataJSON
),
signature: arrayBufferToBase64URL(credential.response.signature),
user_handle: arrayBufferToBase64URL(credential.response.userHandle)
}
};
// Step 5: Verify with server
const verifyResponse = await fetch(O2P_ROUTE_PREFIX + '/passkey/auth/finish', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(authResponse)
});
if (verifyResponse.ok) {
location.reload();
} else {
throw new Error('Authentication verification failed');
}
} catch (error) {
console.error('Authentication error:', error);
alert('Authentication failed: ' + error.message);
}
}
Conditional UI (Autofill)
Conditional UI allows passkeys to appear in the browser’s autofill dropdown. This provides a seamless user experience.
(async function() {
// Feature detection
if (!window.PublicKeyCredential) {
console.error('WebAuthn not supported');
return;
}
const available = await PublicKeyCredential.isConditionalMediationAvailable();
if (!available) {
console.error('Conditional UI not available');
return;
}
// Get fresh challenge from server
async function getFreshChallenge() {
const response = await fetch(O2P_ROUTE_PREFIX + '/passkey/auth/start', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(null)
});
if (!response.ok) return null;
return await response.json();
}
// Start credential request with conditional mediation
async function startCredentialRequest(options) {
const publicKeyOptions = {
challenge: base64URLToUint8Array(options.challenge),
rpId: options.rpId,
timeout: options.timeout || 300000,
userVerification: options.userVerification
};
try {
const credential = await navigator.credentials.get({
mediation: 'conditional', // Enable autofill UI
publicKey: publicKeyOptions
});
if (credential) {
// Send to server for verification
const authResponse = await fetch(O2P_ROUTE_PREFIX + '/passkey/auth/finish', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
id: credential.id,
raw_id: arrayBufferToBase64URL(credential.rawId),
response: {
client_data_json: arrayBufferToBase64URL(
credential.response.clientDataJSON
),
authenticator_data: arrayBufferToBase64URL(
credential.response.authenticatorData
),
signature: arrayBufferToBase64URL(credential.response.signature),
user_handle: credential.response.userHandle
? arrayBufferToBase64URL(credential.response.userHandle)
: null
},
type: credential.type,
auth_id: options.authId
})
});
if (authResponse.ok) {
window.location.href = '/';
}
}
} catch (error) {
if (error.name !== 'AbortError') {
console.error('Authentication error:', error);
}
}
}
// Initialize with fresh challenge
const options = await getFreshChallenge();
if (options) {
startCredentialRequest(options);
}
})();
Server-Side Implementation
Router Setup
The passkey routes are provided by the oauth2_passkey_axum crate:
#![allow(unused)]
fn main() {
use axum::Router;
use oauth2_passkey_axum::oauth2_passkey_full_router;
let app = Router::new()
.route("/", get(index))
.merge(oauth2_passkey_full_router());
}
Route Structure
The passkey router provides these endpoints:
/passkey/
/passkey.js - Client-side JavaScript
/conditional_ui - Conditional UI HTML page
/conditional_ui.js - Conditional UI JavaScript
/register/
/start - POST: Start registration
/finish - POST: Finish registration
/auth/
/start - POST: Start authentication
/finish - POST: Finish authentication
/credentials - GET: List credentials
/credentials/{id} - DELETE: Delete credential
/credential/update - POST: Update credential
Handler Implementation
The HTTP handlers delegate to coordination layer functions:
#![allow(unused)]
fn main() {
// Start registration handler
async fn handle_start_registration(
auth_user: Option<AuthUser>,
Json(request): Json<RegistrationStartRequest>,
) -> Result<Json<RegistrationOptions>, (StatusCode, String)> {
let session_user = auth_user.as_ref().map(SessionUser::from);
let registration_options = handle_start_registration_core(session_user.as_ref(), request)
.await
.into_response_error()?;
Ok(Json(registration_options))
}
// Finish authentication handler
async fn handle_finish_authentication(
Json(auth_response): Json<AuthenticatorResponse>,
) -> Result<(HeaderMap, String), (StatusCode, String)> {
let (_, name, headers) = handle_finish_authentication_core(auth_response)
.await
.into_response_error()?;
Ok((headers, name))
}
}
Well-Known Endpoint
WebAuthn supports related origins through a well-known endpoint. When using oauth2_passkey_full_router(), this endpoint is automatically included when WEBAUTHN_ADDITIONAL_ORIGINS is set.
For manual setup with oauth2_passkey_router(), mount it at the root:
#![allow(unused)]
fn main() {
use oauth2_passkey_axum::{oauth2_passkey_router, passkey_well_known_router, O2P_ROUTE_PREFIX};
let app = Router::new()
.merge(passkey_well_known_router()) // Serves /.well-known/webauthn
.nest(O2P_ROUTE_PREFIX.as_str(), oauth2_passkey_router());
}
Configuration
Environment Variables
| Variable | Default | Description |
|---|---|---|
ORIGIN | Required | Full origin URL (e.g., https://example.com) |
PASSKEY_RP_NAME | Same as ORIGIN | Relying party display name |
PASSKEY_TIMEOUT | 60 | WebAuthn operation timeout (seconds) |
PASSKEY_CHALLENGE_TIMEOUT | 60 | Challenge validity period (seconds) |
PASSKEY_ATTESTATION | direct | Attestation conveyance (none, direct, indirect, enterprise) |
PASSKEY_AUTHENTICATOR_ATTACHMENT | platform | Authenticator type (platform, cross-platform, None) |
PASSKEY_RESIDENT_KEY | required | Resident key requirement (required, preferred, discouraged) |
PASSKEY_REQUIRE_RESIDENT_KEY | true | Require resident/discoverable credentials |
PASSKEY_USER_VERIFICATION | discouraged | User verification (required, preferred, discouraged) |
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL | false | Use single user handle per user (set to true for unique per credential) |
PASSKEY_USER_ACCOUNT_FIELD | name | Field to use for user account (name or display_name) |
PASSKEY_USER_LABEL_FIELD | display_name | Field to use for user label (name or display_name) |
Example Configuration
ORIGIN=https://example.com
PASSKEY_RP_NAME="My Application"
PASSKEY_TIMEOUT=120
PASSKEY_ATTESTATION=direct
PASSKEY_AUTHENTICATOR_ATTACHMENT=platform
PASSKEY_RESIDENT_KEY=required
PASSKEY_USER_VERIFICATION=preferred
Data Types
PasskeyCredential
Stored credential information:
#![allow(unused)]
fn main() {
pub struct PasskeyCredential {
/// Raw credential ID (base64url encoded)
pub credential_id: String,
/// User ID associated with this credential (database ID)
pub user_id: String,
/// Public key bytes (base64url encoded)
pub public_key: String,
/// AAGUID of the authenticator
pub aaguid: String,
/// Counter value for replay attack prevention
pub counter: u32,
/// User entity information
pub user: PublicKeyCredentialUserEntity,
/// Timestamp fields
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
pub last_used_at: DateTime<Utc>,
}
}
Type-Safe Identifiers
The library uses type-safe wrappers for identifiers:
#![allow(unused)]
fn main() {
// Credential ID with validation
let credential_id = CredentialId::new("abc123...".to_string())?;
// User ID with validation
let user_id = UserId::new("user-uuid".to_string())?;
// Challenge types for cache operations
let challenge_type = ChallengeType::registration();
let challenge_type = ChallengeType::authentication();
}
Security Considerations
Challenge Validation
- Challenges are cryptographically random (32 bytes)
- Challenges are stored in cache with TTL
- Each challenge can only be used once
- Challenge verification includes origin check
Attestation Verification
The library supports multiple attestation formats:
none: No attestationpacked: Standard attestationtpm: TPM-based attestationu2f: FIDO U2F attestation
Counter Verification
Signature counters prevent credential cloning:
- Counter must increase with each authentication
- Counter of 0 indicates authenticator doesn’t support counters
- Decreased counter triggers security warning
User Handle Privacy
User handles are random identifiers that:
- Don’t reveal user identity to authenticators
- Can be configured to be unique per credential
- Support multiple credentials per user
Demo Application
A complete demo application is available in demo-passkey/:
use oauth2_passkey_axum::{AuthUser, oauth2_passkey_full_router};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Load environment and initialize library
dotenvy::dotenv().ok();
oauth2_passkey_axum::init().await?;
// Create router with passkey authentication
let app = Router::new()
.route("/", get(index))
.merge(oauth2_passkey_full_router());
// Start server
let addr = SocketAddr::from(([0, 0, 0, 0], 3001));
axum_server::bind(addr)
.serve(app.into_make_service())
.await?;
Ok(())
}
HTML Template Example
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Passkey Demo</title>
<script>
const O2P_ROUTE_PREFIX = '{{o2p_route_prefix}}';
</script>
<script src="{{o2p_route_prefix}}/passkey/passkey.js"></script>
</head>
<body>
<h1>{{message}}</h1>
<!-- For anonymous users -->
<div id="passkey-auth">
<button onclick="showRegistrationModal('create_user')">
Register Passkey
</button>
<button onclick="startAuthentication()">
Sign in
</button>
</div>
</body>
</html>
Related Documentation
- OAuth2 Implementation - Google OAuth2 authentication
- Framework Integration - Axum framework integration
- WebAuthn Compatibility - Browser compatibility
- Security Guide - CSRF protection
Development Tunneling
When developing OAuth2 and Passkey authentication, you need HTTPS access from external devices (mobile testing) or to satisfy OAuth2 redirect URI requirements. This guide covers tunneling solutions.
Cloudflare Tunnel (Recommended)
Cloudflare Tunnel provides free, reliable tunneling without interstitial pages.
Installation
Install cloudflared from Cloudflare Downloads
Quick Start
- Create a quick tunnel (no account required):
cloudflared tunnel --url http://localhost:3001
-
Note the generated URL (e.g.,
https://random-name.trycloudflare.com) -
Update your
.env:
ORIGIN='https://random-name.trycloudflare.com'
-
Update Google OAuth2 redirect URI to
https://random-name.trycloudflare.com/o2p/oauth2/authorized -
Start your local server:
cd demo-both && cargo run
Why Cloudflare Tunnel?
- No interstitial page - Direct tunneling without cookie dependencies
- iOS compatible - Works reliably on iOS Safari (unlike ngrok free tier)
- Free - Quick tunnels require no account
- Stable - No session cookie issues
ngrok
ngrok is a popular alternative but has limitations on iOS.
Installation
Download from ngrok.com
Quick Start
- Create a tunnel:
ngrok http 3001
-
Note the generated URL (e.g.,
https://random-name.ngrok-free.app) -
Update your
.env:
ORIGIN='https://random-name.ngrok-free.app'
-
Update Google OAuth2 redirect URI to
https://random-name.ngrok-free.app/o2p/oauth2/authorized -
Start your local server:
cd demo-both && cargo run
iOS Limitation
ngrok’s free tier does not work on iOS Safari. The interstitial page requires a cookie that iOS Safari’s Intelligent Tracking Prevention (ITP) blocks for subresource requests.
For iOS testing, use Cloudflare Tunnel instead. See iOS Safari Compatibility for technical details.
Workaround
Upgrade to ngrok’s paid plan, which removes the interstitial page entirely.
Comparison
| Feature | Cloudflare Tunnel | ngrok (free) |
|---|---|---|
| iOS Safari | Works | Broken |
| Interstitial page | None | Yes |
| Account required | No (quick tunnels) | No |
| Cost | Free | Free |
Recommendation: Use Cloudflare Tunnel for all development, especially when testing on iOS devices.
Security Model
Overview
This chapter provides a comprehensive security analysis of the oauth2-passkey library. It defines the threat model, documents verified security claims, and explains the security architecture that protects authentication flows.
Understanding the security model is essential for properly deploying and configuring the library. While the library implements robust security measures, proper deployment configuration (HTTPS, infrastructure security) remains the responsibility of the integrator.
Threat Model
The library is designed to defend against several categories of attacks that target web authentication systems.
CSRF Attacks
Cross-Site Request Forgery attacks attempt to trick authenticated users into performing unintended actions. The library defends against CSRF through multiple mechanisms:
Protection Implementation:
- CSRF tokens generated using
ring::rand::SystemRandom(cryptographically secure) - Token length: 32 bytes of random data
- Constant-time comparison using
subtle::ConstantTimeEq::ct_eq()prevents timing attacks - Tokens validated via
X-CSRF-Tokenheader
OAuth2 State Parameter: The OAuth2 flow uses a multi-component state parameter containing:
- CSRF ID
- Nonce ID
- PKCE ID
- Session reference (misc_id)
- Mode ID
This comprehensive state parameter ensures flow integrity and prevents cross-site request manipulation.
Session Fixation
Session fixation attacks attempt to hijack a user’s session by setting a known session ID before authentication. The library prevents this through complete session renewal.
Protection Implementation:
- New session ID generated after OAuth2 authentication completes
- Fresh CSRF token created with each new session
- Session creation uses
create_new_session_with_uid() - Old session data is not carried over
This ensures that even if an attacker knows a pre-authentication session ID, they cannot use it to access the authenticated session.
Replay Attacks
Replay attacks attempt to reuse captured authentication data. The library implements multiple defenses:
OAuth2 Nonce Validation:
- Nonce generated using
gen_random_string(32)with cryptographically secure randomness - Stored with expiration time
- Validated against ID token claims
- Single-use: removed immediately after successful validation
WebAuthn Challenge Handling:
- Challenges generated with 32 bytes of cryptographically secure random data
- Stored with TTL (time-to-live)
- Validated before credential verification
- Removed after successful authentication
These single-use tokens ensure that captured authentication responses cannot be replayed.
Credential Stuffing
While the library does not implement rate limiting directly (this should be handled at the infrastructure level), it provides security measures that complement rate limiting:
WebAuthn/Passkey Benefits:
- Passkeys are phishing-resistant by design
- Credentials are bound to the origin (verified during authentication)
- No passwords to stuff or guess
- Cryptographic authentication eliminates credential reuse attacks
OAuth2 Benefits:
- Authentication delegated to identity provider
- PKCE prevents authorization code interception
- No password handling in the application
Security Architecture
The library implements a layered security architecture across both OAuth2 and WebAuthn authentication flows.
Cryptographic Foundations
Random Number Generation:
- Uses
ring::rand::SystemRandomfor all security-critical random values - Session IDs, CSRF tokens, challenges, and nonces all use this generator
- Standard length: 32 bytes for most tokens
Timing Attack Resistance:
- CSRF token comparison uses constant-time operations
- Implemented via
subtle::ConstantTimeEq::ct_eq()
Digital Signature Verification:
- WebAuthn uses ECDSA P256 SHA256 ASN1 via the
ringcryptography library - Signatures verified against stored public keys
- Authenticator data and client data hash form the signed message
OAuth2 Security Controls
| Security Control | Implementation | Status |
|---|---|---|
| PKCE | S256 with code_challenge/code_verifier | Verified |
| State Parameter | Multi-component (CSRF, nonce, PKCE, session IDs) | Verified |
| Nonce Validation | Generated, stored, validated against ID token | Verified |
| Token Exchange | Secure authorization code exchange with PKCE | Verified |
WebAuthn Security Controls
| Security Control | Implementation | Status |
|---|---|---|
| Challenge Generation | 32-byte cryptographically secure random | Verified |
| Origin Validation | Client origin verified against configured ORIGIN | Verified |
| Signature Verification | ECDSA P256 SHA256 with public key cryptography | Verified |
Session Security Controls
| Security Control | Implementation | Status |
|---|---|---|
| Cookie Security | Secure, HttpOnly, SameSite=Lax attributes | Verified |
| Host-locked Cookies | __Host-SessionId prefix by default | Verified |
| Session Invalidation | Complete removal from cache on logout | Verified |
| Session Expiration | Automatic cleanup of expired sessions | Verified |
| Session Fixation Protection | Complete session renewal after authentication | Verified |
Out of Scope
The following security concerns are outside the library’s scope and must be addressed at the deployment level:
- Network Security: TLS/HTTPS configuration and enforcement
- Infrastructure Security: Redis, database, and server hardening
- Rate Limiting: Must be implemented at the infrastructure or application level
- Client-side Security: JavaScript security and browser protections
- Framework Vulnerabilities: Security of the web framework (Axum, etc.)
Known Limitations
-
No Memory Zeroization: Sensitive data is not explicitly cleared from memory using crates like
zeroize. Consider this for high-security deployments. -
Concurrent Sessions Allowed: Multiple sessions per user are permitted by design. No built-in session limits exist.
-
No Rate Limiting: The library does not implement rate limiting. This must be handled externally.
-
HTTPS Not Enforced: The library assumes HTTPS but does not enforce it. Deployment must configure HTTPS.
Security Testing Recommendations
Static Analysis
- Review all authentication flows for security gaps
- Verify constant-time operations are used appropriately
- Check for credential leakage in logs and error messages
- Validate error handling does not reveal sensitive information
Dynamic Testing
- Timing attack testing on CSRF validation
- Session fixation testing with pre-authentication session manipulation
- CSRF protection bypass attempts
- OAuth2 flow manipulation (state tampering, callback manipulation)
- WebAuthn challenge replay testing
Dependency Auditing
- Regular audit of cryptographic dependencies (
ring,subtle) - Check for known vulnerabilities using
cargo audit - Keep dependencies updated
Summary
The oauth2-passkey library implements a robust security model with verified protections against common web authentication attacks. Key security features include:
- Cryptographically secure random generation for all tokens
- Constant-time comparison for CSRF tokens
- Complete session renewal preventing session fixation
- Single-use nonces and challenges preventing replay attacks
- PKCE and comprehensive state parameters for OAuth2 security
- Origin validation and signature verification for WebAuthn
Proper deployment requires attention to infrastructure security, HTTPS configuration, and rate limiting, which fall outside the library’s scope.
CSRF Protection Guide
This guide covers comprehensive CSRF (Cross-Site Request Forgery) protection implementation in oauth2-passkey applications.
Overview
This library provides automatic CSRF protection with two usage patterns:
- Headers (Recommended): Get token → include in
X-CSRF-Tokenheader → automatic verification ✅ - Forms: Get token → include in form field → manual verification required ⚠️
Your responsibility: Get tokens to your frontend and include them in requests.
Getting CSRF Tokens
Choose the method that best fits your application:
Server-Side Templates (Most Common)
Best for: Traditional web apps, server-side rendering
#![allow(unused)]
fn main() {
// Pass token to your template
async fn page_handler(user: AuthUser) -> impl IntoResponse {
HtmlTemplate::render("page.j2", json!({
"csrf_token": user.csrf_token,
// ... other data
}))
}
}
In your template:
<!-- For JavaScript/AJAX -->
<script>window.csrfToken = "{{ csrf_token }}";</script>
<!-- For forms -->
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
API Endpoint (For SPAs)
Best for: Single-page applications, dynamic token refresh
// Fetch fresh token when needed
const response = await fetch('/o2p/user/csrf_token', {
credentials: 'include'
});
const { csrf_token } = await response.json();
Response Headers (Advanced)
Best for: Existing authenticated requests (token included automatically)
// Token available in any authenticated response
const response = await fetch('/api/user-data', { credentials: 'include' });
const csrfToken = response.headers.get('X-CSRF-Token');
// Use token for subsequent requests
Making Requests with CSRF Tokens
Using Headers (Recommended - Automatic Verification)
Best for: AJAX, fetch requests, SPAs
// Get token from any method above, then include in header
fetch('/api/update-profile', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRF-Token': csrfToken,
},
credentials: 'include',
body: JSON.stringify({ name: 'New Name' })
});
Verification is automatic - no additional code needed in your handlers.
Using Form Fields (Manual Verification Required)
Best for: Traditional HTML form submissions
<form method="POST" action="/update-profile">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
<input type="text" name="name" placeholder="Your name">
<button type="submit">Update Profile</button>
</form>
Manual verification required - see verification code below.
Verification
Header Tokens: Automatic Verification
When using X-CSRF-Token header:
- Works with both
AuthUserextractor andis_authenticated()middleware - Automatic comparison - token verified against session automatically
- Success: Request proceeds (
AuthUser.csrf_via_header_verified=true) - Failure: Request rejected with 403 FORBIDDEN
No code needed - verification happens automatically.
Form Tokens: Manual Verification Required
HTML forms cannot include custom headers, so the X-CSRF-Token header won’t be present. You must verify the form token manually:
#![allow(unused)]
fn main() {
// In your handler - check if manual verification is needed
if !auth_user.csrf_via_header_verified {
// Verify form token manually
if !form_data.csrf_token.as_bytes().ct_eq(auth_user.csrf_token.as_bytes()).into() {
return Err((StatusCode::FORBIDDEN, "Invalid CSRF token"));
}
}
// Token verified - proceed with handler logic
}
Security Best Practices
Use Constant-Time Comparison
Always use constant-time comparison (ct_eq) when manually verifying CSRF tokens to prevent timing attacks:
#![allow(unused)]
fn main() {
use subtle::ConstantTimeEq;
// ✅ Good - constant-time comparison
if !form_data.csrf_token.as_bytes().ct_eq(auth_user.csrf_token.as_bytes()).into() {
return Err((StatusCode::FORBIDDEN, "Invalid CSRF token"));
}
// ❌ Bad - vulnerable to timing attacks
if form_data.csrf_token != auth_user.csrf_token {
return Err((StatusCode::FORBIDDEN, "Invalid CSRF token"));
}
}
Prefer Header-Based CSRF
Header-based CSRF protection is recommended because:
- Automatic verification - no manual code required
- Better security - headers can’t be set by simple forms from malicious sites
- Cleaner code - no additional verification logic needed
Include Credentials in Requests
Always include credentials: 'include' in fetch requests to ensure cookies are sent:
fetch('/api/protected', {
method: 'POST',
headers: { 'X-CSRF-Token': csrfToken },
credentials: 'include', // ← Required for cookies
body: JSON.stringify(data)
});
Troubleshooting
403 Forbidden Errors
If you’re getting 403 errors on protected routes:
- Check token inclusion: Ensure CSRF token is included in request
- Verify credentials: Include
credentials: 'include'in fetch requests - Check token freshness: CSRF tokens may expire with sessions
- Manual verification: For forms, ensure manual verification code is present
Token Not Available
If CSRF tokens are not available in your templates or responses:
- Check authentication: CSRF tokens are only available for authenticated users
- Verify extractor: Ensure you’re using
AuthUserextractor in your handlers - Check initialization: Ensure
oauth2_passkey_axum::init()was called
Performance Considerations
- Session-based tokens: CSRF tokens are tied to session lifetime - when you create a session, a CSRF token is generated and cached until session expires
- Header-based automatic verification: Use header-based CSRF (
X-CSRF-Token) for better performance as verification happens automatically during request extraction - Avoid manual verification: Form-based CSRF requires additional verification code in your handlers, while header-based CSRF is verified automatically
Related Documentation
- Axum Integration Guide - Basic CSRF usage examples
- Security Best Practices - Additional security considerations
- Demo Applications - Complete working examples
Session Cookies and the __Host- Prefix
This guide explains how oauth2-passkey uses the __Host-SessionId cookie for secure session management, what to expect in different environments, and how to handle common warnings.
Session cookies are a critical part of web authentication, ensuring that users remain securely logged in. The __Host- prefix is a modern security feature that helps prevent common vulnerabilities such as session fixation and cross-site attacks by enforcing strict cookie attributes.
Overview
- The
__Host-SessionIdcookie provides enhanced security by enforcing HTTPS, domain locking, and secure attributes. - Some browsers show warnings or block these cookies on localhost; this is normal and does not affect authentication flow.
Quick Reference
| Environment | Recommendation | Notes |
|---|---|---|
| Production | Use default __Host-SessionId + HTTPS | Required for security |
| Local (Firefox) | Keep defaults | Works perfectly |
| Local (Chrome/Safari) | Accept warnings or use SessionId-Dev | Warnings are harmless |
| Tests | Keep defaults | MockBrowser handles it |
Common Error: “Failed to get session cookie”
This error is expected when users have not logged in yet:
ERROR: Failed to get session cookie: "__Host-SessionId" from cookies
When This Error is Normal:
- Initial user visits - Before any authentication has occurred
- New user registration - During the account creation process
- Login page loads - Before users submit credentials
- After logout - When session cookies have been cleared
- Session expiration - When existing sessions have timed out
Why This Happens:
The library attempts to read session cookies as part of its normal operation. When no session exists yet (new users) or when browsers reject __Host- cookies on localhost, this “error” appears but authentication continues normally.
- No action needed: Authentication will proceed normally.
- Occurs in both HTTP and HTTPS environments
- Appears in both localhost and production deployments
Why Use __Host- Cookies?
- HTTPS required: Cookies with the
__Host-prefix can only be sent over secure (HTTPS) connections, protecting them from interception. - Domain locked: These cookies cannot be set by subdomains, reducing the risk of attacks from other parts of your domain.
- Path=/ enforced: The cookie is sent to all paths in your application, ensuring consistent session management.
- Secure only: The cookie is inaccessible to JavaScript (when using
HttpOnly), further reducing attack surface.
Benefits: Prevents session fixation, subdomain attacks, and ensures secure transmission.
Browser Behavior on localhost
| Browser | __Host- Cookies on localhost HTTP | Notes |
|---|---|---|
| Firefox | ✅ Allowed | Best for development |
| Chrome | ❌ Blocked | Warnings, auth works |
| Safari | ❌ Blocked | Most restrictive |
| MockBrowser (Tests) | ✅ Allowed | No browser restrictions |
Why MockBrowser is Different:
The oauth2-passkey test suite uses MockBrowser (based on reqwest), not real browsers. MockBrowser accepts __Host- cookies on http://localhost without validation, so browser-specific restrictions don’t apply in tests. This is why setting SESSION_COOKIE_NAME='SessionId-Test' in the test environment is not necessary.
Browser Differences Explained:
- Firefox: Treats
localhostas a “potentially trustworthy origin” per W3C spec - Chrome: Known inconsistency - allows regular
Securecookies but blocks__Host-prefixed cookies on localhost (Chromium issues #1056543, #1245434) - Safari: Most restrictive - blocks all
Securecookies onhttp://localhost
Tip: Use Firefox for smoothest local development.
Configuration
- Production: Use defaults and ensure HTTPS
ORIGIN=https://your-domain.com # Required - Development: Defaults are fine; warnings are harmless. Optionally override cookie name:
SESSION_COOKIE_NAME='SessionId-Dev' - Testing: Defaults work; MockBrowser bypasses browser restrictions.
Note: Overriding the cookie name (e.g., using SessionId-Dev) can reduce distracting warnings during development, but it also removes some of the security guarantees provided by the __Host- prefix. Always revert to the default in production.
Troubleshooting
Sessions Not Persisting
Symptoms: Users logged out after page refresh, authentication state not maintained
Solutions:
- Ensure HTTPS in production -
__Host-cookies require secure origins - Verify ORIGIN matches domain exactly - Check environment variable
- Check browser developer tools for cookie rejection errors
Different Behavior Across Browsers
Symptoms: Works in Firefox but not Chrome/Safari locally
Explanation: This is expected behavior due to browser differences in localhost handling
Solutions:
- Test with Firefox for local development (most permissive)
- Use local HTTPS with tools like mkcert for production-like testing
- Accept warnings in Chrome/Safari (functionality still works)
“Failed to get session cookie” Error
When it’s normal: Before login, during registration, after logout, on session expiration
When to investigate: If authentication fails completely or sessions don’t work after successful login
Solutions: Usually no action needed; if persistent issues, check HTTPS configuration
Best Practices
Do:
- Use default
__Host-SessionIdin production. - Always use HTTPS in production.
- Accept localhost warnings in development.
- Prefer Firefox for local development.
Don’t:
- Disable
__Host-prefix in production. - Worry about normal “Failed to get session cookie” errors.
Technical Details
The library handles missing cookies gracefully:
#![allow(unused)]
fn main() {
match get_session_cookie_from_headers(headers) {
Ok(Some(session_id)) => { /* Cookie found - proceed */ },
Ok(None) | Err(_) => { /* Normal for new users - continue auth flow */ }
}
}
Browser differences stem from varying RFC interpretations of “potentially trustworthy origins” for localhost.
Summary
__Host-cookies provide robust session security.- Warnings on localhost are normal; authentication is unaffected.
- Library handles session cookies gracefully across environments.
- HTTPS is required for production.
Related Docs
Session Conflict Policy
When a user logs in while already having one or more active sessions, the library needs to decide what to do. The session conflict policy controls this behavior.
This is configured via a single environment variable and requires no code changes.
Policies
| Policy | Env Value | Behavior |
|---|---|---|
| Allow | allow (default) | Permit multiple concurrent sessions. Each login creates a new session without affecting existing ones. |
| Replace | replace | Invalidate all existing sessions for the user before creating a new one. Only the most recent session remains active. |
| Reject | reject | Deny the login attempt if an active session already exists. The user must log out first. |
Configuration
# In your .env or environment
SESSION_CONFLICT_POLICY=allow # default
SESSION_CONFLICT_POLICY=replace
SESSION_CONFLICT_POLICY=reject
When to Use Each Policy
Allow (default)
Suitable for most applications. Users can be logged in from multiple devices or browsers simultaneously.
- Desktop and mobile access at the same time
- Multiple browser tabs or profiles
- Shared accounts where concurrent access is expected
Replace
Useful when you want to ensure only one active session per user. The previous session is silently invalidated when a new login occurs.
- Security-sensitive applications (banking, admin panels)
- Licensing or seat-based restrictions
- Preventing session accumulation
When a session is replaced, the previous device/browser will see an “unauthenticated” state on its next request. No explicit notification is sent.
Reject
Strictest policy. The login attempt itself fails if the user already has an active session. The user must explicitly log out before logging in again (or wait for the existing session to expire).
- High-security environments requiring explicit session lifecycle control
- Preventing account sharing
- Environments where session state must be deterministic
When rejected, the login handler returns a SessionConflictRejected error (HTTP 409 Conflict in the default Axum integration).
How It Works
User-to-Session Mapping
The library maintains a reverse index mapping each user ID to their active session IDs, stored in the cache (Redis or in-memory) under the user_sessions prefix:
cache key: user_sessions:{user_id}
cache value: ["session_id_1", "session_id_2", ...]
This mapping is always maintained regardless of which policy is configured. It enables the library to look up all sessions for a given user without scanning the entire session store.
Login Flow
When a user logs in, the following steps occur:
- Lazy cleanup – The library reads the user’s session mapping and checks whether each listed session still exists in the cache. Expired or deleted sessions are pruned from the mapping.
- Policy evaluation – If active sessions remain after cleanup:
allow: Proceed to create a new session.replace: Delete each existing session, then create a new session.reject: Return an error without creating a session.
- Session creation – A new session is created and added to the user’s mapping.
Logout and Session Deletion
When a session is deleted (via logout or the replace policy), the library:
- Reads the session data to obtain the
user_id - Removes the session ID from the user’s mapping
- Deletes the session from the cache
This ensures the mapping stays consistent with the actual session state.
Mapping TTL
The user-to-session mapping has a 30-day TTL in the cache. Since individual sessions expire independently (via SESSION_COOKIE_MAX_AGE), the mapping may temporarily contain references to expired sessions. These stale entries are cleaned up lazily on the next login attempt, so no background job is needed.
Related Configuration
| Variable | Default | Description |
|---|---|---|
SESSION_CONFLICT_POLICY | allow | Session conflict policy (allow, replace, reject) |
SESSION_COOKIE_MAX_AGE | 600 | Session lifetime in seconds (affects when sessions expire naturally) |
Related Docs
Page Session Protection
Overview
Page session protection addresses a critical security issue in web applications that support authentication and multi-account workflows: session boundary problems. These occur when actions are performed in the wrong user context, leading to serious security and usability risks such as accidental credential linking or unauthorized account access.
This chapter explains the session boundary attack vectors and how the oauth2-passkey library protects against them using page session tokens.
Session Boundary Attacks
Modern web applications face two common session boundary problems:
Page-to-Request Desynchronization
A user loads a page while logged in as Account A. Later, they log in as Account B in another tab or window. If they return to the original page and perform an action (e.g., add credentials), that action may be executed as Account B, not Account A - potentially linking credentials to the wrong user.
Concrete Scenario:
- A user views their account page (User A) with an “Add OAuth2 Account” button
- The user opens another tab and logs in as a different user (User B)
- The user returns to the first tab (still showing User A’s page)
- The user clicks “Add OAuth2 Account”, expecting to add the account to User A
- The OAuth2 account gets added to User B instead, because that’s the active session
Consequences:
- Users could accidentally link their Google/OAuth2 accounts to the wrong user account
- Users might not notice the mistake until much later
- Recovering from this mistake requires manual intervention
Process Start-to-Completion Desynchronization
In multi-step processes (such as passkey or OAuth2 registration), a user might start the process with one account but complete it after switching sessions. This can result in credentials being registered to an unintended user or session.
Page Session Token Mechanism
The library implements Page Session Tokens to solve the page-to-request desynchronization problem, particularly for OAuth2 account linking where standard CSRF protection is insufficient due to redirects to third-party providers.
How It Works
- When rendering the user account page, a token is generated and derived from the user’s CSRF token
- This token is embedded in the page as a JavaScript constant:
PAGE_SESSION_TOKEN - When the user clicks “Add OAuth2 Account”, this token is included in the OAuth2 authorization request
- Before redirecting to the OAuth2 provider, the system verifies that this token matches the current session
This creates a binding between the specific page the user is viewing and their current session, preventing session boundary confusion.
Embedding in the User Interface
The token is included as a JavaScript constant when rendering the user’s account page:
<script>
// Page session token for session boundary protection
const PAGE_SESSION_TOKEN = "{{ page_session_token }}";
</script>
And used in the OAuth2 account addition button:
<button onclick="oauth2.openPopup('add_to_user', PAGE_SESSION_TOKEN)" class="action-button">
Add New OAuth2 Account
</button>
Token Generation and Verification
Token Generation
The page session token is generated by applying HMAC-SHA256 to the user’s CSRF token, creating a secure derivative that cannot be reverse-engineered:
#![allow(unused)]
fn main() {
pub fn generate_page_session_token(token: &str) -> String {
let mut mac =
HmacSha256::new_from_slice(&AUTH_SERVER_SECRET).expect("HMAC can take key of any size");
mac.update(token.as_bytes());
let result = mac.finalize().into_bytes();
URL_SAFE_NO_PAD.encode(result)
}
}
Token Verification
The verification happens in the OAuth2 handler before redirecting to the provider:
#![allow(unused)]
fn main() {
async fn google_auth(
auth_user: Option<AuthUser>,
headers: HeaderMap,
Query(params): Query<HashMap<String, String>>,
) -> Result<(HeaderMap, Redirect), (StatusCode, String)> {
let mode = params.get("mode").cloned();
let context = params.get("context").cloned();
if mode.is_some() && mode.as_ref().unwrap() == "add_to_user" {
if context.is_none() {
return Err((StatusCode::BAD_REQUEST, "Missing Context".to_string()));
}
if auth_user.is_none() {
return Err((StatusCode::BAD_REQUEST, "Missing Session".to_string()));
}
// Verify that the token matches the current session
verify_page_session_token(&headers, Some(&context.unwrap()))
.await
.map_err(|e| (StatusCode::BAD_REQUEST, e.to_string()))?;
}
// If verification passes, proceed with OAuth2 flow
// ...
}
}
The verification function checks that the token matches what is expected for the current session:
#![allow(unused)]
fn main() {
pub async fn verify_page_session_token(
headers: &HeaderMap,
page_session_token: Option<&String>,
) -> Result<(), SessionError> {
let session_id: &str = match get_session_id_from_headers(headers) {
Ok(Some(session_id)) => session_id,
_ => {
return Err(SessionError::PageSessionToken(
"Session ID missing".to_string(),
));
}
};
let cached_session = GENERIC_CACHE_STORE
.lock()
.await
.get("session", session_id)
.await
.map_err(|e| SessionError::Storage(e.to_string()))?
.ok_or(SessionError::SessionError)?;
let stored_session: StoredSession = cached_session.try_into()?;
match page_session_token {
Some(context) => {
if context.as_str() != generate_page_session_token(&stored_session.csrf_token) {
tracing::error!("Page session token does not match session user");
return Err(SessionError::PageSessionToken(
"Page session token does not match session user".to_string(),
));
}
}
None => {
tracing::error!("Page session token missing");
return Err(SessionError::PageSessionToken(
"Page session token missing".to_string(),
));
}
}
Ok(())
}
}
Why This Works
- The page session token is derived from the CSRF token of the user who was logged in when the page was loaded
- If the user’s session changes (by logging out and in as another user), the CSRF token in the new session will be different
- When the page session token is verified, it will not match what is expected for the current session
- The OAuth2 flow is rejected before it even starts, preventing incorrect account linkage
Integration with OAuth2 Flows
Session Context Preservation
For the OAuth2 flow itself, the library maintains session continuity through the entire process:
1. Storing Session Context at Flow Start:
#![allow(unused)]
fn main() {
// Store the current session ID in cache when starting OAuth2 flow
let misc_id = if let Some(session_id) = get_session_id_from_headers(&headers)? {
Some(store_token_in_cache("misc_session", session_id, ttl, expires_at, None).await?)
} else {
None
};
// Include the misc_id in the state parameter
let state_params = StateParams {
csrf_id,
nonce_id,
pkce_id,
misc_id, // Reference to stored session ID
mode_id,
};
}
2. Retrieving Session Context at Flow Completion:
#![allow(unused)]
fn main() {
pub(crate) async fn get_uid_from_stored_session_by_state_param(
state_params: &StateParams,
) -> Result<Option<SessionUser>, OAuth2Error> {
let Some(misc_id) = &state_params.misc_id else {
return Ok(None);
};
// Get the session ID that was stored at the beginning of the flow
let Ok(token) = get_token_from_store::<StoredToken>("misc_session", misc_id).await else {
return Ok(None);
};
// Retrieve the user from that original session
match get_user_from_session(&token.token).await {
Ok(user) => Ok(Some(user)),
Err(_) => Ok(None)
}
}
}
3. Using Preserved Context for Account Linking:
#![allow(unused)]
fn main() {
// During OAuth2 account linking process
let state_in_response = decode_state(&auth_response.state)?;
let state_user = get_uid_from_stored_session_by_state_param(&state_in_response).await?;
// Link the OAuth2 account to the original user who initiated the flow
if let Some(user) = state_user {
// Account linking uses the preserved user context
}
}
This approach ensures that:
- The OAuth2 account is always linked to the user who initiated the flow
- Session changes during the OAuth2 process do not affect the final account linking
- Continuous user context is maintained from flow initiation to completion
Key Security Characteristics
Phase-Specific Protection
Each mechanism addresses a specific phase where session desynchronization can occur:
| Phase | Mechanism | Purpose |
|---|---|---|
| Page load to request | Page session tokens | Detect session changes before OAuth2 redirect |
| OAuth2 flow duration | Session context preservation | Maintain user identity through redirects |
Early Detection
Problems are caught at the earliest possible point:
- Page-level desynchronization is caught before the OAuth2 flow starts
- The user sees a clear error message about session mismatch
Minimal Implementation
- Leverages existing CSRF token mechanism
- No additional database storage required (uses existing cache)
- Single HMAC operation with negligible performance overhead
Testing Page Session Protection
You can verify this protection works by following these steps:
- Log in as User A and open their account page
- In another tab, log out and log in as User B
- Return to User A’s account page and click “Add OAuth2 Account”
- The system should display an error message about session mismatch
- The OAuth2 flow should not proceed
This confirms that the page session token mechanism successfully prevents accidental account linking when sessions change.
OAuth2 User and Session Verification in the Authentication Flow
Overview
This document details the user and session verification mechanisms within the OAuth2 authentication flow. It outlines the security measures implemented at each critical stage, from pre-redirect initiation to post-redirect callback processing, to ensure robust identity confirmation and maintain session integrity.
Current Verification Mechanism
Before OAuth2 Redirect (Initiation)
- Page Session Token Verification
- When adding an OAuth2 account to an existing user, the system verifies:
- That the user has a valid session
- That the page session token matches the obfuscated CSRF token from the session
- This verification occurs before redirecting to the OAuth2 provider’s endpoint
- Ensures the user who loaded the page is the same user making the request
- When adding an OAuth2 account to an existing user, the system verifies:
- State Parameter and Session Preservation
- A state parameter containing several security components is generated:
- CSRF ID for flow integrity (references a secret token stored server-side)
- Nonce for ID token verification
- PKCE for code exchange
- Session reference (
misc_id) that points to the original session ID
- The current session ID is stored in cache with a reference ID (
misc_id) - This enables session continuity throughout the OAuth2 flow
- The state parameter is included in the redirect URL to the OAuth2 provider
- A state parameter containing several security components is generated:
After OAuth2 Redirect (Callback)
- Callback Handling
- State parameter is extracted and decoded from the callback
- CSRF protection works as follows for both redirect and form_post modes:
- CSRF ID is extracted from the state parameter
- CSRF token is retrieved from the cookie
- The stored token associated with the CSRF ID is fetched
- Cookie token is compared with the stored token
- SameSite cookie attribute is set based on response mode (None for form_post, Lax for query)
- Original user context is retrieved using the session reference in state parameter:
#![allow(unused)] fn main() { // Decode state to access misc_id (session reference) let state_in_response = decode_state(&query.state); // Retrieve the original user context from when the flow started let state_user = get_uid_from_stored_session_by_state_param(&state_in_response).await?; } - This mechanism works for both redirect-based and form post response modes
- OAuth2 account is processed based on the original user context
- Account Linking Logic
- Handles multiple scenarios:
- User logged in, OAuth2 account exists
- User logged in, OAuth2 account doesn’t exist
- User not logged in, OAuth2 account exists
- User not logged in, OAuth2 account doesn’t exist
- Handles multiple scenarios:
- Session Renewal
- Creates a completely new session for the user after authentication
- Generates fresh session cookies with new CSRF tokens
- Mitigates session fixation attacks
- Invalidates any previously captured credentials
Security Measures
The system implements multiple layers of security:
- Page Session Token Verification
- Verifies user identity before initiating the OAuth2 flow
- Prevents unauthorized account linking attempts
- Ensures session continuity between page load and action
- Uses obfuscated CSRF tokens as page session tokens
- State Parameter as Multi-Purpose Security Container
- Contains multiple security components:
- CSRF ID for flow integrity verification (references a secret token)
- Session reference (
misc_id) for user context preservation - Additional parameters for OAuth2 protocol security (nonce, PKCE)
- Original session ID stored server-side via
misc_sessionmechanism - Single-use state parameter verified during callback
- Creates secure binding between initial request and callback
- Contains multiple security components:
- Multi-Layered Protection
- Page session token verification before flow initiation
- State parameter with CSRF ID for redirect-based flow integrity
- Session context preservation via
misc_sessionmechanism for all flow types - These layers prevent cross-site request forgery and session desynchronization
- Complete Session Rotation
- New session created after successful authentication
- Fresh credentials issued (session ID and page session token)
- Ensures clean state after authentication
Security Analysis
The current implementation follows OAuth 2.0 security best practices and provides robust protection:
- Pre-Authorization Verification
- Page session token verification ensures legitimate user before redirect
- Prevents session desynchronization between page load and action initiation
- Session Continuity Through Flow
- Original session preserved via the
misc_sessionmechanism - Ensures the OAuth2 account links to the user who initiated the flow
- Works consistently across redirect-based and form post response modes
- Original session preserved via the
- Flow Integrity Verification
- State parameter with CSRF ID and cookie with CSRF token secure both redirect and form_post flows
- Prevents tampering during redirects
- Minimizes exposure of the secret token (only ID is passed to the Authorization Server)
- Cookie SameSite attribute is automatically set based on response mode for optimal security
- Post-Authorization Session Renewal
- Complete session rotation after authentication
- Mitigates session fixation and hijacking attempts
Conclusion
The implemented security measures provide strong protection against common OAuth 2.0 vulnerabilities:
- Session fixation attacks
- Cross-site request forgery
- Session hijacking
- Unauthorized account linking The system uses a combination of CSRF protection, state parameters, response mode validation, and session renewal to create a secure authentication flow without unnecessary complexity or dependencies, aligning with the project’s goals of simplicity and security.
Note on CSRF Tokens in the System
It’s important to understand that there are distinct CSRF protection mechanisms in different parts of the system:
- OAuth2 Flow CSRF Protection
- Implemented in
oauth2/main/core.rs - Uses a double-submit pattern with:
- CSRF ID stored in the state parameter
- CSRF token stored in a cookie
- Token verification during callback
- Applied to both redirect and form_post response modes
- Cookie SameSite attribute is automatically set based on response mode:
- SameSite=None for form_post mode (required for cross-site POST requests)
- SameSite=Lax for query mode (more secure for redirect-based flows)
- HTTP method is strictly enforced based on response mode:
- Only POST requests allowed for form_post mode
- Only GET requests allowed for query mode
- Implemented in
- Session CSRF Protection
- Implemented in
session/main/session.rs - Used for general API endpoint protection
- Stored as part of the user session
- Verified via X-CSRF-Token header in requests
- Used throughout the application for non-OAuth2 endpoints
- Implemented in
- Page Session Token
- An obfuscated version of the session CSRF token
- Used to verify that the user who loaded a page is the same one making a subsequent request
- Implemented as a query parameter for certain actions These mechanisms work together but serve different purposes in the security architecture of the system.
Authorization Security Patterns
Problem
Current authentication functions trust session data without validating against the database, creating security vulnerabilities where tampered sessions could bypass authorization checks (documented in authorization_security_tests.rs:321-333).
Security vs Performance Tradeoff
The fix involves adding verification logic: session_id -> session check -> database user attribute check. This eliminates the security flaw but increases database lookups.
Performance Impact: The additional database lookup penalty is generally acceptable because:
- User attribute operations (showing/modifying) are much less frequent than simple page authentication
- The security benefit outweighs the minimal performance cost
- These operations typically involve user interaction (forms, admin panels) where a few milliseconds don’t matter
- Critical security functions should prioritize correctness over micro-optimizations
Solutions
There are three approaches to fix this security issue, from most robust to most convenient:
1. Direct Function Modification (Most Robust)
Modify functions to receive session_id and validate session + fetch fresh user data directly in each function.
2. Helper Functions (Recommended - Best Balance)
Use helper functions at the top of each function that need security validation. Simple one-liners that do all the validation work.
3. Middleware Pattern (Most Convenient)
Use middleware that wraps function logic, but adds complexity with closures.
Implementation
Helper Functions (Recommended)
#![allow(unused)]
fn main() {
// Helper functions for common authorization patterns
pub async fn validate_admin_session(session_id: &str) -> Result<User, CoordinationError> {
let session = validate_session(session_id).await?;
let user = UserStore::get_user(&session.user_id).await?.ok_or(NotFound)?;
if !user.is_admin {
return Err(CoordinationError::Unauthorized.log());
}
Ok(user)
}
pub async fn validate_owner_session(session_id: &str, resource_user_id: &str) -> Result<User, CoordinationError> {
let session = validate_session(session_id).await?;
let user = UserStore::get_user(&session.user_id).await?.ok_or(NotFound)?;
if user.id != resource_user_id {
return Err(CoordinationError::Unauthorized.log());
}
Ok(user)
}
pub async fn validate_admin_or_owner_session(session_id: &str, resource_user_id: &str) -> Result<User, CoordinationError> {
let session = validate_session(session_id).await?;
let user = UserStore::get_user(&session.user_id).await?.ok_or(NotFound)?;
if !user.is_admin && user.id != resource_user_id {
return Err(CoordinationError::Unauthorized.log());
}
Ok(user)
}
}
Middleware Pattern (Alternative)
#![allow(unused)]
fn main() {
// Admin authorization middleware
pub async fn with_admin_auth<F, R>(session_id: &str, operation: F) -> Result<R, CoordinationError>
where F: FnOnce(&User) -> Result<R, CoordinationError>
{
let session = validate_session(session_id).await?;
let user = get_fresh_user(&session.user_id).await?;
if !user.is_admin { return Err(Unauthorized); }
operation(&user)
}
// Owner authorization middleware
pub async fn with_owner_auth<F, R>(session_id: &str, resource_user_id: &str, operation: F) -> Result<R, CoordinationError>
where F: FnOnce(&User) -> Result<R, CoordinationError>
{
let session = validate_session(session_id).await?;
let user = get_fresh_user(&session.user_id).await?;
if user.id != resource_user_id { return Err(Unauthorized); }
operation(&user)
}
// Admin OR owner authorization middleware
pub async fn with_admin_or_owner_auth<F, R>(session_id: &str, resource_user_id: &str, operation: F) -> Result<R, CoordinationError>
where F: FnOnce(&User) -> Result<R, CoordinationError>
{
let session = validate_session(session_id).await?;
let user = get_fresh_user(&session.user_id).await?;
if !user.is_admin && user.id != resource_user_id {
return Err(Unauthorized);
}
operation(&user)
}
}
Usage Examples
Helper Functions (Simple One-Liners)
#![allow(unused)]
fn main() {
// Admin-only function
pub async fn update_user_admin_status(
session_id: &str,
user_id: &str,
is_admin: bool,
) -> Result<User, CoordinationError> {
// Simple one-liner at the top
let _admin_user = validate_admin_session(session_id).await?;
// Original function logic continues...
let user = UserStore::get_user(user_id).await?.ok_or(NotFound)?;
if user.sequence_number == Some(1) {
return Err(CoordinationError::Coordination("Cannot change admin status of first user".to_string()));
}
let updated_user = User { is_admin, ..user };
UserStore::upsert_user(updated_user).await
}
// Owner-only function
pub async fn update_user_account(
session_id: &str,
user_id: &str,
account: Option<String>,
label: Option<String>,
) -> Result<User, CoordinationError> {
// One-liner owner validation
let _owner_user = validate_owner_session(session_id, user_id).await?;
// Original function logic...
let user = UserStore::get_user(user_id).await?.ok_or(NotFound)?;
let updated_user = User {
account: account.unwrap_or(user.account),
label: label.unwrap_or(user.label),
..user
};
UserStore::upsert_user(updated_user).await
}
// Admin OR owner function
pub async fn delete_user_account(session_id: &str, user_id: &str) -> Result<Vec<String>, CoordinationError> {
// One-liner validation
let _user = validate_admin_or_owner_session(session_id, user_id).await?;
// Original function logic...
let user = UserStore::get_user(user_id).await?.ok_or(NotFound)?;
// ... rest of delete logic
Ok(credential_ids)
}
}
Middleware Pattern (Alternative)
#![allow(unused)]
fn main() {
// Admin-only function
pub async fn delete_user_account_admin(session_id: &str, user_id: &str) -> Result<(), CoordinationError> {
with_admin_auth(session_id, |_admin_user| {
// Original function logic here
UserStore::delete_user(user_id)
}).await
}
// Owner-only function
pub async fn update_user_account(session_id: &str, user_id: &str, account: Option<String>, label: Option<String>) -> Result<User, CoordinationError> {
with_owner_auth(session_id, user_id, |_owner_user| {
// Original function logic here
let user = UserStore::get_user(user_id).await?.ok_or(NotFound)?;
let updated_user = User {
account: account.unwrap_or(user.account),
label: label.unwrap_or(user.label),
..user
};
UserStore::upsert_user(updated_user)
}).await
}
// Admin OR owner function
pub async fn delete_user_account(session_id: &str, user_id: &str) -> Result<Vec<String>, CoordinationError> {
with_admin_or_owner_auth(session_id, user_id, |_user| {
// Original function logic here
// ... delete logic
}).await
}
}
Benefits
Helper Functions
- ✅ Simple to use: Just one line at the top of each function
- ✅ Clear and readable: Obvious what security check is happening
- ✅ No complex middleware: Straightforward function calls
- ✅ Easy to modify: Can add logging, metrics, etc. in helpers
- ✅ Consistent security: Same validation logic everywhere
- ✅ Testable: Can unit test helpers independently
Both Approaches
- ✅ Works for admin, owner, and admin-or-owner authorization patterns
- ✅ Always validates session freshness against database
- ✅ Always fetches fresh user data from database
- ✅ Centralizes authorization logic for consistency and maintainability
- ✅ Prevents privilege escalation vulnerabilities from tampered session data
- ✅ DRY principle - write authorization logic once, use everywhere
- ✅ Easily testable authorization logic in isolation
Security Impact
This pattern eliminates the vulnerability where functions trust SessionUser.is_admin without database validation, preventing attacks where tampered session data could bypass authorization checks.
Alternative Approaches Considered
- Direct function modification: Add session_id parameter to each function - requires changing many signatures
- Capability-based security: Use capability tokens - more complex, requires new infrastructure
- Database-first authorization: Always query DB in each function - repetitive, error-prone
- Session validation helper: Centralized validation function - still requires modifying each function
The middleware pattern provides the best balance of security, maintainability, and implementation simplicity.
Security Best Practices Guide for oauth2-passkey
This guide provides best practices for securely implementing authentication in your applications using the oauth2-passkey library. Following these guidelines will help ensure your application meets modern security standards.
General Security Recommendations
Environment Setup
-
Use HTTPS in Production
Always run your authentication services over HTTPS. The library requires the
ORIGINenvironment variable to usehttps://in production.#![allow(unused)] fn main() { // Example check for ensuring HTTPS if cfg!(not(debug_assertions)) && !origin.starts_with("https://") { panic!("HTTPS required in production mode"); } } -
Secure Environment Variables
Store sensitive environment variables (OAuth2 client secrets, database credentials) securely:
- Use a secrets manager in production environments
- Don’t commit
.envfiles to source control - Set appropriate file permissions for production environment files
-
Database Security
- Use parameterized queries (already implemented in the library)
- Apply principle of least privilege for database users
- Regularly back up and secure authentication databases
Session Management
-
Session Configuration
Configure appropriate session timeouts based on your application’s security requirements:
# Short timeouts for sensitive applications SESSION_COOKIE_MAX_AGE=300 # 5 minutes # Longer timeouts for general use SESSION_COOKIE_MAX_AGE=3600 # 1 hour -
Cookie Security
The library already sets secure cookie attributes:
Secure- Ensures cookies are only sent over HTTPSHttpOnly- Prevents JavaScript access to cookiesSameSite=Lax- Mitigates CSRF attacks__Host-prefix - Prevents subdomain cookie manipulation
These are enabled by default and should not be disabled.
📖 For detailed information about
__Host-cookies, browser compatibility, and localhost development considerations, see Session Cookies and __Host- Prefix. -
Session Invalidation
Implement proper session cleanup:
- Always call logout functions when users log out
- Use the library’s automatic session expiration
- Consider implementing server-side session revocation for sensitive operations
OAuth2 Security
-
Provider Configuration
- Register your exact redirect URIs with OAuth2 providers
- Store client secrets securely
- Verify email addresses when using OAuth2 for registration
-
State Parameter Validation
The library handles this automatically, but be aware that it:
- Generates cryptographically secure state parameters
- Validates state parameters on return from providers
- Uses short-lived state tokens to prevent replay attacks
-
Scope Management
- Request only the minimum required scopes
- Handle scope changes in your provider’s dashboard
- Clearly inform users which data you’re accessing
Passkey/WebAuthn Security
-
Authenticator Selection
Configure authenticator requirements based on your security needs:
# High security (requires biometrics or PIN) PASSKEY_USER_VERIFICATION=required # More flexible (allows presence-only authenticators) PASSKEY_USER_VERIFICATION=preferred -
Relying Party ID
- Match your RP ID to your domain name
- The library automatically derives this from
ORIGIN - For multi-domain support, explicitly set
PASSKEY_RP_ID
-
Credential Management
- Allow users to manage their passkey credentials
- Implement recovery flows (e.g., backup passkeys or OAuth2 options)
- Use the library’s functions to list, update, and delete credentials
CSRF Protection
-
Cross-Site Request Forgery Mitigation
The library implements CSRF protection automatically:
- Tokens are generated with cryptographically secure randomness
- Constant-time comparison of tokens prevents timing attacks
- Per-session unique tokens with secure storage
-
Implementation in Forms
When creating forms, include the CSRF token:
<form method="POST"> <input type="hidden" name="csrf_token" value="{{ csrf_token }}"> <!-- form fields --> <button type="submit">Submit</button> </form> -
Implementation in AJAX Requests
For JavaScript requests, set the CSRF header:
fetch('/api/endpoint', { method: 'POST', headers: { 'X-CSRF-Token': csrfToken, 'Content-Type': 'application/json' }, body: JSON.stringify(data) });
Advanced Security Considerations
-
Rate Limiting
Implement rate limiting for authentication endpoints to prevent brute force attacks:
- Use a proxy server (Nginx, Cloudflare) or middleware for rate limiting
- Apply stricter limits for authentication attempts than regular API calls
- Consider progressive delays for repeated failed attempts
-
Event Logging
Log authentication events for security monitoring:
- Login/logout events
- Failed authentication attempts
- Credential changes (registration, deletion)
- Admin operations
-
Multi-factor Authentication
When higher security is required:
- Combine OAuth2 and Passkey authentication
- Use WebAuthn’s
userVerification: "required"(enforced byPASSKEY_USER_VERIFICATION=required) - Consider additional verification for high-value operations
Security Audit and Compliance
-
Regular Testing
- Conduct periodic security reviews
- Test authentication flows in your deployment environment
- Verify session and token behaviors match expectations
-
Security Headers
Implement additional security headers in your application:
#![allow(unused)] fn main() { // Example middleware for adding security headers with Axum async fn security_headers(req: Request, next: Next) -> Response { let mut response = next.run(req).await; let headers = response.headers_mut(); headers.insert(header::STRICT_TRANSPORT_SECURITY, HeaderValue::from_static("max-age=31536000; includeSubDomains")); headers.insert(header::X_CONTENT_TYPE_OPTIONS, HeaderValue::from_static("nosniff")); headers.insert(header::X_FRAME_OPTIONS, HeaderValue::from_static("DENY")); // Add CSP header based on your application's needs response } // Use in your Router let app = Router::new() .route("/", get(handler)) .layer(from_fn(security_headers)); }Key headers to implement:
- Content-Security-Policy (CSP) - Limits which resources can be loaded
- Strict-Transport-Security (HSTS) - Forces HTTPS connections
- X-Content-Type-Options: nosniff - Prevents MIME type sniffing
- X-Frame-Options: DENY - Prevents clickjacking attacks
-
Rate Limiting
Implement rate limiting for authentication endpoints to prevent brute force attacks:
#![allow(unused)] fn main() { use std::{sync::Arc, time::{Duration, Instant}, collections::HashMap}; use tokio::sync::Mutex; // Simple in-memory rate limiter struct RateLimiter { attempts: HashMap<String, Vec<Instant>>, window: Duration, max_attempts: usize, } impl RateLimiter { fn new(window_seconds: u64, max_attempts: usize) -> Self { Self { attempts: HashMap::new(), window: Duration::from_secs(window_seconds), max_attempts, } } fn is_rate_limited(&mut self, key: &str) -> bool { let now = Instant::now(); let attempts = self.attempts.entry(key.to_string()).or_insert_with(Vec::new); // Remove old attempts attempts.retain(|time| now.duration_since(*time) < self.window); // Check if too many attempts if attempts.len() >= self.max_attempts { return true; } // Record this attempt attempts.push(now); false } } // Example middleware for Axum async fn rate_limit( State(limiter): State<Arc<Mutex<RateLimiter>>>, ConnectInfo(addr): ConnectInfo<SocketAddr>, req: Request, next: Next, ) -> Response { let key = addr.ip().to_string(); // Or use something from the request // Check rate limit let is_limited = { let mut limiter = limiter.lock().await; limiter.is_rate_limited(&key) }; if is_limited { return StatusCode::TOO_MANY_REQUESTS.into_response(); } next.run(req).await } }For production use, consider:
- Using Redis or another distributed cache for rate limiting in multi-server setups
- Implementing progressive delays for repeated failures
- Different limits for different endpoints (stricter for authentication)
-
Keep Dependencies Updated
- Regularly update the
oauth2-passkeylibrary - Monitor security advisories for dependencies
- Use
cargo auditto check for vulnerable dependencies
- Regularly update the
References
- OWASP Authentication Best Practices
- W3C WebAuthn Specification
- OAuth 2.0 Threat Model and Security Considerations
- oauth2-passkey Security Analysis
Core Library API (oauth2-passkey)
Overview
The oauth2-passkey crate provides the core authentication functionality for OAuth2 and WebAuthn/Passkey authentication. It is framework-agnostic and can be used directly or through integration crates like oauth2-passkey-axum.
Full API Documentation: https://docs.rs/oauth2-passkey
Main Modules
coordination
Authentication flow orchestration module. Provides high-level functions that coordinate between different authentication mechanisms (OAuth2, Passkey) and user management.
Submodules:
admin- Admin-specific operations (user management, credential administration)oauth2- OAuth2 authentication flow coordinationpasskey- WebAuthn/Passkey authentication flow coordinationuser- User account management operations
oauth2
OAuth2 authentication module supporting Google OAuth2/OpenID Connect. Handles the authentication flow, token validation, and user profile retrieval.
passkey
WebAuthn/Passkey authentication implementation. Provides capabilities for creating and using passkeys for authentication, following W3C WebAuthn Level 3 specification and FIDO2 standards.
session
Session management components for authentication and user state persistence. Implements secure session cookies with CSRF protection.
storage
Database and cache abstraction layer. Supports:
- Databases: SQLite, PostgreSQL
- Caches: In-memory, Redis
userdb
User account management module for storing, retrieving, updating, and deleting user accounts.
Initialization
use oauth2_passkey::init;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize authentication (reads configuration from environment variables)
init().await?;
Ok(())
}
Key Functions
Coordination Functions
Passkey Authentication
handle_start_registration_core- Start passkey registration flowhandle_finish_registration_core- Complete passkey registrationhandle_start_authentication_core- Start passkey authentication flowhandle_finish_authentication_core- Complete passkey authenticationlist_credentials_core- List user’s passkey credentialsupdate_passkey_credential_core- Update credential name/display namedelete_passkey_credential_core- Delete a passkey credential
OAuth2 Authentication
prepare_oauth2_auth_request- Prepare OAuth2 authorization requestget_authorized_core- Handle OAuth2 callback (GET)post_authorized_core- Handle OAuth2 callback (POST)list_accounts_core- List user’s OAuth2 accountsdelete_oauth2_account_core- Delete an OAuth2 account link
User Management
get_user- Get a specific user by IDget_all_users- Get all users (admin)update_user_account- Update user account detailsdelete_user_account- Delete user accountupdate_user_admin_status- Update user’s admin status
Admin Functions
delete_user_account_admin- Admin: delete any user accountdelete_oauth2_account_admin- Admin: delete any OAuth2 accountdelete_passkey_credential_admin- Admin: delete any passkey credential
Session Functions
Authentication Verification
is_authenticated_basic- Basic session validationis_authenticated_basic_then_csrf- Basic validation + CSRF checkis_authenticated_basic_then_user_and_csrf- Basic validation + user extraction + CSRFis_authenticated_strict- Strict session validationis_authenticated_strict_then_csrf- Strict validation + CSRF check
Session Data Access
get_user_from_session- Extract user from sessionget_csrf_token_from_session- Get CSRF token from sessionget_user_and_csrf_token_from_session- Get both user and CSRF token
Session Management
prepare_logout_response- Create logout response with cleared session
Page Session Tokens
generate_page_session_token- Generate token for sensitive operationsverify_page_session_token- Verify page session token
Passkey Functions
get_authenticator_info- Get info about a single authenticator by AAGUIDget_authenticator_info_batch- Get info for multiple authenticatorsget_related_origin_json- Get WebAuthn related origins configuration
Key Types
User Identification
| Type | Description |
|---|---|
UserId | Unique user identifier (newtype wrapper) |
SessionId | Session identifier |
SessionCookie | Typed session cookie value |
SessionUser (alias: User) | User information stored in session |
DbUser | User as stored in database |
Session Types
| Type | Description |
|---|---|
CsrfToken | CSRF protection token |
CsrfHeaderVerified | Marker indicating CSRF was verified via header |
AuthenticationStatus | Whether user is authenticated |
SessionError | Session-related errors |
OAuth2 Types
| Type | Description |
|---|---|
OAuth2Account | OAuth2 account linked to a user |
AuthResponse | OAuth2 authorization response |
OAuth2Mode | Authentication mode (Login, Register, Link) |
OAuth2State | OAuth2 state parameter |
Provider | OAuth2 provider identifier |
ProviderUserId | User ID from OAuth2 provider |
Passkey Types
| Type | Description |
|---|---|
PasskeyCredential | Stored passkey credential |
CredentialId | Unique credential identifier |
ChallengeId | WebAuthn challenge identifier |
ChallengeType | Type of WebAuthn challenge |
AuthenticationOptions | Options for authentication ceremony |
RegistrationOptions | Options for registration ceremony |
AuthenticatorResponse | Response from authenticator |
RegisterCredential | Credential data for registration |
AuthenticatorInfo | Information about an authenticator device |
RegistrationStartRequest | Request to start passkey registration |
Error Types
| Type | Description |
|---|---|
CoordinationError | Errors from coordination layer |
SessionError | Session management errors |
Constants
| Constant | Description |
|---|---|
O2P_ROUTE_PREFIX | Route prefix for auth endpoints (default: /o2p) |
SESSION_COOKIE_NAME | Name of the session cookie |
Environment Variables
Required
ORIGIN- Base URL of your application (e.g.,https://example.com)
Storage Configuration
GENERIC_DATA_STORE_TYPE- Database type:sqliteorpostgresGENERIC_DATA_STORE_URL- Database connection stringGENERIC_CACHE_STORE_TYPE- Cache type:memoryorredisGENERIC_CACHE_STORE_URL- Cache connection string (for Redis)
OAuth2 Configuration
OAUTH2_GOOGLE_CLIENT_ID- Google OAuth2 client IDOAUTH2_GOOGLE_CLIENT_SECRET- Google OAuth2 client secret
See dot.env.example in the repository for complete configuration options.
Axum Integration API (oauth2-passkey-axum)
Overview
The oauth2-passkey-axum crate provides Axum web framework integration for the oauth2-passkey authentication library. It offers ready-to-use routers, middleware, and extractors for OAuth2 and WebAuthn/Passkey authentication.
Full API Documentation: https://docs.rs/oauth2-passkey-axum
Quick Start
use axum::{Router, response::Html};
use oauth2_passkey_axum::{oauth2_passkey_full_router, init};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize authentication
init().await?;
// Create application router
let app: Router = Router::new()
.route("/", axum::routing::get(|| async { Html("Hello World!") }))
// Add all authentication routes with a single call
.merge(oauth2_passkey_full_router());
// Start server
let listener = tokio::net::TcpListener::bind("127.0.0.1:3000").await?;
axum::serve(listener, app).await?;
Ok(())
}
Routers
oauth2_passkey_full_router (Recommended)
The unified router that provides all authentication endpoints. This is the recommended way to add authentication to your application.
#![allow(unused)]
fn main() {
use oauth2_passkey_axum::oauth2_passkey_full_router;
let app = Router::new()
.merge(oauth2_passkey_full_router());
}
This router:
- Nests all auth endpoints under
O2P_ROUTE_PREFIX(default:/o2p) - Automatically includes
/.well-known/webauthnwhenWEBAUTHN_ADDITIONAL_ORIGINSis configured - Handles single-origin and multi-origin setups seamlessly
Endpoints provided:
| Path | Description |
|---|---|
{O2P_ROUTE_PREFIX}/oauth2/... | OAuth2 authentication endpoints |
{O2P_ROUTE_PREFIX}/passkey/... | WebAuthn/Passkey authentication endpoints |
{O2P_ROUTE_PREFIX}/user/... | User account management endpoints |
{O2P_ROUTE_PREFIX}/admin/... | Admin interface endpoints |
/.well-known/webauthn | WebAuthn relying party configuration (only when multi-origin is configured) |
See Endpoint Reference for complete details.
oauth2_passkey_router
The auth-only router without the prefix nesting. Use this for custom setups where you need more control.
#![allow(unused)]
fn main() {
use oauth2_passkey_axum::{oauth2_passkey_router, O2P_ROUTE_PREFIX};
let app = Router::new()
.nest(O2P_ROUTE_PREFIX.as_str(), oauth2_passkey_router());
}
Endpoints provided (relative to mount point):
| Path | Description |
|---|---|
/oauth2/... | OAuth2 authentication endpoints |
/passkey/... | WebAuthn/Passkey authentication endpoints |
/user/... | User account management endpoints |
/admin/... | Admin interface endpoints |
passkey_well_known_router
Router for the WebAuthn well-known endpoint. Only needed if you use oauth2_passkey_router() directly with a multi-origin setup.
#![allow(unused)]
fn main() {
use oauth2_passkey_axum::{oauth2_passkey_router, passkey_well_known_router, O2P_ROUTE_PREFIX};
let app = Router::new()
.nest(O2P_ROUTE_PREFIX.as_str(), oauth2_passkey_router())
.merge(passkey_well_known_router());
}
This creates a /.well-known/webauthn endpoint for WebAuthn relying party configuration. See Multi-Origin Passkey Setup for details.
Note: If you use
oauth2_passkey_full_router(), this endpoint is included automatically when needed.
Middleware
Authentication middleware for protecting routes. All middleware functions:
- Verify valid session cookie
- For state-changing methods (POST, PUT, DELETE, PATCH), verify CSRF protection
- Add CSRF token to response headers
is_authenticated_401
Returns HTTP 401 Unauthorized for unauthenticated requests.
#![allow(unused)]
fn main() {
use axum::{Router, middleware::from_fn};
use oauth2_passkey_axum::is_authenticated_401;
let app: Router = Router::new()
.route("/api/data", axum::routing::get(handler))
.layer(from_fn(is_authenticated_401));
}
is_authenticated_redirect
Redirects unauthenticated GET requests to login page; returns 401 for other methods.
#![allow(unused)]
fn main() {
use axum::{Router, middleware::from_fn};
use oauth2_passkey_axum::is_authenticated_redirect;
let app: Router = Router::new()
.route("/dashboard", axum::routing::get(handler))
.layer(from_fn(is_authenticated_redirect));
}
is_authenticated_user_401
Like is_authenticated_401, but also extracts user data into an Extension<AuthUser>.
#![allow(unused)]
fn main() {
use axum::{Router, middleware::from_fn, extract::Extension};
use oauth2_passkey_axum::{is_authenticated_user_401, AuthUser};
async fn handler(Extension(user): Extension<AuthUser>) -> String {
format!("Hello, {}", user.account)
}
let app: Router = Router::new()
.route("/api/profile", axum::routing::get(handler))
.layer(from_fn(is_authenticated_user_401));
}
is_authenticated_user_redirect
Like is_authenticated_redirect, but also extracts user data into an Extension<AuthUser>.
#![allow(unused)]
fn main() {
use axum::{Router, middleware::from_fn, extract::Extension};
use oauth2_passkey_axum::{is_authenticated_user_redirect, AuthUser};
async fn handler(Extension(user): Extension<AuthUser>) -> String {
format!("Hello, {}", user.account)
}
let app: Router = Router::new()
.route("/dashboard", axum::routing::get(handler))
.layer(from_fn(is_authenticated_user_redirect));
}
Extractors
AuthUser
Axum extractor for authenticated user information. Automatically verifies session and CSRF tokens.
#![allow(unused)]
fn main() {
use axum::routing::get;
use oauth2_passkey_axum::AuthUser;
async fn protected_handler(user: AuthUser) -> String {
format!("Hello, {}!", user.label)
}
let app: Router = Router::new()
.route("/protected", get(protected_handler));
}
Fields:
| Field | Type | Description |
|---|---|---|
id | String | Unique user identifier |
account | String | User’s account name (email or username) |
label | String | User’s display name |
is_admin | bool | Whether user has admin privileges |
sequence_number | Option<i64> | Database sequence number |
created_at | DateTime<Utc> | Account creation timestamp |
updated_at | DateTime<Utc> | Last update timestamp |
csrf_token | String | CSRF token for the session |
csrf_via_header_verified | bool | Whether CSRF was verified via header |
session_id | String | Session ID for secure API calls |
Methods:
has_admin_privileges()- Returnstrueif user has admin rights (eitheris_adminflag or is first user)
Optional Extraction:
AuthUser also implements OptionalFromRequestParts, allowing optional user extraction:
#![allow(unused)]
fn main() {
async fn handler(user: Option<AuthUser>) -> String {
match user {
Some(u) => format!("Hello, {}!", u.label),
None => "Hello, guest!".to_string(),
}
}
}
URL Constants
| Constant | Description |
|---|---|
O2P_ROUTE_PREFIX | Route prefix for auth endpoints (default: /o2p) |
O2P_LOGIN_URL | Login page URL |
O2P_ADMIN_URL | Admin interface URL |
O2P_ACCOUNT_URL | User account management page URL |
O2P_DEFAULT_REDIRECT | Default redirect URL for auth flows |
Re-exports from oauth2-passkey
The following are re-exported from the core library for convenience:
| Item | Description |
|---|---|
init | Initialize the authentication system |
O2P_ROUTE_PREFIX | Route prefix constant |
CsrfToken | CSRF token type |
CsrfHeaderVerified | CSRF header verification marker |
Example: Protected API Routes
use axum::{Router, routing::{get, post}, middleware::from_fn, Json};
use oauth2_passkey_axum::{
oauth2_passkey_full_router, is_authenticated_user_401, AuthUser, init
};
use serde::Serialize;
#[derive(Serialize)]
struct UserProfile {
id: String,
name: String,
is_admin: bool,
}
async fn get_profile(user: AuthUser) -> Json<UserProfile> {
Json(UserProfile {
id: user.id,
name: user.label,
is_admin: user.has_admin_privileges(),
})
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
init().await?;
// Protected API routes
let api_routes = Router::new()
.route("/profile", get(get_profile))
.layer(from_fn(is_authenticated_user_401));
let app = Router::new()
// Authentication routes (includes /.well-known/webauthn when needed)
.merge(oauth2_passkey_full_router())
// Protected API
.nest("/api", api_routes);
let listener = tokio::net::TcpListener::bind("127.0.0.1:3000").await?;
axum::serve(listener, app).await?;
Ok(())
}
Example: Protected Web Pages with Redirect
#![allow(unused)]
fn main() {
use axum::{Router, routing::get, middleware::from_fn, response::Html};
use oauth2_passkey_axum::{is_authenticated_user_redirect, AuthUser};
async fn dashboard(user: AuthUser) -> Html<String> {
Html(format!(
"<h1>Welcome, {}!</h1><p>Your account: {}</p>",
user.label,
user.account
))
}
let protected_pages = Router::new()
.route("/dashboard", get(dashboard))
.layer(from_fn(is_authenticated_user_redirect));
}
Endpoint Reference
Complete list of all endpoints provided by oauth2_passkey_router().
OAuth2 Endpoints (/oauth2)
| Endpoint | Method | Description |
|---|---|---|
/google | GET | Start Google OAuth2 authentication flow |
/authorized | GET, POST | OAuth2 callback endpoint (redirect URI) |
/popup_close | GET | Close popup window after OAuth2 flow |
/accounts | GET | List user’s linked OAuth2 accounts |
/accounts/{provider}/{provider_user_id} | DELETE | Unlink an OAuth2 account |
/oauth2.js | GET | JavaScript SDK for OAuth2 operations |
Passkey Endpoints (/passkey)
| Endpoint | Method | Description |
|---|---|---|
/register/start | POST | Start passkey registration ceremony |
/register/finish | POST | Complete passkey registration ceremony |
/auth/start | POST | Start passkey authentication ceremony |
/auth/finish | POST | Complete passkey authentication ceremony |
/credentials | GET | List user’s passkey credentials |
/credentials/{credential_id} | DELETE | Delete a passkey credential |
/credential/update | POST | Update passkey credential metadata |
/conditional_ui | GET | Conditional UI page for passkey autofill* |
/passkey.js | GET | JavaScript SDK for passkey operations |
/conditional_ui.js | GET | JavaScript for conditional UI* |
User Endpoints (/user)
| Endpoint | Method | Description |
|---|---|---|
/login | GET | Login page* |
/account | GET | User account management page* |
/logout | GET | End session and redirect |
/info | GET | Get current user info as JSON* |
/csrf_token | GET | Get CSRF token for API calls* |
/delete | DELETE | Delete user account |
/update | PUT | Update user profile (account, label) |
/login_history | GET | User’s own login history (JSON) |
/login_history_page | GET | Login history page* |
/account.js | GET | JavaScript for account page* |
/account.css | GET | CSS for account page* |
/o2p-base.css | GET | Base CSS styles* |
Admin Endpoints (/admin)
| Endpoint | Method | Description |
|---|---|---|
/index | GET | User management list page* |
/user/{user_id} | GET | User detail page* |
/users | GET | List all users (JSON) |
/delete_user | DELETE | Delete a user (admin only) |
/delete_passkey_credential/{credential_id} | DELETE | Delete user’s passkey (admin only) |
/delete_oauth2_account/{provider}/{provider_user_id} | DELETE | Unlink user’s OAuth2 account (admin only) |
/update_admin_status | PUT | Grant/revoke admin privileges |
/sessions | GET | Active session counts for all users (JSON)* |
/user/{user_id}/logout | POST | Force logout a user (terminate all sessions)* |
/user/{user_id}/login_history | GET | Per-user login history (JSON) |
/audit | GET | Cross-user audit data with filters (JSON) |
/audit_page | GET | Audit page HTML template |
/admin_user.js | GET | JavaScript for admin pages* |
/admin_user.css | GET | CSS for admin pages* |
Login history query parameters (for /audit and /user/{user_id}/login_history):
| Parameter | Type | Default | Description |
|---|---|---|---|
limit | integer | 50 | Maximum entries to return |
offset | integer | 0 | Pagination offset |
from | string | - | Filter from date (YYYY-MM-DD) |
to | string | - | Filter to date (YYYY-MM-DD) |
tz_offset | integer | 0 | Timezone offset from UTC (minutes) |
user_id | string | - | Filter by user (audit only) |
success | boolean | - | Filter by success/failure (audit only) |
Admin safeguards: The admin API enforces protections to prevent admin lockout:
- Cannot delete or demote the last remaining admin user
- The first user (seq=1) cannot be demoted
- Admin users cannot delete their own account via the admin interface
Theme Endpoints (/themes)
| Endpoint | Method | Description |
|---|---|---|
/theme-zinc.css | GET | Zinc theme (neutral gray) |
/theme-slate.css | GET | Slate theme (cool gray) |
/theme-blue.css | GET | Blue theme |
/theme-violet.css | GET | Violet theme |
/theme-rose.css | GET | Rose theme |
/theme-neumorphism.css | GET | Neumorphism style theme |
/theme-material.css | GET | Material Design theme |
/theme-eco.css | GET | Eco / nature theme |
/theme-saas.css | GET | SaaS dashboard theme |
These are optional CSS theme files that override o2p-base.css variables. Set O2P_CUSTOM_CSS_URL to apply a theme. See Themes for details.
Feature Flags
Endpoints marked with * are controlled by feature flags:
| Feature | Default | Endpoints Affected |
|---|---|---|
login-ui | ON | /user/login |
user-ui | ON | /user/account, /user/account.js, /user/o2p-base.css |
admin-ui | ON | /admin/index, /admin/user/{id}, /admin/sessions, /admin/user/{id}/logout, admin static files |
API endpoints (/user/info, /user/csrf_token, /user/logout, /user/update, /user/delete, authentication, CRUD operations) are always available regardless of feature flags.
# Disable all built-in UI
oauth2-passkey-axum = { version = "0.3", default-features = false }
# Custom login page, keep account management and admin UI
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["user-ui", "admin-ui"] }
# Keep admin UI only
oauth2-passkey-axum = { version = "0.3", default-features = false, features = ["login-ui", "admin-ui"] }
Additional Router: passkey_well_known_router()
| Endpoint | Method | Description |
|---|---|---|
/.well-known/webauthn | GET | WebAuthn relying party configuration |
This router should be merged at the root level (not nested under O2P_ROUTE_PREFIX). Only needed for multi-origin setups. See Multi-Origin Passkey Setup.
iOS Safari Compatibility Guide
This document covers iOS Safari compatibility issues with OAuth2 authentication and development testing considerations.
Overview
iOS Safari has one main characteristic that affects OAuth2 authentication:
- Intelligent Tracking Prevention (ITP) - Blocks/partitions third-party cookies
This affects development testing with certain tunneling services (like ngrok) and OAuth2 response modes. Understanding these issues helps with debugging and configuration.
Note: OAuth2 popups work correctly on iOS Safari when triggered by user interaction and when using a proper tunneling solution like Cloudflare Tunnel.
OAuth2 Popup Behavior on iOS
Summary
iOS Safari popups work correctly when:
- The popup is triggered by a direct user action (click event)
- The application is accessed via Cloudflare Tunnel or direct proxy (not ngrok)
The library uses standard window.open() popup behavior on all platforms, including iOS Safari. No special fallback is needed.
Historical Note
Earlier versions included iOS-specific redirect fallback code based on the assumption that iOS WebKit blocks popups aggressively. Testing with Cloudflare Tunnel confirmed that popups work fine on iOS Safari when triggered properly. The fallback code was removed in December 2025 as unnecessary complexity.
OAuth2 Response Modes and Cookies
Response Modes
OAuth2 supports two response modes for returning the authorization code:
| Mode | How it works |
|---|---|
form_post (default) | Google auto-submits a POST form to your callback URL |
query | Google redirects via GET with code in query string |
SameSite Cookie Requirements
The CSRF cookie’s SameSite attribute must match the response mode:
| Response Mode | Callback Method | SameSite Required | Why |
|---|---|---|---|
form_post | Cross-site POST | None | Browser must send cookie on cross-origin POST |
query | Top-level GET redirect | Lax | Cookies sent on top-level navigation |
iOS Safari ITP Issue
iOS Safari’s Intelligent Tracking Prevention (ITP) blocks SameSite=None cookies on cross-site requests.
Result: form_post mode fails on iOS Safari:
- CSRF cookie is set with
SameSite=None; Secure - User authenticates on Google
- Google POSTs back to your site
- iOS Safari blocks the cookie due to ITP
- CSRF validation fails
Browser Compatibility
| Browser | Third-party cookies | form_post works? |
|---|---|---|
| Android Chrome | Allowed by default | Yes |
| Desktop Chrome | Allowed by default | Yes |
| Desktop Safari | Some ITP restrictions | Usually |
| iOS Safari | Strict ITP | No |
| iOS Chrome | Uses WebKit (same as Safari) | No |
Note: All browsers on iOS use WebKit (Apple requirement), so iOS Chrome has the same restrictions as iOS Safari.
Configuration
Use query response mode for iOS compatibility:
# .env file
OAUTH2_RESPONSE_MODE='query'
With query mode:
- Google redirects back via GET (not POST)
- Cookie uses
SameSite=Lax(notNone) - iOS Safari’s ITP doesn’t block
Laxcookies on top-level navigations
Development Testing on iOS
The ngrok Problem
When testing on iOS devices, developers often use ngrok to expose localhost. However, ngrok’s free tier does not work reliably on iOS.
Root Cause: ngrok shows an interstitial page that requires a cookie to bypass. iOS Safari’s ITP blocks this cookie for subresource requests (like JavaScript files).
What happens:
- Main HTML page loads (user clicked through interstitial)
- Browser requests external JavaScript files
- ngrok returns HTML interstitial instead of JavaScript (cookie blocked)
- Scripts fail to load, buttons don’t work
Evidence:
# Via ngrok on iOS (broken):
fetch oauth2.js content-type=text/html
fetch oauth2.js starts=<!DOCTYPE html>...
# Via nginx (works):
fetch oauth2.js content-type=application/javascript
fetch oauth2.js starts=const oauth2 = (function()...
Why Android Works but iOS Fails
| Browser | Third-party cookies | ngrok works? |
|---|---|---|
| Android Chrome | Allowed by default | Yes |
| Desktop Chrome | Allowed by default | Yes |
| iOS Safari | Blocked by ITP | No |
| iOS Chrome | Blocked by ITP (uses WebKit) | No |
Recommended Testing Methods
| Method | Use Case | iOS Compatible |
|---|---|---|
| Cloudflare Tunnel | Remote testing over internet | Yes |
| nginx reverse proxy | Local network testing | Yes |
| localhost | Direct access on same machine | N/A (no mobile) |
Cloudflare Tunnel is recommended for iOS testing:
- No interstitial page - Direct tunneling, no cookie dependency
- Works with iOS Safari - No ITP issues
- Free - Quick tunnels require no account
For setup instructions, see Development Tunneling Guide.
ngrok Workarounds
If you must use ngrok, upgrade to a paid plan which removes the interstitial entirely.
Summary
| Issue | Cause | Solution |
|---|---|---|
form_post fails | iOS ITP blocks SameSite=None cookies | Use OAUTH2_RESPONSE_MODE='query' |
| ngrok doesn’t work on iOS | iOS ITP blocks ngrok’s session cookie | Use Cloudflare Tunnel |
Note: OAuth2 popups work correctly on iOS Safari when using Cloudflare Tunnel or direct proxy.
References
- Full Third-Party Cookie Blocking - WebKit
- SameSite cookies explained - web.dev
- OAuth 2.0 Form Post Response Mode - RFC
- Quick Tunnels - Cloudflare Docs
Attestation Overview
What is Attestation?
Attestation in WebAuthn is a mechanism that allows a relying party (your application) to verify the provenance and characteristics of an authenticator during registration. When a user creates a new credential, the authenticator can provide cryptographic proof about itself, including information about its manufacturer, security properties, and certification status.
The attestation statement is included in the registration response and contains:
- The attestation format identifier (e.g., “none”, “packed”, “tpm”)
- Format-specific attestation data (signatures, certificates, etc.)
- The authenticator data containing the new credential
Why Attestation Matters
Attestation provides several security benefits:
-
Authenticator Verification: Confirm that a credential was created by a genuine, certified authenticator rather than a software emulator or compromised device.
-
Security Policy Enforcement: Organizations can require specific authenticator types or certification levels based on their security requirements.
-
AAGUID Identification: The Authenticator Attestation GUID (AAGUID) identifies the authenticator model, enabling relying parties to make policy decisions based on known device characteristics.
-
Supply Chain Trust: For high-security environments, attestation provides cryptographic proof that traces back to the authenticator manufacturer.
However, attestation also has privacy implications. Detailed attestation can potentially be used to track users across services, which is why many platform authenticators use “none” attestation by default.
Supported Formats
The oauth2-passkey library supports three attestation formats, each suitable for different use cases.
None Attestation
The “none” attestation format provides no cryptographic proof of authenticator provenance. It is the simplest format and is typically used by platform authenticators (built-in biometric sensors, operating system passkey implementations).
Key characteristics:
- Empty attestation statement
- Relies on platform security guarantees
- Maximum user privacy
- Suitable for most consumer applications
For detailed verification procedures and implementation notes, see Chapter 16: None Attestation.
Packed Attestation
The “packed” attestation format is commonly used by security keys (like YubiKeys) and provides a compact but comprehensive attestation statement. It supports multiple attestation types:
- Basic/Full Attestation: Includes an attestation certificate chain for full authenticator verification
- Self Attestation: The credential signs its own attestation (no external certificate)
Key characteristics:
- Contains algorithm identifier and signature
- Optional certificate chain (x5c) for full attestation
- Supports ES256 algorithm
- Common format for FIDO2 security keys
For detailed verification procedures and certificate requirements, see Chapter 17: Packed Attestation.
TPM Attestation
The TPM (Trusted Platform Module) attestation format is used by authenticators that leverage hardware TPM chips for cryptographic operations. This format provides strong hardware-backed attestation with detailed device information.
Key characteristics:
- Requires TPM 2.0
- Includes AIK (Attestation Identity Key) certificate
- Contains TPM-specific structures (certInfo, pubArea)
- Provides hardware-rooted trust
- Common on Windows Hello and enterprise devices
For detailed verification procedures, TPM structures, and certificate requirements, see Chapter 18: TPM Attestation.
Choosing an Attestation Format
The choice of attestation format depends on your security requirements and use case:
| Use Case | Recommended Format | Reason |
|---|---|---|
| Consumer applications | None | Maximum privacy, broad compatibility |
| Enterprise with security keys | Packed | Verifies authenticator authenticity |
| High-security environments | TPM or Packed (Full) | Hardware-backed trust, certificate verification |
| Privacy-sensitive applications | None | No tracking potential |
| Authenticator inventory management | Packed or TPM | AAGUID and certificate provide device information |
General guidelines:
-
Default to “none”: Unless you have specific requirements to verify authenticator provenance, accepting “none” attestation provides the best user experience and privacy.
-
Use “packed” for security keys: When deploying hardware security keys in an organization, packed attestation allows verification of genuine devices.
-
Use “tpm” for Windows environments: Windows Hello typically provides TPM attestation, suitable for enterprise Windows deployments.
-
Consider privacy implications: Full attestation can potentially identify specific authenticator models, which may have privacy implications for your users.
Library Configuration
The oauth2-passkey library handles attestation verification automatically during credential registration. The library:
-
Detects the attestation format from the registration response’s
fmtfield. -
Routes to the appropriate verifier based on the format (“none”, “packed”, or “tpm”).
-
Performs format-specific verification following WebAuthn specification requirements.
-
Extracts the credential public key and AAGUID for storage.
Configuration options:
-
User Verification: The
PASSKEY_USER_VERIFICATIONsetting controls whether user verification (biometric, PIN) is required during registration. -
Attestation Preference: When initiating registration, you can request specific attestation conveyance:
none: Request no attestation (default for privacy)indirect: Accept any attestation the authenticator providesdirect: Request direct attestation from the authenticatorenterprise: Request enterprise attestation (for managed devices)
The library accepts all supported attestation formats regardless of the preference setting, as authenticators may provide different formats than requested.
Verification behavior:
- All formats verify the RP ID hash and authenticator flags
- Certificate-based formats (packed full, TPM) verify certificate chains
- The library logs AAGUIDs for debugging and potential policy use
- Verification failures result in registration rejection with appropriate error messages
None Attestation in WebAuthn
This document describes the “none” attestation format as implemented in the oauth2-passkey library, following the WebAuthn specification.
Overview
The “none” attestation format is typically used by platform authenticators (like built-in biometric sensors) where the browser or operating system vouches for the authenticator. This format provides no cryptographic proof of the authenticator’s provenance but relies on the platform’s security guarantees.
Attestation Statement Format
The “none” attestation statement is the simplest of all formats:
attStmtType = (
fmt: "none",
attStmt: {}
)
Field Descriptions
- fmt: The attestation statement format identifier, which is “none”.
- attStmt: An empty map ({}), as no attestation-specific data is provided.
Verification Procedure
The verification procedure for “none” attestation statements follows these steps:
-
Empty Statement Verification:
- Verify that the attestation statement (attStmt) is empty.
-
RP ID Hash Verification:
- Verify that the RP ID hash in authenticatorData matches the SHA-256 hash of the RP ID.
-
Flag Verification:
- Verify that the User Present (UP) flag is set.
- If User Verification (UV) is required by policy, verify that the UV flag is set.
- Verify that the Attested Credential Data flag is set.
-
AAGUID Extraction:
- Extract and log the AAGUID from the authenticator data.
-
Public Key Verification:
- Verify that the credential public key is in the correct COSE format.
- Extract and validate the public key coordinates.
Implementation Notes
- The library performs basic checks on the authenticator data structure.
- User Verification requirements are configurable through the
PASSKEY_USER_VERIFICATIONsetting. - The implementation extracts and logs the AAGUID for potential future use.
- The public key is verified to ensure it follows the expected COSE key format.
Compliance Assessment
The oauth2-passkey library implementation of “none” attestation has been assessed against the WebAuthn specification requirements. Here’s a summary of the compliance status:
| Requirement | Status | Notes |
|---|---|---|
| Empty Statement Verification | ✅ Compliant | Verifies that attStmt is empty |
| RP ID Hash Verification | ✅ Compliant | Ensures the RP ID hash matches the expected value |
| User Present Flag | ✅ Compliant | Verifies the UP flag is set |
| User Verification Flag | ✅ Compliant | Checks UV flag when required by policy |
| Attested Credential Data | ✅ Compliant | Verifies the flag is set and data is present |
| AAGUID Extraction | ✅ Compliant | Successfully extracts and logs the AAGUID |
| Public Key Format | ✅ Compliant | Validates the COSE key format and coordinates |
Areas for Improvement
While the implementation is fully compliant with the WebAuthn specification, there are some areas that could be enhanced:
-
Logging Enhancement: More detailed logging of the verification steps could aid in debugging and auditing.
-
Error Messages: More specific error messages could be provided for each verification step.
-
Configuration Options: Additional configuration options could be provided for customizing the verification behavior.
References
Packed Attestation in WebAuthn
This document describes the “packed” attestation format as implemented in the oauth2-passkey library, following the WebAuthn specification.
Overview
The “packed” attestation format is commonly used by security keys and provides a compact but comprehensive attestation statement. It supports multiple attestation types: Basic, AttCA (with an attestation certificate), and Self Attestation.
Attestation Statement Format
The “packed” attestation statement follows this structure:
attStmtType = (
fmt: "packed",
attStmt: packedStmtFormat
)
packedStmtFormat = {
alg: COSEAlgorithmIdentifier,
sig: bytes,
[x5c: [ attestnCert: bytes, * (caCert: bytes) ]],
[ecdaaKeyId: bytes]
}
Field Descriptions
- fmt: The attestation statement format identifier, which is “packed”.
- alg: A COSEAlgorithmIdentifier containing the identifier of the algorithm used to generate the attestation signature.
- sig: The attestation signature.
- x5c (optional): The attestation certificate and its certificate chain, in X.509 encoding.
- ecdaaKeyId (optional): The identifier of the ECDAA key used for the attestation (not supported in current implementation).
Verification Procedure
The verification procedure for “packed” attestation statements follows these steps:
-
Algorithm and Signature Extraction:
- Extract the algorithm identifier (alg) and signature (sig) from the attestation statement.
-
Signed Data Construction:
- Concatenate authenticatorData and clientDataHash to form the signed data.
-
Algorithm Verification:
- Verify that the algorithm is supported (currently only ES256 is supported).
-
Attestation Type Determination:
- Check for the presence of x5c and ecdaaKeyId to determine the attestation type.
-
Attestation Verification:
- For Full Attestation (x5c present):
- Parse and verify the attestation certificate.
- Verify certificate attributes according to FIDO standards.
- Verify the signature using the attestation certificate’s public key.
- Verify the certificate chain if intermediates are present.
- For Self Attestation (neither x5c nor ecdaaKeyId present):
- Extract the credential public key from authenticatorData.
- Verify the signature using this public key.
- For ECDAA Attestation (ecdaaKeyId present):
- Currently not supported.
- For Full Attestation (x5c present):
Certificate Verification
For Full Attestation, the attestation certificate is verified to ensure it meets these requirements:
-
Basic Constraints: Verify the certificate is not a CA certificate.
-
AAGUID Verification: If the certificate contains the FIDO AAGUID extension (OID 1.3.6.1.4.1.45724.1.1.4), verify it matches the AAGUID in authenticatorData.
Certificate Chain Verification
If the attestation statement includes intermediate certificates, the library verifies:
-
Certificate Parsing: Each certificate in the chain can be parsed correctly.
-
Certificate Validity: Each certificate is currently valid (not expired or not yet valid).
Self Attestation Verification
For Self Attestation, the library:
-
Extracts the Credential Public Key: From the authenticatorData.
-
Constructs the Full Public Key: Formats the extracted coordinates as an uncompressed EC point.
-
Verifies the Signature: Using the credential’s own public key.
Compliance Assessment
The oauth2-passkey library implementation of “packed” attestation has been assessed against the WebAuthn specification requirements. Here’s a summary of the compliance status:
| Requirement | Status | Notes |
|---|---|---|
| Algorithm Extraction | ✅ Compliant | Correctly extracts and verifies the algorithm |
| Signature Extraction | ✅ Compliant | Correctly extracts the signature |
| Signed Data Construction | ✅ Compliant | Properly concatenates authenticatorData and clientDataHash |
| Algorithm Verification | ✅ Compliant | Verifies ES256 algorithm support |
| Attestation Type Determination | ✅ Compliant | Correctly identifies attestation type |
| Full Attestation Verification | ✅ Compliant | Properly verifies certificates and signatures |
| Self Attestation Verification | ✅ Compliant | Correctly extracts and verifies using credential’s own key |
| Certificate Basic Constraints | ✅ Compliant | Verifies certificate is not a CA |
| AAGUID Verification | ✅ Compliant | Matches certificate AAGUID with authenticator AAGUID |
| Certificate Chain Verification | ✅ Compliant | Verifies intermediate certificates when present |
| ECDAA Attestation | ❌ Not Implemented | ECDAA attestation is not currently supported |
Areas for Improvement
While the implementation is largely compliant with the WebAuthn specification, there are some areas that could be enhanced:
-
ECDAA Support: The current implementation does not support ECDAA attestation, which is optional in the WebAuthn specification.
-
Certificate Verification: More comprehensive certificate verification could be implemented, including checking for revocation status.
-
Error Handling: More detailed error messages could be provided for specific verification failures.
-
Performance Optimization: The certificate parsing and verification could potentially be optimized for better performance.
References
- WebAuthn Specification - Packed Attestation
- FIDO Metadata Service
- WebAuthn Specification - Attestation Types
TPM Attestation in WebAuthn
This document describes the TPM (Trusted Platform Module) attestation format as implemented in the oauth2-passkey library, following the WebAuthn specification.
Overview
TPM attestation is used by authenticators that use a Trusted Platform Module as their cryptographic engine. The TPM attestation statement format is identified by the string “tpm” and supports the AttCA attestation type.
Attestation Statement Format
The TPM attestation statement follows this structure:
attStmtType = (
fmt: "tpm",
attStmt: tpmStmtFormat
)
tpmStmtFormat = {
ver: "2.0",
(
alg: COSEAlgorithmIdentifier,
x5c: [ aikCert: bytes, * (caCert: bytes) ]
)
sig: bytes,
certInfo: bytes,
pubArea: bytes
}
Field Descriptions
- ver: The version of the TPM specification to which the signature conforms.
- alg: A COSEAlgorithmIdentifier containing the identifier of the algorithm used to generate the attestation signature.
- x5c: aikCert followed by its certificate chain, in X.509 encoding.
- aikCert: The AIK certificate used for the attestation, in X.509 encoding.
- sig: The attestation signature, in the form of a TPMT_SIGNATURE structure.
- certInfo: The TPMS_ATTEST structure over which the signature was computed.
- pubArea: The TPMT_PUBLIC structure used by the TPM to represent the credential public key.
Verification Procedure
The verification procedure for TPM attestation statements follows these steps:
-
Basic Structure Verification:
- Verify that the attestation statement is valid CBOR with the required fields (ver, alg, x5c, sig, certInfo, pubArea)
- Check that the version is “2.0”
-
Public Key Verification:
- Verify that the public key in pubArea matches the credential public key in authenticatorData
-
certInfo Validation:
- Verify that magic is set to TPM_GENERATED_VALUE
- Verify that type is set to TPM_ST_ATTEST_CERTIFY
- Verify that extraData is set to the hash of attToBeSigned (authenticatorData + clientDataHash)
- Verify that attested contains a valid TPMS_CERTIFY_INFO structure with the correct name field
-
x5c Verification:
- Verify that x5c is present
- The qualifiedSigner, clockInfo, and firmwareVersion fields are ignored
-
Signature Verification:
- Verify that the signature is valid over certInfo using the attestation public key in aikCert
-
AIK Certificate Requirements:
- Verify that the AIK certificate version is 3
- Verify that the Subject field is empty
- Verify the Subject Alternative Name extension
- Verify the Extended Key Usage extension contains the required OID (2.23.133.8.3)
- Verify the Basic Constraints extension has CA set to false
- If present, verify the AAGUID extension (OID 1.3.6.1.4.1.45724.1.1.4) matches the AAGUID in authenticatorData
TPM Structures
TPMS_ATTEST Structure
The TPMS_ATTEST structure contains the following fields:
- magic: Must be set to TPM_GENERATED_VALUE (0xff544347)
- type: Must be set to TPM_ST_ATTEST_CERTIFY (0x8017)
- qualifiedSigner: TPM name of the key signing the attestation
- extraData: The hash of attToBeSigned (authenticatorData + clientDataHash)
- clockInfo: Information about the TPM’s clock
- firmwareVersion: The TPM’s firmware version
- attested: Contains a TPMS_CERTIFY_INFO structure
TPMS_CERTIFY_INFO Structure
The TPMS_CERTIFY_INFO structure contains:
- name: The TPM name of the certified key (hash of pubArea)
- qualifiedName: The qualified name of the certified key
Name Verification
The name field in the TPMS_CERTIFY_INFO structure is a hash of the pubArea using the nameAlg algorithm. The verification process includes:
#![allow(unused)]
fn main() {
// Extract the name algorithm from pubArea
let _name_alg = u16::from_be_bytes([pub_area[2], pub_area[3]]);
// Calculate the hash of pubArea using the nameAlg
let pub_area_hash = match _name_alg {
0x000B => {
// TPM_ALG_SHA256
use sha2::{Digest, Sha256};
let mut hasher = Sha256::new();
hasher.update(pub_area);
hasher.finalize().to_vec()
}
0x000C => {
// TPM_ALG_SHA384
use sha2::{Digest, Sha384};
let mut hasher = Sha384::new();
hasher.update(pub_area);
hasher.finalize().to_vec()
}
0x000D => {
// TPM_ALG_SHA512
use sha2::{Digest, Sha512};
let mut hasher = Sha512::new();
hasher.update(pub_area);
hasher.finalize().to_vec()
}
_ => {
// Unsupported algorithm
return Error;
}
};
// The name field includes a 2-byte algorithm ID followed by the hash
// Verify that the hash part matches our calculated hash
}
AIK Certificate Verification
The AIK certificate must meet specific requirements:
- Version: Must be set to 3
- Subject: Must be empty
- Subject Alternative Name: Must be present as defined in TPMv2-EK-Profile
- Extended Key Usage: Must contain the OID 2.23.133.8.3
- Basic Constraints: Must have CA set to false
- AAGUID Extension: If present (OID 1.3.6.1.4.1.45724.1.1.4), must match the AAGUID in authenticatorData
Compliance Assessment
The oauth2-passkey library implementation of TPM attestation has been assessed against the WebAuthn specification requirements. Here’s a summary of the compliance status:
| Requirement | Status | Notes |
|---|---|---|
| Basic Structure Verification | ✅ Compliant | Verifies all required fields and format |
| Public Key Verification | ✅ Compliant | Ensures pubArea matches credentialPublicKey |
| certInfo Validation | ✅ Compliant | Verifies magic, type, extraData, and attested fields |
| x5c Verification | ✅ Compliant | Checks presence and properly ignores specified fields |
| Signature Verification | ✅ Compliant | Validates signature over certInfo using AIK certificate |
| AIK Certificate Version | ✅ Compliant | Verifies version is 3 |
| AIK Certificate Subject | ✅ Compliant | Verifies subject is empty |
| Subject Alternative Name | ✅ Compliant | Verifies extension is present |
| Extended Key Usage | ✅ Compliant | Verifies OID 2.23.133.8.3 is present |
| Basic Constraints | ✅ Compliant | Verifies CA is false |
| AAGUID Extension | ✅ Compliant | Verifies match with authenticatorData when present |
Areas for Improvement
While the implementation is fully compliant with the WebAuthn specification, there are some areas that could be enhanced:
-
Fallback Verification Robustness: The fallback verification using x509-parser could benefit from more detailed error messages to help diagnose specific validation failures.
-
Error Handling: Current error handling could be enhanced with more specific error types for each verification step.
-
Testing Coverage: Comprehensive tests for various edge cases and failure modes would strengthen the implementation.
-
Performance Optimization: The current implementation prioritizes correctness and compliance over performance. There may be opportunities to optimize the verification process for high-volume deployments.
Supported Signature Algorithms
The TPM attestation verifier supports the following COSE algorithms for signature verification over certInfo:
| COSE Alg ID | Name | Description | Verification Method |
|---|---|---|---|
| -7 | ES256 | ECDSA with P-256 and SHA-256 | webpki |
| -257 | RS256 | RSASSA-PKCS1-v1_5 with SHA-256 | webpki |
| -65535 | RS1 | RSASSA-PKCS1-v1_5 with SHA-1 | ring (legacy) |
RS1 (SHA-1 RSA) Support
Windows Hello with TPM attestation commonly uses RS1 (-65535) to sign the certInfo structure, even when the credential key itself uses ES256. This is because the TPM’s Attestation Identity Key (AIK) may be an RSA key that signs using SHA-1.
Since the webpki crate does not support SHA-1 RSA signature verification, RS1 signatures are verified directly using the ring crate’s RSA_PKCS1_2048_8192_SHA1_FOR_LEGACY_USE_ONLY algorithm. The public key is extracted from the AIK certificate using x509-parser.
Why Windows Hello TPM uses RS1
The attestation signature algorithm is determined by the TPM hardware, not the OS. The TPM’s AIK (Attestation Identity Key) is derived from the Endorsement Key (EK), which is burned into the TPM chip at manufacturing time. On many existing TPM chips, the EK is an RSA key that signs using SHA-1, and the AIK inherits this characteristic.
It is important to distinguish between two different keys and algorithms in TPM attestation:
- Attestation signature (AIK): Signs
certInfoto prove the TPM generated the credential. This is where RS1 (-65535) appears. The algorithm is dictated by the TPM hardware. - Credential key: Used for ongoing authentication (login). Typically uses ES256 (
-7), a modern and secure algorithm. This is unaffected by the AIK’s algorithm.
Why RS1 persists and the outlook
SHA-1 has been considered cryptographically broken since 2017 (SHAttered collision attack), and all major browsers and CAs have stopped accepting SHA-1 certificates. However, RS1 remains in TPM attestation for the following reasons:
- Hardware constraint: Changing the AIK algorithm requires a new TPM chip, not just a software or firmware update. The EK is immutable, and re-provisioning risks breaking dependent services (e.g., BitLocker relies on the TPM’s key hierarchy).
- webpki’s design decision: The
webpki(rustls-webpki) crate intentionally excludes SHA-1 as a security policy. This is a deliberate choice, not an omission, and future support is unlikely. - ring’s pragmatic approach: The
ringcrate providesRSA_PKCS1_2048_8192_SHA1_FOR_LEGACY_USE_ONLYexplicitly for legacy compatibility scenarios like this one.
Newer TPM chips (especially those shipping in recent PCs) tend to support SHA-256 based attestation. However, the transition is tied to hardware replacement cycles (5-10 years), so RS1 support will remain necessary for the foreseeable future.
Security impact
The use of RS1 for attestation has limited security impact:
- The SHA-1 signature only proves that the TPM generated the credential at registration time. It does not affect the security of ongoing authentication.
- An attacker exploiting SHA-1 collisions would need to forge a TPM attestation statement, which would also require compromising the TPM’s private AIK – a much harder attack than finding a hash collision.
- The credential’s authentication signatures use ES256 (SHA-256), which is not affected.
Implementation Notes
- The library uses webpki, x509-parser, and ring for certificate and signature verification
- RS1 (SHA-1 RSA) signatures bypass webpki and are verified directly with ring’s legacy API
- A fallback verification mechanism is implemented when webpki cannot parse the certificate
- The implementation follows a modular approach to separate core attestation logic from TPM-specific logic
- Comprehensive error handling is provided throughout the attestation verification process
References
AAGUID and Authenticator Metadata
This document explains how passkey authenticator icons and names are determined using AAGUID.
What is AAGUID?
AAGUID (Authenticator Attestation Globally Unique Identifier) is a 128-bit identifier that indicates the type (make and model) of an authenticator. It allows Relying Parties (RPs) to identify which device or password manager created a passkey.
Examples:
ea9b8d66-4d01-1d21-3ce4-b6b48cb575d4- Google Password Manageradce0002-35bc-c60a-648b-0b25f1f05503- 1Password00000000-0000-0000-0000-000000000000- Unknown (often Apple devices)
Metadata Sources
FIDO Metadata Service (MDS)
The official FIDO Metadata Service provides metadata for FIDO-certified hardware authenticators.
| Aspect | Details |
|---|---|
| Endpoint | https://mds3.fidoalliance.org/ |
| Format | JWT BLOB (requires signature verification) |
| Auth | Not required (public) |
| Coverage | Hardware security keys (YubiKey, Titan Key, etc.) |
| Update frequency | Monthly recommended |
Limitation: FIDO MDS does not include password managers (Google Password Manager, iCloud Keychain, 1Password, etc.) because they don’t go through FIDO certification.
Community AAGUID Repository
The passkey-authenticator-aaguids project provides a community-sourced list that includes both hardware authenticators and password managers.
| Aspect | Details |
|---|---|
| Endpoint | https://raw.githubusercontent.com/passkeydeveloper/passkey-authenticator-aaguids/main/combined_aaguid.json |
| Format | Simple JSON |
| Auth | Not required |
| Coverage | Hardware keys + Password managers |
| Recommended by | web.dev |
This library uses the community repository to support displaying icons for all authenticator types.
Comparison of Data Sources
| Source | Hardware Keys | Password Managers | Verification | Complexity |
|---|---|---|---|---|
| FIDO MDS | Yes | No | JWT signature | High |
| Community repo | Yes | Yes | None | Low |
How This Library Uses AAGUID
Data Flow
Registration
|
v
Authenticator returns AAGUID
|
v
Store in passkey_credentials table
|
v
get_authenticator_info(aaguid)
|
v
Lookup from cache (loaded from community JSON)
|
v
Display name + icon in templates
Implementation
The AAGUID lookup is implemented in oauth2_passkey/src/passkey/main/aaguid.rs:
#![allow(unused)]
fn main() {
// Embedded fallback data
const AAGUID_JSON: &str = include_str!("../../../assets/aaguid.json");
// Remote source (updated regularly)
const AAGUID_URL: &str = "https://raw.githubusercontent.com/passkeydeveloper/passkey-authenticator-aaguids/main/combined_aaguid.json";
// Lookup function
pub async fn get_authenticator_info(aaguid: &str) -> Result<Option<AuthenticatorInfo>, PasskeyError>
}
Data Structure
#![allow(unused)]
fn main() {
pub struct AuthenticatorInfo {
pub name: String, // e.g., "Google Password Manager"
pub icon_dark: Option<String>, // Base64-encoded SVG for dark theme
pub icon_light: Option<String>, // Base64-encoded SVG for light theme
}
}
Attestation vs AAGUID
| Aspect | Attestation | AAGUID |
|---|---|---|
| Purpose | Cryptographic proof of authenticator | Identifier for display |
| Verification | Certificate chain | None (self-reported) |
| Security use | Yes (device policy enforcement) | No (can be spoofed) |
| Display use | No | Yes (icons, names) |
Important: AAGUID should only be used for UI display purposes (showing icons and names). It should not be used for security decisions because it can be spoofed without attestation verification.
Comparison with Other Libraries
| Library | AAGUID Extraction | FIDO MDS | Password Manager Icons |
|---|---|---|---|
| SimpleWebAuthn (JS) | Yes | No | Requires separate JSON |
| webauthn4j (Java) | Yes | Yes (for attestation) | No |
| Yubico java-webauthn-server | Yes | Yes (full) | No |
| This library | Yes | No | Yes (community repo) |
All libraries require the community AAGUID repository for password manager icon display.
Apple’s Zero AAGUID
Apple devices historically return 00000000-0000-0000-0000-000000000000 (all zeros) as the AAGUID. This is because:
- Apple prioritizes user privacy
- Apple devices don’t support attestation
- Revealing the exact device model could be a fingerprinting vector
When this AAGUID is encountered, the library displays “Unknown Authenticator” or a generic icon.
References
- FIDO Metadata Service
- web.dev: Determine the passkey provider with AAGUID
- passkey-authenticator-aaguids
- FIDO MDS Specification v3
User Handle Strategy and WebAuthn Signal API
This document explains the PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL configuration setting in detail, and how it interacts with the WebAuthn Signal API for credential synchronization.
Overview
Two separate but closely related mechanisms affect how passkey credentials are managed:
-
User Handle Strategy (
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL) – Controls whether each credential gets its own uniqueuser_handleor all credentials for a user share the sameuser_handle. -
WebAuthn Signal API – A set of browser APIs that allow the relying party (server) to communicate credential state changes to the authenticator (password manager, platform authenticator, security key).
The user handle strategy directly determines how effectively the Signal API can synchronize credentials between the server and the authenticator.
User Handle (user.id / user_handle)
What is a User Handle?
In the WebAuthn specification, the user handle (user.id) is an opaque byte sequence (0-64 bytes) that identifies a user account. It is:
- Set by the relying party during credential registration
- Stored by the authenticator alongside the credential
- Returned to the relying party during authentication (for discoverable credentials)
- Used by the authenticator to group credentials belonging to the same user
How the Authenticator Uses User Handles
The authenticator (e.g., Google Password Manager, iCloud Keychain, YubiKey) uses the user handle to determine which credentials belong to the same user. This affects:
- Credential display: Credentials with the same user handle may be grouped together in the authenticator’s UI
- Credential replacement: Some authenticators (especially password managers) overwrite existing credentials when a new one is registered with the same user handle and RP ID
- Signal API scope: The
signalAllAcceptedCredentialsAPI operates on a per-user-handle basis
WebAuthn Specification Guidance
The WebAuthn Level 3 specification states:
The user handle is an identifier for the user account, chosen by the Relying Party. It is not meant to be displayed to the user. Its primary purpose is to allow the Relying Party to associate a credential with a user account.
The spec does not mandate whether user handles should be unique per user or per credential. However, the design of discoverable credentials and the Signal API strongly assumes a one-user-handle-per-user model.
This library provides an option to generate a unique user handle for each credential. This allows a single user to register multiple credentials from the same authenticator type (e.g., multiple passkeys in Google Password Manager), which would otherwise be prevented by password managers that enforce “one credential per user per RP”.
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL
Configuration
# Default: false
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL=false
| Value | user_handle | Credentials per authenticator |
|---|---|---|
false (default) | Shared per user | One |
true | Unique per credential | Unlimited |
When true (Unique Per Credential)
Every time a user registers a new passkey, a fresh random user_handle is generated:
User "alice" registers 3 passkeys:
Credential A: user_handle = "rNd0mStr1ng_AAAA..." credential_id = "cred_111"
Credential B: user_handle = "rNd0mStr2ng_BBBB..." credential_id = "cred_222"
Credential C: user_handle = "rNd0mStr3ng_CCCC..." credential_id = "cred_333"
From the authenticator’s perspective, these appear as three different users because each has a different user_handle.
Behavior details:
- No credential cleanup during registration (each credential is independent)
- The user can register unlimited credentials from the same authenticator type
- Password managers that enforce “one credential per user per RP” will store all credentials independently
- Discoverable credential selection shows each credential as a separate entry
Source: oauth2_passkey/src/passkey/main/register.rs lines 64-71
#![allow(unused)]
fn main() {
if *PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL {
let new_handle = gen_random_string(32)?;
tracing::debug!(
"Using unique user handle for every credential: {}",
new_handle
);
return Ok(new_handle);
}
}
When false (Shared Per User)
All credentials for the same user reuse the same user_handle:
User "alice" registers 3 passkeys:
Credential A: user_handle = "aliceHandle123..." credential_id = "cred_111"
Credential B: user_handle = "aliceHandle123..." credential_id = "cred_222"
Credential C: user_handle = "aliceHandle123..." credential_id = "cred_333"
From the authenticator’s perspective, these all belong to the same user.
Behavior details:
- During registration, existing credentials with the same
user_handle,user_id, andaaguidare deleted (one credential per authenticator type per user) - Password managers may overwrite an existing credential with the same user handle on re-registration
- The authenticator can group all credentials under one user identity
- For already-logged-in users, the user_handle is retrieved from the first existing credential
Note: In the deletion rule above,
user_idrefers to the application’s internal user identifier (database primary key), not the WebAuthnuser_handle.
Source: oauth2_passkey/src/passkey/main/register.rs lines 73-110
#![allow(unused)]
fn main() {
// Otherwise, follow the normal logic of reusing handles for logged-in users
if let Some(user) = session_user {
let existing_credentials =
PasskeyStore::get_credentials_by(CredentialSearchField::UserId(user_id)).await?;
if !existing_credentials.is_empty() {
// Reuse the existing user_handle from the first credential
let existing_handle = existing_credentials[0].user.user_handle.clone();
Ok(existing_handle)
} else {
let new_handle = gen_random_string(32)?;
Ok(new_handle)
}
}
}
Credential Cleanup During Registration
When false, the library performs cleanup during registration to enforce the one-credential-per-authenticator-type policy:
Source: oauth2_passkey/src/passkey/main/register.rs lines 345-422
#![allow(unused)]
fn main() {
if !*PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL {
// Find credentials with matching user_handle
let credentials_with_matching_handle =
PasskeyStore::get_credentials_by(
CredentialSearchField::UserHandle(user_handle)
).await?;
// Delete credentials that match user_handle + user_id + aaguid
for cred in credentials_with_matching_handle {
if cred.aaguid == aaguid && cred.user_id == user_id.as_str() {
PasskeyStore::delete_credential_by(
CredentialSearchField::CredentialId(credential_id)
).await?;
}
}
}
}
When true, this cleanup is skipped entirely because each credential has a unique user_handle, so there are no pre-existing credentials with the same handle.
Database State Comparison
| Scenario | true | false |
|---|---|---|
| Alice registers 1st passkey (Google Password Manager) | 1 credential, unique handle | 1 credential, handle H1 |
| Alice registers 2nd passkey (Google Password Manager) | 2 credentials, different handles | Old credential deleted, new credential with handle H1 |
| Alice registers 3rd passkey (YubiKey) | 3 credentials, all different handles | 2 credentials (1 GPM + 1 YubiKey), both with handle H1 |
WebAuthn Signal API
What is the Signal API?
The WebAuthn Signal API (part of CTAP 2.1 and WebAuthn Level 3) provides functions for the relying party to communicate credential state to the authenticator (password manager, platform authenticator).
Current Reality (2026-01)
Important: Testing with Chrome + Google Password Manager shows that only
signalUnknownCredentialactually works for credential removal.signalAllAcceptedCredentialshas no visible effect.
| API | Purpose | Status |
|---|---|---|
signalUnknownCredential | Remove a specific credential from authenticator | ✅ Works |
signalCurrentUserDetails | Update user metadata (name, display name) | ✅ Works |
signalAllAcceptedCredentials | Sync valid credential list | ❌ No effect |
Browser Support
| Browser | Support |
|---|---|
| Chrome 132+ | Yes |
| Edge 132+ | Yes |
| Safari 26+ (macOS/iOS) | Yes |
| Firefox | Not supported |
All Signal API calls are non-critical and use feature detection:
if (
window.PublicKeyCredential &&
typeof window.PublicKeyCredential.signalUnknownCredential === "function"
) {
// API available
}
signalUnknownCredential (Primary API)
The only working API for credential removal with Google Password Manager.
await PublicKeyCredential.signalUnknownCredential({
rpId: "example.com",
credentialId: "cred_111",
});
How the authenticator processes this:
- Find the stored credential matching
rpIdANDcredentialId - Remove that credential from the authenticator
Key advantages:
- Scoped by
credentialIdonly – works regardless ofuser_handlestrategy - Simple and direct – targets exactly one credential
- Actually works with current browsers and password managers
signalCurrentUserDetails
await PublicKeyCredential.signalCurrentUserDetails({
rpId: "example.com",
userId: base64urlEncodedUserHandle,
name: "alice@example.com",
displayName: "Alice",
});
Updates the display name and username for credentials matching rpId AND userId. This API works correctly.
signalAllAcceptedCredentials (Currently Ineffective)
Note: This API currently has no visible effect on Google Password Manager. It is kept in the codebase for future compatibility.
await PublicKeyCredential.signalAllAcceptedCredentials({
rpId: "example.com",
userId: base64urlEncodedUserHandle,
allAcceptedCredentialIds: ["cred_111", "cred_222", "cred_333"],
});
Theoretical behavior (per WebAuthn spec):
- Find all stored credentials matching
rpIdANDuserId - For each stored credential: if its
credentialIdis NOT in the list, mark it as removed - Credentials with a different
userIdare NOT affected
Actual behavior (Chrome + GPM, 2026-01):
- API call succeeds without error
- No credentials are removed or hidden
- No visible change in passkey selection dialog
This API may work in future browser updates or with different authenticators.
Terminology note: The Signal API uses
userIdas the parameter name. This is the same value asuser.id(registration),userHandle(authentication response), anduser_handle(this library’s database field).
Signal API Behavior by User Handle Strategy
signalUnknownCredential – Works in Both Modes
The key advantage of signalUnknownCredential is that it works regardless of user handle strategy. It targets credentials by credentialId, not by user_handle.
Credential Deletion (Both Modes)
// Works identically in both true and false modes
signalUnknownCredential({
rpId: "example.com",
credentialId: "cred_111", // Directly targets the deleted credential
});
Result: The deleted credential is removed from the authenticator. Simple and direct.
Login Failure (Both Modes)
When authentication fails because the server doesn’t recognize a credential:
signalUnknownCredential({
rpId: "example.com",
credentialId: credential.id, // The unrecognized credential
});
Result: The orphaned credential is removed from the authenticator.
signalAllAcceptedCredentials – Theoretical Differences by Mode
Note: This API currently has no effect on Google Password Manager. The following describes theoretical behavior per the WebAuthn spec.
When false (Shared User Handle)
All credentials share the same user_handle, so the API can theoretically affect all credentials:
signalAllAcceptedCredentials({
rpId: "example.com",
userId: encode("aliceHandle123"), // Shared handle
allAcceptedCredentialIds: ["cred_222", "cred_333"], // Remaining credentials
});
Theoretical result: Credential cred_111 (not in list) would be removed.
When true (Unique User Handle)
Each credential has a different user_handle, so the API only affects one credential at a time:
signalAllAcceptedCredentials({
rpId: "example.com",
userId: encode("handle_aaa"), // Only matches credential A
allAcceptedCredentialIds: [], // Empty list
});
Theoretical result: Only the credential with matching user_handle would be affected.
Summary Table
| Signal API | true (unique) | false (shared) | Actual Status |
|---|---|---|---|
signalUnknownCredential | ✅ Works | ✅ Works | Use this |
signalCurrentUserDetails | Updates one credential | Updates all credentials | Works |
signalAllAcceptedCredentials | Limited scope | Full scope | ❌ No effect |
Implementation Strategy
Primary Approach: signalUnknownCredential
The library uses signalUnknownCredential as the primary and default method for credential synchronization because:
- It actually works - the only API that removes credentials from Google Password Manager
- No user_handle dependency - works identically regardless of
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIALsetting - Simple and direct - targets exactly the credential that was deleted
Signal API Mode Configuration
The PASSKEY_SIGNAL_API_MODE environment variable controls which Signal APIs are called:
# Default: 'direct'
PASSKEY_SIGNAL_API_MODE=direct
| Value | APIs Called | Use Case |
|---|---|---|
direct (default) | signalUnknownCredential only | Production - use this |
sync | signalAllAcceptedCredentials only | Testing signalAllAcceptedCredentials in isolation |
direct+sync | Both APIs | Future compatibility testing |
Important: This is a server-side only configuration. The client-side behavior is controlled entirely by the server response content:
- Server always includes
signal_api_modein responses - When mode includes
direct: Client callssignalUnknownCredential - When mode includes
sync: Server also includescredential_idsanduser_handle, client callssignalAllAcceptedCredentials
This design means custom page developers do not need to configure Signal API behavior – it’s handled automatically by the library.
Pure sync mode: When set to sync (without direct), only signalAllAcceptedCredentials is called. This is useful for testing whether signalAllAcceptedCredentials alone can properly synchronize credentials with the authenticator, without signalUnknownCredential interference.
Credential Deletion Flow
When a credential is deleted from the server:
// Fire-and-forget (no await) to avoid blocking page reload
signalUnknownCredential({
rpId: "example.com",
credentialId: deletedCredentialId,
});
Result: The deleted credential is immediately removed from the authenticator.
Login Failure Flow
When authentication fails because the server doesn’t recognize a credential:
signalUnknownCredential({
rpId: "example.com",
credentialId: credential.id,
});
Result: The orphaned credential is removed, preventing future failed attempts.
Optional: signalAllAcceptedCredentials (Future Compatibility)
When PASSKEY_SIGNAL_API_MODE includes sync, the server includes credential_ids and user_handle in responses. The client then calls signalAllAcceptedCredentials:
// Only called if server returns credential_ids in response
// Currently has no effect on Chrome + GPM, kept for future compatibility
if (data.credential_ids && data.user_handle) {
signalAllAcceptedCredentials({
rpId: "example.com",
userId: encode(data.user_handle),
allAcceptedCredentialIds: data.credential_ids,
});
}
This approach eliminates the need for client-side configuration – the server controls the behavior via response content. This may work in future browser updates or with different authenticators (iCloud Keychain, etc.).
Choosing the Right Strategy
Note: Since
signalUnknownCredentialworks regardless of user handle strategy, Signal API behavior is no longer a primary consideration when choosing between modes.
Use true (Unique Per Credential) When:
- Users need multiple passkeys from the same authenticator type (e.g., multiple Google Password Manager credentials)
- The application prioritizes maximum credential accumulation
- The deployment primarily uses hardware security keys
- You want each credential to appear as a separate entry in the passkey selection dialog
Use false (Shared Per User) When:
- Users typically have one credential per authenticator type
- Password manager compatibility is desired (many password managers enforce one credential per user handle per RP)
- The application wants the authenticator to display credentials grouped by user
- You prefer automatic cleanup of old credentials during re-registration
Migration Considerations
Switching from true to false requires consideration:
- Existing credentials: Credentials already registered with unique user handles will retain their individual handles. Only newly registered credentials will use the shared handle.
- Mixed state: During the transition period, the user may have some credentials with unique handles and some with the shared handle. Signal API will only synchronize credentials sharing the same handle.
- Database migration: No schema changes are required. The
user_handlecolumn remains the same; only the values stored change. - No backward compatibility issues: Authentication works regardless of user handle strategy, since credential lookup is by
credential_id, not byuser_handle.
Technical Reference
Server-Side Data Flow
Authentication Response
After successful authentication, the server returns:
// When PASSKEY_SIGNAL_API_MODE includes 'sync':
{
"name": "alice",
"signal_api_mode": "direct+sync",
"user_handle": "handle_of_authenticated_credential",
"credential_ids": ["cred_111", "cred_222", "cred_333"]
}
// When PASSKEY_SIGNAL_API_MODE is 'direct' (default):
{
"name": "alice",
"signal_api_mode": "direct"
}
name: Always includedsignal_api_mode: Always included (controls whether client callssignalUnknownCredential)user_handle: Only included when mode includes ‘sync’credential_ids: Only included when mode includes ‘sync’ (all credential IDs for this user, queried byuser_id, not byuser_handle)
Source: oauth2_passkey/src/coordination/passkey.rs
#![allow(unused)]
fn main() {
pub struct AuthenticationResponse {
pub name: String,
pub user_handle: String,
pub credential_ids: Vec<String>,
}
}
Credential Deletion Response
After deleting a credential, the server returns:
// When PASSKEY_SIGNAL_API_MODE includes 'sync':
{
"signal_api_mode": "direct+sync",
"remaining_credential_ids": ["cred_222", "cred_333"],
"user_handle": "handle_of_deleted_credential"
}
// When PASSKEY_SIGNAL_API_MODE is 'direct' (default):
{
"signal_api_mode": "direct"
}
signal_api_mode: Always included (controls whether client callssignalUnknownCredential)user_handle: Only included when mode includes ‘sync’ (from the deleted credential)remaining_credential_ids: Only included when mode includes ‘sync’ (credential IDs with the sameuser_handleas the deleted credential, filtered forsignalAllAcceptedCredentialswhich is scoped by userId)
Source: oauth2_passkey/src/coordination/passkey.rs
#![allow(unused)]
fn main() {
pub struct DeleteCredentialResponse {
pub remaining_credential_ids: Vec<String>,
pub user_handle: String,
}
}
Client-Side Signal API Calls
All Signal API calls are fire-and-forget (no await) to avoid blocking page navigation. The authentication/deletion has already succeeded on the server; Signal API is non-critical.
After Login Failure (passkey.js, conditional_ui.js)
Remove the unrecognized credential from the authenticator:
// Fire-and-forget - don't block error handling
PublicKeyCredential.signalUnknownCredential({
rpId: window.location.hostname,
credentialId: credential.id,
});
After Credential Deletion (account.js)
Remove the deleted credential from the authenticator. The server controls which APIs are called via signal_api_mode:
const data = await response.json();
const mode = data.signal_api_mode || "direct";
// signalUnknownCredential - only if mode includes 'direct'
// Works with any user_handle strategy
if (mode.includes("direct")) {
signalUnknownCredential({
rpId: window.location.hostname,
credentialId: deletedCredentialId,
});
}
// signalAllAcceptedCredentials - only if server returns remaining_credential_ids
// Server includes these fields when mode includes 'sync'
if (data.remaining_credential_ids && data.user_handle) {
signalAllAcceptedCredentials({
rpId: window.location.hostname,
userId: encode(data.user_handle),
allAcceptedCredentialIds: data.remaining_credential_ids,
});
}
After Successful Login (Optional)
When PASSKEY_SIGNAL_API_MODE includes sync, the server includes credential_ids and user_handle in the authentication response. The client detects this and calls signalAllAcceptedCredentials:
// Server controls whether this is called by including credential_ids in response
// No client-side configuration needed
// Currently has no effect on Chrome + GPM
if (data.credential_ids && data.user_handle) {
const userIdBytes = new TextEncoder().encode(data.user_handle);
const userIdBase64Url = arrayBufferToBase64URL(userIdBytes.buffer);
PublicKeyCredential.signalAllAcceptedCredentials({
rpId: window.location.hostname,
userId: userIdBase64Url,
allAcceptedCredentialIds: data.credential_ids,
});
}
Encoding Note
The user handle is stored as a UTF-8 string in the database. When passed to the Signal API, it must be encoded to base64url:
const userIdBytes = new TextEncoder().encode(userHandle); // String -> Uint8Array
const userIdBase64Url = arrayBufferToBase64URL(userIdBytes.buffer); // Uint8Array -> base64url
This matches how the user handle is encoded during credential registration (user.id is set to base64URLToUint8Array(userHandle)).
References
- WebAuthn Level 3 Specification
- WebAuthn Signal API Explainer
- Chrome Developers: Signal API
- CTAP 2.1 Specification
Development
This chapter covers the development practices, project organization, and testing strategies for the OAuth2-Passkey library.
Project Structure
The OAuth2-Passkey workspace is organized as a multi-crate Rust project with clear separation of concerns.
Crate Organization
oauth2-passkey/
├── oauth2_passkey/ # Core library
│ ├── src/
│ │ ├── config/ # Configuration handling
│ │ ├── coordination/ # Central orchestration of auth flows
│ │ ├── oauth2/ # OAuth2 implementation
│ │ │ ├── main/ # Core OAuth2 logic
│ │ │ └── storage/ # OAuth2 data persistence
│ │ ├── passkey/ # WebAuthn/Passkey implementation
│ │ │ ├── main/ # Core passkey logic
│ │ │ └── storage/ # Passkey data persistence
│ │ ├── session/ # Session management
│ │ ├── storage/ # Database and cache abstraction
│ │ │ ├── data_store/ # PostgreSQL/SQLite backends
│ │ │ └── cache_store/ # Redis/Memory backends
│ │ ├── userdb/ # User account management
│ │ ├── test_utils/ # Test utilities module
│ │ └── utils/ # Common utilities
│ └── tests/ # Integration tests
│ ├── common/ # Shared test utilities
│ └── integration/ # Integration test modules
│
├── oauth2_passkey_axum/ # Axum web framework integration
│ └── src/
│ ├── assets/ # Static assets (JS/CSS)
│ └── templates/ # HTML templates
│
└── demo-*/ # Demo applications
Key Design Principles
- Layered Architecture: Clear separation between core logic and web framework
- Coordination Layer: All authentication flows route through the coordination module
- Flexible Storage: Supports both development (SQLite, in-memory) and production (PostgreSQL, Redis) setups
- Security First: Built-in CSRF protection, secure sessions, and page session tokens
Testing Strategy
The project follows a bottom-up testing approach, starting with fundamental modules and building toward integration testing.
Testing Principles
-
Simplicity First
- Prefer simple, focused tests
- One assertion per test when practical
- Clear, descriptive test names
-
Minimal Dependencies
- Avoid test-only dependencies when possible
- Prefer standard library solutions
- Document required test setup clearly
-
Test Organization
- Unit tests in the same file as the code under test
- Integration tests in
/tests/directory - Documentation tests in module docs when helpful
Module Testing Order
The testing strategy follows a bottom-up approach based on module dependencies:
- Core Utilities (
src/utils.rs) - Foundation functions - Configuration (
src/config.rs) - Configuration validation - Storage Layer (
src/storage/) - Data and cache operations - OAuth2 Module (
src/oauth2/) - OAuth2 flows and token handling - Passkey Module (
src/passkey/) - WebAuthn operations - Session Management (
src/session/) - Session handling - User Database (
src/userdb/) - User management - Coordination Layer (
src/coordination/) - Business workflows
Unit Test Patterns
Unit tests are placed inline with the code they test, within a tests submodule.
Basic Test Structure
#![allow(unused)]
fn main() {
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_function_returns_expected_value() {
// Arrange
let input = "test_input";
// Act
let result = function_under_test(input);
// Assert
assert!(result.is_ok());
assert_eq!(result.unwrap(), expected_value);
}
}
}
Pure Function Testing
For pure functions without side effects, tests are straightforward:
#![allow(unused)]
fn main() {
#[test]
fn test_base64url_encode_decode() {
// Test with simple string
let original = b"hello world";
let encoded = base64url_encode(original.to_vec()).expect("Failed to encode");
let decoded = base64url_decode(&encoded).expect("Failed to decode");
assert_eq!(decoded, original);
// Test with empty input
let empty_encoded = base64url_encode(vec![]).expect("Failed to encode empty");
let empty_decoded = base64url_decode(&empty_encoded).expect("Failed to decode empty");
assert!(empty_decoded.is_empty());
}
#[test]
fn test_base64url_decode_invalid() {
// Test with invalid base64url string
let invalid_base64 = "This is not base64url!";
let result = base64url_decode(invalid_base64);
assert!(matches!(result, Err(UtilError::Format(_))));
}
}
Testing with Dependencies
For tests requiring HTTP headers or other framework types:
#![allow(unused)]
fn main() {
// Helper function to create test fixtures
fn create_header_map_with_cookie(cookie_name: &str, cookie_value: &str) -> HeaderMap {
let mut headers = HeaderMap::new();
let cookie_str = format!("{cookie_name}={cookie_value}");
headers.insert(COOKIE, HeaderValue::from_str(&cookie_str).unwrap());
headers
}
/// Test session ID extraction from HTTP headers
#[test]
fn test_get_session_id_from_headers() {
// Given a header map with a session cookie
let cookie_name = SESSION_COOKIE_NAME.to_string();
let session_id = "test_session_id";
let headers = create_header_map_with_cookie(&cookie_name, session_id);
// When getting the session ID
let result = get_session_id_from_headers(&headers);
// Then it should return the session ID
assert!(result.is_ok());
let session_id_opt = result.unwrap();
assert!(session_id_opt.is_some());
assert_eq!(session_id_opt.unwrap(), session_id);
}
}
Async Test Patterns
For async functions requiring database access:
#![allow(unused)]
fn main() {
use crate::test_utils::init_test_environment;
#[tokio::test]
async fn test_user_creation() {
// Initialize test environment (loads .env_test, sets up databases)
init_test_environment().await;
// Arrange
let user = User::new(
"test-user".to_string(),
"test@example.com".to_string(),
"Test User".to_string(),
);
// Act
let result = UserStore::upsert_user(user).await;
// Assert
assert!(result.is_ok());
let created_user = result.unwrap();
assert_eq!(created_user.email, "test@example.com");
}
}
Integration Test Patterns
Integration tests verify complete authentication flows and are organized in the /tests/ directory.
Test Organization
oauth2_passkey/tests/
├── integration.rs # Test module entry point
├── common/ # Shared test utilities
│ ├── mod.rs
│ ├── fixtures.rs # Test data and constants
│ └── mock_browser.rs # HTTP client simulation
└── integration/
├── oauth2_flows.rs # OAuth2 flow tests
├── passkey_flows.rs # Passkey flow tests
├── combined_flows.rs # Multi-method auth tests
└── api_client_flows.rs # API client tests
Integration Test Structure
#![allow(unused)]
fn main() {
/// Integration tests for oauth2-passkey library
///
/// These tests verify complete authentication flows in an isolated test environment
/// with mocked external services and in-memory databases.
mod common;
mod integration {
pub mod api_client_flows;
pub mod combined_flows;
pub mod oauth2_flows;
pub mod passkey_flows;
}
}
Flow Testing Example
#![allow(unused)]
fn main() {
use crate::common::{MockBrowser, TestSetup, TestUsers};
#[tokio::test]
async fn test_oauth2_login_flow() {
// Setup test environment
let setup = TestSetup::new().await;
let browser = MockBrowser::new(&setup);
// Start OAuth2 authorization
let auth_url = browser.start_oauth2_login().await
.expect("Should initiate OAuth2 flow");
// Complete authorization (mock provider response)
let (auth_code, state) = complete_oauth2_authorization(&auth_url).await
.expect("Should complete authorization");
// Handle callback
let response = browser.oauth2_callback(&auth_code, &state).await
.expect("Callback should succeed");
// Verify session created
assert!(response.headers().contains_key("set-cookie"));
}
}
Test Utilities
The test_utils module provides centralized test setup functionality for consistent test environments.
Initialization
#![allow(unused)]
fn main() {
use crate::test_utils::init_test_environment;
#[tokio::test]
async fn my_test() {
// Initialize test environment once per test run
init_test_environment().await;
// Test code that requires database access
}
}
What init_test_environment() Does
-
Environment Setup (runs once):
- Loads
.env_testfile with test configuration - Falls back to
.envif test file not found - Cleans up any existing test database file
- Loads
-
Database Initialization:
- Initializes UserStore, OAuth2Store, and PasskeyStore
- Creates a first test user if none exists
- Sets up test OAuth2 accounts and Passkey credentials
Test Origin Helper
#![allow(unused)]
fn main() {
use crate::test_utils::get_test_origin;
#[test]
fn test_with_origin() {
let origin = get_test_origin();
// Returns ORIGIN from environment or defaults to "http://127.0.0.1:3000"
}
}
First User Test Data
The test utilities automatically create a first user with:
- User ID:
first-user - Email:
first-user@example.com - OAuth2 Account: Google provider with test ID
- Passkey Credential: Valid ECDSA P-256 credential for signature verification
This enables integration tests to perform authentic authentication flows.
Running Tests
# Run all tests
cargo test
# Run tests for specific crate
cargo test --manifest-path oauth2_passkey/Cargo.toml
cargo test --manifest-path oauth2_passkey_axum/Cargo.toml --all-features
# Run specific test module
cargo test module_name::tests::
# Run with logging output
RUST_LOG=debug cargo test -- --nocapture
# Run ignored (slow) tests
cargo test -- --ignored
Best Practices
Error Handling in Tests
- Prefer
expect()with descriptive messages overunwrap() - Test error cases explicitly
- Use pattern matching to verify error types
#![allow(unused)]
fn main() {
#[test]
fn test_invalid_input_returns_error() {
let result = process_input("invalid");
assert!(matches!(result, Err(MyError::InvalidInput(_))));
}
}
Test Isolation
- Each test should be independent
- Use unique identifiers for test data
- Clean up test data when necessary
- Use test fixtures for complex setup
Performance
- Keep individual tests fast (under 100ms for unit tests)
- Use in-memory databases for testing when possible
- Mark slow tests with
#[ignore] - Run slow tests separately in CI
#![allow(unused)]
fn main() {
#[test]
#[ignore] // Run with: cargo test -- --ignored
fn slow_integration_test() {
// Test that requires external services or significant time
}
}
Documentation
- Document test requirements in comments
- Explain complex test scenarios
- Note any external dependencies
- Use doc comments for test helper functions
Code Quality Commands
After making code changes, always verify quality:
# Format code
cargo fmt --all
# Check for issues
cargo clippy --all-targets --all-features
# Run tests
cargo test
All clippy warnings should be addressed before committing code.
CI/CD
This chapter covers the CI/CD pipelines configured for the OAuth2-Passkey project using GitHub Actions.
Overview
The project has three GitHub Actions workflows:
| Workflow | File | Purpose |
|---|---|---|
| CI | ci.yml | Testing, linting, security audit |
| Coverage | coverage.yml | Code coverage reporting |
| Documentation | docs.yml | GitHub Pages deployment |
CI Workflow
The main CI workflow (.github/workflows/ci.yml) runs on every push and pull request to master and develop branches.
Jobs
Test Suite
Runs tests across multiple Rust versions:
| Version | Required | Purpose |
|---|---|---|
| stable | Yes | Primary testing target |
| beta | No (can fail) | Early warning for upcoming changes |
| nightly | No (can fail) | Bleeding edge compatibility |
Steps performed (stable only):
- Check formatting (
cargo fmt --all -- --check) - Run clippy (
cargo clippy --all-targets --all-features)
Steps performed (all versions):
- Build core library (
oauth2_passkey) - Build Axum integration (
oauth2_passkey_axum) - Test core library
- Test Axum integration (with all features)
- Test Axum integration (with no default features)
Security Audit
Runs cargo audit to check for known vulnerabilities in dependencies.
- name: Run security audit
run: cargo audit --ignore RUSTSEC-2023-0071
The --ignore flag excludes known advisories that have been reviewed and accepted.
Documentation Build
Verifies that rustdoc builds without warnings:
- name: Build documentation
run: |
cargo doc --no-deps --manifest-path oauth2_passkey/Cargo.toml
cargo doc --no-deps --manifest-path oauth2_passkey_axum/Cargo.toml --all-features
env:
RUSTDOCFLAGS: "-D warnings"
MSRV Check
Verifies compatibility with the Minimum Supported Rust Version (currently 1.88):
- name: Install Rust 1.88
uses: dtolnay/rust-toolchain@stable
with:
toolchain: "1.88"
- name: Check MSRV compatibility
run: |
cargo check --manifest-path oauth2_passkey/Cargo.toml
cargo check --manifest-path oauth2_passkey_axum/Cargo.toml --all-features
Coverage Workflow
The coverage workflow (.github/workflows/coverage.yml) generates code coverage reports on pushes and pull requests to master.
How It Works
-
Generate Coverage: Uses
cargo-llvm-covto run tests with coverage instrumentation- name: Generate coverage report run: cargo llvm-cov --all-features --workspace --lcov --output-path lcov.info -
Upload to Codecov: Sends coverage data to Codecov for tracking and visualization
- name: Upload coverage to Codecov uses: codecov/codecov-action@v4 -
Archive Report: Saves the coverage report as a GitHub artifact (retained for 30 days)
Viewing Coverage
- Codecov Dashboard: View coverage trends and file-level details at codecov.io
- GitHub Artifacts: Download
lcov.infofrom the workflow run’s artifact section
Documentation Workflow
The documentation workflow (.github/workflows/docs.yml) deploys the mdBook documentation to GitHub Pages.
Deployment URL
The documentation is published at:
https://ktaka-ccmp.github.io/oauth2-passkey/
This URL follows GitHub’s standard naming convention:
https://{username}.github.io/{repository-name}/
This is a fixed GitHub Pages specification and cannot be changed (unless you configure a custom domain).
Triggers
on:
push:
branches:
- master
paths:
- 'docs/**'
- '.github/workflows/docs.yml'
workflow_dispatch:
- Automatic: Push to
masterbranch with changes indocs/directory - Manual: Trigger via
workflow_dispatchfrom GitHub Actions UI
How It Works
-
Build Step: mdBook compiles the documentation from
docs/src/into static HTML indocs/book/- name: Build documentation run: mdbook build docs -
Upload Step: The generated
docs/book/directory is uploaded as a GitHub Pages artifact- name: Upload artifact uses: actions/upload-pages-artifact@v3 with: path: 'docs/book' -
Deploy Step: The artifact is deployed to GitHub Pages
- name: Deploy to GitHub Pages uses: actions/deploy-pages@v4
Required GitHub Settings
For this workflow to function, the repository must have GitHub Pages configured:
- Go to Settings → Pages
- Under Source, select GitHub Actions
This enables the actions/deploy-pages action to publish content to GitHub Pages.
Summary
| Workflow | Trigger | Key Outputs |
|---|---|---|
| CI | Push/PR to master, develop | Test results, lint status |
| Coverage | Push/PR to master | Coverage report on Codecov |
| Documentation | Push to master (docs/) | Live site at GitHub Pages |
Release Process for OAuth2-Passkey Workspace
This document explains how to release the oauth2-passkey and oauth2-passkey-axum crates in the correct sequential order.
Overview
The workspace contains two publishable crates with a dependency relationship:
oauth2-passkey(core library)oauth2-passkey-axum(Axum integration, depends onoauth2-passkey)
During development, we use local path dependencies for immediate feedback. During release, we need to publish oauth2-passkey first, then update oauth2-passkey-axum to use the published version before publishing it.
Development vs. Publishing Dependencies
Development Setup (Current)
# In workspace Cargo.toml
[workspace.dependencies]
oauth2-passkey = { path = "./oauth2_passkey" }
oauth2-passkey-axum = { path = "./oauth2_passkey_axum" }
Publishing Setup
# oauth2-passkey-axum temporarily uses published version
oauth2-passkey = "0.2.x" # published version
Release Methods
We provide two methods for releasing: automated and manual.
Automated Release (Recommended)
Use the automated release script for a streamlined process:
# Dry-run to verify (recommended first)
./utils/release.sh -d -v <version>
# Execute the release
./utils/release.sh -e -v <version>
Example:
# Verify first
./utils/release.sh -d -v 0.2.1
# Then execute
./utils/release.sh -e -v 0.2.1
What it does:
- ✅ Checks that git working directory is clean
- 🎯 Publishes
oauth2-passkeywith the specified version - ⏳ Waits for the package to be available on crates.io
- 🔄 Updates
oauth2-passkey-axum/Cargo.tomlto use the published version - 🎯 Publishes
oauth2-passkey-axumwith the same version - 🔄 Reverts
oauth2-passkey-axumback to workspace dependencies - 📝 Commits the version bump changes
- 🏷️ Creates git tags for both releases
Prerequisites:
- Clean git working directory
- Valid
cargologin credentials for crates.io - Internet connection for crates.io verification
Manual Release Process (Alternative)
For more control or troubleshooting, you can follow the manual process:
Manual Steps:
-
Prepare Release
# Ensure clean git state git status -
Release oauth2-passkey First
cd oauth2_passkey cargo publish --dry-run # Test first cargo publish # Actually publish cd .. -
Wait for Availability
# Check until your version appears cargo search oauth2-passkey -
Update oauth2-passkey-axum Dependency
# Edit oauth2_passkey_axum/Cargo.toml # Change: oauth2-passkey = { workspace = true } # To: oauth2-passkey = "0.2.x" # Use the version you're releasing -
Release oauth2-passkey-axum
cd oauth2_passkey_axum cargo publish --dry-run # Test first cargo publish # Actually publish cd .. -
Revert for Development
# Edit oauth2_passkey_axum/Cargo.toml # Change: oauth2-passkey = "0.2.x" # Back to: oauth2-passkey = { workspace = true } -
Tag and Commit
git add . git commit -m "chore: release vX.Y.Z" git tag vX.Y.Z git push origin main --tags
Configuration Details
The release process is configured using cargo-release metadata in the Cargo.toml files:
Workspace Configuration
# In main Cargo.toml
[workspace.metadata.release]
sign-commit = false
sign-tag = false
push = false
publish = false
tag = false
Per-Crate Configuration
# In oauth2_passkey/Cargo.toml and oauth2_passkey_axum/Cargo.toml
[package.metadata.release]
publish = true
tag = true
sign-tag = false
sign-commit = false
push = false
Troubleshooting
Common Issues
1. “Package not found on crates.io”
- Wait longer for crates.io to update (can take up to 5 minutes)
- Check your internet connection
- Verify the package was actually published
2. “Working directory not clean”
- Commit or stash any pending changes before releasing
- Check
git statusand resolve any conflicts
3. “Permission denied” on crates.io
- Ensure you’re logged in:
cargo login - Verify you have publish permissions for both crates
4. “Version already exists”
- Bump the version number in
workspace.package.version - Ensure you’re not trying to republish an existing version
Recovery from Failed Release
If the automated release fails partway through:
-
Check what was published:
cargo search oauth2-passkey cargo search oauth2-passkey-axum -
If only oauth2-passkey was published:
- Continue from step 4 of the manual process
- Or fix the issue and re-run the automated script
-
If both were published but git wasn’t updated:
- Manually create tags and commit the version bump
Version Management
The workspace uses a shared version number in Cargo.toml:
[workspace.package]
version = "0.1.1" # Update this for releases
All crates inherit this version with:
[package]
version = { workspace = true }
Security Considerations
Obtaining and Setting Crates.io Token
Before you can publish crates, you need to authenticate with crates.io:
-
Create a crates.io account:
- Visit crates.io and sign up/log in
- You can use GitHub authentication for convenience
-
Generate an API token:
- Go to crates.io/me (Account Settings)
- Click on “API Tokens” in the left sidebar
- Click “New Token”
- Give it a descriptive name (e.g., “oauth2-passkey-release”)
- Select appropriate scopes:
publish-new- allows publishing new cratespublish-update- allows updating existing crates
- Copy the generated token immediately (you won’t see it again)
-
Set the token locally:
cargo login <your-token-here>Or alternatively, set it as an environment variable:
export CARGO_REGISTRY_TOKEN=<your-token-here> -
Verify authentication:
cargo owner --list oauth2-passkeyThis should show you as an owner if the crate exists, or give appropriate error if it doesn’t.
Security Best Practices
- Never commit crates.io tokens to git
- Use
cargo loginto authenticate securely - Store tokens in secure password managers
- Regularly rotate API tokens (every 6-12 months)
- Use minimal required scopes for tokens
- Review all changes with
--dry-runbefore publishing - Both scripts avoid automatic git pushing for safety review
Managing Crate Ownership
For collaborative projects, you may need to add co-owners:
# Add a co-owner to both crates
cargo owner --add username oauth2-passkey
cargo owner --add username oauth2-passkey-axum
# List current owners
cargo owner --list oauth2-passkey
cargo owner --list oauth2-passkey-axum
Related Files
utils/release.sh- Automated release script (use-dfor dry-run,-efor execute)oauth2_passkey/Cargo.toml- Core library configurationoauth2_passkey_axum/Cargo.toml- Axum integration configurationCargo.toml- Workspace configuration
Next Steps After Release
-
Verify Publications:
-
Update Documentation:
- Update README.md files with new version numbers
- Update any version references in documentation
-
Test Integration:
- Create a new project and test importing the published crates
- Verify all examples still work with the new versions
-
Announcement:
- Update CHANGELOG.md
- Consider announcing on relevant platforms
Terminology and Glossary
This document clarifies the terminology used in WebAuthn, OAuth2, and this library. Many identifier terms can be confusing because the same concept has different names in different contexts.
User Identifiers
Quick Reference
| Term | Context | Description |
|---|---|---|
user_id | This library (database) | Application’s internal user identifier (database primary key) |
user_handle | This library (database) | WebAuthn user identifier stored with credentials |
user.id | WebAuthn registration | User identifier in PublicKeyCredentialUserEntity |
userHandle | WebAuthn authentication | Returned in AuthenticatorAssertionResponse |
userId | Signal API | Parameter name for user identifier |
Key Distinction
user_id is different from user_handle/user.id/userHandle/userId.
user_id: The application’s internal database identifier for the user accountuser_handle: The WebAuthn-specific identifier that the authenticator stores and returns
The terms user_handle, user.id, userHandle, and userId all refer to the same value - just in different contexts:
- Registration:
user.idinPublicKeyCredentialUserEntity - Authentication:
userHandleinAuthenticatorAssertionResponse - Signal API:
userIdparameter - This library’s database:
user_handlecolumn
Database Relationship
Application Database:
+---------------------------------------------+
| users table |
| user_id (PK) ------------------+ |
| account, label | |
+----------------------------------|---------+
v
+---------------------------------------------+
| passkey_credentials table |
| credential_id (PK) |
| user_id (FK) <----------------| |
| user_handle --------> WebAuthn user.id |
| public_key, aaguid |
+---------------------------------------------+
WebAuthn Term Aliases:
Registration: user.id --------+
Authentication: userHandle -----+---> Same value
Signal API: userId ---------+
This library: user_handle ----+
Credential Identifiers
| Term | Context | Description |
|---|---|---|
credential_id | This library (database) | Base64URL-encoded credential identifier |
credentialId | WebAuthn/Signal API | Raw credential identifier (Uint8Array or Base64URL) |
id | PublicKeyCredential | Same as credentialId, Base64URL-encoded |
rawId | PublicKeyCredential | Same as credentialId, ArrayBuffer format |
Encoding Note
In JavaScript, credential IDs come in two formats from PublicKeyCredential:
id: Base64URL-encoded stringrawId: RawArrayBuffer
This library stores credential IDs as Base64URL-encoded strings in the database.
Session Identifiers
| Term | Context | Description |
|---|---|---|
session_id | This library | Internal session identifier stored in cache |
session_cookie | HTTP | Cookie value sent to the client |
SessionId | Type wrapper | Type-safe wrapper for session identifiers |
SessionCookie | Type wrapper | Type-safe wrapper for session cookie values |
OAuth2 Identifiers
| Term | Context | Description |
|---|---|---|
provider | This library | OAuth2 provider name (e.g., “google”) |
provider_user_id | This library | User ID from the OAuth2 provider |
sub | OIDC | Subject identifier in ID token (same as provider_user_id) |
Type-Safe Wrappers
This library uses type-safe wrappers to prevent identifier confusion at compile time. See Type-Safe Validation for details.
| Type | Wraps | Description |
|---|---|---|
UserId | String | Database user identifier |
CredentialId | String | Passkey credential identifier |
SessionId | String | Session identifier |
SessionCookie | String | Session cookie value |
UserHandle | String | WebAuthn user handle |
Provider | String | OAuth2 provider name |
ProviderUserId | String | OAuth2 provider’s user ID |
Common Confusion Points
1. user_id vs user_handle
#![allow(unused)]
fn main() {
// WRONG: These are different!
let user_id = "db_user_123"; // Database primary key
let user_handle = "webauthn_abc"; // WebAuthn identifier
// They relate to different concepts:
// - user_id: Identifies the user in YOUR application
// - user_handle: Identifies the user to the AUTHENTICATOR
}
2. Multiple credentials, one user
A single user (user_id) can have multiple passkey credentials, each with its own credential_id. Depending on configuration, they may share the same user_handle or each have a unique one.
| Configuration | user_handle per credential |
|---|---|
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL=false (default) | Shared |
PASSKEY_USER_HANDLE_UNIQUE_FOR_EVERY_CREDENTIAL=true | Unique |
3. Signal API userId parameter
The Signal API uses userId as the parameter name, but this is the WebAuthn user handle, not the application’s user_id:
// CORRECT: userId here is the user_handle value
PublicKeyCredential.signalAllAcceptedCredentials({
rpId: "example.com",
userId: base64urlEncode(user_handle), // NOT user_id!
allAcceptedCredentialIds: [...]
});
See Also
- User Handle and Signal API - Detailed explanation of user handle strategies
- Type-Safe Validation - Compile-time type safety for identifiers
Security Advisory Management
Active Security Considerations
RUSTSEC-2023-0071 - RSA Marvin Attack
Status: Eliminated from Direct Dependencies Advisory: RUSTSEC-2023-0071 Severity: Medium (5.9) Date: 2023-11-22
Vulnerability Description
The Marvin Attack is a potential key recovery attack through timing side-channels that affects RSA decryption operations using private keys.
Current Status
✅ Direct Usage Eliminated (June 2025):
- Removed direct dependency on
rsacrate from oauth2-passkey - Replaced with
jsonwebtoken::DecodingKey::from_rsa_components()for JWT verification - No longer performing any RSA operations in our codebase
- Removed
pkcs1crate dependency used for PEM conversion
Remaining Exposure:
- Transitive dependency through
sqlx-mysql→rsacrate (via SQLx macros) - Impact: None - we only use SQLite and PostgreSQL features, never MySQL
- Risk: Minimal - vulnerability not in our execution path
- CI Status: Advisory ignored (RUSTSEC-2023-0071) due to unused dependency path
Technical Details:
- SQLx’s macro system (
sqlx-macros-core) includes all database drivers at compile time - This is a known SQLx architectural limitation
- MySQL driver dependencies are never loaded or executed in our applications
- All actual database operations use only SQLite or PostgreSQL drivers
Migration Details
Before (Vulnerable Pattern):
#![allow(unused)]
fn main() {
// Used rsa crate directly
let rsa_public_key = RsaPublicKey::new(
rsa::BigUint::from_bytes_be(&n),
rsa::BigUint::from_bytes_be(&e),
)?;
let pem = rsa_public_key.to_pkcs1_pem(LineEnding::default())?;
Ok(DecodingKey::from_rsa_pem(pem.as_bytes())?)
}
After (Secure Pattern):
#![allow(unused)]
fn main() {
// Uses jsonwebtoken's built-in RSA support
Ok(DecodingKey::from_rsa_components(n, e)?)
}
Benefits of Migration
- Security: Eliminated direct RSA crate usage and vulnerability exposure
- Simplicity: Reduced code complexity and dependency count
- Maintenance: Relies on well-maintained
jsonwebtokencrate for RSA handling - Performance: Eliminated unnecessary base64 decode/encode cycles
Mitigation
- Regular monitoring of RustSec advisories for RSA crate updates
- Consider migration when rsa 0.10+ becomes stable with security fixes
- Current usage pattern remains secure for intended public key operations
Review Schedule
- Next Review: When rsa 0.10.0 stable is released
- Trigger for Action: If vulnerability scope expands to affect public key operations
- Alternative: Monitor for JWT libraries that don’t depend on RSA crate
Last Updated: June 22, 2025 Review Frequency: Quarterly or upon new RSA crate releases
Type-Safe Validation
What Is Type-Safe Validation?
In authentication code, many values are plain strings: user IDs, session IDs, credential IDs, email addresses, and so on. When functions accept raw String parameters, the compiler cannot tell them apart. This leads to a class of bugs where values are accidentally swapped:
#![allow(unused)]
fn main() {
// Both parameters are String — the compiler accepts this without complaint
fn delete_credential(session_id: String, credential_id: String) { /* ... */ }
// Bug: arguments are swapped, but the code compiles and runs
delete_credential(credential_id, session_id);
}
This is especially dangerous in authentication systems where such a mix-up can cause privilege escalation or silent data corruption.
Type-safe validation solves this by wrapping each string in a dedicated type (the “newtype pattern” in Rust):
#![allow(unused)]
fn main() {
// Each type is a thin wrapper around String
pub struct SessionId(String);
pub struct CredentialId(String);
// Now the function signature enforces correct usage
fn delete_credential(session_id: SessionId, credential_id: CredentialId) { /* ... */ }
// Bug caught at compile time — this will not compile
delete_credential(credential_id, session_id);
// ^^^^^^^^^^^^^ expected `SessionId`, found `CredentialId`
}
The wrapper types also validate their contents on construction (e.g., checking length, allowed characters), so invalid values are rejected immediately rather than causing errors deep in the system.
Why This Library Uses It
Authentication code handles many different string identifiers (session IDs, user IDs, credential IDs, cache keys, etc.) that pass through multiple layers. Without type-safe wrappers, two categories of bugs can occur:
- Parameter Confusion: Raw string parameters can be silently swapped. For example, passing a
credential_idwhere asession_idis expected compiles and runs, but produces incorrect behavior. In authentication code, this can lead to privilege escalation or data corruption. - Unvalidated Input: Raw strings carry no guarantee about their contents. Malformed, empty, or overly long values can propagate deep into the system before causing failures. Cache keys could contain characters that trigger Redis command injection.
By wrapping each identifier in its own type, these issues are caught at the point of entry: the compiler rejects type mix-ups, and the constructor rejects invalid input.
Core Benefits
- Compile-time safety: Impossible to mix up parameter types (compiler rejects it)
- Input validation at the boundary: Invalid values are rejected at construction, never propagated
- Single validation point: Validate once at construction, never again
- Consistent behavior: Same validation rules regardless of backend/deployment
- Performance: Zero runtime overhead after construction (just a
Stringwrapper) - Maintainability: Centralized validation logic per type
Available Types
All types follow the same newtype pattern. Here is the full implementation of SessionId as a representative example:
#![allow(unused)]
fn main() {
pub struct SessionId(String); // Private inner field -- cannot be constructed directly
impl SessionId {
pub fn new(id: String) -> Result<Self, SessionError> {
// Must not be empty
if id.is_empty() { return Err(SessionError::Validation("...".into())); }
// Session IDs need sufficient entropy
if id.len() < 10 { return Err(SessionError::Validation("...".into())); }
if id.len() > 256 { return Err(SessionError::Validation("...".into())); }
// URL-safe characters only (no whitespace)
if !id.chars().all(|c| c.is_ascii_alphanumeric() || matches!(c, '-' | '_' | '.' | '~')) {
return Err(SessionError::Validation("...".into()));
}
Ok(SessionId(id))
}
pub fn as_str(&self) -> &str { &self.0 }
}
}
Every type below works the same way: new() validates and returns Result, as_str() returns the inner string. The differences are in what each type accepts.
Session & User Management
#![allow(unused)]
fn main() {
// SessionId: session identifiers for coordination layer functions
// Length: 10-256 | Chars: a-zA-Z0-9 - _ . ~ | Error: SessionError
pub struct SessionId(String);
// UserId: user identifiers (database IDs)
// Length: 1-255 | Chars: a-zA-Z0-9 - _ . @ + ( ) | Rejects: .. -- __ | Error: SessionError
pub struct UserId(String);
// SessionCookie: HTTP session cookie values
// Length: 10-1024 | Chars: a-zA-Z0-9 - _ = . + / | Error: SessionError
pub struct SessionCookie(String);
}
Usage:
#![allow(unused)]
fn main() {
let session_id = SessionId::new("session_abc123".to_string())?;
let user_id = UserId::new("user_123".to_string())?;
let user = get_user(session_id, user_id).await?;
let cookie = SessionCookie::new(cookie_value.to_string())?;
let user = get_user_from_session(&cookie).await?;
}
WebAuthn/Passkey Types
#![allow(unused)]
fn main() {
// CredentialId: passkey credential identifiers (base64url-encoded)
// Length: 10-1024 | Chars: a-zA-Z0-9 - _ . ~ = + / | Error: PasskeyError
pub struct CredentialId(String);
// UserHandle: WebAuthn user handles
// NO VALIDATION -- accepts any string, returns Self (not Result)
// WebAuthn user handles can be arbitrary binary data (base64-encoded)
pub struct UserHandle(String);
// UserName: usernames for WebAuthn registration
// Length: 1-64 | Rejects: .. -- __ and whitespace-only | Error: PasskeyError
pub struct UserName(String);
// ChallengeType: identifies the WebAuthn operation kind
// Length: 1-64 | Chars: a-zA-Z0-9 _ | Error: PasskeyError
// Convenience constructors: ChallengeType::registration(), ChallengeType::authentication()
pub struct ChallengeType(String);
// ChallengeId: unique identifier for a specific challenge instance
// Length: 8-256 | Chars: a-zA-Z0-9 - _ . + | Error: PasskeyError
pub struct ChallengeId(String);
}
Usage:
#![allow(unused)]
fn main() {
let cred_id = CredentialId::new("credential_abc123".to_string())?;
delete_credential(session_id, cred_id).await?;
let handle = UserHandle::new("user_handle_123".to_string()); // no ? -- always succeeds
let challenge_type = ChallengeType::registration(); // convenience constructor, no validation needed
}
OAuth2 Types
#![allow(unused)]
fn main() {
// OAuth2State: OAuth2 state parameter carrying CSRF protection data
// Length: 10-8192 | Error: OAuth2Error
// Multi-layer validation: base64url decode -> UTF-8 check -> JSON parse as StateParams
// This is the most heavily validated type in the library.
pub struct OAuth2State(String);
// AccountId: database identifiers for OAuth2 accounts
// Length: 1-255 | Chars: a-zA-Z0-9 - _ . @ + | Rejects: .. -- __ | Error: OAuth2Error
pub struct AccountId(String);
// Provider: OAuth2 provider names (e.g., "google", "github")
// Length: 1-50 | Chars: a-zA-Z0-9 - _ . | Cannot start with - _ . | Error: OAuth2Error
pub struct Provider(String);
// ProviderUserId: external user IDs from OAuth2 providers
// Length: 1-512 | Chars: a-zA-Z0-9 - _ . @ + = ( ) | Rejects: .. -- __ | Error: OAuth2Error
pub struct ProviderUserId(String);
// DisplayName: user display names from OAuth2 providers
// Length: 1-100 | Rejects: .. -- __ and whitespace-only | Error: OAuth2Error
pub struct DisplayName(String);
// Email: email addresses from OAuth2 providers
// Length: 3-254 (RFC 5321) | Must have exactly one @ with non-empty local/domain | Error: OAuth2Error
pub struct Email(String);
}
Usage:
#![allow(unused)]
fn main() {
let state = OAuth2State::new(state_param.to_string())?; // validates base64url -> UTF-8 -> JSON
let decoded = decode_state(&state)?;
let provider = Provider::new("google".to_string())?;
let email = Email::new("alice@example.com".to_string())?;
}
Cache & Storage Types
#![allow(unused)]
fn main() {
// CachePrefix and CacheKey share identical validation logic.
// Both are designed to prevent Redis command injection.
//
// Length: max 250 bytes
// Rejected chars: \n \r space \t
// Rejected keywords: SET, GET, DEL, FLUSHDB, FLUSHALL, EVAL, SCRIPT,
// SHUTDOWN, CONFIG, CLIENT, DEBUG, MONITOR, SYNC
// Error: StorageError
pub struct CachePrefix(String);
pub struct CacheKey(String);
// CachePrefix provides convenience constructors for common prefixes.
// These bypass validation since they are known-good compile-time constants:
// CachePrefix::session(), CachePrefix::csrf(), CachePrefix::jwks(),
// CachePrefix::pkce(), CachePrefix::nonce(), CachePrefix::aaguid(), ...
}
Usage:
#![allow(unused)]
fn main() {
// From string (with validation)
let prefix = CachePrefix::new("custom_prefix".to_string())?;
// Convenience constructors (known-good values, no validation overhead)
let session_prefix = CachePrefix::session();
let csrf_prefix = CachePrefix::csrf();
}
Search Field Enums
Database queries often need to search by different fields. Without type safety, you might write:
#![allow(unused)]
fn main() {
// Dangerous: which field does this string refer to? A user ID? An email? A credential ID?
fn get_credentials(field_name: &str, value: &str) -> Vec<Credential> { /* ... */ }
}
Search field enums combine the field selection and the typed value into a single type, so the compiler ensures you cannot pass an Email when searching by CredentialId:
#![allow(unused)]
fn main() {
// Passkey credential searches
pub enum CredentialSearchField {
CredentialId(CredentialId), // find by credential ID
UserId(UserId), // find all credentials for a user
UserHandle(UserHandle), // find by WebAuthn user handle
UserName(UserName), // find by username
}
// OAuth2 account searches
pub enum AccountSearchField {
Id(AccountId), // find by account ID
UserId(UserId), // find all accounts for a user
Provider(Provider), // find by provider name
ProviderUserId(ProviderUserId), // find by provider-specific user ID
Name(DisplayName), // find by display name
Email(Email), // find by email address
}
}
Usage:
#![allow(unused)]
fn main() {
// Each variant carries a validated typed value -- no raw strings anywhere
let user_id = UserId::new("user_123".to_string())?;
let credentials = PasskeyStore::get_credentials_by(
CredentialSearchField::UserId(user_id) // compiler ensures UserId goes into UserId variant
).await?;
let email = Email::new("alice@example.com".to_string())?;
let accounts = OAuth2Store::get_accounts_by(
AccountSearchField::Email(email) // cannot accidentally pass a Provider here
).await?;
}
Security Guarantees
Compile-Time Safety
- Parameter Confusion Prevention: Cannot pass
UserIdwhereCredentialIdexpected - Type Mixing Protection: Compiler enforces correct parameter types
- API Consistency: All functions use consistent typed interfaces
Runtime Validation
- Input Validation: All types validate their input during construction
- Cache Security: Prevents Redis command injection across all backends
- Length Limits: Enforces reasonable bounds on all identifiers
Storage Backend Consistency
- Unified Validation: Same security guarantees regardless of storage backend
- Memory vs Redis: No deployment-specific vulnerabilities
- Centralized Logic: Single validation point per type for easier maintenance
Security vs Performance Tradeoff
The type-safe validation system is designed for zero runtime overhead:
- Validation occurs once at type construction
- No repeated validation during function calls
- Compile-time guarantees eliminate runtime checks
- Memory overhead is minimal (single String wrapper per type)
This approach provides maximum security with optimal performance for authentication-critical code paths.
Usage Patterns
Coordination Layer Functions
All coordination functions require typed parameters:
#![allow(unused)]
fn main() {
// Admin functions
get_all_users(session_id: SessionId) -> Result<Vec<User>, CoordinationError>
get_user(session_id: SessionId, user_id: UserId) -> Result<Option<User>, CoordinationError>
delete_credential(session_id: SessionId, credential_id: CredentialId) -> Result<(), CoordinationError>
// User functions
get_user_credentials(session_id: SessionId, user_id: UserId) -> Result<Vec<PasskeyCredential>, CoordinationError>
}
Session Management
#![allow(unused)]
fn main() {
// Session validation
get_user_from_session(session_cookie: &SessionCookie) -> Result<SessionUser, SessionError>
// CSRF token handling uses typed SessionId internally
get_csrf_token_from_session(session_cookie: &str) -> Result<CsrfToken, SessionError>
}
OAuth2 Operations
#![allow(unused)]
fn main() {
// State parameter handling
encode_state(params: StateParams) -> Result<OAuth2State, OAuth2Error>
decode_state(state: &OAuth2State) -> Result<StateParams, OAuth2Error>
// Account search with typed enums
OAuth2Store::get_accounts_by(search_field: AccountSearchField) -> Result<Vec<OAuth2Account>, OAuth2Error>
}
Cache Operations
#![allow(unused)]
fn main() {
// Unified cache operations with type safety
store_cache_auto(prefix: CachePrefix, data: T, ttl: u64) -> Result<String, E>
store_cache_keyed(prefix: CachePrefix, key: CacheKey, data: T, ttl: u64) -> Result<(), E>
get_data(prefix: CachePrefix, key: CacheKey) -> Result<Option<T>, E>
}
Error Handling
All typed constructors can fail with validation errors:
#![allow(unused)]
fn main() {
// Handle validation errors
match SessionCookie::new(cookie_value.to_string()) {
Ok(cookie) => {
let user = get_user_from_session(&cookie).await?;
// Use validated cookie
}
Err(SessionError::Cookie(msg)) => {
// Handle invalid cookie format
}
}
match OAuth2State::new(state_param.to_string()) {
Ok(state) => {
let params = decode_state(&state)?;
// Use validated state
}
Err(OAuth2Error::DecodeState(msg)) => {
// Handle invalid state format
}
}
}
Benefits for Developers
IDE Support
- Auto-completion: IDEs show exactly what types are expected
- Type Checking: Immediate feedback on parameter mistakes
- Refactoring Safety: Compiler catches all places needing updates
Code Clarity
- Self-Documenting: Function signatures show validation requirements
- Intent Clear: Type names indicate the purpose of each parameter
- Consistent APIs: Same patterns across all modules
Security by Default
- No Bypass: Impossible to accidentally use raw strings
- Validation Required: Must construct types with proper validation
- Defense in Depth: Multiple layers of protection
Migration from Raw Strings
When migrating existing code from raw strings to typed wrappers, the change is straightforward. The key difference is that the typed version catches invalid input immediately and prevents parameter mix-ups at compile time:
#![allow(unused)]
fn main() {
// Before: raw String — no validation, no type safety
// A malicious or malformed user_id passes through silently.
// Swapping user_id with another String parameter compiles without error.
let credentials = PasskeyStore::get_credentials_by(
CredentialSearchField::UserId(user_id.to_string())
);
// After: typed wrapper — validated on construction, type-checked by compiler
// UserId::new() rejects empty strings, overly long values, and dangerous characters.
// Passing a CredentialId where UserId is expected is a compile error.
let user_id = UserId::new(user_id_string.to_string())?;
let credentials = PasskeyStore::get_credentials_by(
CredentialSearchField::UserId(user_id)
);
}
Storage Pattern: Why Singleton Instead of Axum State
The Problem with State in a Library
Axum applications typically manage shared resources (database pools, caches) through the State pattern, where a struct is attached to the router and extracted in each handler. This works well for application code, but creates friction when used inside a library.
Users must initialize and manage library state
With State, the library would expose its internal state struct (containing database pools, caches, etc.) and require the user to construct and attach it to the router. Since Axum allows only one state type per router, a user who already has their own application state must merge the two into a single combined type using Axum’s state composition mechanisms (FromRef, wrapper structs). Adding login should not require restructuring the application’s state types.
Internal state threading is burdensome
Authentication flows pass through multiple layers. For example, an OAuth2 login traverses:
google_auth() [HTTP handler]
-> authorized_core() [coordination layer]
-> OAuth2Store::get_account() [storage abstraction]
-> database query [SQLite or PostgreSQL]
With State, every function in this chain needs a state parameter – even the intermediate layers that do not access the database themselves. The coordination layer must accept and forward state simply because a storage function three levels down needs it. When writing a handler’s signature, you must already know that a function deep in the call chain requires database access. Adding a new storage call at the bottom of the chain forces signature changes through every layer above it. In this library, that would affect roughly 80-100 function signatures across the coordination, session, storage, and audit layers – a substantial maintenance burden for a change that adds no functionality.
State prevents environment-variable-only configuration
This library supports multiple storage backends (SQLite/PostgreSQL, Memory/Redis). The preferable user experience is to set DB_TYPE=postgresql in .env and let the library handle backend construction internally. With State, this is not possible – the user must construct the correct backend objects and pass them into the state struct, because State requires the application to provide its contents explicitly.
The Solution: Global Static Storage
Instead of State, this library uses LazyLock globals initialized once at startup:
// Simplified internal structure (uses tokio::sync::Mutex for async access)
static DATA_STORE: LazyLock<Mutex<Box<dyn DataStore>>> = LazyLock::new(|| {
// Reads DB_TYPE from environment, creates appropriate backend
});
static CACHE_STORE: LazyLock<Mutex<Box<dyn CacheStore>>> = LazyLock::new(|| {
// Reads CACHE_TYPE from environment, creates appropriate backend
});
For the user, integration is two lines:
oauth2_passkey_axum::init().await?; // Force-initialize global stores
let app = Router::new().route("/", get(home)).merge(oauth2_passkey_full_router());
No state structs, no type composition, no Axum-specific boilerplate. Internally, any function in the library can access storage directly through the globals, regardless of where it sits in the call chain and without requiring callers to pass state down:
// Any function can read/write storage directly -- no state parameter needed
let store = GENERIC_DATA_STORE.lock().await;
let user = store.get_user(&user_id).await?;
GENERIC_CACHE_STORE.lock().await.put(prefix, key, data, ttl).await?;
Limitations and How They Are Handled
Parallel Test Isolation
This is the most significant practical cost. All tests in a process share the same LazyLock-initialized database, so parallel tests can interfere with each other’s data.
The library addresses this through multiple mechanisms:
- Selective serialization: Tests that modify database state use
#[serial](from theserial_testcrate). Read-only tests run in parallel without restriction. - Unique ID generation: Tests generate per-test identifiers using timestamps, thread IDs, and atomic counters to avoid collisions even when running in parallel.
- Lock-holding deletion:
delete_user_atomically()holds theGENERIC_DATA_STOREmutex lock during the entire delete sequence (OAuth2 accounts -> passkey credentials -> user) to prevent foreign key constraint violations from interleaved operations.
This approach was developed iteratively – foreign key constraint errors in parallel tests led to the lock-holding deletion pattern. It works, but requires discipline: any new test that mutates shared state must use #[serial] or unique IDs.
Implicit Global Dependencies
Function signatures do not reveal their dependency on global storage. A function like get_user_from_session() internally accesses GENERIC_CACHE_STORE and GENERIC_DATA_STORE, but this is invisible to the caller and not enforced by the compiler.
The practical risk is that forgetting to call init() causes a runtime panic on first access rather than a compile-time error. This has not been an issue in practice because init() is always called in main() and init_test_environment() in tests, but the compiler cannot help catch initialization ordering mistakes during refactoring.
Single Instance Per Process
Running independent authentication instances in the same process is not possible. Global statics enforce a single configuration. This has not been a limitation in practice – authentication systems typically need only one instance.
Summary
This library uses LazyLock globals to manage storage, making database and cache access available from any function without state parameters. This approach has known costs – parallel tests share a single database, global dependencies are invisible to the compiler, and only one instance per process is possible. However, compared to Axum’s State pattern, it eliminates user-facing state composition boilerplate, avoids threading state through 80-100 internal function signatures, and enables configuration through environment variables alone – resulting in a simpler integration experience for users.
Appendix C: Troubleshooting
This guide covers common issues you may encounter when using oauth2-passkey and how to resolve them.
Common Errors
Database Connection Issues
SQLite Problems
-
Database errors during startup
- The SQLite database will be created automatically on first run
- Ensure the directory for your SQLite file is writable
- Check that the path specified by
GENERIC_DATA_STORE_URLin your.envis accessible
-
How to reset the database
- Delete the database file (e.g.,
auth.db) to clear all data - Use
touchto recreate the database file if needed - Restart the application to reinitialize
- Delete the database file (e.g.,
PostgreSQL Problems
- Verify the PostgreSQL server is running
- Check connection credentials in
GENERIC_DATA_STORE_URL - Ensure the database exists and the user has proper permissions
OAuth2 Errors
Redirect URI Mismatch
- Verify the redirect URI in Google Cloud Console matches exactly
- The default redirect URI is
{ORIGIN}/o2p/oauth2/authorized - For example:
http://localhost:3001/o2p/oauth2/authorized
Invalid Credentials
- Check your Google OAuth2 credentials in
.env:OAUTH2_GOOGLE_CLIENT_IDOAUTH2_GOOGLE_CLIENT_SECRET
- Verify the credentials are from the correct Google Cloud project
“Invalid origin” Error
- Ensure
ORIGINin.envmatches the URL you’re visiting exactly - Use
http://localhost:3001(not127.0.0.1) - The scheme (http/https), hostname, and port must all match
Google OAuth2 Not Working
- Check your Google OAuth2 credentials in
.env - Verify authorized origins and redirect URIs in Google Cloud Console
- Ensure the OAuth consent screen is properly configured
Passkey/WebAuthn Errors
Origin Mismatch
- Ensure
ORIGINin.envmatches the URL exactly - Use
http://localhost:3001(not127.0.0.1) - WebAuthn is strict about origin validation
“Authenticator not found” Error
- Ensure your device has biometric capabilities enabled
- Check that platform authenticator (Touch ID, Face ID, Windows Hello) is set up
- Try using a security key if available
“WebAuthn not supported” Error
- Ensure you’re using a modern browser (Chrome, Firefox, Safari, Edge)
- Update your browser to the latest version
- WebAuthn support varies by browser version
“Secure context required” Error
- WebAuthn requires a secure context (localhost or HTTPS)
- localhost works over HTTP (it’s a secure context)
- For production, use HTTPS via a reverse proxy (nginx/Caddy)
WebAuthn/Passkey Not Working
- WebAuthn requires a secure context (localhost or HTTPS)
- Try a different browser if having issues (Chrome has the best support)
- Clear browser data for localhost if needed
Session Issues
Cookie Problems
- Check that cookies are enabled in your browser
- For cross-site scenarios, ensure SameSite cookie settings are correct
- Clear browser cookies and try again
CSRF Token Issues
- Do not disable JavaScript (CSRF tokens require JavaScript)
- Ensure the session hasn’t expired
- Try logging out and back in
Debug Tips
Logs
- Check console output for detailed error messages
- Set
RUST_LOGenvironment variable to control log verbosity:- Log levels from least to most verbose:
error < warn < info < debug < trace - Example:
RUST_LOG=debug cargo run
- Log levels from least to most verbose:
Database
- SQLite: File
auth.db(or configured path) stores user data and sessions - Reset: Delete the database file and restart to clear all data
- Location: Check
GENERIC_DATA_STORE_URLin your.envfor the actual path
Browser-Specific Issues
- Chrome: Best WebAuthn support, recommended for development
- Firefox: Good support, may need to enable some WebAuthn features
- Safari: Works well on macOS/iOS with Touch ID/Face ID
- Edge: Similar to Chrome (Chromium-based)
Development Tips
Using Cloudflared Tunnel
For public HTTPS access for mobile testing or production:
- Set up a cloudflared tunnel pointing to
http://localhost:3001 - Update
.env:ORIGIN='https://your-tunnel-domain.example.com' - For OAuth2: Update Google OAuth2 redirect URI to:
https://your-tunnel-domain.example.com/o2p/oauth2/authorized
Environment Configuration
- Use HTTPS for production (via reverse proxy or tunnel)
- The
ORIGINenvironment variable must match the URL users access exactly - Use SQLite and in-memory cache for quick local development
- Use PostgreSQL and Redis for production deployments
Quick Reset Procedure
- Stop the application
- Delete the database file (e.g.,
rm auth.dborrm /tmp/auth.db) - Restart the application
- Re-register users and credentials
Testing Authentication Flows
- Create test users with different authentication methods
- Test both registration and sign-in flows
- Verify session persistence across page reloads
- Test logout functionality clears sessions properly