Documentation Index
Fetch the complete documentation index at: https://polynode.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Track wallets, markets, and tokens locally. The SDK streams live events into a SQLite database and backfills recent history on startup. All queries run instantly against the local DB with zero API calls.
Why Use the Cache
Without the cache, every page view that shows trader positions requires upstream API calls. For apps tracking dozens or hundreds of wallets, this hits rate limits fast.
With the cache:
- Three API calls per wallet to backfill: open positions (metadata), onchain positions (complete P&L), and recent trades
- Live WebSocket stream keeps everything up to date after that
- All queries are local — positions, trades, P&L, stats are instant
- Persists across restarts — SQLite file stays on disk
Quick Start
Install
npm install polynode-sdk better-sqlite3
better-sqlite3 is an optional peer dependency. Only needed if you use the cache.# Cargo.toml
polynode = { version = "0.12", features = ["cache"] }
tokio = { version = "1", features = ["full"] }
The cache feature includes rusqlite with bundled SQLite. No system dependency needed.Create a watchlist
Create polynode.watch.json in your project root:{
"version": 1,
"wallets": [
{ "address": "0xabc...", "label": "trader-1", "backfill": true },
{ "address": "0xdef...", "label": "trader-2", "backfill": true }
],
"settings": {
"ttl_days": 30
}
}
Start the cache
import { PolyNode, PolyNodeCache } from 'polynode-sdk';
const pn = new PolyNode({ apiKey: 'pn_live_...' });
const cache = new PolyNodeCache(pn, {
dbPath: './my-cache.db',
watchlistPath: './polynode.watch.json',
});
await cache.start();
use polynode::{PolyNodeClient, cache::PolyNodeCache};
use std::sync::Arc;
let client = Arc::new(PolyNodeClient::new("pn_live_...")?);
let mut cache = PolyNodeCache::builder(client)
.db_path("./my-cache.db")
.watchlist_path("./polynode.watch.json")
.build()?;
cache.start().await?;
Query locally
const trades = cache.walletTrades('0xabc...', { limit: 50 });
const positions = cache.walletPositions('0xabc...');
const stats = cache.stats();
let trades = cache.wallet_trades("0xabc...", &QueryOptions { limit: Some(50), ..Default::default() })?;
let positions = cache.wallet_positions("0xabc...")?;
let stats = cache.stats()?;
Backfill Timing
Backfill makes three requests per wallet, each spaced by 1 / backfillRatePerSecond seconds:
- Open positions — metadata (title, slug, outcome) from the standard positions API
- Onchain positions — complete position history with accurate realized P&L (single call, no client-side pagination)
- Recent trades — trade history for cost basis and trade analytics
| Wallets | Requests | Time at 1 req/s | Time at 2 req/s |
|---|
| 1 | 3 | ~3 seconds | ~1.5 seconds |
| 10 | 30 | ~30 seconds | ~15 seconds |
| 50 | 150 | ~2.5 minutes | ~1.3 minutes |
| 100 | 300 | ~5 minutes | ~2.5 minutes |
P&L data comes from the onchain positions call (step 2). This returns every position the wallet has ever held in a single request, no client-side pagination needed. P&L is accurate and complete regardless of how many trade pages you fetch.
Trade history (step 3) is separate. It’s useful if you want individual buy/sell records with prices, timestamps, and maker/taker details. Set backfillPages higher for more trade history:
| Pages | Trades per wallet | Extra time per wallet |
|---|
| 1 (default) | up to 500 | +1 second |
| 2 | up to 1,000 | +2 seconds |
| 6 | up to 3,000 | +6 seconds |
Trade history has a 3,000 trade cap from the upstream data source. This does NOT affect P&L accuracy. P&L comes from onchain position data, which is complete and has no cap. The live WebSocket stream captures all new trades going forward with no limit.
Configuration
const cache = new PolyNodeCache(pn, {
// File paths
dbPath: './polynode-cache.db', // SQLite database location
watchlistPath: './polynode.watch.json', // Watchlist file
// Backfill
backfillRatePerSecond: 1, // Requests per second (default: 1)
backfillPages: 1, // Pages per wallet (default: 1, max: 6)
backfillPageSize: 500, // Trades per page (default: 500, max: 1000)
// Storage
ttlSeconds: 30 * 86400, // Auto-prune after 30 days
purgeOnRemove: false, // Delete data when wallet removed from watchlist
// Progress callback
onBackfillProgress: (p) => {
console.log(`${p.label}: ${p.status} (${p.fetched} trades)`);
},
});
Query Methods
All queries run against the local SQLite database. No API calls. Every example below shows real output from a live backfill.
Wallet Trades
const trades = cache.walletTrades('0xad53...', { limit: 3 });
[
{
"side": "BUY",
"price": 0.821,
"size": 5.92,
"market_title": "Will Iran conduct a military action against Israel on March 20, 2026?",
"outcome": "Yes",
"timestamp": "2026-03-21T18:00:28.223Z"
},
{
"side": "SELL",
"price": 0.18,
"size": 40.51,
"market_title": "Will Iran conduct a military action against Israel on March 20, 2026?",
"outcome": "No"
},
{
"side": "BUY",
"price": 0.181,
"size": 19.1,
"market_title": "Will Iran conduct a military action against Israel on March 20, 2026?",
"outcome": "No"
}
]
Filters: side, since, until, orderBy, limit, offset:
// Only BUY trades
const buys = cache.walletTrades('0xad53...', { limit: 3, side: 'BUY' });
// Pagination
const page1 = cache.walletTrades('0xad53...', { limit: 5, offset: 0 });
const page2 = cache.walletTrades('0xad53...', { limit: 5, offset: 5 });
// Time range
const recent = cache.walletTrades('0xad53...', { since: 1774000000 });
// Ascending order
const oldest = cache.walletTrades('0xad53...', { orderBy: 'timestamp_asc', limit: 3 });
Wallet Positions
Positions are backfilled from two sources: the standard positions API (metadata like title, outcome, current price) and the onchain positions endpoint (accurate realized P&L, average entry price, total bought). Trade timestamps are enriched from the local trades table.
const positions = cache.walletPositions('0xad53...');
// 200 positions from 500 cached trades
Example output (first 2 of 500)
[
{
"wallet": "0xad53...",
"token_id": "75929940...",
"market_title": "Will \"How to Make a Killing\" score at least 59 on the Rotten Tomatoes Tomatometer?",
"outcome": "Yes",
"size": 10000,
"avg_price": 0.001,
"cur_price": 0.0005,
"current_value": 5.0,
"cash_pnl": -5.0,
"percent_pnl": -50.0,
"redeemable": false,
"trade_count": 3,
"first_trade_at": 1710000000,
"last_trade_at": 1774100000
},
{
"wallet": "0xad53...",
"market_title": "Will Resni.ca (Res) be part of the next Government of Slovenia?",
"outcome": "Yes",
"size": 7.50,
"avg_price": 0.41,
"cash_pnl": 2.5,
"trade_count": 1
}
]
Multi-Wallet Positions
Query positions for multiple wallets in one call:
const all = cache.multiWalletPositions(['0xad53...', '0x2afd...', '0xe4ca...']);
Returns an object keyed by wallet address, where each value is an array of positions:
{
"0xad53...": [
{ "wallet": "0xad53...", "market_title": "...", "outcome": "Yes", "size": 10000, "avg_price": 0.04, "cash_pnl": -50.0, "trade_count": 3 },
{ "wallet": "0xad53...", "market_title": "...", "outcome": "No", "size": 500, "avg_price": 0.41, "cash_pnl": 12.5, "trade_count": 1 }
],
"0x2afd...": [
{ "wallet": "0x2afd...", "market_title": "...", "outcome": "Yes", "size": 250, "avg_price": 0.55, "cash_pnl": 30.0, "trade_count": 2 }
],
"0xe4ca...": [...]
}
Market Trades
const trades = cache.marketTrades('0xe1cc...', { limit: 3 });
[
{ "taker": "0xad53...", "side": "BUY", "price": 0.821, "size": 5.92 },
{ "taker": "0xad53...", "side": "SELL", "price": 0.18, "size": 40.51 },
{ "taker": "0xad53...", "side": "BUY", "price": 0.181, "size": 19.1 }
]
Market Positions
All positions across all cached wallets for a market:
const positions = cache.marketPositions('0xe1cc...');
// 9 positions across multiple wallets
[
{ "outcome": "Yes", "size": -172.12, "avg_price": 0.8399 },
{ "outcome": "No", "size": -81.39, "avg_price": 0.1808 },
{ "outcome": "No", "size": 28.66, "avg_price": 0.18 }
]
Token Trades
const trades = cache.tokenTrades('11382339...', { limit: 3 });
// All returned trades match the requested token_id
Trade by Transaction Hash
Look up all trades within a single transaction:
const trades = cache.tradeByTxHash('0x6815497d...');
[
{ "side": "BUY", "price": 0.821, "size": 5.92 },
{ "side": "SELL", "price": 0.18, "size": 40.51 },
{ "side": "BUY", "price": 0.181, "size": 19.1 }
]
Wallet Settlements
const settlements = cache.walletSettlements('0xad53...', { limit: 20 });
Cache Stats
const stats = cache.stats();
{
"trade_count": 1509,
"settlement_count": 3,
"db_size_kb": 10567.3,
"oldest_trade": "2026-03-14T19:19:25.000Z",
"newest_trade": "2026-03-21T18:00:28.223Z",
"backfill_complete": 3,
"backfill_total": 3,
"backfill_failed": 0
}
Watchlist
{
"version": 1,
"wallets": [
{ "address": "0xabc...", "label": "whale", "backfill": true }
],
"markets": [
{ "condition_id": "0x789...", "label": "BTC 100k", "backfill": true }
],
"tokens": [
{ "token_id": "12345...", "label": "BTC Yes", "backfill": true }
],
"settings": {
"ttl_days": 30,
"backfill_rate": 1,
"purge_on_remove": false
}
}
Hot Reload
Edit the watchlist file while the cache is running. Changes are detected automatically within 500ms:
- New entries trigger backfill and update the WebSocket subscription
- Removed entries optionally purge data (if
purgeOnRemove is enabled)
Runtime API
Add or remove wallets programmatically. Backfill starts immediately for new entries.
// Add a wallet — backfill starts within 1 second
cache.addToWatchlist([
{ type: 'wallet', id: '0x99ba...', label: 'UnholyScissors' }
]);
// After ~2 seconds:
cache.stats();
// { trade_count: 1857, backfill_complete: 4 }
// (was 1509 trades / 3 complete before adding)
// Remove a wallet
cache.removeFromWatchlist([
{ type: 'wallet', id: '0x99ba...' }
]);
View Methods
Pre-built queries that return data shaped for dashboards. No SQL, no aggregation — just call the method.
Watchlist Summary
All watched wallets with summary stats in one call:
const summary = cache.watchlistSummary();
[
{ "wallet": "0xad53...", "label": "whale-1", "position_count": 42, "total_pnl": 1250.50, "total_value": 8400.00, "last_active": 1774200000 },
{ "wallet": "0x2afd...", "label": "degen", "position_count": 15, "total_pnl": -320.00, "total_value": 1200.00, "last_active": 1774180000 }
]
Wallet Dashboard
Single wallet view with positions grouped, P&L totals, win/loss counts, and recent trades:
const dash = cache.walletDashboard('0xad53...');
The dashboard includes realized P&L computed from onchain position data:
| Field | Type | Description |
|---|
total_positions | number | All positions in cache |
total_pnl | number | Sum of unrealized P&L on open positions |
realized_pnl | number | Total realized P&L from onchain data |
pnl_confidence | string | "full" when onchain data is present, "partial" otherwise |
win_count | number | Positions with positive P&L |
loss_count | number | Positions with negative P&L |
recent_trades | array | Last 20 trades |
token_pnl | array | Per-token P&L breakdown |
Realized P&L
Compute accurate realized P&L for any wallet in the cache. During backfill, the SDK automatically fetches onchain position data from the onchain positions endpoint, which provides precomputed realized_pnl per token that matches Polymarket’s numbers exactly.
const pnl = cache.computeRealizedPnl('0xbddf61af533ff524d27154e589d2d7a81510c684');
let pnl = cache.compute_realized_pnl("0xbddf61af533ff524d27154e589d2d7a81510c684")?;
{
"wallet": "0xbddf61af533ff524d27154e589d2d7a81510c684",
"total_realized_pnl": 17183579.48,
"total_unrealized_pnl": 3086.69,
"total_pnl": 17186666.17,
"confidence": "full",
"trades_analyzed": 403,
"tokens": [
{
"token_id": "34158857...",
"condition_id": "0x5346...",
"market_title": "Nuggets vs. Warriors",
"outcome": "Nuggets",
"realized_pnl": 447182.95,
"unrealized_pnl": 0,
"remaining_size": 0,
"avg_cost": 0.3165,
"cur_price": null,
"trades_analyzed": 0,
"buys": 0,
"sells": 0
}
]
}
| Field | Type | Description |
|---|
total_realized_pnl | number | Sum of realized P&L across all positions |
total_unrealized_pnl | number | Unrealized P&L on open positions |
confidence | string | "full" when onchain data is available. "partial" when only trade-based computation was possible. |
tokens | array | Per-token breakdown with individual P&L |
The onchain realized_pnl is the source of truth. When available, it takes priority over any trade-based P&L computation. This ensures accuracy even when the trade history is incomplete (Polymarket’s trades API silently drops trades for high-volume wallets).
Leaderboard
Rank watched wallets by any metric:
const leaders = cache.leaderboard('total_pnl');
// Also: 'total_value', 'trade_count', 'win_rate'
[
{ "wallet": "0xad53...", "label": "whale-1", "value": 1250.50, "rank": 1 },
{ "wallet": "0x2afd...", "label": "degen", "value": -320.00, "rank": 2 }
]
Leaderboard Builder
New in SDK v0.4.8. The builder extends the basic leaderboard() method with multi-metric support, market/slug filtering, wallet scoping, time windows, and 11 available metrics.
Call cache.leaderboard() with no arguments to get a LeaderboardBuilder. Chain filters and call .build() to execute.
Single metric
const rows = cache.leaderboard()
.metric('total_pnl')
.build();
[
{ "wallet": "0xcarol", "label": "Carol", "rank": 1, "metrics": { "total_pnl": 53.5 } },
{ "wallet": "0xalice", "label": "Alice", "rank": 2, "metrics": { "total_pnl": 15 } },
{ "wallet": "0xbob", "label": "Bob", "rank": 3, "metrics": { "total_pnl": -30 } }
]
Multi-metric
Request multiple metrics per row. Each row includes all metrics, sorted by the first one (or use .sortBy() to override):
const rows = cache.leaderboard()
.metrics(['total_pnl', 'volume', 'win_rate'])
.build();
[
{
"wallet": "0xcarol", "label": "Carol", "rank": 1,
"metrics": { "total_pnl": 53.5, "volume": 138.1, "win_rate": 1.0 }
},
{
"wallet": "0xalice", "label": "Alice", "rank": 2,
"metrics": { "total_pnl": 15, "volume": 102.5, "win_rate": 0.5 }
},
{
"wallet": "0xbob", "label": "Bob", "rank": 3,
"metrics": { "total_pnl": -30, "volume": 70, "win_rate": 0 }
}
]
Available metrics
| Metric | Description | Source |
|---|
total_pnl | Sum of unrealized P&L across positions | Positions |
total_value | Sum of current position values | Positions |
realized_pnl | Sum of realized P&L | Positions |
roi | Return on investment (total_pnl / initial_value) | Positions |
win_rate | Fraction of positions with positive P&L | Positions |
largest_win | Highest single-position P&L | Positions |
largest_loss | Lowest single-position P&L | Positions |
market_count | Number of distinct markets traded | Positions |
trade_count | Total trades | Trades |
volume | Total volume (price * size) | Trades |
avg_trade_size | Average trade volume | Trades |
Sort by a different metric
By default, rows are sorted by the first metric in the array. Override with .sortBy():
const rows = cache.leaderboard()
.metrics(['total_pnl', 'volume'])
.sortBy('volume')
.build();
// Ranked by volume: Carol (138.1), Alice (102.5), Bob (70)
Sort direction
Default is DESC (highest first). Flip with .sort('ASC'):
const rows = cache.leaderboard()
.metric('total_pnl')
.sort('ASC')
.build();
// Bob (-30), Alice (15), Carol (53.5)
Filter by market
Scope to specific markets by condition ID:
const rows = cache.leaderboard()
.metrics(['total_pnl', 'volume'])
.markets(['0xcondition_btc'])
.build();
// Only counts positions and trades in the BTC market
[
{ "wallet": "0xcarol", "rank": 1, "metrics": { "total_pnl": 16, "volume": 40 } },
{ "wallet": "0xbob", "rank": 2, "metrics": { "total_pnl": 0, "volume": 0 } },
{ "wallet": "0xalice", "rank": 3, "metrics": { "total_pnl": -5, "volume": 20 } }
]
Filter by slug pattern
Use glob patterns on market slugs. * matches any characters:
const rows = cache.leaderboard()
.metrics(['total_pnl', 'trade_count'])
.slugs(['*trump*'])
.build();
// Only counts positions/trades in markets with "trump" in the slug
[
{ "wallet": "0xcarol", "rank": 1, "metrics": { "total_pnl": 37.5, "trade_count": 2 } },
{ "wallet": "0xalice", "rank": 2, "metrics": { "total_pnl": 20, "trade_count": 2 } },
{ "wallet": "0xbob", "rank": 3, "metrics": { "total_pnl": -30, "trade_count": 1 } }
]
Multiple patterns are OR’d together:
cache.leaderboard()
.slugs(['*trump*', 'btc-*', '*election*'])
.metric('total_pnl')
.build();
Categories
Define reusable named groups of slug patterns:
const ELECTIONS = { name: 'elections', slugs: ['*election*', '*trump*', '*biden*'] };
const CRYPTO = { name: 'crypto', slugs: ['btc-*', 'eth-*', '*bitcoin*'] };
const SPORTS = { name: 'sports', slugs: ['*nba*', '*nfl*', '*ncaa*'] };
const rows = cache.leaderboard()
.category(ELECTIONS)
.metrics(['total_pnl', 'roi', 'win_rate'])
.limit(10)
.build();
Wallet scoping
Rank a subset of wallets instead of the full watchlist:
const rows = cache.leaderboard()
.metric('total_pnl')
.wallets(['0xalice', '0xcarol'])
.build();
// Only Alice and Carol, Bob excluded
[
{ "wallet": "0xcarol", "label": "Carol", "rank": 1, "metrics": { "total_pnl": 53.5 } },
{ "wallet": "0xalice", "label": "Alice", "rank": 2, "metrics": { "total_pnl": 15 } }
]
Time windows
Filter trade-derived metrics to a time range. Pass UNIX timestamps in seconds:
const weekAgo = Date.now() / 1000 - 7 * 86400;
const rows = cache.leaderboard()
.metrics(['trade_count', 'volume'])
.since(weekAgo)
.sortBy('volume')
.build();
Time windows apply to trade-derived metrics only (trade_count, volume, avg_trade_size). Position-derived metrics (total_pnl, roi, win_rate, etc.) always reflect current state, since positions are a snapshot rather than a time series.
Limit
Return only the top N:
const top5 = cache.leaderboard()
.metrics(['total_pnl', 'volume', 'win_rate'])
.limit(5)
.build();
Full combination
All filters compose together:
const rows = cache.leaderboard()
.metrics(['total_pnl', 'volume', 'trade_count'])
.slugs(['*trump*'])
.since(weekAgo)
.wallets(['0xalice', '0xcarol'])
.sortBy('volume')
.limit(10)
.build();
Type reference
interface LeaderboardRow {
wallet: string;
label: string;
rank: number;
metrics: Record<string, number>;
}
interface LeaderboardCategory {
name: string;
slugs: string[];
}
type LeaderboardMetric =
| 'total_pnl' | 'total_value' | 'trade_count' | 'win_rate'
| 'roi' | 'realized_pnl' | 'volume' | 'avg_trade_size'
| 'largest_win' | 'largest_loss' | 'market_count';
Market Overview
All cached positions for a market across watched wallets:
const overview = cache.marketOverview('0xcondition...');
// overview.positions, overview.total_volume, overview.unique_wallets
Reactive Subscriptions
Fire callbacks when new data lands in the cache from the live WebSocket stream.
// Subscribe to all changes
const unsub = cache.onChange((event) => {
// event.type: 'trade' | 'settlement'
// event.wallet: string
// event.data: TradeRow | SettlementRow
console.log(`New ${event.type} for ${event.wallet}`);
});
// Wallet-specific — only fires for this wallet
const unsub2 = cache.onWalletChange('0xad53...', (event) => {
updateUI(event.data);
});
// Cleanup
unsub();
unsub2();
Export Helpers
Dump filtered data for charting libraries, spreadsheets, or custom analysis.
import * as fs from 'fs';
// CSV export
const csv = cache.exportCSV('trades', { wallet: '0xabc...', limit: 1000 });
fs.writeFileSync('trades.csv', csv);
// JSON array export
const json = cache.exportJSON('positions', { wallet: '0xabc...' });
// Raw rows for data libraries
const rows = cache.exportRows('trades', { wallet: '0xabc...', since: 1774000000 });
Filter options: wallet, conditionId, tokenId, side, since, until, limit, orderBy.
Query Builder
Chainable fluent API for complex queries without writing SQL.
// Filter trades by wallet, side, time, and market
const results = cache.query('trades')
.wallet('0xabc...')
.side('BUY')
.since(1774000000)
.market('0xcondition...')
.limit(50)
.orderBy('timestamp_desc')
.run();
// Filter positions by size and profitability
const winners = cache.query('positions')
.wallet('0xabc...')
.minSize(100)
.minPnl(0) // only profitable
.run();
Available filters: .wallet(), .market(), .token(), .side(), .since(), .until(), .limit(), .skip(), .orderBy(), .minSize(), .minPnl() (positions only).
How It Works
┌──────────────────┐
│ PolyNodeCache │
│ │
┌──────────────┤ backfill (1x) │
│ │ live stream │
│ │ prune timer │
▼ │ file watcher │
┌───────────┐ └────────┬─────────┘
│ REST API │ (backfill) │ (live events)
│ 3 calls │ ┌───────▼──────────┐
│ per wallet│ │ WebSocket stream │
└───────────┘ │ trades + settle. │
│ └───────┬──────────┘
│ │
▼ ▼
┌──────────────────────────────────────┐
│ SQLite (WAL mode) │
│ positions — open + closed with P&L │
│ trades — full inverted index │
│ settlements — pending + confirmed │
│ backfill_state — crash recovery │
└──────────────────────────────────────┘
- On start: opens SQLite, resets any interrupted backfills from a previous crash, loads watchlist, connects WebSocket, begins backfill
- Backfill (per wallet, 3 requests — skipped entirely if already complete):
- Fetches open positions from the standard API (metadata: title, outcome, slug)
- Fetches onchain positions (complete P&L for all positions including closed, single call, no pagination needed)
- Fetches recent trades (individual buy/sell records)
- Live stream: WebSocket delivers new trades and settlements in real-time
- Dedup:
INSERT OR IGNORE with unique constraint prevents duplicates between backfill and live data
- Prune: hourly timer removes data older than the configured TTL
- On stop: waits for any in-flight backfill to finish, then closes WebSocket and DB cleanly
Persistence & Crash Recovery
The SQLite database persists across restarts, deploys, and crashes. When you call cache.start() again:
- All trades, positions, and settlements are preserved in the SQLite file
- Completed backfills are skipped entirely (no network calls)
- Interrupted backfills resume automatically. Any entity that was mid-backfill when the process was killed gets retried on the next start
- WebSocket stream reconnects and picks up live events
- The console logs exactly what’s happening: how many entities need backfilling vs how many are already done
# First run — backfills everything
[PolyNodeCache] Backfilling 4 entities (1 page of 500 each) — ETA: ~4s
# Process killed mid-backfill, then restarted — only resumes incomplete ones
[PolyNodeCache] Reset 1 interrupted backfill(s) from previous session.
[PolyNodeCache] Backfilling 2 entities (1 page of 500 each) — ETA: ~1s
# Clean restart after everything is done — no network calls
[PolyNodeCache] All 4 entities already backfilled, skipping.
Backfill state is tracked in the backfill_state table with per-entity status (pending, in_progress, complete, failed). On startup, any in_progress entries left over from a crash are automatically reset to pending so they get retried.
Stop and Cleanup
await cache.stop(); // closes WebSocket, waits for in-flight backfill to finish, closes DB
// Manual prune
const deleted = cache.prune(); // removes data older than TTL
stop() is safe to call at any time. It waits for any in-flight backfill operation to complete before closing the database, so you won’t get partial writes or corrupted state.
Testing Utilities
The SDK includes helpers that return known-active Polymarket wallets. Useful for examples, integration tests, and getting started without needing to find wallet addresses yourself.
import { getActiveTestWallet, getActiveTestWallets } from 'polynode-sdk';
// Get a single active wallet (instant, uses cached fallback)
const wallet = await getActiveTestWallet();
// Get multiple active wallets
const wallets = await getActiveTestWallets(5);
// Fetch a fresh wallet from live leaderboard data
const fresh = await getActiveTestWallet({ fresh: true });
Combine with the cache for a zero-config quickstart:
import { PolyNode, PolyNodeCache, getActiveTestWallet } from 'polynode-sdk';
const pn = new PolyNode({ apiKey: 'pn_live_...' });
const wallet = await getActiveTestWallet();
const cache = new PolyNodeCache(pn, {
dbPath: './cache.db',
watchlistPath: './polynode.watch.json',
});
await cache.start();
cache.addToWatchlist([{ type: 'wallet', id: wallet, label: 'test-trader' }]);
// Wait for backfill, then query
setTimeout(() => {
const trades = cache.walletTrades(wallet, { limit: 10 });
console.log(`${trades.length} trades for ${wallet}`);
}, 3000);
getActiveTestWallet() returns instantly by default using a cached list of known-active wallets. Pass { fresh: true } to fetch the current top trader from live data (adds ~1-2s network latency).
Full Example
import { PolyNode, PolyNodeCache } from 'polynode-sdk';
const pn = new PolyNode({ apiKey: 'pn_live_...' });
const cache = new PolyNodeCache(pn, {
dbPath: './cache.db',
watchlistPath: './polynode.watch.json',
backfillPages: 1,
onBackfillProgress: (p) => {
const icon = p.status === 'complete' ? '✓' : '⟳';
console.log(`${icon} ${p.label}: ${p.fetched} trades`);
},
});
await cache.start();
// First run:
// [PolyNodeCache] Backfilling 10 entities (1 page of 500 each) — ETA: ~10s
// ⟳ trader-1: 500 trades
// ✓ trader-1: 500 trades
// ...
//
// Subsequent runs (data persisted):
// [PolyNodeCache] All 10 entities already backfilled, skipping.
// Query locally — instant
const positions = cache.walletPositions('0xabc...');
for (const p of positions) {
console.log(`${p.outcome}: ${p.size} shares @ ${p.avg_price.toFixed(4)}`);
}
// Add a wallet at runtime
cache.addToWatchlist([
{ type: 'wallet', id: '0xnew...', label: 'new-whale' }
]);
// Stats
const stats = cache.stats();
console.log(`${stats.trade_count} trades, ${(stats.db_size_bytes / 1024 / 1024).toFixed(1)} MB`);
// Cleanup
await cache.stop();