An autonomous, LLM-powered trading agent that runs 24/7 on Cloudflare Workers. (This is highly modified for of MAHORAGA.
LLM Stock Trader monitors social sentiment from StockTwits and Reddit, uses AI (OpenAI, Anthropic, Google, xAI, DeepSeek via AI SDK, or 300+ models via OpenRouter) to analyze signals, and executes trades through Alpaca. It runs as a Cloudflare Durable Object with persistent state, automatic restarts, and 24/7 crypto trading support.
- 24/7 Operation — Runs on Cloudflare Workers, no local machine required
- Multi-Source Signals — StockTwits, Reddit (9 subreddits), Finnhub (news, insider, upgrades), FMP screener, SEC (8-K + Form 4), QuiverQuant, Alpaca screener & news, Twitter confirmation
- Multi-Provider LLM — OpenAI, Anthropic, Google, xAI, DeepSeek via AI SDK, OpenRouter (300+ models), or Cloudflare AI Gateway
- Crypto Trading — Trade BTC, ETH, SOL around the clock
- Options Support — High-conviction options plays
- Staleness Detection — Auto-exit positions that lose momentum
- Pre-Market Analysis — Prepare trading plans before market open
- Discord Notifications — Get alerts on BUY signals
- Fully Customizable — Well-documented with
[TUNE]and[CUSTOMIZABLE]markers
- Node.js 18+
- Cloudflare account (free tier works)
- Alpaca account (free, paper trading supported)
- LLM API key (OpenAI, Anthropic, Google, xAI, DeepSeek), OpenRouter API key, or Cloudflare AI Gateway credentials
git clone https://github.com/ygwyg/LLM Stock Trader.git
cd LLM Stock Trader
npm install
cd dashboard && npm install && cd ..# Create D1 database
npx wrangler d1 create LLM Stock Trader-db
# Copy the database_id to wrangler.jsonc
# Create KV namespace
npx wrangler kv namespace create CACHE
# Copy the id to wrangler.jsonc
# Run migrations
npx wrangler d1 migrations apply LLM Stock Trader-db# Required
npx wrangler secret put ALPACA_API_KEY
npx wrangler secret put ALPACA_API_SECRET
# API Authentication - generate a secure random token (64+ chars recommended)
# Example: openssl rand -base64 48
npx wrangler secret put LLM Stock Trader_API_TOKEN
# LLM Provider (choose one mode)
npx wrangler secret put LLM_PROVIDER # "openai-raw" (default), "openrouter", "ai-sdk", or "cloudflare-gateway"
npx wrangler secret put LLM_MODEL # e.g. "gpt-4o-mini" or "openai/gpt-5-mini" (OpenRouter)
# LLM API Keys (based on provider mode)
npx wrangler secret put OPENAI_API_KEY # For openai-raw, openrouter, or ai-sdk with OpenAI
npx wrangler secret put OPENAI_BASE_URL # Optional: override base URL (auto-set for openrouter)
# npx wrangler secret put ANTHROPIC_API_KEY # For ai-sdk with Anthropic
# npx wrangler secret put GOOGLE_GENERATIVE_AI_API_KEY # For ai-sdk with Google
# npx wrangler secret put XAI_API_KEY # For ai-sdk with xAI/Grok
# npx wrangler secret put DEEPSEEK_API_KEY # For ai-sdk with DeepSeek
# npx wrangler secret put CLOUDFLARE_AI_GATEWAY_ACCOUNT_ID # For cloudflare-gateway
# npx wrangler secret put CLOUDFLARE_AI_GATEWAY_ID # For cloudflare-gateway
# npx wrangler secret put CLOUDFLARE_AI_GATEWAY_TOKEN # For cloudflare-gateway
# Optional
npx wrangler secret put ALPACA_PAPER # "true" for paper trading (recommended)
npx wrangler secret put TWITTER_BEARER_TOKEN
npx wrangler secret put DISCORD_WEBHOOK_URL
npx wrangler secret put KILL_SWITCH_SECRET # Emergency kill switch (separate from API token)
# Optional: Market data and signal providers
npx wrangler secret put FINNHUB_API_KEY # Equity fundamentals + signal gatherer (free: https://finnhub.io/register)
npx wrangler secret put FMP_API_KEY # Crypto market data + screener signals (free: https://financialmodelingprep.com/register)
npx wrangler secret put QUIVER_API_KEY # QuiverQuant sentiment/fundamentals (optional, for gatherQuiverQuant)npx wrangler deployAll API endpoints require authentication via Bearer token:
# Set your API token as an env var for convenience
export LLM Stock Trader_TOKEN="your-api-token"
# Enable the agent
curl -H "Authorization: Bearer $LLM Stock Trader_TOKEN" \
https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/enable# Check status
curl -H "Authorization: Bearer $LLM Stock Trader_TOKEN" \
https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/status
# View logs
curl -H "Authorization: Bearer $LLM Stock Trader_TOKEN" \
https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/logs
# Emergency kill switch (uses separate KILL_SWITCH_SECRET)
curl -H "Authorization: Bearer $KILL_SWITCH_SECRET" \
https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/kill
# Run dashboard locally (or use ./start for both backend + dashboard)
cd dashboard && npm install && npm run dev# Copy config files (first time only)
cp .env.example .dev.vars # Edit with your API keys
cp wrangler.example.jsonc wrangler.jsonc
# Auto-authenticate the dashboard (optional, avoids manual token entry)
echo "VITE_LLM Stock Trader_API_TOKEN=$(grep LLM Stock Trader_API_TOKEN .dev.vars | cut -d= -f2-)" > dashboard/.env.development
# Run local D1 migrations
npm run db:migrate
# Start both backend and dashboard
./startThis starts the Wrangler backend on http://localhost:8787 and the React dashboard on http://localhost:3000. Press Ctrl+C to stop both.
The dashboard/.env.development file auto-injects your API token so the dashboard authenticates without manual entry. Without it, you'll need to paste your LLM Stock Trader_API_TOKEN into the dashboard login screen on first visit.
./start # Start both backend + dashboard
./start backend # Backend only (port 8787)
./start dashboard # Dashboard only (port 3000)# Terminal 1 - Start wrangler
npm run dev
# Terminal 2 - Start dashboard
cd dashboard && npm run devnpm run test:alpacaVerifies your Alpaca API keys by testing account authentication, market clock, and a live AAPL snapshot.
./scripts/list-models.shFetches all 300+ OpenRouter models with pricing and writes them to scripts/openrouter-models.json, sorted by cost (cheapest first).
curl -H "Authorization: Bearer $LLM Stock Trader_TOKEN" \
http://localhost:8787/agent/enableThe main trading logic is in src/durable-objects/LLM Stock Trader-harness.ts. It's documented with markers to help you find what to modify:
| Marker | Meaning |
|---|---|
[TUNE] |
Numeric values you can adjust |
[TOGGLE] |
Features you can enable/disable |
[CUSTOMIZABLE] |
Sections with code you might want to modify |
- Create a new
gather*()method that returnsSignal[] - Add it to
runDataGatherers()Promise.all - Add source weight to
SOURCE_CONFIG.weights
See docs/harness.html for detailed customization guide.
| Setting | Default | Description |
|---|---|---|
max_positions |
5 | Maximum concurrent positions |
max_position_value |
5000 | Maximum $ per position |
take_profit_pct |
10 | Take profit percentage |
stop_loss_pct |
5 | Stop loss percentage |
min_sentiment_score |
0.3 | Minimum sentiment to consider |
min_analyst_confidence |
0.6 | Minimum LLM confidence to trade |
options_enabled |
false | Enable options trading |
crypto_enabled |
false | Enable 24/7 crypto trading |
llm_model |
gpt-4o-mini | Research model. Use full ID for OpenRouter, e.g. openai/gpt-5-mini |
llm_analyst_model |
gpt-4o | Analyst model. Use full ID for OpenRouter, e.g. openai/gpt-5.2 |
LLM Stock Trader supports multiple LLM providers via four modes:
| Mode | Description | Required Env Vars |
|---|---|---|
openai-raw |
Direct OpenAI API (default) | OPENAI_API_KEY |
openrouter |
OpenRouter proxy (300+ models) | OPENAI_API_KEY (your OpenRouter key) |
ai-sdk |
Vercel AI SDK with 5 providers | One or more provider keys |
cloudflare-gateway |
Cloudflare AI Gateway (/compat) | CLOUDFLARE_AI_GATEWAY_ACCOUNT_ID, CLOUDFLARE_AI_GATEWAY_ID, CLOUDFLARE_AI_GATEWAY_TOKEN |
OpenRouter Setup:
OpenRouter gives you access to 300+ models (OpenAI, Anthropic, Google, Meta, DeepSeek, xAI, and more) through a single API key. Get your key at openrouter.ai/keys.
Free model support: LLM Stock Trader automatically detects free-tier models (model IDs containing
:freeor/free) and applies 8-second rate limiting between LLM calls to respect free-tier rate limits. If a model does not support system prompts (e.g., some Gemma variants via theopenrouter/freeendpoint), LLM Stock Trader automatically retries by converting system instructions into user message prefixes. Cost tracking correctly reports$0.00 (free)for these models.
# .dev.vars (local) or wrangler secrets (production)
LLM_PROVIDER=openrouter
LLM_MODEL=openai/gpt-5-mini
OPENAI_API_KEY=sk-or-v1-your-openrouter-keyModels use the provider/model format (e.g. openai/gpt-5-mini, anthropic/claude-sonnet-4.5, google/gemini-2.5-pro). The base URL is auto-configured to https://openrouter.ai/api/v1.
The dashboard settings panel includes a dynamic model picker when OpenRouter is selected -- it fetches all 300+ models with live pricing from the OpenRouter API, with search filtering and sort by price. LLM cost tracking uses actual costs from OpenRouter's API response, so free models correctly show $0.
Run ./scripts/list-models.sh to fetch all available models with pricing to a local JSON file, sorted by cost.
Optional OpenAI Base URL Override:
OPENAI_BASE_URL— Override the base URL for OpenAI requests. Applies toopenai-rawandai-sdk(OpenAI models). Auto-set foropenrouter. Default:https://api.openai.com/v1.
Cloudflare AI Gateway Notes:
- This integration calls Cloudflare's OpenAI-compatible
/compat/chat/completionsendpoint and always sendscf-aig-authorization. - It is intended for BYOK/Unified Billing setups where upstream provider keys are configured in Cloudflare (so your worker does not send provider API keys).
- Models use the
{provider}/{model}format (e.g.openai/gpt-5-mini,google-ai-studio/gemini-2.5-flash,anthropic/claude-sonnet-4-5).
AI SDK Supported Providers:
| Provider | Env Var | Example Models |
|---|---|---|
| OpenAI | OPENAI_API_KEY |
openai/gpt-4o, openai/o1 |
| Anthropic | ANTHROPIC_API_KEY |
anthropic/claude-sonnet-4, anthropic/claude-opus-4 |
GOOGLE_GENERATIVE_AI_API_KEY |
google/gemini-2.5-pro, google/gemini-2.5-flash |
|
| xAI (Grok) | XAI_API_KEY |
xai/grok-4, xai/grok-3 |
| DeepSeek | DEEPSEEK_API_KEY |
deepseek/deepseek-chat, deepseek/deepseek-reasoner |
Example: Using Claude via OpenRouter:
npx wrangler secret put LLM_PROVIDER # Set to "openrouter"
npx wrangler secret put LLM_MODEL # Set to "anthropic/claude-sonnet-4.5"
npx wrangler secret put OPENAI_API_KEY # Your OpenRouter API key (sk-or-...)Example: Using Claude with AI SDK (direct):
npx wrangler secret put LLM_PROVIDER # Set to "ai-sdk"
npx wrangler secret put LLM_MODEL # Set to "anthropic/claude-sonnet-4"
npx wrangler secret put ANTHROPIC_API_KEY # Your Anthropic API keyLLM Stock Trader gathers social sentiment from multiple sources in parallel. The core pipeline requires no API keys — StockTwits and Reddit are public APIs.
| Source | API Key Required | Role | Weight |
|---|---|---|---|
| StockTwits | No (public API) | Primary signal — 30 symbols, equities endpoint, Bullish/Bearish labels | 0.85 |
| No (public API) | Primary signal — 9 subreddits, 50 posts each, keyword-based sentiment | 0.6–0.9 per sub | |
| Finnhub | Yes (FINNHUB_API_KEY) |
News, insider transactions, analyst upgrades/downgrades | — |
| FMP | Yes (FMP_API_KEY) |
Screeners — gainers, losers, most active | — |
| SEC EDGAR | No (public API) | 8-K and Form 4 feeds (40 entries each), filing-based sentiment | — |
| QuiverQuant | Yes (QUIVER_API_KEY) |
Alternative sentiment/fundamentals (when configured) | — |
| Alpaca Screener | Uses Alpaca key | Most actives, movers (when Alpaca configured) | — |
| Alpaca News | Uses Alpaca key | News feed for screened symbols (when Alpaca configured) | — |
| Twitter/X | Yes (TWITTER_BEARER_TOKEN) |
Optional confirmation only — boosts existing signals | 0.90–0.95 |
| Crypto Momentum | No (uses Alpaca key) | Price-momentum-based sentiment for crypto assets | — |
- Gather —
runDataGatherers()fetches StockTwits trending symbols, Reddit hot posts (r/wallstreetbets, r/stocks, r/investing, r/options), SEC filings, and crypto momentum in parallel - Score — Each source applies weighting: time decay (exponential with 120-min half-life for Reddit), engagement multipliers (upvotes/comments), flair bonuses (DD posts get 1.5x, memes get 0.4x), and source-specific weights
- Filter — Signals below
min_sentiment_score(default 0.3) are dropped. StockTwits requires minimum 5 messages per symbol; Reddit requires minimum 2 mentions - Confirm (optional) — If
TWITTER_BEARER_TOKENis set, signals with sentiment >= 0.3 are checked against recent tweets with actionable keywords (unusual flow, sweep, whale, breaking, upgrade/downgrade). Matching Twitter sentiment boosts signal confidence - Research — Surviving signals are sent to the LLM for deep analysis and final BUY/SKIP/HOLD verdict
The signal pipeline was expanded to include more data sources, higher limits, and better UX:
Data sources and gatherers
- Finnhub —
getUpgradeDowngrade(),getMarketNews(),getInsiderTransactions(); harness methodgatherFinnhub()(requiresFINNHUB_API_KEY). - FMP (Financial Modeling Prep) —
getMarketGainers(),getMarketLosers(),getMostActive(); harness methodgatherFMPScreener()(requiresFMP_API_KEY). - SEC EDGAR — 8-K and Form 4 feeds, 40 entries each.
- StockTwits — Up to 30 symbols, equities-specific endpoint.
- Reddit — 5 additional subreddits (9 total), 50 posts per subreddit.
- QuiverQuant — New provider and harness method
gatherQuiverQuant()(optional; requiresQUIVER_API_KEYwhen used). - Alpaca — New screener (most actives, movers) via
gatherAlpacaScreener(), andgatherAlpacaNews()for news on screened symbols (uses existing Alpaca credentials).
Backend
- MAX_SIGNALS increased to 500 (merged signals before research).
- Deduplication of signals by symbol/source to avoid duplicate research.
- Composite scoring improved for ranking which signals get researched.
- Research limit (e.g. top 10) — only the top N signals by composite score are sent to the analyst LLM.
Frontend
- No 20-signal cap — dashboard shows the full signal set (up to backend limit).
- Source filter tabs — filter signals by source (e.g. StockTwits, Reddit, Finnhub).
- Signal strength — indicators (e.g. STRONG) for high composite score.
Rate limiting
- Centralized RateLimiter utility with per-provider budgets so all data providers respect API limits without blocking each other.
Logging
- Application logs from the harness (
this.log(agent, action, details)) are written to in-memorystate.logsand also toconsole.log, so they appear in the session log files (e.g.logs/<timestamp>.log) when running./start. Targeted logs for the analyst run, BUY+confidence filter, and Alpaca order submit can be added so the “BUY → attempt purchase” path is visible in those same log files.
Twitter is not required for the sentiment pipeline — it only confirms signals that already passed the threshold. To enable it:
- Create a developer account at developer.x.com
- Generate a Bearer Token
- Set the secret:
npx wrangler secret put TWITTER_BEARER_TOKEN
Rate-limited to 200 reads/day ($1–2/day). Results are cached for 5 minutes.
StockTwits ──→ score × sourceWeight × freshness ──┐ Reddit ────→ rawSentiment × (timeDecay × engagement × flair × sourceWeight) ──┤ SEC ───────→ staticSentiment × sourceWeight × freshness ──┤ Crypto ────→ momentum-based rawSentiment ──────────────────┘ │ signalCache (top 200 by |sentiment|) │ ┌────────────────────┴─────────────────────┐ │ │ researchTopSignals() analyzeSignalsWithLLM() raw_sentiment >= 0.3 avgSentiment >= 0.15 top 5 → individual LLM top 10 → batch LLM │ │ └────────────────────┬─────────────────────┘ │ ┌── Twitter Confirm ──┐ │ +15% if confirms │ │ -15% if contradicts│ └────────┬────────────┘ │ confidence >= 0.6? │ executeBuy()
Set DISCORD_WEBHOOK_URL to receive real-time trade alerts as rich embeds in a Discord channel. This is optional — if not set, notifications are silently skipped.
| Type | Trigger | Content |
|---|---|---|
| Signal alert | High sentiment detected for a symbol | Ticker, sentiment %, sources, "researching..." status |
| Research verdict | LLM returns a BUY verdict | Ticker, verdict (BUY/SKIP/HOLD), confidence %, entry quality, reasoning (truncated to 300 chars), catalysts, red flags |
Notifications are rate-limited to one per symbol every 30 minutes to prevent spam.
- In Discord: Channel Settings → Integrations → Webhooks → New Webhook
- Copy the webhook URL
- Set the secret:
npx wrangler secret put DISCORD_WEBHOOK_URL
All embeds include a footer: "LLM Stock Trader • Not financial advice • DYOR"
| Endpoint | Description |
|---|---|
/agent/status |
Full status (account, positions, signals) |
/agent/enable |
Enable the agent |
/agent/disable |
Disable the agent |
/agent/config |
GET or POST configuration |
/agent/costs |
GET costs, or DELETE to reset cost tracker |
/agent/logs |
Get recent logs |
/agent/trigger |
Manually trigger (for testing) |
/agent/kill |
Emergency kill switch (uses KILL_SWITCH_SECRET) |
/agent/symbol-detail/:symbol |
On-demand market data for a symbol (tooltip data) |
/mcp |
MCP server for tool access |
All /agent/* endpoints require Bearer token authentication using LLM Stock Trader_API_TOKEN:
curl -H "Authorization: Bearer $LLM Stock Trader_TOKEN" https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/statusGenerate a secure token: openssl rand -base64 48
The /agent/kill endpoint uses a separate KILL_SWITCH_SECRET for emergency shutdown:
curl -H "Authorization: Bearer $KILL_SWITCH_SECRET" https://LLM Stock Trader.bernardoalmeida2004.workers.dev/agent/killThis immediately disables the agent, cancels all alarms, and clears the signal cache.
For additional security with SSO/email verification, set up Cloudflare Access:
# 1. Create a Cloudflare API token with Access:Edit permissions
# https://dash.cloudflare.com/profile/api-tokens
# 2. Run the setup script
CLOUDFLARE_API_TOKEN=your-token \
CLOUDFLARE_ACCOUNT_ID=your-account-id \
LLM Stock Trader_WORKER_URL=https://LLM Stock Trader.your-subdomain.workers.dev \
LLM Stock Trader_ALLOWED_EMAILS=you@example.com \
npm run setup:accessThis creates a Cloudflare Access Application with email verification or One-Time PIN.
LLM Stock Trader/
├── start # Dev launcher (backend + dashboard)
├── wrangler.jsonc # Cloudflare Workers config
├── .dev.vars # Local secrets (gitignored)
├── src/
│ ├── index.ts # Entry point
│ ├── durable-objects/
│ │ ├── LLM Stock Trader-harness.ts # THE HARNESS - customize this!
│ │ └── session.ts
│ ├── mcp/ # MCP server & tools
│ ├── policy/ # Trade validation
│ └── providers/ # Alpaca, LLM, Finnhub, FMP, news clients
├── scripts/
│ ├── test-alpaca.ts # Alpaca API connection test
│ ├── list-models.sh # Fetch OpenRouter models + pricing
│ └── setup-access.ts # Cloudflare Access setup
├── dashboard/ # React dashboard (Vite + React + Tailwind)
│ └── src/
│ ├── index.css # Synthwave '84 theme, glow utilities, CRT overlay CSS
│ ├── components/ # Panel, LineChart, ModelPicker, CrtEffect, etc.
│ └── hooks/ # useOpenRouterModels (dynamic model fetching)
├── docs/ # Documentation
└── migrations/ # D1 database migrations
The Account panel displays several financial metrics. Some are passed through directly from the Alpaca API, while others are derived in the dashboard.
| Metric | Source | Calculated By |
|---|---|---|
| Equity | Alpaca /v2/account → equity |
Alpaca (direct) |
| Cash | Alpaca /v2/account → cash |
Alpaca (direct) |
| Buying Power | Alpaca /v2/account → buying_power |
Alpaca (direct) |
| Last Equity | Alpaca /v2/account → last_equity |
Alpaca (direct) |
| Starting Equity | Alpaca /v2/account/portfolio/history → base_value |
Auto-captured on first run |
| Unrealized P&L | Alpaca /v2/positions → sum of unrealized_pl |
Dashboard |
| Daily P&L | Derived from equity and last_equity | Dashboard |
| Total P&L | Derived from equity and starting_equity | Dashboard |
| Realized P&L | Derived from total and unrealized P&L | Dashboard |
The starting_equity value is automatically derived from Alpaca on first run — it is not hardcoded. The harness calls Alpaca's Portfolio History API and reads the base_value field, which represents the account's initial funded value (e.g., $100,000 for a standard paper account).
Once captured, the value is persisted in the Durable Object config. To re-derive it from Alpaca (e.g., after a reset), use the Reset button next to "Starting Equity" in the Settings modal — this clears the persisted value so the next status poll re-fetches it.
Daily P&L — change since previous trading day's close:
Total P&L — all-time change since account inception:
Unrealized P&L — sum of gains/losses on open positions:
Realized P&L — derived from the accounting identity
For an account that started at $100,000 with current equity of $106,035:
| Metric | Formula | Value |
|---|---|---|
| Starting Equity | Alpaca base_value
|
$100,000.00 |
| Last Equity | Alpaca last_equity (prev close) |
$100,000.00 |
| Equity | Alpaca equity
|
$106,035.52 |
| Daily P&L | $6,035.52 (+6.04%) | |
| Total P&L | $6,035.52 (+6.04%) | |
| Unrealized P&L | $6,278.21 | |
| Realized P&L | -$242.69 |
Note: Daily and Total P&L are identical on day 1 (when
last_equityequalsstarting_equity). They diverge from day 2 onward aslast_equityupdates to each day's closing equity whilestarting_equityremains fixed.
Cash and Buying Power come directly from Alpaca's account API — the dashboard does not calculate these.
- Cash — uninvested dollars in the account (starting cash minus cost of purchases plus proceeds from sales)
- Buying Power — total amount available to open new positions, factoring in RegT margin. For a margin-eligible account this is typically ~2× settled cash, reduced by the margin requirements of existing positions
The positions table shows all open positions with the following columns:
| Column | Description |
|---|---|
| Symbol | Ticker symbol (crypto symbols show ₿ prefix). Hover for detailed tooltip. |
| Shares | Number of shares/units held |
| Value | Current market value of the position |
| Today P&L | Intraday unrealized profit/loss (uses unrealized_intraday_pl) |
| Total P&L | All-time unrealized profit/loss since entry (uses unrealized_pl) |
| Diversity | Position weight as a percentage of total portfolio value |
| Trend | Sparkline showing recent price movement |
Hovering over a symbol in the positions table loads detailed market data on-demand from the /agent/symbol-detail/:symbol endpoint. Data is organized in sections:
| Section | Fields | Source |
|---|---|---|
| Quote | Bid (price × size), Ask (price × size), Bid-Ask Spread | Alpaca snapshot |
| Trading | Volume, Overnight Volume (--), Average Volume | Alpaca snapshot + Finnhub |
| Price | Open, Today's High, Today's Low | Alpaca snapshot |
| Fundamentals | Market Cap, 52 Week High/Low, P/E Ratio, Dividend Yield | Finnhub (equities) or FMP (crypto) |
| Short Info | Short Inventory (Available/Unavailable), Borrow Rate (--) | Alpaca asset (equities only) |
| Position | Entry Price, Current Price, Hold Time, Entry Sentiment, Staleness | Existing position data |
Finnhub provides equity fundamentals (market cap, P/E ratio, dividend yield, 52-week high/low, average volume) for all US stocks with no symbol restrictions.
- Free tier: 60 calls/minute, no daily cap
- Endpoint:
/api/v1/stock/metric?symbol=X&metric=all - Data is cached in KV for 15 minutes
FMP (Financial Modeling Prep) provides crypto market data for major pairs (BTCUSD, ETHUSD, SOLUSD, etc.).
- Free tier: 250 calls/day
- Endpoint:
/stable/quote?symbol=BTCUSD - Data is cached in KV for 5 minutes
Both providers are optional. If API keys are not configured, the respective fields show "--" in the tooltip. Alpaca data (bid/ask, volume, OHLC, shortable status) is always available.
- Create a free account at https://finnhub.io/register
- Copy your API key from the Finnhub dashboard
- For local development: add
FINNHUB_API_KEY=your_keyto.dev.vars - For production:
npx wrangler secret put FINNHUB_API_KEY
- Create a free account at https://site.financialmodelingprep.com/register
- Copy your API key from the FMP dashboard
- For local development: add
FMP_API_KEY=your_keyto.dev.vars - For production:
npx wrangler secret put FMP_API_KEY
The dashboard uses a Synthwave '84 color palette with neon glow effects, defined as CSS custom properties in dashboard/src/index.css.
All colors are pulled from the Synthwave '84 VS Code theme:
@theme {
/* Backgrounds — deep purple-blue */
--color-hud-bg: #262335;
--color-hud-bg-panel: #2a2139;
/* Core accents */
--color-hud-primary: #36f9f6; /* Neon cyan */
--color-hud-success: #72f1b8; /* Neon green */
--color-hud-warning: #fede5d; /* Neon yellow */
--color-hud-error: #fe4450; /* Neon red */
--color-hud-purple: #ff7edb; /* Neon pink */
--color-hud-cyan: #03edf9; /* Bright cyan */
/* Text */
--color-hud-text: #b6b1b1; /* Muted warm grey */
--color-hud-text-dim: #848bbd; /* Lavender */
--color-hud-text-bright: #f4eee4; /* Off-white */
}Text glow utilities (.glow-cyan, .glow-green, .glow-pink, .glow-red, .glow-yellow, .glow-orange) apply text-shadow halos inspired by the Synthwave '84 glow CSS. They're used on panel titles, status indicators, large metric values, and the header title. A pink-to-cyan gradient stripe (.neon-stripe) accents panel headers and dividers.
A toggleable CRT screen effect is available via the [CRT] button in the top-left of the header. It adds:
- Scanlines — Thin horizontal bars with a slow retrace scroll
- Vignette — Radial gradient darkening screen edges
- Static noise — Tiled noise texture animated via CSS
- Flicker — Subtle brightness/contrast oscillation on the page
- Chromatic aberration — Faint red/cyan fringe at screen edges
The effect is CSS-only with zero per-frame JavaScript (noise tile is generated once on mount). Inspired by CRTFilter.js. The preference is persisted to localStorage.
All overlay layers use pointer-events: none and high z-index so they never block interaction.
| Feature | Description |
|---|---|
| Paper Trading | Start with ALPACA_PAPER=true |
| Kill Switch | Emergency halt via secret |
| Position Limits | Max positions and $ per position |
| Daily Loss Limit | Stops trading after 2% daily loss |
| Staleness Detection | Auto-exit stale positions |
| Duplicate Buy Prevention | Tracks recently ordered symbols for 5 minutes to prevent duplicate buys while orders are pending_new |
| Crypto/Stock Path Isolation | Analyst excludes crypto symbols from stock buy candidates, preventing double-buying through both the crypto and analyst execution paths |
| No Margin | Cash-only trading |
| No Shorting | Long positions only |
Join our Discord for help and discussion:
This software is provided for educational and informational purposes only. Nothing in this repository constitutes financial, investment, legal, or tax advice.
By using this software, you acknowledge and agree that:
- All trading and investment decisions are made at your own risk
- Markets are volatile and you can lose some or all of your capital
- No guarantees of performance, profits, or outcomes are made
- The authors and contributors are not responsible for any financial losses
- This software may contain bugs or behave unexpectedly
- Past performance does not guarantee future results
Always start with paper trading and never risk money you cannot afford to lose.
MIT License - Free for personal and commercial use. See LICENSE for full terms.