A modern web application for analyzing cryptocurrency and financial assets using advanced risk-adjusted performance metrics like Omega and Sharpe ratios across multiple timeframes.
- 26 Major Cryptocurrencies - Curated selection of established tokens
- Yahoo Finance Integration - Reliable historical data via yfinance
- Smart Delta Updates - Only fetches new data since last update
- GitHub Actions Automation - Daily data updates and deployment
- Static Hosting Ready - Deploy to Vercel, Netlify, or GitHub Pages
- Efficient Data Management - Minimal API calls with intelligent caching
- Daily Updates: GitHub Actions runs Python script to fetch latest data
- Delta Logic: Only downloads new days since last update
- Static Files: Data stored as JSON files in
/public/data/ - Frontend: React app loads static JSON files directly
- Yahoo Finance via
yfinancePython library- β Free and reliable
- β Comprehensive OHLCV data
- β No API key required
- β Historical data going back years
Currently tracking 26 major tokens:
- Major: BTC, ETH, BNB, XRP, SOL, ADA, DOGE
- DeFi: LINK, AAVE, DOT, AVAX, ATOM, NEAR
- Layer 1: TRX, ALGO, VET, ICP, HBAR
- Others: LTC, BCH, XLM, ETC, XMR, SHIB, QNT, FIL
# Node.js for frontend
npm install
# Python for data scripts
pip install -r scripts/requirements.txt# Frontend development
npm run dev
# Build for production
npm run build
# Update crypto data manually
cd scripts && python download-historical-data.pyThe Python script (scripts/download-historical-data.py):
- Checks existing data files in
public/data/ - Determines last update date for each token
- Fetches only new data since last update (delta)
- Updates JSON files with new price/volume data
- Creates/updates
index.jsonwith metadata
Each token has its own JSON file (public/data/{token-id}.json):
{
"symbol": "BTC-USD",
"crypto_id": "bitcoin",
"name": "BTC",
"last_updated": "2025-10-21T10:30:00",
"data_points": 2847,
"earliest_date": "2017-01-01T00:00:00",
"latest_date": "2025-10-21T00:00:00",
"prices": [[timestamp_ms, price], ...],
"total_volumes": [[timestamp_ms, volume], ...]
}Two workflows handle automation:
-
Combined Update & Deploy (6 AM UTC daily)
- Updates crypto data using Python script
- Commits changes if data updated
- Deploys to GitHub Pages automatically
-
Manual Deploy (on push to main)
- Builds and deploys to GitHub Pages
- Useful for code changes
No setup required! Workflows use built-in GITHUB_TOKEN permissions.
src/
βββ components/ # React components
β βββ CryptoAnalyzer.tsx # Main analyzer component
β βββ TokenDataTable.tsx # Overall rankings table
β βββ OmegaRatioTable.tsx # Omega ratio analysis
β βββ SharpeRatioTable.tsx # Sharpe ratio analysis
βββ types/ # TypeScript definitions
βββ utils/ # Utility functions
βββ App.tsx # Main application
scripts/
βββ download-historical-data.py # Data fetching script
βββ requirements.txt # Python dependencies
public/
βββ data/ # Generated crypto data (JSON files)
βββ bitcoin.json
βββ ethereum.json
βββ index.json # Metadata and token list
.github/workflows/ # GitHub Actions automation
βββ update-crypto-data.yml
βββ deploy-vercel.yml
βββ update-and-deploy.yml
- Enable GitHub Pages in repository settings
- Set source to "GitHub Actions"
- Workflows handle everything automatically:
- Daily data updates at 6 AM UTC
- Automatic deployment when data changes
- No manual intervention needed!
For other hosting providers:
- Update data:
cd scripts && python download-historical-data.py - Build:
npm run build - Deploy
dist/folder to any static host
No environment variables, API keys, or secrets required!
- Add Yahoo Finance symbol to
CRYPTO_SYMBOLSindownload-historical-data.py - Add mapping to
SYMBOL_TO_IDdictionary - Run data update script
- Token will appear in frontend automatically
Modify cron schedule in .github/workflows/update-and-deploy.yml:
schedule:
- cron: '0 6 * * *' # Daily at 6 AM UTC- Smart Updates: Only fetches new days, not full history
- Static Files: No runtime API calls needed
- Cached Data: Reuses existing historical data
- Minimal Bandwidth: JSON files are compact and cacheable
- 1000 users: Still only 1 daily API update
- No Rate Limits: Users load static files
- Fast Loading: Pre-calculated data, no computation needed
- Fork the repository
- Create a feature branch
- Test data updates:
cd scripts && python download-historical-data.py - Test frontend:
npm run dev - Submit a pull request
MIT License - see LICENSE file for details.