7.6 KiB
7.6 KiB
ANNOTATE - Manual Trade Annotation UI
🎯 Overview
A professional web-based interface for manually marking profitable buy/sell signals on historical market data to generate high-quality training test cases for machine learning models.
Status: ✅ Production Ready - Core features complete and tested
✨ Key Features
📊 Multi-Timeframe Visualization
- 4 synchronized charts: 1s, 1m, 1h, 1d timeframes
- Candlestick + Volume: Professional trading view
- Interactive navigation: Zoom, pan, scroll
- Hover details: OHLCV information on hover
🎯 Trade Annotation
- Click to mark: Entry point (▲) and exit point (▼)
- Visual feedback: Color-coded markers (green=LONG, red=SHORT)
- P&L calculation: Automatic profit/loss percentage
- Connecting lines: Dashed lines between entry/exit
- Edit/Delete: Modify or remove annotations
📦 Test Case Generation
- Realtime format: Identical to training test cases
- Market context: Full OHLCV data for all timeframes
- Data consistency: Uses same DataProvider as training/inference
- Auto-save: Test cases saved to JSON files
🔄 Data Integration
- Existing DataProvider: No duplicate data fetching
- Cached data: Leverages existing cache
- Same quality: Identical data structure as models see
- Multi-symbol: Supports ETH/USDT, BTC/USDT
🎨 Professional UI
- Dark theme: Matches main dashboard
- Template-based: All HTML in separate files
- Responsive: Works on different screen sizes
- Keyboard shortcuts: Arrow keys for navigation
🚀 Quick Start
Installation
# No additional dependencies needed
# Uses existing project dependencies
Running the Application
# Start the annotation UI
python ANNOTATE/web/app.py
# Access at: http://127.0.0.1:8051
📖 Usage Guide
1. Navigate to Time Period
- Date picker: Jump to specific date/time
- Quick ranges: 1h, 4h, 1d, 1w buttons
- Arrow keys: ← → to scroll through time
- Mouse: Zoom with scroll wheel, pan by dragging
2. Mark a Trade
- Click on chart at entry point
- Entry marker (▲) appears
- Status shows "Entry marked"
- Click again at exit point
- Exit marker (▼) appears
- P&L calculated and displayed
- Annotation saved automatically
3. Manage Annotations
- View: Click eye icon (👁️) to navigate to annotation
- Generate test case: Click file icon (📄)
- Delete: Click trash icon (🗑️)
- Export: Click download button to export all
4. Generate Test Cases
- Click file icon next to any annotation
- Test case generated with full market context
- Saved to
ANNOTATE/data/test_cases/ - Ready for model training
📁 Project Structure
ANNOTATE/
├── web/ # Web application
│ ├── app.py # Main Flask/Dash application
│ ├── templates/ # Jinja2 HTML templates
│ │ ├── base_layout.html
│ │ ├── annotation_dashboard.html
│ │ └── components/
│ └── static/ # Static assets
│ ├── css/
│ ├── js/
│ └── images/
├── core/ # Core business logic
│ ├── annotation_manager.py
│ ├── training_simulator.py
│ └── data_loader.py
├── data/ # Data storage
│ ├── annotations/
│ ├── test_cases/
│ └── training_results/
└── tests/ # Test files
🔧 API Endpoints
Chart Data
POST /api/chart-data
Content-Type: application/json
{
"symbol": "ETH/USDT",
"timeframes": ["1s", "1m", "1h", "1d"],
"start_time": "2024-01-15T10:00:00Z",
"end_time": "2024-01-15T11:00:00Z"
}
Save Annotation
POST /api/save-annotation
Content-Type: application/json
{
"symbol": "ETH/USDT",
"timeframe": "1m",
"entry": {"timestamp": "...", "price": 2400.50},
"exit": {"timestamp": "...", "price": 2460.75}
}
Generate Test Case
POST /api/generate-test-case
Content-Type: application/json
{
"annotation_id": "uuid-string"
}
Available Models
GET /api/available-models
🔗 Integration with Main System
Import in Main Dashboard
from ANNOTATE.core.annotation_manager import AnnotationManager
from ANNOTATE.core.training_simulator import TrainingSimulator
from ANNOTATE.core.data_loader import HistoricalDataLoader
# Initialize with existing components
annotation_mgr = AnnotationManager()
training_sim = TrainingSimulator(orchestrator)
data_loader = HistoricalDataLoader(data_provider)
# Use generated test cases
test_cases = annotation_mgr.get_test_cases()
Data Flow
ANNOTATE UI → HistoricalDataLoader → DataProvider (existing)
↓
Training/Inference
📊 Test Case Format
Generated test cases match the realtime format:
{
"test_case_id": "annotation_uuid",
"symbol": "ETH/USDT",
"timestamp": "2024-01-15T10:30:00Z",
"action": "BUY",
"market_state": {
"ohlcv_1s": {
"timestamps": [...],
"open": [...],
"high": [...],
"low": [...],
"close": [...],
"volume": [...]
},
"ohlcv_1m": {...},
"ohlcv_1h": {...},
"ohlcv_1d": {...}
},
"expected_outcome": {
"direction": "LONG",
"profit_loss_pct": 2.5,
"entry_price": 2400.50,
"exit_price": 2460.75,
"holding_period_seconds": 300
},
"annotation_metadata": {
"annotator": "manual",
"confidence": 1.0,
"notes": "",
"created_at": "2024-01-15T11:00:00Z"
}
}
🎓 Best Practices
Marking Trades
- Be selective: Only mark clear, high-confidence trades
- Use multiple timeframes: Confirm patterns across timeframes
- Add notes: Document why you marked the trade
- Review before generating: Verify entry/exit points are correct
Test Case Generation
- Generate after marking: Create test cases immediately
- Verify market context: Check that OHLCV data is complete
- Organize by strategy: Use notes to categorize trade types
- Export regularly: Backup annotations periodically
Model Training
- Start with quality: Better to have fewer high-quality annotations
- Diverse scenarios: Mark different market conditions
- Balance directions: Include both LONG and SHORT trades
- Test incrementally: Train with small batches first
🐛 Troubleshooting
Charts not loading
- Check DataProvider is initialized
- Verify data is available for selected timeframes
- Check browser console for errors
Annotations not saving
- Ensure
ANNOTATE/data/annotations/directory exists - Check file permissions
- Verify JSON format is valid
Test cases missing market context
- Confirm DataProvider has cached data
- Check timestamp is within available data range
- Verify all timeframes have data
📚 Documentation
- Implementation Summary:
ANNOTATE/IMPLEMENTATION_SUMMARY.md - Progress Tracking:
ANNOTATE/PROGRESS.md - Spec Files:
.kiro/specs/manual-trade-annotation-ui/
🎯 Future Enhancements
- Real-time model training integration
- Inference simulation with playback
- Performance metrics dashboard
- Annotation templates
- Collaborative annotation
- Advanced filtering and search
- Annotation quality scoring
📄 License
Part of the AI Trading System project.
🙏 Acknowledgments
Built with:
- Flask & Dash for web framework
- Plotly for interactive charts
- Bootstrap for UI components
- Existing DataProvider for data consistency