Alexo19 commited on
Commit
06d59d5
·
verified ·
1 Parent(s): 1cbf1a4

You are an elite full-stack developer and quant engineer.

Browse files

🚀 GOAL
Build a production-ready web app called **CryptoSignal-Sleuth** hosted as a Hugging Face Space.
This app should be a powerful, transparent, multi-model crypto signal dashboard that is clearly **more advanced and informative than tradvio.com** (AI chart screenshot analyzer that outputs buy/sell/hold with entry, SL and TP). The app does NOT make trade guarantees; it only provides analysis and ideas.

The app will:
- Let a user **upload a trading screenshot** OR **select a crypto pair and timeframe** for live market analysis.
- Run multiple **Hugging Face models** (vision, time-series, sentiment, LLM reasoning) to generate trades ideas.
- Output structured **signals**: direction, entry zone, stop loss, take profit, confidence %, time horizon, and a human-readable explanation.
- Expose clean **webhook endpoints** so I can connect it to **n8n** and then to Discord / Telegram bots.

====================================
1. TECH STACK & PROJECT STRUCTURE
====================================

Use a stack that runs well on Hugging Face Spaces:

- **Backend**: Python 3, FastAPI (or a lightweight API framework) for REST endpoints.
- **Frontend**: React + TypeScript + Vite (or Next.js without server-side rendering), styled with TailwindCSS.
- **App wrapper for HF Space**: A simple `app.py` that mounts the FastAPI and serves the built frontend (or use `Gradio` just as a wrapper hosting the frontend in an iframe).

Project structure (example):

- `app.py` – HF entry point.
- `backend/`
- `main.py` – FastAPI app, routing.
- `models_registry.py` – central registry of Hugging Face models used.
- `signal_engine.py` – core logic that combines all AI outputs into one signal.
- `image_analysis.py` – screenshot parsing and pattern detection.
- `timeseries_analysis.py` – price data handling and forecasting.
- `sentiment_analysis.py` – news/social sentiment functions.
- `config.py` – env var loading (API keys, base URLs).
- `frontend/`
- `index.html`
- `src/`
- `main.tsx`
- `App.tsx`
- `components/` (Navbar, Sidebar, ChartUploadCard, PairSelector, SignalPanel, HistoryTable, ModelSettingsPanel, etc.)
- `pages/` (Dashboard, Backtest, Settings, Docs)
- `lib/api.ts` – API client for FastAPI endpoints.

Include a `requirements.txt` and a simple `README.md` with how to run locally and in Spaces.

====================================
2. CORE USER FLOWS
====================================

The app should support two main workflows:

**(A) Screenshot Signal (Tradvio-style but stronger)**
1. User uploads a PNG/JPEG of their trading chart (like from TradingView, Bybit, Binance, etc.).
2. Backend:
- Uses a vision model from Hugging Face (e.g. `microsoft/resnet-50` or another suitable vision transformer) to extract features from the image.
- Optionally uses OCR (like `microsoft/trocr-base-stage1` or any HF OCR model) to read prices, labels, and text on the chart.
- Passes the extracted info into an LLM (e.g. `Qwen/QwQ-32B` or another instruction-following open-source LLM on HF) with a system prompt that explains:
- Identify trend (bullish, bearish, range).
- Detect major support/resistance and patterns (S/R flips, trendlines, wedges, channels, double tops/bottoms, etc.).
- Decide if this looks like a BUY, SELL, or NO-TRADE zone.
- Propose ENTRY, STOP LOSS, and 1–3 TAKE PROFIT targets.
- Provide a confidence score 0–100%.
- Returns structured JSON with:
- `direction`
- `entry_zone`
- `stop_loss`
- `take_profit_levels` (array)
- `timeframe_inferred`
- `confidence`
- `explanation`
3. Frontend shows:
- Uploaded chart preview.
- A card with the **final combined signal**, using color coding:
- Green for long/buy, red for short/sell, gray for neutral.
- Text explanation with bullet points and a “Copy as text” button.

**(B) Live Market Signal (pair + timeframe)**
1. User selects:
- Exchange (just treat as a simple dropdown: Binance, Bybit, KuCoin – no real auth needed yet).
- Crypto pair (e.g. BTCUSDT, ETHUSDT).
- Timeframe (e.g. 5m, 15m, 1h, 4h, 1D).
2. Backend:
- Calls a placeholder market data fetch function (e.g. from Binance public API) OR, if external internet is not allowed, simulate candles with random data but design the code in a way that’s easy to connect to real API later.
- Uses a Hugging Face **time-series model** or generic transformer to analyze candles and predict short-term bias (e.g. `huggingface/time-series-transformer-*` or any suitable open-source model).
- Uses a **technical analysis module** (like TA-Lib or custom functions) to compute indicators: EMA, RSI, MACD, ATR, etc.
- Feeds both model predictions and technical data into the **signal engine** (see section 3) to produce a structured signal with the same schema as above.
3. Frontend shows:
- Pair + timeframe at top.
- A small sparkline / candle mini-chart (can use a simple JS chart library using returned OHLC).
- Signal cards for:
- HTF Bias (higher timeframe trend)
- LTF Bias (current timeframe)
- Suggested position type (Scalp / Swing / No Trade)
- A mini “What this means” text for beginners.

====================================
3. MULTI-MODEL SIGNAL ENGINE
====================================

Implement a `SignalEngine` class in `signal_engine.py` that:

- Accepts inputs from:
- Vision + OCR (from screenshot).
- Time-series model outputs (price prediction, volatility).
- Sentiment model outputs (optional).
- Technical indicators (EMA, RSI, etc.).
- Normalizes everything into a common score from -1 (strong short) to +1 (strong long).
- Uses simple, transparent rules to combine scores:
- Example: weighted average of:
- Trend score (0.4)
- Momentum score (0.2)
- Pattern score (0.2)
- Sentiment score (0.2)
- Converts the final score into:
- `direction`: LONG / SHORT / NEUTRAL
- `confidence`: 0–100%
- Calculates:
- `entry_zone` using recent swing high/low plus a small buffer.
- `stop_loss` using ATR or last structure high/low.
- `take_profit_levels` as 1:2 and 1:3 risk-reward multiples.
- Returns consistent JSON so front-end can always render.

In `models_registry.py`:
- Create a simple registry mapping model names to tasks and HF model IDs, for example:
- `"vision_feature_extractor"` → `google/vit-base-patch16-224`
- `"ocr"` → some TrOCR model
- `"sentiment"` → `yiyanghkust/finbert-tone` or similar financial sentiment model
- `"general_llm"` → some chat/instruction model on HF
- This makes it easy to plug in **any future Hugging Face models** without rewriting the whole app.

====================================
4. INTEGRATIONS: N8N & DISCORD
====================================

The app DOES NOT directly auto-trade. It only sends signals out via webhooks so that users can automate in n8n or elsewhere.

Backend:
- Expose an endpoint: `POST /api/signals/webhook-test`
- Accepts a JSON payload `{ pair, timeframe, direction, entry_zone, stop_loss, take_profit_levels, confidence, explanation }`.
- Also accepts an optional `webhook_url`.
- If `webhook_url` is present, send the signal JSON to that URL (for n8n or Discord webhook).
- Include a simple helper in docs:
- Example n8n workflow: HTTP Trigger → send to Discord / Telegram / broker API.
- Store `WEBHOOK_DEFAULT_URL` and `DISCORD_WEBHOOK_URL` as environment variables via `config.py`, but don’t hard-code any real secrets.

Frontend:
- In **Settings** page, allow user to:
- Paste a `Webhook URL`.
- Toggle “Auto-send signal to webhook when generated”.
- Add a button “Send this signal to webhook” on each signal result card.

====================================
5. PAGES & UI DESIGN
====================================

Overall layout inspired by modern trading dashboards and Tradvio, but unique:

- **Top Navbar**:
- Logo text: `CryptoSignal-Sleuth`
- Right side: “Dashboard”, “Backtest (coming soon)”, “Settings”, “Docs”
- **Left Sidebar**:
- Section “Analysis Modes”:
- Screenshot Analyzer
- Live Market Analyzer
- Section “History”: shows a list of last 20 signals with pair, direction, timestamp and confidence.

**Dashboard Page layout:**
- Two-column responsive layout.
- Left column:
- Card: “Upload Chart Screenshot”
- Drop-zone, preview thumbnail, “Analyze Screenshot” button.
- Card: “Live Market Selector”
- Exchange dropdown, pair input, timeframe dropdown, “Analyze Market” button.
- Right column:
- Card: “Latest Signal”
- Large status badge (LONG / SHORT / NEUTRAL).
- Entry, SL, TP levels in a small table.
- Confidence % with a simple progress bar.
- “Copy signal as text” button.
- “Send to webhook” button.
- Card: “Explanation”
- Bullet point reasoning from the LLM.
- Small disclaimer text.

**History / Table:**
- Simple table listing:
- Timestamp
- Mode (Screenshot / Live)
- Pair / Instrument
- Direction
- Confidence
- Click a row → open a modal with full details and explanation.

**Settings Page:**
- Text inputs for:
- Default Pair
- Default Timeframe
- Webhook URL
- Risk settings (default RR, max risk per trade – just for display in explanation, not enforcement).
- Save settings to browser localStorage.

**Docs Page:**
- Plain text/Markdown rendered section that explains:
- What the app does.
- What models are used (with links to Hugging Face model cards, use placeholders).
- Clear disclaimer: not financial advice.

Use a clean dark theme suitable for trading:
- Dark background, subtle cards, readable fonts.
- Make sure everything is mobile-responsive.

====================================
6. API DESIGN (BACKEND)
====================================

Implement the following FastAPI routes:

- `POST /api/analyze/screenshot`
- `multipart/form-data` with `file`
- Returns structured signal JSON.
- `POST /api/analyze/market`
- JSON body: `{ pair, timeframe, exchange? }`
- Returns structured signal JSON.
- `GET /api/history`
- Returns last N saved signals from

app.py ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```python
2
+ from fastapi import FastAPI
3
+ from fastapi.staticfiles import StaticFiles
4
+ from fastapi.middleware.cors import CORSMiddleware
5
+ from backend.main import router as api_router
6
+
7
+ app = FastAPI(title="CryptoSignal-Sleuth")
8
+
9
+ # CORS configuration
10
+ app.add_middleware(
11
+ CORSMiddleware,
12
+ allow_origins=["*"],
13
+ allow_credentials=True,
14
+ allow_methods=["*"],
15
+ allow_headers=["*"],
16
+ )
17
+
18
+ # Mount backend API and frontend static files
19
+ app.include_router(api_router, prefix="/api")
20
+ app.mount("/", StaticFiles(directory="frontend/dist", html=True), name="frontend")
21
+
22
+ # For Hugging Face Spaces
23
+ def get_app():
24
+ return app
25
+ ```
backend/main.py ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```python
2
+ from fastapi import APIRouter, UploadFile, File, HTTPException
3
+ from fastapi.responses import JSONResponse
4
+ from pydantic import BaseModel
5
+ from typing import Optional, List
6
+ from datetime import datetime
7
+ import uuid
8
+ from .models_registry import ModelsRegistry
9
+ from .signal_engine import SignalEngine
10
+
11
+ router = APIRouter()
12
+
13
+ # Initialize models and engine
14
+ models = ModelsRegistry()
15
+ engine = SignalEngine(models)
16
+
17
+ class MarketRequest(BaseModel):
18
+ pair: str
19
+ timeframe: str
20
+ exchange: Optional[str] = "binance"
21
+
22
+ class SignalResponse(BaseModel):
23
+ direction: str # "LONG", "SHORT", or "NEUTRAL"
24
+ entry_zone: List[float]
25
+ stop_loss: float
26
+ take_profit_levels: List[float]
27
+ timeframe_inferred: str
28
+ confidence: float # 0-100
29
+ explanation: str
30
+ signal_id: str
31
+ timestamp: datetime
32
+
33
+ class WebhookTestRequest(BaseModel):
34
+ webhook_url: Optional[str]
35
+ signal: SignalResponse
36
+
37
+ @router.post("/analyze/screenshot", response_model=SignalResponse)
38
+ async def analyze_screenshot(file: UploadFile = File(...)):
39
+ try:
40
+ # Generate a unique ID for this signal
41
+ signal_id = str(uuid.uuid4())
42
+
43
+ # In a real implementation, we would:
44
+ # 1. Process the image with vision models
45
+ # 2. Extract text with OCR if needed
46
+ # 3. Analyze with LLM
47
+ # For now, return a mock response
48
+ return SignalResponse(
49
+ direction="LONG",
50
+ entry_zone=[42000, 42500],
51
+ stop_loss=41500,
52
+ take_profit_levels=[43500, 44500],
53
+ timeframe_inferred="1h",
54
+ confidence=75.5,
55
+ explanation="The chart shows a bullish breakout from a descending wedge pattern with increasing volume. Key support at 41,500 and first target at 43,500.",
56
+ signal_id=signal_id,
57
+ timestamp=datetime.now()
58
+ )
59
+ except Exception as e:
60
+ raise HTTPException(status_code=500, detail=str(e))
61
+
62
+ @router.post("/analyze/market", response_model=SignalResponse)
63
+ async def analyze_market(request: MarketRequest):
64
+ try:
65
+ signal_id = str(uuid.uuid4())
66
+
67
+ # In a real implementation, we would:
68
+ # 1. Fetch market data
69
+ # 2. Analyze with time-series models
70
+ # 3. Combine with technical indicators
71
+ return SignalResponse(
72
+ direction="SHORT" if "BTC" in request.pair else "NEUTRAL",
73
+ entry_zone=[2850, 2900] if "ETH" in request.pair else [42000, 42500],
74
+ stop_loss=2950 if "ETH" in request.pair else 43500,
75
+ take_profit_levels=[2700, 2600] if "ETH" in request.pair else [40000, 39000],
76
+ timeframe_inferred=request.timeframe,
77
+ confidence=68.2,
78
+ explanation=f"Market shows overbought conditions on {request.timeframe} timeframe with RSI above 70. Expecting a pullback to support levels.",
79
+ signal_id=signal_id,
80
+ timestamp=datetime.now()
81
+ )
82
+ except Exception as e:
83
+ raise HTTPException(status_code=500, detail=str(e))
84
+
85
+ @router.get("/history", response_model=List[SignalResponse])
86
+ async def get_history(limit: int = 20):
87
+ # In a real implementation, this would fetch from a database
88
+ return []
89
+
90
+ @router.post("/signals/webhook-test")
91
+ async def test_webhook(request: WebhookTestRequest):
92
+ # In a real implementation, this would send to the webhook URL
93
+ return {"status": "success", "message": "Webhook test successful (mock response)"}
94
+ ```
backend/models_registry.py ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```python
2
+ from typing import Optional, Dict, Any
3
+ from huggingface_hub import InferenceClient
4
+
5
+ class ModelsRegistry:
6
+ def __init__(self):
7
+ # Initialize with default models
8
+ self.models = {
9
+ "vision_feature_extractor": "google/vit-base-patch16-224",
10
+ "ocr": "microsoft/trocr-base-stage1",
11
+ "sentiment": "yiyanghkust/finbert-tone",
12
+ "timeseries": "huggingface/time-series-transformer-tiny",
13
+ "general_llm": "mistralai/Mistral-7B-v0.1"
14
+ }
15
+ self.client = InferenceClient()
16
+
17
+ def get_model(self, model_name: str) -> Optional[Dict[str, Any]]:
18
+ return self.models.get(model_name)
19
+
20
+ async def analyze_sentiment(self, text: str) -> Dict[str, float]:
21
+ """Get sentiment scores using HF model"""
22
+ try:
23
+ result = await self.client.post(
24
+ f"https://api-inference.huggingface.co/models/{self.models['sentiment']}",
25
+ json={"inputs": text}
26
+ )
27
+ return {item['label']: item['score'] for item in result}
28
+ except Exception:
29
+ return {"neutral": 0.5, "positive": 0.25, "negative": 0.25}
30
+
31
+ async def analyze_image(self, image_data: bytes) -> Dict[str, Any]:
32
+ """Get image features using HF model"""
33
+ try:
34
+ result = await self.client.post(
35
+ f"https://api-inference.huggingface.co/models/{self.models['vision_feature_extractor']}",
36
+ data=image_data
37
+ )
38
+ return {"features": result}
39
+ except Exception:
40
+ return {"features": []}
41
+
42
+ async def generate_text(self, prompt: str) -> str:
43
+ """Get text generation from HF model"""
44
+ try:
45
+ result = await self.client.post(
46
+ f"https://api-inference.huggingface.co/models/{self.models['general_llm']}",
47
+ json={"inputs": prompt}
48
+ )
49
+ return result[0]['generated_text']
50
+ except Exception:
51
+ return "Model unavailable - using fallback response"
52
+ ```
backend/signal_engine.py ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```python
2
+ from typing import Dict, Any
3
+ from dataclasses import dataclass
4
+ from .models_registry import ModelsRegistry
5
+ import random
6
+
7
+ @dataclass
8
+ class SignalComponents:
9
+ trend_score: float # -1 to +1
10
+ momentum_score: float # -1 to +1
11
+ pattern_score: float # -1 to +1
12
+ sentiment_score: float # -1 to +1
13
+ volatility: float # 0-1
14
+
15
+ class SignalEngine:
16
+ def __init__(self, models: ModelsRegistry):
17
+ self.models = models
18
+
19
+ def generate_signal(self, components: SignalComponents) -> Dict[str, Any]:
20
+ """Combine all analysis components into a final signal"""
21
+ # Weighted average of scores
22
+ final_score = (
23
+ 0.4 * components.trend_score +
24
+ 0.3 * components.momentum_score +
25
+ 0.2 * components.pattern_score +
26
+ 0.1 * components.sentiment_score
27
+ )
28
+
29
+ # Determine direction and confidence
30
+ if abs(final_score) < 0.2:
31
+ direction = "NEUTRAL"
32
+ confidence = max(10, int(50 + 50 * abs(final_score) / 0.2))
33
+ else:
34
+ direction = "LONG" if final_score > 0 else "SHORT"
35
+ confidence = int((abs(final_score) - 0.2) / 0.8 * 90) + 10
36
+
37
+ # Calculate levels based on volatility
38
+ base_price = 42000 # In real app, this would come from market data
39
+ atr = components.volatility * base_price * 0.05
40
+
41
+ if direction == "LONG":
42
+ entry_min = base_price - atr * 0.5
43
+ entry_max = base_price
44
+ stop_loss = base_price - atr * 1.5
45
+ take_profit = [
46
+ base_price + atr * 2,
47
+ base_price + atr * 3
48
+ ]
49
+ elif direction == "SHORT":
50
+ entry_min = base_price
51
+ entry_max = base_price + atr * 0.5
52
+ stop_loss = base_price + atr * 1.5
53
+ take_profit = [
54
+ base_price - atr * 2,
55
+ base_price - atr * 3
56
+ ]
57
+ else: # NEUTRAL
58
+ entry_min = base_price - atr * 0.2
59
+ entry_max = base_price + atr * 0.2
60
+ stop_loss = base_price + atr * 2 * (-1 if random.random() > 0.5 else 1)
61
+ take_profit = []
62
+
63
+ # Generate explanation based on components
64
+ explanation_parts = []
65
+ if components.trend_score > 0.3:
66
+ explanation_parts.append("Strong bullish trend detected.")
67
+ elif components.trend_score < -0.3:
68
+ explanation_parts.append("Strong bearish trend detected.")
69
+
70
+ if components.momentum_score > 0.4:
71
+ explanation_parts.append("Momentum favors upside.")
72
+ elif components.momentum_score < -0.4:
73
+ explanation_parts.append("Momentum favors downside.")
74
+
75
+ if components.pattern_score > 0.5:
76
+ explanation_parts.append("Bullish chart pattern identified.")
77
+ elif components.pattern_score < -0.5:
78
+ explanation_parts.append("Bearish chart pattern identified.")
79
+
80
+ if not explanation_parts:
81
+ explanation_parts.append("No strong signals detected - market may be ranging.")
82
+
83
+ explanation = " ".join(explanation_parts)
84
+
85
+ return {
86
+ "direction": direction,
87
+ "entry_zone": [entry_min, entry_max],
88
+ "stop_loss": stop_loss,
89
+ "take_profit_levels": take_profit,
90
+ "confidence": confidence,
91
+ "explanation": explanation
92
+ }
93
+
94
+ async def analyze_market(self, pair: str, timeframe: str) -> Dict[str, Any]:
95
+ """Analyze market conditions and generate signal"""
96
+ # In a real implementation, we would:
97
+ # 1. Fetch market data
98
+ # 2. Call time-series models
99
+ # 3. Calculate technical indicators
100
+ # 4. Get sentiment if available
101
+
102
+ # Mock analysis for demo purposes
103
+ components = SignalComponents(
104
+ trend_score=random.uniform(-1, 1),
105
+ momentum_score=random.uniform(-0.5, 0.8),
106
+ pattern_score=random.uniform(-1, 1),
107
+ sentiment_score=random.uniform(-0.3, 0.3),
108
+ volatility=random.uniform(0.2, 0.8)
109
+ )
110
+
111
+ return self.generate_signal(components)
112
+
113
+ async def analyze_screenshot(self, image_data: bytes) -> Dict[str, Any]:
114
+ """Analyze uploaded chart screenshot"""
115
+ # In a real implementation, we would:
116
+ # 1. Process image with vision model
117
+ # 2. Extract text with OCR
118
+ # 3. Analyze patterns and generate signal
119
+
120
+ # Mock analysis for demo purposes
121
+ components = SignalComponents(
122
+ trend_score=random.uniform(0, 1), # Screenshots more likely to be long ideas
123
+ momentum_score=random.uniform(0, 0.9),
124
+ pattern_score=random.uniform(0.1, 1),
125
+ sentiment_score=random.uniform(0, 0.5),
126
+ volatility=random.uniform(0.3, 0.9)
127
+ )
128
+
129
+ return self.generate_signal(components)
130
+ ```
131
+
132
+ Let me continue with the frontend setup and other necessary files:
frontend/src/App.tsx ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```typescript
2
+ import { useState } from 'react'
3
+ import { BrowserRouter as Router, Routes, Route } from 'react-router-dom'
4
+ import Navbar from './components/Navbar'
5
+ import Sidebar from './components/Sidebar'
6
+ import Dashboard from './pages/Dashboard'
7
+ import History from './pages/History'
8
+ import Settings from './pages/Settings'
9
+ import Docs from './pages/Docs'
10
+ import './App.css'
11
+
12
+ function App() {
13
+ const [darkMode, setDarkMode] = useState(true)
14
+
15
+ return (
16
+ <div className={`flex flex-col min-h-screen ${darkMode ? 'dark bg-gray-900 text-gray-100' : 'bg-white text-gray-900'}`}>
17
+ <Router>
18
+ <Navbar darkMode={darkMode} toggleDarkMode={() => setDarkMode(!darkMode)} />
19
+ <div className="flex flex-1 overflow-hidden">
20
+ <Sidebar />
21
+ <main className="flex-1 overflow-auto p-4">
22
+ <Routes>
23
+ <Route path="/" element={<Dashboard />} />
24
+ <Route path="/history" element={<History />} />
25
+ <Route path="/settings" element={<Settings />} />
26
+ <Route path="/docs" element={<Docs />} />
27
+ </Routes>
28
+ </main>
29
+ </div>
30
+ </Router>
31
+ </div>
32
+ )
33
+ }
34
+
35
+ export default App
36
+ ```
frontend/src/components/Navbar.tsx ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```typescript
2
+ import { NavLink } from 'react-router-dom'
3
+ import { FiActivity, FiMoon, FiSun } from 'react-icons/fi'
4
+
5
+ interface NavbarProps {
6
+ darkMode: boolean
7
+ toggleDarkMode: () => void
8
+ }
9
+
10
+ export default function Navbar({ darkMode, toggleDarkMode }: NavbarProps) {
11
+ return (
12
+ <header className="border-b border-gray-800 py-4">
13
+ <div className="container mx-auto px-4 flex justify-between items-center">
14
+ <div className="flex items-center space-x-2">
15
+ <FiActivity className="text-purple-500 text-xl" />
16
+ <span className="text-xl font-bold bg-gradient-to-r from-purple-500 to-orange-500 bg-clip-text text-transparent">
17
+ CryptoSignal-Sleuth
18
+ </span>
19
+ </div>
20
+ <div className="hidden md:flex space-x-6">
21
+ <NavLink
22
+ to="/"
23
+ className={({ isActive }) =>
24
+ `hover:text-purple-500 transition ${isActive ? 'text-purple-500 font-medium' : ''}`
25
+ }
26
+ >
27
+ Dashboard
28
+ </NavLink>
29
+ <NavLink
30
+ to="/history"
31
+ className={({ isActive }) =>
32
+ `hover:text-purple-500 transition ${isActive ? 'text-purple-500 font-medium' : ''}`
33
+ }
34
+ >
35
+ History
36
+ </NavLink>
37
+ <NavLink
38
+ to="/settings"
39
+ className={({ isActive }) =>
40
+ `hover:text-purple-500 transition ${isActive ? 'text-purple-500 font-medium' : ''}`
41
+ }
42
+ >
43
+ Settings
44
+ </NavLink>
45
+ <NavLink
46
+ to="/docs"
47
+ className={({ isActive }) =>
48
+ `hover:text-purple-500 transition ${isActive ? 'text-purple-500 font-medium' : ''}`
49
+ }
50
+ >
51
+ Docs
52
+ </NavLink>
53
+ </div>
54
+ <button
55
+ onClick={toggleDarkMode}
56
+ className="p-2 rounded-lg hover:bg-gray-800 transition"
57
+ >
58
+ {darkMode ? <FiSun className="w-5 h-5" /> : <FiMoon className="w-5 h-5" />}
59
+ </button>
60
+ </div>
61
+ </header>
62
+ )
63
+ }
64
+ ```
65
+
66
+ This is just the beginning - I'll continue with more components and pages in the next response to keep it manageable. Would you like me to proceed with the Dashboard page and other components next?
67
+ ___METADATA_START___
68
+ {"repoId":"Alexo19/cryptosignal-sleuth","isNew":false,"userName":"Alexo19"}
69
+ ___METADATA_END___
frontend/src/main.tsx ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```typescript
2
+ import React from 'react'
3
+ import ReactDOM from 'react-dom/client'
4
+ import App from './App'
5
+ import './index.css'
6
+
7
+ ReactDOM.createRoot(document.getElementById('root')!).render(
8
+ <React.StrictMode>
9
+ <App />
10
+ </React.StrictMode>,
11
+ )
12
+ ```