Compare commits
68 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7d825a07f8 | ||
|
|
c5fec61123 | ||
|
|
9437959568 | ||
|
|
d1846b102d | ||
|
|
5cd28bd1d9 | ||
|
|
d48595e082 | ||
|
|
1f3a1e5b3f | ||
|
|
c2bcfbf646 | ||
|
|
28d57924ee | ||
|
|
f83681188c | ||
|
|
3acb0efc62 | ||
|
|
1961b4c9e2 | ||
|
|
de47c7c228 | ||
|
|
4f610a329a | ||
|
|
019092ed7d | ||
|
|
31bccfb957 | ||
|
|
fcbab3bf72 | ||
|
|
878626b440 | ||
|
|
01a97b57c0 | ||
|
|
e48e122bbd | ||
|
|
c6ee92fb66 | ||
|
|
f338ddbc87 | ||
|
|
20255b3edf | ||
|
|
db550faab0 | ||
|
|
7ec4151d6c | ||
|
|
989f47f80d | ||
|
|
f8b05bb34d | ||
|
|
ea5283dd43 | ||
|
|
f004d80cc8 | ||
|
|
08f7407837 | ||
|
|
6ce59eee33 | ||
|
|
63bac7af1b | ||
|
|
b7a06c732e | ||
|
|
86ead4f29f | ||
|
|
67a6a0727f | ||
|
|
cc12997672 | ||
|
|
225c892847 | ||
|
|
bc6cff5d87 | ||
|
|
6bdc1ecefb | ||
|
|
31e31ee7da | ||
|
|
63604aee14 | ||
|
|
f2b685bbf5 | ||
|
|
0b0d14bb20 | ||
|
|
cf1107f736 | ||
|
|
e2b4d7b8fe | ||
|
|
401c7d3c0e | ||
|
|
340143e3e9 | ||
|
|
63e9fbda70 | ||
|
|
be711af607 | ||
|
|
9e47283976 | ||
|
|
b5c1f30b28 | ||
|
|
7fcc510e64 | ||
|
|
456fb7afb4 | ||
|
|
ec9e2c29bb | ||
|
|
9a79bdd27a | ||
|
|
5f9df98a73 | ||
|
|
2dca8519ce | ||
|
|
81973320a3 | ||
|
|
a0abcceec9 | ||
|
|
2f07ee3bd7 | ||
|
|
2f509e7dd5 | ||
|
|
8259202f96 | ||
|
|
2c668c1852 | ||
|
|
e04664d037 | ||
|
|
5da4eac866 | ||
|
|
bffb5e5e6a | ||
|
|
b22a7c0b58 | ||
|
|
d3961f7ef4 |
5
.gitignore
vendored
@@ -8,6 +8,11 @@ wheels/
|
||||
# Virtual environments
|
||||
.venv
|
||||
frontend/node_modules/
|
||||
frontend/test-results/
|
||||
|
||||
# Frontend build output (built from source by end users)
|
||||
frontend/dist/
|
||||
frontend/package-lock.json
|
||||
|
||||
# reference librarys
|
||||
references/
|
||||
|
||||
312
AGENTS.md
Normal file
@@ -0,0 +1,312 @@
|
||||
# RemoteTerm for MeshCore
|
||||
|
||||
## Important Rules
|
||||
|
||||
**NEVER make git commits.** A human must make all commits. You may stage files and prepare commit messages, but do not run `git commit`.
|
||||
|
||||
If instructed to "run all tests" or "get ready for a commit" or other summative, work ending directives, make sure you run the following and that they all pass green:
|
||||
|
||||
```bash
|
||||
uv run ruff check app/ tests/ --fix # check for python violations
|
||||
uv run ruff format app/ tests/ # format python
|
||||
uv run pyright app/ # type check python
|
||||
PYTHONPATH=. uv run pytest tests/ -v # test python
|
||||
|
||||
cd frontend/ # move to frontend directory
|
||||
npm run lint:fix # fix lint violations
|
||||
npm run format # format the code
|
||||
npm run build # run a frontend build
|
||||
```
|
||||
|
||||
## Overview
|
||||
|
||||
A web interface for MeshCore mesh radio networks. The backend connects to a MeshCore-compatible radio over serial and exposes REST/WebSocket APIs. The React frontend provides real-time messaging and radio configuration.
|
||||
|
||||
**For detailed component documentation, see:**
|
||||
- `app/AGENTS.md` - Backend (FastAPI, database, radio connection, packet decryption)
|
||||
- `frontend/AGENTS.md` - Frontend (React, state management, WebSocket, components)
|
||||
- `frontend/src/components/AGENTS.md` - Frontend visualizer feature (a particularly complex and long force-directed graph visualizer component; can skip this file unless you're working on that feature)
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ Frontend (React) │
|
||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
|
||||
│ │ StatusBar│ │ Sidebar │ │MessageList│ │ MessageInput │ │
|
||||
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
|
||||
│ ┌────────────────────────────────────────────────────────────┐ │
|
||||
│ │ CrackerPanel (global collapsible, WebGPU cracking) │ │
|
||||
│ └────────────────────────────────────────────────────────────┘ │
|
||||
│ │ │
|
||||
│ useWebSocket ←──── Real-time updates │
|
||||
│ │ │
|
||||
│ api.ts ←──── REST API calls │
|
||||
└───────────────────────────┼──────────────────────────────────────┘
|
||||
│ HTTP + WebSocket (/api/*)
|
||||
┌───────────────────────────┼──────────────────────────────────────┐
|
||||
│ Backend (FastAPI) │
|
||||
│ ┌──────────┐ ┌──────────────┐ ┌────────────┐ ┌───────────┐ │
|
||||
│ │ Routers │→ │ Repositories │→ │ SQLite DB │ │ WebSocket │ │
|
||||
│ └──────────┘ └──────────────┘ └────────────┘ │ Manager │ │
|
||||
│ ↓ └───────────┘ │
|
||||
│ ┌──────────────────────────────────────────────────────────┐ │
|
||||
│ │ RadioManager + Event Handlers │ │
|
||||
│ └──────────────────────────────────────────────────────────┘ │
|
||||
└───────────────────────────┼──────────────────────────────────────┘
|
||||
│ Serial
|
||||
┌──────┴──────┐
|
||||
│ MeshCore │
|
||||
│ Radio │
|
||||
└─────────────┘
|
||||
```
|
||||
|
||||
## Key Design Principles
|
||||
|
||||
1. **Store-and-serve**: Backend stores all packets even when no client is connected
|
||||
2. **Parallel storage**: Messages stored both decrypted (when possible) and as raw packets
|
||||
3. **Extended capacity**: Server stores contacts/channels beyond radio limits (~350 contacts, ~40 channels)
|
||||
4. **Real-time updates**: WebSocket pushes events; REST for actions
|
||||
5. **Offline-capable**: Radio operates independently; server syncs when connected
|
||||
6. **Auto-reconnect**: Background monitor detects disconnection and attempts reconnection
|
||||
|
||||
## Data Flow
|
||||
|
||||
### Incoming Messages
|
||||
|
||||
1. Radio receives message → MeshCore library emits event
|
||||
2. `event_handlers.py` catches event → stores in database
|
||||
3. `ws_manager` broadcasts to connected clients
|
||||
4. Frontend `useWebSocket` receives → updates React state
|
||||
|
||||
### Outgoing Messages
|
||||
|
||||
1. User types message → clicks send
|
||||
2. `api.sendChannelMessage()` → POST to backend
|
||||
3. Backend calls `radio_manager.meshcore.commands.send_chan_msg()`
|
||||
4. Message stored in database with `outgoing=true`
|
||||
5. For direct messages: ACK tracked; for channel: repeat detection
|
||||
|
||||
### ACK and Repeat Detection
|
||||
|
||||
**Direct messages**: Expected ACK code is tracked. When ACK event arrives, message marked as acked.
|
||||
|
||||
**Channel messages**: Flood messages echo back. The decoder identifies repeats by matching (channel_idx, text_hash, timestamp ±5s) and marks the original as "acked".
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
.
|
||||
├── app/ # FastAPI backend
|
||||
│ ├── AGENTS.md # Backend documentation
|
||||
│ ├── main.py # App entry, lifespan
|
||||
│ ├── routers/ # API endpoints
|
||||
│ ├── repository.py # Database CRUD
|
||||
│ ├── event_handlers.py # Radio events
|
||||
│ ├── decoder.py # Packet decryption
|
||||
│ └── websocket.py # Real-time broadcasts
|
||||
├── frontend/ # React frontend
|
||||
│ ├── AGENTS.md # Frontend documentation
|
||||
│ ├── src/
|
||||
│ │ ├── App.tsx # Main component
|
||||
│ │ ├── api.ts # REST client
|
||||
│ │ ├── useWebSocket.ts # WebSocket hook
|
||||
│ │ └── components/
|
||||
│ │ ├── CrackerPanel.tsx # WebGPU key cracking
|
||||
│ │ ├── MapView.tsx # Leaflet map showing node locations
|
||||
│ │ └── ...
|
||||
│ └── vite.config.ts
|
||||
├── references/meshcore_py/ # MeshCore Python library
|
||||
├── tests/ # Backend tests (pytest)
|
||||
├── data/ # SQLite database (runtime)
|
||||
└── pyproject.toml # Python dependencies
|
||||
```
|
||||
|
||||
## Development Setup
|
||||
|
||||
### Backend
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
uv sync
|
||||
|
||||
# Run server (auto-detects radio)
|
||||
uv run uvicorn app.main:app --reload
|
||||
|
||||
# Or specify port
|
||||
MESHCORE_SERIAL_PORT=/dev/cu.usbserial-0001 uv run uvicorn app.main:app --reload
|
||||
```
|
||||
|
||||
### Frontend
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
npm run dev # http://localhost:5173, proxies /api to :8000
|
||||
```
|
||||
|
||||
### Both Together (Development)
|
||||
|
||||
Terminal 1: `uv run uvicorn app.main:app --reload`
|
||||
Terminal 2: `cd frontend && npm run dev`
|
||||
|
||||
### Production
|
||||
|
||||
In production, the FastAPI backend serves the compiled frontend. You must build the frontend first:
|
||||
|
||||
```bash
|
||||
cd frontend && npm install && npm run build && cd ..
|
||||
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
Access at `http://localhost:8000`. All API routes are prefixed with `/api`.
|
||||
|
||||
## Testing
|
||||
|
||||
### Backend (pytest)
|
||||
|
||||
```bash
|
||||
PYTHONPATH=. uv run pytest tests/ -v
|
||||
```
|
||||
|
||||
Key test files:
|
||||
- `tests/test_decoder.py` - Channel + direct message decryption, key exchange
|
||||
- `tests/test_keystore.py` - Ephemeral key store
|
||||
- `tests/test_event_handlers.py` - ACK tracking, repeat detection
|
||||
- `tests/test_api.py` - API endpoints, read state tracking
|
||||
- `tests/test_migrations.py` - Database migration system
|
||||
|
||||
### Frontend (Vitest)
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm run test:run
|
||||
```
|
||||
|
||||
### Before Completing Changes
|
||||
|
||||
**Always run both backend and frontend validation before finishing any changes:**
|
||||
|
||||
```bash
|
||||
# From project root - run backend tests
|
||||
PYTHONPATH=. uv run pytest tests/ -v
|
||||
|
||||
# From project root - run frontend tests and build
|
||||
cd frontend && npm run test:run && npm run build
|
||||
```
|
||||
|
||||
This catches:
|
||||
- Type mismatches between frontend and backend (e.g., missing fields in TypeScript interfaces)
|
||||
- Breaking changes to shared types or API contracts
|
||||
- Runtime errors that only surface during compilation
|
||||
|
||||
## API Summary
|
||||
|
||||
All endpoints are prefixed with `/api` (e.g., `/api/health`).
|
||||
|
||||
| Method | Endpoint | Description |
|
||||
|--------|----------|-------------|
|
||||
| GET | `/api/health` | Connection status |
|
||||
| GET | `/api/radio/config` | Radio configuration |
|
||||
| PATCH | `/api/radio/config` | Update name, location, radio params |
|
||||
| POST | `/api/radio/advertise` | Send advertisement |
|
||||
| POST | `/api/radio/reconnect` | Manual radio reconnection |
|
||||
| POST | `/api/radio/reboot` | Reboot radio or reconnect if disconnected |
|
||||
| PUT | `/api/radio/private-key` | Import private key to radio |
|
||||
| GET | `/api/contacts` | List contacts |
|
||||
| POST | `/api/contacts` | Create contact (optionally trigger historical DM decrypt) |
|
||||
| POST | `/api/contacts/sync` | Pull from radio |
|
||||
| POST | `/api/contacts/{key}/telemetry` | Request telemetry from repeater |
|
||||
| POST | `/api/contacts/{key}/command` | Send CLI command to repeater |
|
||||
| GET | `/api/channels` | List channels |
|
||||
| POST | `/api/channels` | Create channel |
|
||||
| GET | `/api/messages` | List with filters |
|
||||
| POST | `/api/messages/direct` | Send direct message |
|
||||
| POST | `/api/messages/channel` | Send channel message |
|
||||
| POST | `/api/packets/decrypt/historical` | Decrypt stored packets |
|
||||
| GET | `/api/packets/decrypt/progress` | Get historical decryption progress |
|
||||
| POST | `/api/packets/maintenance` | Delete old packets (cleanup) |
|
||||
| POST | `/api/contacts/{key}/mark-read` | Mark contact conversation as read |
|
||||
| POST | `/api/channels/{key}/mark-read` | Mark channel as read |
|
||||
| GET | `/api/read-state/unreads` | Server-computed unread counts, mentions, last message times |
|
||||
| POST | `/api/read-state/mark-all-read` | Mark all conversations as read |
|
||||
| GET | `/api/settings` | Get app settings |
|
||||
| PATCH | `/api/settings` | Update app settings |
|
||||
| WS | `/api/ws` | Real-time updates |
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Contact Public Keys
|
||||
|
||||
- Full key: 64-character hex string
|
||||
- Prefix: 12-character hex (used for matching)
|
||||
- Lookups use `LIKE 'prefix%'` for matching
|
||||
|
||||
### Contact Types
|
||||
|
||||
- `0` - Unknown
|
||||
- `1` - Client (regular node)
|
||||
- `2` - Repeater
|
||||
- `3` - Room
|
||||
|
||||
### Channel Keys
|
||||
|
||||
- Stored as 32-character hex string (TEXT PRIMARY KEY)
|
||||
- Hashtag channels: `SHA256("#name")[:16]` converted to hex
|
||||
- Custom channels: User-provided or generated
|
||||
|
||||
### Message Types
|
||||
|
||||
- `PRIV` - Direct messages
|
||||
- `CHAN` - Channel messages
|
||||
- Both use `conversation_key` (user pubkey for PRIV, channel key for CHAN)
|
||||
|
||||
### Read State Tracking
|
||||
|
||||
Read state (`last_read_at`) is tracked **server-side** for consistency across devices:
|
||||
- Stored as Unix timestamp in `contacts.last_read_at` and `channels.last_read_at`
|
||||
- Updated via `POST /api/contacts/{key}/mark-read` and `POST /api/channels/{key}/mark-read`
|
||||
- Bulk update via `POST /api/read-state/mark-all-read`
|
||||
- Aggregated counts via `GET /api/read-state/unreads` (server-side computation)
|
||||
|
||||
**State Tracking Keys (Frontend)**: Generated by `getStateKey()` for message times (sidebar sorting):
|
||||
- Channels: `channel-{channel_key}`
|
||||
- Contacts: `contact-{12-char-pubkey-prefix}`
|
||||
|
||||
**Note:** These are NOT the same as `Message.conversation_key` (the database field).
|
||||
|
||||
### Server-Side Decryption
|
||||
|
||||
The server can decrypt packets using stored keys, both in real-time and for historical packets.
|
||||
|
||||
**Channel messages**: Decrypted automatically when a matching channel key is available.
|
||||
|
||||
**Direct messages**: Decrypted server-side using the private key exported from the radio on startup. This enables DM decryption even when the contact isn't loaded on the radio. The private key is stored in memory only (see `keystore.py`).
|
||||
|
||||
## MeshCore Library
|
||||
|
||||
The `meshcore_py` library provides radio communication. Key patterns:
|
||||
|
||||
```python
|
||||
# Connection
|
||||
mc = await MeshCore.create_serial(port="/dev/ttyUSB0")
|
||||
|
||||
# Commands
|
||||
await mc.commands.send_msg(dst, msg)
|
||||
await mc.commands.send_chan_msg(channel_idx, msg)
|
||||
await mc.commands.get_contacts()
|
||||
await mc.commands.set_channel(idx, name, key)
|
||||
|
||||
# Events
|
||||
mc.subscribe(EventType.CONTACT_MSG_RECV, handler)
|
||||
mc.subscribe(EventType.CHANNEL_MSG_RECV, handler)
|
||||
mc.subscribe(EventType.ACK, handler)
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `MESHCORE_SERIAL_PORT` | auto-detect | Serial port for radio |
|
||||
| `MESHCORE_DATABASE_PATH` | `data/meshcore.db` | SQLite database location |
|
||||
| `MESHCORE_MAX_RADIO_CONTACTS` | `200` | Max recent contacts to keep on radio for DM ACKs |
|
||||
68
CHANGELOG.md
@@ -1,3 +1,71 @@
|
||||
## [1.8.0] - 2026-02-07
|
||||
|
||||
Feature: Single hop ping
|
||||
Feature: PWA viewport fixes(thanks @rgregg)
|
||||
Feature (?): No frontend distribution; build it yourself ;P
|
||||
Bugfix: Fix channel message send race condition (concurrent sends could corrupt shared radio slot)
|
||||
Bugfix: Fix TOCTOU race in radio reconnect (duplicate connections under contention)
|
||||
Bugfix: Better guarding around reconnection
|
||||
Bugfix: Duplicate websocket connection fixes
|
||||
Bugfix: Settings tab error cleanliness on tab swap
|
||||
Bugfix: Fix path traversal vuln
|
||||
UI: Swap visualizer legend ordering (yay prettier)
|
||||
Misc: Perf and locking improvements
|
||||
Misc: Always flood advertisements
|
||||
Misc: Better packet dupe handling
|
||||
Misc: Dead code cleanup, test improvements
|
||||
|
||||
## [1.8.0] - 2026-02-07
|
||||
|
||||
Feature: Single hop ping
|
||||
Feature: PWA viewport fixes(thanks @rgregg)
|
||||
Feature (?): No frontend distribution; build it yourself ;P
|
||||
Bugfix: Fix channel message send race condition (concurrent sends could corrupt shared radio slot)
|
||||
Bugfix: Fix TOCTOU race in radio reconnect (duplicate connections under contention)
|
||||
Bugfix: Better guarding around reconnection
|
||||
Bugfix: Duplicate websocket connection fixes
|
||||
Bugfix: Settings tab error cleanliness on tab swap
|
||||
Bugfix: Fix path traversal vuln
|
||||
UI: Swap visualizer legend ordering (yay prettier)
|
||||
Misc: Perf and locking improvements
|
||||
Misc: Always flood advertisements
|
||||
Misc: Better packet dupe handling
|
||||
Misc: Dead code cleanup, test improvements
|
||||
|
||||
## [1.7.1] - 2026-02-03
|
||||
|
||||
Feature: Clickable hyperlinks
|
||||
Bugfix: More consistent public key normalization
|
||||
Bugfix: Use more reliable cursor paging
|
||||
Bugfix: Fix null timestamp dedupe failure
|
||||
Bugfix: More concistent prefix-based message claiming on key reciept
|
||||
Misc: Bot can respond to its own messages
|
||||
Misc: Additional tests
|
||||
Misc: Remove unneeded message dedupe logic
|
||||
Misc: Resync settings after radio settings mutation
|
||||
|
||||
## [1.7.0] - 2026-01-27
|
||||
|
||||
Feature: Multi-bot functionality
|
||||
Bugfix: Adjust bot code editor display and add line numbers
|
||||
Bugfix: Fix clock filtering and contact lookup behavior bugs
|
||||
Bugfix: Fix repeater message duplication issue
|
||||
Bugfix: Correct outbound message timestamp assignment (affecting outgoing messages seen as incoming)
|
||||
UI: Move advertise button to identity tab
|
||||
Misc: Clarify fallback functionality for missing private key export in logs
|
||||
|
||||
## [1.6.0] - 2026-01-26
|
||||
|
||||
Feature: Visualizer: extract public key from AnonReq, add heuristic repeater disambiguation, add reset button, draggable nodes
|
||||
Feature: Customizable advertising interval
|
||||
Feature: In-app bot setup
|
||||
Bugfix: Force contact onto radio before DM send
|
||||
Misc: Remove unused code
|
||||
|
||||
## [1.5.0] - 2026-01-19
|
||||
|
||||
Feature: Network visualizer
|
||||
|
||||
## [1.4.1] - 2026-01-19
|
||||
|
||||
Feature: Add option to attempt historical DM decrypt on new-contact advertisement (disabled by default)
|
||||
|
||||
316
CLAUDE.md
@@ -1,315 +1,5 @@
|
||||
# RemoteTerm for MeshCore
|
||||
|
||||
## Important Rules
|
||||
|
||||
**NEVER make git commits.** A human must make all commits. You may stage files and prepare commit messages, but do not run `git commit`.
|
||||
|
||||
If instructed to "run all tests" or "get ready for a commit" or other summative, work ending directives, make sure you run the following and that they all pass green:
|
||||
|
||||
```bash
|
||||
uv run ruff check app/ tests/ --fix # check for python violations
|
||||
uv run ruff format app/ tests/ # format python
|
||||
uv run pyright app/ # type check python
|
||||
PYTHONPATH=. uv run pytest tests/ -v # test python
|
||||
|
||||
cd frontend/ # move to frontend directory
|
||||
npm run lint:fix # fix lint violations
|
||||
npm run format # format the code
|
||||
npm run build # run a frontend build
|
||||
```
|
||||
|
||||
## Overview
|
||||
|
||||
A web interface for MeshCore mesh radio networks. The backend connects to a MeshCore-compatible radio over serial and exposes REST/WebSocket APIs. The React frontend provides real-time messaging and radio configuration.
|
||||
|
||||
**For detailed component documentation, see:**
|
||||
- `app/CLAUDE.md` - Backend (FastAPI, database, radio connection, packet decryption)
|
||||
- `frontend/CLAUDE.md` - Frontend (React, state management, WebSocket, components)
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ Frontend (React) │
|
||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
|
||||
│ │ StatusBar│ │ Sidebar │ │MessageList│ │ MessageInput │ │
|
||||
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
|
||||
│ ┌────────────────────────────────────────────────────────────┐ │
|
||||
│ │ CrackerPanel (global collapsible, WebGPU cracking) │ │
|
||||
│ └────────────────────────────────────────────────────────────┘ │
|
||||
│ │ │
|
||||
│ useWebSocket ←──── Real-time updates │
|
||||
│ │ │
|
||||
│ api.ts ←──── REST API calls │
|
||||
└───────────────────────────┼──────────────────────────────────────┘
|
||||
│ HTTP + WebSocket (/api/*)
|
||||
┌───────────────────────────┼──────────────────────────────────────┐
|
||||
│ Backend (FastAPI) │
|
||||
│ ┌──────────┐ ┌──────────────┐ ┌────────────┐ ┌───────────┐ │
|
||||
│ │ Routers │→ │ Repositories │→ │ SQLite DB │ │ WebSocket │ │
|
||||
│ └──────────┘ └──────────────┘ └────────────┘ │ Manager │ │
|
||||
│ ↓ └───────────┘ │
|
||||
│ ┌──────────────────────────────────────────────────────────┐ │
|
||||
│ │ RadioManager + Event Handlers │ │
|
||||
│ └──────────────────────────────────────────────────────────┘ │
|
||||
└───────────────────────────┼──────────────────────────────────────┘
|
||||
│ Serial
|
||||
┌──────┴──────┐
|
||||
│ MeshCore │
|
||||
│ Radio │
|
||||
└─────────────┘
|
||||
```
|
||||
|
||||
## Key Design Principles
|
||||
|
||||
1. **Store-and-serve**: Backend stores all packets even when no client is connected
|
||||
2. **Parallel storage**: Messages stored both decrypted (when possible) and as raw packets
|
||||
3. **Extended capacity**: Server stores contacts/channels beyond radio limits (~350 contacts, ~40 channels)
|
||||
4. **Real-time updates**: WebSocket pushes events; REST for actions
|
||||
5. **Offline-capable**: Radio operates independently; server syncs when connected
|
||||
6. **Auto-reconnect**: Background monitor detects disconnection and attempts reconnection
|
||||
|
||||
## Data Flow
|
||||
|
||||
### Incoming Messages
|
||||
|
||||
1. Radio receives message → MeshCore library emits event
|
||||
2. `event_handlers.py` catches event → stores in database
|
||||
3. `ws_manager` broadcasts to connected clients
|
||||
4. Frontend `useWebSocket` receives → updates React state
|
||||
|
||||
### Outgoing Messages
|
||||
|
||||
1. User types message → clicks send
|
||||
2. `api.sendChannelMessage()` → POST to backend
|
||||
3. Backend calls `radio_manager.meshcore.commands.send_chan_msg()`
|
||||
4. Message stored in database with `outgoing=true`
|
||||
5. For direct messages: ACK tracked; for channel: repeat detection
|
||||
|
||||
### ACK and Repeat Detection
|
||||
|
||||
**Direct messages**: Expected ACK code is tracked. When ACK event arrives, message marked as acked.
|
||||
|
||||
**Channel messages**: Flood messages echo back. The decoder identifies repeats by matching (channel_idx, text_hash, timestamp ±5s) and marks the original as "acked".
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
.
|
||||
├── app/ # FastAPI backend
|
||||
│ ├── CLAUDE.md # Backend documentation
|
||||
│ ├── main.py # App entry, lifespan
|
||||
│ ├── routers/ # API endpoints
|
||||
│ ├── repository.py # Database CRUD
|
||||
│ ├── event_handlers.py # Radio events
|
||||
│ ├── decoder.py # Packet decryption
|
||||
│ └── websocket.py # Real-time broadcasts
|
||||
├── frontend/ # React frontend
|
||||
│ ├── CLAUDE.md # Frontend documentation
|
||||
│ ├── src/
|
||||
│ │ ├── App.tsx # Main component
|
||||
│ │ ├── api.ts # REST client
|
||||
│ │ ├── useWebSocket.ts # WebSocket hook
|
||||
│ │ └── components/
|
||||
│ │ ├── CrackerPanel.tsx # WebGPU key cracking
|
||||
│ │ ├── MapView.tsx # Leaflet map showing node locations
|
||||
│ │ └── ...
|
||||
│ └── vite.config.ts
|
||||
├── references/meshcore_py/ # MeshCore Python library
|
||||
├── tests/ # Backend tests (pytest)
|
||||
├── data/ # SQLite database (runtime)
|
||||
├── integration_test.html # Browser-based API tests
|
||||
└── pyproject.toml # Python dependencies
|
||||
```
|
||||
|
||||
## Development Setup
|
||||
|
||||
### Backend
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
uv sync
|
||||
|
||||
# Run server (auto-detects radio)
|
||||
uv run uvicorn app.main:app --reload
|
||||
|
||||
# Or specify port
|
||||
MESHCORE_SERIAL_PORT=/dev/cu.usbserial-0001 uv run uvicorn app.main:app --reload
|
||||
```
|
||||
|
||||
### Frontend
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
npm run dev # http://localhost:5173, proxies /api to :8000
|
||||
```
|
||||
|
||||
### Both Together (Development)
|
||||
|
||||
Terminal 1: `uv run uvicorn app.main:app --reload`
|
||||
Terminal 2: `cd frontend && npm run dev`
|
||||
|
||||
### Production
|
||||
|
||||
In production, the FastAPI backend serves the compiled frontend:
|
||||
|
||||
```bash
|
||||
cd frontend && npm run build && cd ..
|
||||
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
Access at `http://localhost:8000`. All API routes are prefixed with `/api`.
|
||||
|
||||
## Testing
|
||||
|
||||
### Backend (pytest)
|
||||
|
||||
```bash
|
||||
PYTHONPATH=. uv run pytest tests/ -v
|
||||
```
|
||||
|
||||
Key test files:
|
||||
- `tests/test_decoder.py` - Channel + direct message decryption, key exchange
|
||||
- `tests/test_keystore.py` - Ephemeral key store
|
||||
- `tests/test_event_handlers.py` - ACK tracking, repeat detection
|
||||
- `tests/test_api.py` - API endpoints, read state tracking
|
||||
- `tests/test_migrations.py` - Database migration system
|
||||
|
||||
### Frontend (Vitest)
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm run test:run
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
Open `integration_test.html` in a browser with the backend running.
|
||||
|
||||
### Before Completing Changes
|
||||
|
||||
**Always run both backend and frontend validation before finishing any changes:**
|
||||
|
||||
```bash
|
||||
# From project root - run backend tests
|
||||
PYTHONPATH=. uv run pytest tests/ -v
|
||||
|
||||
# From project root - run frontend tests and build
|
||||
cd frontend && npm run test:run && npm run build
|
||||
```
|
||||
|
||||
This catches:
|
||||
- Type mismatches between frontend and backend (e.g., missing fields in TypeScript interfaces)
|
||||
- Breaking changes to shared types or API contracts
|
||||
- Runtime errors that only surface during compilation
|
||||
|
||||
## API Summary
|
||||
|
||||
All endpoints are prefixed with `/api` (e.g., `/api/health`).
|
||||
|
||||
| Method | Endpoint | Description |
|
||||
|--------|----------|-------------|
|
||||
| GET | `/api/health` | Connection status |
|
||||
| GET | `/api/radio/config` | Radio configuration |
|
||||
| PATCH | `/api/radio/config` | Update name, location, radio params |
|
||||
| POST | `/api/radio/advertise` | Send advertisement |
|
||||
| POST | `/api/radio/reconnect` | Manual radio reconnection |
|
||||
| POST | `/api/radio/reboot` | Reboot radio or reconnect if disconnected |
|
||||
| PUT | `/api/radio/private-key` | Import private key to radio |
|
||||
| GET | `/api/contacts` | List contacts |
|
||||
| POST | `/api/contacts` | Create contact (optionally trigger historical DM decrypt) |
|
||||
| POST | `/api/contacts/sync` | Pull from radio |
|
||||
| POST | `/api/contacts/{key}/telemetry` | Request telemetry from repeater |
|
||||
| POST | `/api/contacts/{key}/command` | Send CLI command to repeater |
|
||||
| GET | `/api/channels` | List channels |
|
||||
| POST | `/api/channels` | Create channel |
|
||||
| GET | `/api/messages` | List with filters |
|
||||
| POST | `/api/messages/direct` | Send direct message |
|
||||
| POST | `/api/messages/channel` | Send channel message |
|
||||
| POST | `/api/packets/decrypt/historical` | Decrypt stored packets |
|
||||
| GET | `/api/packets/decrypt/progress` | Get historical decryption progress |
|
||||
| POST | `/api/packets/maintenance` | Delete old packets (cleanup) |
|
||||
| POST | `/api/contacts/{key}/mark-read` | Mark contact conversation as read |
|
||||
| POST | `/api/channels/{key}/mark-read` | Mark channel as read |
|
||||
| POST | `/api/read-state/mark-all-read` | Mark all conversations as read |
|
||||
| GET | `/api/settings` | Get app settings |
|
||||
| PATCH | `/api/settings` | Update app settings |
|
||||
| WS | `/api/ws` | Real-time updates |
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Contact Public Keys
|
||||
|
||||
- Full key: 64-character hex string
|
||||
- Prefix: 12-character hex (used for matching)
|
||||
- Lookups use `LIKE 'prefix%'` for matching
|
||||
|
||||
### Contact Types
|
||||
|
||||
- `0` - Unknown
|
||||
- `1` - Client (regular node)
|
||||
- `2` - Repeater
|
||||
- `3` - Room
|
||||
|
||||
### Channel Keys
|
||||
|
||||
- Stored as 32-character hex string (TEXT PRIMARY KEY)
|
||||
- Hashtag channels: `SHA256("#name")[:16]` converted to hex
|
||||
- Custom channels: User-provided or generated
|
||||
|
||||
### Message Types
|
||||
|
||||
- `PRIV` - Direct messages
|
||||
- `CHAN` - Channel messages
|
||||
- Both use `conversation_key` (user pubkey for PRIV, channel key for CHAN)
|
||||
|
||||
### Read State Tracking
|
||||
|
||||
Read state (`last_read_at`) is tracked **server-side** for consistency across devices:
|
||||
- Stored as Unix timestamp in `contacts.last_read_at` and `channels.last_read_at`
|
||||
- Updated via `POST /api/contacts/{key}/mark-read` and `POST /api/channels/{key}/mark-read`
|
||||
- Bulk update via `POST /api/read-state/mark-all-read`
|
||||
- Frontend compares `last_read_at` with message `received_at` to count unreads
|
||||
|
||||
**State Tracking Keys (Frontend)**: Generated by `getStateKey()` for message times (sidebar sorting):
|
||||
- Channels: `channel-{channel_key}`
|
||||
- Contacts: `contact-{12-char-pubkey-prefix}`
|
||||
|
||||
**Note:** These are NOT the same as `Message.conversation_key` (the database field).
|
||||
|
||||
### Server-Side Decryption
|
||||
|
||||
The server can decrypt packets using stored keys, both in real-time and for historical packets.
|
||||
|
||||
**Channel messages**: Decrypted automatically when a matching channel key is available.
|
||||
|
||||
**Direct messages**: Decrypted server-side using the private key exported from the radio on startup. This enables DM decryption even when the contact isn't loaded on the radio. The private key is stored in memory only (see `keystore.py`).
|
||||
|
||||
## MeshCore Library
|
||||
|
||||
The `meshcore_py` library provides radio communication. Key patterns:
|
||||
|
||||
```python
|
||||
# Connection
|
||||
mc = await MeshCore.create_serial(port="/dev/ttyUSB0")
|
||||
|
||||
# Commands
|
||||
await mc.commands.send_msg(dst, msg)
|
||||
await mc.commands.send_chan_msg(channel_idx, msg)
|
||||
await mc.commands.get_contacts()
|
||||
await mc.commands.set_channel(idx, name, key)
|
||||
|
||||
# Events
|
||||
mc.subscribe(EventType.CONTACT_MSG_RECV, handler)
|
||||
mc.subscribe(EventType.CHANNEL_MSG_RECV, handler)
|
||||
mc.subscribe(EventType.ACK, handler)
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `MESHCORE_SERIAL_PORT` | auto-detect | Serial port for radio |
|
||||
| `MESHCORE_DATABASE_PATH` | `data/meshcore.db` | SQLite database location |
|
||||
| `MESHCORE_MAX_RADIO_CONTACTS` | `200` | Max recent contacts to keep on radio for DM ACKs |
|
||||
- `./AGENTS.md` (general project information)
|
||||
- `app/AGENTS.md` - Backend (FastAPI, database, radio connection, packet decryption)
|
||||
- `frontend/AGENTS.md` - Frontend (React, state management, WebSocket, components)
|
||||
|
||||
14
README.md
@@ -8,7 +8,7 @@ Backend server + browser interface for MeshCore mesh radio networks. Attach your
|
||||
* Access your radio remotely over your network or VPN
|
||||
* Brute force hashtag room names for GroupTexts you don't have keys for yet
|
||||
|
||||
**Warning:** This app has no authentication. Run it on a private network only -- do not expose to the internet unless you want strangers sending traffic as you.
|
||||
**Warning:** This app has no auth, and is for trusted environments only. _Do not put this on an untrusted network, or open it to the public._ The bots can execute arbitrary Python code which means anyone on your network can, too. If you need access control, consider using a reverse proxy like Nginx, or extending FastAPI.
|
||||
|
||||

|
||||
|
||||
@@ -16,7 +16,7 @@ Backend server + browser interface for MeshCore mesh radio networks. Attach your
|
||||
|
||||
This is entirely vibecoded slop -- no warranty of fitness for any purpose. It's been lovingly guided by an engineer with a passion for clean code and good tests, but it's still mostly LLM output, so you may find some bugs.
|
||||
|
||||
If extending, have your LLM read the three `CLAUDE.md` files: `./CLAUDE.md`, `./frontend/CLAUDE.md`, and `./app/CLAUDE.md`.
|
||||
If extending, have your LLM read the three `AGENTS.md` files: `./AGENTS.md`, `./frontend/AGENTS.md`, and `./app/AGENTS.md`.
|
||||
|
||||
## Requirements
|
||||
|
||||
@@ -57,13 +57,17 @@ usbipd bind --busid 3-8
|
||||
|
||||
**This approach is recommended over Docker due to intermittent serial communications issues I've seen on \*nix systems.**
|
||||
|
||||
The frontend is pre-built -- just run the backend:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/jkingsman/Remote-Terminal-for-MeshCore.git
|
||||
cd Remote-Terminal-for-MeshCore
|
||||
|
||||
# Install backend dependencies
|
||||
uv sync
|
||||
|
||||
# Build frontend
|
||||
cd frontend && npm install && npm run build && cd ..
|
||||
|
||||
# Run server
|
||||
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
@@ -193,7 +197,7 @@ cd /opt/remoteterm
|
||||
sudo -u remoteterm uv venv
|
||||
sudo -u remoteterm uv sync
|
||||
|
||||
# Build frontend (optional -- already built in repo and served by backend)
|
||||
# Build frontend (required for the backend to serve the web UI)
|
||||
cd /opt/remoteterm/frontend
|
||||
sudo -u remoteterm npm install
|
||||
sudo -u remoteterm npm run build
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Backend CLAUDE.md
|
||||
# Backend AGENTS.md
|
||||
|
||||
This document provides context for AI assistants and developers working on the FastAPI backend.
|
||||
|
||||
@@ -35,7 +35,7 @@ app/
|
||||
├── channels.py # Channel CRUD, radio sync, mark-read
|
||||
├── messages.py # Message list and send (direct/channel)
|
||||
├── packets.py # Raw packet endpoints, historical decryption
|
||||
├── read_state.py # Bulk read state operations (mark-all-read)
|
||||
├── read_state.py # Read state: unread counts, mark-all-read
|
||||
├── settings.py # App settings (max_radio_contacts)
|
||||
└── ws.py # WebSocket endpoint at /api/ws
|
||||
```
|
||||
@@ -58,7 +58,6 @@ await RawPacketRepository.mark_decrypted(packet_id, message_id)
|
||||
settings = await AppSettingsRepository.get()
|
||||
await AppSettingsRepository.update(auto_decrypt_dm_on_advert=True)
|
||||
await AppSettingsRepository.add_favorite("contact", public_key)
|
||||
await AppSettingsRepository.update_last_message_time("channel-KEY", timestamp)
|
||||
```
|
||||
|
||||
### Radio Connection
|
||||
@@ -81,12 +80,16 @@ Radio events flow through `event_handlers.py`:
|
||||
|
||||
| Event | Handler | Actions |
|
||||
|-------|---------|---------|
|
||||
| `CONTACT_MSG_RECV` | `on_contact_message` | Store message, update contact last_seen, broadcast via WS |
|
||||
| `CHANNEL_MSG_RECV` | `on_channel_message` | Store message, broadcast via WS |
|
||||
| `RAW_DATA` | `on_raw_data` | Store packet, try decrypt with all channel keys, detect repeats |
|
||||
| `ADVERTISEMENT` | `on_advertisement` | Upsert contact with location |
|
||||
| `CONTACT_MSG_RECV` | `on_contact_message` | **Fallback only** - stores DM if packet processor didn't handle it |
|
||||
| `RX_LOG_DATA` | `on_rx_log_data` | Store packet, decrypt channels/DMs, broadcast via WS |
|
||||
| `PATH_UPDATE` | `on_path_update` | Update contact path info |
|
||||
| `NEW_CONTACT` | `on_new_contact` | Sync contact from radio's internal database |
|
||||
| `ACK` | `on_ack` | Match pending ACKs, mark message acked, broadcast |
|
||||
|
||||
**Note on DM handling**: Direct messages are primarily handled by the packet processor via
|
||||
`RX_LOG_DATA`, which decrypts using the exported private key. The `CONTACT_MSG_RECV` handler
|
||||
exists as a fallback for radios without `ENABLE_PRIVATE_KEY_EXPORT=1` in firmware.
|
||||
|
||||
### WebSocket Broadcasting
|
||||
|
||||
Real-time updates use `ws_manager` singleton:
|
||||
@@ -100,6 +103,10 @@ await ws_manager.broadcast("message", {"id": 1, "text": "Hello"})
|
||||
|
||||
Event types: `health`, `contacts`, `channels`, `message`, `contact`, `raw_packet`, `message_acked`, `error`
|
||||
|
||||
**Note:** The WebSocket initial connect only sends `health`. Contacts and channels are fetched
|
||||
via REST (`GET /api/contacts`, `GET /api/channels`) for faster parallel loading. The WS still
|
||||
broadcasts real-time `contacts`/`channels` updates when data changes.
|
||||
|
||||
Helper functions for common broadcasts:
|
||||
|
||||
```python
|
||||
@@ -119,7 +126,7 @@ broadcast_health(radio_connected=True, serial_port="/dev/ttyUSB0")
|
||||
- Checks connection every 5 seconds
|
||||
- Broadcasts `health` event on status change
|
||||
- Attempts automatic reconnection when connection lost
|
||||
- **Re-registers event handlers after successful auto-reconnect** (critical for message delivery)
|
||||
- **Runs full `post_connect_setup()` after successful reconnect** (event handlers, key export, time sync, contact/channel sync, advertisements, message polling)
|
||||
- Resilient to transient errors (logs and continues rather than crashing)
|
||||
- Supports manual reconnection via `POST /api/radio/reconnect`
|
||||
|
||||
@@ -161,6 +168,26 @@ if is_polling_paused():
|
||||
print("Polling is currently paused")
|
||||
```
|
||||
|
||||
### Periodic Advertisement
|
||||
|
||||
The server automatically sends an advertisement every hour to announce presence on the mesh.
|
||||
This helps maintain visibility to other nodes and refreshes routing information.
|
||||
|
||||
- Started automatically on radio connection
|
||||
- Interval: 1 hour (3600 seconds)
|
||||
- Uses flood mode for maximum reach
|
||||
|
||||
```python
|
||||
from app.radio_sync import start_periodic_advert, stop_periodic_advert, send_advertisement
|
||||
|
||||
# Start/stop periodic advertising
|
||||
start_periodic_advert() # Started automatically in lifespan
|
||||
await stop_periodic_advert()
|
||||
|
||||
# Manual advertisement
|
||||
await send_advertisement() # Returns True on success
|
||||
```
|
||||
|
||||
## Database Schema
|
||||
|
||||
```sql
|
||||
@@ -313,9 +340,18 @@ Direct messages use ECDH key exchange (Ed25519 → X25519) for shared secret der
|
||||
via `keystore.py`. This enables server-side DM decryption even when contacts aren't loaded
|
||||
on the radio.
|
||||
|
||||
**Real-time decryption**: When a `RAW_DATA` event contains a `TEXT_MESSAGE` packet, the
|
||||
`packet_processor.py` attempts to decrypt it using known contact public keys and the
|
||||
stored private key.
|
||||
**Primary path (packet processor)**: When an `RX_LOG_DATA` event contains a `TEXT_MESSAGE`
|
||||
packet, `packet_processor.py` handles the complete flow:
|
||||
1. Decrypts using known contact public keys and stored private key
|
||||
2. Filters CLI responses (txt_type=1 in flags)
|
||||
3. Stores message in database
|
||||
4. Broadcasts via WebSocket
|
||||
5. Updates contact's last_contacted timestamp
|
||||
6. Triggers bot if enabled
|
||||
|
||||
**Fallback path (event handler)**: If the packet processor can't decrypt (no private key
|
||||
export, unknown contact), `on_contact_message` handles DMs from the MeshCore library's
|
||||
`CONTACT_MSG_RECV` event. DB deduplication prevents double-storage when both paths fire.
|
||||
|
||||
**Historical decryption**: When creating a contact with `try_historical=True`, the server
|
||||
attempts to decrypt all stored `TEXT_MESSAGE` packets for that contact.
|
||||
@@ -457,6 +493,7 @@ All endpoints are prefixed with `/api`.
|
||||
- `DELETE /api/channels/{key}` - Delete channel
|
||||
|
||||
### Read State
|
||||
- `GET /api/read-state/unreads?name=X` - Server-computed unread counts, mention flags, and last message times
|
||||
- `POST /api/read-state/mark-all-read` - Mark all contacts and channels as read
|
||||
|
||||
### Messages
|
||||
@@ -474,14 +511,14 @@ All endpoints are prefixed with `/api`.
|
||||
- `POST /api/settings/favorites` - Add a favorite
|
||||
- `DELETE /api/settings/favorites` - Remove a favorite
|
||||
- `POST /api/settings/favorites/toggle` - Toggle favorite status
|
||||
- `POST /api/settings/last-message-time` - Update last message time for a conversation
|
||||
- `POST /api/settings/migrate` - One-time migration from frontend localStorage
|
||||
|
||||
### WebSocket
|
||||
- `WS /api/ws` - Real-time updates (health, contacts, channels, messages, raw packets)
|
||||
|
||||
### Static Files (Production)
|
||||
In production, the backend also serves the frontend:
|
||||
In production, the backend serves the frontend if `frontend/dist` exists. Users must build the
|
||||
frontend first (`cd frontend && npm install && npm run build`):
|
||||
- `/` - Serves `frontend/dist/index.html`
|
||||
- `/assets/*` - Serves compiled JS/CSS from `frontend/dist/assets/`
|
||||
- `/*` - Falls back to `index.html` for SPA routing
|
||||
@@ -615,10 +652,18 @@ reboot # Restart repeater
|
||||
|
||||
### CLI Response Filtering
|
||||
|
||||
CLI responses have `txt_type=1` (vs `txt_type=0` for normal messages). The event handler
|
||||
in `event_handlers.py` skips these to prevent duplicates—the command endpoint returns
|
||||
the response directly, so we don't also store/broadcast via WebSocket.
|
||||
CLI responses have `txt_type=1` (vs `txt_type=0` for normal messages). Both DM handling
|
||||
paths filter these to prevent storage—the command endpoint returns the response directly.
|
||||
|
||||
**Packet processor path** (primary):
|
||||
```python
|
||||
# In create_dm_message_from_decrypted()
|
||||
txt_type = decrypted.flags & 0x0F
|
||||
if txt_type == 1:
|
||||
return None # Skip CLI responses
|
||||
```
|
||||
|
||||
**Event handler path** (fallback):
|
||||
```python
|
||||
# In on_contact_message()
|
||||
txt_type = payload.get("txt_type", 0)
|
||||
303
app/bot.py
Normal file
@@ -0,0 +1,303 @@
|
||||
"""
|
||||
Bot execution module for automatic message responses.
|
||||
|
||||
This module provides functionality for executing user-defined Python code
|
||||
in response to incoming messages. The user's code can process message data
|
||||
and optionally return a response string or a list of strings.
|
||||
|
||||
SECURITY WARNING: This executes arbitrary Python code provided by the user.
|
||||
It should only be enabled on trusted systems where the user understands
|
||||
the security implications.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import time
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Limit concurrent bot executions to prevent resource exhaustion
|
||||
_bot_semaphore = asyncio.Semaphore(100)
|
||||
|
||||
# Dedicated thread pool for bot execution (separate from default executor)
|
||||
_bot_executor = ThreadPoolExecutor(max_workers=100, thread_name_prefix="bot_")
|
||||
|
||||
# Timeout for bot code execution (seconds)
|
||||
BOT_EXECUTION_TIMEOUT = 10
|
||||
|
||||
# Minimum spacing between bot message sends (seconds)
|
||||
# This ensures repeaters have time to return to listening mode
|
||||
BOT_MESSAGE_SPACING = 2.0
|
||||
|
||||
# Global state for rate limiting bot sends
|
||||
_bot_send_lock = asyncio.Lock()
|
||||
_last_bot_send_time: float = 0.0
|
||||
|
||||
|
||||
def execute_bot_code(
|
||||
code: str,
|
||||
sender_name: str | None,
|
||||
sender_key: str | None,
|
||||
message_text: str,
|
||||
is_dm: bool,
|
||||
channel_key: str | None,
|
||||
channel_name: str | None,
|
||||
sender_timestamp: int | None,
|
||||
path: str | None,
|
||||
) -> str | list[str] | None:
|
||||
"""
|
||||
Execute user-provided bot code with message context.
|
||||
|
||||
The code should define a function:
|
||||
`bot(sender_name, sender_key, message_text, is_dm, channel_key, channel_name, sender_timestamp, path)`
|
||||
that returns either None (no response), a string (single response message),
|
||||
or a list of strings (multiple messages sent in order).
|
||||
|
||||
Args:
|
||||
code: Python code defining the bot function
|
||||
sender_name: Display name of the sender (may be None)
|
||||
sender_key: 64-char hex public key of sender for DMs, None for channel messages
|
||||
message_text: The message content
|
||||
is_dm: True for direct messages, False for channel messages
|
||||
channel_key: 32-char hex channel key for channel messages, None for DMs
|
||||
channel_name: Channel name (e.g. "#general" with hash), None for DMs
|
||||
sender_timestamp: Sender's timestamp from the message (may be None)
|
||||
path: Hex-encoded routing path (may be None)
|
||||
|
||||
Returns:
|
||||
Response string, list of strings, or None.
|
||||
|
||||
Note: This executes arbitrary code. Only use with trusted input.
|
||||
"""
|
||||
if not code or not code.strip():
|
||||
return None
|
||||
|
||||
# Build execution namespace with allowed imports
|
||||
namespace: dict[str, Any] = {
|
||||
"__builtins__": __builtins__,
|
||||
}
|
||||
|
||||
try:
|
||||
# Execute the user's code to define the bot function
|
||||
exec(code, namespace)
|
||||
except Exception as e:
|
||||
logger.warning("Bot code compilation failed: %s", e)
|
||||
return None
|
||||
|
||||
# Check if bot function was defined
|
||||
if "bot" not in namespace or not callable(namespace["bot"]):
|
||||
logger.debug("Bot code does not define a callable 'bot' function")
|
||||
return None
|
||||
|
||||
bot_func = namespace["bot"]
|
||||
|
||||
try:
|
||||
# Call the bot function with message context
|
||||
result = bot_func(
|
||||
sender_name,
|
||||
sender_key,
|
||||
message_text,
|
||||
is_dm,
|
||||
channel_key,
|
||||
channel_name,
|
||||
sender_timestamp,
|
||||
path,
|
||||
)
|
||||
|
||||
# Validate result
|
||||
if result is None:
|
||||
return None
|
||||
if isinstance(result, str):
|
||||
return result if result.strip() else None
|
||||
if isinstance(result, list):
|
||||
# Filter to non-empty strings only
|
||||
valid_messages = [msg for msg in result if isinstance(msg, str) and msg.strip()]
|
||||
return valid_messages if valid_messages else None
|
||||
|
||||
logger.debug("Bot function returned unsupported type: %s", type(result))
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
logger.warning("Bot function execution failed: %s", e)
|
||||
return None
|
||||
|
||||
|
||||
async def process_bot_response(
|
||||
response: str | list[str],
|
||||
is_dm: bool,
|
||||
sender_key: str,
|
||||
channel_key: str | None,
|
||||
) -> None:
|
||||
"""
|
||||
Send the bot's response message(s) using the existing message sending endpoints.
|
||||
|
||||
For DMs, sends a direct message back to the sender.
|
||||
For channel messages, sends to the same channel.
|
||||
|
||||
Bot messages are rate-limited to ensure at least BOT_MESSAGE_SPACING seconds
|
||||
between sends, giving repeaters time to return to listening mode.
|
||||
|
||||
Args:
|
||||
response: The response text to send, or a list of messages to send in order
|
||||
is_dm: Whether the original message was a DM
|
||||
sender_key: Public key of the original sender (for DM replies)
|
||||
channel_key: Channel key for channel message replies
|
||||
"""
|
||||
# Normalize to list for uniform processing
|
||||
messages = [response] if isinstance(response, str) else response
|
||||
|
||||
for message_text in messages:
|
||||
await _send_single_bot_message(message_text, is_dm, sender_key, channel_key)
|
||||
|
||||
|
||||
async def _send_single_bot_message(
|
||||
message_text: str,
|
||||
is_dm: bool,
|
||||
sender_key: str,
|
||||
channel_key: str | None,
|
||||
) -> None:
|
||||
"""
|
||||
Send a single bot message with rate limiting.
|
||||
|
||||
Args:
|
||||
message_text: The message text to send
|
||||
is_dm: Whether the original message was a DM
|
||||
sender_key: Public key of the original sender (for DM replies)
|
||||
channel_key: Channel key for channel message replies
|
||||
"""
|
||||
global _last_bot_send_time
|
||||
|
||||
from app.models import SendChannelMessageRequest, SendDirectMessageRequest
|
||||
from app.routers.messages import send_channel_message, send_direct_message
|
||||
from app.websocket import broadcast_event
|
||||
|
||||
# Serialize bot sends and enforce minimum spacing
|
||||
async with _bot_send_lock:
|
||||
# Calculate how long since last bot send
|
||||
now = time.monotonic()
|
||||
time_since_last = now - _last_bot_send_time
|
||||
|
||||
if _last_bot_send_time > 0 and time_since_last < BOT_MESSAGE_SPACING:
|
||||
wait_time = BOT_MESSAGE_SPACING - time_since_last
|
||||
logger.debug("Rate limiting bot send, waiting %.2fs", wait_time)
|
||||
await asyncio.sleep(wait_time)
|
||||
|
||||
try:
|
||||
if is_dm:
|
||||
logger.info("Bot sending DM reply to %s", sender_key[:12])
|
||||
request = SendDirectMessageRequest(destination=sender_key, text=message_text)
|
||||
message = await send_direct_message(request)
|
||||
# Broadcast to WebSocket (endpoint returns to HTTP caller, bot needs explicit broadcast)
|
||||
broadcast_event("message", message.model_dump())
|
||||
elif channel_key:
|
||||
logger.info("Bot sending channel reply to %s", channel_key[:8])
|
||||
request = SendChannelMessageRequest(channel_key=channel_key, text=message_text)
|
||||
message = await send_channel_message(request)
|
||||
# Broadcast to WebSocket
|
||||
broadcast_event("message", message.model_dump())
|
||||
else:
|
||||
logger.warning("Cannot send bot response: no destination")
|
||||
return # Don't update timestamp if we didn't send
|
||||
except HTTPException as e:
|
||||
logger.error("Bot failed to send response: %s", e.detail)
|
||||
return # Don't update timestamp on failure
|
||||
except Exception as e:
|
||||
logger.error("Bot failed to send response: %s", e)
|
||||
return # Don't update timestamp on failure
|
||||
|
||||
# Update last send time after successful send
|
||||
_last_bot_send_time = time.monotonic()
|
||||
|
||||
|
||||
async def run_bot_for_message(
|
||||
sender_name: str | None,
|
||||
sender_key: str | None,
|
||||
message_text: str,
|
||||
is_dm: bool,
|
||||
channel_key: str | None,
|
||||
channel_name: str | None = None,
|
||||
sender_timestamp: int | None = None,
|
||||
path: str | None = None,
|
||||
is_outgoing: bool = False,
|
||||
) -> None:
|
||||
"""
|
||||
Run all enabled bots for a message (incoming or outgoing).
|
||||
|
||||
This is the main entry point called by message handlers after
|
||||
a message is successfully decrypted and stored. Bots run serially,
|
||||
and errors in one bot don't prevent others from running.
|
||||
|
||||
Args:
|
||||
sender_name: Display name of the sender
|
||||
sender_key: 64-char hex public key of sender (DMs only, None for channels)
|
||||
message_text: The message content
|
||||
is_dm: True for direct messages, False for channel messages
|
||||
channel_key: Channel key for channel messages
|
||||
channel_name: Channel name (e.g. "#general"), None for DMs
|
||||
sender_timestamp: Sender's timestamp from the message
|
||||
path: Hex-encoded routing path
|
||||
is_outgoing: Whether this is our own outgoing message
|
||||
"""
|
||||
# Early check if any bots are enabled (will re-check after sleep)
|
||||
from app.repository import AppSettingsRepository
|
||||
|
||||
settings = await AppSettingsRepository.get()
|
||||
enabled_bots = [b for b in settings.bots if b.enabled and b.code.strip()]
|
||||
if not enabled_bots:
|
||||
return
|
||||
|
||||
async with _bot_semaphore:
|
||||
logger.debug(
|
||||
"Running %d bot(s) for message from %s (is_dm=%s)",
|
||||
len(enabled_bots),
|
||||
sender_name or (sender_key[:12] if sender_key else "unknown"),
|
||||
is_dm,
|
||||
)
|
||||
|
||||
# Wait for the initiating message's retransmissions to propagate through the mesh
|
||||
await asyncio.sleep(2)
|
||||
|
||||
# Re-check settings after sleep (user may have changed bot config)
|
||||
settings = await AppSettingsRepository.get()
|
||||
enabled_bots = [b for b in settings.bots if b.enabled and b.code.strip()]
|
||||
if not enabled_bots:
|
||||
logger.debug("All bots disabled during wait, skipping")
|
||||
return
|
||||
|
||||
# Run each enabled bot serially
|
||||
loop = asyncio.get_event_loop()
|
||||
for bot in enabled_bots:
|
||||
logger.debug("Executing bot '%s'", bot.name)
|
||||
try:
|
||||
response = await asyncio.wait_for(
|
||||
loop.run_in_executor(
|
||||
_bot_executor,
|
||||
execute_bot_code,
|
||||
bot.code,
|
||||
sender_name,
|
||||
sender_key,
|
||||
message_text,
|
||||
is_dm,
|
||||
channel_key,
|
||||
channel_name,
|
||||
sender_timestamp,
|
||||
path,
|
||||
),
|
||||
timeout=BOT_EXECUTION_TIMEOUT,
|
||||
)
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning(
|
||||
"Bot '%s' execution timed out after %ds", bot.name, BOT_EXECUTION_TIMEOUT
|
||||
)
|
||||
continue # Continue to next bot
|
||||
except Exception as e:
|
||||
logger.warning("Bot '%s' execution error: %s", bot.name, e)
|
||||
continue # Continue to next bot
|
||||
|
||||
# Send response if any
|
||||
if response:
|
||||
await process_bot_response(response, is_dm, sender_key or "", channel_key)
|
||||
@@ -11,7 +11,6 @@ class Settings(BaseSettings):
|
||||
serial_baudrate: int = 115200
|
||||
log_level: Literal["DEBUG", "INFO", "WARNING", "ERROR"] = "INFO"
|
||||
database_path: str = "data/meshcore.db"
|
||||
max_radio_contacts: int = 200 # Max non-repeater contacts to keep on radio for DM ACKs
|
||||
|
||||
|
||||
settings = Settings()
|
||||
|
||||
@@ -393,21 +393,6 @@ def parse_advertisement(payload: bytes) -> ParsedAdvertisement | None:
|
||||
)
|
||||
|
||||
|
||||
def try_parse_advertisement(raw_packet: bytes) -> ParsedAdvertisement | None:
|
||||
"""
|
||||
Try to parse a raw packet as an advertisement.
|
||||
Returns parsed advertisement if successful, None otherwise.
|
||||
"""
|
||||
packet_info = parse_packet(raw_packet)
|
||||
if packet_info is None:
|
||||
return None
|
||||
|
||||
if packet_info.payload_type != PayloadType.ADVERT:
|
||||
return None
|
||||
|
||||
return parse_advertisement(packet_info.payload)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Direct Message (TEXT_MESSAGE) Decryption
|
||||
# =============================================================================
|
||||
|
||||
@@ -4,7 +4,7 @@ from typing import TYPE_CHECKING
|
||||
|
||||
from meshcore import EventType
|
||||
|
||||
from app.models import Contact
|
||||
from app.models import CONTACT_TYPE_REPEATER, Contact
|
||||
from app.packet_processor import process_raw_packet
|
||||
from app.repository import ContactRepository, MessageRepository
|
||||
from app.websocket import broadcast_event
|
||||
@@ -47,55 +47,75 @@ def _cleanup_expired_acks() -> None:
|
||||
|
||||
|
||||
async def on_contact_message(event: "Event") -> None:
|
||||
"""Handle incoming direct messages.
|
||||
"""Handle incoming direct messages from MeshCore library.
|
||||
|
||||
Direct messages are decrypted by MeshCore library using ECDH key exchange.
|
||||
The packet processor cannot decrypt these without the node's private key.
|
||||
NOTE: DMs are primarily handled by the packet processor via RX_LOG_DATA,
|
||||
which decrypts using our exported private key. This handler exists as a
|
||||
fallback for cases where:
|
||||
1. The private key couldn't be exported (firmware without ENABLE_PRIVATE_KEY_EXPORT)
|
||||
2. The packet processor couldn't match the sender to a known contact
|
||||
|
||||
The packet processor handles: decryption, storage, broadcast, bot trigger.
|
||||
This handler only stores if the packet processor didn't already handle it
|
||||
(detected via INSERT OR IGNORE returning None for duplicates).
|
||||
"""
|
||||
payload = event.payload
|
||||
|
||||
# Skip CLI command responses (txt_type=1) - these are handled by the command endpoint
|
||||
# and should not be stored in the database or broadcast via WebSocket
|
||||
txt_type = payload.get("txt_type", 0)
|
||||
if txt_type == 1:
|
||||
logger.debug("Skipping CLI response from %s (txt_type=1)", payload.get("pubkey_prefix"))
|
||||
return
|
||||
|
||||
logger.debug("Received direct message from %s", payload.get("pubkey_prefix"))
|
||||
|
||||
# Get full public key if available, otherwise use prefix
|
||||
sender_pubkey = payload.get("public_key") or payload.get("pubkey_prefix", "")
|
||||
received_at = int(time.time())
|
||||
|
||||
# Look up full public key from contact database if we only have prefix
|
||||
if len(sender_pubkey) < 64:
|
||||
contact = await ContactRepository.get_by_key_prefix(sender_pubkey)
|
||||
if contact:
|
||||
sender_pubkey = contact.public_key
|
||||
# Look up contact from database - use prefix lookup only if needed
|
||||
# (get_by_key_or_prefix does exact match first, then prefix fallback)
|
||||
contact = await ContactRepository.get_by_key_or_prefix(sender_pubkey)
|
||||
if contact:
|
||||
sender_pubkey = contact.public_key.lower()
|
||||
|
||||
# Promote any prefix-stored messages to this full key
|
||||
await MessageRepository.claim_prefix_messages(sender_pubkey)
|
||||
|
||||
# Skip messages from repeaters - they only send CLI responses, not chat messages.
|
||||
# CLI responses are handled by the command endpoint and txt_type filter above.
|
||||
if contact.type == CONTACT_TYPE_REPEATER:
|
||||
logger.debug(
|
||||
"Skipping message from repeater %s (not stored in chat history)",
|
||||
sender_pubkey[:12],
|
||||
)
|
||||
return
|
||||
|
||||
# Try to create message - INSERT OR IGNORE handles duplicates atomically
|
||||
# If the packet processor already stored this message, this returns None
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text=payload.get("text", ""),
|
||||
conversation_key=sender_pubkey,
|
||||
sender_timestamp=payload.get("sender_timestamp"),
|
||||
sender_timestamp=payload.get("sender_timestamp") or received_at,
|
||||
received_at=received_at,
|
||||
path=payload.get("path"),
|
||||
txt_type=payload.get("txt_type", 0),
|
||||
txt_type=txt_type,
|
||||
signature=payload.get("signature"),
|
||||
)
|
||||
|
||||
if msg_id is None:
|
||||
# Duplicate message (same content from same sender) - skip broadcast
|
||||
logger.debug("Duplicate direct message from %s ignored", sender_pubkey[:12])
|
||||
# Already handled by packet processor (or exact duplicate) - nothing more to do
|
||||
logger.debug("DM from %s already processed by packet processor", sender_pubkey[:12])
|
||||
return
|
||||
|
||||
# If we get here, the packet processor didn't handle this message
|
||||
# (likely because private key export is not available)
|
||||
logger.debug("DM from %s handled by event handler (fallback path)", sender_pubkey[:12])
|
||||
|
||||
# Build paths array for broadcast
|
||||
# Use "is not None" to include empty string (direct/0-hop messages)
|
||||
path = payload.get("path")
|
||||
paths = [{"path": path or "", "received_at": received_at}] if path is not None else None
|
||||
|
||||
# Broadcast only genuinely new messages
|
||||
# Broadcast the new message
|
||||
broadcast_event(
|
||||
"message",
|
||||
{
|
||||
@@ -106,17 +126,31 @@ async def on_contact_message(event: "Event") -> None:
|
||||
"sender_timestamp": payload.get("sender_timestamp"),
|
||||
"received_at": received_at,
|
||||
"paths": paths,
|
||||
"txt_type": payload.get("txt_type", 0),
|
||||
"txt_type": txt_type,
|
||||
"signature": payload.get("signature"),
|
||||
"outgoing": False,
|
||||
"acked": 0,
|
||||
},
|
||||
)
|
||||
|
||||
# Update contact last_seen and last_contacted
|
||||
contact = await ContactRepository.get_by_key_prefix(sender_pubkey)
|
||||
# Update contact last_contacted (contact was already fetched above)
|
||||
if contact:
|
||||
await ContactRepository.update_last_contacted(contact.public_key, received_at)
|
||||
await ContactRepository.update_last_contacted(sender_pubkey, received_at)
|
||||
|
||||
# Run bot if enabled
|
||||
from app.bot import run_bot_for_message
|
||||
|
||||
await run_bot_for_message(
|
||||
sender_name=contact.name if contact else None,
|
||||
sender_key=sender_pubkey,
|
||||
message_text=payload.get("text", ""),
|
||||
is_dm=True,
|
||||
channel_key=None,
|
||||
channel_name=None,
|
||||
sender_timestamp=payload.get("sender_timestamp"),
|
||||
path=payload.get("path"),
|
||||
is_outgoing=False,
|
||||
)
|
||||
|
||||
|
||||
async def on_rx_log_data(event: "Event") -> None:
|
||||
|
||||
@@ -64,14 +64,6 @@ def has_private_key() -> bool:
|
||||
return _private_key is not None
|
||||
|
||||
|
||||
def clear_private_key() -> None:
|
||||
"""Clear the stored private key from memory."""
|
||||
global _private_key, _public_key
|
||||
_private_key = None
|
||||
_public_key = None
|
||||
logger.info("Private key cleared from keystore")
|
||||
|
||||
|
||||
async def export_and_store_private_key(mc: "MeshCore") -> bool:
|
||||
"""Export private key from the radio and store it in the keystore.
|
||||
|
||||
|
||||
43
app/main.py
@@ -9,16 +9,11 @@ from fastapi.staticfiles import StaticFiles
|
||||
|
||||
from app.config import setup_logging
|
||||
from app.database import db
|
||||
from app.event_handlers import register_event_handlers
|
||||
from app.radio import radio_manager
|
||||
from app.radio_sync import (
|
||||
drain_pending_messages,
|
||||
start_message_polling,
|
||||
start_periodic_sync,
|
||||
stop_message_polling,
|
||||
stop_periodic_advert,
|
||||
stop_periodic_sync,
|
||||
sync_and_offload_all,
|
||||
sync_radio_time,
|
||||
)
|
||||
from app.routers import (
|
||||
channels,
|
||||
@@ -45,40 +40,7 @@ async def lifespan(app: FastAPI):
|
||||
try:
|
||||
await radio_manager.connect()
|
||||
logger.info("Connected to radio")
|
||||
if radio_manager.meshcore:
|
||||
register_event_handlers(radio_manager.meshcore)
|
||||
|
||||
# Export and store private key for server-side DM decryption
|
||||
from app.keystore import export_and_store_private_key
|
||||
|
||||
await export_and_store_private_key(radio_manager.meshcore)
|
||||
|
||||
# Sync radio clock with system time
|
||||
await sync_radio_time()
|
||||
|
||||
# Sync contacts/channels from radio to DB and clear radio
|
||||
logger.info("Syncing and offloading radio data...")
|
||||
result = await sync_and_offload_all()
|
||||
logger.info("Sync complete: %s", result)
|
||||
|
||||
# Start periodic sync
|
||||
start_periodic_sync()
|
||||
|
||||
# Send advertisement to announce our presence
|
||||
logger.info("Sending startup advertisement...")
|
||||
advert_result = await radio_manager.meshcore.commands.send_advert(flood=True)
|
||||
logger.info("Advertisement sent: %s", advert_result.type)
|
||||
|
||||
await radio_manager.meshcore.start_auto_message_fetching()
|
||||
logger.info("Auto message fetching started")
|
||||
|
||||
# Drain any messages that were queued before we connected
|
||||
drained = await drain_pending_messages()
|
||||
if drained > 0:
|
||||
logger.info("Drained %d pending message(s)", drained)
|
||||
|
||||
# Start periodic message polling as fallback for unreliable push events
|
||||
start_message_polling()
|
||||
await radio_manager.post_connect_setup()
|
||||
except Exception as e:
|
||||
logger.warning("Failed to connect to radio on startup: %s", e)
|
||||
|
||||
@@ -90,6 +52,7 @@ async def lifespan(app: FastAPI):
|
||||
logger.info("Shutting down")
|
||||
await radio_manager.stop_connection_monitor()
|
||||
await stop_message_polling()
|
||||
await stop_periodic_advert()
|
||||
await stop_periodic_sync()
|
||||
if radio_manager.meshcore:
|
||||
await radio_manager.meshcore.stop_auto_message_fetching()
|
||||
|
||||
@@ -100,6 +100,48 @@ async def run_migrations(conn: aiosqlite.Connection) -> int:
|
||||
await set_version(conn, 9)
|
||||
applied += 1
|
||||
|
||||
# Migration 10: Add advert_interval column to app_settings
|
||||
if version < 10:
|
||||
logger.info("Applying migration 10: add advert_interval column")
|
||||
await _migrate_010_add_advert_interval(conn)
|
||||
await set_version(conn, 10)
|
||||
applied += 1
|
||||
|
||||
# Migration 11: Add last_advert_time column to app_settings
|
||||
if version < 11:
|
||||
logger.info("Applying migration 11: add last_advert_time column")
|
||||
await _migrate_011_add_last_advert_time(conn)
|
||||
await set_version(conn, 11)
|
||||
applied += 1
|
||||
|
||||
# Migration 12: Add bot_enabled and bot_code columns to app_settings
|
||||
if version < 12:
|
||||
logger.info("Applying migration 12: add bot settings columns")
|
||||
await _migrate_012_add_bot_settings(conn)
|
||||
await set_version(conn, 12)
|
||||
applied += 1
|
||||
|
||||
# Migration 13: Convert bot_enabled/bot_code to bots JSON array
|
||||
if version < 13:
|
||||
logger.info("Applying migration 13: convert to multi-bot format")
|
||||
await _migrate_013_convert_to_multi_bot(conn)
|
||||
await set_version(conn, 13)
|
||||
applied += 1
|
||||
|
||||
# Migration 14: Lowercase all contact public keys and related data
|
||||
if version < 14:
|
||||
logger.info("Applying migration 14: lowercase all contact public keys")
|
||||
await _migrate_014_lowercase_public_keys(conn)
|
||||
await set_version(conn, 14)
|
||||
applied += 1
|
||||
|
||||
# Migration 15: Fix NULL sender_timestamp and add null-safe dedup index
|
||||
if version < 15:
|
||||
logger.info("Applying migration 15: fix NULL sender_timestamp values")
|
||||
await _migrate_015_fix_null_sender_timestamp(conn)
|
||||
await set_version(conn, 15)
|
||||
applied += 1
|
||||
|
||||
if applied > 0:
|
||||
logger.info(
|
||||
"Applied %d migration(s), schema now at version %d", applied, await get_version(conn)
|
||||
@@ -629,3 +671,325 @@ async def _migrate_009_create_app_settings_table(conn: aiosqlite.Connection) ->
|
||||
|
||||
await conn.commit()
|
||||
logger.debug("Created app_settings table with default values")
|
||||
|
||||
|
||||
async def _migrate_010_add_advert_interval(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Add advert_interval column to app_settings table.
|
||||
|
||||
This enables configurable periodic advertisement interval (default 0 = disabled).
|
||||
"""
|
||||
try:
|
||||
await conn.execute("ALTER TABLE app_settings ADD COLUMN advert_interval INTEGER DEFAULT 0")
|
||||
logger.debug("Added advert_interval column to app_settings")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column" in str(e).lower():
|
||||
logger.debug("advert_interval column already exists, skipping")
|
||||
else:
|
||||
raise
|
||||
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_011_add_last_advert_time(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Add last_advert_time column to app_settings table.
|
||||
|
||||
This tracks when the last advertisement was sent, ensuring we never
|
||||
advertise faster than the configured advert_interval.
|
||||
"""
|
||||
try:
|
||||
await conn.execute("ALTER TABLE app_settings ADD COLUMN last_advert_time INTEGER DEFAULT 0")
|
||||
logger.debug("Added last_advert_time column to app_settings")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column" in str(e).lower():
|
||||
logger.debug("last_advert_time column already exists, skipping")
|
||||
else:
|
||||
raise
|
||||
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_012_add_bot_settings(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Add bot_enabled and bot_code columns to app_settings table.
|
||||
|
||||
This enables user-defined Python code to be executed when messages are received,
|
||||
allowing for custom bot responses.
|
||||
"""
|
||||
try:
|
||||
await conn.execute("ALTER TABLE app_settings ADD COLUMN bot_enabled INTEGER DEFAULT 0")
|
||||
logger.debug("Added bot_enabled column to app_settings")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column" in str(e).lower():
|
||||
logger.debug("bot_enabled column already exists, skipping")
|
||||
else:
|
||||
raise
|
||||
|
||||
try:
|
||||
await conn.execute("ALTER TABLE app_settings ADD COLUMN bot_code TEXT DEFAULT ''")
|
||||
logger.debug("Added bot_code column to app_settings")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column" in str(e).lower():
|
||||
logger.debug("bot_code column already exists, skipping")
|
||||
else:
|
||||
raise
|
||||
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_013_convert_to_multi_bot(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Convert single bot_enabled/bot_code to multi-bot format.
|
||||
|
||||
Adds a 'bots' TEXT column storing a JSON array of bot configs:
|
||||
[{"id": "uuid", "name": "Bot 1", "enabled": true, "code": "..."}]
|
||||
|
||||
If existing bot_code is non-empty OR bot_enabled is true, migrates
|
||||
to a single bot named "Bot 1". Otherwise, creates empty array.
|
||||
|
||||
Attempts to drop the old bot_enabled and bot_code columns.
|
||||
"""
|
||||
import json
|
||||
import uuid
|
||||
|
||||
# Add new bots column
|
||||
try:
|
||||
await conn.execute("ALTER TABLE app_settings ADD COLUMN bots TEXT DEFAULT '[]'")
|
||||
logger.debug("Added bots column to app_settings")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column" in str(e).lower():
|
||||
logger.debug("bots column already exists, skipping")
|
||||
else:
|
||||
raise
|
||||
|
||||
# Migrate existing bot data
|
||||
cursor = await conn.execute("SELECT bot_enabled, bot_code FROM app_settings WHERE id = 1")
|
||||
row = await cursor.fetchone()
|
||||
|
||||
if row:
|
||||
bot_enabled = bool(row[0]) if row[0] is not None else False
|
||||
bot_code = row[1] or ""
|
||||
|
||||
# If there's existing bot data, migrate it
|
||||
if bot_code.strip() or bot_enabled:
|
||||
bots = [
|
||||
{
|
||||
"id": str(uuid.uuid4()),
|
||||
"name": "Bot 1",
|
||||
"enabled": bot_enabled,
|
||||
"code": bot_code,
|
||||
}
|
||||
]
|
||||
bots_json = json.dumps(bots)
|
||||
logger.info("Migrating existing bot to multi-bot format: enabled=%s", bot_enabled)
|
||||
else:
|
||||
bots_json = "[]"
|
||||
|
||||
await conn.execute(
|
||||
"UPDATE app_settings SET bots = ? WHERE id = 1",
|
||||
(bots_json,),
|
||||
)
|
||||
|
||||
# Try to drop old columns (SQLite 3.35.0+ only)
|
||||
for column in ["bot_enabled", "bot_code"]:
|
||||
try:
|
||||
await conn.execute(f"ALTER TABLE app_settings DROP COLUMN {column}")
|
||||
logger.debug("Dropped %s column from app_settings", column)
|
||||
except aiosqlite.OperationalError as e:
|
||||
error_msg = str(e).lower()
|
||||
if "no such column" in error_msg:
|
||||
logger.debug("app_settings.%s already dropped, skipping", column)
|
||||
elif "syntax error" in error_msg or "drop column" in error_msg:
|
||||
# SQLite version doesn't support DROP COLUMN - harmless, column stays
|
||||
logger.debug("SQLite doesn't support DROP COLUMN, %s column will remain", column)
|
||||
else:
|
||||
raise
|
||||
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_014_lowercase_public_keys(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Lowercase all contact public keys and related data for case-insensitive matching.
|
||||
|
||||
Updates:
|
||||
- contacts.public_key (PRIMARY KEY) via temp table swap
|
||||
- messages.conversation_key for PRIV messages
|
||||
- app_settings.favorites (contact IDs)
|
||||
- app_settings.last_message_times (contact- prefixed keys)
|
||||
|
||||
Handles case collisions by keeping the most-recently-seen contact.
|
||||
"""
|
||||
import json
|
||||
|
||||
# 1. Lowercase message conversation keys for private messages
|
||||
try:
|
||||
await conn.execute(
|
||||
"UPDATE messages SET conversation_key = lower(conversation_key) WHERE type = 'PRIV'"
|
||||
)
|
||||
logger.debug("Lowercased PRIV message conversation_keys")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "no such table" in str(e).lower():
|
||||
logger.debug("messages table does not exist yet, skipping conversation_key lowercase")
|
||||
else:
|
||||
raise
|
||||
|
||||
# 2. Check if contacts table exists before proceeding
|
||||
cursor = await conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name='contacts'"
|
||||
)
|
||||
if not await cursor.fetchone():
|
||||
logger.debug("contacts table does not exist yet, skipping key lowercase")
|
||||
await conn.commit()
|
||||
return
|
||||
|
||||
# 3. Handle contacts table - check for case collisions first
|
||||
cursor = await conn.execute(
|
||||
"SELECT lower(public_key) as lk, COUNT(*) as cnt "
|
||||
"FROM contacts GROUP BY lower(public_key) HAVING COUNT(*) > 1"
|
||||
)
|
||||
collisions = list(await cursor.fetchall())
|
||||
|
||||
if collisions:
|
||||
logger.warning(
|
||||
"Found %d case-colliding contact groups, keeping most-recently-seen",
|
||||
len(collisions),
|
||||
)
|
||||
for row in collisions:
|
||||
lower_key = row[0]
|
||||
# Delete all but the most recently seen
|
||||
await conn.execute(
|
||||
"""DELETE FROM contacts WHERE public_key IN (
|
||||
SELECT public_key FROM contacts
|
||||
WHERE lower(public_key) = ?
|
||||
ORDER BY COALESCE(last_seen, 0) DESC
|
||||
LIMIT -1 OFFSET 1
|
||||
)""",
|
||||
(lower_key,),
|
||||
)
|
||||
|
||||
# 3. Rebuild contacts with lowercased keys
|
||||
# Get the actual column names from the table (handles different schema versions)
|
||||
cursor = await conn.execute("PRAGMA table_info(contacts)")
|
||||
columns_info = await cursor.fetchall()
|
||||
all_columns = [col[1] for col in columns_info] # col[1] is column name
|
||||
|
||||
# Build column lists, lowering public_key
|
||||
select_cols = ", ".join(f"lower({c})" if c == "public_key" else c for c in all_columns)
|
||||
col_defs = []
|
||||
for col in columns_info:
|
||||
name, col_type, _notnull, default, pk = col[1], col[2], col[3], col[4], col[5]
|
||||
parts = [name, col_type or "TEXT"]
|
||||
if pk:
|
||||
parts.append("PRIMARY KEY")
|
||||
if default is not None:
|
||||
parts.append(f"DEFAULT {default}")
|
||||
col_defs.append(" ".join(parts))
|
||||
|
||||
create_sql = f"CREATE TABLE contacts_new ({', '.join(col_defs)})"
|
||||
await conn.execute(create_sql)
|
||||
await conn.execute(f"INSERT INTO contacts_new SELECT {select_cols} FROM contacts")
|
||||
await conn.execute("DROP TABLE contacts")
|
||||
await conn.execute("ALTER TABLE contacts_new RENAME TO contacts")
|
||||
|
||||
# Recreate the on_radio index (if column exists)
|
||||
if "on_radio" in all_columns:
|
||||
await conn.execute("CREATE INDEX IF NOT EXISTS idx_contacts_on_radio ON contacts(on_radio)")
|
||||
|
||||
# 4. Lowercase contact IDs in favorites JSON (if app_settings exists)
|
||||
cursor = await conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name='app_settings'"
|
||||
)
|
||||
if not await cursor.fetchone():
|
||||
await conn.commit()
|
||||
logger.info("Lowercased all contact public keys (no app_settings table)")
|
||||
return
|
||||
|
||||
cursor = await conn.execute("SELECT favorites FROM app_settings WHERE id = 1")
|
||||
row = await cursor.fetchone()
|
||||
if row and row[0]:
|
||||
try:
|
||||
favorites = json.loads(row[0])
|
||||
updated = False
|
||||
for fav in favorites:
|
||||
if fav.get("type") == "contact" and fav.get("id"):
|
||||
new_id = fav["id"].lower()
|
||||
if new_id != fav["id"]:
|
||||
fav["id"] = new_id
|
||||
updated = True
|
||||
if updated:
|
||||
await conn.execute(
|
||||
"UPDATE app_settings SET favorites = ? WHERE id = 1",
|
||||
(json.dumps(favorites),),
|
||||
)
|
||||
logger.debug("Lowercased contact IDs in favorites")
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
pass
|
||||
|
||||
# 5. Lowercase contact keys in last_message_times JSON
|
||||
cursor = await conn.execute("SELECT last_message_times FROM app_settings WHERE id = 1")
|
||||
row = await cursor.fetchone()
|
||||
if row and row[0]:
|
||||
try:
|
||||
times = json.loads(row[0])
|
||||
new_times = {}
|
||||
updated = False
|
||||
for key, val in times.items():
|
||||
if key.startswith("contact-"):
|
||||
new_key = "contact-" + key[8:].lower()
|
||||
if new_key != key:
|
||||
updated = True
|
||||
new_times[new_key] = val
|
||||
else:
|
||||
new_times[key] = val
|
||||
if updated:
|
||||
await conn.execute(
|
||||
"UPDATE app_settings SET last_message_times = ? WHERE id = 1",
|
||||
(json.dumps(new_times),),
|
||||
)
|
||||
logger.debug("Lowercased contact keys in last_message_times")
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
pass
|
||||
|
||||
await conn.commit()
|
||||
logger.info("Lowercased all contact public keys")
|
||||
|
||||
|
||||
async def _migrate_015_fix_null_sender_timestamp(conn: aiosqlite.Connection) -> None:
|
||||
"""
|
||||
Fix NULL sender_timestamp values and add null-safe dedup index.
|
||||
|
||||
1. Set sender_timestamp = received_at for any messages with NULL sender_timestamp
|
||||
2. Create a null-safe unique index as belt-and-suspenders protection
|
||||
"""
|
||||
# Check if messages table exists
|
||||
cursor = await conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name='messages'"
|
||||
)
|
||||
if not await cursor.fetchone():
|
||||
logger.debug("messages table does not exist yet, skipping NULL sender_timestamp fix")
|
||||
await conn.commit()
|
||||
return
|
||||
|
||||
# Backfill NULL sender_timestamps with received_at
|
||||
cursor = await conn.execute(
|
||||
"UPDATE messages SET sender_timestamp = received_at WHERE sender_timestamp IS NULL"
|
||||
)
|
||||
if cursor.rowcount > 0:
|
||||
logger.info("Backfilled %d messages with NULL sender_timestamp", cursor.rowcount)
|
||||
|
||||
# Try to create null-safe dedup index (may fail if existing duplicates exist)
|
||||
try:
|
||||
await conn.execute(
|
||||
"""CREATE UNIQUE INDEX IF NOT EXISTS idx_messages_dedup_null_safe
|
||||
ON messages(type, conversation_key, text, COALESCE(sender_timestamp, 0))"""
|
||||
)
|
||||
logger.debug("Created null-safe dedup index")
|
||||
except aiosqlite.IntegrityError:
|
||||
logger.warning(
|
||||
"Could not create null-safe dedup index due to existing duplicates - "
|
||||
"the application-level dedup will handle these"
|
||||
)
|
||||
|
||||
await conn.commit()
|
||||
|
||||
@@ -103,20 +103,6 @@ class Message(BaseModel):
|
||||
acked: int = 0
|
||||
|
||||
|
||||
class RawPacket(BaseModel):
|
||||
"""Raw packet as stored in the database."""
|
||||
|
||||
id: int
|
||||
timestamp: int
|
||||
data: str = Field(description="Hex-encoded packet data")
|
||||
message_id: int | None = None
|
||||
|
||||
@property
|
||||
def decrypted(self) -> bool:
|
||||
"""A packet is decrypted iff it has a linked message_id."""
|
||||
return self.message_id is not None
|
||||
|
||||
|
||||
class RawPacketDecryptedInfo(BaseModel):
|
||||
"""Decryption info for a raw packet (when successfully decrypted)."""
|
||||
|
||||
@@ -204,6 +190,21 @@ class TelemetryResponse(BaseModel):
|
||||
default_factory=list, description="List of neighbors seen by repeater"
|
||||
)
|
||||
acl: list[AclEntry] = Field(default_factory=list, description="Access control list")
|
||||
clock_output: str | None = Field(
|
||||
default=None, description="Output from 'clock' command (or error message)"
|
||||
)
|
||||
|
||||
|
||||
class TraceResponse(BaseModel):
|
||||
"""Result of a direct (zero-hop) trace to a contact."""
|
||||
|
||||
remote_snr: float | None = Field(
|
||||
default=None, description="SNR at which the target heard us (dB)"
|
||||
)
|
||||
local_snr: float | None = Field(
|
||||
default=None, description="SNR at which we heard the target on the bounce-back (dB)"
|
||||
)
|
||||
path_len: int = Field(description="Number of hops in the trace path")
|
||||
|
||||
|
||||
class CommandRequest(BaseModel):
|
||||
@@ -229,6 +230,29 @@ class Favorite(BaseModel):
|
||||
id: str = Field(description="Channel key or contact public key")
|
||||
|
||||
|
||||
class BotConfig(BaseModel):
|
||||
"""Configuration for a single bot."""
|
||||
|
||||
id: str = Field(description="UUID for stable identity across renames/reorders")
|
||||
name: str = Field(description="User-editable name")
|
||||
enabled: bool = Field(default=False, description="Whether this bot is enabled")
|
||||
code: str = Field(default="", description="Python code for this bot")
|
||||
|
||||
|
||||
class UnreadCounts(BaseModel):
|
||||
"""Aggregated unread counts, mention flags, and last message times for all conversations."""
|
||||
|
||||
counts: dict[str, int] = Field(
|
||||
default_factory=dict, description="Map of stateKey -> unread count"
|
||||
)
|
||||
mentions: dict[str, bool] = Field(
|
||||
default_factory=dict, description="Map of stateKey -> has mention"
|
||||
)
|
||||
last_message_times: dict[str, int] = Field(
|
||||
default_factory=dict, description="Map of stateKey -> last message timestamp"
|
||||
)
|
||||
|
||||
|
||||
class AppSettings(BaseModel):
|
||||
"""Application settings stored in the database."""
|
||||
|
||||
@@ -255,3 +279,15 @@ class AppSettings(BaseModel):
|
||||
default=False,
|
||||
description="Whether preferences have been migrated from localStorage",
|
||||
)
|
||||
advert_interval: int = Field(
|
||||
default=0,
|
||||
description="Periodic advertisement interval in seconds (0 = disabled)",
|
||||
)
|
||||
last_advert_time: int = Field(
|
||||
default=0,
|
||||
description="Unix timestamp of last advertisement sent (0 = never)",
|
||||
)
|
||||
bots: list[BotConfig] = Field(
|
||||
default_factory=list,
|
||||
description="List of bot configurations",
|
||||
)
|
||||
|
||||
@@ -39,6 +39,71 @@ from app.websocket import broadcast_error, broadcast_event
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def _handle_duplicate_message(
|
||||
packet_id: int,
|
||||
msg_type: str,
|
||||
conversation_key: str,
|
||||
text: str,
|
||||
sender_timestamp: int,
|
||||
path: str | None,
|
||||
received: int,
|
||||
) -> None:
|
||||
"""Handle a duplicate message by updating paths/acks on the existing record.
|
||||
|
||||
Called when MessageRepository.create returns None (INSERT OR IGNORE hit a duplicate).
|
||||
Looks up the existing message, adds the new path, increments ack count for outgoing
|
||||
messages, and broadcasts the update to clients.
|
||||
"""
|
||||
existing_msg = await MessageRepository.get_by_content(
|
||||
msg_type=msg_type,
|
||||
conversation_key=conversation_key,
|
||||
text=text,
|
||||
sender_timestamp=sender_timestamp,
|
||||
)
|
||||
if not existing_msg:
|
||||
label = "message" if msg_type == "CHAN" else "DM"
|
||||
logger.warning(
|
||||
"Duplicate %s for %s but couldn't find existing",
|
||||
label,
|
||||
conversation_key[:12],
|
||||
)
|
||||
return
|
||||
|
||||
logger.debug(
|
||||
"Duplicate %s for %s (msg_id=%d, outgoing=%s) - adding path",
|
||||
msg_type,
|
||||
conversation_key[:12],
|
||||
existing_msg.id,
|
||||
existing_msg.outgoing,
|
||||
)
|
||||
|
||||
# Add path if provided
|
||||
if path is not None:
|
||||
paths = await MessageRepository.add_path(existing_msg.id, path, received)
|
||||
else:
|
||||
# Get current paths for broadcast
|
||||
paths = existing_msg.paths or []
|
||||
|
||||
# Increment ack count for outgoing messages (echo confirmation)
|
||||
if existing_msg.outgoing:
|
||||
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
|
||||
else:
|
||||
ack_count = await MessageRepository.get_ack_count(existing_msg.id)
|
||||
|
||||
# Broadcast updated paths
|
||||
broadcast_event(
|
||||
"message_acked",
|
||||
{
|
||||
"message_id": existing_msg.id,
|
||||
"ack_count": ack_count,
|
||||
"paths": [p.model_dump() for p in paths] if paths else [],
|
||||
},
|
||||
)
|
||||
|
||||
# Mark this packet as decrypted
|
||||
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
|
||||
|
||||
|
||||
async def create_message_from_decrypted(
|
||||
packet_id: int,
|
||||
channel_key: str,
|
||||
@@ -47,6 +112,8 @@ async def create_message_from_decrypted(
|
||||
timestamp: int,
|
||||
received_at: int | None = None,
|
||||
path: str | None = None,
|
||||
channel_name: str | None = None,
|
||||
trigger_bot: bool = True,
|
||||
) -> int | None:
|
||||
"""Create a message record from decrypted channel packet content.
|
||||
|
||||
@@ -56,11 +123,13 @@ async def create_message_from_decrypted(
|
||||
Args:
|
||||
packet_id: ID of the raw packet being processed
|
||||
channel_key: Hex string channel key
|
||||
channel_name: Channel name (e.g. "#general"), for bot context
|
||||
sender: Sender name (will be prefixed to message) or None
|
||||
message_text: The decrypted message content
|
||||
timestamp: Sender timestamp from the packet
|
||||
received_at: When the packet was received (defaults to now)
|
||||
path: Hex-encoded routing path (None for historical decryption)
|
||||
path: Hex-encoded routing path
|
||||
trigger_bot: Whether to trigger bot response (False for historical decryption)
|
||||
|
||||
Returns the message ID if created, None if duplicate.
|
||||
"""
|
||||
@@ -87,52 +156,9 @@ async def create_message_from_decrypted(
|
||||
# 1. Our own outgoing message echoes back (flood routing)
|
||||
# 2. Same message arrives via multiple paths before first is committed
|
||||
# In either case, add the path to the existing message.
|
||||
existing_msg = await MessageRepository.get_by_content(
|
||||
msg_type="CHAN",
|
||||
conversation_key=channel_key_normalized,
|
||||
text=text,
|
||||
sender_timestamp=timestamp,
|
||||
await _handle_duplicate_message(
|
||||
packet_id, "CHAN", channel_key_normalized, text, timestamp, path, received
|
||||
)
|
||||
if not existing_msg:
|
||||
logger.warning(
|
||||
"Duplicate message for channel %s but couldn't find existing",
|
||||
channel_key_normalized[:8],
|
||||
)
|
||||
return None
|
||||
|
||||
logger.debug(
|
||||
"Duplicate message for channel %s (msg_id=%d, outgoing=%s) - adding path",
|
||||
channel_key_normalized[:8],
|
||||
existing_msg.id,
|
||||
existing_msg.outgoing,
|
||||
)
|
||||
|
||||
# Add path if provided
|
||||
if path is not None:
|
||||
paths = await MessageRepository.add_path(existing_msg.id, path, received)
|
||||
else:
|
||||
# Get current paths for broadcast
|
||||
paths = existing_msg.paths or []
|
||||
|
||||
# Increment ack count for outgoing messages (echo confirmation)
|
||||
if existing_msg.outgoing:
|
||||
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
|
||||
else:
|
||||
ack_count = await MessageRepository.get_ack_count(existing_msg.id)
|
||||
|
||||
# Broadcast updated paths
|
||||
broadcast_event(
|
||||
"message_acked",
|
||||
{
|
||||
"message_id": existing_msg.id,
|
||||
"ack_count": ack_count,
|
||||
"paths": [p.model_dump() for p in paths] if paths else [],
|
||||
},
|
||||
)
|
||||
|
||||
# Mark this packet as decrypted
|
||||
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
|
||||
|
||||
return None
|
||||
|
||||
logger.info("Stored channel message %d for channel %s", msg_id, channel_key_normalized[:8])
|
||||
@@ -162,6 +188,22 @@ async def create_message_from_decrypted(
|
||||
},
|
||||
)
|
||||
|
||||
# Run bot if enabled (for incoming channel messages, not historical decryption)
|
||||
if trigger_bot:
|
||||
from app.bot import run_bot_for_message
|
||||
|
||||
await run_bot_for_message(
|
||||
sender_name=sender,
|
||||
sender_key=None, # Channel messages don't have a sender public key
|
||||
message_text=message_text,
|
||||
is_dm=False,
|
||||
channel_key=channel_key_normalized,
|
||||
channel_name=channel_name,
|
||||
sender_timestamp=timestamp,
|
||||
path=path,
|
||||
is_outgoing=False,
|
||||
)
|
||||
|
||||
return msg_id
|
||||
|
||||
|
||||
@@ -173,6 +215,7 @@ async def create_dm_message_from_decrypted(
|
||||
received_at: int | None = None,
|
||||
path: str | None = None,
|
||||
outgoing: bool = False,
|
||||
trigger_bot: bool = True,
|
||||
) -> int | None:
|
||||
"""Create a message record from decrypted direct message packet content.
|
||||
|
||||
@@ -185,11 +228,23 @@ async def create_dm_message_from_decrypted(
|
||||
their_public_key: The contact's full 64-char public key (conversation_key)
|
||||
our_public_key: Our public key (to determine direction), or None
|
||||
received_at: When the packet was received (defaults to now)
|
||||
path: Hex-encoded routing path (None for historical decryption)
|
||||
path: Hex-encoded routing path
|
||||
outgoing: Whether this is an outgoing message (we sent it)
|
||||
trigger_bot: Whether to trigger bot response (False for historical decryption)
|
||||
|
||||
Returns the message ID if created, None if duplicate.
|
||||
"""
|
||||
# Check if sender is a repeater - repeaters only send CLI responses, not chat messages.
|
||||
# CLI responses are handled by the command endpoint, not stored in chat history.
|
||||
contact = await ContactRepository.get_by_key_or_prefix(their_public_key)
|
||||
if contact and contact.type == CONTACT_TYPE_REPEATER:
|
||||
logger.debug(
|
||||
"Skipping message from repeater %s (CLI responses not stored): %s",
|
||||
their_public_key[:12],
|
||||
(decrypted.message or "")[:50],
|
||||
)
|
||||
return None
|
||||
|
||||
received = received_at or int(time.time())
|
||||
|
||||
# conversation_key is always the other party's public key
|
||||
@@ -208,51 +263,15 @@ async def create_dm_message_from_decrypted(
|
||||
|
||||
if msg_id is None:
|
||||
# Duplicate message detected
|
||||
existing_msg = await MessageRepository.get_by_content(
|
||||
msg_type="PRIV",
|
||||
conversation_key=conversation_key,
|
||||
text=decrypted.message,
|
||||
sender_timestamp=decrypted.timestamp,
|
||||
await _handle_duplicate_message(
|
||||
packet_id,
|
||||
"PRIV",
|
||||
conversation_key,
|
||||
decrypted.message,
|
||||
decrypted.timestamp,
|
||||
path,
|
||||
received,
|
||||
)
|
||||
if not existing_msg:
|
||||
logger.warning(
|
||||
"Duplicate DM for contact %s but couldn't find existing",
|
||||
conversation_key[:12],
|
||||
)
|
||||
return None
|
||||
|
||||
logger.debug(
|
||||
"Duplicate DM for contact %s (msg_id=%d, outgoing=%s) - adding path",
|
||||
conversation_key[:12],
|
||||
existing_msg.id,
|
||||
existing_msg.outgoing,
|
||||
)
|
||||
|
||||
# Add path if provided
|
||||
if path is not None:
|
||||
paths = await MessageRepository.add_path(existing_msg.id, path, received)
|
||||
else:
|
||||
paths = existing_msg.paths or []
|
||||
|
||||
# Increment ack count for outgoing messages (echo confirmation)
|
||||
if existing_msg.outgoing:
|
||||
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
|
||||
else:
|
||||
ack_count = await MessageRepository.get_ack_count(existing_msg.id)
|
||||
|
||||
# Broadcast updated paths
|
||||
broadcast_event(
|
||||
"message_acked",
|
||||
{
|
||||
"message_id": existing_msg.id,
|
||||
"ack_count": ack_count,
|
||||
"paths": [p.model_dump() for p in paths] if paths else [],
|
||||
},
|
||||
)
|
||||
|
||||
# Mark this packet as decrypted
|
||||
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
|
||||
|
||||
return None
|
||||
|
||||
logger.info(
|
||||
@@ -289,6 +308,26 @@ async def create_dm_message_from_decrypted(
|
||||
# Update contact's last_contacted timestamp (for sorting)
|
||||
await ContactRepository.update_last_contacted(conversation_key, received)
|
||||
|
||||
# Run bot if enabled (for all real-time DMs, including our own outgoing messages)
|
||||
if trigger_bot:
|
||||
from app.bot import run_bot_for_message
|
||||
|
||||
# Get contact name for the bot
|
||||
contact = await ContactRepository.get_by_key(their_public_key)
|
||||
sender_name = contact.name if contact else None
|
||||
|
||||
await run_bot_for_message(
|
||||
sender_name=sender_name,
|
||||
sender_key=their_public_key,
|
||||
message_text=decrypted.message,
|
||||
is_dm=True,
|
||||
channel_key=None,
|
||||
channel_name=None,
|
||||
sender_timestamp=decrypted.timestamp,
|
||||
path=path,
|
||||
is_outgoing=outgoing,
|
||||
)
|
||||
|
||||
return msg_id
|
||||
|
||||
|
||||
@@ -315,7 +354,11 @@ async def run_historical_dm_decryption(
|
||||
our_public_key_bytes = derive_public_key(private_key_bytes)
|
||||
|
||||
for packet_id, packet_data, packet_timestamp in packets:
|
||||
# Don't pass our_public_key - we want to decrypt both incoming AND outgoing messages.
|
||||
# Note: passing our_public_key=None means outgoing DMs won't be matched
|
||||
# by try_decrypt_dm (the inbound check requires src_hash == their_first_byte,
|
||||
# which fails for our outgoing packets). This is acceptable because outgoing
|
||||
# DMs are stored directly by the send endpoint. Historical decryption only
|
||||
# recovers incoming messages.
|
||||
result = try_decrypt_dm(
|
||||
packet_data,
|
||||
private_key_bytes,
|
||||
@@ -341,6 +384,7 @@ async def run_historical_dm_decryption(
|
||||
received_at=packet_timestamp,
|
||||
path=path_hex,
|
||||
outgoing=outgoing,
|
||||
trigger_bot=False, # Historical decryption should not trigger bot
|
||||
)
|
||||
|
||||
if msg_id is not None:
|
||||
@@ -535,6 +579,7 @@ async def _process_group_text(
|
||||
msg_id = await create_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
channel_key=channel.key,
|
||||
channel_name=channel.name,
|
||||
sender=decrypted.sender,
|
||||
message_text=decrypted.message,
|
||||
timestamp=decrypted.timestamp,
|
||||
@@ -581,7 +626,7 @@ async def _process_advertisement(
|
||||
new_path_hex = packet_info.path.hex() if packet_info.path else ""
|
||||
|
||||
# Try to find existing contact
|
||||
existing = await ContactRepository.get_by_key(advert.public_key)
|
||||
existing = await ContactRepository.get_by_key(advert.public_key.lower())
|
||||
|
||||
# Determine which path to use: keep shorter path if heard recently (within 60s)
|
||||
# This handles advertisement echoes through different routes
|
||||
@@ -628,7 +673,7 @@ async def _process_advertisement(
|
||||
)
|
||||
|
||||
contact_data = {
|
||||
"public_key": advert.public_key,
|
||||
"public_key": advert.public_key.lower(),
|
||||
"name": advert.name,
|
||||
"type": contact_type,
|
||||
"lat": advert.lat,
|
||||
@@ -645,7 +690,7 @@ async def _process_advertisement(
|
||||
broadcast_event(
|
||||
"contact",
|
||||
{
|
||||
"public_key": advert.public_key,
|
||||
"public_key": advert.public_key.lower(),
|
||||
"name": advert.name,
|
||||
"type": contact_type,
|
||||
"flags": existing.flags if existing else 0,
|
||||
@@ -666,7 +711,7 @@ async def _process_advertisement(
|
||||
|
||||
settings = await AppSettingsRepository.get()
|
||||
if settings.auto_decrypt_dm_on_advert:
|
||||
await start_historical_dm_decryption(None, advert.public_key, advert.name)
|
||||
await start_historical_dm_decryption(None, advert.public_key.lower(), advert.name)
|
||||
|
||||
# If this is not a repeater, trigger recent contacts sync to radio
|
||||
# This ensures we can auto-ACK DMs from recent contacts
|
||||
@@ -738,9 +783,8 @@ async def _process_direct_message(
|
||||
# For outgoing: match dest_hash (recipient's first byte)
|
||||
match_hash = dest_hash if is_outgoing else src_hash
|
||||
|
||||
# Get all contacts and filter by first byte of public key
|
||||
contacts = await ContactRepository.get_all(limit=1000)
|
||||
candidate_contacts = [c for c in contacts if c.public_key.lower().startswith(match_hash)]
|
||||
# Get contacts matching the first byte of public key via targeted SQL query
|
||||
candidate_contacts = await ContactRepository.get_by_pubkey_first_byte(match_hash)
|
||||
|
||||
if not candidate_contacts:
|
||||
logger.debug(
|
||||
|
||||
67
app/radio.py
@@ -107,6 +107,63 @@ class RadioManager:
|
||||
self._last_connected: bool = False
|
||||
self._reconnect_lock: asyncio.Lock | None = None
|
||||
|
||||
async def post_connect_setup(self) -> None:
|
||||
"""Full post-connection setup: handlers, key export, sync, advertisements, polling.
|
||||
|
||||
Called after every successful connection or reconnection.
|
||||
Idempotent — safe to call repeatedly (periodic tasks have start guards).
|
||||
"""
|
||||
from app.event_handlers import register_event_handlers
|
||||
from app.keystore import export_and_store_private_key
|
||||
from app.radio_sync import (
|
||||
drain_pending_messages,
|
||||
send_advertisement,
|
||||
start_message_polling,
|
||||
start_periodic_advert,
|
||||
start_periodic_sync,
|
||||
sync_and_offload_all,
|
||||
sync_radio_time,
|
||||
)
|
||||
|
||||
if not self._meshcore:
|
||||
return
|
||||
|
||||
register_event_handlers(self._meshcore)
|
||||
await export_and_store_private_key(self._meshcore)
|
||||
|
||||
# Sync radio clock with system time
|
||||
await sync_radio_time()
|
||||
|
||||
# Sync contacts/channels from radio to DB and clear radio
|
||||
logger.info("Syncing and offloading radio data...")
|
||||
result = await sync_and_offload_all()
|
||||
logger.info("Sync complete: %s", result)
|
||||
|
||||
# Start periodic sync (idempotent)
|
||||
start_periodic_sync()
|
||||
|
||||
# Send advertisement to announce our presence (if enabled and not throttled)
|
||||
if await send_advertisement():
|
||||
logger.info("Advertisement sent")
|
||||
else:
|
||||
logger.debug("Advertisement skipped (disabled or throttled)")
|
||||
|
||||
# Start periodic advertisement (idempotent)
|
||||
start_periodic_advert()
|
||||
|
||||
await self._meshcore.start_auto_message_fetching()
|
||||
logger.info("Auto message fetching started")
|
||||
|
||||
# Drain any messages that were queued before we connected
|
||||
drained = await drain_pending_messages()
|
||||
if drained > 0:
|
||||
logger.info("Drained %d pending message(s)", drained)
|
||||
|
||||
# Start periodic message polling as fallback (idempotent)
|
||||
start_message_polling()
|
||||
|
||||
logger.info("Post-connect setup complete")
|
||||
|
||||
@property
|
||||
def meshcore(self) -> MeshCore | None:
|
||||
return self._meshcore
|
||||
@@ -229,15 +286,7 @@ class RadioManager:
|
||||
# Attempt reconnection
|
||||
await asyncio.sleep(3) # Wait a bit before trying
|
||||
if await self.reconnect():
|
||||
# Re-register event handlers after successful reconnect
|
||||
from app.event_handlers import register_event_handlers
|
||||
from app.keystore import export_and_store_private_key
|
||||
|
||||
if self._meshcore:
|
||||
register_event_handlers(self._meshcore)
|
||||
await export_and_store_private_key(self._meshcore)
|
||||
await self._meshcore.start_auto_message_fetching()
|
||||
logger.info("Event handlers re-registered after auto-reconnect")
|
||||
await self.post_connect_setup()
|
||||
|
||||
elif not self._last_connected and current_connected:
|
||||
# Connection restored (might have reconnected automatically)
|
||||
|
||||
@@ -16,10 +16,9 @@ from contextlib import asynccontextmanager
|
||||
|
||||
from meshcore import EventType
|
||||
|
||||
from app.config import settings
|
||||
from app.models import Contact
|
||||
from app.radio import radio_manager
|
||||
from app.repository import ChannelRepository, ContactRepository
|
||||
from app.repository import AppSettingsRepository, ChannelRepository, ContactRepository
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -29,6 +28,13 @@ _message_poll_task: asyncio.Task | None = None
|
||||
# Message poll interval in seconds
|
||||
MESSAGE_POLL_INTERVAL = 5
|
||||
|
||||
# Periodic advertisement task handle
|
||||
_advert_task: asyncio.Task | None = None
|
||||
|
||||
# Default check interval when periodic advertising is disabled (seconds)
|
||||
# We still need to periodically check if it's been enabled
|
||||
ADVERT_CHECK_INTERVAL = 60
|
||||
|
||||
# Counter to pause polling during repeater operations (supports nested pauses)
|
||||
_polling_pause_count: int = 0
|
||||
|
||||
@@ -329,6 +335,111 @@ async def stop_message_polling():
|
||||
logger.info("Stopped periodic message polling")
|
||||
|
||||
|
||||
async def send_advertisement(force: bool = False) -> bool:
|
||||
"""Send an advertisement to announce presence on the mesh.
|
||||
|
||||
Respects the configured advert_interval - won't send if not enough time
|
||||
has elapsed since the last advertisement, unless force=True.
|
||||
|
||||
Args:
|
||||
force: If True, send immediately regardless of interval.
|
||||
|
||||
Returns True if successful, False otherwise (including if throttled).
|
||||
"""
|
||||
if not radio_manager.is_connected or radio_manager.meshcore is None:
|
||||
logger.debug("Cannot send advertisement: radio not connected")
|
||||
return False
|
||||
|
||||
# Check if enough time has elapsed (unless forced)
|
||||
if not force:
|
||||
settings = await AppSettingsRepository.get()
|
||||
interval = settings.advert_interval
|
||||
last_time = settings.last_advert_time
|
||||
now = int(time.time())
|
||||
|
||||
# If interval is 0, advertising is disabled
|
||||
if interval <= 0:
|
||||
logger.debug("Advertisement skipped: periodic advertising is disabled")
|
||||
return False
|
||||
|
||||
# Check if enough time has passed
|
||||
elapsed = now - last_time
|
||||
if elapsed < interval:
|
||||
remaining = interval - elapsed
|
||||
logger.debug(
|
||||
"Advertisement throttled: %d seconds remaining (interval=%d, elapsed=%d)",
|
||||
remaining,
|
||||
interval,
|
||||
elapsed,
|
||||
)
|
||||
return False
|
||||
|
||||
try:
|
||||
result = await radio_manager.meshcore.commands.send_advert(flood=True)
|
||||
if result.type == EventType.OK:
|
||||
# Update last_advert_time in database
|
||||
now = int(time.time())
|
||||
await AppSettingsRepository.update(last_advert_time=now)
|
||||
logger.info("Advertisement sent successfully")
|
||||
return True
|
||||
else:
|
||||
logger.warning("Failed to send advertisement: %s", result.payload)
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.warning("Error sending advertisement: %s", e)
|
||||
return False
|
||||
|
||||
|
||||
async def _periodic_advert_loop():
|
||||
"""Background task that periodically checks if an advertisement should be sent.
|
||||
|
||||
The actual throttling logic is in send_advertisement(), which checks
|
||||
last_advert_time from the database. This loop just triggers the check
|
||||
periodically and sleeps between attempts.
|
||||
"""
|
||||
while True:
|
||||
try:
|
||||
# Try to send - send_advertisement() handles all checks
|
||||
# (disabled, throttled, not connected)
|
||||
if radio_manager.is_connected:
|
||||
await send_advertisement()
|
||||
|
||||
# Sleep before next check
|
||||
await asyncio.sleep(ADVERT_CHECK_INTERVAL)
|
||||
|
||||
except asyncio.CancelledError:
|
||||
logger.info("Periodic advertisement task cancelled")
|
||||
break
|
||||
except Exception as e:
|
||||
logger.error("Error in periodic advertisement loop: %s", e)
|
||||
await asyncio.sleep(ADVERT_CHECK_INTERVAL)
|
||||
|
||||
|
||||
def start_periodic_advert():
|
||||
"""Start the periodic advertisement background task.
|
||||
|
||||
The task reads interval from app_settings dynamically, so it will
|
||||
adapt to configuration changes without restart.
|
||||
"""
|
||||
global _advert_task
|
||||
if _advert_task is None or _advert_task.done():
|
||||
_advert_task = asyncio.create_task(_periodic_advert_loop())
|
||||
logger.info("Started periodic advertisement task (interval configured in settings)")
|
||||
|
||||
|
||||
async def stop_periodic_advert():
|
||||
"""Stop the periodic advertisement background task."""
|
||||
global _advert_task
|
||||
if _advert_task and not _advert_task.done():
|
||||
_advert_task.cancel()
|
||||
try:
|
||||
await _advert_task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
_advert_task = None
|
||||
logger.info("Stopped periodic advertisement")
|
||||
|
||||
|
||||
async def sync_radio_time() -> bool:
|
||||
"""Sync the radio's clock with the system time.
|
||||
|
||||
@@ -416,7 +527,8 @@ async def sync_recent_contacts_to_radio(force: bool = False) -> dict:
|
||||
|
||||
try:
|
||||
# Get recent non-repeater contacts from database
|
||||
max_contacts = settings.max_radio_contacts
|
||||
app_settings = await AppSettingsRepository.get()
|
||||
max_contacts = app_settings.max_radio_contacts
|
||||
contacts = await ContactRepository.get_recent_non_repeaters(limit=max_contacts)
|
||||
logger.debug("Found %d recent non-repeater contacts to sync", len(contacts))
|
||||
|
||||
|
||||
@@ -7,7 +7,15 @@ from typing import Any, Literal
|
||||
|
||||
from app.database import db
|
||||
from app.decoder import PayloadType, extract_payload, get_packet_payload_type
|
||||
from app.models import AppSettings, Channel, Contact, Favorite, Message, MessagePath, RawPacket
|
||||
from app.models import (
|
||||
AppSettings,
|
||||
BotConfig,
|
||||
Channel,
|
||||
Contact,
|
||||
Favorite,
|
||||
Message,
|
||||
MessagePath,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -34,7 +42,7 @@ class ContactRepository:
|
||||
last_contacted = COALESCE(excluded.last_contacted, contacts.last_contacted)
|
||||
""",
|
||||
(
|
||||
contact.get("public_key"),
|
||||
contact.get("public_key", "").lower(),
|
||||
contact.get("name") or contact.get("adv_name"),
|
||||
contact.get("type", 0),
|
||||
contact.get("flags", 0),
|
||||
@@ -73,7 +81,9 @@ class ContactRepository:
|
||||
|
||||
@staticmethod
|
||||
async def get_by_key(public_key: str) -> Contact | None:
|
||||
cursor = await db.conn.execute("SELECT * FROM contacts WHERE public_key = ?", (public_key,))
|
||||
cursor = await db.conn.execute(
|
||||
"SELECT * FROM contacts WHERE public_key = ?", (public_key.lower(),)
|
||||
)
|
||||
row = await cursor.fetchone()
|
||||
return ContactRepository._row_to_contact(row) if row else None
|
||||
|
||||
@@ -81,7 +91,7 @@ class ContactRepository:
|
||||
async def get_by_key_prefix(prefix: str) -> Contact | None:
|
||||
cursor = await db.conn.execute(
|
||||
"SELECT * FROM contacts WHERE public_key LIKE ? LIMIT 1",
|
||||
(f"{prefix}%",),
|
||||
(f"{prefix.lower()}%",),
|
||||
)
|
||||
row = await cursor.fetchone()
|
||||
return ContactRepository._row_to_contact(row) if row else None
|
||||
@@ -129,7 +139,7 @@ class ContactRepository:
|
||||
async def update_path(public_key: str, path: str, path_len: int) -> None:
|
||||
await db.conn.execute(
|
||||
"UPDATE contacts SET last_path = ?, last_path_len = ?, last_seen = ? WHERE public_key = ?",
|
||||
(path, path_len, int(time.time()), public_key),
|
||||
(path, path_len, int(time.time()), public_key.lower()),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@@ -137,7 +147,7 @@ class ContactRepository:
|
||||
async def set_on_radio(public_key: str, on_radio: bool) -> None:
|
||||
await db.conn.execute(
|
||||
"UPDATE contacts SET on_radio = ? WHERE public_key = ?",
|
||||
(on_radio, public_key),
|
||||
(on_radio, public_key.lower()),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@@ -145,7 +155,7 @@ class ContactRepository:
|
||||
async def delete(public_key: str) -> None:
|
||||
await db.conn.execute(
|
||||
"DELETE FROM contacts WHERE public_key = ?",
|
||||
(public_key,),
|
||||
(public_key.lower(),),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@@ -155,25 +165,7 @@ class ContactRepository:
|
||||
ts = timestamp or int(time.time())
|
||||
await db.conn.execute(
|
||||
"UPDATE contacts SET last_contacted = ?, last_seen = ? WHERE public_key = ?",
|
||||
(ts, ts, public_key),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@staticmethod
|
||||
async def clear_all_on_radio() -> None:
|
||||
"""Clear the on_radio flag for all contacts."""
|
||||
await db.conn.execute("UPDATE contacts SET on_radio = 0")
|
||||
await db.conn.commit()
|
||||
|
||||
@staticmethod
|
||||
async def set_multiple_on_radio(public_keys: list[str], on_radio: bool = True) -> None:
|
||||
"""Set on_radio flag for multiple contacts."""
|
||||
if not public_keys:
|
||||
return
|
||||
placeholders = ",".join("?" * len(public_keys))
|
||||
await db.conn.execute(
|
||||
f"UPDATE contacts SET on_radio = ? WHERE public_key IN ({placeholders})",
|
||||
[on_radio] + public_keys,
|
||||
(ts, ts, public_key.lower()),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@@ -186,11 +178,26 @@ class ContactRepository:
|
||||
ts = timestamp or int(time.time())
|
||||
cursor = await db.conn.execute(
|
||||
"UPDATE contacts SET last_read_at = ? WHERE public_key = ?",
|
||||
(ts, public_key),
|
||||
(ts, public_key.lower()),
|
||||
)
|
||||
await db.conn.commit()
|
||||
return cursor.rowcount > 0
|
||||
|
||||
@staticmethod
|
||||
async def mark_all_read(timestamp: int) -> None:
|
||||
"""Mark all contacts as read at the given timestamp."""
|
||||
await db.conn.execute("UPDATE contacts SET last_read_at = ?", (timestamp,))
|
||||
|
||||
@staticmethod
|
||||
async def get_by_pubkey_first_byte(hex_byte: str) -> list[Contact]:
|
||||
"""Get contacts whose public key starts with the given hex byte (2 chars)."""
|
||||
cursor = await db.conn.execute(
|
||||
"SELECT * FROM contacts WHERE substr(public_key, 1, 2) = ?",
|
||||
(hex_byte.lower(),),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||
|
||||
|
||||
class ChannelRepository:
|
||||
@staticmethod
|
||||
@@ -244,24 +251,6 @@ class ChannelRepository:
|
||||
for row in rows
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
async def get_by_name(name: str) -> Channel | None:
|
||||
"""Get a channel by name."""
|
||||
cursor = await db.conn.execute(
|
||||
"SELECT key, name, is_hashtag, on_radio, last_read_at FROM channels WHERE name = ?",
|
||||
(name,),
|
||||
)
|
||||
row = await cursor.fetchone()
|
||||
if row:
|
||||
return Channel(
|
||||
key=row["key"],
|
||||
name=row["name"],
|
||||
is_hashtag=bool(row["is_hashtag"]),
|
||||
on_radio=bool(row["on_radio"]),
|
||||
last_read_at=row["last_read_at"],
|
||||
)
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
async def delete(key: str) -> None:
|
||||
"""Delete a channel by key."""
|
||||
@@ -285,6 +274,11 @@ class ChannelRepository:
|
||||
await db.conn.commit()
|
||||
return cursor.rowcount > 0
|
||||
|
||||
@staticmethod
|
||||
async def mark_all_read(timestamp: int) -> None:
|
||||
"""Mark all channels as read at the given timestamp."""
|
||||
await db.conn.execute("UPDATE channels SET last_read_at = ?", (timestamp,))
|
||||
|
||||
|
||||
class MessageRepository:
|
||||
@staticmethod
|
||||
@@ -298,13 +292,6 @@ class MessageRepository:
|
||||
except (json.JSONDecodeError, TypeError, KeyError):
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _serialize_paths(paths: list[dict] | None) -> str | None:
|
||||
"""Serialize paths list to JSON string."""
|
||||
if not paths:
|
||||
return None
|
||||
return json.dumps(paths)
|
||||
|
||||
@staticmethod
|
||||
async def create(
|
||||
msg_type: str,
|
||||
@@ -392,12 +379,31 @@ class MessageRepository:
|
||||
|
||||
return [MessagePath(**p) for p in existing_paths]
|
||||
|
||||
@staticmethod
|
||||
async def claim_prefix_messages(full_key: str) -> int:
|
||||
"""Promote prefix-stored messages to the full conversation key.
|
||||
|
||||
When a full key becomes known for a contact, any messages stored with
|
||||
only a prefix as conversation_key are updated to use the full key.
|
||||
"""
|
||||
lower_key = full_key.lower()
|
||||
cursor = await db.conn.execute(
|
||||
"""UPDATE messages SET conversation_key = ?
|
||||
WHERE type = 'PRIV' AND length(conversation_key) < 64
|
||||
AND ? LIKE conversation_key || '%'""",
|
||||
(lower_key, lower_key),
|
||||
)
|
||||
await db.conn.commit()
|
||||
return cursor.rowcount
|
||||
|
||||
@staticmethod
|
||||
async def get_all(
|
||||
limit: int = 100,
|
||||
offset: int = 0,
|
||||
msg_type: str | None = None,
|
||||
conversation_key: str | None = None,
|
||||
before: int | None = None,
|
||||
before_id: int | None = None,
|
||||
) -> list[Message]:
|
||||
query = "SELECT * FROM messages WHERE 1=1"
|
||||
params: list[Any] = []
|
||||
@@ -410,8 +416,15 @@ class MessageRepository:
|
||||
query += " AND conversation_key LIKE ?"
|
||||
params.append(f"{conversation_key}%")
|
||||
|
||||
query += " ORDER BY received_at DESC LIMIT ? OFFSET ?"
|
||||
params.extend([limit, offset])
|
||||
if before is not None and before_id is not None:
|
||||
query += " AND (received_at < ? OR (received_at = ? AND id < ?))"
|
||||
params.extend([before, before, before_id])
|
||||
|
||||
query += " ORDER BY received_at DESC, id DESC LIMIT ?"
|
||||
params.append(limit)
|
||||
if before is None or before_id is None:
|
||||
query += " OFFSET ?"
|
||||
params.append(offset)
|
||||
|
||||
cursor = await db.conn.execute(query, params)
|
||||
rows = await cursor.fetchall()
|
||||
@@ -495,57 +508,84 @@ class MessageRepository:
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
async def get_bulk(
|
||||
conversations: list[dict],
|
||||
limit_per_conversation: int = 100,
|
||||
) -> dict[str, list["Message"]]:
|
||||
"""Fetch messages for multiple conversations in one query per conversation.
|
||||
async def get_unread_counts(name: str | None = None) -> dict:
|
||||
"""Get unread message counts, mention flags, and last message times for all conversations.
|
||||
|
||||
Args:
|
||||
conversations: List of {type: 'PRIV'|'CHAN', conversation_key: string}
|
||||
limit_per_conversation: Max messages to return per conversation
|
||||
name: User's display name for @[name] mention detection. If None, mentions are skipped.
|
||||
|
||||
Returns:
|
||||
Dict mapping 'type:conversation_key' to list of messages
|
||||
Dict with 'counts', 'mentions', and 'last_message_times' keys.
|
||||
"""
|
||||
result: dict[str, list[Message]] = {}
|
||||
counts: dict[str, int] = {}
|
||||
mention_flags: dict[str, bool] = {}
|
||||
last_message_times: dict[str, int] = {}
|
||||
|
||||
for conv in conversations:
|
||||
msg_type = conv.get("type")
|
||||
conv_key = conv.get("conversation_key")
|
||||
if not msg_type or not conv_key:
|
||||
continue
|
||||
mention_pattern = f"%@[{name}]%" if name else None
|
||||
|
||||
key = f"{msg_type}:{conv_key}"
|
||||
# Channel unreads
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT m.conversation_key,
|
||||
COUNT(*) as unread_count,
|
||||
MAX(m.received_at) as last_message_time,
|
||||
SUM(CASE WHEN m.text LIKE ? THEN 1 ELSE 0 END) > 0 as has_mention
|
||||
FROM messages m
|
||||
JOIN channels c ON m.conversation_key = c.key
|
||||
WHERE m.type = 'CHAN' AND m.outgoing = 0
|
||||
AND m.received_at > COALESCE(c.last_read_at, 0)
|
||||
GROUP BY m.conversation_key
|
||||
""",
|
||||
(mention_pattern or "",),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
for row in rows:
|
||||
state_key = f"channel-{row['conversation_key']}"
|
||||
counts[state_key] = row["unread_count"]
|
||||
if mention_pattern and row["has_mention"]:
|
||||
mention_flags[state_key] = True
|
||||
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT * FROM messages
|
||||
WHERE type = ? AND conversation_key LIKE ?
|
||||
ORDER BY received_at DESC
|
||||
LIMIT ?
|
||||
""",
|
||||
(msg_type, f"{conv_key}%", limit_per_conversation),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
result[key] = [
|
||||
Message(
|
||||
id=row["id"],
|
||||
type=row["type"],
|
||||
conversation_key=row["conversation_key"],
|
||||
text=row["text"],
|
||||
sender_timestamp=row["sender_timestamp"],
|
||||
received_at=row["received_at"],
|
||||
paths=MessageRepository._parse_paths(row["paths"]),
|
||||
txt_type=row["txt_type"],
|
||||
signature=row["signature"],
|
||||
outgoing=bool(row["outgoing"]),
|
||||
acked=row["acked"],
|
||||
)
|
||||
for row in rows
|
||||
]
|
||||
# Contact unreads
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT m.conversation_key,
|
||||
COUNT(*) as unread_count,
|
||||
MAX(m.received_at) as last_message_time,
|
||||
SUM(CASE WHEN m.text LIKE ? THEN 1 ELSE 0 END) > 0 as has_mention
|
||||
FROM messages m
|
||||
JOIN contacts ct ON m.conversation_key = ct.public_key
|
||||
WHERE m.type = 'PRIV' AND m.outgoing = 0
|
||||
AND m.received_at > COALESCE(ct.last_read_at, 0)
|
||||
GROUP BY m.conversation_key
|
||||
""",
|
||||
(mention_pattern or "",),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
for row in rows:
|
||||
state_key = f"contact-{row['conversation_key']}"
|
||||
counts[state_key] = row["unread_count"]
|
||||
if mention_pattern and row["has_mention"]:
|
||||
mention_flags[state_key] = True
|
||||
|
||||
return result
|
||||
# Last message times for all conversations (including read ones)
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT type, conversation_key, MAX(received_at) as last_message_time
|
||||
FROM messages
|
||||
GROUP BY type, conversation_key
|
||||
"""
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
for row in rows:
|
||||
prefix = "channel" if row["type"] == "CHAN" else "contact"
|
||||
state_key = f"{prefix}-{row['conversation_key']}"
|
||||
last_message_times[state_key] = row["last_message_time"]
|
||||
|
||||
return {
|
||||
"counts": counts,
|
||||
"mentions": mention_flags,
|
||||
"last_message_times": last_message_times,
|
||||
}
|
||||
|
||||
|
||||
class RawPacketRepository:
|
||||
@@ -648,29 +688,6 @@ class RawPacketRepository:
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@staticmethod
|
||||
async def get_undecrypted(limit: int = 100) -> list[RawPacket]:
|
||||
"""Get undecrypted packets (those without a linked message)."""
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT id, timestamp, data, message_id FROM raw_packets
|
||||
WHERE message_id IS NULL
|
||||
ORDER BY timestamp DESC
|
||||
LIMIT ?
|
||||
""",
|
||||
(limit,),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
return [
|
||||
RawPacket(
|
||||
id=row["id"],
|
||||
timestamp=row["timestamp"],
|
||||
data=row["data"].hex(),
|
||||
message_id=row["message_id"],
|
||||
)
|
||||
for row in rows
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
async def prune_old_undecrypted(max_age_days: int) -> int:
|
||||
"""Delete undecrypted packets older than max_age_days. Returns count deleted."""
|
||||
@@ -717,7 +734,8 @@ class AppSettingsRepository:
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
SELECT max_radio_contacts, favorites, auto_decrypt_dm_on_advert,
|
||||
sidebar_sort_order, last_message_times, preferences_migrated
|
||||
sidebar_sort_order, last_message_times, preferences_migrated,
|
||||
advert_interval, last_advert_time, bots
|
||||
FROM app_settings WHERE id = 1
|
||||
"""
|
||||
)
|
||||
@@ -753,6 +771,20 @@ class AppSettingsRepository:
|
||||
)
|
||||
last_message_times = {}
|
||||
|
||||
# Parse bots JSON
|
||||
bots: list[BotConfig] = []
|
||||
if row["bots"]:
|
||||
try:
|
||||
bots_data = json.loads(row["bots"])
|
||||
bots = [BotConfig(**b) for b in bots_data]
|
||||
except (json.JSONDecodeError, TypeError, KeyError) as e:
|
||||
logger.warning(
|
||||
"Failed to parse bots JSON, using empty list: %s (data=%r)",
|
||||
e,
|
||||
row["bots"][:100] if row["bots"] else None,
|
||||
)
|
||||
bots = []
|
||||
|
||||
# Validate sidebar_sort_order (fallback to "recent" if invalid)
|
||||
sort_order = row["sidebar_sort_order"]
|
||||
if sort_order not in ("recent", "alpha"):
|
||||
@@ -765,6 +797,9 @@ class AppSettingsRepository:
|
||||
sidebar_sort_order=sort_order,
|
||||
last_message_times=last_message_times,
|
||||
preferences_migrated=bool(row["preferences_migrated"]),
|
||||
advert_interval=row["advert_interval"] or 0,
|
||||
last_advert_time=row["last_advert_time"] or 0,
|
||||
bots=bots,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
@@ -775,6 +810,9 @@ class AppSettingsRepository:
|
||||
sidebar_sort_order: str | None = None,
|
||||
last_message_times: dict[str, int] | None = None,
|
||||
preferences_migrated: bool | None = None,
|
||||
advert_interval: int | None = None,
|
||||
last_advert_time: int | None = None,
|
||||
bots: list[BotConfig] | None = None,
|
||||
) -> AppSettings:
|
||||
"""Update app settings. Only provided fields are updated."""
|
||||
updates = []
|
||||
@@ -805,6 +843,19 @@ class AppSettingsRepository:
|
||||
updates.append("preferences_migrated = ?")
|
||||
params.append(1 if preferences_migrated else 0)
|
||||
|
||||
if advert_interval is not None:
|
||||
updates.append("advert_interval = ?")
|
||||
params.append(advert_interval)
|
||||
|
||||
if last_advert_time is not None:
|
||||
updates.append("last_advert_time = ?")
|
||||
params.append(last_advert_time)
|
||||
|
||||
if bots is not None:
|
||||
updates.append("bots = ?")
|
||||
bots_json = json.dumps([b.model_dump() for b in bots])
|
||||
params.append(bots_json)
|
||||
|
||||
if updates:
|
||||
query = f"UPDATE app_settings SET {', '.join(updates)} WHERE id = 1"
|
||||
await db.conn.execute(query, params)
|
||||
@@ -833,33 +884,6 @@ class AppSettingsRepository:
|
||||
]
|
||||
return await AppSettingsRepository.update(favorites=new_favorites)
|
||||
|
||||
@staticmethod
|
||||
async def update_last_message_time(state_key: str, timestamp: int) -> None:
|
||||
"""Update the last message time for a conversation atomically.
|
||||
|
||||
Only updates if the new timestamp is greater than the existing one.
|
||||
Uses SQLite's json_set for atomic update to avoid race conditions.
|
||||
"""
|
||||
# Use COALESCE to handle NULL or missing keys, json_set for atomic update
|
||||
# Only update if new timestamp > existing (or key doesn't exist)
|
||||
await db.conn.execute(
|
||||
"""
|
||||
UPDATE app_settings
|
||||
SET last_message_times = json_set(
|
||||
COALESCE(last_message_times, '{}'),
|
||||
'$.' || ?,
|
||||
?
|
||||
)
|
||||
WHERE id = 1
|
||||
AND (
|
||||
json_extract(last_message_times, '$.' || ?) IS NULL
|
||||
OR json_extract(last_message_times, '$.' || ?) < ?
|
||||
)
|
||||
""",
|
||||
(state_key, timestamp, state_key, state_key, timestamp),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@staticmethod
|
||||
async def migrate_preferences_from_frontend(
|
||||
favorites: list[dict],
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import random
|
||||
|
||||
from fastapi import APIRouter, BackgroundTasks, HTTPException, Query
|
||||
from meshcore import EventType
|
||||
@@ -15,11 +16,12 @@ from app.models import (
|
||||
NeighborInfo,
|
||||
TelemetryRequest,
|
||||
TelemetryResponse,
|
||||
TraceResponse,
|
||||
)
|
||||
from app.packet_processor import start_historical_dm_decryption
|
||||
from app.radio import radio_manager
|
||||
from app.radio_sync import pause_polling
|
||||
from app.repository import ContactRepository
|
||||
from app.repository import ContactRepository, MessageRepository
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -119,8 +121,9 @@ async def create_contact(
|
||||
return existing
|
||||
|
||||
# Create new contact
|
||||
lower_key = request.public_key.lower()
|
||||
contact_data = {
|
||||
"public_key": request.public_key,
|
||||
"public_key": lower_key,
|
||||
"name": request.name,
|
||||
"type": 0, # Unknown
|
||||
"flags": 0,
|
||||
@@ -134,11 +137,16 @@ async def create_contact(
|
||||
"last_contacted": None,
|
||||
}
|
||||
await ContactRepository.upsert(contact_data)
|
||||
logger.info("Created contact %s", request.public_key[:12])
|
||||
logger.info("Created contact %s", lower_key[:12])
|
||||
|
||||
# Promote any prefix-stored messages to this full key
|
||||
claimed = await MessageRepository.claim_prefix_messages(lower_key)
|
||||
if claimed > 0:
|
||||
logger.info("Claimed %d prefix messages for contact %s", claimed, lower_key[:12])
|
||||
|
||||
# Trigger historical decryption if requested
|
||||
if request.try_historical:
|
||||
await start_historical_dm_decryption(background_tasks, request.public_key, request.name)
|
||||
await start_historical_dm_decryption(background_tasks, lower_key, request.name)
|
||||
|
||||
return Contact(**contact_data)
|
||||
|
||||
@@ -361,8 +369,49 @@ async def request_telemetry(public_key: str, request: TelemetryRequest) -> Telem
|
||||
)
|
||||
)
|
||||
|
||||
# Fetch clock output (up to 2 attempts)
|
||||
# Must pause polling and stop auto-fetch to prevent race condition where
|
||||
# the CLI response is consumed before we can call get_msg()
|
||||
logger.info("Fetching clock from repeater %s", contact.public_key[:12])
|
||||
clock_output: str | None = None
|
||||
|
||||
async with pause_polling():
|
||||
await mc.stop_auto_message_fetching()
|
||||
try:
|
||||
for attempt in range(1, 3):
|
||||
logger.debug("Clock request attempt %d/2", attempt)
|
||||
try:
|
||||
send_result = await mc.commands.send_cmd(contact.public_key, "clock")
|
||||
if send_result.type == EventType.ERROR:
|
||||
logger.debug("Clock command send error: %s", send_result.payload)
|
||||
continue
|
||||
|
||||
# Wait for response
|
||||
wait_result = await mc.wait_for_event(EventType.MESSAGES_WAITING, timeout=5.0)
|
||||
if wait_result is None:
|
||||
logger.debug("Clock request timeout, retrying...")
|
||||
continue
|
||||
|
||||
response_event = await mc.commands.get_msg()
|
||||
if response_event.type == EventType.ERROR:
|
||||
logger.debug("Clock get_msg error: %s", response_event.payload)
|
||||
continue
|
||||
|
||||
clock_output = response_event.payload.get("text", "")
|
||||
logger.info("Received clock output: %s", clock_output)
|
||||
break
|
||||
except Exception as e:
|
||||
logger.debug("Clock request exception: %s", e)
|
||||
continue
|
||||
finally:
|
||||
await mc.start_auto_message_fetching()
|
||||
|
||||
if clock_output is None:
|
||||
clock_output = "Unable to fetch `clock` output (repeater did not respond)"
|
||||
|
||||
# Convert raw telemetry to response format
|
||||
# bat is in mV, convert to V (e.g., 3775 -> 3.775)
|
||||
|
||||
return TelemetryResponse(
|
||||
pubkey_prefix=status.get("pubkey_pre", contact.public_key[:12]),
|
||||
battery_volts=status.get("bat", 0) / 1000.0,
|
||||
@@ -384,6 +433,7 @@ async def request_telemetry(public_key: str, request: TelemetryRequest) -> Telem
|
||||
full_events=status.get("full_evts", 0),
|
||||
neighbors=neighbors,
|
||||
acl=acl_entries,
|
||||
clock_output=clock_output,
|
||||
)
|
||||
|
||||
|
||||
@@ -484,3 +534,66 @@ async def send_repeater_command(public_key: str, request: CommandRequest) -> Com
|
||||
finally:
|
||||
# Always restart auto-fetch, even if an error occurred
|
||||
await mc.start_auto_message_fetching()
|
||||
|
||||
|
||||
@router.post("/{public_key}/trace", response_model=TraceResponse)
|
||||
async def request_trace(public_key: str) -> TraceResponse:
|
||||
"""Send a single-hop trace to a contact and wait for the result.
|
||||
|
||||
The trace path contains the contact's 1-byte pubkey hash as the sole hop
|
||||
(no intermediate repeaters). The radio firmware requires at least one
|
||||
node in the path.
|
||||
"""
|
||||
mc = require_connected()
|
||||
|
||||
contact = await ContactRepository.get_by_key_or_prefix(public_key)
|
||||
if not contact:
|
||||
raise HTTPException(status_code=404, detail="Contact not found")
|
||||
|
||||
tag = random.randint(1, 0xFFFFFFFF)
|
||||
# First 2 hex chars of pubkey = 1-byte hash used by the trace protocol
|
||||
contact_hash = contact.public_key[:2]
|
||||
|
||||
# Note: unlike command/telemetry endpoints, trace does NOT need
|
||||
# stop/start_auto_message_fetching because the response arrives as a
|
||||
# TRACE_DATA event through the reader loop, not via get_msg().
|
||||
async with pause_polling():
|
||||
# Ensure contact is on radio so the trace can reach them
|
||||
await mc.commands.add_contact(contact.to_radio_dict())
|
||||
|
||||
logger.info(
|
||||
"Sending trace to %s (tag=%d, hash=%s)", contact.public_key[:12], tag, contact_hash
|
||||
)
|
||||
result = await mc.commands.send_trace(path=contact_hash, tag=tag)
|
||||
|
||||
if result.type == EventType.ERROR:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to send trace: {result.payload}")
|
||||
|
||||
# Wait for the matching TRACE_DATA event
|
||||
event = await mc.wait_for_event(
|
||||
EventType.TRACE_DATA,
|
||||
attribute_filters={"tag": tag},
|
||||
timeout=15,
|
||||
)
|
||||
|
||||
if event is None:
|
||||
raise HTTPException(status_code=504, detail="No trace response heard")
|
||||
|
||||
trace = event.payload
|
||||
path = trace.get("path", [])
|
||||
path_len = trace.get("path_len", 0)
|
||||
|
||||
# remote_snr: first entry in path (what the target heard us at)
|
||||
remote_snr = path[0]["snr"] if path else None
|
||||
# local_snr: last entry in path (what we heard them at on the bounce-back)
|
||||
local_snr = path[-1]["snr"] if path else None
|
||||
|
||||
logger.info(
|
||||
"Trace result for %s: path_len=%d, remote_snr=%s, local_snr=%s",
|
||||
contact.public_key[:12],
|
||||
path_len,
|
||||
remote_snr,
|
||||
local_snr,
|
||||
)
|
||||
|
||||
return TraceResponse(remote_snr=remote_snr, local_snr=local_snr, path_len=path_len)
|
||||
|
||||
@@ -18,10 +18,8 @@ class HealthResponse(BaseModel):
|
||||
oldest_undecrypted_timestamp: int | None
|
||||
|
||||
|
||||
@router.get("/health", response_model=HealthResponse)
|
||||
async def healthcheck() -> HealthResponse:
|
||||
"""Check if the API is running and if the radio is connected."""
|
||||
# Get database file size in MB
|
||||
async def build_health_data(radio_connected: bool, serial_port: str | None) -> dict:
|
||||
"""Build the health status payload used by REST endpoint and WebSocket broadcasts."""
|
||||
db_size_mb = 0.0
|
||||
try:
|
||||
db_size_bytes = os.path.getsize(settings.database_path)
|
||||
@@ -29,17 +27,23 @@ async def healthcheck() -> HealthResponse:
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
# Get oldest undecrypted packet info (gracefully handle if DB not connected)
|
||||
oldest_ts = None
|
||||
try:
|
||||
oldest_ts = await RawPacketRepository.get_oldest_undecrypted()
|
||||
except RuntimeError:
|
||||
pass # Database not connected
|
||||
|
||||
return HealthResponse(
|
||||
status="ok" if radio_manager.is_connected else "degraded",
|
||||
radio_connected=radio_manager.is_connected,
|
||||
serial_port=radio_manager.port,
|
||||
database_size_mb=db_size_mb,
|
||||
oldest_undecrypted_timestamp=oldest_ts,
|
||||
)
|
||||
return {
|
||||
"status": "ok" if radio_connected else "degraded",
|
||||
"radio_connected": radio_connected,
|
||||
"serial_port": serial_port,
|
||||
"database_size_mb": db_size_mb,
|
||||
"oldest_undecrypted_timestamp": oldest_ts,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/health", response_model=HealthResponse)
|
||||
async def healthcheck() -> HealthResponse:
|
||||
"""Check if the API is running and if the radio is connected."""
|
||||
data = await build_health_data(radio_manager.is_connected, radio_manager.port)
|
||||
return HealthResponse(**data)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import time
|
||||
|
||||
@@ -21,6 +22,10 @@ async def list_messages(
|
||||
conversation_key: str | None = Query(
|
||||
default=None, description="Filter by conversation key (channel key or contact pubkey)"
|
||||
),
|
||||
before: int | None = Query(
|
||||
default=None, description="Cursor: received_at of last seen message"
|
||||
),
|
||||
before_id: int | None = Query(default=None, description="Cursor: id of last seen message"),
|
||||
) -> list[Message]:
|
||||
"""List messages from the database."""
|
||||
return await MessageRepository.get_all(
|
||||
@@ -28,22 +33,11 @@ async def list_messages(
|
||||
offset=offset,
|
||||
msg_type=type,
|
||||
conversation_key=conversation_key,
|
||||
before=before,
|
||||
before_id=before_id,
|
||||
)
|
||||
|
||||
|
||||
@router.post("/bulk", response_model=dict[str, list[Message]])
|
||||
async def get_messages_bulk(
|
||||
conversations: list[dict],
|
||||
limit_per_conversation: int = Query(default=100, ge=1, le=1000),
|
||||
) -> dict[str, list[Message]]:
|
||||
"""Fetch messages for multiple conversations in one request.
|
||||
|
||||
Body should be a list of {type: 'PRIV'|'CHAN', conversation_key: string}.
|
||||
Returns a dict mapping 'type:conversation_key' to list of messages.
|
||||
"""
|
||||
return await MessageRepository.get_bulk(conversations, limit_per_conversation)
|
||||
|
||||
|
||||
@router.post("/direct", response_model=Message)
|
||||
async def send_direct_message(request: SendDirectMessageRequest) -> Message:
|
||||
"""Send a direct message to a contact."""
|
||||
@@ -58,38 +52,42 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
|
||||
status_code=404, detail=f"Contact not found in database: {request.destination}"
|
||||
)
|
||||
|
||||
# Check if contact is on radio, if not add it
|
||||
# Always add/update the contact on radio before sending.
|
||||
# The library cache (get_contact_by_key_prefix) can be stale after radio reboot,
|
||||
# so we can't rely on it to know if the firmware has the contact.
|
||||
# add_contact is idempotent - updates if exists, adds if not.
|
||||
contact_data = db_contact.to_radio_dict()
|
||||
logger.debug("Ensuring contact %s is on radio before sending", db_contact.public_key[:12])
|
||||
add_result = await mc.commands.add_contact(contact_data)
|
||||
if add_result.type == EventType.ERROR:
|
||||
logger.warning("Failed to add contact to radio: %s", add_result.payload)
|
||||
# Continue anyway - might still work if contact exists
|
||||
|
||||
# Get the contact from the library cache (may have updated info like path)
|
||||
contact = mc.get_contact_by_key_prefix(db_contact.public_key[:12])
|
||||
if not contact:
|
||||
logger.info("Adding contact %s to radio before sending", db_contact.public_key[:12])
|
||||
contact_data = db_contact.to_radio_dict()
|
||||
add_result = await mc.commands.add_contact(contact_data)
|
||||
if add_result.type == EventType.ERROR:
|
||||
logger.warning("Failed to add contact to radio: %s", add_result.payload)
|
||||
# Continue anyway - might still work
|
||||
|
||||
# Get the contact from radio again
|
||||
contact = mc.get_contact_by_key_prefix(db_contact.public_key[:12])
|
||||
if not contact:
|
||||
# Use the contact_data we built as fallback
|
||||
contact = contact_data
|
||||
contact = contact_data
|
||||
|
||||
logger.info("Sending direct message to %s", db_contact.public_key[:12])
|
||||
|
||||
# Capture timestamp BEFORE sending so we can pass the same value to both the radio
|
||||
# and the database. This ensures consistency for deduplication.
|
||||
now = int(time.time())
|
||||
|
||||
result = await mc.commands.send_msg(
|
||||
dst=contact,
|
||||
msg=request.text,
|
||||
timestamp=now,
|
||||
)
|
||||
|
||||
if result.type == EventType.ERROR:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
|
||||
|
||||
# Store outgoing message
|
||||
now = int(time.time())
|
||||
message_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text=request.text,
|
||||
conversation_key=db_contact.public_key,
|
||||
conversation_key=db_contact.public_key.lower(),
|
||||
sender_timestamp=now,
|
||||
received_at=now,
|
||||
outgoing=True,
|
||||
@@ -101,7 +99,7 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
|
||||
)
|
||||
|
||||
# Update last_contacted for the contact
|
||||
await ContactRepository.update_last_contacted(db_contact.public_key, now)
|
||||
await ContactRepository.update_last_contacted(db_contact.public_key.lower(), now)
|
||||
|
||||
# Track the expected ACK for this message
|
||||
expected_ack = result.payload.get("expected_ack")
|
||||
@@ -111,10 +109,10 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
|
||||
track_pending_ack(ack_code, message_id, suggested_timeout)
|
||||
logger.debug("Tracking ACK %s for message %d", ack_code, message_id)
|
||||
|
||||
return Message(
|
||||
message = Message(
|
||||
id=message_id,
|
||||
type="PRIV",
|
||||
conversation_key=db_contact.public_key,
|
||||
conversation_key=db_contact.public_key.lower(),
|
||||
text=request.text,
|
||||
sender_timestamp=now,
|
||||
received_at=now,
|
||||
@@ -122,6 +120,25 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
|
||||
acked=0,
|
||||
)
|
||||
|
||||
# Trigger bots for outgoing DMs (runs in background, doesn't block response)
|
||||
from app.bot import run_bot_for_message
|
||||
|
||||
asyncio.create_task(
|
||||
run_bot_for_message(
|
||||
sender_name=None,
|
||||
sender_key=db_contact.public_key.lower(),
|
||||
message_text=request.text,
|
||||
is_dm=True,
|
||||
channel_key=None,
|
||||
channel_name=None,
|
||||
sender_timestamp=now,
|
||||
path=None,
|
||||
is_outgoing=True,
|
||||
)
|
||||
)
|
||||
|
||||
return message
|
||||
|
||||
|
||||
# Temporary radio slot used for sending channel messages
|
||||
TEMP_RADIO_SLOT = 0
|
||||
@@ -175,9 +192,15 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
|
||||
|
||||
logger.info("Sending channel message to %s: %s", db_channel.name, request.text[:50])
|
||||
|
||||
# Capture timestamp BEFORE sending so we can pass the same value to both the radio
|
||||
# and the database. This ensures the echo's timestamp matches our stored message
|
||||
# for proper deduplication.
|
||||
now = int(time.time())
|
||||
|
||||
result = await mc.commands.send_chan_msg(
|
||||
chan=TEMP_RADIO_SLOT,
|
||||
msg=request.text,
|
||||
timestamp=now.to_bytes(4, "little"), # Pass as bytes for compatibility
|
||||
)
|
||||
|
||||
if result.type == EventType.ERROR:
|
||||
@@ -186,7 +209,6 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
|
||||
# Store outgoing message with sender prefix (to match echo format)
|
||||
# The radio includes "SenderName: " prefix when broadcasting, so we store it the same way
|
||||
# to enable proper deduplication when the echo comes back
|
||||
now = int(time.time())
|
||||
channel_key_upper = request.channel_key.upper()
|
||||
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
|
||||
text_with_sender = f"{radio_name}: {request.text}" if radio_name else request.text
|
||||
@@ -204,7 +226,7 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
|
||||
detail="Failed to store outgoing message - unexpected duplicate",
|
||||
)
|
||||
|
||||
return Message(
|
||||
message = Message(
|
||||
id=message_id,
|
||||
type="CHAN",
|
||||
conversation_key=channel_key_upper,
|
||||
@@ -214,3 +236,22 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
|
||||
outgoing=True,
|
||||
acked=0,
|
||||
)
|
||||
|
||||
# Trigger bots for outgoing channel messages (runs in background, doesn't block response)
|
||||
from app.bot import run_bot_for_message
|
||||
|
||||
asyncio.create_task(
|
||||
run_bot_for_message(
|
||||
sender_name=radio_name or None,
|
||||
sender_key=None,
|
||||
message_text=request.text,
|
||||
is_dm=False,
|
||||
channel_key=channel_key_upper,
|
||||
channel_name=db_channel.name,
|
||||
sender_timestamp=now,
|
||||
path=None,
|
||||
is_outgoing=True,
|
||||
)
|
||||
)
|
||||
|
||||
return message
|
||||
|
||||
@@ -63,11 +63,13 @@ async def _run_historical_channel_decryption(
|
||||
msg_id = await create_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
channel_key=channel_key_hex,
|
||||
channel_name=display_name,
|
||||
sender=result.sender,
|
||||
message_text=result.message,
|
||||
timestamp=result.timestamp,
|
||||
received_at=packet_timestamp,
|
||||
path=path_hex,
|
||||
trigger_bot=False, # Historical decryption should not trigger bot
|
||||
)
|
||||
|
||||
if msg_id is not None:
|
||||
|
||||
@@ -5,6 +5,7 @@ from meshcore import EventType
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from app.dependencies import require_connected
|
||||
from app.radio_sync import send_advertisement as do_send_advertisement
|
||||
from app.radio_sync import sync_radio_time
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -103,6 +104,11 @@ async def update_radio_config(update: RadioConfigUpdate) -> RadioConfigResponse:
|
||||
# Sync time with system clock
|
||||
await sync_radio_time()
|
||||
|
||||
# Re-fetch self_info so the response reflects the changes we just made.
|
||||
# Commands like set_name() write to flash but don't update the cached
|
||||
# self_info — send_appstart() triggers a fresh SELF_INFO from the radio.
|
||||
await mc.commands.send_appstart()
|
||||
|
||||
return await get_radio_config()
|
||||
|
||||
|
||||
@@ -128,19 +134,25 @@ async def set_private_key(update: PrivateKeyUpdate) -> dict:
|
||||
|
||||
|
||||
@router.post("/advertise")
|
||||
async def send_advertisement(flood: bool = True) -> dict:
|
||||
"""Send a radio advertisement to announce presence on the mesh."""
|
||||
mc = require_connected()
|
||||
async def send_advertisement() -> dict:
|
||||
"""Send a flood advertisement to announce presence on the mesh.
|
||||
|
||||
logger.info("Sending advertisement (flood=%s)", flood)
|
||||
result = await mc.commands.send_advert(flood=flood)
|
||||
Manual advertisement requests always send immediately, updating the
|
||||
last_advert_time which affects when the next periodic/startup advert
|
||||
can occur.
|
||||
|
||||
if result.type == EventType.ERROR:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to send advertisement: {result.payload}"
|
||||
)
|
||||
Returns:
|
||||
status: "ok" if sent successfully
|
||||
"""
|
||||
require_connected()
|
||||
|
||||
return {"status": "ok", "flood": flood}
|
||||
logger.info("Sending flood advertisement")
|
||||
success = await do_send_advertisement(force=True)
|
||||
|
||||
if not success:
|
||||
raise HTTPException(status_code=500, detail="Failed to send advertisement")
|
||||
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@router.post("/reboot")
|
||||
@@ -173,13 +185,7 @@ async def reboot_radio() -> dict:
|
||||
success = await radio_manager.reconnect()
|
||||
|
||||
if success:
|
||||
# Re-register event handlers after successful reconnect
|
||||
from app.event_handlers import register_event_handlers
|
||||
|
||||
if radio_manager.meshcore:
|
||||
register_event_handlers(radio_manager.meshcore)
|
||||
await radio_manager.meshcore.start_auto_message_fetching()
|
||||
logger.info("Event handlers re-registered and auto message fetching started")
|
||||
await radio_manager.post_connect_setup()
|
||||
|
||||
return {"status": "ok", "message": "Reconnected successfully", "connected": True}
|
||||
else:
|
||||
@@ -212,14 +218,7 @@ async def reconnect_radio() -> dict:
|
||||
success = await radio_manager.reconnect()
|
||||
|
||||
if success:
|
||||
# Re-register event handlers after successful reconnect
|
||||
from app.event_handlers import register_event_handlers
|
||||
|
||||
if radio_manager.meshcore:
|
||||
register_event_handlers(radio_manager.meshcore)
|
||||
# Restart auto message fetching
|
||||
await radio_manager.meshcore.start_auto_message_fetching()
|
||||
logger.info("Event handlers re-registered and auto message fetching started")
|
||||
await radio_manager.post_connect_setup()
|
||||
|
||||
return {"status": "ok", "message": "Reconnected successfully", "connected": True}
|
||||
else:
|
||||
|
||||
@@ -3,14 +3,29 @@
|
||||
import logging
|
||||
import time
|
||||
|
||||
from fastapi import APIRouter
|
||||
from fastapi import APIRouter, Query
|
||||
|
||||
from app.database import db
|
||||
from app.models import UnreadCounts
|
||||
from app.repository import ChannelRepository, ContactRepository, MessageRepository
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/read-state", tags=["read-state"])
|
||||
|
||||
|
||||
@router.get("/unreads", response_model=UnreadCounts)
|
||||
async def get_unreads(
|
||||
name: str | None = Query(default=None, description="User's name for @mention detection"),
|
||||
) -> UnreadCounts:
|
||||
"""Get unread counts, mention flags, and last message times for all conversations.
|
||||
|
||||
Computes unread counts server-side using last_read_at timestamps on
|
||||
channels and contacts, avoiding the need to fetch bulk messages.
|
||||
"""
|
||||
data = await MessageRepository.get_unread_counts(name)
|
||||
return UnreadCounts(**data)
|
||||
|
||||
|
||||
@router.post("/mark-all-read")
|
||||
async def mark_all_read() -> dict:
|
||||
"""Mark all contacts and channels as read.
|
||||
@@ -20,9 +35,8 @@ async def mark_all_read() -> dict:
|
||||
"""
|
||||
now = int(time.time())
|
||||
|
||||
# Update all contacts and channels in one transaction
|
||||
await db.conn.execute("UPDATE contacts SET last_read_at = ?", (now,))
|
||||
await db.conn.execute("UPDATE channels SET last_read_at = ?", (now,))
|
||||
await ContactRepository.mark_all_read(now)
|
||||
await ChannelRepository.mark_all_read(now)
|
||||
await db.conn.commit()
|
||||
|
||||
logger.info("Marked all contacts and channels as read at %d", now)
|
||||
|
||||
@@ -1,16 +1,37 @@
|
||||
import logging
|
||||
from typing import Literal
|
||||
|
||||
from fastapi import APIRouter
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from app.models import AppSettings
|
||||
from app.models import AppSettings, BotConfig
|
||||
from app.repository import AppSettingsRepository
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/settings", tags=["settings"])
|
||||
|
||||
|
||||
def validate_bot_code(code: str, bot_name: str | None = None) -> None:
|
||||
"""Validate bot code syntax. Raises HTTPException on error."""
|
||||
if not code or not code.strip():
|
||||
return # Empty code is valid (disables bot)
|
||||
|
||||
try:
|
||||
compile(code, "<bot_code>", "exec")
|
||||
except SyntaxError as e:
|
||||
name_part = f"'{bot_name}' " if bot_name else ""
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Bot {name_part}has syntax error at line {e.lineno}: {e.msg}",
|
||||
) from None
|
||||
|
||||
|
||||
def validate_all_bots(bots: list[BotConfig]) -> None:
|
||||
"""Validate all bots' code syntax. Raises HTTPException on first error."""
|
||||
for bot in bots:
|
||||
validate_bot_code(bot.code, bot.name)
|
||||
|
||||
|
||||
class AppSettingsUpdate(BaseModel):
|
||||
max_radio_contacts: int | None = Field(
|
||||
default=None,
|
||||
@@ -26,6 +47,15 @@ class AppSettingsUpdate(BaseModel):
|
||||
default=None,
|
||||
description="Sidebar sort order: 'recent' or 'alpha'",
|
||||
)
|
||||
advert_interval: int | None = Field(
|
||||
default=None,
|
||||
ge=0,
|
||||
description="Periodic advertisement interval in seconds (0 = disabled)",
|
||||
)
|
||||
bots: list[BotConfig] | None = Field(
|
||||
default=None,
|
||||
description="List of bot configurations",
|
||||
)
|
||||
|
||||
|
||||
class FavoriteRequest(BaseModel):
|
||||
@@ -33,13 +63,6 @@ class FavoriteRequest(BaseModel):
|
||||
id: str = Field(description="Channel key or contact public key")
|
||||
|
||||
|
||||
class LastMessageTimeUpdate(BaseModel):
|
||||
state_key: str = Field(
|
||||
description="Conversation state key (e.g., 'channel-KEY' or 'contact-PREFIX')"
|
||||
)
|
||||
timestamp: int = Field(description="Unix timestamp of the last message")
|
||||
|
||||
|
||||
class MigratePreferencesRequest(BaseModel):
|
||||
favorites: list[FavoriteRequest] = Field(
|
||||
default_factory=list,
|
||||
@@ -85,26 +108,21 @@ async def update_settings(update: AppSettingsUpdate) -> AppSettings:
|
||||
logger.info("Updating sidebar_sort_order to %s", update.sidebar_sort_order)
|
||||
kwargs["sidebar_sort_order"] = update.sidebar_sort_order
|
||||
|
||||
if update.advert_interval is not None:
|
||||
logger.info("Updating advert_interval to %d", update.advert_interval)
|
||||
kwargs["advert_interval"] = update.advert_interval
|
||||
|
||||
if update.bots is not None:
|
||||
validate_all_bots(update.bots)
|
||||
logger.info("Updating bots (count=%d)", len(update.bots))
|
||||
kwargs["bots"] = update.bots
|
||||
|
||||
if kwargs:
|
||||
return await AppSettingsRepository.update(**kwargs)
|
||||
|
||||
return await AppSettingsRepository.get()
|
||||
|
||||
|
||||
@router.post("/favorites", response_model=AppSettings)
|
||||
async def add_favorite(request: FavoriteRequest) -> AppSettings:
|
||||
"""Add a conversation to favorites."""
|
||||
logger.info("Adding favorite: %s %s", request.type, request.id[:12])
|
||||
return await AppSettingsRepository.add_favorite(request.type, request.id)
|
||||
|
||||
|
||||
@router.delete("/favorites", response_model=AppSettings)
|
||||
async def remove_favorite(request: FavoriteRequest) -> AppSettings:
|
||||
"""Remove a conversation from favorites."""
|
||||
logger.info("Removing favorite: %s %s", request.type, request.id[:12])
|
||||
return await AppSettingsRepository.remove_favorite(request.type, request.id)
|
||||
|
||||
|
||||
@router.post("/favorites/toggle", response_model=AppSettings)
|
||||
async def toggle_favorite(request: FavoriteRequest) -> AppSettings:
|
||||
"""Toggle a conversation's favorite status."""
|
||||
@@ -119,17 +137,6 @@ async def toggle_favorite(request: FavoriteRequest) -> AppSettings:
|
||||
return await AppSettingsRepository.add_favorite(request.type, request.id)
|
||||
|
||||
|
||||
@router.post("/last-message-time")
|
||||
async def update_last_message_time(request: LastMessageTimeUpdate) -> dict:
|
||||
"""Update the last message time for a conversation.
|
||||
|
||||
Used to track when conversations last received messages for sidebar sorting.
|
||||
Only updates if the new timestamp is greater than the existing one.
|
||||
"""
|
||||
await AppSettingsRepository.update_last_message_time(request.state_key, request.timestamp)
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@router.post("/migrate", response_model=MigratePreferencesResponse)
|
||||
async def migrate_preferences(request: MigratePreferencesRequest) -> MigratePreferencesResponse:
|
||||
"""Migrate all preferences from frontend localStorage to database.
|
||||
|
||||
@@ -1,13 +1,11 @@
|
||||
"""WebSocket router for real-time updates."""
|
||||
|
||||
import logging
|
||||
import os
|
||||
|
||||
from fastapi import APIRouter, WebSocket, WebSocketDisconnect
|
||||
|
||||
from app.config import settings
|
||||
from app.radio import radio_manager
|
||||
from app.repository import ChannelRepository, ContactRepository, RawPacketRepository
|
||||
from app.routers.health import build_health_data
|
||||
from app.websocket import ws_manager
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -16,60 +14,18 @@ router = APIRouter()
|
||||
|
||||
@router.websocket("/ws")
|
||||
async def websocket_endpoint(websocket: WebSocket) -> None:
|
||||
"""WebSocket endpoint for real-time updates."""
|
||||
"""WebSocket endpoint for real-time updates.
|
||||
|
||||
Only sends health status on initial connect. Contacts and channels
|
||||
are fetched via REST endpoints for faster parallel loading.
|
||||
"""
|
||||
await ws_manager.connect(websocket)
|
||||
|
||||
# Send initial state
|
||||
# Send initial health status
|
||||
try:
|
||||
# Health status
|
||||
db_size_mb = 0.0
|
||||
try:
|
||||
db_size_bytes = os.path.getsize(settings.database_path)
|
||||
db_size_mb = round(db_size_bytes / (1024 * 1024), 2)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
# Get oldest undecrypted packet info
|
||||
oldest_ts = None
|
||||
try:
|
||||
oldest_ts = await RawPacketRepository.get_oldest_undecrypted()
|
||||
except RuntimeError:
|
||||
pass # Database not connected
|
||||
|
||||
health_data = {
|
||||
"status": "ok" if radio_manager.is_connected else "degraded",
|
||||
"radio_connected": radio_manager.is_connected,
|
||||
"serial_port": radio_manager.port,
|
||||
"database_size_mb": db_size_mb,
|
||||
"oldest_undecrypted_timestamp": oldest_ts,
|
||||
}
|
||||
health_data = await build_health_data(radio_manager.is_connected, radio_manager.port)
|
||||
await ws_manager.send_personal(websocket, "health", health_data)
|
||||
|
||||
# Contacts - fetch all by paginating until exhausted
|
||||
all_contacts = []
|
||||
chunk_size = 500
|
||||
offset = 0
|
||||
while True:
|
||||
chunk = await ContactRepository.get_all(limit=chunk_size, offset=offset)
|
||||
all_contacts.extend(chunk)
|
||||
if len(chunk) < chunk_size:
|
||||
break
|
||||
offset += chunk_size
|
||||
|
||||
await ws_manager.send_personal(
|
||||
websocket,
|
||||
"contacts",
|
||||
[c.model_dump() for c in all_contacts],
|
||||
)
|
||||
|
||||
# Channels
|
||||
channels = await ChannelRepository.get_all()
|
||||
await ws_manager.send_personal(
|
||||
websocket,
|
||||
"channels",
|
||||
[c.model_dump() for c in channels],
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error sending initial state: %s", e)
|
||||
|
||||
|
||||
@@ -3,13 +3,10 @@
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
from typing import Any
|
||||
|
||||
from fastapi import WebSocket
|
||||
|
||||
from app.config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Timeout for individual WebSocket send operations (seconds)
|
||||
@@ -128,33 +125,11 @@ def broadcast_success(message: str, details: str | None = None) -> None:
|
||||
|
||||
def broadcast_health(radio_connected: bool, serial_port: str | None = None) -> None:
|
||||
"""Broadcast health status change to all connected clients."""
|
||||
from app.repository import RawPacketRepository
|
||||
|
||||
async def _broadcast():
|
||||
# Get database file size in MB
|
||||
db_size_mb = 0.0
|
||||
try:
|
||||
db_size_bytes = os.path.getsize(settings.database_path)
|
||||
db_size_mb = round(db_size_bytes / (1024 * 1024), 2)
|
||||
except OSError:
|
||||
pass
|
||||
from app.routers.health import build_health_data
|
||||
|
||||
# Get oldest undecrypted packet info
|
||||
oldest_ts = None
|
||||
try:
|
||||
oldest_ts = await RawPacketRepository.get_oldest_undecrypted()
|
||||
except RuntimeError:
|
||||
pass # Database not connected
|
||||
|
||||
await ws_manager.broadcast(
|
||||
"health",
|
||||
{
|
||||
"status": "ok" if radio_connected else "degraded",
|
||||
"radio_connected": radio_connected,
|
||||
"serial_port": serial_port,
|
||||
"database_size_mb": db_size_mb,
|
||||
"oldest_undecrypted_timestamp": oldest_ts,
|
||||
},
|
||||
)
|
||||
data = await build_health_data(radio_connected, serial_port)
|
||||
await ws_manager.broadcast("health", data)
|
||||
|
||||
asyncio.create_task(_broadcast())
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Frontend CLAUDE.md
|
||||
# Frontend AGENTS.md
|
||||
|
||||
This document provides context for AI assistants and developers working on the React frontend.
|
||||
|
||||
@@ -90,10 +90,13 @@ The `preferences_migrated` flag prevents duplicate migrations.
|
||||
|
||||
### State Flow
|
||||
|
||||
1. **WebSocket** pushes real-time updates (health, contacts, channels, messages)
|
||||
2. **REST API** fetches initial data and handles user actions
|
||||
1. **REST API** fetches initial data on mount in parallel (config, settings, channels, contacts, unreads)
|
||||
2. **WebSocket** pushes real-time updates (health, messages, contact changes, raw packets)
|
||||
3. **Components** receive state as props, call handlers to trigger changes
|
||||
|
||||
**Note:** Contacts and channels are loaded via REST on mount (not from WebSocket initial push).
|
||||
The WebSocket only sends health on initial connect, then broadcasts real-time updates.
|
||||
|
||||
### Conversation Header
|
||||
|
||||
For contacts, the header shows path information alongside "Last heard":
|
||||
@@ -145,7 +148,7 @@ await api.getHealth();
|
||||
// Radio
|
||||
await api.getRadioConfig();
|
||||
await api.updateRadioConfig({ name: 'MyRadio' });
|
||||
await api.sendAdvertisement(true);
|
||||
await api.sendAdvertisement();
|
||||
|
||||
// Contacts/Channels
|
||||
await api.getContacts();
|
||||
@@ -189,19 +192,11 @@ server: {
|
||||
|
||||
## Type Definitions (`types.ts`)
|
||||
|
||||
### Key Type Aliases
|
||||
|
||||
```typescript
|
||||
type PublicKey = string; // 64-char hex identifying a contact/node
|
||||
type PubkeyPrefix = string; // 12-char hex prefix (used in message routing)
|
||||
type ChannelKey = string; // 32-char hex identifying a channel
|
||||
```
|
||||
|
||||
### Key Interfaces
|
||||
|
||||
```typescript
|
||||
interface Contact {
|
||||
public_key: PublicKey;
|
||||
public_key: string; // 64-char hex public key
|
||||
name: string | null;
|
||||
type: number; // 0=unknown, 1=client, 2=repeater, 3=room
|
||||
on_radio: boolean;
|
||||
@@ -212,7 +207,7 @@ interface Contact {
|
||||
}
|
||||
|
||||
interface Channel {
|
||||
key: ChannelKey;
|
||||
key: string; // 32-char hex channel key
|
||||
name: string;
|
||||
is_hashtag: boolean;
|
||||
on_radio: boolean;
|
||||
@@ -221,7 +216,7 @@ interface Channel {
|
||||
interface Message {
|
||||
id: number;
|
||||
type: 'PRIV' | 'CHAN';
|
||||
conversation_key: string; // PublicKey for PRIV, ChannelKey for CHAN
|
||||
conversation_key: string; // public key for PRIV, channel key for CHAN
|
||||
text: string;
|
||||
outgoing: boolean;
|
||||
acked: number; // 0=not acked, 1+=ack count (flood echoes)
|
||||
@@ -229,8 +224,8 @@ interface Message {
|
||||
}
|
||||
|
||||
interface Conversation {
|
||||
type: 'contact' | 'channel' | 'raw' | 'map';
|
||||
id: string; // PublicKey for contacts, ChannelKey for channels, 'raw'/'map' for special views
|
||||
type: 'contact' | 'channel' | 'raw' | 'map' | 'visualizer';
|
||||
id: string; // public key for contacts, channel key for channels, 'raw'/'map'/'visualizer' for special views
|
||||
name: string;
|
||||
}
|
||||
|
||||
@@ -397,11 +392,8 @@ for local state tracking, while `conversation_key` is the raw database field.
|
||||
Unread tracking uses server-side `last_read_at` timestamps for cross-device consistency:
|
||||
|
||||
```typescript
|
||||
// Contacts and channels include last_read_at from server
|
||||
interface Contact {
|
||||
// ...
|
||||
last_read_at: number | null; // Unix timestamp when conversation was last read
|
||||
}
|
||||
// Fetch aggregated unread counts from server (replaces bulk message fetch + client-side counting)
|
||||
await api.getUnreads(myName); // Returns { counts, mentions, last_message_times }
|
||||
|
||||
// Mark as read via API (called automatically when viewing conversation)
|
||||
await api.markContactRead(publicKey);
|
||||
@@ -409,7 +401,9 @@ await api.markChannelRead(channelKey);
|
||||
await api.markAllRead(); // Bulk mark all as read
|
||||
```
|
||||
|
||||
Unread count = messages where `received_at > last_read_at`.
|
||||
The `useUnreadCounts` hook fetches counts from `GET /api/read-state/unreads` on mount and
|
||||
when channels/contacts change. Real-time increments are still tracked client-side via WebSocket
|
||||
message events. The server computes unread counts using `last_read_at` vs `received_at`.
|
||||
|
||||
## Utility Functions
|
||||
|
||||
BIN
frontend/dist/apple-touch-icon.png
vendored
|
Before Width: | Height: | Size: 16 KiB |
1
frontend/dist/assets/index-Bg6UtK9Z.css
vendored
542
frontend/dist/assets/index-CQVFPi-8.js
vendored
1
frontend/dist/assets/index-CQVFPi-8.js.map
vendored
2
frontend/dist/assets/wordlist-BtmChKSf.js
vendored
BIN
frontend/dist/favicon-96x96.png
vendored
|
Before Width: | Height: | Size: 9.5 KiB |
BIN
frontend/dist/favicon.ico
vendored
|
Before Width: | Height: | Size: 15 KiB |
3
frontend/dist/favicon.svg
vendored
|
Before Width: | Height: | Size: 128 KiB |
BIN
frontend/dist/glyph.png
vendored
|
Before Width: | Height: | Size: 96 KiB |
8
frontend/dist/glyph.svg
vendored
@@ -1,8 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg width="700pt" height="700pt" version="1.1" viewBox="0 0 700 700" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="m405.89 352.77c0 30.797-25.02 55.762-55.887 55.762s-55.887-24.965-55.887-55.762c0-30.797 25.02-55.762 55.887-55.762s55.887 24.965 55.887 55.762z"/>
|
||||
<path d="m333.07 352.77h33.871v297.37h-33.871z"/>
|
||||
<path d="m412.71 495.07-13.648-30.926c44.27-19.438 72.879-63.152 72.879-111.37 0-67.082-54.699-121.65-121.94-121.65-67.242 0-121.94 54.57-121.94 121.65 0 48.215 28.609 91.93 72.879 111.37l-13.648 30.926c-56.547-24.844-93.094-80.695-93.094-142.3 0-85.715 69.887-155.44 155.8-155.44 85.918 0 155.8 69.727 155.8 155.44-0.003906 61.594-36.551 117.46-93.094 142.3z"/>
|
||||
<path d="m410.17 581.6-8.5742-32.691c89.277-23.309 151.63-103.96 151.63-196.15 0-111.8-91.168-202.75-203.22-202.75-112.06 0.003907-203.23 90.961-203.23 202.77 0 92.184 62.348 172.83 151.63 196.15l-8.5742 32.691c-104.18-27.195-176.93-121.3-176.93-228.83 0-130.43 106.36-236.54 237.1-236.54 130.73 0 237.1 106.12 237.1 236.54-0.003906 107.52-72.754 201.62-176.93 228.82z"/>
|
||||
<path d="m409.05 661.5-6.3125-33.199c132.34-25.047 228.39-140.93 228.39-275.53 0-154.66-126.12-280.48-281.13-280.48-155 0-281.13 125.82-281.13 280.48 0 134.6 96.055 250.48 228.39 275.53l-6.3164 33.199c-148.3-28.07-255.95-157.91-255.95-308.73 0-173.29 141.32-314.27 315-314.27s315 140.98 315 314.27c0 150.81-107.65 280.66-255.95 308.73z"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 1.4 KiB |
BIN
frontend/dist/glyph.xcf
vendored
22
frontend/dist/index.html
vendored
@@ -1,22 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en" class="dark">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0, viewport-fit=cover" />
|
||||
<meta name="mobile-web-app-capable" content="yes" />
|
||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent" />
|
||||
<meta name="apple-mobile-web-app-title" content="MCTerm" />
|
||||
<meta name="theme-color" content="#0a0a0a" />
|
||||
<title>RemoteTerm for MeshCore</title>
|
||||
<link rel="icon" type="image/png" href="/favicon-96x96.png" sizes="96x96" />
|
||||
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
|
||||
<link rel="shortcut icon" href="/favicon.ico" />
|
||||
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png" />
|
||||
<link rel="manifest" href="/site.webmanifest" />
|
||||
<script type="module" crossorigin src="/assets/index-CQVFPi-8.js"></script>
|
||||
<link rel="stylesheet" crossorigin href="/assets/index-Bg6UtK9Z.css">
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
</body>
|
||||
</html>
|
||||
21
frontend/dist/site.webmanifest
vendored
@@ -1,21 +0,0 @@
|
||||
{
|
||||
"name": "RemoteTerm for MeshCore",
|
||||
"short_name": "RemoteTerm",
|
||||
"icons": [
|
||||
{
|
||||
"src": "/web-app-manifest-192x192.png",
|
||||
"sizes": "192x192",
|
||||
"type": "image/png",
|
||||
"purpose": "maskable"
|
||||
},
|
||||
{
|
||||
"src": "/web-app-manifest-512x512.png",
|
||||
"sizes": "512x512",
|
||||
"type": "image/png",
|
||||
"purpose": "maskable"
|
||||
}
|
||||
],
|
||||
"theme_color": "#ffffff",
|
||||
"background_color": "#ffffff",
|
||||
"display": "standalone"
|
||||
}
|
||||
BIN
frontend/dist/web-app-manifest-192x192.png
vendored
|
Before Width: | Height: | Size: 15 KiB |
BIN
frontend/dist/web-app-manifest-512x512.png
vendored
|
Before Width: | Height: | Size: 46 KiB |
5299
frontend/package-lock.json
generated
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "remoteterm-meshcore-frontend",
|
||||
"private": true,
|
||||
"version": "1.4.1",
|
||||
"version": "1.8.0",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
@@ -15,6 +15,8 @@
|
||||
"format:check": "prettier --check src/"
|
||||
},
|
||||
"dependencies": {
|
||||
"@codemirror/lang-python": "^6.2.1",
|
||||
"@codemirror/theme-one-dark": "^6.1.3",
|
||||
"@michaelhart/meshcore-decoder": "^0.2.7",
|
||||
"@radix-ui/react-checkbox": "^1.3.3",
|
||||
"@radix-ui/react-dialog": "^1.1.15",
|
||||
@@ -22,6 +24,7 @@
|
||||
"@radix-ui/react-separator": "^1.1.8",
|
||||
"@radix-ui/react-slot": "^1.2.4",
|
||||
"@radix-ui/react-tabs": "^1.1.13",
|
||||
"@uiw/react-codemirror": "^4.25.4",
|
||||
"class-variance-authority": "^0.7.1",
|
||||
"clsx": "^2.1.1",
|
||||
"d3-force": "^3.0.0",
|
||||
|
||||
@@ -282,12 +282,32 @@ export function App() {
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Initial fetch for config and settings
|
||||
// Fetch all contacts, paginating if >1000
|
||||
const fetchAllContacts = useCallback(async (): Promise<Contact[]> => {
|
||||
const pageSize = 1000;
|
||||
const first = await api.getContacts(pageSize, 0);
|
||||
if (first.length < pageSize) return first;
|
||||
let all = [...first];
|
||||
let offset = pageSize;
|
||||
while (true) {
|
||||
const page = await api.getContacts(pageSize, offset);
|
||||
all = all.concat(page);
|
||||
if (page.length < pageSize) break;
|
||||
offset += pageSize;
|
||||
}
|
||||
return all;
|
||||
}, []);
|
||||
|
||||
// Initial fetch for config, settings, and data
|
||||
useEffect(() => {
|
||||
fetchConfig();
|
||||
fetchAppSettings();
|
||||
fetchUndecryptedCount();
|
||||
}, [fetchConfig, fetchAppSettings, fetchUndecryptedCount]);
|
||||
|
||||
// Fetch contacts and channels via REST (parallel, faster than WS serial push)
|
||||
api.getChannels().then(setChannels).catch(console.error);
|
||||
fetchAllContacts().then(setContacts).catch(console.error);
|
||||
}, [fetchConfig, fetchAppSettings, fetchUndecryptedCount, fetchAllContacts]);
|
||||
|
||||
// One-time migration of localStorage preferences to server
|
||||
const hasMigratedRef = useRef(false);
|
||||
@@ -355,61 +375,53 @@ export function App() {
|
||||
migratePreferences();
|
||||
}, [appSettings]);
|
||||
|
||||
// Resolve URL hash to a conversation
|
||||
const resolveHashToConversation = useCallback((): Conversation | null => {
|
||||
const hashConv = parseHashConversation();
|
||||
if (!hashConv) return null;
|
||||
// Phase 1: Set initial conversation from URL hash or default to Public channel
|
||||
// Only needs channels (fast path) - doesn't wait for contacts
|
||||
const hasSetDefaultConversation = useRef(false);
|
||||
useEffect(() => {
|
||||
if (hasSetDefaultConversation.current || activeConversation) return;
|
||||
if (channels.length === 0) return;
|
||||
|
||||
if (hashConv.type === 'raw') {
|
||||
return { type: 'raw', id: 'raw', name: 'Raw Packet Feed' };
|
||||
const hashConv = parseHashConversation();
|
||||
|
||||
// Handle non-data views immediately
|
||||
if (hashConv?.type === 'raw') {
|
||||
setActiveConversation({ type: 'raw', id: 'raw', name: 'Raw Packet Feed' });
|
||||
hasSetDefaultConversation.current = true;
|
||||
return;
|
||||
}
|
||||
if (hashConv.type === 'map') {
|
||||
return {
|
||||
if (hashConv?.type === 'map') {
|
||||
setActiveConversation({
|
||||
type: 'map',
|
||||
id: 'map',
|
||||
name: 'Node Map',
|
||||
mapFocusKey: hashConv.mapFocusKey,
|
||||
};
|
||||
});
|
||||
hasSetDefaultConversation.current = true;
|
||||
return;
|
||||
}
|
||||
if (hashConv.type === 'visualizer') {
|
||||
return { type: 'visualizer', id: 'visualizer', name: 'Mesh Visualizer' };
|
||||
}
|
||||
if (hashConv.type === 'channel') {
|
||||
const channel = channels.find(
|
||||
(c) => c.name === hashConv.name || c.name === `#${hashConv.name}`
|
||||
);
|
||||
if (channel) {
|
||||
return { type: 'channel', id: channel.key, name: channel.name };
|
||||
}
|
||||
}
|
||||
if (hashConv.type === 'contact') {
|
||||
const contact = contacts.find(
|
||||
(c) => getContactDisplayName(c.name, c.public_key) === hashConv.name
|
||||
);
|
||||
if (contact) {
|
||||
return {
|
||||
type: 'contact',
|
||||
id: contact.public_key,
|
||||
name: getContactDisplayName(contact.name, contact.public_key),
|
||||
};
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}, [channels, contacts]);
|
||||
|
||||
// Set initial conversation from URL hash or default to Public channel
|
||||
const hasSetDefaultConversation = useRef(false);
|
||||
useEffect(() => {
|
||||
if (hasSetDefaultConversation.current || activeConversation) return;
|
||||
if (channels.length === 0 && contacts.length === 0) return;
|
||||
|
||||
const conv = resolveHashToConversation();
|
||||
if (conv) {
|
||||
setActiveConversation(conv);
|
||||
if (hashConv?.type === 'visualizer') {
|
||||
setActiveConversation({ type: 'visualizer', id: 'visualizer', name: 'Mesh Visualizer' });
|
||||
hasSetDefaultConversation.current = true;
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle channel hash
|
||||
if (hashConv?.type === 'channel') {
|
||||
const channel = channels.find(
|
||||
(c) => c.name === hashConv.name || c.name === `#${hashConv.name}`
|
||||
);
|
||||
if (channel) {
|
||||
setActiveConversation({ type: 'channel', id: channel.key, name: channel.name });
|
||||
hasSetDefaultConversation.current = true;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Contact hash — wait for phase 2
|
||||
if (hashConv?.type === 'contact') return;
|
||||
|
||||
// No hash or unresolvable — default to Public
|
||||
const publicChannel = channels.find((c) => c.name === 'Public');
|
||||
if (publicChannel) {
|
||||
setActiveConversation({
|
||||
@@ -419,7 +431,42 @@ export function App() {
|
||||
});
|
||||
hasSetDefaultConversation.current = true;
|
||||
}
|
||||
}, [channels, contacts, activeConversation, resolveHashToConversation]);
|
||||
}, [channels, activeConversation]);
|
||||
|
||||
// Phase 2: Resolve contact hash (only if phase 1 didn't set a conversation)
|
||||
useEffect(() => {
|
||||
if (hasSetDefaultConversation.current || activeConversation) return;
|
||||
if (contacts.length === 0) return;
|
||||
|
||||
const hashConv = parseHashConversation();
|
||||
if (hashConv?.type === 'contact') {
|
||||
const contact = contacts.find(
|
||||
(c) => getContactDisplayName(c.name, c.public_key) === hashConv.name
|
||||
);
|
||||
if (contact) {
|
||||
setActiveConversation({
|
||||
type: 'contact',
|
||||
id: contact.public_key,
|
||||
name: getContactDisplayName(contact.name, contact.public_key),
|
||||
});
|
||||
hasSetDefaultConversation.current = true;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Contact hash didn't match — fall back to Public if channels loaded
|
||||
if (channels.length > 0) {
|
||||
const publicChannel = channels.find((c) => c.name === 'Public');
|
||||
if (publicChannel) {
|
||||
setActiveConversation({
|
||||
type: 'channel',
|
||||
id: publicChannel.key,
|
||||
name: publicChannel.name,
|
||||
});
|
||||
hasSetDefaultConversation.current = true;
|
||||
}
|
||||
}
|
||||
}, [contacts, channels, activeConversation]);
|
||||
|
||||
// Keep ref in sync and update URL hash
|
||||
useEffect(() => {
|
||||
@@ -496,7 +543,7 @@ export function App() {
|
||||
// Send flood advertisement handler
|
||||
const handleAdvertise = useCallback(async () => {
|
||||
try {
|
||||
await api.sendAdvertisement(true);
|
||||
await api.sendAdvertisement();
|
||||
toast.success('Advertisement sent');
|
||||
} catch (err) {
|
||||
console.error('Failed to send advertisement:', err);
|
||||
@@ -518,29 +565,35 @@ export function App() {
|
||||
}, []);
|
||||
|
||||
// Toggle favorite status for a conversation (via API) with optimistic update
|
||||
const handleToggleFavorite = useCallback(
|
||||
async (type: 'channel' | 'contact', id: string) => {
|
||||
// Compute optimistic new state
|
||||
const wasFavorited = isFavorite(favorites, type, id);
|
||||
const handleToggleFavorite = useCallback(async (type: 'channel' | 'contact', id: string) => {
|
||||
// Read current favorites inside the callback to avoid a dependency on the
|
||||
// derived `favorites` array (which creates a new reference every render).
|
||||
setAppSettings((prev) => {
|
||||
if (!prev) return prev;
|
||||
const currentFavorites = prev.favorites ?? [];
|
||||
const wasFavorited = isFavorite(currentFavorites, type, id);
|
||||
const optimisticFavorites = wasFavorited
|
||||
? favorites.filter((f) => !(f.type === type && f.id === id))
|
||||
: [...favorites, { type, id }];
|
||||
|
||||
// Optimistic update
|
||||
setAppSettings((prev) => (prev ? { ...prev, favorites: optimisticFavorites } : prev));
|
||||
? currentFavorites.filter((f) => !(f.type === type && f.id === id))
|
||||
: [...currentFavorites, { type, id }];
|
||||
return { ...prev, favorites: optimisticFavorites };
|
||||
});
|
||||
|
||||
try {
|
||||
const updatedSettings = await api.toggleFavorite(type, id);
|
||||
setAppSettings(updatedSettings);
|
||||
} catch (err) {
|
||||
console.error('Failed to toggle favorite:', err);
|
||||
// Revert: re-fetch would be safest, but restoring from server state on next sync
|
||||
// is acceptable. For now, just refetch settings.
|
||||
try {
|
||||
const updatedSettings = await api.toggleFavorite(type, id);
|
||||
setAppSettings(updatedSettings);
|
||||
} catch (err) {
|
||||
console.error('Failed to toggle favorite:', err);
|
||||
// Revert on error
|
||||
setAppSettings((prev) => (prev ? { ...prev, favorites } : prev));
|
||||
toast.error('Failed to update favorite');
|
||||
const settings = await api.getSettings();
|
||||
setAppSettings(settings);
|
||||
} catch {
|
||||
// If refetch also fails, leave optimistic state
|
||||
}
|
||||
},
|
||||
[favorites]
|
||||
);
|
||||
toast.error('Failed to update favorite');
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Delete channel handler
|
||||
const handleDeleteChannel = useCallback(async (key: string) => {
|
||||
@@ -640,6 +693,24 @@ export function App() {
|
||||
[fetchUndecryptedCount]
|
||||
);
|
||||
|
||||
// Handle direct trace request
|
||||
const handleTrace = useCallback(async () => {
|
||||
if (!activeConversation || activeConversation.type !== 'contact') return;
|
||||
toast('Trace started...');
|
||||
try {
|
||||
const result = await api.requestTrace(activeConversation.id);
|
||||
const parts: string[] = [];
|
||||
if (result.remote_snr !== null) parts.push(`Remote SNR: ${result.remote_snr.toFixed(1)} dB`);
|
||||
if (result.local_snr !== null) parts.push(`Local SNR: ${result.local_snr.toFixed(1)} dB`);
|
||||
const detail = parts.join(', ');
|
||||
toast.success(detail ? `Trace complete! ${detail}` : 'Trace complete!');
|
||||
} catch (err) {
|
||||
toast.error('Trace failed', {
|
||||
description: err instanceof Error ? err.message : 'Unknown error',
|
||||
});
|
||||
}
|
||||
}, [activeConversation]);
|
||||
|
||||
// Handle sort order change via API with optimistic update
|
||||
const handleSortOrderChange = useCallback(
|
||||
async (order: 'recent' | 'alpha') => {
|
||||
@@ -687,7 +758,7 @@ export function App() {
|
||||
);
|
||||
|
||||
return (
|
||||
<div className="flex flex-col h-dvh">
|
||||
<div className="flex flex-col h-full">
|
||||
<StatusBar
|
||||
health={health}
|
||||
config={config}
|
||||
@@ -721,7 +792,12 @@ export function App() {
|
||||
</div>
|
||||
</>
|
||||
) : activeConversation.type === 'visualizer' ? (
|
||||
<VisualizerView packets={rawPackets} contacts={contacts} config={config} />
|
||||
<VisualizerView
|
||||
packets={rawPackets}
|
||||
contacts={contacts}
|
||||
config={config}
|
||||
onClearPackets={() => setRawPackets([])}
|
||||
/>
|
||||
) : activeConversation.type === 'raw' ? (
|
||||
<>
|
||||
<div className="flex justify-between items-center px-4 py-3 border-b border-border font-medium text-lg">
|
||||
@@ -821,6 +897,16 @@ export function App() {
|
||||
})()}
|
||||
</span>
|
||||
<div className="flex items-center gap-1 flex-shrink-0">
|
||||
{/* Direct trace button (contacts only) */}
|
||||
{activeConversation.type === 'contact' && (
|
||||
<button
|
||||
className="p-1.5 rounded hover:bg-accent text-xl leading-none"
|
||||
onClick={handleTrace}
|
||||
title="Direct Trace"
|
||||
>
|
||||
🛎
|
||||
</button>
|
||||
)}
|
||||
{/* Favorite button */}
|
||||
{(activeConversation.type === 'channel' ||
|
||||
activeConversation.type === 'contact') && (
|
||||
|
||||
@@ -13,13 +13,12 @@ import type {
|
||||
RadioConfig,
|
||||
RadioConfigUpdate,
|
||||
TelemetryResponse,
|
||||
TraceResponse,
|
||||
UnreadCounts,
|
||||
} from './types';
|
||||
|
||||
const API_BASE = '/api';
|
||||
|
||||
/** Max messages fetched per conversation for unread counting. If count equals this, there may be more. */
|
||||
export const UNREAD_FETCH_LIMIT = 100;
|
||||
|
||||
async function fetchJson<T>(url: string, options?: RequestInit): Promise<T> {
|
||||
const res = await fetch(`${API_BASE}${url}`, {
|
||||
...options,
|
||||
@@ -77,8 +76,8 @@ export const api = {
|
||||
method: 'PUT',
|
||||
body: JSON.stringify({ private_key: privateKey }),
|
||||
}),
|
||||
sendAdvertisement: (flood = true) =>
|
||||
fetchJson<{ status: string; flood: boolean }>(`/radio/advertise?flood=${flood}`, {
|
||||
sendAdvertisement: () =>
|
||||
fetchJson<{ status: string }>('/radio/advertise', {
|
||||
method: 'POST',
|
||||
}),
|
||||
rebootRadio: () =>
|
||||
@@ -93,16 +92,6 @@ export const api = {
|
||||
// Contacts
|
||||
getContacts: (limit = 100, offset = 0) =>
|
||||
fetchJson<Contact[]>(`/contacts?limit=${limit}&offset=${offset}`),
|
||||
getContact: (publicKey: string) => fetchJson<Contact>(`/contacts/${publicKey}`),
|
||||
syncContacts: () => fetchJson<{ synced: number }>('/contacts/sync', { method: 'POST' }),
|
||||
addContactToRadio: (publicKey: string) =>
|
||||
fetchJson<{ status: string }>(`/contacts/${publicKey}/add-to-radio`, {
|
||||
method: 'POST',
|
||||
}),
|
||||
removeContactFromRadio: (publicKey: string) =>
|
||||
fetchJson<{ status: string }>(`/contacts/${publicKey}/remove-from-radio`, {
|
||||
method: 'POST',
|
||||
}),
|
||||
deleteContact: (publicKey: string) =>
|
||||
fetchJson<{ status: string }>(`/contacts/${publicKey}`, {
|
||||
method: 'DELETE',
|
||||
@@ -126,16 +115,18 @@ export const api = {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ command }),
|
||||
}),
|
||||
requestTrace: (publicKey: string) =>
|
||||
fetchJson<TraceResponse>(`/contacts/${publicKey}/trace`, {
|
||||
method: 'POST',
|
||||
}),
|
||||
|
||||
// Channels
|
||||
getChannels: () => fetchJson<Channel[]>('/channels'),
|
||||
getChannel: (key: string) => fetchJson<Channel>(`/channels/${key}`),
|
||||
createChannel: (name: string, key?: string) =>
|
||||
fetchJson<Channel>('/channels', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ name, key }),
|
||||
}),
|
||||
syncChannels: () => fetchJson<{ synced: number }>('/channels/sync', { method: 'POST' }),
|
||||
deleteChannel: (key: string) =>
|
||||
fetchJson<{ status: string }>(`/channels/${key}`, { method: 'DELETE' }),
|
||||
markChannelRead: (key: string) =>
|
||||
@@ -150,6 +141,8 @@ export const api = {
|
||||
offset?: number;
|
||||
type?: 'PRIV' | 'CHAN';
|
||||
conversation_key?: string;
|
||||
before?: number;
|
||||
before_id?: number;
|
||||
},
|
||||
signal?: AbortSignal
|
||||
) => {
|
||||
@@ -158,20 +151,11 @@ export const api = {
|
||||
if (params?.offset) searchParams.set('offset', params.offset.toString());
|
||||
if (params?.type) searchParams.set('type', params.type);
|
||||
if (params?.conversation_key) searchParams.set('conversation_key', params.conversation_key);
|
||||
if (params?.before !== undefined) searchParams.set('before', params.before.toString());
|
||||
if (params?.before_id !== undefined) searchParams.set('before_id', params.before_id.toString());
|
||||
const query = searchParams.toString();
|
||||
return fetchJson<Message[]>(`/messages${query ? `?${query}` : ''}`, { signal });
|
||||
},
|
||||
getMessagesBulk: (
|
||||
conversations: Array<{ type: 'PRIV' | 'CHAN'; conversation_key: string }>,
|
||||
limitPerConversation: number = UNREAD_FETCH_LIMIT
|
||||
) =>
|
||||
fetchJson<Record<string, Message[]>>(
|
||||
`/messages/bulk?limit_per_conversation=${limitPerConversation}`,
|
||||
{
|
||||
method: 'POST',
|
||||
body: JSON.stringify(conversations),
|
||||
}
|
||||
),
|
||||
sendDirectMessage: (destination: string, text: string) =>
|
||||
fetchJson<Message>('/messages/direct', {
|
||||
method: 'POST',
|
||||
@@ -201,6 +185,10 @@ export const api = {
|
||||
}),
|
||||
|
||||
// Read State
|
||||
getUnreads: (name?: string) => {
|
||||
const params = name ? `?name=${encodeURIComponent(name)}` : '';
|
||||
return fetchJson<UnreadCounts>(`/read-state/unreads${params}`);
|
||||
},
|
||||
markAllRead: () =>
|
||||
fetchJson<{ status: string; timestamp: number }>('/read-state/mark-all-read', {
|
||||
method: 'POST',
|
||||
@@ -215,29 +203,12 @@ export const api = {
|
||||
}),
|
||||
|
||||
// Favorites
|
||||
addFavorite: (type: Favorite['type'], id: string) =>
|
||||
fetchJson<AppSettings>('/settings/favorites', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ type, id }),
|
||||
}),
|
||||
removeFavorite: (type: Favorite['type'], id: string) =>
|
||||
fetchJson<AppSettings>('/settings/favorites', {
|
||||
method: 'DELETE',
|
||||
body: JSON.stringify({ type, id }),
|
||||
}),
|
||||
toggleFavorite: (type: Favorite['type'], id: string) =>
|
||||
fetchJson<AppSettings>('/settings/favorites/toggle', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ type, id }),
|
||||
}),
|
||||
|
||||
// Last message time tracking
|
||||
updateLastMessageTime: (stateKey: string, timestamp: number) =>
|
||||
fetchJson<{ status: string }>('/settings/last-message-time', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ state_key: stateKey, timestamp }),
|
||||
}),
|
||||
|
||||
// Preferences migration (one-time, from localStorage to database)
|
||||
migratePreferences: (request: MigratePreferencesRequest) =>
|
||||
fetchJson<MigratePreferencesResponse>('/settings/migrate', {
|
||||
|
||||
359
frontend/src/components/AGENTS.md
Normal file
@@ -0,0 +1,359 @@
|
||||
# PacketVisualizer Architecture
|
||||
|
||||
This document explains the architecture and design of the PacketVisualizer component, which renders a real-time force-directed graph visualization of mesh network packet traffic.
|
||||
|
||||
## Overview
|
||||
|
||||
The PacketVisualizer displays:
|
||||
|
||||
- **Nodes**: Network participants (self, repeaters, clients)
|
||||
- **Links**: Connections between nodes based on observed packet paths
|
||||
- **Particles**: Animated dots traveling along links representing packets in transit
|
||||
|
||||
## Architecture: Data Layer vs Rendering Layer
|
||||
|
||||
The component is split into two distinct layers to enable future rendering engine swaps (e.g., WebGL, Three.js):
|
||||
|
||||
### Data Layer (`useVisualizerData` hook)
|
||||
|
||||
The custom hook manages all graph state and simulation logic:
|
||||
|
||||
```
|
||||
Packets → Parse → Aggregate by key → Observation window → Publish → Animate
|
||||
```
|
||||
|
||||
**Key responsibilities:**
|
||||
|
||||
- Maintains node and link maps (`nodesRef`, `linksRef`)
|
||||
- Runs D3 force simulation for layout
|
||||
- Processes incoming packets with deduplication
|
||||
- Aggregates packet repeats across multiple paths
|
||||
- Manages particle queue and animation timing
|
||||
|
||||
**State:**
|
||||
|
||||
- `nodesRef`: Map of node ID → GraphNode
|
||||
- `linksRef`: Map of link key → GraphLink
|
||||
- `particlesRef`: Array of active Particle objects
|
||||
- `simulationRef`: D3 force simulation instance
|
||||
- `pendingRef`: Packets in observation window awaiting animation
|
||||
- `timersRef`: Per-packet publish timers
|
||||
|
||||
### Rendering Layer (canvas drawing functions)
|
||||
|
||||
Separate pure functions handle all canvas rendering:
|
||||
|
||||
- `renderLinks()`: Draws connections between nodes
|
||||
- `renderParticles()`: Draws animated packets with labels
|
||||
- `renderNodes()`: Draws node circles with emojis/text
|
||||
|
||||
The main component orchestrates rendering via `requestAnimationFrame`.
|
||||
|
||||
## Packet Processing Pipeline
|
||||
|
||||
### 1. Packet Arrival
|
||||
|
||||
When a new packet arrives from the WebSocket:
|
||||
|
||||
```typescript
|
||||
packets.forEach((packet) => {
|
||||
if (processedRef.current.has(packet.id)) return; // Skip duplicates
|
||||
processedRef.current.add(packet.id);
|
||||
|
||||
const parsed = parsePacket(packet.data);
|
||||
const key = generatePacketKey(parsed, packet);
|
||||
// ...
|
||||
});
|
||||
```
|
||||
|
||||
### 2. Key Generation
|
||||
|
||||
Packets are grouped by a unique key to aggregate repeats:
|
||||
|
||||
| Packet Type | Key Format |
|
||||
| -------------- | ----------------------------------------- |
|
||||
| Advertisement | `ad:{pubkey_prefix_12}` |
|
||||
| Group Text | `gt:{channel}:{sender}:{content_hash}` |
|
||||
| Direct Message | `dm:{src_hash}:{dst_hash}:{content_hash}` |
|
||||
| Other | `other:{data_hash}` |
|
||||
|
||||
### 3. Observation Window
|
||||
|
||||
Same packets arriving via different paths are aggregated:
|
||||
|
||||
```typescript
|
||||
if (existing && now < existing.expiresAt) {
|
||||
// Append path to existing entry
|
||||
existing.paths.push({ nodes: path, snr: packet.snr, timestamp: now });
|
||||
} else {
|
||||
// Create new pending entry with 2-second observation window
|
||||
pendingPacketsRef.current.set(key, {
|
||||
key,
|
||||
label,
|
||||
paths: [{ nodes: path, ... }],
|
||||
expiresAt: now + OBSERVATION_WINDOW_MS,
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Publishing & Animation
|
||||
|
||||
When the observation window expires, all paths animate simultaneously:
|
||||
|
||||
```typescript
|
||||
function publishPacket(pending: PendingPacket) {
|
||||
// Ensure all nodes exist in graph
|
||||
// Create links between consecutive nodes
|
||||
// Queue particles for ALL paths at once
|
||||
|
||||
for (const observedPath of pending.paths) {
|
||||
for (let i = 0; i < path.length - 1; i++) {
|
||||
// Spawn particle with negative initial progress for smooth flow
|
||||
particlesRef.current.push({
|
||||
progress: -(i * HOP_DELAY), // Stagger by hop index
|
||||
// ...
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Key insight:** Particles start with negative progress. This creates smooth flow through multi-hop paths without pausing at intermediate nodes.
|
||||
|
||||
## D3 Force Simulation
|
||||
|
||||
The layout uses D3's force simulation with these forces:
|
||||
|
||||
| Force | Purpose |
|
||||
| ------------- | ---------------------------------------------------- |
|
||||
| `link` | Pulls connected nodes together |
|
||||
| `charge` | Repels nodes from each other (self node 6x stronger) |
|
||||
| `center` | Gently pulls graph toward center |
|
||||
| `collide` | Prevents node overlap |
|
||||
| `selfX/selfY` | Anchors self node near center |
|
||||
|
||||
### Shuffle Layout
|
||||
|
||||
The "Shuffle layout" button randomizes all node positions (except self, which stays centered) and reheats the simulation to alpha=1. This lets users try different random starting configurations to find a cleaner layout.
|
||||
|
||||
### Continuous Drift
|
||||
|
||||
When "Let 'em drift" is enabled, `alphaTarget(0.05)` keeps the simulation running indefinitely, allowing the graph to continuously reorganize into better layouts.
|
||||
|
||||
## Node Resolution
|
||||
|
||||
Nodes are resolved from various sources:
|
||||
|
||||
```typescript
|
||||
function resolveNode(source, isRepeater, showAmbiguous): string | null {
|
||||
// source.type can be: 'pubkey', 'prefix', or 'name'
|
||||
// Try to find matching contact
|
||||
// If found: use full 12-char prefix as node ID
|
||||
// If not found and showAmbiguous: create "?prefix" node
|
||||
// Otherwise: return null (path terminates)
|
||||
}
|
||||
```
|
||||
|
||||
### Ambiguous Nodes
|
||||
|
||||
When only a 1-byte prefix is known (from packet path bytes), the node is marked ambiguous and shown with a `?` prefix and gray styling.
|
||||
|
||||
### Traffic Pattern Splitting (Experimental)
|
||||
|
||||
**Problem:** Multiple physical repeaters can share the same 1-byte prefix (collision). Since packet paths only contain 1-byte hashes, we can't directly distinguish them. However, traffic patterns provide a heuristic.
|
||||
|
||||
**Key Insight:** A single physical repeater (even acting as a hub) will have the same sources routing through it regardless of next-hop. But if prefix `32` has completely disjoint sets of sources for different next-hops, those are likely different physical nodes sharing the same prefix.
|
||||
|
||||
**Example:**
|
||||
|
||||
```
|
||||
ae -> 32 -> ba -> self (source: ae)
|
||||
c1 -> 32 -> ba -> self (source: c1)
|
||||
d1 -> 32 -> 60 -> self (source: d1)
|
||||
e2 -> 32 -> 60 -> self (source: e2)
|
||||
```
|
||||
|
||||
Analysis:
|
||||
|
||||
- Sources {ae, c1} always route through `32` to `ba`
|
||||
- Sources {d1, e2} always route through `32` to `60`
|
||||
- These source sets are **disjoint** (no overlap)
|
||||
- Conclusion: Likely two different physical repeaters sharing prefix `32`
|
||||
|
||||
Counter-example (same physical hub):
|
||||
|
||||
```
|
||||
ae -> 32 -> ba -> self
|
||||
ae -> 32 -> 60 -> self (same source 'ae' routes to different next-hops!)
|
||||
```
|
||||
|
||||
Here source `ae` routes through `32` to BOTH `ba` and `60`. This proves `32` is a single physical hub node with multiple downstream paths. No splitting should occur.
|
||||
|
||||
**Algorithm:** When "Heuristically group repeaters by traffic pattern" is enabled:
|
||||
|
||||
1. **Record observations** for each ambiguous repeater: `(packetSource, nextHop)` tuples
|
||||
2. **Analyze disjointness**: Group sources by their next-hop, check for overlap
|
||||
3. **Split conservatively**: Only split when:
|
||||
- Multiple distinct next-hop groups exist
|
||||
- Source sets are completely disjoint (no source appears in multiple groups)
|
||||
- Each group has at least 20 unique sources (conservative threshold)
|
||||
4. **Final repeaters** (no next hop, connects directly to self): Never split
|
||||
- Rationale: The last repeater before you is clearly a single physical node
|
||||
|
||||
**Node ID format:**
|
||||
|
||||
- Without splitting (default): `?XX` (e.g., `?32`)
|
||||
- With splitting (after evidence threshold met): `?XX:>YY` (e.g., `?32:>ba`)
|
||||
- Final repeater: `?XX` (unchanged, no suffix)
|
||||
|
||||
**Implementation Notes:**
|
||||
|
||||
- Observations are stored with timestamps and pruned after 30 minutes
|
||||
- Maximum 200 observations per prefix to limit memory
|
||||
- Once split, nodes cannot be un-split (be conservative before splitting)
|
||||
|
||||
## Path Building
|
||||
|
||||
Paths are constructed from packet data:
|
||||
|
||||
```typescript
|
||||
function buildPath(parsed, packet, myPrefix): string[] {
|
||||
const path = [];
|
||||
|
||||
// 1. Add source node (from advert pubkey, DM src hash, or group text sender)
|
||||
// 2. Add repeater path (from path bytes in packet header)
|
||||
// 3. Add destination (self for incoming, or DM dst hash for outgoing)
|
||||
|
||||
return dedupeConsecutive(path); // Remove consecutive duplicates
|
||||
}
|
||||
```
|
||||
|
||||
## Packet Types & Colors
|
||||
|
||||
| Label | Type | Color |
|
||||
| ----- | -------------- | ---------------- |
|
||||
| AD | Advertisement | Amber (#f59e0b) |
|
||||
| GT | Group Text | Cyan (#06b6d4) |
|
||||
| DM | Direct Message | Purple (#8b5cf6) |
|
||||
| ACK | Acknowledgment | Green (#22c55e) |
|
||||
| TR | Trace | Orange (#f97316) |
|
||||
| RQ | Request | Pink (#ec4899) |
|
||||
| RS | Response | Teal (#14b8a6) |
|
||||
| ? | Unknown | Gray (#6b7280) |
|
||||
|
||||
### Sender Extraction by Packet Type
|
||||
|
||||
Different packet types provide different levels of sender identification:
|
||||
|
||||
| Packet Type | Sender Info Available | Resolution |
|
||||
| -------------- | ------------------------------ | ------------------------------ |
|
||||
| Advertisement | Full 32-byte public key | Exact contact match |
|
||||
| AnonRequest | Full 32-byte public key | Exact contact match |
|
||||
| Group Text | Sender name (after decryption) | Name lookup |
|
||||
| Direct Message | 1-byte source hash | Ambiguous (may match multiple) |
|
||||
| Request | 1-byte source hash | Ambiguous |
|
||||
| Other | None | Path bytes only |
|
||||
|
||||
**AnonRequest packets** are particularly useful because they include the sender's full public key (unlike regular Request packets which only have a 1-byte hash). This allows exact identification of who is making the request.
|
||||
|
||||
## Canvas Rendering
|
||||
|
||||
### Coordinate Transformation
|
||||
|
||||
Pan and zoom are applied via transform matrix:
|
||||
|
||||
```typescript
|
||||
ctx.setTransform(dpr * scale, 0, 0, dpr * scale, dpr * (x + panX), dpr * (y + panY));
|
||||
```
|
||||
|
||||
### Render Order
|
||||
|
||||
1. Clear canvas with background
|
||||
2. Draw links (gray lines)
|
||||
3. Draw particles (colored dots with labels)
|
||||
4. Draw nodes (circles with emojis)
|
||||
5. Draw hover tooltip if applicable
|
||||
|
||||
## Mouse Interactions
|
||||
|
||||
| Action | Behavior |
|
||||
| -------------------------- | ------------------------------------------------ |
|
||||
| Click + drag on node | Move node to new position (temporarily fixes it) |
|
||||
| Release dragged node | Node returns to force-directed layout |
|
||||
| Click + drag on empty area | Pan the canvas |
|
||||
| Scroll wheel | Zoom in/out |
|
||||
| Hover over node | Shows node details, cursor changes to pointer |
|
||||
|
||||
**Node Dragging Implementation:**
|
||||
|
||||
- On mouse down over a node, sets `fx`/`fy` (D3 fixed position) to lock it
|
||||
- On mouse move, updates the fixed position to follow cursor
|
||||
- On mouse up, clears `fx`/`fy` so node rejoins the simulation
|
||||
- Simulation is slightly reheated during drag for responsive feedback
|
||||
|
||||
## Configuration Options
|
||||
|
||||
| Option | Default | Description |
|
||||
| -------------------------- | ------- | --------------------------------------------------------- |
|
||||
| Ambiguous path repeaters | On | Show nodes when only partial prefix known |
|
||||
| Ambiguous sender/recipient | Off | Show placeholder nodes for unknown senders |
|
||||
| Split by traffic pattern | Off | Split ambiguous repeaters by next-hop routing (see above) |
|
||||
| Hide repeaters >48hrs | Off | Filter out old repeaters |
|
||||
| Observation window | 15 sec | Wait time for duplicate packets before animating (1-60s) |
|
||||
| Let 'em drift | On | Continuous layout optimization |
|
||||
| Repulsion | 200 | Force strength (50-2500) |
|
||||
| Packet speed | 2x | Particle animation speed multiplier (1x-5x) |
|
||||
| Shuffle layout | - | Button to randomize node positions and reheat sim |
|
||||
| Oooh Big Stretch! | - | Button to temporarily increase repulsion then relax |
|
||||
| Hide UI | Off | Hide legends and most controls for cleaner view |
|
||||
| Full screen | Off | Hide the packet feed panel (desktop only) |
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
PacketVisualizer.tsx
|
||||
├── TYPES (GraphNode, GraphLink, Particle, etc.)
|
||||
├── CONSTANTS (colors, timing, legend items)
|
||||
├── UTILITY FUNCTIONS
|
||||
│ ├── simpleHash()
|
||||
│ ├── parsePacket()
|
||||
│ ├── getPacketLabel()
|
||||
│ ├── generatePacketKey()
|
||||
│ ├── findContactBy*()
|
||||
│ ├── dedupeConsecutive()
|
||||
│ ├── analyzeRepeaterTraffic()
|
||||
│ └── recordTrafficObservation()
|
||||
├── DATA LAYER HOOK (useVisualizerData)
|
||||
│ ├── Refs (nodes, links, particles, simulation, pending, timers, trafficPatterns)
|
||||
│ ├── Simulation initialization
|
||||
│ ├── Node/link management (addNode, addLink, syncSimulation)
|
||||
│ ├── Path building (resolveNode, buildPath)
|
||||
│ ├── Traffic pattern analysis (for repeater disambiguation)
|
||||
│ └── Packet processing & publishing
|
||||
├── RENDERING FUNCTIONS
|
||||
│ ├── renderLinks()
|
||||
│ ├── renderParticles()
|
||||
│ └── renderNodes()
|
||||
└── MAIN COMPONENT (PacketVisualizer)
|
||||
├── State (dimensions, options, transform, hover)
|
||||
├── Event handlers (mouse, wheel)
|
||||
├── Animation loop
|
||||
└── JSX (canvas, legend, settings panel)
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
- **Observation window**: 2 seconds balances latency vs. path aggregation
|
||||
- **Max links**: Capped at 100 to prevent graph explosion
|
||||
- **Particle culling**: Particles removed when progress > 1
|
||||
- **Node filtering**: Old repeaters can be hidden to reduce clutter
|
||||
- **requestAnimationFrame**: Render loop tied to display refresh rate
|
||||
|
||||
## Future Improvements
|
||||
|
||||
The data/rendering split enables:
|
||||
|
||||
- WebGL rendering for larger graphs
|
||||
- 3D visualization
|
||||
- Different layout algorithms
|
||||
- Export to other formats
|
||||
30
frontend/src/components/BotCodeEditor.tsx
Normal file
@@ -0,0 +1,30 @@
|
||||
import CodeMirror from '@uiw/react-codemirror';
|
||||
import { python } from '@codemirror/lang-python';
|
||||
import { oneDark } from '@codemirror/theme-one-dark';
|
||||
|
||||
interface BotCodeEditorProps {
|
||||
value: string;
|
||||
onChange: (value: string) => void;
|
||||
id?: string;
|
||||
}
|
||||
|
||||
export function BotCodeEditor({ value, onChange, id }: BotCodeEditorProps) {
|
||||
return (
|
||||
<div className="w-full overflow-hidden rounded-md border border-input">
|
||||
<CodeMirror
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
extensions={[python()]}
|
||||
theme={oneDark}
|
||||
height="256px"
|
||||
basicSetup={{
|
||||
lineNumbers: true,
|
||||
foldGutter: false,
|
||||
highlightActiveLine: true,
|
||||
}}
|
||||
className="text-sm"
|
||||
id={id}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,4 +1,12 @@
|
||||
import { useEffect, useLayoutEffect, useRef, useCallback, useState, type ReactNode } from 'react';
|
||||
import {
|
||||
useEffect,
|
||||
useLayoutEffect,
|
||||
useRef,
|
||||
useCallback,
|
||||
useMemo,
|
||||
useState,
|
||||
type ReactNode,
|
||||
} from 'react';
|
||||
import type { Contact, Message, MessagePath, RadioConfig } from '../types';
|
||||
import { CONTACT_TYPE_REPEATER } from '../types';
|
||||
import { formatTime, parseSenderFromText } from '../utils/messageParser';
|
||||
@@ -19,10 +27,45 @@ interface MessageListProps {
|
||||
config?: RadioConfig | null;
|
||||
}
|
||||
|
||||
// Helper to render text with highlighted @[Name] mentions
|
||||
function renderTextWithMentions(text: string, radioName?: string): ReactNode {
|
||||
if (!radioName) return text;
|
||||
// URL regex for linkifying plain text
|
||||
const URL_PATTERN =
|
||||
/https?:\/\/(www\.)?[-a-zA-Z0-9@:%._+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_+.~#?&//=]*)/g;
|
||||
|
||||
// Helper to convert URLs in a plain text string into clickable links
|
||||
function linkifyText(text: string, keyPrefix: string): ReactNode[] {
|
||||
const parts: ReactNode[] = [];
|
||||
let lastIndex = 0;
|
||||
let match: RegExpExecArray | null;
|
||||
let keyIndex = 0;
|
||||
|
||||
URL_PATTERN.lastIndex = 0;
|
||||
while ((match = URL_PATTERN.exec(text)) !== null) {
|
||||
if (match.index > lastIndex) {
|
||||
parts.push(text.slice(lastIndex, match.index));
|
||||
}
|
||||
parts.push(
|
||||
<a
|
||||
key={`${keyPrefix}-link-${keyIndex++}`}
|
||||
href={match[0]}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="text-primary underline hover:text-primary/80"
|
||||
>
|
||||
{match[0]}
|
||||
</a>
|
||||
);
|
||||
lastIndex = match.index + match[0].length;
|
||||
}
|
||||
|
||||
if (lastIndex === 0) return [text];
|
||||
if (lastIndex < text.length) {
|
||||
parts.push(text.slice(lastIndex));
|
||||
}
|
||||
return parts;
|
||||
}
|
||||
|
||||
// Helper to render text with highlighted @[Name] mentions and clickable URLs
|
||||
function renderTextWithMentions(text: string, radioName?: string): ReactNode {
|
||||
const mentionPattern = /@\[([^\]]+)\]/g;
|
||||
const parts: ReactNode[] = [];
|
||||
let lastIndex = 0;
|
||||
@@ -30,17 +73,17 @@ function renderTextWithMentions(text: string, radioName?: string): ReactNode {
|
||||
let keyIndex = 0;
|
||||
|
||||
while ((match = mentionPattern.exec(text)) !== null) {
|
||||
// Add text before the match
|
||||
// Add text before the match (with linkification)
|
||||
if (match.index > lastIndex) {
|
||||
parts.push(text.slice(lastIndex, match.index));
|
||||
parts.push(...linkifyText(text.slice(lastIndex, match.index), `pre-${keyIndex}`));
|
||||
}
|
||||
|
||||
const mentionedName = match[1];
|
||||
const isOwnMention = mentionedName === radioName;
|
||||
const isOwnMention = radioName ? mentionedName === radioName : false;
|
||||
|
||||
parts.push(
|
||||
<span
|
||||
key={keyIndex++}
|
||||
key={`mention-${keyIndex++}`}
|
||||
className={cn(
|
||||
'rounded px-0.5',
|
||||
isOwnMention ? 'bg-primary/30 text-primary font-medium' : 'bg-muted-foreground/20'
|
||||
@@ -53,9 +96,9 @@ function renderTextWithMentions(text: string, radioName?: string): ReactNode {
|
||||
lastIndex = match.index + match[0].length;
|
||||
}
|
||||
|
||||
// Add remaining text after last match
|
||||
// Add remaining text after last match (with linkification)
|
||||
if (lastIndex < text.length) {
|
||||
parts.push(text.slice(lastIndex));
|
||||
parts.push(...linkifyText(text.slice(lastIndex), `post-${keyIndex}`));
|
||||
}
|
||||
|
||||
return parts.length > 0 ? parts : text;
|
||||
@@ -196,6 +239,14 @@ export function MessageList({
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Sort messages by received_at ascending (oldest first)
|
||||
// Note: Deduplication is handled by useConversationMessages.addMessageIfNew()
|
||||
// and the database UNIQUE constraint on (type, conversation_key, text, sender_timestamp)
|
||||
const sortedMessages = useMemo(
|
||||
() => [...messages].sort((a, b) => a.received_at - b.received_at),
|
||||
[messages]
|
||||
);
|
||||
|
||||
// Look up contact by public key
|
||||
const getContact = (conversationKey: string | null): Contact | null => {
|
||||
if (!conversationKey) return null;
|
||||
@@ -258,11 +309,6 @@ export function MessageList({
|
||||
);
|
||||
}
|
||||
|
||||
// Sort messages by received_at ascending (oldest first)
|
||||
// Note: Deduplication is handled by useConversationMessages.addMessageIfNew()
|
||||
// and the database UNIQUE constraint on (type, conversation_key, text, sender_timestamp)
|
||||
const sortedMessages = [...messages].sort((a, b) => a.received_at - b.received_at);
|
||||
|
||||
// Helper to get a unique sender key for grouping messages
|
||||
const getSenderKey = (msg: Message, sender: string | null): string => {
|
||||
if (msg.outgoing) return '__outgoing__';
|
||||
|
||||
@@ -12,8 +12,7 @@ import {
|
||||
type SimulationLinkDatum,
|
||||
} from 'd3-force';
|
||||
import { MeshCoreDecoder, PayloadType } from '@michaelhart/meshcore-decoder';
|
||||
import type { Contact, RawPacket, RadioConfig } from '../types';
|
||||
import { CONTACT_TYPE_REPEATER } from '../utils/contactAvatar';
|
||||
import { CONTACT_TYPE_REPEATER, type Contact, type RawPacket, type RadioConfig } from '../types';
|
||||
import { Checkbox } from './ui/checkbox';
|
||||
|
||||
// =============================================================================
|
||||
@@ -76,6 +75,26 @@ interface ParsedPacket {
|
||||
dstHash: string | null;
|
||||
advertPubkey: string | null;
|
||||
groupTextSender: string | null;
|
||||
anonRequestPubkey: string | null;
|
||||
}
|
||||
|
||||
// Traffic pattern tracking for smarter repeater disambiguation
|
||||
interface TrafficObservation {
|
||||
source: string; // Node that originated traffic (could be resolved node ID or ambiguous)
|
||||
nextHop: string | null; // Next hop after this repeater (null if final hop before self)
|
||||
timestamp: number;
|
||||
}
|
||||
|
||||
interface RepeaterTrafficData {
|
||||
prefix: string; // The 1-byte hex prefix (e.g., "32")
|
||||
observations: TrafficObservation[];
|
||||
}
|
||||
|
||||
// Analysis result for whether to split an ambiguous repeater
|
||||
interface RepeaterSplitAnalysis {
|
||||
shouldSplit: boolean;
|
||||
// If shouldSplit, maps nextHop -> the sources that exclusively route through it
|
||||
disjointGroups: Map<string, Set<string>> | null;
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
@@ -111,6 +130,12 @@ const PARTICLE_SPEED = 0.008;
|
||||
const DEFAULT_OBSERVATION_WINDOW_SEC = 15;
|
||||
const FORTY_EIGHT_HOURS_MS = 48 * 60 * 60 * 1000;
|
||||
|
||||
// Traffic pattern analysis thresholds
|
||||
// Be conservative - once split, we can't unsplit, so require strong evidence
|
||||
const MIN_OBSERVATIONS_TO_SPLIT = 20; // Need at least this many unique sources per next-hop group
|
||||
const MAX_TRAFFIC_OBSERVATIONS = 200; // Per ambiguous prefix, to limit memory
|
||||
const TRAFFIC_OBSERVATION_MAX_AGE_MS = 30 * 60 * 1000; // 30 minutes - old observations are pruned
|
||||
|
||||
const LEGEND_ITEMS = [
|
||||
{ emoji: '🟢', label: 'You', size: 'text-xl' },
|
||||
{ emoji: '📡', label: 'Repeater', size: 'text-base' },
|
||||
@@ -154,6 +179,7 @@ function parsePacket(hexData: string): ParsedPacket | null {
|
||||
dstHash: null,
|
||||
advertPubkey: null,
|
||||
groupTextSender: null,
|
||||
anonRequestPubkey: null,
|
||||
};
|
||||
|
||||
if (decoded.payloadType === PayloadType.TextMessage && decoded.payload.decoded) {
|
||||
@@ -165,6 +191,9 @@ function parsePacket(hexData: string): ParsedPacket | null {
|
||||
} else if (decoded.payloadType === PayloadType.GroupText && decoded.payload.decoded) {
|
||||
const payload = decoded.payload.decoded as { decrypted?: { sender?: string } };
|
||||
result.groupTextSender = payload.decrypted?.sender || null;
|
||||
} else if (decoded.payloadType === PayloadType.AnonRequest && decoded.payload.decoded) {
|
||||
const payload = decoded.payload.decoded as { senderPublicKey?: string };
|
||||
result.anonRequestPubkey = payload.senderPublicKey || null;
|
||||
}
|
||||
|
||||
return result;
|
||||
@@ -186,6 +215,7 @@ function getPacketLabel(payloadType: number): PacketLabel {
|
||||
case PayloadType.Trace:
|
||||
return 'TR';
|
||||
case PayloadType.Request:
|
||||
case PayloadType.AnonRequest:
|
||||
return 'RQ';
|
||||
case PayloadType.Response:
|
||||
return 'RS';
|
||||
@@ -208,6 +238,9 @@ function generatePacketKey(parsed: ParsedPacket, rawPacket: RawPacket): string {
|
||||
if (parsed.payloadType === PayloadType.TextMessage) {
|
||||
return `dm:${parsed.srcHash || '?'}:${parsed.dstHash || '?'}:${contentHash}`;
|
||||
}
|
||||
if (parsed.payloadType === PayloadType.AnonRequest && parsed.anonRequestPubkey) {
|
||||
return `rq:${parsed.anonRequestPubkey.slice(0, 12)}:${contentHash}`;
|
||||
}
|
||||
return `other:${contentHash}`;
|
||||
}
|
||||
|
||||
@@ -241,6 +274,104 @@ function dedupeConsecutive<T>(arr: T[]): T[] {
|
||||
return arr.filter((item, i) => i === 0 || item !== arr[i - 1]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Analyze traffic patterns for an ambiguous repeater prefix to determine if it
|
||||
* should be split into multiple nodes.
|
||||
*
|
||||
* Logic:
|
||||
* - Group observations by nextHop
|
||||
* - For each nextHop group, collect the set of sources
|
||||
* - If any source appears in multiple nextHop groups → same physical node (hub), don't split
|
||||
* - If source sets are completely disjoint → likely different physical nodes, split
|
||||
*
|
||||
* Returns shouldSplit=true only when we have enough evidence of disjoint routing.
|
||||
*/
|
||||
function analyzeRepeaterTraffic(data: RepeaterTrafficData): RepeaterSplitAnalysis {
|
||||
const now = Date.now();
|
||||
|
||||
// Filter out old observations
|
||||
const recentObservations = data.observations.filter(
|
||||
(obs) => now - obs.timestamp < TRAFFIC_OBSERVATION_MAX_AGE_MS
|
||||
);
|
||||
|
||||
// Group by nextHop (use "self" for null nextHop - final repeater)
|
||||
const byNextHop = new Map<string, Set<string>>();
|
||||
for (const obs of recentObservations) {
|
||||
const hopKey = obs.nextHop ?? 'self';
|
||||
if (!byNextHop.has(hopKey)) {
|
||||
byNextHop.set(hopKey, new Set());
|
||||
}
|
||||
byNextHop.get(hopKey)!.add(obs.source);
|
||||
}
|
||||
|
||||
// If only one nextHop group, no need to split
|
||||
if (byNextHop.size <= 1) {
|
||||
return { shouldSplit: false, disjointGroups: null };
|
||||
}
|
||||
|
||||
// Check if any source appears in multiple groups (evidence of hub behavior)
|
||||
const allSources = new Map<string, string[]>(); // source -> list of nextHops it uses
|
||||
for (const [nextHop, sources] of byNextHop) {
|
||||
for (const source of sources) {
|
||||
if (!allSources.has(source)) {
|
||||
allSources.set(source, []);
|
||||
}
|
||||
allSources.get(source)!.push(nextHop);
|
||||
}
|
||||
}
|
||||
|
||||
// If any source routes to multiple nextHops, this is a hub - don't split
|
||||
for (const [, nextHops] of allSources) {
|
||||
if (nextHops.length > 1) {
|
||||
return { shouldSplit: false, disjointGroups: null };
|
||||
}
|
||||
}
|
||||
|
||||
// Check if we have enough observations in each group to be confident
|
||||
for (const [, sources] of byNextHop) {
|
||||
if (sources.size < MIN_OBSERVATIONS_TO_SPLIT) {
|
||||
// Not enough evidence yet - be conservative, don't split
|
||||
return { shouldSplit: false, disjointGroups: null };
|
||||
}
|
||||
}
|
||||
|
||||
// Source sets are disjoint and we have enough data - split!
|
||||
return { shouldSplit: true, disjointGroups: byNextHop };
|
||||
}
|
||||
|
||||
/**
|
||||
* Record a traffic observation for an ambiguous repeater prefix.
|
||||
* Prunes old observations and limits total count.
|
||||
*/
|
||||
function recordTrafficObservation(
|
||||
trafficData: Map<string, RepeaterTrafficData>,
|
||||
prefix: string,
|
||||
source: string,
|
||||
nextHop: string | null
|
||||
): void {
|
||||
const normalizedPrefix = prefix.toLowerCase();
|
||||
const now = Date.now();
|
||||
|
||||
if (!trafficData.has(normalizedPrefix)) {
|
||||
trafficData.set(normalizedPrefix, { prefix: normalizedPrefix, observations: [] });
|
||||
}
|
||||
|
||||
const data = trafficData.get(normalizedPrefix)!;
|
||||
|
||||
// Add new observation
|
||||
data.observations.push({ source, nextHop, timestamp: now });
|
||||
|
||||
// Prune old observations
|
||||
data.observations = data.observations.filter(
|
||||
(obs) => now - obs.timestamp < TRAFFIC_OBSERVATION_MAX_AGE_MS
|
||||
);
|
||||
|
||||
// Limit total count
|
||||
if (data.observations.length > MAX_TRAFFIC_OBSERVATIONS) {
|
||||
data.observations = data.observations.slice(-MAX_TRAFFIC_OBSERVATIONS);
|
||||
}
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// DATA LAYER HOOK
|
||||
// =============================================================================
|
||||
@@ -251,6 +382,7 @@ interface UseVisualizerDataOptions {
|
||||
config: RadioConfig | null;
|
||||
showAmbiguousPaths: boolean;
|
||||
showAmbiguousNodes: boolean;
|
||||
splitAmbiguousByTraffic: boolean;
|
||||
chargeStrength: number;
|
||||
letEmDrift: boolean;
|
||||
particleSpeedMultiplier: number;
|
||||
@@ -266,6 +398,7 @@ interface VisualizerData {
|
||||
stats: { processed: number; animated: number; nodes: number; links: number };
|
||||
randomizePositions: () => void;
|
||||
expandContract: () => void;
|
||||
clearAndReset: () => void;
|
||||
}
|
||||
|
||||
function useVisualizerData({
|
||||
@@ -274,6 +407,7 @@ function useVisualizerData({
|
||||
config,
|
||||
showAmbiguousPaths,
|
||||
showAmbiguousNodes,
|
||||
splitAmbiguousByTraffic,
|
||||
chargeStrength,
|
||||
letEmDrift,
|
||||
particleSpeedMultiplier,
|
||||
@@ -287,6 +421,7 @@ function useVisualizerData({
|
||||
const processedRef = useRef<Set<number>>(new Set());
|
||||
const pendingRef = useRef<Map<string, PendingPacket>>(new Map());
|
||||
const timersRef = useRef<Map<string, ReturnType<typeof setTimeout>>>(new Map());
|
||||
const trafficPatternsRef = useRef<Map<string, RepeaterTrafficData>>(new Map());
|
||||
const speedMultiplierRef = useRef(particleSpeedMultiplier);
|
||||
const observationWindowRef = useRef(observationWindowSec * 1000);
|
||||
const [stats, setStats] = useState({ processed: 0, animated: 0, nodes: 0, links: 0 });
|
||||
@@ -334,6 +469,7 @@ function useVisualizerData({
|
||||
return () => {
|
||||
sim.stop();
|
||||
};
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps -- one-time init; dimensions/charge handled by the effect below
|
||||
}, []);
|
||||
|
||||
// Update simulation forces when dimensions/charge change
|
||||
@@ -380,6 +516,7 @@ function useVisualizerData({
|
||||
});
|
||||
syncSimulation();
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps -- syncSimulation is stable (no deps), defined below
|
||||
}, [config, dimensions]);
|
||||
|
||||
// Reset on option changes
|
||||
@@ -393,8 +530,9 @@ function useVisualizerData({
|
||||
pendingRef.current.clear();
|
||||
timersRef.current.forEach((t) => clearTimeout(t));
|
||||
timersRef.current.clear();
|
||||
trafficPatternsRef.current.clear();
|
||||
setStats({ processed: 0, animated: 0, nodes: selfNode ? 1 : 0, links: 0 });
|
||||
}, [showAmbiguousPaths, showAmbiguousNodes]);
|
||||
}, [showAmbiguousPaths, showAmbiguousNodes, splitAmbiguousByTraffic]);
|
||||
|
||||
const syncSimulation = useCallback(() => {
|
||||
const sim = simulationRef.current;
|
||||
@@ -481,15 +619,26 @@ function useVisualizerData({
|
||||
}, []);
|
||||
|
||||
// Resolve a node from various sources and add to graph
|
||||
// trafficContext is used when splitAmbiguousByTraffic is enabled to create
|
||||
// separate nodes for ambiguous repeaters based on their position in traffic flow
|
||||
// myPrefix is the user's own 12-char pubkey prefix - if a node matches, return 'self'
|
||||
// trafficContext.packetSource is the original source of the packet (for traffic analysis)
|
||||
// trafficContext.nextPrefix is the next hop after this repeater
|
||||
const resolveNode = useCallback(
|
||||
(
|
||||
source: { type: 'prefix' | 'pubkey' | 'name'; value: string },
|
||||
isRepeater: boolean,
|
||||
showAmbiguous: boolean
|
||||
showAmbiguous: boolean,
|
||||
myPrefix: string | null,
|
||||
trafficContext?: { packetSource: string | null; nextPrefix: string | null }
|
||||
): string | null => {
|
||||
if (source.type === 'pubkey') {
|
||||
if (source.value.length < 12) return null;
|
||||
const nodeId = source.value.slice(0, 12).toLowerCase();
|
||||
// Check if this is our own identity - return 'self' instead of creating duplicate node
|
||||
if (myPrefix && nodeId === myPrefix) {
|
||||
return 'self';
|
||||
}
|
||||
const contact = contacts.find((c) => c.public_key.toLowerCase().startsWith(nodeId));
|
||||
addNode(
|
||||
nodeId,
|
||||
@@ -506,6 +655,10 @@ function useVisualizerData({
|
||||
const contact = findContactByName(source.value, contacts);
|
||||
if (contact) {
|
||||
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
|
||||
// Check if this is our own identity
|
||||
if (myPrefix && nodeId === myPrefix) {
|
||||
return 'self';
|
||||
}
|
||||
addNode(nodeId, contact.name, getNodeType(contact), false, undefined, contact.last_seen);
|
||||
return nodeId;
|
||||
}
|
||||
@@ -518,6 +671,10 @@ function useVisualizerData({
|
||||
const contact = findContactByPrefix(source.value, contacts);
|
||||
if (contact) {
|
||||
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
|
||||
// Check if this is our own identity
|
||||
if (myPrefix && nodeId === myPrefix) {
|
||||
return 'self';
|
||||
}
|
||||
addNode(nodeId, contact.name, getNodeType(contact), false, undefined, contact.last_seen);
|
||||
return nodeId;
|
||||
}
|
||||
@@ -536,20 +693,55 @@ function useVisualizerData({
|
||||
return nodeId;
|
||||
}
|
||||
|
||||
// Multiple matches - create ambiguous node
|
||||
if (filtered.length > 1) {
|
||||
// Multiple matches or no matches - create ambiguous node
|
||||
// When splitAmbiguousByTraffic is enabled for repeaters, use traffic pattern analysis
|
||||
if (filtered.length > 1 || (filtered.length === 0 && isRepeater)) {
|
||||
const names = filtered.map((c) => c.name || c.public_key.slice(0, 8));
|
||||
const lastSeen = filtered.reduce(
|
||||
(max, c) => (c.last_seen && (!max || c.last_seen > max) ? c.last_seen : max),
|
||||
null as number | null
|
||||
);
|
||||
const nodeId = `?${source.value.toLowerCase()}`;
|
||||
|
||||
// Default: simple ambiguous node ID
|
||||
let nodeId = `?${source.value.toLowerCase()}`;
|
||||
let displayName = source.value.toUpperCase();
|
||||
|
||||
// When splitAmbiguousByTraffic is enabled, use traffic pattern analysis
|
||||
if (splitAmbiguousByTraffic && isRepeater && trafficContext) {
|
||||
const prefix = source.value.toLowerCase();
|
||||
|
||||
// Record observation for traffic analysis (only if we have a packet source)
|
||||
if (trafficContext.packetSource) {
|
||||
recordTrafficObservation(
|
||||
trafficPatternsRef.current,
|
||||
prefix,
|
||||
trafficContext.packetSource,
|
||||
trafficContext.nextPrefix
|
||||
);
|
||||
}
|
||||
|
||||
// Analyze traffic patterns to decide if we should split
|
||||
const trafficData = trafficPatternsRef.current.get(prefix);
|
||||
if (trafficData) {
|
||||
const analysis = analyzeRepeaterTraffic(trafficData);
|
||||
|
||||
if (analysis.shouldSplit && trafficContext.nextPrefix) {
|
||||
// Strong evidence of disjoint routing - split by next hop
|
||||
const nextShort = trafficContext.nextPrefix.slice(0, 2).toLowerCase();
|
||||
nodeId = `?${prefix}:>${nextShort}`;
|
||||
displayName = `${source.value.toUpperCase()}:>${nextShort}`;
|
||||
}
|
||||
// If analysis says don't split, or this is the final repeater (nextPrefix=null),
|
||||
// keep the simple ?XX ID
|
||||
}
|
||||
}
|
||||
|
||||
addNode(
|
||||
nodeId,
|
||||
source.value.toUpperCase(),
|
||||
displayName,
|
||||
isRepeater ? 'repeater' : 'client',
|
||||
true,
|
||||
names,
|
||||
names.length > 0 ? names : undefined,
|
||||
lastSeen
|
||||
);
|
||||
return nodeId;
|
||||
@@ -558,40 +750,82 @@ function useVisualizerData({
|
||||
|
||||
return null;
|
||||
},
|
||||
[contacts, addNode]
|
||||
[contacts, addNode, splitAmbiguousByTraffic]
|
||||
);
|
||||
|
||||
// Build path from parsed packet
|
||||
const buildPath = useCallback(
|
||||
(parsed: ParsedPacket, packet: RawPacket, myPrefix: string | null): string[] => {
|
||||
const path: string[] = [];
|
||||
let packetSource: string | null = null;
|
||||
|
||||
// Add source
|
||||
// Add source - and track it for traffic pattern analysis
|
||||
if (parsed.payloadType === PayloadType.Advert && parsed.advertPubkey) {
|
||||
const nodeId = resolveNode({ type: 'pubkey', value: parsed.advertPubkey }, false, false);
|
||||
if (nodeId) path.push(nodeId);
|
||||
const nodeId = resolveNode(
|
||||
{ type: 'pubkey', value: parsed.advertPubkey },
|
||||
false,
|
||||
false,
|
||||
myPrefix
|
||||
);
|
||||
if (nodeId) {
|
||||
path.push(nodeId);
|
||||
packetSource = nodeId;
|
||||
}
|
||||
} else if (parsed.payloadType === PayloadType.AnonRequest && parsed.anonRequestPubkey) {
|
||||
// AnonRequest packets contain the full sender public key
|
||||
const nodeId = resolveNode(
|
||||
{ type: 'pubkey', value: parsed.anonRequestPubkey },
|
||||
false,
|
||||
false,
|
||||
myPrefix
|
||||
);
|
||||
if (nodeId) {
|
||||
path.push(nodeId);
|
||||
packetSource = nodeId;
|
||||
}
|
||||
} else if (parsed.payloadType === PayloadType.TextMessage && parsed.srcHash) {
|
||||
if (myPrefix && parsed.srcHash.toLowerCase() === myPrefix) {
|
||||
path.push('self');
|
||||
packetSource = 'self';
|
||||
} else {
|
||||
const nodeId = resolveNode(
|
||||
{ type: 'prefix', value: parsed.srcHash },
|
||||
false,
|
||||
showAmbiguousNodes
|
||||
showAmbiguousNodes,
|
||||
myPrefix
|
||||
);
|
||||
if (nodeId) path.push(nodeId);
|
||||
if (nodeId) {
|
||||
path.push(nodeId);
|
||||
packetSource = nodeId;
|
||||
}
|
||||
}
|
||||
} else if (parsed.payloadType === PayloadType.GroupText) {
|
||||
const senderName = parsed.groupTextSender || packet.decrypted_info?.sender;
|
||||
if (senderName) {
|
||||
const nodeId = resolveNode({ type: 'name', value: senderName }, false, false);
|
||||
if (nodeId) path.push(nodeId);
|
||||
const nodeId = resolveNode({ type: 'name', value: senderName }, false, false, myPrefix);
|
||||
if (nodeId) {
|
||||
path.push(nodeId);
|
||||
packetSource = nodeId;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add path bytes (repeaters)
|
||||
for (const hexPrefix of parsed.pathBytes) {
|
||||
const nodeId = resolveNode({ type: 'prefix', value: hexPrefix }, true, showAmbiguousPaths);
|
||||
// Pass packetSource for traffic pattern analysis (used to track which sources route through which repeaters)
|
||||
for (let i = 0; i < parsed.pathBytes.length; i++) {
|
||||
const hexPrefix = parsed.pathBytes[i];
|
||||
const nextPrefix = parsed.pathBytes[i + 1] || null;
|
||||
|
||||
const nodeId = resolveNode(
|
||||
{ type: 'prefix', value: hexPrefix },
|
||||
true,
|
||||
showAmbiguousPaths,
|
||||
myPrefix,
|
||||
{
|
||||
packetSource,
|
||||
nextPrefix,
|
||||
}
|
||||
);
|
||||
if (nodeId) path.push(nodeId);
|
||||
}
|
||||
|
||||
@@ -603,7 +837,8 @@ function useVisualizerData({
|
||||
const nodeId = resolveNode(
|
||||
{ type: 'prefix', value: parsed.dstHash },
|
||||
false,
|
||||
showAmbiguousNodes
|
||||
showAmbiguousNodes,
|
||||
myPrefix
|
||||
);
|
||||
if (nodeId) path.push(nodeId);
|
||||
else path.push('self');
|
||||
@@ -809,6 +1044,59 @@ function useVisualizerData({
|
||||
requestAnimationFrame(animate);
|
||||
}, [chargeStrength]);
|
||||
|
||||
// Clear all state and reset to initial (keeps self node only)
|
||||
const clearAndReset = useCallback(() => {
|
||||
// Clear all pending timers
|
||||
for (const timer of timersRef.current.values()) {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
timersRef.current.clear();
|
||||
|
||||
// Clear pending packets
|
||||
pendingRef.current.clear();
|
||||
|
||||
// Clear processed packet IDs so they can be re-processed if needed
|
||||
processedRef.current.clear();
|
||||
|
||||
// Clear traffic patterns
|
||||
trafficPatternsRef.current.clear();
|
||||
|
||||
// Clear particles
|
||||
particlesRef.current.length = 0;
|
||||
|
||||
// Clear links
|
||||
linksRef.current.clear();
|
||||
|
||||
// Clear nodes except self, then reset self position
|
||||
const selfNode = nodesRef.current.get('self');
|
||||
nodesRef.current.clear();
|
||||
if (selfNode) {
|
||||
selfNode.x = dimensions.width / 2;
|
||||
selfNode.y = dimensions.height / 2;
|
||||
selfNode.vx = 0;
|
||||
selfNode.vy = 0;
|
||||
selfNode.lastActivity = Date.now();
|
||||
nodesRef.current.set('self', selfNode);
|
||||
}
|
||||
|
||||
// Reset simulation with just self node
|
||||
const sim = simulationRef.current;
|
||||
if (sim) {
|
||||
sim.nodes(Array.from(nodesRef.current.values()));
|
||||
sim.force(
|
||||
'link',
|
||||
forceLink<GraphNode, GraphLink>([])
|
||||
.id((d) => d.id)
|
||||
.distance(80)
|
||||
.strength(0.3)
|
||||
);
|
||||
sim.alpha(0.3).restart();
|
||||
}
|
||||
|
||||
// Reset stats
|
||||
setStats({ processed: 0, animated: 0, nodes: 1, links: 0 });
|
||||
}, [dimensions]);
|
||||
|
||||
return {
|
||||
nodes: nodesRef.current,
|
||||
links: linksRef.current,
|
||||
@@ -817,6 +1105,7 @@ function useVisualizerData({
|
||||
stats,
|
||||
randomizePositions,
|
||||
expandContract,
|
||||
clearAndReset,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -963,6 +1252,7 @@ interface PacketVisualizerProps {
|
||||
config: RadioConfig | null;
|
||||
fullScreen?: boolean;
|
||||
onFullScreenChange?: (fullScreen: boolean) => void;
|
||||
onClearPackets?: () => void;
|
||||
}
|
||||
|
||||
export function PacketVisualizer({
|
||||
@@ -971,6 +1261,7 @@ export function PacketVisualizer({
|
||||
config,
|
||||
fullScreen,
|
||||
onFullScreenChange,
|
||||
onClearPackets,
|
||||
}: PacketVisualizerProps) {
|
||||
const canvasRef = useRef<HTMLCanvasElement>(null);
|
||||
const containerRef = useRef<HTMLDivElement>(null);
|
||||
@@ -979,6 +1270,7 @@ export function PacketVisualizer({
|
||||
// Options
|
||||
const [showAmbiguousPaths, setShowAmbiguousPaths] = useState(true);
|
||||
const [showAmbiguousNodes, setShowAmbiguousNodes] = useState(false);
|
||||
const [splitAmbiguousByTraffic, setSplitAmbiguousByTraffic] = useState(false);
|
||||
const [chargeStrength, setChargeStrength] = useState(-200);
|
||||
const [filterOldRepeaters, setFilterOldRepeaters] = useState(false);
|
||||
const [observationWindowSec, setObservationWindowSec] = useState(DEFAULT_OBSERVATION_WINDOW_SEC);
|
||||
@@ -990,6 +1282,7 @@ export function PacketVisualizer({
|
||||
const [transform, setTransform] = useState({ x: 0, y: 0, scale: 1 });
|
||||
const isDraggingRef = useRef(false);
|
||||
const lastMouseRef = useRef({ x: 0, y: 0 });
|
||||
const draggedNodeRef = useRef<GraphNode | null>(null);
|
||||
|
||||
// Hover
|
||||
const [hoveredNodeId, setHoveredNodeId] = useState<string | null>(null);
|
||||
@@ -1001,6 +1294,7 @@ export function PacketVisualizer({
|
||||
config,
|
||||
showAmbiguousPaths,
|
||||
showAmbiguousNodes,
|
||||
splitAmbiguousByTraffic,
|
||||
chargeStrength,
|
||||
letEmDrift,
|
||||
particleSpeedMultiplier,
|
||||
@@ -1117,10 +1411,31 @@ export function PacketVisualizer({
|
||||
[data.nodes]
|
||||
);
|
||||
|
||||
const handleMouseDown = useCallback((e: React.MouseEvent) => {
|
||||
isDraggingRef.current = true;
|
||||
lastMouseRef.current = { x: e.clientX, y: e.clientY };
|
||||
}, []);
|
||||
const handleMouseDown = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
const canvas = canvasRef.current;
|
||||
if (!canvas) return;
|
||||
|
||||
const rect = canvas.getBoundingClientRect();
|
||||
const pos = screenToGraph(e.clientX - rect.left, e.clientY - rect.top);
|
||||
const node = findNodeAt(pos.x, pos.y);
|
||||
|
||||
if (node) {
|
||||
// Start dragging this node
|
||||
draggedNodeRef.current = node;
|
||||
// Fix the node's position while dragging
|
||||
node.fx = node.x;
|
||||
node.fy = node.y;
|
||||
// Reheat simulation slightly for responsive feedback
|
||||
data.simulation?.alpha(0.3).restart();
|
||||
} else {
|
||||
// Start panning
|
||||
isDraggingRef.current = true;
|
||||
}
|
||||
lastMouseRef.current = { x: e.clientX, y: e.clientY };
|
||||
},
|
||||
[screenToGraph, findNodeAt, data.simulation]
|
||||
);
|
||||
|
||||
const handleMouseMove = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
@@ -1129,8 +1444,18 @@ export function PacketVisualizer({
|
||||
|
||||
const rect = canvas.getBoundingClientRect();
|
||||
const pos = screenToGraph(e.clientX - rect.left, e.clientY - rect.top);
|
||||
|
||||
// Update hover state
|
||||
setHoveredNodeId(findNodeAt(pos.x, pos.y)?.id || null);
|
||||
|
||||
// Handle node dragging
|
||||
if (draggedNodeRef.current) {
|
||||
draggedNodeRef.current.fx = pos.x;
|
||||
draggedNodeRef.current.fy = pos.y;
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle canvas panning
|
||||
if (!isDraggingRef.current) return;
|
||||
const dx = e.clientX - lastMouseRef.current.x;
|
||||
const dy = e.clientY - lastMouseRef.current.y;
|
||||
@@ -1141,9 +1466,21 @@ export function PacketVisualizer({
|
||||
);
|
||||
|
||||
const handleMouseUp = useCallback(() => {
|
||||
if (draggedNodeRef.current) {
|
||||
// Release the node - clear fixed position so it can move freely again
|
||||
draggedNodeRef.current.fx = null;
|
||||
draggedNodeRef.current.fy = null;
|
||||
draggedNodeRef.current = null;
|
||||
}
|
||||
isDraggingRef.current = false;
|
||||
}, []);
|
||||
|
||||
const handleMouseLeave = useCallback(() => {
|
||||
if (draggedNodeRef.current) {
|
||||
draggedNodeRef.current.fx = null;
|
||||
draggedNodeRef.current.fy = null;
|
||||
draggedNodeRef.current = null;
|
||||
}
|
||||
isDraggingRef.current = false;
|
||||
setHoveredNodeId(null);
|
||||
}, []);
|
||||
@@ -1161,12 +1498,19 @@ export function PacketVisualizer({
|
||||
return () => canvas.removeEventListener('wheel', handleWheel);
|
||||
}, [handleWheel]);
|
||||
|
||||
// Determine cursor based on state
|
||||
const getCursor = () => {
|
||||
if (draggedNodeRef.current) return 'grabbing';
|
||||
if (hoveredNodeId) return 'pointer';
|
||||
return 'grab';
|
||||
};
|
||||
|
||||
return (
|
||||
<div ref={containerRef} className="w-full h-full bg-background relative overflow-hidden">
|
||||
<canvas
|
||||
ref={canvasRef}
|
||||
className="w-full h-full cursor-grab active:cursor-grabbing"
|
||||
style={{ display: 'block' }}
|
||||
className="w-full h-full"
|
||||
style={{ display: 'block', cursor: getCursor() }}
|
||||
onMouseDown={handleMouseDown}
|
||||
onMouseMove={handleMouseMove}
|
||||
onMouseUp={handleMouseUp}
|
||||
@@ -1177,15 +1521,6 @@ export function PacketVisualizer({
|
||||
{!hideUI && (
|
||||
<div className="absolute bottom-4 left-4 bg-background/80 backdrop-blur-sm rounded-lg p-3 text-xs border border-border">
|
||||
<div className="flex gap-6">
|
||||
<div className="flex flex-col gap-1.5">
|
||||
<div className="text-muted-foreground font-medium mb-1">Nodes</div>
|
||||
{LEGEND_ITEMS.map((item) => (
|
||||
<div key={item.label} className="flex items-center gap-2">
|
||||
<span className={item.size}>{item.emoji}</span>
|
||||
<span>{item.label}</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
<div className="flex flex-col gap-1.5">
|
||||
<div className="text-muted-foreground font-medium mb-1">Packets</div>
|
||||
{PACKET_LEGEND_ITEMS.map((item) => (
|
||||
@@ -1200,6 +1535,15 @@ export function PacketVisualizer({
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
<div className="flex flex-col gap-1.5">
|
||||
<div className="text-muted-foreground font-medium mb-1">Nodes</div>
|
||||
{LEGEND_ITEMS.map((item) => (
|
||||
<div key={item.label} className="flex items-center gap-2">
|
||||
<span className={item.size}>{item.emoji}</span>
|
||||
<span>{item.label}</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
@@ -1230,6 +1574,19 @@ export function PacketVisualizer({
|
||||
Ambiguous sender/recipient
|
||||
</span>
|
||||
</label>
|
||||
<label className="flex items-center gap-2 cursor-pointer">
|
||||
<Checkbox
|
||||
checked={splitAmbiguousByTraffic}
|
||||
onCheckedChange={(c) => setSplitAmbiguousByTraffic(c === true)}
|
||||
disabled={!showAmbiguousPaths}
|
||||
/>
|
||||
<span
|
||||
title="Split ambiguous repeaters into separate nodes based on traffic patterns (prev→next). Helps identify colliding prefixes representing different physical nodes."
|
||||
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
|
||||
>
|
||||
Hueristically group repeaters by traffic pattern
|
||||
</span>
|
||||
</label>
|
||||
<label className="flex items-center gap-2 cursor-pointer">
|
||||
<Checkbox
|
||||
checked={filterOldRepeaters}
|
||||
@@ -1318,6 +1675,16 @@ export function PacketVisualizer({
|
||||
>
|
||||
Oooh Big Stretch!
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
data.clearAndReset();
|
||||
onClearPackets?.();
|
||||
}}
|
||||
className="mt-1 px-3 py-1.5 bg-yellow-500/20 hover:bg-yellow-500/30 text-yellow-500 rounded text-xs transition-colors"
|
||||
title="Clear all nodes, links, and packets - reset to initial state"
|
||||
>
|
||||
Clear & Reset
|
||||
</button>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
@@ -1,7 +1,12 @@
|
||||
import { useState, useEffect, useMemo } from 'react';
|
||||
import { useState, useEffect, useMemo, lazy, Suspense } from 'react';
|
||||
|
||||
const BotCodeEditor = lazy(() =>
|
||||
import('./BotCodeEditor').then((m) => ({ default: m.BotCodeEditor }))
|
||||
);
|
||||
import type {
|
||||
AppSettings,
|
||||
AppSettingsUpdate,
|
||||
BotConfig,
|
||||
HealthStatus,
|
||||
RadioConfig,
|
||||
RadioConfigUpdate,
|
||||
@@ -73,7 +78,7 @@ export function SettingsModal({
|
||||
onRefreshAppSettings,
|
||||
}: SettingsModalProps) {
|
||||
// Tab state
|
||||
type SettingsTab = 'radio' | 'identity' | 'serial' | 'database' | 'advertise';
|
||||
type SettingsTab = 'radio' | 'identity' | 'serial' | 'database' | 'bot';
|
||||
const [activeTab, setActiveTab] = useState<SettingsTab>('radio');
|
||||
|
||||
// Radio config state
|
||||
@@ -100,6 +105,46 @@ export function SettingsModal({
|
||||
const [cleaning, setCleaning] = useState(false);
|
||||
const [autoDecryptOnAdvert, setAutoDecryptOnAdvert] = useState(false);
|
||||
|
||||
// Advertisement interval state
|
||||
const [advertInterval, setAdvertInterval] = useState('0');
|
||||
|
||||
// Bot state
|
||||
const DEFAULT_BOT_CODE = `def bot(
|
||||
sender_name: str | None,
|
||||
sender_key: str | None,
|
||||
message_text: str,
|
||||
is_dm: bool,
|
||||
channel_key: str | None,
|
||||
channel_name: str | None,
|
||||
sender_timestamp: int | None,
|
||||
path: str | None,
|
||||
) -> str | list[str] | None:
|
||||
"""
|
||||
Process incoming messages and optionally return a reply.
|
||||
|
||||
Args:
|
||||
sender_name: Display name of sender (may be None)
|
||||
sender_key: 64-char hex public key (None for channel msgs)
|
||||
message_text: The message content
|
||||
is_dm: True for direct messages, False for channel
|
||||
channel_key: 32-char hex key for channels, None for DMs
|
||||
channel_name: Channel name with hash (e.g. "#bot"), None for DMs
|
||||
sender_timestamp: Sender's timestamp (unix seconds, may be None)
|
||||
path: Hex-encoded routing path (may be None)
|
||||
|
||||
Returns:
|
||||
None for no reply, a string for a single reply,
|
||||
or a list of strings to send multiple messages in order
|
||||
"""
|
||||
# Example: Only respond in #bot channel to "!pling" command
|
||||
if channel_name == "#bot" and "!pling" in message_text.lower():
|
||||
return "[BOT] Plong!"
|
||||
return None`;
|
||||
const [bots, setBots] = useState<BotConfig[]>([]);
|
||||
const [expandedBotId, setExpandedBotId] = useState<string | null>(null);
|
||||
const [editingNameId, setEditingNameId] = useState<string | null>(null);
|
||||
const [editingNameValue, setEditingNameValue] = useState('');
|
||||
|
||||
useEffect(() => {
|
||||
if (config) {
|
||||
setName(config.name);
|
||||
@@ -117,6 +162,8 @@ export function SettingsModal({
|
||||
if (appSettings) {
|
||||
setMaxRadioContacts(String(appSettings.max_radio_contacts));
|
||||
setAutoDecryptOnAdvert(appSettings.auto_decrypt_dm_on_advert);
|
||||
setAdvertInterval(String(appSettings.advert_interval));
|
||||
setBots(appSettings.bots || []);
|
||||
}
|
||||
}, [appSettings]);
|
||||
|
||||
@@ -220,9 +267,17 @@ export function SettingsModal({
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// Save radio name
|
||||
const update: RadioConfigUpdate = { name };
|
||||
await onSave(update);
|
||||
toast.success('Identity saved');
|
||||
|
||||
// Save advert interval to app settings
|
||||
const newAdvertInterval = parseInt(advertInterval, 10);
|
||||
if (!isNaN(newAdvertInterval) && newAdvertInterval !== appSettings?.advert_interval) {
|
||||
await onSaveAppSettings({ advert_interval: newAdvertInterval });
|
||||
}
|
||||
|
||||
toast.success('Identity settings saved');
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'Failed to save');
|
||||
} finally {
|
||||
@@ -342,17 +397,86 @@ export function SettingsModal({
|
||||
}
|
||||
};
|
||||
|
||||
const handleSaveBotSettings = async () => {
|
||||
setLoading(true);
|
||||
setError('');
|
||||
|
||||
try {
|
||||
await onSaveAppSettings({ bots });
|
||||
toast.success('Bot settings saved');
|
||||
} catch (err) {
|
||||
console.error('Failed to save bot settings:', err);
|
||||
const errorMsg = err instanceof Error ? err.message : 'Failed to save';
|
||||
setError(errorMsg);
|
||||
toast.error(errorMsg);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleAddBot = () => {
|
||||
const newBot: BotConfig = {
|
||||
id: crypto.randomUUID(),
|
||||
name: `Bot ${bots.length + 1}`,
|
||||
enabled: false,
|
||||
code: DEFAULT_BOT_CODE,
|
||||
};
|
||||
setBots([...bots, newBot]);
|
||||
setExpandedBotId(newBot.id);
|
||||
};
|
||||
|
||||
const handleDeleteBot = (botId: string) => {
|
||||
const bot = bots.find((b) => b.id === botId);
|
||||
if (bot && bot.code.trim() && bot.code !== DEFAULT_BOT_CODE) {
|
||||
if (!confirm(`Delete "${bot.name}"? This will remove all its code.`)) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
setBots(bots.filter((b) => b.id !== botId));
|
||||
if (expandedBotId === botId) {
|
||||
setExpandedBotId(null);
|
||||
}
|
||||
};
|
||||
|
||||
const handleToggleBotEnabled = (botId: string) => {
|
||||
setBots(bots.map((b) => (b.id === botId ? { ...b, enabled: !b.enabled } : b)));
|
||||
};
|
||||
|
||||
const handleBotCodeChange = (botId: string, code: string) => {
|
||||
setBots(bots.map((b) => (b.id === botId ? { ...b, code } : b)));
|
||||
};
|
||||
|
||||
const handleStartEditingName = (bot: BotConfig) => {
|
||||
setEditingNameId(bot.id);
|
||||
setEditingNameValue(bot.name);
|
||||
};
|
||||
|
||||
const handleFinishEditingName = () => {
|
||||
if (editingNameId && editingNameValue.trim()) {
|
||||
setBots(
|
||||
bots.map((b) => (b.id === editingNameId ? { ...b, name: editingNameValue.trim() } : b))
|
||||
);
|
||||
}
|
||||
setEditingNameId(null);
|
||||
setEditingNameValue('');
|
||||
};
|
||||
|
||||
const handleResetBotCode = (botId: string) => {
|
||||
setBots(bots.map((b) => (b.id === botId ? { ...b, code: DEFAULT_BOT_CODE } : b)));
|
||||
};
|
||||
|
||||
return (
|
||||
<Dialog open={open} onOpenChange={(isOpen) => !isOpen && onClose()}>
|
||||
<DialogContent className="sm:max-w-[500px] max-h-[90vh] overflow-y-auto">
|
||||
<DialogContent className="sm:max-w-[50vw] sm:min-w-[500px] max-h-[90vh] overflow-y-auto">
|
||||
<DialogHeader>
|
||||
<DialogTitle>Radio & Settings</DialogTitle>
|
||||
<DialogDescription className="sr-only">
|
||||
{activeTab === 'radio' && 'Configure radio frequency, power, and location settings'}
|
||||
{activeTab === 'identity' && 'Manage radio name, public key, and private key'}
|
||||
{activeTab === 'identity' &&
|
||||
'Manage radio name, public key, private key, and advertising settings'}
|
||||
{activeTab === 'serial' && 'View serial port connection and configure contact sync'}
|
||||
{activeTab === 'database' && 'View database statistics and clean up old packets'}
|
||||
{activeTab === 'advertise' && 'Send a flood advertisement to announce your presence'}
|
||||
{activeTab === 'bot' && 'Configure automatic message bot with Python code'}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
@@ -361,7 +485,10 @@ export function SettingsModal({
|
||||
) : (
|
||||
<Tabs
|
||||
value={activeTab}
|
||||
onValueChange={(v) => setActiveTab(v as SettingsTab)}
|
||||
onValueChange={(v) => {
|
||||
setActiveTab(v as SettingsTab);
|
||||
setError('');
|
||||
}}
|
||||
className="w-full"
|
||||
>
|
||||
<TabsList className="grid w-full grid-cols-5">
|
||||
@@ -369,7 +496,7 @@ export function SettingsModal({
|
||||
<TabsTrigger value="identity">Identity</TabsTrigger>
|
||||
<TabsTrigger value="serial">Serial</TabsTrigger>
|
||||
<TabsTrigger value="database">Database</TabsTrigger>
|
||||
<TabsTrigger value="advertise">Advertise</TabsTrigger>
|
||||
<TabsTrigger value="bot">Bot</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
{/* Radio Config Tab */}
|
||||
@@ -526,8 +653,27 @@ export function SettingsModal({
|
||||
<Input id="name" value={name} onChange={(e) => setName(e.target.value)} />
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
<Label htmlFor="advert-interval">Periodic Advertising Interval</Label>
|
||||
<div className="flex items-center gap-2">
|
||||
<Input
|
||||
id="advert-interval"
|
||||
type="number"
|
||||
min="0"
|
||||
value={advertInterval}
|
||||
onChange={(e) => setAdvertInterval(e.target.value)}
|
||||
className="w-28"
|
||||
/>
|
||||
<span className="text-sm text-muted-foreground">seconds (0 = off)</span>
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
How often to automatically advertise presence. Set to 0 to disable. Recommended:
|
||||
86400 (24 hours) or higher.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<Button onClick={handleSaveIdentity} disabled={loading} className="w-full">
|
||||
{loading ? 'Saving...' : 'Set Name'}
|
||||
{loading ? 'Saving...' : 'Save Identity Settings'}
|
||||
</Button>
|
||||
|
||||
<Separator />
|
||||
@@ -551,6 +697,25 @@ export function SettingsModal({
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<Separator />
|
||||
|
||||
<div className="space-y-2">
|
||||
<Label>Send Advertisement</Label>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Send a flood advertisement to announce your presence on the mesh network.
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleAdvertise}
|
||||
disabled={advertising || !health?.radio_connected}
|
||||
className="w-full bg-yellow-600 hover:bg-yellow-700 text-white"
|
||||
>
|
||||
{advertising ? 'Sending...' : 'Send Advertisement'}
|
||||
</Button>
|
||||
{!health?.radio_connected && (
|
||||
<p className="text-sm text-destructive">Radio not connected</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{error && <div className="text-sm text-destructive">{error}</div>}
|
||||
</TabsContent>
|
||||
|
||||
@@ -696,24 +861,183 @@ export function SettingsModal({
|
||||
</Button>
|
||||
</TabsContent>
|
||||
|
||||
{/* Advertise Tab */}
|
||||
<TabsContent value="advertise" className="space-y-4 mt-4">
|
||||
<div className="text-center py-8">
|
||||
<p className="text-muted-foreground mb-6">
|
||||
Send a flood advertisement to announce your presence on the mesh network.
|
||||
{/* Bot Tab */}
|
||||
<TabsContent value="bot" className="space-y-4 mt-4">
|
||||
<div className="p-3 bg-red-500/10 border border-red-500/30 rounded-md">
|
||||
<p className="text-sm text-red-500">
|
||||
<strong>Experimental:</strong> This is an alpha feature and introduces automated
|
||||
message sending to your radio; unexpected behavior may occur. Use with caution,
|
||||
and please report any bugs!
|
||||
</p>
|
||||
<Button
|
||||
size="lg"
|
||||
onClick={handleAdvertise}
|
||||
disabled={advertising || !health?.radio_connected}
|
||||
className="bg-green-600 hover:bg-green-700 text-white px-12 py-6 text-lg"
|
||||
>
|
||||
{advertising ? 'Sending...' : 'Send Advertisement'}
|
||||
</Button>
|
||||
{!health?.radio_connected && (
|
||||
<p className="text-sm text-destructive mt-4">Radio not connected</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="p-3 bg-yellow-500/10 border border-yellow-500/30 rounded-md">
|
||||
<p className="text-sm text-yellow-500">
|
||||
<strong>Security Warning:</strong> This feature executes arbitrary Python code on
|
||||
the server. Only run trusted code, and be cautious of arbitrary usage of message
|
||||
parameters.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="p-3 bg-yellow-500/10 border border-yellow-500/30 rounded-md">
|
||||
<p className="text-sm text-yellow-500">
|
||||
<strong>Don't wreck the mesh!</strong> Bots process ALL messages, including
|
||||
their own. Be careful of creating infinite loops!
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="flex justify-between items-center">
|
||||
<Label>Bots</Label>
|
||||
<Button type="button" variant="outline" size="sm" onClick={handleAddBot}>
|
||||
+ New Bot
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{bots.length === 0 ? (
|
||||
<div className="text-center py-8 border border-dashed border-input rounded-md">
|
||||
<p className="text-muted-foreground mb-4">No bots configured</p>
|
||||
<Button type="button" variant="outline" onClick={handleAddBot}>
|
||||
Create your first bot
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
{bots.map((bot) => (
|
||||
<div key={bot.id} className="border border-input rounded-md overflow-hidden">
|
||||
{/* Bot header row */}
|
||||
<div
|
||||
className="flex items-center gap-2 px-3 py-2 bg-muted/50 cursor-pointer hover:bg-muted/80"
|
||||
onClick={(e) => {
|
||||
// Don't toggle if clicking on interactive elements
|
||||
if ((e.target as HTMLElement).closest('input, button')) return;
|
||||
setExpandedBotId(expandedBotId === bot.id ? null : bot.id);
|
||||
}}
|
||||
>
|
||||
<span className="text-muted-foreground">
|
||||
{expandedBotId === bot.id ? '▼' : '▶'}
|
||||
</span>
|
||||
|
||||
{/* Bot name (click to edit) */}
|
||||
{editingNameId === bot.id ? (
|
||||
<input
|
||||
type="text"
|
||||
value={editingNameValue}
|
||||
onChange={(e) => setEditingNameValue(e.target.value)}
|
||||
onBlur={handleFinishEditingName}
|
||||
onKeyDown={(e) => {
|
||||
if (e.key === 'Enter') handleFinishEditingName();
|
||||
if (e.key === 'Escape') {
|
||||
setEditingNameId(null);
|
||||
setEditingNameValue('');
|
||||
}
|
||||
}}
|
||||
autoFocus
|
||||
className="px-2 py-0.5 text-sm bg-background border border-input rounded flex-1 max-w-[200px]"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
/>
|
||||
) : (
|
||||
<span
|
||||
className="text-sm font-medium flex-1 hover:text-primary cursor-text"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
handleStartEditingName(bot);
|
||||
}}
|
||||
title="Click to rename"
|
||||
>
|
||||
{bot.name}
|
||||
</span>
|
||||
)}
|
||||
|
||||
{/* Enabled checkbox */}
|
||||
<label
|
||||
className="flex items-center gap-1.5 cursor-pointer"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={bot.enabled}
|
||||
onChange={() => handleToggleBotEnabled(bot.id)}
|
||||
className="w-4 h-4 rounded border-input accent-primary"
|
||||
/>
|
||||
<span className="text-xs text-muted-foreground">Enabled</span>
|
||||
</label>
|
||||
|
||||
{/* Delete button */}
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
className="h-6 w-6 p-0 text-muted-foreground hover:text-destructive"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
handleDeleteBot(bot.id);
|
||||
}}
|
||||
title="Delete bot"
|
||||
>
|
||||
🗑
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Bot expanded content */}
|
||||
{expandedBotId === bot.id && (
|
||||
<div className="p-3 space-y-3 border-t border-input">
|
||||
<div className="flex items-center justify-between">
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Define a <code className="bg-muted px-1 rounded">bot()</code> function
|
||||
that receives message data and optionally returns a reply.
|
||||
</p>
|
||||
<Button
|
||||
type="button"
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleResetBotCode(bot.id)}
|
||||
>
|
||||
Reset to Example
|
||||
</Button>
|
||||
</div>
|
||||
<Suspense
|
||||
fallback={
|
||||
<div className="h-64 rounded-md border border-input bg-[#282c34] flex items-center justify-center text-muted-foreground">
|
||||
Loading editor...
|
||||
</div>
|
||||
}
|
||||
>
|
||||
<BotCodeEditor
|
||||
value={bot.code}
|
||||
onChange={(code) => handleBotCodeChange(bot.id, code)}
|
||||
id={`bot-code-${bot.id}`}
|
||||
/>
|
||||
</Suspense>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
<Separator />
|
||||
|
||||
<div className="text-xs text-muted-foreground space-y-1">
|
||||
<p>
|
||||
<strong>Available:</strong> Standard Python libraries and any modules installed in
|
||||
the server environment.
|
||||
</p>
|
||||
<p>
|
||||
<strong>Limits:</strong> 10 second timeout per bot.
|
||||
</p>
|
||||
<p>
|
||||
<strong>Note:</strong> Bots respond to all messages, including your own. For
|
||||
channel messages, <code>sender_key</code> is <code>None</code>. Multiple enabled
|
||||
bots run serially, with a two-second delay between messages to prevent repeater
|
||||
collision.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{error && <div className="text-sm text-destructive">{error}</div>}
|
||||
|
||||
<Button onClick={handleSaveBotSettings} disabled={loading} className="w-full">
|
||||
{loading ? 'Saving...' : 'Save Bot Settings'}
|
||||
</Button>
|
||||
</TabsContent>
|
||||
</Tabs>
|
||||
)}
|
||||
|
||||
@@ -1,11 +1,15 @@
|
||||
import { useState } from 'react';
|
||||
import type { Contact, Channel, Conversation, Favorite } from '../types';
|
||||
import {
|
||||
CONTACT_TYPE_REPEATER,
|
||||
type Contact,
|
||||
type Channel,
|
||||
type Conversation,
|
||||
type Favorite,
|
||||
} from '../types';
|
||||
import { getStateKey, type ConversationTimes } from '../utils/conversationState';
|
||||
import { getContactDisplayName } from '../utils/pubkey';
|
||||
import { ContactAvatar } from './ContactAvatar';
|
||||
import { CONTACT_TYPE_REPEATER } from '../utils/contactAvatar';
|
||||
import { isFavorite } from '../utils/favorites';
|
||||
import { UNREAD_FETCH_LIMIT } from '../api';
|
||||
import { Input } from './ui/input';
|
||||
import { Button } from './ui/button';
|
||||
import { cn } from '@/lib/utils';
|
||||
@@ -33,11 +37,6 @@ interface SidebarProps {
|
||||
onSortOrderChange?: (order: SortOrder) => void;
|
||||
}
|
||||
|
||||
/** Format unread count, showing "X+" if at the fetch limit (indicating there may be more) */
|
||||
function formatUnreadCount(count: number): string {
|
||||
return count >= UNREAD_FETCH_LIMIT ? `${count}+` : `${count}`;
|
||||
}
|
||||
|
||||
export function Sidebar({
|
||||
contacts,
|
||||
channels,
|
||||
@@ -380,7 +379,7 @@ export function Sidebar({
|
||||
: 'bg-primary text-primary-foreground'
|
||||
)}
|
||||
>
|
||||
{formatUnreadCount(unreadCount)}
|
||||
{unreadCount}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
@@ -423,7 +422,7 @@ export function Sidebar({
|
||||
: 'bg-primary text-primary-foreground'
|
||||
)}
|
||||
>
|
||||
{formatUnreadCount(unreadCount)}
|
||||
{unreadCount}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
@@ -475,7 +474,7 @@ export function Sidebar({
|
||||
: 'bg-primary text-primary-foreground'
|
||||
)}
|
||||
>
|
||||
{formatUnreadCount(unreadCount)}
|
||||
{unreadCount}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
@@ -536,7 +535,7 @@ export function Sidebar({
|
||||
: 'bg-primary text-primary-foreground'
|
||||
)}
|
||||
>
|
||||
{formatUnreadCount(unreadCount)}
|
||||
{unreadCount}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
@@ -9,9 +9,10 @@ interface VisualizerViewProps {
|
||||
packets: RawPacket[];
|
||||
contacts: Contact[];
|
||||
config: RadioConfig | null;
|
||||
onClearPackets?: () => void;
|
||||
}
|
||||
|
||||
export function VisualizerView({ packets, contacts, config }: VisualizerViewProps) {
|
||||
export function VisualizerView({ packets, contacts, config, onClearPackets }: VisualizerViewProps) {
|
||||
const [fullScreen, setFullScreen] = useState(false);
|
||||
|
||||
return (
|
||||
@@ -29,7 +30,12 @@ export function VisualizerView({ packets, contacts, config }: VisualizerViewProp
|
||||
<TabsTrigger value="packets">Packet Feed</TabsTrigger>
|
||||
</TabsList>
|
||||
<TabsContent value="visualizer" className="flex-1 m-0 overflow-hidden">
|
||||
<PacketVisualizer packets={packets} contacts={contacts} config={config} />
|
||||
<PacketVisualizer
|
||||
packets={packets}
|
||||
contacts={contacts}
|
||||
config={config}
|
||||
onClearPackets={onClearPackets}
|
||||
/>
|
||||
</TabsContent>
|
||||
<TabsContent value="packets" className="flex-1 m-0 overflow-hidden">
|
||||
<RawPacketList packets={packets} />
|
||||
@@ -52,6 +58,7 @@ export function VisualizerView({ packets, contacts, config }: VisualizerViewProp
|
||||
config={config}
|
||||
fullScreen={fullScreen}
|
||||
onFullScreenChange={setFullScreen}
|
||||
onClearPackets={onClearPackets}
|
||||
/>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -1,50 +0,0 @@
|
||||
import * as React from 'react';
|
||||
import { cva, type VariantProps } from 'class-variance-authority';
|
||||
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
const alertVariants = cva(
|
||||
'relative w-full rounded-lg border p-4 [&>svg~*]:pl-7 [&>svg+div]:translate-y-[-3px] [&>svg]:absolute [&>svg]:left-4 [&>svg]:top-4 [&>svg]:text-foreground',
|
||||
{
|
||||
variants: {
|
||||
variant: {
|
||||
default: 'bg-background text-foreground',
|
||||
destructive:
|
||||
'border-destructive/50 text-destructive dark:border-destructive [&>svg]:text-destructive',
|
||||
warning: 'border-yellow-500/50 bg-yellow-500/10 text-yellow-200 [&>svg]:text-yellow-500',
|
||||
},
|
||||
},
|
||||
defaultVariants: {
|
||||
variant: 'default',
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
const Alert = React.forwardRef<
|
||||
HTMLDivElement,
|
||||
React.HTMLAttributes<HTMLDivElement> & VariantProps<typeof alertVariants>
|
||||
>(({ className, variant, ...props }, ref) => (
|
||||
<div ref={ref} role="alert" className={cn(alertVariants({ variant }), className)} {...props} />
|
||||
));
|
||||
Alert.displayName = 'Alert';
|
||||
|
||||
const AlertTitle = React.forwardRef<HTMLParagraphElement, React.HTMLAttributes<HTMLHeadingElement>>(
|
||||
({ className, ...props }, ref) => (
|
||||
<h5
|
||||
ref={ref}
|
||||
className={cn('mb-1 font-medium leading-none tracking-tight', className)}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
);
|
||||
AlertTitle.displayName = 'AlertTitle';
|
||||
|
||||
const AlertDescription = React.forwardRef<
|
||||
HTMLParagraphElement,
|
||||
React.HTMLAttributes<HTMLParagraphElement>
|
||||
>(({ className, ...props }, ref) => (
|
||||
<div ref={ref} className={cn('text-sm [&_p]:leading-relaxed', className)} {...props} />
|
||||
));
|
||||
AlertDescription.displayName = 'AlertDescription';
|
||||
|
||||
export { Alert, AlertTitle, AlertDescription };
|
||||
@@ -1,14 +1,3 @@
|
||||
export {
|
||||
useRepeaterMode,
|
||||
type UseRepeaterModeResult,
|
||||
formatDuration,
|
||||
formatTelemetry,
|
||||
formatNeighbors,
|
||||
formatAcl,
|
||||
} from './useRepeaterMode';
|
||||
export { useUnreadCounts, type UseUnreadCountsResult } from './useUnreadCounts';
|
||||
export {
|
||||
useConversationMessages,
|
||||
type UseConversationMessagesResult,
|
||||
getMessageContentKey,
|
||||
} from './useConversationMessages';
|
||||
export { useRepeaterMode } from './useRepeaterMode';
|
||||
export { useUnreadCounts } from './useUnreadCounts';
|
||||
export { useConversationMessages, getMessageContentKey } from './useConversationMessages';
|
||||
|
||||
@@ -102,7 +102,7 @@ export function useConversationMessages(
|
||||
[activeConversation]
|
||||
);
|
||||
|
||||
// Fetch older messages (pagination)
|
||||
// Fetch older messages (cursor-based pagination)
|
||||
const fetchOlderMessages = useCallback(async () => {
|
||||
if (
|
||||
!activeConversation ||
|
||||
@@ -112,13 +112,18 @@ export function useConversationMessages(
|
||||
)
|
||||
return;
|
||||
|
||||
// Get the oldest message as cursor for the next page
|
||||
const oldestMessage = messages[messages.length - 1];
|
||||
if (!oldestMessage) return;
|
||||
|
||||
setLoadingOlder(true);
|
||||
try {
|
||||
const data = await api.getMessages({
|
||||
type: activeConversation.type === 'channel' ? 'CHAN' : 'PRIV',
|
||||
conversation_key: activeConversation.id,
|
||||
limit: MESSAGE_PAGE_SIZE,
|
||||
offset: messages.length,
|
||||
before: oldestMessage.received_at,
|
||||
before_id: oldestMessage.id,
|
||||
});
|
||||
|
||||
if (data.length > 0) {
|
||||
@@ -139,7 +144,7 @@ export function useConversationMessages(
|
||||
} finally {
|
||||
setLoadingOlder(false);
|
||||
}
|
||||
}, [activeConversation, loadingOlder, hasOlderMessages, messages.length]);
|
||||
}, [activeConversation, loadingOlder, hasOlderMessages, messages]);
|
||||
|
||||
// Fetch messages when conversation changes, with proper cancellation
|
||||
useEffect(() => {
|
||||
|
||||
@@ -37,6 +37,7 @@ export function formatTelemetry(telemetry: TelemetryResponse): string {
|
||||
`Telemetry`,
|
||||
`Battery Voltage: ${telemetry.battery_volts.toFixed(3)}V`,
|
||||
`Uptime: ${formatDuration(telemetry.uptime_seconds)}`,
|
||||
...(telemetry.clock_output ? [`Clock: ${telemetry.clock_output}`] : []),
|
||||
`TX Airtime: ${formatDuration(telemetry.airtime_seconds)}`,
|
||||
`RX Airtime: ${formatDuration(telemetry.rx_airtime_seconds)}`,
|
||||
'',
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { useState, useCallback, useEffect, useRef } from 'react';
|
||||
import { api, UNREAD_FETCH_LIMIT } from '../api';
|
||||
import { api } from '../api';
|
||||
import {
|
||||
getLastMessageTimes,
|
||||
setLastMessageTime,
|
||||
@@ -15,19 +15,9 @@ export interface UseUnreadCountsResult {
|
||||
lastMessageTimes: ConversationTimes;
|
||||
incrementUnread: (stateKey: string, hasMention?: boolean) => void;
|
||||
markAllRead: () => void;
|
||||
markConversationRead: (conv: Conversation) => void;
|
||||
trackNewMessage: (msg: Message) => void;
|
||||
}
|
||||
|
||||
/** Check if a message text contains a mention of the given name in @[name] format */
|
||||
function messageContainsMention(text: string, name: string | null): boolean {
|
||||
if (!name) return false;
|
||||
// Escape special regex characters in the name
|
||||
const escaped = name.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
|
||||
const mentionPattern = new RegExp(`@\\[${escaped}\\]`, 'i');
|
||||
return mentionPattern.test(text);
|
||||
}
|
||||
|
||||
export function useUnreadCounts(
|
||||
channels: Channel[],
|
||||
contacts: Contact[],
|
||||
@@ -44,106 +34,32 @@ export function useUnreadCounts(
|
||||
myNameRef.current = myName;
|
||||
}, [myName]);
|
||||
|
||||
// Track which channels/contacts we've already fetched unreads for
|
||||
const fetchedChannels = useRef<Set<string>>(new Set());
|
||||
const fetchedContacts = useRef<Set<string>>(new Set());
|
||||
// Fetch unreads from the server-side endpoint
|
||||
const fetchUnreads = useCallback(async () => {
|
||||
try {
|
||||
const data = await api.getUnreads(myNameRef.current ?? undefined);
|
||||
|
||||
// Fetch messages and count unreads for new channels/contacts
|
||||
// Uses server-side last_read_at for consistent read state across devices
|
||||
useEffect(() => {
|
||||
const newChannels = channels.filter((c) => !fetchedChannels.current.has(c.key));
|
||||
const newContacts = contacts.filter(
|
||||
(c) => c.public_key && !fetchedContacts.current.has(c.public_key)
|
||||
);
|
||||
// Replace (not merge) — server counts are authoritative
|
||||
setUnreadCounts(data.counts);
|
||||
setMentions(data.mentions);
|
||||
|
||||
if (newChannels.length === 0 && newContacts.length === 0) return;
|
||||
|
||||
// Mark as fetched before starting (to avoid duplicate fetches if effect re-runs)
|
||||
newChannels.forEach((c) => fetchedChannels.current.add(c.key));
|
||||
newContacts.forEach((c) => fetchedContacts.current.add(c.public_key));
|
||||
|
||||
const fetchAndCountUnreads = async () => {
|
||||
const conversations: Array<{ type: 'PRIV' | 'CHAN'; conversation_key: string }> = [
|
||||
...newChannels.map((c) => ({ type: 'CHAN' as const, conversation_key: c.key })),
|
||||
...newContacts.map((c) => ({ type: 'PRIV' as const, conversation_key: c.public_key })),
|
||||
];
|
||||
|
||||
if (conversations.length === 0) return;
|
||||
|
||||
try {
|
||||
// Fetch messages in chunks to avoid huge single requests
|
||||
const chunkSize = 200;
|
||||
const bulkMessages: Record<string, Message[]> = {};
|
||||
|
||||
for (let i = 0; i < conversations.length; i += chunkSize) {
|
||||
const chunk = conversations.slice(i, i + chunkSize);
|
||||
const chunkResult = await api.getMessagesBulk(chunk, UNREAD_FETCH_LIMIT);
|
||||
Object.assign(bulkMessages, chunkResult);
|
||||
}
|
||||
const newUnreadCounts: Record<string, number> = {};
|
||||
const newMentions: Record<string, boolean> = {};
|
||||
const newLastMessageTimes: Record<string, number> = {};
|
||||
|
||||
// Process channel messages - use server-side last_read_at
|
||||
for (const channel of newChannels) {
|
||||
const msgs = bulkMessages[`CHAN:${channel.key}`] || [];
|
||||
if (msgs.length > 0) {
|
||||
const key = getStateKey('channel', channel.key);
|
||||
// Use server-side last_read_at, fallback to 0 if never read
|
||||
const lastRead = channel.last_read_at || 0;
|
||||
|
||||
const unreadMsgs = msgs.filter((m) => !m.outgoing && m.received_at > lastRead);
|
||||
if (unreadMsgs.length > 0) {
|
||||
newUnreadCounts[key] = unreadMsgs.length;
|
||||
// Check if any unread message mentions the user
|
||||
if (unreadMsgs.some((m) => messageContainsMention(m.text, myNameRef.current))) {
|
||||
newMentions[key] = true;
|
||||
}
|
||||
}
|
||||
|
||||
const latestTime = Math.max(...msgs.map((m) => m.received_at));
|
||||
newLastMessageTimes[key] = latestTime;
|
||||
setLastMessageTime(key, latestTime);
|
||||
}
|
||||
}
|
||||
|
||||
// Process contact messages - use server-side last_read_at
|
||||
for (const contact of newContacts) {
|
||||
const msgs = bulkMessages[`PRIV:${contact.public_key}`] || [];
|
||||
if (msgs.length > 0) {
|
||||
const key = getStateKey('contact', contact.public_key);
|
||||
// Use server-side last_read_at, fallback to 0 if never read
|
||||
const lastRead = contact.last_read_at || 0;
|
||||
|
||||
const unreadMsgs = msgs.filter((m) => !m.outgoing && m.received_at > lastRead);
|
||||
if (unreadMsgs.length > 0) {
|
||||
newUnreadCounts[key] = unreadMsgs.length;
|
||||
// Check if any unread message mentions the user
|
||||
if (unreadMsgs.some((m) => messageContainsMention(m.text, myNameRef.current))) {
|
||||
newMentions[key] = true;
|
||||
}
|
||||
}
|
||||
|
||||
const latestTime = Math.max(...msgs.map((m) => m.received_at));
|
||||
newLastMessageTimes[key] = latestTime;
|
||||
setLastMessageTime(key, latestTime);
|
||||
}
|
||||
}
|
||||
|
||||
if (Object.keys(newUnreadCounts).length > 0) {
|
||||
setUnreadCounts((prev) => ({ ...prev, ...newUnreadCounts }));
|
||||
}
|
||||
if (Object.keys(newMentions).length > 0) {
|
||||
setMentions((prev) => ({ ...prev, ...newMentions }));
|
||||
if (Object.keys(data.last_message_times).length > 0) {
|
||||
// Update in-memory cache and state
|
||||
for (const [key, ts] of Object.entries(data.last_message_times)) {
|
||||
setLastMessageTime(key, ts);
|
||||
}
|
||||
setLastMessageTimes(getLastMessageTimes());
|
||||
} catch (err) {
|
||||
console.error('Failed to fetch messages bulk:', err);
|
||||
}
|
||||
};
|
||||
} catch (err) {
|
||||
console.error('Failed to fetch unreads:', err);
|
||||
}
|
||||
}, []);
|
||||
|
||||
fetchAndCountUnreads();
|
||||
}, [channels, contacts]);
|
||||
// Fetch when channels or contacts arrive/change
|
||||
useEffect(() => {
|
||||
if (channels.length === 0 && contacts.length === 0) return;
|
||||
fetchUnreads();
|
||||
}, [channels, contacts, fetchUnreads]);
|
||||
|
||||
// Mark conversation as read when user views it
|
||||
// Calls server API to persist read state across devices
|
||||
@@ -151,7 +67,8 @@ export function useUnreadCounts(
|
||||
if (
|
||||
activeConversation &&
|
||||
activeConversation.type !== 'raw' &&
|
||||
activeConversation.type !== 'map'
|
||||
activeConversation.type !== 'map' &&
|
||||
activeConversation.type !== 'visualizer'
|
||||
) {
|
||||
const key = getStateKey(
|
||||
activeConversation.type as 'channel' | 'contact',
|
||||
@@ -218,45 +135,6 @@ export function useUnreadCounts(
|
||||
});
|
||||
}, []);
|
||||
|
||||
// Mark a specific conversation as read
|
||||
// Calls server API to persist read state across devices
|
||||
const markConversationRead = useCallback((conv: Conversation) => {
|
||||
if (conv.type === 'raw' || conv.type === 'map') return;
|
||||
|
||||
const key = getStateKey(conv.type as 'channel' | 'contact', conv.id);
|
||||
|
||||
// Update local state immediately
|
||||
setUnreadCounts((prev) => {
|
||||
if (prev[key]) {
|
||||
const next = { ...prev };
|
||||
delete next[key];
|
||||
return next;
|
||||
}
|
||||
return prev;
|
||||
});
|
||||
|
||||
// Also clear mentions for this conversation
|
||||
setMentions((prev) => {
|
||||
if (prev[key]) {
|
||||
const next = { ...prev };
|
||||
delete next[key];
|
||||
return next;
|
||||
}
|
||||
return prev;
|
||||
});
|
||||
|
||||
// Persist to server (fire-and-forget)
|
||||
if (conv.type === 'channel') {
|
||||
api.markChannelRead(conv.id).catch((err) => {
|
||||
console.error('Failed to mark channel as read on server:', err);
|
||||
});
|
||||
} else if (conv.type === 'contact') {
|
||||
api.markContactRead(conv.id).catch((err) => {
|
||||
console.error('Failed to mark contact as read on server:', err);
|
||||
});
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Track a new incoming message for unread counts
|
||||
const trackNewMessage = useCallback((msg: Message) => {
|
||||
let conversationKey: string | null = null;
|
||||
@@ -279,7 +157,6 @@ export function useUnreadCounts(
|
||||
lastMessageTimes,
|
||||
incrementUnread,
|
||||
markAllRead,
|
||||
markConversationRead,
|
||||
trackNewMessage,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -37,3 +37,12 @@
|
||||
font-size: 14px;
|
||||
}
|
||||
}
|
||||
|
||||
/* Constrain CodeMirror editor width */
|
||||
.cm-editor {
|
||||
max-width: 100% !important;
|
||||
contain: inline-size;
|
||||
}
|
||||
.cm-editor .cm-scroller {
|
||||
overflow-x: auto !important;
|
||||
}
|
||||
|
||||
@@ -12,10 +12,37 @@ body,
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
/* iOS PWA safe-area support */
|
||||
:root {
|
||||
--safe-area-top: env(safe-area-inset-top, 0px);
|
||||
--safe-area-right: env(safe-area-inset-right, 0px);
|
||||
--safe-area-bottom: env(safe-area-inset-bottom, 0px);
|
||||
--safe-area-left: env(safe-area-inset-left, 0px);
|
||||
--safe-area-bottom-capped: min(var(--safe-area-bottom), 12px);
|
||||
}
|
||||
|
||||
@supports (padding: constant(safe-area-inset-top)) {
|
||||
:root {
|
||||
--safe-area-top: constant(safe-area-inset-top);
|
||||
--safe-area-right: constant(safe-area-inset-right);
|
||||
--safe-area-bottom: constant(safe-area-inset-bottom);
|
||||
--safe-area-left: constant(safe-area-inset-left);
|
||||
--safe-area-bottom-capped: min(var(--safe-area-bottom), 12px);
|
||||
}
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
/* Prevent overscroll/bounce on mobile */
|
||||
overscroll-behavior: none;
|
||||
padding: var(--safe-area-top) var(--safe-area-right) var(--safe-area-bottom-capped)
|
||||
var(--safe-area-left);
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
#root {
|
||||
height: 100%;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
/* Fallback for browsers without dvh support */
|
||||
|
||||
@@ -1,10 +1,6 @@
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import {
|
||||
getAvatarText,
|
||||
getAvatarColor,
|
||||
getContactAvatar,
|
||||
CONTACT_TYPE_REPEATER,
|
||||
} from '../utils/contactAvatar';
|
||||
import { getAvatarText, getAvatarColor, getContactAvatar } from '../utils/contactAvatar';
|
||||
import { CONTACT_TYPE_REPEATER } from '../types';
|
||||
|
||||
describe('getAvatarText', () => {
|
||||
it('returns first emoji when name contains emoji', () => {
|
||||
@@ -13,6 +9,12 @@ describe('getAvatarText', () => {
|
||||
expect(getAvatarText('Test 😀 More 🎯', 'abc123')).toBe('😀');
|
||||
});
|
||||
|
||||
it('returns full flag emoji (not just first regional indicator)', () => {
|
||||
expect(getAvatarText('Jason 🇺🇸', 'abc123')).toBe('🇺🇸');
|
||||
expect(getAvatarText('🇬🇧 London', 'abc123')).toBe('🇬🇧');
|
||||
expect(getAvatarText('Test 🇯🇵 Japan', 'abc123')).toBe('🇯🇵');
|
||||
});
|
||||
|
||||
it('returns initials when name has space', () => {
|
||||
expect(getAvatarText('John Doe', 'abc123')).toBe('JD');
|
||||
expect(getAvatarText('Alice Bob Charlie', 'abc123')).toBe('AB');
|
||||
|
||||
@@ -4,13 +4,12 @@ import {
|
||||
findContactsByPrefix,
|
||||
calculateDistance,
|
||||
sortContactsByDistance,
|
||||
getHopCount,
|
||||
resolvePath,
|
||||
formatDistance,
|
||||
formatHopCounts,
|
||||
} from '../utils/pathUtils';
|
||||
import type { Contact, RadioConfig } from '../types';
|
||||
import { CONTACT_TYPE_REPEATER, CONTACT_TYPE_CLIENT } from '../types';
|
||||
import { CONTACT_TYPE_REPEATER } from '../types';
|
||||
|
||||
// Helper to create mock contacts
|
||||
function createContact(overrides: Partial<Contact> = {}): Contact {
|
||||
@@ -90,7 +89,7 @@ describe('findContactsByPrefix', () => {
|
||||
createContact({
|
||||
public_key: '1ACCCC' + 'C'.repeat(52),
|
||||
name: 'Client1',
|
||||
type: CONTACT_TYPE_CLIENT,
|
||||
type: 1, // client
|
||||
}),
|
||||
];
|
||||
|
||||
@@ -195,20 +194,6 @@ describe('sortContactsByDistance', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('getHopCount', () => {
|
||||
it('returns 0 for null/empty', () => {
|
||||
expect(getHopCount(null)).toBe(0);
|
||||
expect(getHopCount(undefined)).toBe(0);
|
||||
expect(getHopCount('')).toBe(0);
|
||||
});
|
||||
|
||||
it('counts hops correctly', () => {
|
||||
expect(getHopCount('1A')).toBe(1);
|
||||
expect(getHopCount('1A2B')).toBe(2);
|
||||
expect(getHopCount('1A2B3C')).toBe(3);
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolvePath', () => {
|
||||
const repeater1 = createContact({
|
||||
public_key: '1A' + 'A'.repeat(62),
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import { parseSenderFromText } from '../utils/messageParser';
|
||||
import { CONTACT_TYPE_REPEATER, CONTACT_TYPE_CLIENT } from '../types';
|
||||
import { CONTACT_TYPE_REPEATER } from '../types';
|
||||
|
||||
describe('Repeater message sender parsing', () => {
|
||||
/**
|
||||
@@ -52,7 +52,7 @@ describe('Repeater message sender parsing', () => {
|
||||
|
||||
it('non-repeater messages still get sender parsed', () => {
|
||||
const channelMessage = 'Alice: Hello everyone!';
|
||||
const contactType: number = CONTACT_TYPE_CLIENT;
|
||||
const contactType: number = 1; // client
|
||||
|
||||
const isRepeater = contactType === CONTACT_TYPE_REPEATER;
|
||||
const { sender, content } = isRepeater
|
||||
|
||||
@@ -78,6 +78,7 @@ describe('formatTelemetry', () => {
|
||||
full_events: 0,
|
||||
neighbors: [],
|
||||
acl: [],
|
||||
clock_output: null,
|
||||
};
|
||||
|
||||
const result = formatTelemetry(telemetry);
|
||||
@@ -119,6 +120,7 @@ describe('formatTelemetry', () => {
|
||||
full_events: 0,
|
||||
neighbors: [],
|
||||
acl: [],
|
||||
clock_output: null,
|
||||
};
|
||||
|
||||
const result = formatTelemetry(telemetry);
|
||||
|
||||
@@ -1,17 +1,3 @@
|
||||
/**
|
||||
* Type aliases for key types used throughout the application.
|
||||
* These are all hex strings but serve different purposes.
|
||||
*/
|
||||
|
||||
/** 64-character hex string identifying a contact/node */
|
||||
export type PublicKey = string;
|
||||
|
||||
/** 12-character hex prefix of a public key (used in message routing) */
|
||||
export type PubkeyPrefix = string;
|
||||
|
||||
/** 32-character hex string identifying a channel */
|
||||
export type ChannelKey = string;
|
||||
|
||||
export interface RadioSettings {
|
||||
freq: number;
|
||||
bw: number;
|
||||
@@ -51,7 +37,7 @@ export interface MaintenanceResult {
|
||||
}
|
||||
|
||||
export interface Contact {
|
||||
public_key: PublicKey;
|
||||
public_key: string;
|
||||
name: string | null;
|
||||
type: number;
|
||||
flags: number;
|
||||
@@ -67,7 +53,7 @@ export interface Contact {
|
||||
}
|
||||
|
||||
export interface Channel {
|
||||
key: ChannelKey;
|
||||
key: string;
|
||||
name: string;
|
||||
is_hashtag: boolean;
|
||||
on_radio: boolean;
|
||||
@@ -129,6 +115,13 @@ export interface Favorite {
|
||||
id: string; // channel key or contact public key
|
||||
}
|
||||
|
||||
export interface BotConfig {
|
||||
id: string; // UUID for stable identity across renames/reorders
|
||||
name: string; // User-editable name
|
||||
enabled: boolean; // Whether this bot is enabled
|
||||
code: string; // Python code for this bot
|
||||
}
|
||||
|
||||
export interface AppSettings {
|
||||
max_radio_contacts: number;
|
||||
favorites: Favorite[];
|
||||
@@ -136,12 +129,16 @@ export interface AppSettings {
|
||||
sidebar_sort_order: 'recent' | 'alpha';
|
||||
last_message_times: Record<string, number>;
|
||||
preferences_migrated: boolean;
|
||||
advert_interval: number;
|
||||
bots: BotConfig[];
|
||||
}
|
||||
|
||||
export interface AppSettingsUpdate {
|
||||
max_radio_contacts?: number;
|
||||
auto_decrypt_dm_on_advert?: boolean;
|
||||
sidebar_sort_order?: 'recent' | 'alpha';
|
||||
advert_interval?: number;
|
||||
bots?: BotConfig[];
|
||||
}
|
||||
|
||||
export interface MigratePreferencesRequest {
|
||||
@@ -156,7 +153,6 @@ export interface MigratePreferencesResponse {
|
||||
}
|
||||
|
||||
/** Contact type constants */
|
||||
export const CONTACT_TYPE_CLIENT = 1;
|
||||
export const CONTACT_TYPE_REPEATER = 2;
|
||||
|
||||
export interface NeighborInfo {
|
||||
@@ -194,6 +190,7 @@ export interface TelemetryResponse {
|
||||
full_events: number;
|
||||
neighbors: NeighborInfo[];
|
||||
acl: AclEntry[];
|
||||
clock_output: string | null;
|
||||
}
|
||||
|
||||
export interface CommandResponse {
|
||||
@@ -201,3 +198,15 @@ export interface CommandResponse {
|
||||
response: string;
|
||||
sender_timestamp: number | null;
|
||||
}
|
||||
|
||||
export interface TraceResponse {
|
||||
remote_snr: number | null;
|
||||
local_snr: number | null;
|
||||
path_len: number;
|
||||
}
|
||||
|
||||
export interface UnreadCounts {
|
||||
counts: Record<string, number>;
|
||||
mentions: Record<string, boolean>;
|
||||
last_message_times: Record<string, number>;
|
||||
}
|
||||
|
||||
@@ -6,8 +6,7 @@
|
||||
* Repeaters (type=2) always show 🛜 with a gray background.
|
||||
*/
|
||||
|
||||
// Contact type constants (matches backend)
|
||||
export const CONTACT_TYPE_REPEATER = 2;
|
||||
import { CONTACT_TYPE_REPEATER } from '../types';
|
||||
|
||||
// Repeater avatar styling
|
||||
const REPEATER_AVATAR = {
|
||||
@@ -28,8 +27,9 @@ function hashString(str: string): number {
|
||||
}
|
||||
|
||||
// Regex to match emoji (covers most common emoji ranges)
|
||||
// Flag emojis (e.g., 🇺🇸) are TWO consecutive regional indicator symbols, so we match those first
|
||||
const emojiRegex =
|
||||
/[\u{1F300}-\u{1F9FF}]|[\u{2600}-\u{26FF}]|[\u{2700}-\u{27BF}]|[\u{1F600}-\u{1F64F}]|[\u{1F680}-\u{1F6FF}]|[\u{1F1E0}-\u{1F1FF}]/u;
|
||||
/[\u{1F1E0}-\u{1F1FF}]{2}|[\u{1F300}-\u{1F9FF}]|[\u{2600}-\u{26FF}]|[\u{2700}-\u{27BF}]|[\u{1F600}-\u{1F64F}]|[\u{1F680}-\u{1F6FF}]/u;
|
||||
|
||||
/**
|
||||
* Extract display characters from a contact name.
|
||||
|
||||
@@ -13,7 +13,7 @@ const LAST_MESSAGE_KEY = 'remoteterm-lastMessageTime';
|
||||
const SORT_ORDER_KEY = 'remoteterm-sortOrder';
|
||||
|
||||
export type ConversationTimes = Record<string, number>;
|
||||
export type SortOrder = 'recent' | 'alpha';
|
||||
type SortOrder = 'recent' | 'alpha';
|
||||
|
||||
// In-memory cache of last message times (loaded from server on init)
|
||||
let lastMessageTimesCache: ConversationTimes = {};
|
||||
|
||||
@@ -43,6 +43,3 @@ export function clearLocalStorageFavorites(): void {
|
||||
// localStorage might be disabled
|
||||
}
|
||||
}
|
||||
|
||||
// Re-export the Favorite type for convenience
|
||||
export type { Favorite };
|
||||
|
||||
@@ -148,7 +148,7 @@ export function sortContactsByDistance(
|
||||
/**
|
||||
* Get simple hop count from path string
|
||||
*/
|
||||
export function getHopCount(path: string | null | undefined): number {
|
||||
function getHopCount(path: string | null | undefined): number {
|
||||
if (!path || path.length === 0) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@@ -7,56 +7,20 @@
|
||||
* module provides utilities for working with both formats consistently.
|
||||
*/
|
||||
|
||||
/** Length of a full public key in hex characters */
|
||||
export const PUBKEY_FULL_LENGTH = 64;
|
||||
|
||||
/** Length of a public key prefix in hex characters */
|
||||
export const PUBKEY_PREFIX_LENGTH = 12;
|
||||
const PUBKEY_PREFIX_LENGTH = 12;
|
||||
|
||||
/**
|
||||
* Extract the 12-character prefix from a public key.
|
||||
* Works with both full keys and existing prefixes.
|
||||
*/
|
||||
export function getPubkeyPrefix(key: string): string {
|
||||
function getPubkeyPrefix(key: string): string {
|
||||
return key.slice(0, PUBKEY_PREFIX_LENGTH);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if two public keys match by comparing their prefixes.
|
||||
* This handles the case where one key is full (64 chars) and
|
||||
* the other is a prefix (12 chars).
|
||||
*/
|
||||
export function pubkeysMatch(a: string, b: string): boolean {
|
||||
if (!a || !b) return false;
|
||||
return getPubkeyPrefix(a) === getPubkeyPrefix(b);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a public key starts with the given prefix.
|
||||
* More explicit than using .startsWith() directly.
|
||||
*/
|
||||
export function pubkeyMatchesPrefix(fullKey: string, prefix: string): boolean {
|
||||
if (!fullKey || !prefix) return false;
|
||||
return fullKey.startsWith(prefix);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a display name for a contact, falling back to pubkey prefix.
|
||||
*/
|
||||
export function getContactDisplayName(name: string | null | undefined, pubkey: string): string {
|
||||
return name || getPubkeyPrefix(pubkey);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a key is a full 64-character public key.
|
||||
*/
|
||||
export function isFullPubkey(key: string): boolean {
|
||||
return key.length === PUBKEY_FULL_LENGTH;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a key is a 12-character prefix.
|
||||
*/
|
||||
export function isPubkeyPrefix(key: string): boolean {
|
||||
return key.length === PUBKEY_PREFIX_LENGTH;
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { Conversation } from '../types';
|
||||
|
||||
export interface ParsedHashConversation {
|
||||
interface ParsedHashConversation {
|
||||
type: 'channel' | 'contact' | 'raw' | 'map' | 'visualizer';
|
||||
name: string;
|
||||
/** For map view: public key prefix to focus on */
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[project]
|
||||
name = "remoteterm-meshcore"
|
||||
version = "1.4.1"
|
||||
version = "1.8.0"
|
||||
description = "RemoteTerm - Web interface for MeshCore radio mesh networks"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10"
|
||||
|
||||
4
tests/e2e/.gitignore
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
node_modules/
|
||||
.tmp/
|
||||
test-results/
|
||||
playwright-report/
|
||||
38
tests/e2e/global-setup.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
import type { FullConfig } from '@playwright/test';
|
||||
|
||||
const BASE_URL = 'http://localhost:8000';
|
||||
const MAX_RETRIES = 10;
|
||||
const RETRY_DELAY_MS = 2000;
|
||||
|
||||
export default async function globalSetup(_config: FullConfig) {
|
||||
// Wait for the backend to be fully ready and radio connected
|
||||
let lastError: Error | null = null;
|
||||
|
||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||
try {
|
||||
const res = await fetch(`${BASE_URL}/api/health`);
|
||||
if (!res.ok) {
|
||||
throw new Error(`Health check returned ${res.status}`);
|
||||
}
|
||||
const health = (await res.json()) as { radio_connected: boolean; serial_port: string | null };
|
||||
|
||||
if (!health.radio_connected) {
|
||||
throw new Error(
|
||||
'Radio not connected — E2E tests require hardware. ' +
|
||||
'Set MESHCORE_SERIAL_PORT if auto-detection fails.'
|
||||
);
|
||||
}
|
||||
|
||||
console.log(`Radio connected on ${health.serial_port}`);
|
||||
return;
|
||||
} catch (err) {
|
||||
lastError = err instanceof Error ? err : new Error(String(err));
|
||||
if (attempt < MAX_RETRIES) {
|
||||
console.log(`Waiting for backend (attempt ${attempt}/${MAX_RETRIES})...`);
|
||||
await new Promise((r) => setTimeout(r, RETRY_DELAY_MS));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(`Backend not ready after ${MAX_RETRIES} attempts: ${lastError?.message}`);
|
||||
}
|
||||
187
tests/e2e/helpers/api.ts
Normal file
@@ -0,0 +1,187 @@
|
||||
/**
|
||||
* Direct REST API helpers for E2E test setup and teardown.
|
||||
* These bypass the UI to set up preconditions and verify backend state.
|
||||
*/
|
||||
|
||||
const BASE_URL = 'http://localhost:8000/api';
|
||||
|
||||
async function fetchJson<T>(path: string, init?: RequestInit): Promise<T> {
|
||||
const res = await fetch(`${BASE_URL}${path}`, {
|
||||
headers: { 'Content-Type': 'application/json', ...init?.headers },
|
||||
...init,
|
||||
});
|
||||
if (!res.ok) {
|
||||
const body = await res.text();
|
||||
throw new Error(`API ${init?.method || 'GET'} ${path} returned ${res.status}: ${body}`);
|
||||
}
|
||||
return res.json() as Promise<T>;
|
||||
}
|
||||
|
||||
// --- Health ---
|
||||
|
||||
export interface HealthStatus {
|
||||
radio_connected: boolean;
|
||||
serial_port: string | null;
|
||||
}
|
||||
|
||||
export function getHealth(): Promise<HealthStatus> {
|
||||
return fetchJson('/health');
|
||||
}
|
||||
|
||||
// --- Radio Config ---
|
||||
|
||||
export interface RadioConfig {
|
||||
name: string;
|
||||
public_key: string;
|
||||
lat: number;
|
||||
lon: number;
|
||||
tx_power: number;
|
||||
freq: number;
|
||||
bw: number;
|
||||
sf: number;
|
||||
cr: number;
|
||||
}
|
||||
|
||||
export function getRadioConfig(): Promise<RadioConfig> {
|
||||
return fetchJson('/radio/config');
|
||||
}
|
||||
|
||||
export function updateRadioConfig(patch: Partial<RadioConfig>): Promise<RadioConfig> {
|
||||
return fetchJson('/radio/config', {
|
||||
method: 'PATCH',
|
||||
body: JSON.stringify(patch),
|
||||
});
|
||||
}
|
||||
|
||||
export function rebootRadio(): Promise<{ status: string; message: string }> {
|
||||
return fetchJson('/radio/reboot', { method: 'POST' });
|
||||
}
|
||||
|
||||
// --- Channels ---
|
||||
|
||||
export interface Channel {
|
||||
key: string;
|
||||
name: string;
|
||||
is_hashtag: boolean;
|
||||
on_radio: boolean;
|
||||
}
|
||||
|
||||
export function getChannels(): Promise<Channel[]> {
|
||||
return fetchJson('/channels');
|
||||
}
|
||||
|
||||
export function createChannel(name: string): Promise<Channel> {
|
||||
return fetchJson('/channels', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ name }),
|
||||
});
|
||||
}
|
||||
|
||||
export function deleteChannel(key: string): Promise<void> {
|
||||
return fetchJson(`/channels/${key}`, { method: 'DELETE' });
|
||||
}
|
||||
|
||||
// --- Messages ---
|
||||
|
||||
export interface MessagePath {
|
||||
path: string;
|
||||
received_at: number;
|
||||
}
|
||||
|
||||
export interface Message {
|
||||
id: number;
|
||||
type: 'PRIV' | 'CHAN';
|
||||
conversation_key: string;
|
||||
text: string;
|
||||
outgoing: boolean;
|
||||
acked: number;
|
||||
received_at: number;
|
||||
sender_timestamp: number | null;
|
||||
paths: MessagePath[] | null;
|
||||
}
|
||||
|
||||
export function getMessages(params: {
|
||||
type?: string;
|
||||
conversation_key?: string;
|
||||
limit?: number;
|
||||
}): Promise<Message[]> {
|
||||
const qs = new URLSearchParams();
|
||||
if (params.type) qs.set('type', params.type);
|
||||
if (params.conversation_key) qs.set('conversation_key', params.conversation_key);
|
||||
if (params.limit) qs.set('limit', String(params.limit));
|
||||
return fetchJson(`/messages?${qs}`);
|
||||
}
|
||||
|
||||
export function sendChannelMessage(
|
||||
channelKey: string,
|
||||
text: string
|
||||
): Promise<{ status: string; message_id: number }> {
|
||||
return fetchJson('/messages/channel', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ channel_key: channelKey, text }),
|
||||
});
|
||||
}
|
||||
|
||||
// --- Settings ---
|
||||
|
||||
export interface BotConfig {
|
||||
id: string;
|
||||
name: string;
|
||||
enabled: boolean;
|
||||
code: string;
|
||||
}
|
||||
|
||||
export interface AppSettings {
|
||||
max_radio_contacts: number;
|
||||
favorites: { type: string; id: string }[];
|
||||
auto_decrypt_dm_on_advert: boolean;
|
||||
sidebar_sort_order: string;
|
||||
last_message_times: Record<string, number>;
|
||||
preferences_migrated: boolean;
|
||||
bots: BotConfig[];
|
||||
advert_interval: number;
|
||||
}
|
||||
|
||||
export function getSettings(): Promise<AppSettings> {
|
||||
return fetchJson('/settings');
|
||||
}
|
||||
|
||||
export function updateSettings(patch: Partial<AppSettings>): Promise<AppSettings> {
|
||||
return fetchJson('/settings', {
|
||||
method: 'PATCH',
|
||||
body: JSON.stringify(patch),
|
||||
});
|
||||
}
|
||||
|
||||
// --- Helpers ---
|
||||
|
||||
/**
|
||||
* Ensure #flightless channel exists, creating it if needed.
|
||||
* Returns the channel object.
|
||||
*/
|
||||
export async function ensureFlightlessChannel(): Promise<Channel> {
|
||||
const channels = await getChannels();
|
||||
const existing = channels.find((c) => c.name === '#flightless');
|
||||
if (existing) return existing;
|
||||
return createChannel('#flightless');
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for health to show radio_connected, polling with retries.
|
||||
*/
|
||||
export async function waitForRadioConnected(
|
||||
timeoutMs: number = 30_000,
|
||||
intervalMs: number = 2000
|
||||
): Promise<void> {
|
||||
const deadline = Date.now() + timeoutMs;
|
||||
while (Date.now() < deadline) {
|
||||
try {
|
||||
const health = await getHealth();
|
||||
if (health.radio_connected) return;
|
||||
} catch {
|
||||
// Backend might be restarting
|
||||
}
|
||||
await new Promise((r) => setTimeout(r, intervalMs));
|
||||
}
|
||||
throw new Error(`Radio did not reconnect within ${timeoutMs}ms`);
|
||||
}
|
||||
76
tests/e2e/package-lock.json
generated
Normal file
@@ -0,0 +1,76 @@
|
||||
{
|
||||
"name": "remoteterm-e2e",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "remoteterm-e2e",
|
||||
"devDependencies": {
|
||||
"@playwright/test": "^1.52.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@playwright/test": {
|
||||
"version": "1.58.0",
|
||||
"resolved": "https://registry.npmjs.org/@playwright/test/-/test-1.58.0.tgz",
|
||||
"integrity": "sha512-fWza+Lpbj6SkQKCrU6si4iu+fD2dD3gxNHFhUPxsfXBPhnv3rRSQVd0NtBUT9Z/RhF/boCBcuUaMUSTRTopjZg==",
|
||||
"dev": true,
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"playwright": "1.58.0"
|
||||
},
|
||||
"bin": {
|
||||
"playwright": "cli.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/fsevents": {
|
||||
"version": "2.3.2",
|
||||
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz",
|
||||
"integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
|
||||
"dev": true,
|
||||
"hasInstallScript": true,
|
||||
"license": "MIT",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/playwright": {
|
||||
"version": "1.58.0",
|
||||
"resolved": "https://registry.npmjs.org/playwright/-/playwright-1.58.0.tgz",
|
||||
"integrity": "sha512-2SVA0sbPktiIY/MCOPX8e86ehA/e+tDNq+e5Y8qjKYti2Z/JG7xnronT/TXTIkKbYGWlCbuucZ6dziEgkoEjQQ==",
|
||||
"dev": true,
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"playwright-core": "1.58.0"
|
||||
},
|
||||
"bin": {
|
||||
"playwright": "cli.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"fsevents": "2.3.2"
|
||||
}
|
||||
},
|
||||
"node_modules/playwright-core": {
|
||||
"version": "1.58.0",
|
||||
"resolved": "https://registry.npmjs.org/playwright-core/-/playwright-core-1.58.0.tgz",
|
||||
"integrity": "sha512-aaoB1RWrdNi3//rOeKuMiS65UCcgOVljU46At6eFcOFPFHWtd2weHRRow6z/n+Lec0Lvu0k9ZPKJSjPugikirw==",
|
||||
"dev": true,
|
||||
"license": "Apache-2.0",
|
||||
"bin": {
|
||||
"playwright-core": "cli.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
11
tests/e2e/package.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"name": "remoteterm-e2e",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"test": "playwright test",
|
||||
"test:headed": "playwright test --headed"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@playwright/test": "^1.52.0"
|
||||
}
|
||||
}
|
||||
51
tests/e2e/playwright.config.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import { defineConfig } from '@playwright/test';
|
||||
import path from 'path';
|
||||
|
||||
const projectRoot = path.resolve(__dirname, '..', '..');
|
||||
const tmpDir = path.resolve(__dirname, '.tmp');
|
||||
|
||||
export default defineConfig({
|
||||
testDir: './specs',
|
||||
globalSetup: './global-setup.ts',
|
||||
|
||||
// Radio operations are slow — generous timeouts
|
||||
timeout: 60_000,
|
||||
expect: { timeout: 15_000 },
|
||||
|
||||
// Don't retry — failures likely indicate real hardware/app issues
|
||||
retries: 0,
|
||||
|
||||
// Run tests serially — single radio means no parallelism
|
||||
fullyParallel: false,
|
||||
workers: 1,
|
||||
|
||||
reporter: [['list'], ['html', { open: 'never' }]],
|
||||
|
||||
use: {
|
||||
baseURL: 'http://localhost:8000',
|
||||
trace: 'on-first-retry',
|
||||
screenshot: 'only-on-failure',
|
||||
},
|
||||
|
||||
projects: [
|
||||
{
|
||||
name: 'chromium',
|
||||
use: { browserName: 'chromium' },
|
||||
},
|
||||
],
|
||||
|
||||
webServer: {
|
||||
command: 'uv run uvicorn app.main:app --host 127.0.0.1 --port 8000',
|
||||
cwd: projectRoot,
|
||||
port: 8000,
|
||||
reuseExistingServer: false,
|
||||
timeout: 30_000,
|
||||
env: {
|
||||
MESHCORE_DATABASE_PATH: path.join(tmpDir, 'e2e-test.db'),
|
||||
// Pass through the serial port from the environment
|
||||
...(process.env.MESHCORE_SERIAL_PORT
|
||||
? { MESHCORE_SERIAL_PORT: process.env.MESHCORE_SERIAL_PORT }
|
||||
: {}),
|
||||
},
|
||||
},
|
||||
});
|
||||
67
tests/e2e/specs/bot.spec.ts
Normal file
@@ -0,0 +1,67 @@
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { ensureFlightlessChannel, getSettings, updateSettings } from '../helpers/api';
|
||||
import type { BotConfig } from '../helpers/api';
|
||||
|
||||
const BOT_CODE = `def bot(sender_name, sender_key, message_text, is_dm, channel_key, channel_name, sender_timestamp, path):
|
||||
if channel_name == "#flightless" and "!e2etest" in message_text.lower():
|
||||
return "[BOT] e2e-ok"
|
||||
return None`;
|
||||
|
||||
test.describe('Bot functionality', () => {
|
||||
let originalBots: BotConfig[];
|
||||
|
||||
test.beforeAll(async () => {
|
||||
await ensureFlightlessChannel();
|
||||
const settings = await getSettings();
|
||||
originalBots = settings.bots ?? [];
|
||||
});
|
||||
|
||||
test.afterAll(async () => {
|
||||
// Restore original bot config
|
||||
try {
|
||||
await updateSettings({ bots: originalBots });
|
||||
} catch {
|
||||
console.warn('Failed to restore bot config');
|
||||
}
|
||||
});
|
||||
|
||||
test('create a bot via API, verify it in UI, trigger it, and verify response', async ({
|
||||
page,
|
||||
}) => {
|
||||
// --- Step 1: Create and enable bot via API ---
|
||||
// CodeMirror is difficult to drive via Playwright (contenteditable, lazy-loaded),
|
||||
// so we set the bot code via the REST API and verify it through the UI.
|
||||
const testBot: BotConfig = {
|
||||
id: crypto.randomUUID(),
|
||||
name: 'E2E Test Bot',
|
||||
enabled: true,
|
||||
code: BOT_CODE,
|
||||
};
|
||||
await updateSettings({ bots: [...originalBots, testBot] });
|
||||
|
||||
// --- Step 2: Verify bot appears in settings UI ---
|
||||
await page.goto('/');
|
||||
await expect(page.getByText('Connected')).toBeVisible();
|
||||
|
||||
await page.getByText('Radio & Config').click();
|
||||
await page.getByRole('tab', { name: 'Bot' }).click();
|
||||
|
||||
// The bot name should be visible in the bot list
|
||||
await expect(page.getByText('E2E Test Bot')).toBeVisible();
|
||||
|
||||
// Close settings
|
||||
await page.keyboard.press('Escape');
|
||||
|
||||
// --- Step 3: Trigger the bot ---
|
||||
await page.getByText('#flightless', { exact: true }).first().click();
|
||||
|
||||
const triggerMessage = `!e2etest ${Date.now()}`;
|
||||
const input = page.getByPlaceholder(/type a message|message #flightless/i);
|
||||
await input.fill(triggerMessage);
|
||||
await page.getByRole('button', { name: 'Send' }).click();
|
||||
|
||||
// --- Step 4: Verify bot response appears ---
|
||||
// Bot has ~2s delay before responding, plus radio send time
|
||||
await expect(page.getByText('[BOT] e2e-ok')).toBeVisible({ timeout: 30_000 });
|
||||
});
|
||||
});
|
||||
22
tests/e2e/specs/health.spec.ts
Normal file
@@ -0,0 +1,22 @@
|
||||
import { test, expect } from '@playwright/test';
|
||||
|
||||
test.describe('Health & UI basics', () => {
|
||||
test('page loads and shows connected status', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
|
||||
// Status bar shows "Connected"
|
||||
await expect(page.getByText('Connected')).toBeVisible();
|
||||
|
||||
// Sidebar is visible with key sections
|
||||
await expect(page.getByRole('heading', { name: 'Conversations' })).toBeVisible();
|
||||
await expect(page.getByText('Packet Feed')).toBeVisible();
|
||||
await expect(page.getByText('Node Map')).toBeVisible();
|
||||
});
|
||||
|
||||
test('sidebar shows Channels and Contacts sections', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
|
||||
await expect(page.getByText('Channels')).toBeVisible();
|
||||
await expect(page.getByText('Contacts')).toBeVisible();
|
||||
});
|
||||
});
|
||||
174
tests/e2e/specs/incoming-message.spec.ts
Normal file
@@ -0,0 +1,174 @@
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { createChannel, getChannels, getMessages } from '../helpers/api';
|
||||
|
||||
/**
|
||||
* These tests wait for real incoming messages from the mesh network.
|
||||
* They require a radio attached and other nodes actively transmitting.
|
||||
* Timeout is 10 minutes to allow for intermittent traffic.
|
||||
*/
|
||||
|
||||
const ROOMS = [
|
||||
'#flightless', '#bot', '#snoco', '#skagit', '#edmonds', '#bachelorette',
|
||||
'#emergency', '#furry', '#public', '#puppy', '#foobar', '#capitolhill',
|
||||
'#hamradio', '#icewatch', '#saucefamily', '#scvsar', '#startrek', '#metalmusic',
|
||||
'#seattle', '#vanbot', '#bot-van', '#lynden', '#bham', '#sipesbot', '#psrg',
|
||||
'#testing', '#olybot', '#test', '#ve7rva', '#wardrive', '#kitsap', '#tacoma',
|
||||
'#rats', '#pdx', '#olympia', '#bot2', '#transit', '#salishmesh', '#meshwar',
|
||||
'#cats', '#jokes', '#decode', '#whatcom', '#bot-oly', '#sports', '#weather',
|
||||
'#wasma', '#ravenna', '#northbend', '#dsa', '#oly-bot', '#grove', '#cars',
|
||||
'#bellingham', '#baseball', '#mariners', '#eugene', '#victoria', '#vimesh',
|
||||
'#bot-pdx', '#chinese', '#miro', '#poop', '#papa', '#uw', '#renton',
|
||||
'#general', '#bellevue', '#eastside', '#bit', '#dev', '#farts', '#protest',
|
||||
'#gmrs', '#pri', '#boob', '#baga', '#fun', '#w7dk', '#wedgwood', '#bots',
|
||||
'#sounders', '#steelhead', '#uetfwf', '#ballard', '#at', '#1234567', '#funny',
|
||||
'#abbytest', '#abird', '#afterparty', '#arborheights', '#atheist', '#auburn',
|
||||
'#bbs', '#blog', '#bottest', '#cascadiamesh', '#chat', '#checkcheck',
|
||||
'#civicmesh', '#columbiacity', '#dad', '#dmaspace', '#droptable', '#duvall',
|
||||
'#dx', '#emcomm', '#finnhill', '#foxden', '#freebsd', '#greenwood', '#howlbot',
|
||||
'#idahomesh', '#junk', '#kraken', '#kremwerk', '#maplemesh', '#meshcore',
|
||||
'#meshmonday', '#methow', '#minecraft', '#newwestminster', '#northvan',
|
||||
'#ominous', '#pagan', '#party', '#place', '#pokemon', '#portland', '#rave',
|
||||
'#raving', '#rftest', '#richmond', '#rolston', '#salishtest', '#saved',
|
||||
'#seahawks', '#sipebot', '#slumbermesh', '#snoqualmie', '#southisland',
|
||||
'#sydney', '#tacobot', '#tdeck', '#trans', '#ubc', '#underground', '#van-bot',
|
||||
'#vancouver', '#vashon', '#wardriving', '#wormhole', '#yelling', '#zork',
|
||||
];
|
||||
|
||||
// 10 minute timeout for waiting on mesh traffic
|
||||
test.describe('Incoming mesh messages', () => {
|
||||
test.setTimeout(600_000);
|
||||
|
||||
test.beforeAll(async () => {
|
||||
// Ensure all rooms exist — create any that are missing
|
||||
const existing = await getChannels();
|
||||
const existingNames = new Set(existing.map((c) => c.name));
|
||||
|
||||
for (const room of ROOMS) {
|
||||
if (!existingNames.has(room)) {
|
||||
try {
|
||||
await createChannel(room);
|
||||
} catch {
|
||||
// May already exist from a concurrent creation, ignore
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
test('receive an incoming message in any room', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
await expect(page.getByText('Connected')).toBeVisible();
|
||||
|
||||
// Record existing message counts per channel so we can detect new ones
|
||||
const channels = await getChannels();
|
||||
const baselineCounts = new Map<string, number>();
|
||||
for (const ch of channels) {
|
||||
const msgs = await getMessages({ type: 'CHAN', conversation_key: ch.key, limit: 1 });
|
||||
baselineCounts.set(ch.key, msgs.length > 0 ? msgs[0].id : 0);
|
||||
}
|
||||
|
||||
// Poll for a new incoming message across all channels
|
||||
let foundChannel: string | null = null;
|
||||
let foundMessageText: string | null = null;
|
||||
|
||||
await expect(async () => {
|
||||
for (const ch of channels) {
|
||||
const msgs = await getMessages({
|
||||
type: 'CHAN',
|
||||
conversation_key: ch.key,
|
||||
limit: 5,
|
||||
});
|
||||
const baseline = baselineCounts.get(ch.key) ?? 0;
|
||||
const newIncoming = msgs.find((m) => m.id > baseline && !m.outgoing);
|
||||
if (newIncoming) {
|
||||
foundChannel = ch.name;
|
||||
foundMessageText = newIncoming.text;
|
||||
return;
|
||||
}
|
||||
}
|
||||
throw new Error('No new incoming messages yet');
|
||||
}).toPass({ intervals: [5_000], timeout: 570_000 });
|
||||
|
||||
// Navigate to the channel that received a message
|
||||
console.log(`Received message in ${foundChannel}: "${foundMessageText}"`);
|
||||
await page.getByText(foundChannel!, { exact: true }).first().click();
|
||||
|
||||
// Verify the message text is visible in the message list area (not sidebar)
|
||||
const messageArea = page.locator('.break-words');
|
||||
const messageContent = foundMessageText!.includes(': ')
|
||||
? foundMessageText!.split(': ').slice(1).join(': ')
|
||||
: foundMessageText!;
|
||||
await expect(messageArea.getByText(messageContent, { exact: false }).first()).toBeVisible({
|
||||
timeout: 15_000,
|
||||
});
|
||||
});
|
||||
|
||||
test('incoming message with path shows hop badge and path modal', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
await expect(page.getByText('Connected')).toBeVisible();
|
||||
|
||||
// Record baselines
|
||||
const channels = await getChannels();
|
||||
const baselineCounts = new Map<string, number>();
|
||||
for (const ch of channels) {
|
||||
const msgs = await getMessages({ type: 'CHAN', conversation_key: ch.key, limit: 1 });
|
||||
baselineCounts.set(ch.key, msgs.length > 0 ? msgs[0].id : 0);
|
||||
}
|
||||
|
||||
// Wait for any incoming message that has path data
|
||||
let foundChannel: string | null = null;
|
||||
|
||||
await expect(async () => {
|
||||
for (const ch of channels) {
|
||||
const msgs = await getMessages({
|
||||
type: 'CHAN',
|
||||
conversation_key: ch.key,
|
||||
limit: 10,
|
||||
});
|
||||
const baseline = baselineCounts.get(ch.key) ?? 0;
|
||||
const withPath = msgs.find(
|
||||
(m) => m.id > baseline && !m.outgoing && m.paths && m.paths.length > 0
|
||||
);
|
||||
if (withPath) {
|
||||
foundChannel = ch.name;
|
||||
return;
|
||||
}
|
||||
}
|
||||
throw new Error('No new incoming messages with path data yet');
|
||||
}).toPass({ intervals: [5_000], timeout: 570_000 });
|
||||
|
||||
console.log(`Found message with path in ${foundChannel}`);
|
||||
|
||||
// Navigate to the channel that received a message with path data
|
||||
await page.getByText(foundChannel!, { exact: true }).first().click();
|
||||
|
||||
// Find any hop badge on the page — they all have title="View message path"
|
||||
// We don't care which specific message; just that a path badge exists and works.
|
||||
const badge = page.getByTitle('View message path').first();
|
||||
await expect(badge).toBeVisible({ timeout: 15_000 });
|
||||
|
||||
// The badge text should match the pattern: (d), (1), (d/1/3), etc.
|
||||
const badgeText = await badge.textContent();
|
||||
console.log(`Badge text: ${badgeText}`);
|
||||
expect(badgeText).toMatch(/^\([d\d]+(\/[d\d]+)*\)$/);
|
||||
|
||||
// Click the badge to open the path modal
|
||||
await badge.click();
|
||||
|
||||
const modal = page.getByRole('dialog');
|
||||
await expect(modal).toBeVisible();
|
||||
|
||||
// Verify the modal has the basic structural elements every path modal should have
|
||||
await expect(modal.getByText('Sender')).toBeVisible();
|
||||
await expect(modal.getByText('Receiver (me)')).toBeVisible();
|
||||
|
||||
// Title should be either "Message Path" (single) or "Message Paths (N)" (multiple)
|
||||
const titleEl = modal.locator('h2, [class*="DialogTitle"]').first();
|
||||
const titleText = await titleEl.textContent();
|
||||
console.log(`Modal title: ${titleText}`);
|
||||
expect(titleText).toMatch(/^Message Paths?(\s+\(\d+\))?$/);
|
||||
|
||||
// Close the modal
|
||||
await modal.getByRole('button', { name: 'Close', exact: true }).first().click();
|
||||
await expect(modal).not.toBeVisible();
|
||||
});
|
||||
});
|
||||
49
tests/e2e/specs/messaging.spec.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { ensureFlightlessChannel } from '../helpers/api';
|
||||
|
||||
test.describe('Channel messaging in #flightless', () => {
|
||||
test.beforeEach(async () => {
|
||||
await ensureFlightlessChannel();
|
||||
});
|
||||
|
||||
test('send a message and see it appear', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
|
||||
// Click #flightless in the sidebar (use exact match to avoid "Flightless🥝" etc.)
|
||||
await page.getByText('#flightless', { exact: true }).first().click();
|
||||
|
||||
// Verify conversation is open — the input placeholder includes the channel name
|
||||
await expect(page.getByPlaceholder(/message #flightless/i)).toBeVisible();
|
||||
|
||||
// Compose a unique message
|
||||
const testMessage = `e2e-test-${Date.now()}`;
|
||||
const input = page.getByPlaceholder(/type a message|message #flightless/i);
|
||||
await input.fill(testMessage);
|
||||
|
||||
// Send it
|
||||
await page.getByRole('button', { name: 'Send' }).click();
|
||||
|
||||
// Verify message appears in the message list
|
||||
await expect(page.getByText(testMessage)).toBeVisible({ timeout: 15_000 });
|
||||
});
|
||||
|
||||
test('outgoing message shows ack indicator', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
|
||||
await page.getByText('#flightless', { exact: true }).first().click();
|
||||
|
||||
const testMessage = `ack-test-${Date.now()}`;
|
||||
const input = page.getByPlaceholder(/type a message|message #flightless/i);
|
||||
await input.fill(testMessage);
|
||||
await page.getByRole('button', { name: 'Send' }).click();
|
||||
|
||||
// Wait for the message to appear
|
||||
const messageEl = page.getByText(testMessage);
|
||||
await expect(messageEl).toBeVisible({ timeout: 15_000 });
|
||||
|
||||
// Outgoing messages show either "?" (pending) or "✓" (acked)
|
||||
// The ack indicator is in the same container as the message text
|
||||
const messageContainer = messageEl.locator('..');
|
||||
await expect(messageContainer.getByText(/[?✓]/)).toBeVisible();
|
||||
});
|
||||
});
|
||||
54
tests/e2e/specs/radio-settings.spec.ts
Normal file
@@ -0,0 +1,54 @@
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { getRadioConfig, updateRadioConfig } from '../helpers/api';
|
||||
|
||||
test.describe('Radio settings', () => {
|
||||
let originalName: string;
|
||||
|
||||
test.beforeAll(async () => {
|
||||
const config = await getRadioConfig();
|
||||
originalName = config.name;
|
||||
});
|
||||
|
||||
test.afterAll(async () => {
|
||||
// Restore original name via API
|
||||
try {
|
||||
await updateRadioConfig({ name: originalName });
|
||||
} catch {
|
||||
console.warn('Failed to restore radio name — manual intervention may be needed');
|
||||
}
|
||||
});
|
||||
|
||||
test('change radio name via settings UI and verify persistence', async ({ page }) => {
|
||||
// Radio names are limited to 8 characters
|
||||
const testName = 'E2Etest1';
|
||||
|
||||
await page.goto('/');
|
||||
await expect(page.getByText('Connected')).toBeVisible();
|
||||
|
||||
// --- Step 1: Change the name via settings UI ---
|
||||
await page.getByText('Radio & Config').click();
|
||||
await page.getByRole('tab', { name: 'Identity' }).click();
|
||||
|
||||
const nameInput = page.locator('#name');
|
||||
await nameInput.clear();
|
||||
await nameInput.fill(testName);
|
||||
|
||||
await page.getByRole('button', { name: 'Save Identity Settings' }).click();
|
||||
await expect(page.getByText('Identity settings saved')).toBeVisible({ timeout: 10_000 });
|
||||
|
||||
// Close settings
|
||||
await page.keyboard.press('Escape');
|
||||
|
||||
// --- Step 2: Verify via API (now returns fresh data after send_appstart fix) ---
|
||||
const config = await getRadioConfig();
|
||||
expect(config.name).toBe(testName);
|
||||
|
||||
// --- Step 3: Verify persistence across page reload ---
|
||||
await page.reload();
|
||||
await expect(page.getByText('Connected')).toBeVisible({ timeout: 15_000 });
|
||||
|
||||
await page.getByText('Radio & Config').click();
|
||||
await page.getByRole('tab', { name: 'Identity' }).click();
|
||||
await expect(page.locator('#name')).toHaveValue(testName, { timeout: 10_000 });
|
||||
});
|
||||
});
|
||||
12
tests/e2e/tsconfig.json
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "bundler",
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"outDir": "dist"
|
||||
},
|
||||
"include": ["**/*.ts"]
|
||||
}
|
||||
@@ -129,6 +129,11 @@ class TestMessagesEndpoint:
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix.return_value = {"public_key": "a" * 64}
|
||||
|
||||
mock_add_result = MagicMock()
|
||||
mock_add_result.type = MagicMock()
|
||||
mock_add_result.type.name = "OK"
|
||||
mock_mc.commands.add_contact = AsyncMock(return_value=mock_add_result)
|
||||
|
||||
mock_send_result = MagicMock()
|
||||
mock_send_result.type = MagicMock()
|
||||
mock_send_result.type.name = "OK"
|
||||
@@ -475,6 +480,353 @@ class TestReadStateEndpoints:
|
||||
assert response.status_code == 404
|
||||
assert "not found" in response.json()["detail"].lower()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_unreads_returns_counts_and_mentions(self):
|
||||
"""GET /unreads returns unread counts, mentions, and last message times."""
|
||||
import aiosqlite
|
||||
|
||||
from app.database import db
|
||||
from app.repository import MessageRepository
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
|
||||
# Create tables
|
||||
await conn.execute("""
|
||||
CREATE TABLE contacts (
|
||||
public_key TEXT PRIMARY KEY,
|
||||
name TEXT,
|
||||
type INTEGER DEFAULT 0,
|
||||
flags INTEGER DEFAULT 0,
|
||||
last_path TEXT,
|
||||
last_path_len INTEGER DEFAULT -1,
|
||||
last_advert INTEGER,
|
||||
lat REAL,
|
||||
lon REAL,
|
||||
last_seen INTEGER,
|
||||
on_radio INTEGER DEFAULT 0,
|
||||
last_contacted INTEGER,
|
||||
last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE channels (
|
||||
key TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
is_hashtag INTEGER DEFAULT 0,
|
||||
on_radio INTEGER DEFAULT 0,
|
||||
last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE messages (
|
||||
id INTEGER PRIMARY KEY,
|
||||
type TEXT NOT NULL,
|
||||
conversation_key TEXT NOT NULL,
|
||||
text TEXT NOT NULL,
|
||||
sender_timestamp INTEGER,
|
||||
received_at INTEGER NOT NULL,
|
||||
paths TEXT,
|
||||
txt_type INTEGER DEFAULT 0,
|
||||
signature TEXT,
|
||||
outgoing INTEGER DEFAULT 0,
|
||||
acked INTEGER DEFAULT 0,
|
||||
UNIQUE(type, conversation_key, text, sender_timestamp)
|
||||
)
|
||||
""")
|
||||
|
||||
# Insert channel and contact
|
||||
await conn.execute(
|
||||
"INSERT INTO channels (key, name, last_read_at) VALUES (?, ?, ?)",
|
||||
("AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1", "Public", 1000),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO contacts (public_key, name, last_read_at) VALUES (?, ?, ?)",
|
||||
("abcd" * 16, "Alice", 1000),
|
||||
)
|
||||
|
||||
# Insert messages: 2 unread channel msgs (after last_read_at=1000),
|
||||
# 1 read (before), 1 outgoing (should not count)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1", "Bob: hello", 1001, 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1", "Bob: @[TestUser] hey", 1002, 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1", "Bob: old msg", 999, 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1", "Me: outgoing", 1003, 1),
|
||||
)
|
||||
|
||||
# Insert 1 unread DM
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("PRIV", "abcd" * 16, "hi there", 1005, 0),
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
original_conn = db._connection
|
||||
db._connection = conn
|
||||
|
||||
try:
|
||||
result = await MessageRepository.get_unread_counts("TestUser")
|
||||
|
||||
# Channel: 2 unread (1001 and 1002), one has mention
|
||||
assert result["counts"]["channel-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1"] == 2
|
||||
assert result["mentions"]["channel-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1"] is True
|
||||
|
||||
# Contact: 1 unread
|
||||
assert result["counts"][f"contact-{'abcd' * 16}"] == 1
|
||||
|
||||
# Last message times should include all conversations
|
||||
assert "channel-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1" in result["last_message_times"]
|
||||
assert result["last_message_times"]["channel-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1"] == 1003
|
||||
assert f"contact-{'abcd' * 16}" in result["last_message_times"]
|
||||
assert result["last_message_times"][f"contact-{'abcd' * 16}"] == 1005
|
||||
finally:
|
||||
db._connection = original_conn
|
||||
await conn.close()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_unreads_no_name_skips_mentions(self):
|
||||
"""GET /unreads without name param returns counts but no mention flags."""
|
||||
import aiosqlite
|
||||
|
||||
from app.database import db
|
||||
from app.repository import MessageRepository
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
|
||||
await conn.execute("""
|
||||
CREATE TABLE channels (
|
||||
key TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
is_hashtag INTEGER DEFAULT 0,
|
||||
on_radio INTEGER DEFAULT 0,
|
||||
last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE contacts (
|
||||
public_key TEXT PRIMARY KEY,
|
||||
name TEXT,
|
||||
type INTEGER DEFAULT 0,
|
||||
flags INTEGER DEFAULT 0,
|
||||
last_path TEXT,
|
||||
last_path_len INTEGER DEFAULT -1,
|
||||
last_advert INTEGER,
|
||||
lat REAL,
|
||||
lon REAL,
|
||||
last_seen INTEGER,
|
||||
on_radio INTEGER DEFAULT 0,
|
||||
last_contacted INTEGER,
|
||||
last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE messages (
|
||||
id INTEGER PRIMARY KEY,
|
||||
type TEXT NOT NULL,
|
||||
conversation_key TEXT NOT NULL,
|
||||
text TEXT NOT NULL,
|
||||
sender_timestamp INTEGER,
|
||||
received_at INTEGER NOT NULL,
|
||||
paths TEXT,
|
||||
txt_type INTEGER DEFAULT 0,
|
||||
signature TEXT,
|
||||
outgoing INTEGER DEFAULT 0,
|
||||
acked INTEGER DEFAULT 0,
|
||||
UNIQUE(type, conversation_key, text, sender_timestamp)
|
||||
)
|
||||
""")
|
||||
|
||||
await conn.execute(
|
||||
"INSERT INTO channels (key, name, last_read_at) VALUES (?, ?, ?)",
|
||||
("CHAN1KEY1CHAN1KEY1CHAN1KEY1CHAN1KEY1", "Public", 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", "CHAN1KEY1CHAN1KEY1CHAN1KEY1CHAN1KEY1", "Bob: @[Alice] hey", 1001, 0),
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
original_conn = db._connection
|
||||
db._connection = conn
|
||||
|
||||
try:
|
||||
result = await MessageRepository.get_unread_counts(None)
|
||||
|
||||
assert result["counts"]["channel-CHAN1KEY1CHAN1KEY1CHAN1KEY1CHAN1KEY1"] == 1
|
||||
# No mentions since name was None
|
||||
assert len(result["mentions"]) == 0
|
||||
finally:
|
||||
db._connection = original_conn
|
||||
await conn.close()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unreads_reset_after_mark_read(self):
|
||||
"""Marking a conversation as read zeroes its unread count; new messages after count again."""
|
||||
import aiosqlite
|
||||
|
||||
from app.database import db
|
||||
from app.repository import MessageRepository
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
|
||||
await conn.execute("""
|
||||
CREATE TABLE channels (
|
||||
key TEXT PRIMARY KEY, name TEXT NOT NULL,
|
||||
is_hashtag INTEGER DEFAULT 0, on_radio INTEGER DEFAULT 0, last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE contacts (
|
||||
public_key TEXT PRIMARY KEY, name TEXT,
|
||||
type INTEGER DEFAULT 0, flags INTEGER DEFAULT 0,
|
||||
last_path TEXT, last_path_len INTEGER DEFAULT -1,
|
||||
last_advert INTEGER, lat REAL, lon REAL, last_seen INTEGER,
|
||||
on_radio INTEGER DEFAULT 0, last_contacted INTEGER, last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE messages (
|
||||
id INTEGER PRIMARY KEY, type TEXT NOT NULL,
|
||||
conversation_key TEXT NOT NULL, text TEXT NOT NULL,
|
||||
sender_timestamp INTEGER, received_at INTEGER NOT NULL,
|
||||
paths TEXT, txt_type INTEGER DEFAULT 0, signature TEXT,
|
||||
outgoing INTEGER DEFAULT 0, acked INTEGER DEFAULT 0,
|
||||
UNIQUE(type, conversation_key, text, sender_timestamp)
|
||||
)
|
||||
""")
|
||||
|
||||
chan_key = "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA1"
|
||||
await conn.execute(
|
||||
"INSERT INTO channels (key, name, last_read_at) VALUES (?, ?, ?)",
|
||||
(chan_key, "Public", 1000),
|
||||
)
|
||||
# 2 unread messages (received_at > last_read_at=1000)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", chan_key, "msg1", 1001, 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", chan_key, "msg2", 1002, 0),
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
original_conn = db._connection
|
||||
db._connection = conn
|
||||
|
||||
try:
|
||||
# Verify 2 unread
|
||||
result = await MessageRepository.get_unread_counts(None)
|
||||
assert result["counts"][f"channel-{chan_key}"] == 2
|
||||
|
||||
# Simulate mark-read by updating last_read_at to after all messages
|
||||
await conn.execute(
|
||||
"UPDATE channels SET last_read_at = ? WHERE key = ?", (1002, chan_key)
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
# Verify 0 unread
|
||||
result = await MessageRepository.get_unread_counts(None)
|
||||
assert result["counts"].get(f"channel-{chan_key}", 0) == 0
|
||||
|
||||
# New message arrives after the read point
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("CHAN", chan_key, "msg3", 1003, 0),
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
# Verify exactly 1 unread
|
||||
result = await MessageRepository.get_unread_counts(None)
|
||||
assert result["counts"][f"channel-{chan_key}"] == 1
|
||||
finally:
|
||||
db._connection = original_conn
|
||||
await conn.close()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unreads_exclude_outgoing_messages(self):
|
||||
"""Outgoing messages should never count as unread, even when received_at > last_read_at.
|
||||
|
||||
This is critical: without the outgoing filter, every message we send would
|
||||
show as an unread badge in the sidebar.
|
||||
"""
|
||||
import aiosqlite
|
||||
|
||||
from app.database import db
|
||||
from app.repository import MessageRepository
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
|
||||
await conn.execute("""
|
||||
CREATE TABLE channels (
|
||||
key TEXT PRIMARY KEY, name TEXT NOT NULL,
|
||||
is_hashtag INTEGER DEFAULT 0, on_radio INTEGER DEFAULT 0, last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE contacts (
|
||||
public_key TEXT PRIMARY KEY, name TEXT,
|
||||
type INTEGER DEFAULT 0, flags INTEGER DEFAULT 0,
|
||||
last_path TEXT, last_path_len INTEGER DEFAULT -1,
|
||||
last_advert INTEGER, lat REAL, lon REAL, last_seen INTEGER,
|
||||
on_radio INTEGER DEFAULT 0, last_contacted INTEGER, last_read_at INTEGER
|
||||
)
|
||||
""")
|
||||
await conn.execute("""
|
||||
CREATE TABLE messages (
|
||||
id INTEGER PRIMARY KEY, type TEXT NOT NULL,
|
||||
conversation_key TEXT NOT NULL, text TEXT NOT NULL,
|
||||
sender_timestamp INTEGER, received_at INTEGER NOT NULL,
|
||||
paths TEXT, txt_type INTEGER DEFAULT 0, signature TEXT,
|
||||
outgoing INTEGER DEFAULT 0, acked INTEGER DEFAULT 0,
|
||||
UNIQUE(type, conversation_key, text, sender_timestamp)
|
||||
)
|
||||
""")
|
||||
|
||||
contact_key = "abcd" * 16
|
||||
await conn.execute(
|
||||
"INSERT INTO contacts (public_key, name, last_read_at) VALUES (?, ?, ?)",
|
||||
(contact_key, "Bob", 1000),
|
||||
)
|
||||
# 1 incoming (should count) + 2 outgoing (should NOT count)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("PRIV", contact_key, "incoming msg", 1001, 0),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("PRIV", contact_key, "my reply", 1002, 1),
|
||||
)
|
||||
await conn.execute(
|
||||
"INSERT INTO messages (type, conversation_key, text, received_at, outgoing) VALUES (?, ?, ?, ?, ?)",
|
||||
("PRIV", contact_key, "another reply", 1003, 1),
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
original_conn = db._connection
|
||||
db._connection = conn
|
||||
|
||||
try:
|
||||
result = await MessageRepository.get_unread_counts(None)
|
||||
# Only the 1 incoming message should count as unread
|
||||
assert result["counts"][f"contact-{contact_key}"] == 1
|
||||
finally:
|
||||
db._connection = original_conn
|
||||
await conn.close()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_mark_all_read_updates_all_conversations(self):
|
||||
"""Bulk mark-all-read updates all contacts and channels."""
|
||||
|
||||
1044
tests/test_bot.py
Normal file
531
tests/test_contacts_router.py
Normal file
@@ -0,0 +1,531 @@
|
||||
"""Tests for the contacts router.
|
||||
|
||||
Verifies the contact CRUD endpoints, sync, mark-read, delete,
|
||||
and add/remove from radio operations.
|
||||
|
||||
Uses FastAPI TestClient with mocked dependencies, consistent
|
||||
with the test_api.py pattern.
|
||||
"""
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
from meshcore import EventType
|
||||
|
||||
# Sample 64-char hex public keys for testing
|
||||
KEY_A = "aa" * 32 # aaaa...aa
|
||||
KEY_B = "bb" * 32 # bbbb...bb
|
||||
KEY_C = "cc" * 32 # cccc...cc
|
||||
|
||||
|
||||
def _make_contact(public_key=KEY_A, name="Alice", **overrides):
|
||||
"""Create a mock Contact model instance."""
|
||||
from app.models import Contact
|
||||
|
||||
defaults = {
|
||||
"public_key": public_key,
|
||||
"name": name,
|
||||
"type": 0,
|
||||
"flags": 0,
|
||||
"last_path": None,
|
||||
"last_path_len": -1,
|
||||
"last_advert": None,
|
||||
"lat": None,
|
||||
"lon": None,
|
||||
"last_seen": None,
|
||||
"on_radio": False,
|
||||
"last_contacted": None,
|
||||
"last_read_at": None,
|
||||
}
|
||||
defaults.update(overrides)
|
||||
return Contact(**defaults)
|
||||
|
||||
|
||||
class TestListContacts:
|
||||
"""Test GET /api/contacts."""
|
||||
|
||||
def test_list_returns_contacts(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contacts = [_make_contact(KEY_A, "Alice"), _make_contact(KEY_B, "Bob")]
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_all",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contacts,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.get("/api/contacts")
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert len(data) == 2
|
||||
assert data[0]["public_key"] == KEY_A
|
||||
assert data[1]["public_key"] == KEY_B
|
||||
|
||||
def test_list_pagination_params(self):
|
||||
"""Pagination parameters are forwarded to repository."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_all",
|
||||
new_callable=AsyncMock,
|
||||
return_value=[],
|
||||
) as mock_get_all:
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.get("/api/contacts?limit=5&offset=10")
|
||||
|
||||
assert response.status_code == 200
|
||||
mock_get_all.assert_called_once_with(limit=5, offset=10)
|
||||
|
||||
|
||||
class TestCreateContact:
|
||||
"""Test POST /api/contacts."""
|
||||
|
||||
def test_create_new_contact(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with (
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.upsert",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_upsert,
|
||||
patch(
|
||||
"app.routers.contacts.MessageRepository.claim_prefix_messages",
|
||||
new_callable=AsyncMock,
|
||||
return_value=0,
|
||||
),
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(
|
||||
"/api/contacts",
|
||||
json={"public_key": KEY_A, "name": "NewContact"},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["public_key"] == KEY_A
|
||||
assert data["name"] == "NewContact"
|
||||
mock_upsert.assert_called_once()
|
||||
|
||||
def test_create_invalid_hex(self):
|
||||
"""Non-hex public key returns 400."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(
|
||||
"/api/contacts",
|
||||
json={"public_key": "zz" * 32, "name": "Bad"},
|
||||
)
|
||||
|
||||
assert response.status_code == 400
|
||||
assert "hex" in response.json()["detail"].lower()
|
||||
|
||||
def test_create_short_key_rejected(self):
|
||||
"""Key shorter than 64 chars is rejected by pydantic validation."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(
|
||||
"/api/contacts",
|
||||
json={"public_key": "aa" * 16, "name": "Short"},
|
||||
)
|
||||
|
||||
assert response.status_code == 422
|
||||
|
||||
def test_create_existing_updates_name(self):
|
||||
"""Creating a contact that exists updates the name."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
existing = _make_contact(KEY_A, "OldName")
|
||||
|
||||
with (
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=existing,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.upsert",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_upsert,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(
|
||||
"/api/contacts",
|
||||
json={"public_key": KEY_A, "name": "NewName"},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
# Upsert called with new name
|
||||
mock_upsert.assert_called_once()
|
||||
upsert_data = mock_upsert.call_args[0][0]
|
||||
assert upsert_data["name"] == "NewName"
|
||||
|
||||
|
||||
class TestGetContact:
|
||||
"""Test GET /api/contacts/{public_key}."""
|
||||
|
||||
def test_get_existing(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A, "Alice")
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.get(f"/api/contacts/{KEY_A}")
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["name"] == "Alice"
|
||||
|
||||
def test_get_not_found(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.get(f"/api/contacts/{KEY_A}")
|
||||
|
||||
assert response.status_code == 404
|
||||
|
||||
|
||||
class TestMarkRead:
|
||||
"""Test POST /api/contacts/{public_key}/mark-read."""
|
||||
|
||||
def test_mark_read_updates_timestamp(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A)
|
||||
|
||||
with (
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.update_last_read_at",
|
||||
new_callable=AsyncMock,
|
||||
return_value=True,
|
||||
),
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/mark-read")
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["status"] == "ok"
|
||||
|
||||
def test_mark_read_not_found(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/mark-read")
|
||||
|
||||
assert response.status_code == 404
|
||||
|
||||
|
||||
class TestDeleteContact:
|
||||
"""Test DELETE /api/contacts/{public_key}."""
|
||||
|
||||
def test_delete_existing(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A)
|
||||
|
||||
with (
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.delete",
|
||||
new_callable=AsyncMock,
|
||||
),
|
||||
patch("app.routers.contacts.radio_manager") as mock_rm,
|
||||
):
|
||||
mock_rm.is_connected = False
|
||||
mock_rm.meshcore = None
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.delete(f"/api/contacts/{KEY_A}")
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["status"] == "ok"
|
||||
|
||||
def test_delete_not_found(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
):
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.delete(f"/api/contacts/{KEY_A}")
|
||||
|
||||
assert response.status_code == 404
|
||||
|
||||
def test_delete_removes_from_radio_if_connected(self):
|
||||
"""When radio is connected and contact is on radio, remove it first."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A, on_radio=True)
|
||||
mock_radio_contact = MagicMock()
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=mock_radio_contact)
|
||||
mock_mc.commands.remove_contact = AsyncMock()
|
||||
|
||||
with (
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.delete",
|
||||
new_callable=AsyncMock,
|
||||
),
|
||||
patch("app.routers.contacts.radio_manager") as mock_rm,
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.delete(f"/api/contacts/{KEY_A}")
|
||||
|
||||
assert response.status_code == 200
|
||||
mock_mc.commands.remove_contact.assert_called_once_with(mock_radio_contact)
|
||||
|
||||
|
||||
class TestSyncContacts:
|
||||
"""Test POST /api/contacts/sync."""
|
||||
|
||||
def test_sync_from_radio(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.OK
|
||||
mock_result.payload = {
|
||||
KEY_A: {"adv_name": "Alice", "type": 1, "flags": 0},
|
||||
KEY_B: {"adv_name": "Bob", "type": 1, "flags": 0},
|
||||
}
|
||||
mock_mc.commands.get_contacts = AsyncMock(return_value=mock_result)
|
||||
|
||||
with (
|
||||
patch("app.dependencies.radio_manager") as mock_dep_rm,
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.upsert", new_callable=AsyncMock
|
||||
) as mock_upsert,
|
||||
):
|
||||
mock_dep_rm.is_connected = True
|
||||
mock_dep_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post("/api/contacts/sync")
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["synced"] == 2
|
||||
assert mock_upsert.call_count == 2
|
||||
|
||||
def test_sync_requires_connection(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch("app.dependencies.radio_manager") as mock_rm:
|
||||
mock_rm.is_connected = False
|
||||
mock_rm.meshcore = None
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post("/api/contacts/sync")
|
||||
|
||||
assert response.status_code == 503
|
||||
|
||||
|
||||
class TestAddRemoveRadio:
|
||||
"""Test add-to-radio and remove-from-radio endpoints."""
|
||||
|
||||
def test_add_to_radio(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A)
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=None) # Not on radio
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.OK
|
||||
mock_mc.commands.add_contact = AsyncMock(return_value=mock_result)
|
||||
|
||||
with (
|
||||
patch("app.dependencies.radio_manager") as mock_dep_rm,
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_set_on_radio,
|
||||
):
|
||||
mock_dep_rm.is_connected = True
|
||||
mock_dep_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/add-to-radio")
|
||||
|
||||
assert response.status_code == 200
|
||||
mock_mc.commands.add_contact.assert_called_once()
|
||||
mock_set_on_radio.assert_called_once_with(KEY_A, True)
|
||||
|
||||
def test_add_already_on_radio(self):
|
||||
"""Adding a contact already on radio returns ok without calling add_contact."""
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A, on_radio=True)
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=MagicMock()) # On radio
|
||||
|
||||
with (
|
||||
patch("app.dependencies.radio_manager") as mock_dep_rm,
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
):
|
||||
mock_dep_rm.is_connected = True
|
||||
mock_dep_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/add-to-radio")
|
||||
|
||||
assert response.status_code == 200
|
||||
assert "already" in response.json()["message"].lower()
|
||||
|
||||
def test_remove_from_radio(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
contact = _make_contact(KEY_A, on_radio=True)
|
||||
mock_radio_contact = MagicMock()
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=mock_radio_contact)
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.OK
|
||||
mock_mc.commands.remove_contact = AsyncMock(return_value=mock_result)
|
||||
|
||||
with (
|
||||
patch("app.dependencies.radio_manager") as mock_dep_rm,
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contact,
|
||||
),
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_set_on_radio,
|
||||
):
|
||||
mock_dep_rm.is_connected = True
|
||||
mock_dep_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/remove-from-radio")
|
||||
|
||||
assert response.status_code == 200
|
||||
mock_mc.commands.remove_contact.assert_called_once_with(mock_radio_contact)
|
||||
mock_set_on_radio.assert_called_once_with(KEY_A, False)
|
||||
|
||||
def test_add_requires_connection(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
with patch("app.dependencies.radio_manager") as mock_rm:
|
||||
mock_rm.is_connected = False
|
||||
mock_rm.meshcore = None
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/add-to-radio")
|
||||
|
||||
assert response.status_code == 503
|
||||
|
||||
def test_remove_not_found(self):
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
mock_mc = MagicMock()
|
||||
|
||||
with (
|
||||
patch("app.dependencies.radio_manager") as mock_dep_rm,
|
||||
patch(
|
||||
"app.routers.contacts.ContactRepository.get_by_key_or_prefix",
|
||||
new_callable=AsyncMock,
|
||||
return_value=None,
|
||||
),
|
||||
):
|
||||
mock_dep_rm.is_connected = True
|
||||
mock_dep_rm.meshcore = mock_mc
|
||||
|
||||
from app.main import app
|
||||
|
||||
client = TestClient(app)
|
||||
response = client.post(f"/api/contacts/{KEY_A}/remove-from-radio")
|
||||
|
||||
assert response.status_code == 404
|
||||
@@ -254,7 +254,7 @@ class TestAdvertisementParsing:
|
||||
|
||||
def test_parse_repeater_advertisement_with_gps(self):
|
||||
"""Parse a repeater advertisement with GPS coordinates."""
|
||||
from app.decoder import try_parse_advertisement
|
||||
from app.decoder import parse_advertisement, parse_packet
|
||||
|
||||
# Repeater packet with lat/lon of 49.02056 / -123.82935
|
||||
# Flags 0x92: Role=Repeater (2), Location=Yes, Name=Yes
|
||||
@@ -266,7 +266,9 @@ class TestAdvertisementParsing:
|
||||
)
|
||||
packet = bytes.fromhex(packet_hex)
|
||||
|
||||
result = try_parse_advertisement(packet)
|
||||
info = parse_packet(packet)
|
||||
assert info is not None
|
||||
result = parse_advertisement(info.payload)
|
||||
|
||||
assert result is not None
|
||||
assert (
|
||||
@@ -282,7 +284,7 @@ class TestAdvertisementParsing:
|
||||
|
||||
def test_parse_chat_node_advertisement_with_gps(self):
|
||||
"""Parse a chat node advertisement with GPS coordinates."""
|
||||
from app.decoder import try_parse_advertisement
|
||||
from app.decoder import parse_advertisement, parse_packet
|
||||
|
||||
# Chat node packet with lat/lon of 47.786038 / -122.344096
|
||||
# Flags 0x91: Role=Chat (1), Location=Yes, Name=Yes
|
||||
@@ -294,7 +296,9 @@ class TestAdvertisementParsing:
|
||||
)
|
||||
packet = bytes.fromhex(packet_hex)
|
||||
|
||||
result = try_parse_advertisement(packet)
|
||||
info = parse_packet(packet)
|
||||
assert info is not None
|
||||
result = parse_advertisement(info.payload)
|
||||
|
||||
assert result is not None
|
||||
assert (
|
||||
@@ -310,7 +314,7 @@ class TestAdvertisementParsing:
|
||||
|
||||
def test_parse_advertisement_without_gps(self):
|
||||
"""Parse an advertisement without GPS coordinates."""
|
||||
from app.decoder import try_parse_advertisement
|
||||
from app.decoder import parse_advertisement, parse_packet
|
||||
|
||||
# Chat node packet without location
|
||||
# Flags 0x81: Role=Chat (1), Location=No, Name=Yes
|
||||
@@ -322,7 +326,9 @@ class TestAdvertisementParsing:
|
||||
)
|
||||
packet = bytes.fromhex(packet_hex)
|
||||
|
||||
result = try_parse_advertisement(packet)
|
||||
info = parse_packet(packet)
|
||||
assert info is not None
|
||||
result = parse_advertisement(info.payload)
|
||||
|
||||
assert result is not None
|
||||
assert (
|
||||
@@ -352,15 +358,15 @@ class TestAdvertisementParsing:
|
||||
assert info.payload_type == PayloadType.ADVERT
|
||||
|
||||
def test_non_advertisement_returns_none(self):
|
||||
"""Non-advertisement packets return None from try_parse_advertisement."""
|
||||
from app.decoder import try_parse_advertisement
|
||||
"""Non-advertisement packets return None when parsed as advertisement."""
|
||||
from app.decoder import PayloadType, parse_packet
|
||||
|
||||
# GROUP_TEXT packet, not an advertisement
|
||||
packet = bytes([0x15, 0x00]) + bytes(50)
|
||||
|
||||
result = try_parse_advertisement(packet)
|
||||
|
||||
assert result is None
|
||||
info = parse_packet(packet)
|
||||
assert info is not None
|
||||
assert info.payload_type != PayloadType.ADVERT
|
||||
|
||||
|
||||
class TestScalarClamping:
|
||||
|
||||
691
tests/test_echo_dedup.py
Normal file
@@ -0,0 +1,691 @@
|
||||
"""Tests for echo detection, ack counting, path accumulation, and dual-path deduplication.
|
||||
|
||||
These tests exercise the critical duplicate-handling branches in packet_processor.py
|
||||
and event_handlers.py that detect mesh echoes, increment ack counts for outgoing
|
||||
messages, accumulate multi-path routing info, and ensure the dual DM processing
|
||||
paths (packet_processor + event_handler fallback) don't double-store messages.
|
||||
"""
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
from app.database import Database
|
||||
from app.decoder import DecryptedDirectMessage
|
||||
from app.repository import (
|
||||
ContactRepository,
|
||||
MessageRepository,
|
||||
RawPacketRepository,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
async def test_db():
|
||||
"""Create an in-memory test database."""
|
||||
import app.repository as repo_module
|
||||
|
||||
db = Database(":memory:")
|
||||
await db.connect()
|
||||
|
||||
original_db = repo_module.db
|
||||
repo_module.db = db
|
||||
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
repo_module.db = original_db
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def captured_broadcasts():
|
||||
"""Capture WebSocket broadcasts for verification."""
|
||||
broadcasts = []
|
||||
|
||||
def mock_broadcast(event_type: str, data: dict):
|
||||
broadcasts.append({"type": event_type, "data": data})
|
||||
|
||||
return broadcasts, mock_broadcast
|
||||
|
||||
|
||||
# Shared test constants
|
||||
CHANNEL_KEY = "ABC123DEF456ABC123DEF456ABC12345"
|
||||
CONTACT_PUB = "a1b2c3d3ba9f5fa8705b9845fe11cc6f01d1d49caaf4d122ac7121663c5beec7"
|
||||
OUR_PUB = "FACE123334789E2B81519AFDBC39A3C9EB7EA3457AD367D3243597A484847E46"
|
||||
SENDER_TIMESTAMP = 1700000000
|
||||
|
||||
|
||||
class TestChannelEchoDetection:
|
||||
"""Test echo detection for outgoing channel messages.
|
||||
|
||||
When we send a channel message via flood routing, it echoes back through
|
||||
repeaters. The duplicate-detection branch in create_message_from_decrypted
|
||||
should detect the echo, increment ack_count, and add the echo's path.
|
||||
"""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_outgoing_echo_increments_ack_and_adds_path(self, test_db, captured_broadcasts):
|
||||
"""Outgoing channel message echo increments ack count and adds path."""
|
||||
from app.packet_processor import create_message_from_decrypted
|
||||
|
||||
# Store the outgoing message (as the send endpoint would)
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="CHAN",
|
||||
text="Sender: Hello mesh",
|
||||
conversation_key=CHANNEL_KEY,
|
||||
sender_timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
outgoing=True,
|
||||
)
|
||||
assert msg_id is not None
|
||||
|
||||
# Create a raw packet for the echo
|
||||
packet_id, _ = await RawPacketRepository.create(b"echo_packet_1", SENDER_TIMESTAMP + 1)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
# Process the echo (same content, different path)
|
||||
result = await create_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
channel_key=CHANNEL_KEY,
|
||||
sender="Sender",
|
||||
message_text="Hello mesh",
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP + 1,
|
||||
path="aabb",
|
||||
)
|
||||
|
||||
# Should return None (duplicate)
|
||||
assert result is None
|
||||
|
||||
# Should broadcast message_acked
|
||||
ack_broadcasts = [b for b in broadcasts if b["type"] == "message_acked"]
|
||||
assert len(ack_broadcasts) == 1
|
||||
assert ack_broadcasts[0]["data"]["message_id"] == msg_id
|
||||
assert ack_broadcasts[0]["data"]["ack_count"] == 1
|
||||
# Path should be in the broadcast
|
||||
assert len(ack_broadcasts[0]["data"]["paths"]) >= 1
|
||||
assert any(p["path"] == "aabb" for p in ack_broadcasts[0]["data"]["paths"])
|
||||
|
||||
# Should NOT broadcast a new message
|
||||
message_broadcasts = [b for b in broadcasts if b["type"] == "message"]
|
||||
assert len(message_broadcasts) == 0
|
||||
|
||||
# Verify DB state
|
||||
msg = await MessageRepository.get_by_content(
|
||||
msg_type="CHAN",
|
||||
conversation_key=CHANNEL_KEY,
|
||||
text="Sender: Hello mesh",
|
||||
sender_timestamp=SENDER_TIMESTAMP,
|
||||
)
|
||||
assert msg is not None
|
||||
assert msg.acked == 1
|
||||
assert msg.paths is not None
|
||||
assert any(p.path == "aabb" for p in msg.paths)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_multiple_echoes_increment_progressively(self, test_db, captured_broadcasts):
|
||||
"""Multiple echoes of the same outgoing message increment ack count progressively."""
|
||||
from app.packet_processor import create_message_from_decrypted
|
||||
|
||||
# Store outgoing message
|
||||
await MessageRepository.create(
|
||||
msg_type="CHAN",
|
||||
text="Sender: Flood test",
|
||||
conversation_key=CHANNEL_KEY,
|
||||
sender_timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
outgoing=True,
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
echo_paths = ["aa", "bbcc", "ddeeff"]
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
for i, path in enumerate(echo_paths):
|
||||
pkt_id, _ = await RawPacketRepository.create(
|
||||
f"echo_{i}".encode(), SENDER_TIMESTAMP + i + 1
|
||||
)
|
||||
await create_message_from_decrypted(
|
||||
packet_id=pkt_id,
|
||||
channel_key=CHANNEL_KEY,
|
||||
sender="Sender",
|
||||
message_text="Flood test",
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP + i + 1,
|
||||
path=path,
|
||||
)
|
||||
|
||||
# Should have 3 message_acked broadcasts with increasing counts
|
||||
ack_broadcasts = [b for b in broadcasts if b["type"] == "message_acked"]
|
||||
assert len(ack_broadcasts) == 3
|
||||
assert ack_broadcasts[0]["data"]["ack_count"] == 1
|
||||
assert ack_broadcasts[1]["data"]["ack_count"] == 2
|
||||
assert ack_broadcasts[2]["data"]["ack_count"] == 3
|
||||
|
||||
# Final paths should have all 3 echo paths
|
||||
final_paths = ack_broadcasts[2]["data"]["paths"]
|
||||
path_values = [p["path"] for p in final_paths]
|
||||
for p in echo_paths:
|
||||
assert p in path_values
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_incoming_duplicate_does_not_increment_ack(self, test_db, captured_broadcasts):
|
||||
"""Duplicate of incoming (non-outgoing) channel message does NOT increment ack."""
|
||||
from app.packet_processor import create_message_from_decrypted
|
||||
|
||||
# First packet creates the incoming message
|
||||
pkt1, _ = await RawPacketRepository.create(b"incoming_1", SENDER_TIMESTAMP)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_message_from_decrypted(
|
||||
packet_id=pkt1,
|
||||
channel_key=CHANNEL_KEY,
|
||||
sender="OtherUser",
|
||||
message_text="Incoming msg",
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
path="aa",
|
||||
)
|
||||
|
||||
assert msg_id is not None
|
||||
|
||||
# Clear broadcasts for the echo
|
||||
broadcasts.clear()
|
||||
|
||||
# Second packet is the echo (same content, different path)
|
||||
pkt2, _ = await RawPacketRepository.create(b"incoming_2", SENDER_TIMESTAMP + 1)
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
result = await create_message_from_decrypted(
|
||||
packet_id=pkt2,
|
||||
channel_key=CHANNEL_KEY,
|
||||
sender="OtherUser",
|
||||
message_text="Incoming msg",
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP + 1,
|
||||
path="bbcc",
|
||||
)
|
||||
|
||||
assert result is None
|
||||
|
||||
# Should broadcast message_acked but ack_count should be 0 (not incremented)
|
||||
ack_broadcasts = [b for b in broadcasts if b["type"] == "message_acked"]
|
||||
assert len(ack_broadcasts) == 1
|
||||
assert ack_broadcasts[0]["data"]["ack_count"] == 0
|
||||
|
||||
# Path should still be added
|
||||
paths = ack_broadcasts[0]["data"]["paths"]
|
||||
path_values = [p["path"] for p in paths]
|
||||
assert "aa" in path_values
|
||||
assert "bbcc" in path_values
|
||||
|
||||
|
||||
class TestDMEchoDetection:
|
||||
"""Test echo detection for direct messages."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_outgoing_dm_echo_increments_ack(self, test_db, captured_broadcasts):
|
||||
"""Outgoing DM echo increments ack count."""
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
|
||||
# Store outgoing DM
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text="Hello friend",
|
||||
conversation_key=CONTACT_PUB.lower(),
|
||||
sender_timestamp=SENDER_TIMESTAMP,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
outgoing=True,
|
||||
)
|
||||
assert msg_id is not None
|
||||
|
||||
# Echo arrives
|
||||
pkt_id, _ = await RawPacketRepository.create(b"dm_echo", SENDER_TIMESTAMP + 1)
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Hello friend",
|
||||
dest_hash="a1",
|
||||
src_hash="fa",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
result = await create_dm_message_from_decrypted(
|
||||
packet_id=pkt_id,
|
||||
decrypted=decrypted,
|
||||
their_public_key=CONTACT_PUB,
|
||||
our_public_key=OUR_PUB,
|
||||
received_at=SENDER_TIMESTAMP + 1,
|
||||
path="aabb",
|
||||
outgoing=True,
|
||||
)
|
||||
|
||||
assert result is None
|
||||
|
||||
ack_broadcasts = [b for b in broadcasts if b["type"] == "message_acked"]
|
||||
assert len(ack_broadcasts) == 1
|
||||
assert ack_broadcasts[0]["data"]["ack_count"] == 1
|
||||
assert any(p["path"] == "aabb" for p in ack_broadcasts[0]["data"]["paths"])
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_incoming_dm_duplicate_does_not_increment_ack(self, test_db, captured_broadcasts):
|
||||
"""Duplicate of incoming DM does NOT increment ack."""
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
|
||||
# First: create the incoming message
|
||||
pkt1, _ = await RawPacketRepository.create(b"dm_in_1", SENDER_TIMESTAMP)
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Hi from mesh",
|
||||
dest_hash="fa",
|
||||
src_hash="a1",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=pkt1,
|
||||
decrypted=decrypted,
|
||||
their_public_key=CONTACT_PUB,
|
||||
our_public_key=OUR_PUB,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
path="aa",
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
assert msg_id is not None
|
||||
broadcasts.clear()
|
||||
|
||||
# Duplicate arrives via different path
|
||||
pkt2, _ = await RawPacketRepository.create(b"dm_in_2", SENDER_TIMESTAMP + 1)
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
result = await create_dm_message_from_decrypted(
|
||||
packet_id=pkt2,
|
||||
decrypted=decrypted,
|
||||
their_public_key=CONTACT_PUB,
|
||||
our_public_key=OUR_PUB,
|
||||
received_at=SENDER_TIMESTAMP + 1,
|
||||
path="bbcc",
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
assert result is None
|
||||
|
||||
ack_broadcasts = [b for b in broadcasts if b["type"] == "message_acked"]
|
||||
assert len(ack_broadcasts) == 1
|
||||
assert ack_broadcasts[0]["data"]["ack_count"] == 0 # NOT incremented
|
||||
|
||||
# Path still added
|
||||
paths = ack_broadcasts[0]["data"]["paths"]
|
||||
path_values = [p["path"] for p in paths]
|
||||
assert "bbcc" in path_values
|
||||
|
||||
|
||||
class TestDualPathDedup:
|
||||
"""Test deduplication between the packet_processor and event_handler fallback paths.
|
||||
|
||||
DMs can be processed by two paths:
|
||||
1. Primary: RX_LOG_DATA → packet_processor (decrypts with private key)
|
||||
2. Fallback: CONTACT_MSG_RECV → on_contact_message (MeshCore library decoded)
|
||||
|
||||
The fallback uses INSERT OR IGNORE to avoid double-storage when both fire.
|
||||
"""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_event_handler_deduplicates_against_packet_processor(
|
||||
self, test_db, captured_broadcasts
|
||||
):
|
||||
"""on_contact_message does not double-store when packet_processor already handled it."""
|
||||
from app.event_handlers import on_contact_message
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
|
||||
# 1) Packet processor stores the message first
|
||||
pkt_id, _ = await RawPacketRepository.create(b"primary_path", SENDER_TIMESTAMP)
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Dedup test message",
|
||||
dest_hash="fa",
|
||||
src_hash="a1",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=pkt_id,
|
||||
decrypted=decrypted,
|
||||
their_public_key=CONTACT_PUB,
|
||||
our_public_key=OUR_PUB,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
assert msg_id is not None
|
||||
|
||||
# Record broadcast count after packet_processor
|
||||
broadcasts_after_primary = len(broadcasts)
|
||||
|
||||
# 2) Event handler fires with the same message content
|
||||
mock_event = MagicMock()
|
||||
mock_event.payload = {
|
||||
"public_key": CONTACT_PUB,
|
||||
"text": "Dedup test message",
|
||||
"txt_type": 0,
|
||||
"sender_timestamp": SENDER_TIMESTAMP,
|
||||
}
|
||||
|
||||
# Mock contact lookup to return a contact with the right key
|
||||
mock_contact = MagicMock()
|
||||
mock_contact.public_key = CONTACT_PUB
|
||||
mock_contact.type = 1 # Client, not repeater
|
||||
mock_contact.name = "TestContact"
|
||||
|
||||
with (
|
||||
patch("app.event_handlers.ContactRepository") as mock_contact_repo,
|
||||
patch("app.event_handlers.broadcast_event", mock_broadcast),
|
||||
):
|
||||
mock_contact_repo.get_by_key_or_prefix = AsyncMock(return_value=mock_contact)
|
||||
mock_contact_repo.update_last_contacted = AsyncMock()
|
||||
|
||||
await on_contact_message(mock_event)
|
||||
|
||||
# No additional message broadcast should have been sent
|
||||
new_message_broadcasts = [
|
||||
b for b in broadcasts[broadcasts_after_primary:] if b["type"] == "message"
|
||||
]
|
||||
assert len(new_message_broadcasts) == 0
|
||||
|
||||
# Only one message in DB
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=CONTACT_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_case_consistency_between_paths(self, test_db, captured_broadcasts):
|
||||
"""Event handler lowercases conversation_key to match packet_processor.
|
||||
|
||||
This tests the fix applied to event_handlers.py where contact.public_key
|
||||
is now lowercased before being used as conversation_key.
|
||||
"""
|
||||
from app.event_handlers import on_contact_message
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
|
||||
# Use an uppercase key to exercise the case sensitivity path
|
||||
upper_key = CONTACT_PUB.upper()
|
||||
|
||||
# 1) Packet processor stores with lowercased key (always)
|
||||
pkt_id, _ = await RawPacketRepository.create(b"case_test", SENDER_TIMESTAMP)
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Case sensitivity test",
|
||||
dest_hash="fa",
|
||||
src_hash="a1",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=pkt_id,
|
||||
decrypted=decrypted,
|
||||
their_public_key=upper_key, # Uppercase input
|
||||
our_public_key=OUR_PUB,
|
||||
received_at=SENDER_TIMESTAMP,
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
assert msg_id is not None
|
||||
|
||||
# Verify it was stored with lowercase key
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=upper_key.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
|
||||
broadcasts_after_primary = len(broadcasts)
|
||||
|
||||
# 2) Event handler fires - contact DB returns uppercase key
|
||||
mock_event = MagicMock()
|
||||
mock_event.payload = {
|
||||
"public_key": upper_key,
|
||||
"text": "Case sensitivity test",
|
||||
"txt_type": 0,
|
||||
"sender_timestamp": SENDER_TIMESTAMP,
|
||||
}
|
||||
|
||||
mock_contact = MagicMock()
|
||||
mock_contact.public_key = upper_key # Uppercase from DB
|
||||
mock_contact.type = 1
|
||||
mock_contact.name = "TestContact"
|
||||
|
||||
with (
|
||||
patch("app.event_handlers.ContactRepository") as mock_contact_repo,
|
||||
patch("app.event_handlers.broadcast_event", mock_broadcast),
|
||||
):
|
||||
mock_contact_repo.get_by_key_or_prefix = AsyncMock(return_value=mock_contact)
|
||||
mock_contact_repo.update_last_contacted = AsyncMock()
|
||||
|
||||
await on_contact_message(mock_event)
|
||||
|
||||
# Should NOT create a second message (dedup catches it thanks to .lower())
|
||||
new_message_broadcasts = [
|
||||
b for b in broadcasts[broadcasts_after_primary:] if b["type"] == "message"
|
||||
]
|
||||
assert len(new_message_broadcasts) == 0
|
||||
|
||||
# Still only one message in DB
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=upper_key.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
|
||||
|
||||
class TestDirectMessageDirectionDetection:
|
||||
"""Test src_hash/dest_hash direction detection in _process_direct_message.
|
||||
|
||||
The packet processor uses the first byte of public keys to determine
|
||||
message direction. This is a subtle 1-byte hash comparison with an
|
||||
ambiguous case when both bytes match (1/256 chance).
|
||||
"""
|
||||
|
||||
OUR_PUB_BYTES = bytes.fromhex(OUR_PUB)
|
||||
OUR_FIRST_BYTE = format(OUR_PUB_BYTES[0], "02x") # "fa"
|
||||
|
||||
# Contact whose first byte differs from ours
|
||||
DIFFERENT_CONTACT_PUB = CONTACT_PUB # starts with "a1"
|
||||
DIFFERENT_FIRST_BYTE = "a1"
|
||||
|
||||
# Contact whose first byte matches ours ("fa...")
|
||||
SAME_BYTE_CONTACT_PUB = (
|
||||
"fa" + "b2c3d3ba9f5fa8705b9845fe11cc6f01d1d49caaf4d122ac7121663c5beec7"[2:]
|
||||
)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_incoming_message_detected(self, test_db, captured_broadcasts):
|
||||
"""dest_hash matches us, src_hash doesn't → incoming."""
|
||||
from app.packet_processor import _process_direct_message
|
||||
|
||||
# Build a minimal packet_info where payload has [dest_hash=fa, src_hash=a1, ...]
|
||||
packet_info = MagicMock()
|
||||
packet_info.payload = bytes([0xFA, 0xA1, 0x00, 0x00]) + b"\x00" * 20
|
||||
packet_info.path = b""
|
||||
|
||||
# Create the contact so decryption can find a candidate
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": self.DIFFERENT_CONTACT_PUB,
|
||||
"name": "TestContact",
|
||||
"type": 1,
|
||||
}
|
||||
)
|
||||
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Incoming test",
|
||||
dest_hash=self.OUR_FIRST_BYTE,
|
||||
src_hash=self.DIFFERENT_FIRST_BYTE,
|
||||
)
|
||||
|
||||
pkt_id, _ = await RawPacketRepository.create(b"dir_test_in", SENDER_TIMESTAMP)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with (
|
||||
patch("app.packet_processor.has_private_key", return_value=True),
|
||||
patch("app.packet_processor.get_private_key", return_value=b"\x00" * 32),
|
||||
patch("app.packet_processor.get_public_key", return_value=self.OUR_PUB_BYTES),
|
||||
patch("app.packet_processor.try_decrypt_dm", return_value=decrypted),
|
||||
patch("app.packet_processor.broadcast_event", mock_broadcast),
|
||||
):
|
||||
result = await _process_direct_message(
|
||||
b"\x00" * 40, pkt_id, SENDER_TIMESTAMP, packet_info
|
||||
)
|
||||
|
||||
assert result is not None
|
||||
assert result["decrypted"] is True
|
||||
|
||||
# Message should be stored as incoming (outgoing=False)
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=self.DIFFERENT_CONTACT_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
assert messages[0].outgoing is False
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_outgoing_message_detected(self, test_db, captured_broadcasts):
|
||||
"""src_hash matches us, dest_hash doesn't → outgoing."""
|
||||
from app.packet_processor import _process_direct_message
|
||||
|
||||
packet_info = MagicMock()
|
||||
# dest_hash=a1 (contact), src_hash=fa (us)
|
||||
packet_info.payload = bytes([0xA1, 0xFA, 0x00, 0x00]) + b"\x00" * 20
|
||||
packet_info.path = b""
|
||||
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": self.DIFFERENT_CONTACT_PUB,
|
||||
"name": "TestContact",
|
||||
"type": 1,
|
||||
}
|
||||
)
|
||||
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Outgoing test",
|
||||
dest_hash=self.DIFFERENT_FIRST_BYTE,
|
||||
src_hash=self.OUR_FIRST_BYTE,
|
||||
)
|
||||
|
||||
pkt_id, _ = await RawPacketRepository.create(b"dir_test_out", SENDER_TIMESTAMP)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with (
|
||||
patch("app.packet_processor.has_private_key", return_value=True),
|
||||
patch("app.packet_processor.get_private_key", return_value=b"\x00" * 32),
|
||||
patch("app.packet_processor.get_public_key", return_value=self.OUR_PUB_BYTES),
|
||||
patch("app.packet_processor.try_decrypt_dm", return_value=decrypted),
|
||||
patch("app.packet_processor.broadcast_event", mock_broadcast),
|
||||
):
|
||||
result = await _process_direct_message(
|
||||
b"\x00" * 40, pkt_id, SENDER_TIMESTAMP, packet_info
|
||||
)
|
||||
|
||||
assert result is not None
|
||||
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=self.DIFFERENT_CONTACT_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
assert messages[0].outgoing is True
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_ambiguous_direction_defaults_to_incoming(self, test_db, captured_broadcasts):
|
||||
"""Both hash bytes match us → ambiguous → defaults to incoming."""
|
||||
from app.packet_processor import _process_direct_message
|
||||
|
||||
packet_info = MagicMock()
|
||||
# Both dest_hash and src_hash are 0xFA (our first byte)
|
||||
packet_info.payload = bytes([0xFA, 0xFA, 0x00, 0x00]) + b"\x00" * 20
|
||||
packet_info.path = b""
|
||||
|
||||
# Contact whose first byte also starts with "fa"
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": self.SAME_BYTE_CONTACT_PUB,
|
||||
"name": "SameByteContact",
|
||||
"type": 1,
|
||||
}
|
||||
)
|
||||
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=SENDER_TIMESTAMP,
|
||||
flags=0,
|
||||
message="Ambiguous direction",
|
||||
dest_hash=self.OUR_FIRST_BYTE,
|
||||
src_hash=self.OUR_FIRST_BYTE,
|
||||
)
|
||||
|
||||
pkt_id, _ = await RawPacketRepository.create(b"dir_test_ambig", SENDER_TIMESTAMP)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with (
|
||||
patch("app.packet_processor.has_private_key", return_value=True),
|
||||
patch("app.packet_processor.get_private_key", return_value=b"\x00" * 32),
|
||||
patch("app.packet_processor.get_public_key", return_value=self.OUR_PUB_BYTES),
|
||||
patch("app.packet_processor.try_decrypt_dm", return_value=decrypted),
|
||||
patch("app.packet_processor.broadcast_event", mock_broadcast),
|
||||
):
|
||||
result = await _process_direct_message(
|
||||
b"\x00" * 40, pkt_id, SENDER_TIMESTAMP, packet_info
|
||||
)
|
||||
|
||||
assert result is not None
|
||||
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=self.SAME_BYTE_CONTACT_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
assert messages[0].outgoing is False # Defaults to incoming
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_neither_hash_matches_returns_none(self, test_db, captured_broadcasts):
|
||||
"""Neither hash byte matches us → not our message → returns None."""
|
||||
from app.packet_processor import _process_direct_message
|
||||
|
||||
packet_info = MagicMock()
|
||||
# Neither byte matches our first byte (0xFA)
|
||||
packet_info.payload = bytes([0x11, 0x22, 0x00, 0x00]) + b"\x00" * 20
|
||||
packet_info.path = b""
|
||||
|
||||
pkt_id, _ = await RawPacketRepository.create(b"dir_test_none", SENDER_TIMESTAMP)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with (
|
||||
patch("app.packet_processor.has_private_key", return_value=True),
|
||||
patch("app.packet_processor.get_private_key", return_value=b"\x00" * 32),
|
||||
patch("app.packet_processor.get_public_key", return_value=self.OUR_PUB_BYTES),
|
||||
patch("app.packet_processor.broadcast_event", mock_broadcast),
|
||||
):
|
||||
result = await _process_direct_message(
|
||||
b"\x00" * 40, pkt_id, SENDER_TIMESTAMP, packet_info
|
||||
)
|
||||
|
||||
# Not our message - should return None without attempting decryption
|
||||
assert result is None
|
||||
@@ -192,9 +192,10 @@ class TestContactMessageCLIFiltering:
|
||||
patch("app.event_handlers.MessageRepository") as mock_repo,
|
||||
patch("app.event_handlers.ContactRepository") as mock_contact_repo,
|
||||
patch("app.event_handlers.broadcast_event") as mock_broadcast,
|
||||
patch("app.bot.run_bot_for_message", new_callable=AsyncMock),
|
||||
):
|
||||
mock_repo.create = AsyncMock(return_value=42)
|
||||
mock_contact_repo.get_by_key_prefix = AsyncMock(return_value=None)
|
||||
mock_contact_repo.get_by_key_or_prefix = AsyncMock(return_value=None)
|
||||
|
||||
class MockEvent:
|
||||
payload = {
|
||||
@@ -220,9 +221,10 @@ class TestContactMessageCLIFiltering:
|
||||
patch("app.event_handlers.MessageRepository") as mock_repo,
|
||||
patch("app.event_handlers.ContactRepository") as mock_contact_repo,
|
||||
patch("app.event_handlers.broadcast_event") as mock_broadcast,
|
||||
patch("app.bot.run_bot_for_message", new_callable=AsyncMock),
|
||||
):
|
||||
mock_repo.create = AsyncMock(return_value=42)
|
||||
mock_contact_repo.get_by_key_prefix = AsyncMock(return_value=None)
|
||||
mock_contact_repo.get_by_key_or_prefix = AsyncMock(return_value=None)
|
||||
|
||||
class MockEvent:
|
||||
payload = {
|
||||
@@ -254,9 +256,10 @@ class TestContactMessageCLIFiltering:
|
||||
patch("app.event_handlers.MessageRepository") as mock_repo,
|
||||
patch("app.event_handlers.ContactRepository") as mock_contact_repo,
|
||||
patch("app.event_handlers.broadcast_event"),
|
||||
patch("app.bot.run_bot_for_message", new_callable=AsyncMock),
|
||||
):
|
||||
mock_repo.create = AsyncMock(return_value=42)
|
||||
mock_contact_repo.get_by_key_prefix = AsyncMock(return_value=None)
|
||||
mock_contact_repo.get_by_key_or_prefix = AsyncMock(return_value=None)
|
||||
|
||||
class MockEvent:
|
||||
payload = {
|
||||
|
||||
119
tests/test_key_normalization.py
Normal file
@@ -0,0 +1,119 @@
|
||||
"""Tests for public key case normalization."""
|
||||
|
||||
import pytest
|
||||
|
||||
from app.database import Database
|
||||
from app.repository import ContactRepository, MessageRepository
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
async def test_db():
|
||||
"""Create an in-memory test database."""
|
||||
import app.repository as repo_module
|
||||
|
||||
db = Database(":memory:")
|
||||
await db.connect()
|
||||
|
||||
original_db = repo_module.db
|
||||
repo_module.db = db
|
||||
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
repo_module.db = original_db
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_upsert_stores_lowercase_key(test_db):
|
||||
await ContactRepository.upsert(
|
||||
{"public_key": "A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2"}
|
||||
)
|
||||
contact = await ContactRepository.get_by_key(
|
||||
"a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"
|
||||
)
|
||||
assert contact is not None
|
||||
assert contact.public_key == "a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_by_key_case_insensitive(test_db):
|
||||
await ContactRepository.upsert(
|
||||
{"public_key": "a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"}
|
||||
)
|
||||
contact = await ContactRepository.get_by_key(
|
||||
"A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2C3D4E5F6A1B2"
|
||||
)
|
||||
assert contact is not None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_update_last_contacted_case_insensitive(test_db):
|
||||
key = "a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"
|
||||
await ContactRepository.upsert({"public_key": key})
|
||||
|
||||
await ContactRepository.update_last_contacted(key.upper(), 12345)
|
||||
contact = await ContactRepository.get_by_key(key)
|
||||
assert contact is not None
|
||||
assert contact.last_contacted == 12345
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_by_pubkey_first_byte(test_db):
|
||||
key1 = "a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"
|
||||
key2 = "a1ffddeeaabb1122334455667788990011223344556677889900aabbccddeeff00"
|
||||
key3 = "b2b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2"
|
||||
|
||||
for key in [key1, key2, key3]:
|
||||
await ContactRepository.upsert({"public_key": key})
|
||||
|
||||
results = await ContactRepository.get_by_pubkey_first_byte("a1")
|
||||
assert len(results) == 2
|
||||
result_keys = {c.public_key for c in results}
|
||||
assert key1 in result_keys
|
||||
assert key2 in result_keys
|
||||
|
||||
results = await ContactRepository.get_by_pubkey_first_byte("A1")
|
||||
assert len(results) == 2 # case insensitive
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_null_sender_timestamp_defaults_to_received_at(test_db):
|
||||
"""Verify that a None/0 sender_timestamp is replaced by received_at."""
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text="hello",
|
||||
conversation_key="abcd1234" * 8,
|
||||
sender_timestamp=500, # simulates fallback: `payload.get("sender_timestamp") or received_at`
|
||||
received_at=500,
|
||||
)
|
||||
assert msg_id is not None
|
||||
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key="abcd1234" * 8, limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
assert messages[0].sender_timestamp == 500
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_duplicate_with_same_text_and_null_timestamp_rejected(test_db):
|
||||
"""Two messages with same content and sender_timestamp should be deduped."""
|
||||
received_at = 600
|
||||
msg_id1 = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text="hello",
|
||||
conversation_key="abcd1234" * 8,
|
||||
sender_timestamp=received_at,
|
||||
received_at=received_at,
|
||||
)
|
||||
assert msg_id1 is not None
|
||||
|
||||
msg_id2 = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text="hello",
|
||||
conversation_key="abcd1234" * 8,
|
||||
sender_timestamp=received_at,
|
||||
received_at=received_at,
|
||||
)
|
||||
assert msg_id2 is None # duplicate rejected
|
||||
159
tests/test_keystore.py
Normal file
@@ -0,0 +1,159 @@
|
||||
"""Tests for the ephemeral keystore module.
|
||||
|
||||
Verifies private key storage, validation, public key derivation,
|
||||
and the export_and_store_private_key flow with various radio responses.
|
||||
"""
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from meshcore import EventType
|
||||
|
||||
from app.keystore import (
|
||||
export_and_store_private_key,
|
||||
get_private_key,
|
||||
get_public_key,
|
||||
has_private_key,
|
||||
set_private_key,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_keystore():
|
||||
"""Reset keystore state before each test."""
|
||||
import app.keystore as ks
|
||||
|
||||
ks._private_key = None
|
||||
ks._public_key = None
|
||||
yield
|
||||
ks._private_key = None
|
||||
ks._public_key = None
|
||||
|
||||
|
||||
def _make_valid_private_key() -> bytes:
|
||||
"""Create a valid 64-byte MeshCore private key for testing.
|
||||
|
||||
The first 32 bytes are a clamped Ed25519 scalar,
|
||||
the last 32 bytes are the signing prefix.
|
||||
"""
|
||||
# A clamped scalar: clear bottom 3 bits, set bit 254, clear bit 255
|
||||
scalar = bytearray(b"\x01" * 32)
|
||||
scalar[0] &= 0xF8 # Clear bottom 3 bits
|
||||
scalar[31] &= 0x7F # Clear top bit
|
||||
scalar[31] |= 0x40 # Set bit 254
|
||||
prefix = b"\x02" * 32
|
||||
return bytes(scalar) + prefix
|
||||
|
||||
|
||||
VALID_KEY = _make_valid_private_key()
|
||||
|
||||
|
||||
class TestSetPrivateKey:
|
||||
"""Test set_private_key validation and storage."""
|
||||
|
||||
def test_stores_key_and_derives_public_key(self):
|
||||
"""Valid 64-byte key is stored and public key is derived."""
|
||||
set_private_key(VALID_KEY)
|
||||
|
||||
assert get_private_key() == VALID_KEY
|
||||
pub = get_public_key()
|
||||
assert pub is not None
|
||||
assert len(pub) == 32
|
||||
assert has_private_key() is True
|
||||
|
||||
def test_rejects_wrong_length(self):
|
||||
"""Keys that aren't 64 bytes are rejected."""
|
||||
with pytest.raises(ValueError, match="64 bytes"):
|
||||
set_private_key(b"\x00" * 32)
|
||||
|
||||
def test_rejects_empty_key(self):
|
||||
"""Empty key is rejected."""
|
||||
with pytest.raises(ValueError, match="64 bytes"):
|
||||
set_private_key(b"")
|
||||
|
||||
def test_overwrites_previous_key(self):
|
||||
"""Setting a new key replaces the old one."""
|
||||
set_private_key(VALID_KEY)
|
||||
pub1 = get_public_key()
|
||||
|
||||
# Create a different valid key
|
||||
other_key = bytearray(VALID_KEY)
|
||||
other_key[1] = 0x42 # Change a byte in the scalar
|
||||
other_key = bytes(other_key)
|
||||
|
||||
set_private_key(other_key)
|
||||
pub2 = get_public_key()
|
||||
|
||||
assert get_private_key() == other_key
|
||||
assert pub1 != pub2
|
||||
|
||||
|
||||
class TestGettersWhenEmpty:
|
||||
"""Test getter behavior when no key is stored."""
|
||||
|
||||
def test_get_private_key_returns_none(self):
|
||||
assert get_private_key() is None
|
||||
|
||||
def test_get_public_key_returns_none(self):
|
||||
assert get_public_key() is None
|
||||
|
||||
def test_has_private_key_false(self):
|
||||
assert has_private_key() is False
|
||||
|
||||
|
||||
class TestExportAndStorePrivateKey:
|
||||
"""Test the export_and_store_private_key flow with various radio responses."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_success_stores_key(self):
|
||||
"""Successful export stores the key in the keystore."""
|
||||
mock_mc = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.PRIVATE_KEY
|
||||
mock_result.payload = {"private_key": VALID_KEY}
|
||||
mock_mc.commands.export_private_key = AsyncMock(return_value=mock_result)
|
||||
|
||||
result = await export_and_store_private_key(mock_mc)
|
||||
|
||||
assert result is True
|
||||
assert has_private_key()
|
||||
assert get_private_key() == VALID_KEY
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_disabled_returns_false(self):
|
||||
"""DISABLED response returns False without storing."""
|
||||
mock_mc = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.DISABLED
|
||||
mock_result.payload = {}
|
||||
mock_mc.commands.export_private_key = AsyncMock(return_value=mock_result)
|
||||
|
||||
result = await export_and_store_private_key(mock_mc)
|
||||
|
||||
assert result is False
|
||||
assert not has_private_key()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_error_returns_false(self):
|
||||
"""ERROR response returns False without storing."""
|
||||
mock_mc = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.ERROR
|
||||
mock_result.payload = {"error": "something went wrong"}
|
||||
mock_mc.commands.export_private_key = AsyncMock(return_value=mock_result)
|
||||
|
||||
result = await export_and_store_private_key(mock_mc)
|
||||
|
||||
assert result is False
|
||||
assert not has_private_key()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_exception_returns_false(self):
|
||||
"""Exception during export returns False without storing."""
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.commands.export_private_key = AsyncMock(side_effect=Exception("Connection lost"))
|
||||
|
||||
result = await export_and_store_private_key(mock_mc)
|
||||
|
||||
assert result is False
|
||||
assert not has_private_key()
|
||||
64
tests/test_message_pagination.py
Normal file
@@ -0,0 +1,64 @@
|
||||
"""Tests for message pagination using cursor parameters."""
|
||||
|
||||
import pytest
|
||||
|
||||
from app.database import Database
|
||||
from app.repository import MessageRepository
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
async def test_db():
|
||||
"""Create an in-memory test database."""
|
||||
import app.repository as repo_module
|
||||
|
||||
db = Database(":memory:")
|
||||
await db.connect()
|
||||
|
||||
original_db = repo_module.db
|
||||
repo_module.db = db
|
||||
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
repo_module.db = original_db
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cursor_pagination_avoids_overlap(test_db):
|
||||
key = "ABC123DEF456ABC123DEF456ABC12345"
|
||||
|
||||
ids = []
|
||||
for received_at, text in [(200, "m1"), (200, "m2"), (150, "m3"), (100, "m4")]:
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="CHAN",
|
||||
text=text,
|
||||
conversation_key=key,
|
||||
sender_timestamp=received_at,
|
||||
received_at=received_at,
|
||||
)
|
||||
assert msg_id is not None
|
||||
ids.append(msg_id)
|
||||
|
||||
page1 = await MessageRepository.get_all(
|
||||
msg_type="CHAN",
|
||||
conversation_key=key,
|
||||
limit=2,
|
||||
offset=0,
|
||||
)
|
||||
assert len(page1) == 2
|
||||
|
||||
oldest = page1[-1]
|
||||
page2 = await MessageRepository.get_all(
|
||||
msg_type="CHAN",
|
||||
conversation_key=key,
|
||||
limit=2,
|
||||
offset=0,
|
||||
before=oldest.received_at,
|
||||
before_id=oldest.id,
|
||||
)
|
||||
assert len(page2) == 2
|
||||
|
||||
ids_page1 = {m.id for m in page1}
|
||||
ids_page2 = {m.id for m in page2}
|
||||
assert ids_page1.isdisjoint(ids_page2)
|
||||
50
tests/test_message_prefix_claim.py
Normal file
@@ -0,0 +1,50 @@
|
||||
"""Tests for prefix-claiming DM messages."""
|
||||
|
||||
import pytest
|
||||
|
||||
from app.database import Database
|
||||
from app.repository import MessageRepository
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
async def test_db():
|
||||
"""Create an in-memory test database."""
|
||||
import app.repository as repo_module
|
||||
|
||||
db = Database(":memory:")
|
||||
await db.connect()
|
||||
|
||||
original_db = repo_module.db
|
||||
repo_module.db = db
|
||||
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
repo_module.db = original_db
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_claim_prefix_promotes_dm_to_full_key(test_db):
|
||||
full_key = "a1b2c3d3ba9f5fa8705b9845fe11cc6f01d1d49caaf4d122ac7121663c5beec7"
|
||||
prefix = full_key[:6].upper()
|
||||
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text="hello",
|
||||
conversation_key=prefix,
|
||||
sender_timestamp=123,
|
||||
received_at=123,
|
||||
)
|
||||
assert msg_id is not None
|
||||
|
||||
updated = await MessageRepository.claim_prefix_messages(full_key)
|
||||
assert updated == 1
|
||||
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV",
|
||||
conversation_key=full_key,
|
||||
limit=10,
|
||||
)
|
||||
assert len(messages) == 1
|
||||
assert messages[0].conversation_key == full_key.lower()
|
||||
@@ -100,8 +100,8 @@ class TestMigration001:
|
||||
# Run migrations
|
||||
applied = await run_migrations(conn)
|
||||
|
||||
assert applied == 9 # All 9 migrations run
|
||||
assert await get_version(conn) == 9
|
||||
assert applied == 15 # All 15 migrations run
|
||||
assert await get_version(conn) == 15
|
||||
|
||||
# Verify columns exist by inserting and selecting
|
||||
await conn.execute(
|
||||
@@ -183,9 +183,9 @@ class TestMigration001:
|
||||
applied1 = await run_migrations(conn)
|
||||
applied2 = await run_migrations(conn)
|
||||
|
||||
assert applied1 == 9 # All 9 migrations run
|
||||
assert applied1 == 15 # All 15 migrations run
|
||||
assert applied2 == 0 # No migrations on second run
|
||||
assert await get_version(conn) == 9
|
||||
assert await get_version(conn) == 15
|
||||
finally:
|
||||
await conn.close()
|
||||
|
||||
@@ -245,9 +245,9 @@ class TestMigration001:
|
||||
# Run migrations - should not fail
|
||||
applied = await run_migrations(conn)
|
||||
|
||||
# All 9 migrations applied (version incremented) but no error
|
||||
assert applied == 9
|
||||
assert await get_version(conn) == 9
|
||||
# All 15 migrations applied (version incremented) but no error
|
||||
assert applied == 15
|
||||
assert await get_version(conn) == 15
|
||||
finally:
|
||||
await conn.close()
|
||||
|
||||
@@ -337,3 +337,97 @@ class TestMigration001:
|
||||
assert row["last_read_at"] is None
|
||||
finally:
|
||||
await conn.close()
|
||||
|
||||
|
||||
class TestMigration013:
|
||||
"""Test migration 013: convert bot_enabled/bot_code to multi-bot format."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_migration_converts_existing_bot_to_array(self):
|
||||
"""Migration converts existing bot_enabled/bot_code to bots array."""
|
||||
import json
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
try:
|
||||
# Set version to 12 (just before migration 13)
|
||||
await set_version(conn, 12)
|
||||
|
||||
# Create app_settings with old bot columns
|
||||
await conn.execute("""
|
||||
CREATE TABLE app_settings (
|
||||
id INTEGER PRIMARY KEY,
|
||||
max_radio_contacts INTEGER DEFAULT 50,
|
||||
favorites TEXT DEFAULT '[]',
|
||||
auto_decrypt_dm_on_advert INTEGER DEFAULT 0,
|
||||
sidebar_sort_order TEXT DEFAULT 'recent',
|
||||
last_message_times TEXT DEFAULT '{}',
|
||||
preferences_migrated INTEGER DEFAULT 0,
|
||||
advert_interval INTEGER DEFAULT 0,
|
||||
last_advert_time INTEGER DEFAULT 0,
|
||||
bot_enabled INTEGER DEFAULT 0,
|
||||
bot_code TEXT DEFAULT ''
|
||||
)
|
||||
""")
|
||||
await conn.execute(
|
||||
"INSERT INTO app_settings (id, bot_enabled, bot_code) VALUES (1, 1, 'def bot(): return \"hello\"')"
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
# Run migration 13 (plus 14+15 which also run)
|
||||
applied = await run_migrations(conn)
|
||||
assert applied == 3
|
||||
assert await get_version(conn) == 15
|
||||
|
||||
# Verify bots array was created with migrated data
|
||||
cursor = await conn.execute("SELECT bots FROM app_settings WHERE id = 1")
|
||||
row = await cursor.fetchone()
|
||||
bots = json.loads(row["bots"])
|
||||
|
||||
assert len(bots) == 1
|
||||
assert bots[0]["name"] == "Bot 1"
|
||||
assert bots[0]["enabled"] is True
|
||||
assert bots[0]["code"] == 'def bot(): return "hello"'
|
||||
assert "id" in bots[0] # Should have a UUID
|
||||
finally:
|
||||
await conn.close()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_migration_creates_empty_array_when_no_bot(self):
|
||||
"""Migration creates empty bots array when no existing bot data."""
|
||||
import json
|
||||
|
||||
conn = await aiosqlite.connect(":memory:")
|
||||
conn.row_factory = aiosqlite.Row
|
||||
try:
|
||||
await set_version(conn, 12)
|
||||
|
||||
await conn.execute("""
|
||||
CREATE TABLE app_settings (
|
||||
id INTEGER PRIMARY KEY,
|
||||
max_radio_contacts INTEGER DEFAULT 50,
|
||||
favorites TEXT DEFAULT '[]',
|
||||
auto_decrypt_dm_on_advert INTEGER DEFAULT 0,
|
||||
sidebar_sort_order TEXT DEFAULT 'recent',
|
||||
last_message_times TEXT DEFAULT '{}',
|
||||
preferences_migrated INTEGER DEFAULT 0,
|
||||
advert_interval INTEGER DEFAULT 0,
|
||||
last_advert_time INTEGER DEFAULT 0,
|
||||
bot_enabled INTEGER DEFAULT 0,
|
||||
bot_code TEXT DEFAULT ''
|
||||
)
|
||||
""")
|
||||
await conn.execute(
|
||||
"INSERT INTO app_settings (id, bot_enabled, bot_code) VALUES (1, 0, '')"
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
await run_migrations(conn)
|
||||
|
||||
cursor = await conn.execute("SELECT bots FROM app_settings WHERE id = 1")
|
||||
row = await cursor.fetchone()
|
||||
bots = json.loads(row["bots"])
|
||||
|
||||
assert bots == []
|
||||
finally:
|
||||
await conn.close()
|
||||
|
||||
@@ -168,7 +168,7 @@ class TestChannelMessagePipeline:
|
||||
assert result is not None
|
||||
|
||||
# Raw packet should be stored
|
||||
raw_packets = await RawPacketRepository.get_undecrypted(limit=10)
|
||||
raw_packets = await RawPacketRepository.get_all_undecrypted()
|
||||
assert len(raw_packets) >= 1
|
||||
|
||||
# No message broadcast (only raw_packet broadcast)
|
||||
@@ -386,6 +386,58 @@ class TestAdvertisementPipeline:
|
||||
contact = await ContactRepository.get_by_key(test_pubkey)
|
||||
assert contact.last_path_len == 1 # Still the shorter path
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_advertisement_replaces_stale_path_outside_window(
|
||||
self, test_db, captured_broadcasts
|
||||
):
|
||||
"""When existing path is stale (>60s), a new longer path should replace it.
|
||||
|
||||
In a mesh network, a stale short path may no longer be valid (node moved, repeater
|
||||
went offline). Accepting the new longer path ensures we have a working route.
|
||||
"""
|
||||
from app.packet_processor import _process_advertisement
|
||||
|
||||
test_pubkey = "1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef"
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": test_pubkey,
|
||||
"name": "TestNode",
|
||||
"type": 1,
|
||||
"last_seen": 1000,
|
||||
"last_path_len": 1, # Short path
|
||||
"last_path": "aa",
|
||||
}
|
||||
)
|
||||
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
from app.decoder import ParsedAdvertisement
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
# New longer path arriving AFTER 60s window (timestamp 1000 + 61 = 1061)
|
||||
long_packet_info = MagicMock()
|
||||
long_packet_info.path_length = 4
|
||||
long_packet_info.path = bytes.fromhex("aabbccdd")
|
||||
long_packet_info.payload = b""
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
with patch("app.packet_processor.parse_advertisement") as mock_parse:
|
||||
mock_parse.return_value = ParsedAdvertisement(
|
||||
public_key=test_pubkey,
|
||||
name="TestNode",
|
||||
timestamp=1061,
|
||||
lat=None,
|
||||
lon=None,
|
||||
device_role=1,
|
||||
)
|
||||
await _process_advertisement(b"", timestamp=1061, packet_info=long_packet_info)
|
||||
|
||||
# Verify the longer path replaced the stale shorter one
|
||||
contact = await ContactRepository.get_by_key(test_pubkey)
|
||||
assert contact.last_path_len == 4
|
||||
assert contact.last_path == "aabbccdd"
|
||||
|
||||
|
||||
class TestAckPipeline:
|
||||
"""Test ACK flow: outgoing message → ACK received → broadcast update."""
|
||||
@@ -576,8 +628,8 @@ class TestCreateMessageFromDecrypted:
|
||||
)
|
||||
|
||||
# Verify packet is marked decrypted (has message_id set)
|
||||
undecrypted = await RawPacketRepository.get_undecrypted(limit=100)
|
||||
packet_ids = [p.id for p in undecrypted]
|
||||
undecrypted = await RawPacketRepository.get_all_undecrypted()
|
||||
packet_ids = [p[0] for p in undecrypted]
|
||||
assert packet_id not in packet_ids # Should be marked as decrypted
|
||||
|
||||
|
||||
@@ -831,8 +883,8 @@ class TestCreateDMMessageFromDecrypted:
|
||||
)
|
||||
|
||||
# Verify packet is marked decrypted
|
||||
undecrypted = await RawPacketRepository.get_undecrypted(limit=100)
|
||||
packet_ids = [p.id for p in undecrypted]
|
||||
undecrypted = await RawPacketRepository.get_all_undecrypted()
|
||||
packet_ids = [p[0] for p in undecrypted]
|
||||
assert packet_id not in packet_ids
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@@ -939,5 +991,129 @@ class TestDMDecryptionFunction:
|
||||
assert messages[0].outgoing is False
|
||||
|
||||
# Verify raw packet is linked
|
||||
undecrypted = await RawPacketRepository.get_undecrypted(limit=100)
|
||||
assert packet_id not in [p.id for p in undecrypted]
|
||||
undecrypted = await RawPacketRepository.get_all_undecrypted()
|
||||
assert packet_id not in [p[0] for p in undecrypted]
|
||||
|
||||
|
||||
class TestRepeaterMessageFiltering:
|
||||
"""Test that messages from repeaters are not stored in chat history.
|
||||
|
||||
Repeaters only send CLI responses (not chat messages), and these are handled
|
||||
by the command endpoint. The packet processor filters them out based on
|
||||
contact type to prevent duplicate storage.
|
||||
"""
|
||||
|
||||
# A repeater contact
|
||||
REPEATER_PUB = "a1b2c3d3ba9f5fa8705b9845fe11cc6f01d1d49caaf4d122ac7121663c5beec7"
|
||||
# A normal client contact
|
||||
CLIENT_PUB = "b2c3d4e4cb0a6fb9816ca956ff22dd7f12e2e5adbbf5e233bd8232774d6cffe8"
|
||||
# Our public key
|
||||
OUR_PUB = "FACE123334789E2B81519AFDBC39A3C9EB7EA3457AD367D3243597A484847E46"
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_repeater_message_not_stored(self, test_db, captured_broadcasts):
|
||||
"""Messages from repeaters should not be stored in database."""
|
||||
from app.decoder import DecryptedDirectMessage
|
||||
from app.models import CONTACT_TYPE_REPEATER
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
from app.repository import ContactRepository, MessageRepository, RawPacketRepository
|
||||
|
||||
# Create a repeater contact first
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": self.REPEATER_PUB,
|
||||
"name": "Test Repeater",
|
||||
"type": CONTACT_TYPE_REPEATER, # type=2 is repeater
|
||||
"flags": 0,
|
||||
"on_radio": False,
|
||||
}
|
||||
)
|
||||
|
||||
# Store a raw packet
|
||||
packet_id, _ = await RawPacketRepository.create(b"\x09\x00test", 1700000000)
|
||||
|
||||
# Create a DecryptedDirectMessage (simulating a CLI response from repeater)
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=1700000000,
|
||||
flags=0, # flags don't matter - we filter by contact type
|
||||
message="cli response: version 1.0",
|
||||
dest_hash="fa",
|
||||
src_hash="a1",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
decrypted=decrypted,
|
||||
their_public_key=self.REPEATER_PUB,
|
||||
our_public_key=self.OUR_PUB,
|
||||
received_at=1700000001,
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
# Should return None (not stored because sender is a repeater)
|
||||
assert msg_id is None
|
||||
|
||||
# Should not broadcast
|
||||
assert len(broadcasts) == 0
|
||||
|
||||
# Should not be in database
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=self.REPEATER_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 0
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_client_message_still_stored(self, test_db, captured_broadcasts):
|
||||
"""Messages from normal clients should still be stored."""
|
||||
from app.decoder import DecryptedDirectMessage
|
||||
from app.packet_processor import create_dm_message_from_decrypted
|
||||
from app.repository import ContactRepository, MessageRepository, RawPacketRepository
|
||||
|
||||
# Create a normal client contact (type=1)
|
||||
await ContactRepository.upsert(
|
||||
{
|
||||
"public_key": self.CLIENT_PUB,
|
||||
"name": "Test Client",
|
||||
"type": 1, # type=1 is client
|
||||
"flags": 0,
|
||||
"on_radio": False,
|
||||
}
|
||||
)
|
||||
|
||||
packet_id, _ = await RawPacketRepository.create(b"\x09\x00test2", 1700000000)
|
||||
|
||||
decrypted = DecryptedDirectMessage(
|
||||
timestamp=1700000000,
|
||||
flags=0,
|
||||
message="Hello, world!",
|
||||
dest_hash="fa",
|
||||
src_hash="b2",
|
||||
)
|
||||
|
||||
broadcasts, mock_broadcast = captured_broadcasts
|
||||
|
||||
with patch("app.packet_processor.broadcast_event", mock_broadcast):
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
decrypted=decrypted,
|
||||
their_public_key=self.CLIENT_PUB,
|
||||
our_public_key=self.OUR_PUB,
|
||||
received_at=1700000001,
|
||||
outgoing=False,
|
||||
)
|
||||
|
||||
# Should return message ID (stored because sender is a client)
|
||||
assert msg_id is not None
|
||||
|
||||
# Should broadcast
|
||||
message_broadcasts = [b for b in broadcasts if b["type"] == "message"]
|
||||
assert len(message_broadcasts) == 1
|
||||
|
||||
# Should be in database
|
||||
messages = await MessageRepository.get_all(
|
||||
msg_type="PRIV", conversation_key=self.CLIENT_PUB.lower(), limit=10
|
||||
)
|
||||
assert len(messages) == 1
|
||||
|
||||
@@ -7,22 +7,27 @@ message polling from interfering with repeater CLI operations.
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from meshcore import EventType
|
||||
|
||||
from app.models import Contact
|
||||
from app.radio_sync import (
|
||||
is_polling_paused,
|
||||
pause_polling,
|
||||
sync_radio_time,
|
||||
sync_recent_contacts_to_radio,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_polling_state():
|
||||
"""Reset polling pause state before and after each test."""
|
||||
def reset_sync_state():
|
||||
"""Reset polling pause state and sync timestamp before and after each test."""
|
||||
import app.radio_sync as radio_sync
|
||||
|
||||
radio_sync._polling_pause_count = 0
|
||||
radio_sync._last_contact_sync = 0.0
|
||||
yield
|
||||
radio_sync._polling_pause_count = 0
|
||||
radio_sync._last_contact_sync = 0.0
|
||||
|
||||
|
||||
class TestPollingPause:
|
||||
@@ -158,3 +163,263 @@ class TestSyncRadioTime:
|
||||
result = await sync_radio_time()
|
||||
|
||||
assert result is False
|
||||
|
||||
|
||||
KEY_A = "aa" * 32
|
||||
KEY_B = "bb" * 32
|
||||
|
||||
|
||||
def _make_contact(public_key=KEY_A, name="Alice", on_radio=False, **overrides):
|
||||
"""Create a Contact model instance for testing."""
|
||||
defaults = {
|
||||
"public_key": public_key,
|
||||
"name": name,
|
||||
"type": 0,
|
||||
"flags": 0,
|
||||
"last_path": None,
|
||||
"last_path_len": -1,
|
||||
"last_advert": None,
|
||||
"lat": None,
|
||||
"lon": None,
|
||||
"last_seen": None,
|
||||
"on_radio": on_radio,
|
||||
"last_contacted": None,
|
||||
"last_read_at": None,
|
||||
}
|
||||
defaults.update(overrides)
|
||||
return Contact(**defaults)
|
||||
|
||||
|
||||
class TestSyncRecentContactsToRadio:
|
||||
"""Test the sync_recent_contacts_to_radio function."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_loads_contacts_not_on_radio(self):
|
||||
"""Contacts not on radio are added via add_contact."""
|
||||
contacts = [_make_contact(KEY_A, "Alice"), _make_contact(KEY_B, "Bob")]
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=None)
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.OK
|
||||
mock_mc.commands.add_contact = AsyncMock(return_value=mock_result)
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contacts,
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_set_on_radio,
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
result = await sync_recent_contacts_to_radio()
|
||||
|
||||
assert result["loaded"] == 2
|
||||
assert mock_set_on_radio.call_count == 2
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_skips_contacts_already_on_radio(self):
|
||||
"""Contacts already on radio are counted but not re-added."""
|
||||
contacts = [_make_contact(KEY_A, "Alice", on_radio=True)]
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=MagicMock()) # Found
|
||||
mock_mc.commands.add_contact = AsyncMock()
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contacts,
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
result = await sync_recent_contacts_to_radio()
|
||||
|
||||
assert result["loaded"] == 0
|
||||
assert result["already_on_radio"] == 1
|
||||
mock_mc.commands.add_contact.assert_not_called()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_throttled_when_called_quickly(self):
|
||||
"""Second call within throttle window returns throttled result."""
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=None)
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=[],
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
# First call succeeds
|
||||
result1 = await sync_recent_contacts_to_radio()
|
||||
assert "throttled" not in result1
|
||||
|
||||
# Second call is throttled
|
||||
result2 = await sync_recent_contacts_to_radio()
|
||||
assert result2["throttled"] is True
|
||||
assert result2["loaded"] == 0
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_force_bypasses_throttle(self):
|
||||
"""force=True bypasses the throttle window."""
|
||||
mock_mc = MagicMock()
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=[],
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
# First call
|
||||
await sync_recent_contacts_to_radio()
|
||||
|
||||
# Forced second call is not throttled
|
||||
result = await sync_recent_contacts_to_radio(force=True)
|
||||
assert "throttled" not in result
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_not_connected_returns_error(self):
|
||||
"""Returns error when radio is not connected."""
|
||||
with patch("app.radio_sync.radio_manager") as mock_rm:
|
||||
mock_rm.is_connected = False
|
||||
mock_rm.meshcore = None
|
||||
|
||||
result = await sync_recent_contacts_to_radio()
|
||||
|
||||
assert result["loaded"] == 0
|
||||
assert "error" in result
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_marks_on_radio_when_found_but_not_flagged(self):
|
||||
"""Contact found on radio but not flagged gets set_on_radio(True)."""
|
||||
contact = _make_contact(KEY_A, "Alice", on_radio=False)
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=MagicMock()) # Found
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=[contact],
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
) as mock_set_on_radio,
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
result = await sync_recent_contacts_to_radio()
|
||||
|
||||
assert result["already_on_radio"] == 1
|
||||
# Should update the flag since contact.on_radio was False
|
||||
mock_set_on_radio.assert_called_once_with(KEY_A, True)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_handles_add_failure(self):
|
||||
"""Failed add_contact increments the failed counter."""
|
||||
contacts = [_make_contact(KEY_A, "Alice")]
|
||||
|
||||
mock_mc = MagicMock()
|
||||
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=None)
|
||||
mock_result = MagicMock()
|
||||
mock_result.type = EventType.ERROR
|
||||
mock_result.payload = {"error": "Radio full"}
|
||||
mock_mc.commands.add_contact = AsyncMock(return_value=mock_result)
|
||||
|
||||
mock_settings = MagicMock()
|
||||
mock_settings.max_radio_contacts = 200
|
||||
|
||||
with (
|
||||
patch("app.radio_sync.radio_manager") as mock_rm,
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.get_recent_non_repeaters",
|
||||
new_callable=AsyncMock,
|
||||
return_value=contacts,
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.ContactRepository.set_on_radio",
|
||||
new_callable=AsyncMock,
|
||||
),
|
||||
patch(
|
||||
"app.radio_sync.AppSettingsRepository.get",
|
||||
new_callable=AsyncMock,
|
||||
return_value=mock_settings,
|
||||
),
|
||||
):
|
||||
mock_rm.is_connected = True
|
||||
mock_rm.meshcore = mock_mc
|
||||
|
||||
result = await sync_recent_contacts_to_radio()
|
||||
|
||||
assert result["loaded"] == 0
|
||||
assert result["failed"] == 1
|
||||
|
||||
202
tests/test_send_messages.py
Normal file
@@ -0,0 +1,202 @@
|
||||
"""Tests for bot triggering on outgoing messages sent via the messages router."""
|
||||
|
||||
import asyncio
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from meshcore import EventType
|
||||
|
||||
from app.models import Channel, Contact, SendChannelMessageRequest, SendDirectMessageRequest
|
||||
from app.routers.messages import send_channel_message, send_direct_message
|
||||
|
||||
|
||||
def _make_radio_result(payload=None):
|
||||
"""Create a mock radio command result."""
|
||||
result = MagicMock()
|
||||
result.type = EventType.MSG_SENT
|
||||
result.payload = payload or {}
|
||||
return result
|
||||
|
||||
|
||||
def _make_mc(name="TestNode"):
|
||||
"""Create a mock MeshCore connection."""
|
||||
mc = MagicMock()
|
||||
mc.self_info = {"name": name}
|
||||
mc.commands = MagicMock()
|
||||
mc.commands.send_msg = AsyncMock(return_value=_make_radio_result())
|
||||
mc.commands.send_chan_msg = AsyncMock(return_value=_make_radio_result())
|
||||
mc.commands.add_contact = AsyncMock(return_value=_make_radio_result())
|
||||
mc.commands.set_channel = AsyncMock(return_value=_make_radio_result())
|
||||
mc.get_contact_by_key_prefix = MagicMock(return_value=None)
|
||||
return mc
|
||||
|
||||
|
||||
class TestOutgoingDMBotTrigger:
|
||||
"""Test that sending a DM triggers bots with is_outgoing=True."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_dm_triggers_bot(self):
|
||||
"""Sending a DM creates a background task to run bots."""
|
||||
mc = _make_mc()
|
||||
db_contact = Contact(public_key="ab" * 32, name="Alice")
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ContactRepository.get_by_key_or_prefix",
|
||||
new=AsyncMock(return_value=db_contact),
|
||||
),
|
||||
patch("app.repository.ContactRepository.update_last_contacted", new=AsyncMock()),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.bot.run_bot_for_message", new=AsyncMock()) as mock_bot,
|
||||
):
|
||||
request = SendDirectMessageRequest(
|
||||
destination=db_contact.public_key, text="!lasttime Alice"
|
||||
)
|
||||
await send_direct_message(request)
|
||||
|
||||
# Let the background task run
|
||||
await asyncio.sleep(0)
|
||||
|
||||
mock_bot.assert_called_once()
|
||||
call_kwargs = mock_bot.call_args[1]
|
||||
assert call_kwargs["message_text"] == "!lasttime Alice"
|
||||
assert call_kwargs["is_dm"] is True
|
||||
assert call_kwargs["is_outgoing"] is True
|
||||
assert call_kwargs["sender_key"] == db_contact.public_key
|
||||
assert call_kwargs["channel_key"] is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_dm_bot_does_not_block_response(self):
|
||||
"""Bot trigger runs in background and doesn't delay the message response."""
|
||||
mc = _make_mc()
|
||||
db_contact = Contact(public_key="ab" * 32, name="Alice")
|
||||
|
||||
# Bot that would take a long time
|
||||
async def _slow(**kw):
|
||||
await asyncio.sleep(10)
|
||||
|
||||
slow_bot = AsyncMock(side_effect=_slow)
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ContactRepository.get_by_key_or_prefix",
|
||||
new=AsyncMock(return_value=db_contact),
|
||||
),
|
||||
patch("app.repository.ContactRepository.update_last_contacted", new=AsyncMock()),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.bot.run_bot_for_message", new=slow_bot),
|
||||
):
|
||||
request = SendDirectMessageRequest(destination=db_contact.public_key, text="Hello")
|
||||
# This should return immediately, not wait 10 seconds
|
||||
message = await send_direct_message(request)
|
||||
assert message.text == "Hello"
|
||||
assert message.outgoing is True
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_dm_passes_no_sender_name(self):
|
||||
"""Outgoing DMs pass sender_name=None (we are the sender)."""
|
||||
mc = _make_mc()
|
||||
db_contact = Contact(public_key="cd" * 32, name="Bob")
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ContactRepository.get_by_key_or_prefix",
|
||||
new=AsyncMock(return_value=db_contact),
|
||||
),
|
||||
patch("app.repository.ContactRepository.update_last_contacted", new=AsyncMock()),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.bot.run_bot_for_message", new=AsyncMock()) as mock_bot,
|
||||
):
|
||||
request = SendDirectMessageRequest(destination=db_contact.public_key, text="test")
|
||||
await send_direct_message(request)
|
||||
await asyncio.sleep(0)
|
||||
|
||||
call_kwargs = mock_bot.call_args[1]
|
||||
assert call_kwargs["sender_name"] is None
|
||||
|
||||
|
||||
class TestOutgoingChannelBotTrigger:
|
||||
"""Test that sending a channel message triggers bots with is_outgoing=True."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_channel_msg_triggers_bot(self):
|
||||
"""Sending a channel message creates a background task to run bots."""
|
||||
mc = _make_mc(name="MyNode")
|
||||
db_channel = Channel(key="aa" * 16, name="#general")
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ChannelRepository.get_by_key",
|
||||
new=AsyncMock(return_value=db_channel),
|
||||
),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.decoder.calculate_channel_hash", return_value="abcd"),
|
||||
patch("app.bot.run_bot_for_message", new=AsyncMock()) as mock_bot,
|
||||
):
|
||||
request = SendChannelMessageRequest(
|
||||
channel_key=db_channel.key, text="!lasttime5 someone"
|
||||
)
|
||||
await send_channel_message(request)
|
||||
await asyncio.sleep(0)
|
||||
|
||||
mock_bot.assert_called_once()
|
||||
call_kwargs = mock_bot.call_args[1]
|
||||
assert call_kwargs["message_text"] == "!lasttime5 someone"
|
||||
assert call_kwargs["is_dm"] is False
|
||||
assert call_kwargs["is_outgoing"] is True
|
||||
assert call_kwargs["channel_key"] == db_channel.key.upper()
|
||||
assert call_kwargs["channel_name"] == "#general"
|
||||
assert call_kwargs["sender_name"] == "MyNode"
|
||||
assert call_kwargs["sender_key"] is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_channel_msg_no_radio_name(self):
|
||||
"""When radio has no name, sender_name is None."""
|
||||
mc = _make_mc(name="")
|
||||
db_channel = Channel(key="bb" * 16, name="#test")
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ChannelRepository.get_by_key",
|
||||
new=AsyncMock(return_value=db_channel),
|
||||
),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.decoder.calculate_channel_hash", return_value="abcd"),
|
||||
patch("app.bot.run_bot_for_message", new=AsyncMock()) as mock_bot,
|
||||
):
|
||||
request = SendChannelMessageRequest(channel_key=db_channel.key, text="hello")
|
||||
await send_channel_message(request)
|
||||
await asyncio.sleep(0)
|
||||
|
||||
call_kwargs = mock_bot.call_args[1]
|
||||
assert call_kwargs["sender_name"] is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_channel_msg_bot_does_not_block_response(self):
|
||||
"""Bot trigger runs in background and doesn't delay the message response."""
|
||||
mc = _make_mc(name="MyNode")
|
||||
db_channel = Channel(key="cc" * 16, name="#slow")
|
||||
|
||||
async def _slow(**kw):
|
||||
await asyncio.sleep(10)
|
||||
|
||||
slow_bot = AsyncMock(side_effect=_slow)
|
||||
|
||||
with (
|
||||
patch("app.routers.messages.require_connected", return_value=mc),
|
||||
patch(
|
||||
"app.repository.ChannelRepository.get_by_key",
|
||||
new=AsyncMock(return_value=db_channel),
|
||||
),
|
||||
patch("app.repository.MessageRepository.create", new=AsyncMock(return_value=1)),
|
||||
patch("app.decoder.calculate_channel_hash", return_value="abcd"),
|
||||
patch("app.bot.run_bot_for_message", new=slow_bot),
|
||||
):
|
||||
request = SendChannelMessageRequest(channel_key=db_channel.key, text="test")
|
||||
message = await send_channel_message(request)
|
||||
assert message.outgoing is True
|
||||