56 Commits
3.2.0 ... 3.4.0

Author SHA1 Message Date
Jack Kingsman
dbb8dd4c43 Updating changelog + build for 3.4.0 2026-03-16 15:41:43 -07:00
Jack Kingsman
6c003069d4 Move to pre-built frontend on release only. Closes #62 2026-03-16 15:40:01 -07:00
Jack Kingsman
ea5ba3b2a3 Add radio model and stats display. Closes #64 2026-03-16 15:29:21 -07:00
Jack Kingsman
58b34a6a2f Make pagination requests abortable 2026-03-16 15:18:38 -07:00
Jack Kingsman
4277e0c924 Clear keys on radio disconnect and add better error for channel send non-radio response 2026-03-16 15:11:02 -07:00
Jack Kingsman
2f562ce682 Don't reconcile mid history view 2026-03-16 15:03:55 -07:00
Jack Kingsman
370ff115b4 Fix DM collapse on same second send 2026-03-16 14:53:48 -07:00
Jack Kingsman
04733b6a02 Use advert position if we don't have a from-repeater-stats lat/lon 2026-03-16 14:33:04 -07:00
Jack Kingsman
749fb43fd0 Ditch garbage data ingest for lat/lon and extend map. Closes #63 2026-03-16 14:24:58 -07:00
Jack Kingsman
8d7d926762 Fix repeater clock drift-drift on nav-away-come-back 2026-03-16 11:32:40 -07:00
Jack Kingsman
c809dad05d Polish off all our gross edges around frontend/backend Public name management 2026-03-15 18:20:43 -07:00
Jack Kingsman
c76f230c9f Reduce memo thrash on map update 2026-03-15 18:07:30 -07:00
Jack Kingsman
226dc4f59e use server time for advert freshness 2026-03-15 17:55:47 -07:00
Jack Kingsman
3f50a2ef07 fix e2e sort order test 2026-03-15 17:15:14 -07:00
Jack Kingsman
4a7ea9eb29 Always valodate fanout configs on save 2026-03-15 16:27:43 -07:00
Jack Kingsman
29368961fc Tweak send no-response handling 2026-03-15 16:12:17 -07:00
Jack Kingsman
7cb84ea6c7 Fix sidebar sort order. Closes #61 2026-03-15 16:02:09 -07:00
Jack Kingsman
0b1a19164a Fix typo and add better ops for decrypt API calls 2026-03-15 15:47:09 -07:00
Jack Kingsman
cf1a55e258 Add prebuilt frontend 2026-03-14 23:05:57 -07:00
Jack Kingsman
0881998e5b Overhaul repeater interaction to better deal with login failure clearly 2026-03-14 22:58:14 -07:00
Jack Kingsman
ac65943263 Updating changelog + build for 3.3.0 2026-03-13 22:25:47 -07:00
Jack Kingsman
04b324b711 Don't treat matched-prefix DMs as an ack (as it is an echo-ack for channel messages, not DMs) 2026-03-13 22:17:27 -07:00
Jack Kingsman
5512f9e677 Fix last-advert selection logic for path recency 2026-03-13 22:11:02 -07:00
Jack Kingsman
b4962d39f0 Fix up resend logic to be cleaner 2026-03-13 22:07:16 -07:00
Jack Kingsman
39a687da58 Forward whole message to FE on resend so the browser updates 2026-03-13 21:57:19 -07:00
Jack Kingsman
f41c7756d3 Prevent same-second outgoing collision now that we can send faster.
Also add pending ack tracking
2026-03-13 21:43:50 -07:00
Jack Kingsman
bafea6a172 Fix blocking on DMs (again, but right this time) 2026-03-13 21:10:50 -07:00
Jack Kingsman
68f05075ca Modal-ify the room region override 2026-03-13 18:11:18 -07:00
Jack Kingsman
adfb8c930c Move routing override to modal 2026-03-13 18:04:49 -07:00
Jack Kingsman
1299a301c1 Add route discovery 2026-03-13 17:55:17 -07:00
Jack Kingsman
3a4ea8022b Add local node discovery 2026-03-13 17:25:28 -07:00
Jack Kingsman
bd19015693 Don't suggest npm ci 2026-03-13 14:46:55 -07:00
Jack Kingsman
cb9c9ae289 Docs updates 2026-03-13 11:18:11 -07:00
Jack Kingsman
2369e69e0a Catch channel cache issues on set_channel failure during eviction 2026-03-13 11:09:23 -07:00
Jack Kingsman
9c2b6f0744 Add fallback polling message persistence for channel messages 2026-03-13 11:05:49 -07:00
Jack Kingsman
70d28e53a9 Fix self-node snapping on node addition 2026-03-13 10:42:36 -07:00
Jack Kingsman
96d8d1dc64 Be more strict with SQS region 2026-03-12 23:57:13 -07:00
Jack Kingsman
a7ff041a48 Drop out channel hash helper 2026-03-12 23:57:13 -07:00
Jack Kingsman
5a580b9c01 Tighten up error phasing 2026-03-12 23:57:13 -07:00
Jack Kingsman
0834414ba4 Remove redundant channel listing 2026-03-12 23:57:13 -07:00
Jack Kingsman
df538b3aaf Parallelize docker_ci 2026-03-12 23:57:13 -07:00
Jack Kingsman
2710cafb21 Add health endpoint 2026-03-12 23:57:13 -07:00
Jack Kingsman
338f632514 Remove unused endpoint and fix stale slot retry problems 2026-03-12 23:57:13 -07:00
Jack Kingsman
7e1f941760 Add documentation and force-lock-acquisition mode for channel management 2026-03-12 23:57:13 -07:00
Jack Kingsman
87ea2b4675 LRU-based parallel channel storage 2026-03-12 23:57:13 -07:00
Jack Kingsman
5c85a432c8 Phase 1 of manual channel management 2026-03-12 23:57:13 -07:00
Jack Kingsman
22ca5410ee Fix up unread bugs 2026-03-12 22:00:18 -07:00
Jack Kingsman
276e0e09b3 Show dismiss 'X' on the Jump to Unread 2026-03-12 20:23:38 -07:00
Jack Kingsman
1c57e35ba5 Don't collapse ambiguous senders to imply an indirect link betwen repeaters 2026-03-12 20:13:45 -07:00
Jack Kingsman
358589bd66 Cull a bunch of unused functions 2026-03-12 18:12:27 -07:00
Jack Kingsman
74c13d194c Fix message dual render and get the jump to unread link out of the way on visible unread boundaries. Closes #57. 2026-03-12 16:31:47 -07:00
Jack Kingsman
07fd88a4d6 Make repeater neighbor display need a GPS fix to show map + distance, and fetch before display. Closes #58. 2026-03-12 16:18:52 -07:00
Jack Kingsman
07934093e6 Don't force-insert a node with unknown relationships just because they are the marked recipient of a DM. Closes #44. 2026-03-12 14:30:26 -07:00
Jack Kingsman
3ee4f9d7a2 Do some same name ambiguous + known sibling collapse 2026-03-12 13:10:57 -07:00
Jack Kingsman
b81f6ef89e Visualizer overhaul 2026-03-12 12:56:59 -07:00
Jack Kingsman
489950a2f7 Use dashed lines for collapsed ambiguous repeater paths. Closes #44. 2026-03-12 12:11:17 -07:00
147 changed files with 9931 additions and 2816 deletions

1
.gitattributes vendored Normal file
View File

@@ -0,0 +1 @@
frontend/prebuilt/** -diff

4
.gitignore vendored
View File

@@ -12,8 +12,12 @@ frontend/test-results/
# Frontend build output (built from source by end users)
frontend/dist/
frontend/prebuilt/
frontend/.eslintcache
# Release artifacts
remoteterm-prebuilt-frontend-v*.zip
# reference libraries
references/

View File

@@ -129,7 +129,7 @@ To improve repeater disambiguation in the network visualizer, the backend stores
- This is independent of raw-packet payload deduplication.
- Paths are keyed per contact + path + hop count, with `heard_count`, `first_seen`, and `last_seen`.
- Only the N most recent unique paths are retained per contact (currently 10).
- See `frontend/src/components/AGENTS_packet_visualizer.md` § "Advert-Path Identity Hints" for how the visualizer consumes this data.
- See `frontend/src/components/visualizer/AGENTS_packet_visualizer.md` § "Advert-Path Identity Hints" for how the visualizer consumes this data.
## Path Hash Modes
@@ -194,7 +194,7 @@ This message-layer echo/path handling is independent of raw-packet storage dedup
│ │ └── ...
│ └── vite.config.ts
├── scripts/
│ ├── all_quality.sh # Run all lint, format, typecheck, tests, build (sequential)
│ ├── all_quality.sh # Run all lint, format, typecheck, tests, and the standard frontend build
│ ├── collect_licenses.sh # Gather third-party license attributions
│ ├── e2e.sh # End-to-end test runner
│ └── publish.sh # Version bump, changelog, docker build & push
@@ -223,7 +223,7 @@ MESHCORE_SERIAL_PORT=/dev/cu.usbserial-0001 uv run uvicorn app.main:app --reload
```bash
cd frontend
npm ci
npm install
npm run dev # http://localhost:5173, proxies /api to :8000
```
@@ -237,13 +237,13 @@ Terminal 2: `cd frontend && npm run dev`
In production, the FastAPI backend serves the compiled frontend. Build the frontend first:
```bash
cd frontend && npm ci && npm run build && cd ..
cd frontend && npm install && npm run build && cd ..
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
```
Access at `http://localhost:8000`. All API routes are prefixed with `/api`.
If `frontend/dist` (or `frontend/dist/index.html`) is missing, backend startup now logs an explicit error and continues serving API routes. In that case, frontend static routes are not mounted until a frontend build is present.
If `frontend/dist` is missing, the backend falls back to `frontend/prebuilt` when present (for example from the release zip artifact). If neither build directory is available, startup logs an explicit error and continues serving API routes without frontend static routes mounted.
## Testing
@@ -281,7 +281,7 @@ npm run test:run
### Before Completing Changes
**Always run `./scripts/all_quality.sh` before finishing any changes that have modified code or tests.** This runs all linting, formatting, type checking, tests, and builds sequentially, catching type mismatches, breaking changes, and compilation errors. This is not necessary for docs-only changes.
**Always run `./scripts/all_quality.sh` before finishing any changes that have modified code or tests.** This runs all linting, formatting, type checking, tests, and the standard frontend build sequentially, catching type mismatches, breaking changes, and compilation errors. This is not necessary for docs-only changes.
## API Summary
@@ -290,25 +290,20 @@ All endpoints are prefixed with `/api` (e.g., `/api/health`).
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/health` | Connection status, fanout statuses, bots_disabled flag |
| GET | `/api/debug` | Support snapshot: recent logs, live radio probe, contact/channel drift audit, and running version/git info |
| GET | `/api/radio/config` | Radio configuration, including `path_hash_mode`, `path_hash_mode_supported`, and whether adverts include current node location |
| PATCH | `/api/radio/config` | Update name, location, advert-location on/off, radio params, and `path_hash_mode` when supported |
| PUT | `/api/radio/private-key` | Import private key to radio |
| POST | `/api/radio/advertise` | Send advertisement |
| POST | `/api/radio/discover` | Run a short mesh discovery sweep for nearby repeaters/sensors |
| POST | `/api/radio/reboot` | Reboot radio or reconnect if disconnected |
| POST | `/api/radio/disconnect` | Disconnect from radio and pause automatic reconnect attempts |
| POST | `/api/radio/reconnect` | Manual radio reconnection |
| GET | `/api/contacts` | List contacts |
| GET | `/api/contacts/analytics` | Unified keyed-or-name contact analytics payload |
| GET | `/api/contacts/repeaters/advert-paths` | List recent unique advert paths for all contacts |
| GET | `/api/contacts/name-detail` | Channel activity summary for a sender name without a resolved key |
| GET | `/api/contacts/{public_key}` | Get contact by public key or prefix |
| GET | `/api/contacts/{public_key}/detail` | Comprehensive contact profile (stats, name history, paths) |
| GET | `/api/contacts/{public_key}/advert-paths` | List recent unique advert paths for a contact |
| POST | `/api/contacts` | Create contact (optionally trigger historical DM decrypt) |
| DELETE | `/api/contacts/{public_key}` | Delete contact |
| POST | `/api/contacts/sync` | Pull from radio |
| POST | `/api/contacts/{public_key}/add-to-radio` | Push contact to radio |
| POST | `/api/contacts/{public_key}/remove-from-radio` | Remove contact from radio |
| POST | `/api/contacts/{public_key}/mark-read` | Mark contact conversation as read |
| POST | `/api/contacts/{public_key}/command` | Send CLI command to repeater |
| POST | `/api/contacts/{public_key}/routing-override` | Set or clear a forced routing override |
@@ -318,16 +313,15 @@ All endpoints are prefixed with `/api` (e.g., `/api/health`).
| POST | `/api/contacts/{public_key}/repeater/lpp-telemetry` | Fetch CayenneLPP sensor data |
| POST | `/api/contacts/{public_key}/repeater/neighbors` | Fetch repeater neighbors |
| POST | `/api/contacts/{public_key}/repeater/acl` | Fetch repeater ACL |
| POST | `/api/contacts/{public_key}/repeater/radio-settings` | Fetch radio settings via CLI |
| POST | `/api/contacts/{public_key}/repeater/node-info` | Fetch repeater name, location, and clock via CLI |
| POST | `/api/contacts/{public_key}/repeater/radio-settings` | Fetch repeater radio config via CLI |
| POST | `/api/contacts/{public_key}/repeater/advert-intervals` | Fetch advert intervals |
| POST | `/api/contacts/{public_key}/repeater/owner-info` | Fetch owner info |
| GET | `/api/channels` | List channels |
| GET | `/api/channels/{key}/detail` | Comprehensive channel profile (message stats, top senders) |
| GET | `/api/channels/{key}` | Get channel by key |
| POST | `/api/channels` | Create channel |
| DELETE | `/api/channels/{key}` | Delete channel |
| POST | `/api/channels/sync` | Pull from radio |
| POST | `/api/channels/{key}/flood-scope-override` | Set or clear a per-channel regional flood-scope override |
| POST | `/api/channels/{key}/mark-read` | Mark channel as read |
| GET | `/api/messages` | List with filters (`q`, `after`/`after_id` for forward pagination) |
@@ -445,11 +439,12 @@ mc.subscribe(EventType.ACK, handler)
| `MESHCORE_LOG_LEVEL` | `INFO` | Logging level (`DEBUG`/`INFO`/`WARNING`/`ERROR`) |
| `MESHCORE_DATABASE_PATH` | `data/meshcore.db` | SQLite database location |
| `MESHCORE_DISABLE_BOTS` | `false` | Disable bot system entirely (blocks execution and config) |
| `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK` | `false` | Switch the always-on message audit task from hourly checks to aggressive 10-second `get_msg()` fallback polling |
| `MESHCORE_BASIC_AUTH_USERNAME` | *(none)* | Optional app-wide HTTP Basic auth username; must be set together with `MESHCORE_BASIC_AUTH_PASSWORD` |
| `MESHCORE_BASIC_AUTH_PASSWORD` | *(none)* | Optional app-wide HTTP Basic auth password; must be set together with `MESHCORE_BASIC_AUTH_USERNAME` |
| `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK` | `false` | Switch the always-on radio audit task from hourly checks to aggressive 10-second polling; the audit checks both missed message drift and channel-slot cache drift |
| `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE` | `false` | Disable channel-slot reuse and force `set_channel(...)` before every channel send, even on serial/BLE |
**Note:** Runtime app settings are stored in the database (`app_settings` table), not environment variables. These include `max_radio_contacts`, `auto_decrypt_dm_on_advert`, `sidebar_sort_order`, `advert_interval`, `last_advert_time`, `favorites`, `last_message_times`, `flood_scope`, `blocked_keys`, and `blocked_names`. `max_radio_contacts` is the configured radio contact capacity baseline used by background maintenance: favorites reload first, non-favorite fill targets about 80% of that value, and full offload/reload triggers around 95% occupancy. They are configured via `GET/PATCH /api/settings`. MQTT, bot, webhook, Apprise, and SQS configs are stored in the `fanout_configs` table, managed via `/api/fanout`.
**Note:** Runtime app settings are stored in the database (`app_settings` table), not environment variables. These include `max_radio_contacts`, `auto_decrypt_dm_on_advert`, `sidebar_sort_order`, `advert_interval`, `last_advert_time`, `favorites`, `last_message_times`, `flood_scope`, `blocked_keys`, and `blocked_names`. `max_radio_contacts` is the configured radio contact capacity baseline used by background maintenance: favorites reload first, non-favorite fill targets about 80% of that value, and full offload/reload triggers around 95% occupancy. They are configured via `GET/PATCH /api/settings`. The backend still carries `sidebar_sort_order` for compatibility and migration, but the current frontend sidebar stores sort order per section (`Channels`, `Contacts`, `Repeaters`) in localStorage rather than treating it as one shared server-backed preference. MQTT, bot, webhook, Apprise, and SQS configs are stored in the `fanout_configs` table, managed via `/api/fanout`. If the radio's channel slots appear unstable or another client is mutating them underneath this app, operators can force the old always-reconfigure send path with `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true`.
Byte-perfect channel retries are user-triggered via `POST /api/messages/channel/{message_id}/resend` and are allowed for 30 seconds after the original send.

View File

@@ -1,3 +1,33 @@
## [3.4.0] - 2026-03-16
Feature: Add radio model and stats display
Feature: Add prebuilt frontends, then deleted that and moved to prebuilt release artifacts
Bugfix: Misc. frontend performance and correctness fixes
Bugfix: Fix same-second same-content DM send collition
Bugfix: Discard clearly-wrong GPS data
Bugfix: Prevent repeater clock skew drift on page nav
Misc: Use repeater's advertised location if we haven't loaded one from repeater admin
Misc: Don't permit invalid fanout configs to be saved ever`
## [3.3.0] - 2026-03-13
Feature: Use dashed lines to show collapsed ambiguous router results
Feature: Jump to unred
Feature: Local channel management to prevent need to reload channel every time
Feature: Debug endpoint
Feature: Force-singleton channel management
Feature: Local node discovery
Feature: Node routing discovery
Bugfix: Don't tell users to us npm ci
Bugfix: Fallback polling dm message persistence
Bugfix: All native-JS inputs are now modals
Bugfix: Same-second send collision resolution
Bugfix: Proper browser updates on resend
Bugfix: Don't use last-heard when we actually want last-advert for path discovery for nodes
Bugfix: Don't treat prefix-matching DM echoes as acks like we do for channel messages
Misc: Visualizer data layer overhaul for future map work
Misc: Parallelize docker tests
## [3.2.0] - 2026-03-12
Feature: Improve ambiguous-sender DM handling and visibility

View File

@@ -8,8 +8,8 @@ Backend server + browser interface for MeshCore mesh radio networks. Connect you
* Monitor unlimited contacts and channels (radio limits don't apply -- packets are decrypted server-side)
* Access your radio remotely over your network or VPN
* Search for hashtag room names for channels you don't have keys for yet
* Forward packets to MQTT brokers (private: decrypted messages and/or raw packets; community aggregators like LetsMesh.net: raw packets only)
* Use the more recent 1.14 firmwares which support multibyte pathing in all traffic and display systems within the app
* Forward packets to MQTT, LetsMesh, MeshRank, SQS, Apprise, etc.
* Use the more recent 1.14 firmwares which support multibyte pathing
* Visualize the mesh as a map or node set, view repeater stats, and more!
**Warning:** This app is for trusted environments only. _Do not put this on an untrusted network, or open it to the public._ You can optionally set `MESHCORE_BASIC_AUTH_USERNAME` and `MESHCORE_BASIC_AUTH_PASSWORD` for app-wide HTTP Basic auth, but that is only a coarse gate and must be paired with HTTPS. The bots can execute arbitrary Python code which means anyone who gets access to the app can, too. To completely disable the bot system, start the server with `MESHCORE_DISABLE_BOTS=true` — this prevents all bot execution and blocks bot configuration changes via the API. If you need stronger access control, consider using a reverse proxy like Nginx, or extending FastAPI; full access control and user management are outside the scope of this app.
@@ -29,6 +29,8 @@ If extending, have your LLM read the three `AGENTS.md` files: `./AGENTS.md`, `./
- [UV](https://astral.sh/uv) package manager: `curl -LsSf https://astral.sh/uv/install.sh | sh`
- MeshCore radio connected via USB serial, TCP, or BLE
If you are on a low-resource system and do not want to build the frontend locally, download the release zip named `remoteterm-prebuilt-frontend-vX.X.X-<short hash>.zip`. That bundle includes `frontend/prebuilt`, so you can run the app without doing a frontend build from source.
<details>
<summary>Finding your serial port</summary>
@@ -77,7 +79,7 @@ cd Remote-Terminal-for-MeshCore
uv sync
# Build frontend
cd frontend && npm ci && npm run build && cd ..
cd frontend && npm install && npm run build && cd ..
# Run server
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
@@ -101,16 +103,16 @@ $env:MESHCORE_SERIAL_PORT="COM8" # or your COM port
uv run uvicorn app.main:app --host 0.0.0.0 --port 8000
```
Access at http://localhost:8000
Access at http://localhost:8000.
> **Note:** WebGPU cracking requires HTTPS when not on localhost. See the HTTPS section under Additional Setup.
>
> Source checkouts expect a normal frontend build in `frontend/dist`. The backend also supports `frontend/prebuilt` when you are running from the release zip artifact.
## Docker Compose
> **Warning:** Docker has intermittent issues with serial event subscriptions. The native method above is more reliable.
> **Note:** BLE-in-docker is outside the scope of this README, but the env vars should all still work.
Edit `docker-compose.yaml` to set a serial device for passthrough, or uncomment your transport (serial or TCP). Then:
```bash
@@ -175,7 +177,7 @@ uv run uvicorn app.main:app --reload
```bash
cd frontend
npm ci
npm install
npm run dev # Dev server at http://localhost:5173 (proxies API to :8000)
npm run build # Production build to dist/
```
@@ -207,7 +209,7 @@ cd frontend
npm run lint:fix # esLint + auto-fix
npm run test:run # run tests
npm run format # prettier (always writes)
npm run build # build the frontend
npm run build # build frontend/dist
```
</details>
@@ -224,7 +226,6 @@ npm run build # build the frontend
| `MESHCORE_LOG_LEVEL` | INFO | DEBUG, INFO, WARNING, ERROR |
| `MESHCORE_DATABASE_PATH` | data/meshcore.db | SQLite database path |
| `MESHCORE_DISABLE_BOTS` | false | Disable bot system entirely (blocks execution and config) |
| `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK` | false | Run aggressive 10-second `get_msg()` fallback polling instead of the default hourly audit task |
| `MESHCORE_BASIC_AUTH_USERNAME` | | Optional app-wide HTTP Basic auth username; must be set together with `MESHCORE_BASIC_AUTH_PASSWORD` |
| `MESHCORE_BASIC_AUTH_PASSWORD` | | Optional app-wide HTTP Basic auth password; must be set together with `MESHCORE_BASIC_AUTH_USERNAME` |
@@ -232,7 +233,21 @@ Only one transport may be active at a time. If multiple are set, the server will
If you enable Basic Auth, protect the app with HTTPS. HTTP Basic credentials are not safe on plain HTTP.
By default the app relies on radio events plus MeshCore auto-fetch for incoming messages, and also runs a low-frequency hourly audit poll. If that audit ever finds radio data that was not surfaced through event subscription, the backend logs an error and the UI shows a toast telling the operator to check the logs. If you see that warning, or if messages on the radio never show up in the app, try `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK=true` to switch that task into a more aggressive 10-second `get_msg()` safety net.
### Remediation Environment Variables
These are intended for diagnosing or working around radios that behave oddly.
| Variable | Default | Description |
|----------|---------|-------------|
| `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK` | false | Run aggressive 10-second `get_msg()` fallback polling instead of the default hourly sanity check |
| `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE` | false | Disable channel-slot reuse and force `set_channel(...)` before every channel send |
By default the app relies on radio events plus MeshCore auto-fetch for incoming messages, and also runs a low-frequency hourly audit poll. That audit checks both:
- whether messages were left on the radio without reaching the app through event subscription
- whether the app's channel-slot expectations still match the radio's actual channel listing
If the audit finds a mismatch, you'll see an error in the application UI and your logs. If you see that warning, or if messages on the radio never show up in the app, try `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK=true` to switch that task into a more aggressive 10-second safety net. If room sends appear to be using the wrong channel slot or another client is changing slots underneath this app, try `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true` to force the radio to validate the channel slot is valid before sending (will delay sending by ~500ms).
## Additional Setup
@@ -285,11 +300,6 @@ cd /opt/remoteterm
sudo -u remoteterm uv venv
sudo -u remoteterm uv sync
# Build frontend (required for the backend to serve the web UI)
cd /opt/remoteterm/frontend
sudo -u remoteterm npm ci
sudo -u remoteterm npm run build
# Install and start service
sudo cp /opt/remoteterm/remoteterm.service /etc/systemd/system/
sudo systemctl daemon-reload
@@ -301,6 +311,8 @@ sudo journalctl -u remoteterm -f
```
Edit `/etc/systemd/system/remoteterm.service` to set `MESHCORE_SERIAL_PORT` if needed.
If you are deploying from a source checkout, install Node.js and run `cd /opt/remoteterm/frontend && sudo -u remoteterm npm install && sudo -u remoteterm npm run build` before starting the service. If you are deploying from the release zip artifact, the bundled `frontend/prebuilt` is already present.
</details>
<details>

View File

@@ -53,6 +53,7 @@ app/
├── frontend_static.py # Mount/serve built frontend (production)
└── routers/
├── health.py
├── debug.py
├── radio.py
├── contacts.py
├── channels.py
@@ -89,7 +90,7 @@ app/
- `RadioManager.post_connect_setup()` delegates to `services/radio_lifecycle.py`.
- Routers, startup/lifespan code, fanout helpers, and `radio_sync.py` should reach radio state through `services/radio_runtime.py`, not by importing `app.radio.radio_manager` directly.
- Shared reconnect/setup helpers in `services/radio_lifecycle.py` are used by startup, the monitor, and manual reconnect/reboot flows before broadcasting healthy state.
- Setup still includes handler registration, key export, time sync, contact/channel sync, and advertisement tasks. The message-poll task always starts: by default it runs as a low-frequency hourly audit, and `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK=true` switches it to aggressive 10-second polling.
- Setup still includes handler registration, key export, time sync, contact/channel sync, and advertisement tasks. The message-poll task always starts: by default it runs as a low-frequency hourly audit, and `MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK=true` switches it to aggressive 10-second polling. That audit checks both missed-radio-message drift and channel-slot cache drift; cache mismatches are logged, toasted, and the send-slot cache is reset.
- Post-connect setup is timeout-bounded. If initial radio offload/setup hangs too long, the backend logs the failure and broadcasts an `error` toast telling the operator to reboot the radio and restart the server.
## Important Behaviors
@@ -98,6 +99,10 @@ app/
- Packet `path_len` values are hop counts, not byte counts.
- Hop width comes from the packet or radio `path_hash_mode`: `0` = 1-byte, `1` = 2-byte, `2` = 3-byte.
- Channel slot count comes from firmware-reported `DEVICE_INFO.max_channels`; do not hardcode `40` when scanning/offloading channel slots.
- Channel sends use a session-local LRU slot cache after startup channel offload clears the radio. Repeated sends to the same room reuse the loaded slot; new rooms fill free slots up to the discovered channel capacity, then evict the least recently used cached room.
- TCP radios do not reuse cached slot contents. For TCP, channel sends still force `set_channel(...)` before every send because this backend does not have exclusive device access.
- `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true` disables slot reuse on all transports and forces the old always-`set_channel(...)` behavior before every channel send.
- Contacts persist `out_path_hash_mode` in the database so contact sync and outbound DM routing reuse the exact stored mode instead of inferring from path bytes.
- Contacts may also persist `route_override_path`, `route_override_len`, and `route_override_hash_mode`. `Contact.to_radio_dict()` gives these override fields precedence over learned `last_path*`, while advert processing still updates the learned route for telemetry/fallback.
- `contact_advert_paths` identity is `(public_key, path_hex, path_len)` because the same hex bytes can represent different routes at different hop widths.
@@ -110,7 +115,7 @@ app/
### Echo/repeat dedup
- Message uniqueness: `(type, conversation_key, text, sender_timestamp)`.
- Duplicate insert is treated as an echo/repeat: the new path (if any) is appended, and the ACK count is incremented **only for outgoing messages**. Incoming repeats add path data but do not change the ACK count.
- Duplicate insert is treated as an echo/repeat: the new path (if any) is appended, and the ACK count is incremented only for outgoing channel messages. Incoming repeats and direct-message duplicates may still add path data, but DM delivery state advances only from real ACK events.
### Raw packet dedup policy
@@ -145,11 +150,15 @@ app/
### Health
- `GET /health`
### Debug
- `GET /debug` — support snapshot with recent logs, live radio probe, slot/contact audits, and version/git info
### Radio
- `GET /radio/config` — includes `path_hash_mode`, `path_hash_mode_supported`, and advert-location on/off
- `PATCH /radio/config` — may update `path_hash_mode` (`0..2`) when firmware supports it
- `PUT /radio/private-key`
- `POST /radio/advertise`
- `POST /radio/discover` — short mesh discovery sweep for nearby repeaters/sensors
- `POST /radio/disconnect`
- `POST /radio/reboot`
- `POST /radio/reconnect`
@@ -158,15 +167,8 @@ app/
- `GET /contacts`
- `GET /contacts/analytics` — unified keyed-or-name analytics payload
- `GET /contacts/repeaters/advert-paths` — recent advert paths for all contacts
- `GET /contacts/name-detail` — name-only activity summary for unresolved channel senders
- `GET /contacts/{public_key}`
- `GET /contacts/{public_key}/detail` — comprehensive contact profile (stats, name history, paths, nearest repeaters)
- `GET /contacts/{public_key}/advert-paths` — recent advert paths for one contact
- `POST /contacts`
- `DELETE /contacts/{public_key}`
- `POST /contacts/sync`
- `POST /contacts/{public_key}/add-to-radio`
- `POST /contacts/{public_key}/remove-from-radio`
- `POST /contacts/{public_key}/mark-read`
- `POST /contacts/{public_key}/command`
- `POST /contacts/{public_key}/routing-override`
@@ -176,6 +178,7 @@ app/
- `POST /contacts/{public_key}/repeater/lpp-telemetry`
- `POST /contacts/{public_key}/repeater/neighbors`
- `POST /contacts/{public_key}/repeater/acl`
- `POST /contacts/{public_key}/repeater/node-info`
- `POST /contacts/{public_key}/repeater/radio-settings`
- `POST /contacts/{public_key}/repeater/advert-intervals`
- `POST /contacts/{public_key}/repeater/owner-info`
@@ -183,10 +186,8 @@ app/
### Channels
- `GET /channels`
- `GET /channels/{key}/detail`
- `GET /channels/{key}`
- `POST /channels`
- `DELETE /channels/{key}`
- `POST /channels/sync`
- `POST /channels/{key}/flood-scope-override`
- `POST /channels/{key}/mark-read`
@@ -230,6 +231,7 @@ app/
- `health` — radio connection status (broadcast on change, personal on connect)
- `contact` — single contact upsert (from advertisements and radio sync)
- `contact_resolved` — prefix contact reconciled to a full contact row (payload: `{ previous_public_key, contact }`)
- `message` — new message (channel or DM, from packet processor or send endpoints)
- `message_acked` — ACK/echo update for existing message (ack count + paths)
- `raw_packet` — every incoming RF packet (for real-time packet feed UI)
@@ -270,6 +272,8 @@ Repository writes should prefer typed models such as `ContactUpsert` over ad hoc
- `flood_scope`
- `blocked_keys`, `blocked_names`
Note: `sidebar_sort_order` remains in the backend model for compatibility and migration, but the current frontend sidebar uses per-section localStorage sort preferences instead of a single shared server-backed sort mode.
Note: MQTT, community MQTT, and bot configs were migrated to the `fanout_configs` table (migrations 36-38).
## Security Posture (intentional)
@@ -349,7 +353,7 @@ tests/
The MeshCore radio protocol encodes `sender_timestamp` as a 4-byte little-endian integer (Unix seconds). This is a firmware-level wire format — the radio, the Python library (`commands/messaging.py`), and the decoder (`decoder.py`) all read/write exactly 4 bytes. Millisecond Unix timestamps would overflow 4 bytes, so higher resolution is not possible without a firmware change.
**Consequence:** The dedup index `(type, conversation_key, text, COALESCE(sender_timestamp, 0))` operates at 1-second granularity. Sending identical text to the same conversation twice within one second will hit the UNIQUE constraint on the second insert, returning HTTP 500 *after* the radio has already transmitted. The message is sent over the air but not stored in the database. Do not attempt to fix this by switching to millisecond timestamps — it will break echo dedup (the echo's 4-byte timestamp won't match the stored value) and overflow `to_bytes(4, "little")`.
**Consequence:** Channel-message dedup still operates at 1-second granularity because the radio protocol only provides second-resolution `sender_timestamp`. Do not attempt to fix this by switching to millisecond timestamps — it will break echo dedup (the echo's 4-byte timestamp won't match the stored value) and overflow `to_bytes(4, "little")`. Direct messages no longer share that channel dedup index; they are deduplicated by raw-packet identity instead so legitimate same-text same-second DMs can coexist.
### Outgoing DM echoes remain undecrypted

10
app/channel_constants.py Normal file
View File

@@ -0,0 +1,10 @@
PUBLIC_CHANNEL_KEY = "8B3387E9C5CDEA6AC9E5EDBAA115CD72"
PUBLIC_CHANNEL_NAME = "Public"
def is_public_channel_key(key: str) -> bool:
return key.upper() == PUBLIC_CHANNEL_KEY
def is_public_channel_name(name: str) -> bool:
return name.casefold() == PUBLIC_CHANNEL_NAME.casefold()

View File

@@ -1,5 +1,7 @@
import logging
import logging.config
from collections import deque
from threading import Lock
from typing import Literal
from pydantic import model_validator
@@ -19,6 +21,7 @@ class Settings(BaseSettings):
database_path: str = "data/meshcore.db"
disable_bots: bool = False
enable_message_poll_fallback: bool = False
force_channel_slot_reconfigure: bool = False
basic_auth_username: str = ""
basic_auth_password: str = ""
@@ -66,6 +69,47 @@ class Settings(BaseSettings):
settings = Settings()
class _RingBufferLogHandler(logging.Handler):
"""Keep a bounded in-memory tail of formatted log lines."""
def __init__(self, max_lines: int = 1000) -> None:
super().__init__()
self._buffer: deque[str] = deque(maxlen=max_lines)
self._lock = Lock()
def emit(self, record: logging.LogRecord) -> None:
try:
line = self.format(record)
except Exception:
self.handleError(record)
return
with self._lock:
self._buffer.append(line)
def get_lines(self, limit: int = 1000) -> list[str]:
with self._lock:
if limit <= 0:
return []
return list(self._buffer)[-limit:]
def clear(self) -> None:
with self._lock:
self._buffer.clear()
_recent_log_handler = _RingBufferLogHandler(max_lines=1000)
def get_recent_log_lines(limit: int = 1000) -> list[str]:
"""Return recent formatted log lines from the in-memory ring buffer."""
return _recent_log_handler.get_lines(limit)
def clear_recent_log_lines() -> None:
"""Clear the in-memory log ring buffer."""
_recent_log_handler.clear()
class _RepeatSquelch(logging.Filter):
"""Suppress rapid-fire identical messages and emit a summary instead.
@@ -151,6 +195,19 @@ def setup_logging() -> None:
},
}
)
_recent_log_handler.setLevel(logging.DEBUG)
_recent_log_handler.setFormatter(
logging.Formatter(
fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
)
for logger_name in ("", "uvicorn", "uvicorn.error", "uvicorn.access"):
target = logging.getLogger(logger_name)
if _recent_log_handler not in target.handlers:
target.addHandler(_recent_log_handler)
# Squelch repeated messages from the meshcore library (e.g. rapid-fire
# "Serial Connection started" when the port is contended).
logging.getLogger("meshcore").addFilter(_RepeatSquelch())

View File

@@ -50,10 +50,10 @@ CREATE TABLE IF NOT EXISTS messages (
acked INTEGER DEFAULT 0,
sender_name TEXT,
sender_key TEXT
-- Deduplication: identical text + timestamp in the same conversation is treated as a
-- mesh echo/repeat. Second-precision timestamps mean two intentional identical messages
-- within the same second would collide, but this is not feasible in practice — LoRa
-- transmission takes several seconds per message, and the UI clears the input on send.
-- Deduplication: channel echoes/repeats use a channel-only unique index on
-- identical conversation/text/timestamp. Direct messages are deduplicated
-- separately via raw-packet linkage so legitimate same-text same-second DMs
-- can coexist.
-- Enforced via idx_messages_dedup_null_safe (unique index) rather than a table constraint
-- to avoid the storage overhead of SQLite's autoindex duplicating every message text.
);
@@ -91,7 +91,8 @@ CREATE TABLE IF NOT EXISTS contact_name_history (
CREATE INDEX IF NOT EXISTS idx_messages_received ON messages(received_at);
CREATE UNIQUE INDEX IF NOT EXISTS idx_messages_dedup_null_safe
ON messages(type, conversation_key, text, COALESCE(sender_timestamp, 0));
ON messages(type, conversation_key, text, COALESCE(sender_timestamp, 0))
WHERE type = 'CHAN';
CREATE INDEX IF NOT EXISTS idx_raw_packets_message_id ON raw_packets(message_id);
CREATE UNIQUE INDEX IF NOT EXISTS idx_raw_packets_payload_hash ON raw_packets(payload_hash);
CREATE INDEX IF NOT EXISTS idx_contacts_on_radio ON contacts(on_radio);

View File

@@ -85,13 +85,8 @@ class PacketInfo:
path_hash_size: int = 1 # Bytes per hop: 1, 2, or 3
def calculate_channel_hash(channel_key: bytes) -> str:
"""
Calculate the channel hash from a 16-byte channel key.
Returns the first byte of SHA256(key) as hex.
"""
hash_bytes = hashlib.sha256(channel_key).digest()
return format(hash_bytes[0], "02x")
def _is_valid_advert_location(lat: float, lon: float) -> bool:
return -90 <= lat <= 90 and -180 <= lon <= 180
def extract_payload(raw_packet: bytes) -> bytes | None:
@@ -233,7 +228,7 @@ def try_decrypt_packet_with_channel_key(
return None
packet_channel_hash = format(packet_info.payload[0], "02x")
expected_hash = calculate_channel_hash(channel_key)
expected_hash = format(hashlib.sha256(channel_key).digest()[0], "02x")
if packet_channel_hash != expected_hash:
return None
@@ -252,7 +247,9 @@ def get_packet_payload_type(raw_packet: bytes) -> PayloadType | None:
return None
def parse_advertisement(payload: bytes) -> ParsedAdvertisement | None:
def parse_advertisement(
payload: bytes, raw_packet: bytes | None = None
) -> ParsedAdvertisement | None:
"""
Parse an advertisement payload.
@@ -308,6 +305,16 @@ def parse_advertisement(payload: bytes) -> ParsedAdvertisement | None:
lon_raw = int.from_bytes(payload[offset + 4 : offset + 8], byteorder="little", signed=True)
lat = lat_raw / 1_000_000
lon = lon_raw / 1_000_000
if not _is_valid_advert_location(lat, lon):
packet_hex = (raw_packet if raw_packet is not None else payload).hex().upper()
logger.warning(
"Dropping location data for nonsensical packet -- packet %s implies lat/lon %s/%s. Outta this world!",
packet_hex,
lat,
lon,
)
lat = None
lon = None
offset += 8
# Skip feature fields if present

View File

@@ -28,11 +28,12 @@ logger = logging.getLogger(__name__)
# This prevents handler duplication after reconnects
_active_subscriptions: list["Subscription"] = []
_pending_acks = dm_ack_tracker._pending_acks
_buffered_acks = dm_ack_tracker._buffered_acks
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> None:
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> bool:
"""Compatibility wrapper for pending DM ACK tracking."""
dm_ack_tracker.track_pending_ack(expected_ack, message_id, timeout_ms)
return dm_ack_tracker.track_pending_ack(expected_ack, message_id, timeout_ms)
def cleanup_expired_acks() -> None:
@@ -243,7 +244,7 @@ async def on_new_contact(event: "Event") -> None:
logger.debug("New contact: %s", public_key[:12])
contact_upsert = ContactUpsert.from_radio_dict(public_key.lower(), payload, on_radio=True)
contact_upsert = ContactUpsert.from_radio_dict(public_key.lower(), payload, on_radio=False)
contact_upsert.last_seen = int(time.time())
await ContactRepository.upsert(contact_upsert)
promoted_keys = await promote_prefix_contacts_for_contact(
@@ -302,6 +303,7 @@ async def on_ack(event: "Event") -> None:
# preserving any previously known paths.
await increment_ack_and_broadcast(message_id=message_id, broadcast_fn=broadcast_event)
else:
dm_ack_tracker.buffer_unmatched_ack(ack_code)
logger.debug("ACK code %s does not match any pending messages", ack_code)

View File

@@ -82,7 +82,7 @@ Push notifications via Apprise library. Config blob:
### sqs (sqs.py)
Amazon SQS delivery. Config blob:
- `queue_url` — target queue URL
- `region_name` (optional), `endpoint_url` (optional)
- `region_name` (optional; inferred from standard AWS SQS queue URLs when omitted), `endpoint_url` (optional)
- `access_key_id`, `secret_access_key`, `session_token` (all optional; blank uses the normal AWS credential chain)
- Publishes a JSON envelope of the form `{"event_type":"message"|"raw_packet","data":...}`
- Supports both decoded messages and raw packets via normal scope selection

View File

@@ -7,6 +7,7 @@ import hashlib
import json
import logging
from functools import partial
from urllib.parse import urlparse
import boto3
from botocore.exceptions import BotoCoreError, ClientError
@@ -28,6 +29,24 @@ def _build_payload(data: dict, *, event_type: str) -> str:
)
def _infer_region_from_queue_url(queue_url: str) -> str | None:
"""Infer AWS region from a standard SQS queue URL host when possible."""
host = urlparse(queue_url).hostname or ""
if not host:
return None
parts = host.split(".")
if len(parts) < 4 or parts[0] != "sqs":
return None
if parts[2] != "amazonaws":
return None
if parts[3] not in {"com", "com.cn"}:
return None
region = parts[1].strip()
return region or None
def _is_fifo_queue(queue_url: str) -> bool:
"""Return True when the configured queue URL points at an SQS FIFO queue."""
return queue_url.rstrip("/").endswith(".fifo")
@@ -69,12 +88,15 @@ class SqsModule(FanoutModule):
async def start(self) -> None:
kwargs: dict[str, str] = {}
queue_url = str(self.config.get("queue_url", "")).strip()
region_name = str(self.config.get("region_name", "")).strip()
endpoint_url = str(self.config.get("endpoint_url", "")).strip()
access_key_id = str(self.config.get("access_key_id", "")).strip()
secret_access_key = str(self.config.get("secret_access_key", "")).strip()
session_token = str(self.config.get("session_token", "")).strip()
if not region_name:
region_name = _infer_region_from_queue_url(queue_url) or ""
if region_name:
kwargs["region_name"] = region_name
if endpoint_url:

View File

@@ -10,6 +10,10 @@ logger = logging.getLogger(__name__)
INDEX_CACHE_CONTROL = "no-store"
ASSET_CACHE_CONTROL = "public, max-age=31536000, immutable"
STATIC_FILE_CACHE_CONTROL = "public, max-age=3600"
FRONTEND_BUILD_INSTRUCTIONS = (
"Run 'cd frontend && npm install && npm run build', "
"or use a release zip that includes frontend/prebuilt."
)
class CacheControlStaticFiles(StaticFiles):
@@ -48,40 +52,38 @@ def _resolve_request_origin(request: Request) -> str:
return str(request.base_url).rstrip("/")
def register_frontend_static_routes(app: FastAPI, frontend_dir: Path) -> bool:
"""Register frontend static file routes if a built frontend is available.
Returns True when routes are registered, False when frontend files are
missing/incomplete. Missing frontend files are logged but are not fatal.
"""
def _validate_frontend_dir(frontend_dir: Path, *, log_failures: bool = True) -> tuple[bool, Path]:
"""Resolve and validate a built frontend directory."""
frontend_dir = frontend_dir.resolve()
index_file = frontend_dir / "index.html"
assets_dir = frontend_dir / "assets"
if not frontend_dir.exists():
logger.error(
"Frontend build directory not found at %s. "
"Run 'cd frontend && npm run build'. API will continue without frontend routes.",
frontend_dir,
)
return False
if log_failures:
logger.error("Frontend build directory not found at %s.", frontend_dir)
return False, frontend_dir
if not frontend_dir.is_dir():
logger.error(
"Frontend build path is not a directory: %s. "
"API will continue without frontend routes.",
frontend_dir,
)
return False
if log_failures:
logger.error("Frontend build path is not a directory: %s.", frontend_dir)
return False, frontend_dir
if not index_file.exists():
logger.error(
"Frontend index file not found at %s. "
"Run 'cd frontend && npm run build'. API will continue without frontend routes.",
index_file,
)
if log_failures:
logger.error("Frontend index file not found at %s.", index_file)
return False, frontend_dir
return True, frontend_dir
def register_frontend_static_routes(app: FastAPI, frontend_dir: Path) -> bool:
"""Register frontend static file routes if a built frontend is available."""
valid, frontend_dir = _validate_frontend_dir(frontend_dir)
if not valid:
return False
index_file = frontend_dir / "index.html"
assets_dir = frontend_dir / "assets"
if assets_dir.exists() and assets_dir.is_dir():
app.mount(
"/assets",
@@ -157,6 +159,30 @@ def register_frontend_static_routes(app: FastAPI, frontend_dir: Path) -> bool:
return True
def register_first_available_frontend_static_routes(
app: FastAPI, frontend_dirs: list[Path]
) -> Path | None:
"""Register frontend routes from the first valid build directory."""
for i, candidate in enumerate(frontend_dirs):
valid, resolved_candidate = _validate_frontend_dir(candidate, log_failures=False)
if not valid:
continue
if register_frontend_static_routes(app, resolved_candidate):
logger.info("Selected frontend build directory %s", resolved_candidate)
return resolved_candidate
if i < len(frontend_dirs) - 1:
logger.warning("Frontend build at %s was unusable; trying fallback", resolved_candidate)
logger.error(
"No usable frontend build found. Searched: %s. %s API will continue without frontend routes.",
", ".join(str(path.resolve()) for path in frontend_dirs),
FRONTEND_BUILD_INSTRUCTIONS,
)
return None
def register_frontend_missing_fallback(app: FastAPI) -> None:
"""Register a fallback route that tells the user to build the frontend."""
@@ -164,5 +190,5 @@ def register_frontend_missing_fallback(app: FastAPI) -> None:
async def frontend_not_built():
return JSONResponse(
status_code=404,
content={"detail": "Frontend not built. Run: cd frontend && npm ci && npm run build"},
content={"detail": f"Frontend not built. {FRONTEND_BUILD_INSTRUCTIONS}"},
)

View File

@@ -30,6 +30,16 @@ _private_key: bytes | None = None
_public_key: bytes | None = None
def clear_keys() -> None:
"""Clear any stored private/public key material from memory."""
global _private_key, _public_key
had_key = _private_key is not None or _public_key is not None
_private_key = None
_public_key = None
if had_key:
logger.info("Cleared in-memory keystore")
def set_private_key(key: bytes) -> None:
"""Store the private key in memory and derive the public key.

View File

@@ -11,7 +11,10 @@ from fastapi.responses import JSONResponse
from app.config import settings as server_settings
from app.config import setup_logging
from app.database import db
from app.frontend_static import register_frontend_missing_fallback, register_frontend_static_routes
from app.frontend_static import (
register_first_available_frontend_static_routes,
register_frontend_missing_fallback,
)
from app.radio import RadioDisconnectedError
from app.radio_sync import (
stop_message_polling,
@@ -21,6 +24,7 @@ from app.radio_sync import (
from app.routers import (
channels,
contacts,
debug,
fanout,
health,
messages,
@@ -136,6 +140,7 @@ async def radio_disconnected_handler(request: Request, exc: RadioDisconnectedErr
# API routes - all prefixed with /api for production compatibility
app.include_router(health.router, prefix="/api")
app.include_router(debug.router, prefix="/api")
app.include_router(fanout.router, prefix="/api")
app.include_router(radio.router, prefix="/api")
app.include_router(contacts.router, prefix="/api")
@@ -149,6 +154,9 @@ app.include_router(statistics.router, prefix="/api")
app.include_router(ws.router, prefix="/api")
# Serve frontend static files in production
FRONTEND_DIR = Path(__file__).parent.parent / "frontend" / "dist"
if not register_frontend_static_routes(app, FRONTEND_DIR):
FRONTEND_DIST_DIR = Path(__file__).parent.parent / "frontend" / "dist"
FRONTEND_PREBUILT_DIR = Path(__file__).parent.parent / "frontend" / "prebuilt"
if not register_first_available_frontend_static_routes(
app, [FRONTEND_DIST_DIR, FRONTEND_PREBUILT_DIR]
):
register_frontend_missing_fallback(app)

View File

@@ -331,6 +331,13 @@ async def run_migrations(conn: aiosqlite.Connection) -> int:
await set_version(conn, 42)
applied += 1
# Migration 43: Limit message dedup index to channel messages only
if version < 43:
logger.info("Applying migration 43: narrow message dedup index to channels")
await _migrate_043_split_message_dedup_by_type(conn)
await set_version(conn, 43)
applied += 1
if applied > 0:
logger.info(
"Applied %d migration(s), schema now at version %d", applied, await get_version(conn)
@@ -2443,3 +2450,29 @@ async def _migrate_042_add_channel_flood_scope_override(conn: aiosqlite.Connecti
raise
await conn.commit()
async def _migrate_043_split_message_dedup_by_type(conn: aiosqlite.Connection) -> None:
"""Restrict the message dedup index to channel messages."""
cursor = await conn.execute(
"SELECT name FROM sqlite_master WHERE type='table' AND name='messages'"
)
if await cursor.fetchone() is None:
await conn.commit()
return
cursor = await conn.execute("PRAGMA table_info(messages)")
columns = {row[1] for row in await cursor.fetchall()}
required_columns = {"type", "conversation_key", "text", "sender_timestamp"}
if not required_columns.issubset(columns):
logger.debug("messages table missing dedup-index columns, skipping migration 43")
await conn.commit()
return
await conn.execute("DROP INDEX IF EXISTS idx_messages_dedup_null_safe")
await conn.execute(
"""CREATE UNIQUE INDEX IF NOT EXISTS idx_messages_dedup_null_safe
ON messages(type, conversation_key, text, COALESCE(sender_timestamp, 0))
WHERE type = 'CHAN'"""
)
await conn.commit()

View File

@@ -347,6 +347,12 @@ class MessagesAroundResponse(BaseModel):
has_newer: bool
class ResendChannelMessageResponse(BaseModel):
status: str
message_id: int
message: Message | None = None
class RawPacketDecryptedInfo(BaseModel):
"""Decryption info for a raw packet (when successfully decrypted)."""
@@ -404,6 +410,11 @@ class RepeaterLoginResponse(BaseModel):
"""Response from repeater login."""
status: str = Field(description="Login result status")
authenticated: bool = Field(description="Whether repeater authentication was confirmed")
message: str | None = Field(
default=None,
description="Optional warning or error message when authentication was not confirmed",
)
class RepeaterStatusResponse(BaseModel):
@@ -428,8 +439,17 @@ class RepeaterStatusResponse(BaseModel):
full_events: int = Field(description="Full event queue count")
class RepeaterNodeInfoResponse(BaseModel):
"""Identity/location info from a repeater (small CLI batch)."""
name: str | None = Field(default=None, description="Repeater name")
lat: str | None = Field(default=None, description="Latitude")
lon: str | None = Field(default=None, description="Longitude")
clock_utc: str | None = Field(default=None, description="Repeater clock in UTC")
class RepeaterRadioSettingsResponse(BaseModel):
"""Radio settings from a repeater (batch CLI get commands)."""
"""Radio settings from a repeater (radio/config CLI batch)."""
firmware_version: str | None = Field(default=None, description="Firmware version string")
radio: str | None = Field(default=None, description="Radio settings (freq,bw,sf,cr)")
@@ -437,10 +457,6 @@ class RepeaterRadioSettingsResponse(BaseModel):
airtime_factor: str | None = Field(default=None, description="Airtime factor")
repeat_enabled: str | None = Field(default=None, description="Repeat mode enabled")
flood_max: str | None = Field(default=None, description="Max flood hops")
name: str | None = Field(default=None, description="Repeater name")
lat: str | None = Field(default=None, description="Latitude")
lon: str | None = Field(default=None, description="Longitude")
clock_utc: str | None = Field(default=None, description="Repeater clock in UTC")
class RepeaterAdvertIntervalsResponse(BaseModel):
@@ -519,6 +535,30 @@ class TraceResponse(BaseModel):
path_len: int = Field(description="Number of hops in the trace path")
class PathDiscoveryRoute(BaseModel):
"""One resolved route returned by contact path discovery."""
path: str = Field(description="Hex-encoded path bytes")
path_len: int = Field(description="Hop count for this route")
path_hash_mode: int = Field(
description="Path hash mode (0=1-byte, 1=2-byte, 2=3-byte hop identifiers)"
)
class PathDiscoveryResponse(BaseModel):
"""Round-trip routing data for a contact path discovery request."""
contact: Contact = Field(
description="Updated contact row after saving the learned forward path"
)
forward_path: PathDiscoveryRoute = Field(
description="Route used from the local radio to the target contact"
)
return_path: PathDiscoveryRoute = Field(
description="Route used from the target contact back to the local radio"
)
class CommandRequest(BaseModel):
"""Request to send a CLI command to a repeater."""
@@ -535,6 +575,48 @@ class CommandResponse(BaseModel):
)
class RadioDiscoveryRequest(BaseModel):
"""Request to discover nearby mesh nodes from the local radio."""
target: Literal["repeaters", "sensors", "all"] = Field(
default="all",
description="Which node classes to discover over the mesh",
)
class RadioDiscoveryResult(BaseModel):
"""One mesh node heard during a discovery sweep."""
public_key: str = Field(description="Discovered node public key as hex")
node_type: Literal["repeater", "sensor"] = Field(description="Discovered node class")
heard_count: int = Field(default=1, description="How many responses were heard from this node")
local_snr: float | None = Field(
default=None,
description="SNR at which the local radio heard the response (dB)",
)
local_rssi: int | None = Field(
default=None,
description="RSSI at which the local radio heard the response (dBm)",
)
remote_snr: float | None = Field(
default=None,
description="SNR reported by the remote node while hearing our discovery request (dB)",
)
class RadioDiscoveryResponse(BaseModel):
"""Response payload for a mesh discovery sweep."""
target: Literal["repeaters", "sensors", "all"] = Field(
description="Which node classes were requested"
)
duration_seconds: float = Field(description="How long the sweep listened for responses")
results: list[RadioDiscoveryResult] = Field(
default_factory=list,
description="Deduplicated discovery responses heard during the sweep",
)
class Favorite(BaseModel):
"""A favorite conversation."""
@@ -554,6 +636,9 @@ class UnreadCounts(BaseModel):
last_message_times: dict[str, int] = Field(
default_factory=dict, description="Map of stateKey -> last message timestamp"
)
last_read_ats: dict[str, int | None] = Field(
default_factory=dict, description="Map of stateKey -> server-side last_read_at boundary"
)
class AppSettings(BaseModel):

View File

@@ -425,7 +425,7 @@ async def _process_advertisement(
logger.debug("Failed to parse advertisement packet")
return
advert = parse_advertisement(packet_info.payload)
advert = parse_advertisement(packet_info.payload, raw_packet=raw_bytes)
if not advert:
logger.debug("Failed to parse advertisement payload")
return
@@ -442,8 +442,8 @@ async def _process_advertisement(
PATH_FRESHNESS_SECONDS = 60
use_existing_path = False
if existing and existing.last_seen:
path_age = timestamp - existing.last_seen
if existing and existing.last_advert:
path_age = timestamp - existing.last_advert
existing_path_len = existing.last_path_len if existing.last_path_len >= 0 else float("inf")
# Keep existing path if it's fresh and shorter (or equal)
@@ -477,8 +477,9 @@ async def _process_advertisement(
path_len,
)
# Use device_role from advertisement for contact type (1=Chat, 2=Repeater, 3=Room, 4=Sensor)
# Use advert.timestamp for last_advert (sender's timestamp), receive timestamp for last_seen
# Use device_role from advertisement for contact type (1=Chat, 2=Repeater, 3=Room, 4=Sensor).
# Persist advert freshness fields using the server receive wall clock so
# route selection is not affected by sender clock skew.
contact_type = (
advert.device_role if advert.device_role > 0 else (existing.type if existing else 0)
)
@@ -498,7 +499,7 @@ async def _process_advertisement(
type=contact_type,
lat=advert.lat,
lon=advert.lon,
last_advert=advert.timestamp if advert.timestamp > 0 else timestamp,
last_advert=timestamp,
last_seen=timestamp,
last_path=path_hex,
last_path_len=path_len,

View File

@@ -2,12 +2,14 @@ import asyncio
import glob
import logging
import platform
from collections import OrderedDict
from contextlib import asynccontextmanager, nullcontext
from pathlib import Path
from meshcore import MeshCore
from app.config import settings
from app.keystore import clear_keys
logger = logging.getLogger(__name__)
@@ -129,8 +131,17 @@ class RadioManager:
self._setup_lock: asyncio.Lock | None = None
self._setup_in_progress: bool = False
self._setup_complete: bool = False
self.device_info_loaded: bool = False
self.max_contacts: int | None = None
self.device_model: str | None = None
self.firmware_build: str | None = None
self.firmware_version: str | None = None
self.max_channels: int = 40
self.path_hash_mode: int = 0
self.path_hash_mode_supported: bool = False
self._channel_slot_by_key: OrderedDict[str, int] = OrderedDict()
self._channel_key_by_slot: dict[int, str] = {}
self._pending_message_channel_key_by_slot: dict[int, str] = {}
async def _acquire_operation_lock(
self,
@@ -223,6 +234,121 @@ class RadioManager:
await run_post_connect_setup(self)
def reset_channel_send_cache(self) -> None:
"""Forget any session-local channel-slot reuse state."""
self._channel_slot_by_key.clear()
self._channel_key_by_slot.clear()
def remember_pending_message_channel_slot(self, channel_key: str, slot: int) -> None:
"""Remember a channel key for later queued-message recovery."""
self._pending_message_channel_key_by_slot[slot] = channel_key.upper()
def get_pending_message_channel_key(self, slot: int) -> str | None:
"""Return the last remembered channel key for a radio slot."""
return self._pending_message_channel_key_by_slot.get(slot)
def clear_pending_message_channel_slots(self) -> None:
"""Drop any queued-message recovery slot metadata."""
self._pending_message_channel_key_by_slot.clear()
def channel_slot_reuse_enabled(self) -> bool:
"""Return whether this transport can safely reuse cached channel slots."""
if settings.force_channel_slot_reconfigure:
return False
if self._connection_info:
return not self._connection_info.startswith("TCP:")
return settings.connection_type != "tcp"
def get_channel_send_cache_capacity(self) -> int:
"""Return the app-managed channel cache capacity for the current session."""
try:
return max(1, int(self.max_channels))
except (TypeError, ValueError):
return 1
def get_cached_channel_slot(self, channel_key: str) -> int | None:
"""Return the cached radio slot for a channel key, if present."""
return self._channel_slot_by_key.get(channel_key.upper())
def plan_channel_send_slot(
self,
channel_key: str,
*,
preferred_slot: int = 0,
) -> tuple[int, bool, str | None]:
"""Choose a radio slot for a channel send.
Returns `(slot, needs_configure, evicted_channel_key)`.
"""
if not self.channel_slot_reuse_enabled():
return preferred_slot, True, None
normalized_key = channel_key.upper()
cached_slot = self._channel_slot_by_key.get(normalized_key)
if cached_slot is not None:
return cached_slot, False, None
capacity = self.get_channel_send_cache_capacity()
if len(self._channel_slot_by_key) < capacity:
slot = self._find_first_free_channel_slot(capacity, preferred_slot)
return slot, True, None
evicted_key, slot = next(iter(self._channel_slot_by_key.items()))
return slot, True, evicted_key
def note_channel_slot_loaded(self, channel_key: str, slot: int) -> None:
"""Record that a channel is now resident in the given radio slot."""
if not self.channel_slot_reuse_enabled():
return
normalized_key = channel_key.upper()
previous_slot = self._channel_slot_by_key.pop(normalized_key, None)
if previous_slot is not None and previous_slot != slot:
self._channel_key_by_slot.pop(previous_slot, None)
displaced_key = self._channel_key_by_slot.get(slot)
if displaced_key is not None and displaced_key != normalized_key:
self._channel_slot_by_key.pop(displaced_key, None)
self._channel_key_by_slot[slot] = normalized_key
self._channel_slot_by_key[normalized_key] = slot
def note_channel_slot_used(self, channel_key: str) -> None:
"""Refresh LRU order for a previously loaded channel slot."""
if not self.channel_slot_reuse_enabled():
return
normalized_key = channel_key.upper()
slot = self._channel_slot_by_key.get(normalized_key)
if slot is None:
return
self._channel_slot_by_key.move_to_end(normalized_key)
self._channel_key_by_slot[slot] = normalized_key
def invalidate_cached_channel_slot(self, channel_key: str) -> None:
"""Drop any cached slot assignment for a channel key."""
normalized_key = channel_key.upper()
slot = self._channel_slot_by_key.pop(normalized_key, None)
if slot is None:
return
if self._channel_key_by_slot.get(slot) == normalized_key:
self._channel_key_by_slot.pop(slot, None)
def get_channel_send_cache_snapshot(self) -> list[tuple[str, int]]:
"""Return the current channel send cache contents in LRU order."""
return list(self._channel_slot_by_key.items())
def _find_first_free_channel_slot(self, capacity: int, preferred_slot: int) -> int:
"""Pick the first unclaimed app-managed slot, preferring the requested slot."""
if preferred_slot < capacity and preferred_slot not in self._channel_key_by_slot:
return preferred_slot
for slot in range(capacity):
if slot not in self._channel_key_by_slot:
return slot
return preferred_slot
@property
def meshcore(self) -> MeshCore | None:
return self._meshcore
@@ -358,6 +484,7 @@ class RadioManager:
async def disconnect(self) -> None:
"""Disconnect from the radio."""
clear_keys()
if self._meshcore is not None:
logger.debug("Disconnecting from radio")
mc = self._meshcore
@@ -366,8 +493,16 @@ class RadioManager:
await self._disable_meshcore_auto_reconnect(mc)
self._meshcore = None
self._setup_complete = False
self.device_info_loaded = False
self.max_contacts = None
self.device_model = None
self.firmware_build = None
self.firmware_version = None
self.max_channels = 40
self.path_hash_mode = 0
self.path_hash_mode_supported = False
self.reset_channel_send_cache()
self.clear_pending_message_channel_slots()
logger.debug("Radio disconnected")
async def reconnect(self, *, broadcast_on_success: bool = True) -> bool:

View File

@@ -17,6 +17,7 @@ from contextlib import asynccontextmanager
from meshcore import EventType, MeshCore
from app.channel_constants import PUBLIC_CHANNEL_KEY, PUBLIC_CHANNEL_NAME
from app.config import settings
from app.event_handlers import cleanup_expired_acks
from app.models import Contact, ContactUpsert
@@ -28,11 +29,14 @@ from app.repository import (
ContactRepository,
)
from app.services.contact_reconciliation import reconcile_contact_messages
from app.services.messages import create_fallback_channel_message
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_error
from app.websocket import broadcast_error, broadcast_event
logger = logging.getLogger(__name__)
DEFAULT_MAX_CHANNELS = 40
def _contact_sync_debug_fields(contact: Contact) -> dict[str, object]:
"""Return key contact fields for sync failure diagnostics."""
@@ -48,7 +52,6 @@ def _contact_sync_debug_fields(contact: Contact) -> dict[str, object]:
"last_advert": contact.last_advert,
"lat": contact.lat,
"lon": contact.lon,
"on_radio": contact.on_radio,
}
@@ -98,6 +101,20 @@ async def upsert_channel_from_radio_slot(payload: dict, *, on_radio: bool) -> st
return key_hex
def get_radio_channel_limit(max_channels: int | None = None) -> int:
"""Return the effective channel-slot limit for the connected firmware."""
discovered = getattr(radio_manager, "max_channels", DEFAULT_MAX_CHANNELS)
try:
limit = max(1, int(discovered))
except (TypeError, ValueError):
limit = DEFAULT_MAX_CHANNELS
if max_channels is not None:
return min(limit, max(1, int(max_channels)))
return limit
# Message poll task handle
_message_poll_task: asyncio.Task | None = None
@@ -285,7 +302,7 @@ async def sync_and_offload_contacts(mc: MeshCore) -> dict:
return {"synced": synced, "removed": removed}
async def sync_and_offload_channels(mc: MeshCore) -> dict:
async def sync_and_offload_channels(mc: MeshCore, max_channels: int | None = None) -> dict:
"""
Sync channels from radio to database, then clear them from radio.
Returns counts of synced and cleared channels.
@@ -294,8 +311,11 @@ async def sync_and_offload_channels(mc: MeshCore) -> dict:
cleared = 0
try:
# Check all 40 channel slots
for idx in range(40):
radio_manager.reset_channel_send_cache()
channel_limit = get_radio_channel_limit(max_channels)
# Check all available channel slots for this firmware variant
for idx in range(channel_limit):
result = await mc.commands.get_channel(idx)
if result.type != EventType.CHANNEL_INFO:
@@ -308,6 +328,7 @@ async def sync_and_offload_channels(mc: MeshCore) -> dict:
if key_hex is None:
continue
radio_manager.remember_pending_message_channel_slot(key_hex, idx)
synced += 1
logger.debug("Synced channel %s: %s", key_hex[:8], result.payload.get("channel_name"))
@@ -334,6 +355,87 @@ async def sync_and_offload_channels(mc: MeshCore) -> dict:
return {"synced": synced, "cleared": cleared}
def _split_channel_sender_and_text(text: str) -> tuple[str | None, str]:
"""Parse the canonical MeshCore "<sender>: <message>" channel text format."""
sender = None
message_text = text
colon_idx = text.find(": ")
if 0 < colon_idx < 50:
potential_sender = text[:colon_idx]
if not any(char in potential_sender for char in ":[]\x00"):
sender = potential_sender
message_text = text[colon_idx + 2 :]
return sender, message_text
async def _resolve_channel_for_pending_message(
mc: MeshCore,
channel_idx: int,
) -> tuple[str | None, str | None]:
"""Resolve a pending channel message's slot to a channel key and name."""
try:
result = await mc.commands.get_channel(channel_idx)
except Exception as exc:
logger.debug("Failed to fetch channel slot %s for pending message: %s", channel_idx, exc)
else:
if result.type == EventType.CHANNEL_INFO:
key_hex = await upsert_channel_from_radio_slot(result.payload, on_radio=False)
if key_hex is not None:
radio_manager.remember_pending_message_channel_slot(key_hex, channel_idx)
return key_hex, result.payload.get("channel_name") or None
current_slot_map = getattr(radio_manager, "_channel_key_by_slot", {})
cached_key = current_slot_map.get(channel_idx)
if cached_key is None:
cached_key = radio_manager.get_pending_message_channel_key(channel_idx)
if cached_key is None:
return None, None
channel = await ChannelRepository.get_by_key(cached_key)
return cached_key, channel.name if channel else None
async def _store_pending_channel_message(mc: MeshCore, payload: dict) -> None:
"""Persist a CHANNEL_MSG_RECV event pulled via get_msg()."""
channel_idx = payload.get("channel_idx")
if channel_idx is None:
logger.warning("Pending channel message missing channel_idx; dropping payload")
return
try:
normalized_channel_idx = int(channel_idx)
except (TypeError, ValueError):
logger.warning("Pending channel message had invalid channel_idx=%r", channel_idx)
return
channel_key, channel_name = await _resolve_channel_for_pending_message(
mc, normalized_channel_idx
)
if channel_key is None:
logger.warning(
"Could not resolve channel slot %d for pending message; message cannot be stored",
normalized_channel_idx,
)
return
received_at = int(time.time())
sender_timestamp = payload.get("sender_timestamp") or received_at
sender_name, message_text = _split_channel_sender_and_text(payload.get("text", ""))
await create_fallback_channel_message(
conversation_key=channel_key,
message_text=message_text,
sender_timestamp=sender_timestamp,
received_at=received_at,
path=payload.get("path"),
path_len=payload.get("path_len"),
txt_type=payload.get("txt_type", 0),
sender_name=sender_name,
channel_name=channel_name,
broadcast_fn=broadcast_event,
)
async def ensure_default_channels() -> None:
"""
Ensure default channels exist in the database.
@@ -342,16 +444,13 @@ async def ensure_default_channels() -> None:
This seeds the canonical Public channel row in the database if it is missing
or misnamed. It does not make the channel undeletable through the router.
"""
# Public channel - no hashtag, specific well-known key
PUBLIC_CHANNEL_KEY_HEX = "8B3387E9C5CDEA6AC9E5EDBAA115CD72"
# Check by KEY (not name) since that's what's fixed
existing = await ChannelRepository.get_by_key(PUBLIC_CHANNEL_KEY_HEX)
if not existing or existing.name != "Public":
existing = await ChannelRepository.get_by_key(PUBLIC_CHANNEL_KEY)
if not existing or existing.name != PUBLIC_CHANNEL_NAME:
logger.info("Ensuring default Public channel exists with correct name")
await ChannelRepository.upsert(
key=PUBLIC_CHANNEL_KEY_HEX,
name="Public",
key=PUBLIC_CHANNEL_KEY,
name=PUBLIC_CHANNEL_NAME,
is_hashtag=False,
on_radio=existing.on_radio if existing else False,
)
@@ -361,16 +460,19 @@ async def sync_and_offload_all(mc: MeshCore) -> dict:
"""Sync and offload both contacts and channels, then ensure defaults exist."""
logger.info("Starting full radio sync and offload")
# Contact on_radio is legacy/stale metadata. Clear it during the offload/reload
# cycle so old rows stop claiming radio residency we do not actively track.
await ContactRepository.clear_on_radio_except([])
contacts_result = await sync_and_offload_contacts(mc)
channels_result = await sync_and_offload_channels(mc)
# Ensure default channels exist
await ensure_default_channels()
# Reload favorites plus a working-set fill back onto the radio immediately
# so they do not stay in on_radio=False limbo after offload. Pass mc directly
# since the caller already holds the radio operation lock (asyncio.Lock is not
# reentrant).
# Reload favorites plus a working-set fill back onto the radio immediately.
# Pass mc directly since the caller already holds the radio operation lock
# (asyncio.Lock is not reentrant).
reload_result = await sync_recent_contacts_to_radio(force=True, mc=mc)
return {
@@ -400,6 +502,8 @@ async def drain_pending_messages(mc: MeshCore) -> int:
logger.debug("Error during message drain: %s", result.payload)
break
elif result.type in (EventType.CONTACT_MSG_RECV, EventType.CHANNEL_MSG_RECV):
if result.type == EventType.CHANNEL_MSG_RECV:
await _store_pending_channel_message(mc, result.payload)
count += 1
# Small delay between fetches
@@ -435,6 +539,8 @@ async def poll_for_messages(mc: MeshCore) -> int:
elif result.type == EventType.ERROR:
return 0
elif result.type in (EventType.CONTACT_MSG_RECV, EventType.CHANNEL_MSG_RECV):
if result.type == EventType.CHANNEL_MSG_RECV:
await _store_pending_channel_message(mc, result.payload)
count += 1
# If we got a message, there might be more - drain them
count += await drain_pending_messages(mc)
@@ -447,6 +553,63 @@ async def poll_for_messages(mc: MeshCore) -> int:
return count
def _normalize_channel_secret(payload: dict) -> bytes:
"""Return a normalized bytes representation of a radio channel secret."""
secret = payload.get("channel_secret", b"")
if isinstance(secret, bytes):
return secret
return bytes(secret)
async def audit_channel_send_cache(mc: MeshCore) -> bool:
"""Verify cached send-slot expectations still match radio channel contents.
If a mismatch is detected, the app's send-slot cache is reset so future sends
fall back to reloading channels before reuse resumes.
"""
if not radio_manager.channel_slot_reuse_enabled():
return True
cached_slots = radio_manager.get_channel_send_cache_snapshot()
if not cached_slots:
return True
mismatches: list[str] = []
for channel_key, slot in cached_slots:
result = await mc.commands.get_channel(slot)
if result.type != EventType.CHANNEL_INFO:
mismatches.append(
f"slot {slot}: expected {channel_key[:8]} but radio returned {result.type}"
)
continue
observed_name = result.payload.get("channel_name") or ""
observed_key = _normalize_channel_secret(result.payload).hex().upper()
expected_channel = await ChannelRepository.get_by_key(channel_key)
expected_name = expected_channel.name if expected_channel is not None else None
if observed_key != channel_key or expected_name is None or observed_name != expected_name:
mismatches.append(
f"slot {slot}: expected {expected_name or '(missing db row)'} "
f"{channel_key[:8]}, got {observed_name or '(empty)'} {observed_key[:8]}"
)
if not mismatches:
return True
logger.error(
"[RADIO SYNC ERROR] A periodic radio audit discovered that the channel send-slot cache fell out of sync with radio state. This indicates that some other system, internal or external to the radio, has updated the channel slots on the radio (which the app assumes it has exclusive rights to, except on TCP-linked devices). The cache is resetting now, but you should review the README.md and consider using the environment variable MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true to make the radio use non-optimistic channel management and force-write the channel to radio before each send. This is a minor performance hit, but guarantees consistency. Mismatches found: %s",
"; ".join(mismatches),
)
radio_manager.reset_channel_send_cache()
broadcast_error(
"A periodic poll task has discovered radio inconsistencies.",
"Please check the logs for recommendations (search "
"'MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE').",
)
return False
async def _message_poll_loop():
"""Background task that periodically polls for messages."""
while True:
@@ -464,6 +627,7 @@ async def _message_poll_loop():
suspend_auto_fetch=True,
) as mc:
count = await poll_for_messages(mc)
await audit_channel_send_cache(mc)
if count > 0:
if aggressive_fallback:
logger.warning(
@@ -472,10 +636,10 @@ async def _message_poll_loop():
)
else:
logger.error(
"Periodic radio audit caught %d message(s) that were not "
"surfaced via event subscription. See README and consider "
"[RADIO SYNC ERROR] Periodic radio audit caught %d message(s) that were not "
"surfaced via event subscription. This means that the method of event (new contacts, messages, etc.) awareness we want isn't giving us everything. There is a fallback method available; see README.md and consider "
"setting MESHCORE_ENABLE_MESSAGE_POLL_FALLBACK=true to "
"enable more frequent polling.",
"enable active radio polling every few seconds.",
count,
)
broadcast_error(
@@ -699,20 +863,8 @@ _last_contact_sync: float = 0.0
CONTACT_SYNC_THROTTLE_SECONDS = 30 # Don't sync more than once per 30 seconds
async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
"""
Core logic for loading contacts onto the radio.
Fill order is:
1. Favorite contacts
2. Most recently interacted-with non-repeaters
3. Most recently advert-heard non-repeaters without interaction history
Favorite contacts are always reloaded first, up to the configured capacity.
Additional non-favorite fill stops at the refill target (80% of capacity).
Caller must hold the radio operation lock and pass a valid MeshCore instance.
"""
async def get_contacts_selected_for_radio_sync() -> list[Contact]:
"""Return the contacts that would be loaded onto the radio right now."""
app_settings = await AppSettingsRepository.get()
max_contacts = app_settings.max_radio_contacts
refill_target, _full_sync_trigger = _compute_radio_contact_limits(max_contacts)
@@ -779,6 +931,24 @@ async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
refill_target,
max_contacts,
)
return selected_contacts
async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
"""
Core logic for loading contacts onto the radio.
Fill order is:
1. Favorite contacts
2. Most recently interacted-with non-repeaters
3. Most recently advert-heard non-repeaters without interaction history
Favorite contacts are always reloaded first, up to the configured capacity.
Additional non-favorite fill stops at the refill target (80% of capacity).
Caller must hold the radio operation lock and pass a valid MeshCore instance.
"""
selected_contacts = await get_contacts_selected_for_radio_sync()
return await _load_contacts_to_radio(mc, selected_contacts)
@@ -851,8 +1021,6 @@ async def _load_contacts_to_radio(mc: MeshCore, contacts: list[Contact]) -> dict
radio_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if radio_contact:
already_on_radio += 1
if not contact.on_radio:
await ContactRepository.set_on_radio(contact.public_key, True)
continue
try:
@@ -860,7 +1028,6 @@ async def _load_contacts_to_radio(mc: MeshCore, contacts: list[Contact]) -> dict
result = await mc.commands.add_contact(radio_contact_payload)
if result.type == EventType.OK:
loaded += 1
await ContactRepository.set_on_radio(contact.public_key, True)
logger.debug("Loaded contact %s to radio", contact.public_key[:12])
else:
failed += 1

View File

@@ -66,6 +66,30 @@ class ChannelRepository:
for row in rows
]
@staticmethod
async def get_on_radio() -> list[Channel]:
"""Return channels currently marked as resident on the radio in the database."""
cursor = await db.conn.execute(
"""
SELECT key, name, is_hashtag, on_radio, flood_scope_override, last_read_at
FROM channels
WHERE on_radio = 1
ORDER BY name
"""
)
rows = await cursor.fetchall()
return [
Channel(
key=row["key"],
name=row["name"],
is_hashtag=bool(row["is_hashtag"]),
on_radio=bool(row["on_radio"]),
flood_scope_override=row["flood_scope_override"],
last_read_at=row["last_read_at"],
)
for row in rows
]
@staticmethod
async def delete(key: str) -> None:
"""Delete a channel by key."""

View File

@@ -352,14 +352,6 @@ class ContactRepository:
)
await db.conn.commit()
@staticmethod
async def set_on_radio(public_key: str, on_radio: bool) -> None:
await db.conn.execute(
"UPDATE contacts SET on_radio = ? WHERE public_key = ?",
(on_radio, public_key.lower()),
)
await db.conn.commit()
@staticmethod
async def clear_on_radio_except(keep_keys: list[str]) -> None:
"""Set on_radio=False for all contacts NOT in keep_keys."""

View File

@@ -293,6 +293,40 @@ class MessageRepository:
clause += ")"
return clause, params
@staticmethod
def _build_blocked_incoming_clause(
message_alias: str = "",
blocked_keys: list[str] | None = None,
blocked_names: list[str] | None = None,
) -> tuple[str, list[Any]]:
prefix = f"{message_alias}." if message_alias else ""
blocked_matchers: list[str] = []
params: list[Any] = []
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
blocked_matchers.append(
f"({prefix}type = 'PRIV' AND LOWER({prefix}conversation_key) IN ({placeholders}))"
)
params.extend(blocked_keys)
blocked_matchers.append(
f"({prefix}type = 'CHAN' AND {prefix}sender_key IS NOT NULL"
f" AND LOWER({prefix}sender_key) IN ({placeholders}))"
)
params.extend(blocked_keys)
if blocked_names:
placeholders = ",".join("?" for _ in blocked_names)
blocked_matchers.append(
f"({prefix}sender_name IS NOT NULL AND {prefix}sender_name IN ({placeholders}))"
)
params.extend(blocked_names)
if not blocked_matchers:
return "", []
return f"NOT ({prefix}outgoing = 0 AND ({' OR '.join(blocked_matchers)}))", params
@staticmethod
def _row_to_message(row: Any) -> Message:
"""Convert a database row to a Message model."""
@@ -337,25 +371,12 @@ class MessageRepository:
)
params: list[Any] = []
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
query += (
f" AND NOT (messages.outgoing=0 AND ("
f"(messages.type='PRIV' AND LOWER(messages.conversation_key) IN ({placeholders}))"
f" OR (messages.type='CHAN' AND messages.sender_key IS NOT NULL"
f" AND LOWER(messages.sender_key) IN ({placeholders}))"
f"))"
)
params.extend(blocked_keys)
params.extend(blocked_keys)
if blocked_names:
placeholders = ",".join("?" for _ in blocked_names)
query += (
f" AND NOT (messages.outgoing=0 AND messages.sender_name IS NOT NULL"
f" AND messages.sender_name IN ({placeholders}))"
)
params.extend(blocked_names)
blocked_clause, blocked_params = MessageRepository._build_blocked_incoming_clause(
"messages", blocked_keys, blocked_names
)
if blocked_clause:
query += f" AND {blocked_clause}"
params.extend(blocked_params)
if msg_type:
query += " AND messages.type = ?"
@@ -437,23 +458,12 @@ class MessageRepository:
where_parts.append(clause.removeprefix("AND "))
base_params.append(norm_key)
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
where_parts.append(
f"NOT (outgoing=0 AND ("
f"(type='PRIV' AND LOWER(conversation_key) IN ({placeholders}))"
f" OR (type='CHAN' AND sender_key IS NOT NULL AND LOWER(sender_key) IN ({placeholders}))"
f"))"
)
base_params.extend(blocked_keys)
base_params.extend(blocked_keys)
if blocked_names:
placeholders = ",".join("?" for _ in blocked_names)
where_parts.append(
f"NOT (outgoing=0 AND sender_name IS NOT NULL AND sender_name IN ({placeholders}))"
)
base_params.extend(blocked_names)
blocked_clause, blocked_params = MessageRepository._build_blocked_incoming_clause(
blocked_keys=blocked_keys, blocked_names=blocked_names
)
if blocked_clause:
where_parts.append(blocked_clause)
base_params.extend(blocked_params)
where_sql = " AND ".join(["1=1", *where_parts])
@@ -579,29 +589,19 @@ class MessageRepository:
blocked_names: Display names whose messages should be excluded from counts.
Returns:
Dict with 'counts', 'mentions', and 'last_message_times' keys.
Dict with 'counts', 'mentions', 'last_message_times', and 'last_read_ats' keys.
"""
counts: dict[str, int] = {}
mention_flags: dict[str, bool] = {}
last_message_times: dict[str, int] = {}
last_read_ats: dict[str, int | None] = {}
mention_token = f"@[{name}]" if name else None
# Build optional block-list WHERE fragments for channel messages
chan_block_sql = ""
chan_block_params: list[Any] = []
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
chan_block_sql += (
f" AND NOT (m.sender_key IS NOT NULL AND LOWER(m.sender_key) IN ({placeholders}))"
)
chan_block_params.extend(blocked_keys)
if blocked_names:
placeholders = ",".join("?" for _ in blocked_names)
chan_block_sql += (
f" AND NOT (m.sender_name IS NOT NULL AND m.sender_name IN ({placeholders}))"
)
chan_block_params.extend(blocked_names)
blocked_clause, blocked_params = MessageRepository._build_blocked_incoming_clause(
"m", blocked_keys, blocked_names
)
blocked_sql = f" AND {blocked_clause}" if blocked_clause else ""
# Channel unreads
cursor = await db.conn.execute(
@@ -616,10 +616,10 @@ class MessageRepository:
JOIN channels c ON m.conversation_key = c.key
WHERE m.type = 'CHAN' AND m.outgoing = 0
AND m.received_at > COALESCE(c.last_read_at, 0)
{chan_block_sql}
{blocked_sql}
GROUP BY m.conversation_key
""",
(mention_token or "", mention_token or "", *chan_block_params),
(mention_token or "", mention_token or "", *blocked_params),
)
rows = await cursor.fetchall()
for row in rows:
@@ -628,14 +628,6 @@ class MessageRepository:
if mention_token and row["has_mention"]:
mention_flags[state_key] = True
# Build block-list exclusion for contact (DM) unreads
contact_block_sql = ""
contact_block_params: list[Any] = []
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
contact_block_sql += f" AND LOWER(m.conversation_key) NOT IN ({placeholders})"
contact_block_params.extend(blocked_keys)
# Contact unreads
cursor = await db.conn.execute(
f"""
@@ -649,10 +641,10 @@ class MessageRepository:
JOIN contacts ct ON m.conversation_key = ct.public_key
WHERE m.type = 'PRIV' AND m.outgoing = 0
AND m.received_at > COALESCE(ct.last_read_at, 0)
{contact_block_sql}
{blocked_sql}
GROUP BY m.conversation_key
""",
(mention_token or "", mention_token or "", *contact_block_params),
(mention_token or "", mention_token or "", *blocked_params),
)
rows = await cursor.fetchall()
for row in rows:
@@ -661,52 +653,32 @@ class MessageRepository:
if mention_token and row["has_mention"]:
mention_flags[state_key] = True
cursor = await db.conn.execute(
"""
SELECT key, last_read_at
FROM channels
"""
)
rows = await cursor.fetchall()
for row in rows:
last_read_ats[f"channel-{row['key']}"] = row["last_read_at"]
cursor = await db.conn.execute(
"""
SELECT public_key, last_read_at
FROM contacts
"""
)
rows = await cursor.fetchall()
for row in rows:
last_read_ats[f"contact-{row['public_key']}"] = row["last_read_at"]
# Last message times for all conversations (including read ones),
# excluding blocked incoming traffic so refresh matches live WS behavior.
last_time_filters: list[str] = []
last_time_params: list[Any] = []
if blocked_keys:
placeholders = ",".join("?" for _ in blocked_keys)
last_time_filters.append(
f"""
NOT (
type = 'PRIV'
AND outgoing = 0
AND LOWER(conversation_key) IN ({placeholders})
)
"""
)
last_time_params.extend(blocked_keys)
last_time_filters.append(
f"""
NOT (
type = 'CHAN'
AND outgoing = 0
AND sender_key IS NOT NULL
AND LOWER(sender_key) IN ({placeholders})
)
"""
)
last_time_params.extend(blocked_keys)
if blocked_names:
placeholders = ",".join("?" for _ in blocked_names)
last_time_filters.append(
f"""
NOT (
type = 'CHAN'
AND outgoing = 0
AND sender_name IS NOT NULL
AND sender_name IN ({placeholders})
)
"""
)
last_time_params.extend(blocked_names)
last_time_where_sql = (
f"WHERE {' AND '.join(last_time_filters)}" if last_time_filters else ""
last_time_clause, last_time_params = MessageRepository._build_blocked_incoming_clause(
blocked_keys=blocked_keys, blocked_names=blocked_names
)
last_time_where_sql = f"WHERE {last_time_clause}" if last_time_clause else ""
cursor = await db.conn.execute(
f"""
@@ -727,6 +699,7 @@ class MessageRepository:
"counts": counts,
"mentions": mention_flags,
"last_message_times": last_message_times,
"last_read_ats": last_read_ats,
}
@staticmethod
@@ -835,6 +808,19 @@ class MessageRepository:
"top_senders_24h": top_senders,
}
@staticmethod
async def count_channels_with_incoming_messages() -> int:
"""Count distinct channel conversations with at least one incoming message."""
cursor = await db.conn.execute(
"""
SELECT COUNT(DISTINCT conversation_key) AS cnt
FROM messages
WHERE type = 'CHAN' AND outgoing = 0
"""
)
row = await cursor.fetchone()
return int(row["cnt"]) if row and row["cnt"] is not None else 0
@staticmethod
async def get_most_active_rooms(sender_key: str, limit: int = 5) -> list[tuple[str, str, int]]:
"""Get channels where a contact has sent the most messages.

View File

@@ -109,6 +109,18 @@ class RawPacketRepository:
)
await db.conn.commit()
@staticmethod
async def get_linked_message_id(packet_id: int) -> int | None:
"""Return the linked message ID for a raw packet, if any."""
cursor = await db.conn.execute(
"SELECT message_id FROM raw_packets WHERE id = ?",
(packet_id,),
)
row = await cursor.fetchone()
if not row:
return None
return row["message_id"]
@staticmethod
async def prune_old_undecrypted(max_age_days: int) -> int:
"""Delete undecrypted packets older than max_age_days. Returns count deleted."""

View File

@@ -1,16 +1,18 @@
import logging
from hashlib import sha256
from fastapi import APIRouter, HTTPException, Query
from meshcore import EventType
from fastapi import APIRouter, HTTPException
from pydantic import BaseModel, Field
from app.dependencies import require_connected
from app.channel_constants import (
PUBLIC_CHANNEL_KEY,
PUBLIC_CHANNEL_NAME,
is_public_channel_key,
is_public_channel_name,
)
from app.models import Channel, ChannelDetail, ChannelMessageCounts, ChannelTopSender
from app.radio_sync import upsert_channel_from_radio_slot
from app.region_scope import normalize_region_scope
from app.repository import ChannelRepository, MessageRepository
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_event
logger = logging.getLogger(__name__)
@@ -59,15 +61,6 @@ async def get_channel_detail(key: str) -> ChannelDetail:
)
@router.get("/{key}", response_model=Channel)
async def get_channel(key: str) -> Channel:
"""Get a specific channel by key (32-char hex string)."""
channel = await ChannelRepository.get_by_key(key)
if not channel:
raise HTTPException(status_code=404, detail="Channel not found")
return channel
@router.post("", response_model=Channel)
async def create_channel(request: CreateChannelRequest) -> Channel:
"""Create a channel in the database.
@@ -75,10 +68,31 @@ async def create_channel(request: CreateChannelRequest) -> Channel:
Channels are NOT pushed to radio on creation. They are loaded to the radio
automatically when sending a message (see messages.py send_channel_message).
"""
is_hashtag = request.name.startswith("#")
requested_name = request.name
is_hashtag = requested_name.startswith("#")
# Determine the channel secret
if request.key and not is_hashtag:
# Reserve the canonical Public room so it cannot drift to another key,
# and the well-known Public key cannot be renamed to something else.
if is_public_channel_name(requested_name):
if request.key:
try:
key_bytes = bytes.fromhex(request.key)
if len(key_bytes) != 16:
raise HTTPException(
status_code=400,
detail="Channel key must be exactly 16 bytes (32 hex chars)",
)
except ValueError:
raise HTTPException(status_code=400, detail="Invalid hex string for key") from None
if key_bytes.hex().upper() != PUBLIC_CHANNEL_KEY:
raise HTTPException(
status_code=400,
detail=f'"{PUBLIC_CHANNEL_NAME}" must use the canonical Public key',
)
key_hex = PUBLIC_CHANNEL_KEY
channel_name = PUBLIC_CHANNEL_NAME
is_hashtag = False
elif request.key and not is_hashtag:
try:
key_bytes = bytes.fromhex(request.key)
if len(key_bytes) != 16:
@@ -87,17 +101,25 @@ async def create_channel(request: CreateChannelRequest) -> Channel:
)
except ValueError:
raise HTTPException(status_code=400, detail="Invalid hex string for key") from None
key_hex = key_bytes.hex().upper()
if is_public_channel_key(key_hex):
raise HTTPException(
status_code=400,
detail=f'The canonical Public key may only be used for "{PUBLIC_CHANNEL_NAME}"',
)
channel_name = requested_name
else:
# Derive key from name hash (same as meshcore library does)
key_bytes = sha256(request.name.encode("utf-8")).digest()[:16]
key_bytes = sha256(requested_name.encode("utf-8")).digest()[:16]
key_hex = key_bytes.hex().upper()
channel_name = requested_name
key_hex = key_bytes.hex().upper()
logger.info("Creating channel %s: %s (hashtag=%s)", key_hex, request.name, is_hashtag)
logger.info("Creating channel %s: %s (hashtag=%s)", key_hex, channel_name, is_hashtag)
# Store in database only - radio sync happens at send time
await ChannelRepository.upsert(
key=key_hex,
name=request.name,
name=channel_name,
is_hashtag=is_hashtag,
on_radio=False,
)
@@ -110,33 +132,6 @@ async def create_channel(request: CreateChannelRequest) -> Channel:
return stored
@router.post("/sync")
async def sync_channels_from_radio(max_channels: int = Query(default=40, ge=1, le=40)) -> dict:
"""Sync channels from the radio to the database."""
require_connected()
logger.info("Syncing channels from radio (checking %d slots)", max_channels)
count = 0
async with radio_manager.radio_operation("sync_channels_from_radio") as mc:
for idx in range(max_channels):
result = await mc.commands.get_channel(idx)
if result.type == EventType.CHANNEL_INFO:
key_hex = await upsert_channel_from_radio_slot(result.payload, on_radio=True)
if key_hex is not None:
count += 1
stored = await ChannelRepository.get_by_key(key_hex)
if stored is not None:
_broadcast_channel_update(stored)
logger.debug(
"Synced channel %s: %s", key_hex, result.payload.get("channel_name")
)
logger.info("Synced %d channels from radio", count)
return {"synced": count}
@router.post("/{key}/mark-read")
async def mark_channel_read(key: str) -> dict:
"""Mark a channel as read (update last_read_at timestamp)."""
@@ -180,6 +175,11 @@ async def delete_channel(key: str) -> dict:
Note: This does not clear the channel from the radio. The radio's channel
slots are managed separately (channels are loaded temporarily when sending).
"""
if is_public_channel_key(key):
raise HTTPException(
status_code=400, detail="The canonical Public channel cannot be deleted"
)
logger.info("Deleting channel %s from database", key)
await ChannelRepository.delete(key)

View File

@@ -1,5 +1,7 @@
import asyncio
import logging
import random
from contextlib import suppress
from fastapi import APIRouter, BackgroundTasks, HTTPException, Query
from meshcore import EventType
@@ -8,15 +10,14 @@ from app.dependencies import require_connected
from app.models import (
Contact,
ContactActiveRoom,
ContactAdvertPath,
ContactAdvertPathSummary,
ContactAnalytics,
ContactDetail,
ContactRoutingOverrideRequest,
ContactUpsert,
CreateContactRequest,
NameOnlyContactDetail,
NearestRepeater,
PathDiscoveryResponse,
PathDiscoveryRoute,
TraceResponse,
)
from app.packet_processor import start_historical_dm_decryption
@@ -69,8 +70,8 @@ async def _ensure_on_radio(mc, contact: Contact) -> None:
async def _best_effort_push_contact_to_radio(contact: Contact, operation_name: str) -> None:
"""Push the current effective route to the radio when the contact is already loaded."""
if not radio_manager.is_connected or not contact.on_radio:
"""Best-effort push the current effective route to the radio when connected."""
if not radio_manager.is_connected:
return
try:
@@ -109,6 +110,12 @@ async def _broadcast_contact_resolution(previous_public_keys: list[str], contact
)
def _path_hash_mode_from_hop_width(hop_width: object) -> int:
if not isinstance(hop_width, int):
return 0
return max(0, min(hop_width - 1, 2))
async def _build_keyed_contact_analytics(contact: Contact) -> ContactAnalytics:
name_history = await ContactNameHistoryRepository.get_history(contact.public_key)
dm_count = await MessageRepository.count_dm_messages(contact.public_key)
@@ -325,158 +332,6 @@ async def create_contact(
return stored
@router.get("/{public_key}/detail", response_model=ContactDetail)
async def get_contact_detail(public_key: str) -> ContactDetail:
"""Get comprehensive contact profile data.
Returns contact info, name history, message counts, most active rooms,
advertisement paths, advert frequency, and nearest repeaters.
"""
contact = await _resolve_contact_or_404(public_key)
analytics = await _build_keyed_contact_analytics(contact)
assert analytics.contact is not None
return ContactDetail(
contact=analytics.contact,
name_history=analytics.name_history,
dm_message_count=analytics.dm_message_count,
channel_message_count=analytics.channel_message_count,
most_active_rooms=analytics.most_active_rooms,
advert_paths=analytics.advert_paths,
advert_frequency=analytics.advert_frequency,
nearest_repeaters=analytics.nearest_repeaters,
)
@router.get("/name-detail", response_model=NameOnlyContactDetail)
async def get_name_only_contact_detail(
name: str = Query(min_length=1, max_length=200),
) -> NameOnlyContactDetail:
"""Get channel activity summary for a sender name without a resolved key."""
normalized_name = name.strip()
if not normalized_name:
raise HTTPException(status_code=400, detail="name is required")
analytics = await _build_name_only_contact_analytics(normalized_name)
return NameOnlyContactDetail(
name=analytics.name,
channel_message_count=analytics.channel_message_count,
most_active_rooms=analytics.most_active_rooms,
)
@router.get("/{public_key}", response_model=Contact)
async def get_contact(public_key: str) -> Contact:
"""Get a specific contact by public key or prefix."""
return await _resolve_contact_or_404(public_key)
@router.get("/{public_key}/advert-paths", response_model=list[ContactAdvertPath])
async def get_contact_advert_paths(
public_key: str,
limit: int = Query(default=10, ge=1, le=50),
) -> list[ContactAdvertPath]:
"""List recent unique advert paths for a contact."""
contact = await _resolve_contact_or_404(public_key)
return await ContactAdvertPathRepository.get_recent_for_contact(contact.public_key, limit)
@router.post("/sync")
async def sync_contacts_from_radio() -> dict:
"""Sync contacts from the radio to the database."""
require_connected()
logger.info("Syncing contacts from radio")
async with radio_manager.radio_operation("sync_contacts_from_radio") as mc:
result = await mc.commands.get_contacts()
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to get contacts: {result.payload}")
contacts = result.payload
count = 0
synced_keys: list[str] = []
for public_key, contact_data in contacts.items():
lower_key = public_key.lower()
await ContactRepository.upsert(
ContactUpsert.from_radio_dict(lower_key, contact_data, on_radio=True)
)
promoted_keys = await promote_prefix_contacts_for_contact(
public_key=lower_key,
log=logger,
)
synced_keys.append(lower_key)
await reconcile_contact_messages(
public_key=lower_key,
contact_name=contact_data.get("adv_name"),
log=logger,
)
stored = await ContactRepository.get_by_key(lower_key)
if stored is not None:
await _broadcast_contact_update(stored)
await _broadcast_contact_resolution(promoted_keys, stored)
count += 1
# Clear on_radio for contacts not found on the radio
await ContactRepository.clear_on_radio_except(synced_keys)
logger.info("Synced %d contacts from radio", count)
return {"synced": count}
@router.post("/{public_key}/remove-from-radio")
async def remove_contact_from_radio(public_key: str) -> dict:
"""Remove a contact from the radio (keeps it in database)."""
require_connected()
contact = await _resolve_contact_or_404(public_key)
async with radio_manager.radio_operation("remove_contact_from_radio") as mc:
# Get the contact from radio
radio_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if not radio_contact:
# Already not on radio
await ContactRepository.set_on_radio(contact.public_key, False)
return {"status": "ok", "message": "Contact was not on radio"}
logger.info("Removing contact %s from radio", contact.public_key[:12])
result = await mc.commands.remove_contact(radio_contact)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500, detail=f"Failed to remove contact: {result.payload}"
)
await ContactRepository.set_on_radio(contact.public_key, False)
return {"status": "ok"}
@router.post("/{public_key}/add-to-radio")
async def add_contact_to_radio(public_key: str) -> dict:
"""Add a contact from the database to the radio."""
require_connected()
contact = await _resolve_contact_or_404(public_key, "Contact not found in database")
async with radio_manager.radio_operation("add_contact_to_radio") as mc:
# Check if already on radio
radio_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if radio_contact:
await ContactRepository.set_on_radio(contact.public_key, True)
return {"status": "ok", "message": "Contact already on radio"}
logger.info("Adding contact %s to radio", contact.public_key[:12])
result = await mc.commands.add_contact(contact.to_radio_dict())
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to add contact: {result.payload}")
await ContactRepository.set_on_radio(contact.public_key, True)
return {"status": "ok"}
@router.post("/{public_key}/mark-read")
async def mark_contact_read(public_key: str) -> dict:
"""Mark a contact conversation as read (update last_read_at timestamp)."""
@@ -575,6 +430,90 @@ async def request_trace(public_key: str) -> TraceResponse:
return TraceResponse(remote_snr=remote_snr, local_snr=local_snr, path_len=path_len)
@router.post("/{public_key}/path-discovery", response_model=PathDiscoveryResponse)
async def request_path_discovery(public_key: str) -> PathDiscoveryResponse:
"""Discover the current forward and return paths to a known contact."""
require_connected()
contact = await _resolve_contact_or_404(public_key)
pubkey_prefix = contact.public_key[:12]
async with radio_manager.radio_operation("request_path_discovery", pause_polling=True) as mc:
await _ensure_on_radio(mc, contact)
response_task = asyncio.create_task(
mc.wait_for_event(
EventType.PATH_RESPONSE,
attribute_filters={"pubkey_pre": pubkey_prefix},
timeout=15,
)
)
try:
result = await mc.commands.send_path_discovery(contact.public_key)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500,
detail=f"Failed to send path discovery: {result.payload}",
)
event = await response_task
finally:
if not response_task.done():
response_task.cancel()
with suppress(asyncio.CancelledError):
await response_task
if event is None:
raise HTTPException(status_code=504, detail="No path discovery response heard")
payload = event.payload
forward_path = str(payload.get("out_path") or "")
forward_len = int(payload.get("out_path_len") or 0)
forward_mode = _path_hash_mode_from_hop_width(payload.get("out_path_hash_len"))
return_path = str(payload.get("in_path") or "")
return_len = int(payload.get("in_path_len") or 0)
return_mode = _path_hash_mode_from_hop_width(payload.get("in_path_hash_len"))
await ContactRepository.update_path(
contact.public_key,
forward_path,
forward_len,
forward_mode,
)
refreshed_contact = await _resolve_contact_or_404(contact.public_key)
try:
sync_result = await mc.commands.add_contact(refreshed_contact.to_radio_dict())
if sync_result is not None and sync_result.type == EventType.ERROR:
logger.warning(
"Failed to sync discovered path back to radio for %s: %s",
refreshed_contact.public_key[:12],
sync_result.payload,
)
except Exception:
logger.warning(
"Failed to sync discovered path back to radio for %s",
refreshed_contact.public_key[:12],
exc_info=True,
)
await _broadcast_contact_update(refreshed_contact)
return PathDiscoveryResponse(
contact=refreshed_contact,
forward_path=PathDiscoveryRoute(
path=forward_path,
path_len=forward_len,
path_hash_mode=forward_mode,
),
return_path=PathDiscoveryRoute(
path=return_path,
path_len=return_len,
path_hash_mode=return_mode,
),
)
@router.post("/{public_key}/routing-override")
async def set_contact_routing_override(
public_key: str, request: ContactRoutingOverrideRequest

299
app/routers/debug.py Normal file
View File

@@ -0,0 +1,299 @@
import hashlib
import importlib.metadata
import logging
import subprocess
import sys
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
from fastapi import APIRouter
from meshcore import EventType
from pydantic import BaseModel, Field
from app.config import get_recent_log_lines, settings
from app.radio_sync import get_contacts_selected_for_radio_sync, get_radio_channel_limit
from app.repository import MessageRepository
from app.routers.health import HealthResponse, build_health_data
from app.services.radio_runtime import radio_runtime
logger = logging.getLogger(__name__)
router = APIRouter(tags=["debug"])
class DebugApplicationInfo(BaseModel):
version: str
commit_hash: str | None = None
git_branch: str | None = None
git_dirty: bool | None = None
python_version: str
class DebugRuntimeInfo(BaseModel):
connection_info: str | None = None
connection_desired: bool
setup_in_progress: bool
setup_complete: bool
channels_with_incoming_messages: int
max_channels: int
path_hash_mode: int
path_hash_mode_supported: bool
channel_slot_reuse_enabled: bool
channel_send_cache_capacity: int
remediation_flags: dict[str, bool]
class DebugContactAudit(BaseModel):
expected_and_found: int
expected_but_not_found: list[str]
found_but_not_expected: list[str]
class DebugChannelSlotMismatch(BaseModel):
slot_number: int
expected_sha256_of_room_key: str | None = None
actual_sha256_of_room_key: str | None = None
class DebugChannelAudit(BaseModel):
matched_slots: int
wrong_slots: list[DebugChannelSlotMismatch]
class DebugRadioProbe(BaseModel):
performed: bool
errors: list[str] = Field(default_factory=list)
self_info: dict[str, Any] | None = None
device_info: dict[str, Any] | None = None
stats_core: dict[str, Any] | None = None
stats_radio: dict[str, Any] | None = None
contacts: DebugContactAudit | None = None
channels: DebugChannelAudit | None = None
class DebugSnapshotResponse(BaseModel):
captured_at: str
application: DebugApplicationInfo
health: HealthResponse
runtime: DebugRuntimeInfo
radio_probe: DebugRadioProbe
logs: list[str]
def _repo_root() -> Path:
return Path(__file__).resolve().parents[2]
def _get_app_version() -> str:
try:
return importlib.metadata.version("remoteterm-meshcore")
except Exception:
pyproject = _repo_root() / "pyproject.toml"
try:
for line in pyproject.read_text().splitlines():
if line.startswith("version = "):
return line.split('"')[1]
except Exception:
pass
return "0.0.0"
def _git_output(*args: str) -> str | None:
try:
result = subprocess.run(
["git", *args],
cwd=_repo_root(),
check=True,
capture_output=True,
text=True,
)
except Exception:
return None
output = result.stdout.strip()
return output or None
def _build_application_info() -> DebugApplicationInfo:
dirty_output = _git_output("status", "--porcelain")
return DebugApplicationInfo(
version=_get_app_version(),
commit_hash=_git_output("rev-parse", "HEAD"),
git_branch=_git_output("rev-parse", "--abbrev-ref", "HEAD"),
git_dirty=(dirty_output is not None and dirty_output != ""),
python_version=sys.version.split()[0],
)
def _event_type_name(event: Any) -> str:
event_type = getattr(event, "type", None)
return getattr(event_type, "name", str(event_type))
def _sha256_hex(value: str) -> str:
return hashlib.sha256(value.encode("utf-8")).hexdigest()
def _normalize_channel_secret(payload: dict[str, Any]) -> bytes:
secret = payload.get("channel_secret", b"")
if isinstance(secret, bytes):
return secret
return bytes(secret)
def _is_empty_channel_payload(payload: dict[str, Any]) -> bool:
name = payload.get("channel_name", "")
return not name or name == "\x00" * len(name)
def _observed_channel_key(event: Any) -> str | None:
if getattr(event, "type", None) != EventType.CHANNEL_INFO:
return None
payload = event.payload or {}
if _is_empty_channel_payload(payload):
return None
return _normalize_channel_secret(payload).hex().upper()
def _coerce_live_max_channels(device_info: dict[str, Any] | None) -> int | None:
if not device_info or "max_channels" not in device_info:
return None
try:
return int(device_info["max_channels"])
except (TypeError, ValueError):
return None
async def _build_contact_audit(
observed_contacts_payload: dict[str, dict[str, Any]],
) -> DebugContactAudit:
expected_contacts = await get_contacts_selected_for_radio_sync()
expected_keys = {contact.public_key.lower() for contact in expected_contacts}
observed_keys = {public_key.lower() for public_key in observed_contacts_payload}
return DebugContactAudit(
expected_and_found=len(expected_keys & observed_keys),
expected_but_not_found=sorted(_sha256_hex(key) for key in (expected_keys - observed_keys)),
found_but_not_expected=sorted(_sha256_hex(key) for key in (observed_keys - expected_keys)),
)
async def _build_channel_audit(mc: Any, max_channels: int | None = None) -> DebugChannelAudit:
cache_key_by_slot = {
slot: channel_key for channel_key, slot in radio_runtime.get_channel_send_cache_snapshot()
}
matched_slots = 0
wrong_slots: list[DebugChannelSlotMismatch] = []
for slot in range(get_radio_channel_limit(max_channels)):
event = await mc.commands.get_channel(slot)
expected_key = cache_key_by_slot.get(slot)
observed_key = _observed_channel_key(event)
if expected_key == observed_key:
matched_slots += 1
continue
wrong_slots.append(
DebugChannelSlotMismatch(
slot_number=slot,
expected_sha256_of_room_key=_sha256_hex(expected_key) if expected_key else None,
actual_sha256_of_room_key=_sha256_hex(observed_key) if observed_key else None,
)
)
return DebugChannelAudit(
matched_slots=matched_slots,
wrong_slots=wrong_slots,
)
async def _probe_radio() -> DebugRadioProbe:
if not radio_runtime.is_connected:
return DebugRadioProbe(performed=False, errors=["Radio not connected"])
errors: list[str] = []
try:
async with radio_runtime.radio_operation(
"debug_support_snapshot",
suspend_auto_fetch=True,
) as mc:
device_info = None
stats_core = None
stats_radio = None
device_event = await mc.commands.send_device_query()
if getattr(device_event, "type", None) == EventType.DEVICE_INFO:
device_info = device_event.payload
else:
errors.append(f"send_device_query returned {_event_type_name(device_event)}")
core_event = await mc.commands.get_stats_core()
if getattr(core_event, "type", None) == EventType.STATS_CORE:
stats_core = core_event.payload
else:
errors.append(f"get_stats_core returned {_event_type_name(core_event)}")
radio_event = await mc.commands.get_stats_radio()
if getattr(radio_event, "type", None) == EventType.STATS_RADIO:
stats_radio = radio_event.payload
else:
errors.append(f"get_stats_radio returned {_event_type_name(radio_event)}")
contacts_event = await mc.commands.get_contacts()
observed_contacts_payload: dict[str, dict[str, Any]] = {}
if getattr(contacts_event, "type", None) != EventType.ERROR:
observed_contacts_payload = contacts_event.payload or {}
else:
errors.append(f"get_contacts returned {_event_type_name(contacts_event)}")
return DebugRadioProbe(
performed=True,
errors=errors,
self_info=dict(mc.self_info or {}),
device_info=device_info,
stats_core=stats_core,
stats_radio=stats_radio,
contacts=await _build_contact_audit(observed_contacts_payload),
channels=await _build_channel_audit(
mc,
max_channels=_coerce_live_max_channels(device_info),
),
)
except Exception as exc:
logger.warning("Debug support snapshot radio probe failed: %s", exc, exc_info=True)
errors.append(str(exc))
return DebugRadioProbe(performed=False, errors=errors)
@router.get("/debug", response_model=DebugSnapshotResponse)
async def debug_support_snapshot() -> DebugSnapshotResponse:
"""Return a support/debug snapshot with recent logs and live radio state."""
health_data = await build_health_data(radio_runtime.is_connected, radio_runtime.connection_info)
radio_probe = await _probe_radio()
channels_with_incoming_messages = (
await MessageRepository.count_channels_with_incoming_messages()
)
return DebugSnapshotResponse(
captured_at=datetime.now(timezone.utc).isoformat(),
application=_build_application_info(),
health=HealthResponse(**health_data),
runtime=DebugRuntimeInfo(
connection_info=radio_runtime.connection_info,
connection_desired=radio_runtime.connection_desired,
setup_in_progress=radio_runtime.is_setup_in_progress,
setup_complete=radio_runtime.is_setup_complete,
channels_with_incoming_messages=channels_with_incoming_messages,
max_channels=radio_runtime.max_channels,
path_hash_mode=radio_runtime.path_hash_mode,
path_hash_mode_supported=radio_runtime.path_hash_mode_supported,
channel_slot_reuse_enabled=radio_runtime.channel_slot_reuse_enabled(),
channel_send_cache_capacity=radio_runtime.get_channel_send_cache_capacity(),
remediation_flags={
"enable_message_poll_fallback": settings.enable_message_poll_fallback,
"force_channel_slot_reconfigure": settings.force_channel_slot_reconfigure,
},
),
radio_probe=radio_probe,
logs=get_recent_log_lines(limit=1000),
)

View File

@@ -78,6 +78,26 @@ class FanoutConfigUpdate(BaseModel):
enabled: bool | None = Field(default=None, description="Enable/disable toggle")
def _validate_and_normalize_config(config_type: str, config: dict) -> dict:
"""Validate a config blob and return the canonical persisted form."""
normalized = dict(config)
if config_type == "mqtt_private":
_validate_mqtt_private_config(normalized)
elif config_type == "mqtt_community":
_validate_mqtt_community_config(normalized)
elif config_type == "bot":
_validate_bot_config(normalized)
elif config_type == "webhook":
_validate_webhook_config(normalized)
elif config_type == "apprise":
_validate_apprise_config(normalized)
elif config_type == "sqs":
_validate_sqs_config(normalized)
return normalized
def _validate_mqtt_private_config(config: dict) -> None:
"""Validate mqtt_private config blob."""
if not config.get("broker_host"):
@@ -323,28 +343,13 @@ async def create_fanout_config(body: FanoutConfigCreate) -> dict:
if body.type == "bot" and server_settings.disable_bots:
raise HTTPException(status_code=403, detail="Bot system disabled by server configuration")
# Only validate config when creating as enabled — disabled configs
# are drafts the user hasn't finished configuring yet.
if body.enabled:
if body.type == "mqtt_private":
_validate_mqtt_private_config(body.config)
elif body.type == "mqtt_community":
_validate_mqtt_community_config(body.config)
elif body.type == "bot":
_validate_bot_config(body.config)
elif body.type == "webhook":
_validate_webhook_config(body.config)
elif body.type == "apprise":
_validate_apprise_config(body.config)
elif body.type == "sqs":
_validate_sqs_config(body.config)
normalized_config = _validate_and_normalize_config(body.type, body.config)
scope = _enforce_scope(body.type, body.scope)
cfg = await FanoutConfigRepository.create(
config_type=body.type,
name=body.name,
config=body.config,
config=normalized_config,
scope=scope,
enabled=body.enabled,
)
@@ -374,27 +379,11 @@ async def update_fanout_config(config_id: str, body: FanoutConfigUpdate) -> dict
kwargs["name"] = body.name
if body.enabled is not None:
kwargs["enabled"] = body.enabled
if body.config is not None:
kwargs["config"] = body.config
if body.scope is not None:
kwargs["scope"] = _enforce_scope(existing["type"], body.scope)
# Validate config when the result will be enabled
will_be_enabled = body.enabled if body.enabled is not None else existing["enabled"]
if will_be_enabled:
config_to_validate = body.config if body.config is not None else existing["config"]
if existing["type"] == "mqtt_private":
_validate_mqtt_private_config(config_to_validate)
elif existing["type"] == "mqtt_community":
_validate_mqtt_community_config(config_to_validate)
elif existing["type"] == "bot":
_validate_bot_config(config_to_validate)
elif existing["type"] == "webhook":
_validate_webhook_config(config_to_validate)
elif existing["type"] == "apprise":
_validate_apprise_config(config_to_validate)
elif existing["type"] == "sqs":
_validate_sqs_config(config_to_validate)
config_to_validate = body.config if body.config is not None else existing["config"]
kwargs["config"] = _validate_and_normalize_config(existing["type"], config_to_validate)
updated = await FanoutConfigRepository.update(config_id, **kwargs)
if updated is None:

View File

@@ -11,18 +11,34 @@ from app.services.radio_runtime import radio_runtime as radio_manager
router = APIRouter(tags=["health"])
class RadioDeviceInfoResponse(BaseModel):
model: str | None = None
firmware_build: str | None = None
firmware_version: str | None = None
max_contacts: int | None = None
max_channels: int | None = None
class HealthResponse(BaseModel):
status: str
radio_connected: bool
radio_initializing: bool = False
radio_state: str = "disconnected"
connection_info: str | None
radio_device_info: RadioDeviceInfoResponse | None = None
database_size_mb: float
oldest_undecrypted_timestamp: int | None
fanout_statuses: dict[str, dict[str, str]] = {}
bots_disabled: bool = False
def _clean_optional_str(value: object) -> str | None:
if not isinstance(value, str):
return None
cleaned = value.strip()
return cleaned or None
async def build_health_data(radio_connected: bool, connection_info: str | None) -> dict:
"""Build the health status payload used by REST endpoint and WebSocket broadcasts."""
db_size_mb = 0.0
@@ -48,22 +64,12 @@ async def build_health_data(radio_connected: bool, connection_info: str | None)
pass
setup_in_progress = getattr(radio_manager, "is_setup_in_progress", False)
if not isinstance(setup_in_progress, bool):
setup_in_progress = False
setup_complete = getattr(radio_manager, "is_setup_complete", radio_connected)
if not isinstance(setup_complete, bool):
setup_complete = radio_connected
if not radio_connected:
setup_complete = False
connection_desired = getattr(radio_manager, "connection_desired", True)
if not isinstance(connection_desired, bool):
connection_desired = True
is_reconnecting = getattr(radio_manager, "is_reconnecting", False)
if not isinstance(is_reconnecting, bool):
is_reconnecting = False
radio_initializing = bool(radio_connected and (setup_in_progress or not setup_complete))
if not connection_desired:
@@ -77,12 +83,26 @@ async def build_health_data(radio_connected: bool, connection_info: str | None)
else:
radio_state = "disconnected"
radio_device_info = None
device_info_loaded = getattr(radio_manager, "device_info_loaded", False)
if radio_connected and device_info_loaded:
radio_device_info = {
"model": _clean_optional_str(getattr(radio_manager, "device_model", None)),
"firmware_build": _clean_optional_str(getattr(radio_manager, "firmware_build", None)),
"firmware_version": _clean_optional_str(
getattr(radio_manager, "firmware_version", None)
),
"max_contacts": getattr(radio_manager, "max_contacts", None),
"max_channels": getattr(radio_manager, "max_channels", None),
}
return {
"status": "ok" if radio_connected and not radio_initializing else "degraded",
"radio_connected": radio_connected,
"radio_initializing": radio_initializing,
"radio_state": radio_state,
"connection_info": connection_info,
"radio_device_info": radio_device_info,
"database_size_mb": db_size_mb,
"oldest_undecrypted_timestamp": oldest_ts,
"fanout_statuses": fanout_statuses,

View File

@@ -8,6 +8,7 @@ from app.event_handlers import track_pending_ack
from app.models import (
Message,
MessagesAroundResponse,
ResendChannelMessageResponse,
SendChannelMessageRequest,
SendDirectMessageRequest,
)
@@ -126,7 +127,9 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
)
# Temporary radio slot used for sending channel messages
# Preferred first radio slot used for sending channel messages.
# The send service may reuse/load other app-managed slots depending on transport
# and session cache state.
TEMP_RADIO_SLOT = 0
@@ -136,7 +139,6 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
require_connected()
# Get channel info from our database
from app.decoder import calculate_channel_hash
from app.repository import ChannelRepository
db_channel = await ChannelRepository.get_by_key(request.channel_key)
@@ -153,14 +155,6 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
status_code=400, detail=f"Invalid channel key format: {request.channel_key}"
) from None
expected_hash = calculate_channel_hash(key_bytes)
logger.info(
"Sending to channel %s (%s) via radio slot %d, key hash: %s",
request.channel_key,
db_channel.name,
TEMP_RADIO_SLOT,
expected_hash,
)
return await send_channel_message_to_channel(
channel=db_channel,
channel_key_upper=request.channel_key.upper(),
@@ -178,11 +172,15 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
RESEND_WINDOW_SECONDS = 30
@router.post("/channel/{message_id}/resend")
@router.post(
"/channel/{message_id}/resend",
response_model=ResendChannelMessageResponse,
response_model_exclude_none=True,
)
async def resend_channel_message(
message_id: int,
new_timestamp: bool = Query(default=False),
) -> dict:
) -> ResendChannelMessageResponse:
"""Resend a channel message.
When new_timestamp=False (default): byte-perfect resend using the original timestamp.

View File

@@ -3,7 +3,7 @@ from hashlib import sha256
from sqlite3 import OperationalError
import aiosqlite
from fastapi import APIRouter, BackgroundTasks
from fastapi import APIRouter, BackgroundTasks, HTTPException, Response, status
from pydantic import BaseModel, Field
from app.database import db
@@ -40,6 +40,10 @@ class DecryptResult(BaseModel):
message: str
def _bad_request(detail: str) -> HTTPException:
return HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=detail)
async def _run_historical_channel_decryption(
channel_key_bytes: bytes, channel_key_hex: str, display_name: str | None = None
) -> None:
@@ -100,7 +104,7 @@ async def get_undecrypted_count() -> dict:
@router.post("/decrypt/historical", response_model=DecryptResult)
async def decrypt_historical_packets(
request: DecryptRequest, background_tasks: BackgroundTasks
request: DecryptRequest, background_tasks: BackgroundTasks, response: Response
) -> DecryptResult:
"""
Attempt to decrypt historical packets with the provided key.
@@ -112,27 +116,15 @@ async def decrypt_historical_packets(
try:
channel_key_bytes = bytes.fromhex(request.channel_key)
if len(channel_key_bytes) != 16:
return DecryptResult(
started=False,
total_packets=0,
message="Channel key must be 16 bytes (32 hex chars)",
)
raise _bad_request("Channel key must be 16 bytes (32 hex chars)")
channel_key_hex = request.channel_key.upper()
except ValueError:
return DecryptResult(
started=False,
total_packets=0,
message="Invalid hex string for channel key",
)
raise _bad_request("Invalid hex string for channel key") from None
elif request.channel_name:
channel_key_bytes = sha256(request.channel_name.encode("utf-8")).digest()[:16]
channel_key_hex = channel_key_bytes.hex().upper()
else:
return DecryptResult(
started=False,
total_packets=0,
message="Must provide channel_key or channel_name",
)
raise _bad_request("Must provide channel_key or channel_name")
# Get count and lookup channel name for display
count = await RawPacketRepository.get_undecrypted_count()
@@ -148,6 +140,7 @@ async def decrypt_historical_packets(
background_tasks.add_task(
_run_historical_channel_decryption, channel_key_bytes, channel_key_hex, display_name
)
response.status_code = status.HTTP_202_ACCEPTED
return DecryptResult(
started=True,
@@ -158,48 +151,24 @@ async def decrypt_historical_packets(
elif request.key_type == "contact":
# DM decryption
if not request.private_key:
return DecryptResult(
started=False,
total_packets=0,
message="Must provide private_key for contact decryption",
)
raise _bad_request("Must provide private_key for contact decryption")
if not request.contact_public_key:
return DecryptResult(
started=False,
total_packets=0,
message="Must provide contact_public_key for contact decryption",
)
raise _bad_request("Must provide contact_public_key for contact decryption")
try:
private_key_bytes = bytes.fromhex(request.private_key)
if len(private_key_bytes) != 64:
return DecryptResult(
started=False,
total_packets=0,
message="Private key must be 64 bytes (128 hex chars)",
)
raise _bad_request("Private key must be 64 bytes (128 hex chars)")
except ValueError:
return DecryptResult(
started=False,
total_packets=0,
message="Invalid hex string for private key",
)
raise _bad_request("Invalid hex string for private key") from None
try:
contact_public_key_bytes = bytes.fromhex(request.contact_public_key)
if len(contact_public_key_bytes) != 32:
return DecryptResult(
started=False,
total_packets=0,
message="Contact public key must be 32 bytes (64 hex chars)",
)
raise _bad_request("Contact public key must be 32 bytes (64 hex chars)")
contact_public_key_hex = request.contact_public_key.lower()
except ValueError:
return DecryptResult(
started=False,
total_packets=0,
message="Invalid hex string for contact public key",
)
raise _bad_request("Invalid hex string for contact public key") from None
packets = await RawPacketRepository.get_undecrypted_text_messages()
count = len(packets)
@@ -223,6 +192,7 @@ async def decrypt_historical_packets(
contact_public_key_hex,
display_name,
)
response.status_code = status.HTTP_202_ACCEPTED
return DecryptResult(
started=True,
@@ -230,11 +200,7 @@ async def decrypt_historical_packets(
message=f"Started DM decryption of {count} TEXT_MESSAGE packets in background",
)
return DecryptResult(
started=False,
total_packets=0,
message="key_type must be 'channel' or 'contact'",
)
raise _bad_request("key_type must be 'channel' or 'contact'")
class MaintenanceRequest(BaseModel):

View File

@@ -1,12 +1,23 @@
import asyncio
import logging
from typing import Literal
import random
import time
from typing import Literal, TypeAlias
from fastapi import APIRouter, HTTPException
from meshcore import EventType
from pydantic import BaseModel, Field
from app.dependencies import require_connected
from app.models import (
ContactUpsert,
RadioDiscoveryRequest,
RadioDiscoveryResponse,
RadioDiscoveryResult,
)
from app.radio_sync import send_advertisement as do_send_advertisement
from app.radio_sync import sync_radio_time
from app.repository import ContactRepository
from app.services.radio_commands import (
KeystoreRefreshError,
PathHashModeUnsupportedError,
@@ -15,12 +26,23 @@ from app.services.radio_commands import (
import_private_key_and_refresh_keystore,
)
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_health
from app.websocket import broadcast_event, broadcast_health
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/radio", tags=["radio"])
AdvertLocationSource = Literal["off", "current"]
DiscoveryNodeType: TypeAlias = Literal["repeater", "sensor"]
DISCOVERY_WINDOW_SECONDS = 8.0
_DISCOVERY_TARGET_BITS = {
"repeaters": 1 << 2,
"sensors": 1 << 4,
"all": (1 << 2) | (1 << 4),
}
_DISCOVERY_NODE_TYPES: dict[int, DiscoveryNodeType] = {
2: "repeater",
4: "sensor",
}
async def _prepare_connected(*, broadcast_on_success: bool) -> bool:
@@ -82,6 +104,88 @@ class PrivateKeyUpdate(BaseModel):
private_key: str = Field(description="Private key as hex string")
def _monotonic() -> float:
return time.monotonic()
def _better_signal(first: float | None, second: float | None) -> float | None:
if first is None:
return second
if second is None:
return first
return second if second > first else first
def _coerce_float(value: object) -> float | None:
if isinstance(value, (int, float)):
return float(value)
return None
def _coerce_int(value: object) -> int | None:
if isinstance(value, int):
return value
return None
def _merge_discovery_result(
existing: RadioDiscoveryResult | None, event_payload: dict[str, object]
) -> RadioDiscoveryResult | None:
public_key = event_payload.get("pubkey")
node_type_code = event_payload.get("node_type")
if not isinstance(public_key, str) or not public_key:
return existing
if not isinstance(node_type_code, int):
return existing
node_type = _DISCOVERY_NODE_TYPES.get(node_type_code)
if node_type is None:
return existing
if existing is None:
return RadioDiscoveryResult(
public_key=public_key,
node_type=node_type,
heard_count=1,
local_snr=_coerce_float(event_payload.get("SNR")),
local_rssi=_coerce_int(event_payload.get("RSSI")),
remote_snr=_coerce_float(event_payload.get("SNR_in")),
)
existing.heard_count += 1
existing.local_snr = _better_signal(existing.local_snr, _coerce_float(event_payload.get("SNR")))
current_rssi = _coerce_int(event_payload.get("RSSI"))
if existing.local_rssi is None or (
current_rssi is not None and current_rssi > existing.local_rssi
):
existing.local_rssi = current_rssi
existing.remote_snr = _better_signal(
existing.remote_snr,
_coerce_float(event_payload.get("SNR_in")),
)
return existing
async def _persist_new_discovery_contacts(results: list[RadioDiscoveryResult]) -> None:
now = int(time.time())
for result in results:
existing = await ContactRepository.get_by_key(result.public_key)
if existing is not None:
continue
contact = ContactUpsert(
public_key=result.public_key,
type=2 if result.node_type == "repeater" else 4,
last_seen=now,
first_seen=now,
on_radio=False,
)
await ContactRepository.upsert(contact)
created = await ContactRepository.get_by_key(result.public_key)
if created is not None:
broadcast_event("contact", created.model_dump())
@router.get("/config", response_model=RadioConfigResponse)
async def get_radio_config() -> RadioConfigResponse:
"""Get the current radio configuration."""
@@ -184,6 +288,72 @@ async def send_advertisement() -> dict:
return {"status": "ok"}
@router.post("/discover", response_model=RadioDiscoveryResponse)
async def discover_mesh(request: RadioDiscoveryRequest) -> RadioDiscoveryResponse:
"""Run a short node-discovery sweep from the local radio."""
require_connected()
target_bits = _DISCOVERY_TARGET_BITS[request.target]
tag = random.randint(1, 0xFFFFFFFF)
tag_hex = tag.to_bytes(4, "little", signed=False).hex()
events: asyncio.Queue = asyncio.Queue()
async with radio_manager.radio_operation(
"discover_mesh",
pause_polling=True,
suspend_auto_fetch=True,
) as mc:
subscription = mc.subscribe(
EventType.DISCOVER_RESPONSE,
lambda event: events.put_nowait(event),
{"tag": tag_hex},
)
try:
send_result = await mc.commands.send_node_discover_req(
target_bits,
prefix_only=False,
tag=tag,
)
if send_result is None or send_result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail="Failed to start mesh discovery")
deadline = _monotonic() + DISCOVERY_WINDOW_SECONDS
results_by_key: dict[str, RadioDiscoveryResult] = {}
while True:
remaining = deadline - _monotonic()
if remaining <= 0:
break
try:
event = await asyncio.wait_for(events.get(), timeout=remaining)
except asyncio.TimeoutError:
break
merged = _merge_discovery_result(
results_by_key.get(event.payload.get("pubkey")),
event.payload,
)
if merged is not None:
results_by_key[merged.public_key] = merged
finally:
subscription.unsubscribe()
results = sorted(
results_by_key.values(),
key=lambda item: (
item.node_type,
-(item.local_snr if item.local_snr is not None else -999.0),
item.public_key,
),
)
await _persist_new_discovery_contacts(results)
return RadioDiscoveryResponse(
target=request.target,
duration_seconds=DISCOVERY_WINDOW_SECONDS,
results=results,
)
async def _attempt_reconnect() -> dict:
"""Shared reconnection logic for reboot and reconnect endpoints."""
radio_manager.resume_connection()

View File

@@ -21,6 +21,7 @@ from app.models import (
RepeaterLoginResponse,
RepeaterLppTelemetryResponse,
RepeaterNeighborsResponse,
RepeaterNodeInfoResponse,
RepeaterOwnerInfoResponse,
RepeaterRadioSettingsResponse,
RepeaterStatusResponse,
@@ -43,8 +44,21 @@ ACL_PERMISSION_NAMES = {
}
router = APIRouter(prefix="/contacts", tags=["repeaters"])
# Delay between repeater radio operations to allow key exchange and path establishment
REPEATER_OP_DELAY_SECONDS = 2.0
REPEATER_LOGIN_RESPONSE_TIMEOUT_SECONDS = 5.0
REPEATER_LOGIN_REJECTED_MESSAGE = (
"The repeater replied but did not confirm this login. "
"Existing access may still allow some repeater operations, but admin actions may fail."
)
REPEATER_LOGIN_SEND_FAILED_MESSAGE = (
"The login request could not be sent to the repeater. "
"The dashboard is still available, but repeater operations may fail until a login succeeds."
)
REPEATER_LOGIN_TIMEOUT_MESSAGE = (
"No login confirmation was heard from the repeater. "
"On current repeater firmware, that can mean the password was wrong, "
"blank-password login was not allowed by the ACL, or the reply was missed in transit. "
"The dashboard is still available; try logging in again if admin actions fail."
)
def _monotonic() -> float:
@@ -135,31 +149,88 @@ async def _fetch_repeater_response(
return None
async def prepare_repeater_connection(mc, contact: Contact, password: str) -> None:
"""Prepare connection to a repeater by adding to radio and logging in.
async def prepare_repeater_connection(mc, contact: Contact, password: str) -> RepeaterLoginResponse:
"""Prepare connection to a repeater by adding to radio and attempting login.
Args:
mc: MeshCore instance
contact: The repeater contact
password: Password for login (empty string for no password)
Raises:
HTTPException: If login fails
"""
pubkey_prefix = contact.public_key[:12].lower()
loop = asyncio.get_running_loop()
login_future = loop.create_future()
def _resolve_login(event_type: EventType, message: str | None = None) -> None:
if login_future.done():
return
login_future.set_result(
RepeaterLoginResponse(
status="ok" if event_type == EventType.LOGIN_SUCCESS else "error",
authenticated=event_type == EventType.LOGIN_SUCCESS,
message=message,
)
)
success_subscription = mc.subscribe(
EventType.LOGIN_SUCCESS,
lambda _event: _resolve_login(EventType.LOGIN_SUCCESS),
attribute_filters={"pubkey_prefix": pubkey_prefix},
)
failed_subscription = mc.subscribe(
EventType.LOGIN_FAILED,
lambda _event: _resolve_login(
EventType.LOGIN_FAILED,
REPEATER_LOGIN_REJECTED_MESSAGE,
),
attribute_filters={"pubkey_prefix": pubkey_prefix},
)
# Add contact to radio with path from DB (non-fatal — contact may already be loaded)
logger.info("Adding repeater %s to radio", contact.public_key[:12])
await _ensure_on_radio(mc, contact)
try:
logger.info("Adding repeater %s to radio", contact.public_key[:12])
await _ensure_on_radio(mc, contact)
# Send login with password
logger.info("Sending login to repeater %s", contact.public_key[:12])
login_result = await mc.commands.send_login(contact.public_key, password)
logger.info("Sending login to repeater %s", contact.public_key[:12])
login_result = await mc.commands.send_login(contact.public_key, password)
if login_result.type == EventType.ERROR:
raise HTTPException(status_code=401, detail=f"Login failed: {login_result.payload}")
if login_result.type == EventType.ERROR:
return RepeaterLoginResponse(
status="error",
authenticated=False,
message=f"{REPEATER_LOGIN_SEND_FAILED_MESSAGE} ({login_result.payload})",
)
# Wait for key exchange to complete before sending requests
logger.debug("Waiting %.1fs for key exchange to complete", REPEATER_OP_DELAY_SECONDS)
await asyncio.sleep(REPEATER_OP_DELAY_SECONDS)
try:
return await asyncio.wait_for(
login_future,
timeout=REPEATER_LOGIN_RESPONSE_TIMEOUT_SECONDS,
)
except asyncio.TimeoutError:
logger.warning(
"No login response from repeater %s within %.1fs",
contact.public_key[:12],
REPEATER_LOGIN_RESPONSE_TIMEOUT_SECONDS,
)
return RepeaterLoginResponse(
status="timeout",
authenticated=False,
message=REPEATER_LOGIN_TIMEOUT_MESSAGE,
)
except HTTPException as exc:
logger.warning(
"Repeater login setup failed for %s: %s",
contact.public_key[:12],
exc.detail,
)
return RepeaterLoginResponse(
status="error",
authenticated=False,
message=f"{REPEATER_LOGIN_SEND_FAILED_MESSAGE} ({exc.detail})",
)
finally:
success_subscription.unsubscribe()
failed_subscription.unsubscribe()
def _require_repeater(contact: Contact) -> None:
@@ -179,7 +250,7 @@ def _require_repeater(contact: Contact) -> None:
@router.post("/{public_key}/repeater/login", response_model=RepeaterLoginResponse)
async def repeater_login(public_key: str, request: RepeaterLoginRequest) -> RepeaterLoginResponse:
"""Log in to a repeater. Adds contact to radio, sends login, waits for key exchange."""
"""Attempt repeater login and report whether auth was confirmed."""
require_connected()
contact = await _resolve_contact_or_404(public_key)
_require_repeater(contact)
@@ -189,9 +260,7 @@ async def repeater_login(public_key: str, request: RepeaterLoginRequest) -> Repe
pause_polling=True,
suspend_auto_fetch=True,
) as mc:
await prepare_repeater_connection(mc, contact, request.password)
return RepeaterLoginResponse(status="ok")
return await prepare_repeater_connection(mc, contact, request.password)
@router.post("/{public_key}/repeater/status", response_model=RepeaterStatusResponse)
@@ -373,9 +442,29 @@ async def _batch_cli_fetch(
return results
@router.post("/{public_key}/repeater/node-info", response_model=RepeaterNodeInfoResponse)
async def repeater_node_info(public_key: str) -> RepeaterNodeInfoResponse:
"""Fetch repeater identity/location info via a small CLI batch."""
require_connected()
contact = await _resolve_contact_or_404(public_key)
_require_repeater(contact)
results = await _batch_cli_fetch(
contact,
"repeater_node_info",
[
("get name", "name"),
("get lat", "lat"),
("get lon", "lon"),
("clock", "clock_utc"),
],
)
return RepeaterNodeInfoResponse(**results)
@router.post("/{public_key}/repeater/radio-settings", response_model=RepeaterRadioSettingsResponse)
async def repeater_radio_settings(public_key: str) -> RepeaterRadioSettingsResponse:
"""Fetch radio settings from a repeater via batch CLI commands."""
"""Fetch radio settings from a repeater via radio/config CLI commands."""
require_connected()
contact = await _resolve_contact_or_404(public_key)
_require_repeater(contact)
@@ -390,10 +479,6 @@ async def repeater_radio_settings(public_key: str) -> RepeaterRadioSettingsRespo
("get af", "airtime_factor"),
("get repeat", "repeat_enabled"),
("get flood.max", "flood_max"),
("get name", "name"),
("get lat", "lat"),
("get lon", "lon"),
("clock", "clock_utc"),
],
)
return RepeaterRadioSettingsResponse(**results)

View File

@@ -6,12 +6,26 @@ import time
logger = logging.getLogger(__name__)
PendingAck = tuple[int, float, int]
BUFFERED_ACK_TTL_SECONDS = 30.0
_pending_acks: dict[str, PendingAck] = {}
_buffered_acks: dict[str, float] = {}
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> None:
"""Track an expected ACK code for an outgoing direct message."""
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> bool:
"""Track an expected ACK code for an outgoing direct message.
Returns True when the ACK was already observed and buffered before registration.
"""
buffered_at = _buffered_acks.pop(expected_ack, None)
if buffered_at is not None:
logger.debug(
"Matched buffered ACK %s immediately for message %d",
expected_ack,
message_id,
)
return True
_pending_acks[expected_ack] = (message_id, time.time(), timeout_ms)
logger.debug(
"Tracking pending ACK %s for message %d (timeout %dms)",
@@ -19,6 +33,13 @@ def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> No
message_id,
timeout_ms,
)
return False
def buffer_unmatched_ack(ack_code: str) -> None:
"""Remember an ACK that arrived before its message registration."""
_buffered_acks[ack_code] = time.time()
logger.debug("Buffered unmatched ACK %s for late registration", ack_code)
def cleanup_expired_acks() -> None:
@@ -33,6 +54,15 @@ def cleanup_expired_acks() -> None:
del _pending_acks[code]
logger.debug("Expired pending ACK %s", code)
expired_buffered_codes = [
code
for code, buffered_at in _buffered_acks.items()
if now - buffered_at > BUFFERED_ACK_TTL_SECONDS
]
for code in expired_buffered_codes:
del _buffered_acks[code]
logger.debug("Expired buffered ACK %s", code)
def pop_pending_ack(ack_code: str) -> int | None:
"""Claim the tracked message ID for an ACK code if present."""

View File

@@ -1,5 +1,6 @@
"""Shared send/resend orchestration for outgoing messages."""
import asyncio
import logging
from collections.abc import Callable
from typing import Any
@@ -7,29 +8,109 @@ from typing import Any
from fastapi import HTTPException
from meshcore import EventType
from app.models import ResendChannelMessageResponse
from app.region_scope import normalize_region_scope
from app.repository import AppSettingsRepository, ContactRepository, MessageRepository
from app.services.messages import (
build_message_model,
create_outgoing_channel_message,
create_outgoing_direct_message,
increment_ack_and_broadcast,
)
logger = logging.getLogger(__name__)
NO_RADIO_RESPONSE_AFTER_SEND_DETAIL = (
"Send command was issued to the radio, but no response was heard back. "
"The message may or may not have sent successfully."
)
BroadcastFn = Callable[..., Any]
TrackAckFn = Callable[[str, int, int], None]
TrackAckFn = Callable[[str, int, int], bool]
NowFn = Callable[[], float]
OutgoingReservationKey = tuple[str, str, str]
_pending_outgoing_timestamp_reservations: dict[OutgoingReservationKey, set[int]] = {}
_outgoing_timestamp_reservations_lock = asyncio.Lock()
async def allocate_outgoing_sender_timestamp(
*,
message_repository,
msg_type: str,
conversation_key: str,
text: str,
requested_timestamp: int,
) -> int:
"""Pick a sender timestamp that will not collide with an existing stored message."""
reservation_key = (msg_type, conversation_key, text)
candidate = requested_timestamp
while True:
async with _outgoing_timestamp_reservations_lock:
reserved = _pending_outgoing_timestamp_reservations.get(reservation_key, set())
is_reserved = candidate in reserved
if is_reserved:
candidate += 1
continue
existing = await message_repository.get_by_content(
msg_type=msg_type,
conversation_key=conversation_key,
text=text,
sender_timestamp=candidate,
)
if existing is not None:
candidate += 1
continue
async with _outgoing_timestamp_reservations_lock:
reserved = _pending_outgoing_timestamp_reservations.setdefault(reservation_key, set())
if candidate in reserved:
candidate += 1
continue
reserved.add(candidate)
break
if candidate != requested_timestamp:
logger.info(
"Bumped outgoing %s timestamp for %s from %d to %d to avoid same-content collision",
msg_type,
conversation_key[:12],
requested_timestamp,
candidate,
)
return candidate
async def release_outgoing_sender_timestamp(
*,
msg_type: str,
conversation_key: str,
text: str,
sender_timestamp: int,
) -> None:
reservation_key = (msg_type, conversation_key, text)
async with _outgoing_timestamp_reservations_lock:
reserved = _pending_outgoing_timestamp_reservations.get(reservation_key)
if not reserved:
return
reserved.discard(sender_timestamp)
if not reserved:
_pending_outgoing_timestamp_reservations.pop(reservation_key, None)
async def send_channel_message_with_effective_scope(
*,
mc,
channel,
channel_key: str,
key_bytes: bytes,
text: str,
timestamp_bytes: bytes,
action_label: str,
radio_manager,
temp_radio_slot: int,
error_broadcast_fn: BroadcastFn,
app_settings_repository=AppSettingsRepository,
@@ -64,28 +145,71 @@ async def send_channel_message_with_effective_scope(
)
try:
set_result = await mc.commands.set_channel(
channel_idx=temp_radio_slot,
channel_name=channel.name,
channel_secret=key_bytes,
channel_slot, needs_configure, evicted_channel_key = radio_manager.plan_channel_send_slot(
channel_key,
preferred_slot=temp_radio_slot,
)
if set_result.type == EventType.ERROR:
logger.warning(
"Failed to set channel on radio slot %d before %s: %s",
temp_radio_slot,
if needs_configure:
logger.debug(
"Loading channel %s into radio slot %d before %s%s",
channel.name,
channel_slot,
action_label,
set_result.payload,
(
f" (evicting cached {evicted_channel_key[:8]})"
if evicted_channel_key is not None
else ""
),
)
raise HTTPException(
status_code=500,
detail=f"Failed to configure channel on radio before {action_label}",
try:
set_result = await mc.commands.set_channel(
channel_idx=channel_slot,
channel_name=channel.name,
channel_secret=key_bytes,
)
except Exception:
if evicted_channel_key is not None:
radio_manager.invalidate_cached_channel_slot(evicted_channel_key)
raise
if set_result.type == EventType.ERROR:
if evicted_channel_key is not None:
radio_manager.invalidate_cached_channel_slot(evicted_channel_key)
logger.warning(
"Failed to set channel on radio slot %d before %s: %s",
channel_slot,
action_label,
set_result.payload,
)
raise HTTPException(
status_code=500,
detail=f"Failed to configure channel on radio before {action_label}",
)
radio_manager.note_channel_slot_loaded(channel_key, channel_slot)
else:
logger.debug(
"Reusing cached radio slot %d for channel %s before %s",
channel_slot,
channel.name,
action_label,
)
return await mc.commands.send_chan_msg(
chan=temp_radio_slot,
send_result = await mc.commands.send_chan_msg(
chan=channel_slot,
msg=text,
timestamp=timestamp_bytes,
)
if send_result is None:
logger.warning(
"No response from radio after %s for channel %s; send outcome is unknown",
action_label,
channel.name,
)
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if send_result.type == EventType.ERROR:
radio_manager.invalidate_cached_channel_slot(channel_key)
else:
radio_manager.note_channel_slot_used(channel_key)
return send_result
finally:
if override_scope and override_scope != baseline_scope:
try:
@@ -137,57 +261,85 @@ async def send_direct_message_to_contact(
) -> Any:
"""Send a direct message and persist/broadcast the outgoing row."""
contact_data = contact.to_radio_dict()
contact_ensured_on_radio = False
async with radio_manager.radio_operation("send_direct_message") as mc:
logger.debug("Ensuring contact %s is on radio before sending", contact.public_key[:12])
add_result = await mc.commands.add_contact(contact_data)
if add_result.type == EventType.ERROR:
logger.warning("Failed to add contact to radio: %s", add_result.payload)
else:
contact_ensured_on_radio = True
sent_at: int | None = None
sender_timestamp: int | None = None
message = None
result = None
try:
async with radio_manager.radio_operation("send_direct_message") as mc:
logger.debug("Ensuring contact %s is on radio before sending", contact.public_key[:12])
add_result = await mc.commands.add_contact(contact_data)
if add_result.type == EventType.ERROR:
logger.warning("Failed to add contact to radio: %s", add_result.payload)
cached_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if not cached_contact:
cached_contact = contact_data
else:
contact_ensured_on_radio = True
cached_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if not cached_contact:
cached_contact = contact_data
logger.info("Sending direct message to %s", contact.public_key[:12])
now = int(now_fn())
result = await mc.commands.send_msg(
dst=cached_contact,
msg=text,
timestamp=now,
logger.info("Sending direct message to %s", contact.public_key[:12])
sent_at = int(now_fn())
sender_timestamp = await allocate_outgoing_sender_timestamp(
message_repository=message_repository,
msg_type="PRIV",
conversation_key=contact.public_key.lower(),
text=text,
requested_timestamp=sent_at,
)
result = await mc.commands.send_msg(
dst=cached_contact,
msg=text,
timestamp=sender_timestamp,
)
if result is None:
logger.warning(
"No response from radio after direct send to %s; send outcome is unknown",
contact.public_key[:12],
)
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
message = await create_outgoing_direct_message(
conversation_key=contact.public_key.lower(),
text=text,
sender_timestamp=sender_timestamp,
received_at=sent_at,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
finally:
if sender_timestamp is not None:
await release_outgoing_sender_timestamp(
msg_type="PRIV",
conversation_key=contact.public_key.lower(),
text=text,
sender_timestamp=sender_timestamp,
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
if sent_at is None or sender_timestamp is None or message is None or result is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message")
if contact_ensured_on_radio and not contact.on_radio:
await contact_repository.set_on_radio(contact.public_key.lower(), True)
message = await create_outgoing_direct_message(
conversation_key=contact.public_key.lower(),
text=text,
sender_timestamp=now,
received_at=now,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
await contact_repository.update_last_contacted(contact.public_key.lower(), now)
await contact_repository.update_last_contacted(contact.public_key.lower(), sent_at)
expected_ack = result.payload.get("expected_ack")
suggested_timeout: int = result.payload.get("suggested_timeout", 10000)
if expected_ack:
ack_code = expected_ack.hex() if isinstance(expected_ack, bytes) else expected_ack
track_pending_ack_fn(ack_code, message.id, suggested_timeout)
matched_immediately = track_pending_ack_fn(ack_code, message.id, suggested_timeout) is True
logger.debug("Tracking ACK %s for message %d", ack_code, message.id)
if matched_immediately:
ack_count = await increment_ack_and_broadcast(
message_id=message.id,
broadcast_fn=broadcast_fn,
)
message.acked = ack_count
return message
@@ -206,54 +358,83 @@ async def send_channel_message_to_channel(
message_repository=MessageRepository,
) -> Any:
"""Send a channel message and persist/broadcast the outgoing row."""
now: int | None = None
sent_at: int | None = None
sender_timestamp: int | None = None
radio_name = ""
our_public_key: str | None = None
text_with_sender = text
outgoing_message = None
async with radio_manager.radio_operation("send_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
our_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_with_sender = f"{radio_name}: {text}" if radio_name else text
logger.info("Sending channel message to %s: %s", channel.name, text[:50])
try:
async with radio_manager.radio_operation("send_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
our_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_with_sender = f"{radio_name}: {text}" if radio_name else text
logger.info("Sending channel message to %s: %s", channel.name, text[:50])
now = int(now_fn())
timestamp_bytes = now.to_bytes(4, "little")
sent_at = int(now_fn())
sender_timestamp = await allocate_outgoing_sender_timestamp(
message_repository=message_repository,
msg_type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
requested_timestamp=sent_at,
)
timestamp_bytes = sender_timestamp.to_bytes(4, "little")
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
key_bytes=key_bytes,
text=text,
timestamp_bytes=timestamp_bytes,
action_label="sending message",
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
channel_key=channel_key_upper,
key_bytes=key_bytes,
text=text,
timestamp_bytes=timestamp_bytes,
action_label="sending message",
radio_manager=radio_manager,
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
)
if result is None:
logger.warning(
"No response from radio after channel send to %s; send outcome is unknown",
channel.name,
)
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500, detail=f"Failed to send message: {result.payload}"
)
outgoing_message = await create_outgoing_channel_message(
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=sender_timestamp,
received_at=sent_at,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if outgoing_message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
finally:
if sender_timestamp is not None:
await release_outgoing_sender_timestamp(
msg_type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=sender_timestamp,
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
if now is None:
if sent_at is None or sender_timestamp is None or outgoing_message is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message")
outgoing_message = await create_outgoing_channel_message(
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if outgoing_message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
message_id = outgoing_message.id
acked_count, paths = await message_repository.get_ack_and_paths(message_id)
return build_message_model(
@@ -261,8 +442,8 @@ async def send_channel_message_to_channel(
msg_type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
sender_timestamp=sender_timestamp,
received_at=sent_at,
paths=paths,
outgoing=True,
acked=acked_count,
@@ -283,7 +464,7 @@ async def resend_channel_message_record(
now_fn: NowFn,
temp_radio_slot: int,
message_repository=MessageRepository,
) -> dict[str, Any]:
) -> ResendChannelMessageResponse:
"""Resend a stored outgoing channel message."""
try:
key_bytes = bytes.fromhex(message.conversation_key)
@@ -293,59 +474,88 @@ async def resend_channel_message_record(
detail=f"Invalid channel key format: {message.conversation_key}",
) from None
now: int | None = None
if new_timestamp:
now = int(now_fn())
timestamp_bytes = now.to_bytes(4, "little")
else:
timestamp_bytes = message.sender_timestamp.to_bytes(4, "little")
sent_at: int | None = None
sender_timestamp = message.sender_timestamp
timestamp_bytes = message.sender_timestamp.to_bytes(4, "little")
resend_public_key: str | None = None
radio_name = ""
new_message = None
stored_text = message.text
async with radio_manager.radio_operation("resend_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
resend_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_to_send = message.text
if radio_name and text_to_send.startswith(f"{radio_name}: "):
text_to_send = text_to_send[len(f"{radio_name}: ") :]
try:
async with radio_manager.radio_operation("resend_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
resend_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_to_send = message.text
if radio_name and text_to_send.startswith(f"{radio_name}: "):
text_to_send = text_to_send[len(f"{radio_name}: ") :]
if new_timestamp:
sent_at = int(now_fn())
sender_timestamp = await allocate_outgoing_sender_timestamp(
message_repository=message_repository,
msg_type="CHAN",
conversation_key=message.conversation_key,
text=stored_text,
requested_timestamp=sent_at,
)
timestamp_bytes = sender_timestamp.to_bytes(4, "little")
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
key_bytes=key_bytes,
text=text_to_send,
timestamp_bytes=timestamp_bytes,
action_label="resending message",
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500,
detail=f"Failed to resend message: {result.payload}",
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
channel_key=message.conversation_key,
key_bytes=key_bytes,
text=text_to_send,
timestamp_bytes=timestamp_bytes,
action_label="resending message",
radio_manager=radio_manager,
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
)
if result is None:
logger.warning(
"No response from radio after channel resend to %s; send outcome is unknown",
channel.name,
)
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500,
detail=f"Failed to resend message: {result.payload}",
)
if new_timestamp:
if sent_at is None:
raise HTTPException(status_code=500, detail="Failed to assign resend timestamp")
new_message = await create_outgoing_channel_message(
conversation_key=message.conversation_key,
text=message.text,
sender_timestamp=sender_timestamp,
received_at=sent_at,
sender_name=radio_name or None,
sender_key=resend_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if new_message is None:
raise HTTPException(
status_code=500,
detail="Failed to store resent message - unexpected duplicate",
)
finally:
if new_timestamp and sent_at is not None:
await release_outgoing_sender_timestamp(
msg_type="CHAN",
conversation_key=message.conversation_key,
text=stored_text,
sender_timestamp=sender_timestamp,
)
if new_timestamp:
if now is None:
if sent_at is None or new_message is None:
raise HTTPException(status_code=500, detail="Failed to assign resend timestamp")
new_message = await create_outgoing_channel_message(
conversation_key=message.conversation_key,
text=message.text,
sender_timestamp=now,
received_at=now,
sender_name=radio_name or None,
sender_key=resend_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if new_message is None:
logger.warning(
"Duplicate timestamp collision resending message %d — radio sent but DB row not created",
message.id,
)
return {"status": "ok", "message_id": message.id}
logger.info(
"Resent channel message %d as new message %d to %s",
@@ -353,7 +563,11 @@ async def resend_channel_message_record(
new_message.id,
channel.name,
)
return {"status": "ok", "message_id": new_message.id}
return ResendChannelMessageResponse(
status="ok",
message_id=new_message.id,
message=new_message,
)
logger.info("Resent channel message %d to %s", message.id, channel.name)
return {"status": "ok", "message_id": message.id}
return ResendChannelMessageResponse(status="ok", message_id=message.id)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import time
from collections.abc import Callable
@@ -13,6 +14,7 @@ logger = logging.getLogger(__name__)
BroadcastFn = Callable[..., Any]
LOG_MESSAGE_PREVIEW_LEN = 32
_decrypted_dm_store_lock = asyncio.Lock()
def _truncate_for_log(text: str, max_chars: int = LOG_MESSAGE_PREVIEW_LEN) -> str:
@@ -125,9 +127,48 @@ async def increment_ack_and_broadcast(
return ack_count
async def _reconcile_duplicate_message(
*,
existing_msg: Message,
packet_id: int | None,
path: str | None,
received_at: int,
path_len: int | None,
broadcast_fn: BroadcastFn,
) -> None:
logger.debug(
"Duplicate %s for %s (msg_id=%d, outgoing=%s) - adding path",
existing_msg.type,
existing_msg.conversation_key[:12],
existing_msg.id,
existing_msg.outgoing,
)
if path is not None:
paths = await MessageRepository.add_path(existing_msg.id, path, received_at, path_len)
else:
paths = existing_msg.paths or []
if existing_msg.outgoing and existing_msg.type == "CHAN":
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
else:
ack_count = existing_msg.acked
if existing_msg.outgoing or path is not None:
broadcast_message_acked(
message_id=existing_msg.id,
ack_count=ack_count,
paths=paths,
broadcast_fn=broadcast_fn,
)
if packet_id is not None:
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
async def handle_duplicate_message(
*,
packet_id: int,
packet_id: int | None,
msg_type: str,
conversation_key: str,
text: str,
@@ -153,34 +194,15 @@ async def handle_duplicate_message(
)
return
logger.debug(
"Duplicate %s for %s (msg_id=%d, outgoing=%s) - adding path",
msg_type,
conversation_key[:12],
existing_msg.id,
existing_msg.outgoing,
await _reconcile_duplicate_message(
existing_msg=existing_msg,
packet_id=packet_id,
path=path,
received_at=received_at,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
if path is not None:
paths = await MessageRepository.add_path(existing_msg.id, path, received_at, path_len)
else:
paths = existing_msg.paths or []
if existing_msg.outgoing:
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
else:
ack_count = existing_msg.acked
if existing_msg.outgoing or path is not None:
broadcast_message_acked(
message_id=existing_msg.id,
ack_count=ack_count,
paths=paths,
broadcast_fn=broadcast_fn,
)
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
async def create_message_from_decrypted(
*,
@@ -289,32 +311,64 @@ async def create_dm_message_from_decrypted(
conversation_key = their_public_key.lower()
sender_name = contact.name if contact and not outgoing else None
msg_id = await MessageRepository.create(
msg_type="PRIV",
text=decrypted.message,
conversation_key=conversation_key,
sender_timestamp=decrypted.timestamp,
received_at=received,
path=path,
path_len=path_len,
outgoing=outgoing,
sender_key=conversation_key if not outgoing else None,
sender_name=sender_name,
)
async with _decrypted_dm_store_lock:
linked_message_id = await RawPacketRepository.get_linked_message_id(packet_id)
if linked_message_id is not None:
existing_msg = await MessageRepository.get_by_id(linked_message_id)
if existing_msg is not None:
await _reconcile_duplicate_message(
existing_msg=existing_msg,
packet_id=packet_id,
path=path,
received_at=received,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
if msg_id is None:
await handle_duplicate_message(
packet_id=packet_id,
if outgoing:
existing_msg = await MessageRepository.get_by_content(
msg_type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
sender_timestamp=decrypted.timestamp,
)
if existing_msg is not None:
await _reconcile_duplicate_message(
existing_msg=existing_msg,
packet_id=packet_id,
path=path,
received_at=received,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
msg_id = await MessageRepository.create(
msg_type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
conversation_key=conversation_key,
sender_timestamp=decrypted.timestamp,
path=path,
received_at=received,
path=path,
path_len=path_len,
broadcast_fn=broadcast_fn,
outgoing=outgoing,
sender_key=conversation_key if not outgoing else None,
sender_name=sender_name,
)
return None
if msg_id is None:
await handle_duplicate_message(
packet_id=packet_id,
msg_type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
sender_timestamp=decrypted.timestamp,
path=path,
received_at=received,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
logger.info(
'Stored direct message "%s" for %r (msg ID %d in contact ID %s, outgoing=%s)',
@@ -363,6 +417,15 @@ async def create_fallback_direct_message(
message_repository=MessageRepository,
) -> Message | None:
"""Store and broadcast a CONTACT_MSG_RECV fallback direct message."""
existing = await message_repository.get_by_content(
msg_type="PRIV",
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
)
if existing is not None:
return None
msg_id = await message_repository.create(
msg_type="PRIV",
text=text,
@@ -396,6 +459,73 @@ async def create_fallback_direct_message(
return message
async def create_fallback_channel_message(
*,
conversation_key: str,
message_text: str,
sender_timestamp: int,
received_at: int,
path: str | None,
path_len: int | None,
txt_type: int,
sender_name: str | None,
channel_name: str | None,
broadcast_fn: BroadcastFn,
message_repository=MessageRepository,
) -> Message | None:
"""Store and broadcast a CHANNEL_MSG_RECV fallback channel message."""
conversation_key_normalized = conversation_key.upper()
text = f"{sender_name}: {message_text}" if sender_name else message_text
resolved_sender_key: str | None = None
if sender_name:
candidates = await ContactRepository.get_by_name(sender_name)
if len(candidates) == 1:
resolved_sender_key = candidates[0].public_key
msg_id = await message_repository.create(
msg_type="CHAN",
text=text,
conversation_key=conversation_key_normalized,
sender_timestamp=sender_timestamp,
received_at=received_at,
path=path,
path_len=path_len,
txt_type=txt_type,
sender_name=sender_name,
sender_key=resolved_sender_key,
)
if msg_id is None:
await handle_duplicate_message(
packet_id=None,
msg_type="CHAN",
conversation_key=conversation_key_normalized,
text=text,
sender_timestamp=sender_timestamp,
path=path,
received_at=received_at,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
message = build_message_model(
message_id=msg_id,
msg_type="CHAN",
conversation_key=conversation_key_normalized,
text=text,
sender_timestamp=sender_timestamp,
received_at=received_at,
paths=build_message_paths(path, received_at, path_len),
txt_type=txt_type,
sender_name=sender_name,
sender_key=resolved_sender_key,
channel_name=channel_name,
)
broadcast_message(message=message, broadcast_fn=broadcast_fn)
return message
async def create_outgoing_direct_message(
*,
conversation_key: str,

View File

@@ -7,6 +7,21 @@ POST_CONNECT_SETUP_TIMEOUT_SECONDS = 300
POST_CONNECT_SETUP_MAX_ATTEMPTS = 2
def _clean_device_string(value: object) -> str | None:
if not isinstance(value, str):
return None
cleaned = value.strip()
return cleaned or None
def _decode_fixed_string(raw: bytes, start: int, length: int) -> str | None:
if len(raw) < start:
return None
return _clean_device_string(
raw[start : start + length].decode("utf-8", "ignore").replace("\0", "")
)
async def run_post_connect_setup(radio_manager) -> None:
"""Run shared radio initialization after a transport connection succeeds."""
from app.event_handlers import register_event_handlers
@@ -78,18 +93,66 @@ async def run_post_connect_setup(radio_manager) -> None:
return await _original_handle_rx(data)
reader.handle_rx = _capture_handle_rx
radio_manager.device_info_loaded = False
radio_manager.max_contacts = None
radio_manager.device_model = None
radio_manager.firmware_build = None
radio_manager.firmware_version = None
radio_manager.max_channels = 40
radio_manager.path_hash_mode = 0
radio_manager.path_hash_mode_supported = False
try:
device_query = await mc.commands.send_device_query()
if device_query and "path_hash_mode" in device_query.payload:
radio_manager.path_hash_mode = device_query.payload["path_hash_mode"]
payload = (
device_query.payload
if device_query is not None and isinstance(device_query.payload, dict)
else {}
)
payload_max_contacts = payload.get("max_contacts")
if isinstance(payload_max_contacts, int):
radio_manager.max_contacts = max(1, payload_max_contacts)
payload_max_channels = payload.get("max_channels")
if isinstance(payload_max_channels, int):
radio_manager.max_channels = max(1, payload_max_channels)
radio_manager.device_model = _clean_device_string(payload.get("model"))
radio_manager.firmware_build = _clean_device_string(payload.get("fw_build"))
radio_manager.firmware_version = _clean_device_string(payload.get("ver"))
fw_ver = payload.get("fw ver")
payload_reports_device_info = isinstance(fw_ver, int) and fw_ver >= 3
if payload_reports_device_info:
radio_manager.device_info_loaded = True
if "path_hash_mode" in payload and isinstance(payload["path_hash_mode"], int):
radio_manager.path_hash_mode = payload["path_hash_mode"]
radio_manager.path_hash_mode_supported = True
elif _captured_frame:
# Raw-frame fallback: byte 1 = fw_ver, byte 81 = path_hash_mode
if _captured_frame:
# Raw-frame fallback / completion:
# byte 1 = fw_ver, byte 2 = max_contacts/2, byte 3 = max_channels,
# bytes 8:20 = fw_build, 20:60 = model, 60:80 = ver, byte 81 = path_hash_mode
raw = _captured_frame[-1]
fw_ver = raw[1] if len(raw) > 1 else 0
if fw_ver >= 10 and len(raw) >= 82:
if fw_ver >= 3:
radio_manager.device_info_loaded = True
if radio_manager.max_contacts is None and len(raw) >= 3:
radio_manager.max_contacts = max(1, raw[2] * 2)
if len(raw) >= 4 and not isinstance(payload_max_channels, int):
radio_manager.max_channels = max(1, raw[3])
if radio_manager.firmware_build is None:
radio_manager.firmware_build = _decode_fixed_string(raw, 8, 12)
if radio_manager.device_model is None:
radio_manager.device_model = _decode_fixed_string(raw, 20, 40)
if radio_manager.firmware_version is None:
radio_manager.firmware_version = _decode_fixed_string(raw, 60, 20)
if (
not radio_manager.path_hash_mode_supported
and fw_ver >= 10
and len(raw) >= 82
):
radio_manager.path_hash_mode = raw[81]
radio_manager.path_hash_mode_supported = True
logger.warning(
@@ -106,8 +169,20 @@ async def run_post_connect_setup(radio_manager) -> None:
logger.info("Path hash mode: %d (supported)", radio_manager.path_hash_mode)
else:
logger.debug("Firmware does not report path_hash_mode")
if radio_manager.device_info_loaded:
logger.info(
"Radio device info: model=%s build=%s version=%s max_contacts=%s max_channels=%d",
radio_manager.device_model or "unknown",
radio_manager.firmware_build or "unknown",
radio_manager.firmware_version or "unknown",
radio_manager.max_contacts
if radio_manager.max_contacts is not None
else "unknown",
radio_manager.max_channels,
)
logger.info("Max channel slots: %d", radio_manager.max_channels)
except Exception as exc:
logger.debug("Failed to query path_hash_mode: %s", exc)
logger.debug("Failed to query device info capabilities: %s", exc)
finally:
reader.handle_rx = _original_handle_rx
@@ -128,6 +203,7 @@ async def run_post_connect_setup(radio_manager) -> None:
drained = await drain_pending_messages(mc)
if drained > 0:
logger.info("Drained %d pending message(s)", drained)
radio_manager.clear_pending_message_channel_slots()
await mc.start_auto_message_fetching()
logger.info("Auto message fetching started")

View File

@@ -51,7 +51,7 @@ frontend/src/
│ ├── useRealtimeAppState.ts # WebSocket event application and reconnect recovery
│ ├── useAppShell.ts # App-shell view state (settings/sidebar/modals/cracker)
│ ├── useRepeaterDashboard.ts # Repeater dashboard state (login, panes, console, retries)
│ ├── useRadioControl.ts # Radio health/config state, reconnection
│ ├── useRadioControl.ts # Radio health/config state, reconnection, mesh discovery sweeps
│ ├── useAppSettings.ts # Settings, favorites, preferences migration
│ ├── useConversationRouter.ts # URL hash → active conversation routing
│ └── useContactsAndChannels.ts # Contact/channel loading, creation, deletion
@@ -110,7 +110,7 @@ frontend/src/
│ ├── NeighborsMiniMap.tsx # Leaflet mini-map for repeater neighbor locations
│ ├── settings/
│ │ ├── settingsConstants.ts # Settings section type, ordering, labels
│ │ ├── SettingsRadioSection.tsx # Name, keys, advert interval, max contacts, radio preset, freq/bw/sf/cr, txPower, lat/lon, reboot
│ │ ├── SettingsRadioSection.tsx # Name, keys, advert interval, max contacts, radio preset, freq/bw/sf/cr, txPower, lat/lon, reboot, mesh discovery
│ │ ├── SettingsLocalSection.tsx # Browser-local settings: theme, local label, reopen last conversation
│ │ ├── SettingsFanoutSection.tsx # Fanout integrations: MQTT, bots, config CRUD
│ │ ├── SettingsDatabaseSection.tsx # DB size, cleanup, auto-decrypt, local label
@@ -122,7 +122,8 @@ frontend/src/
│ │ ├── RepeaterTelemetryPane.tsx # Battery, airtime, packet counts
│ │ ├── RepeaterNeighborsPane.tsx # Neighbor table + lazy mini-map
│ │ ├── RepeaterAclPane.tsx # Permission table
│ │ ├── RepeaterRadioSettingsPane.tsx # Radio settings + advert intervals
│ │ ├── RepeaterNodeInfoPane.tsx # Repeater name, coords, clock drift
│ │ ├── RepeaterRadioSettingsPane.tsx # Radio config + advert intervals
│ │ ├── RepeaterLppTelemetryPane.tsx # CayenneLPP sensor data
│ │ ├── RepeaterOwnerInfoPane.tsx # Owner info + guest password
│ │ ├── RepeaterActionsPane.tsx # Send Advert, Sync Clock, Reboot
@@ -249,6 +250,7 @@ High-level state is delegated to hooks:
- `SettingsRadioSection.tsx` surfaces `path_hash_mode` only when `config.path_hash_mode_supported` is true.
- Advert-location control is intentionally only `off` vs `include node location`. Companion-radio firmware does not reliably distinguish saved coordinates from live GPS in this path.
- Mesh discovery in the radio section is limited to node classes that currently answer discovery control-data requests in firmware: repeaters and sensors.
- Frontend `path_len` fields are hop counts, not raw byte lengths; multibyte path rendering must use the accompanying metadata before splitting hop identifiers.
## WebSocket (`useWebSocket.ts`)
@@ -256,7 +258,7 @@ High-level state is delegated to hooks:
- Auto reconnect (3s) with cleanup guard on unmount.
- Heartbeat ping every 30s.
- Incoming JSON is parsed through `wsEvents.ts`, which validates the top-level envelope and known event type strings, then casts payloads at the handler boundary. It does not schema-validate per-event payload shapes.
- Event handlers: `health`, `message`, `contact`, `raw_packet`, `message_acked`, `contact_deleted`, `channel_deleted`, `error`, `success`, `pong` (ignored).
- Event handlers: `health`, `message`, `contact`, `contact_resolved`, `channel`, `raw_packet`, `message_acked`, `contact_deleted`, `channel_deleted`, `error`, `success`, `pong` (ignored).
- For `raw_packet` events, use `observation_id` as event identity; `id` is a storage reference and may repeat.
## URL Hash Navigation (`utils/urlHash.ts`)
@@ -316,6 +318,8 @@ LocalStorage migration helpers for favorites; canonical favorites are server-sid
- `flood_scope`
- `blocked_keys`, `blocked_names`
The backend still carries `sidebar_sort_order` for compatibility and old preference migration, but the current sidebar UI stores sort order per section (`Channels`, `Contacts`, `Repeaters`) in frontend localStorage rather than treating it as one global server-backed setting.
Note: MQTT, bot, and community MQTT settings were migrated to the `fanout_configs` table (managed via `/api/fanout`). They are no longer part of `AppSettings`.
`HealthStatus` includes `fanout_statuses: Record<string, FanoutStatusEntry>` mapping config IDs to `{name, type, status}`. Also includes `bots_disabled: boolean`.
@@ -326,7 +330,7 @@ Note: MQTT, bot, and community MQTT settings were migrated to the `fanout_config
## Contact Info Pane
Clicking a contact's avatar in `ChatHeader` or `MessageList` opens a `ContactInfoPane` sheet (right drawer) showing comprehensive contact details fetched from `GET /api/contacts/{key}/detail`:
Clicking a contact's avatar in `ChatHeader` or `MessageList` opens a `ContactInfoPane` sheet (right drawer) showing comprehensive contact details fetched from `GET /api/contacts/analytics` using either `?public_key=...` or `?name=...`:
- Header: avatar, name, public key, type badge, on-radio badge
- Info grid: last seen, first heard, last contacted, distance, hops
@@ -359,7 +363,7 @@ For repeater contacts (`type=2`), `ConversationPane.tsx` renders `RepeaterDashbo
**Login**: `RepeaterLogin` component — password or guest login via `POST /api/contacts/{key}/repeater/login`.
**Dashboard panes** (after login): Telemetry, Neighbors, ACL, Radio Settings, Advert Intervals, Owner Info — each fetched via granular `POST /api/contacts/{key}/repeater/{pane}` endpoints. Panes retry up to 3 times client-side. "Load All" fetches all panes serially (parallel would queue behind the radio lock).
**Dashboard panes** (after login): Telemetry, Node Info, Neighbors, ACL, Radio Settings, Advert Intervals, Owner Info — each fetched via granular `POST /api/contacts/{key}/repeater/{pane}` endpoints. Panes retry up to 3 times client-side. `Neighbors` depends on the smaller `node-info` fetch for repeater GPS, not the heavier radio-settings batch. "Load All" fetches all panes serially (parallel would queue behind the radio lock).
**Actions pane**: Send Advert, Sync Clock, Reboot — all send CLI commands via `POST /api/contacts/{key}/command`.
@@ -405,6 +409,10 @@ npm run test:run
npm run build
```
`npm run packaged-build` is release-only. It writes the fallback `frontend/prebuilt`
directory used by the downloadable prebuilt release zip; normal development and
validation should stick to `npm run build`.
When touching cross-layer contracts, also run backend tests from repo root:
```bash
@@ -413,6 +421,10 @@ PYTHONPATH=. uv run pytest tests/ -v
## Errata & Known Non-Issues
### Contacts rollup uses mention styling for unread DMs
This is intentional. In the sidebar section headers, unread direct messages are treated as mention-equivalent, so the Contacts rollup uses the highlighted mention-style badge for any unread DM. Row-level mention detection remains separate; this note is only about the section summary styling.
### RawPacketList always scrolls to bottom
`RawPacketList` unconditionally scrolls to the latest packet on every update. This is intentional — the packet feed is a live status display, not an interactive log meant for lingering or long-term analysis. Users watching it want to see the newest packet, not hold a scroll position.

View File

@@ -1,11 +1,12 @@
{
"name": "remoteterm-meshcore-frontend",
"private": true,
"version": "3.2.0",
"version": "3.4.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc && vite build",
"packaged-build": "vite build --outDir prebuilt",
"preview": "vite preview",
"test": "vitest",
"test:run": "vitest run",

View File

@@ -1,5 +1,6 @@
import { useEffect, useCallback, useRef, useState } from 'react';
import { api } from './api';
import * as messageCache from './messageCache';
import { takePrefetchOrFetch } from './prefetch';
import { useWebSocket } from './useWebSocket';
import {
@@ -72,6 +73,7 @@ export function App() {
const messageInputRef = useRef<MessageInputHandle>(null);
const [rawPackets, setRawPackets] = useState<RawPacket[]>([]);
const [channelUnreadMarker, setChannelUnreadMarker] = useState<ChannelUnreadMarker | null>(null);
const [visibilityVersion, setVisibilityVersion] = useState(0);
const lastUnreadBackfillAttemptRef = useRef<string | null>(null);
const {
notificationsSupported,
@@ -122,6 +124,9 @@ export function App() {
handleDisconnect,
handleReconnect,
handleAdvertise,
meshDiscovery,
meshDiscoveryLoadingTarget,
handleDiscoverMesh,
handleHealthRefresh,
} = useRadioControl();
@@ -130,7 +135,6 @@ export function App() {
favorites,
fetchAppSettings,
handleSaveAppSettings,
handleSortOrderChange,
handleToggleFavorite,
handleToggleBlockedKey,
handleToggleBlockedName,
@@ -228,6 +232,7 @@ export function App() {
fetchOlderMessages,
fetchNewerMessages,
jumpToBottom,
reloadCurrentConversation,
addMessageIfNew,
updateMessageAck,
triggerReconcile,
@@ -237,6 +242,7 @@ export function App() {
unreadCounts,
mentions,
lastMessageTimes,
unreadLastReadAts,
incrementUnread,
renameConversationState,
markAllRead,
@@ -260,14 +266,12 @@ export function App() {
if (activeChannelUnreadCount <= 0) {
return null;
}
const activeChannel = channels.find((channel) => channel.key === activeChannelId);
return {
channelId: activeChannelId,
lastReadAt: activeChannel?.last_read_at ?? null,
lastReadAt: unreadLastReadAts[getStateKey('channel', activeChannelId)] ?? null,
};
});
}, [activeConversation, channels, unreadCounts]);
}, [activeConversation, unreadCounts, unreadLastReadAts]);
useEffect(() => {
lastUnreadBackfillAttemptRef.current = null;
@@ -323,22 +327,41 @@ export function App() {
updateMessageAck,
notifyIncomingMessage,
});
const handleVisibilityPolicyChanged = useCallback(() => {
messageCache.clear();
reloadCurrentConversation();
void refreshUnreads();
setVisibilityVersion((current) => current + 1);
}, [refreshUnreads, reloadCurrentConversation]);
const handleBlockKey = useCallback(
async (key: string) => {
await handleToggleBlockedKey(key);
handleVisibilityPolicyChanged();
},
[handleToggleBlockedKey, handleVisibilityPolicyChanged]
);
const handleBlockName = useCallback(
async (name: string) => {
await handleToggleBlockedName(name);
handleVisibilityPolicyChanged();
},
[handleToggleBlockedName, handleVisibilityPolicyChanged]
);
const {
handleSendMessage,
handleResendChannelMessage,
handleSetChannelFloodScopeOverride,
handleSenderClick,
handleTrace,
handleBlockKey,
handleBlockName,
handlePathDiscovery,
} = useConversationActions({
activeConversation,
activeConversationRef,
setContacts,
setChannels,
addMessageIfNew,
jumpToBottom,
handleToggleBlockedKey,
handleToggleBlockedName,
messageInputRef,
});
const handleCreateCrackedChannel = useCallback(
@@ -377,10 +400,7 @@ export function App() {
void markAllRead();
},
favorites,
sortOrder: appSettings?.sidebar_sort_order ?? 'recent',
onSortOrderChange: (sortOrder: 'recent' | 'alpha') => {
void handleSortOrderChange(sortOrder);
},
legacySortOrder: appSettings?.sidebar_sort_order,
isConversationNotificationsEnabled,
};
const conversationPaneProps = {
@@ -405,6 +425,7 @@ export function App() {
loadingNewer,
messageInputRef,
onTrace: handleTrace,
onPathDiscovery: handlePathDiscovery,
onToggleFavorite: handleToggleFavorite,
onDeleteContact: handleDeleteContact,
onDeleteChannel: handleDeleteChannel,
@@ -438,6 +459,7 @@ export function App() {
const searchProps = {
contacts,
channels,
visibilityVersion,
onNavigateToMessage: handleNavigateToMessage,
prefillRequest: searchPrefillRequest,
};
@@ -452,6 +474,9 @@ export function App() {
onDisconnect: handleDisconnect,
onReconnect: handleReconnect,
onAdvertise: handleAdvertise,
meshDiscovery,
meshDiscoveryLoadingTarget,
onDiscoverMesh: handleDiscoverMesh,
onHealthRefresh: handleHealthRefresh,
onRefreshAppSettings: fetchAppSettings,
blockedKeys: appSettings?.blocked_keys,

View File

@@ -6,9 +6,7 @@ import type {
CommandResponse,
Contact,
ContactAnalytics,
ContactAdvertPath,
ContactAdvertPathSummary,
ContactDetail,
FanoutConfig,
Favorite,
HealthStatus,
@@ -17,14 +15,18 @@ import type {
MessagesAroundResponse,
MigratePreferencesRequest,
MigratePreferencesResponse,
NameOnlyContactDetail,
RadioConfig,
RadioConfigUpdate,
RadioDiscoveryResponse,
RadioDiscoveryTarget,
PathDiscoveryResponse,
ResendChannelMessageResponse,
RepeaterAclResponse,
RepeaterAdvertIntervalsResponse,
RepeaterLoginResponse,
RepeaterLppTelemetryResponse,
RepeaterNeighborsResponse,
RepeaterNodeInfoResponse,
RepeaterOwnerInfoResponse,
RepeaterRadioSettingsResponse,
RepeaterStatusResponse,
@@ -97,6 +99,11 @@ export const api = {
fetchJson<{ status: string }>('/radio/advertise', {
method: 'POST',
}),
discoverMesh: (target: RadioDiscoveryTarget) =>
fetchJson<RadioDiscoveryResponse>('/radio/discover', {
method: 'POST',
body: JSON.stringify({ target }),
}),
rebootRadio: () =>
fetchJson<{ status: string; message: string }>('/radio/reboot', {
method: 'POST',
@@ -120,18 +127,12 @@ export const api = {
fetchJson<ContactAdvertPathSummary[]>(
`/contacts/repeaters/advert-paths?limit_per_repeater=${limitPerRepeater}`
),
getContactAdvertPaths: (publicKey: string, limit = 10) =>
fetchJson<ContactAdvertPath[]>(`/contacts/${publicKey}/advert-paths?limit=${limit}`),
getContactAnalytics: (params: { publicKey?: string; name?: string }) => {
const searchParams = new URLSearchParams();
if (params.publicKey) searchParams.set('public_key', params.publicKey);
if (params.name) searchParams.set('name', params.name);
return fetchJson<ContactAnalytics>(`/contacts/analytics?${searchParams.toString()}`);
},
getContactDetail: (publicKey: string) =>
fetchJson<ContactDetail>(`/contacts/${publicKey}/detail`),
getNameOnlyContactDetail: (name: string) =>
fetchJson<NameOnlyContactDetail>(`/contacts/name-detail?name=${encodeURIComponent(name)}`),
deleteContact: (publicKey: string) =>
fetchJson<{ status: string }>(`/contacts/${publicKey}`, {
method: 'DELETE',
@@ -154,6 +155,10 @@ export const api = {
fetchJson<TraceResponse>(`/contacts/${publicKey}/trace`, {
method: 'POST',
}),
requestPathDiscovery: (publicKey: string) =>
fetchJson<PathDiscoveryResponse>(`/contacts/${publicKey}/path-discovery`, {
method: 'POST',
}),
setContactRoutingOverride: (publicKey: string, route: string) =>
fetchJson<{ status: string; public_key: string }>(`/contacts/${publicKey}/routing-override`, {
method: 'POST',
@@ -234,7 +239,7 @@ export const api = {
body: JSON.stringify({ channel_key: channelKey, text }),
}),
resendChannelMessage: (messageId: number, newTimestamp?: boolean) =>
fetchJson<{ status: string; message_id: number }>(
fetchJson<ResendChannelMessageResponse>(
`/messages/channel/${messageId}/resend${newTimestamp ? '?new_timestamp=true' : ''}`,
{ method: 'POST' }
),
@@ -352,6 +357,10 @@ export const api = {
fetchJson<RepeaterNeighborsResponse>(`/contacts/${publicKey}/repeater/neighbors`, {
method: 'POST',
}),
repeaterNodeInfo: (publicKey: string) =>
fetchJson<RepeaterNodeInfoResponse>(`/contacts/${publicKey}/repeater/node-info`, {
method: 'POST',
}),
repeaterAcl: (publicKey: string) =>
fetchJson<RepeaterAclResponse>(`/contacts/${publicKey}/repeater/acl`, {
method: 'POST',

View File

@@ -0,0 +1,105 @@
import { useEffect, useState } from 'react';
import { stripRegionScopePrefix } from '../utils/regionScope';
import { Button } from './ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from './ui/dialog';
import { Input } from './ui/input';
import { Label } from './ui/label';
interface ChannelFloodScopeOverrideModalProps {
open: boolean;
onClose: () => void;
roomName: string;
currentOverride: string | null;
onSetOverride: (value: string) => void;
}
export function ChannelFloodScopeOverrideModal({
open,
onClose,
roomName,
currentOverride,
onSetOverride,
}: ChannelFloodScopeOverrideModalProps) {
const [region, setRegion] = useState('');
useEffect(() => {
if (!open) {
return;
}
setRegion(stripRegionScopePrefix(currentOverride));
}, [currentOverride, open]);
const trimmedRegion = region.trim();
return (
<Dialog open={open} onOpenChange={(isOpen) => !isOpen && onClose()}>
<DialogContent className="sm:max-w-[520px]">
<DialogHeader>
<DialogTitle>Regional Override</DialogTitle>
<DialogDescription>
Room-level regional routing temporarily changes the radio flood scope before send and
restores it after. This can noticeably slow room sends.
</DialogDescription>
</DialogHeader>
<div className="space-y-4">
<div className="rounded-md border border-border bg-muted/20 p-3 text-sm">
<div className="font-medium">{roomName}</div>
<div className="mt-1 text-muted-foreground">
Current regional override:{' '}
{currentOverride ? stripRegionScopePrefix(currentOverride) : 'none'}
</div>
</div>
<div className="space-y-2">
<Label htmlFor="channel-region-input">Region</Label>
<Input
id="channel-region-input"
value={region}
onChange={(event) => setRegion(event.target.value)}
placeholder="Esperance"
autoFocus
/>
</div>
</div>
<DialogFooter className="gap-2 sm:block sm:space-x-0">
<div className="space-y-2">
<Button
type="button"
className="w-full"
disabled={trimmedRegion.length === 0}
onClick={() => {
onSetOverride(trimmedRegion);
onClose();
}}
>
{trimmedRegion.length > 0
? `Use ${trimmedRegion} region for ${roomName}`
: `Use region for ${roomName}`}
</Button>
<Button
type="button"
variant="outline"
className="w-full"
onClick={() => {
onSetOverride('');
onClose();
}}
>
Do not use region routing for {roomName}
</Button>
</div>
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -1,14 +1,24 @@
import { useEffect, useRef, useState } from 'react';
import { Bell, Globe2, Info, Star, Trash2 } from 'lucide-react';
import { Bell, Globe2, Info, Route, Star, Trash2 } from 'lucide-react';
import { toast } from './ui/sonner';
import { DirectTraceIcon } from './DirectTraceIcon';
import { ContactPathDiscoveryModal } from './ContactPathDiscoveryModal';
import { ChannelFloodScopeOverrideModal } from './ChannelFloodScopeOverrideModal';
import { isFavorite } from '../utils/favorites';
import { handleKeyboardActivate } from '../utils/a11y';
import { isPublicChannelKey } from '../utils/publicChannel';
import { stripRegionScopePrefix } from '../utils/regionScope';
import { isPrefixOnlyContact } from '../utils/pubkey';
import { ContactAvatar } from './ContactAvatar';
import { ContactStatusInfo } from './ContactStatusInfo';
import type { Channel, Contact, Conversation, Favorite, RadioConfig } from '../types';
import type {
Channel,
Contact,
Conversation,
Favorite,
PathDiscoveryResponse,
RadioConfig,
} from '../types';
interface ChatHeaderProps {
conversation: Conversation;
@@ -20,6 +30,7 @@ interface ChatHeaderProps {
notificationsEnabled: boolean;
notificationsPermission: NotificationPermission | 'unsupported';
onTrace: () => void;
onPathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
onToggleNotifications: () => void;
onToggleFavorite: (type: 'channel' | 'contact', id: string) => void;
onSetChannelFloodScopeOverride?: (key: string, floodScopeOverride: string) => void;
@@ -39,6 +50,7 @@ export function ChatHeader({
notificationsEnabled,
notificationsPermission,
onTrace,
onPathDiscovery,
onToggleNotifications,
onToggleFavorite,
onSetChannelFloodScopeOverride,
@@ -49,10 +61,14 @@ export function ChatHeader({
}: ChatHeaderProps) {
const [showKey, setShowKey] = useState(false);
const [contactStatusInline, setContactStatusInline] = useState(true);
const [pathDiscoveryOpen, setPathDiscoveryOpen] = useState(false);
const [channelOverrideOpen, setChannelOverrideOpen] = useState(false);
const keyTextRef = useRef<HTMLSpanElement | null>(null);
useEffect(() => {
setShowKey(false);
setPathDiscoveryOpen(false);
setChannelOverrideOpen(false);
}, [conversation.id]);
const activeChannel =
@@ -88,12 +104,7 @@ export function ChatHeader({
const handleEditFloodScopeOverride = () => {
if (conversation.type !== 'channel' || !onSetChannelFloodScopeOverride) return;
const nextValue = window.prompt(
'Enter regional override flood scope for this room. This temporarily changes the radio flood scope before send and restores it after, which significantly slows room sends. Leave blank to clear.',
activeFloodScopeLabel ?? ''
);
if (nextValue === null) return;
onSetChannelFloodScopeOverride(conversation.id, nextValue);
setChannelOverrideOpen(true);
};
const handleOpenConversationInfo = () => {
@@ -272,6 +283,21 @@ export function ChatHeader({
</span>
</span>
<div className="flex items-center justify-end gap-0.5 flex-shrink-0">
{conversation.type === 'contact' && (
<button
className="p-1 rounded hover:bg-accent text-lg leading-none transition-colors disabled:cursor-not-allowed disabled:opacity-50 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
onClick={() => setPathDiscoveryOpen(true)}
title={
activeContactIsPrefixOnly
? 'Path Discovery unavailable until the full contact key is known'
: 'Path Discovery. Send a routed probe and inspect the forward and return paths'
}
aria-label="Path Discovery"
disabled={activeContactIsPrefixOnly}
>
<Route className="h-4 w-4 text-muted-foreground" aria-hidden="true" />
</button>
)}
{conversation.type === 'contact' && (
<button
className="p-1 rounded hover:bg-accent text-lg leading-none transition-colors disabled:cursor-not-allowed disabled:opacity-50 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
@@ -279,7 +305,7 @@ export function ChatHeader({
title={
activeContactIsPrefixOnly
? 'Direct Trace unavailable until the full contact key is known'
: 'Direct Trace'
: 'Direct Trace. Send a zero-hop packet to this contact and display out and back SNR'
}
aria-label="Direct Trace"
disabled={activeContactIsPrefixOnly}
@@ -354,7 +380,7 @@ export function ChatHeader({
)}
</button>
)}
{!(conversation.type === 'channel' && conversation.name === 'Public') && (
{!(conversation.type === 'channel' && isPublicChannelKey(conversation.id)) && (
<button
className="p-1 rounded hover:bg-destructive/10 text-muted-foreground hover:text-destructive text-lg leading-none transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
onClick={() => {
@@ -371,6 +397,25 @@ export function ChatHeader({
</button>
)}
</div>
{conversation.type === 'contact' && activeContact && (
<ContactPathDiscoveryModal
open={pathDiscoveryOpen}
onClose={() => setPathDiscoveryOpen(false)}
contact={activeContact}
contacts={contacts}
radioName={config?.name ?? null}
onDiscover={onPathDiscovery}
/>
)}
{conversation.type === 'channel' && onSetChannelFloodScopeOverride && (
<ChannelFloodScopeOverrideModal
open={channelOverrideOpen}
onClose={() => setChannelOverrideOpen(false)}
roomName={conversation.name}
currentOverride={activeFloodScopeDisplay}
onSetOverride={(value) => onSetChannelFloodScopeOverride(conversation.id, value)}
/>
)}
</header>
);
}

View File

@@ -16,6 +16,7 @@ import {
hasRoutingOverride,
parsePathHops,
} from '../utils/pathUtils';
import { isPublicChannelKey } from '../utils/publicChannel';
import { getMapFocusHash } from '../utils/urlHash';
import { isFavorite } from '../utils/favorites';
import { handleKeyboardActivate } from '../utils/a11y';
@@ -278,11 +279,6 @@ export function ContactInfoPane({
<span className="text-[10px] uppercase tracking-wider px-1.5 py-0.5 rounded bg-muted text-muted-foreground font-medium">
{CONTACT_TYPE_LABELS[contact.type] ?? 'Unknown'}
</span>
{contact.on_radio && (
<span className="text-[10px] uppercase tracking-wider px-1.5 py-0.5 rounded bg-primary/10 text-primary font-medium">
On Radio
</span>
)}
</div>
</div>
</div>
@@ -616,7 +612,7 @@ function MostActiveRoomsSection({
onKeyDown={onNavigateToChannel ? handleKeyboardActivate : undefined}
onClick={() => onNavigateToChannel?.(room.channel_key)}
>
{room.channel_name.startsWith('#') || room.channel_name === 'Public'
{room.channel_name.startsWith('#') || isPublicChannelKey(room.channel_key)
? room.channel_name
: `#${room.channel_name}`}
</span>

View File

@@ -0,0 +1,213 @@
import { useMemo, useState } from 'react';
import type { Contact, PathDiscoveryResponse, PathDiscoveryRoute } from '../types';
import {
findContactsByPrefix,
formatRouteLabel,
getEffectiveContactRoute,
hasRoutingOverride,
parsePathHops,
} from '../utils/pathUtils';
import { Button } from './ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from './ui/dialog';
interface ContactPathDiscoveryModalProps {
open: boolean;
onClose: () => void;
contact: Contact;
contacts: Contact[];
radioName: string | null;
onDiscover: (publicKey: string) => Promise<PathDiscoveryResponse>;
}
function formatPathHashMode(mode: number): string {
if (mode === 0) return '1-byte hops';
if (mode === 1) return '2-byte hops';
if (mode === 2) return '3-byte hops';
return 'Unknown hop width';
}
function renderRouteNodes(
route: PathDiscoveryRoute,
startLabel: string,
endLabel: string,
contacts: Contact[]
): string {
if (route.path_len <= 0 || !route.path) {
return `${startLabel} -> ${endLabel}`;
}
const hops = parsePathHops(route.path, route.path_len).map((prefix) => {
const matches = findContactsByPrefix(prefix, contacts, true);
if (matches.length === 1) {
return matches[0].name || `${matches[0].public_key.slice(0, prefix.length)}`;
}
if (matches.length > 1) {
return `${prefix}…?`;
}
return `${prefix}`;
});
return [startLabel, ...hops, endLabel].join(' -> ');
}
function RouteCard({
label,
route,
chain,
}: {
label: string;
route: PathDiscoveryRoute;
chain: string;
}) {
const rawPath = parsePathHops(route.path, route.path_len).join(' -> ') || 'direct';
return (
<div className="rounded-md border border-border bg-muted/20 p-3">
<div className="flex items-center justify-between gap-3">
<h4 className="text-sm font-semibold">{label}</h4>
<span className="text-[11px] text-muted-foreground">
{formatRouteLabel(route.path_len, true)}
</span>
</div>
<p className="mt-2 text-sm">{chain}</p>
<div className="mt-2 flex flex-wrap gap-x-3 gap-y-1 text-[11px] text-muted-foreground">
<span>Raw: {rawPath}</span>
<span>{formatPathHashMode(route.path_hash_mode)}</span>
</div>
</div>
);
}
export function ContactPathDiscoveryModal({
open,
onClose,
contact,
contacts,
radioName,
onDiscover,
}: ContactPathDiscoveryModalProps) {
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const [result, setResult] = useState<PathDiscoveryResponse | null>(null);
const effectiveRoute = useMemo(() => getEffectiveContactRoute(contact), [contact]);
const hasForcedRoute = hasRoutingOverride(contact);
const learnedRouteSummary = useMemo(() => {
if (contact.last_path_len === -1) {
return 'Flood';
}
const hops = parsePathHops(contact.last_path, contact.last_path_len);
return hops.length > 0
? `${formatRouteLabel(contact.last_path_len, true)} (${hops.join(' -> ')})`
: formatRouteLabel(contact.last_path_len, true);
}, [contact.last_path, contact.last_path_len]);
const forcedRouteSummary = useMemo(() => {
if (!hasForcedRoute) {
return null;
}
if (effectiveRoute.pathLen === -1) {
return 'Flood';
}
const hops = parsePathHops(effectiveRoute.path, effectiveRoute.pathLen);
return hops.length > 0
? `${formatRouteLabel(effectiveRoute.pathLen, true)} (${hops.join(' -> ')})`
: formatRouteLabel(effectiveRoute.pathLen, true);
}, [effectiveRoute, hasForcedRoute]);
const forwardChain = result
? renderRouteNodes(
result.forward_path,
radioName || 'Local radio',
contact.name || contact.public_key.slice(0, 12),
contacts
)
: null;
const returnChain = result
? renderRouteNodes(
result.return_path,
contact.name || contact.public_key.slice(0, 12),
radioName || 'Local radio',
contacts
)
: null;
const handleDiscover = async () => {
setLoading(true);
setError(null);
try {
const discovered = await onDiscover(contact.public_key);
setResult(discovered);
} catch (err) {
setError(err instanceof Error ? err.message : 'Unknown error');
} finally {
setLoading(false);
}
};
return (
<Dialog open={open} onOpenChange={(isOpen) => !isOpen && onClose()}>
<DialogContent className="sm:max-w-[560px]">
<DialogHeader>
<DialogTitle>Path Discovery</DialogTitle>
<DialogDescription>
Send a routed probe to this contact and wait for the round-trip path response. The
learned forward route will be saved back onto the contact if a response comes back.
</DialogDescription>
</DialogHeader>
<div className="space-y-4">
<div className="rounded-md border border-border bg-muted/20 p-3 text-sm">
<div className="font-medium">{contact.name || contact.public_key.slice(0, 12)}</div>
<div className="mt-1 text-muted-foreground">
Current learned route: {learnedRouteSummary}
</div>
{forcedRouteSummary && (
<div className="mt-1 text-destructive">
Current forced route: {forcedRouteSummary}
</div>
)}
</div>
{hasForcedRoute && (
<div className="rounded-md border border-warning/30 bg-warning/10 px-3 py-2 text-sm text-warning">
A forced route override is currently set for this contact. Path discovery will update
the learned route data, but it will not replace the forced path. Clearing the forced
route afterward is enough to make the newly discovered learned path take effect. You
only need to rerun path discovery if you want a fresher route sample.
</div>
)}
{error && (
<div className="rounded-md border border-destructive/30 bg-destructive/10 px-3 py-2 text-sm text-destructive">
{error}
</div>
)}
{result && forwardChain && returnChain && (
<div className="space-y-3">
<RouteCard label="Forward Path" route={result.forward_path} chain={forwardChain} />
<RouteCard label="Return Path" route={result.return_path} chain={returnChain} />
</div>
)}
</div>
<DialogFooter className="gap-2 sm:justify-between">
<Button variant="secondary" onClick={onClose}>
Close
</Button>
<Button onClick={handleDiscover} disabled={loading}>
{loading ? 'Running...' : 'Run path discovery'}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,175 @@
import { useEffect, useMemo, useState } from 'react';
import { api } from '../api';
import type { Contact } from '../types';
import {
formatRouteLabel,
formatRoutingOverrideInput,
hasRoutingOverride,
} from '../utils/pathUtils';
import { Button } from './ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from './ui/dialog';
import { Input } from './ui/input';
import { Label } from './ui/label';
interface ContactRoutingOverrideModalProps {
open: boolean;
onClose: () => void;
contact: Contact;
onSaved: (message: string) => void;
onError: (message: string) => void;
}
function summarizeLearnedRoute(contact: Contact): string {
return formatRouteLabel(contact.last_path_len, true);
}
function summarizeForcedRoute(contact: Contact): string | null {
if (!hasRoutingOverride(contact)) {
return null;
}
const routeOverrideLen = contact.route_override_len;
return routeOverrideLen == null ? null : formatRouteLabel(routeOverrideLen, true);
}
export function ContactRoutingOverrideModal({
open,
onClose,
contact,
onSaved,
onError,
}: ContactRoutingOverrideModalProps) {
const [route, setRoute] = useState('');
const [saving, setSaving] = useState(false);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
if (!open) {
return;
}
setRoute(formatRoutingOverrideInput(contact));
setError(null);
}, [contact, open]);
const forcedRouteSummary = useMemo(() => summarizeForcedRoute(contact), [contact]);
const saveRoute = async (value: string) => {
setSaving(true);
setError(null);
try {
await api.setContactRoutingOverride(contact.public_key, value);
onSaved(value.trim() === '' ? 'Routing override cleared' : 'Routing override updated');
onClose();
} catch (err) {
const message = err instanceof Error ? err.message : 'Failed to update routing override';
setError(message);
onError(message);
} finally {
setSaving(false);
}
};
return (
<Dialog open={open} onOpenChange={(isOpen) => !isOpen && onClose()}>
<DialogContent className="sm:max-w-[560px]">
<DialogHeader>
<DialogTitle>Routing Override</DialogTitle>
<DialogDescription>
Set a forced route for this contact. Leave the field blank to clear the override and
fall back to the learned route or flood until a new path is heard.
</DialogDescription>
</DialogHeader>
<form
className="space-y-4"
onSubmit={(event) => {
event.preventDefault();
void saveRoute(route);
}}
>
<div className="rounded-md border border-border bg-muted/20 p-3 text-sm">
<div className="font-medium">{contact.name || contact.public_key.slice(0, 12)}</div>
<div className="mt-1 text-muted-foreground">
Current learned route: {summarizeLearnedRoute(contact)}
</div>
{forcedRouteSummary && (
<div className="mt-1 text-destructive">
Current forced route: {forcedRouteSummary}
</div>
)}
</div>
<div className="space-y-2">
<Label htmlFor="routing-override-input">Forced route</Label>
<Input
id="routing-override-input"
value={route}
onChange={(event) => setRoute(event.target.value)}
placeholder='Examples: "ae,f1" or "ae92,f13e"'
autoFocus
disabled={saving}
/>
<div className="space-y-1 text-xs text-muted-foreground">
<p>Use comma-separated 1, 2, or 3 byte hop IDs for an explicit path.</p>
</div>
</div>
<div className="space-y-2">
<div className="grid grid-cols-2 gap-2">
<Button
type="button"
variant="outline"
className="w-full"
onClick={() => void saveRoute('-1')}
disabled={saving}
>
Force Flood
</Button>
<Button
type="button"
variant="outline"
className="w-full"
onClick={() => void saveRoute('0')}
disabled={saving}
>
Force Direct
</Button>
</div>
<Button type="submit" className="w-full" disabled={saving || route.trim().length === 0}>
{saving
? 'Saving...'
: `Force ${route.trim() === '' ? 'custom' : route.trim()} routing`}
</Button>
</div>
{error && (
<div className="rounded-md border border-destructive/30 bg-destructive/10 px-3 py-2 text-sm text-destructive">
{error}
</div>
)}
<DialogFooter className="gap-2 sm:justify-between">
<Button type="button" variant="secondary" onClick={onClose} disabled={saving}>
Cancel
</Button>
<Button
type="button"
variant="outline"
onClick={() => void saveRoute('')}
disabled={saving}
>
Clear override
</Button>
</DialogFooter>
</form>
</DialogContent>
</Dialog>
);
}

View File

@@ -1,18 +1,17 @@
import type { ReactNode } from 'react';
import { useState, type ReactNode } from 'react';
import { toast } from './ui/sonner';
import { api } from '../api';
import { formatTime } from '../utils/messageParser';
import {
isValidLocation,
calculateDistance,
formatDistance,
formatRouteLabel,
formatRoutingOverrideInput,
getEffectiveContactRoute,
} from '../utils/pathUtils';
import { getMapFocusHash } from '../utils/urlHash';
import { handleKeyboardActivate } from '../utils/a11y';
import type { Contact } from '../types';
import { ContactRoutingOverrideModal } from './ContactRoutingOverrideModal';
interface ContactStatusInfoProps {
contact: Contact;
@@ -25,28 +24,10 @@ interface ContactStatusInfoProps {
* shared between ChatHeader and RepeaterDashboard.
*/
export function ContactStatusInfo({ contact, ourLat, ourLon }: ContactStatusInfoProps) {
const [routingModalOpen, setRoutingModalOpen] = useState(false);
const parts: ReactNode[] = [];
const effectiveRoute = getEffectiveContactRoute(contact);
const editRoutingOverride = () => {
const route = window.prompt(
'Enter explicit path as comma-separated 1, 2, or 3 byte hops (for example "ae,f1" or "ae92,f13e").\nEnter 0 to force direct always.\nEnter -1 to force flooding always.\nLeave blank to clear the override and reset to flood until a new path is heard.',
formatRoutingOverrideInput(contact)
);
if (route === null) {
return;
}
api.setContactRoutingOverride(contact.public_key, route).then(
() =>
toast.success(
route.trim() === '' ? 'Routing override cleared' : 'Routing override updated'
),
(err: unknown) =>
toast.error(err instanceof Error ? err.message : 'Failed to update routing override')
);
};
if (contact.last_seen) {
parts.push(`Last heard: ${formatTime(contact.last_seen)}`);
}
@@ -54,13 +35,13 @@ export function ContactStatusInfo({ contact, ourLat, ourLon }: ContactStatusInfo
parts.push(
<span
key="path"
className="cursor-pointer hover:text-primary hover:underline"
className="cursor-pointer underline underline-offset-2 decoration-muted-foreground/50 hover:text-primary"
role="button"
tabIndex={0}
onKeyDown={handleKeyboardActivate}
onClick={(e) => {
e.stopPropagation();
editRoutingOverride();
setRoutingModalOpen(true);
}}
title="Click to edit routing override"
>
@@ -101,15 +82,24 @@ export function ContactStatusInfo({ contact, ourLat, ourLon }: ContactStatusInfo
if (parts.length === 0) return null;
return (
<span className="font-normal text-sm text-muted-foreground flex-shrink-0">
(
{parts.map((part, i) => (
<span key={i}>
{i > 0 && ', '}
{part}
</span>
))}
)
</span>
<>
<span className="font-normal text-sm text-muted-foreground flex-shrink-0">
(
{parts.map((part, i) => (
<span key={i}>
{i > 0 && ', '}
{part}
</span>
))}
)
</span>
<ContactRoutingOverrideModal
open={routingModalOpen}
onClose={() => setRoutingModalOpen(false)}
contact={contact}
onSaved={(message) => toast.success(message)}
onError={(message) => toast.error(message)}
/>
</>
);
}

View File

@@ -11,6 +11,7 @@ import type {
Favorite,
HealthStatus,
Message,
PathDiscoveryResponse,
RawPacket,
RadioConfig,
} from '../types';
@@ -46,6 +47,7 @@ interface ConversationPaneProps {
loadingNewer: boolean;
messageInputRef: Ref<MessageInputHandle>;
onTrace: () => Promise<void>;
onPathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
onToggleFavorite: (type: 'channel' | 'contact', id: string) => Promise<void>;
onDeleteContact: (publicKey: string) => Promise<void>;
onDeleteChannel: (key: string) => Promise<void>;
@@ -109,6 +111,7 @@ export function ConversationPane({
loadingNewer,
messageInputRef,
onTrace,
onPathDiscovery,
onToggleFavorite,
onDeleteContact,
onDeleteChannel,
@@ -205,6 +208,7 @@ export function ConversationPane({
radioLon={config?.lon ?? null}
radioName={config?.name ?? null}
onTrace={onTrace}
onPathDiscovery={onPathDiscovery}
onToggleNotifications={onToggleNotifications}
onToggleFavorite={onToggleFavorite}
onDeleteContact={onDeleteContact}
@@ -225,6 +229,7 @@ export function ConversationPane({
notificationsEnabled={notificationsEnabled}
notificationsPermission={notificationsPermission}
onTrace={onTrace}
onPathDiscovery={onPathDiscovery}
onToggleNotifications={onToggleNotifications}
onToggleFavorite={onToggleFavorite}
onSetChannelFloodScopeOverride={onSetChannelFloodScopeOverride}

View File

@@ -104,7 +104,7 @@ function MapBoundsHandler({
}
export function MapView({ contacts, focusedKey }: MapViewProps) {
const sevenDaysAgo = Date.now() / 1000 - 7 * 24 * 60 * 60;
const [sevenDaysAgo] = useState(() => Date.now() / 1000 - 7 * 24 * 60 * 60);
// Filter to contacts with GPS coordinates, heard within the last 7 days.
// Always include the focused contact so "view on map" links work for older nodes.

View File

@@ -24,6 +24,7 @@ const CHANNEL_WARNING_THRESHOLD = 120; // Conservative for multi-hop
const CHANNEL_DANGER_BUFFER = 8; // Red zone starts this many bytes before hard limit
const textEncoder = new TextEncoder();
const RADIO_NO_RESPONSE_SNIPPET = 'no response was heard back';
/** Get UTF-8 byte length of a string (LoRa packets are byte-constrained, not character-constrained). */
function byteLen(s: string): number {
return textEncoder.encode(s).length;
@@ -118,8 +119,11 @@ export const MessageInput = forwardRef<MessageInputHandle, MessageInputProps>(fu
setText('');
} catch (err) {
console.error('Failed to send message:', err);
toast.error('Failed to send message', {
description: err instanceof Error ? err.message : 'Check radio connection',
const description = err instanceof Error ? err.message : 'Check radio connection';
const isRadioNoResponse =
err instanceof Error && err.message.toLowerCase().includes(RADIO_NO_RESPONSE_SNIPPET);
toast.error(isRadioNoResponse ? 'Radio did not confirm send' : 'Failed to send message', {
description,
});
return;
} finally {

View File

@@ -204,11 +204,9 @@ export function MessageList({
const resendTimersRef = useRef<Map<number, ReturnType<typeof setTimeout>>>(new Map());
const [highlightedMessageId, setHighlightedMessageId] = useState<number | null>(null);
const [showJumpToUnread, setShowJumpToUnread] = useState(false);
const [jumpToUnreadDismissed, setJumpToUnreadDismissed] = useState(false);
const targetScrolledRef = useRef(false);
const unreadMarkerRef = useRef<HTMLButtonElement | HTMLDivElement | null>(null);
const setUnreadMarkerElement = useCallback((node: HTMLButtonElement | HTMLDivElement | null) => {
unreadMarkerRef.current = node;
}, []);
// Capture scroll state in the scroll handler BEFORE any state updates
const scrollStateRef = useRef({
@@ -330,68 +328,6 @@ export function MessageList({
};
}, [messages, onResendChannelMessage]);
// Refs for scroll handler to read without causing callback recreation
const onLoadOlderRef = useRef(onLoadOlder);
const loadingOlderRef = useRef(loadingOlder);
const hasOlderMessagesRef = useRef(hasOlderMessages);
const onLoadNewerRef = useRef(onLoadNewer);
const loadingNewerRef = useRef(loadingNewer);
const hasNewerMessagesRef = useRef(hasNewerMessages);
onLoadOlderRef.current = onLoadOlder;
loadingOlderRef.current = loadingOlder;
hasOlderMessagesRef.current = hasOlderMessages;
onLoadNewerRef.current = onLoadNewer;
loadingNewerRef.current = loadingNewer;
hasNewerMessagesRef.current = hasNewerMessages;
// Handle scroll - capture state and detect when user is near top/bottom
// Stable callback: reads changing values from refs, never recreated.
const handleScroll = useCallback(() => {
if (!listRef.current) return;
const { scrollTop, scrollHeight, clientHeight } = listRef.current;
const distanceFromBottom = scrollHeight - scrollTop - clientHeight;
// Always capture current scroll state (needed for scroll preservation)
scrollStateRef.current = {
scrollTop,
scrollHeight,
clientHeight,
wasNearTop: scrollTop < 150,
wasNearBottom: distanceFromBottom < 100,
};
// Show scroll-to-bottom button when not near the bottom (more than 100px away)
setShowScrollToBottom(distanceFromBottom > 100);
if (!onLoadOlderRef.current || loadingOlderRef.current || !hasOlderMessagesRef.current) {
// skip older load
} else if (scrollTop < 100) {
onLoadOlderRef.current();
}
// Trigger load newer when within 100px of bottom
if (
onLoadNewerRef.current &&
!loadingNewerRef.current &&
hasNewerMessagesRef.current &&
distanceFromBottom < 100
) {
onLoadNewerRef.current();
}
}, []);
// Scroll to bottom handler (or jump to bottom if viewing historical messages)
const scrollToBottom = useCallback(() => {
if (hasNewerMessages && onJumpToBottom) {
onJumpToBottom();
return;
}
if (listRef.current) {
listRef.current.scrollTop = listRef.current.scrollHeight;
}
}, [hasNewerMessages, onJumpToBottom]);
// Sort messages by received_at ascending (oldest first)
// Note: Deduplication is handled by useConversationMessages.addMessageIfNew()
// and the database UNIQUE constraint on (type, conversation_key, text, sender_timestamp)
@@ -408,10 +344,117 @@ export function MessageList({
return sortedMessages.findIndex((msg) => !msg.outgoing && msg.received_at > boundary);
}, [sortedMessages, unreadMarkerLastReadAt]);
const syncJumpToUnreadVisibility = useCallback(() => {
if (unreadMarkerIndex === -1 || jumpToUnreadDismissed) {
setShowJumpToUnread(false);
return;
}
const marker = unreadMarkerRef.current;
const list = listRef.current;
if (!marker || !list) {
setShowJumpToUnread(true);
return;
}
const markerRect = marker.getBoundingClientRect();
const listRect = list.getBoundingClientRect();
if (
markerRect.width === 0 ||
markerRect.height === 0 ||
listRect.width === 0 ||
listRect.height === 0
) {
setShowJumpToUnread(true);
return;
}
const markerVisible =
markerRect.top >= listRect.top &&
markerRect.bottom <= listRect.bottom &&
markerRect.left >= listRect.left &&
markerRect.right <= listRect.right;
setShowJumpToUnread(!markerVisible);
}, [jumpToUnreadDismissed, unreadMarkerIndex]);
// Refs for scroll handler to read without causing callback recreation
const onLoadOlderRef = useRef(onLoadOlder);
const loadingOlderRef = useRef(loadingOlder);
const hasOlderMessagesRef = useRef(hasOlderMessages);
const onLoadNewerRef = useRef(onLoadNewer);
const loadingNewerRef = useRef(loadingNewer);
const hasNewerMessagesRef = useRef(hasNewerMessages);
onLoadOlderRef.current = onLoadOlder;
loadingOlderRef.current = loadingOlder;
hasOlderMessagesRef.current = hasOlderMessages;
onLoadNewerRef.current = onLoadNewer;
loadingNewerRef.current = loadingNewer;
hasNewerMessagesRef.current = hasNewerMessages;
const setUnreadMarkerElement = useCallback(
(node: HTMLButtonElement | HTMLDivElement | null) => {
unreadMarkerRef.current = node;
syncJumpToUnreadVisibility();
},
[syncJumpToUnreadVisibility]
);
useEffect(() => {
setShowJumpToUnread(unreadMarkerIndex !== -1);
setJumpToUnreadDismissed(false);
}, [unreadMarkerIndex]);
useLayoutEffect(() => {
syncJumpToUnreadVisibility();
}, [messages, syncJumpToUnreadVisibility]);
// Handle scroll - capture state and detect when user is near top/bottom
// Stable callback: reads changing values from refs, never recreated.
const handleScroll = useCallback(() => {
if (!listRef.current) return;
const { scrollTop, scrollHeight, clientHeight } = listRef.current;
const distanceFromBottom = scrollHeight - scrollTop - clientHeight;
scrollStateRef.current = {
scrollTop,
scrollHeight,
clientHeight,
wasNearTop: scrollTop < 150,
wasNearBottom: distanceFromBottom < 100,
};
setShowScrollToBottom(distanceFromBottom > 100);
if (!onLoadOlderRef.current || loadingOlderRef.current || !hasOlderMessagesRef.current) {
// skip older load
} else if (scrollTop < 100) {
onLoadOlderRef.current();
}
if (
onLoadNewerRef.current &&
!loadingNewerRef.current &&
hasNewerMessagesRef.current &&
distanceFromBottom < 100
) {
onLoadNewerRef.current();
}
syncJumpToUnreadVisibility();
}, [syncJumpToUnreadVisibility]);
// Scroll to bottom handler (or jump to bottom if viewing historical messages)
const scrollToBottom = useCallback(() => {
if (hasNewerMessages && onJumpToBottom) {
onJumpToBottom();
return;
}
if (listRef.current) {
listRef.current.scrollTop = listRef.current.scrollHeight;
}
}, [hasNewerMessages, onJumpToBottom]);
// Sender info for outgoing messages (used by path modal on own messages)
const selfSenderInfo = useMemo<SenderInfo>(
() => ({
@@ -837,16 +880,31 @@ export function MessageList({
{/* Scroll to bottom button */}
{showJumpToUnread && (
<div className="pointer-events-none absolute bottom-4 left-1/2 -translate-x-1/2">
<button
type="button"
onClick={() => {
unreadMarkerRef.current?.scrollIntoView?.({ block: 'center' });
setShowJumpToUnread(false);
}}
className="pointer-events-auto h-9 rounded-full bg-card hover:bg-accent border border-border px-3 text-sm font-medium shadow-lg transition-all hover:scale-105 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
>
Jump to unread
</button>
<div className="pointer-events-auto flex h-9 items-center overflow-hidden rounded-full border border-border bg-card shadow-lg transition-all hover:scale-105">
<button
type="button"
onClick={() => {
unreadMarkerRef.current?.scrollIntoView?.({ block: 'center' });
setJumpToUnreadDismissed(true);
setShowJumpToUnread(false);
}}
className="h-full px-3 text-sm font-medium hover:bg-accent focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
>
Jump to unread
</button>
<button
type="button"
onClick={() => {
setJumpToUnreadDismissed(true);
setShowJumpToUnread(false);
}}
className="flex h-full w-9 items-center justify-center border-l border-border text-muted-foreground hover:bg-accent hover:text-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
aria-label="Dismiss jump to unread"
title="Dismiss jump to unread"
>
×
</button>
</div>
</div>
)}
{showScrollToBottom && (

View File

@@ -15,6 +15,7 @@ import { Input } from './ui/input';
import { Label } from './ui/label';
import { Checkbox } from './ui/checkbox';
import { Button } from './ui/button';
import { toast } from './ui/sonner';
type Tab = 'existing' | 'new-contact' | 'new-room' | 'hashtag';
@@ -90,6 +91,9 @@ export function NewMessageModal({
resetForm();
onClose();
} catch (err) {
toast.error('Failed to create conversation', {
description: err instanceof Error ? err.message : undefined,
});
setError(err instanceof Error ? err.message : 'Failed to create');
} finally {
setLoading(false);
@@ -123,6 +127,9 @@ export function NewMessageModal({
setName('');
hashtagInputRef.current?.focus();
} catch (err) {
toast.error('Failed to create conversation', {
description: err instanceof Error ? err.message : undefined,
});
setError(err instanceof Error ? err.message : 'Failed to create');
} finally {
setLoading(false);

View File

@@ -29,6 +29,9 @@ export function PacketVisualizer3D({
const [showAmbiguousPaths, setShowAmbiguousPaths] = useState(savedSettings.showAmbiguousPaths);
const [showAmbiguousNodes, setShowAmbiguousNodes] = useState(savedSettings.showAmbiguousNodes);
const [useAdvertPathHints, setUseAdvertPathHints] = useState(savedSettings.useAdvertPathHints);
const [collapseLikelyKnownSiblingRepeaters, setCollapseLikelyKnownSiblingRepeaters] = useState(
savedSettings.collapseLikelyKnownSiblingRepeaters
);
const [splitAmbiguousByTraffic, setSplitAmbiguousByTraffic] = useState(
savedSettings.splitAmbiguousByTraffic
);
@@ -52,6 +55,7 @@ export function PacketVisualizer3D({
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
collapseLikelyKnownSiblingRepeaters,
splitAmbiguousByTraffic,
chargeStrength,
observationWindowSec,
@@ -66,6 +70,7 @@ export function PacketVisualizer3D({
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
collapseLikelyKnownSiblingRepeaters,
splitAmbiguousByTraffic,
chargeStrength,
observationWindowSec,
@@ -108,6 +113,7 @@ export function PacketVisualizer3D({
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
collapseLikelyKnownSiblingRepeaters,
splitAmbiguousByTraffic,
chargeStrength,
letEmDrift,
@@ -117,7 +123,7 @@ export function PacketVisualizer3D({
pruneStaleMinutes,
});
const { hoveredNodeId, hoveredNeighborIds, pinnedNodeId } = useVisualizer3DScene({
const { hoveredNodeId, pinnedNodeId } = useVisualizer3DScene({
containerRef,
data,
autoOrbit,
@@ -143,6 +149,8 @@ export function PacketVisualizer3D({
setShowAmbiguousNodes={setShowAmbiguousNodes}
useAdvertPathHints={useAdvertPathHints}
setUseAdvertPathHints={setUseAdvertPathHints}
collapseLikelyKnownSiblingRepeaters={collapseLikelyKnownSiblingRepeaters}
setCollapseLikelyKnownSiblingRepeaters={setCollapseLikelyKnownSiblingRepeaters}
splitAmbiguousByTraffic={splitAmbiguousByTraffic}
setSplitAmbiguousByTraffic={setSplitAmbiguousByTraffic}
observationWindowSec={observationWindowSec}
@@ -167,8 +175,9 @@ export function PacketVisualizer3D({
<VisualizerTooltip
activeNodeId={tooltipNodeId}
nodes={data.nodes}
neighborIds={hoveredNeighborIds}
canonicalNodes={data.canonicalNodes}
canonicalNeighborIds={data.canonicalNeighborIds}
renderedNodeIds={data.renderedNodeIds}
/>
</div>
);

View File

@@ -1,21 +1,26 @@
import { useState } from 'react';
import { toast } from './ui/sonner';
import { Button } from './ui/button';
import { Bell, Star, Trash2 } from 'lucide-react';
import { Bell, Route, Star, Trash2 } from 'lucide-react';
import { DirectTraceIcon } from './DirectTraceIcon';
import { RepeaterLogin } from './RepeaterLogin';
import { useRepeaterDashboard } from '../hooks/useRepeaterDashboard';
import { isFavorite } from '../utils/favorites';
import { handleKeyboardActivate } from '../utils/a11y';
import { isValidLocation } from '../utils/pathUtils';
import { ContactStatusInfo } from './ContactStatusInfo';
import type { Contact, Conversation, Favorite } from '../types';
import type { Contact, Conversation, Favorite, PathDiscoveryResponse } from '../types';
import { TelemetryPane } from './repeater/RepeaterTelemetryPane';
import { NeighborsPane } from './repeater/RepeaterNeighborsPane';
import { AclPane } from './repeater/RepeaterAclPane';
import { NodeInfoPane } from './repeater/RepeaterNodeInfoPane';
import { RadioSettingsPane } from './repeater/RepeaterRadioSettingsPane';
import { LppTelemetryPane } from './repeater/RepeaterLppTelemetryPane';
import { OwnerInfoPane } from './repeater/RepeaterOwnerInfoPane';
import { ActionsPane } from './repeater/RepeaterActionsPane';
import { ConsolePane } from './repeater/RepeaterConsolePane';
import { ContactPathDiscoveryModal } from './ContactPathDiscoveryModal';
// Re-export for backwards compatibility (used by repeaterFormatters.test.ts)
export { formatDuration, formatClockDrift } from './repeater/repeaterPaneShared';
@@ -33,6 +38,7 @@ interface RepeaterDashboardProps {
radioLon: number | null;
radioName: string | null;
onTrace: () => void;
onPathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
onToggleNotifications: () => void;
onToggleFavorite: (type: 'channel' | 'contact', id: string) => void;
onDeleteContact: (publicKey: string) => void;
@@ -49,10 +55,14 @@ export function RepeaterDashboard({
radioLon,
radioName,
onTrace,
onPathDiscovery,
onToggleNotifications,
onToggleFavorite,
onDeleteContact,
}: RepeaterDashboardProps) {
const [pathDiscoveryOpen, setPathDiscoveryOpen] = useState(false);
const contact = contacts.find((c) => c.public_key === conversation.id) ?? null;
const hasAdvertLocation = isValidLocation(contact?.lat ?? null, contact?.lon ?? null);
const {
loggedIn,
loginLoading,
@@ -70,9 +80,8 @@ export function RepeaterDashboard({
sendFloodAdvert,
rebootRepeater,
syncClock,
} = useRepeaterDashboard(conversation);
} = useRepeaterDashboard(conversation, { hasAdvertLocation });
const contact = contacts.find((c) => c.public_key === conversation.id);
const isFav = isFavorite(favorites, 'contact', conversation.id);
// Loading all panes indicator
@@ -121,6 +130,16 @@ export function RepeaterDashboard({
{anyLoading ? 'Loading...' : 'Load All'}
</Button>
)}
{contact && (
<button
className="p-1 rounded hover:bg-accent text-lg leading-none transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
onClick={() => setPathDiscoveryOpen(true)}
title="Path Discovery. Send a routed probe and inspect the forward and return paths"
aria-label="Path Discovery"
>
<Route className="h-4 w-4 text-muted-foreground" aria-hidden="true" />
</button>
)}
<button
className="p-1 rounded hover:bg-accent text-lg leading-none transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
onClick={onTrace}
@@ -183,6 +202,16 @@ export function RepeaterDashboard({
<Trash2 className="h-4 w-4" aria-hidden="true" />
</button>
</div>
{contact && (
<ContactPathDiscoveryModal
open={pathDiscoveryOpen}
onClose={() => setPathDiscoveryOpen(false)}
contact={contact}
contacts={contacts}
radioName={radioName}
onDiscover={onPathDiscovery}
/>
)}
</header>
{/* Body */}
@@ -197,9 +226,15 @@ export function RepeaterDashboard({
/>
) : (
<div className="space-y-4">
{/* Top row: Telemetry + Radio Settings | Neighbors (with expanding map) */}
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
{/* Top row: Telemetry + Radio Settings | Node Info + Neighbors */}
<div className="grid grid-cols-1 gap-4 md:grid-cols-2 md:items-stretch">
<div className="flex flex-col gap-4">
<NodeInfoPane
data={paneData.nodeInfo}
state={paneStates.nodeInfo}
onRefresh={() => refreshPane('nodeInfo')}
disabled={anyLoading}
/>
<TelemetryPane
data={paneData.status}
state={paneStates.status}
@@ -222,16 +257,19 @@ export function RepeaterDashboard({
disabled={anyLoading}
/>
</div>
<NeighborsPane
data={paneData.neighbors}
state={paneStates.neighbors}
onRefresh={() => refreshPane('neighbors')}
disabled={anyLoading}
contacts={contacts}
radioLat={radioLat}
radioLon={radioLon}
radioName={radioName}
/>
<div className="flex min-h-0 flex-col gap-4">
<NeighborsPane
data={paneData.neighbors}
state={paneStates.neighbors}
onRefresh={() => refreshPane('neighbors')}
disabled={anyLoading}
repeaterContact={contact}
contacts={contacts}
nodeInfo={paneData.nodeInfo}
nodeInfoState={paneStates.nodeInfo}
repeaterName={conversation.name}
/>
</div>
</div>
{/* Remaining panes: ACL | Owner Info + Actions */}

View File

@@ -31,6 +31,7 @@ export interface SearchNavigateTarget {
export interface SearchViewProps {
contacts: Contact[];
channels: Channel[];
visibilityVersion?: number;
onNavigateToMessage: (target: SearchNavigateTarget) => void;
prefillRequest?: {
query: string;
@@ -84,6 +85,7 @@ function getHighlightQuery(query: string): string {
export function SearchView({
contacts,
channels,
visibilityVersion = 0,
onNavigateToMessage,
prefillRequest = null,
}: SearchViewProps) {
@@ -110,7 +112,7 @@ export function SearchView({
setResults([]);
setOffset(0);
setHasMore(false);
}, [debouncedQuery]);
}, [debouncedQuery, visibilityVersion]);
useEffect(() => {
if (!prefillRequest) {
@@ -159,7 +161,7 @@ export function SearchView({
});
return () => controller.abort();
}, [debouncedQuery]);
}, [debouncedQuery, visibilityVersion]);
const loadMore = useCallback(() => {
if (!debouncedQuery || loading) return;

View File

@@ -5,6 +5,8 @@ import type {
HealthStatus,
RadioConfig,
RadioConfigUpdate,
RadioDiscoveryResponse,
RadioDiscoveryTarget,
} from '../types';
import type { LocalLabel } from '../utils/localLabel';
import {
@@ -34,6 +36,9 @@ interface SettingsModalBaseProps {
onDisconnect: () => Promise<void>;
onReconnect: () => Promise<void>;
onAdvertise: () => Promise<void>;
meshDiscovery: RadioDiscoveryResponse | null;
meshDiscoveryLoadingTarget: RadioDiscoveryTarget | null;
onDiscoverMesh: (target: RadioDiscoveryTarget) => Promise<void>;
onHealthRefresh: () => Promise<void>;
onRefreshAppSettings: () => Promise<void>;
onLocalLabelChange?: (label: LocalLabel) => void;
@@ -64,6 +69,9 @@ export function SettingsModal(props: SettingsModalProps) {
onDisconnect,
onReconnect,
onAdvertise,
meshDiscovery,
meshDiscoveryLoadingTarget,
onDiscoverMesh,
onHealthRefresh,
onRefreshAppSettings,
onLocalLabelChange,
@@ -189,6 +197,9 @@ export function SettingsModal(props: SettingsModalProps) {
onDisconnect={onDisconnect}
onReconnect={onReconnect}
onAdvertise={onAdvertise}
meshDiscovery={meshDiscovery}
meshDiscoveryLoadingTarget={meshDiscoveryLoadingTarget}
onDiscoverMesh={onDiscoverMesh}
onClose={onClose}
className={sectionContentClass}
/>

View File

@@ -19,7 +19,18 @@ import {
type Conversation,
type Favorite,
} from '../types';
import { getStateKey, type ConversationTimes, type SortOrder } from '../utils/conversationState';
import {
buildSidebarSectionSortOrders,
getStateKey,
loadLegacyLocalStorageSortOrder,
loadLocalStorageSidebarSectionSortOrders,
saveLocalStorageSidebarSectionSortOrders,
type ConversationTimes,
type SidebarSectionSortOrders,
type SidebarSortableSection,
type SortOrder,
} from '../utils/conversationState';
import { isPublicChannelKey } from '../utils/publicChannel';
import { getContactDisplayName } from '../utils/pubkey';
import { handleKeyboardActivate } from '../utils/a11y';
import { ContactAvatar } from './ContactAvatar';
@@ -91,13 +102,36 @@ interface SidebarProps {
onToggleCracker: () => void;
onMarkAllRead: () => void;
favorites: Favorite[];
/** Sort order from server settings */
sortOrder?: SortOrder;
/** Callback when sort order changes */
onSortOrderChange?: (order: SortOrder) => void;
/** Legacy global sort order, used only to seed per-section local preferences. */
legacySortOrder?: SortOrder;
isConversationNotificationsEnabled?: (type: 'channel' | 'contact', id: string) => boolean;
}
type InitialSectionSortState = {
orders: SidebarSectionSortOrders;
source: 'section' | 'legacy' | 'none';
};
function loadInitialSectionSortOrders(): InitialSectionSortState {
const storedOrders = loadLocalStorageSidebarSectionSortOrders();
if (storedOrders) {
return { orders: storedOrders, source: 'section' };
}
const legacyOrder = loadLegacyLocalStorageSortOrder();
if (legacyOrder) {
return {
orders: buildSidebarSectionSortOrders(legacyOrder),
source: 'legacy',
};
}
return {
orders: buildSidebarSectionSortOrders(),
source: 'none',
};
}
export function Sidebar({
contacts,
channels,
@@ -112,12 +146,12 @@ export function Sidebar({
onToggleCracker,
onMarkAllRead,
favorites,
sortOrder: sortOrderProp = 'recent',
onSortOrderChange,
legacySortOrder,
isConversationNotificationsEnabled,
}: SidebarProps) {
const sortOrder = sortOrderProp;
const [searchQuery, setSearchQuery] = useState('');
const initialSectionSortState = useMemo(loadInitialSectionSortOrders, []);
const [sectionSortOrders, setSectionSortOrders] = useState(initialSectionSortState.orders);
const initialCollapsedState = useMemo(loadCollapsedState, []);
const [toolsCollapsed, setToolsCollapsed] = useState(initialCollapsedState.tools);
const [favoritesCollapsed, setFavoritesCollapsed] = useState(initialCollapsedState.favorites);
@@ -125,10 +159,31 @@ export function Sidebar({
const [contactsCollapsed, setContactsCollapsed] = useState(initialCollapsedState.contacts);
const [repeatersCollapsed, setRepeatersCollapsed] = useState(initialCollapsedState.repeaters);
const collapseSnapshotRef = useRef<CollapseState | null>(null);
const sectionSortSourceRef = useRef(initialSectionSortState.source);
const handleSortToggle = () => {
const newOrder = sortOrder === 'alpha' ? 'recent' : 'alpha';
onSortOrderChange?.(newOrder);
useEffect(() => {
if (sectionSortSourceRef.current === 'legacy') {
saveLocalStorageSidebarSectionSortOrders(sectionSortOrders);
sectionSortSourceRef.current = 'section';
return;
}
if (sectionSortSourceRef.current !== 'none' || legacySortOrder === undefined) return;
const seededOrders = buildSidebarSectionSortOrders(legacySortOrder);
setSectionSortOrders(seededOrders);
saveLocalStorageSidebarSectionSortOrders(seededOrders);
sectionSortSourceRef.current = 'section';
}, [legacySortOrder, sectionSortOrders]);
const handleSortToggle = (section: SidebarSortableSection) => {
setSectionSortOrders((prev) => {
const nextOrder = prev[section] === 'alpha' ? 'recent' : 'alpha';
const updated = { ...prev, [section]: nextOrder };
saveLocalStorageSidebarSectionSortOrders(updated);
sectionSortSourceRef.current = 'section';
return updated;
});
};
const handleSelectConversation = (conversation: Conversation) => {
@@ -200,10 +255,10 @@ export function Sidebar({
() =>
[...uniqueChannels].sort((a, b) => {
// Public channel always sorts to the top
if (a.name === 'Public') return -1;
if (b.name === 'Public') return 1;
if (isPublicChannelKey(a.key)) return -1;
if (isPublicChannelKey(b.key)) return 1;
if (sortOrder === 'recent') {
if (sectionSortOrders.channels === 'recent') {
const timeA = getLastMessageTime('channel', a.key);
const timeB = getLastMessageTime('channel', b.key);
if (timeA && timeB) return timeB - timeA;
@@ -212,13 +267,13 @@ export function Sidebar({
}
return a.name.localeCompare(b.name);
}),
[uniqueChannels, sortOrder, getLastMessageTime]
[uniqueChannels, sectionSortOrders.channels, getLastMessageTime]
);
const sortContactsByOrder = useCallback(
(items: Contact[]) =>
(items: Contact[], order: SortOrder) =>
[...items].sort((a, b) => {
if (sortOrder === 'recent') {
if (order === 'recent') {
const timeA = getLastMessageTime('contact', a.public_key);
const timeB = getLastMessageTime('contact', b.public_key);
if (timeA && timeB) return timeB - timeA;
@@ -227,18 +282,26 @@ export function Sidebar({
}
return (a.name || a.public_key).localeCompare(b.name || b.public_key);
}),
[sortOrder, getLastMessageTime]
[getLastMessageTime]
);
// Split non-repeater contacts and repeater contacts into separate sorted lists
const sortedNonRepeaterContacts = useMemo(
() => sortContactsByOrder(uniqueContacts.filter((c) => c.type !== CONTACT_TYPE_REPEATER)),
[uniqueContacts, sortContactsByOrder]
() =>
sortContactsByOrder(
uniqueContacts.filter((c) => c.type !== CONTACT_TYPE_REPEATER),
sectionSortOrders.contacts
),
[uniqueContacts, sectionSortOrders.contacts, sortContactsByOrder]
);
const sortedRepeaters = useMemo(
() => sortContactsByOrder(uniqueContacts.filter((c) => c.type === CONTACT_TYPE_REPEATER)),
[uniqueContacts, sortContactsByOrder]
() =>
sortContactsByOrder(
uniqueContacts.filter((c) => c.type === CONTACT_TYPE_REPEATER),
sectionSortOrders.repeaters
),
[uniqueContacts, sectionSortOrders.repeaters, sortContactsByOrder]
);
// Filter by search query
@@ -604,11 +667,12 @@ export function Sidebar({
title: string,
collapsed: boolean,
onToggle: () => void,
showSortToggle = false,
sortSection: SidebarSortableSection | null = null,
unreadCount = 0,
highlightUnread = false
) => {
const effectiveCollapsed = isSearching ? false : collapsed;
const sectionSortOrder = sortSection ? sectionSortOrders[sortSection] : null;
return (
<div className="flex justify-between items-center px-3 py-2 pt-3.5">
@@ -630,16 +694,24 @@ export function Sidebar({
)}
<span>{title}</span>
</button>
{(showSortToggle || unreadCount > 0) && (
{(sortSection || unreadCount > 0) && (
<div className="ml-auto flex items-center gap-1.5">
{showSortToggle && (
{sortSection && sectionSortOrder && (
<button
className="bg-transparent text-muted-foreground/60 px-1 py-0.5 text-[10px] rounded hover:text-foreground transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
onClick={handleSortToggle}
aria-label={sortOrder === 'alpha' ? 'Sort by recent' : 'Sort alphabetically'}
title={sortOrder === 'alpha' ? 'Sort by recent' : 'Sort alphabetically'}
onClick={() => handleSortToggle(sortSection)}
aria-label={
sectionSortOrder === 'alpha'
? `Sort ${title} by recent`
: `Sort ${title} alphabetically`
}
title={
sectionSortOrder === 'alpha'
? `Sort ${title} by recent`
: `Sort ${title} alphabetically`
}
>
{sortOrder === 'alpha' ? 'A-Z' : '⏱'}
{sectionSortOrder === 'alpha' ? 'A-Z' : '⏱'}
</button>
)}
{unreadCount > 0 && (
@@ -731,7 +803,7 @@ export function Sidebar({
'Favorites',
favoritesCollapsed,
() => setFavoritesCollapsed((prev) => !prev),
false,
null,
favoritesUnreadCount,
favoritesHasMention
)}
@@ -747,7 +819,7 @@ export function Sidebar({
'Channels',
channelsCollapsed,
() => setChannelsCollapsed((prev) => !prev),
true,
'channels',
channelsUnreadCount,
channelsHasMention
)}
@@ -763,7 +835,7 @@ export function Sidebar({
'Contacts',
contactsCollapsed,
() => setContactsCollapsed((prev) => !prev),
true,
'contacts',
contactsUnreadCount,
contactsUnreadCount > 0
)}
@@ -779,7 +851,7 @@ export function Sidebar({
'Repeaters',
repeatersCollapsed,
() => setRepeatersCollapsed((prev) => !prev),
true,
'repeaters',
repeatersUnreadCount
)}
{(isSearching || !repeatersCollapsed) &&

View File

@@ -2,7 +2,13 @@ import { useMemo, lazy, Suspense } from 'react';
import { cn } from '@/lib/utils';
import { RepeaterPane, NotFetched, formatDuration } from './repeaterPaneShared';
import { isValidLocation, calculateDistance, formatDistance } from '../../utils/pathUtils';
import type { Contact, RepeaterNeighborsResponse, PaneState, NeighborInfo } from '../../types';
import type {
Contact,
RepeaterNeighborsResponse,
PaneState,
NeighborInfo,
RepeaterNodeInfoResponse,
} from '../../types';
const NeighborsMiniMap = lazy(() =>
import('../NeighborsMiniMap').then((m) => ({ default: m.NeighborsMiniMap }))
@@ -13,20 +19,56 @@ export function NeighborsPane({
state,
onRefresh,
disabled,
repeaterContact,
contacts,
radioLat,
radioLon,
radioName,
nodeInfo,
nodeInfoState,
repeaterName,
}: {
data: RepeaterNeighborsResponse | null;
state: PaneState;
onRefresh: () => void;
disabled?: boolean;
repeaterContact: Contact | null;
contacts: Contact[];
radioLat: number | null;
radioLon: number | null;
radioName: string | null;
nodeInfo: RepeaterNodeInfoResponse | null;
nodeInfoState: PaneState;
repeaterName: string | null;
}) {
const advertLat = repeaterContact?.lat ?? null;
const advertLon = repeaterContact?.lon ?? null;
const radioLat = useMemo(() => {
const parsed = nodeInfo?.lat != null ? parseFloat(nodeInfo.lat) : null;
return Number.isFinite(parsed) ? parsed : null;
}, [nodeInfo?.lat]);
const radioLon = useMemo(() => {
const parsed = nodeInfo?.lon != null ? parseFloat(nodeInfo.lon) : null;
return Number.isFinite(parsed) ? parsed : null;
}, [nodeInfo?.lon]);
const positionSource = useMemo(() => {
if (isValidLocation(radioLat, radioLon)) {
return { lat: radioLat, lon: radioLon, source: 'reported' as const };
}
if (isValidLocation(advertLat, advertLon)) {
return { lat: advertLat, lon: advertLon, source: 'advert' as const };
}
return { lat: null, lon: null, source: null };
}, [advertLat, advertLon, radioLat, radioLon]);
const radioName = nodeInfo?.name || repeaterContact?.name || repeaterName;
const hasValidRepeaterGps = positionSource.source !== null;
const headerNote =
positionSource.source === 'reported'
? 'Using repeater-reported position'
: positionSource.source === 'advert'
? 'Using advert position'
: nodeInfoState.loading
? 'Waiting for repeater position'
: 'No repeater position available';
// Resolve contact data for each neighbor in a single pass — used for
// coords (mini-map), distances (table column), and sorted display order.
const { neighborsWithCoords, sorted, hasDistances } = useMemo(() => {
@@ -48,8 +90,8 @@ export function NeighborsPane({
const nLon = contact?.lon ?? null;
let dist: string | null = null;
if (isValidLocation(radioLat, radioLon) && isValidLocation(nLat, nLon)) {
const distKm = calculateDistance(radioLat, radioLon, nLat, nLon);
if (hasValidRepeaterGps && isValidLocation(nLat, nLon)) {
const distKm = calculateDistance(positionSource.lat, positionSource.lon, nLat, nLon);
if (distKm != null) {
dist = formatDistance(distKm);
anyDist = true;
@@ -69,24 +111,25 @@ export function NeighborsPane({
sorted: enriched,
hasDistances: anyDist,
};
}, [data, contacts, radioLat, radioLon]);
}, [contacts, data, hasValidRepeaterGps, positionSource.lat, positionSource.lon]);
return (
<RepeaterPane
title="Neighbors"
headerNote={headerNote}
state={state}
onRefresh={onRefresh}
disabled={disabled}
className="flex flex-col"
contentClassName="flex-1 flex flex-col"
className="flex min-h-0 flex-1 flex-col"
contentClassName="flex min-h-0 flex-1 flex-col"
>
{!data ? (
<NotFetched />
) : sorted.length === 0 ? (
<p className="text-sm text-muted-foreground">No neighbors reported</p>
) : (
<div className="flex-1 flex flex-col gap-2">
<div className="overflow-x-auto">
<div className="flex min-h-0 flex-1 flex-col gap-2">
<div className="shrink-0 overflow-x-auto">
<table className="w-full text-sm">
<thead>
<tr className="text-left text-muted-foreground text-xs">
@@ -120,10 +163,10 @@ export function NeighborsPane({
</tbody>
</table>
</div>
{(neighborsWithCoords.length > 0 || isValidLocation(radioLat, radioLon)) && (
{hasValidRepeaterGps && (neighborsWithCoords.length > 0 || hasValidRepeaterGps) ? (
<Suspense
fallback={
<div className="h-48 flex items-center justify-center text-xs text-muted-foreground">
<div className="flex min-h-48 flex-1 items-center justify-center text-xs text-muted-foreground">
Loading map...
</div>
}
@@ -131,11 +174,16 @@ export function NeighborsPane({
<NeighborsMiniMap
key={neighborsWithCoords.map((n) => n.pubkey_prefix).join(',')}
neighbors={neighborsWithCoords}
radioLat={radioLat}
radioLon={radioLon}
radioLat={positionSource.lat}
radioLon={positionSource.lon}
radioName={radioName}
/>
</Suspense>
) : (
<div className="rounded border border-border/70 bg-muted/20 px-3 py-2 text-xs text-muted-foreground">
Map and distance data are unavailable until this repeater has a valid position from
either its advert or a Node Info fetch.
</div>
)}
</div>
)}

View File

@@ -0,0 +1,55 @@
import { useMemo } from 'react';
import { cn } from '@/lib/utils';
import { RepeaterPane, NotFetched, KvRow, formatClockDrift } from './repeaterPaneShared';
import type { RepeaterNodeInfoResponse, PaneState } from '../../types';
export function NodeInfoPane({
data,
state,
onRefresh,
disabled,
}: {
data: RepeaterNodeInfoResponse | null;
state: PaneState;
onRefresh: () => void;
disabled?: boolean;
}) {
const clockDrift = useMemo(() => {
if (!data?.clock_utc) return null;
return formatClockDrift(data.clock_utc, state.fetched_at ?? undefined);
}, [data?.clock_utc, state.fetched_at]);
return (
<RepeaterPane title="Node Info" state={state} onRefresh={onRefresh} disabled={disabled}>
{!data ? (
<NotFetched />
) : (
<div>
<KvRow label="Name" value={data.name ?? '—'} />
<KvRow
label="Lat / Lon"
value={
data.lat != null || data.lon != null ? `${data.lat ?? '—'}, ${data.lon ?? '—'}` : '—'
}
/>
<div className="flex justify-between text-sm py-0.5">
<span className="text-muted-foreground">Clock (UTC)</span>
<span>
{data.clock_utc ?? '—'}
{clockDrift && (
<span
className={cn(
'ml-2 text-xs',
clockDrift.isLarge ? 'text-destructive' : 'text-muted-foreground'
)}
>
(drift: {clockDrift.text})
</span>
)}
</span>
</div>
</div>
)}
</RepeaterPane>
);
}

View File

@@ -1,4 +1,3 @@
import { useMemo } from 'react';
import { cn } from '@/lib/utils';
import { Separator } from '../ui/separator';
import {
@@ -6,7 +5,6 @@ import {
RefreshIcon,
NotFetched,
KvRow,
formatClockDrift,
formatAdvertInterval,
} from './repeaterPaneShared';
import type {
@@ -15,6 +13,35 @@ import type {
PaneState,
} from '../../types';
function formatRadioTuple(radio: string | null): { display: string; raw: string | null } {
if (radio == null) {
return { display: '—', raw: null };
}
const trimmed = radio.trim();
const parts = trimmed.split(',').map((part) => part.trim());
if (parts.length !== 4) {
return { display: trimmed || '—', raw: trimmed || null };
}
const [freqRaw, bwRaw, sfRaw, crRaw] = parts;
const freq = Number.parseFloat(freqRaw);
const bw = Number.parseFloat(bwRaw);
const sf = Number.parseInt(sfRaw, 10);
const cr = Number.parseInt(crRaw, 10);
if (![freq, bw, sf, cr].every(Number.isFinite)) {
return { display: trimmed || '—', raw: trimmed || null };
}
const formattedFreq = Number(freq.toFixed(3)).toString();
const formattedBw = Number(bw.toFixed(3)).toString();
return {
display: `${formattedFreq} MHz, BW ${formattedBw} kHz, SF${sf}, CR${cr}`,
raw: trimmed,
};
}
export function RadioSettingsPane({
data,
state,
@@ -32,10 +59,7 @@ export function RadioSettingsPane({
advertState: PaneState;
onRefreshAdvert: () => void;
}) {
const clockDrift = useMemo(() => {
if (!data?.clock_utc) return null;
return formatClockDrift(data.clock_utc);
}, [data?.clock_utc]);
const formattedRadio = formatRadioTuple(data?.radio ?? null);
return (
<RepeaterPane title="Radio Settings" state={state} onRefresh={onRefresh} disabled={disabled}>
@@ -44,36 +68,14 @@ export function RadioSettingsPane({
) : (
<div>
<KvRow label="Firmware" value={data.firmware_version ?? '—'} />
<KvRow label="Radio" value={data.radio ?? '—'} />
<KvRow
label="Radio"
value={<span title={formattedRadio.raw ?? undefined}>{formattedRadio.display}</span>}
/>
<KvRow label="TX Power" value={data.tx_power != null ? `${data.tx_power} dBm` : '—'} />
<KvRow label="Airtime Factor" value={data.airtime_factor ?? '—'} />
<KvRow label="Repeat Mode" value={data.repeat_enabled ?? '—'} />
<KvRow label="Max Flood Hops" value={data.flood_max ?? '—'} />
<Separator className="my-1" />
<KvRow label="Name" value={data.name ?? '—'} />
<KvRow
label="Lat / Lon"
value={
data.lat != null || data.lon != null ? `${data.lat ?? '—'}, ${data.lon ?? '—'}` : '—'
}
/>
<Separator className="my-1" />
<div className="flex justify-between text-sm py-0.5">
<span className="text-muted-foreground">Clock (UTC)</span>
<span>
{data.clock_utc ?? '—'}
{clockDrift && (
<span
className={cn(
'ml-2 text-xs',
clockDrift.isLarge ? 'text-destructive' : 'text-muted-foreground'
)}
>
(drift: {clockDrift.text})
</span>
)}
</span>
</div>
</div>
)}
{/* Advert Intervals sub-section */}

View File

@@ -39,7 +39,10 @@ export function formatDuration(seconds: number): string {
return `${mins}m`;
}
export function formatClockDrift(clockUtc: string): { text: string; isLarge: boolean } {
export function formatClockDrift(
clockUtc: string,
referenceTimeMs: number = Date.now()
): { text: string; isLarge: boolean } {
// Firmware format: "HH:MM - D/M/YYYY UTC" or "HH:MM:SS - D/M/YYYY UTC"
// Also handle ISO-like: "YYYY-MM-DD HH:MM:SS"
let parsed: Date;
@@ -56,7 +59,7 @@ export function formatClockDrift(clockUtc: string): { text: string; isLarge: boo
}
if (isNaN(parsed.getTime())) return { text: '(invalid)', isLarge: false };
const driftMs = Math.abs(Date.now() - parsed.getTime());
const driftMs = Math.abs(referenceTimeMs - parsed.getTime());
const driftSec = Math.floor(driftMs / 1000);
if (driftSec >= 86400) return { text: '>24 hours!', isLarge: true };
@@ -106,6 +109,7 @@ function formatFetchedTime(fetchedAt: number): string {
export function RepeaterPane({
title,
headerNote,
state,
onRefresh,
disabled,
@@ -114,6 +118,7 @@ export function RepeaterPane({
contentClassName,
}: {
title: string;
headerNote?: ReactNode;
state: PaneState;
onRefresh?: () => void;
disabled?: boolean;
@@ -128,6 +133,7 @@ export function RepeaterPane({
<div className="flex items-center justify-between px-3 py-2 bg-muted/50 border-b border-border">
<div className="min-w-0">
<h3 className="text-sm font-medium">{title}</h3>
{headerNote && <p className="text-[11px] text-muted-foreground">{headerNote}</p>}
{fetchedAt && (
<p
className="text-[11px] text-muted-foreground"

View File

@@ -113,6 +113,19 @@ export function SettingsAboutSection({ className }: { className?: string }) {
</a>
</p>
</div>
<Separator />
<div className="text-center">
<a
href="/api/debug"
target="_blank"
rel="noopener noreferrer"
className="text-xs text-muted-foreground hover:text-primary hover:underline"
>
Open debug support snapshot
</a>
</div>
</div>
</div>
);

View File

@@ -13,6 +13,8 @@ import type {
HealthStatus,
RadioConfig,
RadioConfigUpdate,
RadioDiscoveryResponse,
RadioDiscoveryTarget,
} from '../../types';
export function SettingsRadioSection({
@@ -27,6 +29,9 @@ export function SettingsRadioSection({
onDisconnect,
onReconnect,
onAdvertise,
meshDiscovery,
meshDiscoveryLoadingTarget,
onDiscoverMesh,
onClose,
className,
}: {
@@ -41,6 +46,9 @@ export function SettingsRadioSection({
onDisconnect: () => Promise<void>;
onReconnect: () => Promise<void>;
onAdvertise: () => Promise<void>;
meshDiscovery: RadioDiscoveryResponse | null;
meshDiscoveryLoadingTarget: RadioDiscoveryTarget | null;
onDiscoverMesh: (target: RadioDiscoveryTarget) => Promise<void>;
onClose: () => void;
className?: string;
}) {
@@ -75,6 +83,7 @@ export function SettingsRadioSection({
// Advertise state
const [advertising, setAdvertising] = useState(false);
const [discoverError, setDiscoverError] = useState<string | null>(null);
const [connectionBusy, setConnectionBusy] = useState(false);
useEffect(() => {
@@ -295,6 +304,15 @@ export function SettingsRadioSection({
}
};
const handleDiscover = async (target: RadioDiscoveryTarget) => {
setDiscoverError(null);
try {
await onDiscoverMesh(target);
} catch (err) {
setDiscoverError(err instanceof Error ? err.message : 'Failed to run mesh discovery');
}
};
const radioState =
health?.radio_state ?? (health?.radio_initializing ? 'initializing' : 'disconnected');
const connectionActionLabel =
@@ -315,6 +333,35 @@ export function SettingsRadioSection({
? `Connection paused${health?.connection_info ? ` (${health.connection_info})` : ''}`
: 'Not connected';
const deviceInfoLabel = useMemo(() => {
const info = health?.radio_device_info;
if (!info) {
return null;
}
const model = info.model?.trim() || null;
const firmwareParts = [info.firmware_build?.trim(), info.firmware_version?.trim()].filter(
(value): value is string => Boolean(value)
);
const capacityParts = [
typeof info.max_contacts === 'number' ? `${info.max_contacts} contacts` : null,
typeof info.max_channels === 'number' ? `${info.max_channels} channels` : null,
].filter((value): value is string => value !== null);
if (!model && firmwareParts.length === 0 && capacityParts.length === 0) {
return null;
}
let label = model ?? 'Radio';
if (firmwareParts.length > 0) {
label += ` running ${firmwareParts.join('/')}`;
}
if (capacityParts.length > 0) {
label += ` (max: ${capacityParts.join(', ')})`;
}
return label;
}, [health?.radio_device_info]);
const handleConnectionAction = async () => {
setConnectionBusy(true);
try {
@@ -359,6 +406,7 @@ export function SettingsRadioSection({
{connectionStatusLabel}
</span>
</div>
{deviceInfoLabel && <p className="text-sm text-muted-foreground">{deviceInfoLabel}</p>}
<Button
type="button"
variant="outline"
@@ -687,7 +735,10 @@ export function SettingsRadioSection({
<Separator />
{/* Send Advertisement */}
<div className="space-y-2">
<Label className="text-base">Hear &amp; Be Heard</Label>
</div>
<div className="space-y-2">
<Label>Send Advertisement</Label>
<p className="text-xs text-muted-foreground">
@@ -704,6 +755,81 @@ export function SettingsRadioSection({
<p className="text-sm text-destructive">Radio not connected</p>
)}
</div>
<div className="space-y-3">
<Label>Mesh Discovery</Label>
<p className="text-xs text-muted-foreground">
Discover nearby node types that currently respond to mesh discovery requests: repeaters
and sensors.
</p>
<div className="grid grid-cols-1 gap-2 sm:grid-cols-3">
{[
{ target: 'repeaters', label: 'Discover Repeaters' },
{ target: 'sensors', label: 'Discover Sensors' },
{ target: 'all', label: 'Discover Both' },
].map(({ target, label }) => (
<Button
key={target}
type="button"
variant="outline"
onClick={() => handleDiscover(target as RadioDiscoveryTarget)}
disabled={meshDiscoveryLoadingTarget !== null || !health?.radio_connected}
className="w-full"
>
{meshDiscoveryLoadingTarget === target ? 'Listening...' : label}
</Button>
))}
</div>
{!health?.radio_connected && (
<p className="text-sm text-destructive">Radio not connected</p>
)}
{discoverError && (
<p className="text-sm text-destructive" role="alert">
{discoverError}
</p>
)}
{meshDiscovery && (
<div className="space-y-2 rounded-md border border-input bg-muted/20 p-3">
<div className="flex items-center justify-between gap-4">
<p className="text-sm font-medium">
Last sweep: {meshDiscovery.results.length} node
{meshDiscovery.results.length === 1 ? '' : 's'}
</p>
<p className="text-xs text-muted-foreground">
{meshDiscovery.duration_seconds.toFixed(0)}s listen window
</p>
</div>
{meshDiscovery.results.length === 0 ? (
<p className="text-sm text-muted-foreground">
No supported nodes responded during the last discovery sweep.
</p>
) : (
<div className="space-y-2">
{meshDiscovery.results.map((result) => (
<div
key={result.public_key}
className="rounded-md border border-input bg-background px-3 py-2"
>
<div className="flex items-center justify-between gap-3">
<span className="text-sm font-medium capitalize">{result.node_type}</span>
<span className="text-xs text-muted-foreground">
heard {result.heard_count} time{result.heard_count === 1 ? '' : 's'}
</span>
</div>
<p className="mt-1 break-all font-mono text-xs text-muted-foreground">
{result.public_key}
</p>
<p className="mt-1 text-xs text-muted-foreground">
Heard here: {result.local_snr ?? 'n/a'} dB SNR / {result.local_rssi ?? 'n/a'}{' '}
dBm RSSI. Remote heard us: {result.remote_snr ?? 'n/a'} dB SNR.
</p>
</div>
))}
</div>
)}
</div>
)}
</div>
</div>
);
}

View File

@@ -12,24 +12,41 @@ The visualizer displays:
## Architecture
### Data Layer (`components/visualizer/useVisualizerData3D.ts`)
### Semantic Data Layer (`networkGraph/packetNetworkGraph.ts`)
The custom hook manages all graph state and simulation logic:
The packet-network module owns the canonical mesh representation and the visibility-aware projection logic:
```
Packets → Parse → Aggregate by key → Observation window → Publish → Animate
Packets → Parse → Canonical observations/adjacency → Projection by settings
```
**Key responsibilities:**
- Maintains node and link maps (`nodesRef`, `linksRef`)
- Resolves packet source / repeater / destination nodes into a canonical path
- Maintains canonical node, link, observation, and neighbor state independent of UI toggles
- Applies ambiguous repeater heuristics and advert-path hints while building canonical data
- Projects canonical paths into rendered links, including dashed bridges over hidden ambiguous runs
- Exposes a reusable semantic surface for other consumers besides the 3D visualizer
### Visualizer Data Hook (`components/visualizer/useVisualizerData3D.ts`)
The hook manages render-specific state and animation timing on top of the shared packet-network data layer:
```
Canonical projection → Aggregate by key → Observation window → Publish → Animate
```
**Key responsibilities:**
- Adapts semantic packet-network nodes/links into `GraphNode` / `GraphLink` render objects
- Runs `d3-force-3d` simulation for 3D layout (`.numDimensions(3)`)
- Processes incoming packets with deduplication
- Aggregates packet repeats across multiple paths
- Processes incoming packets with deduplication and feeds them into the semantic layer
- Aggregates packet repeats across multiple projected paths
- Manages particle queue and animation timing
**State:**
- `networkStateRef`: Canonical packet-network state (nodes, links, observations, neighbors)
- `nodesRef`: Map of node ID → GraphNode
- `linksRef`: Map of link key → GraphLink
- `particlesRef`: Array of active Particle objects
@@ -50,6 +67,8 @@ Scene creation, render-loop updates, raycasting hover, and click-to-pin interact
### Shared Utilities
- `networkGraph/packetNetworkGraph.ts`
- Canonical packet-network types and replay/projection logic
- `components/visualizer/shared.ts`
- Graph-specific types: `GraphNode`, `GraphLink`, `NodeMeshData`
- Shared rendering helpers: node colors, relative-time formatting, typed-array growth helpers
@@ -75,8 +94,9 @@ When a new packet arrives from the WebSocket:
```typescript
packets.forEach((packet) => {
if (processedRef.current.has(packet.id)) return; // Skip duplicates
processedRef.current.add(packet.id);
const observationKey = getRawPacketObservationKey(packet);
if (processedRef.current.has(observationKey)) return; // Skip duplicates
processedRef.current.add(observationKey);
const parsed = parsePacket(packet.data);
const key = generatePacketKey(parsed, packet);
@@ -196,6 +216,8 @@ When a winner is found, the ambiguous node gets a `probableIdentity` label (the
**Interaction with traffic splitting:** Advert-path hints run first. If a probable identity is found, the display name is set. Traffic splitting can still produce separate node IDs (`?XX:>YY`), but won't overwrite the advert-path display name.
**Sibling collapse projection:** When an ambiguous repeater has a high-confidence likely identity and that likely repeater also appears as a definitely-known sibling connecting to the same next hop, the projection layer can collapse the ambiguous node into the known repeater. This is projection-only: canonical observations and canonical neighbor truth remain unchanged.
**Toggle:** "Use repeater advert-path identity hints" checkbox (enabled by default, disabled when ambiguous repeaters are hidden).
### Traffic Pattern Splitting (Experimental)
@@ -308,40 +330,44 @@ function buildPath(parsed, packet, myPrefix): string[] {
| Pan (right-drag) | Pan the camera |
| Scroll wheel | Zoom in/out |
**Click-to-pin:** When a node is pinned, hovering other nodes does not change the highlight. The tooltip shows "Traffic exchanged with:" listing all connected neighbors with their possible names.
**Click-to-pin:** When a node is pinned, hovering other nodes does not change the highlight. The tooltip shows "Traffic exchanged with:" using canonical packet-network adjacency, not rendered-link adjacency, so hidden repeaters still appear truthfully as hidden neighbors.
## Configuration Options
| Option | Default | Description |
| -------------------------- | ------- | --------------------------------------------------------- |
| Ambiguous repeaters | On | Show nodes when only partial prefix known |
| Ambiguous sender/recipient | Off | Show placeholder nodes for unknown senders |
| Advert-path identity hints | On | Use stored advert paths to label ambiguous repeaters |
| Split by traffic pattern | Off | Split ambiguous repeaters by next-hop routing (see above) |
| Observation window | 15 sec | Wait time for duplicate packets before animating (1-60s) |
| Let 'em drift | On | Continuous layout optimization |
| Repulsion | 200 | Force strength (50-2500) |
| Packet speed | 2x | Particle animation speed multiplier (1x-5x) |
| Shuffle layout | - | Button to randomize node positions and reheat sim |
| Oooh Big Stretch! | - | Button to temporarily increase repulsion then relax |
| Clear & Reset | - | Button to clear all nodes, links, and packets |
| Hide UI | Off | Hide legends and most controls for cleaner view |
| Full screen | Off | Hide the packet feed panel (desktop only) |
| Option | Default | Description |
| -------------------------- | ------- | ----------------------------------------------------------- |
| Ambiguous repeaters | On | Show nodes when only partial prefix known |
| Ambiguous sender/recipient | Off | Show placeholder nodes for unknown senders |
| Advert-path identity hints | On | Use stored advert paths to label ambiguous repeaters |
| Collapse sibling repeaters | On | Merge likely ambiguous repeater with known sibling repeater |
| Split by traffic pattern | Off | Split ambiguous repeaters by next-hop routing (see above) |
| Observation window | 15 sec | Wait time for duplicate packets before animating (1-60s) |
| Let 'em drift | On | Continuous layout optimization |
| Repulsion | 200 | Force strength (50-2500) |
| Packet speed | 2x | Particle animation speed multiplier (1x-5x) |
| Shuffle layout | - | Button to randomize node positions and reheat sim |
| Oooh Big Stretch! | - | Button to temporarily increase repulsion then relax |
| Clear & Reset | - | Button to clear all nodes, links, and packets |
| Hide UI | Off | Hide legends and most controls for cleaner view |
| Full screen | Off | Hide the packet feed panel (desktop only) |
## File Structure
```
PacketVisualizer3D.tsx
├── TYPES (GraphNode extends SimulationNodeDatum3D, GraphLink)
├── CONSTANTS (NODE_COLORS, NODE_LEGEND_ITEMS)
├── DATA LAYER HOOK (useVisualizerData3D)
│ ├── Refs (nodes, links, particles, simulation, pending, timers, trafficPatterns, stretchRaf)
│ ├── d3-force-3d simulation initialization (.numDimensions(3))
│ ├── Contact indexing (byPrefix12 / byName / byPrefix)
│ ├── Node/link management (addNode, addLink, syncSimulation)
│ ├── Path building (resolveNode, buildPath)
├── SEMANTIC DATA LAYER (networkGraph/packetNetworkGraph.ts)
│ ├── Contact/advert indexes
│ ├── Canonical node/link/neighbor/observation state
│ ├── Path building (resolveNode, buildCanonicalPathForPacket)
│ ├── Traffic pattern analysis (for repeater disambiguation)
│ └── Packet processing & publishing
│ └── Projection (projectCanonicalPath, projectPacketNetwork)
├── DATA HOOK (useVisualizerData3D)
│ ├── Refs (network state, render nodes, links, particles, simulation, pending, timers, stretchRaf)
│ ├── d3-force-3d simulation initialization (.numDimensions(3))
│ ├── Semantic→render adaptation
│ ├── Observation-window packet aggregation
│ └── Particle publishing
└── MAIN COMPONENT (PacketVisualizer3D)
├── Three.js scene setup (WebGLRenderer, CSS2DRenderer, OrbitControls)
├── Node mesh management (SphereGeometry + CSS2DObject labels)
@@ -356,6 +382,13 @@ utils/visualizerUtils.ts
├── Constants (COLORS, PARTICLE_COLOR_MAP, PARTICLE_SPEED, PACKET_LEGEND_ITEMS)
└── Functions (parsePacket, generatePacketKey, analyzeRepeaterTraffic, etc.)
networkGraph/packetNetworkGraph.ts
├── Types (PacketNetworkNode, PacketNetworkLink, PacketNetworkObservation, projection types)
├── Context builders (contact and advert-path indexes)
├── Canonical replay (ingestPacketIntoPacketNetwork)
├── Projection helpers (projectCanonicalPath, projectPacketNetwork)
└── State maintenance (clear, prune, neighbor snapshots)
types/d3-force-3d.d.ts
└── Type declarations for d3-force-3d (SimulationNodeDatum3D, Simulation3D, forces)
```

View File

@@ -13,6 +13,8 @@ interface VisualizerControlsProps {
setShowAmbiguousNodes: (value: boolean) => void;
useAdvertPathHints: boolean;
setUseAdvertPathHints: (value: boolean) => void;
collapseLikelyKnownSiblingRepeaters: boolean;
setCollapseLikelyKnownSiblingRepeaters: (value: boolean) => void;
splitAmbiguousByTraffic: boolean;
setSplitAmbiguousByTraffic: (value: boolean) => void;
observationWindowSec: number;
@@ -46,6 +48,8 @@ export function VisualizerControls({
setShowAmbiguousNodes,
useAdvertPathHints,
setUseAdvertPathHints,
collapseLikelyKnownSiblingRepeaters,
setCollapseLikelyKnownSiblingRepeaters,
splitAmbiguousByTraffic,
setSplitAmbiguousByTraffic,
observationWindowSec,
@@ -149,55 +153,77 @@ export function VisualizerControls({
Show ambiguous sender/recipient
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={useAdvertPathHints}
onCheckedChange={(c) => setUseAdvertPathHints(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Use stored repeater advert paths to assign likely identity labels for ambiguous repeater nodes"
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Use repeater advert-path identity hints
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={splitAmbiguousByTraffic}
onCheckedChange={(c) => setSplitAmbiguousByTraffic(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Split ambiguous repeaters into separate nodes based on traffic patterns (prev→next). Helps identify colliding prefixes representing different physical nodes, but requires enough traffic to disambiguate."
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Heuristically group repeaters by traffic pattern
</span>
</label>
<div className="flex items-center gap-2">
<label
htmlFor="observation-window-3d"
className="text-muted-foreground"
title="How long to wait for duplicate packets via different paths before animating"
>
Ack/echo listen window:
</label>
<input
id="observation-window-3d"
type="number"
min="1"
max="60"
value={observationWindowSec}
onChange={(e) =>
setObservationWindowSec(
Math.max(1, Math.min(60, parseInt(e.target.value, 10) || 1))
)
}
className="w-12 px-1 py-0.5 bg-background border border-border rounded text-xs text-center"
/>
<span className="text-muted-foreground">sec</span>
</div>
<details className="rounded border border-border/60 px-2 py-1">
<summary className="cursor-pointer select-none text-muted-foreground">
Advanced
</summary>
<div className="mt-2 flex flex-col gap-2">
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={useAdvertPathHints}
onCheckedChange={(c) => setUseAdvertPathHints(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Use stored repeater advert paths to assign likely identity labels for ambiguous repeater nodes."
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Use repeater advert-path identity hints
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={collapseLikelyKnownSiblingRepeaters}
onCheckedChange={(c) => setCollapseLikelyKnownSiblingRepeaters(c === true)}
disabled={!showAmbiguousPaths || !useAdvertPathHints}
/>
<span
title="When an ambiguous repeater has a high-confidence likely-identity that matches a sibling definitely-known repeater, and they both connect to the same next hop, collapse them into the known repeater. This should resolve more ambiguity as the mesh navigates the 1.14 upgrade."
className={
!showAmbiguousPaths || !useAdvertPathHints ? 'text-muted-foreground' : ''
}
>
Collapse likely sibling repeaters
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={splitAmbiguousByTraffic}
onCheckedChange={(c) => setSplitAmbiguousByTraffic(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Split ambiguous repeaters into separate nodes based on traffic patterns (prev→next). Helps identify colliding prefixes representing different physical nodes, but requires enough traffic to disambiguate."
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Heuristically group repeaters by traffic pattern
</span>
</label>
<div className="flex items-center gap-2">
<label
htmlFor="observation-window-3d"
className="text-muted-foreground"
title="How long to wait for duplicate packets via different paths before animating"
>
Ack/echo listen window:
</label>
<input
id="observation-window-3d"
type="number"
min="1"
max="60"
value={observationWindowSec}
onChange={(e) =>
setObservationWindowSec(
Math.max(1, Math.min(60, parseInt(e.target.value, 10) || 1))
)
}
className="w-12 px-1 py-0.5 bg-background border border-border rounded text-xs text-center"
/>
<span className="text-muted-foreground">sec</span>
</div>
</div>
</details>
<div className="border-t border-border pt-2 mt-1 flex flex-col gap-2">
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox

View File

@@ -1,25 +1,37 @@
import type { GraphNode } from './shared';
import type { PacketNetworkNode } from '../../networkGraph/packetNetworkGraph';
import { formatRelativeTime } from './shared';
interface VisualizerTooltipProps {
activeNodeId: string | null;
nodes: Map<string, GraphNode>;
neighborIds: string[];
canonicalNodes: Map<string, PacketNetworkNode>;
canonicalNeighborIds: Map<string, string[]>;
renderedNodeIds: Set<string>;
}
export function VisualizerTooltip({ activeNodeId, nodes, neighborIds }: VisualizerTooltipProps) {
export function VisualizerTooltip({
activeNodeId,
canonicalNodes,
canonicalNeighborIds,
renderedNodeIds,
}: VisualizerTooltipProps) {
if (!activeNodeId) return null;
const node = nodes.get(activeNodeId);
const node = canonicalNodes.get(activeNodeId);
if (!node) return null;
const neighborIds = canonicalNeighborIds.get(activeNodeId) ?? [];
const neighbors = neighborIds
.map((nid) => {
const neighbor = nodes.get(nid);
const neighbor = canonicalNodes.get(nid);
if (!neighbor) return null;
const displayName =
neighbor.name || (neighbor.type === 'self' ? 'Me' : neighbor.id.slice(0, 8));
return { id: nid, name: displayName, ambiguousNames: neighbor.ambiguousNames };
return {
id: nid,
name: displayName,
ambiguousNames: neighbor.ambiguousNames,
hidden: !renderedNodeIds.has(nid),
};
})
.filter((neighbor): neighbor is NonNullable<typeof neighbor> => neighbor !== null);
@@ -56,6 +68,7 @@ export function VisualizerTooltip({ activeNodeId, nodes, neighborIds }: Visualiz
{neighbors.map((neighbor) => (
<li key={neighbor.id}>
{neighbor.name}
{neighbor.hidden && <span className="text-muted-foreground/60"> (hidden)</span>}
{neighbor.ambiguousNames && neighbor.ambiguousNames.length > 0 && (
<span className="text-muted-foreground/60">
{' '}

View File

@@ -21,6 +21,9 @@ export interface GraphLink extends SimulationLinkDatum<GraphNode> {
source: string | GraphNode;
target: string | GraphNode;
lastActivity: number;
hasDirectObservation: boolean;
hasHiddenIntermediate: boolean;
hiddenHopLabels: string[];
}
export interface NodeMeshData {
@@ -74,6 +77,11 @@ export function formatRelativeTime(timestamp: number): string {
return secs > 0 ? `${minutes}m ${secs}s ago` : `${minutes}m ago`;
}
export function getSceneNodeLabel(node: Pick<GraphNode, 'id' | 'name' | 'type' | 'isAmbiguous'>) {
const baseLabel = node.name || (node.type === 'self' ? 'Me' : node.id.slice(0, 8));
return node.isAmbiguous ? `${baseLabel} (?)` : baseLabel;
}
export function normalizePacketTimestampMs(timestamp: number | null | undefined): number {
if (!Number.isFinite(timestamp) || !timestamp || timestamp <= 0) {
return Date.now();

View File

@@ -5,7 +5,13 @@ import { CSS2DObject, CSS2DRenderer } from 'three/examples/jsm/renderers/CSS2DRe
import { COLORS, getLinkId } from '../../utils/visualizerUtils';
import type { VisualizerData3D } from './useVisualizerData3D';
import { arraysEqual, getBaseNodeColor, growFloat32Buffer, type NodeMeshData } from './shared';
import {
arraysEqual,
getBaseNodeColor,
getSceneNodeLabel,
growFloat32Buffer,
type NodeMeshData,
} from './shared';
interface UseVisualizer3DSceneArgs {
containerRef: RefObject<HTMLDivElement | null>;
@@ -32,10 +38,12 @@ export function useVisualizer3DScene({
const nodeMeshesRef = useRef<Map<string, NodeMeshData>>(new Map());
const raycastTargetsRef = useRef<THREE.Mesh[]>([]);
const linkLineRef = useRef<THREE.LineSegments | null>(null);
const dashedLinkLineRef = useRef<THREE.LineSegments | null>(null);
const highlightLineRef = useRef<THREE.LineSegments | null>(null);
const particlePointsRef = useRef<THREE.Points | null>(null);
const particleTextureRef = useRef<THREE.Texture | null>(null);
const linkPositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const dashedLinkPositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const highlightPositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const particlePositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const particleColorBufferRef = useRef<Float32Array>(new Float32Array(0));
@@ -126,6 +134,19 @@ export function useVisualizer3DScene({
scene.add(linkSegments);
linkLineRef.current = linkSegments;
const dashedLinkGeometry = new THREE.BufferGeometry();
const dashedLinkMaterial = new THREE.LineDashedMaterial({
color: 0x94a3b8,
transparent: true,
opacity: 0.85,
dashSize: 16,
gapSize: 10,
});
const dashedLinkSegments = new THREE.LineSegments(dashedLinkGeometry, dashedLinkMaterial);
dashedLinkSegments.visible = false;
scene.add(dashedLinkSegments);
dashedLinkLineRef.current = dashedLinkSegments;
const highlightGeometry = new THREE.BufferGeometry();
const highlightMaterial = new THREE.LineBasicMaterial({
color: 0xffd700,
@@ -198,6 +219,12 @@ export function useVisualizer3DScene({
(linkLineRef.current.material as THREE.Material).dispose();
linkLineRef.current = null;
}
if (dashedLinkLineRef.current) {
scene.remove(dashedLinkLineRef.current);
dashedLinkLineRef.current.geometry.dispose();
(dashedLinkLineRef.current.material as THREE.Material).dispose();
dashedLinkLineRef.current = null;
}
if (highlightLineRef.current) {
scene.remove(highlightLineRef.current);
highlightLineRef.current.geometry.dispose();
@@ -213,6 +240,7 @@ export function useVisualizer3DScene({
particleTexture.dispose();
particleTextureRef.current = null;
linkPositionBufferRef.current = new Float32Array(0);
dashedLinkPositionBufferRef.current = new Float32Array(0);
highlightPositionBufferRef.current = new Float32Array(0);
particlePositionBufferRef.current = new Float32Array(0);
particleColorBufferRef.current = new Float32Array(0);
@@ -340,7 +368,7 @@ export function useVisualizer3DScene({
if (nd.labelDiv.style.color !== labelColor) {
nd.labelDiv.style.color = labelColor;
}
const labelText = node.name || (node.type === 'self' ? 'Me' : node.id.slice(0, 8));
const labelText = getSceneNodeLabel(node);
if (nd.labelDiv.textContent !== labelText) {
nd.labelDiv.textContent = labelText;
}
@@ -369,11 +397,16 @@ export function useVisualizer3DScene({
}
const activeId = pinnedNodeIdRef.current ?? hoveredNodeIdRef.current;
const visibleLinks = [];
const solidLinks = [];
const dashedLinks = [];
for (const link of links.values()) {
const { sourceId, targetId } = getLinkId(link);
if (currentNodeIds.has(sourceId) && currentNodeIds.has(targetId)) {
visibleLinks.push(link);
if (link.hasDirectObservation || !link.hasHiddenIntermediate) {
solidLinks.push(link);
} else {
dashedLinks.push(link);
}
}
}
@@ -382,7 +415,8 @@ export function useVisualizer3DScene({
const linkLine = linkLineRef.current;
if (linkLine) {
const geometry = linkLine.geometry as THREE.BufferGeometry;
const requiredLength = visibleLinks.length * 6;
const requiredLength = solidLinks.length * 6;
const highlightRequiredLength = (solidLinks.length + dashedLinks.length) * 6;
if (linkPositionBufferRef.current.length < requiredLength) {
linkPositionBufferRef.current = growFloat32Buffer(
linkPositionBufferRef.current,
@@ -397,10 +431,10 @@ export function useVisualizer3DScene({
}
const highlightLine = highlightLineRef.current;
if (highlightLine && highlightPositionBufferRef.current.length < requiredLength) {
if (highlightLine && highlightPositionBufferRef.current.length < highlightRequiredLength) {
highlightPositionBufferRef.current = growFloat32Buffer(
highlightPositionBufferRef.current,
requiredLength
highlightRequiredLength
);
(highlightLine.geometry as THREE.BufferGeometry).setAttribute(
'position',
@@ -415,7 +449,7 @@ export function useVisualizer3DScene({
let idx = 0;
let hlIdx = 0;
for (const link of visibleLinks) {
for (const link of solidLinks) {
const { sourceId, targetId } = getLinkId(link);
const sNode = nodes.get(sourceId);
const tNode = nodes.get(targetId);
@@ -446,6 +480,23 @@ export function useVisualizer3DScene({
}
}
for (const link of dashedLinks) {
const { sourceId, targetId } = getLinkId(link);
if (activeId && (sourceId === activeId || targetId === activeId)) {
const sNode = nodes.get(sourceId);
const tNode = nodes.get(targetId);
if (!sNode || !tNode) continue;
connectedIds?.add(sourceId === activeId ? targetId : sourceId);
hlPositions[hlIdx++] = sNode.x ?? 0;
hlPositions[hlIdx++] = sNode.y ?? 0;
hlPositions[hlIdx++] = sNode.z ?? 0;
hlPositions[hlIdx++] = tNode.x ?? 0;
hlPositions[hlIdx++] = tNode.y ?? 0;
hlPositions[hlIdx++] = tNode.z ?? 0;
}
}
const positionAttr = geometry.getAttribute('position') as THREE.BufferAttribute | undefined;
if (positionAttr) {
positionAttr.needsUpdate = true;
@@ -464,6 +515,51 @@ export function useVisualizer3DScene({
}
}
const dashedLinkLine = dashedLinkLineRef.current;
if (dashedLinkLine) {
const geometry = dashedLinkLine.geometry as THREE.BufferGeometry;
const requiredLength = dashedLinks.length * 6;
if (dashedLinkPositionBufferRef.current.length < requiredLength) {
dashedLinkPositionBufferRef.current = growFloat32Buffer(
dashedLinkPositionBufferRef.current,
requiredLength
);
geometry.setAttribute(
'position',
new THREE.BufferAttribute(dashedLinkPositionBufferRef.current, 3).setUsage(
THREE.DynamicDrawUsage
)
);
}
const positions = dashedLinkPositionBufferRef.current;
let idx = 0;
for (const link of dashedLinks) {
const { sourceId, targetId } = getLinkId(link);
const sNode = nodes.get(sourceId);
const tNode = nodes.get(targetId);
if (!sNode || !tNode) continue;
positions[idx++] = sNode.x ?? 0;
positions[idx++] = sNode.y ?? 0;
positions[idx++] = sNode.z ?? 0;
positions[idx++] = tNode.x ?? 0;
positions[idx++] = tNode.y ?? 0;
positions[idx++] = tNode.z ?? 0;
}
const positionAttr = geometry.getAttribute('position') as THREE.BufferAttribute | undefined;
if (positionAttr) {
positionAttr.needsUpdate = true;
}
geometry.setDrawRange(0, idx / 3);
dashedLinkLine.visible = idx > 0;
if (idx > 0 && positionAttr) {
dashedLinkLine.computeLineDistances();
}
}
let writeIdx = 0;
for (let readIdx = 0; readIdx < particles.length; readIdx++) {
const particle = particles[readIdx];

View File

@@ -10,10 +10,20 @@ import {
type ForceLink3D,
type Simulation3D,
} from 'd3-force-3d';
import { PayloadType } from '@michaelhart/meshcore-decoder';
import type { PacketNetworkNode } from '../../networkGraph/packetNetworkGraph';
import {
buildPacketNetworkContext,
clearPacketNetworkState,
createPacketNetworkState,
ensureSelfNode,
ingestPacketIntoPacketNetwork,
projectCanonicalPath,
projectPacketNetwork,
prunePacketNetworkState,
snapshotNeighborIds,
} from '../../networkGraph/packetNetworkGraph';
import {
CONTACT_TYPE_REPEATER,
type Contact,
type ContactAdvertPathSummary,
type RadioConfig,
@@ -21,22 +31,15 @@ import {
} from '../../types';
import { getRawPacketObservationKey } from '../../utils/rawPacketIdentity';
import {
type Particle,
type PendingPacket,
type RepeaterTrafficData,
PARTICLE_COLOR_MAP,
PARTICLE_SPEED,
analyzeRepeaterTraffic,
buildAmbiguousRepeaterLabel,
buildAmbiguousRepeaterNodeId,
buildLinkKey,
dedupeConsecutive,
generatePacketKey,
getNodeType,
getPacketLabel,
parsePacket,
recordTrafficObservation,
type Particle,
PARTICLE_COLOR_MAP,
PARTICLE_SPEED,
type PendingPacket,
} from '../../utils/visualizerUtils';
import { type GraphLink, type GraphNode, normalizePacketTimestampMs } from './shared';
import { type GraphLink, type GraphNode } from './shared';
export interface UseVisualizerData3DOptions {
packets: RawPacket[];
@@ -46,6 +49,7 @@ export interface UseVisualizerData3DOptions {
showAmbiguousPaths: boolean;
showAmbiguousNodes: boolean;
useAdvertPathHints: boolean;
collapseLikelyKnownSiblingRepeaters: boolean;
splitAmbiguousByTraffic: boolean;
chargeStrength: number;
letEmDrift: boolean;
@@ -58,12 +62,42 @@ export interface UseVisualizerData3DOptions {
export interface VisualizerData3D {
nodes: Map<string, GraphNode>;
links: Map<string, GraphLink>;
canonicalNodes: Map<string, PacketNetworkNode>;
canonicalNeighborIds: Map<string, string[]>;
renderedNodeIds: Set<string>;
particles: Particle[];
stats: { processed: number; animated: number; nodes: number; links: number };
expandContract: () => void;
clearAndReset: () => void;
}
function buildInitialRenderNode(node: PacketNetworkNode): GraphNode {
if (node.id === 'self') {
return {
...node,
x: 0,
y: 0,
z: 0,
fx: 0,
fy: 0,
fz: 0,
vx: 0,
vy: 0,
vz: 0,
};
}
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = 80 + Math.random() * 100;
return {
...node,
x: r * Math.sin(phi) * Math.cos(theta),
y: r * Math.sin(phi) * Math.sin(theta),
z: r * Math.cos(phi),
};
}
export function useVisualizerData3D({
packets,
contacts,
@@ -72,6 +106,7 @@ export function useVisualizerData3D({
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
collapseLikelyKnownSiblingRepeaters,
splitAmbiguousByTraffic,
chargeStrength,
letEmDrift,
@@ -80,6 +115,7 @@ export function useVisualizerData3D({
pruneStaleNodes,
pruneStaleMinutes,
}: UseVisualizerData3DOptions): VisualizerData3D {
const networkStateRef = useRef(createPacketNetworkState(config?.name || 'Me'));
const nodesRef = useRef<Map<string, GraphNode>>(new Map());
const linksRef = useRef<Map<string, GraphLink>>(new Map());
const particlesRef = useRef<Particle[]>([]);
@@ -87,47 +123,23 @@ export function useVisualizerData3D({
const processedRef = useRef<Set<string>>(new Set());
const pendingRef = useRef<Map<string, PendingPacket>>(new Map());
const timersRef = useRef<Map<string, ReturnType<typeof setTimeout>>>(new Map());
const trafficPatternsRef = useRef<Map<string, RepeaterTrafficData>>(new Map());
const speedMultiplierRef = useRef(particleSpeedMultiplier);
const observationWindowRef = useRef(observationWindowSec * 1000);
const stretchRafRef = useRef<number | null>(null);
const [stats, setStats] = useState({ processed: 0, animated: 0, nodes: 0, links: 0 });
const [, setProjectionVersion] = useState(0);
const contactIndex = useMemo(() => {
const byPrefix12 = new Map<string, Contact>();
const byName = new Map<string, Contact>();
const byPrefix = new Map<string, Contact[]>();
for (const contact of contacts) {
const prefix12 = contact.public_key.slice(0, 12).toLowerCase();
byPrefix12.set(prefix12, contact);
if (contact.name && !byName.has(contact.name)) {
byName.set(contact.name, contact);
}
for (let len = 1; len <= 12; len++) {
const prefix = prefix12.slice(0, len);
const matches = byPrefix.get(prefix);
if (matches) {
matches.push(contact);
} else {
byPrefix.set(prefix, [contact]);
}
}
}
return { byPrefix12, byName, byPrefix };
}, [contacts]);
const advertPathIndex = useMemo(() => {
const byRepeater = new Map<string, ContactAdvertPathSummary['paths']>();
for (const summary of repeaterAdvertPaths) {
const key = summary.public_key.slice(0, 12).toLowerCase();
byRepeater.set(key, summary.paths);
}
return { byRepeater };
}, [repeaterAdvertPaths]);
const packetNetworkContext = useMemo(
() =>
buildPacketNetworkContext({
contacts,
config,
repeaterAdvertPaths,
splitAmbiguousByTraffic,
useAdvertPathHints,
}),
[contacts, config, repeaterAdvertPaths, splitAmbiguousByTraffic, useAdvertPathHints]
);
useEffect(() => {
speedMultiplierRef.current = particleSpeedMultiplier;
@@ -213,96 +225,108 @@ export function useVisualizerData3D({
? prev
: { ...prev, nodes: nodes.length, links: links.length }
);
setProjectionVersion((prev) => prev + 1);
}, []);
useEffect(() => {
if (!nodesRef.current.has('self')) {
nodesRef.current.set('self', {
id: 'self',
name: config?.name || 'Me',
type: 'self',
isAmbiguous: false,
lastActivity: Date.now(),
x: 0,
y: 0,
z: 0,
});
syncSimulation();
}
}, [config, syncSimulation]);
useEffect(() => {
processedRef.current.clear();
const selfNode = nodesRef.current.get('self');
nodesRef.current.clear();
if (selfNode) nodesRef.current.set('self', selfNode);
linksRef.current.clear();
particlesRef.current = [];
pendingRef.current.clear();
timersRef.current.forEach((t) => clearTimeout(t));
timersRef.current.clear();
trafficPatternsRef.current.clear();
setStats({ processed: 0, animated: 0, nodes: selfNode ? 1 : 0, links: 0 });
syncSimulation();
}, [
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
splitAmbiguousByTraffic,
syncSimulation,
]);
const addNode = useCallback(
(
id: string,
name: string | null,
type: GraphNode['type'],
isAmbiguous: boolean,
probableIdentity?: string | null,
ambiguousNames?: string[],
lastSeen?: number | null,
activityAtMs?: number
) => {
const activityAt = activityAtMs ?? Date.now();
const existing = nodesRef.current.get(id);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAt);
if (name) existing.name = name;
if (probableIdentity !== undefined) existing.probableIdentity = probableIdentity;
if (ambiguousNames) existing.ambiguousNames = ambiguousNames;
if (lastSeen !== undefined) existing.lastSeen = lastSeen;
} else {
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = 80 + Math.random() * 100;
nodesRef.current.set(id, {
id,
name,
type,
isAmbiguous,
lastActivity: activityAt,
probableIdentity,
lastSeen,
ambiguousNames,
x: r * Math.sin(phi) * Math.cos(theta),
y: r * Math.sin(phi) * Math.sin(theta),
z: r * Math.cos(phi),
});
const upsertRenderNode = useCallback(
(node: PacketNetworkNode, existing?: GraphNode): GraphNode => {
if (!existing) {
return buildInitialRenderNode(node);
}
existing.name = node.name;
existing.type = node.type;
existing.isAmbiguous = node.isAmbiguous;
existing.lastActivity = node.lastActivity;
existing.lastActivityReason = node.lastActivityReason;
existing.lastSeen = node.lastSeen;
existing.probableIdentity = node.probableIdentity;
existing.ambiguousNames = node.ambiguousNames;
if (node.id === 'self') {
existing.x = 0;
existing.y = 0;
existing.z = 0;
existing.fx = 0;
existing.fy = 0;
existing.fz = 0;
existing.vx = 0;
existing.vy = 0;
existing.vz = 0;
}
return existing;
},
[]
);
const addLink = useCallback((sourceId: string, targetId: string, activityAtMs?: number) => {
const activityAt = activityAtMs ?? Date.now();
const key = [sourceId, targetId].sort().join('->');
const existing = linksRef.current.get(key);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAt);
} else {
linksRef.current.set(key, { source: sourceId, target: targetId, lastActivity: activityAt });
const rebuildRenderProjection = useCallback(() => {
const projection = projectPacketNetwork(networkStateRef.current, {
showAmbiguousNodes,
showAmbiguousPaths,
collapseLikelyKnownSiblingRepeaters,
});
const previousNodes = nodesRef.current;
const nextNodes = new Map<string, GraphNode>();
for (const [nodeId, node] of projection.nodes) {
nextNodes.set(nodeId, upsertRenderNode(node, previousNodes.get(nodeId)));
}
}, []);
const nextLinks = new Map<string, GraphLink>();
for (const [key, link] of projection.links) {
nextLinks.set(key, {
source: link.sourceId,
target: link.targetId,
lastActivity: link.lastActivity,
hasDirectObservation: link.hasDirectObservation,
hasHiddenIntermediate: link.hasHiddenIntermediate,
hiddenHopLabels: [...link.hiddenHopLabels],
});
}
nodesRef.current = nextNodes;
linksRef.current = nextLinks;
syncSimulation();
}, [
collapseLikelyKnownSiblingRepeaters,
showAmbiguousNodes,
showAmbiguousPaths,
syncSimulation,
upsertRenderNode,
]);
useEffect(() => {
ensureSelfNode(networkStateRef.current, config?.name || 'Me');
const selfNode = networkStateRef.current.nodes.get('self');
if (selfNode) {
nodesRef.current.set('self', upsertRenderNode(selfNode, nodesRef.current.get('self')));
}
syncSimulation();
}, [config?.name, syncSimulation, upsertRenderNode]);
useEffect(() => {
processedRef.current.clear();
clearPacketNetworkState(networkStateRef.current, { selfName: config?.name || 'Me' });
nodesRef.current.clear();
linksRef.current.clear();
particlesRef.current = [];
pendingRef.current.clear();
timersRef.current.forEach((timer) => clearTimeout(timer));
timersRef.current.clear();
const selfNode = networkStateRef.current.nodes.get('self');
if (selfNode) {
nodesRef.current.set('self', upsertRenderNode(selfNode));
}
setStats({ processed: 0, animated: 0, nodes: selfNode ? 1 : 0, links: 0 });
syncSimulation();
}, [config?.name, splitAmbiguousByTraffic, syncSimulation, upsertRenderNode, useAdvertPathHints]);
useEffect(() => {
rebuildRenderProjection();
}, [rebuildRenderProjection]);
const publishPacket = useCallback((packetKey: string) => {
const pending = pendingRef.current.get(packetKey);
@@ -319,7 +343,7 @@ export function useVisualizerData3D({
for (let i = 0; i < dedupedPath.length - 1; i++) {
particlesRef.current.push({
linkKey: [dedupedPath[i], dedupedPath[i + 1]].sort().join('->'),
linkKey: buildLinkKey(dedupedPath[i], dedupedPath[i + 1]),
progress: -i,
speed: PARTICLE_SPEED * speedMultiplierRef.current,
color: PARTICLE_COLOR_MAP[pending.label],
@@ -331,334 +355,10 @@ export function useVisualizerData3D({
}
}, []);
const pickLikelyRepeaterByAdvertPath = useCallback(
(candidates: Contact[], nextPrefix: string | null) => {
const nextHop = nextPrefix?.toLowerCase() ?? null;
const scored = candidates
.map((candidate) => {
const prefix12 = candidate.public_key.slice(0, 12).toLowerCase();
const paths = advertPathIndex.byRepeater.get(prefix12) ?? [];
let matchScore = 0;
let totalScore = 0;
for (const path of paths) {
totalScore += path.heard_count;
const pathNextHop = path.next_hop?.toLowerCase() ?? null;
if (pathNextHop === nextHop) {
matchScore += path.heard_count;
}
}
return { candidate, matchScore, totalScore };
})
.filter((entry) => entry.totalScore > 0)
.sort(
(a, b) =>
b.matchScore - a.matchScore ||
b.totalScore - a.totalScore ||
a.candidate.public_key.localeCompare(b.candidate.public_key)
);
if (scored.length === 0) return null;
const top = scored[0];
const second = scored[1] ?? null;
if (top.matchScore < 2) return null;
if (second && top.matchScore < second.matchScore * 2) return null;
return top.candidate;
},
[advertPathIndex]
);
const resolveNode = useCallback(
(
source: { type: 'prefix' | 'pubkey' | 'name'; value: string },
isRepeater: boolean,
showAmbiguous: boolean,
myPrefix: string | null,
activityAtMs: number,
trafficContext?: { packetSource: string | null; nextPrefix: string | null }
): string | null => {
if (source.type === 'pubkey') {
if (source.value.length < 12) return null;
const nodeId = source.value.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
const contact = contactIndex.byPrefix12.get(nodeId);
addNode(
nodeId,
contact?.name || null,
getNodeType(contact),
false,
undefined,
undefined,
contact?.last_seen,
activityAtMs
);
return nodeId;
}
if (source.type === 'name') {
const contact = contactIndex.byName.get(source.value) ?? null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
addNode(
nodeId,
contact.name,
getNodeType(contact),
false,
undefined,
undefined,
contact.last_seen,
activityAtMs
);
return nodeId;
}
const nodeId = `name:${source.value}`;
addNode(
nodeId,
source.value,
'client',
false,
undefined,
undefined,
undefined,
activityAtMs
);
return nodeId;
}
const lookupValue = source.value.toLowerCase();
const matches = contactIndex.byPrefix.get(lookupValue) ?? [];
const contact = matches.length === 1 ? matches[0] : null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
addNode(
nodeId,
contact.name,
getNodeType(contact),
false,
undefined,
undefined,
contact.last_seen,
activityAtMs
);
return nodeId;
}
if (showAmbiguous) {
const filtered = isRepeater
? matches.filter((c) => c.type === CONTACT_TYPE_REPEATER)
: matches.filter((c) => c.type !== CONTACT_TYPE_REPEATER);
if (filtered.length === 1) {
const c = filtered[0];
const nodeId = c.public_key.slice(0, 12).toLowerCase();
addNode(
nodeId,
c.name,
getNodeType(c),
false,
undefined,
undefined,
c.last_seen,
activityAtMs
);
return nodeId;
}
if (filtered.length > 1 || (filtered.length === 0 && isRepeater)) {
const names = filtered.map((c) => c.name || c.public_key.slice(0, 8));
const lastSeen = filtered.reduce(
(max, c) => (c.last_seen && (!max || c.last_seen > max) ? c.last_seen : max),
null as number | null
);
let nodeId = buildAmbiguousRepeaterNodeId(lookupValue);
let displayName = buildAmbiguousRepeaterLabel(lookupValue);
let probableIdentity: string | null = null;
let ambiguousNames = names.length > 0 ? names : undefined;
if (useAdvertPathHints && isRepeater && trafficContext) {
const normalizedNext = trafficContext.nextPrefix?.toLowerCase() ?? null;
const likely = pickLikelyRepeaterByAdvertPath(filtered, normalizedNext);
if (likely) {
const likelyName = likely.name || likely.public_key.slice(0, 12).toUpperCase();
probableIdentity = likelyName;
displayName = likelyName;
ambiguousNames = filtered
.filter((c) => c.public_key !== likely.public_key)
.map((c) => c.name || c.public_key.slice(0, 8));
}
}
if (splitAmbiguousByTraffic && isRepeater && trafficContext) {
const normalizedNext = trafficContext.nextPrefix?.toLowerCase() ?? null;
if (trafficContext.packetSource) {
recordTrafficObservation(
trafficPatternsRef.current,
lookupValue,
trafficContext.packetSource,
normalizedNext
);
}
const trafficData = trafficPatternsRef.current.get(lookupValue);
if (trafficData) {
const analysis = analyzeRepeaterTraffic(trafficData);
if (analysis.shouldSplit && normalizedNext) {
nodeId = buildAmbiguousRepeaterNodeId(lookupValue, normalizedNext);
if (!probableIdentity) {
displayName = buildAmbiguousRepeaterLabel(lookupValue, normalizedNext);
}
}
}
}
addNode(
nodeId,
displayName,
isRepeater ? 'repeater' : 'client',
true,
probableIdentity,
ambiguousNames,
lastSeen,
activityAtMs
);
return nodeId;
}
}
return null;
},
[
contactIndex,
addNode,
useAdvertPathHints,
pickLikelyRepeaterByAdvertPath,
splitAmbiguousByTraffic,
]
);
const buildPath = useCallback(
(
parsed: ReturnType<typeof parsePacket>,
packet: RawPacket,
myPrefix: string | null,
activityAtMs: number
): string[] => {
if (!parsed) return [];
const path: string[] = [];
let packetSource: string | null = null;
if (parsed.payloadType === PayloadType.Advert && parsed.advertPubkey) {
const nodeId = resolveNode(
{ type: 'pubkey', value: parsed.advertPubkey },
false,
false,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.AnonRequest && parsed.anonRequestPubkey) {
const nodeId = resolveNode(
{ type: 'pubkey', value: parsed.anonRequestPubkey },
false,
false,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.TextMessage && parsed.srcHash) {
if (myPrefix && parsed.srcHash.toLowerCase() === myPrefix) {
path.push('self');
packetSource = 'self';
} else {
const nodeId = resolveNode(
{ type: 'prefix', value: parsed.srcHash },
false,
showAmbiguousNodes,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
}
} else if (parsed.payloadType === PayloadType.GroupText) {
const senderName = parsed.groupTextSender || packet.decrypted_info?.sender;
if (senderName) {
const resolved = resolveNode(
{ type: 'name', value: senderName },
false,
false,
myPrefix,
activityAtMs
);
if (resolved) {
path.push(resolved);
packetSource = resolved;
}
}
}
for (let i = 0; i < parsed.pathBytes.length; i++) {
const hexPrefix = parsed.pathBytes[i];
const nextPrefix = parsed.pathBytes[i + 1] || null;
const nodeId = resolveNode(
{ type: 'prefix', value: hexPrefix },
true,
showAmbiguousPaths,
myPrefix,
activityAtMs,
{ packetSource, nextPrefix }
);
if (nodeId) path.push(nodeId);
}
if (parsed.payloadType === PayloadType.TextMessage && parsed.dstHash) {
if (myPrefix && parsed.dstHash.toLowerCase() === myPrefix) {
path.push('self');
} else {
const nodeId = resolveNode(
{ type: 'prefix', value: parsed.dstHash },
false,
showAmbiguousNodes,
myPrefix,
activityAtMs
);
if (nodeId) path.push(nodeId);
else path.push('self');
}
} else if (path.length > 0) {
path.push('self');
}
if (path.length > 0 && path[path.length - 1] !== 'self') {
path.push('self');
}
return dedupeConsecutive(path);
},
[resolveNode, showAmbiguousPaths, showAmbiguousNodes]
);
useEffect(() => {
let newProcessed = 0;
let newAnimated = 0;
let needsUpdate = false;
const myPrefix = config?.public_key?.slice(0, 12).toLowerCase() || null;
let needsProjectionRebuild = false;
for (const packet of packets) {
const observationKey = getRawPacketObservationKey(packet);
@@ -670,34 +370,31 @@ export function useVisualizerData3D({
processedRef.current = new Set(Array.from(processedRef.current).slice(-500));
}
const parsed = parsePacket(packet.data);
if (!parsed) continue;
const ingested = ingestPacketIntoPacketNetwork(
networkStateRef.current,
packetNetworkContext,
packet
);
if (!ingested) continue;
needsProjectionRebuild = true;
const packetActivityAt = normalizePacketTimestampMs(packet.timestamp);
const path = buildPath(parsed, packet, myPrefix, packetActivityAt);
if (path.length < 2) continue;
const projectedPath = projectCanonicalPath(networkStateRef.current, ingested.canonicalPath, {
showAmbiguousNodes,
showAmbiguousPaths,
collapseLikelyKnownSiblingRepeaters,
});
if (projectedPath.nodes.length < 2) continue;
const label = getPacketLabel(parsed.payloadType);
for (let i = 0; i < path.length; i++) {
const n = nodesRef.current.get(path[i]);
if (n && n.id !== 'self') {
n.lastActivityReason = i === 0 ? `${label} source` : `Relayed ${label}`;
}
}
for (let i = 0; i < path.length - 1; i++) {
if (path[i] !== path[i + 1]) {
addLink(path[i], path[i + 1], packetActivityAt);
needsUpdate = true;
}
}
const packetKey = generatePacketKey(parsed, packet);
const packetKey = generatePacketKey(ingested.parsed, packet);
const now = Date.now();
const existing = pendingRef.current.get(packetKey);
if (existing && now < existing.expiresAt) {
existing.paths.push({ nodes: path, snr: packet.snr ?? null, timestamp: now });
existing.paths.push({
nodes: projectedPath.nodes,
snr: packet.snr ?? null,
timestamp: now,
});
} else {
const existingTimer = timersRef.current.get(packetKey);
if (existingTimer) {
@@ -706,8 +403,8 @@ export function useVisualizerData3D({
const windowMs = observationWindowRef.current;
pendingRef.current.set(packetKey, {
key: packetKey,
label: getPacketLabel(parsed.payloadType),
paths: [{ nodes: path, snr: packet.snr ?? null, timestamp: now }],
label: ingested.label,
paths: [{ nodes: projectedPath.nodes, snr: packet.snr ?? null, timestamp: now }],
firstSeen: now,
expiresAt: now + windowMs,
});
@@ -734,7 +431,9 @@ export function useVisualizerData3D({
newAnimated++;
}
if (needsUpdate) syncSimulation();
if (needsProjectionRebuild) {
rebuildRenderProjection();
}
if (newProcessed > 0) {
setStats((prev) => ({
...prev,
@@ -742,7 +441,15 @@ export function useVisualizerData3D({
animated: prev.animated + newAnimated,
}));
}
}, [packets, config, buildPath, addLink, syncSimulation, publishPacket]);
}, [
packets,
packetNetworkContext,
publishPacket,
collapseLikelyKnownSiblingRepeaters,
rebuildRenderProjection,
showAmbiguousNodes,
showAmbiguousPaths,
]);
const expandContract = useCallback(() => {
const sim = simulationRef.current;
@@ -831,21 +538,14 @@ export function useVisualizerData3D({
timersRef.current.clear();
pendingRef.current.clear();
processedRef.current.clear();
trafficPatternsRef.current.clear();
particlesRef.current.length = 0;
linksRef.current.clear();
clearPacketNetworkState(networkStateRef.current, { selfName: config?.name || 'Me' });
const selfNode = nodesRef.current.get('self');
linksRef.current.clear();
nodesRef.current.clear();
const selfNode = networkStateRef.current.nodes.get('self');
if (selfNode) {
selfNode.x = 0;
selfNode.y = 0;
selfNode.z = 0;
selfNode.vx = 0;
selfNode.vy = 0;
selfNode.vz = 0;
selfNode.lastActivity = Date.now();
nodesRef.current.set('self', selfNode);
nodesRef.current.set('self', upsertRenderNode(selfNode));
}
const sim = simulationRef.current;
@@ -856,8 +556,8 @@ export function useVisualizerData3D({
sim.alpha(0.3).restart();
}
setStats({ processed: 0, animated: 0, nodes: 1, links: 0 });
}, []);
setStats({ processed: 0, animated: 0, nodes: selfNode ? 1 : 0, links: 0 });
}, [config?.name, upsertRenderNode]);
useEffect(() => {
const stretchRaf = stretchRafRef;
@@ -883,40 +583,23 @@ export function useVisualizerData3D({
const interval = setInterval(() => {
const cutoff = Date.now() - staleMs;
let pruned = false;
for (const [id, node] of nodesRef.current) {
if (id === 'self') continue;
if (node.lastActivity < cutoff) {
nodesRef.current.delete(id);
pruned = true;
}
}
if (pruned) {
for (const [key, link] of linksRef.current) {
const sourceId = typeof link.source === 'string' ? link.source : link.source.id;
const targetId = typeof link.target === 'string' ? link.target : link.target.id;
if (!nodesRef.current.has(sourceId) || !nodesRef.current.has(targetId)) {
linksRef.current.delete(key);
}
}
syncSimulation();
if (prunePacketNetworkState(networkStateRef.current, cutoff)) {
rebuildRenderProjection();
}
}, pruneIntervalMs);
return () => clearInterval(interval);
}, [pruneStaleNodes, pruneStaleMinutes, syncSimulation]);
}, [pruneStaleMinutes, pruneStaleNodes, rebuildRenderProjection]);
return useMemo(
() => ({
nodes: nodesRef.current,
links: linksRef.current,
particles: particlesRef.current,
stats,
expandContract,
clearAndReset,
}),
[stats, expandContract, clearAndReset]
);
return {
nodes: nodesRef.current,
links: linksRef.current,
canonicalNodes: networkStateRef.current.nodes,
canonicalNeighborIds: snapshotNeighborIds(networkStateRef.current),
renderedNodeIds: new Set(nodesRef.current.keys()),
particles: particlesRef.current,
stats,
expandContract,
clearAndReset,
};
}

View File

@@ -43,25 +43,6 @@ export function useAppSettings() {
[fetchAppSettings]
);
const handleSortOrderChange = useCallback(
async (order: 'recent' | 'alpha') => {
const previousOrder = appSettings?.sidebar_sort_order ?? 'recent';
// Optimistic update for responsive UI
setAppSettings((prev) => (prev ? { ...prev, sidebar_sort_order: order } : prev));
try {
const updatedSettings = await api.updateSettings({ sidebar_sort_order: order });
setAppSettings(updatedSettings);
} catch (err) {
console.error('Failed to update sort order:', err);
setAppSettings((prev) => (prev ? { ...prev, sidebar_sort_order: previousOrder } : prev));
toast.error('Failed to save sort preference');
}
},
[appSettings?.sidebar_sort_order]
);
const handleToggleBlockedKey = useCallback(async (key: string) => {
const normalizedKey = key.toLowerCase();
setAppSettings((prev) => {
@@ -198,7 +179,6 @@ export function useAppSettings() {
favorites,
fetchAppSettings,
handleSaveAppSettings,
handleSortOrderChange,
handleToggleFavorite,
handleToggleBlockedKey,
handleToggleBlockedName,

View File

@@ -4,10 +4,9 @@ import { takePrefetchOrFetch } from '../prefetch';
import { toast } from '../components/ui/sonner';
import * as messageCache from '../messageCache';
import { getContactDisplayName } from '../utils/pubkey';
import { findPublicChannel, PUBLIC_CHANNEL_KEY, PUBLIC_CHANNEL_NAME } from '../utils/publicChannel';
import type { Channel, Contact, Conversation } from '../types';
const PUBLIC_CHANNEL_KEY = '8B3387E9C5CDEA6AC9E5EDBAA115CD72';
interface UseContactsAndChannelsArgs {
setActiveConversation: (conv: Conversation | null) => void;
pendingDeleteFallbackRef: MutableRefObject<boolean>;
@@ -121,14 +120,12 @@ export function useContactsAndChannels({
messageCache.remove(key);
const refreshedChannels = await api.getChannels();
setChannels(refreshedChannels);
const publicChannel =
refreshedChannels.find((c) => c.key === PUBLIC_CHANNEL_KEY) ||
refreshedChannels.find((c) => c.name === 'Public');
const publicChannel = findPublicChannel(refreshedChannels);
hasSetDefaultConversation.current = true;
setActiveConversation({
type: 'channel',
id: publicChannel?.key || PUBLIC_CHANNEL_KEY,
name: publicChannel?.name || 'Public',
name: publicChannel?.name || PUBLIC_CHANNEL_NAME,
});
toast.success('Channel deleted');
} catch (err) {
@@ -151,14 +148,12 @@ export function useContactsAndChannels({
setContacts((prev) => prev.filter((c) => c.public_key !== publicKey));
const refreshedChannels = await api.getChannels();
setChannels(refreshedChannels);
const publicChannel =
refreshedChannels.find((c) => c.key === PUBLIC_CHANNEL_KEY) ||
refreshedChannels.find((c) => c.name === 'Public');
const publicChannel = findPublicChannel(refreshedChannels);
hasSetDefaultConversation.current = true;
setActiveConversation({
type: 'channel',
id: publicChannel?.key || PUBLIC_CHANNEL_KEY,
name: publicChannel?.name || 'Public',
name: publicChannel?.name || PUBLIC_CHANNEL_NAME,
});
toast.success('Contact deleted');
} catch (err) {

View File

@@ -1,18 +1,16 @@
import { useCallback, type MutableRefObject, type RefObject } from 'react';
import { api } from '../api';
import * as messageCache from '../messageCache';
import { toast } from '../components/ui/sonner';
import type { MessageInputHandle } from '../components/MessageInput';
import type { Channel, Conversation, Message } from '../types';
import type { Channel, Contact, Conversation, Message, PathDiscoveryResponse } from '../types';
import { mergeContactIntoList } from '../utils/contactMerge';
interface UseConversationActionsArgs {
activeConversation: Conversation | null;
activeConversationRef: MutableRefObject<Conversation | null>;
setContacts: React.Dispatch<React.SetStateAction<Contact[]>>;
setChannels: React.Dispatch<React.SetStateAction<Channel[]>>;
addMessageIfNew: (msg: Message) => boolean;
jumpToBottom: () => void;
handleToggleBlockedKey: (key: string) => Promise<void>;
handleToggleBlockedName: (name: string) => Promise<void>;
messageInputRef: RefObject<MessageInputHandle | null>;
}
@@ -25,18 +23,15 @@ interface UseConversationActionsResult {
) => Promise<void>;
handleSenderClick: (sender: string) => void;
handleTrace: () => Promise<void>;
handleBlockKey: (key: string) => Promise<void>;
handleBlockName: (name: string) => Promise<void>;
handlePathDiscovery: (publicKey: string) => Promise<PathDiscoveryResponse>;
}
export function useConversationActions({
activeConversation,
activeConversationRef,
setContacts,
setChannels,
addMessageIfNew,
jumpToBottom,
handleToggleBlockedKey,
handleToggleBlockedName,
messageInputRef,
}: UseConversationActionsArgs): UseConversationActionsResult {
const mergeChannelIntoList = useCallback(
@@ -74,7 +69,16 @@ export function useConversationActions({
const handleResendChannelMessage = useCallback(
async (messageId: number, newTimestamp?: boolean) => {
try {
await api.resendChannelMessage(messageId, newTimestamp);
const resent = await api.resendChannelMessage(messageId, newTimestamp);
const resentMessage = resent.message;
if (
newTimestamp &&
resentMessage &&
activeConversationRef.current?.type === 'channel' &&
activeConversationRef.current.id === resentMessage.conversation_key
) {
addMessageIfNew(resentMessage);
}
toast.success(newTimestamp ? 'Message resent with new timestamp' : 'Message resent');
} catch (err) {
toast.error('Failed to resend', {
@@ -82,7 +86,7 @@ export function useConversationActions({
});
}
},
[]
[activeConversationRef, addMessageIfNew]
);
const handleSetChannelFloodScopeOverride = useCallback(
@@ -126,22 +130,13 @@ export function useConversationActions({
}
}, [activeConversation]);
const handleBlockKey = useCallback(
async (key: string) => {
await handleToggleBlockedKey(key);
messageCache.clear();
jumpToBottom();
const handlePathDiscovery = useCallback(
async (publicKey: string) => {
const result = await api.requestPathDiscovery(publicKey);
setContacts((prev) => mergeContactIntoList(prev, result.contact));
return result;
},
[handleToggleBlockedKey, jumpToBottom]
);
const handleBlockName = useCallback(
async (name: string) => {
await handleToggleBlockedName(name);
messageCache.clear();
jumpToBottom();
},
[handleToggleBlockedName, jumpToBottom]
[setContacts]
);
return {
@@ -150,7 +145,6 @@ export function useConversationActions({
handleSetChannelFloodScopeOverride,
handleSenderClick,
handleTrace,
handleBlockKey,
handleBlockName,
handlePathDiscovery,
};
}

View File

@@ -77,6 +77,7 @@ interface UseConversationMessagesResult {
fetchOlderMessages: () => Promise<void>;
fetchNewerMessages: () => Promise<void>;
jumpToBottom: () => void;
reloadCurrentConversation: () => void;
addMessageIfNew: (msg: Message) => boolean;
updateMessageAck: (messageId: number, ackCount: number, paths?: MessagePath[]) => void;
triggerReconcile: () => void;
@@ -86,6 +87,30 @@ function isMessageConversation(conversation: Conversation | null): conversation
return !!conversation && !['raw', 'map', 'visualizer', 'search'].includes(conversation.type);
}
function appendUniqueMessages(current: Message[], incoming: Message[]): Message[] {
if (incoming.length === 0) return current;
const seenIds = new Set(current.map((msg) => msg.id));
const seenContent = new Set(current.map((msg) => getMessageContentKey(msg)));
const additions: Message[] = [];
for (const msg of incoming) {
const contentKey = getMessageContentKey(msg);
if (seenIds.has(msg.id) || seenContent.has(contentKey)) {
continue;
}
seenIds.add(msg.id);
seenContent.add(contentKey);
additions.push(msg);
}
if (additions.length === 0) {
return current;
}
return [...current, ...additions];
}
export function useConversationMessages(
activeConversation: Conversation | null,
targetMessageId?: number | null
@@ -136,17 +161,26 @@ export function useConversationMessages(
const [loadingNewer, setLoadingNewer] = useState(false);
const abortControllerRef = useRef<AbortController | null>(null);
const olderAbortControllerRef = useRef<AbortController | null>(null);
const newerAbortControllerRef = useRef<AbortController | null>(null);
const fetchingConversationIdRef = useRef<string | null>(null);
const latestReconcileRequestIdRef = useRef(0);
const messagesRef = useRef<Message[]>([]);
const loadingOlderRef = useRef(false);
const hasOlderMessagesRef = useRef(false);
const hasNewerMessagesRef = useRef(false);
const prevConversationIdRef = useRef<string | null>(null);
const prevReloadVersionRef = useRef(0);
const [reloadVersion, setReloadVersion] = useState(0);
useEffect(() => {
messagesRef.current = messages;
}, [messages]);
useEffect(() => {
loadingOlderRef.current = loadingOlder;
}, [loadingOlder]);
useEffect(() => {
hasOlderMessagesRef.current = hasOlderMessages;
}, [hasOlderMessages]);
@@ -252,7 +286,13 @@ export function useConversationMessages(
);
const fetchOlderMessages = useCallback(async () => {
if (!isMessageConversation(activeConversation) || loadingOlder || !hasOlderMessages) return;
if (
!isMessageConversation(activeConversation) ||
loadingOlderRef.current ||
!hasOlderMessagesRef.current
) {
return;
}
const conversationId = activeConversation.id;
const oldestMessage = messages.reduce(
@@ -266,36 +306,57 @@ export function useConversationMessages(
);
if (!oldestMessage) return;
loadingOlderRef.current = true;
setLoadingOlder(true);
const controller = new AbortController();
olderAbortControllerRef.current = controller;
try {
const data = await api.getMessages({
type: activeConversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
before: oldestMessage.received_at,
before_id: oldestMessage.id,
});
const data = await api.getMessages(
{
type: activeConversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
before: oldestMessage.received_at,
before_id: oldestMessage.id,
},
controller.signal
);
if (fetchingConversationIdRef.current !== conversationId) return;
const dataWithPendingAck = data.map((msg) => applyPendingAck(msg));
if (dataWithPendingAck.length > 0) {
setMessages((prev) => [...prev, ...dataWithPendingAck]);
for (const msg of dataWithPendingAck) {
seenMessageContent.current.add(getMessageContentKey(msg));
let nextMessages: Message[] | null = null;
setMessages((prev) => {
const merged = appendUniqueMessages(prev, dataWithPendingAck);
if (merged !== prev) {
nextMessages = merged;
}
return merged;
});
if (nextMessages) {
messagesRef.current = nextMessages;
syncSeenContent(nextMessages);
}
}
setHasOlderMessages(dataWithPendingAck.length >= MESSAGE_PAGE_SIZE);
} catch (err) {
if (isAbortError(err)) {
return;
}
console.error('Failed to fetch older messages:', err);
toast.error('Failed to load older messages', {
description: err instanceof Error ? err.message : 'Check your connection',
});
} finally {
if (olderAbortControllerRef.current === controller) {
olderAbortControllerRef.current = null;
}
loadingOlderRef.current = false;
setLoadingOlder(false);
}
}, [activeConversation, applyPendingAck, hasOlderMessages, loadingOlder, messages]);
}, [activeConversation, applyPendingAck, messages, syncSeenContent]);
const fetchNewerMessages = useCallback(async () => {
if (!isMessageConversation(activeConversation) || loadingNewer || !hasNewerMessages) return;
@@ -313,14 +374,19 @@ export function useConversationMessages(
if (!newestMessage) return;
setLoadingNewer(true);
const controller = new AbortController();
newerAbortControllerRef.current = controller;
try {
const data = await api.getMessages({
type: activeConversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
after: newestMessage.received_at,
after_id: newestMessage.id,
});
const data = await api.getMessages(
{
type: activeConversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
after: newestMessage.received_at,
after_id: newestMessage.id,
},
controller.signal
);
if (fetchingConversationIdRef.current !== conversationId) return;
@@ -337,11 +403,17 @@ export function useConversationMessages(
}
setHasNewerMessages(dataWithPendingAck.length >= MESSAGE_PAGE_SIZE);
} catch (err) {
if (isAbortError(err)) {
return;
}
console.error('Failed to fetch newer messages:', err);
toast.error('Failed to load newer messages', {
description: err instanceof Error ? err.message : 'Check your connection',
});
} finally {
if (newerAbortControllerRef.current === controller) {
newerAbortControllerRef.current = null;
}
setLoadingNewer(false);
}
}, [activeConversation, applyPendingAck, hasNewerMessages, loadingNewer, messages]);
@@ -353,6 +425,13 @@ export function useConversationMessages(
void fetchLatestMessages(true);
}, [activeConversation, fetchLatestMessages]);
const reloadCurrentConversation = useCallback(() => {
if (!isMessageConversation(activeConversation)) return;
setHasNewerMessages(false);
messageCache.remove(activeConversation.id);
setReloadVersion((current) => current + 1);
}, [activeConversation]);
const triggerReconcile = useCallback(() => {
if (!isMessageConversation(activeConversation)) return;
const controller = new AbortController();
@@ -365,20 +444,31 @@ export function useConversationMessages(
if (abortControllerRef.current) {
abortControllerRef.current.abort();
}
if (olderAbortControllerRef.current) {
olderAbortControllerRef.current.abort();
olderAbortControllerRef.current = null;
}
if (newerAbortControllerRef.current) {
newerAbortControllerRef.current.abort();
newerAbortControllerRef.current = null;
}
const prevId = prevConversationIdRef.current;
const newId = activeConversation?.id ?? null;
const conversationChanged = prevId !== newId;
const reloadRequested = prevReloadVersionRef.current !== reloadVersion;
fetchingConversationIdRef.current = newId;
prevConversationIdRef.current = newId;
prevReloadVersionRef.current = reloadVersion;
latestReconcileRequestIdRef.current = 0;
// Preserve around-loaded context on the same conversation when search clears targetMessageId.
if (!conversationChanged && !targetMessageId) {
if (!conversationChanged && !targetMessageId && !reloadRequested) {
return;
}
setLoadingOlder(false);
loadingOlderRef.current = false;
setLoadingNewer(false);
if (conversationChanged) {
setHasNewerMessages(false);
@@ -452,7 +542,7 @@ export function useConversationMessages(
controller.abort();
};
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [activeConversation?.id, activeConversation?.type, targetMessageId]);
}, [activeConversation?.id, activeConversation?.type, targetMessageId, reloadVersion]);
// Add a message if it's new (deduplication)
// Returns true if the message was added, false if it was a duplicate
@@ -538,6 +628,7 @@ export function useConversationMessages(
fetchOlderMessages,
fetchNewerMessages,
jumpToBottom,
reloadCurrentConversation,
addMessageIfNew,
updateMessageAck,
triggerReconcile,

View File

@@ -10,11 +10,10 @@ import {
getReopenLastConversationEnabled,
saveLastViewedConversation,
} from '../utils/lastViewedConversation';
import { findPublicChannel } from '../utils/publicChannel';
import { getContactDisplayName } from '../utils/pubkey';
import type { Channel, Contact, Conversation } from '../types';
const PUBLIC_CHANNEL_KEY = '8B3387E9C5CDEA6AC9E5EDBAA115CD72';
interface UseConversationRouterArgs {
channels: Channel[];
contacts: Contact[];
@@ -44,7 +43,7 @@ export function useConversationRouter({
}, []);
const getPublicChannelConversation = useCallback((): Conversation | null => {
const publicChannel = channels.find((c) => c.name === 'Public');
const publicChannel = findPublicChannel(channels);
if (!publicChannel) return null;
return {
type: 'channel',
@@ -221,9 +220,7 @@ export function useConversationRouter({
return;
}
const publicChannel =
channels.find((c) => c.key === PUBLIC_CHANNEL_KEY) ||
channels.find((c) => c.name === 'Public');
const publicChannel = findPublicChannel(channels);
if (!publicChannel) return;
hasSetDefaultConversation.current = true;

View File

@@ -2,11 +2,20 @@ import { useState, useCallback, useEffect, useRef } from 'react';
import { api } from '../api';
import { takePrefetchOrFetch } from '../prefetch';
import { toast } from '../components/ui/sonner';
import type { HealthStatus, RadioConfig, RadioConfigUpdate } from '../types';
import type {
HealthStatus,
RadioConfig,
RadioConfigUpdate,
RadioDiscoveryResponse,
RadioDiscoveryTarget,
} from '../types';
export function useRadioControl() {
const [health, setHealth] = useState<HealthStatus | null>(null);
const [config, setConfig] = useState<RadioConfig | null>(null);
const [meshDiscovery, setMeshDiscovery] = useState<RadioDiscoveryResponse | null>(null);
const [meshDiscoveryLoadingTarget, setMeshDiscoveryLoadingTarget] =
useState<RadioDiscoveryTarget | null>(null);
const prevHealthRef = useRef<HealthStatus | null>(null);
const rebootPollTokenRef = useRef(0);
@@ -96,6 +105,26 @@ export function useRadioControl() {
}
}, []);
const handleDiscoverMesh = useCallback(async (target: RadioDiscoveryTarget) => {
setMeshDiscoveryLoadingTarget(target);
try {
const data = await api.discoverMesh(target);
setMeshDiscovery(data);
toast.success(
data.results.length === 0
? 'No nearby nodes responded'
: `Found ${data.results.length} nearby node${data.results.length === 1 ? '' : 's'}`
);
} catch (err) {
console.error('Failed to discover nearby nodes:', err);
toast.error('Failed to run mesh discovery', {
description: err instanceof Error ? err.message : 'Check radio connection',
});
} finally {
setMeshDiscoveryLoadingTarget(null);
}
}, []);
const handleHealthRefresh = useCallback(async () => {
try {
const data = await api.getHealth();
@@ -118,6 +147,9 @@ export function useRadioControl() {
handleDisconnect,
handleReconnect,
handleAdvertise,
meshDiscovery,
meshDiscoveryLoadingTarget,
handleDiscoverMesh,
handleHealthRefresh,
};
}

View File

@@ -85,6 +85,10 @@ function isActiveConversationMessage(
return false;
}
function isMessageConversation(conversation: Conversation | null): boolean {
return conversation?.type === 'channel' || conversation?.type === 'contact';
}
export function useRealtimeAppState({
prevHealthRef,
setHealth,
@@ -180,7 +184,11 @@ export function useRealtimeAppState({
},
onReconnect: () => {
setRawPackets([]);
triggerReconcile();
if (
!(hasNewerMessagesRef.current && isMessageConversation(activeConversationRef.current))
) {
triggerReconcile();
}
refreshUnreads();
api.getChannels().then(setChannels).catch(console.error);
fetchAllContacts()

View File

@@ -8,6 +8,7 @@ import type {
RepeaterStatusResponse,
RepeaterNeighborsResponse,
RepeaterAclResponse,
RepeaterNodeInfoResponse,
RepeaterRadioSettingsResponse,
RepeaterAdvertIntervalsResponse,
RepeaterOwnerInfoResponse,
@@ -28,6 +29,7 @@ interface ConsoleEntry {
interface PaneData {
status: RepeaterStatusResponse | null;
nodeInfo: RepeaterNodeInfoResponse | null;
neighbors: RepeaterNeighborsResponse | null;
acl: RepeaterAclResponse | null;
radioSettings: RepeaterRadioSettingsResponse | null;
@@ -49,6 +51,7 @@ const INITIAL_PANE_STATE: PaneState = { loading: false, attempt: 0, error: null,
function createInitialPaneStates(): Record<PaneName, PaneState> {
return {
status: { ...INITIAL_PANE_STATE },
nodeInfo: { ...INITIAL_PANE_STATE },
neighbors: { ...INITIAL_PANE_STATE },
acl: { ...INITIAL_PANE_STATE },
radioSettings: { ...INITIAL_PANE_STATE },
@@ -61,6 +64,7 @@ function createInitialPaneStates(): Record<PaneName, PaneState> {
function createInitialPaneData(): PaneData {
return {
status: null,
nodeInfo: null,
neighbors: null,
acl: null,
radioSettings: null,
@@ -72,6 +76,17 @@ function createInitialPaneData(): PaneData {
const repeaterDashboardCache = new Map<string, RepeaterDashboardCacheEntry>();
function getLoginToastTitle(status: string): string {
switch (status) {
case 'timeout':
return 'Login confirmation not heard';
case 'error':
return 'Login not confirmed';
default:
return 'Repeater login not confirmed';
}
}
function clonePaneData(data: PaneData): PaneData {
return { ...data };
}
@@ -79,6 +94,7 @@ function clonePaneData(data: PaneData): PaneData {
function normalizePaneStates(paneStates: Record<PaneName, PaneState>): Record<PaneName, PaneState> {
return {
status: { ...paneStates.status, loading: false },
nodeInfo: { ...paneStates.nodeInfo, loading: false },
neighbors: { ...paneStates.neighbors, loading: false },
acl: { ...paneStates.acl, loading: false },
radioSettings: { ...paneStates.radioSettings, loading: false },
@@ -136,6 +152,8 @@ function fetchPaneData(publicKey: string, pane: PaneName) {
switch (pane) {
case 'status':
return api.repeaterStatus(publicKey);
case 'nodeInfo':
return api.repeaterNodeInfo(publicKey);
case 'neighbors':
return api.repeaterNeighbors(publicKey);
case 'acl':
@@ -170,8 +188,13 @@ export interface UseRepeaterDashboardResult {
syncClock: () => Promise<void>;
}
interface UseRepeaterDashboardOptions {
hasAdvertLocation?: boolean;
}
export function useRepeaterDashboard(
activeConversation: Conversation | null
activeConversation: Conversation | null,
options: UseRepeaterDashboardOptions = {}
): UseRepeaterDashboardResult {
const conversationId =
activeConversation && activeConversation.type === 'contact' ? activeConversation.id : null;
@@ -187,6 +210,10 @@ export function useRepeaterDashboard(
const [paneStates, setPaneStates] = useState<Record<PaneName, PaneState>>(
cachedState?.paneStates ?? createInitialPaneStates
);
const paneDataRef = useRef<PaneData>(cachedState?.paneData ?? createInitialPaneData());
const paneStatesRef = useRef<Record<PaneName, PaneState>>(
cachedState?.paneStates ?? createInitialPaneStates()
);
const [consoleHistory, setConsoleHistory] = useState<ConsoleEntry[]>(
cachedState?.consoleHistory ?? []
@@ -222,6 +249,14 @@ export function useRepeaterDashboard(
});
}, [consoleHistory, conversationId, loggedIn, loginError, paneData, paneStates]);
useEffect(() => {
paneDataRef.current = paneData;
}, [paneData]);
useEffect(() => {
paneStatesRef.current = paneStates;
}, [paneStates]);
const getPublicKey = useCallback((): string | null => {
if (!activeConversation || activeConversation.type !== 'contact') return null;
return activeConversation.id;
@@ -236,13 +271,22 @@ export function useRepeaterDashboard(
setLoginLoading(true);
setLoginError(null);
try {
await api.repeaterLogin(publicKey, password);
const result = await api.repeaterLogin(publicKey, password);
if (activeIdRef.current !== conversationId) return;
setLoggedIn(true);
if (!result.authenticated) {
const msg = result.message ?? 'Repeater login was not confirmed';
setLoginError(msg);
toast.error(getLoginToastTitle(result.status), { description: msg });
}
} catch (err) {
if (activeIdRef.current !== conversationId) return;
const msg = err instanceof Error ? err.message : 'Login failed';
setLoggedIn(true);
setLoginError(msg);
toast.error('Login request failed', {
description: `${msg}. The dashboard is still available, but repeater operations may fail until a login succeeds.`,
});
} finally {
if (activeIdRef.current === conversationId) {
setLoginLoading(false);
@@ -262,27 +306,60 @@ export function useRepeaterDashboard(
if (!publicKey) return;
const conversationId = publicKey;
if (pane === 'neighbors' && !options.hasAdvertLocation) {
const nodeInfoState = paneStatesRef.current.nodeInfo;
const nodeInfoData = paneDataRef.current.nodeInfo;
const needsNodeInfoPrefetch =
nodeInfoState.error !== null ||
(nodeInfoState.fetched_at == null && nodeInfoData == null);
if (needsNodeInfoPrefetch) {
await refreshPane('nodeInfo');
if (!mountedRef.current || activeIdRef.current !== conversationId) return;
}
}
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
if (!mountedRef.current || activeIdRef.current !== conversationId) return;
const loadingState = {
loading: true,
attempt,
error: null,
fetched_at: paneStatesRef.current[pane].fetched_at ?? null,
};
paneStatesRef.current = {
...paneStatesRef.current,
[pane]: loadingState,
};
setPaneStates((prev) => ({
...prev,
[pane]: {
loading: true,
attempt,
error: null,
fetched_at: prev[pane].fetched_at ?? null,
},
[pane]: loadingState,
}));
try {
const data = await fetchPaneData(publicKey, pane);
if (!mountedRef.current || activeIdRef.current !== conversationId) return;
paneDataRef.current = {
...paneDataRef.current,
[pane]: data,
};
const successState = {
loading: false,
attempt,
error: null,
fetched_at: Date.now(),
};
paneStatesRef.current = {
...paneStatesRef.current,
[pane]: successState,
};
setPaneData((prev) => ({ ...prev, [pane]: data }));
setPaneStates((prev) => ({
...prev,
[pane]: { loading: false, attempt, error: null, fetched_at: Date.now() },
[pane]: successState,
}));
return; // Success
} catch (err) {
@@ -291,14 +368,19 @@ export function useRepeaterDashboard(
const msg = err instanceof Error ? err.message : 'Request failed';
if (attempt === MAX_RETRIES) {
const errorState = {
loading: false,
attempt,
error: msg,
fetched_at: paneStatesRef.current[pane].fetched_at ?? null,
};
paneStatesRef.current = {
...paneStatesRef.current,
[pane]: errorState,
};
setPaneStates((prev) => ({
...prev,
[pane]: {
loading: false,
attempt,
error: msg,
fetched_at: prev[pane].fetched_at ?? null,
},
[pane]: errorState,
}));
toast.error(`Failed to fetch ${pane}`, { description: msg });
} else {
@@ -308,15 +390,16 @@ export function useRepeaterDashboard(
}
}
},
[getPublicKey]
[getPublicKey, options.hasAdvertLocation]
);
const loadAll = useCallback(async () => {
const panes: PaneName[] = [
'status',
'nodeInfo',
'neighbors',
'acl',
'radioSettings',
'acl',
'advertIntervals',
'ownerInfo',
'lppTelemetry',

View File

@@ -15,6 +15,7 @@ interface UseUnreadCountsResult {
/** Tracks which conversations have unread messages that mention the user */
mentions: Record<string, boolean>;
lastMessageTimes: ConversationTimes;
unreadLastReadAts: Record<string, number | null>;
incrementUnread: (stateKey: string, hasMention?: boolean) => void;
renameConversationState: (oldStateKey: string, newStateKey: string) => void;
markAllRead: () => void;
@@ -30,6 +31,7 @@ export function useUnreadCounts(
const [unreadCounts, setUnreadCounts] = useState<Record<string, number>>({});
const [mentions, setMentions] = useState<Record<string, boolean>>({});
const [lastMessageTimes, setLastMessageTimes] = useState<ConversationTimes>(getLastMessageTimes);
const [unreadLastReadAts, setUnreadLastReadAts] = useState<Record<string, number | null>>({});
// Track active conversation via ref so applyUnreads can filter without
// destabilizing the callback chain (avoids re-creating fetchUnreads on
@@ -62,6 +64,8 @@ export function useUnreadCounts(
setMentions(data.mentions);
}
setUnreadLastReadAts(data.last_read_ats);
if (Object.keys(data.last_message_times).length > 0) {
for (const [key, ts] of Object.entries(data.last_message_times)) {
setLastMessageTime(key, ts);
@@ -200,6 +204,7 @@ export function useUnreadCounts(
// Update local state immediately
setUnreadCounts({});
setMentions({});
setUnreadLastReadAts({});
// Persist to server with single bulk request
api.markAllRead().catch((err) => {
@@ -227,6 +232,7 @@ export function useUnreadCounts(
unreadCounts,
mentions,
lastMessageTimes,
unreadLastReadAts,
incrementUnread,
renameConversationState,
markAllRead,

View File

@@ -0,0 +1,861 @@
import { PayloadType } from '@michaelhart/meshcore-decoder';
import {
CONTACT_TYPE_REPEATER,
type Contact,
type ContactAdvertPathSummary,
type RadioConfig,
type RawPacket,
} from '../types';
import {
analyzeRepeaterTraffic,
buildAmbiguousRepeaterLabel,
buildAmbiguousRepeaterNodeId,
buildLinkKey,
compactPathSteps,
dedupeConsecutive,
getNodeType,
getPacketLabel,
parsePacket,
recordTrafficObservation,
type NodeType,
type ParsedPacket,
type RepeaterTrafficData,
} from '../utils/visualizerUtils';
import { normalizePacketTimestampMs } from '../components/visualizer/shared';
interface ContactIndex {
byPrefix12: Map<string, Contact>;
byName: Map<string, Contact>;
byPrefix: Map<string, Contact[]>;
}
interface AdvertPathIndex {
byRepeater: Map<string, ContactAdvertPathSummary['paths']>;
}
export interface PacketNetworkContext {
advertPathIndex: AdvertPathIndex;
contactIndex: ContactIndex;
myPrefix: string | null;
splitAmbiguousByTraffic: boolean;
useAdvertPathHints: boolean;
}
export interface PacketNetworkVisibilityOptions {
showAmbiguousNodes: boolean;
showAmbiguousPaths: boolean;
collapseLikelyKnownSiblingRepeaters: boolean;
}
export interface PacketNetworkNode {
id: string;
name: string | null;
type: NodeType;
isAmbiguous: boolean;
lastActivity: number;
lastActivityReason?: string;
lastSeen?: number | null;
probableIdentity?: string | null;
probableIdentityNodeId?: string | null;
ambiguousNames?: string[];
}
export interface PacketNetworkLink {
lastActivity: number;
sourceId: string;
targetId: string;
}
export interface ProjectedPacketNetworkLink extends PacketNetworkLink {
hasDirectObservation: boolean;
hasHiddenIntermediate: boolean;
hiddenHopLabels: string[];
}
export interface PacketNetworkObservation {
activityAtMs: number;
nodes: string[];
}
export interface PacketNetworkState {
links: Map<string, PacketNetworkLink>;
neighborIds: Map<string, Set<string>>;
nodes: Map<string, PacketNetworkNode>;
observations: PacketNetworkObservation[];
trafficPatterns: Map<string, RepeaterTrafficData>;
}
export interface PacketNetworkIngestResult {
activityAtMs: number;
canonicalPath: string[];
label: ReturnType<typeof getPacketLabel>;
parsed: ParsedPacket;
}
export interface ProjectedPacketNetworkPath {
dashedLinkDetails: Map<string, string[]>;
nodes: string[];
}
export interface PacketNetworkProjection {
links: Map<string, ProjectedPacketNetworkLink>;
nodes: Map<string, PacketNetworkNode>;
renderedNodeIds: Set<string>;
}
export function buildPacketNetworkContext({
config,
contacts,
repeaterAdvertPaths,
splitAmbiguousByTraffic,
useAdvertPathHints,
}: {
config: RadioConfig | null;
contacts: Contact[];
repeaterAdvertPaths: ContactAdvertPathSummary[];
splitAmbiguousByTraffic: boolean;
useAdvertPathHints: boolean;
}): PacketNetworkContext {
const byPrefix12 = new Map<string, Contact>();
const byName = new Map<string, Contact>();
const byPrefix = new Map<string, Contact[]>();
for (const contact of contacts) {
const prefix12 = contact.public_key.slice(0, 12).toLowerCase();
byPrefix12.set(prefix12, contact);
if (contact.name && !byName.has(contact.name)) {
byName.set(contact.name, contact);
}
for (let len = 1; len <= 12; len++) {
const prefix = prefix12.slice(0, len);
const matches = byPrefix.get(prefix);
if (matches) {
matches.push(contact);
} else {
byPrefix.set(prefix, [contact]);
}
}
}
const byRepeater = new Map<string, ContactAdvertPathSummary['paths']>();
for (const summary of repeaterAdvertPaths) {
const key = summary.public_key.slice(0, 12).toLowerCase();
byRepeater.set(key, summary.paths);
}
return {
contactIndex: { byPrefix12, byName, byPrefix },
advertPathIndex: { byRepeater },
myPrefix: config?.public_key?.slice(0, 12).toLowerCase() || null,
splitAmbiguousByTraffic,
useAdvertPathHints,
};
}
export function createPacketNetworkState(selfName: string = 'Me'): PacketNetworkState {
const now = Date.now();
return {
nodes: new Map([
[
'self',
{
id: 'self',
name: selfName,
type: 'self',
isAmbiguous: false,
lastActivity: now,
},
],
]),
links: new Map(),
neighborIds: new Map(),
observations: [],
trafficPatterns: new Map(),
};
}
export function ensureSelfNode(state: PacketNetworkState, selfName: string = 'Me'): void {
const existing = state.nodes.get('self');
if (existing) {
existing.name = selfName;
return;
}
state.nodes.set('self', {
id: 'self',
name: selfName,
type: 'self',
isAmbiguous: false,
lastActivity: Date.now(),
});
}
export function clearPacketNetworkState(
state: PacketNetworkState,
{ selfName = 'Me' }: { selfName?: string } = {}
): void {
state.links.clear();
state.neighborIds.clear();
state.observations = [];
state.trafficPatterns.clear();
const selfNode = state.nodes.get('self');
state.nodes.clear();
state.nodes.set('self', {
id: 'self',
name: selfName,
type: 'self',
isAmbiguous: false,
lastActivity: Date.now(),
lastActivityReason: undefined,
lastSeen: null,
probableIdentity: undefined,
probableIdentityNodeId: undefined,
ambiguousNames: undefined,
});
if (selfNode?.name && selfNode.name !== selfName) {
state.nodes.get('self')!.name = selfName;
}
}
function addOrUpdateNode(
state: PacketNetworkState,
{
activityAtMs,
ambiguousNames,
id,
isAmbiguous,
lastSeen,
name,
probableIdentity,
probableIdentityNodeId,
type,
}: {
activityAtMs: number;
ambiguousNames?: string[];
id: string;
isAmbiguous: boolean;
lastSeen?: number | null;
name: string | null;
probableIdentity?: string | null;
probableIdentityNodeId?: string | null;
type: NodeType;
}
): void {
const existing = state.nodes.get(id);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAtMs);
if (name) existing.name = name;
if (probableIdentity !== undefined) existing.probableIdentity = probableIdentity;
if (probableIdentityNodeId !== undefined) {
existing.probableIdentityNodeId = probableIdentityNodeId;
}
if (ambiguousNames) existing.ambiguousNames = ambiguousNames;
if (lastSeen !== undefined) existing.lastSeen = lastSeen;
return;
}
state.nodes.set(id, {
id,
name,
type,
isAmbiguous,
lastActivity: activityAtMs,
probableIdentity,
probableIdentityNodeId,
ambiguousNames,
lastSeen,
});
}
function addCanonicalLink(
state: PacketNetworkState,
sourceId: string,
targetId: string,
activityAtMs: number
): void {
const key = buildLinkKey(sourceId, targetId);
const existing = state.links.get(key);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAtMs);
} else {
state.links.set(key, { sourceId, targetId, lastActivity: activityAtMs });
}
}
function upsertNeighbor(state: PacketNetworkState, sourceId: string, targetId: string): void {
const ensureSet = (id: string) => {
const existing = state.neighborIds.get(id);
if (existing) return existing;
const created = new Set<string>();
state.neighborIds.set(id, created);
return created;
};
ensureSet(sourceId).add(targetId);
ensureSet(targetId).add(sourceId);
}
function pickLikelyRepeaterByAdvertPath(
context: PacketNetworkContext,
candidates: Contact[],
nextPrefix: string | null
): Contact | null {
const nextHop = nextPrefix?.toLowerCase() ?? null;
const scored = candidates
.map((candidate) => {
const prefix12 = candidate.public_key.slice(0, 12).toLowerCase();
const paths = context.advertPathIndex.byRepeater.get(prefix12) ?? [];
let matchScore = 0;
let totalScore = 0;
for (const path of paths) {
totalScore += path.heard_count;
const pathNextHop = path.next_hop?.toLowerCase() ?? null;
if (pathNextHop === nextHop) {
matchScore += path.heard_count;
}
}
return { candidate, matchScore, totalScore };
})
.filter((entry) => entry.totalScore > 0)
.sort(
(a, b) =>
b.matchScore - a.matchScore ||
b.totalScore - a.totalScore ||
a.candidate.public_key.localeCompare(b.candidate.public_key)
);
if (scored.length === 0) return null;
const top = scored[0];
const second = scored[1] ?? null;
if (top.matchScore < 2) return null;
if (second && top.matchScore < second.matchScore * 2) return null;
return top.candidate;
}
function resolveNode(
state: PacketNetworkState,
context: PacketNetworkContext,
source: { type: 'prefix' | 'pubkey' | 'name'; value: string },
isRepeater: boolean,
showAmbiguous: boolean,
activityAtMs: number,
trafficContext?: { packetSource: string | null; nextPrefix: string | null }
): string | null {
if (source.type === 'pubkey') {
if (source.value.length < 12) return null;
const nodeId = source.value.slice(0, 12).toLowerCase();
if (context.myPrefix && nodeId === context.myPrefix) return 'self';
const contact = context.contactIndex.byPrefix12.get(nodeId);
addOrUpdateNode(state, {
id: nodeId,
name: contact?.name || null,
type: getNodeType(contact),
isAmbiguous: false,
lastSeen: contact?.last_seen,
activityAtMs,
});
return nodeId;
}
if (source.type === 'name') {
const contact = context.contactIndex.byName.get(source.value) ?? null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (context.myPrefix && nodeId === context.myPrefix) return 'self';
addOrUpdateNode(state, {
id: nodeId,
name: contact.name,
type: getNodeType(contact),
isAmbiguous: false,
lastSeen: contact.last_seen,
activityAtMs,
});
return nodeId;
}
const nodeId = `name:${source.value}`;
addOrUpdateNode(state, {
id: nodeId,
name: source.value,
type: 'client',
isAmbiguous: false,
activityAtMs,
});
return nodeId;
}
const lookupValue = source.value.toLowerCase();
const matches = context.contactIndex.byPrefix.get(lookupValue) ?? [];
const contact = matches.length === 1 ? matches[0] : null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (context.myPrefix && nodeId === context.myPrefix) return 'self';
addOrUpdateNode(state, {
id: nodeId,
name: contact.name,
type: getNodeType(contact),
isAmbiguous: false,
lastSeen: contact.last_seen,
activityAtMs,
});
return nodeId;
}
if (!showAmbiguous) {
return null;
}
const filtered = isRepeater
? matches.filter((candidate) => candidate.type === CONTACT_TYPE_REPEATER)
: matches.filter((candidate) => candidate.type !== CONTACT_TYPE_REPEATER);
if (filtered.length === 1) {
const only = filtered[0];
const nodeId = only.public_key.slice(0, 12).toLowerCase();
addOrUpdateNode(state, {
id: nodeId,
name: only.name,
type: getNodeType(only),
isAmbiguous: false,
lastSeen: only.last_seen,
activityAtMs,
});
return nodeId;
}
if (filtered.length === 0 && !isRepeater) {
return null;
}
const names = filtered.map((candidate) => candidate.name || candidate.public_key.slice(0, 8));
const lastSeen = filtered.reduce(
(max, candidate) =>
candidate.last_seen && (!max || candidate.last_seen > max) ? candidate.last_seen : max,
null as number | null
);
let nodeId = buildAmbiguousRepeaterNodeId(lookupValue);
let displayName = buildAmbiguousRepeaterLabel(lookupValue);
let probableIdentity: string | null = null;
let probableIdentityNodeId: string | null = null;
let ambiguousNames = names.length > 0 ? names : undefined;
if (context.useAdvertPathHints && isRepeater && trafficContext) {
const likely = pickLikelyRepeaterByAdvertPath(context, filtered, trafficContext.nextPrefix);
if (likely) {
const likelyName = likely.name || likely.public_key.slice(0, 12).toUpperCase();
probableIdentity = likelyName;
probableIdentityNodeId = likely.public_key.slice(0, 12).toLowerCase();
displayName = likelyName;
ambiguousNames = filtered
.filter((candidate) => candidate.public_key !== likely.public_key)
.map((candidate) => candidate.name || candidate.public_key.slice(0, 8));
}
}
if (context.splitAmbiguousByTraffic && isRepeater && trafficContext) {
const normalizedNext = trafficContext.nextPrefix?.toLowerCase() ?? null;
if (trafficContext.packetSource) {
recordTrafficObservation(
state.trafficPatterns,
lookupValue,
trafficContext.packetSource,
normalizedNext
);
}
const trafficData = state.trafficPatterns.get(lookupValue);
if (trafficData) {
const analysis = analyzeRepeaterTraffic(trafficData);
if (analysis.shouldSplit && normalizedNext) {
nodeId = buildAmbiguousRepeaterNodeId(lookupValue, normalizedNext);
if (!probableIdentity) {
displayName = buildAmbiguousRepeaterLabel(lookupValue, normalizedNext);
}
}
}
}
addOrUpdateNode(state, {
id: nodeId,
name: displayName,
type: isRepeater ? 'repeater' : 'client',
isAmbiguous: true,
probableIdentity,
probableIdentityNodeId,
ambiguousNames,
lastSeen,
activityAtMs,
});
return nodeId;
}
export function buildCanonicalPathForPacket(
state: PacketNetworkState,
context: PacketNetworkContext,
parsed: ParsedPacket,
packet: RawPacket,
activityAtMs: number
): string[] {
const path: string[] = [];
let packetSource: string | null = null;
const isDm = parsed.payloadType === PayloadType.TextMessage;
const isOutgoingDm =
isDm && !!context.myPrefix && parsed.srcHash?.toLowerCase() === context.myPrefix;
if (parsed.payloadType === PayloadType.Advert && parsed.advertPubkey) {
const nodeId = resolveNode(
state,
context,
{ type: 'pubkey', value: parsed.advertPubkey },
false,
false,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.AnonRequest && parsed.anonRequestPubkey) {
const nodeId = resolveNode(
state,
context,
{ type: 'pubkey', value: parsed.anonRequestPubkey },
false,
false,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.TextMessage && parsed.srcHash) {
if (context.myPrefix && parsed.srcHash.toLowerCase() === context.myPrefix) {
path.push('self');
packetSource = 'self';
} else {
const nodeId = resolveNode(
state,
context,
{ type: 'prefix', value: parsed.srcHash },
false,
true,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
}
} else if (parsed.payloadType === PayloadType.GroupText) {
const senderName = parsed.groupTextSender || packet.decrypted_info?.sender;
if (senderName) {
const nodeId = resolveNode(
state,
context,
{ type: 'name', value: senderName },
false,
false,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
}
}
for (let i = 0; i < parsed.pathBytes.length; i++) {
const nodeId = resolveNode(
state,
context,
{ type: 'prefix', value: parsed.pathBytes[i] },
true,
true,
activityAtMs,
{ packetSource, nextPrefix: parsed.pathBytes[i + 1] || null }
);
if (nodeId) {
path.push(nodeId);
}
}
if (parsed.payloadType === PayloadType.TextMessage && parsed.dstHash) {
if (context.myPrefix && parsed.dstHash.toLowerCase() === context.myPrefix) {
path.push('self');
} else if (!isOutgoingDm && path.length > 0) {
path.push('self');
}
} else if (path.length > 0) {
path.push('self');
}
return dedupeConsecutive(path);
}
export function ingestPacketIntoPacketNetwork(
state: PacketNetworkState,
context: PacketNetworkContext,
packet: RawPacket
): PacketNetworkIngestResult | null {
const parsed = parsePacket(packet.data);
if (!parsed) return null;
const activityAtMs = normalizePacketTimestampMs(packet.timestamp);
const canonicalPath = buildCanonicalPathForPacket(state, context, parsed, packet, activityAtMs);
if (canonicalPath.length < 2) {
return null;
}
const label = getPacketLabel(parsed.payloadType);
for (let i = 0; i < canonicalPath.length; i++) {
const node = state.nodes.get(canonicalPath[i]);
if (node && node.id !== 'self') {
node.lastActivityReason = i === 0 ? `${label} source` : `Relayed ${label}`;
}
}
state.observations.push({ nodes: canonicalPath, activityAtMs });
for (let i = 0; i < canonicalPath.length - 1; i++) {
if (canonicalPath[i] !== canonicalPath[i + 1]) {
addCanonicalLink(state, canonicalPath[i], canonicalPath[i + 1], activityAtMs);
upsertNeighbor(state, canonicalPath[i], canonicalPath[i + 1]);
}
}
return { parsed, label, canonicalPath, activityAtMs };
}
export function isPacketNetworkNodeVisible(
node: PacketNetworkNode | undefined,
visibility: PacketNetworkVisibilityOptions
): boolean {
if (!node) return false;
if (node.id === 'self') return true;
if (!node.isAmbiguous) return true;
return node.type === 'repeater' ? visibility.showAmbiguousPaths : visibility.showAmbiguousNodes;
}
function buildKnownSiblingRepeaterAliasMap(
state: PacketNetworkState,
visibility: PacketNetworkVisibilityOptions
): Map<string, string> {
if (!visibility.collapseLikelyKnownSiblingRepeaters || !visibility.showAmbiguousPaths) {
return new Map();
}
const knownRepeaterNextHops = new Map<string, Set<string>>();
for (const observation of state.observations) {
for (let i = 0; i < observation.nodes.length - 1; i++) {
const currentNode = state.nodes.get(observation.nodes[i]);
if (!currentNode || currentNode.type !== 'repeater' || currentNode.isAmbiguous) {
continue;
}
const nextNodeId = observation.nodes[i + 1];
const existing = knownRepeaterNextHops.get(currentNode.id);
if (existing) {
existing.add(nextNodeId);
} else {
knownRepeaterNextHops.set(currentNode.id, new Set([nextNodeId]));
}
}
}
const aliases = new Map<string, string>();
for (const observation of state.observations) {
for (let i = 0; i < observation.nodes.length - 1; i++) {
const currentNodeId = observation.nodes[i];
const currentNode = state.nodes.get(currentNodeId);
if (
!currentNode ||
currentNode.type !== 'repeater' ||
!currentNode.isAmbiguous ||
!currentNode.probableIdentityNodeId
) {
continue;
}
const probableNode = state.nodes.get(currentNode.probableIdentityNodeId);
if (!probableNode || probableNode.type !== 'repeater' || probableNode.isAmbiguous) {
continue;
}
const nextNodeId = observation.nodes[i + 1];
const probableNextHops = knownRepeaterNextHops.get(probableNode.id);
if (probableNextHops?.has(nextNodeId)) {
aliases.set(currentNodeId, probableNode.id);
}
}
}
return aliases;
}
function projectCanonicalPathWithAliases(
state: PacketNetworkState,
canonicalPath: string[],
visibility: PacketNetworkVisibilityOptions,
repeaterAliases: Map<string, string>
): ProjectedPacketNetworkPath {
const projected = compactPathSteps(
canonicalPath.map((nodeId, index) => {
const node = state.nodes.get(nodeId);
const visible = isPacketNetworkNodeVisible(node, visibility);
return {
nodeId: visible ? (repeaterAliases.get(nodeId) ?? nodeId) : null,
// Only hidden repeater hops should imply a bridged dashed segment.
// Hidden sender/recipient endpoints should disappear with their own edge.
markHiddenLinkWhenOmitted:
!visible &&
!!node &&
node.type === 'repeater' &&
index > 0 &&
index < canonicalPath.length - 1,
hiddenLabel: null,
};
})
);
return {
nodes: dedupeConsecutive(projected.nodes),
dashedLinkDetails: projected.dashedLinkDetails,
};
}
export function projectCanonicalPath(
state: PacketNetworkState,
canonicalPath: string[],
visibility: PacketNetworkVisibilityOptions
): ProjectedPacketNetworkPath {
return projectCanonicalPathWithAliases(
state,
canonicalPath,
visibility,
buildKnownSiblingRepeaterAliasMap(state, visibility)
);
}
export function projectPacketNetwork(
state: PacketNetworkState,
visibility: PacketNetworkVisibilityOptions
): PacketNetworkProjection {
const repeaterAliases = buildKnownSiblingRepeaterAliasMap(state, visibility);
const nodes = new Map<string, PacketNetworkNode>();
const selfNode = state.nodes.get('self');
if (selfNode) {
nodes.set('self', selfNode);
}
const links = new Map<string, ProjectedPacketNetworkLink>();
for (const observation of state.observations) {
const projected = projectCanonicalPathWithAliases(
state,
observation.nodes,
visibility,
repeaterAliases
);
if (projected.nodes.length < 2) continue;
for (const nodeId of projected.nodes) {
const node = state.nodes.get(nodeId);
if (node) {
nodes.set(nodeId, node);
}
}
for (let i = 0; i < projected.nodes.length - 1; i++) {
const sourceId = projected.nodes[i];
const targetId = projected.nodes[i + 1];
if (sourceId === targetId) continue;
const key = buildLinkKey(sourceId, targetId);
const hiddenIntermediate = projected.dashedLinkDetails.has(key);
const existing = links.get(key);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, observation.activityAtMs);
if (hiddenIntermediate) {
existing.hasHiddenIntermediate = true;
for (const label of projected.dashedLinkDetails.get(key) ?? []) {
if (!existing.hiddenHopLabels.includes(label)) {
existing.hiddenHopLabels.push(label);
}
}
} else {
existing.hasDirectObservation = true;
}
continue;
}
links.set(key, {
sourceId,
targetId,
lastActivity: observation.activityAtMs,
hasDirectObservation: !hiddenIntermediate,
hasHiddenIntermediate: hiddenIntermediate,
hiddenHopLabels: [...(projected.dashedLinkDetails.get(key) ?? [])],
});
}
}
return {
nodes,
links,
renderedNodeIds: new Set(nodes.keys()),
};
}
export function prunePacketNetworkState(state: PacketNetworkState, cutoff: number): boolean {
let pruned = false;
for (const [id, node] of state.nodes) {
if (id === 'self') continue;
if (node.lastActivity < cutoff) {
state.nodes.delete(id);
pruned = true;
}
}
if (!pruned) {
return false;
}
for (const [key, link] of state.links) {
if (!state.nodes.has(link.sourceId) || !state.nodes.has(link.targetId)) {
state.links.delete(key);
}
}
state.observations = state.observations.filter((observation) =>
observation.nodes.every((nodeId) => state.nodes.has(nodeId))
);
state.neighborIds.clear();
for (const link of state.links.values()) {
upsertNeighbor(state, link.sourceId, link.targetId);
}
return true;
}
export function snapshotNeighborIds(state: PacketNetworkState): Map<string, string[]> {
return new Map(
Array.from(state.neighborIds.entries()).map(([nodeId, neighborIds]) => [
nodeId,
Array.from(neighborIds).sort(),
])
);
}

View File

@@ -200,6 +200,25 @@ describe('fetchJson (via api methods)', () => {
expect.objectContaining({ 'Content-Type': 'application/json' })
);
});
it('omits Content-Type on POST requests without a body', async () => {
installMockFetch();
mockFetch.mockResolvedValueOnce({
ok: true,
json: () =>
Promise.resolve({
contact: null,
forward_path: { path: '', path_len: 0, path_hash_mode: 0 },
return_path: { path: '', path_len: 0, path_hash_mode: 0 },
}),
});
await api.requestPathDiscovery('aa'.repeat(32));
const [, options] = mockFetch.mock.calls[0];
expect(options.method).toBe('POST');
expect(options.headers).not.toHaveProperty('Content-Type');
});
});
describe('HTTP methods and body', () => {
@@ -257,6 +276,21 @@ describe('fetchJson (via api methods)', () => {
expect(JSON.parse(options.body)).toEqual({ private_key: 'my-secret-key' });
});
it('sends POST with JSON body for mesh discovery', async () => {
installMockFetch();
mockFetch.mockResolvedValueOnce({
ok: true,
json: () => Promise.resolve({ target: 'repeaters', duration_seconds: 8, results: [] }),
});
await api.discoverMesh('repeaters');
const [url, options] = mockFetch.mock.calls[0];
expect(url).toBe('/api/radio/discover');
expect(options.method).toBe('POST');
expect(JSON.parse(options.body)).toEqual({ target: 'repeaters' });
});
it('sends DELETE for deleteContact', async () => {
installMockFetch();
mockFetch.mockResolvedValueOnce({

View File

@@ -69,6 +69,7 @@ vi.mock('../hooks', async (importOriginal) => {
fetchOlderMessages: mocks.hookFns.fetchOlderMessages,
fetchNewerMessages: vi.fn(async () => {}),
jumpToBottom: vi.fn(),
reloadCurrentConversation: vi.fn(),
addMessageIfNew: mocks.hookFns.addMessageIfNew,
updateMessageAck: mocks.hookFns.updateMessageAck,
triggerReconcile: mocks.hookFns.triggerReconcile,
@@ -77,7 +78,9 @@ vi.mock('../hooks', async (importOriginal) => {
unreadCounts: {},
mentions: {},
lastMessageTimes: {},
unreadLastReadAts: {},
incrementUnread: mocks.hookFns.incrementUnread,
renameConversationState: vi.fn(),
markAllRead: mocks.hookFns.markAllRead,
trackNewMessage: mocks.hookFns.trackNewMessage,
refreshUnreads: mocks.hookFns.refreshUnreads,

View File

@@ -42,6 +42,7 @@ vi.mock('../hooks', async (importOriginal) => {
fetchOlderMessages: vi.fn(async () => {}),
fetchNewerMessages: vi.fn(async () => {}),
jumpToBottom: vi.fn(),
reloadCurrentConversation: vi.fn(),
addMessageIfNew: vi.fn(),
updateMessageAck: vi.fn(),
triggerReconcile: vi.fn(),
@@ -51,7 +52,9 @@ vi.mock('../hooks', async (importOriginal) => {
unreadCounts: {},
mentions: {},
lastMessageTimes: {},
unreadLastReadAts: {},
incrementUnread: vi.fn(),
renameConversationState: vi.fn(),
markAllRead: vi.fn(),
trackNewMessage: vi.fn(),
refreshUnreads: vi.fn(),

View File

@@ -30,19 +30,29 @@ vi.mock('../hooks', async (importOriginal) => {
messagesLoading: false,
loadingOlder: false,
hasOlderMessages: false,
hasNewerMessages: false,
loadingNewer: false,
hasNewerMessagesRef: { current: false },
setMessages: vi.fn(),
fetchMessages: vi.fn(async () => {}),
fetchOlderMessages: vi.fn(async () => {}),
fetchNewerMessages: vi.fn(async () => {}),
jumpToBottom: vi.fn(),
reloadCurrentConversation: vi.fn(),
addMessageIfNew: vi.fn(),
updateMessageAck: vi.fn(),
triggerReconcile: vi.fn(),
}),
useUnreadCounts: () => ({
unreadCounts: {},
mentions: {},
lastMessageTimes: {},
unreadLastReadAts: {},
incrementUnread: vi.fn(),
renameConversationState: vi.fn(),
markAllRead: vi.fn(),
trackNewMessage: vi.fn(),
refreshUnreads: vi.fn(async () => {}),
}),
getMessageContentKey: () => 'content-key',
};

View File

@@ -1,8 +1,9 @@
import { fireEvent, render, screen, within } from '@testing-library/react';
import { fireEvent, render, screen, waitFor, within } from '@testing-library/react';
import { describe, expect, it, vi } from 'vitest';
import { ChatHeader } from '../components/ChatHeader';
import type { Channel, Conversation, Favorite } from '../types';
import type { Channel, Contact, Conversation, Favorite, PathDiscoveryResponse } from '../types';
import { PUBLIC_CHANNEL_KEY } from '../utils/publicChannel';
function makeChannel(key: string, name: string, isHashtag: boolean): Channel {
return { key, name, is_hashtag: isHashtag, on_radio: false, last_read_at: null };
@@ -18,6 +19,9 @@ const baseProps = {
notificationsEnabled: false,
notificationsPermission: 'granted' as const,
onTrace: noop,
onPathDiscovery: vi.fn(async () => {
throw new Error('unused');
}) as (_: string) => Promise<PathDiscoveryResponse>,
onToggleNotifications: noop,
onToggleFavorite: noop,
onSetChannelFloodScopeOverride: noop,
@@ -166,12 +170,113 @@ describe('ChatHeader key visibility', () => {
expect(onToggleNotifications).toHaveBeenCalledTimes(1);
});
it('prompts for regional override when globe button is clicked', () => {
it('hides the delete button for the canonical Public channel', () => {
const channel = makeChannel(PUBLIC_CHANNEL_KEY, 'Public', false);
const conversation: Conversation = { type: 'channel', id: PUBLIC_CHANNEL_KEY, name: 'Public' };
render(<ChatHeader {...baseProps} conversation={conversation} channels={[channel]} />);
expect(screen.queryByRole('button', { name: 'Delete' })).not.toBeInTheDocument();
});
it('still shows the delete button for non-canonical channels named Public', () => {
const key = 'AB'.repeat(16);
const channel = makeChannel(key, 'Public', false);
const conversation: Conversation = { type: 'channel', id: key, name: 'Public' };
render(<ChatHeader {...baseProps} conversation={conversation} channels={[channel]} />);
expect(screen.getByRole('button', { name: 'Delete' })).toBeInTheDocument();
});
it('opens path discovery modal for contacts and runs the request on demand', async () => {
const pubKey = '21'.repeat(32);
const contact: Contact = {
public_key: pubKey,
name: 'Alice',
type: 1,
flags: 0,
last_path: 'AA',
last_path_len: 1,
out_path_hash_mode: 0,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
};
const conversation: Conversation = { type: 'contact', id: pubKey, name: 'Alice' };
const onPathDiscovery = vi.fn().mockResolvedValue({
contact,
forward_path: { path: 'AA', path_len: 1, path_hash_mode: 0 },
return_path: { path: '', path_len: 0, path_hash_mode: 0 },
} satisfies PathDiscoveryResponse);
render(
<ChatHeader
{...baseProps}
conversation={conversation}
channels={[]}
contacts={[contact]}
onPathDiscovery={onPathDiscovery}
/>
);
fireEvent.click(screen.getByRole('button', { name: 'Path Discovery' }));
expect(await screen.findByRole('dialog')).toBeInTheDocument();
fireEvent.click(screen.getByRole('button', { name: 'Run path discovery' }));
await waitFor(() => {
expect(onPathDiscovery).toHaveBeenCalledWith(pubKey);
});
});
it('shows an override warning in the path discovery modal when forced routing is set', async () => {
const pubKey = '31'.repeat(32);
const contact: Contact = {
public_key: pubKey,
name: 'Alice',
type: 1,
flags: 0,
last_path: 'AA',
last_path_len: 1,
out_path_hash_mode: 0,
route_override_path: 'BBDD',
route_override_len: 2,
route_override_hash_mode: 0,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
};
const conversation: Conversation = { type: 'contact', id: pubKey, name: 'Alice' };
render(
<ChatHeader {...baseProps} conversation={conversation} channels={[]} contacts={[contact]} />
);
fireEvent.click(screen.getByRole('button', { name: 'Path Discovery' }));
expect(await screen.findByRole('dialog')).toBeInTheDocument();
expect(screen.getByText(/current learned route: 1 hop \(AA\)/i)).toBeInTheDocument();
expect(screen.getByText(/current forced route: 2 hops \(BB -> DD\)/i)).toBeInTheDocument();
expect(screen.getByText(/forced route override is currently set/i)).toBeInTheDocument();
expect(screen.getByText(/clearing the forced route afterward is enough/i)).toBeInTheDocument();
});
it('opens the regional override modal and applies the entered region', async () => {
const key = 'CD'.repeat(16);
const channel = makeChannel(key, '#flightless', true);
const conversation: Conversation = { type: 'channel', id: key, name: '#flightless' };
const onSetChannelFloodScopeOverride = vi.fn();
const promptSpy = vi.spyOn(window, 'prompt').mockReturnValue('Esperance');
render(
<ChatHeader
@@ -184,8 +289,10 @@ describe('ChatHeader key visibility', () => {
fireEvent.click(screen.getByTitle('Set regional override'));
expect(promptSpy).toHaveBeenCalled();
expect(await screen.findByRole('dialog')).toBeInTheDocument();
fireEvent.change(screen.getByLabelText('Region'), { target: { value: 'Esperance' } });
fireEvent.click(screen.getByRole('button', { name: 'Use Esperance region for #flightless' }));
expect(onSetChannelFloodScopeOverride).toHaveBeenCalledWith(key, 'Esperance');
promptSpy.mockRestore();
});
});

View File

@@ -117,6 +117,9 @@ function createProps(overrides: Partial<React.ComponentProps<typeof Conversation
loadingNewer: false,
messageInputRef: { current: null },
onTrace: vi.fn(async () => {}),
onPathDiscovery: vi.fn(async () => {
throw new Error('unused');
}),
onToggleFavorite: vi.fn(async () => {}),
onDeleteContact: vi.fn(async () => {}),
onDeleteChannel: vi.fn(async () => {}),

View File

@@ -52,4 +52,43 @@ describe('MapView', () => {
).toBeInTheDocument();
expect(screen.getByText('Last heard: Never heard by this server')).toBeInTheDocument();
});
it('keeps the 7-day cutoff stable for the lifetime of the mounted map', () => {
vi.useFakeTimers();
try {
vi.setSystemTime(new Date('2026-03-15T12:00:00Z'));
const contact: Contact = {
public_key: 'bb'.repeat(32),
name: 'Almost Stale',
type: 1,
flags: 0,
last_path: null,
last_path_len: -1,
out_path_hash_mode: -1,
route_override_path: null,
route_override_len: null,
route_override_hash_mode: null,
last_advert: null,
lat: 41,
lon: -73,
last_seen: Math.floor(Date.now() / 1000) - 7 * 24 * 60 * 60 + 60,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
};
const { rerender } = render(<MapView contacts={[contact]} focusedKey={null} />);
expect(screen.getByText(/showing 1 contact heard in the last 7 days/i)).toBeInTheDocument();
vi.advanceTimersByTime(2 * 60 * 1000);
rerender(<MapView contacts={[contact]} focusedKey={null} />);
expect(screen.getByText(/showing 1 contact heard in the last 7 days/i)).toBeInTheDocument();
} finally {
vi.useRealTimers();
}
});
});

View File

@@ -9,12 +9,18 @@ import { render, screen, fireEvent } from '@testing-library/react';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { MessageInput } from '../components/MessageInput';
import { toast } from '../components/ui/sonner';
// Mock sonner (toast)
vi.mock('../components/ui/sonner', () => ({
toast: { success: vi.fn(), error: vi.fn() },
}));
const mockToast = toast as unknown as {
success: ReturnType<typeof vi.fn>;
error: ReturnType<typeof vi.fn>;
};
const textEncoder = new TextEncoder();
function byteLen(s: string): number {
@@ -182,4 +188,24 @@ describe('MessageInput', () => {
expect(getSendButton()).toBeEnabled();
});
});
describe('send failure toasts', () => {
it('shows the radio no-response toast when the send outcome is unknown', async () => {
onSend.mockRejectedValueOnce(
new Error(
'Send command was issued to the radio, but no response was heard back. The message may or may not have sent successfully.'
)
);
renderInput({ conversationType: 'contact' });
fireEvent.change(getInput(), { target: { value: 'Hello' } });
fireEvent.click(getSendButton());
expect(await screen.findByDisplayValue('Hello')).toBeTruthy();
expect(mockToast.error).toHaveBeenCalledWith('Radio did not confirm send', {
description:
'Send command was issued to the radio, but no response was heard back. The message may or may not have sent successfully.',
});
});
});
});

View File

@@ -7,6 +7,7 @@ import { MessageList } from '../components/MessageList';
import type { Message } from '../types';
const scrollIntoViewMock = vi.fn();
const originalGetBoundingClientRect = HTMLElement.prototype.getBoundingClientRect;
function createMessage(overrides: Partial<Message> = {}): Message {
return {
@@ -35,6 +36,11 @@ describe('MessageList channel sender rendering', () => {
value: scrollIntoViewMock,
writable: true,
});
Object.defineProperty(HTMLElement.prototype, 'getBoundingClientRect', {
configurable: true,
value: originalGetBoundingClientRect,
writable: true,
});
});
it('renders explicit corrupt placeholder and warning avatar for unnamed corrupt channel packets', () => {
@@ -155,4 +161,91 @@ describe('MessageList channel sender rendering', () => {
expect(screen.getByText('Unread messages')).toBeInTheDocument();
expect(scrollIntoViewMock).toHaveBeenCalled();
});
it('lets the user dismiss the jump-to-unread button without scrolling or hiding the marker', async () => {
const user = userEvent.setup();
const messages = [
createMessage({ id: 1, received_at: 1700000001, text: 'Alice: older' }),
createMessage({ id: 2, received_at: 1700000010, text: 'Alice: newer' }),
];
render(
<MessageList
messages={messages}
contacts={[]}
loading={false}
unreadMarkerLastReadAt={1700000005}
/>
);
await user.click(screen.getByRole('button', { name: 'Dismiss jump to unread' }));
expect(screen.queryByRole('button', { name: 'Jump to unread' })).not.toBeInTheDocument();
expect(screen.getByText('Unread messages')).toBeInTheDocument();
expect(scrollIntoViewMock).not.toHaveBeenCalled();
});
it('hides the jump-to-unread button when the unread marker is already visible', () => {
Object.defineProperty(HTMLElement.prototype, 'getBoundingClientRect', {
configurable: true,
writable: true,
value: function () {
const element = this as HTMLElement;
if (element.textContent?.includes('Unread messages')) {
return {
top: 200,
bottom: 240,
left: 0,
right: 300,
width: 300,
height: 40,
x: 0,
y: 200,
toJSON: () => '',
};
}
if (element.className.includes('overflow-y-auto')) {
return {
top: 100,
bottom: 500,
left: 0,
right: 400,
width: 400,
height: 400,
x: 0,
y: 100,
toJSON: () => '',
};
}
return {
top: 0,
bottom: 0,
left: 0,
right: 0,
width: 0,
height: 0,
x: 0,
y: 0,
toJSON: () => '',
};
},
});
const messages = [
createMessage({ id: 1, received_at: 1700000001, text: 'Alice: older' }),
createMessage({ id: 2, received_at: 1700000010, text: 'Alice: newer' }),
];
render(
<MessageList
messages={messages}
contacts={[]}
loading={false}
unreadMarkerLastReadAt={1700000005}
/>
);
expect(screen.getByText('Unread messages')).toBeInTheDocument();
expect(screen.queryByRole('button', { name: 'Jump to unread' })).not.toBeInTheDocument();
});
});

View File

@@ -11,6 +11,7 @@ import { describe, it, expect, vi, beforeEach } from 'vitest';
import { NewMessageModal } from '../components/NewMessageModal';
import type { Contact } from '../types';
import { toast } from '../components/ui/sonner';
// Mock sonner (toast)
vi.mock('../components/ui/sonner', () => ({
@@ -35,6 +36,11 @@ const mockContact: Contact = {
first_seen: null,
};
const mockToast = toast as unknown as {
success: ReturnType<typeof vi.fn>;
error: ReturnType<typeof vi.fn>;
};
describe('NewMessageModal form reset', () => {
const onClose = vi.fn();
const onSelectConversation = vi.fn();
@@ -137,6 +143,24 @@ describe('NewMessageModal form reset', () => {
});
expect(onClose).toHaveBeenCalled();
});
it('toasts when creation fails', async () => {
const user = userEvent.setup();
onCreateChannel.mockRejectedValueOnce(new Error('Bad key'));
renderModal();
await switchToTab(user, 'Room');
await user.type(screen.getByPlaceholderText('Room name'), 'MyRoom');
await user.type(screen.getByPlaceholderText('Pre-shared key (hex)'), 'cc'.repeat(16));
await user.click(screen.getByRole('button', { name: 'Create' }));
await waitFor(() => {
expect(mockToast.error).toHaveBeenCalledWith('Failed to create conversation', {
description: 'Bad key',
});
});
expect(screen.getByText('Bad key')).toBeTruthy();
});
});
describe('tab switching resets form', () => {

View File

@@ -0,0 +1,403 @@
import { describe, expect, it, vi } from 'vitest';
import { PayloadType } from '@michaelhart/meshcore-decoder';
import {
buildPacketNetworkContext,
createPacketNetworkState,
ingestPacketIntoPacketNetwork,
projectCanonicalPath,
projectPacketNetwork,
snapshotNeighborIds,
} from '../networkGraph/packetNetworkGraph';
import { buildLinkKey } from '../utils/visualizerUtils';
import type { Contact, RadioConfig, RawPacket } from '../types';
import { CONTACT_TYPE_REPEATER } from '../types';
const { packetFixtures } = vi.hoisted(() => ({
packetFixtures: new Map<string, unknown>(),
}));
vi.mock('../utils/visualizerUtils', async () => {
const actual = await vi.importActual<typeof import('../utils/visualizerUtils')>(
'../utils/visualizerUtils'
);
return {
...actual,
parsePacket: vi.fn(
(hexData: string) => packetFixtures.get(hexData) ?? actual.parsePacket(hexData)
),
};
});
function createConfig(publicKey: string): RadioConfig {
return {
public_key: publicKey,
name: 'Me',
lat: 0,
lon: 0,
tx_power: 0,
max_tx_power: 0,
radio: { freq: 0, bw: 0, sf: 0, cr: 0 },
path_hash_mode: 0,
path_hash_mode_supported: true,
advert_location_source: 'off',
};
}
function createContact(publicKey: string, name: string, type = 1): Contact {
return {
public_key: publicKey,
name,
type,
flags: 0,
last_path: null,
last_path_len: 0,
out_path_hash_mode: 0,
route_override_path: null,
route_override_len: null,
route_override_hash_mode: null,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
};
}
function createPacket(data: string): RawPacket {
return {
id: 1,
observation_id: 1,
timestamp: 1_700_000_000,
data,
payload_type: 'TEXT',
snr: null,
rssi: null,
decrypted: false,
decrypted_info: null,
};
}
describe('packetNetworkGraph', () => {
it('preserves canonical adjacency while projection hides ambiguous repeaters', () => {
const selfKey = 'ffffffffffff0000000000000000000000000000000000000000000000000000';
const aliceKey = 'aaaaaaaaaaaa0000000000000000000000000000000000000000000000000000';
packetFixtures.set('dm-semantic-hide', {
payloadType: PayloadType.TextMessage,
messageHash: 'dm-semantic-hide',
pathBytes: ['32'],
srcHash: 'aaaaaaaaaaaa',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [createContact(aliceKey, 'Alice')],
config: createConfig(selfKey),
repeaterAdvertPaths: [],
splitAmbiguousByTraffic: false,
useAdvertPathHints: false,
});
ingestPacketIntoPacketNetwork(state, context, createPacket('dm-semantic-hide'));
const hiddenProjection = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: false,
collapseLikelyKnownSiblingRepeaters: true,
});
const shownProjection = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: true,
collapseLikelyKnownSiblingRepeaters: true,
});
expect(snapshotNeighborIds(state)).toEqual(
new Map([
['?32', ['aaaaaaaaaaaa', 'self']],
['aaaaaaaaaaaa', ['?32']],
['self', ['?32']],
])
);
expect(hiddenProjection.links.has('aaaaaaaaaaaa->self')).toBe(true);
expect(shownProjection.links.has('?32->aaaaaaaaaaaa')).toBe(true);
expect(shownProjection.links.has('?32->self')).toBe(true);
});
it('projects hidden ambiguous runs as dashed bridges but keeps later known repeaters visible', () => {
const selfKey = 'ffffffffffff0000000000000000000000000000000000000000000000000000';
const aliceKey = 'aaaaaaaaaaaa0000000000000000000000000000000000000000000000000000';
const repeaterKey = '5656565656560000000000000000000000000000000000000000000000000000';
packetFixtures.set('dm-hidden-chain', {
payloadType: PayloadType.TextMessage,
messageHash: 'dm-hidden-chain',
pathBytes: ['32', '565656565656'],
srcHash: 'aaaaaaaaaaaa',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [
createContact(aliceKey, 'Alice'),
createContact(repeaterKey, 'Relay B', CONTACT_TYPE_REPEATER),
],
config: createConfig(selfKey),
repeaterAdvertPaths: [],
splitAmbiguousByTraffic: false,
useAdvertPathHints: false,
});
const ingested = ingestPacketIntoPacketNetwork(state, context, createPacket('dm-hidden-chain'));
expect(ingested?.canonicalPath).toEqual(['aaaaaaaaaaaa', '?32', '565656565656', 'self']);
const projectedPath = projectCanonicalPath(state, ingested!.canonicalPath, {
showAmbiguousNodes: false,
showAmbiguousPaths: false,
collapseLikelyKnownSiblingRepeaters: true,
});
const projection = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: false,
collapseLikelyKnownSiblingRepeaters: true,
});
expect(projectedPath.nodes).toEqual(['aaaaaaaaaaaa', '565656565656', 'self']);
expect(Array.from(projectedPath.dashedLinkDetails.keys())).toEqual([
'565656565656->aaaaaaaaaaaa',
]);
expect(projection.links.get('565656565656->aaaaaaaaaaaa')?.hasHiddenIntermediate).toBe(true);
expect(projection.links.get('565656565656->self')?.hasDirectObservation).toBe(true);
});
it('does not bridge across hidden ambiguous sender endpoints', () => {
const selfKey = 'ffffffffffff0000000000000000000000000000000000000000000000000000';
const repeaterOneKey = '1111111111110000000000000000000000000000000000000000000000000000';
const repeaterTwoKey = '2222222222220000000000000000000000000000000000000000000000000000';
const repeaterThreeKey = '3333333333330000000000000000000000000000000000000000000000000000';
const repeaterFourKey = '4444444444440000000000000000000000000000000000000000000000000000';
packetFixtures.set('dm-hidden-ambiguous-sender-a', {
payloadType: PayloadType.TextMessage,
messageHash: 'dm-hidden-ambiguous-sender-a',
pathBytes: ['111111111111', '222222222222'],
srcHash: '32',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
packetFixtures.set('dm-hidden-ambiguous-sender-b', {
payloadType: PayloadType.TextMessage,
messageHash: 'dm-hidden-ambiguous-sender-b',
pathBytes: ['333333333333', '444444444444'],
srcHash: '32',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [
createContact(repeaterOneKey, 'Relay 1', CONTACT_TYPE_REPEATER),
createContact(repeaterTwoKey, 'Relay 2', CONTACT_TYPE_REPEATER),
createContact(repeaterThreeKey, 'Relay 3', CONTACT_TYPE_REPEATER),
createContact(repeaterFourKey, 'Relay 4', CONTACT_TYPE_REPEATER),
],
config: createConfig(selfKey),
repeaterAdvertPaths: [],
splitAmbiguousByTraffic: false,
useAdvertPathHints: false,
});
ingestPacketIntoPacketNetwork(state, context, createPacket('dm-hidden-ambiguous-sender-a'));
ingestPacketIntoPacketNetwork(state, context, createPacket('dm-hidden-ambiguous-sender-b'));
const projection = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: true,
collapseLikelyKnownSiblingRepeaters: true,
});
expect(projection.links.has(buildLinkKey('111111111111', '222222222222'))).toBe(true);
expect(projection.links.has(buildLinkKey('333333333333', '444444444444'))).toBe(true);
expect(projection.links.has(buildLinkKey('111111111111', '333333333333'))).toBe(false);
expect(projection.links.has(buildLinkKey('222222222222', '444444444444'))).toBe(false);
});
it('does not add a DM recipient node from destination metadata alone', () => {
const selfKey = 'ffffffffffff0000000000000000000000000000000000000000000000000000';
const aliceKey = 'aaaaaaaaaaaa0000000000000000000000000000000000000000000000000000';
const bobKey = 'bbbbbbbbbbbb0000000000000000000000000000000000000000000000000000';
const repeaterKey = '5656565656560000000000000000000000000000000000000000000000000000';
packetFixtures.set('dm-third-party-no-dst-node', {
payloadType: PayloadType.TextMessage,
messageHash: 'dm-third-party-no-dst-node',
pathBytes: ['565656565656'],
srcHash: 'aaaaaaaaaaaa',
dstHash: 'bbbbbbbbbbbb',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [
createContact(aliceKey, 'Alice'),
createContact(bobKey, 'Bob'),
createContact(repeaterKey, 'Relay', CONTACT_TYPE_REPEATER),
],
config: createConfig(selfKey),
repeaterAdvertPaths: [],
splitAmbiguousByTraffic: false,
useAdvertPathHints: false,
});
const ingested = ingestPacketIntoPacketNetwork(
state,
context,
createPacket('dm-third-party-no-dst-node')
);
expect(ingested?.canonicalPath).toEqual(['aaaaaaaaaaaa', '565656565656', 'self']);
expect(state.nodes.has('bbbbbbbbbbbb')).toBe(false);
expect(snapshotNeighborIds(state)).toEqual(
new Map([
['565656565656', ['aaaaaaaaaaaa', 'self']],
['aaaaaaaaaaaa', ['565656565656']],
['self', ['565656565656']],
])
);
});
it('replays real advert packets through the semantic layer', () => {
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [],
config: createConfig('ffffffffffff0000000000000000000000000000000000000000000000000000'),
repeaterAdvertPaths: [],
splitAmbiguousByTraffic: false,
useAdvertPathHints: false,
});
const packet = createPacket(
'1106538B1CD273868576DC7F679B493F9AB5AC316173E1A56D3388BC3BA75F583F63AB0D1BA2A8ABD0BC6669DBF719E67E4C8517BA4E0D6F8C96A323E9D13A77F2630DED965A5C17C3EC6ED1601EEFE857749DA24E9F39CBEACD722C3708F433DB5FA9BAF0BAF9BC5B1241069290FEEB029A839EF843616E204F204D657368203220F09FA5AB'
);
packet.payload_type = 'ADVERT';
const ingested = ingestPacketIntoPacketNetwork(state, context, packet);
expect(ingested?.canonicalPath).toEqual([
'8576dc7f679b',
'?53',
'?8b',
'?1c',
'?d2',
'?73',
'?86',
'self',
]);
expect(snapshotNeighborIds(state).get('?73')).toEqual(['?86', '?d2']);
});
it('collapses a likely ambiguous repeater into its known sibling when both share the same next hop', () => {
const selfKey = 'ffffffffffff0000000000000000000000000000000000000000000000000000';
const state = createPacketNetworkState('Me');
const context = buildPacketNetworkContext({
contacts: [
createContact('aaaaaaaaaaaa0000000000000000000000000000000000000000000000000000', 'Alice'),
createContact('cccccccccccc0000000000000000000000000000000000000000000000000000', 'Carol'),
createContact(
'3232323232320000000000000000000000000000000000000000000000000000',
'Relay A',
CONTACT_TYPE_REPEATER
),
createContact(
'32ababababab0000000000000000000000000000000000000000000000000000',
'Relay B',
CONTACT_TYPE_REPEATER
),
createContact(
'5656565656560000000000000000000000000000000000000000000000000000',
'Relay Next',
CONTACT_TYPE_REPEATER
),
],
config: createConfig(selfKey),
repeaterAdvertPaths: [
{
public_key: '3232323232320000000000000000000000000000000000000000000000000000',
paths: [
{
path: '',
path_len: 1,
next_hop: '565656565656',
first_seen: 1,
last_seen: 2,
heard_count: 4,
},
],
},
],
splitAmbiguousByTraffic: false,
useAdvertPathHints: true,
});
packetFixtures.set('graph-ambiguous-sibling', {
payloadType: PayloadType.TextMessage,
messageHash: 'graph-ambiguous-sibling',
pathBytes: ['32', '565656565656'],
srcHash: 'aaaaaaaaaaaa',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
packetFixtures.set('graph-known-sibling', {
payloadType: PayloadType.TextMessage,
messageHash: 'graph-known-sibling',
pathBytes: ['323232323232', '565656565656'],
srcHash: 'cccccccccccc',
dstHash: 'ffffffffffff',
advertPubkey: null,
groupTextSender: null,
anonRequestPubkey: null,
});
ingestPacketIntoPacketNetwork(state, context, createPacket('graph-ambiguous-sibling'));
ingestPacketIntoPacketNetwork(state, context, createPacket('graph-known-sibling'));
const collapsed = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: true,
collapseLikelyKnownSiblingRepeaters: true,
});
const separated = projectPacketNetwork(state, {
showAmbiguousNodes: false,
showAmbiguousPaths: true,
collapseLikelyKnownSiblingRepeaters: false,
});
expect(collapsed.renderedNodeIds.has('?32')).toBe(false);
expect(collapsed.renderedNodeIds.has('323232323232')).toBe(true);
expect(collapsed.links.has('323232323232->aaaaaaaaaaaa')).toBe(true);
expect(separated.renderedNodeIds.has('?32')).toBe(true);
expect(separated.links.has('?32->aaaaaaaaaaaa')).toBe(true);
});
});

View File

@@ -7,6 +7,7 @@ import {
formatRouteLabel,
formatRoutingOverrideInput,
getEffectiveContactRoute,
isValidLocation,
resolvePath,
formatDistance,
formatHopCounts,
@@ -665,6 +666,24 @@ describe('resolvePath', () => {
});
});
describe('isValidLocation', () => {
it('rejects null and unset coordinates', () => {
expect(isValidLocation(null, -122.3)).toBe(false);
expect(isValidLocation(47.6, null)).toBe(false);
expect(isValidLocation(0, 0)).toBe(false);
});
it('rejects out-of-range coordinates', () => {
expect(isValidLocation(-593.497573, -1659.939204)).toBe(false);
expect(isValidLocation(91, 0)).toBe(false);
expect(isValidLocation(0, 181)).toBe(false);
});
it('accepts sane coordinates', () => {
expect(isValidLocation(47.6062, -122.3321)).toBe(true);
});
});
describe('formatDistance', () => {
it('formats distances under 1km in meters', () => {
expect(formatDistance(0.5)).toBe('500m');

View File

@@ -1,5 +1,5 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import { RepeaterDashboard } from '../components/RepeaterDashboard';
import type { UseRepeaterDashboardResult } from '../hooks/useRepeaterDashboard';
import type { Contact, Conversation, Favorite } from '../types';
@@ -13,6 +13,7 @@ const mockHook: {
loginError: null,
paneData: {
status: null,
nodeInfo: null,
neighbors: null,
acl: null,
radioSettings: null,
@@ -23,6 +24,7 @@ const mockHook: {
},
paneStates: {
status: { loading: false, attempt: 0, error: null },
nodeInfo: { loading: false, attempt: 0, error: null },
neighbors: { loading: false, attempt: 0, error: null },
acl: { loading: false, attempt: 0, error: null },
radioSettings: { loading: false, attempt: 0, error: null },
@@ -63,6 +65,7 @@ vi.mock('react-leaflet', () => ({
TileLayer: () => null,
CircleMarker: () => null,
Popup: () => null,
Polyline: () => null,
}));
const REPEATER_KEY = 'aa'.repeat(32);
@@ -106,6 +109,9 @@ const defaultProps = {
radioLon: null,
radioName: null,
onTrace: vi.fn(),
onPathDiscovery: vi.fn(async () => {
throw new Error('unused');
}),
onToggleNotifications: vi.fn(),
onToggleFavorite: vi.fn(),
onDeleteContact: vi.fn(),
@@ -120,6 +126,7 @@ describe('RepeaterDashboard', () => {
mockHook.loginError = null;
mockHook.paneData = {
status: null,
nodeInfo: null,
neighbors: null,
acl: null,
radioSettings: null,
@@ -130,6 +137,7 @@ describe('RepeaterDashboard', () => {
};
mockHook.paneStates = {
status: { loading: false, attempt: 0, error: null },
nodeInfo: { loading: false, attempt: 0, error: null },
neighbors: { loading: false, attempt: 0, error: null },
acl: { loading: false, attempt: 0, error: null },
radioSettings: { loading: false, attempt: 0, error: null },
@@ -157,6 +165,7 @@ describe('RepeaterDashboard', () => {
render(<RepeaterDashboard {...defaultProps} />);
expect(screen.getByText('Telemetry')).toBeInTheDocument();
expect(screen.getByText('Node Info')).toBeInTheDocument();
expect(screen.getByText('Neighbors')).toBeInTheDocument();
expect(screen.getByText('ACL')).toBeInTheDocument();
expect(screen.getByText('Radio Settings')).toBeInTheDocument();
@@ -226,6 +235,159 @@ describe('RepeaterDashboard', () => {
expect(screen.getByText('Timeout')).toBeInTheDocument();
});
it('shows GPS unavailable message for neighbors when repeater coords are missing', () => {
mockHook.loggedIn = true;
mockHook.paneData.neighbors = {
neighbors: [
{ pubkey_prefix: 'bbbbbbbbbbbb', name: 'Neighbor', snr: 7.2, last_heard_seconds: 9 },
],
};
mockHook.paneData.nodeInfo = {
name: 'TestRepeater',
lat: '0',
lon: '0',
clock_utc: null,
};
mockHook.paneStates.neighbors = {
loading: false,
attempt: 1,
error: null,
fetched_at: Date.now(),
};
mockHook.paneStates.nodeInfo = {
loading: false,
attempt: 1,
error: null,
fetched_at: Date.now(),
};
render(<RepeaterDashboard {...defaultProps} />);
expect(
screen.getByText(
'Map and distance data are unavailable until this repeater has a valid position from either its advert or a Node Info fetch.'
)
).toBeInTheDocument();
expect(screen.getByText('No repeater position available')).toBeInTheDocument();
expect(screen.queryByText('Dist')).not.toBeInTheDocument();
});
it('shows neighbor distance when repeater node info includes valid coords', () => {
mockHook.loggedIn = true;
mockHook.paneData.neighbors = {
neighbors: [
{ pubkey_prefix: 'bbbbbbbbbbbb', name: 'Neighbor', snr: 7.2, last_heard_seconds: 9 },
],
};
mockHook.paneData.nodeInfo = {
name: 'TestRepeater',
lat: '-31.9500',
lon: '115.8600',
clock_utc: null,
};
mockHook.paneStates.neighbors = {
loading: false,
attempt: 1,
error: null,
fetched_at: Date.now(),
};
mockHook.paneStates.nodeInfo = {
loading: false,
attempt: 1,
error: null,
fetched_at: Date.now(),
};
const contactsWithNeighbor = [
...contacts,
{
public_key: 'bbbbbbbbbbbb0000000000000000000000000000000000000000000000000000',
name: 'Neighbor',
type: 1,
flags: 0,
last_path: null,
last_path_len: 0,
out_path_hash_mode: 0,
route_override_path: null,
route_override_len: null,
route_override_hash_mode: null,
last_advert: null,
lat: -31.94,
lon: 115.87,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
},
];
render(<RepeaterDashboard {...defaultProps} contacts={contactsWithNeighbor} />);
expect(screen.getByText('Dist')).toBeInTheDocument();
expect(screen.getByText('Using repeater-reported position')).toBeInTheDocument();
expect(
screen.queryByText(
'Map and distance data are unavailable until this repeater has a valid position from either its advert or a Node Info fetch.'
)
).not.toBeInTheDocument();
});
it('uses advert coords for neighbor distance when node info is unavailable', () => {
mockHook.loggedIn = true;
mockHook.paneData.neighbors = {
neighbors: [
{ pubkey_prefix: 'bbbbbbbbbbbb', name: 'Neighbor', snr: 7.2, last_heard_seconds: 9 },
],
};
mockHook.paneData.nodeInfo = null;
mockHook.paneStates.neighbors = {
loading: false,
attempt: 1,
error: null,
fetched_at: Date.now(),
};
mockHook.paneStates.nodeInfo = {
loading: false,
attempt: 0,
error: null,
fetched_at: null,
};
const contactsWithAdvertAndNeighbor = [
{
...contacts[0],
lat: -31.95,
lon: 115.86,
},
{
public_key: 'bbbbbbbbbbbb0000000000000000000000000000000000000000000000000000',
name: 'Neighbor',
type: 1,
flags: 0,
last_path: null,
last_path_len: 0,
out_path_hash_mode: 0,
route_override_path: null,
route_override_len: null,
route_override_hash_mode: null,
last_advert: null,
lat: -31.94,
lon: 115.87,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
},
];
render(<RepeaterDashboard {...defaultProps} contacts={contactsWithAdvertAndNeighbor} />);
expect(screen.getByText('Dist')).toBeInTheDocument();
expect(screen.getByText('Using advert position')).toBeInTheDocument();
});
it('shows fetching state with attempt counter', () => {
mockHook.loggedIn = true;
mockHook.paneStates.status = { loading: true, attempt: 2, error: null };
@@ -264,6 +426,24 @@ describe('RepeaterDashboard', () => {
expect(screen.getByText('7.5 dB')).toBeInTheDocument();
});
it('formats the radio tuple and preserves the raw tuple in a tooltip', () => {
mockHook.loggedIn = true;
mockHook.paneData.radioSettings = {
firmware_version: 'v1.0',
radio: '910.5250244,62.5,7,5',
tx_power: '20',
airtime_factor: '0',
repeat_enabled: '1',
flood_max: '3',
};
render(<RepeaterDashboard {...defaultProps} />);
const formatted = screen.getByText('910.525 MHz, BW 62.5 kHz, SF7, CR5');
expect(formatted).toBeInTheDocument();
expect(formatted).toHaveAttribute('title', '910.5250244,62.5,7,5');
});
it('shows fetched time and relative age when pane data has been loaded', () => {
mockHook.loggedIn = true;
mockHook.paneStates.status = {
@@ -278,6 +458,40 @@ describe('RepeaterDashboard', () => {
expect(screen.getByText(/Fetched .*Just now/)).toBeInTheDocument();
});
it('keeps repeater clock drift anchored to fetch time across remounts', () => {
vi.useFakeTimers();
try {
const fetchedAt = Date.UTC(2024, 0, 1, 12, 0, 0);
vi.setSystemTime(fetchedAt);
mockHook.loggedIn = true;
mockHook.paneData.nodeInfo = {
name: 'TestRepeater',
lat: null,
lon: null,
clock_utc: '11:59:30 - 1/1/2024 UTC',
};
mockHook.paneStates.nodeInfo = {
loading: false,
attempt: 1,
error: null,
fetched_at: fetchedAt,
};
const firstRender = render(<RepeaterDashboard {...defaultProps} />);
expect(screen.getByText(/\(drift: 30s\)/)).toBeInTheDocument();
vi.setSystemTime(fetchedAt + 10 * 60 * 1000);
firstRender.unmount();
render(<RepeaterDashboard {...defaultProps} />);
expect(screen.getByText(/\(drift: 30s\)/)).toBeInTheDocument();
expect(screen.queryByText(/\(drift: 10m30s\)/)).not.toBeInTheDocument();
} finally {
vi.useRealTimers();
}
});
it('renders action buttons', () => {
mockHook.loggedIn = true;
@@ -342,7 +556,7 @@ describe('RepeaterDashboard', () => {
expect(screen.getByText('1 hop')).toBeInTheDocument();
});
it('direct path is clickable with routing override title', () => {
it('direct path is clickable, underlined, and marked as editable', () => {
const directContacts: Contact[] = [
{ ...contacts[0], last_path_len: 0, last_seen: 1700000000 },
];
@@ -352,6 +566,7 @@ describe('RepeaterDashboard', () => {
const directEl = screen.getByTitle('Click to edit routing override');
expect(directEl).toBeInTheDocument();
expect(directEl.textContent).toBe('direct');
expect(directEl.className).toContain('underline');
});
it('shows forced decorator when a routing override is active', () => {
@@ -372,13 +587,11 @@ describe('RepeaterDashboard', () => {
expect(screen.getByText('(forced)')).toBeInTheDocument();
});
it('clicking direct path opens prompt and updates routing override', async () => {
it('clicking direct path opens modal and can force direct routing', async () => {
const directContacts: Contact[] = [
{ ...contacts[0], last_path_len: 0, last_seen: 1700000000 },
];
const promptSpy = vi.spyOn(window, 'prompt').mockReturnValue('0');
const { api } = await import('../api');
const overrideSpy = vi.spyOn(api, 'setContactRoutingOverride').mockResolvedValue({
status: 'ok',
@@ -388,21 +601,21 @@ describe('RepeaterDashboard', () => {
render(<RepeaterDashboard {...defaultProps} contacts={directContacts} />);
fireEvent.click(screen.getByTitle('Click to edit routing override'));
expect(await screen.findByRole('dialog')).toBeInTheDocument();
fireEvent.click(screen.getByRole('button', { name: 'Force Direct' }));
expect(promptSpy).toHaveBeenCalled();
expect(overrideSpy).toHaveBeenCalledWith(REPEATER_KEY, '0');
await waitFor(() => {
expect(overrideSpy).toHaveBeenCalledWith(REPEATER_KEY, '0');
});
promptSpy.mockRestore();
overrideSpy.mockRestore();
});
it('clicking path does not call API when prompt is cancelled', async () => {
it('closing the routing override modal does not call the API', async () => {
const directContacts: Contact[] = [
{ ...contacts[0], last_path_len: 0, last_seen: 1700000000 },
];
const promptSpy = vi.spyOn(window, 'prompt').mockReturnValue(null);
const { api } = await import('../api');
const overrideSpy = vi.spyOn(api, 'setContactRoutingOverride').mockResolvedValue({
status: 'ok',
@@ -412,11 +625,11 @@ describe('RepeaterDashboard', () => {
render(<RepeaterDashboard {...defaultProps} contacts={directContacts} />);
fireEvent.click(screen.getByTitle('Click to edit routing override'));
expect(await screen.findByRole('dialog')).toBeInTheDocument();
fireEvent.click(screen.getByRole('button', { name: 'Cancel' }));
expect(promptSpy).toHaveBeenCalled();
expect(overrideSpy).not.toHaveBeenCalled();
promptSpy.mockRestore();
overrideSpy.mockRestore();
});
});

View File

@@ -60,6 +60,7 @@ async function typeAndWaitForResults(query: string) {
describe('SearchView', () => {
beforeEach(() => {
vi.clearAllMocks();
mockGetMessages.mockReset();
});
afterEach(() => {
@@ -284,6 +285,33 @@ describe('SearchView', () => {
);
});
it('refetches current results when visibility policy changes', async () => {
mockGetMessages
.mockResolvedValueOnce([createSearchResult({ id: 1, text: 'visible result' })])
.mockResolvedValueOnce([]);
const { rerender } = render(<SearchView {...defaultProps} visibilityVersion={0} />);
await typeAndWaitForResults('visible');
expect(mockGetMessages).toHaveBeenCalledTimes(1);
expect(
screen.getAllByRole('button').some((button) => button.textContent?.includes('visible result'))
).toBe(true);
rerender(<SearchView {...defaultProps} visibilityVersion={1} />);
await act(async () => {
await new Promise((resolve) => setTimeout(resolve, 0));
});
expect(mockGetMessages).toHaveBeenCalledTimes(2);
expect(mockGetMessages).toHaveBeenLastCalledWith(
expect.objectContaining({ q: 'visible' }),
expect.any(AbortSignal)
);
expect(screen.getByText(/No messages found/)).toBeInTheDocument();
});
it('aborts the load-more request on unmount', async () => {
const pageResults = Array.from({ length: 50 }, (_, i) =>
createSearchResult({ id: i + 1, text: `result ${i}` })

View File

@@ -0,0 +1,23 @@
import { render, screen } from '@testing-library/react';
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
import { SettingsAboutSection } from '../components/settings/SettingsAboutSection';
describe('SettingsAboutSection', () => {
beforeEach(() => {
vi.stubGlobal('__APP_VERSION__', '3.2.0-test');
vi.stubGlobal('__COMMIT_HASH__', 'deadbeef');
});
afterEach(() => {
vi.unstubAllGlobals();
});
it('renders the debug support snapshot link', () => {
render(<SettingsAboutSection />);
const link = screen.getByRole('link', { name: /Open debug support snapshot/i });
expect(link).toHaveAttribute('href', '/api/debug');
expect(link).toHaveAttribute('target', '_blank');
});
});

View File

@@ -8,6 +8,8 @@ import type {
HealthStatus,
RadioConfig,
RadioConfigUpdate,
RadioDiscoveryResponse,
RadioDiscoveryTarget,
StatisticsResponse,
} from '../types';
import type { SettingsSection } from '../components/settings/settingsConstants';
@@ -71,6 +73,9 @@ function renderModal(overrides?: {
onReboot?: () => Promise<void>;
onDisconnect?: () => Promise<void>;
onReconnect?: () => Promise<void>;
meshDiscovery?: RadioDiscoveryResponse | null;
meshDiscoveryLoadingTarget?: RadioDiscoveryTarget | null;
onDiscoverMesh?: (target: RadioDiscoveryTarget) => Promise<void>;
open?: boolean;
pageMode?: boolean;
externalSidebarNav?: boolean;
@@ -87,6 +92,7 @@ function renderModal(overrides?: {
const onReboot = overrides?.onReboot ?? vi.fn(async () => {});
const onDisconnect = overrides?.onDisconnect ?? vi.fn(async () => {});
const onReconnect = overrides?.onReconnect ?? vi.fn(async () => {});
const onDiscoverMesh = overrides?.onDiscoverMesh ?? vi.fn(async () => {});
const commonProps = {
open: overrides?.open ?? true,
@@ -102,6 +108,9 @@ function renderModal(overrides?: {
onDisconnect,
onReconnect,
onAdvertise: vi.fn(async () => {}),
meshDiscovery: overrides?.meshDiscovery ?? null,
meshDiscoveryLoadingTarget: overrides?.meshDiscoveryLoadingTarget ?? null,
onDiscoverMesh,
onHealthRefresh: vi.fn(async () => {}),
onRefreshAppSettings,
};
@@ -125,6 +134,7 @@ function renderModal(overrides?: {
onReboot,
onDisconnect,
onReconnect,
onDiscoverMesh,
view,
};
}
@@ -195,6 +205,26 @@ describe('SettingsModal', () => {
expect(screen.getByText(/Configured radio contact capacity/i)).toBeInTheDocument();
});
it('shows cached radio firmware and capacity info under the connection status', () => {
renderModal({
health: {
...baseHealth,
radio_device_info: {
model: 'T-Echo',
firmware_build: '2025-02-01',
firmware_version: '1.2.3',
max_contacts: 350,
max_channels: 64,
},
},
});
openRadioSection();
expect(
screen.getByText('T-Echo running 2025-02-01/1.2.3 (max: 350 contacts, 64 channels)')
).toBeInTheDocument();
});
it('shows reconnect action when radio connection is paused', () => {
renderModal({
health: { ...baseHealth, radio_state: 'paused' },
@@ -204,6 +234,42 @@ describe('SettingsModal', () => {
expect(screen.getByRole('button', { name: 'Reconnect' })).toBeInTheDocument();
});
it('runs repeater mesh discovery from the radio tab', async () => {
const { onDiscoverMesh } = renderModal();
openRadioSection();
fireEvent.click(screen.getByRole('button', { name: 'Discover Repeaters' }));
await waitFor(() => {
expect(onDiscoverMesh).toHaveBeenCalledWith('repeaters');
});
});
it('renders mesh discovery results in the radio tab', () => {
renderModal({
meshDiscovery: {
target: 'all',
duration_seconds: 8,
results: [
{
public_key: '11'.repeat(32),
node_type: 'repeater',
heard_count: 2,
local_snr: 7.5,
local_rssi: -101,
remote_snr: 4,
},
],
},
});
openRadioSection();
expect(screen.getByText('Last sweep: 1 node')).toBeInTheDocument();
expect(screen.getByText('repeater')).toBeInTheDocument();
expect(screen.getByText('heard 2 times')).toBeInTheDocument();
expect(screen.getByText('8s listen window')).toBeInTheDocument();
});
it('saves advert location source through radio config save', async () => {
const { onSave } = renderModal();
openRadioSection();
@@ -336,6 +402,9 @@ describe('SettingsModal', () => {
onDisconnect={vi.fn(async () => {})}
onReconnect={vi.fn(async () => {})}
onAdvertise={vi.fn(async () => {})}
meshDiscovery={null}
meshDiscoveryLoadingTarget={null}
onDiscoverMesh={vi.fn(async () => {})}
onHealthRefresh={vi.fn(async () => {})}
onRefreshAppSettings={vi.fn(async () => {})}
/>

View File

@@ -4,6 +4,7 @@ import { beforeEach, describe, expect, it, vi } from 'vitest';
import { Sidebar } from '../components/Sidebar';
import { CONTACT_TYPE_REPEATER, type Channel, type Contact, type Favorite } from '../types';
import { getStateKey, type ConversationTimes } from '../utils/conversationState';
import { PUBLIC_CHANNEL_KEY } from '../utils/publicChannel';
function makeChannel(key: string, name: string): Channel {
return {
@@ -75,8 +76,7 @@ function renderSidebar(overrides?: {
onToggleCracker={vi.fn()}
onMarkAllRead={vi.fn()}
favorites={favorites}
sortOrder="recent"
onSortOrderChange={vi.fn()}
legacySortOrder="recent"
isConversationNotificationsEnabled={overrides?.isConversationNotificationsEnabled}
/>
);
@@ -85,7 +85,7 @@ function renderSidebar(overrides?: {
}
function getSectionHeaderContainer(title: string): HTMLElement {
const btn = screen.getByRole('button', { name: new RegExp(title, 'i') });
const btn = screen.getByRole('button', { name: title });
const container = btn.closest('div');
if (!container) throw new Error(`Missing header container for section ${title}`);
return container;
@@ -142,9 +142,9 @@ describe('Sidebar section summaries', () => {
it('expands collapsed sections during search and restores collapse state after clearing search', async () => {
const { opsChannel, aliceName } = renderSidebar();
fireEvent.click(screen.getByRole('button', { name: /Tools/i }));
fireEvent.click(screen.getByRole('button', { name: /Channels/i }));
fireEvent.click(screen.getByRole('button', { name: /Contacts/i }));
fireEvent.click(screen.getByRole('button', { name: 'Tools' }));
fireEvent.click(screen.getByRole('button', { name: 'Channels' }));
fireEvent.click(screen.getByRole('button', { name: 'Contacts' }));
expect(screen.queryByText('Packet Feed')).not.toBeInTheDocument();
expect(screen.queryByText(opsChannel.name)).not.toBeInTheDocument();
@@ -169,9 +169,9 @@ describe('Sidebar section summaries', () => {
it('persists collapsed section state across unmount and remount', () => {
const { opsChannel, aliceName, unmount } = renderSidebar();
fireEvent.click(screen.getByRole('button', { name: /Tools/i }));
fireEvent.click(screen.getByRole('button', { name: /Channels/i }));
fireEvent.click(screen.getByRole('button', { name: /Contacts/i }));
fireEvent.click(screen.getByRole('button', { name: 'Tools' }));
fireEvent.click(screen.getByRole('button', { name: 'Channels' }));
fireEvent.click(screen.getByRole('button', { name: 'Contacts' }));
expect(screen.queryByText('Packet Feed')).not.toBeInTheDocument();
expect(screen.queryByText(opsChannel.name)).not.toBeInTheDocument();
@@ -206,8 +206,7 @@ describe('Sidebar section summaries', () => {
onToggleCracker={vi.fn()}
onMarkAllRead={vi.fn()}
favorites={[]}
sortOrder="recent"
onSortOrderChange={vi.fn()}
legacySortOrder="recent"
/>
);
@@ -253,4 +252,103 @@ describe('Sidebar section summaries', () => {
const unread = within(aliceRow).getByText('3');
expect(bell.compareDocumentPosition(unread) & Node.DOCUMENT_POSITION_FOLLOWING).toBeTruthy();
});
it('sorts each section independently and persists per-section sort preferences', () => {
const publicChannel = makeChannel('AA'.repeat(16), 'Public');
const zebraChannel = makeChannel('BB'.repeat(16), '#zebra');
const alphaChannel = makeChannel('CC'.repeat(16), '#alpha');
const zed = makeContact('11'.repeat(32), 'Zed');
const amy = makeContact('22'.repeat(32), 'Amy');
const relayZulu = makeContact('33'.repeat(32), 'Zulu Relay', CONTACT_TYPE_REPEATER);
const relayAlpha = makeContact('44'.repeat(32), 'Alpha Relay', CONTACT_TYPE_REPEATER);
const props = {
contacts: [zed, amy, relayZulu, relayAlpha],
channels: [publicChannel, zebraChannel, alphaChannel],
activeConversation: null,
onSelectConversation: vi.fn(),
onNewMessage: vi.fn(),
lastMessageTimes: {
[getStateKey('channel', zebraChannel.key)]: 300,
[getStateKey('channel', alphaChannel.key)]: 100,
[getStateKey('contact', zed.public_key)]: 200,
[getStateKey('contact', amy.public_key)]: 100,
[getStateKey('contact', relayZulu.public_key)]: 300,
[getStateKey('contact', relayAlpha.public_key)]: 100,
},
unreadCounts: {},
mentions: {},
showCracker: false,
crackerRunning: false,
onToggleCracker: vi.fn(),
onMarkAllRead: vi.fn(),
favorites: [],
legacySortOrder: 'recent' as const,
};
const getChannelsOrder = () => screen.getAllByText(/^#/).map((node) => node.textContent);
const getContactsOrder = () =>
screen
.getAllByText(/^(Amy|Zed)$/)
.map((node) => node.textContent)
.filter((text): text is string => Boolean(text));
const getRepeatersOrder = () =>
screen
.getAllByText(/Relay$/)
.map((node) => node.textContent)
.filter((text): text is string => Boolean(text));
const { unmount } = render(<Sidebar {...props} />);
expect(getChannelsOrder()).toEqual(['#zebra', '#alpha']);
expect(getContactsOrder()).toEqual(['Zed', 'Amy']);
expect(getRepeatersOrder()).toEqual(['Zulu Relay', 'Alpha Relay']);
fireEvent.click(screen.getByRole('button', { name: 'Sort Channels alphabetically' }));
expect(getChannelsOrder()).toEqual(['#alpha', '#zebra']);
expect(getContactsOrder()).toEqual(['Zed', 'Amy']);
expect(getRepeatersOrder()).toEqual(['Zulu Relay', 'Alpha Relay']);
unmount();
render(<Sidebar {...props} />);
expect(getChannelsOrder()).toEqual(['#alpha', '#zebra']);
expect(getContactsOrder()).toEqual(['Zed', 'Amy']);
expect(getRepeatersOrder()).toEqual(['Zulu Relay', 'Alpha Relay']);
});
it('pins only the canonical Public channel to the top of channel sorting', () => {
const publicChannel = makeChannel(PUBLIC_CHANNEL_KEY, 'Public');
const fakePublic = makeChannel('DD'.repeat(16), 'Public');
const alphaChannel = makeChannel('CC'.repeat(16), '#alpha');
const onSelectConversation = vi.fn();
render(
<Sidebar
contacts={[]}
channels={[fakePublic, alphaChannel, publicChannel]}
activeConversation={null}
onSelectConversation={onSelectConversation}
onNewMessage={vi.fn()}
lastMessageTimes={{}}
unreadCounts={{}}
mentions={{}}
showCracker={false}
crackerRunning={false}
onToggleCracker={vi.fn()}
onMarkAllRead={vi.fn()}
favorites={[]}
legacySortOrder="alpha"
/>
);
fireEvent.click(screen.getAllByText('Public')[0]);
expect(onSelectConversation).toHaveBeenCalledWith({
type: 'channel',
id: PUBLIC_CHANNEL_KEY,
name: 'Public',
});
});
});

View File

@@ -13,6 +13,7 @@ import {
resolveContactFromHashToken,
} from '../utils/urlHash';
import type { Channel, Contact } from '../types';
import { PUBLIC_CHANNEL_KEY } from '../utils/publicChannel';
describe('parseHashConversation', () => {
let originalHash: string;
@@ -149,7 +150,7 @@ describe('parseHashConversation', () => {
describe('resolveChannelFromHashToken', () => {
const channels: Channel[] = [
{
key: 'ABCDEF0123456789ABCDEF0123456789',
key: PUBLIC_CHANNEL_KEY,
name: 'Public',
is_hashtag: false,
on_radio: true,
@@ -172,13 +173,13 @@ describe('resolveChannelFromHashToken', () => {
];
it('prefers stable key lookup (case-insensitive)', () => {
const result = resolveChannelFromHashToken('abcdef0123456789abcdef0123456789', channels);
expect(result?.key).toBe('ABCDEF0123456789ABCDEF0123456789');
const result = resolveChannelFromHashToken(PUBLIC_CHANNEL_KEY.toLowerCase(), channels);
expect(result?.key).toBe(PUBLIC_CHANNEL_KEY);
});
it('supports legacy name-based hash lookup', () => {
it('resolves legacy Public hashes to the canonical Public key', () => {
const result = resolveChannelFromHashToken('Public', channels);
expect(result?.key).toBe('ABCDEF0123456789ABCDEF0123456789');
expect(result?.key).toBe(PUBLIC_CHANNEL_KEY);
});
it('supports legacy hashtag hash without leading #', () => {

View File

@@ -2,19 +2,17 @@ import { act, renderHook } from '@testing-library/react';
import { beforeEach, describe, expect, it, vi } from 'vitest';
import { useConversationActions } from '../hooks/useConversationActions';
import type { Channel, Conversation, Message } from '../types';
import type { Channel, Contact, Conversation, Message, PathDiscoveryResponse } from '../types';
const mocks = vi.hoisted(() => ({
api: {
requestPathDiscovery: vi.fn(),
requestTrace: vi.fn(),
resendChannelMessage: vi.fn(),
sendChannelMessage: vi.fn(),
sendDirectMessage: vi.fn(),
setChannelFloodScopeOverride: vi.fn(),
},
messageCache: {
clear: vi.fn(),
},
toast: {
success: vi.fn(),
error: vi.fn(),
@@ -25,8 +23,6 @@ vi.mock('../api', () => ({
api: mocks.api,
}));
vi.mock('../messageCache', () => mocks.messageCache);
vi.mock('../components/ui/sonner', () => ({
toast: mocks.toast,
}));
@@ -65,11 +61,9 @@ function createArgs(overrides: Partial<Parameters<typeof useConversationActions>
return {
activeConversation,
activeConversationRef: { current: activeConversation },
setContacts: vi.fn(),
setChannels: vi.fn(),
addMessageIfNew: vi.fn(() => true),
jumpToBottom: vi.fn(),
handleToggleBlockedKey: vi.fn(async () => {}),
handleToggleBlockedName: vi.fn(async () => {}),
messageInputRef: { current: { appendText: vi.fn() } },
...overrides,
};
@@ -120,19 +114,6 @@ describe('useConversationActions', () => {
expect(args.addMessageIfNew).not.toHaveBeenCalled();
});
it('clears cached messages and jumps to the latest page after blocking a key', async () => {
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handleBlockKey('cc'.repeat(32));
});
expect(args.handleToggleBlockedKey).toHaveBeenCalledWith('cc'.repeat(32));
expect(mocks.messageCache.clear).toHaveBeenCalledTimes(1);
expect(args.jumpToBottom).toHaveBeenCalledTimes(1);
});
it('appends sender mentions into the message input', () => {
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
@@ -143,4 +124,116 @@ describe('useConversationActions', () => {
expect(args.messageInputRef.current?.appendText).toHaveBeenCalledWith('@[Alice] ');
});
it('appends a new-timestamp resend immediately for the active channel', async () => {
const resentMessage: Message = {
...sentMessage,
id: 99,
sender_timestamp: 1700000100,
received_at: 1700000100,
};
mocks.api.resendChannelMessage.mockResolvedValue({
status: 'ok',
message_id: resentMessage.id,
message: resentMessage,
});
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handleResendChannelMessage(sentMessage.id, true);
});
expect(mocks.api.resendChannelMessage).toHaveBeenCalledWith(sentMessage.id, true);
expect(args.addMessageIfNew).toHaveBeenCalledWith(resentMessage);
});
it('does not append a byte-perfect resend locally', async () => {
mocks.api.resendChannelMessage.mockResolvedValue({
status: 'ok',
message_id: sentMessage.id,
});
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handleResendChannelMessage(sentMessage.id, false);
});
expect(args.addMessageIfNew).not.toHaveBeenCalled();
});
it('does not append a resend if the user has switched conversations', async () => {
const resentMessage: Message = {
...sentMessage,
id: 100,
sender_timestamp: 1700000200,
received_at: 1700000200,
};
mocks.api.resendChannelMessage.mockResolvedValue({
status: 'ok',
message_id: resentMessage.id,
message: resentMessage,
});
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
const resendPromise = result.current.handleResendChannelMessage(sentMessage.id, true);
args.activeConversationRef.current = {
type: 'channel',
id: 'AA'.repeat(16),
name: 'Other',
};
await resendPromise;
});
expect(args.addMessageIfNew).not.toHaveBeenCalled();
});
it('merges returned contact data after path discovery', async () => {
const contactKey = 'aa'.repeat(32);
const discoveredContact: Contact = {
public_key: contactKey,
name: 'Alice',
type: 1,
flags: 0,
last_path: 'AABB',
last_path_len: 2,
out_path_hash_mode: 0,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
};
const response: PathDiscoveryResponse = {
contact: discoveredContact,
forward_path: { path: 'AABB', path_len: 2, path_hash_mode: 0 },
return_path: { path: 'CC', path_len: 1, path_hash_mode: 0 },
};
mocks.api.requestPathDiscovery.mockResolvedValue(response);
const setContacts = vi.fn();
const args = createArgs({
activeConversation: { type: 'contact', id: contactKey, name: 'Alice' },
activeConversationRef: { current: { type: 'contact', id: contactKey, name: 'Alice' } },
setContacts,
});
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handlePathDiscovery(contactKey);
});
expect(mocks.api.requestPathDiscovery).toHaveBeenCalledWith(contactKey);
expect(setContacts).toHaveBeenCalledTimes(1);
const updater = setContacts.mock.calls[0][0] as (contacts: Contact[]) => Contact[];
expect(updater([])).toEqual([discoveredContact]);
});
});

View File

@@ -2,20 +2,28 @@ import { act, renderHook, waitFor } from '@testing-library/react';
import { beforeEach, describe, expect, it, vi, type Mock } from 'vitest';
import * as messageCache from '../messageCache';
import { api } from '../api';
import { useConversationMessages } from '../hooks/useConversationMessages';
import type { Conversation, Message } from '../types';
const mockGetMessages = vi.fn<(...args: unknown[]) => Promise<Message[]>>();
const mockGetMessages = vi.fn<typeof api.getMessages>();
const mockGetMessagesAround = vi.fn();
vi.mock('../api', () => ({
api: {
getMessages: (...args: unknown[]) => mockGetMessages(...args),
getMessages: (...args: Parameters<typeof api.getMessages>) => mockGetMessages(...args),
getMessagesAround: (...args: unknown[]) => mockGetMessagesAround(...args),
},
isAbortError: (err: unknown) => err instanceof DOMException && err.name === 'AbortError',
}));
const mockToastError = vi.fn();
vi.mock('../components/ui/sonner', () => ({
toast: {
error: (...args: unknown[]) => mockToastError(...args),
},
}));
function createConversation(): Conversation {
return {
type: 'contact',
@@ -55,6 +63,7 @@ describe('useConversationMessages ACK ordering', () => {
beforeEach(() => {
mockGetMessages.mockReset();
messageCache.clear();
mockToastError.mockReset();
});
it('applies buffered ACK when message is added after ACK event', async () => {
@@ -225,6 +234,36 @@ describe('useConversationMessages conversation switch', () => {
expect(result.current.messages[0].conversation_key).toBe('conv_b');
});
it('reloads the active conversation from source when requested', async () => {
const conv = createConversation();
mockGetMessages
.mockResolvedValueOnce([
createMessage({ id: 1, text: 'keep me', sender_timestamp: 1700000000, received_at: 1 }),
createMessage({
id: 2,
text: 'blocked later',
sender_timestamp: 1700000001,
received_at: 2,
}),
])
.mockResolvedValueOnce([
createMessage({ id: 1, text: 'keep me', sender_timestamp: 1700000000, received_at: 1 }),
]);
const { result } = renderHook(() => useConversationMessages(conv));
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(result.current.messages.map((msg) => msg.text)).toEqual(['keep me', 'blocked later']);
act(() => {
result.current.reloadCurrentConversation();
});
await waitFor(() => expect(mockGetMessages).toHaveBeenCalledTimes(2));
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(result.current.messages.map((msg) => msg.text)).toEqual(['keep me']);
});
it('aborts in-flight fetch when switching conversations', async () => {
const convA: Conversation = { type: 'contact', id: 'conv_a', name: 'Contact A' };
const convB: Conversation = { type: 'contact', id: 'conv_b', name: 'Contact B' };
@@ -323,11 +362,157 @@ describe('useConversationMessages background reconcile ordering', () => {
});
});
describe('useConversationMessages older-page dedup and reentry', () => {
beforeEach(() => {
mockGetMessages.mockReset();
messageCache.clear();
});
it('prevents duplicate overlapping older-page fetches in the same tick', async () => {
const conv: Conversation = { type: 'contact', id: 'conv_a', name: 'Contact A' };
const fullPage = Array.from({ length: 200 }, (_, i) =>
createMessage({
id: i + 1,
conversation_key: 'conv_a',
text: `msg-${i + 1}`,
sender_timestamp: 1700000000 + i,
received_at: 1700000000 + i,
})
);
mockGetMessages.mockResolvedValueOnce(fullPage);
const olderDeferred = createDeferred<Message[]>();
mockGetMessages.mockReturnValueOnce(olderDeferred.promise);
const { result } = renderHook(() => useConversationMessages(conv));
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(result.current.messages).toHaveLength(200);
expect(result.current.hasOlderMessages).toBe(true);
act(() => {
void result.current.fetchOlderMessages();
void result.current.fetchOlderMessages();
});
expect(mockGetMessages).toHaveBeenCalledTimes(2); // initial page + one older fetch
olderDeferred.resolve([
createMessage({
id: 0,
conversation_key: 'conv_a',
text: 'older-msg',
sender_timestamp: 1699999999,
received_at: 1699999999,
}),
]);
await waitFor(() => expect(result.current.loadingOlder).toBe(false));
expect(result.current.messages).toHaveLength(201);
expect(result.current.messages.filter((msg) => msg.id === 0)).toHaveLength(1);
});
it('does not append duplicate messages from an overlapping older page', async () => {
const conv: Conversation = { type: 'contact', id: 'conv_a', name: 'Contact A' };
const fullPage = Array.from({ length: 200 }, (_, i) =>
createMessage({
id: i + 1,
conversation_key: 'conv_a',
text: `msg-${i + 1}`,
sender_timestamp: 1700000000 + i,
received_at: 1700000000 + i,
})
);
mockGetMessages.mockResolvedValueOnce(fullPage);
mockGetMessages.mockResolvedValueOnce([
createMessage({
id: 1,
conversation_key: 'conv_a',
text: 'msg-1',
sender_timestamp: 1700000000,
received_at: 1700000000,
}),
createMessage({
id: 0,
conversation_key: 'conv_a',
text: 'older-msg',
sender_timestamp: 1699999999,
received_at: 1699999999,
}),
]);
const { result } = renderHook(() => useConversationMessages(conv));
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(result.current.messages).toHaveLength(200);
await act(async () => {
await result.current.fetchOlderMessages();
});
expect(result.current.messages.filter((msg) => msg.id === 1)).toHaveLength(1);
expect(result.current.messages.filter((msg) => msg.id === 0)).toHaveLength(1);
expect(result.current.messages).toHaveLength(201);
});
it('aborts stale older-page requests on conversation switch without toasting', async () => {
const convA: Conversation = { type: 'contact', id: 'conv_a', name: 'Contact A' };
const convB: Conversation = { type: 'contact', id: 'conv_b', name: 'Contact B' };
const fullPage = Array.from({ length: 200 }, (_, i) =>
createMessage({
id: i + 1,
conversation_key: 'conv_a',
text: `msg-${i + 1}`,
sender_timestamp: 1700000000 + i,
received_at: 1700000000 + i,
})
);
mockGetMessages.mockResolvedValueOnce(fullPage);
const olderDeferred = createDeferred<Message[]>();
let olderSignal: AbortSignal | undefined;
mockGetMessages.mockImplementationOnce((_, signal?: AbortSignal) => {
olderSignal = signal;
signal?.addEventListener('abort', () => {
olderDeferred.resolve([]);
});
return new Promise<Message[]>((_, reject) => {
signal?.addEventListener('abort', () => {
reject(new DOMException('The operation was aborted', 'AbortError'));
});
});
});
const { result, rerender } = renderHook(
({ conv }: { conv: Conversation }) => useConversationMessages(conv),
{ initialProps: { conv: convA } }
);
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
act(() => {
void result.current.fetchOlderMessages();
});
await waitFor(() => expect(result.current.loadingOlder).toBe(true));
mockGetMessages.mockResolvedValueOnce([createMessage({ id: 999, conversation_key: 'conv_b' })]);
rerender({ conv: convB });
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(olderSignal?.aborted).toBe(true);
expect(mockToastError).not.toHaveBeenCalled();
});
});
describe('useConversationMessages forward pagination', () => {
beforeEach(() => {
mockGetMessages.mockReset();
mockGetMessagesAround.mockReset();
messageCache.clear();
mockToastError.mockReset();
});
it('fetchNewerMessages loads newer messages and appends them', async () => {
@@ -492,6 +677,69 @@ describe('useConversationMessages forward pagination', () => {
expect(result.current.messages[0].text).toBe('latest-msg');
});
it('aborts stale newer-page requests on conversation switch without toasting', async () => {
const convA: Conversation = { type: 'channel', id: 'ch1', name: 'Channel A' };
const convB: Conversation = { type: 'channel', id: 'ch2', name: 'Channel B' };
mockGetMessagesAround.mockResolvedValueOnce({
messages: [
createMessage({
id: 1,
type: 'CHAN',
conversation_key: 'ch1',
text: 'msg-0',
sender_timestamp: 1700000000,
received_at: 1700000000,
}),
],
has_older: false,
has_newer: true,
});
let newerSignal: AbortSignal | undefined;
mockGetMessages.mockImplementationOnce((_, signal?: AbortSignal) => {
newerSignal = signal;
return new Promise<Message[]>((_, reject) => {
signal?.addEventListener('abort', () => {
reject(new DOMException('The operation was aborted', 'AbortError'));
});
});
});
const initialProps: { conv: Conversation; target: number | null } = {
conv: convA,
target: 1,
};
const { result, rerender } = renderHook(
({ conv, target }: { conv: Conversation; target: number | null }) =>
useConversationMessages(conv, target),
{ initialProps }
);
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
act(() => {
void result.current.fetchNewerMessages();
});
await waitFor(() => expect(result.current.loadingNewer).toBe(true));
mockGetMessages.mockResolvedValueOnce([
createMessage({
id: 999,
type: 'CHAN',
conversation_key: 'ch2',
text: 'conv-b',
}),
]);
rerender({ conv: convB, target: null });
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(newerSignal?.aborted).toBe(true);
expect(mockToastError).not.toHaveBeenCalled();
});
it('preserves around-loaded messages when the jump target is cleared in the same conversation', async () => {
const conv: Conversation = { type: 'channel', id: 'ch1', name: 'Channel' };

Some files were not shown because too many files have changed in this diff Show More