Compare commits

..

23 Commits

Author SHA1 Message Date
Jack Kingsman 462ba8945f Thin out the docker image 2026-04-30 20:57:25 -07:00
Jack Kingsman f95745cb05 Updating changelog + build for 3.13.0 2026-04-30 20:31:32 -07:00
Jack Kingsman 39ba88bc4b Fix up e2e tests 2026-04-30 20:22:59 -07:00
Jack Kingsman e814653300 Add non-markdown option. Closes #232. 2026-04-30 19:54:43 -07:00
Jack Kingsman e76d922752 Add recieved time to packet display. Closes #238. 2026-04-30 19:07:50 -07:00
Jack Kingsman d0e02a42f8 Merge pull request #237 from Bjorkan/TraceFix
Return HTTP 422 for missing trace responses to avoid confusing proxies in front of RemoteTerm. Closes #236.
2026-04-30 18:51:24 -07:00
Jack Kingsman dbf14259dc Do full rewrite of 5xx => 4xx 2026-04-30 18:47:35 -07:00
Jack Kingsman a9ac87e668 Allow newlines in text input. Closes #234. 2026-04-30 18:36:36 -07:00
Björkan f710a1f2d9 Change failed trace from using 504 to instead use 422 2026-04-30 23:03:08 +02:00
Björkan 9f6c0f12c5 Don't include .codex file 2026-04-30 22:58:59 +02:00
Jack Kingsman 466f693c21 Fix page to dvh. Closes #233. 2026-04-28 14:41:56 -07:00
Jack Kingsman 16f87e640f Attempt up to three Apprise retries. Closes #232. 2026-04-28 14:40:14 -07:00
Jack Kingsman 761fd82da6 Backoff MQTT failures all the way up to 1hr on connection failure, and also don't multi-toast on connection error. Closes #231. 2026-04-28 12:00:03 -07:00
Jack Kingsman 2c1279eb9e Add error rate percentage to metrics graph 2026-04-27 11:21:02 -07:00
Jack Kingsman 047d713003 Permit hourly checks for direct/routed repeaters. Closes #226. 2026-04-27 09:51:57 -07:00
Jack Kingsman 25041e1367 Add dynamic text replacement. Closes #223. 2026-04-25 15:00:36 -07:00
Jack Kingsman b3fe717416 Correct packet sum for repeater error rate. Closes #225. 2026-04-25 14:48:44 -07:00
Jack Kingsman 9a4e78c504 Show RX error percentage 2026-04-25 14:01:39 -07:00
Jack Kingsman d436de67a2 Merge pull request #224 from jkingsman/repeater-error-count
Repeater error count
2026-04-25 13:54:42 -07:00
Jack Kingsman 89cee49725 Actuall bump lib 2026-04-25 13:45:43 -07:00
Jack Kingsman b37ce89c96 Add repeater telemetry error count 2026-04-25 13:45:17 -07:00
Jack Kingsman f0b7842c60 Merge pull request #221 from jkingsman/dependabot/npm_and_yarn/frontend/npm_and_yarn-754666cf41
Bump postcss from 8.5.8 to 8.5.10 in /frontend in the npm_and_yarn group across 1 directory
2026-04-24 18:06:54 -07:00
dependabot[bot] 82a6553539 Bump postcss in /frontend in the npm_and_yarn group across 1 directory
Bumps the npm_and_yarn group with 1 update in the /frontend directory: [postcss](https://github.com/postcss/postcss).


Updates `postcss` from 8.5.8 to 8.5.10
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.5.8...8.5.10)

---
updated-dependencies:
- dependency-name: postcss
  dependency-version: 8.5.10
  dependency-type: direct:development
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-24 21:10:08 +00:00
80 changed files with 1922 additions and 3141 deletions
+1
View File
@@ -25,6 +25,7 @@ references/
# ancillary LLM files # ancillary LLM files
.claude/ .claude/
.codex
# local Docker compose files # local Docker compose files
docker-compose.yml docker-compose.yml
-4
View File
@@ -22,7 +22,6 @@ A web interface for MeshCore mesh radio networks. The backend connects to a Mesh
Ancillary AGENTS.md files which should generally not be reviewed unless specific work is being performed on those features include: Ancillary AGENTS.md files which should generally not be reviewed unless specific work is being performed on those features include:
- `app/fanout/AGENTS_fanout.md` - Fanout bus architecture (MQTT, bots, webhooks, Apprise, SQS) - `app/fanout/AGENTS_fanout.md` - Fanout bus architecture (MQTT, bots, webhooks, Apprise, SQS)
- `app/tcp_proxy/AGENTS_tcp_proxy.md` - TCP companion protocol proxy (emulates a MeshCore radio for remote clients)
- `frontend/src/components/visualizer/AGENTS_packet_visualizer.md` - Packet visualizer (force-directed graph, advert-path identity, layout engine) - `frontend/src/components/visualizer/AGENTS_packet_visualizer.md` - Packet visualizer (force-directed graph, advert-path identity, layout engine)
## Architecture Overview ## Architecture Overview
@@ -508,9 +507,6 @@ mc.subscribe(EventType.ACK, handler)
| `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE` | `false` | Disable channel-slot reuse and force `set_channel(...)` before every channel send, even on serial/BLE | | `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE` | `false` | Disable channel-slot reuse and force `set_channel(...)` before every channel send, even on serial/BLE |
| `MESHCORE_LOAD_WITH_AUTOEVICT` | `false` | Enable autoevict contact loading: sets `AUTO_ADD_OVERWRITE_OLDEST` on the radio so adds never fail with TABLE_FULL, skips the removal phase during reconcile, and allows blind loading when `get_contacts` fails. Loaded contacts are not radio-favorited and may be evicted by new adverts when the table is full. | | `MESHCORE_LOAD_WITH_AUTOEVICT` | `false` | Enable autoevict contact loading: sets `AUTO_ADD_OVERWRITE_OLDEST` on the radio so adds never fail with TABLE_FULL, skips the removal phase during reconcile, and allows blind loading when `get_contacts` fails. Loaded contacts are not radio-favorited and may be evicted by new adverts when the table is full. |
| `MESHCORE_ENABLE_LOCAL_PRIVATE_KEY_EXPORT` | `false` | Enable `GET /api/radio/private-key` to return the in-memory private key as hex. Disabled by default; only enable on a trusted network where you need to retrieve the key (e.g. for backup or migration). | | `MESHCORE_ENABLE_LOCAL_PRIVATE_KEY_EXPORT` | `false` | Enable `GET /api/radio/private-key` to return the in-memory private key as hex. Disabled by default; only enable on a trusted network where you need to retrieve the key (e.g. for backup or migration). |
| `MESHCORE_TCP_PROXY_ENABLED` | `false` | Enable the MeshCore TCP companion protocol proxy (see `app/tcp_proxy/AGENTS_tcp_proxy.md`) |
| `MESHCORE_TCP_PROXY_BIND` | `0.0.0.0` | Bind address for the TCP proxy server |
| `MESHCORE_TCP_PROXY_PORT` | `5001` | Port for the TCP proxy server |
**Note:** Runtime app settings are stored in the database (`app_settings` table), not environment variables. These include `max_radio_contacts`, `auto_decrypt_dm_on_advert`, `advert_interval`, `last_advert_time`, `last_message_times`, `flood_scope`, `blocked_keys`, `blocked_names`, `discovery_blocked_types`, `tracked_telemetry_repeaters`, `auto_resend_channel`, and `telemetry_interval_hours`. `max_radio_contacts` is the configured radio contact capacity baseline used by background maintenance: favorites reload first, non-favorite fill targets about 80% of that value, and full offload/reload triggers around 95% occupancy. They are configured via `GET/PATCH /api/settings`. MQTT, bot, webhook, Apprise, and SQS configs are stored in the `fanout_configs` table, managed via `/api/fanout`. If the radio's channel slots appear unstable or another client is mutating them underneath this app, operators can force the old always-reconfigure send path with `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true`. **Note:** Runtime app settings are stored in the database (`app_settings` table), not environment variables. These include `max_radio_contacts`, `auto_decrypt_dm_on_advert`, `advert_interval`, `last_advert_time`, `last_message_times`, `flood_scope`, `blocked_keys`, `blocked_names`, `discovery_blocked_types`, `tracked_telemetry_repeaters`, `auto_resend_channel`, and `telemetry_interval_hours`. `max_radio_contacts` is the configured radio contact capacity baseline used by background maintenance: favorites reload first, non-favorite fill targets about 80% of that value, and full offload/reload triggers around 95% occupancy. They are configured via `GET/PATCH /api/settings`. MQTT, bot, webhook, Apprise, and SQS configs are stored in the `fanout_configs` table, managed via `/api/fanout`. If the radio's channel slots appear unstable or another client is mutating them underneath this app, operators can force the old always-reconfigure send path with `MESHCORE_FORCE_CHANNEL_SLOT_RECONFIGURE=true`.
+15
View File
@@ -1,3 +1,18 @@
## [3.13.0] - 2026-04-30
* Feature: Error counts included in repeater telemetry
* Feature: RX error rate + percentage surfaced and tracked for repeaters
* Feature: Dynamic as-you-type text replacement for Cyrillic byte optimization
* Feature: Permit hourly checks for direct/routed repeaters
* Feature: Allow newlines in input
* Feature: Packet-send radio time added to packet analyzer
* Feature: Enable forced plaintext for Apprise
* Bugfix: Less annoying MQTT failure notifications with backoff
* Bugfis: Don't obscure input; use dvh everywhere
* Bugfix: Clearer save button for advert interval
* Misc: Library updates
* Misc: Rewrite 5xx to 4xx to avoid issues with proxies that don't react well to 503/504
## [3.12.3] - 2026-04-24 ## [3.12.3] - 2026-04-24
* Feature: Customizable Apprise strings * Feature: Customizable Apprise strings
+22 -13
View File
@@ -9,26 +9,35 @@ COPY frontend/package.json frontend/package-lock.json frontend/.npmrc ./
RUN npm ci RUN npm ci
COPY frontend/ ./ COPY frontend/ ./
RUN VITE_COMMIT_HASH=${COMMIT_HASH} npm run build RUN VITE_COMMIT_HASH=${COMMIT_HASH} npm run build \
&& find dist -name '*.map' -delete
# Stage 2: Python runtime # Stage 2: Install Python dependencies (uv stays in this stage only)
FROM python:3.13-slim AS python-deps
WORKDIR /app
COPY --from=ghcr.io/astral-sh/uv:0.6 /uv /usr/local/bin/uv
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-dev
# Stage 3: Final runtime (no uv, no source maps)
FROM python:3.13-slim FROM python:3.13-slim
ARG COMMIT_HASH=unknown ARG COMMIT_HASH=unknown
WORKDIR /app WORKDIR /app
ENV COMMIT_HASH=${COMMIT_HASH} ENV COMMIT_HASH=${COMMIT_HASH} \
PATH="/app/.venv/bin:$PATH"
# Install uv # Copy installed venv from deps stage
COPY --from=ghcr.io/astral-sh/uv:0.6 /uv /usr/local/bin/uv COPY --from=python-deps /app/.venv ./.venv
# Copy dependency files first for layer caching # Copy dependency metadata (pyproject.toml needed by app for version info)
COPY pyproject.toml uv.lock ./ COPY pyproject.toml ./
# Install dependencies (no dev/test deps)
RUN uv sync --frozen --no-dev
# Copy application code # Copy application code
COPY app/ ./app/ COPY app/ ./app/
@@ -36,7 +45,7 @@ COPY app/ ./app/
# Copy license attributions # Copy license attributions
COPY LICENSES.md ./ COPY LICENSES.md ./
# Copy built frontend from first stage # Copy built frontend from first stage (source maps already stripped)
COPY --from=frontend-builder /build/dist ./frontend/dist COPY --from=frontend-builder /build/dist ./frontend/dist
# Create data directory for SQLite database # Create data directory for SQLite database
@@ -44,5 +53,5 @@ RUN mkdir -p /app/data
EXPOSE 8000 EXPOSE 8000
# Run the application (we retain root for max compatibility) # Run uvicorn directly from the venv (no uv needed at runtime)
CMD ["uv", "run", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"] CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
+1 -1
View File
@@ -330,7 +330,7 @@ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
</details> </details>
### meshcore (2.3.2) — MIT ### meshcore (2.3.7) — MIT
<details> <details>
<summary>Full license text</summary> <summary>Full license text</summary>
-24
View File
@@ -39,30 +39,6 @@ Import via `PUT /api/radio/private-key` is always available regardless of this s
The Radio Settings config export/import feature uses these endpoints. When export is disabled, config exports will omit the private key and show a notice. The Radio Settings config export/import feature uses these endpoints. When export is disabled, config exports will omit the private key and show a notice.
## MeshCore TCP Proxy
RemoteTerm can emulate a MeshCore companion radio over TCP, allowing MeshCore clients (mobile apps, meshcore-cli, meshcore-ha) to connect to it as if it were a directly-connected radio.
| Variable | Default | Description |
|----------|---------|-------------|
| `MESHCORE_TCP_PROXY_ENABLED` | `false` | Enable the TCP companion protocol proxy |
| `MESHCORE_TCP_PROXY_BIND` | `0.0.0.0` | Bind address for the proxy TCP server |
| `MESHCORE_TCP_PROXY_PORT` | `5001` | Port for the proxy TCP server |
Once enabled, MeshCore clients can connect:
```bash
meshcore-cli --tcp <host>:5001
```
**How it works:** The proxy translates the MeshCore companion binary protocol into in-process RemoteTerm operations. Contacts, channels, and messages come from the RemoteTerm database. Outgoing messages are sent through RemoteTerm's send orchestration (with radio lock, retries, and ACK tracking). Incoming messages are pushed to connected clients in real time.
**Limitations:**
- Only favorite contacts are synced to clients
- Only favorite channels are pre-loaded into slots; clients can load additional channels via SET_CHANNEL (local to the proxy session, does not modify RemoteTerm channel configuration)
- DMs receive an immediate synthetic ACK; actual delivery retries are handled server-side by RemoteTerm
- Radio configuration changes (SET_NAME, SET_LATLON) are applied to the real radio
## Contact Loading Issues ## Contact Loading Issues
RemoteTerm loads favorite and recently active contacts onto the radio so that the radio can automatically acknowledge incoming DMs on your behalf. To do this, it first enumerates the radio's existing contact table, then reconciles it with the desired working set. RemoteTerm loads favorite and recently active contacts onto the radio so that the radio can automatically acknowledge incoming DMs on your behalf. To do this, it first enumerates the radio's existing contact table, then reconciles it with the desired working set.
+1 -6
View File
@@ -55,7 +55,6 @@ app/
│ ├── send.py # pywebpush wrapper (async via thread executor) │ ├── send.py # pywebpush wrapper (async via thread executor)
│ └── manager.py # Push dispatch: filter, build payload, concurrent send │ └── manager.py # Push dispatch: filter, build payload, concurrent send
├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md) ├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise, SQS (see fanout/AGENTS_fanout.md)
├── tcp_proxy/ # MeshCore TCP companion protocol proxy (see tcp_proxy/AGENTS_tcp_proxy.md)
├── telemetry_interval.py # Shared telemetry interval math for tracked-repeater scheduler ├── telemetry_interval.py # Shared telemetry interval math for tracked-repeater scheduler
├── path_utils.py # Path hex rendering and hop-width helpers ├── path_utils.py # Path hex rendering and hop-width helpers
├── region_scope.py # Normalize/validate regional flood-scope values ├── region_scope.py # Normalize/validate regional flood-scope values
@@ -427,11 +426,7 @@ tests/
├── test_telemetry_interval.py # Telemetry interval scheduling math ├── test_telemetry_interval.py # Telemetry interval scheduling math
├── test_version_info.py # Version/build metadata resolution ├── test_version_info.py # Version/build metadata resolution
├── test_websocket.py # WS manager broadcast/cleanup ├── test_websocket.py # WS manager broadcast/cleanup
── test_websocket_route.py # WS endpoint lifecycle ── test_websocket_route.py # WS endpoint lifecycle
├── test_tcp_proxy_protocol.py # TCP proxy frame parsing and helpers
├── test_tcp_proxy_encoder.py # TCP proxy binary encoding
├── test_tcp_proxy_session.py # TCP proxy session command handlers
└── test_tcp_proxy_integration.py # TCP proxy end-to-end frame exchange
``` ```
## Errata & Known Non-Issues ## Errata & Known Non-Issues
-3
View File
@@ -31,9 +31,6 @@ class Settings(BaseSettings):
skip_post_connect_sync: bool = False skip_post_connect_sync: bool = False
basic_auth_username: str = "" basic_auth_username: str = ""
basic_auth_password: str = "" basic_auth_password: str = ""
tcp_proxy_enabled: bool = False
tcp_proxy_bind: str = "0.0.0.0"
tcp_proxy_port: int = 5001
@model_validator(mode="after") @model_validator(mode="after")
def validate_transport_exclusivity(self) -> "Settings": def validate_transport_exclusivity(self) -> "Settings":
+93 -24
View File
@@ -11,6 +11,9 @@ from app.path_utils import split_path_hex
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
_MAX_SEND_ATTEMPTS = 3
_RETRY_DELAY_S = 2
DEFAULT_BODY_FORMAT_DM = "**DM:** {sender_name}: {text} **via:** [{hops_backticked}]" DEFAULT_BODY_FORMAT_DM = "**DM:** {sender_name}: {text} **via:** [{hops_backticked}]"
DEFAULT_BODY_FORMAT_CHANNEL = ( DEFAULT_BODY_FORMAT_CHANNEL = (
"**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]" "**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]"
@@ -18,6 +21,12 @@ DEFAULT_BODY_FORMAT_CHANNEL = (
_DEFAULT_BODY_FORMAT_DM_NO_PATH = "**DM:** {sender_name}: {text}" _DEFAULT_BODY_FORMAT_DM_NO_PATH = "**DM:** {sender_name}: {text}"
_DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH = "**{channel_name}:** {sender_name}: {text}" _DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH = "**{channel_name}:** {sender_name}: {text}"
# Plain-text variants (no markdown formatting)
DEFAULT_BODY_FORMAT_DM_PLAIN = "DM: {sender_name}: {text} via: [{hops}]"
DEFAULT_BODY_FORMAT_CHANNEL_PLAIN = "{channel_name}: {sender_name}: {text} via: [{hops}]"
_DEFAULT_BODY_FORMAT_DM_NO_PATH_PLAIN = "DM: {sender_name}: {text}"
_DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH_PLAIN = "{channel_name}: {sender_name}: {text}"
# Variables available for user format strings # Variables available for user format strings
FORMAT_VARIABLES = ( FORMAT_VARIABLES = (
"type", "type",
@@ -130,10 +139,17 @@ def _apply_format(fmt: str, variables: dict[str, str]) -> str:
def _format_body( def _format_body(
data: dict, data: dict,
*, *,
body_format_dm: str = DEFAULT_BODY_FORMAT_DM, body_format_dm: str | None = None,
body_format_channel: str = DEFAULT_BODY_FORMAT_CHANNEL, body_format_channel: str | None = None,
markdown: bool = True,
) -> str: ) -> str:
"""Build a notification body from message data using format strings.""" """Build a notification body from message data using format strings."""
if body_format_dm is None:
body_format_dm = DEFAULT_BODY_FORMAT_DM if markdown else DEFAULT_BODY_FORMAT_DM_PLAIN
if body_format_channel is None:
body_format_channel = (
DEFAULT_BODY_FORMAT_CHANNEL if markdown else DEFAULT_BODY_FORMAT_CHANNEL_PLAIN
)
variables = _build_template_vars(data) variables = _build_template_vars(data)
msg_type = data.get("type", "") msg_type = data.get("type", "")
fmt = body_format_dm if msg_type == "PRIV" else body_format_channel fmt = body_format_dm if msg_type == "PRIV" else body_format_channel
@@ -141,13 +157,21 @@ def _format_body(
return _apply_format(fmt, variables) return _apply_format(fmt, variables)
except Exception: except Exception:
logger.warning("Apprise format string error, falling back to default") logger.warning("Apprise format string error, falling back to default")
default = DEFAULT_BODY_FORMAT_DM if msg_type == "PRIV" else DEFAULT_BODY_FORMAT_CHANNEL if markdown:
default = DEFAULT_BODY_FORMAT_DM if msg_type == "PRIV" else DEFAULT_BODY_FORMAT_CHANNEL
else:
default = (
DEFAULT_BODY_FORMAT_DM_PLAIN
if msg_type == "PRIV"
else DEFAULT_BODY_FORMAT_CHANNEL_PLAIN
)
return _apply_format(default, variables) return _apply_format(default, variables)
def _send_sync(urls_raw: str, body: str, *, preserve_identity: bool) -> bool: def _send_sync(urls_raw: str, body: str, *, preserve_identity: bool, markdown: bool = True) -> bool:
"""Send notification synchronously via Apprise. Returns True on success.""" """Send notification synchronously via Apprise. Returns True on success."""
import apprise as apprise_lib import apprise as apprise_lib
from apprise import NotifyFormat
urls = _parse_urls(urls_raw) urls = _parse_urls(urls_raw)
if not urls: if not urls:
@@ -159,7 +183,8 @@ def _send_sync(urls_raw: str, body: str, *, preserve_identity: bool) -> bool:
url = _normalize_discord_url(url) url = _normalize_discord_url(url)
notifier.add(url) notifier.add(url)
return bool(notifier.notify(title="", body=body)) body_fmt = NotifyFormat.MARKDOWN if markdown else NotifyFormat.TEXT
return bool(notifier.notify(title="", body=body, body_format=body_fmt))
class AppriseModule(FanoutModule): class AppriseModule(FanoutModule):
@@ -178,6 +203,7 @@ class AppriseModule(FanoutModule):
return return
preserve_identity = self.config.get("preserve_identity", True) preserve_identity = self.config.get("preserve_identity", True)
markdown = self.config.get("markdown_format", True)
# Read format strings; treat empty/whitespace as unset (use default). # Read format strings; treat empty/whitespace as unset (use default).
# Fall back to legacy include_path for pre-migration configs. # Fall back to legacy include_path for pre-migration configs.
@@ -186,30 +212,73 @@ class AppriseModule(FanoutModule):
if body_format_dm is None or body_format_channel is None: if body_format_dm is None or body_format_channel is None:
include_path = self.config.get("include_path", True) include_path = self.config.get("include_path", True)
if body_format_dm is None: if body_format_dm is None:
body_format_dm = ( if markdown:
DEFAULT_BODY_FORMAT_DM if include_path else _DEFAULT_BODY_FORMAT_DM_NO_PATH body_format_dm = (
) DEFAULT_BODY_FORMAT_DM if include_path else _DEFAULT_BODY_FORMAT_DM_NO_PATH
)
else:
body_format_dm = (
DEFAULT_BODY_FORMAT_DM_PLAIN
if include_path
else _DEFAULT_BODY_FORMAT_DM_NO_PATH_PLAIN
)
if body_format_channel is None: if body_format_channel is None:
body_format_channel = ( if markdown:
DEFAULT_BODY_FORMAT_CHANNEL body_format_channel = (
if include_path DEFAULT_BODY_FORMAT_CHANNEL
else _DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH if include_path
) else _DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH
)
else:
body_format_channel = (
DEFAULT_BODY_FORMAT_CHANNEL_PLAIN
if include_path
else _DEFAULT_BODY_FORMAT_CHANNEL_NO_PATH_PLAIN
)
body = _format_body( body = _format_body(
data, body_format_dm=body_format_dm, body_format_channel=body_format_channel data,
body_format_dm=body_format_dm,
body_format_channel=body_format_channel,
markdown=markdown,
) )
try: last_exc: Exception | None = None
success = await asyncio.to_thread( for attempt in range(_MAX_SEND_ATTEMPTS):
_send_sync, urls, body, preserve_identity=preserve_identity try:
) success = await asyncio.to_thread(
self._set_last_error(None if success else "Apprise notify returned failure") _send_sync,
if not success: urls,
logger.warning("Apprise notification failed for module %s", self.config_id) body,
except Exception as exc: preserve_identity=preserve_identity,
self._set_last_error(str(exc)) markdown=markdown,
logger.exception("Apprise send error for module %s", self.config_id) )
if success:
self._set_last_error(None)
return
logger.warning(
"Apprise notification failed for module %s (attempt %d/%d)",
self.config_id,
attempt + 1,
_MAX_SEND_ATTEMPTS,
)
except Exception as exc:
last_exc = exc
logger.warning(
"Apprise send error for module %s (attempt %d/%d): %s",
self.config_id,
attempt + 1,
_MAX_SEND_ATTEMPTS,
exc,
)
if attempt < _MAX_SEND_ATTEMPTS - 1:
await asyncio.sleep(_RETRY_DELAY_S)
# All attempts exhausted
if last_exc is not None:
self._set_last_error(str(last_exc))
else:
self._set_last_error("Apprise notify returned failure")
@property @property
def status(self) -> str: def status(self) -> str:
+1 -1
View File
@@ -245,7 +245,7 @@ def _get_client_version() -> str:
class CommunityMqttPublisher(BaseMqttPublisher): class CommunityMqttPublisher(BaseMqttPublisher):
"""Manages the community MQTT connection and publishes raw packets.""" """Manages the community MQTT connection and publishes raw packets."""
_backoff_max = 60 _backoff_max = 3600
_log_prefix = "Community MQTT" _log_prefix = "Community MQTT"
_not_configured_timeout: float | None = 30 _not_configured_timeout: float | None = 30
+1 -1
View File
@@ -27,7 +27,7 @@ class PrivateMqttSettings(Protocol):
class MqttPublisher(BaseMqttPublisher): class MqttPublisher(BaseMqttPublisher):
"""Manages an MQTT connection and publishes mesh network events.""" """Manages an MQTT connection and publishes mesh network events."""
_backoff_max = 30 _backoff_max = 3600
_log_prefix = "MQTT" _log_prefix = "MQTT"
def _is_configured(self) -> bool: def _is_configured(self) -> bool:
+8 -3
View File
@@ -65,6 +65,7 @@ class BaseMqttPublisher(ABC):
self.connected: bool = False self.connected: bool = False
self.integration_name: str = "" self.integration_name: str = ""
self._last_error: str | None = None self._last_error: str | None = None
self._error_notified: bool = False
def set_integration_name(self, name: str) -> None: def set_integration_name(self, name: str) -> None:
"""Attach the configured fanout-module name for operator-facing logs.""" """Attach the configured fanout-module name for operator-facing logs."""
@@ -104,6 +105,7 @@ class BaseMqttPublisher(ABC):
self._client = None self._client = None
self.connected = False self.connected = False
self._last_error = None self._last_error = None
self._error_notified = False
async def restart(self, settings: object) -> None: async def restart(self, settings: object) -> None:
"""Called when settings change — stop + start.""" """Called when settings change — stop + start."""
@@ -217,6 +219,7 @@ class BaseMqttPublisher(ABC):
self._client = client self._client = client
self.connected = True self.connected = True
self._last_error = None self._last_error = None
self._error_notified = False
backoff = _BACKOFF_MIN backoff = _BACKOFF_MIN
title, detail = self._on_connected(settings) title, detail = self._on_connected(settings)
@@ -281,9 +284,11 @@ class BaseMqttPublisher(ABC):
) )
return return
title, detail = self._on_error() if not self._error_notified:
broadcast_error(title, detail) title, detail = self._on_error()
_broadcast_health() broadcast_error(title, detail)
_broadcast_health()
self._error_notified = True
logger.warning( logger.warning(
"%s connection error. This is usually transient network noise; " "%s connection error. This is usually transient network noise; "
"if it self-resolves, it is generally not a concern: %s " "if it self-resolves, it is generally not a concern: %s "
+10 -1
View File
@@ -81,6 +81,15 @@ _REPEATER_SENSORS: list[dict[str, Any]] = [
"unit": None, "unit": None,
"precision": 0, "precision": 0,
}, },
{
"field": "recv_errors",
"name": "RX Errors",
"object_id": "recv_errors",
"device_class": None,
"state_class": "total_increasing",
"unit": None,
"precision": 0,
},
{ {
"field": "uptime_seconds", "field": "uptime_seconds",
"name": "Uptime", "name": "Uptime",
@@ -307,7 +316,7 @@ def _device_payload(
class _HaMqttPublisher(BaseMqttPublisher): class _HaMqttPublisher(BaseMqttPublisher):
"""Thin MQTT lifecycle wrapper for the HA discovery module.""" """Thin MQTT lifecycle wrapper for the HA discovery module."""
_backoff_max = 30 _backoff_max = 3600
_log_prefix = "HA-MQTT" _log_prefix = "HA-MQTT"
def __init__(self) -> None: def __init__(self) -> None:
+11 -24
View File
@@ -2,14 +2,13 @@ import logging
import sys import sys
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Windows event-loop advisory for MQTT fanout and TCP proxy # Windows event-loop advisory for MQTT fanout
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# On Windows, uvicorn's default event loop (ProactorEventLoop) does not # On Windows, uvicorn's default event loop (ProactorEventLoop) does not
# implement add_reader()/add_writer(), which paho-mqtt (via aiomqtt) and # implement add_reader()/add_writer(), which paho-mqtt (via aiomqtt) requires.
# asyncio.start_server (TCP proxy) require. The loop is already created by # We cannot fix this from inside the app — the loop is already created by the
# the time this module is imported, so we cannot switch it here. Log a # time this module is imported. Log a prominent warning so Windows operators
# prominent warning so Windows operators know to start uvicorn with the # who want MQTT know to add ``--loop none`` to their uvicorn command.
# selector loop policy set before import.
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
if sys.platform == "win32": if sys.platform == "win32":
import asyncio as _asyncio import asyncio as _asyncio
@@ -22,15 +21,12 @@ if sys.platform == "win32":
" NOTE FOR WINDOWS USERS\n" + "!" * 78 + "\n" " NOTE FOR WINDOWS USERS\n" + "!" * 78 + "\n"
"\n" "\n"
" The running event loop is ProactorEventLoop, which is not\n" " The running event loop is ProactorEventLoop, which is not\n"
" compatible with MQTT fanout or the TCP proxy.\n" " compatible with MQTT fanout (aiomqtt / paho-mqtt).\n"
"\n" "\n"
" If you use either feature, restart with:\n" " If you use MQTT integrations, restart with --loop none:\n"
"\n" "\n"
' python -c "import asyncio; asyncio.set_event_loop_policy(' " uv run uvicorn app.main:app \033[1m--loop none\033[0m"
'asyncio.WindowsSelectorEventLoopPolicy())" & ' " [... other options ...]\n"
"uv run uvicorn app.main:app [... options ...]\n"
"\n"
" Or add --loop asyncio to the uvicorn command.\n"
"\n" "\n"
" Everything else works fine as-is.\n" " Everything else works fine as-is.\n"
"\n" + "!" * 78 + "\n", "\n" + "!" * 78 + "\n",
@@ -134,21 +130,12 @@ async def lifespan(app: FastAPI):
except Exception: except Exception:
logger.exception("Failed to start fanout modules") logger.exception("Failed to start fanout modules")
if server_settings.tcp_proxy_enabled:
from app.tcp_proxy import start_tcp_proxy
await start_tcp_proxy()
startup_radio_task = asyncio.create_task(_startup_radio_connect_and_setup()) startup_radio_task = asyncio.create_task(_startup_radio_connect_and_setup())
app.state.startup_radio_task = startup_radio_task app.state.startup_radio_task = startup_radio_task
yield yield
logger.info("Shutting down") logger.info("Shutting down")
if server_settings.tcp_proxy_enabled:
from app.tcp_proxy import stop_tcp_proxy
await stop_tcp_proxy()
if startup_radio_task and not startup_radio_task.done(): if startup_radio_task and not startup_radio_task.done():
startup_radio_task.cancel() startup_radio_task.cancel()
try: try:
@@ -189,8 +176,8 @@ app.add_middleware(
@app.exception_handler(RadioDisconnectedError) @app.exception_handler(RadioDisconnectedError)
async def radio_disconnected_handler(request: Request, exc: RadioDisconnectedError): async def radio_disconnected_handler(request: Request, exc: RadioDisconnectedError):
"""Return 503 when a radio disconnect race occurs during an operation.""" """Return 423 when a radio disconnect race occurs during an operation."""
return JSONResponse(status_code=503, content={"detail": "Radio not connected"}) return JSONResponse(status_code=423, content={"detail": "Radio not connected"})
@app.middleware("http") @app.middleware("http")
@@ -0,0 +1,20 @@
import logging
import aiosqlite
logger = logging.getLogger(__name__)
async def migrate(conn: aiosqlite.Connection) -> None:
"""Add telemetry_routed_hourly boolean column to app_settings."""
tables_cursor = await conn.execute("SELECT name FROM sqlite_master WHERE type='table'")
if "app_settings" not in {row[0] for row in await tables_cursor.fetchall()}:
await conn.commit()
return
col_cursor = await conn.execute("PRAGMA table_info(app_settings)")
columns = {row[1] for row in await col_cursor.fetchall()}
if "telemetry_routed_hourly" not in columns:
await conn.execute(
"ALTER TABLE app_settings ADD COLUMN telemetry_routed_hourly INTEGER DEFAULT 0"
)
await conn.commit()
+10
View File
@@ -448,6 +448,8 @@ class RawPacketDecryptedInfo(BaseModel):
sender: str | None = None sender: str | None = None
channel_key: str | None = None channel_key: str | None = None
contact_key: str | None = None contact_key: str | None = None
sender_timestamp: int | None = None
message: str | None = None
class RawPacketBroadcast(BaseModel): class RawPacketBroadcast(BaseModel):
@@ -540,6 +542,7 @@ class RepeaterStatusResponse(BaseModel):
flood_dups: int = Field(description="Duplicate flood packets") flood_dups: int = Field(description="Duplicate flood packets")
direct_dups: int = Field(description="Duplicate direct packets") direct_dups: int = Field(description="Duplicate direct packets")
full_events: int = Field(description="Full event queue count") full_events: int = Field(description="Full event queue count")
recv_errors: int | None = Field(default=None, description="Radio-level RX packet errors")
telemetry_history: list["TelemetryHistoryEntry"] = Field( telemetry_history: list["TelemetryHistoryEntry"] = Field(
default_factory=list, description="Recent telemetry history snapshots" default_factory=list, description="Recent telemetry history snapshots"
) )
@@ -854,6 +857,13 @@ class AppSettings(BaseModel):
"tracked repeaters so daily checks stay under a 24/day ceiling." "tracked repeaters so daily checks stay under a 24/day ceiling."
), ),
) )
telemetry_routed_hourly: bool = Field(
default=False,
description=(
"When enabled, tracked repeaters with a direct or routed (non-flood) "
"path are polled every hour instead of on the normal scheduled interval."
),
)
auto_resend_channel: bool = Field( auto_resend_channel: bool = Field(
default=False, default=False,
description=( description=(
+6
View File
@@ -366,6 +366,8 @@ async def process_raw_packet(
sender=result["sender"], sender=result["sender"],
channel_key=result.get("channel_key"), channel_key=result.get("channel_key"),
contact_key=result.get("contact_key"), contact_key=result.get("contact_key"),
sender_timestamp=result.get("sender_timestamp"),
message=result.get("message"),
) )
if result["decrypted"] if result["decrypted"]
else None, else None,
@@ -428,6 +430,8 @@ async def _process_group_text(
"sender": decrypted.sender, "sender": decrypted.sender,
"message_id": msg_id, # None if duplicate, msg_id if new "message_id": msg_id, # None if duplicate, msg_id if new
"channel_key": channel.key, "channel_key": channel.key,
"sender_timestamp": decrypted.timestamp,
"message": decrypted.message,
} }
# Couldn't decrypt with any known key # Couldn't decrypt with any known key
@@ -694,6 +698,8 @@ async def _process_direct_message(
"sender": contact.name or contact.public_key[:12], "sender": contact.name or contact.public_key[:12],
"message_id": msg_id, "message_id": msg_id,
"contact_key": contact.public_key, "contact_key": contact.public_key,
"sender_timestamp": result.timestamp,
"message": result.message,
} }
# Couldn't decrypt with any known contact # Couldn't decrypt with any known contact
+39 -11
View File
@@ -1821,6 +1821,7 @@ async def _collect_repeater_telemetry(mc: MeshCore, contact: Contact) -> bool:
"flood_dups": status.get("flood_dups", 0), "flood_dups": status.get("flood_dups", 0),
"direct_dups": status.get("direct_dups", 0), "direct_dups": status.get("direct_dups", 0),
"full_events": status.get("full_evts", 0), "full_events": status.get("full_evts", 0),
"recv_errors": status.get("recv_errors"),
} }
# Best-effort LPP sensor fetch — failure here does not fail the overall # Best-effort LPP sensor fetch — failure here does not fail the overall
@@ -1889,8 +1890,13 @@ async def _collect_repeater_telemetry(mc: MeshCore, contact: Contact) -> bool:
return False return False
async def _run_telemetry_cycle() -> None: async def _run_telemetry_cycle(*, routed_only: bool = False) -> None:
"""Collect one telemetry sample from every tracked repeater.""" """Collect one telemetry sample from tracked repeaters.
When *routed_only* is True, only repeaters whose effective route is
``"direct"`` or ``"override"`` (i.e. not ``"flood"``) are collected.
This is used by the hourly routed-path fast-poll feature.
"""
if not radio_manager.is_connected: if not radio_manager.is_connected:
logger.debug("Telemetry collect: radio not connected, skipping cycle") logger.debug("Telemetry collect: radio not connected, skipping cycle")
return return
@@ -1900,9 +1906,7 @@ async def _run_telemetry_cycle() -> None:
if not tracked: if not tracked:
return return
logger.info("Telemetry collect: starting cycle for %d repeater(s)", len(tracked)) candidates: list[tuple[str, Contact]] = []
collected = 0
for pub_key in tracked: for pub_key in tracked:
contact = await ContactRepository.get_by_key(pub_key) contact = await ContactRepository.get_by_key(pub_key)
if not contact or contact.type != 2: if not contact or contact.type != 2:
@@ -1911,7 +1915,24 @@ async def _run_telemetry_cycle() -> None:
pub_key[:12], pub_key[:12],
) )
continue continue
if routed_only and (not contact.effective_route or contact.effective_route.path_len < 0):
continue
candidates.append((pub_key, contact))
if not candidates:
if routed_only:
logger.debug("Telemetry collect: no routed repeaters to poll this hour")
return
label = "routed" if routed_only else "full"
logger.info(
"Telemetry collect: starting %s cycle for %d repeater(s)",
label,
len(candidates),
)
collected = 0
for _pub_key, contact in candidates:
try: try:
async with radio_manager.radio_operation( async with radio_manager.radio_operation(
"telemetry_collect", "telemetry_collect",
@@ -1923,13 +1944,14 @@ async def _run_telemetry_cycle() -> None:
except RadioOperationBusyError: except RadioOperationBusyError:
logger.debug( logger.debug(
"Telemetry collect: radio busy, skipping %s", "Telemetry collect: radio busy, skipping %s",
pub_key[:12], contact.public_key[:12],
) )
logger.info( logger.info(
"Telemetry collect: cycle complete, %d/%d successful", "Telemetry collect: %s cycle complete, %d/%d successful",
label,
collected, collected,
len(tracked), len(candidates),
) )
@@ -1959,9 +1981,15 @@ async def _maybe_run_scheduled_cycle(now: datetime) -> None:
effective_hours = clamp_telemetry_interval(app_settings.telemetry_interval_hours, tracked_count) effective_hours = clamp_telemetry_interval(app_settings.telemetry_interval_hours, tracked_count)
if effective_hours <= 0: if effective_hours <= 0:
return return
if now.hour % effective_hours != 0:
return is_normal_cycle = now.hour % effective_hours == 0
await _run_telemetry_cycle()
if is_normal_cycle:
# Normal scheduled boundary: collect ALL tracked repeaters.
await _run_telemetry_cycle()
elif app_settings.telemetry_routed_hourly:
# Hourly routed-path fast-poll: only repeaters with a non-flood route.
await _run_telemetry_cycle(routed_only=True)
async def _telemetry_collect_loop() -> None: async def _telemetry_collect_loop() -> None:
+15 -1
View File
@@ -42,7 +42,7 @@ class AppSettingsRepository:
advert_interval, last_advert_time, flood_scope, advert_interval, last_advert_time, flood_scope,
blocked_keys, blocked_names, discovery_blocked_types, blocked_keys, blocked_names, discovery_blocked_types,
tracked_telemetry_repeaters, auto_resend_channel, tracked_telemetry_repeaters, auto_resend_channel,
telemetry_interval_hours telemetry_interval_hours, telemetry_routed_hourly
FROM app_settings WHERE id = 1 FROM app_settings WHERE id = 1
""" """
) as cursor: ) as cursor:
@@ -113,6 +113,12 @@ class AppSettingsRepository:
except (KeyError, TypeError, ValueError): except (KeyError, TypeError, ValueError):
telemetry_interval_hours = DEFAULT_TELEMETRY_INTERVAL_HOURS telemetry_interval_hours = DEFAULT_TELEMETRY_INTERVAL_HOURS
# Parse telemetry_routed_hourly boolean
try:
telemetry_routed_hourly = bool(row["telemetry_routed_hourly"])
except (KeyError, TypeError):
telemetry_routed_hourly = False
return AppSettings( return AppSettings(
max_radio_contacts=row["max_radio_contacts"], max_radio_contacts=row["max_radio_contacts"],
auto_decrypt_dm_on_advert=bool(row["auto_decrypt_dm_on_advert"]), auto_decrypt_dm_on_advert=bool(row["auto_decrypt_dm_on_advert"]),
@@ -126,6 +132,7 @@ class AppSettingsRepository:
tracked_telemetry_repeaters=tracked_telemetry_repeaters, tracked_telemetry_repeaters=tracked_telemetry_repeaters,
auto_resend_channel=auto_resend_channel, auto_resend_channel=auto_resend_channel,
telemetry_interval_hours=telemetry_interval_hours, telemetry_interval_hours=telemetry_interval_hours,
telemetry_routed_hourly=telemetry_routed_hourly,
) )
@staticmethod @staticmethod
@@ -144,6 +151,7 @@ class AppSettingsRepository:
tracked_telemetry_repeaters: list[str] | None = None, tracked_telemetry_repeaters: list[str] | None = None,
auto_resend_channel: bool | None = None, auto_resend_channel: bool | None = None,
telemetry_interval_hours: int | None = None, telemetry_interval_hours: int | None = None,
telemetry_routed_hourly: bool | None = None,
) -> None: ) -> None:
"""Apply field updates using an already-acquired connection. """Apply field updates using an already-acquired connection.
@@ -201,6 +209,10 @@ class AppSettingsRepository:
updates.append("telemetry_interval_hours = ?") updates.append("telemetry_interval_hours = ?")
params.append(telemetry_interval_hours) params.append(telemetry_interval_hours)
if telemetry_routed_hourly is not None:
updates.append("telemetry_routed_hourly = ?")
params.append(1 if telemetry_routed_hourly else 0)
if updates: if updates:
query = f"UPDATE app_settings SET {', '.join(updates)} WHERE id = 1" query = f"UPDATE app_settings SET {', '.join(updates)} WHERE id = 1"
async with conn.execute(query, params): async with conn.execute(query, params):
@@ -229,6 +241,7 @@ class AppSettingsRepository:
tracked_telemetry_repeaters: list[str] | None = None, tracked_telemetry_repeaters: list[str] | None = None,
auto_resend_channel: bool | None = None, auto_resend_channel: bool | None = None,
telemetry_interval_hours: int | None = None, telemetry_interval_hours: int | None = None,
telemetry_routed_hourly: bool | None = None,
) -> AppSettings: ) -> AppSettings:
"""Update app settings. Only provided fields are updated.""" """Update app settings. Only provided fields are updated."""
async with db.tx() as conn: async with db.tx() as conn:
@@ -246,6 +259,7 @@ class AppSettingsRepository:
tracked_telemetry_repeaters=tracked_telemetry_repeaters, tracked_telemetry_repeaters=tracked_telemetry_repeaters,
auto_resend_channel=auto_resend_channel, auto_resend_channel=auto_resend_channel,
telemetry_interval_hours=telemetry_interval_hours, telemetry_interval_hours=telemetry_interval_hours,
telemetry_routed_hourly=telemetry_routed_hourly,
) )
return await AppSettingsRepository._get_in_conn(conn) return await AppSettingsRepository._get_in_conn(conn)
+6 -6
View File
@@ -66,11 +66,11 @@ async def _resolve_contact_or_404(
async def _ensure_on_radio(mc, contact: Contact) -> None: async def _ensure_on_radio(mc, contact: Contact) -> None:
"""Add a contact to the radio for routing, raising 500 on failure.""" """Add a contact to the radio for routing, raising 422 on failure."""
add_result = await mc.commands.add_contact(contact.to_radio_dict()) add_result = await mc.commands.add_contact(contact.to_radio_dict())
if add_result is not None and add_result.type == EventType.ERROR: if add_result is not None and add_result.type == EventType.ERROR:
raise HTTPException( raise HTTPException(
status_code=500, detail=f"Failed to add contact to radio: {add_result.payload}" status_code=422, detail=f"Failed to add contact to radio: {add_result.payload}"
) )
@@ -452,7 +452,7 @@ async def request_trace(public_key: str) -> TraceResponse:
) )
if result.type == EventType.ERROR: if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send trace: {result.payload}") raise HTTPException(status_code=422, detail=f"Failed to send trace: {result.payload}")
# Wait for the matching TRACE_DATA event # Wait for the matching TRACE_DATA event
event = await mc.wait_for_event( event = await mc.wait_for_event(
@@ -462,7 +462,7 @@ async def request_trace(public_key: str) -> TraceResponse:
) )
if event is None: if event is None:
raise HTTPException(status_code=504, detail="No trace response heard") raise HTTPException(status_code=408, detail="No trace response heard")
trace = event.payload trace = event.payload
path = trace.get("path", []) path = trace.get("path", [])
@@ -506,7 +506,7 @@ async def request_path_discovery(public_key: str) -> PathDiscoveryResponse:
result = await mc.commands.send_path_discovery(contact.public_key) result = await mc.commands.send_path_discovery(contact.public_key)
if result.type == EventType.ERROR: if result.type == EventType.ERROR:
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail=f"Failed to send path discovery: {result.payload}", detail=f"Failed to send path discovery: {result.payload}",
) )
@@ -518,7 +518,7 @@ async def request_path_discovery(public_key: str) -> PathDiscoveryResponse:
await response_task await response_task
if event is None: if event is None:
raise HTTPException(status_code=504, detail="No path discovery response heard") raise HTTPException(status_code=408, detail="No path discovery response heard")
payload = event.payload payload = event.payload
forward_path = str(payload.get("out_path") or "") forward_path = str(payload.get("out_path") or "")
+4
View File
@@ -274,6 +274,10 @@ def _validate_apprise_config(config: dict) -> None:
status_code=400, detail=f"Invalid format string in {field}" status_code=400, detail=f"Invalid format string in {field}"
) from None ) from None
markdown_format = config.get("markdown_format")
if markdown_format is not None:
config["markdown_format"] = bool(markdown_format)
def _validate_webhook_config(config: dict) -> None: def _validate_webhook_config(config: dict) -> None:
"""Validate webhook config blob.""" """Validate webhook config blob."""
+4
View File
@@ -128,11 +128,15 @@ async def get_raw_packet(packet_id: int) -> RawPacketDetail:
sender=message.sender_name, sender=message.sender_name,
channel_key=message.conversation_key, channel_key=message.conversation_key,
contact_key=message.sender_key, contact_key=message.sender_key,
sender_timestamp=message.sender_timestamp,
message=message.text,
) )
else: else:
decrypted_info = RawPacketDecryptedInfo( decrypted_info = RawPacketDecryptedInfo(
sender=message.sender_name, sender=message.sender_name,
contact_key=message.conversation_key, contact_key=message.conversation_key,
sender_timestamp=message.sender_timestamp,
message=message.text,
) )
return RawPacketDetail( return RawPacketDetail(
+5 -5
View File
@@ -48,7 +48,7 @@ async def vapid_public_key() -> VapidPublicKeyResponse:
"""Return the VAPID public key for browser PushManager.subscribe().""" """Return the VAPID public key for browser PushManager.subscribe()."""
key = get_vapid_public_key() key = get_vapid_public_key()
if not key: if not key:
raise HTTPException(status_code=503, detail="VAPID keys not initialized") raise HTTPException(status_code=423, detail="VAPID keys not initialized")
return VapidPublicKeyResponse(public_key=key) return VapidPublicKeyResponse(public_key=key)
@@ -103,7 +103,7 @@ async def test_push(subscription_id: str) -> dict:
vapid_key = get_vapid_private_key() vapid_key = get_vapid_private_key()
if not vapid_key: if not vapid_key:
raise HTTPException(status_code=503, detail="VAPID keys not initialized") raise HTTPException(status_code=423, detail="VAPID keys not initialized")
payload = json.dumps( payload = json.dumps(
{ {
@@ -127,7 +127,7 @@ async def test_push(subscription_id: str) -> dict:
) )
return {"status": "sent"} return {"status": "sent"}
except TimeoutError: except TimeoutError:
raise HTTPException(status_code=504, detail="Push delivery timed out") from None raise HTTPException(status_code=408, detail="Push delivery timed out") from None
except WebPushException as e: except WebPushException as e:
status_code = getattr(getattr(e, "response", None), "status_code", 0) status_code = getattr(getattr(e, "response", None), "status_code", 0)
if status_code in (403, 404, 410): if status_code in (403, 404, 410):
@@ -143,10 +143,10 @@ async def test_push(subscription_id: str) -> dict:
"Re-enable push from a conversation header.", "Re-enable push from a conversation header.",
) from None ) from None
logger.warning("Test push failed: %s", e) logger.warning("Test push failed: %s", e)
raise HTTPException(status_code=502, detail=f"Push delivery failed: {e}") from None raise HTTPException(status_code=422, detail=f"Push delivery failed: {e}") from None
except Exception as e: except Exception as e:
logger.warning("Test push failed: %s", e) logger.warning("Test push failed: %s", e)
raise HTTPException(status_code=502, detail=f"Push delivery failed: {e}") from None raise HTTPException(status_code=422, detail=f"Push delivery failed: {e}") from None
# ── Global push conversation management ────────────────────────────────── # ── Global push conversation management ──────────────────────────────────
+15 -15
View File
@@ -338,7 +338,7 @@ async def get_radio_config() -> RadioConfigResponse:
info = mc.self_info info = mc.self_info
if not info: if not info:
raise HTTPException(status_code=503, detail="Radio info not available") raise HTTPException(status_code=423, detail="Radio info not available")
adv_loc_policy = info.get("adv_loc_policy", 1) adv_loc_policy = info.get("adv_loc_policy", 1)
advert_location_source: AdvertLocationSource = "off" if adv_loc_policy == 0 else "current" advert_location_source: AdvertLocationSource = "off" if adv_loc_policy == 0 else "current"
@@ -380,7 +380,7 @@ async def update_radio_config(update: RadioConfigUpdate) -> RadioConfigResponse:
except PathHashModeUnsupportedError as exc: except PathHashModeUnsupportedError as exc:
raise HTTPException(status_code=400, detail=str(exc)) from exc raise HTTPException(status_code=400, detail=str(exc)) from exc
except RadioCommandRejectedError as exc: except RadioCommandRejectedError as exc:
raise HTTPException(status_code=500, detail=str(exc)) from exc raise HTTPException(status_code=422, detail=str(exc)) from exc
return await get_radio_config() return await get_radio_config()
@@ -430,7 +430,7 @@ async def set_private_key(update: PrivateKeyUpdate) -> dict:
export_and_store_private_key_fn=export_and_store_private_key, export_and_store_private_key_fn=export_and_store_private_key,
) )
except (RadioCommandRejectedError, KeystoreRefreshError) as exc: except (RadioCommandRejectedError, KeystoreRefreshError) as exc:
raise HTTPException(status_code=500, detail=str(exc)) from exc raise HTTPException(status_code=422, detail=str(exc)) from exc
return {"status": "ok"} return {"status": "ok"}
@@ -454,7 +454,7 @@ async def send_advertisement(request: RadioAdvertiseRequest | None = None) -> di
success = await do_send_advertisement(mc, force=True, mode=mode) success = await do_send_advertisement(mc, force=True, mode=mode)
if not success: if not success:
raise HTTPException(status_code=500, detail=f"Failed to send {mode} advertisement") raise HTTPException(status_code=422, detail=f"Failed to send {mode} advertisement")
return {"status": "ok"} return {"status": "ok"}
@@ -486,7 +486,7 @@ async def discover_mesh(request: RadioDiscoveryRequest) -> RadioDiscoveryRespons
tag=tag, tag=tag,
) )
if send_result is None or send_result.type == EventType.ERROR: if send_result is None or send_result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail="Failed to start mesh discovery") raise HTTPException(status_code=422, detail="Failed to start mesh discovery")
deadline = _monotonic() + DISCOVERY_WINDOW_SECONDS deadline = _monotonic() + DISCOVERY_WINDOW_SECONDS
results_by_key: dict[str, RadioDiscoveryResult] = {} results_by_key: dict[str, RadioDiscoveryResult] = {}
@@ -538,7 +538,7 @@ async def trace_path(request: RadioTraceRequest) -> RadioTraceResponse:
async with radio_manager.radio_operation("radio_trace", pause_polling=True) as mc: async with radio_manager.radio_operation("radio_trace", pause_polling=True) as mc:
local_public_key = str((mc.self_info or {}).get("public_key") or "").lower() local_public_key = str((mc.self_info or {}).get("public_key") or "").lower()
if len(local_public_key) != 64: if len(local_public_key) != 64:
raise HTTPException(status_code=503, detail="Local radio public key is unavailable") raise HTTPException(status_code=423, detail="Local radio public key is unavailable")
local_name = (mc.self_info or {}).get("name") local_name = (mc.self_info or {}).get("name")
response_task = asyncio.create_task( response_task = asyncio.create_task(
@@ -555,13 +555,13 @@ async def trace_path(request: RadioTraceRequest) -> RadioTraceResponse:
flags=trace_flags, flags=trace_flags,
) )
if send_result is None or send_result.type == EventType.ERROR: if send_result is None or send_result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail="Failed to send trace") raise HTTPException(status_code=422, detail="Failed to send trace")
timeout_seconds = _trace_timeout_seconds(send_result) timeout_seconds = _trace_timeout_seconds(send_result)
try: try:
event = await asyncio.wait_for(response_task, timeout=timeout_seconds) event = await asyncio.wait_for(response_task, timeout=timeout_seconds)
except TimeoutError as exc: except TimeoutError as exc:
raise HTTPException(status_code=504, detail="No trace response heard") from exc raise HTTPException(status_code=408, detail="No trace response heard") from exc
finally: finally:
if not response_task.done(): if not response_task.done():
response_task.cancel() response_task.cancel()
@@ -569,12 +569,12 @@ async def trace_path(request: RadioTraceRequest) -> RadioTraceResponse:
await response_task await response_task
if event is None: if event is None:
raise HTTPException(status_code=504, detail="No trace response heard") raise HTTPException(status_code=408, detail="No trace response heard")
payload = event.payload if isinstance(event.payload, dict) else {} payload = event.payload if isinstance(event.payload, dict) else {}
path_len = payload.get("path_len") path_len = payload.get("path_len")
if not isinstance(path_len, int): if not isinstance(path_len, int):
raise HTTPException(status_code=500, detail="Trace response was malformed") raise HTTPException(status_code=422, detail="Trace response was malformed")
raw_path = payload.get("path") raw_path = payload.get("path")
path_nodes = raw_path if isinstance(raw_path, list) else [] path_nodes = raw_path if isinstance(raw_path, list) else []
@@ -588,7 +588,7 @@ async def trace_path(request: RadioTraceRequest) -> RadioTraceResponse:
hashed_nodes = path_nodes[:-1] if final_local_node is not None else path_nodes hashed_nodes = path_nodes[:-1] if final_local_node is not None else path_nodes
if len(hashed_nodes) < len(trace_nodes): if len(hashed_nodes) < len(trace_nodes):
raise HTTPException(status_code=500, detail="Trace response was incomplete") raise HTTPException(status_code=422, detail="Trace response was incomplete")
nodes: list[RadioTraceNode] = [] nodes: list[RadioTraceNode] = []
for index, trace_node in enumerate(trace_nodes): for index, trace_node in enumerate(trace_nodes):
@@ -641,13 +641,13 @@ async def _attempt_reconnect() -> dict:
except Exception as e: except Exception as e:
logger.exception("Post-connect setup failed after reconnect") logger.exception("Post-connect setup failed after reconnect")
raise HTTPException( raise HTTPException(
status_code=503, status_code=423,
detail=f"Radio connected but setup failed: {e}", detail=f"Radio connected but setup failed: {e}",
) from e ) from e
if not success: if not success:
raise HTTPException( raise HTTPException(
status_code=503, detail="Failed to reconnect. Check radio connection and power." status_code=423, detail="Failed to reconnect. Check radio connection and power."
) )
return {"status": "ok", "message": "Reconnected successfully", "connected": True} return {"status": "ok", "message": "Reconnected successfully", "connected": True}
@@ -702,14 +702,14 @@ async def reconnect_radio() -> dict:
logger.info("Radio connected but setup incomplete, retrying setup") logger.info("Radio connected but setup incomplete, retrying setup")
try: try:
if not await _prepare_connected(broadcast_on_success=True): if not await _prepare_connected(broadcast_on_success=True):
raise HTTPException(status_code=503, detail="Radio connection is paused") raise HTTPException(status_code=423, detail="Radio connection is paused")
return {"status": "ok", "message": "Setup completed", "connected": True} return {"status": "ok", "message": "Setup completed", "connected": True}
except HTTPException: except HTTPException:
raise raise
except Exception as e: except Exception as e:
logger.exception("Post-connect setup failed") logger.exception("Post-connect setup failed")
raise HTTPException( raise HTTPException(
status_code=503, status_code=423,
detail=f"Radio connected but setup failed: {e}", detail=f"Radio connected but setup failed: {e}",
) from e ) from e
+3 -2
View File
@@ -113,7 +113,7 @@ async def repeater_status(public_key: str) -> RepeaterStatusResponse:
logger.debug("LPP sensor fetch failed for %s (non-fatal): %s", public_key[:12], e) logger.debug("LPP sensor fetch failed for %s (non-fatal): %s", public_key[:12], e)
if status is None: if status is None:
raise HTTPException(status_code=504, detail="No status response from repeater") raise HTTPException(status_code=408, detail="No status response from repeater")
response = RepeaterStatusResponse( response = RepeaterStatusResponse(
battery_volts=status.get("bat", 0) / 1000.0, battery_volts=status.get("bat", 0) / 1000.0,
@@ -133,6 +133,7 @@ async def repeater_status(public_key: str) -> RepeaterStatusResponse:
flood_dups=status.get("flood_dups", 0), flood_dups=status.get("flood_dups", 0),
direct_dups=status.get("direct_dups", 0), direct_dups=status.get("direct_dups", 0),
full_events=status.get("full_evts", 0), full_events=status.get("full_evts", 0),
recv_errors=status.get("recv_errors"),
) )
# Record to telemetry history as a JSON blob (best-effort) # Record to telemetry history as a JSON blob (best-effort)
@@ -221,7 +222,7 @@ async def repeater_lpp_telemetry(public_key: str) -> RepeaterLppTelemetryRespons
) )
if telemetry is None: if telemetry is None:
raise HTTPException(status_code=504, detail="No telemetry response from repeater") raise HTTPException(status_code=408, detail="No telemetry response from repeater")
sensors: list[LppSensor] = [] sensors: list[LppSensor] = []
for entry in telemetry: for entry in telemetry:
+3 -2
View File
@@ -58,7 +58,7 @@ async def room_status(public_key: str) -> RepeaterStatusResponse:
status = await mc.commands.req_status_sync(contact.public_key, timeout=10, min_timeout=5) status = await mc.commands.req_status_sync(contact.public_key, timeout=10, min_timeout=5)
if status is None: if status is None:
raise HTTPException(status_code=504, detail="No status response from room server") raise HTTPException(status_code=408, detail="No status response from room server")
return RepeaterStatusResponse( return RepeaterStatusResponse(
battery_volts=status.get("bat", 0) / 1000.0, battery_volts=status.get("bat", 0) / 1000.0,
@@ -78,6 +78,7 @@ async def room_status(public_key: str) -> RepeaterStatusResponse:
flood_dups=status.get("flood_dups", 0), flood_dups=status.get("flood_dups", 0),
direct_dups=status.get("direct_dups", 0), direct_dups=status.get("direct_dups", 0),
full_events=status.get("full_evts", 0), full_events=status.get("full_evts", 0),
recv_errors=status.get("recv_errors"),
) )
@@ -97,7 +98,7 @@ async def room_lpp_telemetry(public_key: str) -> RepeaterLppTelemetryResponse:
) )
if telemetry is None: if telemetry is None:
raise HTTPException(status_code=504, detail="No telemetry response from room server") raise HTTPException(status_code=408, detail="No telemetry response from room server")
sensors = [ sensors = [
LppSensor( LppSensor(
+1 -1
View File
@@ -291,7 +291,7 @@ async def send_contact_cli_command(
if send_result.type == EventType.ERROR: if send_result.type == EventType.ERROR:
raise HTTPException( raise HTTPException(
status_code=500, detail=f"Failed to send command: {send_result.payload}" status_code=422, detail=f"Failed to send command: {send_result.payload}"
) )
response_event = await fetch_contact_cli_response(mc, contact.public_key[:12]) response_event = await fetch_contact_cli_response(mc, contact.public_key[:12])
+44 -5
View File
@@ -73,6 +73,13 @@ class AppSettingsUpdate(BaseModel):
"based on the current tracked-repeater count." "based on the current tracked-repeater count."
), ),
) )
telemetry_routed_hourly: bool | None = Field(
default=None,
description=(
"When enabled, tracked repeaters with a direct or routed (non-flood) "
"path are polled every hour instead of on the normal scheduled interval."
),
)
class BlockKeyRequest(BaseModel): class BlockKeyRequest(BaseModel):
@@ -126,7 +133,18 @@ class TelemetrySchedule(BaseModel):
max_tracked: int = Field(description="Maximum number of repeaters that can be tracked") max_tracked: int = Field(description="Maximum number of repeaters that can be tracked")
next_run_at: int | None = Field( next_run_at: int | None = Field(
default=None, default=None,
description="Unix timestamp (UTC seconds) of the next scheduled cycle", description="Unix timestamp (UTC seconds) of the next scheduled flood cycle",
)
routed_hourly: bool = Field(
default=False,
description="Whether hourly routed/direct-path telemetry is enabled",
)
next_routed_run_at: int | None = Field(
default=None,
description=(
"Unix timestamp (UTC seconds) of the next hourly routed/direct check, "
"or None when routed_hourly is off or no repeaters are tracked"
),
) )
@@ -140,20 +158,27 @@ class TrackedTelemetryResponse(BaseModel):
schedule: TelemetrySchedule = Field(description="Current scheduling state") schedule: TelemetrySchedule = Field(description="Current scheduling state")
def _build_schedule(tracked_count: int, preferred_hours: int | None) -> TelemetrySchedule: def _build_schedule(
tracked_count: int,
preferred_hours: int | None,
routed_hourly: bool = False,
) -> TelemetrySchedule:
pref = ( pref = (
preferred_hours preferred_hours
if preferred_hours in TELEMETRY_INTERVAL_OPTIONS_HOURS if preferred_hours in TELEMETRY_INTERVAL_OPTIONS_HOURS
else DEFAULT_TELEMETRY_INTERVAL_HOURS else DEFAULT_TELEMETRY_INTERVAL_HOURS
) )
effective = clamp_telemetry_interval(pref, tracked_count) effective = clamp_telemetry_interval(pref, tracked_count)
has_tracked = tracked_count > 0
return TelemetrySchedule( return TelemetrySchedule(
preferred_hours=pref, preferred_hours=pref,
effective_hours=effective, effective_hours=effective,
options=legal_interval_options(tracked_count), options=legal_interval_options(tracked_count),
tracked_count=tracked_count, tracked_count=tracked_count,
max_tracked=MAX_TRACKED_TELEMETRY_REPEATERS, max_tracked=MAX_TRACKED_TELEMETRY_REPEATERS,
next_run_at=next_run_timestamp_utc(effective) if tracked_count > 0 else None, next_run_at=next_run_timestamp_utc(effective) if has_tracked else None,
routed_hourly=routed_hourly,
next_routed_run_at=(next_run_timestamp_utc(1) if has_tracked and routed_hourly else None),
) )
@@ -216,6 +241,11 @@ async def update_settings(update: AppSettingsUpdate) -> AppSettings:
logger.info("Updating telemetry_interval_hours to %d", raw_interval) logger.info("Updating telemetry_interval_hours to %d", raw_interval)
kwargs["telemetry_interval_hours"] = raw_interval kwargs["telemetry_interval_hours"] = raw_interval
# Telemetry routed hourly
if update.telemetry_routed_hourly is not None:
logger.info("Updating telemetry_routed_hourly to %s", update.telemetry_routed_hourly)
kwargs["telemetry_routed_hourly"] = update.telemetry_routed_hourly
# Flood scope # Flood scope
flood_scope_changed = False flood_scope_changed = False
if update.flood_scope is not None: if update.flood_scope is not None:
@@ -328,7 +358,11 @@ async def toggle_tracked_telemetry(request: TrackedTelemetryRequest) -> TrackedT
return TrackedTelemetryResponse( return TrackedTelemetryResponse(
tracked_telemetry_repeaters=new_list, tracked_telemetry_repeaters=new_list,
names=await _resolve_names(new_list), names=await _resolve_names(new_list),
schedule=_build_schedule(len(new_list), settings.telemetry_interval_hours), schedule=_build_schedule(
len(new_list),
settings.telemetry_interval_hours,
settings.telemetry_routed_hourly,
),
) )
# Validate it's a repeater # Validate it's a repeater
@@ -355,7 +389,11 @@ async def toggle_tracked_telemetry(request: TrackedTelemetryRequest) -> TrackedT
return TrackedTelemetryResponse( return TrackedTelemetryResponse(
tracked_telemetry_repeaters=new_list, tracked_telemetry_repeaters=new_list,
names=await _resolve_names(new_list), names=await _resolve_names(new_list),
schedule=_build_schedule(len(new_list), settings.telemetry_interval_hours), schedule=_build_schedule(
len(new_list),
settings.telemetry_interval_hours,
settings.telemetry_routed_hourly,
),
) )
@@ -371,4 +409,5 @@ async def get_telemetry_schedule() -> TelemetrySchedule:
return _build_schedule( return _build_schedule(
len(app_settings.tracked_telemetry_repeaters), len(app_settings.tracked_telemetry_repeaters),
app_settings.telemetry_interval_hours, app_settings.telemetry_interval_hours,
app_settings.telemetry_routed_hourly,
) )
+16 -16
View File
@@ -159,7 +159,7 @@ async def send_channel_message_with_effective_scope(
override_result.payload, override_result.payload,
) )
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail=( detail=(
f"Failed to apply regional override {override_scope!r} before {action_label}: " f"Failed to apply regional override {override_scope!r} before {action_label}: "
f"{override_result.payload}" f"{override_result.payload}"
@@ -189,7 +189,7 @@ async def send_channel_message_with_effective_scope(
phm_result.payload, phm_result.payload,
) )
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail=( detail=(
f"Failed to apply path hash mode override before {action_label}: " f"Failed to apply path hash mode override before {action_label}: "
f"{phm_result.payload}" f"{phm_result.payload}"
@@ -233,7 +233,7 @@ async def send_channel_message_with_effective_scope(
set_result.payload, set_result.payload,
) )
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail=f"Failed to configure channel on radio before {action_label}", detail=f"Failed to configure channel on radio before {action_label}",
) )
radio_manager.note_channel_slot_loaded(channel_key, channel_slot) radio_manager.note_channel_slot_loaded(channel_key, channel_slot)
@@ -256,7 +256,7 @@ async def send_channel_message_with_effective_scope(
action_label, action_label,
channel.name, channel.name,
) )
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL) raise HTTPException(status_code=408, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if send_result.type == EventType.ERROR: if send_result.type == EventType.ERROR:
logger.error( logger.error(
"Radio returned error during %s for channel %s: %s", "Radio returned error during %s for channel %s: %s",
@@ -598,10 +598,10 @@ async def send_direct_message_to_contact(
"No response from radio after direct send to %s; send outcome is unknown", "No response from radio after direct send to %s; send outcome is unknown",
contact.public_key[:12], contact.public_key[:12],
) )
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL) raise HTTPException(status_code=408, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR: if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}") raise HTTPException(status_code=422, detail=f"Failed to send message: {result.payload}")
message = await create_outgoing_direct_message( message = await create_outgoing_direct_message(
conversation_key=contact.public_key.lower(), conversation_key=contact.public_key.lower(),
@@ -613,7 +613,7 @@ async def send_direct_message_to_contact(
) )
if message is None: if message is None:
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail="Failed to store outgoing message - unexpected duplicate", detail="Failed to store outgoing message - unexpected duplicate",
) )
finally: finally:
@@ -626,7 +626,7 @@ async def send_direct_message_to_contact(
) )
if sent_at is None or sender_timestamp is None or message is None or result is None: if sent_at is None or sender_timestamp is None or message is None or result is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message") raise HTTPException(status_code=422, detail="Failed to store outgoing message")
await contact_repository.update_last_contacted(contact.public_key.lower(), sent_at) await contact_repository.update_last_contacted(contact.public_key.lower(), sent_at)
@@ -791,7 +791,7 @@ async def send_channel_message_to_channel(
) )
if outgoing_message is None: if outgoing_message is None:
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail="Failed to store outgoing message - unexpected duplicate", detail="Failed to store outgoing message - unexpected duplicate",
) )
@@ -813,11 +813,11 @@ async def send_channel_message_to_channel(
"No response from radio after channel send to %s; send outcome is unknown", "No response from radio after channel send to %s; send outcome is unknown",
channel.name, channel.name,
) )
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL) raise HTTPException(status_code=408, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR: if result.type == EventType.ERROR:
raise HTTPException( raise HTTPException(
status_code=500, detail=f"Failed to send message: {result.payload}" status_code=422, detail=f"Failed to send message: {result.payload}"
) )
except Exception: except Exception:
if outgoing_message is not None: if outgoing_message is not None:
@@ -834,7 +834,7 @@ async def send_channel_message_to_channel(
) )
if sent_at is None or sender_timestamp is None or outgoing_message is None: if sent_at is None or sender_timestamp is None or outgoing_message is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message") raise HTTPException(status_code=422, detail="Failed to store outgoing message")
outgoing_message = await build_stored_outgoing_channel_message( outgoing_message = await build_stored_outgoing_channel_message(
message_id=outgoing_message.id, message_id=outgoing_message.id,
@@ -928,7 +928,7 @@ async def resend_channel_message_record(
) )
if new_message is None: if new_message is None:
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail="Failed to store resent message - unexpected duplicate", detail="Failed to store resent message - unexpected duplicate",
) )
@@ -949,10 +949,10 @@ async def resend_channel_message_record(
"No response from radio after channel resend to %s; send outcome is unknown", "No response from radio after channel resend to %s; send outcome is unknown",
channel.name, channel.name,
) )
raise HTTPException(status_code=504, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL) raise HTTPException(status_code=408, detail=NO_RADIO_RESPONSE_AFTER_SEND_DETAIL)
if result.type == EventType.ERROR: if result.type == EventType.ERROR:
raise HTTPException( raise HTTPException(
status_code=500, status_code=422,
detail=f"Failed to resend message: {result.payload}", detail=f"Failed to resend message: {result.payload}",
) )
except Exception: except Exception:
@@ -971,7 +971,7 @@ async def resend_channel_message_record(
if new_timestamp: if new_timestamp:
if sent_at is None or new_message is None: if sent_at is None or new_message is None:
raise HTTPException(status_code=500, detail="Failed to assign resend timestamp") raise HTTPException(status_code=422, detail="Failed to assign resend timestamp")
new_message = await build_stored_outgoing_channel_message( new_message = await build_stored_outgoing_channel_message(
message_id=new_message.id, message_id=new_message.id,
+3 -3
View File
@@ -52,12 +52,12 @@ class RadioRuntime:
def require_connected(self): def require_connected(self):
"""Return MeshCore when available, mirroring existing HTTP semantics.""" """Return MeshCore when available, mirroring existing HTTP semantics."""
if self.is_setup_in_progress: if self.is_setup_in_progress:
raise HTTPException(status_code=503, detail="Radio is initializing") raise HTTPException(status_code=423, detail="Radio is initializing")
if not self.is_connected: if not self.is_connected:
raise HTTPException(status_code=503, detail="Radio not connected") raise HTTPException(status_code=423, detail="Radio not connected")
mc = self.meshcore mc = self.meshcore
if mc is None: if mc is None:
raise HTTPException(status_code=503, detail="Radio not connected") raise HTTPException(status_code=423, detail="Radio not connected")
return mc return mc
@asynccontextmanager @asynccontextmanager
-118
View File
@@ -1,118 +0,0 @@
# TCP Proxy Architecture
MeshCore companion protocol proxy: emulates a MeshCore radio over TCP,
translating the binary companion protocol into in-process RemoteTerm
operations. MeshCore clients (mobile apps, meshcore-cli, meshcore-ha)
connect to it and interact with RemoteTerm as if it were a physical radio.
Enable with `MESHCORE_TCP_PROXY_ENABLED=true`.
## Module Map
```text
app/tcp_proxy/
├── __init__.py # start_tcp_proxy() / stop_tcp_proxy() lifecycle
├── protocol.py # Constants, FrameParser, frame helpers
├── encoder.py # Binary builders: contact, self_info, device_info
├── session.py # ProxySession: per-client command dispatch + event handlers
├── server.py # TCP server lifecycle, session registry, dispatch_event()
└── AGENTS_tcp_proxy.md # This file
```
## Protocol (protocol.py)
- Frame format: `0x3C`/`0x3E` marker + 2-byte LE length + payload
- Command constants (`CMD_*`): client → proxy (first payload byte)
- Response constants (`RESP_*`): proxy → client
- Push constants (`PUSH_*`): unsolicited proxy → client notifications
- `FrameParser`: stateful streaming frame decoder (mirrors meshcore_py `tcp_cx.py`)
- Helpers: `frame_response`, `build_ok`, `build_error`, `pad`, `encode_path_byte`
## Encoder (encoder.py)
Stateless binary serializers that build companion-protocol payloads from
domain data. All functions return raw `bytes` (no frame wrapping).
- `build_contact` / `build_contact_from_dict`: Contact → RESP_CONTACT / PUSH_NEW_ADVERT
- `build_self_info` / `build_self_info_from_runtime`: radio config → RESP_SELF_INFO
- `build_device_info`: → RESP_DEVICE_INFO (fixed proxy identity)
## Session (session.py)
One `ProxySession` per connected TCP client. Maintains per-client state:
- **contacts**: cached favorite contacts from DB
- **channels**: cached channel list
- **channel_slots** / **key_to_idx**: bidirectional channel index ↔ key mapping
- **_msg_queue**: queued incoming messages for the pull-based delivery model
### Command Dispatch
Command byte → handler method via class-level dispatch table. Unsupported
commands return `ERR_UNSUPPORTED`.
### Message Delivery (Pull Model)
MeshCore mobile apps use a pull model for incoming messages:
1. Broadcast event arrives → session builds a V3 message frame → queues it
2. Session sends `PUSH_MSG_WAITING` (0x83) to notify the client
3. Client calls `CMD_SYNC_NEXT_MESSAGE` (0x0A) to pull the message
4. Session dequeues and sends the frame
5. Client calls again → `RESP_NO_MORE_MSGS` when queue is empty
### DM Send Flow
1. Parse destination prefix/key from binary payload
2. Resolve to full public key via contacts cache
3. Send immediate `RESP_MSG_SENT` + `PUSH_ACK` (fake ACK) so client doesn't retry
4. Fire-and-forget `_do_send_dm()` task calls `send_direct_message_to_contact()`
5. RemoteTerm handles actual radio lock, retries, and ACK tracking
## Server (server.py)
- TCP server lifecycle (`start` / `stop`) following the `radio_stats.py` pattern
- Session registry (`register` / `unregister`)
- `dispatch_event()`: called from `broadcast_event()` in `websocket.py` for
`message`, `message_acked`, and `contact` events
## Data Flow
```
Client → TCP frame → FrameParser → ProxySession._dispatch
→ command handler → repository/service call → binary response → TCP frame
RemoteTerm event → broadcast_event → dispatch_event
→ ProxySession.on_event_* → push frame → TCP frame
```
## Integration Points
- `app/config.py`: `tcp_proxy_enabled`, `tcp_proxy_bind`, `tcp_proxy_port`
- `app/main.py`: conditional `start_tcp_proxy()` / `stop_tcp_proxy()` in lifespan
- `app/websocket.py`: `dispatch_event()` hook in `broadcast_event()` for message/ack/contact
## Design Constraints
- Never mutate RemoteTerm state from SET_CHANNEL (local slot mapping only)
- Only sync favorite contacts to clients
- Channel slots: pre-load favorites only, ERR_NOT_FOUND for empty slots
- DM sends return immediate fake ACK (RemoteTerm handles retries)
- Message delivery uses the pull model (PUSH_MSG_WAITING → SYNC_NEXT_MESSAGE)
## Config
| Variable | Default | Description |
|----------|---------|-------------|
| `MESHCORE_TCP_PROXY_ENABLED` | `false` | Enable the TCP companion protocol proxy |
| `MESHCORE_TCP_PROXY_BIND` | `0.0.0.0` | Bind address for the proxy TCP server |
| `MESHCORE_TCP_PROXY_PORT` | `5001` | Port for the proxy TCP server |
## Tests
```text
tests/
├── test_tcp_proxy_protocol.py # FrameParser, frame helpers (pure, no async)
├── test_tcp_proxy_encoder.py # Binary encoding against expected wire bytes
├── test_tcp_proxy_session.py # Command handlers with mocked radio + repos
└── test_tcp_proxy_integration.py # Real TCP server, end-to-end frame exchange
```
-28
View File
@@ -1,28 +0,0 @@
"""MeshCore TCP companion protocol proxy.
Emulates a MeshCore companion radio over TCP, translating the binary
protocol into in-process RemoteTerm operations. Enable with
``MESHCORE_TCP_PROXY_ENABLED=true``.
"""
from __future__ import annotations
import logging
logger = logging.getLogger(__name__)
async def start_tcp_proxy() -> None:
"""Start the TCP proxy server using settings from config."""
from app.config import settings
from .server import start
await start(settings.tcp_proxy_bind, settings.tcp_proxy_port)
async def stop_tcp_proxy() -> None:
"""Stop the TCP proxy server."""
from .server import stop
await stop()
-165
View File
@@ -1,165 +0,0 @@
"""Binary encoders that build companion-protocol response payloads.
All functions return raw ``bytes`` payloads (without frame wrapping).
The caller is responsible for framing via :func:`protocol.frame_response`.
"""
from __future__ import annotations
import struct
import time
from typing import Any
from .protocol import (
PROXY_FW_BUILD,
PROXY_FW_VER,
PROXY_FW_VERSION,
PROXY_MAX_CHANNELS,
PROXY_MAX_CONTACTS_RAW,
PROXY_MODEL,
PUSH_NEW_ADVERT,
RESP_CONTACT,
RESP_DEVICE_INFO,
RESP_SELF_INFO,
encode_path_byte,
pad,
)
def build_contact(
public_key: str,
*,
contact_type: int = 0,
favorite: bool = False,
direct_path: str | None = None,
direct_path_len: int = -1,
direct_path_hash_mode: int = -1,
name: str | None = None,
last_advert: int = 0,
lat: float = 0.0,
lon: float = 0.0,
lastmod: int | None = None,
push: bool = False,
) -> bytes:
"""Build a ``RESP_CONTACT`` (or ``PUSH_NEW_ADVERT``) payload.
Args:
push: If True, use ``PUSH_NEW_ADVERT`` (0x8A) instead of
``RESP_CONTACT`` (0x03) as the leading byte.
"""
out = bytearray()
out.append(PUSH_NEW_ADVERT if push else RESP_CONTACT)
out.extend(pad(bytes.fromhex(public_key), 32))
out.append(contact_type)
flags = 0x01 if favorite else 0x00
out.append(flags)
if direct_path_len >= 0 and direct_path_hash_mode >= 0:
out.append(encode_path_byte(direct_path_len, direct_path_hash_mode))
else:
out.append(0xFF) # no route known
path_bytes = bytes.fromhex(direct_path) if direct_path else b""
out.extend(pad(path_bytes, 64))
out.extend(pad((name or "").encode("utf-8", "replace"), 32))
out.extend(struct.pack("<I", last_advert))
out.extend(struct.pack("<i", int(lat * 1e6)))
out.extend(struct.pack("<i", int(lon * 1e6)))
out.extend(struct.pack("<I", lastmod or int(time.time())))
return bytes(out)
def build_contact_from_dict(data: dict[str, Any], *, push: bool = False) -> bytes:
"""Build a contact payload from either a ``Contact`` model dict or a
WS event ``data`` dict. Accepts both snake_case model fields and
the shapes produced by Pydantic JSON serialisation."""
return build_contact(
public_key=data["public_key"],
contact_type=data.get("type") or 0,
favorite=bool(data.get("favorite")),
direct_path=data.get("direct_path") or None,
direct_path_len=data.get("direct_path_len", -1),
direct_path_hash_mode=data.get("direct_path_hash_mode", -1),
name=data.get("name"),
last_advert=int(data.get("last_advert") or 0),
lat=float(data.get("lat") or 0),
lon=float(data.get("lon") or 0),
lastmod=int(data.get("lastmod") or data.get("first_seen") or 0) or None,
push=push,
)
def build_self_info(
*,
public_key: str = "00" * 32,
name: str = "RemoteTerm",
tx_power: int = 20,
max_tx_power: int = 22,
lat: float = 0.0,
lon: float = 0.0,
multi_acks: bool = False,
advert_loc: bool = False,
radio_freq: float = 915.0,
radio_bw: float = 250.0,
radio_sf: int = 10,
radio_cr: int = 7,
) -> bytes:
"""Build a ``RESP_SELF_INFO`` payload (response to ``CMD_APP_START``)."""
out = bytearray()
out.append(RESP_SELF_INFO)
out.append(1) # adv_type = CHAT
out.append(tx_power)
out.append(max_tx_power)
out.extend(pad(bytes.fromhex(public_key), 32))
out.extend(struct.pack("<i", int(lat * 1e6)))
out.extend(struct.pack("<i", int(lon * 1e6)))
out.append(1 if multi_acks else 0)
out.append(1 if advert_loc else 0)
out.append(0) # telemetry_mode
out.append(0) # manual_add_contacts
out.extend(struct.pack("<I", int(radio_freq * 1000)))
out.extend(struct.pack("<I", int(radio_bw * 1000)))
out.append(radio_sf)
out.append(radio_cr)
out.extend(name.encode("utf-8"))
return bytes(out)
def build_self_info_from_runtime(self_info: dict[str, Any]) -> bytes:
"""Build ``RESP_SELF_INFO`` from ``radio_runtime.self_info``."""
return build_self_info(
public_key=self_info.get("public_key") or "00" * 32,
name=self_info.get("name") or "RemoteTerm",
tx_power=self_info.get("tx_power") or 20,
max_tx_power=self_info.get("max_tx_power") or 22,
lat=float(self_info.get("adv_lat") or 0),
lon=float(self_info.get("adv_lon") or 0),
multi_acks=bool(self_info.get("multi_acks")),
advert_loc=bool(self_info.get("adv_loc_policy")),
radio_freq=float(self_info.get("radio_freq") or 915.0),
radio_bw=float(self_info.get("radio_bw") or 250.0),
radio_sf=int(self_info.get("radio_sf") or 10),
radio_cr=int(self_info.get("radio_cr") or 7),
)
def build_device_info(path_hash_mode: int = 0) -> bytes:
"""Build a ``RESP_DEVICE_INFO`` payload (response to ``CMD_DEVICE_QUERY``)."""
out = bytearray()
out.append(RESP_DEVICE_INFO)
out.append(PROXY_FW_VER)
out.append(PROXY_MAX_CONTACTS_RAW) # ×2 by reader
out.append(PROXY_MAX_CHANNELS)
out.extend(struct.pack("<I", 0)) # ble_pin
out.extend(pad(PROXY_FW_BUILD.encode(), 12))
out.extend(pad(PROXY_MODEL.encode(), 40))
out.extend(pad(PROXY_FW_VERSION.encode(), 20))
out.append(0) # repeat mode (fw v9+)
out.append(path_hash_mode) # (fw v10+)
return bytes(out)
-195
View File
@@ -1,195 +0,0 @@
"""MeshCore companion protocol constants, frame helpers, and streaming parser."""
from __future__ import annotations
# ── Frame markers ────────────────────────────────────────────────────
FRAME_TX = 0x3C # client → radio
FRAME_RX = 0x3E # radio → client
MAX_FRAME_SIZE = 300 # firmware MAX_FRAME_SIZE is 172; we allow a bit more
# ── Command types (client → proxy) ──────────────────────────────────
CMD_APP_START = 0x01
CMD_SEND_TXT_MSG = 0x02
CMD_SEND_CHANNEL_TXT_MSG = 0x03
CMD_GET_CONTACTS = 0x04
CMD_GET_DEVICE_TIME = 0x05
CMD_SET_DEVICE_TIME = 0x06
CMD_SEND_SELF_ADVERT = 0x07
CMD_SET_ADVERT_NAME = 0x08
CMD_ADD_UPDATE_CONTACT = 0x09
CMD_SYNC_NEXT_MESSAGE = 0x0A
CMD_SET_RADIO_PARAMS = 0x0B
CMD_SET_RADIO_TX_POWER = 0x0C
CMD_RESET_PATH = 0x0D
CMD_SET_ADVERT_LATLON = 0x0E
CMD_REMOVE_CONTACT = 0x0F
CMD_REBOOT = 0x13
CMD_GET_BATT_AND_STORAGE = 0x14
CMD_DEVICE_QUERY = 0x16
CMD_EXPORT_PRIVATE_KEY = 0x17
CMD_HAS_CONNECTION = 0x1C
CMD_GET_CONTACT_BY_KEY = 0x1E
CMD_GET_CHANNEL = 0x1F
CMD_SET_CHANNEL = 0x20
CMD_SET_FLOOD_SCOPE = 0x36
CMD_GET_STATS = 0x38
CMD_NAMES: dict[int, str] = {
0x01: "APP_START",
0x02: "SEND_TXT_MSG",
0x03: "SEND_CHAN_MSG",
0x04: "GET_CONTACTS",
0x05: "GET_TIME",
0x06: "SET_TIME",
0x07: "SEND_ADVERT",
0x08: "SET_NAME",
0x09: "ADD_CONTACT",
0x0A: "SYNC_MSG",
0x0B: "SET_RADIO",
0x0C: "SET_TX_POWER",
0x0D: "RESET_PATH",
0x0E: "SET_LATLON",
0x0F: "REMOVE_CONTACT",
0x13: "REBOOT",
0x14: "GET_BATTERY",
0x16: "DEVICE_QUERY",
0x17: "EXPORT_PRIV_KEY",
0x1C: "HAS_CONNECTION",
0x1E: "GET_CONTACT_BY_KEY",
0x1F: "GET_CHANNEL",
0x20: "SET_CHANNEL",
0x36: "SET_FLOOD_SCOPE",
0x38: "GET_STATS",
}
# ── Response / push types (proxy → client) ──────────────────────────
RESP_OK = 0x00
RESP_ERR = 0x01
RESP_CONTACT_START = 0x02
RESP_CONTACT = 0x03
RESP_CONTACT_END = 0x04
RESP_SELF_INFO = 0x05
RESP_MSG_SENT = 0x06
RESP_CONTACT_MSG_RECV = 0x07
RESP_CHANNEL_MSG_RECV = 0x08
RESP_CURRENT_TIME = 0x09
RESP_NO_MORE_MSGS = 0x0A
RESP_BATTERY = 0x0C
RESP_DEVICE_INFO = 0x0D
RESP_DISABLED = 0x0F
RESP_CONTACT_MSG_RECV_V3 = 0x10
RESP_CHANNEL_MSG_RECV_V3 = 0x11
RESP_CHANNEL_INFO = 0x12
PUSH_ACK = 0x82
PUSH_MSG_WAITING = 0x83
PUSH_NEW_ADVERT = 0x8A
# ── Error codes ──────────────────────────────────────────────────────
ERR_UNSUPPORTED = 1
ERR_NOT_FOUND = 2
# ── Virtual device identity ─────────────────────────────────────────
PROXY_FW_VER = 11
PROXY_MAX_CONTACTS_RAW = 255 # reader multiplies by 2 → 510
PROXY_MAX_CHANNELS = 40
PROXY_MODEL = "RemoteTerm Proxy"
PROXY_FW_VERSION = "v0.1.0-proxy"
PROXY_FW_BUILD = "proxy"
# ── Frame helpers ────────────────────────────────────────────────────
def frame_response(payload: bytes) -> bytes:
"""Wrap *payload* in a ``0x3E`` frame for sending to the client."""
return bytes([FRAME_RX]) + len(payload).to_bytes(2, "little") + payload
def build_ok(value: int | None = None) -> bytes:
"""Build a ``RESP_OK`` payload, optionally with a 4-byte LE value."""
if value is not None:
return bytes([RESP_OK]) + value.to_bytes(4, "little")
return bytes([RESP_OK])
def build_error(code: int = ERR_UNSUPPORTED) -> bytes:
"""Build a ``RESP_ERR`` payload with the given error code."""
return bytes([RESP_ERR, code])
def pad(data: bytes, length: int) -> bytes:
"""Pad or truncate *data* to exactly *length* bytes."""
return data[:length].ljust(length, b"\x00")
def encode_path_byte(hop_count: int, hash_mode: int) -> int:
"""Encode hop count + hash mode into a single packed byte.
Returns ``0xFF`` (direct / non-flood) when either value is negative.
"""
if hop_count < 0 or hash_mode < 0:
return 0xFF
return ((hash_mode & 0x03) << 6) | (hop_count & 0x3F)
# ── Streaming frame parser ──────────────────────────────────────────
class FrameParser:
"""Stateful parser for ``0x3C``-framed TCP data.
Mirrors the framing logic in ``meshcore_py`` ``tcp_cx.py``.
"""
def __init__(self) -> None:
self.header = b""
self.inframe = b""
self.frame_size = 0
self.started = False
def feed(self, data: bytes) -> list[bytes]:
"""Feed raw TCP bytes, return a list of complete payloads."""
payloads: list[bytes] = []
offset = 0
while offset < len(data):
remaining = data[offset:]
if not self.started:
needed = 3 - len(self.header)
chunk = remaining[:needed]
self.header += chunk
offset += len(chunk)
if len(self.header) < 3:
break
if self.header[0] != FRAME_TX:
self.header = b""
continue
self.frame_size = int.from_bytes(self.header[1:3], "little")
if self.frame_size > MAX_FRAME_SIZE:
self.header = b""
continue
self.started = True
else:
needed = self.frame_size - len(self.inframe)
chunk = remaining[:needed]
self.inframe += chunk
offset += len(chunk)
if len(self.inframe) >= self.frame_size:
payloads.append(self.inframe)
self.header = b""
self.inframe = b""
self.started = False
return payloads
-92
View File
@@ -1,92 +0,0 @@
"""TCP server lifecycle, session registry, and broadcast event dispatch."""
from __future__ import annotations
import asyncio
import logging
from typing import Any
from .session import ProxySession
logger = logging.getLogger(__name__)
# ── Session registry ─────────────────────────────────────────────────
_sessions: set[ProxySession] = set()
_server: asyncio.Server | None = None
def register(session: ProxySession) -> None:
_sessions.add(session)
def unregister(session: ProxySession) -> None:
_sessions.discard(session)
# ── Event dispatch (called from broadcast_event) ─────────────────────
async def dispatch_event(event_type: str, data: dict[str, Any]) -> None:
"""Dispatch a broadcast event to all connected proxy sessions.
Called from :func:`app.websocket.broadcast_event` for ``message``,
``message_acked``, and ``contact`` events.
"""
for session in list(_sessions):
try:
if event_type == "message":
await session.on_event_message(data)
elif event_type == "contact":
await session.on_event_contact(data)
except Exception:
logger.exception("Error dispatching %s to %s", event_type, session.addr)
# ── TCP client handler ───────────────────────────────────────────────
async def _handle_client(
reader: asyncio.StreamReader,
writer: asyncio.StreamWriter,
) -> None:
session = ProxySession(reader, writer)
register(session)
try:
await session.run()
finally:
unregister(session)
# ── Server lifecycle ─────────────────────────────────────────────────
async def start(host: str, port: int) -> None:
"""Start the TCP proxy server."""
global _server
if _server is not None:
return
_server = await asyncio.start_server(_handle_client, host, port)
addrs = ", ".join(str(s.getsockname()) for s in _server.sockets)
logger.info("TCP proxy listening on %s", addrs)
async def stop() -> None:
"""Stop the TCP proxy server and disconnect all clients."""
global _server
if _server is None:
return
# Close all active sessions
for session in list(_sessions):
try:
session.writer.close()
except Exception:
pass
_sessions.clear()
_server.close()
await _server.wait_closed()
_server = None
logger.info("TCP proxy stopped")
-683
View File
@@ -1,683 +0,0 @@
"""Per-client MeshCore companion protocol session.
Each connected TCP client gets its own ``ProxySession`` which:
- parses incoming 0x3C frames via :class:`protocol.FrameParser`
- dispatches commands to handler methods
- translates between binary companion payloads and in-process
repository / service calls
- receives broadcast events and queues push frames for the client
"""
from __future__ import annotations
import asyncio
import io
import logging
import random
import struct
import time
from typing import Any
from .encoder import (
build_contact_from_dict,
build_device_info,
build_self_info_from_runtime,
)
from .protocol import (
CMD_ADD_UPDATE_CONTACT,
CMD_APP_START,
CMD_DEVICE_QUERY,
CMD_EXPORT_PRIVATE_KEY,
CMD_GET_BATT_AND_STORAGE,
CMD_GET_CHANNEL,
CMD_GET_CONTACT_BY_KEY,
CMD_GET_CONTACTS,
CMD_GET_DEVICE_TIME,
CMD_HAS_CONNECTION,
CMD_NAMES,
CMD_REMOVE_CONTACT,
CMD_RESET_PATH,
CMD_SEND_CHANNEL_TXT_MSG,
CMD_SEND_SELF_ADVERT,
CMD_SEND_TXT_MSG,
CMD_SET_ADVERT_LATLON,
CMD_SET_ADVERT_NAME,
CMD_SET_CHANNEL,
CMD_SET_DEVICE_TIME,
CMD_SET_FLOOD_SCOPE,
CMD_SYNC_NEXT_MESSAGE,
ERR_NOT_FOUND,
ERR_UNSUPPORTED,
PROXY_MAX_CHANNELS,
PUSH_ACK,
PUSH_MSG_WAITING,
RESP_BATTERY,
RESP_CHANNEL_INFO,
RESP_CHANNEL_MSG_RECV_V3,
RESP_CONTACT_END,
RESP_CONTACT_MSG_RECV_V3,
RESP_CONTACT_START,
RESP_CURRENT_TIME,
RESP_DISABLED,
RESP_MSG_SENT,
RESP_NO_MORE_MSGS,
FrameParser,
build_error,
build_ok,
encode_path_byte,
frame_response,
pad,
)
logger = logging.getLogger(__name__)
class ProxySession:
"""Handles one MeshCore TCP client, translating commands to RemoteTerm
repository and service calls."""
def __init__(
self,
reader: asyncio.StreamReader,
writer: asyncio.StreamWriter,
) -> None:
self.reader = reader
self.writer = writer
self.addr = writer.get_extra_info("peername")
self.parser = FrameParser()
# Cached state
self.contacts: list[dict[str, Any]] = []
self.channels: list[dict[str, Any]] = []
# Channel index ↔ key mapping
self.channel_slots: dict[int, str] = {} # idx → key (lowercase hex)
self.key_to_idx: dict[str, int] = {} # key (lowercase) → idx
# Queued incoming messages for SYNC_NEXT_MESSAGE pull flow.
self._msg_queue: list[bytes] = []
# ── send helper ──────────────────────────────────────────────────
async def send(self, payload: bytes) -> None:
"""Frame and send a response payload."""
self.writer.write(frame_response(payload))
await self.writer.drain()
# ── main loop ────────────────────────────────────────────────────
async def run(self) -> None:
logger.info("Client connected: %s", self.addr)
try:
while True:
data = await self.reader.read(4096)
if not data:
break
for payload in self.parser.feed(data):
await self._dispatch(payload)
except (asyncio.CancelledError, ConnectionResetError):
pass
except Exception:
logger.exception("Session error [%s]", self.addr)
finally:
self.writer.close()
logger.info("Client disconnected: %s", self.addr)
# ── command dispatch ─────────────────────────────────────────────
_DISPATCH_TABLE: dict[int, str] | None = None
@classmethod
def _build_dispatch_table(cls) -> dict[int, str]:
if cls._DISPATCH_TABLE is None:
cls._DISPATCH_TABLE = {
CMD_APP_START: "_cmd_app_start",
CMD_DEVICE_QUERY: "_cmd_device_query",
CMD_GET_CONTACTS: "_cmd_get_contacts",
CMD_GET_CONTACT_BY_KEY: "_cmd_get_contact_by_key",
CMD_GET_CHANNEL: "_cmd_get_channel",
CMD_SET_CHANNEL: "_cmd_set_channel",
CMD_SEND_TXT_MSG: "_cmd_send_dm",
CMD_SEND_CHANNEL_TXT_MSG: "_cmd_send_channel",
CMD_GET_DEVICE_TIME: "_cmd_get_time",
CMD_SET_DEVICE_TIME: "_cmd_ok_stub",
CMD_SEND_SELF_ADVERT: "_cmd_advertise",
CMD_GET_BATT_AND_STORAGE: "_cmd_battery",
CMD_HAS_CONNECTION: "_cmd_has_connection",
CMD_SYNC_NEXT_MESSAGE: "_cmd_sync_next",
CMD_ADD_UPDATE_CONTACT: "_cmd_ok_stub",
CMD_REMOVE_CONTACT: "_cmd_remove_contact",
CMD_RESET_PATH: "_cmd_ok_stub",
CMD_SET_ADVERT_NAME: "_cmd_set_name",
CMD_SET_ADVERT_LATLON: "_cmd_set_latlon",
CMD_SET_FLOOD_SCOPE: "_cmd_ok_stub",
CMD_EXPORT_PRIVATE_KEY: "_cmd_disabled",
}
return cls._DISPATCH_TABLE
async def _dispatch(self, data: bytes) -> None:
if not data:
return
cmd = data[0]
name = CMD_NAMES.get(cmd, f"0x{cmd:02x}")
logger.debug("[%s] ← %s (%dB)", self.addr, name, len(data))
table = self._build_dispatch_table()
method_name = table.get(cmd)
if method_name:
handler = getattr(self, method_name)
try:
await handler(data)
except Exception:
logger.exception("[%s] Error in %s", self.addr, name)
await self.send(build_error())
else:
logger.warning("[%s] Unsupported: %s", self.addr, name)
await self.send(build_error(ERR_UNSUPPORTED))
# ── stubs ────────────────────────────────────────────────────────
async def _cmd_ok_stub(self, data: bytes) -> None:
await self.send(build_ok())
async def _cmd_disabled(self, data: bytes) -> None:
await self.send(bytes([RESP_DISABLED]))
# ── APP_START → SELF_INFO ────────────────────────────────────────
async def _cmd_app_start(self, data: bytes) -> None:
from app.repository import AppSettingsRepository, ChannelRepository, ContactRepository
from app.services.radio_runtime import radio_runtime
self.contacts = [c.model_dump() for c in await ContactRepository.get_favorites()]
self.channels = [c.model_dump() for c in await ChannelRepository.get_all()]
settings = await AppSettingsRepository.get()
lmt = settings.last_message_times or {}
self._sort_channels(lmt)
self._rebuild_slots()
mc = radio_runtime.meshcore
self_info = mc.self_info if mc else {}
await self.send(build_self_info_from_runtime(self_info or {}))
name = (self_info or {}).get("name", "?")
pubkey = (self_info or {}).get("public_key", "?" * 12)
logger.info(
"[%s] Session started — %s (%s...) | %d contacts, %d channel slots",
self.addr,
name,
pubkey[:12],
len(self.contacts),
len(self.channel_slots),
)
# ── DEVICE_QUERY → DEVICE_INFO ──────────────────────────────────
async def _cmd_device_query(self, data: bytes) -> None:
from app.services.radio_runtime import radio_runtime
mc = radio_runtime.meshcore
self_info = mc.self_info if mc else {}
# Fall back to radio_runtime.path_hash_mode which radio_lifecycle
# recovers from the raw device-info frame when self_info is missing it.
phm = (self_info or {}).get("path_hash_mode")
if phm is None:
phm = getattr(radio_runtime, "path_hash_mode", 0) or 0
await self.send(build_device_info(path_hash_mode=phm))
# ── GET_CONTACTS ─────────────────────────────────────────────────
async def _cmd_get_contacts(self, data: bytes) -> None:
from app.repository import ContactRepository
self.contacts = [c.model_dump() for c in await ContactRepository.get_favorites()]
count = len(self.contacts)
await self.send(bytes([RESP_CONTACT_START]) + count.to_bytes(4, "little"))
for c in self.contacts:
await self.send(build_contact_from_dict(c))
await self.send(bytes([RESP_CONTACT_END]) + int(time.time()).to_bytes(4, "little"))
logger.info("[%s] Sent %d contacts", self.addr, count)
# ── GET_CONTACT_BY_KEY ───────────────────────────────────────────
async def _cmd_get_contact_by_key(self, data: bytes) -> None:
if len(data) < 33:
await self.send(build_error(ERR_NOT_FOUND))
return
pubkey = data[1:33].hex()
contact = next((c for c in self.contacts if c["public_key"] == pubkey), None)
if contact is None:
await self.send(build_error(ERR_NOT_FOUND))
return
await self.send(build_contact_from_dict(contact))
# ── GET_CHANNEL → CHANNEL_INFO ───────────────────────────────────
async def _cmd_get_channel(self, data: bytes) -> None:
if len(data) < 2:
await self.send(build_error(ERR_NOT_FOUND))
return
idx = data[1]
key_hex = self.channel_slots.get(idx)
if key_hex is None:
await self.send(build_error(ERR_NOT_FOUND))
return
ch = next((c for c in self.channels if c["key"].lower() == key_hex), None)
name = (ch.get("name") or "") if ch else ""
out = bytearray()
out.append(RESP_CHANNEL_INFO)
out.append(idx)
out.extend(pad(name.encode("utf-8"), 32))
out.extend(pad(bytes.fromhex(key_hex), 16))
await self.send(bytes(out))
# ── SET_CHANNEL ──────────────────────────────────────────────────
async def _cmd_set_channel(self, data: bytes) -> None:
if len(data) < 50:
await self.send(build_error())
return
idx = data[1]
key_hex = data[34:50].hex()
# Clean up stale bidirectional mappings
old_key = self.channel_slots.get(idx)
if old_key is not None and old_key != key_hex:
self.key_to_idx.pop(old_key, None)
old_idx = self.key_to_idx.get(key_hex)
if old_idx is not None and old_idx != idx:
self.channel_slots.pop(old_idx, None)
self.channel_slots[idx] = key_hex
self.key_to_idx[key_hex] = idx
await self.send(build_ok())
# ── SEND_TXT_MSG (DM) ───────────────────────────────────────────
async def _cmd_send_dm(self, data: bytes) -> None:
buf = io.BytesIO(data)
buf.read(1) # cmd
buf.read(1) # txt_type
buf.read(1) # attempt
buf.read(4) # timestamp
remaining = buf.read()
full_key, text = self._parse_destination_and_text(remaining)
if not full_key or text is None:
logger.warning(
"[%s] Cannot resolve DM destination (remaining %dB)",
self.addr,
len(remaining),
)
await self.send(build_error(ERR_NOT_FOUND))
return
# Send immediate MSG_SENT + fake ACK — RemoteTerm handles retries.
ack_code = random.randbytes(4)
out = bytearray([RESP_MSG_SENT, 1]) # type=flood
out.extend(ack_code)
out.extend(struct.pack("<I", 5_000))
await self.send(bytes(out))
ack_frame = bytearray([PUSH_ACK])
ack_frame.extend(ack_code)
ack_frame.extend(struct.pack("<I", 100)) # fake trip_time
await self.send(bytes(ack_frame))
# Fire-and-forget the actual send
asyncio.create_task(self._do_send_dm(full_key, text))
logger.info("[%s] DM → %s...: %s", self.addr, full_key[:12], text[:40])
async def _do_send_dm(self, public_key: str, text: str) -> None:
"""Background task: send a DM through the radio via the service layer."""
try:
from app.event_handlers import track_pending_ack
from app.repository import ContactRepository, MessageRepository
from app.services.message_send import send_direct_message_to_contact
from app.services.radio_runtime import radio_runtime
from app.websocket import broadcast_event
contact = await ContactRepository.get_by_key_or_prefix(public_key)
if not contact:
logger.warning("DM send: contact %s not found", public_key[:12])
return
await send_direct_message_to_contact(
contact=contact,
text=text,
radio_manager=radio_runtime,
broadcast_fn=broadcast_event,
track_pending_ack_fn=track_pending_ack,
now_fn=time.time,
message_repository=MessageRepository,
contact_repository=ContactRepository,
)
except Exception:
logger.exception("[%s] DM send failed for %s", self.addr, public_key[:12])
def _parse_destination_and_text(self, remaining: bytes) -> tuple[str | None, str | None]:
"""Resolve destination key + text from the combined buffer.
The standard companion protocol sends a 6-byte pubkey prefix at the
start of ``remaining``, so we try prefix resolution first. Only when
prefix lookup fails do we attempt a 32-byte full-key parse (used by
``meshcore_py`` ``send_msg_with_retry``).
"""
# Standard path: 6-byte prefix — resolve against cached contacts.
if len(remaining) > 6:
prefix = remaining[:6].hex()
matches = [c["public_key"] for c in self.contacts if c["public_key"].startswith(prefix)]
if len(matches) == 1:
return matches[0], remaining[6:].decode("utf-8", "ignore")
# Extended path: 32-byte full key (send_msg_with_retry sends full
# keys). _do_send_dm resolves from the repository, not just our
# favorites cache.
if len(remaining) > 32:
candidate = remaining[:32].hex()
return candidate, remaining[32:].decode("utf-8", "ignore")
return None, None
# ── SEND_CHANNEL_TXT_MSG ─────────────────────────────────────────
async def _cmd_send_channel(self, data: bytes) -> None:
buf = io.BytesIO(data)
buf.read(1) # cmd
buf.read(1) # txt_type
channel_idx = buf.read(1)[0]
buf.read(4) # timestamp
text = buf.read().rstrip(b"\x00").decode("utf-8", "ignore")
key_hex = self.channel_slots.get(channel_idx)
if not key_hex:
logger.warning("[%s] No channel at slot %d", self.addr, channel_idx)
await self.send(build_error(ERR_NOT_FOUND))
return
# Verify the channel exists in RemoteTerm's DB before confirming.
# SET_CHANNEL is local-only, so client-loaded channels that aren't in
# the DB can't be sent on — return ERR_NOT_FOUND instead of false OK.
from app.repository import ChannelRepository
channel = await ChannelRepository.get_by_key(key_hex)
if not channel:
logger.warning("[%s] Channel %s not in DB", self.addr, key_hex[:12])
await self.send(build_error(ERR_NOT_FOUND))
return
await self.send(build_ok())
asyncio.create_task(self._do_send_channel(key_hex, text))
label = channel.name or key_hex[:8]
logger.info("[%s] Chan [%s]: %s", self.addr, label, text[:40])
async def _do_send_channel(self, channel_key: str, text: str) -> None:
"""Background task: send a channel message through the radio."""
try:
from app.repository import ChannelRepository, MessageRepository
from app.services.message_send import send_channel_message_to_channel
from app.services.radio_runtime import radio_runtime
from app.websocket import broadcast_error, broadcast_event
channel = await ChannelRepository.get_by_key(channel_key)
if not channel:
logger.warning("Channel send: key %s not found", channel_key[:12])
return
key_bytes = bytes.fromhex(channel_key)
await send_channel_message_to_channel(
channel=channel,
channel_key_upper=channel_key.upper(),
key_bytes=key_bytes,
text=text,
radio_manager=radio_runtime,
broadcast_fn=broadcast_event,
error_broadcast_fn=broadcast_error,
now_fn=time.time,
temp_radio_slot=0,
message_repository=MessageRepository,
)
except Exception:
logger.exception("[%s] Channel send failed for %s", self.addr, channel_key[:12])
# ── Simple command handlers ──────────────────────────────────────
async def _cmd_get_time(self, data: bytes) -> None:
t = int(time.time())
await self.send(bytes([RESP_CURRENT_TIME]) + t.to_bytes(4, "little"))
async def _cmd_advertise(self, data: bytes) -> None:
try:
from app.services.radio_runtime import radio_runtime
async with radio_runtime.radio_operation("proxy_advertise") as mc:
await mc.commands.send_advert(flood=True)
await self.send(build_ok())
except Exception:
logger.exception("Advertise failed")
await self.send(build_error())
async def _cmd_battery(self, data: bytes) -> None:
out = bytearray([RESP_BATTERY])
out.extend(struct.pack("<H", 0)) # no battery
await self.send(bytes(out))
async def _cmd_has_connection(self, data: bytes) -> None:
from app.services.radio_runtime import radio_runtime
val = 1 if radio_runtime.is_connected else 0
await self.send(build_ok(val))
async def _cmd_sync_next(self, data: bytes) -> None:
if self._msg_queue:
frame = self._msg_queue.pop(0)
await self.send(frame)
logger.debug(
"[%s] Delivered queued msg (%d remaining)",
self.addr,
len(self._msg_queue),
)
else:
await self.send(bytes([RESP_NO_MORE_MSGS]))
async def _cmd_remove_contact(self, data: bytes) -> None:
if len(data) < 33:
await self.send(build_error())
return
pubkey = data[1:33].hex()
self.contacts = [c for c in self.contacts if c["public_key"] != pubkey]
await self.send(build_ok())
async def _cmd_set_name(self, data: bytes) -> None:
name = data[1:].decode("utf-8", "ignore").rstrip("\x00")
try:
from app.services.radio_runtime import radio_runtime
async with radio_runtime.radio_operation("proxy_set_name") as mc:
await mc.commands.set_name(name)
await self.send(build_ok())
except Exception:
logger.exception("Set name failed")
await self.send(build_error())
async def _cmd_set_latlon(self, data: bytes) -> None:
if len(data) < 9:
await self.send(build_error())
return
lat = struct.unpack_from("<i", data, 1)[0] / 1e6
lon = struct.unpack_from("<i", data, 5)[0] / 1e6
try:
from app.services.radio_runtime import radio_runtime
async with radio_runtime.radio_operation("proxy_set_latlon") as mc:
await mc.commands.set_coords(lat, lon)
await self.send(build_ok())
except Exception:
logger.exception("Set lat/lon failed")
await self.send(build_error())
# ── Channel slot management ──────────────────────────────────────
def _sort_channels(self, last_message_times: dict[str, Any]) -> None:
"""Sort channels: favorites first, then most recently active."""
lmt = last_message_times
def key(ch: dict) -> tuple:
is_fav = 1 if ch.get("favorite") else 0
state_key = f"channel-{ch['key']}"
last_activity = lmt.get(state_key) or 0
return (-is_fav, -last_activity)
self.channels.sort(key=key)
def _rebuild_slots(self) -> None:
"""Pre-load only favorite channels into slots."""
self.channel_slots.clear()
self.key_to_idx.clear()
favorites = [ch for ch in self.channels if ch.get("favorite")]
for i, ch in enumerate(favorites[:PROXY_MAX_CHANNELS]):
k = ch["key"].lower()
self.channel_slots[i] = k
self.key_to_idx[k] = i
logger.debug("Pre-loaded %d favorite channel(s)", len(self.channel_slots))
# ── Broadcast event helpers ────────────────────────────────────────
@staticmethod
def _extract_path_meta(data: dict[str, Any]) -> tuple[int, int]:
"""Extract (snr_byte, path_len_byte) from a broadcast message dict.
Returns the SNR as ``int8(snr * 4)`` and path_len as the companion-
protocol packed byte ``(hash_mode << 6) | hop_count``. When no path
data is available, returns ``(0, 0)`` 0 hops at 1-byte hash mode,
which is the safest "we don't know" default for flood messages.
"""
paths = data.get("paths") or []
first = paths[0] if paths else None
# SNR — V3 field, signed int8 encoded as snr * 4
snr_raw = (first.get("snr") if first else None) or 0.0
snr_byte = max(-128, min(127, int(snr_raw * 4))) & 0xFF
if first is None:
return snr_byte, 0 # no path info → 0 hops
hop_count = first.get("path_len")
path_hex: str = first.get("path") or ""
if hop_count is None:
# Legacy: infer 1-byte hops from hex length
hop_count = len(path_hex) // 2
# Determine hash mode from path hex length and hop count
if hop_count > 0 and path_hex:
path_byte_len = len(path_hex) // 2
hash_size = path_byte_len // hop_count if hop_count else 1
hash_mode = max(0, hash_size - 1) # 1-byte → 0, 2 → 1, 3 → 2
else:
hash_mode = 0
return snr_byte, encode_path_byte(hop_count, hash_mode)
# ── Broadcast event handlers (called by server.dispatch_event) ──
async def _push_contact_from_db(self, public_key: str) -> None:
"""Fetch a contact from the DB and push it to the client so it can
display messages from senders not in the favorites cache."""
try:
from app.repository import ContactRepository
contact = await ContactRepository.get_by_key(public_key)
if not contact:
return
contact_dict = contact.model_dump()
await self.send(build_contact_from_dict(contact_dict, push=True))
self.contacts.append(contact_dict)
except Exception:
logger.debug("Failed to push contact %s from DB", public_key[:12])
async def on_event_message(self, data: dict[str, Any]) -> None:
"""Translate a broadcast ``message`` event into a queued push frame."""
if data.get("outgoing"):
return
msg_type = data.get("type")
if msg_type == "PRIV":
sender_key = data.get("conversation_key", "")
if len(sender_key) < 12:
return
# If sender isn't in our cache, fetch from DB and push to client
# so it knows who the message is from.
if not any(c["public_key"] == sender_key for c in self.contacts):
await self._push_contact_from_db(sender_key)
text = data.get("text") or ""
ts = int(data.get("sender_timestamp") or time.time())
snr_byte, path_byte = self._extract_path_meta(data)
frame = bytearray([RESP_CONTACT_MSG_RECV_V3])
frame.append(snr_byte)
frame.extend(b"\x00\x00") # reserved
frame.extend(bytes.fromhex(sender_key[:12])) # 6-byte prefix
frame.append(path_byte)
frame.append(0) # txt_type
frame.extend(struct.pack("<I", ts))
frame.extend(text.encode("utf-8"))
self._msg_queue.append(bytes(frame))
await self.send(bytes([PUSH_MSG_WAITING]))
elif msg_type == "CHAN":
conv_key = data.get("conversation_key", "").lower()
idx = self.key_to_idx.get(conv_key)
if idx is None:
return
text = data.get("text") or ""
ts = int(data.get("sender_timestamp") or time.time())
snr_byte, path_byte = self._extract_path_meta(data)
frame = bytearray([RESP_CHANNEL_MSG_RECV_V3])
frame.append(snr_byte)
frame.extend(b"\x00\x00") # reserved
frame.append(idx)
frame.append(path_byte)
frame.append(0) # txt_type
frame.extend(struct.pack("<I", ts))
frame.extend(text.encode("utf-8"))
self._msg_queue.append(bytes(frame))
await self.send(bytes([PUSH_MSG_WAITING]))
async def on_event_contact(self, data: dict[str, Any]) -> None:
"""Translate a broadcast ``contact`` event into a PUSH_NEW_ADVERT."""
pubkey = data.get("public_key", "")
if len(pubkey) < 64:
return
# Only push contacts that are already in our favorites cache.
# Without this filter, a long-lived session would gradually sync
# every contact on the mesh, defeating the favorites-only policy.
existing = next((c for c in self.contacts if c["public_key"] == pubkey), None)
if existing is None:
return
try:
await self.send(build_contact_from_dict(data, push=True))
except Exception:
logger.debug("Failed to build contact push for %s", pubkey[:12])
existing.update(data)
-9
View File
@@ -117,15 +117,6 @@ def broadcast_event(event_type: str, data: dict, *, realtime: bool = True) -> No
elif event_type == "contact": elif event_type == "contact":
asyncio.create_task(fanout_manager.broadcast_contact(data)) asyncio.create_task(fanout_manager.broadcast_contact(data))
# TCP proxy dispatch
if event_type in ("message", "message_acked", "contact"):
from app.config import settings
if settings.tcp_proxy_enabled:
from app.tcp_proxy.server import dispatch_event
asyncio.create_task(dispatch_event(event_type, data))
def broadcast_error(message: str, details: str | None = None) -> None: def broadcast_error(message: str, details: str | None = None) -> None:
"""Broadcast an error notification to all connected clients. """Broadcast an error notification to all connected clients.
+6 -6
View File
@@ -1,12 +1,12 @@
{ {
"name": "remoteterm-meshcore-frontend", "name": "remoteterm-meshcore-frontend",
"version": "3.12.0", "version": "3.12.3",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "remoteterm-meshcore-frontend", "name": "remoteterm-meshcore-frontend",
"version": "3.12.0", "version": "3.12.3",
"dependencies": { "dependencies": {
"@codemirror/lang-python": "^6.2.1", "@codemirror/lang-python": "^6.2.1",
"@codemirror/theme-one-dark": "^6.1.3", "@codemirror/theme-one-dark": "^6.1.3",
@@ -53,7 +53,7 @@
"eslint": "^9.17.0", "eslint": "^9.17.0",
"eslint-plugin-react-hooks": "^5.1.0", "eslint-plugin-react-hooks": "^5.1.0",
"jsdom": "^25.0.0", "jsdom": "^25.0.0",
"postcss": "^8.5.6", "postcss": "^8.5.10",
"prettier": "^3.4.2", "prettier": "^3.4.2",
"tailwindcss": "^3.4.19", "tailwindcss": "^3.4.19",
"typescript": "^5.6.3", "typescript": "^5.6.3",
@@ -5619,9 +5619,9 @@
} }
}, },
"node_modules/postcss": { "node_modules/postcss": {
"version": "8.5.8", "version": "8.5.10",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.8.tgz", "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.10.tgz",
"integrity": "sha512-OW/rX8O/jXnm82Ey1k44pObPtdblfiuWnrd8X7GJ7emImCOstunGbXUpp7HdBrFQX6rJzn3sPT397Wp5aCwCHg==", "integrity": "sha512-pMMHxBOZKFU6HgAZ4eyGnwXF/EvPGGqUr0MnZ5+99485wwW41kW91A4LOGxSHhgugZmSChL5AlElNdwlNgcnLQ==",
"funding": [ "funding": [
{ {
"type": "opencollective", "type": "opencollective",
+2 -2
View File
@@ -1,7 +1,7 @@
{ {
"name": "remoteterm-meshcore-frontend", "name": "remoteterm-meshcore-frontend",
"private": true, "private": true,
"version": "3.12.3", "version": "3.13.0",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",
@@ -61,7 +61,7 @@
"eslint": "^9.17.0", "eslint": "^9.17.0",
"eslint-plugin-react-hooks": "^5.1.0", "eslint-plugin-react-hooks": "^5.1.0",
"jsdom": "^25.0.0", "jsdom": "^25.0.0",
"postcss": "^8.5.6", "postcss": "^8.5.10",
"prettier": "^3.4.2", "prettier": "^3.4.2",
"tailwindcss": "^3.4.19", "tailwindcss": "^3.4.19",
"typescript": "^5.6.3", "typescript": "^5.6.3",
+65 -14
View File
@@ -4,14 +4,20 @@ import {
useImperativeHandle, useImperativeHandle,
forwardRef, forwardRef,
useRef, useRef,
useEffect,
useMemo, useMemo,
type ChangeEvent,
type FormEvent, type FormEvent,
type KeyboardEvent, type KeyboardEvent,
} from 'react'; } from 'react';
import { Input } from './ui/input';
import { Button } from './ui/button'; import { Button } from './ui/button';
import { toast } from './ui/sonner'; import { toast } from './ui/sonner';
import { cn } from '@/lib/utils'; import { cn } from '@/lib/utils';
import {
getTextReplaceEnabled,
getTextReplaceMapJson,
applyTextReplacements,
} from '../utils/textReplace';
// MeshCore message size limits (empirically determined from LoRa packet constraints) // MeshCore message size limits (empirically determined from LoRa packet constraints)
// Direct delivery allows ~156 bytes; multi-hop requires buffer for path growth. // Direct delivery allows ~156 bytes; multi-hop requires buffer for path growth.
@@ -53,19 +59,32 @@ export const MessageInput = forwardRef<MessageInputHandle, MessageInputProps>(fu
) { ) {
const [text, setText] = useState(''); const [text, setText] = useState('');
const [sending, setSending] = useState(false); const [sending, setSending] = useState(false);
const inputRef = useRef<HTMLInputElement>(null); const textareaRef = useRef<HTMLTextAreaElement>(null);
/** Resize textarea to fit content, clamped between 1 row and ~6 rows. */
const autoResize = useCallback(() => {
const el = textareaRef.current;
if (!el) return;
el.style.height = 'auto';
// Clamp: min 40px (≈1 row), max 160px (≈6 rows)
el.style.height = `${Math.min(el.scrollHeight, 160)}px`;
}, []);
useImperativeHandle(ref, () => ({ useImperativeHandle(ref, () => ({
appendText: (appendedText: string) => { appendText: (appendedText: string) => {
setText((prev) => prev + appendedText); setText((prev) => prev + appendedText);
// Focus the input after appending textareaRef.current?.focus();
inputRef.current?.focus();
}, },
focus: () => { focus: () => {
inputRef.current?.focus(); textareaRef.current?.focus();
}, },
})); }));
// Re-measure height whenever text changes (covers programmatic updates like appendText)
useEffect(() => {
autoResize();
}, [text, autoResize]);
// Calculate character limits based on conversation type // Calculate character limits based on conversation type
const limits = useMemo(() => { const limits = useMemo(() => {
if (conversationType === 'contact') { if (conversationType === 'contact') {
@@ -133,18 +152,44 @@ export const MessageInput = forwardRef<MessageInputHandle, MessageInputProps>(fu
} finally { } finally {
setSending(false); setSending(false);
} }
// Refocus after React re-enables the input // Refocus after React re-enables the textarea
setTimeout(() => inputRef.current?.focus(), 0); setTimeout(() => textareaRef.current?.focus(), 0);
}, },
[text, sending, disabled, onSend] [text, sending, disabled, onSend]
); );
const handleChange = useCallback((e: ChangeEvent<HTMLTextAreaElement>) => {
const input = e.target;
const raw = input.value;
// Skip replacement during IME / dead-key composition to avoid garbling interim input
if (!e.nativeEvent || (e.nativeEvent as InputEvent).isComposing) {
setText(raw);
return;
}
if (getTextReplaceEnabled()) {
const result = applyTextReplacements(
raw,
input.selectionStart ?? raw.length,
getTextReplaceMapJson()
);
if (result) {
setText(result.text);
// Schedule cursor restore after React flushes the new value
const pos = result.cursor;
requestAnimationFrame(() => input.setSelectionRange(pos, pos));
return;
}
}
setText(raw);
}, []);
const handleKeyDown = useCallback( const handleKeyDown = useCallback(
(e: KeyboardEvent<HTMLInputElement>) => { (e: KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === 'Enter' && !e.shiftKey) { if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault(); e.preventDefault();
handleSubmit(e as unknown as FormEvent); handleSubmit(e as unknown as FormEvent);
} }
// Shift+Enter falls through naturally and inserts a newline
}, },
[handleSubmit] [handleSubmit]
); );
@@ -162,22 +207,28 @@ export const MessageInput = forwardRef<MessageInputHandle, MessageInputProps>(fu
onSubmit={handleSubmit} onSubmit={handleSubmit}
autoComplete="off" autoComplete="off"
> >
<div className="flex gap-2"> <div className="flex gap-2 items-end">
<Input <textarea
ref={inputRef} ref={textareaRef}
type="text"
autoComplete="off" autoComplete="off"
name="chat-message-input" name="chat-message-input"
aria-label={placeholder || 'Type a message'} aria-label={placeholder || 'Type a message'}
data-lpignore="true" data-lpignore="true"
data-1p-ignore="true" data-1p-ignore="true"
data-bwignore="true" data-bwignore="true"
rows={1}
value={text} value={text}
onChange={(e) => setText(e.target.value)} onChange={handleChange}
onKeyDown={handleKeyDown} onKeyDown={handleKeyDown}
placeholder={placeholder || 'Type a message...'} placeholder={placeholder || 'Type a message...'}
disabled={disabled || sending} disabled={disabled || sending}
className="flex-1 min-w-0" className={cn(
'flex-1 min-w-0 resize-none overflow-y-auto',
'rounded-md border border-input bg-background px-3 py-2 text-base ring-offset-background',
'placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2',
'disabled:cursor-not-allowed disabled:opacity-50 md:text-sm'
)}
style={{ minHeight: '40px', maxHeight: '160px' }}
/> />
<Button <Button
type="submit" type="submit"
@@ -17,7 +17,12 @@ import type { TelemetryHistoryEntry, TelemetryLppSensor, Contact } from '../../t
const MAX_TRACKED = 8; const MAX_TRACKED = 8;
type BuiltinMetric = 'battery_volts' | 'noise_floor_dbm' | 'packets' | 'uptime_seconds'; type BuiltinMetric =
| 'battery_volts'
| 'noise_floor_dbm'
| 'packets'
| 'recv_errors'
| 'uptime_seconds';
interface MetricConfig { interface MetricConfig {
label: string; label: string;
@@ -29,6 +34,7 @@ const BUILTIN_METRIC_CONFIG: Record<BuiltinMetric, MetricConfig> = {
battery_volts: { label: 'Voltage', unit: 'V', color: '#22c55e' }, battery_volts: { label: 'Voltage', unit: 'V', color: '#22c55e' },
noise_floor_dbm: { label: 'Noise Floor', unit: 'dBm', color: '#8b5cf6' }, noise_floor_dbm: { label: 'Noise Floor', unit: 'dBm', color: '#8b5cf6' },
packets: { label: 'Packets', unit: '', color: '#0ea5e9' }, packets: { label: 'Packets', unit: '', color: '#0ea5e9' },
recv_errors: { label: 'RX Errors', unit: '', color: '#ef4444' },
uptime_seconds: { label: 'Uptime', unit: 's', color: '#f59e0b' }, uptime_seconds: { label: 'Uptime', unit: 's', color: '#f59e0b' },
}; };
@@ -148,12 +154,19 @@ export function TelemetryHistoryPane({
const chartData = useMemo(() => { const chartData = useMemo(() => {
return entries.map((e) => { return entries.map((e) => {
const d = e.data; const d = e.data;
const recvErrors = d.recv_errors ?? undefined;
const packetsReceived = d.packets_received;
const point: Record<string, number | undefined> = { const point: Record<string, number | undefined> = {
timestamp: e.timestamp, timestamp: e.timestamp,
battery_volts: d.battery_volts, battery_volts: d.battery_volts,
noise_floor_dbm: d.noise_floor_dbm, noise_floor_dbm: d.noise_floor_dbm,
packets_received: d.packets_received, packets_received: packetsReceived,
packets_sent: d.packets_sent, packets_sent: d.packets_sent,
recv_errors: recvErrors,
recv_error_pct:
recvErrors != null && packetsReceived != null && packetsReceived + recvErrors > 0
? +((recvErrors / (packetsReceived + recvErrors)) * 100).toFixed(2)
: undefined,
uptime_seconds: d.uptime_seconds, uptime_seconds: d.uptime_seconds,
}; };
// Flatten LPP sensors into the point, converting units as needed // Flatten LPP sensors into the point, converting units as needed
@@ -167,7 +180,11 @@ export function TelemetryHistoryPane({
}, [entries, distanceUnit]); }, [entries, distanceUnit]);
const dataKeys = const dataKeys =
activeMetric === 'packets' ? ['packets_received', 'packets_sent'] : [activeMetric]; activeMetric === 'packets'
? ['packets_received', 'packets_sent']
: activeMetric === 'recv_errors'
? ['recv_errors', 'recv_error_pct']
: [activeMetric];
const yDomain = useMemo<[number, number] | undefined>(() => { const yDomain = useMemo<[number, number] | undefined>(() => {
if (activeMetric !== 'battery_volts' || chartData.length === 0) return undefined; if (activeMetric !== 'battery_volts' || chartData.length === 0) return undefined;
@@ -178,6 +195,20 @@ export function TelemetryHistoryPane({
return [Math.min(3, Math.floor(lo) - 1), Math.max(5, Math.ceil(hi) + 1)]; return [Math.min(3, Math.floor(lo) - 1), Math.max(5, Math.ceil(hi) + 1)];
}, [activeMetric, chartData]); }, [activeMetric, chartData]);
const yDomainPct = useMemo<[number, number]>(() => {
const MIN_SPAN = 5;
const values = chartData.map((d) => d.recv_error_pct).filter((v) => v != null) as number[];
if (values.length === 0) return [0, MIN_SPAN];
const lo = Math.min(...values);
const hi = Math.max(...values);
const span = hi - lo;
if (span >= MIN_SPAN)
return [Math.max(0, Math.floor(lo - span * 0.1)), Math.ceil(hi + span * 0.1)];
const pad = (MIN_SPAN - span) / 2;
const bottom = Math.max(0, Math.floor(lo - pad));
return [bottom, Math.ceil(bottom + MIN_SPAN)];
}, [chartData]);
const handleToggle = async () => { const handleToggle = async () => {
setToggling(true); setToggling(true);
try { try {
@@ -214,16 +245,16 @@ export function TelemetryHistoryPane({
via the repeater pane, API calls to the endpoint ( via the repeater pane, API calls to the endpoint (
<code className="text-[0.6875rem]">POST /api/contacts/&lt;key&gt;/repeater/status</code> <code className="text-[0.6875rem]">POST /api/contacts/&lt;key&gt;/repeater/status</code>
), or when the repeater is opted into interval telemetry polling, in which case the ), or when the repeater is opted into interval telemetry polling, in which case the
repeater will be polled for metrics every 8 hours. You can see which repeaters are opted repeater will be polled for metrics automatically. Fetch frequency can be configured in{' '}
into this flow in the{' '}
<a <a
href="#settings/database" href="#settings/database"
className="underline text-primary hover:text-primary/80 transition-colors" className="underline text-primary hover:text-primary/80 transition-colors"
> >
Database &amp; Messaging Settings &rarr; Database &amp; Messaging
</a>{' '} </a>
settings pane. A maximum of {MAX_TRACKED} repeaters may be opted into this for the sake , where you can also see which repeaters are currently opted in. A maximum of{' '}
of keeping mesh congestion reasonable. {MAX_TRACKED} repeaters may be opted into this for the sake of keeping mesh congestion
reasonable.
</p> </p>
{isTracked ? ( {isTracked ? (
@@ -252,7 +283,7 @@ export function TelemetryHistoryPane({
disabled={toggling} disabled={toggling}
className="border-green-600/50 text-green-600 hover:bg-green-600/10" className="border-green-600/50 text-green-600 hover:bg-green-600/10"
> >
{toggling ? 'Updating...' : 'Opt Repeater into 8hr Interval Metrics Tracking'} {toggling ? 'Updating...' : 'Opt Repeater into Interval Metrics Tracking'}
</Button> </Button>
)} )}
</div> </div>
@@ -299,7 +330,15 @@ export function TelemetryHistoryPane({
</p> </p>
) : ( ) : (
<ResponsiveContainer width="100%" height={180}> <ResponsiveContainer width="100%" height={180}>
<AreaChart data={chartData} margin={{ top: 4, right: 4, bottom: 0, left: -8 }}> <AreaChart
data={chartData}
margin={{
top: 4,
right: activeMetric === 'recv_errors' ? 8 : 4,
bottom: 0,
left: -8,
}}
>
<CartesianGrid strokeDasharray="3 3" stroke="hsl(var(--border))" vertical={false} /> <CartesianGrid strokeDasharray="3 3" stroke="hsl(var(--border))" vertical={false} />
<XAxis <XAxis
dataKey="timestamp" dataKey="timestamp"
@@ -311,6 +350,7 @@ export function TelemetryHistoryPane({
tickFormatter={formatTime} tickFormatter={formatTime}
/> />
<YAxis <YAxis
yAxisId="left"
domain={yDomain} domain={yDomain}
tick={{ fontSize: 10, fill: 'hsl(var(--muted-foreground))' }} tick={{ fontSize: 10, fill: 'hsl(var(--muted-foreground))' }}
tickLine={false} tickLine={false}
@@ -319,6 +359,17 @@ export function TelemetryHistoryPane({
activeMetric === 'uptime_seconds' ? formatUptime(v) : `${v}` activeMetric === 'uptime_seconds' ? formatUptime(v) : `${v}`
} }
/> />
{activeMetric === 'recv_errors' && (
<YAxis
yAxisId="right"
orientation="right"
domain={yDomainPct}
tick={{ fontSize: 10, fill: 'hsl(var(--muted-foreground))' }}
tickLine={false}
axisLine={false}
tickFormatter={(v) => `${v}%`}
/>
)}
<RechartsTooltip <RechartsTooltip
{...TOOLTIP_STYLE} {...TOOLTIP_STYLE}
cursor={{ cursor={{
@@ -330,6 +381,10 @@ export function TelemetryHistoryPane({
// eslint-disable-next-line @typescript-eslint/no-explicit-any // eslint-disable-next-line @typescript-eslint/no-explicit-any
formatter={(value: any, name: any) => { formatter={(value: any, name: any) => {
const numVal = typeof value === 'number' ? value : Number(value); const numVal = typeof value === 'number' ? value : Number(value);
if (activeMetric === 'recv_errors') {
if (name === 'recv_error_pct') return [`${numVal}%`, 'Error Rate'];
return [`${value}`, 'RX Errors'];
}
const display = const display =
activeMetric === 'uptime_seconds' ? formatUptime(numVal) : `${value}`; activeMetric === 'uptime_seconds' ? formatUptime(numVal) : `${value}`;
const suffix = const suffix =
@@ -347,51 +402,44 @@ export function TelemetryHistoryPane({
return [`${display}${suffix}`, label]; return [`${display}${suffix}`, label];
}} }}
/> />
{dataKeys.map((key, i) => ( {dataKeys.map((key, i) => {
<Area const color =
key={key} activeMetric === 'packets'
type="linear" ? i === 0
dataKey={key} ? '#0ea5e9'
stroke={ : '#f43f5e'
activeMetric === 'packets' : activeMetric === 'recv_errors'
? i === 0 ? i === 0
? '#0ea5e9' ? '#ef4444'
: '#f43f5e' : '#f59e0b'
: activeConfig.color : activeConfig.color;
} return (
fill={ <Area
activeMetric === 'packets' key={key}
? i === 0 type="linear"
? '#0ea5e9' dataKey={key}
: '#f43f5e' yAxisId={
: activeConfig.color activeMetric === 'recv_errors' && key === 'recv_error_pct' ? 'right' : 'left'
} }
fillOpacity={0.15} stroke={color}
strokeWidth={1.5} fill={color}
dot={{ fillOpacity={0.15}
r: 4, strokeWidth={1.5}
fill: dot={{
activeMetric === 'packets' r: 4,
? i === 0 fill: color,
? '#0ea5e9' strokeWidth: 1.5,
: '#f43f5e' stroke: 'hsl(var(--popover))',
: activeConfig.color, }}
strokeWidth: 1.5, activeDot={{
stroke: 'hsl(var(--popover))', r: 6,
}} fill: color,
activeDot={{ strokeWidth: 2,
r: 6, stroke: 'hsl(var(--popover))',
fill: }}
activeMetric === 'packets' />
? i === 0 );
? '#0ea5e9' })}
: '#f43f5e'
: activeConfig.color,
strokeWidth: 2,
stroke: 'hsl(var(--popover))',
}}
/>
))}
</AreaChart> </AreaChart>
</ResponsiveContainer> </ResponsiveContainer>
)} )}
@@ -91,6 +91,26 @@ export function TelemetryPane({
label="Duplicates" label="Duplicates"
value={`${data.flood_dups.toLocaleString()} flood / ${data.direct_dups.toLocaleString()} direct`} value={`${data.flood_dups.toLocaleString()} flood / ${data.direct_dups.toLocaleString()} direct`}
/> />
{data.recv_errors != null && (
<KvRow
label="RX Errors"
value={
<>
{data.recv_errors.toLocaleString()}
{data.packets_received > 0 && (
<Secondary>
(
{(
(data.recv_errors / (data.packets_received + data.recv_errors)) *
100
).toFixed(2)}
%)
</Secondary>
)}
</>
}
/>
)}
<Separator className="my-1" /> <Separator className="my-1" />
<KvRow label="TX Queue" value={data.tx_queue_len} /> <KvRow label="TX Queue" value={data.tx_queue_len} />
<KvRow label="Debug Flags" value={data.full_events} /> <KvRow label="Debug Flags" value={data.full_events} />
@@ -92,7 +92,11 @@ export function SettingsDatabaseSection({
return () => { return () => {
cancelled = true; cancelled = true;
}; };
}, [trackedTelemetryRepeaters.length, appSettings.telemetry_interval_hours]); }, [
trackedTelemetryRepeaters.length,
appSettings.telemetry_interval_hours,
appSettings.telemetry_routed_hourly,
]);
useEffect(() => { useEffect(() => {
if (trackedTelemetryRepeaters.length === 0 || telemetryFetchedRef.current) return; if (trackedTelemetryRepeaters.length === 0 || telemetryFetchedRef.current) return;
@@ -346,13 +350,41 @@ export function SettingsDatabaseSection({
restored if you drop back to a supported count. restored if you drop back to a supported count.
</p> </p>
)} )}
{schedule?.next_run_at != null && (
<p className="text-xs text-muted-foreground">
Next run at {formatTime(schedule.next_run_at)} (UTC top of hour).
</p>
)}
</div> </div>
{/* Routed hourly toggle */}
<label className="flex items-start gap-2 cursor-pointer">
<input
type="checkbox"
checked={appSettings.telemetry_routed_hourly}
onChange={() => {
const next = !appSettings.telemetry_routed_hourly;
void persistAppSettings({ telemetry_routed_hourly: next }, () => {});
}}
className="w-4 h-4 rounded border-input accent-primary mt-0.5"
/>
<div>
<span className="text-sm">Poll direct/routed-path repeaters hourly</span>
<p className="text-[0.8125rem] text-muted-foreground">
When enabled, tracked repeaters with a direct or routed path (not flood) are polled
every hour instead of on the scheduled interval above. Flood-only repeaters still
follow the normal schedule.
</p>
</div>
</label>
{schedule?.next_run_at != null && (
<p className="text-xs text-muted-foreground">
{schedule.routed_hourly ? 'Next flood run at' : 'Next run at'}{' '}
{formatTime(schedule.next_run_at)} (UTC top of hour).
</p>
)}
{schedule?.next_routed_run_at != null && (
<p className="text-xs text-muted-foreground">
Next direct/routed run at {formatTime(schedule.next_routed_run_at)} (UTC top of hour).
</p>
)}
{trackedTelemetryRepeaters.length === 0 ? ( {trackedTelemetryRepeaters.length === 0 ? (
<p className="text-sm text-muted-foreground italic"> <p className="text-sm text-muted-foreground italic">
No repeaters are being tracked. Enable tracking from a repeater's dashboard. No repeaters are being tracked. Enable tracking from a repeater's dashboard.
@@ -362,6 +394,21 @@ export function SettingsDatabaseSection({
{trackedTelemetryRepeaters.map((key) => { {trackedTelemetryRepeaters.map((key) => {
const contact = contacts.find((c) => c.public_key === key); const contact = contacts.find((c) => c.public_key === key);
const displayName = contact?.name ?? key.slice(0, 12); const displayName = contact?.name ?? key.slice(0, 12);
const routeSource = contact?.effective_route_source ?? 'flood';
// A forced-flood override (path_len < 0) still reports source
// "override", but the actual route is flood. Check the real path.
const hasRealPath =
contact?.effective_route != null && contact.effective_route.path_len >= 0;
const routeLabel = !hasRealPath
? 'flood'
: routeSource === 'override'
? 'routed'
: routeSource === 'direct'
? 'direct'
: 'flood';
const routeColor = hasRealPath
? 'text-primary bg-primary/10'
: 'text-muted-foreground bg-muted';
const snap = latestTelemetry[key]; const snap = latestTelemetry[key];
const d = snap?.data; const d = snap?.data;
return ( return (
@@ -369,9 +416,16 @@ export function SettingsDatabaseSection({
<div className="flex items-center justify-between gap-2"> <div className="flex items-center justify-between gap-2">
<div className="flex-1 min-w-0"> <div className="flex-1 min-w-0">
<span className="text-sm truncate block">{displayName}</span> <span className="text-sm truncate block">{displayName}</span>
<span className="text-[0.625rem] text-muted-foreground font-mono"> <div className="flex items-center gap-1.5">
{key.slice(0, 12)} <span className="text-[0.625rem] text-muted-foreground font-mono">
</span> {key.slice(0, 12)}
</span>
<span
className={`text-[0.625rem] uppercase tracking-wider px-1.5 py-0.5 rounded font-medium ${routeColor}`}
>
{routeLabel}
</span>
</div>
</div> </div>
{onToggleTrackedTelemetry && ( {onToggleTrackedTelemetry && (
<Button <Button
@@ -287,6 +287,7 @@ const CREATE_INTEGRATION_DEFINITIONS: readonly CreateIntegrationDefinition[] = [
config: { config: {
urls: '', urls: '',
preserve_identity: true, preserve_identity: true,
markdown_format: true,
body_format_dm: '**DM:** {sender_name}: {text} **via:** [{hops_backticked}]', body_format_dm: '**DM:** {sender_name}: {text} **via:** [{hops_backticked}]',
body_format_channel: body_format_channel:
'**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]', '**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]',
@@ -2390,6 +2391,8 @@ function ScopeSelector({
const APPRISE_DEFAULT_DM = '**DM:** {sender_name}: {text} **via:** [{hops_backticked}]'; const APPRISE_DEFAULT_DM = '**DM:** {sender_name}: {text} **via:** [{hops_backticked}]';
const APPRISE_DEFAULT_CHANNEL = const APPRISE_DEFAULT_CHANNEL =
'**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]'; '**{channel_name}:** {sender_name}: {text} **via:** [{hops_backticked}]';
const APPRISE_DEFAULT_DM_PLAIN = 'DM: {sender_name}: {text} via: [{hops}]';
const APPRISE_DEFAULT_CHANNEL_PLAIN = '{channel_name}: {sender_name}: {text} via: [{hops}]';
const APPRISE_SAMPLE_VARS: Record<string, string> = { const APPRISE_SAMPLE_VARS: Record<string, string> = {
type: 'CHAN', type: 'CHAN',
@@ -2420,19 +2423,32 @@ function appriseApplyFormat(fmt: string, vars: Record<string, string>): string {
return result; return result;
} }
/** Render a markdown-ish string into inline React elements (bold + code spans). */ /** Render a markdown-ish string into inline React elements (bold, italic, code). */
function appriseRenderMarkdown(s: string): ReactNode[] { function appriseRenderMarkdown(s: string): ReactNode[] {
const nodes: ReactNode[] = []; const nodes: ReactNode[] = [];
let key = 0; let key = 0;
// Split on **bold** and `code` spans // Split on **bold**, __bold__, *italic*, _italic_, and `code` spans.
const parts = s.split(/(\*\*[^*]+\*\*|`[^`]+`)/g); // Longer delimiters first so ** and __ match before * and _.
const parts = s.split(/(\*\*[^*]+\*\*|__[^_]+__|`[^`]+`|\*[^*]+\*|_[^_]+_)/g);
for (const part of parts) { for (const part of parts) {
if (part.startsWith('**') && part.endsWith('**')) { if (
(part.startsWith('**') && part.endsWith('**')) ||
(part.startsWith('__') && part.endsWith('__'))
) {
nodes.push( nodes.push(
<strong key={key++} className="font-bold"> <strong key={key++} className="font-bold">
{part.slice(2, -2)} {part.slice(2, -2)}
</strong> </strong>
); );
} else if (
(part.startsWith('*') && part.endsWith('*')) ||
(part.startsWith('_') && part.endsWith('_'))
) {
nodes.push(
<em key={key++} className="italic">
{part.slice(1, -1)}
</em>
);
} else if (part.startsWith('`') && part.endsWith('`')) { } else if (part.startsWith('`') && part.endsWith('`')) {
nodes.push( nodes.push(
<code key={key++} className="rounded bg-muted px-1 py-0.5 text-[0.6875rem] font-mono"> <code key={key++} className="rounded bg-muted px-1 py-0.5 text-[0.6875rem] font-mono">
@@ -2446,19 +2462,29 @@ function appriseRenderMarkdown(s: string): ReactNode[] {
return nodes; return nodes;
} }
function AppriseFormatPreview({ format, vars }: { format: string; vars: Record<string, string> }) { function AppriseFormatPreview({
format,
vars,
markdown = true,
}: {
format: string;
vars: Record<string, string>;
markdown?: boolean;
}) {
const raw = appriseApplyFormat(format, vars); const raw = appriseApplyFormat(format, vars);
return ( return (
<div className="rounded-md border border-border bg-muted/30 p-2 space-y-1.5"> <div className="rounded-md border border-border bg-muted/30 p-2 space-y-1.5">
{markdown && (
<div>
<span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium">
Rendered (Discord, Slack, Telegram)
</span>
<p className="text-xs break-all">{appriseRenderMarkdown(raw)}</p>
</div>
)}
<div> <div>
<span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium"> <span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium">
Rendered (Discord, Slack) {markdown ? 'Raw (email, SMS)' : 'Preview'}
</span>
<p className="text-xs break-all">{appriseRenderMarkdown(raw)}</p>
</div>
<div>
<span className="text-[0.625rem] uppercase tracking-wider text-muted-foreground font-medium">
Raw (Telegram, email)
</span> </span>
<p className="text-xs font-mono break-all text-muted-foreground">{raw}</p> <p className="text-xs font-mono break-all text-muted-foreground">{raw}</p>
</div> </div>
@@ -2483,9 +2509,11 @@ function AppriseConfigEditor({
onChange: (config: Record<string, unknown>) => void; onChange: (config: Record<string, unknown>) => void;
onScopeChange: (scope: Record<string, unknown>) => void; onScopeChange: (scope: Record<string, unknown>) => void;
}) { }) {
const dmFormat = ((config.body_format_dm as string) || '').trim() || APPRISE_DEFAULT_DM; const markdown = config.markdown_format !== false;
const chanFormat = const defaultDm = markdown ? APPRISE_DEFAULT_DM : APPRISE_DEFAULT_DM_PLAIN;
((config.body_format_channel as string) || '').trim() || APPRISE_DEFAULT_CHANNEL; const defaultChan = markdown ? APPRISE_DEFAULT_CHANNEL : APPRISE_DEFAULT_CHANNEL_PLAIN;
const dmFormat = ((config.body_format_dm as string) || '').trim() || defaultDm;
const chanFormat = ((config.body_format_channel as string) || '').trim() || defaultChan;
return ( return (
<div className="space-y-3"> <div className="space-y-3">
@@ -2549,6 +2577,39 @@ function AppriseConfigEditor({
<h3 className="text-base font-semibold tracking-tight">Message Format</h3> <h3 className="text-base font-semibold tracking-tight">Message Format</h3>
<label className="flex items-center gap-3 cursor-pointer">
<input
type="checkbox"
checked={markdown}
onChange={(e) => {
const md = e.target.checked;
const updates: Record<string, unknown> = { ...config, markdown_format: md };
const curDm = ((config.body_format_dm as string) || '').trim();
const curChan = ((config.body_format_channel as string) || '').trim();
if (md) {
if (!curDm || curDm === APPRISE_DEFAULT_DM_PLAIN)
updates.body_format_dm = APPRISE_DEFAULT_DM;
if (!curChan || curChan === APPRISE_DEFAULT_CHANNEL_PLAIN)
updates.body_format_channel = APPRISE_DEFAULT_CHANNEL;
} else {
if (!curDm || curDm === APPRISE_DEFAULT_DM)
updates.body_format_dm = APPRISE_DEFAULT_DM_PLAIN;
if (!curChan || curChan === APPRISE_DEFAULT_CHANNEL)
updates.body_format_channel = APPRISE_DEFAULT_CHANNEL_PLAIN;
}
onChange(updates);
}}
className="h-4 w-4 rounded border-border"
/>
<div>
<span className="text-sm">Markdown formatting</span>
<p className="text-[0.8125rem] text-muted-foreground">
If notifications fail on services like Telegram due to special characters in sender
names, disable this option.
</p>
</div>
</label>
<details className="group"> <details className="group">
<summary className="text-sm font-medium text-foreground cursor-pointer select-none flex items-center gap-1"> <summary className="text-sm font-medium text-foreground cursor-pointer select-none flex items-center gap-1">
<ChevronDown className="h-3 w-3 transition-transform group-open:rotate-0 -rotate-90" /> <ChevronDown className="h-3 w-3 transition-transform group-open:rotate-0 -rotate-90" />
@@ -2604,12 +2665,12 @@ function AppriseConfigEditor({
<div className="space-y-2"> <div className="space-y-2">
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<Label htmlFor="fanout-apprise-fmt-dm">DM format</Label> <Label htmlFor="fanout-apprise-fmt-dm">DM format</Label>
{!appriseIsDefault(config.body_format_dm, APPRISE_DEFAULT_DM) && ( {!appriseIsDefault(config.body_format_dm, defaultDm) && (
<button <button
type="button" type="button"
aria-label="Reset DM format to default" aria-label="Reset DM format to default"
className="text-xs text-muted-foreground hover:text-foreground transition-colors" className="text-xs text-muted-foreground hover:text-foreground transition-colors"
onClick={() => onChange({ ...config, body_format_dm: APPRISE_DEFAULT_DM })} onClick={() => onChange({ ...config, body_format_dm: defaultDm })}
> >
Reset to default Reset to default
</button> </button>
@@ -2618,23 +2679,23 @@ function AppriseConfigEditor({
<textarea <textarea
id="fanout-apprise-fmt-dm" id="fanout-apprise-fmt-dm"
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm font-mono min-h-[56px]" className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm font-mono min-h-[56px]"
placeholder={APPRISE_DEFAULT_DM} placeholder={defaultDm}
value={(config.body_format_dm as string) ?? ''} value={(config.body_format_dm as string) ?? ''}
onChange={(e) => onChange({ ...config, body_format_dm: e.target.value })} onChange={(e) => onChange({ ...config, body_format_dm: e.target.value })}
rows={2} rows={2}
/> />
<AppriseFormatPreview format={dmFormat} vars={APPRISE_SAMPLE_VARS_DM} /> <AppriseFormatPreview format={dmFormat} vars={APPRISE_SAMPLE_VARS_DM} markdown={markdown} />
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<Label htmlFor="fanout-apprise-fmt-chan">Channel format</Label> <Label htmlFor="fanout-apprise-fmt-chan">Channel format</Label>
{!appriseIsDefault(config.body_format_channel, APPRISE_DEFAULT_CHANNEL) && ( {!appriseIsDefault(config.body_format_channel, defaultChan) && (
<button <button
type="button" type="button"
aria-label="Reset channel format to default" aria-label="Reset channel format to default"
className="text-xs text-muted-foreground hover:text-foreground transition-colors" className="text-xs text-muted-foreground hover:text-foreground transition-colors"
onClick={() => onChange({ ...config, body_format_channel: APPRISE_DEFAULT_CHANNEL })} onClick={() => onChange({ ...config, body_format_channel: defaultChan })}
> >
Reset to default Reset to default
</button> </button>
@@ -2643,12 +2704,12 @@ function AppriseConfigEditor({
<textarea <textarea
id="fanout-apprise-fmt-chan" id="fanout-apprise-fmt-chan"
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm font-mono min-h-[56px]" className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm font-mono min-h-[56px]"
placeholder={APPRISE_DEFAULT_CHANNEL} placeholder={defaultChan}
value={(config.body_format_channel as string) ?? ''} value={(config.body_format_channel as string) ?? ''}
onChange={(e) => onChange({ ...config, body_format_channel: e.target.value })} onChange={(e) => onChange({ ...config, body_format_channel: e.target.value })}
rows={2} rows={2}
/> />
<AppriseFormatPreview format={chanFormat} vars={APPRISE_SAMPLE_VARS} /> <AppriseFormatPreview format={chanFormat} vars={APPRISE_SAMPLE_VARS} markdown={markdown} />
</div> </div>
<Separator /> <Separator />
@@ -33,6 +33,13 @@ import {
setSavedFontScale, setSavedFontScale,
} from '../../utils/fontScale'; } from '../../utils/fontScale';
import { getAutoFocusInputEnabled, setAutoFocusInputEnabled } from '../../utils/autoFocusInput'; import { getAutoFocusInputEnabled, setAutoFocusInputEnabled } from '../../utils/autoFocusInput';
import {
getTextReplaceEnabled,
setTextReplaceEnabled as saveTextReplaceEnabled,
getTextReplaceMapJson,
setTextReplaceMapJson,
DEFAULT_MAP_JSON,
} from '../../utils/textReplace';
import { import {
BATTERY_DISPLAY_CHANGE_EVENT, BATTERY_DISPLAY_CHANGE_EVENT,
getShowBatteryPercent, getShowBatteryPercent,
@@ -232,6 +239,9 @@ export function SettingsLocalSection({
const [batteryPercent, setBatteryPercent] = useState(getShowBatteryPercent); const [batteryPercent, setBatteryPercent] = useState(getShowBatteryPercent);
const [batteryVoltage, setBatteryVoltage] = useState(getShowBatteryVoltage); const [batteryVoltage, setBatteryVoltage] = useState(getShowBatteryVoltage);
const [statusDotPulse, setStatusDotPulse] = useState(getStatusDotPulseEnabled); const [statusDotPulse, setStatusDotPulse] = useState(getStatusDotPulseEnabled);
const [textReplaceEnabled, setTextReplaceEnabled] = useState(getTextReplaceEnabled);
const [textReplaceJson, setTextReplaceJson] = useState(getTextReplaceMapJson);
const [textReplaceError, setTextReplaceError] = useState<string | null>(null);
const [fontScale, setFontScale] = useState(getSavedFontScale); const [fontScale, setFontScale] = useState(getSavedFontScale);
const [fontScaleSlider, setFontScaleSlider] = useState(getSavedFontScale); const [fontScaleSlider, setFontScaleSlider] = useState(getSavedFontScale);
const [fontScaleInput, setFontScaleInput] = useState(() => String(getSavedFontScale())); const [fontScaleInput, setFontScaleInput] = useState(() => String(getSavedFontScale()));
@@ -439,6 +449,63 @@ export function SettingsLocalSection({
</p> </p>
</div> </div>
</div> </div>
<div className="rounded-md border border-border/60 p-3 space-y-2">
<div className="flex items-start gap-3">
<Checkbox
id="text-replace"
checked={textReplaceEnabled}
onCheckedChange={(checked) => {
const v = checked === true;
setTextReplaceEnabled(v);
saveTextReplaceEnabled(v);
}}
className="mt-0.5"
/>
<div className="space-y-1">
<Label htmlFor="text-replace">Replace as you Type</Label>
<p className="text-[0.8125rem] text-muted-foreground">
Automatically replace characters as you type in the message input. Define
replacements as a JSON object mapping source strings to their replacements.
</p>
</div>
</div>
{textReplaceEnabled && (
<div className="space-y-2 pl-7">
<textarea
value={textReplaceJson}
onChange={(e) => {
const val = e.target.value;
setTextReplaceJson(val);
setTextReplaceError(setTextReplaceMapJson(val));
}}
spellCheck={false}
rows={10}
className={cn(
'w-full rounded-md border bg-background px-3 py-2 text-sm font-mono',
textReplaceError ? 'border-destructive' : 'border-input'
)}
aria-label="Text replacement map (JSON)"
/>
{textReplaceError && (
<p className="text-xs text-destructive">
{textReplaceError} Changes are not saved until this is resolved.
</p>
)}
<button
type="button"
onClick={() => {
setTextReplaceJson(DEFAULT_MAP_JSON);
setTextReplaceMapJson(DEFAULT_MAP_JSON);
setTextReplaceError(null);
}}
className="inline-flex h-8 items-center justify-center rounded-md border border-input px-3 text-sm font-medium transition-colors hover:bg-accent focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
>
Reset to Default
</button>
</div>
)}
</div>
</div> </div>
<div className="space-y-3"> <div className="space-y-3">
+4 -1
View File
@@ -6,7 +6,10 @@
padding: 0; padding: 0;
} }
html, html {
height: 100dvh;
}
body, body,
#root { #root {
height: 100%; height: 100%;
+2
View File
@@ -111,6 +111,7 @@ beforeEach(() => {
tracked_telemetry_repeaters: [], tracked_telemetry_repeaters: [],
auto_resend_channel: false, auto_resend_channel: false,
telemetry_interval_hours: 8, telemetry_interval_hours: 8,
telemetry_routed_hourly: false,
}); });
mockedApi.getRadioConfig.mockResolvedValue({ mockedApi.getRadioConfig.mockResolvedValue({
public_key: 'aa'.repeat(32), public_key: 'aa'.repeat(32),
@@ -1050,6 +1051,7 @@ describe('SettingsFanoutSection', () => {
tracked_telemetry_repeaters: ['cc'.repeat(32)], tracked_telemetry_repeaters: ['cc'.repeat(32)],
auto_resend_channel: false, auto_resend_channel: false,
telemetry_interval_hours: 8, telemetry_interval_hours: 8,
telemetry_routed_hourly: false,
}); });
renderSection(); renderSection();
+1 -1
View File
@@ -51,7 +51,7 @@ describe('MessageInput', () => {
} }
function getInput() { function getInput() {
return screen.getByPlaceholderText('Type a message...') as HTMLInputElement; return screen.getByPlaceholderText('Type a message...') as HTMLTextAreaElement;
} }
function getSendButton() { function getSendButton() {
+2
View File
@@ -94,6 +94,8 @@ describe('buildRawPacketStatsSnapshot', () => {
sender: 'Alpha', sender: 'Alpha',
channel_key: null, channel_key: null,
contact_key: '0a'.repeat(32), contact_key: '0a'.repeat(32),
sender_timestamp: null,
message: null,
}, },
}; };
@@ -438,6 +438,7 @@ describe('RepeaterDashboard', () => {
flood_dups: 1, flood_dups: 1,
direct_dups: 0, direct_dups: 0,
full_events: 0, full_events: 0,
recv_errors: 5,
telemetry_history: [], telemetry_history: [],
}; };
@@ -707,6 +708,7 @@ describe('RepeaterDashboard', () => {
flood_dups: 1, flood_dups: 1,
direct_dups: 0, direct_dups: 0,
full_events: 0, full_events: 0,
recv_errors: null,
telemetry_history: [liveEntry], telemetry_history: [liveEntry],
}; };
@@ -742,6 +744,7 @@ describe('RepeaterDashboard', () => {
flood_dups: 1, flood_dups: 1,
direct_dups: 0, direct_dups: 0,
full_events: 0, full_events: 0,
recv_errors: null,
telemetry_history: [{ timestamp: 1700000000, data: { battery_volts: 4.2 } }], telemetry_history: [{ timestamp: 1700000000, data: { battery_volts: 4.2 } }],
}; };
+70
View File
@@ -5,6 +5,7 @@ import { SettingsModal } from '../components/SettingsModal';
import type { import type {
AppSettings, AppSettings,
AppSettingsUpdate, AppSettingsUpdate,
Contact,
HealthStatus, HealthStatus,
RadioAdvertMode, RadioAdvertMode,
RadioConfig, RadioConfig,
@@ -71,6 +72,7 @@ const baseSettings: AppSettings = {
tracked_telemetry_repeaters: [], tracked_telemetry_repeaters: [],
auto_resend_channel: false, auto_resend_channel: false,
telemetry_interval_hours: 8, telemetry_interval_hours: 8,
telemetry_routed_hourly: false,
}; };
function renderModal(overrides?: { function renderModal(overrides?: {
@@ -89,6 +91,8 @@ function renderModal(overrides?: {
meshDiscovery?: RadioDiscoveryResponse | null; meshDiscovery?: RadioDiscoveryResponse | null;
meshDiscoveryLoadingTarget?: RadioDiscoveryTarget | null; meshDiscoveryLoadingTarget?: RadioDiscoveryTarget | null;
onDiscoverMesh?: (target: RadioDiscoveryTarget) => Promise<void>; onDiscoverMesh?: (target: RadioDiscoveryTarget) => Promise<void>;
contacts?: Contact[];
trackedTelemetryRepeaters?: string[];
open?: boolean; open?: boolean;
pageMode?: boolean; pageMode?: boolean;
externalSidebarNav?: boolean; externalSidebarNav?: boolean;
@@ -127,6 +131,8 @@ function renderModal(overrides?: {
onDiscoverMesh, onDiscoverMesh,
onHealthRefresh: vi.fn(async () => {}), onHealthRefresh: vi.fn(async () => {}),
onRefreshAppSettings, onRefreshAppSettings,
contacts: overrides?.contacts,
trackedTelemetryRepeaters: overrides?.trackedTelemetryRepeaters,
}; };
const view = overrides?.externalSidebarNav const view = overrides?.externalSidebarNav
@@ -794,4 +800,68 @@ describe('SettingsModal', () => {
expect(screen.getByText('Network')).toBeInTheDocument(); expect(screen.getByText('Network')).toBeInTheDocument();
}); });
}); });
it('renders routed hourly checkbox and calls save on toggle', async () => {
const onSaveAppSettings = vi.fn(async () => {});
renderModal({
externalSidebarNav: true,
desktopSection: 'database',
onSaveAppSettings,
});
const checkbox = screen.getByRole('checkbox', {
name: /Poll direct\/routed-path repeaters hourly/i,
}) as HTMLInputElement;
expect(checkbox).toBeInTheDocument();
expect(checkbox.checked).toBe(false);
fireEvent.click(checkbox);
await waitFor(() => {
expect(onSaveAppSettings).toHaveBeenCalledWith(
expect.objectContaining({ telemetry_routed_hourly: true })
);
});
});
it('shows route badge per tracked repeater', async () => {
const directKey = 'bb'.repeat(32);
renderModal({
externalSidebarNav: true,
desktopSection: 'database',
appSettings: {
...baseSettings,
tracked_telemetry_repeaters: [directKey],
},
trackedTelemetryRepeaters: [directKey],
contacts: [
{
public_key: directKey,
name: 'DirectRepeater',
type: 2,
flags: 0,
direct_path: 'aabb',
direct_path_len: 1,
direct_path_hash_mode: 1,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
favorite: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
effective_route: { path: 'aabb', path_len: 1, path_hash_mode: 1 },
effective_route_source: 'direct',
},
],
});
expect(screen.getByText('DirectRepeater')).toBeInTheDocument();
expect(screen.getByText('direct')).toBeInTheDocument();
});
}); });
+192
View File
@@ -0,0 +1,192 @@
import { describe, it, expect, beforeEach } from 'vitest';
import {
getTextReplaceEnabled,
setTextReplaceEnabled,
getTextReplaceMapJson,
setTextReplaceMapJson,
applyTextReplacements,
DEFAULT_MAP_JSON,
} from '../utils/textReplace';
beforeEach(() => {
localStorage.clear();
});
describe('enabled toggle', () => {
it('defaults to disabled', () => {
expect(getTextReplaceEnabled()).toBe(false);
});
it('persists enabled state', () => {
setTextReplaceEnabled(true);
expect(getTextReplaceEnabled()).toBe(true);
setTextReplaceEnabled(false);
expect(getTextReplaceEnabled()).toBe(false);
});
});
describe('map JSON persistence', () => {
it('returns default map when nothing stored', () => {
expect(getTextReplaceMapJson()).toBe(DEFAULT_MAP_JSON);
});
it('persists valid JSON and returns null', () => {
const json = '{"a":"b"}';
expect(setTextReplaceMapJson(json)).toBeNull();
expect(getTextReplaceMapJson()).toBe(json);
});
it('rejects invalid JSON with error string', () => {
const err = setTextReplaceMapJson('not json');
expect(err).toBeTypeOf('string');
// localStorage unchanged — still returns default
expect(getTextReplaceMapJson()).toBe(DEFAULT_MAP_JSON);
});
it('rejects arrays', () => {
expect(setTextReplaceMapJson('["a","b"]')).toBeTypeOf('string');
});
it('rejects non-string values', () => {
expect(setTextReplaceMapJson('{"a":123}')).toBeTypeOf('string');
});
it('rejects null', () => {
expect(setTextReplaceMapJson('null')).toBeTypeOf('string');
});
it('accepts empty object', () => {
expect(setTextReplaceMapJson('{}')).toBeNull();
});
});
describe('re-expansion validation', () => {
it('rejects when a key appears in its own replacement', () => {
const err = setTextReplaceMapJson(JSON.stringify({ a: 'aa' }));
expect(err).toBeTypeOf('string');
expect(err).toContain('"a"');
expect(err).toContain('"aa"');
});
it('rejects when a key appears in another replacement', () => {
const err = setTextReplaceMapJson(JSON.stringify({ a: 'X', b: 'ab' }));
expect(err).toBeTypeOf('string');
expect(err).toContain('"a"');
expect(err).toContain('"ab"');
});
it('allows replacements that do not contain any key', () => {
expect(setTextReplaceMapJson(JSON.stringify({ a: 'X', b: 'Y' }))).toBeNull();
});
it('allows the default Cyrillic map', () => {
expect(setTextReplaceMapJson(DEFAULT_MAP_JSON)).toBeNull();
});
it('does not check empty keys for re-expansion', () => {
// Empty key is silently skipped by buildReplacements, so it should not
// cause a re-expansion rejection for other entries.
expect(setTextReplaceMapJson(JSON.stringify({ '': 'x', b: 'Y' }))).toBeNull();
});
});
describe('applyTextReplacements', () => {
const simpleMap = JSON.stringify({ a: 'X', b: 'Y' });
it('returns null when no replacements match', () => {
expect(applyTextReplacements('hello', 5, simpleMap)).toBeNull();
});
it('returns null for empty map', () => {
expect(applyTextReplacements('abc', 3, '{}')).toBeNull();
});
it('returns null for invalid JSON', () => {
expect(applyTextReplacements('abc', 3, 'broken')).toBeNull();
});
it('replaces a single character with cursor at end', () => {
const result = applyTextReplacements('a', 1, simpleMap);
expect(result).toEqual({ text: 'X', cursor: 1 });
});
it('replaces multiple characters in one pass', () => {
const result = applyTextReplacements('ab', 2, simpleMap);
expect(result).toEqual({ text: 'XY', cursor: 2 });
});
it('adjusts cursor when replacement is longer than needle', () => {
const map = JSON.stringify({ ':)': 'smiley' });
// "hello :)" cursor at end (8)
const result = applyTextReplacements('hello :)', 8, map);
expect(result).toEqual({ text: 'hello smiley', cursor: 12 });
});
it('adjusts cursor when replacement is shorter than needle', () => {
const map = JSON.stringify({ abc: 'Z' });
// "abcdef" cursor at end (6)
const result = applyTextReplacements('abcdef', 6, map);
expect(result).toEqual({ text: 'Zdef', cursor: 4 });
});
it('preserves cursor position when replacement is before cursor', () => {
const map = JSON.stringify({ a: 'XX' });
// "a_b" cursor at 2 (on 'b'), 'a' replaced with 'XX'
const result = applyTextReplacements('a_b', 2, map);
expect(result).toEqual({ text: 'XX_b', cursor: 3 });
});
it('does not adjust cursor for replacements after cursor', () => {
const map = JSON.stringify({ b: 'YY' });
// "ab" cursor at 1 (after 'a'), 'b' is after cursor
const result = applyTextReplacements('ab', 1, map);
expect(result).toEqual({ text: 'aYY', cursor: 1 });
});
it('places cursor after replacement when cursor is inside a multi-char match', () => {
const map = JSON.stringify({ abc: 'Z' });
// "abc" cursor at 2 (inside the match)
const result = applyTextReplacements('abc', 2, map);
expect(result).toEqual({ text: 'Z', cursor: 1 });
});
it('handles multiple replacements with cursor tracking', () => {
const map = JSON.stringify({ ':)': 'S' });
// ":):)" cursor at end (4) — two replacements, each shrinks by 1
const result = applyTextReplacements(':):)', 4, map);
expect(result).toEqual({ text: 'SS', cursor: 2 });
});
it('cursor between two replacements stays correct', () => {
const map = JSON.stringify({ ':)': 'S' });
// ":):)" cursor at 2 (between the two smileys)
const result = applyTextReplacements(':):)', 2, map);
expect(result).toEqual({ text: 'SS', cursor: 1 });
});
it('uses longest match first', () => {
const map = JSON.stringify({ ab: 'LONG', a: 'X' });
const result = applyTextReplacements('ab', 2, map);
expect(result).toEqual({ text: 'LONG', cursor: 4 });
});
it('ignores empty-string keys (no infinite loop)', () => {
const map = JSON.stringify({ '': 'oops', a: 'X' });
const result = applyTextReplacements('abc', 3, map);
expect(result).toEqual({ text: 'Xbc', cursor: 3 });
});
it('works with the default Cyrillic map', () => {
// "Привет" — П has no mapping, р→p, и has no mapping, в has no mapping, е→e, т has no mapping
const result = applyTextReplacements('Привет', 6, DEFAULT_MAP_JSON);
expect(result).not.toBeNull();
expect(result!.text).toBe('Пpивeт');
expect(result!.cursor).toBe(6);
});
it('handles paste with many replacements', () => {
const map = JSON.stringify({ А: 'A', В: 'B', С: 'C' });
const result = applyTextReplacements('АВС', 3, map);
expect(result).toEqual({ text: 'ABC', cursor: 3 });
});
});
+7
View File
@@ -343,6 +343,8 @@ export interface RawPacket {
sender: string | null; sender: string | null;
channel_key: string | null; channel_key: string | null;
contact_key: string | null; contact_key: string | null;
sender_timestamp: number | null;
message: string | null;
} | null; } | null;
} }
@@ -359,6 +361,7 @@ export interface AppSettings {
tracked_telemetry_repeaters: string[]; tracked_telemetry_repeaters: string[];
auto_resend_channel: boolean; auto_resend_channel: boolean;
telemetry_interval_hours: number; telemetry_interval_hours: number;
telemetry_routed_hourly: boolean;
} }
export interface AppSettingsUpdate { export interface AppSettingsUpdate {
@@ -371,6 +374,7 @@ export interface AppSettingsUpdate {
blocked_names?: string[]; blocked_names?: string[];
discovery_blocked_types?: number[]; discovery_blocked_types?: number[];
telemetry_interval_hours?: number; telemetry_interval_hours?: number;
telemetry_routed_hourly?: boolean;
} }
export interface TelemetrySchedule { export interface TelemetrySchedule {
@@ -380,6 +384,8 @@ export interface TelemetrySchedule {
tracked_count: number; tracked_count: number;
max_tracked: number; max_tracked: number;
next_run_at: number | null; next_run_at: number | null;
routed_hourly: boolean;
next_routed_run_at: number | null;
} }
export interface TrackedTelemetryResponse { export interface TrackedTelemetryResponse {
@@ -438,6 +444,7 @@ export interface RepeaterStatusResponse {
flood_dups: number; flood_dups: number;
direct_dups: number; direct_dups: number;
full_events: number; full_events: number;
recv_errors: number | null;
telemetry_history: TelemetryHistoryEntry[]; telemetry_history: TelemetryHistoryEntry[];
} }
+50 -45
View File
@@ -324,51 +324,56 @@ export function inspectRawPacketWithOptions(
createPacketField('payload', `payload-${index}`, segment, structure.payload.startByte) createPacketField('payload', `payload-${index}`, segment, structure.payload.startByte)
); );
const enrichedPayloadFields = const enrichedPayloadFields = payloadFields.map((field) => {
decoded?.isValid && decoded.payloadType === PayloadType.GroupText && decoded.payload.decoded if (!decoded?.isValid || field.name !== 'Ciphertext') {
? payloadFields.map((field) => { return field;
if (field.name !== 'Ciphertext') { }
return field;
} const withStructure = {
const payload = decoded.payload.decoded as { ...field,
decrypted?: { timestamp?: number; flags?: number; sender?: string; message?: string }; description: describeCiphertextStructure(
}; decoded.payloadType,
if (!payload.decrypted?.message) { field.endByte - field.startByte + 1,
return field; field.description
} ),
const detailLines = [ };
payload.decrypted.timestamp != null
? `Timestamp: ${formatUnixTimestamp(payload.decrypted.timestamp)}` // GroupText: client-side decoder has the decrypted content
: null, if (decoded.payloadType === PayloadType.GroupText && decoded.payload.decoded) {
payload.decrypted.flags != null const payload = decoded.payload.decoded as {
? `Flags: 0x${payload.decrypted.flags.toString(16).padStart(2, '0')}` decrypted?: { timestamp?: number; flags?: number; sender?: string; message?: string };
: null, };
payload.decrypted.sender ? `Sender: ${payload.decrypted.sender}` : null, if (!payload.decrypted?.message) {
`Message: ${payload.decrypted.message}`, return withStructure;
].filter((line): line is string => line !== null); }
return { const detailLines = [
...field, payload.decrypted.timestamp != null
description: describeCiphertextStructure( ? `Sent (packet): ${formatUnixTimestamp(payload.decrypted.timestamp)}`
decoded.payloadType, : null,
field.endByte - field.startByte + 1, payload.decrypted.flags != null
field.description ? `Flags: 0x${payload.decrypted.flags.toString(16).padStart(2, '0')}`
), : null,
decryptedMessage: detailLines.join('\n'), payload.decrypted.sender ? `Sender: ${payload.decrypted.sender}` : null,
}; `Message: ${payload.decrypted.message}`,
}) ].filter((line): line is string => line !== null);
: payloadFields.map((field) => { return { ...withStructure, decryptedMessage: detailLines.join('\n') };
if (!decoded?.isValid || field.name !== 'Ciphertext') { }
return field;
} // TextMessage (DM): server-side decryption via decrypted_info
return { if (decoded.payloadType === PayloadType.TextMessage && packet.decrypted_info?.message) {
...field, const info = packet.decrypted_info;
description: describeCiphertextStructure( const detailLines = [
decoded.payloadType, info.sender_timestamp != null
field.endByte - field.startByte + 1, ? `Sent (packet): ${formatUnixTimestamp(info.sender_timestamp)}`
field.description : null,
), info.sender ? `Sender: ${info.sender}` : null,
}; `Message: ${info.message}`,
}); ].filter((line): line is string => line !== null);
return { ...withStructure, decryptedMessage: detailLines.join('\n') };
}
return withStructure;
});
return { return {
decoded, decoded,
+142
View File
@@ -0,0 +1,142 @@
const ENABLED_KEY = 'remoteterm-text-replace-enabled';
const MAP_KEY = 'remoteterm-text-replace-map';
const DEFAULT_MAP: Record<string, string> = {
А: 'A',
В: 'B',
Е: 'E',
Ё: 'E',
З: '3',
К: 'K',
М: 'M',
Н: 'H',
О: 'O',
Р: 'P',
С: 'C',
Т: 'T',
Х: 'X',
Ь: 'b',
а: 'a',
е: 'e',
ё: 'e',
о: 'o',
р: 'p',
с: 'c',
у: 'y',
х: 'x',
};
export const DEFAULT_MAP_JSON = JSON.stringify(DEFAULT_MAP, null, 2);
export function getTextReplaceEnabled(): boolean {
try {
return localStorage.getItem(ENABLED_KEY) === 'true';
} catch {
return false;
}
}
export function setTextReplaceEnabled(enabled: boolean): void {
try {
if (enabled) {
localStorage.setItem(ENABLED_KEY, 'true');
} else {
localStorage.removeItem(ENABLED_KEY);
}
} catch {
// localStorage may be unavailable
}
}
export function getTextReplaceMapJson(): string {
try {
const raw = localStorage.getItem(MAP_KEY);
if (raw !== null) return raw;
} catch {
// fall through
}
return DEFAULT_MAP_JSON;
}
/** Persist the map JSON only if it's valid. Returns null on success or an error string. */
export function setTextReplaceMapJson(json: string): string | null {
try {
const parsed = JSON.parse(json);
if (typeof parsed !== 'object' || parsed === null || Array.isArray(parsed))
return 'Must be a JSON object.';
const rawEntries = Object.entries(parsed);
for (const [k, v] of rawEntries) {
if (typeof k !== 'string' || typeof v !== 'string')
return 'All keys and values must be strings.';
}
const entries = rawEntries as [string, string][];
// Check for re-expansion: no key may appear as a substring of any replacement value.
for (const [needle] of entries) {
if (needle.length === 0) continue;
for (const [, replacement] of entries) {
if (replacement.includes(needle)) {
return `Key "${needle}" appears inside replacement "${replacement}" and would re-expand on every keystroke.`;
}
}
}
localStorage.setItem(MAP_KEY, json);
return null;
} catch {
return 'Invalid JSON.';
}
}
/** Build a sorted-by-length-desc array of [needle, replacement] for efficient matching. */
function buildReplacements(json: string): [string, string][] {
try {
const parsed = JSON.parse(json) as Record<string, string>;
return Object.entries(parsed)
.filter(([k]) => k.length > 0)
.sort((a, b) => b[0].length - a[0].length);
} catch {
return [];
}
}
/**
* Apply text replacements and compute the adjusted cursor position.
* Returns null if nothing changed.
*/
export function applyTextReplacements(
text: string,
cursorPos: number,
mapJson: string
): { text: string; cursor: number } | null {
const replacements = buildReplacements(mapJson);
if (replacements.length === 0) return null;
let result = '';
let newCursor = cursorPos;
let i = 0;
while (i < text.length) {
let matched = false;
for (const [needle, replacement] of replacements) {
if (text.startsWith(needle, i)) {
result += replacement;
// Adjust cursor if this match is before or spans the cursor
if (i + needle.length <= cursorPos) {
newCursor += replacement.length - needle.length;
} else if (i < cursorPos) {
// Cursor is inside this match — place it after the replacement
newCursor = result.length;
}
i += needle.length;
matched = true;
break;
}
}
if (!matched) {
result += text[i];
i++;
}
}
if (result === text) return null;
return { text: result, cursor: newCursor };
}
+2 -2
View File
@@ -1,6 +1,6 @@
[project] [project]
name = "remoteterm-meshcore" name = "remoteterm-meshcore"
version = "3.12.3" version = "3.13.0"
description = "RemoteTerm - Web interface for MeshCore radio mesh networks" description = "RemoteTerm - Web interface for MeshCore radio mesh networks"
readme = "README.md" readme = "README.md"
requires-python = ">=3.11" requires-python = ">=3.11"
@@ -12,7 +12,7 @@ dependencies = [
"httpx>=0.28.1", "httpx>=0.28.1",
"pycryptodome>=3.20.0", "pycryptodome>=3.20.0",
"pynacl>=1.5.0", "pynacl>=1.5.0",
"meshcore==2.3.2", "meshcore==2.3.7",
"aiomqtt>=2.0", "aiomqtt>=2.0",
"apprise>=1.9.8", "apprise>=1.9.8",
"boto3>=1.38.0", "boto3>=1.38.0",
+7 -6
View File
@@ -23,8 +23,9 @@ test.describe('Channel messaging in #flightless', () => {
// Send it // Send it
await page.getByRole('button', { name: 'Send', exact: true }).click(); await page.getByRole('button', { name: 'Send', exact: true }).click();
// Verify message appears in the message list // Verify message appears in the message list (use locator('span') to avoid
await expect(page.getByText(testMessage)).toBeVisible({ timeout: 15_000 }); // matching the textarea which may briefly retain the sent text)
await expect(page.locator('span', { hasText: testMessage })).toBeVisible({ timeout: 15_000 });
}); });
test('outgoing message shows ack indicator', async ({ page }) => { test('outgoing message shows ack indicator', async ({ page }) => {
@@ -37,8 +38,8 @@ test.describe('Channel messaging in #flightless', () => {
await input.fill(testMessage); await input.fill(testMessage);
await page.getByRole('button', { name: 'Send', exact: true }).click(); await page.getByRole('button', { name: 'Send', exact: true }).click();
// Wait for the message to appear // Wait for the message to appear in the message list
const messageEl = page.getByText(testMessage); const messageEl = page.locator('span', { hasText: testMessage });
await expect(messageEl).toBeVisible({ timeout: 15_000 }); await expect(messageEl).toBeVisible({ timeout: 15_000 });
// Outgoing messages show either "?" (pending) or "✓" (acked) // Outgoing messages show either "?" (pending) or "✓" (acked)
@@ -58,7 +59,7 @@ test.describe('Channel messaging in #flightless', () => {
await input.fill(testMessage); await input.fill(testMessage);
await page.getByRole('button', { name: 'Send', exact: true }).click(); await page.getByRole('button', { name: 'Send', exact: true }).click();
const messageEl = page.getByText(testMessage).first(); const messageEl = page.locator('span', { hasText: testMessage }).first();
await expect(messageEl).toBeVisible({ timeout: 15_000 }); await expect(messageEl).toBeVisible({ timeout: 15_000 });
const messageContainer = messageEl.locator( const messageContainer = messageEl.locator(
@@ -94,6 +95,6 @@ test.describe('Channel messaging in #flightless', () => {
await expect(page.getByText('Message resent')).toBeVisible({ timeout: 10_000 }); await expect(page.getByText('Message resent')).toBeVisible({ timeout: 10_000 });
// Byte-perfect resend should not create a second visible row in this conversation. // Byte-perfect resend should not create a second visible row in this conversation.
await expect(page.getByText(testMessage)).toHaveCount(1); await expect(page.locator('span', { hasText: testMessage })).toHaveCount(1);
}); });
}); });
+17 -17
View File
@@ -50,7 +50,7 @@ def _patch_require_connected(mc=None, *, detail="Radio not connected"):
if mc is None: if mc is None:
return patch( return patch(
"app.services.radio_runtime.radio_runtime.require_connected", "app.services.radio_runtime.radio_runtime.require_connected",
side_effect=HTTPException(status_code=503, detail=detail), side_effect=HTTPException(status_code=423, detail=detail),
) )
return patch("app.services.radio_runtime.radio_runtime.require_connected", return_value=mc) return patch("app.services.radio_runtime.radio_runtime.require_connected", return_value=mc)
@@ -422,11 +422,11 @@ class TestDebugEndpoint:
class TestRadioDisconnectedHandler: class TestRadioDisconnectedHandler:
"""Test that RadioDisconnectedError maps to 503.""" """Test that RadioDisconnectedError maps to 423."""
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_disconnect_race_returns_503(self, test_db, client): async def test_disconnect_race_returns_423(self, test_db, client):
"""If radio disconnects between require_connected() and lock acquisition, return 503.""" """If radio disconnects between require_connected() and lock acquisition, return 423."""
pub_key = "ab" * 32 pub_key = "ab" * 32
await _insert_contact(pub_key, "Alice") await _insert_contact(pub_key, "Alice")
@@ -437,7 +437,7 @@ class TestRadioDisconnectedHandler:
"/api/messages/direct", json={"destination": pub_key, "text": "Hi"} "/api/messages/direct", json={"destination": pub_key, "text": "Hi"}
) )
assert response.status_code == 503 assert response.status_code == 423
assert "not connected" in response.json()["detail"].lower() assert "not connected" in response.json()["detail"].lower()
@@ -500,25 +500,25 @@ class TestMessagesEndpoint:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_send_direct_message_requires_connection(self, test_db, client): async def test_send_direct_message_requires_connection(self, test_db, client):
"""Sending message when disconnected returns 503.""" """Sending message when disconnected returns 423."""
with _patch_require_connected(): with _patch_require_connected():
response = await client.post( response = await client.post(
"/api/messages/direct", json={"destination": "abc123", "text": "Hello"} "/api/messages/direct", json={"destination": "abc123", "text": "Hello"}
) )
assert response.status_code == 503 assert response.status_code == 423
assert "not connected" in response.json()["detail"].lower() assert "not connected" in response.json()["detail"].lower()
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_send_channel_message_requires_connection(self, test_db, client): async def test_send_channel_message_requires_connection(self, test_db, client):
"""Sending channel message when disconnected returns 503.""" """Sending channel message when disconnected returns 423."""
with _patch_require_connected(): with _patch_require_connected():
response = await client.post( response = await client.post(
"/api/messages/channel", "/api/messages/channel",
json={"channel_key": "0123456789ABCDEF0123456789ABCDEF", "text": "Hello"}, json={"channel_key": "0123456789ABCDEF0123456789ABCDEF", "text": "Hello"},
) )
assert response.status_code == 503 assert response.status_code == 423
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_send_direct_message_emits_websocket_message_event(self, test_db, client): async def test_send_direct_message_emits_websocket_message_event(self, test_db, client):
@@ -603,8 +603,8 @@ class TestMessagesEndpoint:
assert "not found" in response.json()["detail"].lower() assert "not found" in response.json()["detail"].lower()
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_send_direct_message_duplicate_returns_500(self, test_db): async def test_send_direct_message_duplicate_returns_422(self, test_db):
"""If MessageRepository.create returns None (duplicate), returns 500.""" """If MessageRepository.create returns None (duplicate), returns 422."""
from app.models import SendDirectMessageRequest from app.models import SendDirectMessageRequest
from app.routers.messages import send_direct_message from app.routers.messages import send_direct_message
@@ -636,12 +636,12 @@ class TestMessagesEndpoint:
SendDirectMessageRequest(destination=pub_key, text="Hello") SendDirectMessageRequest(destination=pub_key, text="Hello")
) )
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert "unexpected duplicate" in exc_info.value.detail.lower() assert "unexpected duplicate" in exc_info.value.detail.lower()
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_send_channel_message_duplicate_returns_500(self, test_db): async def test_send_channel_message_duplicate_returns_422(self, test_db):
"""If MessageRepository.create returns None (duplicate), returns 500.""" """If MessageRepository.create returns None (duplicate), returns 422."""
from app.models import SendChannelMessageRequest from app.models import SendChannelMessageRequest
from app.routers.messages import send_channel_message from app.routers.messages import send_channel_message
@@ -672,16 +672,16 @@ class TestMessagesEndpoint:
SendChannelMessageRequest(channel_key=chan_key, text="Hello") SendChannelMessageRequest(channel_key=chan_key, text="Hello")
) )
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert "unexpected duplicate" in exc_info.value.detail.lower() assert "unexpected duplicate" in exc_info.value.detail.lower()
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_resend_channel_message_requires_connection(self, test_db, client): async def test_resend_channel_message_requires_connection(self, test_db, client):
"""Resend endpoint returns 503 when radio is disconnected.""" """Resend endpoint returns 423 when radio is disconnected."""
with _patch_require_connected(): with _patch_require_connected():
response = await client.post("/api/messages/channel/1/resend") response = await client.post("/api/messages/channel/1/resend")
assert response.status_code == 503 assert response.status_code == 423
assert "not connected" in response.json()["detail"].lower() assert "not connected" in response.json()["detail"].lower()
@pytest.mark.asyncio @pytest.mark.asyncio
+1 -1
View File
@@ -709,7 +709,7 @@ class TestBotMessageRateLimiting:
patch( patch(
"app.routers.messages.send_direct_message", "app.routers.messages.send_direct_message",
new_callable=AsyncMock, new_callable=AsyncMock,
side_effect=HTTPException(status_code=500, detail="Send failed"), side_effect=HTTPException(status_code=422, detail="Send failed"),
), ),
): ):
await process_bot_response( await process_bot_response(
+2 -2
View File
@@ -317,7 +317,7 @@ class TestPathDiscovery:
mock_broadcast.assert_called_once_with("contact", updated.model_dump()) mock_broadcast.assert_called_once_with("contact", updated.model_dump())
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_returns_504_when_no_response_is_heard(self, test_db, client): async def test_returns_408_when_no_response_is_heard(self, test_db, client):
await _insert_contact(KEY_A, "Alice", type=1) await _insert_contact(KEY_A, "Alice", type=1)
mc = MagicMock() mc = MagicMock()
mc.commands = MagicMock() mc.commands = MagicMock()
@@ -332,7 +332,7 @@ class TestPathDiscovery:
mock_rm.radio_operation = _noop_radio_operation(mc) mock_rm.radio_operation = _noop_radio_operation(mc)
response = await client.post(f"/api/contacts/{KEY_A}/path-discovery") response = await client.post(f"/api/contacts/{KEY_A}/path-discovery")
assert response.status_code == 504 assert response.status_code == 408
assert "No path discovery response heard" in response.json()["detail"] assert "No path discovery response heard" in response.json()["detail"]
+128
View File
@@ -1367,6 +1367,134 @@ class TestAppriseValidation:
assert scope["raw_packets"] == "none" assert scope["raw_packets"] == "none"
assert scope["messages"] == "all" assert scope["messages"] == "all"
def test_validate_apprise_config_accepts_markdown_format_bool(self):
from app.routers.fanout import _validate_apprise_config
_validate_apprise_config({"urls": "discord://123/abc", "markdown_format": False})
def test_validate_apprise_config_normalizes_markdown_format(self):
from app.routers.fanout import _validate_apprise_config
config: dict = {"urls": "discord://123/abc", "markdown_format": 0}
_validate_apprise_config(config)
assert config["markdown_format"] is False
def test_validate_apprise_config_works_without_markdown_format(self):
from app.routers.fanout import _validate_apprise_config
_validate_apprise_config({"urls": "discord://123/abc"})
class TestAppriseMarkdownFormat:
def test_format_body_markdown_true_uses_markdown_fallback(self):
from app.fanout.apprise_mod import _format_body
body = _format_body(
{"type": "PRIV", "text": "hi", "sender_name": "Alice"},
markdown=True,
)
assert "**DM:**" in body
def test_format_body_markdown_false_uses_plain_fallback(self):
from app.fanout.apprise_mod import _format_body
body = _format_body(
{"type": "PRIV", "text": "hi", "sender_name": "Alice"},
markdown=False,
)
assert "**" not in body
assert "DM:" in body
assert "Alice" in body
def test_format_body_markdown_false_channel(self):
from app.fanout.apprise_mod import _format_body
body = _format_body(
{"type": "CHAN", "text": "hi", "sender_name": "Bob", "channel_name": "#gen"},
markdown=False,
)
assert "**" not in body
assert "#gen:" in body
def test_send_sync_passes_markdown_body_format(self):
from unittest.mock import MagicMock, patch
with patch("app.fanout.apprise_mod.apprise_lib", create=True) as mock_lib:
mock_notifier = MagicMock()
mock_notifier.notify.return_value = True
mock_lib.Apprise.return_value = mock_notifier
with patch.dict("sys.modules", {"apprise": mock_lib}):
from app.fanout.apprise_mod import _send_sync
_send_sync("json://localhost", "test", preserve_identity=False, markdown=True)
call_kwargs = mock_notifier.notify.call_args
assert call_kwargs.kwargs.get("body_format") or call_kwargs[1].get("body_format")
def test_send_sync_passes_text_body_format_when_markdown_false(self):
from unittest.mock import MagicMock, patch
with patch("app.fanout.apprise_mod.apprise_lib", create=True) as mock_lib:
mock_notifier = MagicMock()
mock_notifier.notify.return_value = True
mock_lib.Apprise.return_value = mock_notifier
with patch.dict("sys.modules", {"apprise": mock_lib}):
from app.fanout.apprise_mod import _send_sync
_send_sync("json://localhost", "test", preserve_identity=False, markdown=False)
call_kwargs = mock_notifier.notify.call_args
assert call_kwargs.kwargs.get("body_format") or call_kwargs[1].get("body_format")
@pytest.mark.asyncio
async def test_on_message_reads_markdown_format_config(self):
from unittest.mock import patch as _patch
from app.fanout.apprise_mod import AppriseModule
mod = AppriseModule("test", {"urls": "json://localhost", "markdown_format": False})
with _patch("app.fanout.apprise_mod._send_sync", return_value=True) as mock_send:
await mod.on_message(
{"type": "PRIV", "text": "hello", "outgoing": False, "sender_name": "S_Borkin"}
)
mock_send.assert_called_once()
assert mock_send.call_args.kwargs.get("markdown") is False
@pytest.mark.asyncio
async def test_on_message_defaults_markdown_true(self):
from unittest.mock import patch as _patch
from app.fanout.apprise_mod import AppriseModule
mod = AppriseModule("test", {"urls": "json://localhost"})
with _patch("app.fanout.apprise_mod._send_sync", return_value=True) as mock_send:
await mod.on_message(
{"type": "PRIV", "text": "hello", "outgoing": False, "sender_name": "Alice"}
)
mock_send.assert_called_once()
assert mock_send.call_args.kwargs.get("markdown") is True
@pytest.mark.asyncio
async def test_on_message_markdown_false_uses_plain_default_format(self):
from unittest.mock import patch as _patch
from app.fanout.apprise_mod import AppriseModule
mod = AppriseModule("test", {"urls": "json://localhost", "markdown_format": False})
with _patch("app.fanout.apprise_mod._send_sync", return_value=True) as mock_send:
await mod.on_message(
{
"type": "CHAN",
"text": "hi",
"outgoing": False,
"sender_name": "Bob",
"channel_name": "#general",
}
)
body = mock_send.call_args[0][1]
assert "**" not in body
assert "#general:" in body
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Comprehensive scope/filter selection logic tests # Comprehensive scope/filter selection logic tests
+40
View File
@@ -1580,6 +1580,46 @@ class TestFanoutAppriseIntegration:
assert "Eve" in body_text assert "Eve" in body_text
assert "routed msg" in body_text assert "routed msg" in body_text
@pytest.mark.asyncio
async def test_apprise_markdown_false_delivers_plain_text(
self, apprise_capture_server, integration_db
):
"""Apprise with markdown_format=False delivers without markdown formatting."""
cfg = await FanoutConfigRepository.create(
config_type="apprise",
name="Plain Apprise",
config={
"urls": f"json://127.0.0.1:{apprise_capture_server.port}",
"markdown_format": False,
},
scope={"messages": "all", "raw_packets": "none"},
enabled=True,
)
manager = FanoutManager()
try:
await manager.load_from_db()
assert cfg["id"] in manager._modules
await manager.broadcast_message(
{
"type": "PRIV",
"conversation_key": "pk1",
"text": "hello",
"sender_name": "S_Borkin",
}
)
results = await apprise_capture_server.wait_for(1)
finally:
await manager.stop_all()
assert len(results) >= 1
body_text = str(results[0])
assert "S_Borkin" in body_text
assert "hello" in body_text
assert "**" not in body_text
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Bot lifecycle tests # Bot lifecycle tests
+1 -1
View File
@@ -2,4 +2,4 @@
# run ``run_migrations`` to completion assert ``get_version == LATEST`` and # run ``run_migrations`` to completion assert ``get_version == LATEST`` and
# ``applied == LATEST - starting_version`` so only this constant needs to # ``applied == LATEST - starting_version`` so only this constant needs to
# change, not every individual assertion. # change, not every individual assertion.
LATEST_SCHEMA_VERSION = 60 LATEST_SCHEMA_VERSION = 61
+2 -2
View File
@@ -342,8 +342,8 @@ class TestConnectionLoop:
assert sleep_args[0] == _BACKOFF_MIN assert sleep_args[0] == _BACKOFF_MIN
assert sleep_args[1] == _BACKOFF_MIN * 2 assert sleep_args[1] == _BACKOFF_MIN * 2
assert sleep_args[2] == _BACKOFF_MIN * 4 assert sleep_args[2] == _BACKOFF_MIN * 4
# Fourth should be capped at _backoff_max (5*8=40 > 30) # Fourth is still doubling (5*8=40), not yet at _backoff_max
assert sleep_args[3] == MqttPublisher._backoff_max assert sleep_args[3] == _BACKOFF_MIN * 8
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_waits_for_settings_when_unconfigured(self): async def test_waits_for_settings_when_unconfigured(self):
+1 -1
View File
@@ -125,7 +125,7 @@ class TestRadioDiscovery:
class TestRepeaterDiscovery: class TestRepeaterDiscovery:
def test_produces_sensor_per_field(self): def test_produces_sensor_per_field(self):
configs = _repeater_discovery_configs("mc", "ccdd11223344", "Rep1", "aabb") configs = _repeater_discovery_configs("mc", "ccdd11223344", "Rep1", "aabb")
assert len(configs) == 7 # matches _REPEATER_SENSORS length assert len(configs) == 8 # matches _REPEATER_SENSORS length
topics = [t for t, _ in configs] topics = [t for t, _ in configs]
assert "homeassistant/sensor/meshcore_ccdd11223344/battery_voltage/config" in topics assert "homeassistant/sensor/meshcore_ccdd11223344/battery_voltage/config" in topics
+2
View File
@@ -95,6 +95,8 @@ class TestGetRawPacket:
"sender": "Alice", "sender": "Alice",
"channel_key": channel_key, "channel_key": channel_key,
"contact_key": None, "contact_key": None,
"sender_timestamp": 1700000000,
"message": "Alice: hello",
} }
+6 -6
View File
@@ -174,8 +174,8 @@ class TestRadioOperationYield:
class TestRequireConnected: class TestRequireConnected:
"""Test the require_connected() FastAPI dependency.""" """Test the require_connected() FastAPI dependency."""
def test_raises_503_when_setup_in_progress(self): def test_raises_423_when_setup_in_progress(self):
"""HTTPException 503 is raised when radio is connected but setup is still in progress.""" """HTTPException 423 is raised when radio is connected but setup is still in progress."""
from fastapi import HTTPException from fastapi import HTTPException
from app.services.radio_runtime import radio_runtime from app.services.radio_runtime import radio_runtime
@@ -188,11 +188,11 @@ class TestRequireConnected:
with pytest.raises(HTTPException) as exc_info: with pytest.raises(HTTPException) as exc_info:
radio_runtime.require_connected() radio_runtime.require_connected()
assert exc_info.value.status_code == 503 assert exc_info.value.status_code == 423
assert "initializing" in exc_info.value.detail.lower() assert "initializing" in exc_info.value.detail.lower()
def test_raises_503_when_not_connected(self): def test_raises_423_when_not_connected(self):
"""HTTPException 503 is raised when radio is not connected.""" """HTTPException 423 is raised when radio is not connected."""
from fastapi import HTTPException from fastapi import HTTPException
from app.services.radio_runtime import radio_runtime from app.services.radio_runtime import radio_runtime
@@ -205,7 +205,7 @@ class TestRequireConnected:
with pytest.raises(HTTPException) as exc_info: with pytest.raises(HTTPException) as exc_info:
radio_runtime.require_connected() radio_runtime.require_connected()
assert exc_info.value.status_code == 503 assert exc_info.value.status_code == 423
def test_returns_meshcore_when_connected_and_setup_complete(self): def test_returns_meshcore_when_connected_and_setup_complete(self):
"""Returns meshcore instance when radio is connected and setup is complete.""" """Returns meshcore instance when radio is connected and setup is complete."""
+11 -11
View File
@@ -131,14 +131,14 @@ class TestGetRadioConfig:
assert response.advert_location_source == "current" assert response.advert_location_source == "current"
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_returns_503_when_self_info_missing(self): async def test_returns_423_when_self_info_missing(self):
mc = MagicMock() mc = MagicMock()
mc.self_info = None mc.self_info = None
with patch("app.routers.radio.radio_manager.require_connected", return_value=mc): with patch("app.routers.radio.radio_manager.require_connected", return_value=mc):
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await get_radio_config() await get_radio_config()
assert exc.value.status_code == 503 assert exc.value.status_code == 423
class TestUpdateRadioConfig: class TestUpdateRadioConfig:
@@ -278,7 +278,7 @@ class TestUpdateRadioConfig:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await update_radio_config(RadioConfigUpdate(path_hash_mode=1)) await update_radio_config(RadioConfigUpdate(path_hash_mode=1))
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to set path hash mode" in str(exc.value.detail) assert "Failed to set path hash mode" in str(exc.value.detail)
assert radio_manager.path_hash_mode == 0 assert radio_manager.path_hash_mode == 0
mc.commands.send_appstart.assert_not_awaited() mc.commands.send_appstart.assert_not_awaited()
@@ -339,7 +339,7 @@ class TestPrivateKeyImport:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await set_private_key(PrivateKeyUpdate(private_key="aa" * 64)) await set_private_key(PrivateKeyUpdate(private_key="aa" * 64))
assert exc.value.status_code == 500 assert exc.value.status_code == 422
class TestDiscoverMesh: class TestDiscoverMesh:
@@ -699,7 +699,7 @@ class TestTracePath:
assert "not a repeater" in exc.value.detail assert "not a repeater" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_returns_504_when_no_trace_response_is_heard(self): async def test_returns_408_when_no_trace_response_is_heard(self):
mc = _mock_meshcore_with_info() mc = _mock_meshcore_with_info()
repeater = Contact( repeater = Contact(
public_key="44" * 32, public_key="44" * 32,
@@ -741,7 +741,7 @@ class TestTracePath:
) )
) )
assert exc.value.status_code == 504 assert exc.value.status_code == 408
assert "No trace response heard" in exc.value.detail assert "No trace response heard" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -850,7 +850,7 @@ class TestTracePath:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await discover_mesh(RadioDiscoveryRequest(target="sensors")) await discover_mesh(RadioDiscoveryRequest(target="sensors"))
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert exc.value.detail == "Failed to start mesh discovery" assert exc.value.detail == "Failed to start mesh discovery"
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -887,7 +887,7 @@ class TestTracePath:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await set_private_key(PrivateKeyUpdate(private_key="aa" * 64)) await set_private_key(PrivateKeyUpdate(private_key="aa" * 64))
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "keystore" in exc.value.detail.lower() assert "keystore" in exc.value.detail.lower()
# Called twice: initial attempt + one retry # Called twice: initial attempt + one retry
assert mock_export.await_count == 2 assert mock_export.await_count == 2
@@ -926,7 +926,7 @@ class TestAdvertise:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await send_advertisement() await send_advertisement()
assert exc.value.status_code == 500 assert exc.value.status_code == 422
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_defaults_to_flood_mode(self): async def test_defaults_to_flood_mode(self):
@@ -1059,7 +1059,7 @@ class TestRebootAndReconnect:
assert result["connected"] is True assert result["connected"] is True
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_reconnect_raises_503_on_failure(self): async def test_reconnect_raises_423_on_failure(self):
mock_rm = MagicMock() mock_rm = MagicMock()
mock_rm.is_connected = False mock_rm.is_connected = False
mock_rm.is_reconnecting = False mock_rm.is_reconnecting = False
@@ -1070,7 +1070,7 @@ class TestRebootAndReconnect:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await reconnect_radio() await reconnect_radio()
assert exc.value.status_code == 503 assert exc.value.status_code == 423
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_disconnect_pauses_connection_attempts_and_broadcasts_health(self): async def test_disconnect_pauses_connection_attempts_and_broadcasts_health(self):
+2 -2
View File
@@ -57,12 +57,12 @@ def test_require_connected_preserves_http_semantics():
) )
with pytest.raises(HTTPException, match="Radio is initializing") as exc: with pytest.raises(HTTPException, match="Radio is initializing") as exc:
runtime.require_connected() runtime.require_connected()
assert exc.value.status_code == 503 assert exc.value.status_code == 423
runtime = RadioRuntime(_Manager(meshcore=None, is_connected=False, is_setup_in_progress=False)) runtime = RadioRuntime(_Manager(meshcore=None, is_connected=False, is_setup_in_progress=False))
with pytest.raises(HTTPException, match="Radio not connected") as exc: with pytest.raises(HTTPException, match="Radio not connected") as exc:
runtime.require_connected() runtime.require_connected()
assert exc.value.status_code == 503 assert exc.value.status_code == 423
def test_require_connected_returns_fresh_meshcore_after_connectivity_check(): def test_require_connected_returns_fresh_meshcore_after_connectivity_check():
+363
View File
@@ -2219,6 +2219,262 @@ class TestCollectRepeaterTelemetryLpp:
assert "lpp_sensors" not in recorded_data assert "lpp_sensors" not in recorded_data
class TestRunTelemetryCycleRoutedOnly:
"""Verify that _run_telemetry_cycle(routed_only=True) skips flood repeaters."""
@pytest.mark.asyncio
async def test_routed_only_skips_flood_contacts(self):
from unittest.mock import AsyncMock, MagicMock, patch
from app.models import AppSettings, Contact
from app.radio_sync import _run_telemetry_cycle
flood_key = "aa" * 32
direct_key = "bb" * 32
override_key = "cc" * 32
flood_contact = Contact(
public_key=flood_key,
name="Flood",
type=2,
direct_path=None,
direct_path_len=-1,
direct_path_hash_mode=-1,
)
direct_contact = Contact(
public_key=direct_key,
name="Direct",
type=2,
direct_path="aabb",
direct_path_len=1,
direct_path_hash_mode=1,
)
override_contact = Contact(
public_key=override_key,
name="Override",
type=2,
direct_path=None,
direct_path_len=-1,
direct_path_hash_mode=-1,
route_override_path="ccdd",
route_override_len=1,
route_override_hash_mode=1,
)
settings = AppSettings(
tracked_telemetry_repeaters=[flood_key, direct_key, override_key],
)
contact_map = {
flood_key: flood_contact,
direct_key: direct_contact,
override_key: override_contact,
}
collected_keys: list[str] = []
async def fake_get_by_key(key):
return contact_map.get(key)
async def fake_collect(mc, contact):
collected_keys.append(contact.public_key)
return True
fake_radio_manager = MagicMock()
fake_radio_manager.is_connected = True
fake_radio_manager.radio_operation = MagicMock()
# Make radio_operation an async context manager that yields a MagicMock
fake_mc = MagicMock()
class FakeRadioOp:
async def __aenter__(self):
return fake_mc
async def __aexit__(self, *args):
pass
fake_radio_manager.radio_operation.return_value = FakeRadioOp()
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch(
"app.radio_sync.ContactRepository.get_by_key",
new_callable=AsyncMock,
side_effect=fake_get_by_key,
),
patch("app.radio_sync._collect_repeater_telemetry", new=fake_collect),
patch("app.radio_sync.radio_manager", fake_radio_manager),
):
await _run_telemetry_cycle(routed_only=True)
# Flood contact should be skipped; direct and override should be collected
assert flood_key not in collected_keys
assert direct_key in collected_keys
assert override_key in collected_keys
@pytest.mark.asyncio
async def test_routed_only_skips_forced_flood_override(self):
"""A contact with a forced-flood override (path_len=-1) should be
treated as flood even though effective_route_source is 'override'."""
from unittest.mock import AsyncMock, MagicMock, patch
from app.models import AppSettings, Contact
from app.radio_sync import _run_telemetry_cycle
forced_flood_key = "aa" * 32
direct_key = "bb" * 32
forced_flood_contact = Contact(
public_key=forced_flood_key,
name="ForcedFlood",
type=2,
direct_path=None,
direct_path_len=-1,
direct_path_hash_mode=-1,
route_override_path="",
route_override_len=-1,
route_override_hash_mode=-1,
)
direct_contact = Contact(
public_key=direct_key,
name="Direct",
type=2,
direct_path="aabb",
direct_path_len=1,
direct_path_hash_mode=1,
)
# Verify the forced-flood contact reports "override" source
assert forced_flood_contact.effective_route_source == "override"
settings = AppSettings(
tracked_telemetry_repeaters=[forced_flood_key, direct_key],
)
contact_map = {forced_flood_key: forced_flood_contact, direct_key: direct_contact}
collected_keys: list[str] = []
async def fake_get_by_key(key):
return contact_map.get(key)
async def fake_collect(mc, contact):
collected_keys.append(contact.public_key)
return True
fake_radio_manager = MagicMock()
fake_radio_manager.is_connected = True
fake_mc = MagicMock()
class FakeRadioOp:
async def __aenter__(self):
return fake_mc
async def __aexit__(self, *args):
pass
fake_radio_manager.radio_operation.return_value = FakeRadioOp()
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch(
"app.radio_sync.ContactRepository.get_by_key",
new_callable=AsyncMock,
side_effect=fake_get_by_key,
),
patch("app.radio_sync._collect_repeater_telemetry", new=fake_collect),
patch("app.radio_sync.radio_manager", fake_radio_manager),
):
await _run_telemetry_cycle(routed_only=True)
# Forced-flood override should be excluded; direct should be collected
assert forced_flood_key not in collected_keys
assert direct_key in collected_keys
@pytest.mark.asyncio
async def test_full_cycle_includes_all_contacts(self):
from unittest.mock import AsyncMock, MagicMock, patch
from app.models import AppSettings, Contact
from app.radio_sync import _run_telemetry_cycle
flood_key = "aa" * 32
direct_key = "bb" * 32
flood_contact = Contact(
public_key=flood_key,
name="Flood",
type=2,
direct_path=None,
direct_path_len=-1,
direct_path_hash_mode=-1,
)
direct_contact = Contact(
public_key=direct_key,
name="Direct",
type=2,
direct_path="aabb",
direct_path_len=1,
direct_path_hash_mode=1,
)
settings = AppSettings(
tracked_telemetry_repeaters=[flood_key, direct_key],
)
contact_map = {flood_key: flood_contact, direct_key: direct_contact}
collected_keys: list[str] = []
async def fake_get_by_key(key):
return contact_map.get(key)
async def fake_collect(mc, contact):
collected_keys.append(contact.public_key)
return True
fake_radio_manager = MagicMock()
fake_radio_manager.is_connected = True
fake_mc = MagicMock()
class FakeRadioOp:
async def __aenter__(self):
return fake_mc
async def __aexit__(self, *args):
pass
fake_radio_manager.radio_operation.return_value = FakeRadioOp()
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch(
"app.radio_sync.ContactRepository.get_by_key",
new_callable=AsyncMock,
side_effect=fake_get_by_key,
),
patch("app.radio_sync._collect_repeater_telemetry", new=fake_collect),
patch("app.radio_sync.radio_manager", fake_radio_manager),
):
await _run_telemetry_cycle(routed_only=False)
# Full cycle collects both
assert flood_key in collected_keys
assert direct_key in collected_keys
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# _telemetry_collect_loop — UTC modulo scheduler # _telemetry_collect_loop — UTC modulo scheduler
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -2518,6 +2774,113 @@ class TestTelemetryCollectSchedulerDecision:
) )
class TestRoutedHourlySchedulerDecision:
"""Verify the routed_hourly feature in _maybe_run_scheduled_cycle."""
@pytest.mark.asyncio
async def test_routed_hourly_fires_on_non_modulo_hour(self):
"""At 09:00 UTC with 8h interval and routed_hourly=True, the scheduler
should call _run_telemetry_cycle(routed_only=True)."""
import datetime as real_datetime
from unittest.mock import AsyncMock, patch
from app import radio_sync
from app.models import AppSettings
settings = AppSettings(
tracked_telemetry_repeaters=["aa" * 32],
telemetry_interval_hours=8,
telemetry_routed_hourly=True,
)
calls = []
async def fake_cycle(*, routed_only=False):
calls.append({"routed_only": routed_only})
now = real_datetime.datetime(2026, 4, 16, 9, 0, 0, tzinfo=real_datetime.UTC)
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
):
await radio_sync._maybe_run_scheduled_cycle(now)
assert len(calls) == 1
assert calls[0]["routed_only"] is True
@pytest.mark.asyncio
async def test_routed_hourly_disabled_skips_non_modulo_hour(self):
"""At 09:00 UTC with 8h interval and routed_hourly=False, nothing runs."""
import datetime as real_datetime
from unittest.mock import AsyncMock, patch
from app import radio_sync
from app.models import AppSettings
settings = AppSettings(
tracked_telemetry_repeaters=["aa" * 32],
telemetry_interval_hours=8,
telemetry_routed_hourly=False,
)
calls = []
async def fake_cycle(*, routed_only=False):
calls.append({"routed_only": routed_only})
now = real_datetime.datetime(2026, 4, 16, 9, 0, 0, tzinfo=real_datetime.UTC)
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
):
await radio_sync._maybe_run_scheduled_cycle(now)
assert len(calls) == 0
@pytest.mark.asyncio
async def test_modulo_hour_runs_full_cycle_even_with_routed_hourly(self):
"""At 16:00 UTC with 8h interval, a normal full cycle runs regardless
of whether routed_hourly is enabled it covers all repeaters."""
import datetime as real_datetime
from unittest.mock import AsyncMock, patch
from app import radio_sync
from app.models import AppSettings
settings = AppSettings(
tracked_telemetry_repeaters=["aa" * 32],
telemetry_interval_hours=8,
telemetry_routed_hourly=True,
)
calls = []
async def fake_cycle(*, routed_only=False):
calls.append({"routed_only": routed_only})
now = real_datetime.datetime(2026, 4, 16, 16, 0, 0, tzinfo=real_datetime.UTC)
with (
patch(
"app.radio_sync.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings,
),
patch("app.radio_sync._run_telemetry_cycle", new=fake_cycle),
):
await radio_sync._maybe_run_scheduled_cycle(now)
assert len(calls) == 1
assert calls[0]["routed_only"] is False
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# get_contacts_selected_for_radio_sync — DM-active prioritization # get_contacts_selected_for_radio_sync — DM-active prioritization
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
+15 -13
View File
@@ -302,7 +302,7 @@ class TestRepeaterCommandRoute:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await send_repeater_command(KEY_A, CommandRequest(command="ver")) await send_repeater_command(KEY_A, CommandRequest(command="ver"))
assert exc.value.status_code == 500 assert exc.value.status_code == 422
mc.start_auto_message_fetching.assert_awaited_once() mc.start_auto_message_fetching.assert_awaited_once()
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -502,7 +502,7 @@ class TestTraceRoute:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await request_trace(KEY_A) await request_trace(KEY_A)
assert exc.value.status_code == 500 assert exc.value.status_code == 422
mc.commands.send_trace.assert_awaited_once_with( mc.commands.send_trace.assert_awaited_once_with(
path=KEY_A[:8], path=KEY_A[:8],
tag=1234, tag=1234,
@@ -510,7 +510,7 @@ class TestTraceRoute:
) )
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_wait_timeout_returns_504(self, test_db): async def test_wait_timeout_returns_408(self, test_db):
mc = _mock_mc() mc = _mock_mc()
await _insert_contact(KEY_A, name="Client", contact_type=1) await _insert_contact(KEY_A, name="Client", contact_type=1)
mc.commands.send_trace = AsyncMock(return_value=_radio_result(EventType.OK)) mc.commands.send_trace = AsyncMock(return_value=_radio_result(EventType.OK))
@@ -524,7 +524,7 @@ class TestTraceRoute:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await request_trace(KEY_A) await request_trace(KEY_A)
assert exc.value.status_code == 504 assert exc.value.status_code == 408
mc.commands.send_trace.assert_awaited_once_with( mc.commands.send_trace.assert_awaited_once_with(
path=KEY_A[:8], path=KEY_A[:8],
tag=1234, tag=1234,
@@ -722,6 +722,7 @@ class TestRepeaterStatus:
"flood_dups": 10, "flood_dups": 10,
"direct_dups": 5, "direct_dups": 5,
"full_evts": 0, "full_evts": 0,
"recv_errors": 42,
} }
) )
@@ -741,9 +742,10 @@ class TestRepeaterStatus:
assert response.uptime_seconds == 86400 assert response.uptime_seconds == 86400
assert response.sent_flood == 100 assert response.sent_flood == 100
assert response.recv_direct == 700 assert response.recv_direct == 700
assert response.recv_errors == 42
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_504_on_timeout(self, test_db): async def test_408_on_timeout(self, test_db):
mc = _mock_mc() mc = _mock_mc()
await _insert_contact(KEY_A, name="Repeater", contact_type=2) await _insert_contact(KEY_A, name="Repeater", contact_type=2)
mc.commands.req_status_sync = AsyncMock(return_value=None) mc.commands.req_status_sync = AsyncMock(return_value=None)
@@ -754,7 +756,7 @@ class TestRepeaterStatus:
): ):
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_status(KEY_A) await repeater_status(KEY_A)
assert exc.value.status_code == 504 assert exc.value.status_code == 408
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_400_not_repeater(self, test_db): async def test_400_not_repeater(self, test_db):
@@ -817,7 +819,7 @@ class TestRepeaterLppTelemetry:
assert response.sensors == [] assert response.sensors == []
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_504_on_timeout(self, test_db): async def test_408_on_timeout(self, test_db):
mc = _mock_mc() mc = _mock_mc()
await _insert_contact(KEY_A, name="Repeater", contact_type=2) await _insert_contact(KEY_A, name="Repeater", contact_type=2)
mc.commands.req_telemetry_sync = AsyncMock(return_value=None) mc.commands.req_telemetry_sync = AsyncMock(return_value=None)
@@ -828,7 +830,7 @@ class TestRepeaterLppTelemetry:
): ):
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_lpp_telemetry(KEY_A) await repeater_lpp_telemetry(KEY_A)
assert exc.value.status_code == 504 assert exc.value.status_code == 408
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_400_not_repeater(self, test_db): async def test_400_not_repeater(self, test_db):
@@ -1232,7 +1234,7 @@ class TestBatchCliFetch:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await _batch_cli_fetch(contact, "test_op", [("ver", "firmware_version")]) await _batch_cli_fetch(contact, "test_op", [("ver", "firmware_version")])
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to add contact to radio" in exc.value.detail assert "Failed to add contact to radio" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -1305,7 +1307,7 @@ class TestRepeaterAddContactError:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_status(KEY_A) await repeater_status(KEY_A)
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to add contact to radio" in exc.value.detail assert "Failed to add contact to radio" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -1323,7 +1325,7 @@ class TestRepeaterAddContactError:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_lpp_telemetry(KEY_A) await repeater_lpp_telemetry(KEY_A)
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to add contact to radio" in exc.value.detail assert "Failed to add contact to radio" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -1341,7 +1343,7 @@ class TestRepeaterAddContactError:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_neighbors(KEY_A) await repeater_neighbors(KEY_A)
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to add contact to radio" in exc.value.detail assert "Failed to add contact to radio" in exc.value.detail
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -1359,5 +1361,5 @@ class TestRepeaterAddContactError:
with pytest.raises(HTTPException) as exc: with pytest.raises(HTTPException) as exc:
await repeater_acl(KEY_A) await repeater_acl(KEY_A)
assert exc.value.status_code == 500 assert exc.value.status_code == 422
assert "Failed to add contact to radio" in exc.value.detail assert "Failed to add contact to radio" in exc.value.detail
+1
View File
@@ -31,6 +31,7 @@ SAMPLE_STATUS = {
"flood_dups": 5, "flood_dups": 5,
"direct_dups": 2, "direct_dups": 2,
"full_events": 0, "full_events": 0,
"recv_errors": None,
} }
+2
View File
@@ -135,6 +135,7 @@ class TestRoomStatus:
"flood_dups": 2, "flood_dups": 2,
"direct_dups": 1, "direct_dups": 1,
"full_evts": 0, "full_evts": 0,
"recv_errors": 7,
} }
) )
@@ -147,6 +148,7 @@ class TestRoomStatus:
assert response.battery_volts == 4.025 assert response.battery_volts == 4.025
assert response.packets_received == 80 assert response.packets_received == 80
assert response.recv_direct == 73 assert response.recv_direct == 73
assert response.recv_errors == 7
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_room_acl_maps_entries(self, test_db): async def test_room_acl_maps_entries(self, test_db):
+10 -10
View File
@@ -646,7 +646,7 @@ class TestOutgoingChannelBroadcast:
request = SendChannelMessageRequest(channel_key=chan_key, text="hello") request = SendChannelMessageRequest(channel_key=chan_key, text="hello")
await send_channel_message(request) await send_channel_message(request)
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert "regional override" in exc_info.value.detail.lower() assert "regional override" in exc_info.value.detail.lower()
mc.commands.set_channel.assert_not_awaited() mc.commands.set_channel.assert_not_awaited()
mc.commands.send_chan_msg.assert_not_awaited() mc.commands.send_chan_msg.assert_not_awaited()
@@ -790,7 +790,7 @@ class TestOutgoingChannelBroadcast:
SendChannelMessageRequest(channel_key=chan_key, text="this will fail") SendChannelMessageRequest(channel_key=chan_key, text="this will fail")
) )
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert radio_manager.get_cached_channel_slot(chan_key) is None assert radio_manager.get_cached_channel_slot(chan_key) is None
@@ -969,7 +969,7 @@ class TestResendChannelMessage:
assert sent_timestamp == now + 1 assert sent_timestamp == now + 1
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_resend_no_radio_response_returns_504_and_creates_no_new_row(self, test_db): async def test_resend_no_radio_response_returns_408_and_creates_no_new_row(self, test_db):
"""When resend returns None, report unknown outcome and create no new message row.""" """When resend returns None, report unknown outcome and create no new message row."""
mc = _make_mc(name="MyNode") mc = _make_mc(name="MyNode")
chan_key = "c1" * 16 chan_key = "c1" * 16
@@ -995,7 +995,7 @@ class TestResendChannelMessage:
): ):
await resend_channel_message(msg_id, new_timestamp=True) await resend_channel_message(msg_id, new_timestamp=True)
assert exc_info.value.status_code == 504 assert exc_info.value.status_code == 408
assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL
messages = await MessageRepository.get_all( messages = await MessageRepository.get_all(
@@ -1317,7 +1317,7 @@ class TestPathHashModeOverride:
SendChannelMessageRequest(channel_key=chan_key, text="hello") SendChannelMessageRequest(channel_key=chan_key, text="hello")
) )
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert "path hash mode" in exc_info.value.detail.lower() assert "path hash mode" in exc_info.value.detail.lower()
mc.commands.send_chan_msg.assert_not_awaited() mc.commands.send_chan_msg.assert_not_awaited()
@@ -1567,7 +1567,7 @@ class TestRadioExceptionMidSend:
assert len(messages) == 0 assert len(messages) == 0
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_dm_send_no_radio_response_returns_504_without_storing_message(self, test_db): async def test_dm_send_no_radio_response_returns_408_without_storing_message(self, test_db):
"""When mc.commands.send_msg() returns None, report unknown outcome and store nothing.""" """When mc.commands.send_msg() returns None, report unknown outcome and store nothing."""
mc = _make_mc() mc = _make_mc()
pub_key = "ac" * 32 pub_key = "ac" * 32
@@ -1584,7 +1584,7 @@ class TestRadioExceptionMidSend:
SendDirectMessageRequest(destination=pub_key, text="Did this send?") SendDirectMessageRequest(destination=pub_key, text="Did this send?")
) )
assert exc_info.value.status_code == 504 assert exc_info.value.status_code == 408
assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL
messages = await MessageRepository.get_all( messages = await MessageRepository.get_all(
@@ -1593,7 +1593,7 @@ class TestRadioExceptionMidSend:
assert len(messages) == 0 assert len(messages) == 0
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_channel_send_no_radio_response_returns_504_without_storing_message( async def test_channel_send_no_radio_response_returns_408_without_storing_message(
self, test_db self, test_db
): ):
"""When mc.commands.send_chan_msg() returns None, report unknown outcome and store nothing.""" """When mc.commands.send_chan_msg() returns None, report unknown outcome and store nothing."""
@@ -1612,7 +1612,7 @@ class TestRadioExceptionMidSend:
SendChannelMessageRequest(channel_key=chan_key, text="Did this send?") SendChannelMessageRequest(channel_key=chan_key, text="Did this send?")
) )
assert exc_info.value.status_code == 504 assert exc_info.value.status_code == 408
assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL assert exc_info.value.detail == NO_RADIO_RESPONSE_AFTER_SEND_DETAIL
messages = await MessageRepository.get_all( messages = await MessageRepository.get_all(
@@ -1733,7 +1733,7 @@ class TestRadioExceptionMidSend:
SendChannelMessageRequest(channel_key=chan_key_b, text="Never sent") SendChannelMessageRequest(channel_key=chan_key_b, text="Never sent")
) )
assert exc_info.value.status_code == 500 assert exc_info.value.status_code == 422
assert radio_manager.get_cached_channel_slot(chan_key_a) is None assert radio_manager.get_cached_channel_slot(chan_key_a) is None
assert radio_manager.get_cached_channel_slot(chan_key_b) is None assert radio_manager.get_cached_channel_slot(chan_key_b) is None
mc.commands.send_chan_msg.assert_not_called() mc.commands.send_chan_msg.assert_not_called()
+63
View File
@@ -330,3 +330,66 @@ class TestTelemetryScheduleEndpoint:
assert schedule.tracked_count == 5 assert schedule.tracked_count == 5
assert schedule.options == [6, 8, 12, 24] assert schedule.options == [6, 8, 12, 24]
assert schedule.next_run_at is not None assert schedule.next_run_at is not None
class TestRoutedHourlySetting:
"""Tests for the telemetry_routed_hourly setting."""
@pytest.mark.asyncio
async def test_defaults_to_false(self, test_db):
settings = await AppSettingsRepository.get()
assert settings.telemetry_routed_hourly is False
@pytest.mark.asyncio
async def test_round_trip_via_patch(self, test_db):
result = await update_settings(AppSettingsUpdate(telemetry_routed_hourly=True))
assert result.telemetry_routed_hourly is True
result = await update_settings(AppSettingsUpdate(telemetry_routed_hourly=False))
assert result.telemetry_routed_hourly is False
@pytest.mark.asyncio
async def test_schedule_includes_routed_fields_when_enabled(self, test_db):
key = "aa" * 32
await ContactRepository.upsert(
ContactUpsert(public_key=key, name="R1", type=CONTACT_TYPE_REPEATER)
)
await AppSettingsRepository.update(
tracked_telemetry_repeaters=[key],
telemetry_routed_hourly=True,
)
schedule = await get_telemetry_schedule()
assert schedule.routed_hourly is True
assert schedule.next_routed_run_at is not None
assert schedule.next_run_at is not None
@pytest.mark.asyncio
async def test_schedule_omits_routed_run_when_disabled(self, test_db):
key = "aa" * 32
await ContactRepository.upsert(
ContactUpsert(public_key=key, name="R1", type=CONTACT_TYPE_REPEATER)
)
await AppSettingsRepository.update(
tracked_telemetry_repeaters=[key],
telemetry_routed_hourly=False,
)
schedule = await get_telemetry_schedule()
assert schedule.routed_hourly is False
assert schedule.next_routed_run_at is None
@pytest.mark.asyncio
async def test_toggle_response_carries_routed_hourly(self, test_db):
key = "bb" * 32
await ContactRepository.upsert(
ContactUpsert(public_key=key, name="R2", type=CONTACT_TYPE_REPEATER)
)
await AppSettingsRepository.update(telemetry_routed_hourly=True)
result = await toggle_tracked_telemetry(TrackedTelemetryRequest(public_key=key))
assert result.schedule.routed_hourly is True
assert result.schedule.next_routed_run_at is not None
-204
View File
@@ -1,204 +0,0 @@
"""Tests for app.tcp_proxy.encoder — binary payload builders."""
import struct
from app.tcp_proxy.encoder import (
build_contact,
build_contact_from_dict,
build_device_info,
build_self_info,
build_self_info_from_runtime,
)
from app.tcp_proxy.protocol import (
PROXY_FW_VER,
PROXY_MAX_CHANNELS,
PROXY_MAX_CONTACTS_RAW,
PUSH_NEW_ADVERT,
RESP_CONTACT,
RESP_DEVICE_INFO,
RESP_SELF_INFO,
)
EXAMPLE_KEY = "ab" * 32 # 64-char hex → 32 bytes
# ── build_contact ────────────────────────────────────────────────────
class TestBuildContact:
def test_basic_structure(self):
payload = build_contact(EXAMPLE_KEY, name="Alice")
assert payload[0] == RESP_CONTACT
# public key at bytes 1-32
assert payload[1:33] == bytes.fromhex(EXAMPLE_KEY)
# total length: 1 + 32 + 1(type) + 1(flags) + 1(path) + 64(path) + 32(name) + 4(adv) + 4(lat) + 4(lon) + 4(lastmod) = 148
assert len(payload) == 148
def test_push_variant(self):
payload = build_contact(EXAMPLE_KEY, push=True)
assert payload[0] == PUSH_NEW_ADVERT
assert len(payload) == 148
def test_favorite_flag(self):
payload = build_contact(EXAMPLE_KEY, favorite=True)
flags_byte = payload[34] # byte 1+32+1 = 34
assert flags_byte & 0x01 == 1
def test_not_favorite(self):
payload = build_contact(EXAMPLE_KEY, favorite=False)
flags_byte = payload[34]
assert flags_byte & 0x01 == 0
def test_flood_path(self):
payload = build_contact(EXAMPLE_KEY)
path_byte = payload[35] # byte 1+32+1+1 = 35
assert path_byte == 0xFF
def test_direct_path(self):
payload = build_contact(
EXAMPLE_KEY,
direct_path="aabb",
direct_path_len=2,
direct_path_hash_mode=1,
)
path_byte = payload[35]
# mode=1 → 0x40, hops=2 → 0x02 → packed = 0x42
assert path_byte == 0x42
def test_name_truncated(self):
long_name = "A" * 50
payload = build_contact(EXAMPLE_KEY, name=long_name)
# name field is 32 bytes at offset 100 (1+32+1+1+1+64)
name_bytes = payload[100:132]
assert name_bytes == b"A" * 32
def test_lat_lon_encoding(self):
payload = build_contact(EXAMPLE_KEY, lat=45.123456, lon=-122.654321)
lat_offset = 136 # 1+32+1+1+1+64+32+4 = 136
lat = struct.unpack_from("<i", payload, lat_offset)[0]
lon = struct.unpack_from("<i", payload, lat_offset + 4)[0]
assert abs(lat - 45123456) < 2
assert abs(lon - (-122654321)) < 2
def test_contact_type(self):
payload = build_contact(EXAMPLE_KEY, contact_type=2)
assert payload[33] == 2 # type byte at offset 1+32
# ── build_contact_from_dict ──────────────────────────────────────────
class TestBuildContactFromDict:
def test_minimal_dict(self):
data = {"public_key": EXAMPLE_KEY}
payload = build_contact_from_dict(data)
assert payload[0] == RESP_CONTACT
assert len(payload) == 148
def test_full_dict(self):
data = {
"public_key": EXAMPLE_KEY,
"type": 1,
"favorite": True,
"name": "Bob",
"direct_path": "ff",
"direct_path_len": 1,
"direct_path_hash_mode": 0,
"last_advert": 1700000000,
"lat": 37.7749,
"lon": -122.4194,
"first_seen": 1699000000,
}
payload = build_contact_from_dict(data)
assert payload[33] == 1 # type
assert payload[34] & 0x01 == 1 # favorite
def test_push_flag(self):
data = {"public_key": EXAMPLE_KEY}
payload = build_contact_from_dict(data, push=True)
assert payload[0] == PUSH_NEW_ADVERT
# ── build_self_info ──────────────────────────────────────────────────
class TestBuildSelfInfo:
def test_basic_structure(self):
payload = build_self_info()
assert payload[0] == RESP_SELF_INFO
assert payload[1] == 1 # adv_type = CHAT
# minimum length: 1+1+1+1+32+4+4+1+1+1+1+4+4+1+1 + len("RemoteTerm") = 68
assert len(payload) >= 58
def test_name_appended(self):
payload = build_self_info(name="TestNode")
# name starts at offset 58
name_bytes = payload[58:]
assert name_bytes == b"TestNode"
def test_public_key_encoded(self):
payload = build_self_info(public_key=EXAMPLE_KEY)
assert payload[4:36] == bytes.fromhex(EXAMPLE_KEY)
def test_radio_params(self):
payload = build_self_info(radio_freq=868.0, radio_bw=125.0, radio_sf=12, radio_cr=8)
freq = struct.unpack_from("<I", payload, 48)[0]
bw = struct.unpack_from("<I", payload, 52)[0]
assert freq == 868000
assert bw == 125000
assert payload[56] == 12 # sf
assert payload[57] == 8 # cr
def test_multi_acks_flag(self):
on = build_self_info(multi_acks=True)
off = build_self_info(multi_acks=False)
assert on[44] == 1
assert off[44] == 0
class TestBuildSelfInfoFromRuntime:
def test_from_self_info_dict(self):
info = {
"public_key": EXAMPLE_KEY,
"name": "MyRadio",
"tx_power": 18,
"max_tx_power": 22,
"adv_lat": 40.0,
"adv_lon": -74.0,
"multi_acks": 1,
"adv_loc_policy": 1,
"radio_freq": 915.0,
"radio_bw": 250.0,
"radio_sf": 10,
"radio_cr": 7,
}
payload = build_self_info_from_runtime(info)
assert payload[0] == RESP_SELF_INFO
assert payload[58:] == b"MyRadio"
def test_missing_fields_use_defaults(self):
payload = build_self_info_from_runtime({})
assert payload[0] == RESP_SELF_INFO
assert payload[58:] == b"RemoteTerm"
# ── build_device_info ────────────────────────────────────────────────
class TestBuildDeviceInfo:
def test_basic_structure(self):
payload = build_device_info()
assert payload[0] == RESP_DEVICE_INFO
assert payload[1] == PROXY_FW_VER
assert payload[2] == PROXY_MAX_CONTACTS_RAW
assert payload[3] == PROXY_MAX_CHANNELS
def test_path_hash_mode(self):
payload = build_device_info(path_hash_mode=2)
# path_hash_mode is at offset 81 (1+1+1+1+4+12+40+20+1 = 81)
assert payload[81] == 2
def test_expected_length(self):
# fw_ver=11 → 1+1+1+1+4+12+40+20+1+1 = 82 bytes
payload = build_device_info()
assert len(payload) == 82
-365
View File
@@ -1,365 +0,0 @@
"""Integration tests for the TCP proxy — real asyncio TCP server + client."""
import asyncio
import pytest
from app.tcp_proxy.protocol import (
CMD_APP_START,
CMD_DEVICE_QUERY,
CMD_GET_CHANNEL,
CMD_GET_CONTACTS,
CMD_GET_DEVICE_TIME,
CMD_HAS_CONNECTION,
CMD_SET_CHANNEL,
CMD_SYNC_NEXT_MESSAGE,
FRAME_RX,
FRAME_TX,
PROXY_FW_VER,
PUSH_MSG_WAITING,
RESP_CONTACT_END,
RESP_CONTACT_START,
RESP_CURRENT_TIME,
RESP_DEVICE_INFO,
RESP_ERR,
RESP_NO_MORE_MSGS,
RESP_OK,
RESP_SELF_INFO,
)
from app.tcp_proxy.server import dispatch_event, register, unregister
from app.tcp_proxy.session import ProxySession
# ── Helpers ──────────────────────────────────────────────────────────
EXAMPLE_KEY = "ab" * 32
def _frame_cmd(payload: bytes) -> bytes:
"""Wrap a command payload in a 0x3C frame."""
return bytes([FRAME_TX]) + len(payload).to_bytes(2, "little") + payload
async def _read_response(reader: asyncio.StreamReader) -> bytes:
"""Read one 0x3E-framed response and return the payload."""
marker = await reader.readexactly(1)
assert marker[0] == FRAME_RX
size_bytes = await reader.readexactly(2)
size = int.from_bytes(size_bytes, "little")
payload = await reader.readexactly(size)
return payload
class _ProxyTestHarness:
"""Manages a real TCP proxy server for testing."""
def __init__(self):
self._server: asyncio.Server | None = None
self.port: int = 0
self.sessions: list[ProxySession] = []
async def start(self):
self._server = await asyncio.start_server(self._handle, "127.0.0.1", 0)
self.port = self._server.sockets[0].getsockname()[1]
async def stop(self):
for s in self.sessions:
try:
s.writer.close()
except Exception:
pass
self.sessions.clear()
if self._server:
self._server.close()
await self._server.wait_closed()
async def _handle(self, reader, writer):
session = ProxySession(reader, writer)
self.sessions.append(session)
register(session)
try:
await session.run()
finally:
unregister(session)
async def connect(self) -> tuple[asyncio.StreamReader, asyncio.StreamWriter]:
reader, writer = await asyncio.open_connection("127.0.0.1", self.port)
return reader, writer
@pytest.fixture
async def harness():
h = _ProxyTestHarness()
await h.start()
yield h
await h.stop()
def _mock_repos_and_runtime():
"""Return a context manager that mocks repositories and radio_runtime."""
import time
from unittest.mock import AsyncMock, MagicMock, patch
contacts = [
MagicMock(
model_dump=MagicMock(
return_value={
"public_key": EXAMPLE_KEY,
"name": "Alice",
"type": 1,
"favorite": True,
"direct_path": None,
"direct_path_len": -1,
"direct_path_hash_mode": -1,
"last_advert": 0,
"lat": 0.0,
"lon": 0.0,
"first_seen": int(time.time()),
}
)
)
]
channels = [
MagicMock(
model_dump=MagicMock(return_value={"key": "cc" * 16, "name": "test", "favorite": True})
)
]
settings_obj = MagicMock(last_message_times={})
rt = MagicMock()
rt.is_connected = True
mc = MagicMock()
mc.self_info = {
"public_key": EXAMPLE_KEY,
"name": "TestNode",
"tx_power": 20,
"max_tx_power": 22,
"adv_lat": 0.0,
"adv_lon": 0.0,
"radio_freq": 915.0,
"radio_bw": 250.0,
"radio_sf": 10,
"radio_cr": 7,
}
rt.meshcore = mc
class _Ctx:
def __enter__(self_):
self_._patches = [
patch(
"app.repository.ContactRepository.get_favorites",
new_callable=AsyncMock,
return_value=contacts,
),
patch(
"app.repository.ChannelRepository.get_all",
new_callable=AsyncMock,
return_value=channels,
),
patch(
"app.repository.AppSettingsRepository.get",
new_callable=AsyncMock,
return_value=settings_obj,
),
patch(
"app.services.radio_runtime.radio_runtime",
rt,
),
]
for p in self_._patches:
p.__enter__()
return self_
def __exit__(self_, *args):
for p in reversed(self_._patches):
p.__exit__(*args)
return _Ctx()
# ── Tests ────────────────────────────────────────────────────────────
class TestTcpProxyIntegration:
@pytest.mark.asyncio
async def test_app_start_returns_self_info(self, harness):
reader, writer = await harness.connect()
try:
with _mock_repos_and_runtime():
writer.write(_frame_cmd(bytes([CMD_APP_START]) + b"\x03" + b" " * 6 + b"test"))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_SELF_INFO
finally:
writer.close()
@pytest.mark.asyncio
async def test_device_query_returns_device_info(self, harness):
reader, writer = await harness.connect()
try:
with _mock_repos_and_runtime():
# First do APP_START to initialize session state
writer.write(_frame_cmd(bytes([CMD_APP_START]) + b"\x03" + b" " * 6 + b"test"))
await writer.drain()
await asyncio.wait_for(_read_response(reader), timeout=3)
writer.write(_frame_cmd(bytes([CMD_DEVICE_QUERY, 0x03])))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_DEVICE_INFO
assert resp[1] == PROXY_FW_VER
finally:
writer.close()
@pytest.mark.asyncio
async def test_get_contacts_flow(self, harness):
reader, writer = await harness.connect()
try:
with _mock_repos_and_runtime():
writer.write(_frame_cmd(bytes([CMD_GET_CONTACTS])))
await writer.drain()
# Should get CONTACT_START
resp1 = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp1[0] == RESP_CONTACT_START
count = int.from_bytes(resp1[1:5], "little")
assert count == 1
# One contact
resp2 = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp2[0] == 0x03 # RESP_CONTACT
# CONTACT_END
resp3 = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp3[0] == RESP_CONTACT_END
finally:
writer.close()
@pytest.mark.asyncio
async def test_get_time(self, harness):
reader, writer = await harness.connect()
try:
writer.write(_frame_cmd(bytes([CMD_GET_DEVICE_TIME])))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_CURRENT_TIME
assert len(resp) == 5
finally:
writer.close()
@pytest.mark.asyncio
async def test_has_connection(self, harness):
reader, writer = await harness.connect()
try:
with _mock_repos_and_runtime():
writer.write(_frame_cmd(bytes([CMD_HAS_CONNECTION])))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_OK
val = int.from_bytes(resp[1:5], "little")
assert val == 1
finally:
writer.close()
@pytest.mark.asyncio
async def test_empty_channel_returns_error(self, harness):
reader, writer = await harness.connect()
try:
writer.write(_frame_cmd(bytes([CMD_GET_CHANNEL, 5])))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_ERR
finally:
writer.close()
@pytest.mark.asyncio
async def test_set_then_get_channel(self, harness):
reader, writer = await harness.connect()
try:
# SET_CHANNEL: cmd(1) + idx(1) + name(32) + secret(16) = 50
name = b"mychan" + b"\x00" * 26 # 32 bytes
secret = b"\xdd" * 16
cmd = bytes([CMD_SET_CHANNEL, 2]) + name + secret
writer.write(_frame_cmd(cmd))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_OK
# GET_CHANNEL for slot 2
writer.write(_frame_cmd(bytes([CMD_GET_CHANNEL, 2])))
await writer.drain()
resp2 = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp2[0] == 0x12 # RESP_CHANNEL_INFO
assert resp2[1] == 2 # idx
finally:
writer.close()
@pytest.mark.asyncio
async def test_sync_next_empty(self, harness):
reader, writer = await harness.connect()
try:
writer.write(_frame_cmd(bytes([CMD_SYNC_NEXT_MESSAGE])))
await writer.drain()
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == RESP_NO_MORE_MSGS
finally:
writer.close()
@pytest.mark.asyncio
async def test_event_dispatch_queues_message(self, harness):
reader, writer = await harness.connect()
try:
with _mock_repos_and_runtime():
# APP_START to init session
writer.write(_frame_cmd(bytes([CMD_APP_START]) + b"\x03" + b" " * 6 + b"test"))
await writer.drain()
await asyncio.wait_for(_read_response(reader), timeout=3)
# Set a channel so CHAN messages can be routed
name = b"\x00" * 32
secret = bytes.fromhex("cc" * 16)
writer.write(_frame_cmd(bytes([CMD_SET_CHANNEL, 0]) + name + secret))
await writer.drain()
await asyncio.wait_for(_read_response(reader), timeout=3)
# Simulate a broadcast event
await dispatch_event(
"message",
{
"type": "CHAN",
"outgoing": False,
"conversation_key": "cc" * 16,
"text": "hello from event",
"sender_timestamp": 1700000000,
},
)
# Should receive PUSH_MSG_WAITING
resp = await asyncio.wait_for(_read_response(reader), timeout=3)
assert resp[0] == PUSH_MSG_WAITING
# Pull the message
writer.write(_frame_cmd(bytes([CMD_SYNC_NEXT_MESSAGE])))
await writer.drain()
msg = await asyncio.wait_for(_read_response(reader), timeout=3)
assert msg[0] == 0x11 # RESP_CHANNEL_MSG_RECV_V3
finally:
writer.close()
@pytest.mark.asyncio
async def test_multiple_clients_isolated(self, harness):
r1, w1 = await harness.connect()
r2, w2 = await harness.connect()
try:
# Both can get time independently
w1.write(_frame_cmd(bytes([CMD_GET_DEVICE_TIME])))
w2.write(_frame_cmd(bytes([CMD_GET_DEVICE_TIME])))
await w1.drain()
await w2.drain()
resp1 = await asyncio.wait_for(_read_response(r1), timeout=3)
resp2 = await asyncio.wait_for(_read_response(r2), timeout=3)
assert resp1[0] == RESP_CURRENT_TIME
assert resp2[0] == RESP_CURRENT_TIME
finally:
w1.close()
w2.close()
-180
View File
@@ -1,180 +0,0 @@
"""Tests for app.tcp_proxy.protocol — frame parsing, helpers, constants."""
from app.tcp_proxy.protocol import (
ERR_NOT_FOUND,
ERR_UNSUPPORTED,
FRAME_RX,
FRAME_TX,
RESP_ERR,
RESP_OK,
FrameParser,
build_error,
build_ok,
encode_path_byte,
frame_response,
pad,
)
# ── frame_response ───────────────────────────────────────────────────
class TestFrameResponse:
def test_empty_payload(self):
result = frame_response(b"")
assert result == bytes([FRAME_RX, 0x00, 0x00])
def test_short_payload(self):
result = frame_response(b"\x05\x01")
assert result[0] == FRAME_RX
size = int.from_bytes(result[1:3], "little")
assert size == 2
assert result[3:] == b"\x05\x01"
def test_larger_payload(self):
payload = b"\xaa" * 200
result = frame_response(payload)
assert result[0] == FRAME_RX
size = int.from_bytes(result[1:3], "little")
assert size == 200
assert result[3:] == payload
# ── build_ok / build_error ───────────────────────────────────────────
class TestBuildOk:
def test_no_value(self):
assert build_ok() == bytes([RESP_OK])
def test_with_value(self):
result = build_ok(42)
assert result[0] == RESP_OK
assert int.from_bytes(result[1:5], "little") == 42
def test_zero_value(self):
result = build_ok(0)
assert len(result) == 5
assert int.from_bytes(result[1:5], "little") == 0
class TestBuildError:
def test_default_code(self):
assert build_error() == bytes([RESP_ERR, ERR_UNSUPPORTED])
def test_not_found(self):
assert build_error(ERR_NOT_FOUND) == bytes([RESP_ERR, ERR_NOT_FOUND])
# ── pad ──────────────────────────────────────────────────────────────
class TestPad:
def test_shorter_data(self):
result = pad(b"AB", 5)
assert result == b"AB\x00\x00\x00"
assert len(result) == 5
def test_exact_data(self):
assert pad(b"ABCDE", 5) == b"ABCDE"
def test_longer_data(self):
assert pad(b"ABCDEFGH", 5) == b"ABCDE"
def test_empty_data(self):
assert pad(b"", 3) == b"\x00\x00\x00"
# ── encode_path_byte ────────────────────────────────────────────────
class TestEncodePathByte:
def test_flood_negative_hop(self):
assert encode_path_byte(-1, 0) == 0xFF
def test_flood_negative_mode(self):
assert encode_path_byte(0, -1) == 0xFF
def test_flood_both_negative(self):
assert encode_path_byte(-1, -1) == 0xFF
def test_zero_hops_mode_zero(self):
assert encode_path_byte(0, 0) == 0x00
def test_three_hops_mode_one(self):
# mode=1 → bits 6-7 = 01 → 0x40; hops=3 → 0x03
assert encode_path_byte(3, 1) == 0x43
def test_max_hops_mode_two(self):
# mode=2 → bits 6-7 = 10 → 0x80; hops=63 → 0x3F
assert encode_path_byte(63, 2) == 0xBF
# ── FrameParser ──────────────────────────────────────────────────────
class TestFrameParser:
def test_single_complete_frame(self):
parser = FrameParser()
# 0x3C + 2-byte LE size (3) + 3 bytes payload
data = bytes([FRAME_TX, 0x03, 0x00, 0xAA, 0xBB, 0xCC])
payloads = parser.feed(data)
assert len(payloads) == 1
assert payloads[0] == b"\xaa\xbb\xcc"
def test_two_frames_in_one_chunk(self):
parser = FrameParser()
frame1 = bytes([FRAME_TX, 0x02, 0x00, 0x01, 0x02])
frame2 = bytes([FRAME_TX, 0x01, 0x00, 0xFF])
payloads = parser.feed(frame1 + frame2)
assert len(payloads) == 2
assert payloads[0] == b"\x01\x02"
assert payloads[1] == b"\xff"
def test_split_across_chunks(self):
parser = FrameParser()
full = bytes([FRAME_TX, 0x04, 0x00, 0x01, 0x02, 0x03, 0x04])
# Split in the middle of the payload
p1 = parser.feed(full[:5])
assert p1 == []
p2 = parser.feed(full[5:])
assert len(p2) == 1
assert p2[0] == b"\x01\x02\x03\x04"
def test_split_in_header(self):
parser = FrameParser()
full = bytes([FRAME_TX, 0x01, 0x00, 0xAA])
p1 = parser.feed(full[:2]) # marker + first size byte
assert p1 == []
p2 = parser.feed(full[2:]) # second size byte + payload
assert len(p2) == 1
assert p2[0] == b"\xaa"
def test_bad_marker_skipped(self):
parser = FrameParser()
junk = b"\x00\x00\x00"
good = bytes([FRAME_TX, 0x01, 0x00, 0xBB])
payloads = parser.feed(junk + good)
assert len(payloads) == 1
assert payloads[0] == b"\xbb"
def test_oversized_frame_skipped(self):
parser = FrameParser()
# Size = 400 (> MAX_FRAME_SIZE=300)
bad = bytes([FRAME_TX, 0x90, 0x01])
good = bytes([FRAME_TX, 0x01, 0x00, 0xCC])
payloads = parser.feed(bad + good)
assert len(payloads) == 1
assert payloads[0] == b"\xcc"
def test_empty_feed(self):
parser = FrameParser()
assert parser.feed(b"") == []
def test_byte_at_a_time(self):
parser = FrameParser()
full = bytes([FRAME_TX, 0x02, 0x00, 0xDE, 0xAD])
payloads = []
for b in full:
payloads.extend(parser.feed(bytes([b])))
assert len(payloads) == 1
assert payloads[0] == b"\xde\xad"
-695
View File
@@ -1,695 +0,0 @@
"""Tests for app.tcp_proxy.session — ProxySession command handlers."""
import asyncio
import time
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from app.tcp_proxy.protocol import (
CMD_APP_START,
CMD_DEVICE_QUERY,
CMD_GET_BATT_AND_STORAGE,
CMD_GET_CHANNEL,
CMD_GET_CONTACT_BY_KEY,
CMD_GET_CONTACTS,
CMD_GET_DEVICE_TIME,
CMD_HAS_CONNECTION,
CMD_RESET_PATH,
CMD_SEND_CHANNEL_TXT_MSG,
CMD_SEND_TXT_MSG,
CMD_SET_CHANNEL,
CMD_SYNC_NEXT_MESSAGE,
ERR_NOT_FOUND,
PROXY_FW_VER,
PUSH_MSG_WAITING,
RESP_BATTERY,
RESP_CHANNEL_MSG_RECV_V3,
RESP_CONTACT_END,
RESP_CONTACT_MSG_RECV_V3,
RESP_CONTACT_START,
RESP_CURRENT_TIME,
RESP_DEVICE_INFO,
RESP_ERR,
RESP_MSG_SENT,
RESP_NO_MORE_MSGS,
RESP_OK,
RESP_SELF_INFO,
encode_path_byte,
)
from app.tcp_proxy.session import ProxySession
EXAMPLE_KEY = "ab" * 32
# ── Helpers ──────────────────────────────────────────────────────────
def _make_session() -> tuple[ProxySession, list[bytes]]:
"""Create a ProxySession with a capturing writer."""
reader = AsyncMock(spec=asyncio.StreamReader)
writer = MagicMock(spec=asyncio.StreamWriter)
writer.get_extra_info.return_value = ("127.0.0.1", 12345)
sent: list[bytes] = []
def capture_write(data: bytes):
sent.append(data)
writer.write = capture_write
writer.drain = AsyncMock()
session = ProxySession(reader, writer)
return session, sent
def _extract_payloads(sent: list[bytes]) -> list[bytes]:
"""Extract payloads from framed response bytes."""
payloads = []
for frame in sent:
assert frame[0] == 0x3E
size = int.from_bytes(frame[1:3], "little")
payloads.append(frame[3 : 3 + size])
return payloads
def _make_contact(public_key: str = EXAMPLE_KEY, name: str = "Alice", **kw):
return MagicMock(
model_dump=MagicMock(
return_value={
"public_key": public_key,
"name": name,
"type": 1,
"favorite": True,
"direct_path": None,
"direct_path_len": -1,
"direct_path_hash_mode": -1,
"last_advert": 0,
"lat": 0.0,
"lon": 0.0,
"first_seen": int(time.time()),
**kw,
}
)
)
def _make_channel(key: str = "cc" * 16, name: str = "test", favorite: bool = True):
return MagicMock(
model_dump=MagicMock(return_value={"key": key, "name": name, "favorite": favorite})
)
def _make_settings(last_message_times=None):
return MagicMock(last_message_times=last_message_times or {})
def _mock_radio_runtime(connected: bool = True, self_info: dict | None = None):
rt = MagicMock()
rt.is_connected = connected
mc = MagicMock()
mc.self_info = self_info or {
"public_key": EXAMPLE_KEY,
"name": "TestNode",
"tx_power": 20,
"max_tx_power": 22,
"adv_lat": 0.0,
"adv_lon": 0.0,
"radio_freq": 915.0,
"radio_bw": 250.0,
"radio_sf": 10,
"radio_cr": 7,
}
rt.meshcore = mc
return rt
# ── Tests ────────────────────────────────────────────────────────────
class TestAppStart:
@pytest.mark.asyncio
async def test_sends_self_info(self):
session, sent = _make_session()
contacts = [_make_contact()]
channels = [_make_channel()]
settings = _make_settings()
rt = _mock_radio_runtime()
with (
patch("app.repository.ContactRepository") as cr,
patch("app.repository.ChannelRepository") as chr_,
patch("app.repository.AppSettingsRepository") as sr,
patch("app.services.radio_runtime.radio_runtime", rt),
):
cr.get_favorites = AsyncMock(return_value=contacts)
chr_.get_all = AsyncMock(return_value=channels)
sr.get = AsyncMock(return_value=settings)
await session._cmd_app_start(bytes([CMD_APP_START]))
payloads = _extract_payloads(sent)
assert len(payloads) == 1
assert payloads[0][0] == RESP_SELF_INFO
@pytest.mark.asyncio
async def test_populates_contacts_and_channels(self):
session, sent = _make_session()
contacts = [_make_contact(), _make_contact(public_key="cd" * 32, name="Bob")]
channels = [_make_channel(), _make_channel(key="dd" * 16, name="ch2")]
settings = _make_settings()
rt = _mock_radio_runtime()
with (
patch("app.repository.ContactRepository") as cr,
patch("app.repository.ChannelRepository") as chr_,
patch("app.repository.AppSettingsRepository") as sr,
patch("app.services.radio_runtime.radio_runtime", rt),
):
cr.get_favorites = AsyncMock(return_value=contacts)
chr_.get_all = AsyncMock(return_value=channels)
sr.get = AsyncMock(return_value=settings)
await session._cmd_app_start(bytes([CMD_APP_START]))
assert len(session.contacts) == 2
# Only favorite channels are slotted
assert len(session.channel_slots) == 2
class TestDeviceQuery:
@pytest.mark.asyncio
async def test_sends_device_info(self):
session, sent = _make_session()
rt = _mock_radio_runtime()
with patch("app.services.radio_runtime.radio_runtime", rt):
await session._cmd_device_query(bytes([CMD_DEVICE_QUERY, 0x03]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_DEVICE_INFO
assert payloads[0][1] == PROXY_FW_VER
class TestGetContacts:
@pytest.mark.asyncio
async def test_sends_start_contacts_end(self):
session, sent = _make_session()
contacts = [_make_contact()]
with patch("app.repository.ContactRepository") as cr:
cr.get_favorites = AsyncMock(return_value=contacts)
await session._cmd_get_contacts(bytes([CMD_GET_CONTACTS]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_CONTACT_START
count = int.from_bytes(payloads[0][1:5], "little")
assert count == 1
# Middle payload(s) are contacts
assert payloads[-1][0] == RESP_CONTACT_END
class TestGetContactByKey:
@pytest.mark.asyncio
async def test_found(self):
session, sent = _make_session()
session.contacts = [
{
"public_key": EXAMPLE_KEY,
"type": 1,
"name": "Alice",
"favorite": True,
"direct_path": None,
"direct_path_len": -1,
"direct_path_hash_mode": -1,
"last_advert": 0,
"lat": 0.0,
"lon": 0.0,
"first_seen": 0,
}
]
cmd = bytes([CMD_GET_CONTACT_BY_KEY]) + bytes.fromhex(EXAMPLE_KEY)
await session._cmd_get_contact_by_key(cmd)
payloads = _extract_payloads(sent)
assert len(payloads) == 1
assert payloads[0][0] == 0x03 # RESP_CONTACT
@pytest.mark.asyncio
async def test_not_found(self):
session, sent = _make_session()
session.contacts = []
cmd = bytes([CMD_GET_CONTACT_BY_KEY]) + bytes.fromhex(EXAMPLE_KEY)
await session._cmd_get_contact_by_key(cmd)
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_ERR
assert payloads[0][1] == ERR_NOT_FOUND
class TestGetChannel:
@pytest.mark.asyncio
async def test_found(self):
session, sent = _make_session()
key = "cc" * 16
session.channel_slots = {0: key}
session.channels = [{"key": key, "name": "test"}]
await session._cmd_get_channel(bytes([CMD_GET_CHANNEL, 0]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == 0x12 # RESP_CHANNEL_INFO
@pytest.mark.asyncio
async def test_empty_slot_returns_error(self):
session, sent = _make_session()
session.channel_slots = {}
await session._cmd_get_channel(bytes([CMD_GET_CHANNEL, 5]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_ERR
class TestSetChannel:
@pytest.mark.asyncio
async def test_updates_slot_mapping(self):
session, sent = _make_session()
name = b"test" + b"\x00" * 28 # 32 bytes
secret = b"\xaa" * 16
cmd = bytes([CMD_SET_CHANNEL, 3]) + name + secret
await session._cmd_set_channel(cmd)
assert session.channel_slots[3] == "aa" * 16
assert session.key_to_idx["aa" * 16] == 3
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_OK
@pytest.mark.asyncio
async def test_cleans_stale_mapping(self):
session, sent = _make_session()
# Pre-load slot 0 with key_a
session.channel_slots[0] = "aa" * 16
session.key_to_idx["aa" * 16] = 0
# Overwrite slot 0 with key_b
name = b"\x00" * 32
secret_b = b"\xbb" * 16
cmd = bytes([CMD_SET_CHANNEL, 0]) + name + secret_b
await session._cmd_set_channel(cmd)
assert session.channel_slots[0] == "bb" * 16
assert "aa" * 16 not in session.key_to_idx
class TestSendDm:
@pytest.mark.asyncio
async def test_sends_msg_sent_and_ack(self):
session, sent = _make_session()
session.contacts = [{"public_key": EXAMPLE_KEY}]
# CMD_SEND_TXT_MSG: cmd(1) + txt_type(1) + attempt(1) + ts(4) + prefix(6) + text
prefix = bytes.fromhex(EXAMPLE_KEY[:12])
cmd = (
bytes([CMD_SEND_TXT_MSG, 0, 0])
+ int(time.time()).to_bytes(4, "little")
+ prefix
+ b"Hello"
)
with patch.object(session, "_do_send_dm", new_callable=AsyncMock):
await session._cmd_send_dm(cmd)
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_MSG_SENT
assert payloads[1][0] == 0x82 # PUSH_ACK
# ACK code should match
ack_from_sent = payloads[0][2:6]
ack_from_push = payloads[1][1:5]
assert ack_from_sent == ack_from_push
@pytest.mark.asyncio
async def test_long_text_with_prefix(self):
"""6-byte prefix + long text (>26 chars) must resolve correctly."""
session, sent = _make_session()
session.contacts = [{"public_key": EXAMPLE_KEY}]
prefix = bytes.fromhex(EXAMPLE_KEY[:12])
long_text = b"A" * 50 # well over 26 chars
cmd = (
bytes([CMD_SEND_TXT_MSG, 0, 0])
+ int(time.time()).to_bytes(4, "little")
+ prefix
+ long_text
)
with patch.object(session, "_do_send_dm", new_callable=AsyncMock) as mock_send:
await session._cmd_send_dm(cmd)
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_MSG_SENT # not ERR
mock_send.assert_called_once()
call_key, call_text = mock_send.call_args[0]
assert call_key == EXAMPLE_KEY
assert call_text == "A" * 50
class TestSendChannel:
@pytest.mark.asyncio
async def test_sends_ok(self):
session, sent = _make_session()
key = "cc" * 16
session.channel_slots = {0: key}
session.channels = [{"key": key, "name": "test"}]
cmd = (
bytes([CMD_SEND_CHANNEL_TXT_MSG, 0, 0])
+ int(time.time()).to_bytes(4, "little")
+ b"Hello"
)
fake_channel = MagicMock(name="test")
with (
patch(
"app.repository.ChannelRepository.get_by_key",
new_callable=AsyncMock,
return_value=fake_channel,
),
patch.object(session, "_do_send_channel", new_callable=AsyncMock),
):
await session._cmd_send_channel(cmd)
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_OK
class TestSimpleCommands:
@pytest.mark.asyncio
async def test_get_time(self):
session, sent = _make_session()
await session._cmd_get_time(bytes([CMD_GET_DEVICE_TIME]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_CURRENT_TIME
@pytest.mark.asyncio
async def test_battery(self):
session, sent = _make_session()
await session._cmd_battery(bytes([CMD_GET_BATT_AND_STORAGE]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_BATTERY
@pytest.mark.asyncio
async def test_has_connection(self):
session, sent = _make_session()
rt = _mock_radio_runtime(connected=True)
with patch("app.services.radio_runtime.radio_runtime", rt):
await session._cmd_has_connection(bytes([CMD_HAS_CONNECTION]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_OK
val = int.from_bytes(payloads[0][1:5], "little")
assert val == 1
@pytest.mark.asyncio
async def test_ok_stub(self):
session, sent = _make_session()
await session._cmd_ok_stub(bytes([CMD_RESET_PATH]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_OK
class TestSyncNext:
@pytest.mark.asyncio
async def test_empty_queue(self):
session, sent = _make_session()
await session._cmd_sync_next(bytes([CMD_SYNC_NEXT_MESSAGE]))
payloads = _extract_payloads(sent)
assert payloads[0][0] == RESP_NO_MORE_MSGS
@pytest.mark.asyncio
async def test_dequeues_message(self):
session, sent = _make_session()
fake_msg = bytes([0x10, 0x00, 0x00, 0x00]) + b"\xaa" * 10
session._msg_queue.append(fake_msg)
await session._cmd_sync_next(bytes([CMD_SYNC_NEXT_MESSAGE]))
payloads = _extract_payloads(sent)
assert payloads[0] == fake_msg
assert len(session._msg_queue) == 0
class TestExtractPathMeta:
"""Tests for _extract_path_meta static helper."""
def test_no_paths(self):
snr, path_byte = ProxySession._extract_path_meta({"paths": None})
assert snr == 0
assert path_byte == 0 # 0 hops, mode 0
def test_empty_paths_list(self):
snr, path_byte = ProxySession._extract_path_meta({"paths": []})
assert snr == 0
assert path_byte == 0
def test_one_byte_hops(self):
"""2 hops at 1-byte hash mode → path_byte = (0 << 6) | 2 = 0x02."""
snr, path_byte = ProxySession._extract_path_meta(
{
"paths": [{"path": "aabb", "path_len": 2, "snr": None, "rssi": None}],
}
)
assert path_byte == encode_path_byte(2, 0)
assert path_byte == 0x02
def test_two_byte_hops(self):
"""3 hops at 2-byte hash mode → path_byte = (1 << 6) | 3 = 0x43."""
snr, path_byte = ProxySession._extract_path_meta(
{
"paths": [{"path": "aabbccddee11", "path_len": 3, "snr": None, "rssi": None}],
}
)
assert path_byte == encode_path_byte(3, 1)
assert path_byte == 0x43
def test_three_byte_hops(self):
"""1 hop at 3-byte hash mode → path_byte = (2 << 6) | 1 = 0x81."""
snr, path_byte = ProxySession._extract_path_meta(
{
"paths": [{"path": "aabbcc", "path_len": 1, "snr": None, "rssi": None}],
}
)
assert path_byte == encode_path_byte(1, 2)
assert path_byte == 0x81
def test_snr_encoded(self):
"""SNR is encoded as int8(snr * 4)."""
snr, _ = ProxySession._extract_path_meta(
{
"paths": [{"path": "aa", "path_len": 1, "snr": -5.25, "rssi": -100}],
}
)
assert snr == (-21) & 0xFF # -5.25 * 4 = -21 → unsigned byte
def test_zero_hops_empty_path(self):
"""0 hops, empty path → path_byte 0."""
snr, path_byte = ProxySession._extract_path_meta(
{
"paths": [{"path": "", "path_len": 0, "snr": None, "rssi": None}],
}
)
assert path_byte == 0
def test_legacy_no_path_len(self):
"""path_len=None falls back to inferring from hex length (1-byte hops)."""
snr, path_byte = ProxySession._extract_path_meta(
{
"paths": [{"path": "aabb", "path_len": None, "snr": None, "rssi": None}],
}
)
# Inferred: 2 hops, path is 2 bytes → 1-byte hash → mode 0
assert path_byte == encode_path_byte(2, 0)
class TestEventHandlers:
@pytest.mark.asyncio
async def test_priv_message_queued(self):
session, sent = _make_session()
data = {
"type": "PRIV",
"outgoing": False,
"conversation_key": EXAMPLE_KEY,
"text": "hello",
"sender_timestamp": 1700000000,
}
await session.on_event_message(data)
assert len(session._msg_queue) == 1
payloads = _extract_payloads(sent)
assert payloads[0][0] == PUSH_MSG_WAITING
@pytest.mark.asyncio
async def test_priv_message_path_encoding(self):
"""DM frame encodes path_len byte from message path data."""
session, sent = _make_session()
data = {
"type": "PRIV",
"outgoing": False,
"conversation_key": EXAMPLE_KEY,
"text": "hi",
"sender_timestamp": 1700000000,
"paths": [{"path": "aabb", "path_len": 2, "snr": 3.0, "rssi": -80}],
}
await session.on_event_message(data)
frame = session._msg_queue[0]
assert frame[0] == RESP_CONTACT_MSG_RECV_V3
snr_byte = frame[1]
assert snr_byte == 12 # 3.0 * 4
# path_len byte is at offset 10 (after: type, snr, 2 reserved, 6 prefix)
path_byte = frame[10]
assert path_byte == encode_path_byte(2, 0) # 2 hops, 1-byte hash
@pytest.mark.asyncio
async def test_chan_message_queued(self):
session, sent = _make_session()
key = "cc" * 16
session.key_to_idx = {key: 0}
data = {
"type": "CHAN",
"outgoing": False,
"conversation_key": key.upper(), # test case normalization
"text": "hello",
"sender_timestamp": 1700000000,
}
await session.on_event_message(data)
assert len(session._msg_queue) == 1
@pytest.mark.asyncio
async def test_chan_message_path_encoding(self):
"""Channel frame encodes path_len byte correctly instead of 0xFF."""
session, sent = _make_session()
key = "cc" * 16
session.key_to_idx = {key: 0}
data = {
"type": "CHAN",
"outgoing": False,
"conversation_key": key,
"text": "hello",
"sender_timestamp": 1700000000,
"paths": [{"path": "aabbccdd", "path_len": 2, "snr": -2.5, "rssi": -90}],
}
await session.on_event_message(data)
frame = session._msg_queue[0]
assert frame[0] == RESP_CHANNEL_MSG_RECV_V3
snr_byte = frame[1]
assert snr_byte == (-10) & 0xFF # -2.5 * 4
# path_len byte is at offset 5 (after: type, snr, 2 reserved, channel_idx)
path_byte = frame[5]
assert path_byte == encode_path_byte(2, 1) # 2 hops, 2-byte hash
assert path_byte != 0xFF # Must NOT be the old wrong value
@pytest.mark.asyncio
async def test_chan_message_no_paths_defaults_zero(self):
"""Channel message with no path data uses 0 (not 0xFF)."""
session, sent = _make_session()
key = "cc" * 16
session.key_to_idx = {key: 0}
data = {
"type": "CHAN",
"outgoing": False,
"conversation_key": key,
"text": "hello",
"sender_timestamp": 1700000000,
}
await session.on_event_message(data)
frame = session._msg_queue[0]
path_byte = frame[5]
assert path_byte == 0 # 0 hops, not 0xFF
@pytest.mark.asyncio
async def test_outgoing_message_ignored(self):
session, sent = _make_session()
data = {"type": "PRIV", "outgoing": True, "conversation_key": EXAMPLE_KEY}
await session.on_event_message(data)
assert len(session._msg_queue) == 0
assert len(sent) == 0
@pytest.mark.asyncio
async def test_chan_unmapped_dropped(self):
session, sent = _make_session()
session.key_to_idx = {}
data = {
"type": "CHAN",
"outgoing": False,
"conversation_key": "ff" * 16,
"text": "hello",
"sender_timestamp": 0,
}
await session.on_event_message(data)
assert len(session._msg_queue) == 0
@pytest.mark.asyncio
async def test_contact_event_updates_existing_cache(self):
session, sent = _make_session()
# Contact must already be in favorites cache to receive pushes
session.contacts = [
{
"public_key": EXAMPLE_KEY,
"name": "Old",
"type": 1,
"favorite": True,
"direct_path": None,
"direct_path_len": -1,
"direct_path_hash_mode": -1,
"last_advert": 0,
"lat": 0.0,
"lon": 0.0,
"first_seen": 0,
}
]
data = {
"public_key": EXAMPLE_KEY,
"type": 1,
"name": "Updated",
"favorite": True,
"direct_path": None,
"direct_path_len": -1,
"direct_path_hash_mode": -1,
"last_advert": 100,
"lat": 0.0,
"lon": 0.0,
"first_seen": 0,
}
await session.on_event_contact(data)
assert len(session.contacts) == 1
assert session.contacts[0]["name"] == "Updated"
# Should have sent a PUSH_NEW_ADVERT
payloads = _extract_payloads(sent)
assert payloads[0][0] == 0x8A # PUSH_NEW_ADVERT
@pytest.mark.asyncio
async def test_contact_event_ignored_for_non_favorites(self):
session, sent = _make_session()
session.contacts = []
data = {
"public_key": EXAMPLE_KEY,
"type": 1,
"name": "Stranger",
"favorite": False,
}
await session.on_event_contact(data)
assert len(session.contacts) == 0
assert len(sent) == 0
Generated
+5 -5
View File
@@ -768,7 +768,7 @@ wheels = [
[[package]] [[package]]
name = "meshcore" name = "meshcore"
version = "2.3.2" version = "2.3.7"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "bleak" }, { name = "bleak" },
@@ -776,9 +776,9 @@ dependencies = [
{ name = "pycryptodome" }, { name = "pycryptodome" },
{ name = "pyserial-asyncio-fast" }, { name = "pyserial-asyncio-fast" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/4c/32/6e7a3e7dcc379888bc2bfcbbdf518af89e47b3697977cbfefd0b87fdf333/meshcore-2.3.2.tar.gz", hash = "sha256:98ceb8c28a8abe5b5b77f0941b30f99ba3d4fc2350f76de99b6c8a4e778dad6f", size = 69871 } sdist = { url = "https://files.pythonhosted.org/packages/50/d1/e45d8fa3cac24d58c3bc2523fe67b8cd00c05ea68e1704fbbaf56cb19753/meshcore-2.3.7.tar.gz", hash = "sha256:267107e09a96f7d0d63f4bdb1402d033a724baadd9c9becf9b71a458170f60bb", size = 90787 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/db/e4/9aafcd70315e48ca1bbae2f4ad1e00a13d5ef00019c486f964b31c34c488/meshcore-2.3.2-py3-none-any.whl", hash = "sha256:7b98e6d71f2c1e1ee146dd2fe96da40eb5bf33077e34ca840557ee53b192e322", size = 53325 }, { url = "https://files.pythonhosted.org/packages/80/3d/ff4b5971a3210da07dc793b54af9b1231fea42dfb87e2818fdcc83e10d72/meshcore-2.3.7-py3-none-any.whl", hash = "sha256:952f028b25527155e78103d01598fa3897cccfa793ba2028a32bc36c86759f14", size = 60352 },
] ]
[[package]] [[package]]
@@ -1533,7 +1533,7 @@ wheels = [
[[package]] [[package]]
name = "remoteterm-meshcore" name = "remoteterm-meshcore"
version = "3.12.3" version = "3.13.0"
source = { virtual = "." } source = { virtual = "." }
dependencies = [ dependencies = [
{ name = "aiomqtt" }, { name = "aiomqtt" },
@@ -1569,7 +1569,7 @@ requires-dist = [
{ name = "boto3", specifier = ">=1.38.0" }, { name = "boto3", specifier = ">=1.38.0" },
{ name = "fastapi", specifier = ">=0.115.0" }, { name = "fastapi", specifier = ">=0.115.0" },
{ name = "httpx", specifier = ">=0.28.1" }, { name = "httpx", specifier = ">=0.28.1" },
{ name = "meshcore", specifier = "==2.3.2" }, { name = "meshcore", specifier = "==2.3.7" },
{ name = "pycryptodome", specifier = ">=3.20.0" }, { name = "pycryptodome", specifier = ">=3.20.0" },
{ name = "pydantic-settings", specifier = ">=2.0.0" }, { name = "pydantic-settings", specifier = ">=2.0.0" },
{ name = "pynacl", specifier = ">=1.5.0" }, { name = "pynacl", specifier = ">=1.5.0" },