forked from iarv/potato-mesh
Compare commits
28 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 5a47a8f8e4 | |||
| c13f3c913f | |||
| 2e9b54b6cf | |||
| 7e844be627 | |||
| b37e55c29a | |||
| 332ba044f2 | |||
| 09a2d849ec | |||
| a3fb9b0d5c | |||
| 192978acf9 | |||
| 581aaea93b | |||
| 299752a4f1 | |||
| 142c0aa539 | |||
| 78168ce3db | |||
| 332abbc183 | |||
| c136c5cf26 | |||
| 2a65e89eee | |||
| d6f1e7bc80 | |||
| 5ac5f3ec3f | |||
| bb4cbfa62c | |||
| f0d600e5d7 | |||
| e0f0a6390d | |||
| d4a27dccf7 | |||
| 74c4596dc5 | |||
| 1f2328613c | |||
| eeca67f6ea | |||
| 4ae8a1cfca | |||
| ff06129a6f | |||
| 6d7aa4dd56 |
@@ -56,6 +56,9 @@ MATRIX_ROOM='#meshtastic-berlin:matrix.org'
|
||||
# Debug mode (0=off, 1=on)
|
||||
DEBUG=0
|
||||
|
||||
# Docker image architecture (linux-amd64, linux-arm64, linux-armv7)
|
||||
POTATOMESH_IMAGE_ARCH=linux-amd64
|
||||
|
||||
# Docker Compose networking profile
|
||||
# Leave unset for Linux hosts (default host networking).
|
||||
# Set to "bridge" on Docker Desktop (macOS/Windows) if host networking
|
||||
|
||||
@@ -33,6 +33,7 @@ jobs:
|
||||
architecture:
|
||||
- { name: linux-amd64, platform: linux/amd64, label: "Linux x86_64" }
|
||||
- { name: linux-arm64, platform: linux/arm64, label: "Linux ARM64" }
|
||||
- { name: linux-armv7, platform: linux/arm/v7, label: "Linux ARMv7" }
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
@@ -161,11 +162,13 @@ jobs:
|
||||
echo "### 🌐 Web Application" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-web-linux-amd64:latest\` - Linux x86_64" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-web-linux-arm64:latest\` - Linux ARM64" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-web-linux-armv7:latest\` - Linux ARMv7" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
|
||||
# Ingestor images
|
||||
echo "### 📡 Ingestor Service" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-ingestor-linux-amd64:latest\` - Linux x86_64" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-ingestor-linux-arm64:latest\` - Linux ARM64" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- \`${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-ingestor-linux-armv7:latest\` - Linux ARMv7" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
|
||||
|
||||
@@ -1,103 +1,85 @@
|
||||
# PotatoMesh Docker Setup
|
||||
# PotatoMesh Docker Guide
|
||||
|
||||
## Quick Start
|
||||
PotatoMesh publishes ready-to-run container images to the GitHub Packages container
|
||||
registry (GHCR). You do not need to clone the repository to deploy them—Compose
|
||||
will pull the latest release images for you.
|
||||
|
||||
```bash
|
||||
./configure.sh
|
||||
docker-compose up -d
|
||||
docker-compose logs -f
|
||||
## Prerequisites
|
||||
|
||||
- Docker Engine 24+ or Docker Desktop with the Compose plugin
|
||||
- Access to `/dev/ttyACM*` (or equivalent) if you plan to attach a Meshtastic
|
||||
device to the ingestor container
|
||||
- An API token that authorises the ingestor to post to your PotatoMesh instance
|
||||
|
||||
## Images on GHCR
|
||||
|
||||
| Service | Image |
|
||||
|----------|-------------------------------------------------------------------|
|
||||
| Web UI | `ghcr.io/l5yth/potato-mesh-web-linux-amd64:latest` |
|
||||
| Ingestor | `ghcr.io/l5yth/potato-mesh-ingestor-linux-amd64:latest` |
|
||||
|
||||
Images are published for every tagged release. Replace `latest` with a
|
||||
specific version tag if you prefer pinned deployments.
|
||||
|
||||
## Configure environment
|
||||
|
||||
Create a `.env` file alongside your Compose file and populate the variables you
|
||||
need. At a minimum you must set `API_TOKEN` so the ingestor can authenticate
|
||||
against the web API.
|
||||
|
||||
```env
|
||||
API_TOKEN=replace-with-a-strong-token
|
||||
SITE_NAME=My Meshtastic Network
|
||||
MESH_SERIAL=/dev/ttyACM0
|
||||
```
|
||||
|
||||
The default configuration attaches both services to the host network. This
|
||||
avoids creating Docker bridge interfaces on platforms where that operation is
|
||||
blocked. Access the dashboard at `http://127.0.0.1:41447` as soon as the
|
||||
containers are running. On Docker Desktop (macOS/Windows) or when you prefer
|
||||
traditional bridged networking, start Compose with the `bridge` profile:
|
||||
Additional environment variables are optional:
|
||||
|
||||
- `DEFAULT_CHANNEL`, `DEFAULT_FREQUENCY`, `MAP_CENTER_LAT`, `MAP_CENTER_LON`,
|
||||
`MAX_NODE_DISTANCE_KM`, and `MATRIX_ROOM` customise the UI.
|
||||
- `POTATOMESH_INSTANCE` (defaults to `http://web:41447`) lets the ingestor post
|
||||
to a remote PotatoMesh instance if you do not run both services together.
|
||||
- `MESH_CHANNEL_INDEX`, `MESH_SNAPSHOT_SECS`, and `DEBUG` adjust ingestor
|
||||
behaviour.
|
||||
|
||||
## Docker Compose file
|
||||
|
||||
Use the `docker-compose.yml` file provided in the repository (or download the
|
||||
[raw file from GitHub](https://raw.githubusercontent.com/l5yth/potato-mesh/main/docker-compose.yml)).
|
||||
It already references the published GHCR images, defines persistent volumes for
|
||||
data and logs, and includes optional bridge-profile services for environments
|
||||
that require classic port mapping. Place this file in the same directory as
|
||||
your `.env` file so Compose can pick up both.
|
||||
|
||||
## Start the stack
|
||||
|
||||
From the directory containing the Compose file:
|
||||
|
||||
```bash
|
||||
COMPOSE_PROFILES=bridge docker-compose up -d
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
Access at `http://localhost:41447`
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `.env` file or run `./configure.sh` to set:
|
||||
|
||||
- `API_TOKEN` - Required for ingestor authentication
|
||||
- `MESH_SERIAL` - Your Meshtastic device path (e.g., `/dev/ttyACM0`)
|
||||
- `SITE_NAME` - Your mesh network name
|
||||
- `MAP_CENTER_LAT/LON` - Map center coordinates
|
||||
|
||||
## Device Setup
|
||||
|
||||
**Find your device:**
|
||||
Docker automatically pulls the GHCR images when they are not present locally.
|
||||
The dashboard becomes available at `http://127.0.0.1:41447`. Use the bridge
|
||||
profile when you need to map the port explicitly:
|
||||
|
||||
```bash
|
||||
# Linux
|
||||
ls /dev/ttyACM* /dev/ttyUSB*
|
||||
|
||||
# macOS
|
||||
ls /dev/cu.usbserial-*
|
||||
|
||||
# Windows
|
||||
ls /dev/ttyS*
|
||||
COMPOSE_PROFILES=bridge docker compose up -d
|
||||
```
|
||||
|
||||
**Set permissions (Linux/macOS):**
|
||||
## Updating
|
||||
|
||||
```bash
|
||||
sudo chmod 666 /dev/ttyACM0
|
||||
# Or add user to dialout group
|
||||
sudo usermod -a -G dialout $USER
|
||||
```
|
||||
|
||||
## Common Commands
|
||||
|
||||
```bash
|
||||
# Start services
|
||||
docker-compose up -d
|
||||
|
||||
# View logs
|
||||
docker-compose logs -f
|
||||
|
||||
# Stop services
|
||||
docker-compose down
|
||||
|
||||
# Stop and remove data
|
||||
docker-compose down -v
|
||||
|
||||
# Update images
|
||||
docker-compose pull && docker-compose up -d
|
||||
docker compose pull
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Device access issues:**
|
||||
- **Serial device permissions (Linux/macOS):** grant access with `sudo chmod 666
|
||||
/dev/ttyACM0` or add your user to the `dialout` group.
|
||||
- **Port already in use:** identify the conflicting service with `sudo lsof -i
|
||||
:41447`.
|
||||
- **Viewing logs:** `docker compose logs -f` tails output from both services.
|
||||
|
||||
```bash
|
||||
# Check device exists and permissions
|
||||
ls -la /dev/ttyACM0
|
||||
|
||||
# Fix permissions
|
||||
sudo chmod 666 /dev/ttyACM0
|
||||
```
|
||||
|
||||
**Port conflicts:**
|
||||
|
||||
```bash
|
||||
# Find what's using port 41447
|
||||
sudo lsof -i :41447
|
||||
```
|
||||
|
||||
**Container issues:**
|
||||
|
||||
```bash
|
||||
# Check logs
|
||||
docker-compose logs
|
||||
|
||||
# Restart services
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
For more Docker help, see [Docker Compose documentation](https://docs.docker.com/compose/).
|
||||
For general Docker support, consult the [Docker Compose documentation](https://docs.docker.com/compose/).
|
||||
|
||||
@@ -18,23 +18,6 @@ Live demo for Berlin #MediumFast: [potatomesh.net](https://potatomesh.net)
|
||||
|
||||

|
||||
|
||||
## Quick Start with Docker
|
||||
|
||||
```bash
|
||||
./configure.sh # Configure your setup
|
||||
docker-compose up -d # Start services
|
||||
docker-compose logs -f # View logs
|
||||
```
|
||||
|
||||
PotatoMesh uses host networking by default so it can run on restricted
|
||||
systems where Docker cannot create bridged interfaces. The web UI listens on
|
||||
`http://127.0.0.1:41447` immediately without explicit port mappings. If you
|
||||
are using Docker Desktop (macOS/Windows) or otherwise require bridged
|
||||
networking, enable the Compose profile with:
|
||||
|
||||
```bash
|
||||
COMPOSE_PROFILES=bridge docker-compose up -d
|
||||
```
|
||||
|
||||
## Web App
|
||||
|
||||
@@ -72,6 +55,7 @@ The web app can be configured with environment variables (defaults shown):
|
||||
* `MAP_CENTER_LAT` / `MAP_CENTER_LON` - default map center coordinates (default: `52.502889` / `13.404194`)
|
||||
* `MAX_NODE_DISTANCE_KM` - hide nodes farther than this distance from the center (default: `137`)
|
||||
* `MATRIX_ROOM` - matrix room id for a footer link (default: `#meshtastic-berlin:matrix.org`)
|
||||
* `PRIVATE` - set to `1` to hide the chat UI, disable message APIs, and exclude hidden clients (default: unset)
|
||||
|
||||
The application derives SEO-friendly document titles, descriptions, and social
|
||||
preview tags from these existing configuration values and reuses the bundled
|
||||
@@ -89,10 +73,10 @@ The web app contains an API:
|
||||
|
||||
* GET `/api/nodes?limit=100` - returns the latest 100 nodes reported to the app
|
||||
* GET `/api/positions?limit=100` - returns the latest 100 position data
|
||||
* GET `/api/messages?limit=100` - returns the latest 100 messages
|
||||
* GET `/api/messages?limit=100` - returns the latest 100 messages (disabled when `PRIVATE=1`)
|
||||
* POST `/api/nodes` - upserts nodes provided as JSON object mapping node ids to node data (requires `Authorization: Bearer <API_TOKEN>`)
|
||||
* POST `/api/messages` - appends positions provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`)
|
||||
* POST `/api/messages` - appends messages provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`)
|
||||
* POST `/api/positions` - appends positions provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`)
|
||||
* POST `/api/messages` - appends messages provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`; disabled when `PRIVATE=1`)
|
||||
|
||||
The `API_TOKEN` environment variable must be set to a non-empty value and match the token supplied in the `Authorization` header for `POST` requests.
|
||||
|
||||
@@ -140,6 +124,11 @@ address (for example `192.168.1.20:4403`) to use the Meshtastic TCP interface.
|
||||
|
||||
* <https://potatomesh.net/>
|
||||
* <https://vrs.kdd2105.ru/>
|
||||
* <https://potatomesh.stratospire.com/>
|
||||
|
||||
## Docker
|
||||
|
||||
Looking for container deployment instructions? See the [Docker guide](DOCKER.md).
|
||||
|
||||
## License
|
||||
|
||||
|
||||
@@ -62,6 +62,7 @@ MAP_CENTER_LON=$(grep "^MAP_CENTER_LON=" .env 2>/dev/null | cut -d'=' -f2- | tr
|
||||
MAX_NODE_DISTANCE_KM=$(grep "^MAX_NODE_DISTANCE_KM=" .env 2>/dev/null | cut -d'=' -f2- | tr -d '"' || echo "50")
|
||||
MATRIX_ROOM=$(grep "^MATRIX_ROOM=" .env 2>/dev/null | cut -d'=' -f2- | tr -d '"' || echo "")
|
||||
API_TOKEN=$(grep "^API_TOKEN=" .env 2>/dev/null | cut -d'=' -f2- | tr -d '"' || echo "")
|
||||
POTATOMESH_IMAGE_ARCH=$(grep "^POTATOMESH_IMAGE_ARCH=" .env 2>/dev/null | cut -d'=' -f2- | tr -d '"' || echo "linux-amd64")
|
||||
|
||||
echo "📍 Location Settings"
|
||||
echo "-------------------"
|
||||
@@ -81,6 +82,12 @@ echo "💬 Optional Settings"
|
||||
echo "-------------------"
|
||||
read_with_default "Matrix Room (optional, e.g., #meshtastic-berlin:matrix.org)" "$MATRIX_ROOM" MATRIX_ROOM
|
||||
|
||||
echo ""
|
||||
echo "🛠 Docker Settings"
|
||||
echo "------------------"
|
||||
echo "Specify the Docker image architecture for your host (linux-amd64, linux-arm64, linux-armv7)."
|
||||
read_with_default "Docker image architecture" "$POTATOMESH_IMAGE_ARCH" POTATOMESH_IMAGE_ARCH
|
||||
|
||||
echo ""
|
||||
echo "🔐 Security Settings"
|
||||
echo "-------------------"
|
||||
@@ -124,6 +131,7 @@ update_env "MAP_CENTER_LON" "$MAP_CENTER_LON"
|
||||
update_env "MAX_NODE_DISTANCE_KM" "$MAX_NODE_DISTANCE_KM"
|
||||
update_env "MATRIX_ROOM" "\"$MATRIX_ROOM\""
|
||||
update_env "API_TOKEN" "$API_TOKEN"
|
||||
update_env "POTATOMESH_IMAGE_ARCH" "$POTATOMESH_IMAGE_ARCH"
|
||||
|
||||
# Add other common settings if they don't exist
|
||||
if ! grep -q "^MESH_SERIAL=" .env; then
|
||||
@@ -148,6 +156,7 @@ echo " Channel: $DEFAULT_CHANNEL"
|
||||
echo " Frequency: $DEFAULT_FREQUENCY"
|
||||
echo " Matrix Room: ${MATRIX_ROOM:-'Not set'}"
|
||||
echo " API Token: ${API_TOKEN:0:8}..."
|
||||
echo " Docker Image Arch: $POTATOMESH_IMAGE_ARCH"
|
||||
echo ""
|
||||
echo "🚀 You can now start PotatoMesh with:"
|
||||
echo " docker-compose up -d"
|
||||
|
||||
+562
-87
@@ -24,12 +24,24 @@ entry point that performs these synchronisation tasks.
|
||||
|
||||
import base64
|
||||
import dataclasses
|
||||
import glob
|
||||
import heapq
|
||||
import inspect
|
||||
import ipaddress
|
||||
import itertools
|
||||
import json, os, time, threading, signal, urllib.request, urllib.error, urllib.parse
|
||||
import json
|
||||
import math
|
||||
import os
|
||||
import re
|
||||
import signal
|
||||
import threading
|
||||
import time
|
||||
import urllib.error
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
from collections.abc import Mapping
|
||||
from functools import lru_cache
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from meshtastic.serial_interface import SerialInterface
|
||||
from meshtastic.tcp_interface import TCPInterface
|
||||
@@ -38,8 +50,16 @@ from google.protobuf.json_format import MessageToDict
|
||||
from google.protobuf.message import Message as ProtoMessage
|
||||
from google.protobuf.message import DecodeError
|
||||
|
||||
if TYPE_CHECKING: # pragma: no cover - import only used for type checking
|
||||
from meshtastic.ble_interface import BLEInterface as _BLEInterface
|
||||
|
||||
# Exposed for tests and backward compatibility; resolved lazily in
|
||||
# :func:`_load_ble_interface` so importing this module does not require the BLE
|
||||
# extras to be installed.
|
||||
BLEInterface = None
|
||||
|
||||
# --- Config (env overrides) ---------------------------------------------------
|
||||
PORT = os.environ.get("MESH_SERIAL", "/dev/ttyACM0")
|
||||
PORT = os.environ.get("MESH_SERIAL")
|
||||
SNAPSHOT_SECS = int(os.environ.get("MESH_SNAPSHOT_SECS", "60"))
|
||||
CHANNEL_INDEX = int(os.environ.get("MESH_CHANNEL_INDEX", "0"))
|
||||
DEBUG = os.environ.get("DEBUG") == "1"
|
||||
@@ -51,12 +71,57 @@ API_TOKEN = os.environ.get("API_TOKEN", "")
|
||||
|
||||
|
||||
_DEFAULT_TCP_PORT = 4403
|
||||
_DEFAULT_TCP_TARGET = "http://127.0.0.1"
|
||||
|
||||
_DEFAULT_SERIAL_PATTERNS = (
|
||||
"/dev/ttyACM*",
|
||||
"/dev/ttyUSB*",
|
||||
"/dev/tty.usbmodem*",
|
||||
"/dev/tty.usbserial*",
|
||||
"/dev/cu.usbmodem*",
|
||||
"/dev/cu.usbserial*",
|
||||
)
|
||||
|
||||
_BLE_ADDRESS_RE = re.compile(r"^(?:[0-9a-fA-F]{2}:){5}[0-9a-fA-F]{2}$")
|
||||
|
||||
|
||||
def _debug_log(message: str):
|
||||
"""Print ``message`` with a UTC timestamp when ``DEBUG`` is enabled."""
|
||||
|
||||
if not DEBUG:
|
||||
return
|
||||
|
||||
timestamp = time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime())
|
||||
print(f"[{timestamp}] [debug] {message}")
|
||||
|
||||
|
||||
# Reconnect configuration: retry delays are adjustable via environment
|
||||
# variables to ease testing while keeping sensible defaults in production.
|
||||
_RECONNECT_INITIAL_DELAY_SECS = float(os.environ.get("MESH_RECONNECT_INITIAL", "5"))
|
||||
_RECONNECT_MAX_DELAY_SECS = float(os.environ.get("MESH_RECONNECT_MAX", "60"))
|
||||
_CLOSE_TIMEOUT_SECS = float(os.environ.get("MESH_CLOSE_TIMEOUT", "5"))
|
||||
|
||||
|
||||
def _event_wait_allows_default_timeout() -> bool:
|
||||
"""Return ``True`` when :func:`threading.Event.wait` accepts no timeout."""
|
||||
|
||||
try:
|
||||
wait_signature = inspect.signature(threading.Event.wait)
|
||||
except (TypeError, ValueError): # pragma: no cover - built-ins without inspect
|
||||
return True
|
||||
|
||||
parameters = list(wait_signature.parameters.values())
|
||||
if len(parameters) <= 1:
|
||||
return True
|
||||
|
||||
timeout_parameter = parameters[1]
|
||||
if timeout_parameter.kind in (
|
||||
inspect.Parameter.VAR_POSITIONAL,
|
||||
inspect.Parameter.VAR_KEYWORD,
|
||||
):
|
||||
return True
|
||||
|
||||
return timeout_parameter.default is not inspect._empty
|
||||
|
||||
|
||||
class _DummySerialInterface:
|
||||
@@ -76,6 +141,19 @@ class _DummySerialInterface:
|
||||
pass
|
||||
|
||||
|
||||
def _parse_ble_target(value: str) -> str | None:
|
||||
"""Return an uppercase BLE MAC address when ``value`` matches the format."""
|
||||
|
||||
if not value:
|
||||
return None
|
||||
value = value.strip()
|
||||
if not value:
|
||||
return None
|
||||
if _BLE_ADDRESS_RE.fullmatch(value):
|
||||
return value.upper()
|
||||
return None
|
||||
|
||||
|
||||
def _parse_network_target(value: str) -> tuple[str, int] | None:
|
||||
"""Return ``(host, port)`` when ``value`` is an IP address string.
|
||||
|
||||
@@ -128,8 +206,27 @@ def _parse_network_target(value: str) -> tuple[str, int] | None:
|
||||
return _validated_result(value, None)
|
||||
|
||||
|
||||
def _create_serial_interface(port: str):
|
||||
"""Return an appropriate serial interface for ``port``.
|
||||
@lru_cache(maxsize=1)
|
||||
def _load_ble_interface():
|
||||
"""Return :class:`meshtastic.ble_interface.BLEInterface` when available."""
|
||||
|
||||
global BLEInterface
|
||||
if BLEInterface is not None:
|
||||
return BLEInterface
|
||||
|
||||
try:
|
||||
from meshtastic.ble_interface import BLEInterface as _resolved_interface
|
||||
except ImportError as exc: # pragma: no cover - exercised in non-BLE envs
|
||||
raise RuntimeError(
|
||||
"BLE interface requested but the Meshtastic BLE dependencies are not installed. "
|
||||
"Install the 'meshtastic[ble]' extra to enable BLE support."
|
||||
) from exc
|
||||
BLEInterface = _resolved_interface
|
||||
return _resolved_interface
|
||||
|
||||
|
||||
def _create_serial_interface(port: str) -> tuple[object, str]:
|
||||
"""Return an appropriate mesh interface for ``port``.
|
||||
|
||||
Passing ``mock`` (case-insensitive) or an empty value skips hardware access
|
||||
and returns :class:`_DummySerialInterface`. This makes it possible to run
|
||||
@@ -139,16 +236,64 @@ def _create_serial_interface(port: str):
|
||||
|
||||
port_value = (port or "").strip()
|
||||
if port_value.lower() in {"", "mock", "none", "null", "disabled"}:
|
||||
if DEBUG:
|
||||
print(f"[debug] using dummy serial interface for port={port_value!r}")
|
||||
return _DummySerialInterface()
|
||||
_debug_log(f"using dummy serial interface for port={port_value!r}")
|
||||
return _DummySerialInterface(), "mock"
|
||||
ble_target = _parse_ble_target(port_value)
|
||||
if ble_target:
|
||||
_debug_log(f"using BLE interface for address={ble_target}")
|
||||
return _load_ble_interface()(address=ble_target), ble_target
|
||||
network_target = _parse_network_target(port_value)
|
||||
if network_target:
|
||||
host, tcp_port = network_target
|
||||
if DEBUG:
|
||||
print("[debug] using TCP interface for host=" f"{host!r} port={tcp_port!r}")
|
||||
return TCPInterface(hostname=host, portNumber=tcp_port)
|
||||
return SerialInterface(devPath=port_value)
|
||||
_debug_log(f"using TCP interface for host={host!r} port={tcp_port!r}")
|
||||
return (
|
||||
TCPInterface(hostname=host, portNumber=tcp_port),
|
||||
f"tcp://{host}:{tcp_port}",
|
||||
)
|
||||
_debug_log(f"using serial interface for port={port_value!r}")
|
||||
return SerialInterface(devPath=port_value), port_value
|
||||
|
||||
|
||||
class NoAvailableMeshInterface(RuntimeError):
|
||||
"""Raised when no default mesh interface can be created."""
|
||||
|
||||
|
||||
def _default_serial_targets() -> list[str]:
|
||||
"""Return a list of candidate serial device paths for auto-discovery."""
|
||||
|
||||
candidates: list[str] = []
|
||||
seen: set[str] = set()
|
||||
for pattern in _DEFAULT_SERIAL_PATTERNS:
|
||||
for path in sorted(glob.glob(pattern)):
|
||||
if path not in seen:
|
||||
candidates.append(path)
|
||||
seen.add(path)
|
||||
if "/dev/ttyACM0" not in seen:
|
||||
candidates.append("/dev/ttyACM0")
|
||||
return candidates
|
||||
|
||||
|
||||
def _create_default_interface() -> tuple[object, str]:
|
||||
"""Attempt to create the default mesh interface, raising on failure."""
|
||||
|
||||
errors: list[tuple[str, Exception]] = []
|
||||
for candidate in _default_serial_targets():
|
||||
try:
|
||||
return _create_serial_interface(candidate)
|
||||
except Exception as exc: # pragma: no cover - hardware dependent
|
||||
errors.append((candidate, exc))
|
||||
_debug_log(f"failed to open serial candidate {candidate!r}: {exc}")
|
||||
try:
|
||||
return _create_serial_interface(_DEFAULT_TCP_TARGET)
|
||||
except Exception as exc: # pragma: no cover - network dependent
|
||||
errors.append((_DEFAULT_TCP_TARGET, exc))
|
||||
_debug_log(f"failed to open TCP fallback {_DEFAULT_TCP_TARGET!r}: {exc}")
|
||||
if errors:
|
||||
summary = "; ".join(f"{target}: {error}" for target, error in errors)
|
||||
raise NoAvailableMeshInterface(
|
||||
f"no mesh interface available ({summary})"
|
||||
) from errors[-1][1]
|
||||
raise NoAvailableMeshInterface("no mesh interface available")
|
||||
|
||||
|
||||
# --- POST queue ----------------------------------------------------------------
|
||||
@@ -157,18 +302,23 @@ _POST_QUEUE = []
|
||||
_POST_QUEUE_COUNTER = itertools.count()
|
||||
_POST_QUEUE_ACTIVE = False
|
||||
|
||||
_MESSAGE_POST_PRIORITY = 0
|
||||
_POSITION_POST_PRIORITY = 10
|
||||
_NODE_POST_PRIORITY = 20
|
||||
_DEFAULT_POST_PRIORITY = 50
|
||||
_MESSAGE_POST_PRIORITY = 10
|
||||
_NEIGHBOR_POST_PRIORITY = 20
|
||||
_POSITION_POST_PRIORITY = 30
|
||||
_TELEMETRY_POST_PRIORITY = 40
|
||||
_NODE_POST_PRIORITY = 50
|
||||
_DEFAULT_POST_PRIORITY = 90
|
||||
|
||||
_RECEIVE_TOPICS = (
|
||||
"meshtastic.receive",
|
||||
"meshtastic.receive.text",
|
||||
"meshtastic.receive.position",
|
||||
"meshtastic.receive.POSITION_APP",
|
||||
"meshtastic.receive.user",
|
||||
"meshtastic.receive.POSITION_APP",
|
||||
"meshtastic.receive.NODEINFO_APP",
|
||||
"meshtastic.receive.NEIGHBORINFO_APP",
|
||||
"meshtastic.receive.TEXT_MESSAGE_APP",
|
||||
"meshtastic.receive.TELEMETRY_APP",
|
||||
)
|
||||
|
||||
|
||||
@@ -210,8 +360,7 @@ def _post_json(path: str, payload: dict):
|
||||
with urllib.request.urlopen(req, timeout=10) as resp:
|
||||
resp.read()
|
||||
except Exception as e:
|
||||
if DEBUG:
|
||||
print(f"[warn] POST {url} failed: {e}")
|
||||
_debug_log(f"[warn] POST {url} failed: {e}")
|
||||
|
||||
|
||||
def _enqueue_post_json(path: str, payload: dict, priority: int):
|
||||
@@ -324,7 +473,8 @@ def upsert_node(node_id, n):
|
||||
if DEBUG:
|
||||
user = _get(ndict, "user") or {}
|
||||
short = _get(user, "shortName")
|
||||
print(f"[debug] upserted node {node_id} shortName={short!r}")
|
||||
long = _get(user, "longName")
|
||||
_debug_log(f"upserted node {node_id} shortName={short!r} longName={long!r}")
|
||||
|
||||
|
||||
# --- Message logging via PubSub -----------------------------------------------
|
||||
@@ -859,7 +1009,7 @@ def store_position_packet(packet: dict, decoded: Mapping):
|
||||
|
||||
position_payload = {
|
||||
"id": pkt_id,
|
||||
"node_id": node_id,
|
||||
"node_id": node_id or raw_from,
|
||||
"node_num": node_num,
|
||||
"num": node_num,
|
||||
"from_id": node_id,
|
||||
@@ -890,8 +1040,160 @@ def store_position_packet(packet: dict, decoded: Mapping):
|
||||
)
|
||||
|
||||
if DEBUG:
|
||||
print(
|
||||
f"[debug] stored position for {node_id} lat={latitude!r} lon={longitude!r} rx_time={rx_time}"
|
||||
_debug_log(f"stored position for {node_id} lat={latitude!r} lon={longitude!r}")
|
||||
|
||||
|
||||
def store_telemetry_packet(packet: dict, decoded: Mapping):
|
||||
"""Handle ``TELEMETRY_APP`` packets and forward them to ``/api/telemetry``."""
|
||||
|
||||
telemetry_section = (
|
||||
decoded.get("telemetry") if isinstance(decoded, Mapping) else None
|
||||
)
|
||||
if not isinstance(telemetry_section, Mapping):
|
||||
return
|
||||
|
||||
pkt_id = _coerce_int(_first(packet, "id", "packet_id", "packetId", default=None))
|
||||
if pkt_id is None:
|
||||
return
|
||||
|
||||
raw_from = _first(packet, "fromId", "from_id", "from", default=None)
|
||||
node_id = _canonical_node_id(raw_from)
|
||||
node_num = _coerce_int(_first(decoded, "num", "node_num", default=None))
|
||||
if node_num is None:
|
||||
node_num = _node_num_from_id(node_id or raw_from)
|
||||
|
||||
to_id = _first(packet, "toId", "to_id", "to", default=None)
|
||||
|
||||
raw_rx_time = _first(packet, "rxTime", "rx_time", default=time.time())
|
||||
try:
|
||||
rx_time = int(raw_rx_time)
|
||||
except (TypeError, ValueError):
|
||||
rx_time = int(time.time())
|
||||
rx_iso = _iso(rx_time)
|
||||
|
||||
telemetry_time = _coerce_int(_first(telemetry_section, "time", default=None))
|
||||
|
||||
channel = _coerce_int(_first(decoded, "channel", default=None))
|
||||
if channel is None:
|
||||
channel = _coerce_int(_first(packet, "channel", default=None))
|
||||
if channel is None:
|
||||
channel = 0
|
||||
|
||||
bitfield = _coerce_int(_first(decoded, "bitfield", default=None))
|
||||
snr = _coerce_float(_first(packet, "snr", "rx_snr", "rxSnr", default=None))
|
||||
rssi = _coerce_int(_first(packet, "rssi", "rx_rssi", "rxRssi", default=None))
|
||||
hop_limit = _coerce_int(_first(packet, "hopLimit", "hop_limit", default=None))
|
||||
|
||||
portnum_raw = _first(decoded, "portnum", default=None)
|
||||
portnum = str(portnum_raw) if portnum_raw is not None else None
|
||||
|
||||
payload_bytes = _extract_payload_bytes(decoded)
|
||||
payload_b64 = (
|
||||
base64.b64encode(payload_bytes).decode("ascii") if payload_bytes else None
|
||||
)
|
||||
|
||||
device_metrics_section = telemetry_section.get(
|
||||
"deviceMetrics"
|
||||
) or telemetry_section.get("device_metrics")
|
||||
device_metrics = (
|
||||
_node_to_dict(device_metrics_section)
|
||||
if isinstance(device_metrics_section, Mapping)
|
||||
else None
|
||||
)
|
||||
|
||||
environment_section = telemetry_section.get(
|
||||
"environmentMetrics"
|
||||
) or telemetry_section.get("environment_metrics")
|
||||
environment_metrics = (
|
||||
_node_to_dict(environment_section)
|
||||
if isinstance(environment_section, Mapping)
|
||||
else None
|
||||
)
|
||||
|
||||
metrics_lookup = lambda mapping, *names: _first(mapping or {}, *names, default=None)
|
||||
|
||||
battery_level = _coerce_float(
|
||||
metrics_lookup(device_metrics, "batteryLevel", "battery_level")
|
||||
)
|
||||
voltage = _coerce_float(metrics_lookup(device_metrics, "voltage"))
|
||||
channel_utilization = _coerce_float(
|
||||
metrics_lookup(device_metrics, "channelUtilization", "channel_utilization")
|
||||
)
|
||||
air_util_tx = _coerce_float(
|
||||
metrics_lookup(device_metrics, "airUtilTx", "air_util_tx")
|
||||
)
|
||||
uptime_seconds = _coerce_int(
|
||||
metrics_lookup(device_metrics, "uptimeSeconds", "uptime_seconds")
|
||||
)
|
||||
|
||||
temperature = _coerce_float(
|
||||
metrics_lookup(
|
||||
environment_metrics,
|
||||
"temperature",
|
||||
"temperatureC",
|
||||
"temperature_c",
|
||||
"tempC",
|
||||
)
|
||||
)
|
||||
relative_humidity = _coerce_float(
|
||||
metrics_lookup(
|
||||
environment_metrics,
|
||||
"relativeHumidity",
|
||||
"relative_humidity",
|
||||
"humidity",
|
||||
)
|
||||
)
|
||||
barometric_pressure = _coerce_float(
|
||||
metrics_lookup(
|
||||
environment_metrics,
|
||||
"barometricPressure",
|
||||
"barometric_pressure",
|
||||
"pressure",
|
||||
)
|
||||
)
|
||||
|
||||
telemetry_payload = {
|
||||
"id": pkt_id,
|
||||
"node_id": node_id,
|
||||
"node_num": node_num,
|
||||
"from_id": node_id or raw_from,
|
||||
"to_id": to_id,
|
||||
"rx_time": rx_time,
|
||||
"rx_iso": rx_iso,
|
||||
"telemetry_time": telemetry_time,
|
||||
"channel": channel,
|
||||
"portnum": portnum,
|
||||
"bitfield": bitfield,
|
||||
"snr": snr,
|
||||
"rssi": rssi,
|
||||
"hop_limit": hop_limit,
|
||||
"payload_b64": payload_b64,
|
||||
}
|
||||
|
||||
if battery_level is not None:
|
||||
telemetry_payload["battery_level"] = battery_level
|
||||
if voltage is not None:
|
||||
telemetry_payload["voltage"] = voltage
|
||||
if channel_utilization is not None:
|
||||
telemetry_payload["channel_utilization"] = channel_utilization
|
||||
if air_util_tx is not None:
|
||||
telemetry_payload["air_util_tx"] = air_util_tx
|
||||
if uptime_seconds is not None:
|
||||
telemetry_payload["uptime_seconds"] = uptime_seconds
|
||||
if temperature is not None:
|
||||
telemetry_payload["temperature"] = temperature
|
||||
if relative_humidity is not None:
|
||||
telemetry_payload["relative_humidity"] = relative_humidity
|
||||
if barometric_pressure is not None:
|
||||
telemetry_payload["barometric_pressure"] = barometric_pressure
|
||||
|
||||
_queue_post_json(
|
||||
"/api/telemetry", telemetry_payload, priority=_TELEMETRY_POST_PRIORITY
|
||||
)
|
||||
|
||||
if DEBUG:
|
||||
_debug_log(
|
||||
f"stored telemetry for {node_id!r} battery={battery_level!r} voltage={voltage!r}"
|
||||
)
|
||||
|
||||
|
||||
@@ -1042,7 +1344,96 @@ def store_nodeinfo_packet(packet: dict, decoded: Mapping):
|
||||
short = None
|
||||
if isinstance(user_dict, Mapping):
|
||||
short = user_dict.get("shortName")
|
||||
print(f"[debug] stored nodeinfo for {node_id} shortName={short!r}")
|
||||
long = user_dict.get("longName")
|
||||
else:
|
||||
long = None
|
||||
_debug_log(
|
||||
f"stored nodeinfo for {node_id} shortName={short!r} longName={long!r}"
|
||||
)
|
||||
|
||||
|
||||
def store_neighborinfo_packet(packet: dict, decoded: Mapping):
|
||||
"""Handle ``NEIGHBORINFO_APP`` packets for mesh health broadcasts."""
|
||||
|
||||
neighbor_section = (
|
||||
decoded.get("neighborinfo") if isinstance(decoded, Mapping) else None
|
||||
)
|
||||
if not isinstance(neighbor_section, Mapping):
|
||||
return
|
||||
|
||||
node_ref = _first(
|
||||
neighbor_section,
|
||||
"nodeId",
|
||||
"node_id",
|
||||
default=_first(packet, "fromId", "from_id", "from", default=None),
|
||||
)
|
||||
node_id = _canonical_node_id(node_ref)
|
||||
if node_id is None:
|
||||
return
|
||||
|
||||
node_num = _coerce_int(_first(neighbor_section, "nodeId", "node_id", default=None))
|
||||
if node_num is None:
|
||||
node_num = _node_num_from_id(node_id)
|
||||
|
||||
rx_time = _coerce_int(_first(packet, "rxTime", "rx_time", default=time.time()))
|
||||
if rx_time is None:
|
||||
rx_time = int(time.time())
|
||||
|
||||
neighbors_payload = neighbor_section.get("neighbors")
|
||||
neighbors_iterable = (
|
||||
neighbors_payload if isinstance(neighbors_payload, list) else []
|
||||
)
|
||||
|
||||
neighbor_entries: list[dict] = []
|
||||
for entry in neighbors_iterable:
|
||||
if not isinstance(entry, Mapping):
|
||||
continue
|
||||
neighbor_ref = _first(entry, "nodeId", "node_id", default=None)
|
||||
neighbor_id = _canonical_node_id(neighbor_ref)
|
||||
if neighbor_id is None:
|
||||
continue
|
||||
neighbor_num = _coerce_int(_first(entry, "nodeId", "node_id", default=None))
|
||||
if neighbor_num is None:
|
||||
neighbor_num = _node_num_from_id(neighbor_id)
|
||||
snr = _coerce_float(_first(entry, "snr", default=None))
|
||||
entry_rx_time = _coerce_int(_first(entry, "rxTime", "rx_time", default=None))
|
||||
if entry_rx_time is None:
|
||||
entry_rx_time = rx_time
|
||||
neighbor_entries.append(
|
||||
{
|
||||
"neighbor_id": neighbor_id,
|
||||
"neighbor_num": neighbor_num,
|
||||
"snr": snr,
|
||||
"rx_time": entry_rx_time,
|
||||
}
|
||||
)
|
||||
|
||||
payload: dict[str, object] = {
|
||||
"node_id": node_id,
|
||||
"node_num": node_num,
|
||||
"rx_time": rx_time,
|
||||
}
|
||||
if neighbor_entries:
|
||||
payload["neighbors"] = neighbor_entries
|
||||
|
||||
broadcast_interval = _coerce_int(
|
||||
_first(neighbor_section, "nodeBroadcastIntervalSecs", default=None)
|
||||
)
|
||||
if broadcast_interval is not None:
|
||||
payload["node_broadcast_interval_secs"] = broadcast_interval
|
||||
|
||||
last_sent_by = _canonical_node_id(
|
||||
_first(neighbor_section, "lastSentById", "last_sent_by_id", default=None)
|
||||
)
|
||||
if last_sent_by is not None:
|
||||
payload["last_sent_by_id"] = last_sent_by
|
||||
|
||||
_queue_post_json("/api/neighbors", payload, priority=_NEIGHBOR_POST_PRIORITY)
|
||||
|
||||
if DEBUG:
|
||||
_debug_log(
|
||||
f"stored neighborinfo for {node_id} neighbors={len(neighbor_entries)}"
|
||||
)
|
||||
|
||||
|
||||
def store_packet_dict(p: dict):
|
||||
@@ -1060,6 +1451,16 @@ def store_packet_dict(p: dict):
|
||||
|
||||
portnum_raw = _first(dec, "portnum", default=None)
|
||||
portnum = str(portnum_raw).upper() if portnum_raw is not None else None
|
||||
portnum_int = _coerce_int(portnum_raw)
|
||||
|
||||
telemetry_section = dec.get("telemetry") if isinstance(dec, Mapping) else None
|
||||
if (
|
||||
portnum == "TELEMETRY_APP"
|
||||
or portnum_int == 65
|
||||
or isinstance(telemetry_section, Mapping)
|
||||
):
|
||||
store_telemetry_packet(p, dec)
|
||||
return
|
||||
|
||||
if portnum in {"5", "NODEINFO_APP"}:
|
||||
store_nodeinfo_packet(p, dec)
|
||||
@@ -1069,6 +1470,11 @@ def store_packet_dict(p: dict):
|
||||
store_position_packet(p, dec)
|
||||
return
|
||||
|
||||
neighborinfo_section = dec.get("neighborinfo") if isinstance(dec, Mapping) else None
|
||||
if portnum == "NEIGHBORINFO_APP" or isinstance(neighborinfo_section, Mapping):
|
||||
store_neighborinfo_packet(p, dec)
|
||||
return
|
||||
|
||||
text = _first(dec, "payload.text", "text", default=None)
|
||||
encrypted = _first(dec, "payload.encrypted", "encrypted", default=None)
|
||||
if encrypted is None:
|
||||
@@ -1102,7 +1508,7 @@ def store_packet_dict(p: dict):
|
||||
raw = json.dumps(p, default=str)
|
||||
except Exception:
|
||||
raw = str(p)
|
||||
print(f"[debug] packet missing from_id: {raw}")
|
||||
_debug_log(f"packet missing from_id: {raw}")
|
||||
|
||||
# link metrics
|
||||
snr = _first(p, "snr", "rx_snr", "rxSnr", default=None)
|
||||
@@ -1126,8 +1532,14 @@ def store_packet_dict(p: dict):
|
||||
_queue_post_json("/api/messages", msg, priority=_MESSAGE_POST_PRIORITY)
|
||||
|
||||
if DEBUG:
|
||||
print(
|
||||
f"[debug] stored message from {from_id!r} to {to_id!r} ch={ch} text={text!r}"
|
||||
from_label = _canonical_node_id(from_id) or from_id
|
||||
to_label = _canonical_node_id(to_id) or to_id
|
||||
if text is None and encrypted:
|
||||
payload = "Encrypted"
|
||||
else:
|
||||
payload = text
|
||||
_debug_log(
|
||||
f"stored message from {from_label!r} to {to_label!r} ch={ch} text={payload!r}"
|
||||
)
|
||||
|
||||
|
||||
@@ -1163,8 +1575,7 @@ def _subscribe_receive_topics() -> list[str]:
|
||||
pub.subscribe(on_receive, topic)
|
||||
subscribed.append(topic)
|
||||
except Exception as exc: # pragma: no cover - pub may raise in prod only
|
||||
if DEBUG:
|
||||
print(f"[debug] failed to subscribe to {topic!r}: {exc}")
|
||||
_debug_log(f"failed to subscribe to {topic!r}: {exc}")
|
||||
return subscribed
|
||||
|
||||
|
||||
@@ -1220,86 +1631,150 @@ def main():
|
||||
# Subscribe to PubSub topics (reliable in current meshtastic)
|
||||
subscribed = _subscribe_receive_topics()
|
||||
if DEBUG and subscribed:
|
||||
print(f"[debug] subscribed to receive topics: {', '.join(subscribed)}")
|
||||
_debug_log(f"subscribed to receive topics: {', '.join(subscribed)}")
|
||||
|
||||
def _close_interface(iface_obj):
|
||||
if iface_obj is None:
|
||||
return
|
||||
try:
|
||||
iface_obj.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _do_close():
|
||||
try:
|
||||
iface_obj.close()
|
||||
except Exception as exc:
|
||||
if DEBUG:
|
||||
_debug_log(f"error while closing mesh interface: {exc}")
|
||||
|
||||
if _CLOSE_TIMEOUT_SECS <= 0 or not _event_wait_allows_default_timeout():
|
||||
_do_close()
|
||||
return
|
||||
|
||||
close_thread = threading.Thread(
|
||||
target=_do_close, name="mesh-close", daemon=True
|
||||
)
|
||||
close_thread.start()
|
||||
close_thread.join(_CLOSE_TIMEOUT_SECS)
|
||||
if close_thread.is_alive():
|
||||
print(
|
||||
"[warn] mesh interface did not close within "
|
||||
f"{_CLOSE_TIMEOUT_SECS:g}s; continuing shutdown"
|
||||
)
|
||||
|
||||
iface = None
|
||||
resolved_target = None
|
||||
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
|
||||
|
||||
stop = threading.Event()
|
||||
initial_snapshot_sent = False
|
||||
|
||||
def handle_sig(*_):
|
||||
def handle_sigterm(*_):
|
||||
"""Stop the daemon when a termination signal is received."""
|
||||
|
||||
stop.set()
|
||||
|
||||
signal.signal(signal.SIGINT, handle_sig)
|
||||
signal.signal(signal.SIGTERM, handle_sig)
|
||||
def handle_sigint(signum, frame):
|
||||
"""Handle ``SIGINT`` by requesting shutdown and escalating on repeat."""
|
||||
|
||||
if stop.is_set():
|
||||
signal.default_int_handler(signum, frame)
|
||||
return
|
||||
|
||||
stop.set()
|
||||
|
||||
signal.signal(signal.SIGINT, handle_sigint)
|
||||
signal.signal(signal.SIGTERM, handle_sigterm)
|
||||
|
||||
target = INSTANCE or "(no POTATOMESH_INSTANCE)"
|
||||
configured_port = PORT
|
||||
active_candidate = configured_port
|
||||
announced_target = False
|
||||
print(
|
||||
f"Mesh daemon: nodes+messages → {target} | port={PORT} | channel={CHANNEL_INDEX}"
|
||||
f"Mesh daemon: nodes+messages → {target} | port={configured_port or 'auto'} | channel={CHANNEL_INDEX}"
|
||||
)
|
||||
while not stop.is_set():
|
||||
if iface is None:
|
||||
try:
|
||||
iface = _create_serial_interface(PORT)
|
||||
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
|
||||
except Exception as exc:
|
||||
print(f"[warn] failed to create mesh interface: {exc}")
|
||||
stop.wait(retry_delay)
|
||||
if _RECONNECT_MAX_DELAY_SECS > 0:
|
||||
retry_delay = min(
|
||||
(
|
||||
retry_delay * 2
|
||||
if retry_delay
|
||||
else _RECONNECT_INITIAL_DELAY_SECS
|
||||
),
|
||||
_RECONNECT_MAX_DELAY_SECS,
|
||||
)
|
||||
continue
|
||||
|
||||
try:
|
||||
nodes = getattr(iface, "nodes", {}) or {}
|
||||
node_items = _node_items_snapshot(nodes)
|
||||
if node_items is None:
|
||||
if DEBUG:
|
||||
print(
|
||||
"[debug] skipping node snapshot; nodes changed during iteration"
|
||||
)
|
||||
else:
|
||||
for node_id, n in node_items:
|
||||
try:
|
||||
upsert_node(node_id, n)
|
||||
except Exception as e:
|
||||
print(
|
||||
f"[warn] failed to update node snapshot for {node_id}: {e}"
|
||||
try:
|
||||
while not stop.is_set():
|
||||
if iface is None:
|
||||
try:
|
||||
if active_candidate:
|
||||
iface, resolved_target = _create_serial_interface(
|
||||
active_candidate
|
||||
)
|
||||
if DEBUG:
|
||||
print(f"[debug] node object: {n!r}")
|
||||
except Exception as e:
|
||||
print(f"[warn] failed to update node snapshot: {e}")
|
||||
_close_interface(iface)
|
||||
iface = None
|
||||
stop.wait(retry_delay)
|
||||
if _RECONNECT_MAX_DELAY_SECS > 0:
|
||||
retry_delay = min(
|
||||
retry_delay * 2 if retry_delay else _RECONNECT_INITIAL_DELAY_SECS,
|
||||
_RECONNECT_MAX_DELAY_SECS,
|
||||
)
|
||||
continue
|
||||
else:
|
||||
iface, resolved_target = _create_default_interface()
|
||||
active_candidate = resolved_target
|
||||
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
|
||||
initial_snapshot_sent = False
|
||||
if not announced_target and resolved_target:
|
||||
print(f"[info] using mesh interface: {resolved_target}")
|
||||
announced_target = True
|
||||
except NoAvailableMeshInterface as exc:
|
||||
print(f"[error] {exc}")
|
||||
_close_interface(iface)
|
||||
raise SystemExit(1) from exc
|
||||
except Exception as exc:
|
||||
candidate_desc = active_candidate or "auto"
|
||||
print(
|
||||
f"[warn] failed to create mesh interface ({candidate_desc}): {exc}"
|
||||
)
|
||||
if configured_port is None:
|
||||
active_candidate = None
|
||||
announced_target = False
|
||||
stop.wait(retry_delay)
|
||||
if _RECONNECT_MAX_DELAY_SECS > 0:
|
||||
retry_delay = min(
|
||||
(
|
||||
retry_delay * 2
|
||||
if retry_delay
|
||||
else _RECONNECT_INITIAL_DELAY_SECS
|
||||
),
|
||||
_RECONNECT_MAX_DELAY_SECS,
|
||||
)
|
||||
continue
|
||||
|
||||
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
|
||||
stop.wait(SNAPSHOT_SECS)
|
||||
if not initial_snapshot_sent:
|
||||
try:
|
||||
nodes = getattr(iface, "nodes", {}) or {}
|
||||
node_items = _node_items_snapshot(nodes)
|
||||
if node_items is None:
|
||||
_debug_log(
|
||||
"skipping node snapshot; nodes changed during iteration"
|
||||
)
|
||||
else:
|
||||
processed_snapshot_item = False
|
||||
for node_id, n in node_items:
|
||||
processed_snapshot_item = True
|
||||
try:
|
||||
upsert_node(node_id, n)
|
||||
except Exception as e:
|
||||
print(
|
||||
f"[warn] failed to update node snapshot for {node_id}: {e}"
|
||||
)
|
||||
if DEBUG:
|
||||
_debug_log(f"node object: {n!r}")
|
||||
if processed_snapshot_item:
|
||||
initial_snapshot_sent = True
|
||||
except Exception as e:
|
||||
print(f"[warn] failed to update node snapshot: {e}")
|
||||
_close_interface(iface)
|
||||
iface = None
|
||||
stop.wait(retry_delay)
|
||||
if _RECONNECT_MAX_DELAY_SECS > 0:
|
||||
retry_delay = min(
|
||||
(
|
||||
retry_delay * 2
|
||||
if retry_delay
|
||||
else _RECONNECT_INITIAL_DELAY_SECS
|
||||
),
|
||||
_RECONNECT_MAX_DELAY_SECS,
|
||||
)
|
||||
continue
|
||||
|
||||
_close_interface(iface)
|
||||
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
|
||||
stop.wait(SNAPSHOT_SECS)
|
||||
except KeyboardInterrupt:
|
||||
_debug_log("received KeyboardInterrupt; shutting down")
|
||||
stop.set()
|
||||
finally:
|
||||
_close_interface(iface)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
@@ -0,0 +1,26 @@
|
||||
-- Copyright (C) 2025 l5yth
|
||||
--
|
||||
-- Licensed under the Apache License, Version 2.0 (the "License");
|
||||
-- you may not use this file except in compliance with the License.
|
||||
-- You may obtain a copy of the License at
|
||||
--
|
||||
-- http://www.apache.org/licenses/LICENSE-2.0
|
||||
--
|
||||
-- Unless required by applicable law or agreed to in writing, software
|
||||
-- distributed under the License is distributed on an "AS IS" BASIS,
|
||||
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
-- See the License for the specific language governing permissions and
|
||||
-- limitations under the License.
|
||||
|
||||
CREATE TABLE IF NOT EXISTS neighbors (
|
||||
node_id TEXT NOT NULL,
|
||||
neighbor_id TEXT NOT NULL,
|
||||
snr REAL,
|
||||
rx_time INTEGER NOT NULL,
|
||||
PRIMARY KEY (node_id, neighbor_id),
|
||||
FOREIGN KEY (node_id) REFERENCES nodes(node_id) ON DELETE CASCADE,
|
||||
FOREIGN KEY (neighbor_id) REFERENCES nodes(node_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_neighbors_rx_time ON neighbors(rx_time);
|
||||
CREATE INDEX IF NOT EXISTS idx_neighbors_neighbor_id ON neighbors(neighbor_id);
|
||||
@@ -36,6 +36,7 @@ CREATE TABLE IF NOT EXISTS nodes (
|
||||
uptime_seconds INTEGER,
|
||||
position_time INTEGER,
|
||||
location_source TEXT,
|
||||
precision_bits INTEGER,
|
||||
latitude REAL,
|
||||
longitude REAL,
|
||||
altitude REAL
|
||||
|
||||
@@ -0,0 +1,43 @@
|
||||
-- Copyright (C) 2025 l5yth
|
||||
--
|
||||
-- Licensed under the Apache License, Version 2.0 (the "License");
|
||||
-- you may not use this file except in compliance with the License.
|
||||
-- You may obtain a copy of the License at
|
||||
--
|
||||
-- http://www.apache.org/licenses/LICENSE-2.0
|
||||
--
|
||||
-- Unless required by applicable law or agreed to in writing, software
|
||||
-- distributed under the License is distributed on an "AS IS" BASIS,
|
||||
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
-- See the License for the specific language governing permissions and
|
||||
-- limitations under the License.
|
||||
|
||||
CREATE TABLE IF NOT EXISTS telemetry (
|
||||
id INTEGER PRIMARY KEY,
|
||||
node_id TEXT,
|
||||
node_num INTEGER,
|
||||
from_id TEXT,
|
||||
to_id TEXT,
|
||||
rx_time INTEGER NOT NULL,
|
||||
rx_iso TEXT NOT NULL,
|
||||
telemetry_time INTEGER,
|
||||
channel INTEGER,
|
||||
portnum TEXT,
|
||||
hop_limit INTEGER,
|
||||
snr REAL,
|
||||
rssi INTEGER,
|
||||
bitfield INTEGER,
|
||||
payload_b64 TEXT,
|
||||
battery_level REAL,
|
||||
voltage REAL,
|
||||
channel_utilization REAL,
|
||||
air_util_tx REAL,
|
||||
uptime_seconds INTEGER,
|
||||
temperature REAL,
|
||||
relative_humidity REAL,
|
||||
barometric_pressure REAL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_telemetry_rx_time ON telemetry(rx_time);
|
||||
CREATE INDEX IF NOT EXISTS idx_telemetry_node_id ON telemetry(node_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_telemetry_time ON telemetry(telemetry_time);
|
||||
+2
-2
@@ -1,5 +1,5 @@
|
||||
x-web-base: &web-base
|
||||
image: ghcr.io/l5yth/potato-mesh-web-linux-amd64:latest
|
||||
image: ghcr.io/l5yth/potato-mesh-web-${POTATOMESH_IMAGE_ARCH:-linux-amd64}:latest
|
||||
environment:
|
||||
SITE_NAME: ${SITE_NAME:-My Meshtastic Network}
|
||||
DEFAULT_CHANNEL: ${DEFAULT_CHANNEL:-#MediumFast}
|
||||
@@ -24,7 +24,7 @@ x-web-base: &web-base
|
||||
cpus: '0.25'
|
||||
|
||||
x-ingestor-base: &ingestor-base
|
||||
image: ghcr.io/l5yth/potato-mesh-ingestor-linux-amd64:latest
|
||||
image: ghcr.io/l5yth/potato-mesh-ingestor-${POTATOMESH_IMAGE_ARCH:-linux-amd64}:latest
|
||||
environment:
|
||||
MESH_SERIAL: ${MESH_SERIAL:-/dev/ttyACM0}
|
||||
MESH_SNAPSHOT_SECS: ${MESH_SNAPSHOT_SECS:-60}
|
||||
|
||||
@@ -0,0 +1,20 @@
|
||||
[
|
||||
{
|
||||
"node_id": "!7c5b0920",
|
||||
"rx_time": 1758884186,
|
||||
"node_broadcast_interval_secs": 1800,
|
||||
"last_sent_by": "!9e99f8c0",
|
||||
"neighbors": [
|
||||
{ "node_id": "!2b22accc", "snr": -6.5, "rx_time": 1758884106 },
|
||||
{ "node_id": "!43ba26d0", "snr": -5.0, "rx_time": 1758884120 },
|
||||
{ "node_id": "!69ba6f71", "snr": -13.0, "rx_time": 1758884135 },
|
||||
{ "node_id": "!fa848384", "snr": -14.75, "rx_time": 1758884150 },
|
||||
{ "node_id": "!da6a35b4", "snr": -6.5, "rx_time": 1758884165 }
|
||||
]
|
||||
},
|
||||
{
|
||||
"node_id": "!cafebabe",
|
||||
"rx_time": 1758883200,
|
||||
"neighbors": []
|
||||
}
|
||||
]
|
||||
+563
-207
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,84 @@
|
||||
[
|
||||
{
|
||||
"id": 1256091342,
|
||||
"node_id": "!9e95cf60",
|
||||
"from_id": "!9e95cf60",
|
||||
"to_id": "^all",
|
||||
"rx_time": 1758024300,
|
||||
"rx_iso": "2025-09-16T12:05:00Z",
|
||||
"telemetry_time": 1758024300,
|
||||
"channel": 0,
|
||||
"portnum": "TELEMETRY_APP",
|
||||
"battery_level": 101,
|
||||
"bitfield": 1,
|
||||
"payload_b64": "DTVr0mgSFQhlFQIrh0AdJb8YPyXYFSA9KJTPEg==",
|
||||
"device_metrics": {
|
||||
"batteryLevel": 101,
|
||||
"voltage": 4.224,
|
||||
"channelUtilization": 0.59666663,
|
||||
"airUtilTx": 0.03908333,
|
||||
"uptimeSeconds": 305044
|
||||
},
|
||||
"raw": {
|
||||
"device_metrics": {
|
||||
"battery_level": 101,
|
||||
"voltage": 4.224,
|
||||
"channel_utilization": 0.59666663,
|
||||
"air_util_tx": 0.03908333,
|
||||
"uptime_seconds": 305044
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": 2817720548,
|
||||
"node_id": "!2a2a2a2a",
|
||||
"from_id": "!2a2a2a2a",
|
||||
"to_id": "^all",
|
||||
"rx_time": 1758024400,
|
||||
"rx_iso": "2025-09-16T12:06:40Z",
|
||||
"telemetry_time": 1758024390,
|
||||
"channel": 0,
|
||||
"portnum": "TELEMETRY_APP",
|
||||
"bitfield": 1,
|
||||
"environment_metrics": {
|
||||
"temperature": 21.98,
|
||||
"relativeHumidity": 39.475586,
|
||||
"barometricPressure": 1017.8353
|
||||
},
|
||||
"raw": {
|
||||
"environment_metrics": {
|
||||
"temperature": 21.98,
|
||||
"relative_humidity": 39.475586,
|
||||
"barometric_pressure": 1017.8353
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": 345678901,
|
||||
"node_id": "!1234abcd",
|
||||
"from_id": "!1234abcd",
|
||||
"node_num": 305441741,
|
||||
"to_id": "^all",
|
||||
"rx_time": 1758024500,
|
||||
"rx_iso": "2025-09-16T12:08:20Z",
|
||||
"telemetry_time": 1758024450,
|
||||
"channel": 1,
|
||||
"portnum": "TELEMETRY_APP",
|
||||
"payload_b64": "AAEC",
|
||||
"device_metrics": {
|
||||
"battery_level": 58.5,
|
||||
"voltage": 3.92,
|
||||
"channel_utilization": 0.284,
|
||||
"air_util_tx": 0.051,
|
||||
"uptime_seconds": 86400
|
||||
},
|
||||
"local_stats": {
|
||||
"numPacketsTx": 1280,
|
||||
"numPacketsRx": 1425,
|
||||
"numClients": 6,
|
||||
"numNodes": 18,
|
||||
"freeHeap": 21344,
|
||||
"heapLowWater": 19876
|
||||
}
|
||||
}
|
||||
]
|
||||
+249
-7
@@ -48,9 +48,21 @@ def mesh_module(monkeypatch):
|
||||
|
||||
tcp_interface_mod.TCPInterface = DummyTCPInterface
|
||||
|
||||
ble_interface_mod = types.ModuleType("meshtastic.ble_interface")
|
||||
|
||||
class DummyBLEInterface:
|
||||
def __init__(self, *_, **__):
|
||||
self.closed = False
|
||||
|
||||
def close(self):
|
||||
self.closed = True
|
||||
|
||||
ble_interface_mod.BLEInterface = DummyBLEInterface
|
||||
|
||||
meshtastic_mod = types.ModuleType("meshtastic")
|
||||
meshtastic_mod.serial_interface = serial_interface_mod
|
||||
meshtastic_mod.tcp_interface = tcp_interface_mod
|
||||
meshtastic_mod.ble_interface = ble_interface_mod
|
||||
if real_protobuf is not None:
|
||||
meshtastic_mod.protobuf = real_protobuf
|
||||
|
||||
@@ -59,6 +71,7 @@ def mesh_module(monkeypatch):
|
||||
sys.modules, "meshtastic.serial_interface", serial_interface_mod
|
||||
)
|
||||
monkeypatch.setitem(sys.modules, "meshtastic.tcp_interface", tcp_interface_mod)
|
||||
monkeypatch.setitem(sys.modules, "meshtastic.ble_interface", ble_interface_mod)
|
||||
if real_protobuf is not None:
|
||||
monkeypatch.setitem(sys.modules, "meshtastic.protobuf", real_protobuf)
|
||||
|
||||
@@ -144,8 +157,9 @@ def test_snapshot_interval_defaults_to_60_seconds(mesh_module):
|
||||
def test_create_serial_interface_allows_mock(mesh_module, value):
|
||||
mesh = mesh_module
|
||||
|
||||
iface = mesh._create_serial_interface(value)
|
||||
iface, resolved = mesh._create_serial_interface(value)
|
||||
|
||||
assert resolved == "mock"
|
||||
assert isinstance(iface.nodes, dict)
|
||||
iface.close()
|
||||
|
||||
@@ -161,9 +175,10 @@ def test_create_serial_interface_uses_serial_module(mesh_module, monkeypatch):
|
||||
|
||||
monkeypatch.setattr(mesh, "SerialInterface", fake_interface)
|
||||
|
||||
iface = mesh._create_serial_interface("/dev/ttyTEST")
|
||||
iface, resolved = mesh._create_serial_interface("/dev/ttyTEST")
|
||||
|
||||
assert created["devPath"] == "/dev/ttyTEST"
|
||||
assert resolved == "/dev/ttyTEST"
|
||||
assert iface.nodes == {"!foo": sentinel}
|
||||
|
||||
|
||||
@@ -178,9 +193,10 @@ def test_create_serial_interface_uses_tcp_for_ip(mesh_module, monkeypatch):
|
||||
|
||||
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
|
||||
|
||||
iface = mesh._create_serial_interface("192.168.1.25:4500")
|
||||
iface, resolved = mesh._create_serial_interface("192.168.1.25:4500")
|
||||
|
||||
assert created == {"hostname": "192.168.1.25", "portNumber": 4500}
|
||||
assert resolved == "tcp://192.168.1.25:4500"
|
||||
assert iface.nodes == {}
|
||||
|
||||
|
||||
@@ -195,10 +211,11 @@ def test_create_serial_interface_defaults_tcp_port(mesh_module, monkeypatch):
|
||||
|
||||
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
|
||||
|
||||
mesh._create_serial_interface("tcp://10.20.30.40")
|
||||
_, resolved = mesh._create_serial_interface("tcp://10.20.30.40")
|
||||
|
||||
assert created["hostname"] == "10.20.30.40"
|
||||
assert created["portNumber"] == mesh._DEFAULT_TCP_PORT
|
||||
assert resolved == "tcp://10.20.30.40:4403"
|
||||
|
||||
|
||||
def test_create_serial_interface_plain_ip(mesh_module, monkeypatch):
|
||||
@@ -212,10 +229,67 @@ def test_create_serial_interface_plain_ip(mesh_module, monkeypatch):
|
||||
|
||||
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
|
||||
|
||||
mesh._create_serial_interface(" 192.168.50.10 ")
|
||||
_, resolved = mesh._create_serial_interface(" 192.168.50.10 ")
|
||||
|
||||
assert created["hostname"] == "192.168.50.10"
|
||||
assert created["portNumber"] == mesh._DEFAULT_TCP_PORT
|
||||
assert resolved == "tcp://192.168.50.10:4403"
|
||||
|
||||
|
||||
def test_create_serial_interface_ble(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
created = {}
|
||||
|
||||
def fake_ble_interface(*, address=None, **_):
|
||||
created["address"] = address
|
||||
return SimpleNamespace(nodes={}, close=lambda: None)
|
||||
|
||||
monkeypatch.setattr(mesh, "BLEInterface", fake_ble_interface)
|
||||
|
||||
iface, resolved = mesh._create_serial_interface("ed:4d:9e:95:cf:60")
|
||||
|
||||
assert created["address"] == "ED:4D:9E:95:CF:60"
|
||||
assert resolved == "ED:4D:9E:95:CF:60"
|
||||
assert iface.nodes == {}
|
||||
|
||||
|
||||
def test_create_default_interface_falls_back_to_tcp(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
attempts = []
|
||||
|
||||
def fake_targets():
|
||||
return ["/dev/ttyFAIL"]
|
||||
|
||||
def fake_create(port):
|
||||
attempts.append(port)
|
||||
if port.startswith("/dev/tty"):
|
||||
raise RuntimeError("missing serial device")
|
||||
return SimpleNamespace(nodes={}, close=lambda: None), "tcp://127.0.0.1:4403"
|
||||
|
||||
monkeypatch.setattr(mesh, "_default_serial_targets", fake_targets)
|
||||
monkeypatch.setattr(mesh, "_create_serial_interface", fake_create)
|
||||
|
||||
iface, resolved = mesh._create_default_interface()
|
||||
|
||||
assert attempts == ["/dev/ttyFAIL", mesh._DEFAULT_TCP_TARGET]
|
||||
assert resolved == "tcp://127.0.0.1:4403"
|
||||
assert iface.nodes == {}
|
||||
|
||||
|
||||
def test_create_default_interface_raises_when_unavailable(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
|
||||
monkeypatch.setattr(mesh, "_default_serial_targets", lambda: ["/dev/ttyFAIL"])
|
||||
|
||||
def always_fail(port):
|
||||
raise RuntimeError(f"boom for {port}")
|
||||
|
||||
monkeypatch.setattr(mesh, "_create_serial_interface", always_fail)
|
||||
|
||||
with pytest.raises(mesh.NoAvailableMeshInterface) as exc_info:
|
||||
mesh._create_default_interface()
|
||||
|
||||
assert "/dev/ttyFAIL" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_node_to_dict_handles_nested_structures(mesh_module):
|
||||
@@ -368,6 +442,58 @@ def test_store_packet_dict_posts_position(mesh_module, monkeypatch):
|
||||
assert payload["raw"]["time"] == 1_758_624_189
|
||||
|
||||
|
||||
def test_store_packet_dict_posts_neighborinfo(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
captured = []
|
||||
monkeypatch.setattr(
|
||||
mesh,
|
||||
"_queue_post_json",
|
||||
lambda path, payload, *, priority: captured.append((path, payload, priority)),
|
||||
)
|
||||
|
||||
packet = {
|
||||
"id": 2049886869,
|
||||
"rxTime": 1_758_884_186,
|
||||
"fromId": "!7c5b0920",
|
||||
"decoded": {
|
||||
"portnum": "NEIGHBORINFO_APP",
|
||||
"neighborinfo": {
|
||||
"nodeId": 0x7C5B0920,
|
||||
"lastSentById": 0x9E3AA2F0,
|
||||
"nodeBroadcastIntervalSecs": 1800,
|
||||
"neighbors": [
|
||||
{"nodeId": 0x2B2A4D51, "snr": -6.5},
|
||||
{"nodeId": 0x437FE3E0, "snr": -2.75, "rxTime": 1_758_884_150},
|
||||
{"nodeId": "!0badc0de", "snr": None},
|
||||
],
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
mesh.store_packet_dict(packet)
|
||||
|
||||
assert captured, "Expected POST to be triggered for neighbor info"
|
||||
path, payload, priority = captured[0]
|
||||
assert path == "/api/neighbors"
|
||||
assert priority == mesh._NEIGHBOR_POST_PRIORITY
|
||||
assert payload["node_id"] == "!7c5b0920"
|
||||
assert payload["node_num"] == 0x7C5B0920
|
||||
assert payload["rx_time"] == 1_758_884_186
|
||||
assert payload["node_broadcast_interval_secs"] == 1800
|
||||
assert payload["last_sent_by_id"] == "!9e3aa2f0"
|
||||
neighbors = payload["neighbors"]
|
||||
assert len(neighbors) == 3
|
||||
assert neighbors[0]["neighbor_id"] == "!2b2a4d51"
|
||||
assert neighbors[0]["neighbor_num"] == 0x2B2A4D51
|
||||
assert neighbors[0]["rx_time"] == 1_758_884_186
|
||||
assert neighbors[0]["snr"] == pytest.approx(-6.5)
|
||||
assert neighbors[1]["neighbor_id"] == "!437fe3e0"
|
||||
assert neighbors[1]["rx_time"] == 1_758_884_150
|
||||
assert neighbors[1]["snr"] == pytest.approx(-2.75)
|
||||
assert neighbors[2]["neighbor_id"] == "!0badc0de"
|
||||
assert neighbors[2]["neighbor_num"] == 0x0BAD_C0DE
|
||||
|
||||
|
||||
def test_store_packet_dict_handles_nodeinfo_packet(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
captured = []
|
||||
@@ -811,8 +937,9 @@ def test_main_retries_interface_creation(mesh_module, monkeypatch):
|
||||
attempts.append(port)
|
||||
if len(attempts) < 3:
|
||||
raise RuntimeError("boom")
|
||||
return iface
|
||||
return iface, port
|
||||
|
||||
monkeypatch.setattr(mesh, "PORT", "/dev/ttyTEST")
|
||||
monkeypatch.setattr(mesh, "_create_serial_interface", fake_create)
|
||||
monkeypatch.setattr(mesh.threading, "Event", DummyEvent)
|
||||
monkeypatch.setattr(mesh.signal, "signal", lambda *_, **__: None)
|
||||
@@ -866,13 +993,14 @@ def test_main_recreates_interface_after_snapshot_error(mesh_module, monkeypatch)
|
||||
|
||||
interface = FlakyInterface(fail_first)
|
||||
interfaces.append(interface)
|
||||
return interface
|
||||
return interface, port
|
||||
|
||||
upsert_calls = []
|
||||
|
||||
def record_upsert(node_id, node):
|
||||
upsert_calls.append(node_id)
|
||||
|
||||
monkeypatch.setattr(mesh, "PORT", "/dev/ttyTEST")
|
||||
monkeypatch.setattr(mesh, "_create_serial_interface", fake_create)
|
||||
monkeypatch.setattr(mesh, "upsert_node", record_upsert)
|
||||
monkeypatch.setattr(mesh.threading, "Event", DummyEvent)
|
||||
@@ -888,6 +1016,22 @@ def test_main_recreates_interface_after_snapshot_error(mesh_module, monkeypatch)
|
||||
assert upsert_calls == ["!node"]
|
||||
|
||||
|
||||
def test_main_exits_when_defaults_unavailable(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
|
||||
def fail_default():
|
||||
raise mesh.NoAvailableMeshInterface("no interface available")
|
||||
|
||||
monkeypatch.setattr(mesh, "PORT", None)
|
||||
monkeypatch.setattr(mesh, "_create_default_interface", fail_default)
|
||||
monkeypatch.setattr(mesh.signal, "signal", lambda *_, **__: None)
|
||||
|
||||
with pytest.raises(SystemExit) as exc_info:
|
||||
mesh.main()
|
||||
|
||||
assert exc_info.value.code == 1
|
||||
|
||||
|
||||
def test_store_packet_dict_uses_top_level_channel(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
captured = []
|
||||
@@ -979,6 +1123,104 @@ def test_store_packet_dict_includes_encrypted_payload(mesh_module, monkeypatch):
|
||||
assert priority == mesh._MESSAGE_POST_PRIORITY
|
||||
|
||||
|
||||
def test_store_packet_dict_handles_telemetry_packet(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
captured = []
|
||||
monkeypatch.setattr(
|
||||
mesh,
|
||||
"_queue_post_json",
|
||||
lambda path, payload, *, priority: captured.append((path, payload, priority)),
|
||||
)
|
||||
|
||||
packet = {
|
||||
"id": 1_256_091_342,
|
||||
"rxTime": 1_758_024_300,
|
||||
"fromId": "!9e95cf60",
|
||||
"toId": "^all",
|
||||
"decoded": {
|
||||
"portnum": "TELEMETRY_APP",
|
||||
"bitfield": 1,
|
||||
"telemetry": {
|
||||
"time": 1_758_024_300,
|
||||
"deviceMetrics": {
|
||||
"batteryLevel": 101,
|
||||
"voltage": 4.224,
|
||||
"channelUtilization": 0.59666663,
|
||||
"airUtilTx": 0.03908333,
|
||||
"uptimeSeconds": 305044,
|
||||
},
|
||||
"localStats": {
|
||||
"numPacketsTx": 1280,
|
||||
"numPacketsRx": 1425,
|
||||
},
|
||||
},
|
||||
"payload": {
|
||||
"__bytes_b64__": "DTVr0mgSFQhlFQIrh0AdJb8YPyXYFSA9KJTPEg==",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
mesh.store_packet_dict(packet)
|
||||
|
||||
assert captured
|
||||
path, payload, priority = captured[0]
|
||||
assert path == "/api/telemetry"
|
||||
assert priority == mesh._TELEMETRY_POST_PRIORITY
|
||||
assert payload["id"] == 1_256_091_342
|
||||
assert payload["node_id"] == "!9e95cf60"
|
||||
assert payload["from_id"] == "!9e95cf60"
|
||||
assert payload["rx_time"] == 1_758_024_300
|
||||
assert payload["telemetry_time"] == 1_758_024_300
|
||||
assert payload["channel"] == 0
|
||||
assert payload["bitfield"] == 1
|
||||
assert payload["payload_b64"] == "DTVr0mgSFQhlFQIrh0AdJb8YPyXYFSA9KJTPEg=="
|
||||
assert payload["battery_level"] == pytest.approx(101.0)
|
||||
assert payload["voltage"] == pytest.approx(4.224)
|
||||
assert payload["channel_utilization"] == pytest.approx(0.59666663)
|
||||
assert payload["air_util_tx"] == pytest.approx(0.03908333)
|
||||
assert payload["uptime_seconds"] == 305044
|
||||
|
||||
|
||||
def test_store_packet_dict_handles_environment_telemetry(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
captured = []
|
||||
monkeypatch.setattr(
|
||||
mesh,
|
||||
"_queue_post_json",
|
||||
lambda path, payload, *, priority: captured.append((path, payload, priority)),
|
||||
)
|
||||
|
||||
packet = {
|
||||
"id": 2_817_720_548,
|
||||
"rxTime": 1_758_024_400,
|
||||
"from": 3_698_627_780,
|
||||
"decoded": {
|
||||
"portnum": "TELEMETRY_APP",
|
||||
"telemetry": {
|
||||
"time": 1_758_024_390,
|
||||
"environmentMetrics": {
|
||||
"temperature": 21.98,
|
||||
"relativeHumidity": 39.475586,
|
||||
"barometricPressure": 1017.8353,
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
mesh.store_packet_dict(packet)
|
||||
|
||||
assert captured
|
||||
path, payload, priority = captured[0]
|
||||
assert path == "/api/telemetry"
|
||||
assert payload["id"] == 2_817_720_548
|
||||
assert payload["node_id"] == "!dc7494c4"
|
||||
assert payload["from_id"] == "!dc7494c4"
|
||||
assert payload["telemetry_time"] == 1_758_024_390
|
||||
assert payload["temperature"] == pytest.approx(21.98)
|
||||
assert payload["relative_humidity"] == pytest.approx(39.475586)
|
||||
assert payload["barometric_pressure"] == pytest.approx(1017.8353)
|
||||
|
||||
|
||||
def test_post_queue_prioritises_messages(mesh_module, monkeypatch):
|
||||
mesh = mesh_module
|
||||
mesh._clear_post_queue()
|
||||
|
||||
+617
-20
@@ -98,6 +98,17 @@ MAX_NODE_DISTANCE_KM = ENV.fetch("MAX_NODE_DISTANCE_KM", "137").to_f
|
||||
MATRIX_ROOM = ENV.fetch("MATRIX_ROOM", "#meshtastic-berlin:matrix.org")
|
||||
DEBUG = ENV["DEBUG"] == "1"
|
||||
|
||||
def debug_log(message)
|
||||
return unless DEBUG
|
||||
|
||||
logger = settings.logger if respond_to?(:settings)
|
||||
logger&.debug(message)
|
||||
end
|
||||
|
||||
def private_mode?
|
||||
ENV["PRIVATE"] == "1"
|
||||
end
|
||||
|
||||
def sanitized_string(value)
|
||||
value.to_s.strip
|
||||
end
|
||||
@@ -173,6 +184,37 @@ def coerce_float(value)
|
||||
end
|
||||
end
|
||||
|
||||
def normalize_json_value(value)
|
||||
case value
|
||||
when Hash
|
||||
value.each_with_object({}) do |(key, val), memo|
|
||||
memo[key.to_s] = normalize_json_value(val)
|
||||
end
|
||||
when Array
|
||||
value.map { |element| normalize_json_value(element) }
|
||||
else
|
||||
value
|
||||
end
|
||||
end
|
||||
|
||||
def normalize_json_object(value)
|
||||
case value
|
||||
when Hash
|
||||
normalize_json_value(value)
|
||||
when String
|
||||
trimmed = value.strip
|
||||
return nil if trimmed.empty?
|
||||
begin
|
||||
parsed = JSON.parse(trimmed)
|
||||
rescue JSON::ParserError
|
||||
return nil
|
||||
end
|
||||
parsed.is_a?(Hash) ? normalize_json_value(parsed) : nil
|
||||
else
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
||||
def sanitized_max_distance_km
|
||||
return nil unless defined?(MAX_NODE_DISTANCE_KM)
|
||||
|
||||
@@ -204,7 +246,13 @@ def meta_description
|
||||
summary += " on #{channel} (#{frequency})."
|
||||
end
|
||||
|
||||
sentences = [summary, "Track nodes, messages, and coverage in real time."]
|
||||
activity_sentence = if private_mode?
|
||||
"Track nodes and coverage in real time."
|
||||
else
|
||||
"Track nodes, messages, and coverage in real time."
|
||||
end
|
||||
|
||||
sentences = [summary, activity_sentence]
|
||||
if (distance = sanitized_max_distance_km)
|
||||
sentences << "Shows nodes within roughly #{formatted_distance_km(distance)} km of the map center."
|
||||
end
|
||||
@@ -245,6 +293,7 @@ end
|
||||
def open_database(readonly: false)
|
||||
SQLite3::Database.new(DB_PATH, readonly: readonly).tap do |db|
|
||||
db.busy_timeout = DB_BUSY_TIMEOUT_MS
|
||||
db.execute("PRAGMA foreign_keys = ON")
|
||||
end
|
||||
end
|
||||
|
||||
@@ -274,8 +323,8 @@ end
|
||||
def db_schema_present?
|
||||
return false unless File.exist?(DB_PATH)
|
||||
db = open_database(readonly: true)
|
||||
required = %w[nodes messages positions]
|
||||
tables = db.execute("SELECT name FROM sqlite_master WHERE type='table' AND name IN ('nodes','messages','positions')").flatten
|
||||
required = %w[nodes messages positions telemetry neighbors]
|
||||
tables = db.execute("SELECT name FROM sqlite_master WHERE type='table' AND name IN ('nodes','messages','positions','telemetry','neighbors')").flatten
|
||||
(required - tables).empty?
|
||||
rescue SQLite3::Exception
|
||||
false
|
||||
@@ -289,7 +338,7 @@ end
|
||||
def init_db
|
||||
FileUtils.mkdir_p(File.dirname(DB_PATH))
|
||||
db = open_database
|
||||
%w[nodes messages positions].each do |schema|
|
||||
%w[nodes messages positions telemetry neighbors].each do |schema|
|
||||
sql_file = File.expand_path("../data/#{schema}.sql", __dir__)
|
||||
db.execute_batch(File.read(sql_file))
|
||||
end
|
||||
@@ -299,6 +348,20 @@ end
|
||||
|
||||
init_db unless db_schema_present?
|
||||
|
||||
def ensure_schema_upgrades
|
||||
db = open_database
|
||||
node_columns = db.execute("PRAGMA table_info(nodes)").map { |row| row[1] }
|
||||
unless node_columns.include?("precision_bits")
|
||||
db.execute("ALTER TABLE nodes ADD COLUMN precision_bits INTEGER")
|
||||
end
|
||||
rescue SQLite3::SQLException => e
|
||||
warn "[warn] failed to apply schema upgrade: #{e.message}"
|
||||
ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
ensure_schema_upgrades
|
||||
|
||||
# Retrieve recently heard nodes ordered by their last contact time.
|
||||
#
|
||||
# @param limit [Integer] maximum number of rows returned.
|
||||
@@ -308,16 +371,26 @@ def query_nodes(limit)
|
||||
db.results_as_hash = true
|
||||
now = Time.now.to_i
|
||||
min_last_heard = now - WEEK_SECONDS
|
||||
rows = db.execute <<~SQL, [min_last_heard, limit]
|
||||
SELECT node_id, short_name, long_name, hw_model, role, snr,
|
||||
battery_level, voltage, last_heard, first_heard,
|
||||
uptime_seconds, channel_utilization, air_util_tx,
|
||||
position_time, latitude, longitude, altitude
|
||||
FROM nodes
|
||||
WHERE last_heard >= ?
|
||||
ORDER BY last_heard DESC
|
||||
LIMIT ?
|
||||
SQL
|
||||
params = [min_last_heard]
|
||||
sql = <<~SQL
|
||||
SELECT node_id, short_name, long_name, hw_model, role, snr,
|
||||
battery_level, voltage, last_heard, first_heard,
|
||||
uptime_seconds, channel_utilization, air_util_tx,
|
||||
position_time, location_source, precision_bits,
|
||||
latitude, longitude, altitude
|
||||
FROM nodes
|
||||
WHERE last_heard >= ?
|
||||
SQL
|
||||
if private_mode?
|
||||
sql += " AND (role IS NULL OR role <> 'CLIENT_HIDDEN')\n"
|
||||
end
|
||||
sql += <<~SQL
|
||||
ORDER BY last_heard DESC
|
||||
LIMIT ?
|
||||
SQL
|
||||
params << limit
|
||||
|
||||
rows = db.execute(sql, params)
|
||||
rows.each do |r|
|
||||
r["role"] ||= "CLIENT"
|
||||
lh = r["last_heard"]&.to_i
|
||||
@@ -328,6 +401,8 @@ def query_nodes(limit)
|
||||
r["position_time"] = pt
|
||||
r["last_seen_iso"] = Time.at(lh).utc.iso8601 if lh
|
||||
r["pos_time_iso"] = Time.at(pt).utc.iso8601 if pt
|
||||
pb = r["precision_bits"]
|
||||
r["precision_bits"] = pb.to_i if pb
|
||||
end
|
||||
rows
|
||||
ensure
|
||||
@@ -448,6 +523,74 @@ def query_positions(limit)
|
||||
end
|
||||
pt_val = r["position_time"]
|
||||
r["position_time_iso"] = Time.at(pt_val).utc.iso8601 if pt_val
|
||||
pb = r["precision_bits"]
|
||||
r["precision_bits"] = pb.to_i if pb
|
||||
end
|
||||
rows
|
||||
ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
def query_neighbors(limit)
|
||||
db = open_database(readonly: true)
|
||||
db.results_as_hash = true
|
||||
rows = db.execute <<~SQL, [limit]
|
||||
SELECT node_id, neighbor_id, snr, rx_time
|
||||
FROM neighbors
|
||||
ORDER BY rx_time DESC
|
||||
LIMIT ?
|
||||
SQL
|
||||
rows.each do |r|
|
||||
rx_time = coerce_integer(r["rx_time"])
|
||||
r["rx_time"] = rx_time if rx_time
|
||||
r["rx_iso"] = Time.at(rx_time).utc.iso8601 if rx_time
|
||||
r["snr"] = coerce_float(r["snr"])
|
||||
end
|
||||
rows
|
||||
ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
def query_telemetry(limit)
|
||||
db = open_database(readonly: true)
|
||||
db.results_as_hash = true
|
||||
rows = db.execute <<~SQL, [limit]
|
||||
SELECT id, node_id, node_num, from_id, to_id, rx_time, rx_iso,
|
||||
telemetry_time, channel, portnum, hop_limit, snr, rssi,
|
||||
bitfield, payload_b64, battery_level, voltage,
|
||||
channel_utilization, air_util_tx, uptime_seconds,
|
||||
temperature, relative_humidity, barometric_pressure
|
||||
FROM telemetry
|
||||
ORDER BY rx_time DESC
|
||||
LIMIT ?
|
||||
SQL
|
||||
now = Time.now.to_i
|
||||
rows.each do |r|
|
||||
rx_time = coerce_integer(r["rx_time"])
|
||||
r["rx_time"] = rx_time if rx_time
|
||||
r["rx_iso"] = Time.at(rx_time).utc.iso8601 if rx_time && string_or_nil(r["rx_iso"]).nil?
|
||||
|
||||
node_num = coerce_integer(r["node_num"])
|
||||
r["node_num"] = node_num if node_num
|
||||
|
||||
telemetry_time = coerce_integer(r["telemetry_time"])
|
||||
telemetry_time = nil if telemetry_time && telemetry_time > now
|
||||
r["telemetry_time"] = telemetry_time
|
||||
r["telemetry_time_iso"] = Time.at(telemetry_time).utc.iso8601 if telemetry_time
|
||||
|
||||
r["channel"] = coerce_integer(r["channel"])
|
||||
r["hop_limit"] = coerce_integer(r["hop_limit"])
|
||||
r["rssi"] = coerce_integer(r["rssi"])
|
||||
r["bitfield"] = coerce_integer(r["bitfield"])
|
||||
r["snr"] = coerce_float(r["snr"])
|
||||
r["battery_level"] = coerce_float(r["battery_level"])
|
||||
r["voltage"] = coerce_float(r["voltage"])
|
||||
r["channel_utilization"] = coerce_float(r["channel_utilization"])
|
||||
r["air_util_tx"] = coerce_float(r["air_util_tx"])
|
||||
r["uptime_seconds"] = coerce_integer(r["uptime_seconds"])
|
||||
r["temperature"] = coerce_float(r["temperature"])
|
||||
r["relative_humidity"] = coerce_float(r["relative_humidity"])
|
||||
r["barometric_pressure"] = coerce_float(r["barometric_pressure"])
|
||||
end
|
||||
rows
|
||||
ensure
|
||||
@@ -458,6 +601,7 @@ end
|
||||
#
|
||||
# Returns a JSON array of stored text messages including node metadata.
|
||||
get "/api/messages" do
|
||||
halt 404 if private_mode?
|
||||
content_type :json
|
||||
limit = [params["limit"]&.to_i || 200, 1000].min
|
||||
query_messages(limit).to_json
|
||||
@@ -472,6 +616,24 @@ get "/api/positions" do
|
||||
query_positions(limit).to_json
|
||||
end
|
||||
|
||||
# GET /api/neighbors
|
||||
#
|
||||
# Returns the most recent neighbor tuples describing mesh health.
|
||||
get "/api/neighbors" do
|
||||
content_type :json
|
||||
limit = [params["limit"]&.to_i || 200, 1000].min
|
||||
query_neighbors(limit).to_json
|
||||
end
|
||||
|
||||
# GET /api/telemetry
|
||||
#
|
||||
# Returns a JSON array of recorded telemetry packets.
|
||||
get "/api/telemetry" do
|
||||
content_type :json
|
||||
limit = [params["limit"]&.to_i || 200, 1000].min
|
||||
query_telemetry(limit).to_json
|
||||
end
|
||||
|
||||
# Determine the numeric node reference for a canonical node identifier.
|
||||
#
|
||||
# The Meshtastic protobuf encodes the node ID as a hexadecimal string prefixed
|
||||
@@ -622,9 +784,67 @@ def ensure_unknown_node(db, node_ref, fallback_num = nil, heard_time: nil)
|
||||
inserted = db.changes.positive?
|
||||
end
|
||||
|
||||
if inserted
|
||||
debug_log(
|
||||
"ensure_unknown_node created hidden node_id=#{node_id} from=#{node_ref.inspect} " \
|
||||
"fallback=#{fallback_num.inspect} heard_time=#{heard_time.inspect}"
|
||||
)
|
||||
end
|
||||
|
||||
inserted
|
||||
end
|
||||
|
||||
# Ensure the node's last_seen timestamp reflects the provided receive time.
|
||||
#
|
||||
# @param db [SQLite3::Database] open database handle.
|
||||
# @param node_ref [Object] raw identifier used to resolve the node.
|
||||
# @param fallback_num [Object] optional numeric identifier.
|
||||
# @param rx_time [Object] receive timestamp that should update the node.
|
||||
def touch_node_last_seen(db, node_ref, fallback_num = nil, rx_time: nil, source: nil)
|
||||
timestamp = coerce_integer(rx_time)
|
||||
return unless timestamp
|
||||
|
||||
node_id = nil
|
||||
|
||||
parts = canonical_node_parts(node_ref, fallback_num)
|
||||
node_id, = parts if parts
|
||||
|
||||
unless node_id
|
||||
trimmed = string_or_nil(node_ref)
|
||||
if trimmed
|
||||
node_id = normalize_node_id(db, trimmed) || trimmed
|
||||
elsif fallback_num
|
||||
fallback_parts = canonical_node_parts(fallback_num, nil)
|
||||
node_id, = fallback_parts if fallback_parts
|
||||
end
|
||||
end
|
||||
|
||||
return unless node_id
|
||||
|
||||
updated = false
|
||||
with_busy_retry do
|
||||
db.execute <<~SQL, [timestamp, timestamp, timestamp, node_id]
|
||||
UPDATE nodes
|
||||
SET last_heard = CASE
|
||||
WHEN COALESCE(last_heard, 0) >= ? THEN last_heard
|
||||
ELSE ?
|
||||
END,
|
||||
first_heard = COALESCE(first_heard, ?)
|
||||
WHERE node_id = ?
|
||||
SQL
|
||||
updated ||= db.changes.positive?
|
||||
end
|
||||
|
||||
if updated
|
||||
debug_log(
|
||||
"touch_node_last_seen updated last_heard node_id=#{node_id} timestamp=#{timestamp} " \
|
||||
"source=#{(source || :unknown).inspect}"
|
||||
)
|
||||
end
|
||||
|
||||
updated
|
||||
end
|
||||
|
||||
# Insert or update a node row with the most recent metrics.
|
||||
#
|
||||
# @param db [SQLite3::Database] open database handle.
|
||||
@@ -673,6 +893,11 @@ def upsert_node(db, node_id, n)
|
||||
met["uptimeSeconds"],
|
||||
pt,
|
||||
pos["locationSource"],
|
||||
coerce_integer(
|
||||
pos["precisionBits"] ||
|
||||
pos["precision_bits"] ||
|
||||
pos.dig("raw", "precision_bits"),
|
||||
),
|
||||
pos["latitude"],
|
||||
pos["longitude"],
|
||||
pos["altitude"],
|
||||
@@ -681,8 +906,8 @@ def upsert_node(db, node_id, n)
|
||||
db.execute <<~SQL, row
|
||||
INSERT INTO nodes(node_id,num,short_name,long_name,macaddr,hw_model,role,public_key,is_unmessagable,is_favorite,
|
||||
hops_away,snr,last_heard,first_heard,battery_level,voltage,channel_utilization,air_util_tx,uptime_seconds,
|
||||
position_time,location_source,latitude,longitude,altitude)
|
||||
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
|
||||
position_time,location_source,precision_bits,latitude,longitude,altitude)
|
||||
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
|
||||
ON CONFLICT(node_id) DO UPDATE SET
|
||||
num=excluded.num, short_name=excluded.short_name, long_name=excluded.long_name, macaddr=excluded.macaddr,
|
||||
hw_model=excluded.hw_model, role=excluded.role, public_key=excluded.public_key, is_unmessagable=excluded.is_unmessagable,
|
||||
@@ -690,7 +915,7 @@ def upsert_node(db, node_id, n)
|
||||
first_heard=COALESCE(nodes.first_heard, excluded.first_heard, excluded.last_heard),
|
||||
battery_level=excluded.battery_level, voltage=excluded.voltage, channel_utilization=excluded.channel_utilization,
|
||||
air_util_tx=excluded.air_util_tx, uptime_seconds=excluded.uptime_seconds, position_time=excluded.position_time,
|
||||
location_source=excluded.location_source, latitude=excluded.latitude, longitude=excluded.longitude,
|
||||
location_source=excluded.location_source, precision_bits=excluded.precision_bits, latitude=excluded.latitude, longitude=excluded.longitude,
|
||||
altitude=excluded.altitude
|
||||
WHERE COALESCE(excluded.last_heard,0) >= COALESCE(nodes.last_heard,0)
|
||||
SQL
|
||||
@@ -762,8 +987,9 @@ end
|
||||
# @param latitude [Float, nil] reported latitude.
|
||||
# @param longitude [Float, nil] reported longitude.
|
||||
# @param altitude [Float, nil] reported altitude.
|
||||
# @param precision_bits [Integer, nil] precision estimate provided by the device.
|
||||
# @param snr [Float, nil] link SNR for the packet.
|
||||
def update_node_from_position(db, node_id, node_num, rx_time, position_time, location_source, latitude, longitude, altitude, snr)
|
||||
def update_node_from_position(db, node_id, node_num, rx_time, position_time, location_source, precision_bits, latitude, longitude, altitude, snr)
|
||||
num = coerce_integer(node_num)
|
||||
id = string_or_nil(node_id)
|
||||
if id&.start_with?("!")
|
||||
@@ -784,6 +1010,7 @@ def update_node_from_position(db, node_id, node_num, rx_time, position_time, loc
|
||||
lat = coerce_float(latitude)
|
||||
lon = coerce_float(longitude)
|
||||
alt = coerce_float(altitude)
|
||||
precision = coerce_integer(precision_bits)
|
||||
snr_val = coerce_float(snr)
|
||||
|
||||
row = [
|
||||
@@ -793,6 +1020,7 @@ def update_node_from_position(db, node_id, node_num, rx_time, position_time, loc
|
||||
last_heard,
|
||||
pos_time,
|
||||
loc,
|
||||
precision,
|
||||
lat,
|
||||
lon,
|
||||
alt,
|
||||
@@ -800,8 +1028,8 @@ def update_node_from_position(db, node_id, node_num, rx_time, position_time, loc
|
||||
]
|
||||
with_busy_retry do
|
||||
db.execute <<~SQL, row
|
||||
INSERT INTO nodes(node_id,num,last_heard,first_heard,position_time,location_source,latitude,longitude,altitude,snr)
|
||||
VALUES (?,?,?,?,?,?,?,?,?,?)
|
||||
INSERT INTO nodes(node_id,num,last_heard,first_heard,position_time,location_source,precision_bits,latitude,longitude,altitude,snr)
|
||||
VALUES (?,?,?,?,?,?,?,?,?,?,?)
|
||||
ON CONFLICT(node_id) DO UPDATE SET
|
||||
num=COALESCE(excluded.num,nodes.num),
|
||||
snr=COALESCE(excluded.snr,nodes.snr),
|
||||
@@ -818,6 +1046,12 @@ def update_node_from_position(db, node_id, node_num, rx_time, position_time, loc
|
||||
THEN excluded.location_source
|
||||
ELSE nodes.location_source
|
||||
END,
|
||||
precision_bits=CASE
|
||||
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
|
||||
AND excluded.precision_bits IS NOT NULL
|
||||
THEN excluded.precision_bits
|
||||
ELSE nodes.precision_bits
|
||||
END,
|
||||
latitude=CASE
|
||||
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
|
||||
AND excluded.latitude IS NOT NULL
|
||||
@@ -868,6 +1102,7 @@ def insert_position(db, payload)
|
||||
node_id = canonical if canonical
|
||||
|
||||
ensure_unknown_node(db, node_id || node_num, node_num, heard_time: rx_time)
|
||||
touch_node_last_seen(db, node_id || node_num, node_num, rx_time: rx_time, source: :position)
|
||||
|
||||
to_id = string_or_nil(payload["to_id"] || payload["to"])
|
||||
|
||||
@@ -1010,6 +1245,7 @@ def insert_position(db, payload)
|
||||
rx_time,
|
||||
position_time,
|
||||
location_source,
|
||||
precision_bits,
|
||||
lat,
|
||||
lon,
|
||||
alt,
|
||||
@@ -1017,6 +1253,314 @@ def insert_position(db, payload)
|
||||
)
|
||||
end
|
||||
|
||||
def insert_neighbors(db, payload)
|
||||
return unless payload.is_a?(Hash)
|
||||
|
||||
now = Time.now.to_i
|
||||
rx_time = coerce_integer(payload["rx_time"])
|
||||
rx_time = now if rx_time.nil? || rx_time > now
|
||||
|
||||
raw_node_id = payload["node_id"] || payload["node"] || payload["from_id"]
|
||||
raw_node_num = coerce_integer(payload["node_num"]) || coerce_integer(payload["num"])
|
||||
|
||||
canonical_parts = canonical_node_parts(raw_node_id, raw_node_num)
|
||||
if canonical_parts
|
||||
node_id, node_num, = canonical_parts
|
||||
else
|
||||
node_id = string_or_nil(raw_node_id)
|
||||
canonical = normalize_node_id(db, node_id || raw_node_num)
|
||||
node_id = canonical if canonical
|
||||
if node_id&.start_with?("!") && raw_node_num.nil?
|
||||
begin
|
||||
node_num = Integer(node_id.delete_prefix("!"), 16)
|
||||
rescue ArgumentError
|
||||
node_num = nil
|
||||
end
|
||||
else
|
||||
node_num = raw_node_num
|
||||
end
|
||||
end
|
||||
|
||||
return unless node_id
|
||||
|
||||
node_id = "!#{node_id.delete_prefix("!").downcase}" if node_id.start_with?("!")
|
||||
|
||||
ensure_unknown_node(db, node_id || node_num, node_num, heard_time: rx_time)
|
||||
touch_node_last_seen(db, node_id || node_num, node_num, rx_time: rx_time, source: :neighborinfo)
|
||||
|
||||
neighbor_entries = []
|
||||
neighbors_payload = payload["neighbors"]
|
||||
neighbors_list = neighbors_payload.is_a?(Array) ? neighbors_payload : []
|
||||
|
||||
neighbors_list.each do |neighbor|
|
||||
next unless neighbor.is_a?(Hash)
|
||||
|
||||
neighbor_ref = neighbor["neighbor_id"] || neighbor["node_id"] || neighbor["nodeId"] || neighbor["id"]
|
||||
neighbor_num = coerce_integer(
|
||||
neighbor["neighbor_num"] || neighbor["node_num"] || neighbor["nodeId"] || neighbor["id"],
|
||||
)
|
||||
|
||||
canonical_neighbor = canonical_node_parts(neighbor_ref, neighbor_num)
|
||||
if canonical_neighbor
|
||||
neighbor_id, neighbor_num, = canonical_neighbor
|
||||
else
|
||||
neighbor_id = string_or_nil(neighbor_ref)
|
||||
canonical_neighbor_id = normalize_node_id(db, neighbor_id || neighbor_num)
|
||||
neighbor_id = canonical_neighbor_id if canonical_neighbor_id
|
||||
if neighbor_id&.start_with?("!") && neighbor_num.nil?
|
||||
begin
|
||||
neighbor_num = Integer(neighbor_id.delete_prefix("!"), 16)
|
||||
rescue ArgumentError
|
||||
neighbor_num = nil
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
next unless neighbor_id
|
||||
|
||||
neighbor_id = "!#{neighbor_id.delete_prefix("!").downcase}" if neighbor_id.start_with?("!")
|
||||
|
||||
entry_rx_time = coerce_integer(neighbor["rx_time"]) || rx_time
|
||||
entry_rx_time = now if entry_rx_time && entry_rx_time > now
|
||||
snr = coerce_float(neighbor["snr"])
|
||||
|
||||
ensure_unknown_node(db, neighbor_id || neighbor_num, neighbor_num, heard_time: entry_rx_time)
|
||||
touch_node_last_seen(db, neighbor_id || neighbor_num, neighbor_num, rx_time: entry_rx_time, source: :neighborinfo)
|
||||
|
||||
neighbor_entries << [neighbor_id, snr, entry_rx_time]
|
||||
end
|
||||
|
||||
with_busy_retry do
|
||||
db.transaction do
|
||||
db.execute("DELETE FROM neighbors WHERE node_id = ?", [node_id])
|
||||
neighbor_entries.each do |neighbor_id, snr, heard_time|
|
||||
db.execute(
|
||||
<<~SQL,
|
||||
INSERT OR REPLACE INTO neighbors(node_id, neighbor_id, snr, rx_time)
|
||||
VALUES (?, ?, ?, ?)
|
||||
SQL
|
||||
[node_id, neighbor_id, snr, heard_time],
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def update_node_from_telemetry(db, node_id, node_num, rx_time, metrics = {})
|
||||
num = coerce_integer(node_num)
|
||||
id = string_or_nil(node_id)
|
||||
if id&.start_with?("!")
|
||||
id = "!#{id.delete_prefix("!").downcase}"
|
||||
end
|
||||
id ||= format("!%08x", num & 0xFFFFFFFF) if num
|
||||
return unless id
|
||||
|
||||
ensure_unknown_node(db, id, num, heard_time: rx_time)
|
||||
touch_node_last_seen(db, id, num, rx_time: rx_time, source: :telemetry)
|
||||
|
||||
battery = coerce_float(metrics[:battery_level] || metrics["battery_level"])
|
||||
voltage = coerce_float(metrics[:voltage] || metrics["voltage"])
|
||||
channel_util = coerce_float(metrics[:channel_utilization] || metrics["channel_utilization"])
|
||||
air_util_tx = coerce_float(metrics[:air_util_tx] || metrics["air_util_tx"])
|
||||
uptime = coerce_integer(metrics[:uptime_seconds] || metrics["uptime_seconds"])
|
||||
|
||||
assignments = []
|
||||
params = []
|
||||
|
||||
if num
|
||||
assignments << "num = ?"
|
||||
params << num
|
||||
end
|
||||
|
||||
metric_updates = {
|
||||
"battery_level" => battery,
|
||||
"voltage" => voltage,
|
||||
"channel_utilization" => channel_util,
|
||||
"air_util_tx" => air_util_tx,
|
||||
"uptime_seconds" => uptime,
|
||||
}
|
||||
|
||||
metric_updates.each do |column, value|
|
||||
next if value.nil?
|
||||
|
||||
assignments << "#{column} = ?"
|
||||
params << value
|
||||
end
|
||||
|
||||
return if assignments.empty?
|
||||
|
||||
assignments_sql = assignments.join(", ")
|
||||
params << id
|
||||
|
||||
with_busy_retry do
|
||||
db.execute("UPDATE nodes SET #{assignments_sql} WHERE node_id = ?", params)
|
||||
end
|
||||
end
|
||||
|
||||
def insert_telemetry(db, payload)
|
||||
return unless payload.is_a?(Hash)
|
||||
|
||||
telemetry_id = coerce_integer(payload["id"] || payload["packet_id"])
|
||||
return unless telemetry_id
|
||||
|
||||
now = Time.now.to_i
|
||||
rx_time = coerce_integer(payload["rx_time"])
|
||||
rx_time = now if rx_time.nil? || rx_time > now
|
||||
rx_iso = string_or_nil(payload["rx_iso"])
|
||||
rx_iso ||= Time.at(rx_time).utc.iso8601
|
||||
|
||||
raw_node_id = payload["node_id"] || payload["from_id"] || payload["from"]
|
||||
node_id = string_or_nil(raw_node_id)
|
||||
node_id = "!#{node_id.delete_prefix("!").downcase}" if node_id&.start_with?("!")
|
||||
raw_node_num = coerce_integer(payload["node_num"]) || coerce_integer(payload["num"])
|
||||
|
||||
payload_for_num = payload.dup
|
||||
payload_for_num["num"] ||= raw_node_num if raw_node_num
|
||||
node_num = resolve_node_num(node_id, payload_for_num)
|
||||
node_num ||= raw_node_num
|
||||
|
||||
canonical = normalize_node_id(db, node_id || node_num)
|
||||
node_id = canonical if canonical
|
||||
|
||||
from_id = string_or_nil(payload["from_id"]) || node_id
|
||||
to_id = string_or_nil(payload["to_id"] || payload["to"])
|
||||
|
||||
telemetry_time = coerce_integer(payload["telemetry_time"] || payload["time"] || payload.dig("telemetry", "time"))
|
||||
telemetry_time = nil if telemetry_time && telemetry_time > now
|
||||
|
||||
channel = coerce_integer(payload["channel"])
|
||||
portnum = string_or_nil(payload["portnum"])
|
||||
hop_limit = coerce_integer(payload["hop_limit"] || payload["hopLimit"])
|
||||
snr = coerce_float(payload["snr"])
|
||||
rssi = coerce_integer(payload["rssi"])
|
||||
bitfield = coerce_integer(payload["bitfield"])
|
||||
payload_b64 = string_or_nil(payload["payload_b64"] || payload["payload"])
|
||||
|
||||
telemetry_section = normalize_json_object(payload["telemetry"])
|
||||
device_metrics = normalize_json_object(payload["device_metrics"] || payload["deviceMetrics"])
|
||||
device_metrics ||= normalize_json_object(telemetry_section["deviceMetrics"]) if telemetry_section&.key?("deviceMetrics")
|
||||
environment_metrics = normalize_json_object(payload["environment_metrics"] || payload["environmentMetrics"])
|
||||
environment_metrics ||= normalize_json_object(telemetry_section["environmentMetrics"]) if telemetry_section&.key?("environmentMetrics")
|
||||
|
||||
fetch_metric = lambda do |map, *names|
|
||||
next nil unless map.is_a?(Hash)
|
||||
names.each do |name|
|
||||
next unless name
|
||||
key = name.to_s
|
||||
return map[key] if map.key?(key)
|
||||
end
|
||||
nil
|
||||
end
|
||||
|
||||
battery_level = payload.key?("battery_level") ? payload["battery_level"] : nil
|
||||
battery_level = coerce_float(battery_level)
|
||||
battery_level ||= coerce_float(fetch_metric.call(device_metrics, :battery_level, :batteryLevel))
|
||||
|
||||
voltage = payload.key?("voltage") ? payload["voltage"] : nil
|
||||
voltage = coerce_float(voltage)
|
||||
voltage ||= coerce_float(fetch_metric.call(device_metrics, :voltage))
|
||||
|
||||
channel_utilization = payload.key?("channel_utilization") ? payload["channel_utilization"] : nil
|
||||
channel_utilization ||= payload["channelUtilization"] if payload.key?("channelUtilization")
|
||||
channel_utilization = coerce_float(channel_utilization)
|
||||
channel_utilization ||= coerce_float(fetch_metric.call(device_metrics, :channel_utilization, :channelUtilization))
|
||||
|
||||
air_util_tx = payload.key?("air_util_tx") ? payload["air_util_tx"] : nil
|
||||
air_util_tx ||= payload["airUtilTx"] if payload.key?("airUtilTx")
|
||||
air_util_tx = coerce_float(air_util_tx)
|
||||
air_util_tx ||= coerce_float(fetch_metric.call(device_metrics, :air_util_tx, :airUtilTx))
|
||||
|
||||
uptime_seconds = payload.key?("uptime_seconds") ? payload["uptime_seconds"] : nil
|
||||
uptime_seconds ||= payload["uptimeSeconds"] if payload.key?("uptimeSeconds")
|
||||
uptime_seconds = coerce_integer(uptime_seconds)
|
||||
uptime_seconds ||= coerce_integer(fetch_metric.call(device_metrics, :uptime_seconds, :uptimeSeconds))
|
||||
|
||||
temperature = payload.key?("temperature") ? payload["temperature"] : nil
|
||||
temperature = coerce_float(temperature)
|
||||
temperature ||= coerce_float(fetch_metric.call(environment_metrics, :temperature, :temperatureC, :temperature_c, :tempC))
|
||||
|
||||
relative_humidity = payload.key?("relative_humidity") ? payload["relative_humidity"] : nil
|
||||
relative_humidity ||= payload["relativeHumidity"] if payload.key?("relativeHumidity")
|
||||
relative_humidity ||= payload["humidity"] if payload.key?("humidity")
|
||||
relative_humidity = coerce_float(relative_humidity)
|
||||
relative_humidity ||= coerce_float(fetch_metric.call(environment_metrics, :relative_humidity, :relativeHumidity, :humidity))
|
||||
|
||||
barometric_pressure = payload.key?("barometric_pressure") ? payload["barometric_pressure"] : nil
|
||||
barometric_pressure ||= payload["barometricPressure"] if payload.key?("barometricPressure")
|
||||
barometric_pressure ||= payload["pressure"] if payload.key?("pressure")
|
||||
barometric_pressure = coerce_float(barometric_pressure)
|
||||
barometric_pressure ||= coerce_float(fetch_metric.call(environment_metrics, :barometric_pressure, :barometricPressure, :pressure))
|
||||
|
||||
row = [
|
||||
telemetry_id,
|
||||
node_id,
|
||||
node_num,
|
||||
from_id,
|
||||
to_id,
|
||||
rx_time,
|
||||
rx_iso,
|
||||
telemetry_time,
|
||||
channel,
|
||||
portnum,
|
||||
hop_limit,
|
||||
snr,
|
||||
rssi,
|
||||
bitfield,
|
||||
payload_b64,
|
||||
battery_level,
|
||||
voltage,
|
||||
channel_utilization,
|
||||
air_util_tx,
|
||||
uptime_seconds,
|
||||
temperature,
|
||||
relative_humidity,
|
||||
barometric_pressure,
|
||||
]
|
||||
|
||||
with_busy_retry do
|
||||
db.execute <<~SQL, row
|
||||
INSERT INTO telemetry(id,node_id,node_num,from_id,to_id,rx_time,rx_iso,telemetry_time,channel,portnum,hop_limit,snr,rssi,bitfield,payload_b64,
|
||||
battery_level,voltage,channel_utilization,air_util_tx,uptime_seconds,temperature,relative_humidity,barometric_pressure)
|
||||
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
|
||||
ON CONFLICT(id) DO UPDATE SET
|
||||
node_id=COALESCE(excluded.node_id,telemetry.node_id),
|
||||
node_num=COALESCE(excluded.node_num,telemetry.node_num),
|
||||
from_id=COALESCE(excluded.from_id,telemetry.from_id),
|
||||
to_id=COALESCE(excluded.to_id,telemetry.to_id),
|
||||
rx_time=excluded.rx_time,
|
||||
rx_iso=excluded.rx_iso,
|
||||
telemetry_time=COALESCE(excluded.telemetry_time,telemetry.telemetry_time),
|
||||
channel=COALESCE(excluded.channel,telemetry.channel),
|
||||
portnum=COALESCE(excluded.portnum,telemetry.portnum),
|
||||
hop_limit=COALESCE(excluded.hop_limit,telemetry.hop_limit),
|
||||
snr=COALESCE(excluded.snr,telemetry.snr),
|
||||
rssi=COALESCE(excluded.rssi,telemetry.rssi),
|
||||
bitfield=COALESCE(excluded.bitfield,telemetry.bitfield),
|
||||
payload_b64=COALESCE(excluded.payload_b64,telemetry.payload_b64),
|
||||
battery_level=COALESCE(excluded.battery_level,telemetry.battery_level),
|
||||
voltage=COALESCE(excluded.voltage,telemetry.voltage),
|
||||
channel_utilization=COALESCE(excluded.channel_utilization,telemetry.channel_utilization),
|
||||
air_util_tx=COALESCE(excluded.air_util_tx,telemetry.air_util_tx),
|
||||
uptime_seconds=COALESCE(excluded.uptime_seconds,telemetry.uptime_seconds),
|
||||
temperature=COALESCE(excluded.temperature,telemetry.temperature),
|
||||
relative_humidity=COALESCE(excluded.relative_humidity,telemetry.relative_humidity),
|
||||
barometric_pressure=COALESCE(excluded.barometric_pressure,telemetry.barometric_pressure)
|
||||
SQL
|
||||
end
|
||||
|
||||
update_node_from_telemetry(
|
||||
db,
|
||||
node_id || from_id,
|
||||
node_num,
|
||||
rx_time,
|
||||
battery_level: battery_level,
|
||||
voltage: voltage,
|
||||
channel_utilization: channel_utilization,
|
||||
air_util_tx: air_util_tx,
|
||||
uptime_seconds: uptime_seconds,
|
||||
)
|
||||
end
|
||||
|
||||
# Insert a text message if it does not already exist.
|
||||
#
|
||||
# @param db [SQLite3::Database] open database handle.
|
||||
@@ -1063,6 +1607,13 @@ def insert_message(db, m)
|
||||
encrypted = string_or_nil(m["encrypted"])
|
||||
|
||||
ensure_unknown_node(db, from_id || raw_from_id, m["from_num"], heard_time: rx_time)
|
||||
touch_node_last_seen(
|
||||
db,
|
||||
from_id || raw_from_id || m["from_num"],
|
||||
m["from_num"],
|
||||
rx_time: rx_time,
|
||||
source: :message,
|
||||
)
|
||||
|
||||
row = [
|
||||
msg_id,
|
||||
@@ -1182,6 +1733,7 @@ end
|
||||
#
|
||||
# Accepts an array or object describing text messages and stores each entry.
|
||||
post "/api/messages" do
|
||||
halt 404 if private_mode?
|
||||
require_token!
|
||||
content_type :json
|
||||
begin
|
||||
@@ -1222,6 +1774,50 @@ ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
# POST /api/neighbors
|
||||
#
|
||||
# Accepts an array or object describing neighbor tuples and stores each entry.
|
||||
post "/api/neighbors" do
|
||||
require_token!
|
||||
content_type :json
|
||||
begin
|
||||
data = JSON.parse(read_json_body)
|
||||
rescue JSON::ParserError
|
||||
halt 400, { error: "invalid JSON" }.to_json
|
||||
end
|
||||
neighbor_payloads = data.is_a?(Array) ? data : [data]
|
||||
halt 400, { error: "too many neighbor packets" }.to_json if neighbor_payloads.size > 1000
|
||||
db = open_database
|
||||
neighbor_payloads.each do |packet|
|
||||
insert_neighbors(db, packet)
|
||||
end
|
||||
{ status: "ok" }.to_json
|
||||
ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
# POST /api/telemetry
|
||||
#
|
||||
# Accepts an array or object describing telemetry packets and stores each entry.
|
||||
post "/api/telemetry" do
|
||||
require_token!
|
||||
content_type :json
|
||||
begin
|
||||
data = JSON.parse(read_json_body)
|
||||
rescue JSON::ParserError
|
||||
halt 400, { error: "invalid JSON" }.to_json
|
||||
end
|
||||
telemetry_packets = data.is_a?(Array) ? data : [data]
|
||||
halt 400, { error: "too many telemetry packets" }.to_json if telemetry_packets.size > 1000
|
||||
db = open_database
|
||||
telemetry_packets.each do |packet|
|
||||
insert_telemetry(db, packet)
|
||||
end
|
||||
{ status: "ok" }.to_json
|
||||
ensure
|
||||
db&.close
|
||||
end
|
||||
|
||||
get "/potatomesh-logo.svg" do
|
||||
# Sinatra знает корень через settings.root (обычно это каталог app.rb)
|
||||
path = File.expand_path("potatomesh-logo.svg", settings.public_folder)
|
||||
@@ -1258,5 +1854,6 @@ get "/" do
|
||||
max_node_distance_km: MAX_NODE_DISTANCE_KM,
|
||||
matrix_room: sanitized_matrix_room,
|
||||
version: APP_VERSION,
|
||||
private_mode: private_mode?,
|
||||
}
|
||||
end
|
||||
|
||||
+385
-2
@@ -29,6 +29,8 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
|
||||
def with_db(readonly: false)
|
||||
db = SQLite3::Database.new(DB_PATH, readonly: readonly)
|
||||
db.busy_timeout = DB_BUSY_TIMEOUT_MS
|
||||
db.execute("PRAGMA foreign_keys = ON")
|
||||
yield db
|
||||
ensure
|
||||
db&.close
|
||||
@@ -36,9 +38,11 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
|
||||
def clear_database
|
||||
with_db do |db|
|
||||
db.execute("DELETE FROM neighbors")
|
||||
db.execute("DELETE FROM messages")
|
||||
db.execute("DELETE FROM nodes")
|
||||
db.execute("DELETE FROM positions")
|
||||
db.execute("DELETE FROM telemetry")
|
||||
end
|
||||
end
|
||||
|
||||
@@ -73,6 +77,8 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
"latitude" => node["latitude"],
|
||||
"longitude" => node["longitude"],
|
||||
"altitude" => node["altitude"],
|
||||
"locationSource" => node["location_source"],
|
||||
"precisionBits" => node["precision_bits"],
|
||||
)
|
||||
payload["position"] = position unless position.empty?
|
||||
|
||||
@@ -100,6 +106,8 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
"channel_utilization" => node["channel_utilization"],
|
||||
"air_util_tx" => node["air_util_tx"],
|
||||
"position_time" => node["position_time"],
|
||||
"location_source" => node["location_source"],
|
||||
"precision_bits" => node["precision_bits"],
|
||||
"latitude" => node["latitude"],
|
||||
"longitude" => node["longitude"],
|
||||
"altitude" => node["altitude"],
|
||||
@@ -143,6 +151,7 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
end
|
||||
let(:nodes_fixture) { JSON.parse(File.read(fixture_path("nodes.json"))) }
|
||||
let(:messages_fixture) { JSON.parse(File.read(fixture_path("messages.json"))) }
|
||||
let(:telemetry_fixture) { JSON.parse(File.read(fixture_path("telemetry.json"))) }
|
||||
let(:reference_time) do
|
||||
latest = nodes_fixture.map { |node| node["last_heard"] }.compact.max
|
||||
Time.at((latest || Time.now.to_i) + 1000)
|
||||
@@ -150,13 +159,20 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
|
||||
before do
|
||||
@original_token = ENV["API_TOKEN"]
|
||||
@original_private = ENV["PRIVATE"]
|
||||
ENV["API_TOKEN"] = api_token
|
||||
ENV.delete("PRIVATE")
|
||||
allow(Time).to receive(:now).and_return(reference_time)
|
||||
clear_database
|
||||
end
|
||||
|
||||
after do
|
||||
ENV["API_TOKEN"] = @original_token
|
||||
if @original_private.nil?
|
||||
ENV.delete("PRIVATE")
|
||||
else
|
||||
ENV["PRIVATE"] = @original_private
|
||||
end
|
||||
end
|
||||
|
||||
describe "logging configuration" do
|
||||
@@ -705,7 +721,7 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
node_row = db.get_first_row(
|
||||
"SELECT last_heard, position_time, latitude, longitude, altitude, location_source, snr FROM nodes WHERE node_id = ?",
|
||||
"SELECT last_heard, position_time, latitude, longitude, altitude, location_source, precision_bits, snr FROM nodes WHERE node_id = ?",
|
||||
[node_id],
|
||||
)
|
||||
expect(node_row["last_heard"]).to eq(rx_time)
|
||||
@@ -714,6 +730,7 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect_same_value(node_row["longitude"], 13.4)
|
||||
expect_same_value(node_row["altitude"], 42.0)
|
||||
expect(node_row["location_source"]).to eq("LOC_INTERNAL")
|
||||
expect(node_row["precision_bits"]).to eq(15)
|
||||
expect_same_value(node_row["snr"], -8.5)
|
||||
end
|
||||
end
|
||||
@@ -843,6 +860,203 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
end
|
||||
end
|
||||
|
||||
describe "POST /api/neighbors" do
|
||||
it "stores neighbor tuples and updates node metadata" do
|
||||
rx_time = reference_time.to_i - 120
|
||||
neighbor_rx_time = rx_time - 30
|
||||
payload = {
|
||||
"node_id" => "!abc123ef",
|
||||
"node_num" => 0xabc123ef,
|
||||
"rx_time" => rx_time,
|
||||
"neighbors" => [
|
||||
{ "node_id" => "!00ff0011", "snr" => -7.5 },
|
||||
{ "node_id" => 0x11223344, "snr" => 3.25, "rx_time" => neighbor_rx_time },
|
||||
],
|
||||
}
|
||||
|
||||
post "/api/neighbors", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response).to be_ok
|
||||
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
rows = db.execute(
|
||||
"SELECT node_id, neighbor_id, snr, rx_time FROM neighbors ORDER BY neighbor_id",
|
||||
)
|
||||
|
||||
expect(rows.size).to eq(2)
|
||||
expect(rows[0]["node_id"]).to eq("!abc123ef")
|
||||
expect(rows[0]["neighbor_id"]).to eq("!00ff0011")
|
||||
expect_same_value(rows[0]["snr"], -7.5)
|
||||
expect(rows[0]["rx_time"]).to eq(rx_time)
|
||||
|
||||
expect(rows[1]["node_id"]).to eq("!abc123ef")
|
||||
expect(rows[1]["neighbor_id"]).to eq("!11223344")
|
||||
expect_same_value(rows[1]["snr"], 3.25)
|
||||
expect(rows[1]["rx_time"]).to eq(neighbor_rx_time)
|
||||
end
|
||||
|
||||
get "/api/neighbors"
|
||||
|
||||
expect(last_response).to be_ok
|
||||
neighbors = JSON.parse(last_response.body)
|
||||
expect(neighbors.map { |row| row["neighbor_id"] }).to contain_exactly("!00ff0011", "!11223344")
|
||||
expect(neighbors.first).to include("node_id" => "!abc123ef")
|
||||
expect(neighbors.first["rx_iso"]).to be_a(String)
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
node_rows = db.execute(
|
||||
"SELECT node_id, last_heard FROM nodes ORDER BY node_id",
|
||||
)
|
||||
|
||||
expect(node_rows.size).to eq(3)
|
||||
origin = node_rows.find { |row| row["node_id"] == "!abc123ef" }
|
||||
expect(origin["last_heard"]).to eq(rx_time)
|
||||
neighbor_one = node_rows.find { |row| row["node_id"] == "!00ff0011" }
|
||||
expect(neighbor_one["last_heard"]).to eq(rx_time)
|
||||
neighbor_two = node_rows.find { |row| row["node_id"] == "!11223344" }
|
||||
expect(neighbor_two["last_heard"]).to eq(neighbor_rx_time)
|
||||
end
|
||||
end
|
||||
|
||||
it "handles broadcasts with no neighbors" do
|
||||
rx_time = reference_time.to_i - 60
|
||||
payload = {
|
||||
"node_id" => "!cafebabe",
|
||||
"rx_time" => rx_time,
|
||||
"neighbors" => [],
|
||||
}
|
||||
|
||||
post "/api/neighbors", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response).to be_ok
|
||||
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
count = db.get_first_value("SELECT COUNT(*) FROM neighbors")
|
||||
expect(count).to eq(0)
|
||||
|
||||
db.results_as_hash = true
|
||||
row = db.get_first_row(
|
||||
"SELECT node_id, last_heard FROM nodes WHERE node_id = ?",
|
||||
["!cafebabe"],
|
||||
)
|
||||
expect(row).not_to be_nil
|
||||
expect(row["last_heard"]).to eq(rx_time)
|
||||
end
|
||||
|
||||
get "/api/neighbors"
|
||||
expect(last_response).to be_ok
|
||||
expect(JSON.parse(last_response.body)).to be_empty
|
||||
end
|
||||
|
||||
it "returns 400 when more than 1000 neighbor packets are provided" do
|
||||
payload = Array.new(1001) do |i|
|
||||
{ "node_id" => format("!%08x", i), "rx_time" => reference_time.to_i - i }
|
||||
end
|
||||
|
||||
post "/api/neighbors", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response.status).to eq(400)
|
||||
expect(JSON.parse(last_response.body)).to eq("error" => "too many neighbor packets")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
count = db.get_first_value("SELECT COUNT(*) FROM neighbors")
|
||||
expect(count).to eq(0)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe "POST /api/telemetry" do
|
||||
it "stores telemetry packets and updates node metrics" do
|
||||
payload = telemetry_fixture
|
||||
|
||||
post "/api/telemetry", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response).to be_ok
|
||||
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
rows = db.execute(
|
||||
"SELECT * FROM telemetry ORDER BY id",
|
||||
)
|
||||
|
||||
expect(rows.size).to eq(payload.size)
|
||||
|
||||
first = rows.find { |row| row["id"] == payload[0]["id"] }
|
||||
expect(first).not_to be_nil
|
||||
expect(first["node_id"]).to eq(payload[0]["node_id"])
|
||||
expect(first["rx_time"]).to eq(payload[0]["rx_time"])
|
||||
expect_same_value(first["battery_level"], payload[0]["battery_level"])
|
||||
expect_same_value(first["voltage"], payload[0].dig("device_metrics", "voltage"))
|
||||
expect_same_value(first["channel_utilization"], payload[0].dig("device_metrics", "channelUtilization"))
|
||||
expect_same_value(first["air_util_tx"], payload[0].dig("device_metrics", "airUtilTx"))
|
||||
expect(first["uptime_seconds"]).to eq(payload[0].dig("device_metrics", "uptimeSeconds"))
|
||||
|
||||
environment_row = rows.find { |row| row["id"] == payload[1]["id"] }
|
||||
expect(environment_row["temperature"]).to be_within(1e-6).of(payload[1].dig("environment_metrics", "temperature"))
|
||||
expect(environment_row["relative_humidity"]).to be_within(1e-6).of(payload[1].dig("environment_metrics", "relativeHumidity"))
|
||||
expect(environment_row["barometric_pressure"]).to be_within(1e-6).of(payload[1].dig("environment_metrics", "barometricPressure"))
|
||||
end
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
|
||||
metrics_node = db.get_first_row(
|
||||
"SELECT battery_level, voltage, channel_utilization, air_util_tx, uptime_seconds, last_heard, first_heard FROM nodes WHERE node_id = ?",
|
||||
[payload[0]["node_id"]],
|
||||
)
|
||||
expect_same_value(metrics_node["battery_level"], payload[0]["battery_level"])
|
||||
expect_same_value(metrics_node["voltage"], payload[0]["device_metrics"]["voltage"])
|
||||
expect_same_value(metrics_node["channel_utilization"], payload[0]["device_metrics"]["channelUtilization"])
|
||||
expect_same_value(metrics_node["air_util_tx"], payload[0]["device_metrics"]["airUtilTx"])
|
||||
expect(metrics_node["uptime_seconds"]).to eq(payload[0]["device_metrics"]["uptimeSeconds"])
|
||||
expect(metrics_node["last_heard"]).to eq(payload[0]["rx_time"])
|
||||
expect(metrics_node["first_heard"]).to eq(payload[0]["rx_time"])
|
||||
|
||||
env_node = db.get_first_row(
|
||||
"SELECT last_heard, battery_level, voltage FROM nodes WHERE node_id = ?",
|
||||
[payload[1]["node_id"]],
|
||||
)
|
||||
expect(env_node["last_heard"]).to eq(payload[1]["rx_time"])
|
||||
expect(env_node["battery_level"]).to be_nil
|
||||
expect(env_node["voltage"]).to be_nil
|
||||
|
||||
local_node = db.get_first_row(
|
||||
"SELECT battery_level, uptime_seconds, last_heard FROM nodes WHERE node_id = ?",
|
||||
[payload[2]["node_id"]],
|
||||
)
|
||||
expect_same_value(local_node["battery_level"], payload[2]["device_metrics"]["battery_level"])
|
||||
expect(local_node["uptime_seconds"]).to eq(payload[2]["device_metrics"]["uptime_seconds"])
|
||||
expect(local_node["last_heard"]).to eq(payload[2]["rx_time"])
|
||||
end
|
||||
end
|
||||
|
||||
it "returns 400 when the payload is not valid JSON" do
|
||||
post "/api/telemetry", "{", auth_headers
|
||||
|
||||
expect(last_response.status).to eq(400)
|
||||
expect(JSON.parse(last_response.body)).to eq("error" => "invalid JSON")
|
||||
end
|
||||
|
||||
it "returns 400 when more than 1000 telemetry packets are provided" do
|
||||
payload = Array.new(1001) { |i| { "id" => i + 1, "rx_time" => reference_time.to_i - i } }
|
||||
|
||||
post "/api/telemetry", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response.status).to eq(400)
|
||||
expect(JSON.parse(last_response.body)).to eq("error" => "too many telemetry packets")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
count = db.get_first_value("SELECT COUNT(*) FROM telemetry")
|
||||
expect(count).to eq(0)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
it "returns 400 when more than 1000 messages are provided" do
|
||||
payload = Array.new(1001) { |i| { "packet_id" => i + 1 } }
|
||||
|
||||
@@ -1004,6 +1218,13 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect(row["to_id"]).to eq(receiver_id)
|
||||
expect(row["text"]).to be_nil
|
||||
expect(row["encrypted"]).to eq(encrypted_b64)
|
||||
|
||||
node_row = db.get_first_row(
|
||||
"SELECT last_heard FROM nodes WHERE node_id = ?",
|
||||
[sender_id],
|
||||
)
|
||||
|
||||
expect(node_row["last_heard"]).to eq(payload["rx_time"])
|
||||
end
|
||||
|
||||
get "/api/messages"
|
||||
@@ -1014,6 +1235,44 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect(messages).to be_empty
|
||||
end
|
||||
|
||||
it "updates node last_heard for plaintext messages" do
|
||||
node_id = "!plainmsg01"
|
||||
initial_first = reference_time.to_i - 600
|
||||
initial_last = reference_time.to_i - 300
|
||||
|
||||
with_db do |db|
|
||||
db.execute(
|
||||
"INSERT INTO nodes(node_id, last_heard, first_heard) VALUES (?,?,?)",
|
||||
[node_id, initial_last, initial_first],
|
||||
)
|
||||
end
|
||||
|
||||
rx_time = reference_time.to_i - 120
|
||||
payload = {
|
||||
"packet_id" => 888_001,
|
||||
"rx_time" => rx_time,
|
||||
"rx_iso" => Time.at(rx_time).utc.iso8601,
|
||||
"from_id" => node_id,
|
||||
"text" => "plaintext update",
|
||||
}
|
||||
|
||||
post "/api/messages", payload.to_json, auth_headers
|
||||
|
||||
expect(last_response).to be_ok
|
||||
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
|
||||
|
||||
with_db(readonly: true) do |db|
|
||||
db.results_as_hash = true
|
||||
row = db.get_first_row(
|
||||
"SELECT last_heard, first_heard FROM nodes WHERE node_id = ?",
|
||||
[node_id],
|
||||
)
|
||||
|
||||
expect(row["last_heard"]).to eq(rx_time)
|
||||
expect(row["first_heard"]).to eq(initial_first)
|
||||
end
|
||||
end
|
||||
|
||||
it "stores messages containing SQL control characters without executing them" do
|
||||
payload = {
|
||||
"packet_id" => 404,
|
||||
@@ -1151,6 +1410,8 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect_same_value(actual_row["channel_utilization"], expected["channel_utilization"])
|
||||
expect_same_value(actual_row["air_util_tx"], expected["air_util_tx"])
|
||||
expect_same_value(actual_row["position_time"], expected["position_time"])
|
||||
expect(actual_row["location_source"]).to eq(expected["location_source"])
|
||||
expect_same_value(actual_row["precision_bits"], expected["precision_bits"])
|
||||
expect_same_value(actual_row["latitude"], expected["latitude"])
|
||||
expect_same_value(actual_row["longitude"], expected["longitude"])
|
||||
expect_same_value(actual_row["altitude"], expected["altitude"])
|
||||
@@ -1211,6 +1472,38 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
node_aliases[num.to_s] ||= canonical
|
||||
end
|
||||
|
||||
latest_rx_by_node = {}
|
||||
messages_fixture.each do |message|
|
||||
rx_time = message["rx_time"]
|
||||
next unless rx_time
|
||||
|
||||
canonical = nil
|
||||
from_id = message["from_id"]
|
||||
|
||||
if from_id.is_a?(String)
|
||||
trimmed = from_id.strip
|
||||
unless trimmed.empty?
|
||||
if trimmed.match?(/\A[0-9]+\z/)
|
||||
canonical = node_aliases[trimmed] || trimmed
|
||||
else
|
||||
canonical = trimmed
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
canonical ||= message.dig("node", "node_id")
|
||||
|
||||
if canonical.nil?
|
||||
num = message.dig("node", "num")
|
||||
canonical = node_aliases[num.to_s] if num
|
||||
end
|
||||
|
||||
next unless canonical
|
||||
|
||||
existing = latest_rx_by_node[canonical]
|
||||
latest_rx_by_node[canonical] = [existing, rx_time].compact.max
|
||||
end
|
||||
|
||||
messages_fixture.each do |message|
|
||||
expected = message.reject { |key, _| key == "node" }
|
||||
actual_row = actual_by_id.fetch(message["id"])
|
||||
@@ -1263,7 +1556,12 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect_same_value(node_actual["snr"], node_expected["snr"])
|
||||
expect_same_value(node_actual["battery_level"], node_expected["battery_level"])
|
||||
expect_same_value(node_actual["voltage"], node_expected["voltage"])
|
||||
expect(node_actual["last_heard"]).to eq(node_expected["last_heard"])
|
||||
expected_last_heard = node_expected["last_heard"]
|
||||
latest_rx = latest_rx_by_node[node_expected["node_id"]]
|
||||
if latest_rx
|
||||
expected_last_heard = [expected_last_heard, latest_rx].compact.max
|
||||
end
|
||||
expect(node_actual["last_heard"]).to eq(expected_last_heard)
|
||||
expect(node_actual["first_heard"]).to eq(node_expected["first_heard"])
|
||||
expect_same_value(node_actual["latitude"], node_expected["latitude"])
|
||||
expect_same_value(node_actual["longitude"], node_expected["longitude"])
|
||||
@@ -1311,6 +1609,55 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
end
|
||||
end
|
||||
|
||||
context "when private mode is enabled" do
|
||||
before do
|
||||
ENV["PRIVATE"] = "1"
|
||||
end
|
||||
|
||||
it "returns 404 for GET /api/messages" do
|
||||
get "/api/messages"
|
||||
expect(last_response.status).to eq(404)
|
||||
end
|
||||
|
||||
it "returns 404 for POST /api/messages" do
|
||||
post "/api/messages", {}.to_json, auth_headers
|
||||
expect(last_response.status).to eq(404)
|
||||
end
|
||||
|
||||
it "excludes hidden clients from the nodes API" do
|
||||
now = reference_time.to_i
|
||||
with_db do |db|
|
||||
db.execute(
|
||||
"INSERT INTO nodes(node_id, short_name, long_name, hw_model, role, snr, last_heard, first_heard) VALUES(?,?,?,?,?,?,?,?)",
|
||||
["!hidden", "hidn", "Hidden", "TBEAM", "CLIENT_HIDDEN", 0.0, now, now],
|
||||
)
|
||||
db.execute(
|
||||
"INSERT INTO nodes(node_id, short_name, long_name, hw_model, role, snr, last_heard, first_heard) VALUES(?,?,?,?,?,?,?,?)",
|
||||
["!visible", "vis", "Visible", "TBEAM", "CLIENT", 1.0, now, now],
|
||||
)
|
||||
end
|
||||
|
||||
get "/api/nodes?limit=10"
|
||||
|
||||
expect(last_response).to be_ok
|
||||
nodes = JSON.parse(last_response.body)
|
||||
ids = nodes.map { |node| node["node_id"] }
|
||||
expect(ids).to include("!visible")
|
||||
expect(ids).not_to include("!hidden")
|
||||
end
|
||||
|
||||
it "removes the chat interface from the homepage" do
|
||||
get "/"
|
||||
|
||||
expect(last_response).to be_ok
|
||||
body = last_response.body
|
||||
expect(body).not_to include('<div id="chat"')
|
||||
expect(body).to include("const CHAT_ENABLED = false;")
|
||||
expect(body).not_to include("Track nodes, messages, and coverage in real time.")
|
||||
expect(body).to include("Track nodes and coverage in real time.")
|
||||
end
|
||||
end
|
||||
|
||||
describe "GET /api/positions" do
|
||||
it "returns stored positions ordered by receive time" do
|
||||
node_id = "!specfetch"
|
||||
@@ -1324,6 +1671,8 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
"position_time" => rx_time - 5,
|
||||
"latitude" => 52.0 + idx,
|
||||
"longitude" => 13.0 + idx,
|
||||
"location_source" => "LOC_TEST",
|
||||
"precision_bits" => 7 + idx,
|
||||
"payload_b64" => "AQI=",
|
||||
}
|
||||
post "/api/positions", payload.to_json, auth_headers
|
||||
@@ -1344,7 +1693,41 @@ RSpec.describe "Potato Mesh Sinatra app" do
|
||||
expect(entry["position_time_iso"]).to eq(Time.at(rx_times.last - 5).utc.iso8601)
|
||||
expect(entry["latitude"]).to eq(53.0)
|
||||
expect(entry["longitude"]).to eq(14.0)
|
||||
expect(entry["location_source"]).to eq("LOC_TEST")
|
||||
expect(entry["precision_bits"]).to eq(8)
|
||||
expect(entry["payload_b64"]).to eq("AQI=")
|
||||
end
|
||||
end
|
||||
|
||||
describe "GET /api/telemetry" do
|
||||
it "returns stored telemetry ordered by receive time" do
|
||||
post "/api/telemetry", telemetry_fixture.to_json, auth_headers
|
||||
expect(last_response).to be_ok
|
||||
|
||||
get "/api/telemetry?limit=2"
|
||||
|
||||
expect(last_response).to be_ok
|
||||
data = JSON.parse(last_response.body)
|
||||
expect(data.length).to eq(2)
|
||||
|
||||
latest = telemetry_fixture.max_by { |entry| entry["rx_time"] }
|
||||
second_latest = telemetry_fixture.sort_by { |entry| entry["rx_time"] }[-2]
|
||||
|
||||
first_entry = data.first
|
||||
expect(first_entry["id"]).to eq(latest["id"])
|
||||
expect(first_entry["node_id"]).to eq(latest["node_id"])
|
||||
expect(first_entry["rx_time"]).to eq(latest["rx_time"])
|
||||
expect(first_entry["telemetry_time"]).to eq(latest["telemetry_time"])
|
||||
expect(first_entry["telemetry_time_iso"]).to eq(Time.at(latest["telemetry_time"]).utc.iso8601)
|
||||
expect(first_entry).not_to have_key("device_metrics")
|
||||
expect_same_value(first_entry["battery_level"], latest.dig("device_metrics", "battery_level") || latest.dig("device_metrics", "batteryLevel"))
|
||||
|
||||
second_entry = data.last
|
||||
expect(second_entry["id"]).to eq(second_latest["id"])
|
||||
expect(second_entry).not_to have_key("environment_metrics")
|
||||
expect(second_entry["temperature"]).to be_within(1e-6).of(second_latest["environment_metrics"]["temperature"])
|
||||
expect(second_entry["relative_humidity"]).to be_within(1e-6).of(second_latest["environment_metrics"]["relativeHumidity"])
|
||||
expect(second_entry["barometric_pressure"]).to be_within(1e-6).of(second_latest["environment_metrics"]["barometricPressure"])
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
+1069
-164
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user