Compare commits

..

2 Commits

Author SHA1 Message Date
l5y fa40adf454 Default to host networking in Compose 2025-09-22 09:19:39 +02:00
l5y 6091ef92e5 Require time library for ISO timestamp formatting 2025-09-22 09:07:40 +02:00
17 changed files with 227 additions and 3113 deletions
+3 -3
View File
@@ -22,6 +22,9 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install black pytest pytest-cov meshtastic
- name: Lint with black
run: |
black --check ./
- name: Test with pytest and coverage
run: |
mkdir -p reports
@@ -42,6 +45,3 @@ jobs:
token: ${{ secrets.CODECOV_TOKEN }}
files: reports/python-junit.xml
flags: python-ingestor
- name: Lint with black
run: |
black --check ./
+2 -2
View File
@@ -29,6 +29,8 @@ jobs:
working-directory: ./web
- name: Set up dependencies
run: bundle install
- name: Run rufo
run: bundle exec rufo --check .
- name: Run tests
run: |
mkdir -p tmp/test-results
@@ -51,5 +53,3 @@ jobs:
flags: ruby-${{ matrix.ruby-version }}
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
- name: Run rufo
run: bundle exec rufo --check .
-1
View File
@@ -65,4 +65,3 @@ reports/
# AI planning and documentation
ai_docs/
*.log
+2 -7
View File
@@ -32,12 +32,11 @@ Edit `.env` file or run `./configure.sh` to set:
## Device Setup
**Find your device:**
```bash
# Linux
ls /dev/ttyACM* /dev/ttyUSB*
# macOS
# macOS
ls /dev/cu.usbserial-*
# Windows
@@ -45,7 +44,6 @@ ls /dev/ttyS*
```
**Set permissions (Linux/macOS):**
```bash
sudo chmod 666 /dev/ttyACM0
# Or add user to dialout group
@@ -74,7 +72,6 @@ docker-compose pull && docker-compose up -d
## Troubleshooting
**Device access issues:**
```bash
# Check device exists and permissions
ls -la /dev/ttyACM0
@@ -84,14 +81,12 @@ sudo chmod 666 /dev/ttyACM0
```
**Port conflicts:**
```bash
# Find what's using port 41447
sudo lsof -i :41447
```
**Container issues:**
```bash
# Check logs
docker-compose logs
@@ -100,4 +95,4 @@ docker-compose logs
docker-compose restart
```
For more Docker help, see [Docker Compose documentation](https://docs.docker.com/compose/).
For more Docker help, see [Docker Compose documentation](https://docs.docker.com/compose/).
+7 -20
View File
@@ -16,9 +16,9 @@ A simple Meshtastic-powered node dashboard for your local community. _No MQTT cl
Live demo for Berlin #MediumFast: [potatomesh.net](https://potatomesh.net)
![screenshot of the third version](./scrot-0.3.png)
![screenshot of the second version](./scrot-0.2.png)
## Quick Start with Docker
## 🐳 Quick Start with Docker
```bash
./configure.sh # Configure your setup
@@ -27,8 +27,8 @@ docker-compose logs -f # View logs
```
PotatoMesh uses host networking by default so it can run on restricted
systems where Docker cannot create bridged interfaces. The web UI listens on
`http://127.0.0.1:41447` immediately without explicit port mappings. If you
systems where Docker cannot create bridged interfaces. The web UI listens on
`http://127.0.0.1:41447` immediately without explicit port mappings. If you
are using Docker Desktop (macOS/Windows) or otherwise require bridged
networking, enable the Compose profile with:
@@ -73,10 +73,6 @@ The web app can be configured with environment variables (defaults shown):
* `MAX_NODE_DISTANCE_KM` - hide nodes farther than this distance from the center (default: `137`)
* `MATRIX_ROOM` - matrix room id for a footer link (default: `#meshtastic-berlin:matrix.org`)
The application derives SEO-friendly document titles, descriptions, and social
preview tags from these existing configuration values and reuses the bundled
logo for Open Graph and Twitter cards.
Example:
```bash
@@ -88,10 +84,8 @@ SITE_NAME="Meshtastic Berlin" MAP_CENTER_LAT=52.502889 MAP_CENTER_LON=13.404194
The web app contains an API:
* GET `/api/nodes?limit=100` - returns the latest 100 nodes reported to the app
* GET `/api/positions?limit=100` - returns the latest 100 position data
* GET `/api/messages?limit=100` - returns the latest 100 messages
* POST `/api/nodes` - upserts nodes provided as JSON object mapping node ids to node data (requires `Authorization: Bearer <API_TOKEN>`)
* POST `/api/messages` - appends positions provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`)
* POST `/api/messages` - appends messages provided as a JSON object or array (requires `Authorization: Bearer <API_TOKEN>`)
The `API_TOKEN` environment variable must be set to a non-empty value and match the token supplied in the `Authorization` header for `POST` requests.
@@ -104,9 +98,8 @@ accepts data through the API POST endpoints. Benefit is, here multiple nodes acr
community can feed the dashboard with data. The web app handles messages and nodes
by ID and there will be no duplication.
For convenience, the directory `./data` contains a Python ingestor. It connects to a
Meshtastic node via serial port or to a remote device that exposes the Meshtastic TCP
interface to gather nodes and messages seen by the node.
For convenience, the directory `./data` contains a Python ingestor. It connects to a local
Meshtastic node via serial port to gather nodes and messages seen by the node.
```bash
pacman -S python
@@ -133,13 +126,7 @@ Mesh daemon: nodes+messages → http://127.0.0.1 | port=41447 | channel=0
Run the script with `POTATOMESH_INSTANCE` and `API_TOKEN` to keep updating
node records and parsing new incoming messages. Enable debug output with `DEBUG=1`,
specify the serial port with `MESH_SERIAL` (default `/dev/ttyACM0`) or set it to an IP
address (for example `192.168.1.20:4403`) to use the Meshtastic TCP interface.
## Demos
* <https://potatomesh.net/>
* <https://vrs.kdd2105.ru/>
specify the serial port with `MESH_SERIAL` (default `/dev/ttyACM0`), etc.
## License
+23 -821
View File
@@ -22,21 +22,16 @@ them to the accompanying web API. It also provides the long-running daemon
entry point that performs these synchronisation tasks.
"""
import base64
import dataclasses
import heapq
import ipaddress
import itertools
import json, os, time, threading, signal, urllib.request, urllib.error, urllib.parse
import math
import json, os, time, threading, signal, urllib.request, urllib.error
from collections.abc import Mapping
from meshtastic.serial_interface import SerialInterface
from meshtastic.tcp_interface import TCPInterface
from pubsub import pub
from google.protobuf.json_format import MessageToDict
from google.protobuf.message import Message as ProtoMessage
from google.protobuf.message import DecodeError
# --- Config (env overrides) ---------------------------------------------------
PORT = os.environ.get("MESH_SERIAL", "/dev/ttyACM0")
@@ -50,15 +45,6 @@ API_TOKEN = os.environ.get("API_TOKEN", "")
# --- Serial interface helpers --------------------------------------------------
_DEFAULT_TCP_PORT = 4403
# Reconnect configuration: retry delays are adjustable via environment
# variables to ease testing while keeping sensible defaults in production.
_RECONNECT_INITIAL_DELAY_SECS = float(os.environ.get("MESH_RECONNECT_INITIAL", "5"))
_RECONNECT_MAX_DELAY_SECS = float(os.environ.get("MESH_RECONNECT_MAX", "60"))
class _DummySerialInterface:
"""In-memory replacement for ``meshtastic.serial_interface.SerialInterface``.
@@ -76,58 +62,6 @@ class _DummySerialInterface:
pass
def _parse_network_target(value: str) -> tuple[str, int] | None:
"""Return ``(host, port)`` when ``value`` is an IP address string.
The ingestor accepts values such as ``192.168.1.10`` or
``tcp://192.168.1.10:4500`` for ``MESH_SERIAL`` to support Meshtastic
devices shared via TCP. Serial device paths (``/dev/ttyACM0``) are ignored
by returning ``None``.
"""
if not value:
return None
value = value.strip()
if not value:
return None
def _validated_result(host: str | None, port: int | None):
if not host:
return None
try:
ipaddress.ip_address(host)
except ValueError:
return None
return host, port or _DEFAULT_TCP_PORT
parsed_values = []
if "://" in value:
parsed_values.append(urllib.parse.urlparse(value, scheme="tcp"))
parsed_values.append(urllib.parse.urlparse(f"//{value}", scheme="tcp"))
for parsed in parsed_values:
try:
port = parsed.port
except ValueError:
port = None
result = _validated_result(parsed.hostname, port)
if result:
return result
if value.count(":") == 1 and not value.startswith("["):
host, _, port_text = value.partition(":")
try:
port = int(port_text) if port_text else None
except ValueError:
port = None
result = _validated_result(host, port)
if result:
return result
return _validated_result(value, None)
def _create_serial_interface(port: str):
"""Return an appropriate serial interface for ``port``.
@@ -142,12 +76,6 @@ def _create_serial_interface(port: str):
if DEBUG:
print(f"[debug] using dummy serial interface for port={port_value!r}")
return _DummySerialInterface()
network_target = _parse_network_target(port_value)
if network_target:
host, tcp_port = network_target
if DEBUG:
print("[debug] using TCP interface for host=" f"{host!r} port={tcp_port!r}")
return TCPInterface(hostname=host, portNumber=tcp_port)
return SerialInterface(devPath=port_value)
@@ -157,20 +85,10 @@ _POST_QUEUE = []
_POST_QUEUE_COUNTER = itertools.count()
_POST_QUEUE_ACTIVE = False
_MESSAGE_POST_PRIORITY = 0
_POSITION_POST_PRIORITY = 10
_NODE_POST_PRIORITY = 20
_NODE_POST_PRIORITY = 0
_MESSAGE_POST_PRIORITY = 10
_DEFAULT_POST_PRIORITY = 50
_RECEIVE_TOPICS = (
"meshtastic.receive",
"meshtastic.receive.text",
"meshtastic.receive.position",
"meshtastic.receive.POSITION_APP",
"meshtastic.receive.user",
"meshtastic.receive.NODEINFO_APP",
)
def _get(obj, key, default=None):
"""Return a key or attribute value from ``obj``.
@@ -279,22 +197,9 @@ def _node_to_dict(n) -> dict:
if dataclasses.is_dataclass(value):
return {k: _convert(getattr(value, k)) for k in value.__dataclass_fields__}
if isinstance(value, ProtoMessage):
try:
return MessageToDict(
value,
preserving_proto_field_name=True,
use_integers_for_enums=False,
)
except Exception:
if hasattr(value, "to_dict"):
try:
return value.to_dict()
except Exception:
pass
try:
return json.loads(json.dumps(value, default=str))
except Exception:
return str(value)
return MessageToDict(
value, preserving_proto_field_name=True, use_integers_for_enums=False
)
if isinstance(value, bytes):
try:
return value.decode()
@@ -391,64 +296,6 @@ def _first(d, *names, default=None):
return default
def _coerce_int(value):
"""Return ``value`` converted to ``int`` when possible."""
if value is None:
return None
if isinstance(value, bool):
return int(value)
if isinstance(value, int):
return value
if isinstance(value, float):
return int(value) if math.isfinite(value) else None
if isinstance(value, (str, bytes, bytearray)):
text = value.decode() if isinstance(value, (bytes, bytearray)) else value
stripped = text.strip()
if not stripped:
return None
try:
if stripped.lower().startswith("0x"):
return int(stripped, 16)
return int(stripped, 10)
except ValueError:
try:
return int(float(stripped))
except ValueError:
return None
try:
return int(value)
except (TypeError, ValueError):
return None
def _coerce_float(value):
"""Return ``value`` converted to ``float`` when possible."""
if value is None:
return None
if isinstance(value, bool):
return float(value)
if isinstance(value, (int, float)):
result = float(value)
return result if math.isfinite(result) else None
if isinstance(value, (str, bytes, bytearray)):
text = value.decode() if isinstance(value, (bytes, bytearray)) else value
stripped = text.strip()
if not stripped:
return None
try:
result = float(stripped)
except ValueError:
return None
return result if math.isfinite(result) else None
try:
result = float(value)
except (TypeError, ValueError):
return None
return result if math.isfinite(result) else None
def _pkt_to_dict(packet) -> dict:
"""Normalise a received packet into a JSON-friendly dictionary.
@@ -461,16 +308,9 @@ def _pkt_to_dict(packet) -> dict:
if isinstance(packet, dict):
return packet
if isinstance(packet, ProtoMessage):
try:
return MessageToDict(
packet, preserving_proto_field_name=True, use_integers_for_enums=False
)
except Exception:
if hasattr(packet, "to_dict"):
try:
return packet.to_dict()
except Exception:
pass
return MessageToDict(
packet, preserving_proto_field_name=True, use_integers_for_enums=False
)
# Last resort: try to read attributes
try:
return json.loads(json.dumps(packet, default=lambda o: str(o)))
@@ -478,605 +318,24 @@ def _pkt_to_dict(packet) -> dict:
return {"_unparsed": str(packet)}
def _canonical_node_id(value) -> str | None:
"""Normalise node identifiers to the canonical ``!deadbeef`` form."""
if value is None:
return None
if isinstance(value, (int, float)):
try:
num = int(value)
except (TypeError, ValueError):
return None
if num < 0:
return None
return f"!{num & 0xFFFFFFFF:08x}"
if not isinstance(value, str):
return None
trimmed = value.strip()
if not trimmed:
return None
if trimmed.startswith("^"):
return trimmed
if trimmed.startswith("!"):
body = trimmed[1:]
elif trimmed.lower().startswith("0x"):
body = trimmed[2:]
elif trimmed.isdigit():
try:
return f"!{int(trimmed, 10) & 0xFFFFFFFF:08x}"
except ValueError:
return None
else:
body = trimmed
if not body:
return None
try:
return f"!{int(body, 16) & 0xFFFFFFFF:08x}"
except ValueError:
return None
def _node_num_from_id(node_id) -> int | None:
"""Return the numeric node reference derived from ``node_id``."""
if node_id is None:
return None
if isinstance(node_id, (int, float)):
try:
num = int(node_id)
except (TypeError, ValueError):
return None
return num if num >= 0 else None
if not isinstance(node_id, str):
return None
trimmed = node_id.strip()
if not trimmed:
return None
if trimmed.startswith("!"):
trimmed = trimmed[1:]
if trimmed.lower().startswith("0x"):
trimmed = trimmed[2:]
try:
return int(trimmed, 16)
except ValueError:
try:
return int(trimmed, 10)
except ValueError:
return None
def _merge_mappings(base, extra):
"""Recursively merge mapping ``extra`` into ``base`` without mutation."""
base_dict: dict
if isinstance(base, Mapping):
base_dict = dict(base)
elif base:
converted_base = _node_to_dict(base)
base_dict = dict(converted_base) if isinstance(converted_base, Mapping) else {}
else:
base_dict = {}
if not isinstance(extra, Mapping):
converted_extra = _node_to_dict(extra)
if not isinstance(converted_extra, Mapping):
return base_dict
extra = converted_extra
for key, value in extra.items():
if isinstance(value, Mapping):
existing = base_dict.get(key)
base_dict[key] = _merge_mappings(existing, value)
else:
base_dict[key] = _node_to_dict(value)
return base_dict
def _extract_payload_bytes(decoded_section: Mapping) -> bytes | None:
"""Extract raw payload bytes from a decoded packet section."""
if not isinstance(decoded_section, Mapping):
return None
payload = decoded_section.get("payload")
if isinstance(payload, Mapping):
data = payload.get("__bytes_b64__") or payload.get("bytes")
if isinstance(data, str):
try:
return base64.b64decode(data)
except Exception:
return None
if isinstance(payload, (bytes, bytearray)):
return bytes(payload)
if isinstance(payload, str):
try:
return base64.b64decode(payload)
except Exception:
return None
return None
def _decode_nodeinfo_payload(payload_bytes):
"""Return a ``NodeInfo`` protobuf message parsed from ``payload_bytes``."""
if not payload_bytes:
return None
try:
from meshtastic.protobuf import mesh_pb2
except Exception:
return None
node_info = mesh_pb2.NodeInfo()
try:
node_info.ParseFromString(payload_bytes)
return node_info
except DecodeError:
try:
user_msg = mesh_pb2.User()
user_msg.ParseFromString(payload_bytes)
except DecodeError:
return None
node_info = mesh_pb2.NodeInfo()
node_info.user.CopyFrom(user_msg)
return node_info
def _nodeinfo_metrics_dict(node_info) -> dict | None:
"""Convert ``NodeInfo.device_metrics`` into a JSON-friendly mapping."""
if not node_info:
return None
metrics_field_names = {f[0].name for f in node_info.ListFields()}
if "device_metrics" not in metrics_field_names:
return None
metrics = {}
for field_desc, value in node_info.device_metrics.ListFields():
name = field_desc.name
if name == "battery_level":
metrics["batteryLevel"] = float(value)
elif name == "voltage":
metrics["voltage"] = float(value)
elif name == "channel_utilization":
metrics["channelUtilization"] = float(value)
elif name == "air_util_tx":
metrics["airUtilTx"] = float(value)
elif name == "uptime_seconds":
metrics["uptimeSeconds"] = int(value)
return metrics if metrics else None
def _nodeinfo_position_dict(node_info) -> dict | None:
"""Convert ``NodeInfo.position`` into a dictionary with decoded coordinates."""
if not node_info:
return None
field_names = {f[0].name for f in node_info.ListFields()}
if "position" not in field_names:
return None
position = {}
for field_desc, value in node_info.position.ListFields():
name = field_desc.name
if name == "latitude_i":
position["latitude"] = float(value) / 1e7
elif name == "longitude_i":
position["longitude"] = float(value) / 1e7
elif name == "altitude":
position["altitude"] = float(value)
elif name == "time":
position["time"] = int(value)
elif name == "location_source":
try:
from meshtastic.protobuf import mesh_pb2
position["locationSource"] = mesh_pb2.Position.LocSource.Name(value)
except Exception:
position["locationSource"] = value
return position if position else None
def _nodeinfo_user_dict(node_info, decoded_user) -> dict | None:
"""Merge user details from the decoded packet and NodeInfo payload."""
user_dict = None
if node_info:
field_names = {f[0].name for f in node_info.ListFields()}
if "user" in field_names:
try:
from google.protobuf.json_format import MessageToDict
user_dict = MessageToDict(
node_info.user,
preserving_proto_field_name=False,
use_integers_for_enums=False,
)
except Exception:
user_dict = None
if isinstance(decoded_user, ProtoMessage):
try:
from google.protobuf.json_format import MessageToDict
decoded_user = MessageToDict(
decoded_user,
preserving_proto_field_name=False,
use_integers_for_enums=False,
)
except Exception:
decoded_user = _node_to_dict(decoded_user)
if isinstance(decoded_user, Mapping):
user_dict = _merge_mappings(user_dict, decoded_user)
if isinstance(user_dict, Mapping):
canonical = _canonical_node_id(user_dict.get("id"))
if canonical:
user_dict = dict(user_dict)
user_dict["id"] = canonical
return user_dict
def store_position_packet(packet: dict, decoded: Mapping):
"""Handle ``POSITION_APP`` packets and forward them to ``/api/positions``."""
node_ref = _first(packet, "fromId", "from_id", "from", default=None)
if node_ref is None:
node_ref = _first(decoded, "num", default=None)
node_id = _canonical_node_id(node_ref)
if node_id is None:
return
node_num = _coerce_int(_first(decoded, "num", default=None))
if node_num is None:
node_num = _node_num_from_id(node_id)
pkt_id = _coerce_int(_first(packet, "id", "packet_id", "packetId", default=None))
if pkt_id is None:
return
rx_time = _coerce_int(_first(packet, "rxTime", "rx_time", default=time.time()))
if rx_time is None:
rx_time = int(time.time())
to_id = _first(packet, "toId", "to_id", "to", default=None)
to_id = to_id if to_id not in {"", None} else None
position_section = decoded.get("position") if isinstance(decoded, Mapping) else None
if not isinstance(position_section, Mapping):
position_section = {}
latitude = _coerce_float(
_first(position_section, "latitude", "raw.latitude", default=None)
)
if latitude is None:
lat_i = _coerce_int(
_first(
position_section,
"latitudeI",
"latitude_i",
"raw.latitude_i",
default=None,
)
)
if lat_i is not None:
latitude = lat_i / 1e7
longitude = _coerce_float(
_first(position_section, "longitude", "raw.longitude", default=None)
)
if longitude is None:
lon_i = _coerce_int(
_first(
position_section,
"longitudeI",
"longitude_i",
"raw.longitude_i",
default=None,
)
)
if lon_i is not None:
longitude = lon_i / 1e7
altitude = _coerce_float(
_first(position_section, "altitude", "raw.altitude", default=None)
)
position_time = _coerce_int(
_first(position_section, "time", "raw.time", default=None)
)
location_source = _first(
position_section,
"locationSource",
"location_source",
"raw.location_source",
default=None,
)
location_source = (
str(location_source).strip() if location_source not in {None, ""} else None
)
precision_bits = _coerce_int(
_first(
position_section,
"precisionBits",
"precision_bits",
"raw.precision_bits",
default=None,
)
)
sats_in_view = _coerce_int(
_first(
position_section,
"satsInView",
"sats_in_view",
"raw.sats_in_view",
default=None,
)
)
pdop = _coerce_float(
_first(position_section, "PDOP", "pdop", "raw.PDOP", "raw.pdop", default=None)
)
ground_speed = _coerce_float(
_first(
position_section,
"groundSpeed",
"ground_speed",
"raw.ground_speed",
default=None,
)
)
ground_track = _coerce_float(
_first(
position_section,
"groundTrack",
"ground_track",
"raw.ground_track",
default=None,
)
)
snr = _coerce_float(_first(packet, "snr", "rx_snr", "rxSnr", default=None))
rssi = _coerce_int(_first(packet, "rssi", "rx_rssi", "rxRssi", default=None))
hop_limit = _coerce_int(_first(packet, "hopLimit", "hop_limit", default=None))
bitfield = _coerce_int(_first(decoded, "bitfield", default=None))
payload_bytes = _extract_payload_bytes(decoded)
payload_b64 = (
base64.b64encode(payload_bytes).decode("ascii") if payload_bytes else None
)
raw_section = decoded.get("raw") if isinstance(decoded, Mapping) else None
raw_payload = _node_to_dict(raw_section) if raw_section else None
if raw_payload is None and position_section:
raw_position = (
position_section.get("raw")
if isinstance(position_section, Mapping)
else None
)
if raw_position:
raw_payload = _node_to_dict(raw_position)
position_payload = {
"id": pkt_id,
"node_id": node_id,
"node_num": node_num,
"num": node_num,
"from_id": node_id,
"to_id": to_id,
"rx_time": rx_time,
"rx_iso": _iso(rx_time),
"latitude": latitude,
"longitude": longitude,
"altitude": altitude,
"position_time": position_time,
"location_source": location_source,
"precision_bits": precision_bits,
"sats_in_view": sats_in_view,
"pdop": pdop,
"ground_speed": ground_speed,
"ground_track": ground_track,
"snr": snr,
"rssi": rssi,
"hop_limit": hop_limit,
"bitfield": bitfield,
"payload_b64": payload_b64,
}
if raw_payload:
position_payload["raw"] = raw_payload
_queue_post_json(
"/api/positions", position_payload, priority=_POSITION_POST_PRIORITY
)
if DEBUG:
print(
f"[debug] stored position for {node_id} lat={latitude!r} lon={longitude!r} rx_time={rx_time}"
)
def store_nodeinfo_packet(packet: dict, decoded: Mapping):
"""Handle ``NODEINFO_APP`` packets and forward them to ``/api/nodes``."""
payload_bytes = _extract_payload_bytes(decoded)
node_info = _decode_nodeinfo_payload(payload_bytes)
decoded_user = decoded.get("user")
user_dict = _nodeinfo_user_dict(node_info, decoded_user)
node_info_fields = set()
if node_info:
node_info_fields = {field_desc.name for field_desc, _ in node_info.ListFields()}
node_id = None
if isinstance(user_dict, Mapping):
node_id = _canonical_node_id(user_dict.get("id"))
if node_id is None:
node_id = _canonical_node_id(
_first(packet, "fromId", "from_id", "from", default=None)
)
if node_id is None:
return
node_payload = {}
if user_dict:
node_payload["user"] = user_dict
node_num = None
if node_info and "num" in node_info_fields:
try:
node_num = int(node_info.num)
except (TypeError, ValueError):
node_num = None
if node_num is None:
decoded_num = decoded.get("num")
if decoded_num is not None:
try:
node_num = int(decoded_num)
except (TypeError, ValueError):
try:
node_num = int(str(decoded_num).strip(), 0)
except Exception:
node_num = None
if node_num is None:
node_num = _node_num_from_id(node_id)
if node_num is not None:
node_payload["num"] = node_num
rx_time = int(_first(packet, "rxTime", "rx_time", default=time.time()))
last_heard = None
if node_info and "last_heard" in node_info_fields:
try:
last_heard = int(node_info.last_heard)
except (TypeError, ValueError):
last_heard = None
if last_heard is None:
decoded_last_heard = decoded.get("lastHeard")
if decoded_last_heard is not None:
try:
last_heard = int(decoded_last_heard)
except (TypeError, ValueError):
last_heard = None
if last_heard is None or last_heard < rx_time:
last_heard = rx_time
node_payload["lastHeard"] = last_heard
snr = None
if node_info and "snr" in node_info_fields:
try:
snr = float(node_info.snr)
except (TypeError, ValueError):
snr = None
if snr is None:
snr = _first(packet, "snr", "rx_snr", "rxSnr", default=None)
if snr is not None:
try:
snr = float(snr)
except (TypeError, ValueError):
snr = None
if snr is not None:
node_payload["snr"] = snr
hops = None
if node_info and "hops_away" in node_info_fields:
try:
hops = int(node_info.hops_away)
except (TypeError, ValueError):
hops = None
if hops is None:
hops = decoded.get("hopsAway")
if hops is not None:
try:
hops = int(hops)
except (TypeError, ValueError):
hops = None
if hops is not None:
node_payload["hopsAway"] = hops
if node_info and "channel" in node_info_fields:
try:
node_payload["channel"] = int(node_info.channel)
except (TypeError, ValueError):
pass
if node_info and "via_mqtt" in node_info_fields:
node_payload["viaMqtt"] = bool(node_info.via_mqtt)
if node_info and "is_favorite" in node_info_fields:
node_payload["isFavorite"] = bool(node_info.is_favorite)
elif "isFavorite" in decoded:
node_payload["isFavorite"] = bool(decoded.get("isFavorite"))
if node_info and "is_ignored" in node_info_fields:
node_payload["isIgnored"] = bool(node_info.is_ignored)
if node_info and "is_key_manually_verified" in node_info_fields:
node_payload["isKeyManuallyVerified"] = bool(node_info.is_key_manually_verified)
metrics = _nodeinfo_metrics_dict(node_info)
decoded_metrics = decoded.get("deviceMetrics")
if isinstance(decoded_metrics, Mapping):
metrics = _merge_mappings(metrics, _node_to_dict(decoded_metrics))
if metrics:
node_payload["deviceMetrics"] = metrics
position = _nodeinfo_position_dict(node_info)
decoded_position = decoded.get("position")
if isinstance(decoded_position, Mapping):
position = _merge_mappings(position, _node_to_dict(decoded_position))
if position:
node_payload["position"] = position
hop_limit = _first(packet, "hopLimit", "hop_limit", default=None)
if hop_limit is not None and "hopLimit" not in node_payload:
try:
node_payload["hopLimit"] = int(hop_limit)
except (TypeError, ValueError):
pass
_queue_post_json(
"/api/nodes", {node_id: node_payload}, priority=_NODE_POST_PRIORITY
)
if DEBUG:
short = None
if isinstance(user_dict, Mapping):
short = user_dict.get("shortName")
print(f"[debug] stored nodeinfo for {node_id} shortName={short!r}")
def store_packet_dict(p: dict):
"""Persist packets extracted from a decoded payload.
"""Persist text messages extracted from a decoded packet.
Node information packets are forwarded to the ``/api/nodes`` endpoint
while text messages from the ``TEXT_MESSAGE_APP`` port continue to be
stored via ``/api/messages``. Field lookups tolerate camelCase and
snake_case variants for compatibility across Meshtastic releases.
Only packets from the ``TEXT_MESSAGE_APP`` port are forwarded to the
web API. Field lookups tolerate camelCase and snake_case variants for
compatibility across Meshtastic releases.
Args:
p: Packet dictionary produced by ``_pkt_to_dict``.
"""
dec = p.get("decoded") or {}
portnum_raw = _first(dec, "portnum", default=None)
portnum = str(portnum_raw).upper() if portnum_raw is not None else None
if portnum in {"5", "NODEINFO_APP"}:
store_nodeinfo_packet(p, dec)
return
if portnum in {"4", "POSITION_APP"}:
store_position_packet(p, dec)
return
text = _first(dec, "payload.text", "text", default=None)
encrypted = _first(dec, "payload.encrypted", "encrypted", default=None)
if encrypted is None:
encrypted = _first(p, "encrypted", default=None)
if not text and not encrypted:
return # ignore packets that lack text and encrypted payloads
if not text:
return # ignore non-text packets
# port filter: only keep packets from the TEXT_MESSAGE_APP port
portnum_raw = _first(dec, "portnum", default=None)
portnum = str(portnum_raw).upper() if portnum_raw is not None else None
if portnum and portnum not in {"1", "TEXT_MESSAGE_APP"}:
return # ignore non-text-message ports
@@ -1118,7 +377,6 @@ def store_packet_dict(p: dict):
"channel": ch,
"portnum": str(portnum) if portnum is not None else None,
"text": text,
"encrypted": encrypted,
"snr": float(snr) if snr is not None else None,
"rssi": int(rssi) if rssi is not None else None,
"hop_limit": int(hop) if hop is not None else None,
@@ -1140,11 +398,6 @@ def on_receive(packet, interface):
interface: Serial interface instance (unused).
"""
if isinstance(packet, dict):
if packet.get("_potatomesh_seen"):
return
packet["_potatomesh_seen"] = True
p = None
try:
p = _pkt_to_dict(packet)
@@ -1154,20 +407,6 @@ def on_receive(packet, interface):
print(f"[warn] failed to store packet: {e} | info: {info}")
def _subscribe_receive_topics() -> list[str]:
"""Subscribe ``on_receive`` to relevant PubSub topics."""
subscribed = []
for topic in _RECEIVE_TOPICS:
try:
pub.subscribe(on_receive, topic)
subscribed.append(topic)
except Exception as exc: # pragma: no cover - pub may raise in prod only
if DEBUG:
print(f"[debug] failed to subscribe to {topic!r}: {exc}")
return subscribed
# --- Main ---------------------------------------------------------------------
def _node_items_snapshot(nodes_obj, retries: int = 3):
"""Return a snapshot list of ``(node_id, node)`` pairs.
@@ -1218,20 +457,9 @@ def main():
"""Run the mesh synchronisation daemon."""
# Subscribe to PubSub topics (reliable in current meshtastic)
subscribed = _subscribe_receive_topics()
if DEBUG and subscribed:
print(f"[debug] subscribed to receive topics: {', '.join(subscribed)}")
pub.subscribe(on_receive, "meshtastic.receive")
def _close_interface(iface_obj):
if iface_obj is None:
return
try:
iface_obj.close()
except Exception:
pass
iface = None
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
iface = _create_serial_interface(PORT)
stop = threading.Event()
@@ -1248,24 +476,6 @@ def main():
f"Mesh daemon: nodes+messages → {target} | port={PORT} | channel={CHANNEL_INDEX}"
)
while not stop.is_set():
if iface is None:
try:
iface = _create_serial_interface(PORT)
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
except Exception as exc:
print(f"[warn] failed to create mesh interface: {exc}")
stop.wait(retry_delay)
if _RECONNECT_MAX_DELAY_SECS > 0:
retry_delay = min(
(
retry_delay * 2
if retry_delay
else _RECONNECT_INITIAL_DELAY_SECS
),
_RECONNECT_MAX_DELAY_SECS,
)
continue
try:
nodes = getattr(iface, "nodes", {}) or {}
node_items = _node_items_snapshot(nodes)
@@ -1286,20 +496,12 @@ def main():
print(f"[debug] node object: {n!r}")
except Exception as e:
print(f"[warn] failed to update node snapshot: {e}")
_close_interface(iface)
iface = None
stop.wait(retry_delay)
if _RECONNECT_MAX_DELAY_SECS > 0:
retry_delay = min(
retry_delay * 2 if retry_delay else _RECONNECT_INITIAL_DELAY_SECS,
_RECONNECT_MAX_DELAY_SECS,
)
continue
retry_delay = max(0.0, _RECONNECT_INITIAL_DELAY_SECS)
stop.wait(SNAPSHOT_SECS)
_close_interface(iface)
try:
iface.close()
except Exception:
pass
if __name__ == "__main__":
+2 -2
View File
@@ -21,10 +21,10 @@ CREATE TABLE IF NOT EXISTS messages (
channel INTEGER,
portnum TEXT,
text TEXT,
encrypted TEXT,
snr REAL,
rssi INTEGER,
hop_limit INTEGER
hop_limit INTEGER,
raw_json TEXT
);
CREATE INDEX IF NOT EXISTS idx_messages_rx_time ON messages(rx_time);
@@ -1,4 +0,0 @@
-- Add support for encrypted messages to the existing schema.
BEGIN;
ALTER TABLE messages ADD COLUMN encrypted TEXT;
COMMIT;
-40
View File
@@ -1,40 +0,0 @@
-- Copyright (C) 2025 l5yth
--
-- Licensed under the Apache License, Version 2.0 (the "License");
-- you may not use this file except in compliance with the License.
-- You may obtain a copy of the License at
--
-- http://www.apache.org/licenses/LICENSE-2.0
--
-- Unless required by applicable law or agreed to in writing, software
-- distributed under the License is distributed on an "AS IS" BASIS,
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-- See the License for the specific language governing permissions and
-- limitations under the License.
CREATE TABLE IF NOT EXISTS positions (
id INTEGER PRIMARY KEY,
node_id TEXT,
node_num INTEGER,
rx_time INTEGER NOT NULL,
rx_iso TEXT NOT NULL,
position_time INTEGER,
to_id TEXT,
latitude REAL,
longitude REAL,
altitude REAL,
location_source TEXT,
precision_bits INTEGER,
sats_in_view INTEGER,
pdop REAL,
ground_speed REAL,
ground_track REAL,
snr REAL,
rssi INTEGER,
hop_limit INTEGER,
bitfield INTEGER,
payload_b64 TEXT
);
CREATE INDEX IF NOT EXISTS idx_positions_rx_time ON positions(rx_time);
CREATE INDEX IF NOT EXISTS idx_positions_node_id ON positions(node_id);
BIN
View File
Binary file not shown.

Before

Width:  |  Height:  |  Size: 952 KiB

-77
View File
@@ -1,77 +0,0 @@
#!/usr/bin/env python3
import json, os, signal, sys, time, threading
from datetime import datetime, timezone
from meshtastic.serial_interface import SerialInterface
from meshtastic.mesh_interface import MeshInterface
from pubsub import pub
PORT = os.environ.get("MESH_SERIAL", "/dev/ttyACM0")
OUT = os.environ.get("MESH_DUMP_FILE", "meshtastic-dump.ndjson")
# line-buffered append so you can tail -f safely
f = open(OUT, "a", buffering=1, encoding="utf-8")
def now():
return datetime.now(timezone.utc).isoformat()
def write(kind, payload):
rec = {"ts": now(), "kind": kind, **payload}
f.write(json.dumps(rec, ensure_ascii=False, default=str) + "\n")
# Connect to the node
iface: MeshInterface = SerialInterface(PORT)
# Packet callback: every RF/Mesh packet the node receives/decodes lands here
def on_packet(packet, iface):
# 'packet' already includes decoded fields when available (portnum, payload, position, telemetry, etc.)
write("packet", {"packet": packet})
# Node callback: topology/metadata updates (nodeinfo, hops, lastHeard, etc.)
def on_node(node, iface):
write("node", {"node": node})
iface.onReceive = on_packet
pub.subscribe(on_node, "meshtastic.node")
# Write a little header so you know what you captured
try:
my = getattr(iface, "myInfo", None)
write(
"meta",
{
"event": "started",
"port": PORT,
"my_node_num": getattr(my, "my_node_num", None) if my else None,
},
)
except Exception as e:
write("meta", {"event": "started", "port": PORT, "error": str(e)})
# Keep the process alive until Ctrl-C
def _stop(signum, frame):
write("meta", {"event": "stopping"})
try:
try:
pub.unsubscribe(on_node, "meshtastic.node")
except Exception:
pass
iface.close()
finally:
f.close()
sys.exit(0)
signal.signal(signal.SIGINT, _stop)
signal.signal(signal.SIGTERM, _stop)
# Simple sleep loop; avoids busy-wait
while True:
time.sleep(1)
+100
View File
@@ -13,6 +13,7 @@
"snr": -13.25,
"node": {
"snr": -13.25,
"raw_json": null,
"node_id": "!bba83318",
"num": 3148362520,
"short_name": "BerF",
@@ -52,6 +53,7 @@
"snr": -12.0,
"node": {
"snr": -12.0,
"raw_json": null,
"node_id": "!43b6e530",
"num": 1136059696,
"short_name": "FFSR",
@@ -91,6 +93,7 @@
"snr": -13.5,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!d42e18e8",
"num": 3559790824,
"short_name": "RRun",
@@ -130,6 +133,7 @@
"snr": -13.0,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!d42e18e8",
"num": 3559790824,
"short_name": "RRun",
@@ -169,6 +173,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -208,6 +213,7 @@
"snr": 11.25,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!4ed36bd0",
"num": 1322478544,
"short_name": "RDM",
@@ -247,6 +253,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -286,6 +293,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -325,6 +333,7 @@
"snr": 12.0,
"node": {
"snr": 12.0,
"raw_json": null,
"node_id": "!b03c97a4",
"num": 2956760996,
"short_name": "BLN1",
@@ -364,6 +373,7 @@
"snr": -15.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9eeb25ec",
"num": 2666210796,
"short_name": "25ec",
@@ -403,6 +413,7 @@
"snr": 11.25,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!f9b0938c",
"num": 4189098892,
"short_name": "Ed-1",
@@ -442,6 +453,7 @@
"snr": 11.25,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!6c73bf84",
"num": 1819524996,
"short_name": "ts1",
@@ -481,6 +493,7 @@
"snr": 11.25,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -520,6 +533,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -559,6 +573,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!6cf821fb",
"num": 1828200955,
"short_name": "OKP1",
@@ -598,6 +613,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!6cf821fb",
"num": 1828200955,
"short_name": "OKP1",
@@ -637,6 +653,7 @@
"snr": 10.5,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -676,6 +693,7 @@
"snr": 10.25,
"node": {
"snr": 10.25,
"raw_json": null,
"node_id": "!db2b23f4",
"num": 3677037556,
"short_name": "Eagl",
@@ -715,6 +733,7 @@
"snr": 11.25,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!6cf821fb",
"num": 1828200955,
"short_name": "OKP1",
@@ -754,6 +773,7 @@
"snr": 11.0,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -793,6 +813,7 @@
"snr": -11.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!177cfa26",
"num": 394066470,
"short_name": "lun1",
@@ -832,6 +853,7 @@
"snr": 11.25,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!9ea0c780",
"num": 2661336960,
"short_name": "nguE",
@@ -871,6 +893,7 @@
"snr": 10.75,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -910,6 +933,7 @@
"snr": 11.5,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!e80cda12",
"num": 3893156370,
"short_name": "mowW",
@@ -949,6 +973,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -988,6 +1013,7 @@
"snr": 11.5,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -1027,6 +1053,7 @@
"snr": 11.5,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -1066,6 +1093,7 @@
"snr": -11.75,
"node": {
"snr": -9.75,
"raw_json": null,
"node_id": "!a0cb1608",
"num": 2697664008,
"short_name": "KBV5",
@@ -1105,6 +1133,7 @@
"snr": 10.75,
"node": {
"snr": 10.25,
"raw_json": null,
"node_id": "!bcf10936",
"num": 3169913142,
"short_name": "0936",
@@ -1144,6 +1173,7 @@
"snr": 11.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -1183,6 +1213,7 @@
"snr": -13.25,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!a0cc6904",
"num": 2697750788,
"short_name": "Kdû",
@@ -1222,6 +1253,7 @@
"snr": 10.5,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -1261,6 +1293,7 @@
"snr": 11.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9eeb25ec",
"num": 2666210796,
"short_name": "25ec",
@@ -1300,6 +1333,7 @@
"snr": -14.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!a0cc6904",
"num": 2697750788,
"short_name": "Kdû",
@@ -1339,6 +1373,7 @@
"snr": 11.25,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9eeb25ec",
"num": 2666210796,
"short_name": "25ec",
@@ -1378,6 +1413,7 @@
"snr": 11.5,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9eeb25ec",
"num": 2666210796,
"short_name": "25ec",
@@ -1417,6 +1453,7 @@
"snr": 11.75,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9eeb25ec",
"num": 2666210796,
"short_name": "25ec",
@@ -1456,6 +1493,7 @@
"snr": 11.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -1495,6 +1533,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!03b9ca11",
"num": 62507537,
"short_name": "ca11",
@@ -1534,6 +1573,7 @@
"snr": 7.5,
"node": {
"snr": 10.25,
"raw_json": null,
"node_id": "!db2b23f4",
"num": 3677037556,
"short_name": "Eagl",
@@ -1573,6 +1613,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -1612,6 +1653,7 @@
"snr": 10.75,
"node": {
"snr": 10.25,
"raw_json": null,
"node_id": "!db2b23f4",
"num": 3677037556,
"short_name": "Eagl",
@@ -1651,6 +1693,7 @@
"snr": 10.75,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -1690,6 +1733,7 @@
"snr": 10.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -1729,6 +1773,7 @@
"snr": 10.5,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -1768,6 +1813,7 @@
"snr": 11.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!a0cc6904",
"num": 2697750788,
"short_name": "Kdû",
@@ -1807,6 +1853,7 @@
"snr": -12.25,
"node": {
"snr": -12.25,
"raw_json": null,
"node_id": "!2f945044",
"num": 798249028,
"short_name": "BND",
@@ -1846,6 +1893,7 @@
"snr": 11.0,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -1885,6 +1933,7 @@
"snr": 10.5,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -1924,6 +1973,7 @@
"snr": 10.75,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -1963,6 +2013,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -2002,6 +2053,7 @@
"snr": 10.5,
"node": {
"snr": -6.25,
"raw_json": null,
"node_id": "!7c5b0920",
"num": 2086340896,
"short_name": "FFTB",
@@ -2041,6 +2093,7 @@
"snr": 10.25,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -2080,6 +2133,7 @@
"snr": 11.25,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!9ea0c780",
"num": 2661336960,
"short_name": "nguE",
@@ -2119,6 +2173,7 @@
"snr": 10.75,
"node": {
"snr": -12.75,
"raw_json": null,
"node_id": "!0910c922",
"num": 152095010,
"short_name": "c922",
@@ -2158,6 +2213,7 @@
"snr": 11.0,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -2197,6 +2253,7 @@
"snr": 11.0,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!9ee71430",
"num": 2665944112,
"short_name": "FiSp",
@@ -2236,6 +2293,7 @@
"snr": 11.5,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -2275,6 +2333,7 @@
"snr": 10.75,
"node": {
"snr": 10.25,
"raw_json": null,
"node_id": "!bcf10936",
"num": 3169913142,
"short_name": "0936",
@@ -2314,6 +2373,7 @@
"snr": 11.0,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!16ced364",
"num": 382653284,
"short_name": "Pat",
@@ -2353,6 +2413,7 @@
"snr": 11.25,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -2392,6 +2453,7 @@
"snr": 10.5,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -2431,6 +2493,7 @@
"snr": 10.25,
"node": {
"snr": 10.0,
"raw_json": null,
"node_id": "!a3deea53",
"num": 2749295187,
"short_name": "🐸",
@@ -2470,6 +2533,7 @@
"snr": 9.0,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!9ea0c780",
"num": 2661336960,
"short_name": "nguE",
@@ -2509,6 +2573,7 @@
"snr": 11.5,
"node": {
"snr": -13.25,
"raw_json": null,
"node_id": "!bba83318",
"num": 3148362520,
"short_name": "BerF",
@@ -2548,6 +2613,7 @@
"snr": 9.25,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -2587,6 +2653,7 @@
"snr": 10.25,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!e80cda12",
"num": 3893156370,
"short_name": "mowW",
@@ -2626,6 +2693,7 @@
"snr": -5.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!a0cc6904",
"num": 2697750788,
"short_name": "Kdû",
@@ -2665,6 +2733,7 @@
"snr": 11.0,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!e80cda12",
"num": 3893156370,
"short_name": "mowW",
@@ -2704,6 +2773,7 @@
"snr": 0.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -2743,6 +2813,7 @@
"snr": 11.25,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -2782,6 +2853,7 @@
"snr": 11.5,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -2821,6 +2893,7 @@
"snr": 10.0,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!16ced364",
"num": 382653284,
"short_name": "Pat",
@@ -2860,6 +2933,7 @@
"snr": 11.0,
"node": {
"snr": -9.75,
"raw_json": null,
"node_id": "!a0cb1608",
"num": 2697664008,
"short_name": "KBV5",
@@ -2899,6 +2973,7 @@
"snr": 9.5,
"node": {
"snr": -9.75,
"raw_json": null,
"node_id": "!a0cb1608",
"num": 2697664008,
"short_name": "KBV5",
@@ -2938,6 +3013,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -2977,6 +3053,7 @@
"snr": 11.0,
"node": {
"snr": -12.0,
"raw_json": null,
"node_id": "!43b6e530",
"num": 1136059696,
"short_name": "FFSR",
@@ -3016,6 +3093,7 @@
"snr": 11.0,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!e80cda12",
"num": 3893156370,
"short_name": "mowW",
@@ -3055,6 +3133,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -3094,6 +3173,7 @@
"snr": 10.25,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!16ced364",
"num": 382653284,
"short_name": "Pat",
@@ -3133,6 +3213,7 @@
"snr": 10.5,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -3172,6 +3253,7 @@
"snr": 10.75,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
@@ -3211,6 +3293,7 @@
"snr": 11.0,
"node": {
"snr": 11.0,
"raw_json": null,
"node_id": "!abbdf3f7",
"num": 2881352695,
"short_name": "f3f7",
@@ -3250,6 +3333,7 @@
"snr": 10.5,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!c0c32348",
"num": 3234014024,
"short_name": "CooP",
@@ -3289,6 +3373,7 @@
"snr": 11.0,
"node": {
"snr": 11.25,
"raw_json": null,
"node_id": "!16ced364",
"num": 382653284,
"short_name": "Pat",
@@ -3328,6 +3413,7 @@
"snr": 10.5,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -3367,6 +3453,7 @@
"snr": -12.5,
"node": {
"snr": -9.75,
"raw_json": null,
"node_id": "!a0cb1608",
"num": 2697664008,
"short_name": "KBV5",
@@ -3406,6 +3493,7 @@
"snr": 11.0,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!da635e24",
"num": 3663945252,
"short_name": "LAN",
@@ -3445,6 +3533,7 @@
"snr": -8.75,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -3484,6 +3573,7 @@
"snr": 10.25,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!5d823fb1",
"num": 1568817073,
"short_name": "3fb1",
@@ -3523,6 +3613,7 @@
"snr": 11.25,
"node": {
"snr": -12.0,
"raw_json": null,
"node_id": "!43b6e530",
"num": 1136059696,
"short_name": "FFSR",
@@ -3562,6 +3653,7 @@
"snr": 11.0,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!849a8ba4",
"num": 2224720804,
"short_name": "MGN1",
@@ -3601,6 +3693,7 @@
"snr": -13.25,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!849a8ba4",
"num": 2224720804,
"short_name": "MGN1",
@@ -3640,6 +3733,7 @@
"snr": 10.75,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!9c93a2df",
"num": 2626921183,
"short_name": "xaRa",
@@ -3679,6 +3773,7 @@
"snr": 11.25,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -3718,6 +3813,7 @@
"snr": 11.0,
"node": {
"snr": 11.5,
"raw_json": null,
"node_id": "!9ee71c38",
"num": 2665946168,
"short_name": "1c38",
@@ -3757,6 +3853,7 @@
"snr": 11.0,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!5d823fb1",
"num": 1568817073,
"short_name": "3fb1",
@@ -3796,6 +3893,7 @@
"snr": 11.0,
"node": {
"snr": 10.5,
"raw_json": null,
"node_id": "!6c73bf84",
"num": 1819524996,
"short_name": "ts1",
@@ -3835,6 +3933,7 @@
"snr": 11.25,
"node": {
"snr": null,
"raw_json": null,
"node_id": null,
"num": null,
"short_name": null,
@@ -3874,6 +3973,7 @@
"snr": 11.25,
"node": {
"snr": 10.75,
"raw_json": null,
"node_id": "!194a7351",
"num": 424309585,
"short_name": "l5y7",
+25 -566
View File
@@ -1,4 +1,3 @@
import base64
import importlib
import sys
import types
@@ -16,15 +15,6 @@ def mesh_module(monkeypatch):
repo_root = Path(__file__).resolve().parents[1]
monkeypatch.syspath_prepend(str(repo_root))
try:
import meshtastic as real_meshtastic # type: ignore
except Exception: # pragma: no cover - dependency may be unavailable in CI
real_meshtastic = None
real_protobuf = (
getattr(real_meshtastic, "protobuf", None) if real_meshtastic else None
)
# Stub meshtastic.serial_interface.SerialInterface
serial_interface_mod = types.ModuleType("meshtastic.serial_interface")
@@ -37,30 +27,13 @@ def mesh_module(monkeypatch):
serial_interface_mod.SerialInterface = DummySerialInterface
tcp_interface_mod = types.ModuleType("meshtastic.tcp_interface")
class DummyTCPInterface:
def __init__(self, *_, **__):
self.closed = False
def close(self):
self.closed = True
tcp_interface_mod.TCPInterface = DummyTCPInterface
meshtastic_mod = types.ModuleType("meshtastic")
meshtastic_mod.serial_interface = serial_interface_mod
meshtastic_mod.tcp_interface = tcp_interface_mod
if real_protobuf is not None:
meshtastic_mod.protobuf = real_protobuf
monkeypatch.setitem(sys.modules, "meshtastic", meshtastic_mod)
monkeypatch.setitem(
sys.modules, "meshtastic.serial_interface", serial_interface_mod
)
monkeypatch.setitem(sys.modules, "meshtastic.tcp_interface", tcp_interface_mod)
if real_protobuf is not None:
monkeypatch.setitem(sys.modules, "meshtastic.protobuf", real_protobuf)
# Stub pubsub.pub
pubsub_mod = types.ModuleType("pubsub")
@@ -75,47 +48,36 @@ def mesh_module(monkeypatch):
pubsub_mod.pub = DummyPub()
monkeypatch.setitem(sys.modules, "pubsub", pubsub_mod)
# Prefer real google.protobuf modules when available, otherwise provide stubs
try:
from google.protobuf import json_format as json_format_mod # type: ignore
from google.protobuf import message as message_mod # type: ignore
except Exception: # pragma: no cover - protobuf may be missing in CI
json_format_mod = types.ModuleType("google.protobuf.json_format")
# Stub google.protobuf modules used by mesh.py
json_format_mod = types.ModuleType("google.protobuf.json_format")
def message_to_dict(obj, *_, **__):
if hasattr(obj, "to_dict"):
return obj.to_dict()
if hasattr(obj, "__dict__"):
return dict(obj.__dict__)
return {}
def message_to_dict(obj, *_, **__):
if hasattr(obj, "to_dict"):
return obj.to_dict()
if hasattr(obj, "__dict__"):
return dict(obj.__dict__)
return {}
json_format_mod.MessageToDict = message_to_dict
json_format_mod.MessageToDict = message_to_dict
message_mod = types.ModuleType("google.protobuf.message")
message_mod = types.ModuleType("google.protobuf.message")
class DummyProtoMessage:
pass
class DummyProtoMessage:
pass
class DummyDecodeError(Exception):
pass
message_mod.Message = DummyProtoMessage
message_mod.Message = DummyProtoMessage
message_mod.DecodeError = DummyDecodeError
protobuf_mod = types.ModuleType("google.protobuf")
protobuf_mod.json_format = json_format_mod
protobuf_mod.message = message_mod
protobuf_mod = types.ModuleType("google.protobuf")
protobuf_mod.json_format = json_format_mod
protobuf_mod.message = message_mod
google_mod = types.ModuleType("google")
google_mod.protobuf = protobuf_mod
google_mod = types.ModuleType("google")
google_mod.protobuf = protobuf_mod
monkeypatch.setitem(sys.modules, "google", google_mod)
monkeypatch.setitem(sys.modules, "google.protobuf", protobuf_mod)
monkeypatch.setitem(sys.modules, "google.protobuf.json_format", json_format_mod)
monkeypatch.setitem(sys.modules, "google.protobuf.message", message_mod)
else:
monkeypatch.setitem(sys.modules, "google.protobuf.json_format", json_format_mod)
monkeypatch.setitem(sys.modules, "google.protobuf.message", message_mod)
monkeypatch.setitem(sys.modules, "google", google_mod)
monkeypatch.setitem(sys.modules, "google.protobuf", protobuf_mod)
monkeypatch.setitem(sys.modules, "google.protobuf.json_format", json_format_mod)
monkeypatch.setitem(sys.modules, "google.protobuf.message", message_mod)
module_name = "data.mesh"
if module_name in sys.modules:
@@ -167,57 +129,6 @@ def test_create_serial_interface_uses_serial_module(mesh_module, monkeypatch):
assert iface.nodes == {"!foo": sentinel}
def test_create_serial_interface_uses_tcp_for_ip(mesh_module, monkeypatch):
mesh = mesh_module
created = {}
def fake_tcp_interface(*, hostname, portNumber, **_):
created["hostname"] = hostname
created["portNumber"] = portNumber
return SimpleNamespace(nodes={}, close=lambda: None)
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
iface = mesh._create_serial_interface("192.168.1.25:4500")
assert created == {"hostname": "192.168.1.25", "portNumber": 4500}
assert iface.nodes == {}
def test_create_serial_interface_defaults_tcp_port(mesh_module, monkeypatch):
mesh = mesh_module
created = {}
def fake_tcp_interface(*, hostname, portNumber, **_):
created["hostname"] = hostname
created["portNumber"] = portNumber
return SimpleNamespace(nodes={}, close=lambda: None)
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
mesh._create_serial_interface("tcp://10.20.30.40")
assert created["hostname"] == "10.20.30.40"
assert created["portNumber"] == mesh._DEFAULT_TCP_PORT
def test_create_serial_interface_plain_ip(mesh_module, monkeypatch):
mesh = mesh_module
created = {}
def fake_tcp_interface(*, hostname, portNumber, **_):
created["hostname"] = hostname
created["portNumber"] = portNumber
return SimpleNamespace(nodes={}, close=lambda: None)
monkeypatch.setattr(mesh, "TCPInterface", fake_tcp_interface)
mesh._create_serial_interface(" 192.168.50.10 ")
assert created["hostname"] == "192.168.50.10"
assert created["portNumber"] == mesh._DEFAULT_TCP_PORT
def test_node_to_dict_handles_nested_structures(mesh_module):
mesh = mesh_module
@@ -292,316 +203,6 @@ def test_store_packet_dict_posts_text_message(mesh_module, monkeypatch):
assert priority == mesh._MESSAGE_POST_PRIORITY
def test_store_packet_dict_posts_position(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
packet = {
"id": 200498337,
"rxTime": 1_758_624_186,
"fromId": "!b1fa2b07",
"toId": "^all",
"rxSnr": -9.5,
"rxRssi": -104,
"decoded": {
"portnum": "POSITION_APP",
"bitfield": 1,
"position": {
"latitudeI": int(52.518912 * 1e7),
"longitudeI": int(13.5512064 * 1e7),
"altitude": -16,
"time": 1_758_624_189,
"locationSource": "LOC_INTERNAL",
"precisionBits": 17,
"satsInView": 7,
"PDOP": 211,
"groundSpeed": 2,
"groundTrack": 0,
"raw": {
"latitude_i": int(52.518912 * 1e7),
"longitude_i": int(13.5512064 * 1e7),
"altitude": -16,
"time": 1_758_624_189,
},
},
"payload": {
"__bytes_b64__": "DQDATR8VAMATCBjw//////////8BJb150mgoAljTAXgCgAEAmAEHuAER",
},
},
}
mesh.store_packet_dict(packet)
assert captured, "Expected POST to be triggered for position packet"
path, payload, priority = captured[0]
assert path == "/api/positions"
assert priority == mesh._POSITION_POST_PRIORITY
assert payload["id"] == 200498337
assert payload["node_id"] == "!b1fa2b07"
assert payload["node_num"] == int("b1fa2b07", 16)
assert payload["num"] == payload["node_num"]
assert payload["rx_time"] == 1_758_624_186
assert payload["rx_iso"] == mesh._iso(1_758_624_186)
assert payload["latitude"] == pytest.approx(52.518912)
assert payload["longitude"] == pytest.approx(13.5512064)
assert payload["altitude"] == pytest.approx(-16)
assert payload["position_time"] == 1_758_624_189
assert payload["location_source"] == "LOC_INTERNAL"
assert payload["precision_bits"] == 17
assert payload["sats_in_view"] == 7
assert payload["pdop"] == pytest.approx(211.0)
assert payload["ground_speed"] == pytest.approx(2.0)
assert payload["ground_track"] == pytest.approx(0.0)
assert payload["snr"] == pytest.approx(-9.5)
assert payload["rssi"] == -104
assert payload["hop_limit"] is None
assert payload["bitfield"] == 1
assert (
payload["payload_b64"]
== "DQDATR8VAMATCBjw//////////8BJb150mgoAljTAXgCgAEAmAEHuAER"
)
assert payload["raw"]["time"] == 1_758_624_189
def test_store_packet_dict_handles_nodeinfo_packet(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
from meshtastic.protobuf import config_pb2, mesh_pb2
node_info = mesh_pb2.NodeInfo()
node_info.num = 321
user = node_info.user
user.id = "!abcd1234"
user.short_name = "LoRa"
user.long_name = "LoRa Node"
user.role = config_pb2.Config.DeviceConfig.Role.Value("CLIENT")
user.hw_model = mesh_pb2.HardwareModel.Value("TBEAM")
node_info.device_metrics.battery_level = 87
node_info.device_metrics.voltage = 3.91
node_info.device_metrics.channel_utilization = 5.5
node_info.device_metrics.air_util_tx = 0.12
node_info.device_metrics.uptime_seconds = 4321
node_info.position.latitude_i = int(52.5 * 1e7)
node_info.position.longitude_i = int(13.4 * 1e7)
node_info.position.altitude = 48
node_info.position.time = 1_700_000_050
node_info.position.location_source = mesh_pb2.Position.LocSource.Value(
"LOC_INTERNAL"
)
node_info.snr = 9.5
node_info.last_heard = 1_700_000_040
node_info.hops_away = 2
node_info.is_favorite = True
payload_b64 = base64.b64encode(node_info.SerializeToString()).decode()
packet = {
"id": 999,
"rxTime": 1_700_000_200,
"from": int("abcd1234", 16),
"rxSnr": -5.5,
"decoded": {
"portnum": "NODEINFO_APP",
"payload": {"__bytes_b64__": payload_b64},
},
}
mesh.store_packet_dict(packet)
assert captured, "Expected nodeinfo packet to trigger POST"
path, payload, priority = captured[0]
assert path == "/api/nodes"
assert priority == mesh._NODE_POST_PRIORITY
assert "!abcd1234" in payload
node_entry = payload["!abcd1234"]
assert node_entry["num"] == 321
assert node_entry["lastHeard"] == 1_700_000_200
assert node_entry["snr"] == pytest.approx(9.5)
assert node_entry["hopsAway"] == 2
assert node_entry["isFavorite"] is True
assert node_entry["user"]["shortName"] == "LoRa"
assert node_entry["deviceMetrics"]["batteryLevel"] == pytest.approx(87)
assert node_entry["deviceMetrics"]["voltage"] == pytest.approx(3.91)
assert node_entry["deviceMetrics"]["uptimeSeconds"] == 4321
assert node_entry["position"]["latitude"] == pytest.approx(52.5)
assert node_entry["position"]["longitude"] == pytest.approx(13.4)
assert node_entry["position"]["time"] == 1_700_000_050
def test_store_packet_dict_handles_user_only_nodeinfo(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
from meshtastic.protobuf import mesh_pb2
user_msg = mesh_pb2.User()
user_msg.id = "!11223344"
user_msg.short_name = "Test"
user_msg.long_name = "Test Node"
payload_b64 = base64.b64encode(user_msg.SerializeToString()).decode()
packet = {
"id": 42,
"rxTime": 1_234,
"from": int("11223344", 16),
"decoded": {
"portnum": "NODEINFO_APP",
"payload": {"__bytes_b64__": payload_b64},
"user": {
"id": "!11223344",
"shortName": "Test",
"longName": "Test Node",
"hwModel": "HELTEC_V3",
},
},
}
mesh.store_packet_dict(packet)
assert captured
_, payload, _ = captured[0]
node_entry = payload["!11223344"]
assert node_entry["lastHeard"] == 1_234
assert node_entry["user"]["longName"] == "Test Node"
assert "deviceMetrics" not in node_entry
def test_store_packet_dict_nodeinfo_merges_proto_user(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
from meshtastic.protobuf import mesh_pb2
user_msg = mesh_pb2.User()
user_msg.id = "!44556677"
user_msg.short_name = "Proto"
user_msg.long_name = "Proto User"
node_info = mesh_pb2.NodeInfo()
node_info.snr = 2.5
payload_b64 = base64.b64encode(node_info.SerializeToString()).decode()
packet = {
"id": 73,
"rxTime": 5_000,
"fromId": "!44556677",
"decoded": {
"portnum": "NODEINFO_APP",
"payload": {"__bytes_b64__": payload_b64},
"user": user_msg,
},
}
mesh.store_packet_dict(packet)
assert captured
_, payload, _ = captured[0]
node_entry = payload["!44556677"]
assert node_entry["lastHeard"] == 5_000
assert node_entry["user"]["shortName"] == "Proto"
assert node_entry["user"]["longName"] == "Proto User"
def test_store_packet_dict_nodeinfo_sanitizes_nested_proto(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
from meshtastic.protobuf import mesh_pb2
user_msg = mesh_pb2.User()
user_msg.id = "!55667788"
user_msg.short_name = "Nested"
node_info = mesh_pb2.NodeInfo()
node_info.hops_away = 1
payload_b64 = base64.b64encode(node_info.SerializeToString()).decode()
packet = {
"id": 74,
"rxTime": 6_000,
"fromId": "!55667788",
"decoded": {
"portnum": "NODEINFO_APP",
"payload": {"__bytes_b64__": payload_b64},
"user": {
"id": "!55667788",
"shortName": "Nested",
"raw": user_msg,
},
},
}
mesh.store_packet_dict(packet)
assert captured
_, payload, _ = captured[0]
node_entry = payload["!55667788"]
assert node_entry["user"]["shortName"] == "Nested"
assert isinstance(node_entry["user"]["raw"], dict)
assert node_entry["user"]["raw"]["id"] == "!55667788"
def test_store_packet_dict_nodeinfo_uses_from_id_when_user_missing(
mesh_module, monkeypatch
):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
from meshtastic.protobuf import mesh_pb2
node_info = mesh_pb2.NodeInfo()
node_info.snr = 1.5
node_info.last_heard = 100
payload_b64 = base64.b64encode(node_info.SerializeToString()).decode()
packet = {
"id": 7,
"rxTime": 200,
"from": 0x01020304,
"decoded": {"portnum": 5, "payload": {"__bytes_b64__": payload_b64}},
}
mesh.store_packet_dict(packet)
assert captured
_, payload, _ = captured[0]
assert "!01020304" in payload
node_entry = payload["!01020304"]
assert node_entry["num"] == 0x01020304
assert node_entry["lastHeard"] == 200
assert node_entry["snr"] == pytest.approx(1.5)
def test_store_packet_dict_ignores_non_text(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
@@ -618,7 +219,7 @@ def test_store_packet_dict_ignores_non_text(mesh_module, monkeypatch):
"toId": "!def",
"decoded": {
"payload": {"text": "ignored"},
"portnum": "ENVIRONMENTAL_MEASUREMENT",
"portnum": "POSITION_APP",
},
}
@@ -778,116 +379,6 @@ def test_pkt_to_dict_handles_dict_and_proto(mesh_module, monkeypatch):
assert isinstance(fallback["_unparsed"], str)
def test_main_retries_interface_creation(mesh_module, monkeypatch):
mesh = mesh_module
attempts = []
class DummyEvent:
def __init__(self):
self.wait_calls = 0
def is_set(self):
return self.wait_calls >= 3
def set(self):
self.wait_calls = 3
def wait(self, timeout):
self.wait_calls += 1
return self.is_set()
class DummyInterface:
def __init__(self):
self.closed = False
self.nodes = {}
def close(self):
self.closed = True
iface = DummyInterface()
def fake_create(port):
attempts.append(port)
if len(attempts) < 3:
raise RuntimeError("boom")
return iface
monkeypatch.setattr(mesh, "_create_serial_interface", fake_create)
monkeypatch.setattr(mesh.threading, "Event", DummyEvent)
monkeypatch.setattr(mesh.signal, "signal", lambda *_, **__: None)
monkeypatch.setattr(mesh, "SNAPSHOT_SECS", 0)
monkeypatch.setattr(mesh, "_RECONNECT_INITIAL_DELAY_SECS", 0)
monkeypatch.setattr(mesh, "_RECONNECT_MAX_DELAY_SECS", 0)
mesh.main()
assert len(attempts) == 3
assert iface.closed is True
def test_main_recreates_interface_after_snapshot_error(mesh_module, monkeypatch):
mesh = mesh_module
class DummyEvent:
def __init__(self):
self.wait_calls = 0
def is_set(self):
return self.wait_calls >= 2
def set(self):
self.wait_calls = 2
def wait(self, timeout):
self.wait_calls += 1
return self.is_set()
interfaces = []
def fake_create(port):
fail_first = not interfaces
class FlakyInterface:
def __init__(self, should_fail):
self.closed = False
self._should_fail = should_fail
self._calls = 0
@property
def nodes(self):
self._calls += 1
if self._should_fail and self._calls == 1:
raise RuntimeError("temporary failure")
return {"!node": {"id": 1}}
def close(self):
self.closed = True
interface = FlakyInterface(fail_first)
interfaces.append(interface)
return interface
upsert_calls = []
def record_upsert(node_id, node):
upsert_calls.append(node_id)
monkeypatch.setattr(mesh, "_create_serial_interface", fake_create)
monkeypatch.setattr(mesh, "upsert_node", record_upsert)
monkeypatch.setattr(mesh.threading, "Event", DummyEvent)
monkeypatch.setattr(mesh.signal, "signal", lambda *_, **__: None)
monkeypatch.setattr(mesh, "SNAPSHOT_SECS", 0)
monkeypatch.setattr(mesh, "_RECONNECT_INITIAL_DELAY_SECS", 0)
monkeypatch.setattr(mesh, "_RECONNECT_MAX_DELAY_SECS", 0)
mesh.main()
assert len(interfaces) >= 2
assert interfaces[0].closed is True
assert upsert_calls == ["!node"]
def test_store_packet_dict_uses_top_level_channel(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
@@ -914,7 +405,6 @@ def test_store_packet_dict_uses_top_level_channel(mesh_module, monkeypatch):
assert payload["channel"] == 5
assert payload["portnum"] == "1"
assert payload["text"] == "hi"
assert payload["encrypted"] is None
assert payload["snr"] is None and payload["rssi"] is None
assert priority == mesh._MESSAGE_POST_PRIORITY
@@ -945,41 +435,10 @@ def test_store_packet_dict_handles_invalid_channel(mesh_module, monkeypatch):
path, payload, priority = captured[0]
assert path == "/api/messages"
assert payload["channel"] == 0
assert payload["encrypted"] is None
assert priority == mesh._MESSAGE_POST_PRIORITY
def test_store_packet_dict_includes_encrypted_payload(mesh_module, monkeypatch):
mesh = mesh_module
captured = []
monkeypatch.setattr(
mesh,
"_queue_post_json",
lambda path, payload, *, priority: captured.append((path, payload, priority)),
)
packet = {
"id": 555,
"rxTime": 111,
"from": 2988082812,
"to": "!receiver",
"channel": 8,
"encrypted": "abc123==",
}
mesh.store_packet_dict(packet)
assert captured
path, payload, priority = captured[0]
assert path == "/api/messages"
assert payload["encrypted"] == "abc123=="
assert payload["text"] is None
assert payload["from_id"] == 2988082812
assert payload["to_id"] == "!receiver"
assert priority == mesh._MESSAGE_POST_PRIORITY
def test_post_queue_prioritises_messages(mesh_module, monkeypatch):
def test_post_queue_prioritises_nodes(mesh_module, monkeypatch):
mesh = mesh_module
mesh._clear_post_queue()
calls = []
@@ -996,7 +455,7 @@ def test_post_queue_prioritises_messages(mesh_module, monkeypatch):
mesh._drain_post_queue()
assert [path for path, _ in calls] == ["/api/messages", "/api/nodes"]
assert [path for path, _ in calls] == ["/api/nodes", "/api/messages"]
def test_store_packet_dict_requires_id(mesh_module, monkeypatch):
+34 -714
View File
@@ -40,15 +40,7 @@ MAX_JSON_BODY_BYTES = begin
rescue ArgumentError
DEFAULT_MAX_JSON_BODY_BYTES
end
VERSION_FALLBACK = "v0.3.0"
def fetch_config_string(key, default)
value = ENV[key]
return default if value.nil?
trimmed = value.strip
trimmed.empty? ? default : trimmed
end
VERSION_FALLBACK = "v0.2.1"
def determine_app_version
repo_root = File.expand_path("..", __dir__)
@@ -79,149 +71,15 @@ APP_VERSION = determine_app_version
set :public_folder, File.join(__dir__, "public")
set :views, File.join(__dir__, "views")
get "/favicon.ico" do
cache_control :public, max_age: WEEK_SECONDS
ico_path = File.join(settings.public_folder, "favicon.ico")
if File.file?(ico_path)
send_file ico_path, type: "image/x-icon"
else
send_file File.join(settings.public_folder, "potatomesh-logo.svg"), type: "image/svg+xml"
end
end
SITE_NAME = fetch_config_string("SITE_NAME", "Meshtastic Berlin")
DEFAULT_CHANNEL = fetch_config_string("DEFAULT_CHANNEL", "#MediumFast")
DEFAULT_FREQUENCY = fetch_config_string("DEFAULT_FREQUENCY", "868MHz")
SITE_NAME = ENV.fetch("SITE_NAME", "Meshtastic Berlin")
DEFAULT_CHANNEL = ENV.fetch("DEFAULT_CHANNEL", "#MediumFast")
DEFAULT_FREQUENCY = ENV.fetch("DEFAULT_FREQUENCY", "868MHz")
MAP_CENTER_LAT = ENV.fetch("MAP_CENTER_LAT", "52.502889").to_f
MAP_CENTER_LON = ENV.fetch("MAP_CENTER_LON", "13.404194").to_f
MAX_NODE_DISTANCE_KM = ENV.fetch("MAX_NODE_DISTANCE_KM", "137").to_f
MATRIX_ROOM = ENV.fetch("MATRIX_ROOM", "#meshtastic-berlin:matrix.org")
DEBUG = ENV["DEBUG"] == "1"
def sanitized_string(value)
value.to_s.strip
end
def sanitized_site_name
sanitized_string(SITE_NAME)
end
def sanitized_default_channel
sanitized_string(DEFAULT_CHANNEL)
end
def sanitized_default_frequency
sanitized_string(DEFAULT_FREQUENCY)
end
def sanitized_matrix_room
value = sanitized_string(MATRIX_ROOM)
value.empty? ? nil : value
end
def string_or_nil(value)
return nil if value.nil?
str = value.is_a?(String) ? value : value.to_s
trimmed = str.strip
trimmed.empty? ? nil : trimmed
end
def coerce_integer(value)
case value
when Integer
value
when Float
value.finite? ? value.to_i : nil
when Numeric
value.to_i
when String
trimmed = value.strip
return nil if trimmed.empty?
return trimmed.to_i(16) if trimmed.match?(/\A0[xX][0-9A-Fa-f]+\z/)
return trimmed.to_i(10) if trimmed.match?(/\A-?\d+\z/)
begin
float_val = Float(trimmed)
float_val.finite? ? float_val.to_i : nil
rescue ArgumentError
nil
end
else
nil
end
end
def coerce_float(value)
case value
when Float
value.finite? ? value : nil
when Integer
value.to_f
when Numeric
value.to_f
when String
trimmed = value.strip
return nil if trimmed.empty?
begin
float_val = Float(trimmed)
float_val.finite? ? float_val : nil
rescue ArgumentError
nil
end
else
nil
end
end
def sanitized_max_distance_km
return nil unless defined?(MAX_NODE_DISTANCE_KM)
distance = MAX_NODE_DISTANCE_KM
return nil unless distance.is_a?(Numeric)
return nil unless distance.positive?
distance
end
def formatted_distance_km(distance)
format("%.1f", distance).sub(/\.0\z/, "")
end
def meta_description
site = sanitized_site_name
channel = sanitized_default_channel
frequency = sanitized_default_frequency
matrix = sanitized_matrix_room
summary = "Live Meshtastic mesh map for #{site}"
if channel.empty? && frequency.empty?
summary += "."
elsif channel.empty?
summary += " tuned to #{frequency}."
elsif frequency.empty?
summary += " on #{channel}."
else
summary += " on #{channel} (#{frequency})."
end
sentences = [summary, "Track nodes, messages, and coverage in real time."]
if (distance = sanitized_max_distance_km)
sentences << "Shows nodes within roughly #{formatted_distance_km(distance)} km of the map center."
end
sentences << "Join the community in #{matrix} on Matrix." if matrix
sentences.join(" ")
end
def meta_configuration
site = sanitized_site_name
{
title: site,
name: site,
description: meta_description,
}
end
class << Sinatra::Application
def apply_logger_level!
logger = settings.logger
@@ -274,9 +132,8 @@ end
def db_schema_present?
return false unless File.exist?(DB_PATH)
db = open_database(readonly: true)
required = %w[nodes messages positions]
tables = db.execute("SELECT name FROM sqlite_master WHERE type='table' AND name IN ('nodes','messages','positions')").flatten
(required - tables).empty?
tables = db.execute("SELECT name FROM sqlite_master WHERE type='table' AND name IN ('nodes','messages')").flatten
tables.include?("nodes") && tables.include?("messages")
rescue SQLite3::Exception
false
ensure
@@ -289,7 +146,7 @@ end
def init_db
FileUtils.mkdir_p(File.dirname(DB_PATH))
db = open_database
%w[nodes messages positions].each do |schema|
%w[nodes messages].each do |schema|
sql_file = File.expand_path("../data/#{schema}.sql", __dir__)
db.execute_batch(File.read(sql_file))
end
@@ -354,17 +211,16 @@ def query_messages(limit)
SELECT m.*, n.*, m.snr AS msg_snr
FROM messages m
LEFT JOIN nodes n ON (
m.from_id IS NOT NULL AND TRIM(m.from_id) <> '' AND (
m.from_id = n.node_id OR (
m.from_id GLOB '[0-9]*' AND CAST(m.from_id AS INTEGER) = n.num
)
m.from_id = n.node_id OR (
CAST(m.from_id AS TEXT) <> '' AND
CAST(m.from_id AS TEXT) GLOB '[0-9]*' AND
CAST(m.from_id AS INTEGER) = n.num
)
)
WHERE COALESCE(TRIM(m.encrypted), '') = ''
ORDER BY m.rx_time DESC
LIMIT ?
SQL
msg_fields = %w[id rx_time rx_iso from_id to_id channel portnum text encrypted msg_snr rssi hop_limit]
msg_fields = %w[id rx_time rx_iso from_id to_id channel portnum text msg_snr rssi hop_limit]
rows.each do |r|
if DEBUG && (r["from_id"].nil? || r["from_id"].to_s.empty?)
raw = db.execute("SELECT * FROM messages WHERE id = ?", [r["id"]]).first
@@ -377,8 +233,7 @@ def query_messages(limit)
node[k] = r.delete(k)
end
r["snr"] = r.delete("msg_snr")
references = [r["from_id"]].compact
if references.any? && (node["node_id"].nil? || node["node_id"].to_s.empty?)
if r["from_id"] && (node["node_id"].nil? || node["node_id"].to_s.empty?)
lookup_keys = []
canonical = normalize_node_id(db, r["from_id"])
lookup_keys << canonical if canonical
@@ -401,16 +256,6 @@ def query_messages(limit)
end
node["role"] = "CLIENT" if node.key?("role") && (node["role"].nil? || node["role"].to_s.empty?)
r["node"] = node
canonical_from_id = string_or_nil(node["node_id"]) || string_or_nil(normalize_node_id(db, r["from_id"]))
if canonical_from_id
raw_from_id = string_or_nil(r["from_id"])
if raw_from_id.nil? || raw_from_id.match?(/\A[0-9]+\z/)
r["from_id"] = canonical_from_id
elsif raw_from_id.start_with?("!") && raw_from_id.casecmp(canonical_from_id) != 0
r["from_id"] = canonical_from_id
end
end
if DEBUG && (r["from_id"].nil? || r["from_id"].to_s.empty?)
Kernel.warn "[debug] row after processing: #{r.inspect}"
end
@@ -420,40 +265,6 @@ ensure
db&.close
end
# Retrieve recorded position packets ordered by receive time.
#
# @param limit [Integer] maximum number of rows returned.
# @return [Array<Hash>] collection of position rows formatted for the API.
def query_positions(limit)
db = open_database(readonly: true)
db.results_as_hash = true
rows = db.execute <<~SQL, [limit]
SELECT id, node_id, node_num, rx_time, rx_iso, position_time,
to_id, latitude, longitude, altitude, location_source,
precision_bits, sats_in_view, pdop, ground_speed,
ground_track, snr, rssi, hop_limit, bitfield,
payload_b64
FROM positions
ORDER BY rx_time DESC
LIMIT ?
SQL
rows.each do |r|
pt = r["position_time"]
if pt
begin
r["position_time"] = Integer(pt, 10)
rescue ArgumentError, TypeError
r["position_time"] = coerce_integer(pt)
end
end
pt_val = r["position_time"]
r["position_time_iso"] = Time.at(pt_val).utc.iso8601 if pt_val
end
rows
ensure
db&.close
end
# GET /api/messages
#
# Returns a JSON array of stored text messages including node metadata.
@@ -463,15 +274,6 @@ get "/api/messages" do
query_messages(limit).to_json
end
# GET /api/positions
#
# Returns a JSON array of recorded position packets.
get "/api/positions" do
content_type :json
limit = [params["limit"]&.to_i || 200, 1000].min
query_positions(limit).to_json
end
# Determine the numeric node reference for a canonical node identifier.
#
# The Meshtastic protobuf encodes the node ID as a hexadecimal string prefixed
@@ -514,117 +316,6 @@ rescue ArgumentError
nil
end
# Determine canonical node identifiers and derived metadata for a reference.
#
# @param node_ref [Object] raw node identifier or numeric reference.
# @param fallback_num [Object] optional numeric reference used when the
# identifier does not encode the value directly.
# @return [Array(String, Integer, String), nil] tuple containing the canonical
# node ID, numeric node reference, and uppercase short identifier suffix when
# the reference can be parsed. Returns nil when the reference cannot be
# converted into a canonical ID.
def canonical_node_parts(node_ref, fallback_num = nil)
fallback = coerce_integer(fallback_num)
hex = nil
num = nil
case node_ref
when Integer
num = node_ref
when Numeric
num = node_ref.to_i
when String
trimmed = node_ref.strip
return nil if trimmed.empty?
if trimmed.start_with?("!")
hex = trimmed.delete_prefix("!")
elsif trimmed.match?(/\A0[xX][0-9A-Fa-f]+\z/)
hex = trimmed[2..].to_s
elsif trimmed.match?(/\A-?\d+\z/)
num = trimmed.to_i
elsif trimmed.match?(/\A[0-9A-Fa-f]+\z/)
hex = trimmed
else
return nil
end
when nil
num = fallback if fallback
else
return nil
end
num ||= fallback if fallback
if hex
begin
num ||= Integer(hex, 16)
rescue ArgumentError
return nil
end
elsif num
return nil if num.negative?
hex = format("%08x", num & 0xFFFFFFFF)
else
return nil
end
return nil if hex.nil? || hex.empty?
begin
parsed = Integer(hex, 16)
rescue ArgumentError
return nil
end
parsed &= 0xFFFFFFFF
canonical_hex = format("%08x", parsed)
short_id = canonical_hex[-4, 4].upcase
["!#{canonical_hex}", parsed, short_id]
end
# Ensure a placeholder node entry exists for the provided identifier.
#
# Messages and telemetry can reference nodes before the daemon has received a
# full node snapshot. When this happens we create a minimal hidden entry so the
# sender can be resolved in the UI until richer metadata becomes available.
#
# @param db [SQLite3::Database] open database handle.
# @param node_ref [Object] raw identifier extracted from the payload.
# @param fallback_num [Object] optional numeric reference used when the
# identifier is missing.
def ensure_unknown_node(db, node_ref, fallback_num = nil, heard_time: nil)
parts = canonical_node_parts(node_ref, fallback_num)
return unless parts
node_id, node_num, short_id = parts
existing = db.get_first_value(
"SELECT 1 FROM nodes WHERE node_id = ? LIMIT 1",
[node_id],
)
return if existing
long_name = "Meshtastic #{short_id}"
heard_time = coerce_integer(heard_time)
inserted = false
with_busy_retry do
db.execute(
<<~SQL,
INSERT OR IGNORE INTO nodes(node_id,num,short_name,long_name,role,last_heard,first_heard)
VALUES (?,?,?,?,?,?,?)
SQL
[node_id, node_num, short_id, long_name, "CLIENT_HIDDEN", heard_time, heard_time],
)
inserted = db.changes.positive?
end
inserted
end
# Insert or update a node row with the most recent metrics.
#
# @param db [SQLite3::Database] open database handle.
@@ -635,13 +326,12 @@ def upsert_node(db, node_id, n)
met = n["deviceMetrics"] || {}
pos = n["position"] || {}
role = user["role"] || "CLIENT"
lh = coerce_integer(n["lastHeard"])
pt = coerce_integer(pos["time"])
lh = n["lastHeard"]
pt = pos["time"]
now = Time.now.to_i
pt = nil if pt && pt > now
lh = now if lh && lh > now
lh = pt if pt && (!lh || lh < pt)
lh ||= now
bool = ->(v) {
case v
when true then 1
@@ -687,7 +377,6 @@ def upsert_node(db, node_id, n)
num=excluded.num, short_name=excluded.short_name, long_name=excluded.long_name, macaddr=excluded.macaddr,
hw_model=excluded.hw_model, role=excluded.role, public_key=excluded.public_key, is_unmessagable=excluded.is_unmessagable,
is_favorite=excluded.is_favorite, hops_away=excluded.hops_away, snr=excluded.snr, last_heard=excluded.last_heard,
first_heard=COALESCE(nodes.first_heard, excluded.first_heard, excluded.last_heard),
battery_level=excluded.battery_level, voltage=excluded.voltage, channel_utilization=excluded.channel_utilization,
air_util_tx=excluded.air_util_tx, uptime_seconds=excluded.uptime_seconds, position_time=excluded.position_time,
location_source=excluded.location_source, latitude=excluded.latitude, longitude=excluded.longitude,
@@ -751,272 +440,6 @@ def prefer_canonical_sender?(message)
message.is_a?(Hash) && message.key?("packet_id") && !message.key?("id")
end
# Update or create a node entry using information from a position payload.
#
# @param db [SQLite3::Database] open database handle.
# @param node_id [String, nil] canonical node identifier when available.
# @param node_num [Integer, nil] numeric node reference if known.
# @param rx_time [Integer] time the packet was received by the gateway.
# @param position_time [Integer, nil] timestamp reported by the device.
# @param location_source [String, nil] location source flag from the packet.
# @param latitude [Float, nil] reported latitude.
# @param longitude [Float, nil] reported longitude.
# @param altitude [Float, nil] reported altitude.
# @param snr [Float, nil] link SNR for the packet.
def update_node_from_position(db, node_id, node_num, rx_time, position_time, location_source, latitude, longitude, altitude, snr)
num = coerce_integer(node_num)
id = string_or_nil(node_id)
if id&.start_with?("!")
id = "!#{id.delete_prefix("!").downcase}"
end
id ||= format("!%08x", num & 0xFFFFFFFF) if num
return unless id
now = Time.now.to_i
rx = coerce_integer(rx_time) || now
rx = now if rx && rx > now
pos_time = coerce_integer(position_time)
pos_time = nil if pos_time && pos_time > now
last_heard = [rx, pos_time].compact.max || rx
last_heard = now if last_heard && last_heard > now
loc = string_or_nil(location_source)
lat = coerce_float(latitude)
lon = coerce_float(longitude)
alt = coerce_float(altitude)
snr_val = coerce_float(snr)
row = [
id,
num,
last_heard,
last_heard,
pos_time,
loc,
lat,
lon,
alt,
snr_val,
]
with_busy_retry do
db.execute <<~SQL, row
INSERT INTO nodes(node_id,num,last_heard,first_heard,position_time,location_source,latitude,longitude,altitude,snr)
VALUES (?,?,?,?,?,?,?,?,?,?)
ON CONFLICT(node_id) DO UPDATE SET
num=COALESCE(excluded.num,nodes.num),
snr=COALESCE(excluded.snr,nodes.snr),
last_heard=MAX(COALESCE(nodes.last_heard,0),COALESCE(excluded.last_heard,0)),
first_heard=COALESCE(nodes.first_heard, excluded.first_heard, excluded.last_heard),
position_time=CASE
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
THEN excluded.position_time
ELSE nodes.position_time
END,
location_source=CASE
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
AND excluded.location_source IS NOT NULL
THEN excluded.location_source
ELSE nodes.location_source
END,
latitude=CASE
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
AND excluded.latitude IS NOT NULL
THEN excluded.latitude
ELSE nodes.latitude
END,
longitude=CASE
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
AND excluded.longitude IS NOT NULL
THEN excluded.longitude
ELSE nodes.longitude
END,
altitude=CASE
WHEN COALESCE(excluded.position_time,0) >= COALESCE(nodes.position_time,0)
AND excluded.altitude IS NOT NULL
THEN excluded.altitude
ELSE nodes.altitude
END
SQL
end
end
# Insert a position packet into the history table and refresh node metadata.
#
# @param db [SQLite3::Database] open database handle.
# @param payload [Hash] position payload provided by the data daemon.
def insert_position(db, payload)
pos_id = coerce_integer(payload["id"] || payload["packet_id"])
return unless pos_id
now = Time.now.to_i
rx_time = coerce_integer(payload["rx_time"])
rx_time = now if rx_time.nil? || rx_time > now
rx_iso = string_or_nil(payload["rx_iso"])
rx_iso ||= Time.at(rx_time).utc.iso8601
raw_node_id = payload["node_id"] || payload["from_id"] || payload["from"]
node_id = string_or_nil(raw_node_id)
node_id = "!#{node_id.delete_prefix("!").downcase}" if node_id&.start_with?("!")
raw_node_num = coerce_integer(payload["node_num"]) || coerce_integer(payload["num"])
node_id ||= format("!%08x", raw_node_num & 0xFFFFFFFF) if node_id.nil? && raw_node_num
payload_for_num = payload.is_a?(Hash) ? payload.dup : {}
payload_for_num["num"] ||= raw_node_num if raw_node_num
node_num = resolve_node_num(node_id, payload_for_num)
node_num ||= raw_node_num
canonical = normalize_node_id(db, node_id || node_num)
node_id = canonical if canonical
ensure_unknown_node(db, node_id || node_num, node_num, heard_time: rx_time)
to_id = string_or_nil(payload["to_id"] || payload["to"])
position_section = payload["position"].is_a?(Hash) ? payload["position"] : {}
lat = coerce_float(payload["latitude"]) || coerce_float(position_section["latitude"])
lon = coerce_float(payload["longitude"]) || coerce_float(position_section["longitude"])
alt = coerce_float(payload["altitude"]) || coerce_float(position_section["altitude"])
lat ||= begin
lat_i = coerce_integer(position_section["latitudeI"] || position_section["latitude_i"] || position_section.dig("raw", "latitude_i"))
lat_i ? lat_i / 1e7 : nil
end
lon ||= begin
lon_i = coerce_integer(position_section["longitudeI"] || position_section["longitude_i"] || position_section.dig("raw", "longitude_i"))
lon_i ? lon_i / 1e7 : nil
end
alt ||= coerce_float(position_section.dig("raw", "altitude"))
position_time = coerce_integer(
payload["position_time"] ||
position_section["time"] ||
position_section.dig("raw", "time"),
)
location_source = string_or_nil(
payload["location_source"] ||
payload["locationSource"] ||
position_section["location_source"] ||
position_section["locationSource"] ||
position_section.dig("raw", "location_source"),
)
precision_bits = coerce_integer(
payload["precision_bits"] ||
payload["precisionBits"] ||
position_section["precision_bits"] ||
position_section["precisionBits"] ||
position_section.dig("raw", "precision_bits"),
)
sats_in_view = coerce_integer(
payload["sats_in_view"] ||
payload["satsInView"] ||
position_section["sats_in_view"] ||
position_section["satsInView"] ||
position_section.dig("raw", "sats_in_view"),
)
pdop = coerce_float(
payload["pdop"] ||
payload["PDOP"] ||
position_section["pdop"] ||
position_section["PDOP"] ||
position_section.dig("raw", "PDOP") ||
position_section.dig("raw", "pdop"),
)
ground_speed = coerce_float(
payload["ground_speed"] ||
payload["groundSpeed"] ||
position_section["ground_speed"] ||
position_section["groundSpeed"] ||
position_section.dig("raw", "ground_speed"),
)
ground_track = coerce_float(
payload["ground_track"] ||
payload["groundTrack"] ||
position_section["ground_track"] ||
position_section["groundTrack"] ||
position_section.dig("raw", "ground_track"),
)
snr = coerce_float(payload["snr"] || payload["rx_snr"] || payload["rxSnr"])
rssi = coerce_integer(payload["rssi"] || payload["rx_rssi"] || payload["rxRssi"])
hop_limit = coerce_integer(payload["hop_limit"] || payload["hopLimit"])
bitfield = coerce_integer(payload["bitfield"])
payload_b64 = string_or_nil(payload["payload_b64"] || payload["payload"])
payload_b64 ||= string_or_nil(position_section.dig("payload", "__bytes_b64__"))
row = [
pos_id,
node_id,
node_num,
rx_time,
rx_iso,
position_time,
to_id,
lat,
lon,
alt,
location_source,
precision_bits,
sats_in_view,
pdop,
ground_speed,
ground_track,
snr,
rssi,
hop_limit,
bitfield,
payload_b64,
]
with_busy_retry do
db.execute <<~SQL, row
INSERT INTO positions(id,node_id,node_num,rx_time,rx_iso,position_time,to_id,latitude,longitude,altitude,location_source,
precision_bits,sats_in_view,pdop,ground_speed,ground_track,snr,rssi,hop_limit,bitfield,payload_b64)
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
ON CONFLICT(id) DO UPDATE SET
node_id=COALESCE(excluded.node_id,positions.node_id),
node_num=COALESCE(excluded.node_num,positions.node_num),
rx_time=excluded.rx_time,
rx_iso=excluded.rx_iso,
position_time=COALESCE(excluded.position_time,positions.position_time),
to_id=COALESCE(excluded.to_id,positions.to_id),
latitude=COALESCE(excluded.latitude,positions.latitude),
longitude=COALESCE(excluded.longitude,positions.longitude),
altitude=COALESCE(excluded.altitude,positions.altitude),
location_source=COALESCE(excluded.location_source,positions.location_source),
precision_bits=COALESCE(excluded.precision_bits,positions.precision_bits),
sats_in_view=COALESCE(excluded.sats_in_view,positions.sats_in_view),
pdop=COALESCE(excluded.pdop,positions.pdop),
ground_speed=COALESCE(excluded.ground_speed,positions.ground_speed),
ground_track=COALESCE(excluded.ground_track,positions.ground_track),
snr=COALESCE(excluded.snr,positions.snr),
rssi=COALESCE(excluded.rssi,positions.rssi),
hop_limit=COALESCE(excluded.hop_limit,positions.hop_limit),
bitfield=COALESCE(excluded.bitfield,positions.bitfield),
payload_b64=COALESCE(excluded.payload_b64,positions.payload_b64)
SQL
end
update_node_from_position(
db,
node_id,
node_num,
rx_time,
position_time,
location_source,
lat,
lon,
alt,
snr,
)
end
# Insert a text message if it does not already exist.
#
# @param db [SQLite3::Database] open database handle.
@@ -1024,112 +447,54 @@ end
def insert_message(db, m)
msg_id = m["id"] || m["packet_id"]
return unless msg_id
rx_time = m["rx_time"]&.to_i || Time.now.to_i
rx_iso = m["rx_iso"] || Time.at(rx_time).utc.iso8601
raw_from_id = m["from_id"]
if raw_from_id.nil? || raw_from_id.to_s.strip.empty?
alt_from = m["from"]
raw_from_id = alt_from unless alt_from.nil? || alt_from.to_s.strip.empty?
end
trimmed_from_id = string_or_nil(raw_from_id)
canonical_from_id = string_or_nil(normalize_node_id(db, raw_from_id))
from_id = trimmed_from_id
if canonical_from_id
if from_id.nil?
from_id = canonical_from_id
elsif prefer_canonical_sender?(m)
from_id = canonical_from_id
elsif from_id.start_with?("!") && from_id.casecmp(canonical_from_id) != 0
from_id = canonical_from_id
trimmed_from_id = raw_from_id.nil? ? nil : raw_from_id.to_s.strip
trimmed_from_id = nil if trimmed_from_id&.empty?
canonical_from_id = normalize_node_id(db, raw_from_id)
use_canonical = canonical_from_id && (trimmed_from_id.nil? || prefer_canonical_sender?(m))
from_id = if use_canonical
canonical_from_id.to_s.strip
else
trimmed_from_id
end
end
raw_to_id = m["to_id"]
raw_to_id = m["to"] if raw_to_id.nil? || raw_to_id.to_s.strip.empty?
trimmed_to_id = string_or_nil(raw_to_id)
canonical_to_id = string_or_nil(normalize_node_id(db, raw_to_id))
to_id = trimmed_to_id
if canonical_to_id
if to_id.nil?
to_id = canonical_to_id
elsif to_id.start_with?("!") && to_id.casecmp(canonical_to_id) != 0
to_id = canonical_to_id
end
end
encrypted = string_or_nil(m["encrypted"])
ensure_unknown_node(db, from_id || raw_from_id, m["from_num"], heard_time: rx_time)
from_id = nil if from_id&.empty?
row = [
msg_id,
rx_time,
rx_iso,
from_id,
to_id,
m["to_id"],
m["channel"],
m["portnum"],
m["text"],
encrypted,
m["snr"],
m["rssi"],
m["hop_limit"],
]
with_busy_retry do
existing = db.get_first_row(
"SELECT from_id, to_id, encrypted FROM messages WHERE id = ?",
[msg_id],
)
existing = db.get_first_row("SELECT from_id FROM messages WHERE id = ?", [msg_id])
if existing
updates = {}
if from_id
existing_from = existing.is_a?(Hash) ? existing["from_id"] : existing[0]
existing_from_str = existing_from&.to_s
should_update = existing_from_str.nil? || existing_from_str.strip.empty?
should_update ||= existing_from != from_id
updates["from_id"] = from_id if should_update
end
if to_id
existing_to = existing.is_a?(Hash) ? existing["to_id"] : existing[1]
existing_to_str = existing_to&.to_s
should_update = existing_to_str.nil? || existing_to_str.strip.empty?
should_update ||= existing_to != to_id
updates["to_id"] = to_id if should_update
end
if encrypted
existing_encrypted = existing.is_a?(Hash) ? existing["encrypted"] : existing[2]
existing_encrypted_str = existing_encrypted&.to_s
should_update = existing_encrypted_str.nil? || existing_encrypted_str.strip.empty?
should_update ||= existing_encrypted != encrypted
updates["encrypted"] = encrypted if should_update
end
unless updates.empty?
assignments = updates.keys.map { |column| "#{column} = ?" }.join(", ")
db.execute("UPDATE messages SET #{assignments} WHERE id = ?", updates.values + [msg_id])
db.execute("UPDATE messages SET from_id = ? WHERE id = ?", [from_id, msg_id]) if should_update
end
else
begin
db.execute <<~SQL, row
INSERT INTO messages(id,rx_time,rx_iso,from_id,to_id,channel,portnum,text,encrypted,snr,rssi,hop_limit)
VALUES (?,?,?,?,?,?,?,?,?,?,?,?)
INSERT INTO messages(id,rx_time,rx_iso,from_id,to_id,channel,portnum,text,snr,rssi,hop_limit)
VALUES (?,?,?,?,?,?,?,?,?,?,?)
SQL
rescue SQLite3::ConstraintException
fallback_updates = {}
fallback_updates["from_id"] = from_id if from_id
fallback_updates["to_id"] = to_id if to_id
fallback_updates["encrypted"] = encrypted if encrypted
unless fallback_updates.empty?
assignments = fallback_updates.keys.map { |column| "#{column} = ?" }.join(", ")
db.execute("UPDATE messages SET #{assignments} WHERE id = ?", fallback_updates.values + [msg_id])
end
db.execute("UPDATE messages SET from_id = ? WHERE id = ?", [from_id, msg_id]) if from_id
end
end
end
@@ -1200,63 +565,18 @@ ensure
db&.close
end
# POST /api/positions
#
# Accepts an array or object describing position packets and stores each entry.
post "/api/positions" do
require_token!
content_type :json
begin
data = JSON.parse(read_json_body)
rescue JSON::ParserError
halt 400, { error: "invalid JSON" }.to_json
end
positions = data.is_a?(Array) ? data : [data]
halt 400, { error: "too many positions" }.to_json if positions.size > 1000
db = open_database
positions.each do |pos|
insert_position(db, pos)
end
{ status: "ok" }.to_json
ensure
db&.close
end
get "/potatomesh-logo.svg" do
# Sinatra знает корень через settings.root (обычно это каталог app.rb)
path = File.expand_path("potatomesh-logo.svg", settings.public_folder)
# отладка в лог (видно в docker logs)
settings.logger&.info("logo_path=#{path} exist=#{File.exist?(path)}
file=#{File.file?(path)}")
halt 404, "Not Found" unless File.exist?(path) && File.readable?(path)
content_type "image/svg+xml"
last_modified File.mtime(path)
cache_control :public, max_age: 3600
send_file path
end
# GET /
#
# Renders the main site with configuration-driven defaults for the template.
get "/" do
meta = meta_configuration
response.set_cookie("theme", value: "dark", path: "/", max_age: 60 * 60 * 24 * 7, same_site: :lax) unless request.cookies["theme"]
erb :index, locals: {
site_name: meta[:name],
meta_title: meta[:title],
meta_name: meta[:name],
meta_description: meta[:description],
default_channel: sanitized_default_channel,
default_frequency: sanitized_default_frequency,
site_name: SITE_NAME,
default_channel: DEFAULT_CHANNEL,
default_frequency: DEFAULT_FREQUENCY,
map_center_lat: MAP_CENTER_LAT,
map_center_lon: MAP_CENTER_LON,
max_node_distance_km: MAX_NODE_DISTANCE_KM,
matrix_room: sanitized_matrix_room,
matrix_room: MATRIX_ROOM,
version: APP_VERSION,
}
end
Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.2 KiB

+3 -544
View File
@@ -38,7 +38,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
with_db do |db|
db.execute("DELETE FROM messages")
db.execute("DELETE FROM nodes")
db.execute("DELETE FROM positions")
end
end
@@ -190,23 +189,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
get "/"
expect(last_response.body).to include("#{APP_VERSION}")
end
it "includes SEO metadata from configuration" do
stub_const("SITE_NAME", "Spec Mesh Title")
stub_const("DEFAULT_CHANNEL", "#SpecChannel")
stub_const("DEFAULT_FREQUENCY", "915MHz")
stub_const("MAX_NODE_DISTANCE_KM", 120.5)
stub_const("MATRIX_ROOM", " #spec-room:example.org ")
expected_description = "Live Meshtastic mesh map for Spec Mesh Title on #SpecChannel (915MHz). Track nodes, messages, and coverage in real time. Shows nodes within roughly 120.5 km of the map center. Join the community in #spec-room:example.org on Matrix."
get "/"
expect(last_response.body).to include(%(meta name="description" content="#{expected_description}" />))
expect(last_response.body).to include('<meta property="og:title" content="Spec Mesh Title" />')
expect(last_response.body).to include('<meta property="og:site_name" content="Spec Mesh Title" />')
expect(last_response.body).to include('<meta name="twitter:image" content="http://example.org/potatomesh-logo.svg" />')
end
end
describe "database initialization" do
@@ -321,65 +303,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
expect(JSON.parse(last_response.body)).to eq("error" => "invalid JSON")
end
it "updates timestamps when the payload omits lastHeard" do
node_id = "!spectime01"
payload = {
node_id => {
"user" => { "shortName" => "Spec Time" },
},
}
post "/api/nodes", payload.to_json, auth_headers
expect(last_response).to be_ok
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT last_heard, first_heard FROM nodes WHERE node_id = ?",
[node_id],
)
expect(row["last_heard"]).to eq(reference_time.to_i)
expect(row["first_heard"]).to eq(reference_time.to_i)
end
end
it "preserves the original first_heard when updating nodes" do
node_id = "!spectime02"
initial_first = reference_time.to_i - 600
initial_last = reference_time.to_i - 300
with_db do |db|
db.execute(
"INSERT INTO nodes(node_id, last_heard, first_heard) VALUES (?,?,?)",
[node_id, initial_last, initial_first],
)
end
payload = {
node_id => {
"user" => { "shortName" => "Spec Update" },
"lastHeard" => reference_time.to_i,
},
}
post "/api/nodes", payload.to_json, auth_headers
expect(last_response).to be_ok
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT last_heard, first_heard FROM nodes WHERE node_id = ?",
[node_id],
)
expect(row["last_heard"]).to eq(reference_time.to_i)
expect(row["first_heard"]).to eq(initial_first)
end
end
it "returns 400 when more than 1000 nodes are provided" do
payload = (0..1000).each_with_object({}) do |i, acc|
acc["node-#{i}"] = {}
@@ -473,62 +396,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
end
end
describe "#ensure_unknown_node" do
it "creates a hidden placeholder with timestamps for chat notifications" do
with_db do |db|
created = ensure_unknown_node(db, "!1234abcd", nil, heard_time: reference_time.to_i)
expect(created).to be_truthy
end
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
<<~SQL,
SELECT short_name, long_name, role, last_heard, first_heard
FROM nodes
WHERE node_id = ?
SQL
["!1234abcd"],
)
expect(row["short_name"]).to eq("ABCD")
expect(row["long_name"]).to eq("Meshtastic ABCD")
expect(row["role"]).to eq("CLIENT_HIDDEN")
expect(row["last_heard"]).to eq(reference_time.to_i)
expect(row["first_heard"]).to eq(reference_time.to_i)
end
end
it "leaves timestamps nil when no receive time is provided" do
with_db do |db|
created = ensure_unknown_node(db, "!1111beef", nil)
expect(created).to be_truthy
end
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
<<~SQL,
SELECT last_heard, first_heard
FROM nodes
WHERE node_id = ?
SQL
["!1111beef"],
)
expect(row["last_heard"]).to be_nil
expect(row["first_heard"]).to be_nil
end
end
it "returns false when the node already exists" do
with_db do |db|
expect(ensure_unknown_node(db, "!0000c0de", nil)).to be_truthy
expect(ensure_unknown_node(db, "!0000c0de", nil)).to be_falsey
end
end
end
describe "POST /api/messages" do
it "persists messages from fixture data" do
import_nodes_fixture
@@ -565,41 +432,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
end
end
it "creates hidden nodes for unknown message senders" do
payload = {
"id" => 9_999,
"rx_time" => reference_time.to_i,
"rx_iso" => reference_time.iso8601,
"from_id" => "!feedf00d",
"to_id" => "^all",
"channel" => 0,
"portnum" => "TEXT_MESSAGE_APP",
"text" => "Spec placeholder message",
}
post "/api/messages", payload.to_json, auth_headers
expect(last_response).to be_ok
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT node_id, num, short_name, long_name, role, last_heard, first_heard FROM nodes WHERE node_id = ?",
["!feedf00d"],
)
expect(row).not_to be_nil
expect(row["node_id"]).to eq("!feedf00d")
expect(row["num"]).to eq(0xfeedf00d)
expect(row["short_name"]).to eq("F00D")
expect(row["long_name"]).to eq("Meshtastic F00D")
expect(row["role"]).to eq("CLIENT_HIDDEN")
expect(row["last_heard"]).to eq(payload["rx_time"])
expect(row["first_heard"]).to eq(payload["rx_time"])
end
end
it "returns 400 when the payload is not valid JSON" do
post "/api/messages", "{", auth_headers
@@ -624,225 +456,6 @@ RSpec.describe "Potato Mesh Sinatra app" do
end
end
describe "POST /api/positions" do
it "stores position packets and updates node metadata" do
node_id = "!specpos01"
node_num = 0x1234_5678
initial_last_heard = reference_time.to_i - 600
node_payload = {
node_id => {
"num" => node_num,
"user" => { "shortName" => "SpecPos" },
"lastHeard" => initial_last_heard,
"position" => {
"time" => initial_last_heard - 60,
"latitude" => 52.0,
"longitude" => 13.0,
},
},
}
post "/api/nodes", node_payload.to_json, auth_headers
expect(last_response).to be_ok
rx_time = reference_time.to_i - 120
position_time = rx_time - 30
raw_payload = { "time" => position_time, "latitude_i" => (52.5 * 1e7).to_i }
position_payload = {
"id" => 9_001,
"node_id" => node_id,
"node_num" => node_num,
"rx_time" => rx_time,
"rx_iso" => Time.at(rx_time).utc.iso8601,
"to_id" => "^all",
"latitude" => 52.5,
"longitude" => 13.4,
"altitude" => 42.0,
"position_time" => position_time,
"location_source" => "LOC_INTERNAL",
"precision_bits" => 15,
"sats_in_view" => 6,
"pdop" => 2.5,
"ground_speed" => 3.2,
"ground_track" => 180.0,
"snr" => -8.5,
"rssi" => -90,
"hop_limit" => 3,
"bitfield" => 1,
"payload_b64" => "AQI=",
"raw" => raw_payload,
}
post "/api/positions", position_payload.to_json, auth_headers
expect(last_response).to be_ok
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row("SELECT * FROM positions WHERE id = ?", [9_001])
expect(row["node_id"]).to eq(node_id)
expect(row["node_num"]).to eq(node_num)
expect(row["rx_time"]).to eq(rx_time)
expect(row["rx_iso"]).to eq(Time.at(rx_time).utc.iso8601)
expect(row["position_time"]).to eq(position_time)
expect_same_value(row["latitude"], 52.5)
expect_same_value(row["longitude"], 13.4)
expect_same_value(row["altitude"], 42.0)
expect(row["location_source"]).to eq("LOC_INTERNAL")
expect(row["precision_bits"]).to eq(15)
expect(row["sats_in_view"]).to eq(6)
expect_same_value(row["pdop"], 2.5)
expect_same_value(row["ground_speed"], 3.2)
expect_same_value(row["ground_track"], 180.0)
expect_same_value(row["snr"], -8.5)
expect(row["rssi"]).to eq(-90)
expect(row["hop_limit"]).to eq(3)
expect(row["bitfield"]).to eq(1)
expect(row["payload_b64"]).to eq("AQI=")
end
with_db(readonly: true) do |db|
db.results_as_hash = true
node_row = db.get_first_row(
"SELECT last_heard, position_time, latitude, longitude, altitude, location_source, snr FROM nodes WHERE node_id = ?",
[node_id],
)
expect(node_row["last_heard"]).to eq(rx_time)
expect(node_row["position_time"]).to eq(position_time)
expect_same_value(node_row["latitude"], 52.5)
expect_same_value(node_row["longitude"], 13.4)
expect_same_value(node_row["altitude"], 42.0)
expect(node_row["location_source"]).to eq("LOC_INTERNAL")
expect_same_value(node_row["snr"], -8.5)
end
end
it "creates node records when none exist" do
node_id = "!specnew01"
node_num = 0xfeed_cafe
rx_time = reference_time.to_i - 60
position_time = rx_time - 10
payload = {
"id" => 9_002,
"node_id" => node_id,
"node_num" => node_num,
"rx_time" => rx_time,
"rx_iso" => Time.at(rx_time).utc.iso8601,
"latitude" => 52.1,
"longitude" => 13.1,
"altitude" => 33.0,
"position_time" => position_time,
"location_source" => "LOC_EXTERNAL",
}
post "/api/positions", payload.to_json, auth_headers
expect(last_response).to be_ok
with_db(readonly: true) do |db|
db.results_as_hash = true
node_row = db.get_first_row("SELECT * FROM nodes WHERE node_id = ?", [node_id])
expect(node_row).not_to be_nil
expect(node_row["num"]).to eq(node_num)
expect(node_row["last_heard"]).to eq(rx_time)
expect(node_row["first_heard"]).to eq(rx_time)
expect(node_row["position_time"]).to eq(position_time)
expect_same_value(node_row["latitude"], 52.1)
expect_same_value(node_row["longitude"], 13.1)
expect_same_value(node_row["altitude"], 33.0)
expect(node_row["location_source"]).to eq("LOC_EXTERNAL")
end
end
it "creates hidden nodes for unknown position senders" do
payload = {
"id" => 42,
"node_id" => "!0badc0de",
"rx_time" => reference_time.to_i,
"rx_iso" => reference_time.iso8601,
"latitude" => 52.1,
"longitude" => 13.1,
}
post "/api/positions", payload.to_json, auth_headers
expect(last_response).to be_ok
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT node_id, num, short_name, long_name, role FROM nodes WHERE node_id = ?",
["!0badc0de"],
)
expect(row).not_to be_nil
expect(row["node_id"]).to eq("!0badc0de")
expect(row["num"]).to eq(0x0badc0de)
expect(row["short_name"]).to eq("C0DE")
expect(row["long_name"]).to eq("Meshtastic C0DE")
expect(row["role"]).to eq("CLIENT_HIDDEN")
end
end
it "fills first_heard when updating an existing node without one" do
node_id = "!specposfh"
rx_time = reference_time.to_i - 90
with_db do |db|
db.execute(
"INSERT INTO nodes(node_id, last_heard, first_heard) VALUES (?,?,?)",
[node_id, nil, nil],
)
end
payload = {
"id" => 51,
"node_id" => node_id,
"rx_time" => rx_time,
"latitude" => 51.5,
"longitude" => -0.12,
}
post "/api/positions", payload.to_json, auth_headers
expect(last_response).to be_ok
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT last_heard, first_heard FROM nodes WHERE node_id = ?",
[node_id],
)
expect(row["last_heard"]).to eq(rx_time)
expect(row["first_heard"]).to eq(rx_time)
end
end
it "returns 400 when the payload is not valid JSON" do
post "/api/positions", "{", auth_headers
expect(last_response.status).to eq(400)
expect(JSON.parse(last_response.body)).to eq("error" => "invalid JSON")
end
it "returns 400 when more than 1000 positions are provided" do
payload = Array.new(1001) { |i| { "id" => i + 1, "rx_time" => reference_time.to_i - i } }
post "/api/positions", payload.to_json, auth_headers
expect(last_response.status).to eq(400)
expect(JSON.parse(last_response.body)).to eq("error" => "too many positions")
with_db(readonly: true) do |db|
count = db.get_first_value("SELECT COUNT(*) FROM positions")
expect(count).to eq(0)
end
end
end
it "returns 400 when more than 1000 messages are provided" do
payload = Array.new(1001) { |i| { "packet_id" => i + 1 } }
@@ -894,9 +507,7 @@ RSpec.describe "Potato Mesh Sinatra app" do
with_db(readonly: true) do |db|
db.results_as_hash = true
rows = db.execute(
"SELECT id, from_id, to_id, rx_time, rx_iso, text, encrypted FROM messages ORDER BY id",
)
rows = db.execute("SELECT id, from_id, rx_time, rx_iso, text FROM messages ORDER BY id")
expect(rows.size).to eq(2)
@@ -904,116 +515,18 @@ RSpec.describe "Potato Mesh Sinatra app" do
expect(first["id"]).to eq(101)
expect(first["from_id"]).to eq(node_id)
expect(first).not_to have_key("from_node_id")
expect(first).not_to have_key("from_node_num")
expect(first["rx_time"]).to eq(reference_time.to_i)
expect(first["rx_iso"]).to eq(reference_time.utc.iso8601)
expect(first["text"]).to eq("normalized")
expect(first).not_to have_key("to_node_id")
expect(first).not_to have_key("to_node_num")
expect(first["encrypted"]).to be_nil
expect(second["id"]).to eq(102)
expect(second["from_id"]).to be_nil
expect(second).not_to have_key("from_node_id")
expect(second).not_to have_key("from_node_num")
expect(second["rx_time"]).to eq(reference_time.to_i)
expect(second["rx_iso"]).to eq(reference_time.utc.iso8601)
expect(second["text"]).to eq("blank")
expect(second).not_to have_key("to_node_id")
expect(second).not_to have_key("to_node_num")
expect(second["encrypted"]).to be_nil
end
end
it "stores encrypted messages and resolves node references" do
sender_id = "!feedc0de"
sender_num = 0xfeedc0de
receiver_id = "!c0ffee99"
receiver_num = 0xc0ffee99
sender_node = {
"node_id" => sender_id,
"short_name" => "EncS",
"long_name" => "Encrypted Sender",
"hw_model" => "TEST",
"role" => "CLIENT",
"snr" => 5.5,
"battery_level" => 80.0,
"voltage" => 3.9,
"last_heard" => reference_time.to_i - 30,
"position_time" => reference_time.to_i - 60,
"latitude" => 52.1,
"longitude" => 13.1,
"altitude" => 42.0,
}
sender_payload = build_node_payload(sender_node)
sender_payload["num"] = sender_num
receiver_node = {
"node_id" => receiver_id,
"short_name" => "EncR",
"long_name" => "Encrypted Receiver",
"hw_model" => "TEST",
"role" => "CLIENT",
"snr" => 4.25,
"battery_level" => 75.0,
"voltage" => 3.8,
"last_heard" => reference_time.to_i - 40,
"position_time" => reference_time.to_i - 70,
"latitude" => 52.2,
"longitude" => 13.2,
"altitude" => 35.0,
}
receiver_payload = build_node_payload(receiver_node)
receiver_payload["num"] = receiver_num
post "/api/nodes", { sender_id => sender_payload }.to_json, auth_headers
expect(last_response).to be_ok
post "/api/nodes", { receiver_id => receiver_payload }.to_json, auth_headers
expect(last_response).to be_ok
encrypted_b64 = Base64.strict_encode64("secret message")
payload = {
"packet_id" => 777_001,
"rx_time" => reference_time.to_i,
"rx_iso" => reference_time.utc.iso8601,
"from_id" => sender_num.to_s,
"to_id" => receiver_id,
"channel" => 8,
"portnum" => "TEXT_MESSAGE_APP",
"encrypted" => encrypted_b64,
"snr" => -12.5,
"rssi" => -109,
"hop_limit" => 3,
}
post "/api/messages", payload.to_json, auth_headers
expect(last_response).to be_ok
expect(JSON.parse(last_response.body)).to eq("status" => "ok")
with_db(readonly: true) do |db|
db.results_as_hash = true
row = db.get_first_row(
"SELECT from_id, to_id, text, encrypted FROM messages WHERE id = ?",
[777_001],
)
expect(row["from_id"]).to eq(sender_id)
expect(row["to_id"]).to eq(receiver_id)
expect(row["text"]).to be_nil
expect(row["encrypted"]).to eq(encrypted_b64)
end
get "/api/messages"
expect(last_response).to be_ok
messages = JSON.parse(last_response.body)
expect(messages).to be_an(Array)
expect(messages).to be_empty
end
it "stores messages containing SQL control characters without executing them" do
payload = {
"packet_id" => 404,
@@ -1217,28 +730,11 @@ RSpec.describe "Potato Mesh Sinatra app" do
expect(actual_row["rx_time"]).to eq(expected["rx_time"])
expect(actual_row["rx_iso"]).to eq(expected["rx_iso"])
expected_from_id = expected["from_id"]
if expected_from_id.is_a?(String) && expected_from_id.match?(/\A[0-9]+\z/)
expected_from_id = node_aliases[expected_from_id] || expected_from_id
elsif expected_from_id.nil?
expected_from_id = message.dig("node", "node_id")
end
expect(actual_row["from_id"]).to eq(expected_from_id)
expect(actual_row).not_to have_key("from_node_id")
expect(actual_row).not_to have_key("from_node_num")
expected_to_id = expected["to_id"]
if expected_to_id.is_a?(String) && expected_to_id.match?(/\A[0-9]+\z/)
expected_to_id = node_aliases[expected_to_id] || expected_to_id
end
expect(actual_row["to_id"]).to eq(expected_to_id)
expect(actual_row).not_to have_key("to_node_id")
expect(actual_row).not_to have_key("to_node_num")
expect(actual_row["from_id"]).to eq(expected["from_id"])
expect(actual_row["to_id"]).to eq(expected["to_id"])
expect(actual_row["channel"]).to eq(expected["channel"])
expect(actual_row["portnum"]).to eq(expected["portnum"])
expect(actual_row["text"]).to eq(expected["text"])
expect(actual_row["encrypted"]).to eq(expected["encrypted"])
expect_same_value(actual_row["snr"], expected["snr"])
expect(actual_row["rssi"]).to eq(expected["rssi"])
expect(actual_row["hop_limit"]).to eq(expected["hop_limit"])
@@ -1310,41 +806,4 @@ RSpec.describe "Potato Mesh Sinatra app" do
end
end
end
describe "GET /api/positions" do
it "returns stored positions ordered by receive time" do
node_id = "!specfetch"
rx_times = [reference_time.to_i - 50, reference_time.to_i - 10]
rx_times.each_with_index do |rx_time, idx|
payload = {
"id" => 20_000 + idx,
"node_id" => node_id,
"rx_time" => rx_time,
"rx_iso" => Time.at(rx_time).utc.iso8601,
"position_time" => rx_time - 5,
"latitude" => 52.0 + idx,
"longitude" => 13.0 + idx,
"payload_b64" => "AQI=",
}
post "/api/positions", payload.to_json, auth_headers
expect(last_response).to be_ok
end
get "/api/positions?limit=1"
expect(last_response).to be_ok
data = JSON.parse(last_response.body)
expect(data.length).to eq(1)
entry = data.first
expect(entry["id"]).to eq(20_001)
expect(entry["node_id"]).to eq(node_id)
expect(entry["rx_time"]).to eq(rx_times.last)
expect(entry["rx_iso"]).to eq(Time.at(rx_times.last).utc.iso8601)
expect(entry["position_time"]).to eq(rx_times.last - 5)
expect(entry["position_time_iso"]).to eq(Time.at(rx_times.last - 5).utc.iso8601)
expect(entry["latitude"]).to eq(53.0)
expect(entry["longitude"]).to eq(14.0)
expect(entry["payload_b64"]).to eq("AQI=")
end
end
end
+26 -312
View File
@@ -18,138 +18,9 @@
<html lang="en">
<head>
<style>
:root {
--bg: #f6f3ee;
--bg2: #ffffff;
--fg: #0c0f12;
--muted: #5c6773;
--card: rgba(0,0,0,0.03);
--line: rgba(12,15,18,0.08);
--accent: #2b6cb0;
--row-alt: rgba(0,0,0,0.02);
--table-head-bg: rgba(0,0,0,0.06);
--table-head-fg: var(--fg);
}
body.dark {
--bg: #0e1418;
--bg2: #0e141b;
--fg: #e6ebf0;
--muted: #9aa7b4;
--card: rgba(255,255,255,0.04);
--line: rgba(255,255,255,0.10);
--accent: #5fa8ff;
--row-alt: rgba(255,255,255,0.05);
--table-head-bg: rgba(255,255,255,0.06);
--table-head-fg: var(--fg);
}
html, body {
background-color: var(--bg);
color: var(--fg);
background-image: var(--bg-image, none);
background-size: cover;
background-attachment: fixed;
transition: background-color 160ms ease, color 160ms ease;
}
a { color: var(--accent); }
hr { border-color: var(--line); }
.card, .panel, .box {
background: var(--card);
backdrop-filter: blur(2px);
border: 1px solid var(--line);
border-radius: 10px;
}
table { border-collapse: collapse; width: 100%; border: 1px solid
var(--line); }
thead th {
background: var(--table-head-bg);
color: var(--table-head-fg);
text-align: left;
border-bottom: 1px solid var(--line);
padding: 8px;
}
tbody td { padding: 8px; border-bottom: 1px solid var(--line); }
tbody tr:nth-child(even) td { background: var(--row-alt); }
.leaflet-container { background: transparent !important; color:
var(--fg); }
</style>
<script>
(function () {
var THEME_COOKIE_MAX_AGE = 60 * 60 * 24 * 7;
function getCookie(name) {
const m = document.cookie.match(new RegExp('(?:^|; )' +
name.replace(/([.$?*|{}()\[\]\\/+^])/g, '\\$1') + '=([^;]*)'));
return m ? decodeURIComponent(m[1]) : null;
}
function setCookie(name, value, opts) {
opts = Object.assign({ path: '/', 'max-age': THEME_COOKIE_MAX_AGE, SameSite:
'Lax' }, opts || {});
let updated = encodeURIComponent(name) + '=' +
encodeURIComponent(value);
for (const k in opts) {
updated += '; ' + k + (opts[k] === true ? '' : '=' + opts[k]);
}
document.cookie = updated;
}
function persistTheme(value) {
setCookie('theme', value, { 'max-age': THEME_COOKIE_MAX_AGE });
}
var theme = getCookie('theme');
if (theme !== 'dark' && theme !== 'light') {
theme = 'dark';
}
persistTheme(theme);
document.addEventListener('DOMContentLoaded', function () {
if (theme === 'dark') {
document.body.classList.add('dark');
} else {
document.body.classList.remove('dark');
}
var btn = document.getElementById('themeToggle');
if (btn) btn.textContent = document.body.classList.contains('dark') ?
'☀️' : '🌙';
if (typeof window.applyFiltersToAllTiles === 'function') {
window.applyFiltersToAllTiles();
}
});
window.__themeCookie = { getCookie, setCookie, persistTheme, maxAge: THEME_COOKIE_MAX_AGE };
})();
</script>
<meta name="color-scheme" content="dark light">
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
<% meta_title_html = Rack::Utils.escape_html(meta_title) %>
<% meta_name_html = Rack::Utils.escape_html(meta_name) %>
<% meta_description_html = Rack::Utils.escape_html(meta_description) %>
<% request_path = request.path.to_s.empty? ? "/" : request.path %>
<% canonical_url = "#{request.base_url}#{request_path}" %>
<% canonical_html = Rack::Utils.escape_html(canonical_url) %>
<% logo_url = "#{request.base_url}/potatomesh-logo.svg" %>
<% logo_url_html = Rack::Utils.escape_html(logo_url) %>
<% logo_alt_html = Rack::Utils.escape_html("#{meta_name} logo") %>
<title><%= meta_title_html %></title>
<meta name="application-name" content="<%= meta_name_html %>" />
<meta name="apple-mobile-web-app-title" content="<%= meta_name_html %>" />
<meta name="description" content="<%= meta_description_html %>" />
<link rel="canonical" href="<%= canonical_html %>" />
<meta property="og:title" content="<%= meta_title_html %>" />
<meta property="og:site_name" content="<%= meta_name_html %>" />
<meta property="og:description" content="<%= meta_description_html %>" />
<meta property="og:type" content="website" />
<meta property="og:url" content="<%= canonical_html %>" />
<meta property="og:image" content="<%= logo_url_html %>" />
<meta property="og:image:alt" content="<%= logo_alt_html %>" />
<meta name="twitter:card" content="summary" />
<meta name="twitter:title" content="<%= meta_title_html %>" />
<meta name="twitter:description" content="<%= meta_description_html %>" />
<meta name="twitter:image" content="<%= logo_url_html %>" />
<meta name="twitter:image:alt" content="<%= logo_alt_html %>" />
<link rel="icon" type="image/x-icon" href="/favicon.ico" />
<title><%= site_name %></title>
<link rel="icon" type="image/svg+xml" href="/potatomesh-logo.svg" />
<% refresh_interval_seconds = 60 %>
<% tile_filter_light = "grayscale(1) saturate(0) brightness(0.92) contrast(1.05)" %>
@@ -210,7 +81,7 @@ var(--fg); }
.auto-refresh-toggle { display: inline-flex; align-items: center; gap: 6px; }
.controls { display: flex; gap: 8px; align-items: center; }
.controls label { display: inline-flex; align-items: center; gap: 6px; }
button { padding: 6px 10px; border: 1px solid #ccc; background: #fff; border-radius: 6px; cursor: pointer; color: var(--fg); }
button { padding: 6px 10px; border: 1px solid #ccc; background: #fff; border-radius: 6px; cursor: pointer; }
button:hover { background: #f6f6f6; }
.sort-button { padding: 0; border: none; background: none; color: inherit; font: inherit; cursor: pointer; display: inline-flex; align-items: center; gap: 4px; }
.sort-button:hover { background: none; }
@@ -219,7 +90,7 @@ var(--fg); }
th[aria-sort] .sort-indicator { opacity: 1; }
label { font-size: 14px; color: #333; }
input[type="text"] { padding: 6px 10px; border: 1px solid #ccc; border-radius: 6px; }
.legend { position: relative; background: #fff; color: var(--fg); padding: 8px 10px 10px; border: 1px solid #ccc; border-radius: 8px; font-size: 12px; line-height: 18px; min-width: 160px; box-shadow: 0 4px 16px rgba(0, 0, 0, 0.12); }
.legend { position: relative; background: #fff; padding: 8px 10px 10px; border: 1px solid #ccc; border-radius: 8px; font-size: 12px; line-height: 18px; min-width: 160px; box-shadow: 0 4px 16px rgba(0, 0, 0, 0.12); }
.legend-header { display: flex; align-items: center; justify-content: flex-start; gap: 4px; margin-bottom: 6px; font-weight: 600; }
.legend-title { font-size: 13px; }
.legend-items { display: flex; flex-direction: column; gap: 2px; }
@@ -244,7 +115,7 @@ var(--fg); }
.legend-swatch { display: inline-block; width: 12px; height: 12px; border-radius: 2px; }
.legend-hidden { display: none !important; }
.legend-toggle { margin-top: 8px; }
.legend-toggle-button { font-size: 12px; color: var(--fg); }
.legend-toggle-button { font-size: 12px; }
#map .leaflet-tile-pane,
#map .leaflet-layer,
#map .leaflet-tile.map-tiles {
@@ -260,7 +131,7 @@ var(--fg); }
}
#nodes { font-size: 12px; }
footer { position: fixed; bottom: 0; left: var(--pad); width: calc(100% - 2 * var(--pad)); background: #fafafa; border-top: 1px solid #ddd; text-align: center; font-size: 12px; padding: 4px 0; }
.info-overlay { position: fixed; inset: 0; background: rgba(0, 0, 0, 0.45); display: flex; align-items: center; justify-content: center; padding: var(--pad); z-index: 4000; }
.info-overlay { position: fixed; inset: 0; background: rgba(0, 0, 0, 0.45); display: flex; align-items: center; justify-content: center; padding: var(--pad); z-index: 1000; }
.info-overlay[hidden] { display: none; }
.info-dialog { background: #fff; color: #111; max-width: 420px; width: min(100%, 420px); border-radius: 12px; box-shadow: 0 16px 40px rgba(0, 0, 0, 0.2); position: relative; padding: 20px 24px; outline: none; }
.info-dialog:focus { outline: 2px solid #4a90e2; outline-offset: 4px; }
@@ -285,7 +156,7 @@ var(--fg); }
}
}
@media (max-width: 1024px) {
@media (max-width: 768px) {
.row { flex-direction: column; align-items: stretch; gap: var(--pad); }
.site-title img { width: 44px; height: 44px; }
.map-row { flex-direction: column; }
@@ -383,88 +254,6 @@ var(--fg); }
-webkit-filter: <%= tile_filter_dark %>;
}
</style>
<script>
(function(){
function xmur3(str){for(var i=0,h=1779033703^str.length;i<str.length;i++)h=Math.imul(h^str.charCodeAt(i),3432918353),h=h<<13|h>>>19;return function(){h=Math.imul(h^h>>>16,2246822507);h=Math.imul(h^h>>>13,3266489909);return (h^h>>>16)>>>0;};}
function mulberry32(a){return function(){var t=a+=0x6D2B79F5;t=Math.imul(t^t>>>15,t|1);t^=t+Math.imul(t^t>>>7,t|61);return((t^t>>>14)>>>0)/4294967296;}}
function genBackground(theme){
var seedInput = location.hostname + '::' + theme;
var seed = xmur3(seedInput)();
var rnd = mulberry32(seed);
var w = 1400, h = 900;
var c = document.createElement('canvas'); c.width=w; c.height=h;
var ctx = c.getContext('2d');
if(theme==='dark'){
var g = ctx.createLinearGradient(0,0,w,h);
g.addColorStop(0, '#0b1119');
g.addColorStop(1, '#121b27');
ctx.fillStyle = g; ctx.fillRect(0,0,w,h);
} else {
var g2 = ctx.createLinearGradient(0,0,w,h);
g2.addColorStop(0, '#efe8d9'); g2.addColorStop(1, '#dfe5ec');
ctx.fillStyle = g2; ctx.fillRect(0,0,w,h);
}
ctx.globalAlpha = (theme==='dark') ? 0.05 : 0.06;
for(var i=0;i<14000;i++){
var x = Math.floor(rnd()*w), y = Math.floor(rnd()*h);
var s = Math.floor(rnd()*2)+1;
ctx.fillStyle = (theme==='dark') ? '#ffffff' : '#000000';
ctx.fillRect(x,y,s,s);
}
var rad = ctx.createRadialGradient(w*0.5,h*0.5,Math.min(w,h)*0.2, w*0.5,h*0.5,Math.max(w,h)*0.7);
if(theme==='dark'){
rad.addColorStop(0,'rgba(0,0,0,0)');
rad.addColorStop(1,'rgba(0,0,0,0.20)');
} else {
rad.addColorStop(0,'rgba(255,255,255,0)');
rad.addColorStop(1,'rgba(255,255,255,0.22)');
}
ctx.globalAlpha = 1; ctx.fillStyle = rad; ctx.fillRect(0,0,w,h);
var url = c.toDataURL('image/png');
document.documentElement.style.setProperty('--bg-image', 'url('+url+')');
}
function currentTheme(){
return document.body.classList.contains('dark') ? 'dark' : 'light';
}
document.addEventListener('DOMContentLoaded', function(){
genBackground(currentTheme());
});
window.addEventListener('themechange', function(e){
var theme = e.detail && e.detail.theme || currentTheme();
genBackground(theme);
});
var obs = new MutationObserver(function(){ genBackground(currentTheme());});
obs.observe(document.documentElement, { attributes:true, attributeFilter:['class'] });
window.__regenBackground = genBackground;
})();
</script>
<style>
/* Make common wrappers transparent so the generated background is visible */
#app, main, .container, .content, .wrapper, .layout, .page, .root, body > div:first-child {
background: transparent !important;
}
/* Soften dark cards a bit to avoid heavy overlay */
body.dark .card, body.dark .panel, body.dark .box {
background: rgba(255,255,255,0.02);
border-color: rgba(255,255,255,0.07);
}
body.dark thead th {
background: rgba(255,255,255,0.04);
}
</style>
<style>
/* Dark theme: avoid any solid blocks that hide the background */
body.dark :is(#app, main, .container, .content, .wrapper, .page, .layout, .root, .section) {
background: rgba(255,255,255,0.04) !important;
}
/* Dark theme tables & boxes */
body.dark :is(.card, .panel, .box) {
background: var(--card) !important;
border-color: var(--line) !important;
}
/* Defensive: Leaflet map stays transparent */
body.dark .leaflet-container { background: transparent !important; }
</style>
</head>
<body>
<h1 class="site-title">
@@ -607,7 +396,6 @@ var(--fg); }
let lastChatDate;
const NODE_LIMIT = 1000;
const CHAT_LIMIT = 1000;
const CHAT_RECENT_WINDOW_SECONDS = 7 * 24 * 60 * 60;
const REFRESH_MS = <%= refresh_interval_seconds * 1000 %>;
refreshInfo.textContent = `<%= default_channel %> (<%= default_frequency %>) — active nodes: …`;
@@ -731,31 +519,16 @@ var(--fg); }
const MAP_CENTER = L.latLng(<%= map_center_lat %>, <%= map_center_lon %>);
const MAX_NODE_DISTANCE_KM = <%= max_node_distance_km %>;
// Firmware 2.7.10 / Android 2.7.0 roles and colors (see issue #177)
const roleColors = Object.freeze({
CLIENT_HIDDEN: '#A9CBE8',
SENSOR: '#A8D5BA',
TRACKER: '#B9DFAC',
CLIENT_MUTE: '#CDE7A9',
CLIENT: '#E8E6A1',
CLIENT_BASE: '#F6D0A6',
CLIENT: '#A8D5BA',
CLIENT_HIDDEN: '#B8DCA9',
CLIENT_MUTE: '#D2E3A2',
TRACKER: '#E8E6A1',
SENSOR: '#F4E3A3',
LOST_AND_FOUND: '#F9D4A6',
REPEATER: '#F7B7A3',
ROUTER_LATE: '#F29AA3',
ROUTER: '#E88B94',
LOST_AND_FOUND: '#C3A8E8'
});
const roleRenderOrder = Object.freeze({
CLIENT_HIDDEN: 1,
SENSOR: 2,
TRACKER: 3,
CLIENT_MUTE: 4,
CLIENT: 5,
CLIENT_BASE: 6,
REPEATER: 7,
ROUTER_LATE: 8,
ROUTER: 9,
LOST_AND_FOUND: 10
ROUTER: '#E88B94'
});
const activeRoleFilters = new Set();
@@ -780,15 +553,11 @@ var(--fg); }
return roleColors[key] || roleColors.CLIENT || '#3388ff';
}
function getRoleRenderPriority(role) {
const key = getRoleKey(role);
const priority = roleRenderOrder[key];
return typeof priority === 'number' ? priority : 0;
}
// --- Map setup ---
const map = L.map('map', { worldCopyJump: true, attributionControl: false });
const map = L.map('map', { worldCopyJump: true });
const TILE_LAYER_URL = 'https://{s}.tile.openstreetmap.fr/hot/{z}/{x}/{y}.png';
const TILE_ATTRIBUTION =
'&copy; OpenStreetMap contributors, tiles style by Humanitarian OpenStreetMap Team, hosted by OpenStreetMap France';
const TILE_FILTER_LIGHT = '<%= tile_filter_light %>';
const TILE_FILTER_DARK = '<%= tile_filter_dark %>';
@@ -838,6 +607,7 @@ var(--fg); }
const tiles = L.tileLayer(TILE_LAYER_URL, {
maxZoom: 19,
attribution: TILE_ATTRIBUTION,
className: 'map-tiles',
crossOrigin: 'anonymous'
});
@@ -1032,26 +802,17 @@ var(--fg); }
};
legendToggleControl.addTo(map);
const legendMediaQuery = window.matchMedia('(max-width: 1024px)');
const legendMediaQuery = window.matchMedia('(max-width: 768px)');
setLegendVisibility(!legendMediaQuery.matches);
legendMediaQuery.addEventListener('change', event => {
setLegendVisibility(!event.matches);
});
themeToggle.addEventListener('click', () => {
const dark = document.body.classList.toggle('dark');
const themeValue = dark ? 'dark' : 'light';
themeToggle.textContent = dark ? '☀️' : '🌙';
if (window.__themeCookie) {
if (typeof window.__themeCookie.persistTheme === 'function') {
window.__themeCookie.persistTheme(themeValue);
} else if (typeof window.__themeCookie.setCookie === 'function') {
window.__themeCookie.setCookie('theme', themeValue);
}
}
window.dispatchEvent(new CustomEvent('themechange', { detail: { theme: themeValue } }));
if (typeof window.applyFiltersToAllTiles === 'function') window.applyFiltersToAllTiles();
});
themeToggle.addEventListener('click', () => {
const dark = document.body.classList.toggle('dark');
themeToggle.textContent = dark ? '☀️' : '🌙';
applyFiltersToAllTiles();
});
let lastFocusBeforeInfo = null;
@@ -1287,27 +1048,15 @@ var(--fg); }
entries.push({ type: 'node', ts: n.first_heard ?? 0, item: n });
}
for (const m of messages || []) {
if (!m || m.encrypted) continue;
entries.push({ type: 'msg', ts: m.rx_time ?? 0, item: m });
}
const nowSeconds = Math.floor(Date.now() / 1000);
const cutoff = nowSeconds - CHAT_RECENT_WINDOW_SECONDS;
const recentEntries = entries.filter(entry => {
if (entry == null) return false;
const rawTs = entry.ts;
if (rawTs == null) return false;
const ts = typeof rawTs === 'number' ? rawTs : Number(rawTs);
if (!Number.isFinite(ts)) return false;
entry.ts = ts;
return ts >= cutoff;
});
recentEntries.sort((a, b) => {
entries.sort((a, b) => {
if (a.ts !== b.ts) return a.ts - b.ts;
return a.type === 'node' && b.type === 'msg' ? -1 : a.type === 'msg' && b.type === 'node' ? 1 : 0;
});
const frag = document.createDocumentFragment();
lastChatDate = null;
for (const entry of recentEntries) {
for (const entry of entries) {
const divider = maybeCreateDateDivider(entry.ts);
if (divider) frag.appendChild(divider);
if (entry.type === 'node') {
@@ -1357,27 +1106,6 @@ var(--fg); }
return Number.isFinite(n) ? `${n.toFixed(d)}%` : "";
}
function normalizeNodeNameValue(value) {
if (value == null) return '';
const str = String(value).trim();
return str.length ? str : '';
}
function applyNodeNameFallback(node) {
if (!node || typeof node !== 'object') return;
const short = normalizeNodeNameValue(node.short_name ?? node.shortName);
const long = normalizeNodeNameValue(node.long_name ?? node.longName);
if (short || long) return;
const nodeId = normalizeNodeNameValue(node.node_id ?? node.nodeId);
if (!nodeId) return;
const fallbackShort = nodeId.slice(-4);
const fallbackLong = `Meshtastic ${nodeId}`;
node.short_name = fallbackShort;
node.long_name = fallbackLong;
if ('shortName' in node) node.shortName = fallbackShort;
if ('longName' in node) node.longName = fallbackLong;
}
function timeHum(unixSec) {
if (!unixSec) return "";
if (unixSec < 0) return "0s";
@@ -1459,17 +1187,7 @@ var(--fg); }
function renderMap(nodes, nowSec) {
markersLayer.clearLayers();
const pts = [];
const nodesByRenderOrder = nodes
.map((node, index) => ({ node, index }))
.sort((a, b) => {
const orderA = getRoleRenderPriority(a.node && a.node.role);
const orderB = getRoleRenderPriority(b.node && b.node.role);
if (orderA !== orderB) return orderA - orderB;
return a.index - b.index;
})
.map(entry => entry.node);
for (const n of nodesByRenderOrder) {
for (const n of nodes) {
const latRaw = n.latitude, lonRaw = n.longitude;
if (latRaw == null || latRaw === '' || lonRaw == null || lonRaw === '') continue;
const lat = Number(latRaw), lon = Number(lonRaw);
@@ -1539,12 +1257,8 @@ var(--fg); }
try {
statusEl.textContent = 'refreshing…';
const nodes = await fetchNodes();
nodes.forEach(applyNodeNameFallback);
computeDistances(nodes);
const messages = await fetchMessages();
messages.forEach(message => {
if (message && message.node) applyNodeNameFallback(message.node);
});
renderChatLog(nodes, messages);
allNodes = nodes;
applyFilter();