Compare commits

..

7 Commits

Author SHA1 Message Date
Jorijn Schrijvershof
64cc352b80 chore(main): release 0.2.8 (#26) 2026-01-06 12:06:53 +01:00
Jorijn Schrijvershof
e37aef6c5e fix: normalize reporting outputs and chart tooltips
- render last observation with local TZ label

- show zero values in report tables

- keep report JSON raw values with explicit units

- reduce chart queries per period and tag chart line paths

- remove redundant get_bat call in companion collector
2026-01-06 11:24:47 +01:00
Jorijn Schrijvershof
81b7c6897a chore(main): release 0.2.7 (#25) 2026-01-06 09:54:44 +01:00
Jorijn Schrijvershof
a3015e2209 feat: add telemetry collection for companion and repeater nodes (#24)
Add environmental telemetry collection (temperature, humidity, barometric
pressure, voltage) from both the repeater node (over LoRa) and companion
node (local serial). Telemetry is stored in the same EAV metrics table
with `telemetry.` prefix.

Key changes:
- Add TELEMETRY_ENABLED feature flag (defaults to OFF)
- Add telemetry-specific timeout/retry settings
- Create shared telemetry.py module with extract_lpp_from_payload()
  and extract_telemetry_metrics() helpers
- Handle MeshCore API dict payload format: {'pubkey_pre': '...', 'lpp': [...]}
- Repeater: store status metrics BEFORE attempting telemetry (LoRa reliability)
- Companion: merge telemetry into single DB write (serial is reliable)
- Telemetry failures do NOT affect circuit breaker state

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 09:53:15 +01:00
Jorijn Schrijvershof
5545ce5b28 Merge pull request #23 from jorijn/release-please--branches--main--components--meshcore-stats
chore(main): release 0.2.6
2026-01-05 10:50:13 +01:00
Jorijn Schrijvershof
666ed4215f chore(main): release 0.2.6 2026-01-05 10:49:22 +01:00
Jorijn Schrijvershof
3d0d90304c fix: add tmpfs mount for fontconfig cache to fix read-only filesystem errors
The container runs with read_only: true for security hardening, but
fontconfig needs a writable cache directory. Added tmpfs mount at
/var/cache/fontconfig to allow fontconfig to write its cache without
compromising the read-only filesystem security.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-05 10:49:02 +01:00
15 changed files with 506 additions and 152 deletions

View File

@@ -1,3 +1,3 @@
{
".": "0.2.5"
".": "0.2.8"
}

View File

@@ -4,6 +4,27 @@ All notable changes to this project will be documented in this file.
This changelog is automatically generated by [release-please](https://github.com/googleapis/release-please) based on [Conventional Commits](https://www.conventionalcommits.org/).
## [0.2.8](https://github.com/jorijn/meshcore-stats/compare/v0.2.7...v0.2.8) (2026-01-06)
### Bug Fixes
* normalize reporting outputs and chart tooltips ([e37aef6](https://github.com/jorijn/meshcore-stats/commit/e37aef6c5e55d2077baf4ee35abdff0562983d69))
## [0.2.7](https://github.com/jorijn/meshcore-stats/compare/v0.2.6...v0.2.7) (2026-01-06)
### Features
* add telemetry collection for companion and repeater nodes ([#24](https://github.com/jorijn/meshcore-stats/issues/24)) ([a3015e2](https://github.com/jorijn/meshcore-stats/commit/a3015e2209781bdd7c317fa992ced6afa19efe61))
## [0.2.6](https://github.com/jorijn/meshcore-stats/compare/v0.2.5...v0.2.6) (2026-01-05)
### Bug Fixes
* add tmpfs mount for fontconfig cache to fix read-only filesystem errors ([3d0d903](https://github.com/jorijn/meshcore-stats/commit/3d0d90304cec5ebcdb34935400de31afd62e258d))
## [0.2.5](https://github.com/jorijn/meshcore-stats/compare/v0.2.4...v0.2.5) (2026-01-05)

View File

@@ -16,6 +16,8 @@ Always edit the source templates, then regenerate with `python scripts/render_si
## Running Commands
**IMPORTANT: Always activate the virtual environment before running any Python commands.**
```bash
cd /path/to/meshcore-stats
source .venv/bin/activate
@@ -354,11 +356,17 @@ All configuration via `meshcore.conf` or environment variables. The config file
### Timeouts & Retry
- `REMOTE_TIMEOUT_S`: Minimum timeout for LoRa requests (default: 10)
- `REMOTE_RETRY_ATTEMPTS`: Number of retry attempts (default: 5)
- `REMOTE_RETRY_ATTEMPTS`: Number of retry attempts (default: 2)
- `REMOTE_RETRY_BACKOFF_S`: Seconds between retries (default: 4)
- `REMOTE_CB_FAILS`: Failures before circuit breaker opens (default: 6)
- `REMOTE_CB_COOLDOWN_S`: Circuit breaker cooldown (default: 3600)
### Telemetry Collection
- `TELEMETRY_ENABLED`: Enable environmental telemetry collection from repeater (0/1, default: 0)
- `TELEMETRY_TIMEOUT_S`: Timeout for telemetry requests (default: 10)
- `TELEMETRY_RETRY_ATTEMPTS`: Retry attempts for telemetry (default: 2)
- `TELEMETRY_RETRY_BACKOFF_S`: Backoff between telemetry retries (default: 4)
### Intervals
- `COMPANION_STEP`: Collection interval for companion (default: 60s)
- `REPEATER_STEP`: Collection interval for repeater (default: 900s / 15min)
@@ -410,6 +418,12 @@ Metrics are classified as either **gauge** or **counter** in `src/meshmon/metric
Counter metrics are converted to rates during chart rendering by calculating deltas between consecutive readings.
- **TELEMETRY**: Environmental sensor data (when `TELEMETRY_ENABLED=1`):
- Stored with `telemetry.` prefix: `telemetry.temperature.0`, `telemetry.humidity.0`, `telemetry.barometer.0`
- Channel number distinguishes multiple sensors of the same type
- Compound values (e.g., GPS) stored as: `telemetry.gps.0.latitude`, `telemetry.gps.0.longitude`
- Telemetry collection does NOT affect circuit breaker state
## Database Schema
Metrics are stored in a SQLite database at `data/state/metrics.db` with WAL mode enabled for concurrent access.

View File

@@ -15,7 +15,7 @@ services:
# MeshCore Stats - Data collection and rendering
# ==========================================================================
meshcore-stats:
image: ghcr.io/jorijn/meshcore-stats:0.2.5 # x-release-please-version
image: ghcr.io/jorijn/meshcore-stats:0.2.8 # x-release-please-version
container_name: meshcore-stats
restart: unless-stopped
@@ -47,6 +47,7 @@ services:
read_only: true
tmpfs:
- /tmp:noexec,nosuid,size=64m
- /var/cache/fontconfig:noexec,nosuid,size=4m
# Resource limits
deploy:

View File

@@ -102,6 +102,84 @@ Returns a single dict with all status fields.
---
## Telemetry Data
Environmental telemetry is requested via `req_telemetry_sync(contact)` and returns
Cayenne LPP formatted sensor data. This requires `TELEMETRY_ENABLED=1` and a sensor
board attached to the repeater.
### Payload Format
Both `req_telemetry_sync()` and `get_self_telemetry()` return a dict containing the
LPP data list and a public key prefix:
```python
{
'pubkey_pre': 'a5c14f5244d6',
'lpp': [
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
]
}
```
The `extract_lpp_from_payload()` helper in `src/meshmon/telemetry.py` handles
extracting the `lpp` list from this wrapper format.
### `req_telemetry_sync(contact)`
Returns sensor readings from a remote node in Cayenne LPP format:
```python
[
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
{'channel': 0, 'type': 'barometer', 'value': 1013.25},
{'channel': 1, 'type': 'gps', 'value': {'latitude': 51.5, 'longitude': -0.1, 'altitude': 10}},
]
```
**Common sensor types:**
| Type | Unit | Description |
|------|------|-------------|
| `temperature` | Celsius | Temperature reading |
| `humidity` | % | Relative humidity |
| `barometer` | hPa/mbar | Barometric pressure |
| `voltage` | V | Voltage reading |
| `gps` | compound | GPS with `latitude`, `longitude`, `altitude` |
**Stored as:**
- `telemetry.temperature.0` - Temperature on channel 0
- `telemetry.humidity.0` - Humidity on channel 0
- `telemetry.gps.1.latitude` - GPS latitude on channel 1
**Notes:**
- Requires environmental sensor board (BME280, BME680, etc.) on repeater
- Channel number distinguishes multiple sensors of the same type
- Not all repeaters have environmental sensors attached
- Telemetry collection does not affect circuit breaker state
- Telemetry failures are logged as warnings and do not block status collection
### `get_self_telemetry()`
Returns self telemetry from the companion node's attached sensors.
Same Cayenne LPP format as `req_telemetry_sync()`.
```python
[
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
]
```
**Notes:**
- Requires environmental sensor board attached to companion
- Returns empty list if no sensors attached
- Uses same format as repeater telemetry
---
## Derived Metrics
These are computed at query time, not stored:

View File

@@ -113,6 +113,23 @@ RADIO_CODING_RATE=CR8
# REMOTE_CB_FAILS=6
# REMOTE_CB_COOLDOWN_S=3600
# =============================================================================
# Telemetry Collection (Environmental Sensors)
# =============================================================================
# Enable telemetry collection from repeater's environmental sensors
# (temperature, humidity, barometric pressure, etc.)
# Requires sensor board attached to repeater (e.g., BME280, BME680)
# Default: 0 (disabled)
# TELEMETRY_ENABLED=1
# Telemetry-specific timeout and retry settings
# Defaults match status settings. Separate config allows tuning if telemetry
# proves problematic (e.g., firmware doesn't support it, sensor board missing).
# You can reduce these if telemetry collection is causing issues.
# TELEMETRY_TIMEOUT_S=10
# TELEMETRY_RETRY_ATTEMPTS=2
# TELEMETRY_RETRY_BACKOFF_S=4
# =============================================================================
# Paths (Native installation only)
# =============================================================================

View File

@@ -27,6 +27,7 @@ from meshmon.env import get_config
from meshmon import log
from meshmon.meshcore_client import connect_with_lock, run_command
from meshmon.db import init_db, insert_metrics
from meshmon.telemetry import extract_lpp_from_payload, extract_telemetry_metrics
async def collect_companion() -> int:
@@ -73,16 +74,6 @@ async def collect_companion() -> int:
else:
log.error(f"device_query failed: {err}")
# get_bat
ok, evt_type, payload, err = await run_command(
mc, cmd.get_bat(), "get_bat"
)
if ok:
commands_succeeded += 1
log.debug(f"get_bat: {payload}")
else:
log.error(f"get_bat failed: {err}")
# get_time
ok, evt_type, payload, err = await run_command(
mc, cmd.get_time(), "get_time"
@@ -93,15 +84,26 @@ async def collect_companion() -> int:
else:
log.error(f"get_time failed: {err}")
# get_self_telemetry
# get_self_telemetry - collect environmental sensor data
# Note: The call happens regardless of telemetry_enabled for device query completeness,
# but we only extract and store metrics if the feature is enabled.
ok, evt_type, payload, err = await run_command(
mc, cmd.get_self_telemetry(), "get_self_telemetry"
)
if ok:
commands_succeeded += 1
log.debug(f"get_self_telemetry: {payload}")
# Extract and store telemetry if enabled
if cfg.telemetry_enabled:
lpp_data = extract_lpp_from_payload(payload)
if lpp_data is not None:
telemetry_metrics = extract_telemetry_metrics(lpp_data)
if telemetry_metrics:
metrics.update(telemetry_metrics)
log.debug(f"Extracted {len(telemetry_metrics)} telemetry metrics")
else:
log.error(f"get_self_telemetry failed: {err}")
# Debug level because not all devices have sensors attached - this is expected
log.debug(f"get_self_telemetry failed: {err}")
# get_custom_vars
ok, evt_type, payload, err = await run_command(
@@ -176,6 +178,10 @@ async def collect_companion() -> int:
summary_parts.append(f"rx={int(metrics['recv'])}")
if "sent" in metrics:
summary_parts.append(f"tx={int(metrics['sent'])}")
# Add telemetry count to summary if present
telemetry_count = sum(1 for k in metrics if k.startswith("telemetry."))
if telemetry_count > 0:
summary_parts.append(f"telem={telemetry_count}")
log.info(f"Companion: {', '.join(summary_parts)}")

View File

@@ -32,10 +32,10 @@ from meshmon.meshcore_client import (
get_contact_by_name,
get_contact_by_key_prefix,
extract_contact_info,
list_contacts_summary,
)
from meshmon.db import init_db, insert_metrics
from meshmon.retry import get_repeater_circuit_breaker, with_retries
from meshmon.telemetry import extract_lpp_from_payload, extract_telemetry_metrics
async def find_repeater_contact(mc: Any) -> Optional[Any]:
@@ -143,8 +143,10 @@ async def query_repeater_with_retry(
async def collect_repeater() -> int:
"""
Collect data from remote repeater node.
"""Collect data from remote repeater node.
Collects status metrics (battery, uptime, packet counters, etc.) and
optionally telemetry data (temperature, humidity, pressure) if enabled.
Returns:
Exit code (0 = success, 1 = error)
@@ -162,7 +164,8 @@ async def collect_repeater() -> int:
return 0
# Metrics to insert (firmware field names from req_status_sync)
metrics: dict[str, float] = {}
status_metrics: dict[str, float] = {}
telemetry_metrics: dict[str, float] = {}
node_name = "unknown"
status_ok = False
@@ -213,7 +216,7 @@ async def collect_repeater() -> int:
except Exception as e:
log.debug(f"Login not supported: {e}")
# Query status (using _sync version which returns payload directly)
# Phase 1: Status collection (affects circuit breaker)
# Use timeout=0 to let the device suggest timeout, with min_timeout as floor
log.debug("Querying repeater status...")
success, payload, err = await query_repeater_with_retry(
@@ -227,12 +230,12 @@ async def collect_repeater() -> int:
# Insert all numeric fields from status response
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
status_metrics[key] = float(value)
log.debug(f"req_status_sync: {payload}")
else:
log.warn(f"req_status_sync failed: {err}")
# Update circuit breaker
# Update circuit breaker based on status result
if status_ok:
cb.record_success()
log.debug("Circuit breaker: recorded success")
@@ -240,6 +243,51 @@ async def collect_repeater() -> int:
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
log.debug(f"Circuit breaker: recorded failure ({cb.consecutive_failures}/{cfg.remote_cb_fails})")
# CRITICAL: Store status metrics immediately before attempting telemetry
# This ensures critical data is saved even if telemetry fails
if status_ok and status_metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=status_metrics)
log.debug(f"Stored {inserted} status metrics (ts={ts})")
except Exception as e:
log.error(f"Failed to store status metrics: {e}")
return 1
# Phase 2: Telemetry collection (does NOT affect circuit breaker)
if cfg.telemetry_enabled and status_ok:
log.debug("Querying repeater telemetry...")
try:
# Note: Telemetry uses its own retry settings and does NOT
# affect circuit breaker. Status success proves the link is up;
# telemetry failures are likely firmware/capability issues.
telem_success, telem_payload, telem_err = await with_retries(
lambda: cmd.req_telemetry_sync(
contact, timeout=0, min_timeout=cfg.telemetry_timeout_s
),
attempts=cfg.telemetry_retry_attempts,
backoff_s=cfg.telemetry_retry_backoff_s,
name="req_telemetry_sync",
)
if telem_success and telem_payload:
log.debug(f"req_telemetry_sync: {telem_payload}")
lpp_data = extract_lpp_from_payload(telem_payload)
if lpp_data is not None:
telemetry_metrics = extract_telemetry_metrics(lpp_data)
log.debug(f"Extracted {len(telemetry_metrics)} telemetry metrics")
# Store telemetry metrics
if telemetry_metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=telemetry_metrics)
log.debug(f"Stored {inserted} telemetry metrics")
except Exception as e:
log.warn(f"Failed to store telemetry metrics: {e}")
else:
log.warn(f"req_telemetry_sync failed: {telem_err}")
except Exception as e:
log.warn(f"Telemetry collection error (continuing): {e}")
except Exception as e:
log.error(f"Error during collection: {e}")
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
@@ -248,28 +296,21 @@ async def collect_repeater() -> int:
# Print summary
summary_parts = [f"ts={ts}"]
if "bat" in metrics:
bat_v = metrics["bat"] / 1000.0
if "bat" in status_metrics:
bat_v = status_metrics["bat"] / 1000.0
summary_parts.append(f"bat={bat_v:.2f}V")
if "uptime" in metrics:
uptime_days = metrics["uptime"] // 86400
if "uptime" in status_metrics:
uptime_days = status_metrics["uptime"] // 86400
summary_parts.append(f"uptime={int(uptime_days)}d")
if "nb_recv" in metrics:
summary_parts.append(f"rx={int(metrics['nb_recv'])}")
if "nb_sent" in metrics:
summary_parts.append(f"tx={int(metrics['nb_sent'])}")
if "nb_recv" in status_metrics:
summary_parts.append(f"rx={int(status_metrics['nb_recv'])}")
if "nb_sent" in status_metrics:
summary_parts.append(f"tx={int(status_metrics['nb_sent'])}")
if telemetry_metrics:
summary_parts.append(f"telem={len(telemetry_metrics)}")
log.info(f"Repeater ({node_name}): {', '.join(summary_parts)}")
# Write metrics to database
if status_ok and metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=metrics)
log.debug(f"Inserted {inserted} metrics to database (ts={ts})")
except Exception as e:
log.error(f"Failed to write metrics to database: {e}")
return 1
return 0 if status_ok else 1

View File

@@ -1,3 +1,3 @@
"""MeshCore network monitoring library."""
__version__ = "0.2.5" # x-release-please-version
__version__ = "0.2.8" # x-release-please-version

View File

@@ -167,6 +167,7 @@ def load_timeseries_from_db(
end_time: datetime,
lookback: timedelta,
period: str,
all_metrics: Optional[dict[str, list[tuple[int, float]]]] = None,
) -> TimeSeries:
"""Load time series data from SQLite database.
@@ -179,6 +180,7 @@ def load_timeseries_from_db(
end_time: End of the time range (typically now)
lookback: How far back to look
period: Period name for binning config ("day", "week", etc.)
all_metrics: Optional pre-fetched metrics dict for this period
Returns:
TimeSeries with extracted data points
@@ -188,7 +190,8 @@ def load_timeseries_from_db(
end_ts = int(end_time.timestamp())
# Fetch all metrics for this role/period (returns pivoted dict)
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
if all_metrics is None:
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
# Get data for this specific metric
metric_data = all_metrics.get(metric, [])
@@ -379,10 +382,22 @@ def render_chart_svg(
# Plot area fill
area_color = _hex_to_rgba(theme.area)
ax.fill_between(timestamps, values, alpha=area_color[3], color=f"#{theme.line}")
area = ax.fill_between(
timestamps,
values,
alpha=area_color[3],
color=f"#{theme.line}",
)
area.set_gid("chart-area")
# Plot line
ax.plot(timestamps, values, color=f"#{theme.line}", linewidth=2)
(line,) = ax.plot(
timestamps,
values,
color=f"#{theme.line}",
linewidth=2,
)
line.set_gid("chart-line")
# Set Y-axis limits and track actual values used
if y_min is not None and y_max is not None:
@@ -458,7 +473,7 @@ def _inject_data_attributes(
Adds:
- data-metric, data-period, data-theme, data-x-start, data-x-end, data-y-min, data-y-max to root <svg>
- data-points JSON array to the chart path element
- data-points JSON array to the root <svg> and chart line path
Args:
svg: Raw SVG string
@@ -495,22 +510,35 @@ def _inject_data_attributes(
r'<svg\b',
f'<svg data-metric="{ts.metric}" data-period="{ts.period}" data-theme="{theme_name}" '
f'data-x-start="{x_start_ts}" data-x-end="{x_end_ts}" '
f'data-y-min="{y_min_val}" data-y-max="{y_max_val}"',
f'data-y-min="{y_min_val}" data-y-max="{y_max_val}" '
f'data-points="{data_points_attr}"',
svg,
count=1
)
# Add data-points to the main path element (the line, not the fill)
# Look for the second path element (first is usually the fill area)
path_count = 0
def add_data_to_path(match):
nonlocal path_count
path_count += 1
if path_count == 2: # The line path
return f'<path data-points="{data_points_attr}"'
return match.group(0)
def add_data_to_id(match):
return f'<path{match.group(1)} data-points="{data_points_attr}"'
svg = re.sub(r'<path\b', add_data_to_path, svg)
svg, count = re.subn(
r'<path([^>]*(?:id|gid)="chart-line"[^>]*)',
add_data_to_id,
svg,
count=1,
)
if count == 0:
# Look for the second path element (first is usually the fill area)
path_count = 0
def add_data_to_path(match):
nonlocal path_count
path_count += 1
if path_count == 2: # The line path
return f'<path data-points="{data_points_attr}"'
return match.group(0)
svg = re.sub(r'<path\b', add_data_to_path, svg)
return svg
@@ -558,9 +586,16 @@ def render_all_charts(
for metric in metrics:
all_stats[metric] = {}
for period in periods:
period_cfg = PERIOD_CONFIG[period]
for period in periods:
period_cfg = PERIOD_CONFIG[period]
x_end = now
x_start = now - period_cfg["lookback"]
start_ts = int(x_start.timestamp())
end_ts = int(x_end.timestamp())
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
for metric in metrics:
# Load time series from database
ts = load_timeseries_from_db(
role=role,
@@ -568,6 +603,7 @@ def render_all_charts(
end_time=now,
lookback=period_cfg["lookback"],
period=period,
all_metrics=all_metrics,
)
# Calculate and store statistics
@@ -579,10 +615,6 @@ def render_all_charts(
y_min = y_range[0] if y_range else None
y_max = y_range[1] if y_range else None
# Calculate X-axis range for full period padding
x_end = now
x_start = now - period_cfg["lookback"]
# Render chart for each theme
for theme_name in themes:
theme = CHART_THEMES[theme_name]

View File

@@ -155,6 +155,14 @@ class Config:
self.remote_cb_fails = get_int("REMOTE_CB_FAILS", 6)
self.remote_cb_cooldown_s = get_int("REMOTE_CB_COOLDOWN_S", 3600)
# Telemetry collection (requires sensor board on repeater)
self.telemetry_enabled = get_bool("TELEMETRY_ENABLED", False)
# Separate settings allow tuning if telemetry proves problematic
# Defaults match status settings - tune down if needed
self.telemetry_timeout_s = get_int("TELEMETRY_TIMEOUT_S", 10)
self.telemetry_retry_attempts = get_int("TELEMETRY_RETRY_ATTEMPTS", 2)
self.telemetry_retry_backoff_s = get_int("TELEMETRY_RETRY_BACKOFF_S", 4)
# Paths (defaults are Docker container paths; native installs override via config)
self.state_dir = get_path("STATE_DIR", "/data/state")
self.out_dir = get_path("OUT_DIR", "/out")

View File

@@ -588,8 +588,8 @@ def build_page_context(
last_updated = None
last_updated_iso = None
if ts:
dt = datetime.fromtimestamp(ts)
last_updated = dt.strftime("%b %d, %Y at %H:%M UTC")
dt = datetime.fromtimestamp(ts).astimezone()
last_updated = dt.strftime("%b %d, %Y at %H:%M %Z")
last_updated_iso = dt.isoformat()
# Build metrics for sidebar
@@ -845,24 +845,24 @@ def build_monthly_table_data(
airtime = m.get("airtime", MetricStats())
# Convert mV to V for display
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": f"{daily.date.day:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_time(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_time(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total is not None else "-", "class": None},
],
})
@@ -877,24 +877,24 @@ def build_monthly_table_data(
tx = s.get("nb_sent", MetricStats())
airtime = s.get("airtime", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total is not None else "-", "class": None},
],
})
@@ -928,21 +928,21 @@ def build_monthly_table_data(
tx = m.get("sent", MetricStats())
# Convert mV to V for display
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": f"{daily.date.day:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_time(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_time(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -954,21 +954,21 @@ def build_monthly_table_data(
rx = s.get("recv", MetricStats())
tx = s.get("sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1033,23 +1033,23 @@ def build_yearly_table_data(
tx = s.get("nb_sent", MetricStats())
# Convert mV to V
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": str(agg.year), "class": None},
{"value": f"{monthly.month:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1062,23 +1062,23 @@ def build_yearly_table_data(
rx = s.get("nb_recv", MetricStats())
tx = s.get("nb_sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_month(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_month(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1113,22 +1113,22 @@ def build_yearly_table_data(
tx = s.get("sent", MetricStats())
# Convert mV to V
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": str(agg.year), "class": None},
{"value": f"{monthly.month:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1140,22 +1140,22 @@ def build_yearly_table_data(
rx = s.get("recv", MetricStats())
tx = s.get("sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_month(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_month(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})

View File

@@ -17,17 +17,12 @@ import calendar
import json
from dataclasses import dataclass, field
from datetime import date, datetime, timedelta
from pathlib import Path
from typing import Any, Optional
from .db import get_connection, get_metrics_for_period, VALID_ROLES
from .env import get_config
from .metrics import (
is_counter_metric,
get_chart_metrics,
transform_value,
)
from . import log
def _validate_role(role: str) -> str:
@@ -59,6 +54,32 @@ def get_metrics_for_role(role: str) -> list[str]:
raise ValueError(f"Unknown role: {role}")
REPORT_UNITS_RAW = {
"battery_mv": "mV",
"bat": "mV",
"bat_pct": "%",
"uptime": "s",
"uptime_secs": "s",
"last_rssi": "dBm",
"last_snr": "dB",
"noise_floor": "dBm",
"tx_queue_len": "count",
"contacts": "count",
"recv": "packets",
"sent": "packets",
"nb_recv": "packets",
"nb_sent": "packets",
"airtime": "s",
"rx_airtime": "s",
"flood_dups": "packets",
"direct_dups": "packets",
"sent_flood": "packets",
"recv_flood": "packets",
"sent_direct": "packets",
"recv_direct": "packets",
}
@dataclass
class MetricStats:
"""Statistics for a single metric over a period.
@@ -1116,10 +1137,14 @@ def format_yearly_txt(
return format_yearly_txt_companion(agg, node_name, location)
def _metric_stats_to_dict(stats: MetricStats) -> dict[str, Any]:
def _metric_stats_to_dict(stats: MetricStats, metric: str) -> dict[str, Any]:
"""Convert MetricStats to JSON-serializable dict."""
result: dict[str, Any] = {"count": stats.count}
unit = REPORT_UNITS_RAW.get(metric)
if unit:
result["unit"] = unit
if stats.mean is not None:
result["mean"] = round(stats.mean, 4)
if stats.min_value is not None:
@@ -1144,7 +1169,7 @@ def _daily_to_dict(daily: DailyAggregate) -> dict[str, Any]:
"date": daily.date.isoformat(),
"snapshot_count": daily.snapshot_count,
"metrics": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in daily.metrics.items()
if stats.has_data
},
@@ -1167,7 +1192,7 @@ def monthly_to_json(agg: MonthlyAggregate) -> dict[str, Any]:
"role": agg.role,
"days_with_data": len(agg.daily),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in agg.summary.items()
if stats.has_data
},
@@ -1190,7 +1215,7 @@ def yearly_to_json(agg: YearlyAggregate) -> dict[str, Any]:
"role": agg.role,
"months_with_data": len(agg.monthly),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in agg.summary.items()
if stats.has_data
},
@@ -1200,7 +1225,7 @@ def yearly_to_json(agg: YearlyAggregate) -> dict[str, Any]:
"month": m.month,
"days_with_data": len(m.daily),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in m.summary.items()
if stats.has_data
},

102
src/meshmon/telemetry.py Normal file
View File

@@ -0,0 +1,102 @@
"""Telemetry data extraction from Cayenne LPP format."""
from typing import Any
from . import log
__all__ = ["extract_lpp_from_payload", "extract_telemetry_metrics"]
def extract_lpp_from_payload(payload: Any) -> list | None:
"""Extract LPP data list from telemetry payload.
Handles both formats returned by the MeshCore API:
- Dict format: {'pubkey_pre': '...', 'lpp': [...]}
- Direct list format: [...]
Args:
payload: Raw telemetry payload from get_self_telemetry() or req_telemetry_sync()
Returns:
The LPP data list, or None if not extractable.
"""
if payload is None:
return None
if isinstance(payload, dict):
lpp = payload.get("lpp")
if lpp is None:
log.debug("No 'lpp' key in telemetry payload dict")
return None
if not isinstance(lpp, list):
log.debug(f"Unexpected LPP data type in payload: {type(lpp).__name__}")
return None
return lpp
if isinstance(payload, list):
return payload
log.debug(f"Unexpected telemetry payload type: {type(payload).__name__}")
return None
def extract_telemetry_metrics(lpp_data: Any) -> dict[str, float]:
"""Extract numeric telemetry values from Cayenne LPP response.
Expected format:
[
{"type": "temperature", "channel": 0, "value": 23.5},
{"type": "gps", "channel": 1, "value": {"latitude": 51.5, "longitude": -0.1, "altitude": 10}}
]
Keys are formatted as:
- telemetry.{type}.{channel} for scalar values
- telemetry.{type}.{channel}.{subkey} for compound values (e.g., GPS)
Returns:
Dict mapping metric keys to float values. Invalid readings are skipped.
"""
if not isinstance(lpp_data, list):
log.warn(f"Expected list for LPP data, got {type(lpp_data).__name__}")
return {}
metrics: dict[str, float] = {}
for i, reading in enumerate(lpp_data):
if not isinstance(reading, dict):
log.debug(f"Skipping non-dict LPP reading at index {i}")
continue
sensor_type = reading.get("type")
if not isinstance(sensor_type, str) or not sensor_type.strip():
log.debug(f"Skipping reading with invalid type at index {i}")
continue
# Normalize sensor type for use as metric key component
sensor_type = sensor_type.strip().lower().replace(" ", "_")
channel = reading.get("channel", 0)
if not isinstance(channel, int):
channel = 0
value = reading.get("value")
base_key = f"telemetry.{sensor_type}.{channel}"
# Note: Check bool before int because bool is a subclass of int in Python.
# Some sensors may report digital on/off values as booleans.
if isinstance(value, bool):
metrics[base_key] = float(value)
elif isinstance(value, (int, float)):
metrics[base_key] = float(value)
elif isinstance(value, dict):
for subkey, subval in value.items():
if not isinstance(subkey, str):
continue
subkey_clean = subkey.strip().lower().replace(" ", "_")
if not subkey_clean:
continue
if isinstance(subval, bool):
metrics[f"{base_key}.{subkey_clean}"] = float(subval)
elif isinstance(subval, (int, float)):
metrics[f"{base_key}.{subkey_clean}"] = float(subval)
return metrics

View File

@@ -58,7 +58,8 @@
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit'
minute: '2-digit',
timeZoneName: 'short'
};
// For year view, include year
@@ -180,15 +181,23 @@
const yMin = parseFloat(svg.dataset.yMin);
const yMax = parseFloat(svg.dataset.yMax);
// Find the path with data-points
const path = svg.querySelector('path[data-points]');
// Find the primary line path for precise coordinates
const path =
svg.querySelector('path#chart-line') ||
svg.querySelector('path[gid="chart-line"]') ||
svg.querySelector('#chart-line path') ||
svg.querySelector('[gid="chart-line"] path') ||
svg.querySelector('path[data-points]');
if (!path) return;
// Parse and cache data points and path coordinates on first access
if (!path._dataPoints) {
const pointsSource = path.dataset.points || svg.dataset.points;
if (!pointsSource) return;
// Parse and cache data points on first access
if (!svg._dataPoints) {
try {
const json = path.dataset.points.replace(/&quot;/g, '"');
path._dataPoints = JSON.parse(json);
const json = pointsSource.replace(/&quot;/g, '"');
svg._dataPoints = JSON.parse(json);
} catch (e) {
console.warn('Failed to parse chart data:', e);
return;
@@ -220,7 +229,7 @@
const targetTs = xStart + clampedRelX * (xEnd - xStart);
// Find closest data point by timestamp
const result = findClosestPoint(path._dataPoints, targetTs);
const result = findClosestPoint(svg._dataPoints, targetTs);
if (!result) return;
const { point } = result;