Compare commits

...

14 Commits

Author SHA1 Message Date
Jorijn Schrijvershof
c199ace4a2 chore(main): release 0.2.9 (#27) 2026-01-06 13:57:15 +01:00
Jorijn Schrijvershof
f7923b9434 fix: tooltip positioning and locale-aware time formatting
- Fix tooltip indicator dot appearing above chart line by using
  clipPath rect bounds instead of line path bounding box for Y positioning
- Inject data-points attribute into line path inside #chart-line group
- Refactor chart-tooltip.js with clear sections, extracted utilities,
  and centralized configuration
- Use navigator.language for date formatting to respect browser
  language preference (fixes 12h/24h format based on locale)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 13:56:36 +01:00
Jorijn Schrijvershof
c978844271 ci: add artifact-metadata permission for attestation storage records
Fixes warning: "Failed to create storage record: artifact-metadata:write
permission has been included"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 12:15:50 +01:00
Jorijn Schrijvershof
64cc352b80 chore(main): release 0.2.8 (#26) 2026-01-06 12:06:53 +01:00
Jorijn Schrijvershof
e37aef6c5e fix: normalize reporting outputs and chart tooltips
- render last observation with local TZ label

- show zero values in report tables

- keep report JSON raw values with explicit units

- reduce chart queries per period and tag chart line paths

- remove redundant get_bat call in companion collector
2026-01-06 11:24:47 +01:00
Jorijn Schrijvershof
81b7c6897a chore(main): release 0.2.7 (#25) 2026-01-06 09:54:44 +01:00
Jorijn Schrijvershof
a3015e2209 feat: add telemetry collection for companion and repeater nodes (#24)
Add environmental telemetry collection (temperature, humidity, barometric
pressure, voltage) from both the repeater node (over LoRa) and companion
node (local serial). Telemetry is stored in the same EAV metrics table
with `telemetry.` prefix.

Key changes:
- Add TELEMETRY_ENABLED feature flag (defaults to OFF)
- Add telemetry-specific timeout/retry settings
- Create shared telemetry.py module with extract_lpp_from_payload()
  and extract_telemetry_metrics() helpers
- Handle MeshCore API dict payload format: {'pubkey_pre': '...', 'lpp': [...]}
- Repeater: store status metrics BEFORE attempting telemetry (LoRa reliability)
- Companion: merge telemetry into single DB write (serial is reliable)
- Telemetry failures do NOT affect circuit breaker state

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 09:53:15 +01:00
Jorijn Schrijvershof
5545ce5b28 Merge pull request #23 from jorijn/release-please--branches--main--components--meshcore-stats
chore(main): release 0.2.6
2026-01-05 10:50:13 +01:00
Jorijn Schrijvershof
666ed4215f chore(main): release 0.2.6 2026-01-05 10:49:22 +01:00
Jorijn Schrijvershof
3d0d90304c fix: add tmpfs mount for fontconfig cache to fix read-only filesystem errors
The container runs with read_only: true for security hardening, but
fontconfig needs a writable cache directory. Added tmpfs mount at
/var/cache/fontconfig to allow fontconfig to write its cache without
compromising the read-only filesystem security.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-05 10:49:02 +01:00
Jorijn Schrijvershof
6afc14e007 Merge pull request #21 from jorijn/release-please--branches--main--components--meshcore-stats
chore(main): release 0.2.5
2026-01-05 10:03:26 +01:00
Jorijn Schrijvershof
4c5a408604 chore(main): release 0.2.5 2026-01-05 10:00:46 +01:00
Jorijn Schrijvershof
3c5eace220 feat: add automatic serial port locking to prevent concurrent access
Implements fcntl.flock() based locking for serial transport to prevent
USB serial conflicts when collect_companion and collect_repeater run
simultaneously. This addresses Ofelia's limitation where no-overlap
only prevents a job from overlapping with itself, not other jobs.

Key changes:
- Add connect_with_lock() async context manager to meshcore_client.py
- Use non-blocking LOCK_NB with async polling to avoid freezing event loop
- Only lock for serial transport (TCP/BLE don't need it)
- 60s timeout with clear error message if lock cannot be acquired
- Update collector scripts to use new context manager
- Remove external flock from cron examples (now handled in Python)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-05 10:00:25 +01:00
Jorijn Schrijvershof
7eee23ec40 docs: fix formatting in architecture diagram 2026-01-05 09:57:54 +01:00
18 changed files with 1137 additions and 555 deletions

View File

@@ -33,6 +33,7 @@ permissions:
packages: write
id-token: write
attestations: write
artifact-metadata: write
concurrency:
group: docker-${{ github.ref }}

View File

@@ -1,3 +1,3 @@
{
".": "0.2.4"
".": "0.2.9"
}

View File

@@ -4,6 +4,51 @@ All notable changes to this project will be documented in this file.
This changelog is automatically generated by [release-please](https://github.com/googleapis/release-please) based on [Conventional Commits](https://www.conventionalcommits.org/).
## [0.2.9](https://github.com/jorijn/meshcore-stats/compare/v0.2.8...v0.2.9) (2026-01-06)
### Bug Fixes
* tooltip positioning and locale-aware time formatting ([f7923b9](https://github.com/jorijn/meshcore-stats/commit/f7923b94346c3d492e7291ecca208ab704176308))
### Continuous Integration
* add artifact-metadata permission for attestation storage records ([c978844](https://github.com/jorijn/meshcore-stats/commit/c978844271eafd35f4778d748d7c832309d1614f))
## [0.2.8](https://github.com/jorijn/meshcore-stats/compare/v0.2.7...v0.2.8) (2026-01-06)
### Bug Fixes
* normalize reporting outputs and chart tooltips ([e37aef6](https://github.com/jorijn/meshcore-stats/commit/e37aef6c5e55d2077baf4ee35abdff0562983d69))
## [0.2.7](https://github.com/jorijn/meshcore-stats/compare/v0.2.6...v0.2.7) (2026-01-06)
### Features
* add telemetry collection for companion and repeater nodes ([#24](https://github.com/jorijn/meshcore-stats/issues/24)) ([a3015e2](https://github.com/jorijn/meshcore-stats/commit/a3015e2209781bdd7c317fa992ced6afa19efe61))
## [0.2.6](https://github.com/jorijn/meshcore-stats/compare/v0.2.5...v0.2.6) (2026-01-05)
### Bug Fixes
* add tmpfs mount for fontconfig cache to fix read-only filesystem errors ([3d0d903](https://github.com/jorijn/meshcore-stats/commit/3d0d90304cec5ebcdb34935400de31afd62e258d))
## [0.2.5](https://github.com/jorijn/meshcore-stats/compare/v0.2.4...v0.2.5) (2026-01-05)
### Features
* add automatic serial port locking to prevent concurrent access ([3c5eace](https://github.com/jorijn/meshcore-stats/commit/3c5eace2207279c55401dd8fa27294d5a94bb682))
### Documentation
* fix formatting in architecture diagram ([7eee23e](https://github.com/jorijn/meshcore-stats/commit/7eee23ec40ff9441515b4ac18fbb7cd3f87fa4b5))
## [0.2.4](https://github.com/jorijn/meshcore-stats/compare/v0.2.3...v0.2.4) (2026-01-05)

View File

@@ -16,6 +16,8 @@ Always edit the source templates, then regenerate with `python scripts/render_si
## Running Commands
**IMPORTANT: Always activate the virtual environment before running any Python commands.**
```bash
cd /path/to/meshcore-stats
source .venv/bin/activate
@@ -354,11 +356,17 @@ All configuration via `meshcore.conf` or environment variables. The config file
### Timeouts & Retry
- `REMOTE_TIMEOUT_S`: Minimum timeout for LoRa requests (default: 10)
- `REMOTE_RETRY_ATTEMPTS`: Number of retry attempts (default: 5)
- `REMOTE_RETRY_ATTEMPTS`: Number of retry attempts (default: 2)
- `REMOTE_RETRY_BACKOFF_S`: Seconds between retries (default: 4)
- `REMOTE_CB_FAILS`: Failures before circuit breaker opens (default: 6)
- `REMOTE_CB_COOLDOWN_S`: Circuit breaker cooldown (default: 3600)
### Telemetry Collection
- `TELEMETRY_ENABLED`: Enable environmental telemetry collection from repeater (0/1, default: 0)
- `TELEMETRY_TIMEOUT_S`: Timeout for telemetry requests (default: 10)
- `TELEMETRY_RETRY_ATTEMPTS`: Retry attempts for telemetry (default: 2)
- `TELEMETRY_RETRY_BACKOFF_S`: Backoff between telemetry retries (default: 4)
### Intervals
- `COMPANION_STEP`: Collection interval for companion (default: 60s)
- `REPEATER_STEP`: Collection interval for repeater (default: 900s / 15min)
@@ -410,6 +418,12 @@ Metrics are classified as either **gauge** or **counter** in `src/meshmon/metric
Counter metrics are converted to rates during chart rendering by calculating deltas between consecutive readings.
- **TELEMETRY**: Environmental sensor data (when `TELEMETRY_ENABLED=1`):
- Stored with `telemetry.` prefix: `telemetry.temperature.0`, `telemetry.humidity.0`, `telemetry.barometer.0`
- Channel number distinguishes multiple sensors of the same type
- Compound values (e.g., GPS) stored as: `telemetry.gps.0.latitude`, `telemetry.gps.0.longitude`
- Telemetry collection does NOT affect circuit breaker state
## Database Schema
Metrics are stored in a SQLite database at `data/state/metrics.db` with WAL mode enabled for concurrent access.
@@ -694,16 +708,14 @@ meshcore-cli -s /dev/ttyACM0 reset_path "repeater name"
## Cron Setup (Example)
Use `flock` to prevent USB serial conflicts when companion and repeater collection overlap.
```cron
MESHCORE=/path/to/meshcore-stats
# Companion: every minute
* * * * * cd $MESHCORE && flock -w 60 /tmp/meshcore.lock .venv/bin/python scripts/collect_companion.py
* * * * * cd $MESHCORE && .venv/bin/python scripts/collect_companion.py
# Repeater: every 15 minutes (offset by 1 min for staggering)
1,16,31,46 * * * * cd $MESHCORE && flock -w 60 /tmp/meshcore.lock .venv/bin/python scripts/collect_repeater.py
1,16,31,46 * * * * cd $MESHCORE && .venv/bin/python scripts/collect_repeater.py
# Charts: every 5 minutes (generates SVG charts from database)
*/5 * * * * cd $MESHCORE && .venv/bin/python scripts/render_charts.py
@@ -717,7 +729,7 @@ MESHCORE=/path/to/meshcore-stats
**Notes:**
- `cd $MESHCORE` is required because paths in the config are relative to the project root
- `flock -w 60` waits up to 60 seconds for the lock, preventing USB serial conflicts
- Serial port locking is handled automatically via `fcntl.flock()` in Python (no external `flock` needed)
## Adding New Metrics

View File

@@ -184,10 +184,10 @@ Add to your crontab (`crontab -e`):
MESHCORE=/path/to/meshcore-stats
# Companion: every minute
* * * * * cd $MESHCORE && flock -w 60 /tmp/meshcore.lock .venv/bin/python scripts/collect_companion.py
* * * * * cd $MESHCORE && .venv/bin/python scripts/collect_companion.py
# Repeater: every 15 minutes
1,16,31,46 * * * * cd $MESHCORE && flock -w 60 /tmp/meshcore.lock .venv/bin/python scripts/collect_repeater.py
1,16,31,46 * * * * cd $MESHCORE && .venv/bin/python scripts/collect_repeater.py
# Charts: every 5 minutes
*/5 * * * * cd $MESHCORE && .venv/bin/python scripts/render_charts.py
@@ -328,7 +328,7 @@ docker compose restart meshcore-stats
```
┌─────────────────┐ LoRa ┌─────────────────┐
│ Companion │◄────────────►│ Repeater │
│ Companion │◄────────────►│ Repeater │
│ (USB Serial) │ │ (Remote) │
└────────┬────────┘ └─────────────────┘

View File

@@ -15,7 +15,7 @@ services:
# MeshCore Stats - Data collection and rendering
# ==========================================================================
meshcore-stats:
image: ghcr.io/jorijn/meshcore-stats:0.2.4 # x-release-please-version
image: ghcr.io/jorijn/meshcore-stats:0.2.9 # x-release-please-version
container_name: meshcore-stats
restart: unless-stopped
@@ -47,6 +47,7 @@ services:
read_only: true
tmpfs:
- /tmp:noexec,nosuid,size=64m
- /var/cache/fontconfig:noexec,nosuid,size=4m
# Resource limits
deploy:

View File

@@ -102,6 +102,84 @@ Returns a single dict with all status fields.
---
## Telemetry Data
Environmental telemetry is requested via `req_telemetry_sync(contact)` and returns
Cayenne LPP formatted sensor data. This requires `TELEMETRY_ENABLED=1` and a sensor
board attached to the repeater.
### Payload Format
Both `req_telemetry_sync()` and `get_self_telemetry()` return a dict containing the
LPP data list and a public key prefix:
```python
{
'pubkey_pre': 'a5c14f5244d6',
'lpp': [
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
]
}
```
The `extract_lpp_from_payload()` helper in `src/meshmon/telemetry.py` handles
extracting the `lpp` list from this wrapper format.
### `req_telemetry_sync(contact)`
Returns sensor readings from a remote node in Cayenne LPP format:
```python
[
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
{'channel': 0, 'type': 'barometer', 'value': 1013.25},
{'channel': 1, 'type': 'gps', 'value': {'latitude': 51.5, 'longitude': -0.1, 'altitude': 10}},
]
```
**Common sensor types:**
| Type | Unit | Description |
|------|------|-------------|
| `temperature` | Celsius | Temperature reading |
| `humidity` | % | Relative humidity |
| `barometer` | hPa/mbar | Barometric pressure |
| `voltage` | V | Voltage reading |
| `gps` | compound | GPS with `latitude`, `longitude`, `altitude` |
**Stored as:**
- `telemetry.temperature.0` - Temperature on channel 0
- `telemetry.humidity.0` - Humidity on channel 0
- `telemetry.gps.1.latitude` - GPS latitude on channel 1
**Notes:**
- Requires environmental sensor board (BME280, BME680, etc.) on repeater
- Channel number distinguishes multiple sensors of the same type
- Not all repeaters have environmental sensors attached
- Telemetry collection does not affect circuit breaker state
- Telemetry failures are logged as warnings and do not block status collection
### `get_self_telemetry()`
Returns self telemetry from the companion node's attached sensors.
Same Cayenne LPP format as `req_telemetry_sync()`.
```python
[
{'channel': 0, 'type': 'temperature', 'value': 23.5},
{'channel': 0, 'type': 'humidity', 'value': 45.2},
]
```
**Notes:**
- Requires environmental sensor board attached to companion
- Returns empty list if no sensors attached
- Uses same format as repeater telemetry
---
## Derived Metrics
These are computed at query time, not stored:

View File

@@ -113,6 +113,23 @@ RADIO_CODING_RATE=CR8
# REMOTE_CB_FAILS=6
# REMOTE_CB_COOLDOWN_S=3600
# =============================================================================
# Telemetry Collection (Environmental Sensors)
# =============================================================================
# Enable telemetry collection from repeater's environmental sensors
# (temperature, humidity, barometric pressure, etc.)
# Requires sensor board attached to repeater (e.g., BME280, BME680)
# Default: 0 (disabled)
# TELEMETRY_ENABLED=1
# Telemetry-specific timeout and retry settings
# Defaults match status settings. Separate config allows tuning if telemetry
# proves problematic (e.g., firmware doesn't support it, sensor board missing).
# You can reduce these if telemetry collection is causing issues.
# TELEMETRY_TIMEOUT_S=10
# TELEMETRY_RETRY_ATTEMPTS=2
# TELEMETRY_RETRY_BACKOFF_S=4
# =============================================================================
# Paths (Native installation only)
# =============================================================================

View File

@@ -25,8 +25,9 @@ sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
from meshmon.env import get_config
from meshmon import log
from meshmon.meshcore_client import connect_from_env, run_command
from meshmon.meshcore_client import connect_with_lock, run_command
from meshmon.db import init_db, insert_metrics
from meshmon.telemetry import extract_lpp_from_payload, extract_telemetry_metrics
async def collect_companion() -> int:
@@ -39,138 +40,132 @@ async def collect_companion() -> int:
cfg = get_config()
ts = int(time.time())
log.debug("Connecting to companion node...")
mc = await connect_from_env()
if mc is None:
log.error("Failed to connect to companion node")
return 1
# Metrics to insert (firmware field names)
metrics: dict[str, float] = {}
commands_succeeded = 0
# Commands are accessed via mc.commands
cmd = mc.commands
log.debug("Connecting to companion node...")
async with connect_with_lock() as mc:
if mc is None:
log.error("Failed to connect to companion node")
return 1
try:
# send_appstart (already called during connect, but call again to get self_info)
ok, evt_type, payload, err = await run_command(
mc, cmd.send_appstart(), "send_appstart"
)
if ok:
commands_succeeded += 1
log.debug(f"appstart: {evt_type}")
else:
log.error(f"appstart failed: {err}")
# Commands are accessed via mc.commands
cmd = mc.commands
# send_device_query
ok, evt_type, payload, err = await run_command(
mc, cmd.send_device_query(), "send_device_query"
)
if ok:
commands_succeeded += 1
log.debug(f"device_query: {payload}")
else:
log.error(f"device_query failed: {err}")
try:
# send_appstart (already called during connect, but call again to get self_info)
ok, evt_type, payload, err = await run_command(
mc, cmd.send_appstart(), "send_appstart"
)
if ok:
commands_succeeded += 1
log.debug(f"appstart: {evt_type}")
else:
log.error(f"appstart failed: {err}")
# get_bat
ok, evt_type, payload, err = await run_command(
mc, cmd.get_bat(), "get_bat"
)
if ok:
commands_succeeded += 1
log.debug(f"get_bat: {payload}")
else:
log.error(f"get_bat failed: {err}")
# send_device_query
ok, evt_type, payload, err = await run_command(
mc, cmd.send_device_query(), "send_device_query"
)
if ok:
commands_succeeded += 1
log.debug(f"device_query: {payload}")
else:
log.error(f"device_query failed: {err}")
# get_time
ok, evt_type, payload, err = await run_command(
mc, cmd.get_time(), "get_time"
)
if ok:
commands_succeeded += 1
log.debug(f"get_time: {payload}")
else:
log.error(f"get_time failed: {err}")
# get_time
ok, evt_type, payload, err = await run_command(
mc, cmd.get_time(), "get_time"
)
if ok:
commands_succeeded += 1
log.debug(f"get_time: {payload}")
else:
log.error(f"get_time failed: {err}")
# get_self_telemetry
ok, evt_type, payload, err = await run_command(
mc, cmd.get_self_telemetry(), "get_self_telemetry"
)
if ok:
commands_succeeded += 1
log.debug(f"get_self_telemetry: {payload}")
else:
log.error(f"get_self_telemetry failed: {err}")
# get_self_telemetry - collect environmental sensor data
# Note: The call happens regardless of telemetry_enabled for device query completeness,
# but we only extract and store metrics if the feature is enabled.
ok, evt_type, payload, err = await run_command(
mc, cmd.get_self_telemetry(), "get_self_telemetry"
)
if ok:
commands_succeeded += 1
log.debug(f"get_self_telemetry: {payload}")
# Extract and store telemetry if enabled
if cfg.telemetry_enabled:
lpp_data = extract_lpp_from_payload(payload)
if lpp_data is not None:
telemetry_metrics = extract_telemetry_metrics(lpp_data)
if telemetry_metrics:
metrics.update(telemetry_metrics)
log.debug(f"Extracted {len(telemetry_metrics)} telemetry metrics")
else:
# Debug level because not all devices have sensors attached - this is expected
log.debug(f"get_self_telemetry failed: {err}")
# get_custom_vars
ok, evt_type, payload, err = await run_command(
mc, cmd.get_custom_vars(), "get_custom_vars"
)
if ok:
commands_succeeded += 1
log.debug(f"get_custom_vars: {payload}")
else:
log.debug(f"get_custom_vars failed: {err}")
# get_custom_vars
ok, evt_type, payload, err = await run_command(
mc, cmd.get_custom_vars(), "get_custom_vars"
)
if ok:
commands_succeeded += 1
log.debug(f"get_custom_vars: {payload}")
else:
log.debug(f"get_custom_vars failed: {err}")
# get_contacts - count contacts
ok, evt_type, payload, err = await run_command(
mc, cmd.get_contacts(), "get_contacts"
)
if ok:
commands_succeeded += 1
contacts_count = len(payload) if payload else 0
metrics["contacts"] = float(contacts_count)
log.debug(f"get_contacts: found {contacts_count} contacts")
else:
log.error(f"get_contacts failed: {err}")
# get_contacts - count contacts
ok, evt_type, payload, err = await run_command(
mc, cmd.get_contacts(), "get_contacts"
)
if ok:
commands_succeeded += 1
contacts_count = len(payload) if payload else 0
metrics["contacts"] = float(contacts_count)
log.debug(f"get_contacts: found {contacts_count} contacts")
else:
log.error(f"get_contacts failed: {err}")
# Get statistics - these contain the main metrics
# Core stats (battery_mv, uptime_secs, errors, queue_len)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_core(), "get_stats_core"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
# Insert all numeric fields from stats_core
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_core: {payload}")
# Get statistics - these contain the main metrics
# Core stats (battery_mv, uptime_secs, errors, queue_len)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_core(), "get_stats_core"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
# Insert all numeric fields from stats_core
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_core: {payload}")
# Radio stats (noise_floor, last_rssi, last_snr, tx_air_secs, rx_air_secs)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_radio(), "get_stats_radio"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_radio: {payload}")
# Radio stats (noise_floor, last_rssi, last_snr, tx_air_secs, rx_air_secs)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_radio(), "get_stats_radio"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_radio: {payload}")
# Packet stats (recv, sent, flood_tx, direct_tx, flood_rx, direct_rx)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_packets(), "get_stats_packets"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_packets: {payload}")
# Packet stats (recv, sent, flood_tx, direct_tx, flood_rx, direct_rx)
ok, evt_type, payload, err = await run_command(
mc, cmd.get_stats_packets(), "get_stats_packets"
)
if ok and payload and isinstance(payload, dict):
commands_succeeded += 1
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"stats_packets: {payload}")
except Exception as e:
log.error(f"Error during collection: {e}")
except Exception as e:
log.error(f"Error during collection: {e}")
finally:
# Close connection
if hasattr(mc, "disconnect"):
try:
await mc.disconnect()
except Exception:
pass
# Connection closed and lock released by context manager
# Print summary
summary_parts = [f"ts={ts}"]
@@ -183,6 +178,10 @@ async def collect_companion() -> int:
summary_parts.append(f"rx={int(metrics['recv'])}")
if "sent" in metrics:
summary_parts.append(f"tx={int(metrics['sent'])}")
# Add telemetry count to summary if present
telemetry_count = sum(1 for k in metrics if k.startswith("telemetry."))
if telemetry_count > 0:
summary_parts.append(f"telem={telemetry_count}")
log.info(f"Companion: {', '.join(summary_parts)}")

View File

@@ -27,15 +27,15 @@ sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
from meshmon.env import get_config
from meshmon import log
from meshmon.meshcore_client import (
connect_from_env,
connect_with_lock,
run_command,
get_contact_by_name,
get_contact_by_key_prefix,
extract_contact_info,
list_contacts_summary,
)
from meshmon.db import init_db, insert_metrics
from meshmon.retry import get_repeater_circuit_breaker, with_retries
from meshmon.telemetry import extract_lpp_from_payload, extract_telemetry_metrics
async def find_repeater_contact(mc: Any) -> Optional[Any]:
@@ -143,8 +143,10 @@ async def query_repeater_with_retry(
async def collect_repeater() -> int:
"""
Collect data from remote repeater node.
"""Collect data from remote repeater node.
Collects status metrics (battery, uptime, packet counters, etc.) and
optionally telemetry data (temperature, humidity, pressure) if enabled.
Returns:
Exit code (0 = success, 1 = error)
@@ -161,122 +163,154 @@ async def collect_repeater() -> int:
# Skip collection - no metrics to write
return 0
# Connect to companion
log.debug("Connecting to companion node...")
mc = await connect_from_env()
if mc is None:
log.error("Failed to connect to companion node")
return 1
# Metrics to insert (firmware field names from req_status_sync)
metrics: dict[str, float] = {}
status_metrics: dict[str, float] = {}
telemetry_metrics: dict[str, float] = {}
node_name = "unknown"
status_ok = False
# Commands are accessed via mc.commands
cmd = mc.commands
try:
# Initialize (appstart already called during connect)
ok, evt_type, payload, err = await run_command(
mc, cmd.send_appstart(), "send_appstart"
)
if not ok:
log.error(f"appstart failed: {err}")
# Find repeater contact
contact = await find_repeater_contact(mc)
if contact is None:
log.error("Cannot find repeater contact")
# Connect to companion
log.debug("Connecting to companion node...")
async with connect_with_lock() as mc:
if mc is None:
log.error("Failed to connect to companion node")
return 1
# Store contact info
contact_info = extract_contact_info(contact)
node_name = contact_info.get("adv_name", "unknown")
# Commands are accessed via mc.commands
cmd = mc.commands
log.debug(f"Found repeater: {node_name}")
try:
# Initialize (appstart already called during connect)
ok, evt_type, payload, err = await run_command(
mc, cmd.send_appstart(), "send_appstart"
)
if not ok:
log.error(f"appstart failed: {err}")
# Optional login (if command exists)
if cfg.repeater_password and hasattr(cmd, "send_login"):
log.debug("Attempting login...")
try:
ok, evt_type, payload, err = await run_command(
mc,
cmd.send_login(contact, cfg.repeater_password),
"send_login",
)
if ok:
log.debug("Login successful")
else:
log.debug(f"Login failed or not supported: {err}")
except Exception as e:
log.debug(f"Login not supported: {e}")
# Find repeater contact
contact = await find_repeater_contact(mc)
# Query status (using _sync version which returns payload directly)
# Use timeout=0 to let the device suggest timeout, with min_timeout as floor
log.debug("Querying repeater status...")
success, payload, err = await query_repeater_with_retry(
mc,
contact,
"req_status_sync",
lambda: cmd.req_status_sync(contact, timeout=0, min_timeout=cfg.remote_timeout_s),
)
if success and payload and isinstance(payload, dict):
status_ok = True
# Insert all numeric fields from status response
for key, value in payload.items():
if isinstance(value, (int, float)):
metrics[key] = float(value)
log.debug(f"req_status_sync: {payload}")
else:
log.warn(f"req_status_sync failed: {err}")
if contact is None:
log.error("Cannot find repeater contact")
return 1
# Update circuit breaker
if status_ok:
cb.record_success()
log.debug("Circuit breaker: recorded success")
else:
# Store contact info
contact_info = extract_contact_info(contact)
node_name = contact_info.get("adv_name", "unknown")
log.debug(f"Found repeater: {node_name}")
# Optional login (if command exists)
if cfg.repeater_password and hasattr(cmd, "send_login"):
log.debug("Attempting login...")
try:
ok, evt_type, payload, err = await run_command(
mc,
cmd.send_login(contact, cfg.repeater_password),
"send_login",
)
if ok:
log.debug("Login successful")
else:
log.debug(f"Login failed or not supported: {err}")
except Exception as e:
log.debug(f"Login not supported: {e}")
# Phase 1: Status collection (affects circuit breaker)
# Use timeout=0 to let the device suggest timeout, with min_timeout as floor
log.debug("Querying repeater status...")
success, payload, err = await query_repeater_with_retry(
mc,
contact,
"req_status_sync",
lambda: cmd.req_status_sync(contact, timeout=0, min_timeout=cfg.remote_timeout_s),
)
if success and payload and isinstance(payload, dict):
status_ok = True
# Insert all numeric fields from status response
for key, value in payload.items():
if isinstance(value, (int, float)):
status_metrics[key] = float(value)
log.debug(f"req_status_sync: {payload}")
else:
log.warn(f"req_status_sync failed: {err}")
# Update circuit breaker based on status result
if status_ok:
cb.record_success()
log.debug("Circuit breaker: recorded success")
else:
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
log.debug(f"Circuit breaker: recorded failure ({cb.consecutive_failures}/{cfg.remote_cb_fails})")
# CRITICAL: Store status metrics immediately before attempting telemetry
# This ensures critical data is saved even if telemetry fails
if status_ok and status_metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=status_metrics)
log.debug(f"Stored {inserted} status metrics (ts={ts})")
except Exception as e:
log.error(f"Failed to store status metrics: {e}")
return 1
# Phase 2: Telemetry collection (does NOT affect circuit breaker)
if cfg.telemetry_enabled and status_ok:
log.debug("Querying repeater telemetry...")
try:
# Note: Telemetry uses its own retry settings and does NOT
# affect circuit breaker. Status success proves the link is up;
# telemetry failures are likely firmware/capability issues.
telem_success, telem_payload, telem_err = await with_retries(
lambda: cmd.req_telemetry_sync(
contact, timeout=0, min_timeout=cfg.telemetry_timeout_s
),
attempts=cfg.telemetry_retry_attempts,
backoff_s=cfg.telemetry_retry_backoff_s,
name="req_telemetry_sync",
)
if telem_success and telem_payload:
log.debug(f"req_telemetry_sync: {telem_payload}")
lpp_data = extract_lpp_from_payload(telem_payload)
if lpp_data is not None:
telemetry_metrics = extract_telemetry_metrics(lpp_data)
log.debug(f"Extracted {len(telemetry_metrics)} telemetry metrics")
# Store telemetry metrics
if telemetry_metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=telemetry_metrics)
log.debug(f"Stored {inserted} telemetry metrics")
except Exception as e:
log.warn(f"Failed to store telemetry metrics: {e}")
else:
log.warn(f"req_telemetry_sync failed: {telem_err}")
except Exception as e:
log.warn(f"Telemetry collection error (continuing): {e}")
except Exception as e:
log.error(f"Error during collection: {e}")
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
log.debug(f"Circuit breaker: recorded failure ({cb.consecutive_failures}/{cfg.remote_cb_fails})")
except Exception as e:
log.error(f"Error during collection: {e}")
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
finally:
# Close connection
if hasattr(mc, "disconnect"):
try:
await mc.disconnect()
except Exception:
pass
# Connection closed and lock released by context manager
# Print summary
summary_parts = [f"ts={ts}"]
if "bat" in metrics:
bat_v = metrics["bat"] / 1000.0
if "bat" in status_metrics:
bat_v = status_metrics["bat"] / 1000.0
summary_parts.append(f"bat={bat_v:.2f}V")
if "uptime" in metrics:
uptime_days = metrics["uptime"] // 86400
if "uptime" in status_metrics:
uptime_days = status_metrics["uptime"] // 86400
summary_parts.append(f"uptime={int(uptime_days)}d")
if "nb_recv" in metrics:
summary_parts.append(f"rx={int(metrics['nb_recv'])}")
if "nb_sent" in metrics:
summary_parts.append(f"tx={int(metrics['nb_sent'])}")
if "nb_recv" in status_metrics:
summary_parts.append(f"rx={int(status_metrics['nb_recv'])}")
if "nb_sent" in status_metrics:
summary_parts.append(f"tx={int(status_metrics['nb_sent'])}")
if telemetry_metrics:
summary_parts.append(f"telem={len(telemetry_metrics)}")
log.info(f"Repeater ({node_name}): {', '.join(summary_parts)}")
# Write metrics to database
if status_ok and metrics:
try:
inserted = insert_metrics(ts=ts, role="repeater", metrics=metrics)
log.debug(f"Inserted {inserted} metrics to database (ts={ts})")
except Exception as e:
log.error(f"Failed to write metrics to database: {e}")
return 1
return 0 if status_ok else 1

View File

@@ -1,3 +1,3 @@
"""MeshCore network monitoring library."""
__version__ = "0.2.4" # x-release-please-version
__version__ = "0.2.9" # x-release-please-version

View File

@@ -167,6 +167,7 @@ def load_timeseries_from_db(
end_time: datetime,
lookback: timedelta,
period: str,
all_metrics: Optional[dict[str, list[tuple[int, float]]]] = None,
) -> TimeSeries:
"""Load time series data from SQLite database.
@@ -179,6 +180,7 @@ def load_timeseries_from_db(
end_time: End of the time range (typically now)
lookback: How far back to look
period: Period name for binning config ("day", "week", etc.)
all_metrics: Optional pre-fetched metrics dict for this period
Returns:
TimeSeries with extracted data points
@@ -188,7 +190,8 @@ def load_timeseries_from_db(
end_ts = int(end_time.timestamp())
# Fetch all metrics for this role/period (returns pivoted dict)
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
if all_metrics is None:
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
# Get data for this specific metric
metric_data = all_metrics.get(metric, [])
@@ -379,10 +382,22 @@ def render_chart_svg(
# Plot area fill
area_color = _hex_to_rgba(theme.area)
ax.fill_between(timestamps, values, alpha=area_color[3], color=f"#{theme.line}")
area = ax.fill_between(
timestamps,
values,
alpha=area_color[3],
color=f"#{theme.line}",
)
area.set_gid("chart-area")
# Plot line
ax.plot(timestamps, values, color=f"#{theme.line}", linewidth=2)
(line,) = ax.plot(
timestamps,
values,
color=f"#{theme.line}",
linewidth=2,
)
line.set_gid("chart-line")
# Set Y-axis limits and track actual values used
if y_min is not None and y_max is not None:
@@ -458,7 +473,7 @@ def _inject_data_attributes(
Adds:
- data-metric, data-period, data-theme, data-x-start, data-x-end, data-y-min, data-y-max to root <svg>
- data-points JSON array to the chart path element
- data-points JSON array to the root <svg> and chart line path
Args:
svg: Raw SVG string
@@ -495,22 +510,33 @@ def _inject_data_attributes(
r'<svg\b',
f'<svg data-metric="{ts.metric}" data-period="{ts.period}" data-theme="{theme_name}" '
f'data-x-start="{x_start_ts}" data-x-end="{x_end_ts}" '
f'data-y-min="{y_min_val}" data-y-max="{y_max_val}"',
f'data-y-min="{y_min_val}" data-y-max="{y_max_val}" '
f'data-points="{data_points_attr}"',
svg,
count=1
)
# Add data-points to the main path element (the line, not the fill)
# Look for the second path element (first is usually the fill area)
path_count = 0
def add_data_to_path(match):
nonlocal path_count
path_count += 1
if path_count == 2: # The line path
return f'<path data-points="{data_points_attr}"'
return match.group(0)
# Add data-points to the line path inside the #chart-line group
# matplotlib creates <g id="chart-line"><path d="..."></g>
svg, count = re.subn(
r'(<g[^>]*id="chart-line"[^>]*>\s*<path\b)',
rf'\1 data-points="{data_points_attr}"',
svg,
count=1,
)
svg = re.sub(r'<path\b', add_data_to_path, svg)
if count == 0:
# Fallback: look for the second path element (first is usually the fill area)
path_count = 0
def add_data_to_path(match):
nonlocal path_count
path_count += 1
if path_count == 2: # The line path
return f'<path data-points="{data_points_attr}"'
return match.group(0)
svg = re.sub(r'<path\b', add_data_to_path, svg)
return svg
@@ -558,9 +584,16 @@ def render_all_charts(
for metric in metrics:
all_stats[metric] = {}
for period in periods:
period_cfg = PERIOD_CONFIG[period]
for period in periods:
period_cfg = PERIOD_CONFIG[period]
x_end = now
x_start = now - period_cfg["lookback"]
start_ts = int(x_start.timestamp())
end_ts = int(x_end.timestamp())
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
for metric in metrics:
# Load time series from database
ts = load_timeseries_from_db(
role=role,
@@ -568,6 +601,7 @@ def render_all_charts(
end_time=now,
lookback=period_cfg["lookback"],
period=period,
all_metrics=all_metrics,
)
# Calculate and store statistics
@@ -579,10 +613,6 @@ def render_all_charts(
y_min = y_range[0] if y_range else None
y_max = y_range[1] if y_range else None
# Calculate X-axis range for full period padding
x_end = now
x_start = now - period_cfg["lookback"]
# Render chart for each theme
for theme_name in themes:
theme = CHART_THEMES[theme_name]

View File

@@ -155,6 +155,14 @@ class Config:
self.remote_cb_fails = get_int("REMOTE_CB_FAILS", 6)
self.remote_cb_cooldown_s = get_int("REMOTE_CB_COOLDOWN_S", 3600)
# Telemetry collection (requires sensor board on repeater)
self.telemetry_enabled = get_bool("TELEMETRY_ENABLED", False)
# Separate settings allow tuning if telemetry proves problematic
# Defaults match status settings - tune down if needed
self.telemetry_timeout_s = get_int("TELEMETRY_TIMEOUT_S", 10)
self.telemetry_retry_attempts = get_int("TELEMETRY_RETRY_ATTEMPTS", 2)
self.telemetry_retry_backoff_s = get_int("TELEMETRY_RETRY_BACKOFF_S", 4)
# Paths (defaults are Docker container paths; native installs override via config)
self.state_dir = get_path("STATE_DIR", "/data/state")
self.out_dir = get_path("OUT_DIR", "/out")

View File

@@ -588,8 +588,8 @@ def build_page_context(
last_updated = None
last_updated_iso = None
if ts:
dt = datetime.fromtimestamp(ts)
last_updated = dt.strftime("%b %d, %Y at %H:%M UTC")
dt = datetime.fromtimestamp(ts).astimezone()
last_updated = dt.strftime("%b %d, %Y at %H:%M %Z")
last_updated_iso = dt.isoformat()
# Build metrics for sidebar
@@ -845,24 +845,24 @@ def build_monthly_table_data(
airtime = m.get("airtime", MetricStats())
# Convert mV to V for display
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": f"{daily.date.day:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_time(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_time(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total is not None else "-", "class": None},
],
})
@@ -877,24 +877,24 @@ def build_monthly_table_data(
tx = s.get("nb_sent", MetricStats())
airtime = s.get("airtime", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{noise.mean:.0f}" if noise.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
{"value": f"{airtime.total:,}" if airtime.total is not None else "-", "class": None},
],
})
@@ -928,21 +928,21 @@ def build_monthly_table_data(
tx = m.get("sent", MetricStats())
# Convert mV to V for display
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": f"{daily.date.day:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_time(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_time(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -954,21 +954,21 @@ def build_monthly_table_data(
rx = s.get("recv", MetricStats())
tx = s.get("sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1033,23 +1033,23 @@ def build_yearly_table_data(
tx = s.get("nb_sent", MetricStats())
# Convert mV to V
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": str(agg.year), "class": None},
{"value": f"{monthly.month:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1062,23 +1062,23 @@ def build_yearly_table_data(
rx = s.get("nb_recv", MetricStats())
tx = s.get("nb_sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_month(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_month(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{rssi.mean:.0f}" if rssi.mean else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{rssi.mean:.0f}" if rssi.mean is not None else "-", "class": None},
{"value": f"{snr.mean:.1f}" if snr.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1113,22 +1113,22 @@ def build_yearly_table_data(
tx = s.get("sent", MetricStats())
# Convert mV to V
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": False,
"cells": [
{"value": str(agg.year), "class": None},
{"value": f"{monthly.month:02d}", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_day(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_day(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})
@@ -1140,22 +1140,22 @@ def build_yearly_table_data(
rx = s.get("recv", MetricStats())
tx = s.get("sent", MetricStats())
bat_v_mean = bat.mean / 1000.0 if bat.mean else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value else None
bat_v_mean = bat.mean / 1000.0 if bat.mean is not None else None
bat_v_min = bat.min_value / 1000.0 if bat.min_value is not None else None
bat_v_max = bat.max_value / 1000.0 if bat.max_value is not None else None
rows.append({
"is_summary": True,
"cells": [
{"value": "", "class": None},
{"value": "", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean else "-", "class": None},
{"value": f"{bat_v_mean:.2f}" if bat_v_mean is not None else "-", "class": None},
{"value": f"{bat_pct.mean:.0f}" if bat_pct.mean is not None else "-", "class": None},
{"value": _fmt_val_month(bat_v_max, bat.max_time), "class": "muted"},
{"value": _fmt_val_month(bat_v_min, bat.min_time), "class": "muted"},
{"value": f"{contacts.mean:.0f}" if contacts.mean else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total else "-", "class": None},
{"value": f"{contacts.mean:.0f}" if contacts.mean is not None else "-", "class": None},
{"value": f"{rx.total:,}" if rx.total is not None else "-", "class": "highlight"},
{"value": f"{tx.total:,}" if tx.total is not None else "-", "class": None},
],
})

View File

@@ -1,7 +1,10 @@
"""MeshCore client wrapper with safe command execution and contact lookup."""
import asyncio
from typing import Any, Optional, Callable, Coroutine
import fcntl
from contextlib import asynccontextmanager
from pathlib import Path
from typing import Any, AsyncIterator, Callable, Coroutine, Optional
from .env import get_config
from . import log
@@ -100,6 +103,92 @@ async def connect_from_env() -> Optional[Any]:
return None
async def _acquire_lock_async(
lock_file,
timeout: float = 60.0,
poll_interval: float = 0.1,
) -> None:
"""Acquire exclusive file lock without blocking the event loop.
Uses non-blocking LOCK_NB with async polling to avoid freezing the event loop.
Args:
lock_file: Open file handle to lock
timeout: Maximum seconds to wait for lock
poll_interval: Seconds between lock attempts
Raises:
TimeoutError: If lock cannot be acquired within timeout
"""
loop = asyncio.get_running_loop()
deadline = loop.time() + timeout
while True:
try:
fcntl.flock(lock_file.fileno(), fcntl.LOCK_EX | fcntl.LOCK_NB)
return
except BlockingIOError:
if loop.time() >= deadline:
raise TimeoutError(
f"Could not acquire serial lock within {timeout}s. "
"Another process may be using the serial port."
)
await asyncio.sleep(poll_interval)
@asynccontextmanager
async def connect_with_lock(
lock_timeout: float = 60.0,
) -> AsyncIterator[Optional[Any]]:
"""Connect to MeshCore with serial port locking to prevent concurrent access.
For serial transport: Acquires exclusive file lock before connecting.
For TCP/BLE: No locking needed (protocol handles multiple connections).
Args:
lock_timeout: Maximum seconds to wait for serial lock
Yields:
MeshCore client instance, or None if connection failed
"""
cfg = get_config()
lock_file = None
mc = None
needs_lock = cfg.mesh_transport.lower() == "serial"
try:
if needs_lock:
lock_path: Path = cfg.state_dir / "serial.lock"
lock_path.parent.mkdir(parents=True, exist_ok=True)
# Use 'a' mode: doesn't truncate, creates if missing
lock_file = open(lock_path, "a")
try:
await _acquire_lock_async(lock_file, timeout=lock_timeout)
log.debug(f"Acquired serial lock: {lock_path}")
except Exception:
# If lock acquisition fails, close file before re-raising
lock_file.close()
lock_file = None
raise
mc = await connect_from_env()
yield mc
finally:
# Disconnect first (while we still hold the lock)
if mc is not None and hasattr(mc, "disconnect"):
try:
await mc.disconnect()
except Exception as e:
log.debug(f"Error during disconnect (ignored): {e}")
# Release lock by closing the file (close() auto-releases flock)
if lock_file is not None:
lock_file.close()
log.debug("Released serial lock")
async def run_command(
mc: Any,
cmd_coro: Coroutine,

View File

@@ -17,17 +17,12 @@ import calendar
import json
from dataclasses import dataclass, field
from datetime import date, datetime, timedelta
from pathlib import Path
from typing import Any, Optional
from .db import get_connection, get_metrics_for_period, VALID_ROLES
from .env import get_config
from .metrics import (
is_counter_metric,
get_chart_metrics,
transform_value,
)
from . import log
def _validate_role(role: str) -> str:
@@ -59,6 +54,32 @@ def get_metrics_for_role(role: str) -> list[str]:
raise ValueError(f"Unknown role: {role}")
REPORT_UNITS_RAW = {
"battery_mv": "mV",
"bat": "mV",
"bat_pct": "%",
"uptime": "s",
"uptime_secs": "s",
"last_rssi": "dBm",
"last_snr": "dB",
"noise_floor": "dBm",
"tx_queue_len": "count",
"contacts": "count",
"recv": "packets",
"sent": "packets",
"nb_recv": "packets",
"nb_sent": "packets",
"airtime": "s",
"rx_airtime": "s",
"flood_dups": "packets",
"direct_dups": "packets",
"sent_flood": "packets",
"recv_flood": "packets",
"sent_direct": "packets",
"recv_direct": "packets",
}
@dataclass
class MetricStats:
"""Statistics for a single metric over a period.
@@ -1116,10 +1137,14 @@ def format_yearly_txt(
return format_yearly_txt_companion(agg, node_name, location)
def _metric_stats_to_dict(stats: MetricStats) -> dict[str, Any]:
def _metric_stats_to_dict(stats: MetricStats, metric: str) -> dict[str, Any]:
"""Convert MetricStats to JSON-serializable dict."""
result: dict[str, Any] = {"count": stats.count}
unit = REPORT_UNITS_RAW.get(metric)
if unit:
result["unit"] = unit
if stats.mean is not None:
result["mean"] = round(stats.mean, 4)
if stats.min_value is not None:
@@ -1144,7 +1169,7 @@ def _daily_to_dict(daily: DailyAggregate) -> dict[str, Any]:
"date": daily.date.isoformat(),
"snapshot_count": daily.snapshot_count,
"metrics": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in daily.metrics.items()
if stats.has_data
},
@@ -1167,7 +1192,7 @@ def monthly_to_json(agg: MonthlyAggregate) -> dict[str, Any]:
"role": agg.role,
"days_with_data": len(agg.daily),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in agg.summary.items()
if stats.has_data
},
@@ -1190,7 +1215,7 @@ def yearly_to_json(agg: YearlyAggregate) -> dict[str, Any]:
"role": agg.role,
"months_with_data": len(agg.monthly),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in agg.summary.items()
if stats.has_data
},
@@ -1200,7 +1225,7 @@ def yearly_to_json(agg: YearlyAggregate) -> dict[str, Any]:
"month": m.month,
"days_with_data": len(m.daily),
"summary": {
ds: _metric_stats_to_dict(stats)
ds: _metric_stats_to_dict(stats, ds)
for ds, stats in m.summary.items()
if stats.has_data
},

102
src/meshmon/telemetry.py Normal file
View File

@@ -0,0 +1,102 @@
"""Telemetry data extraction from Cayenne LPP format."""
from typing import Any
from . import log
__all__ = ["extract_lpp_from_payload", "extract_telemetry_metrics"]
def extract_lpp_from_payload(payload: Any) -> list | None:
"""Extract LPP data list from telemetry payload.
Handles both formats returned by the MeshCore API:
- Dict format: {'pubkey_pre': '...', 'lpp': [...]}
- Direct list format: [...]
Args:
payload: Raw telemetry payload from get_self_telemetry() or req_telemetry_sync()
Returns:
The LPP data list, or None if not extractable.
"""
if payload is None:
return None
if isinstance(payload, dict):
lpp = payload.get("lpp")
if lpp is None:
log.debug("No 'lpp' key in telemetry payload dict")
return None
if not isinstance(lpp, list):
log.debug(f"Unexpected LPP data type in payload: {type(lpp).__name__}")
return None
return lpp
if isinstance(payload, list):
return payload
log.debug(f"Unexpected telemetry payload type: {type(payload).__name__}")
return None
def extract_telemetry_metrics(lpp_data: Any) -> dict[str, float]:
"""Extract numeric telemetry values from Cayenne LPP response.
Expected format:
[
{"type": "temperature", "channel": 0, "value": 23.5},
{"type": "gps", "channel": 1, "value": {"latitude": 51.5, "longitude": -0.1, "altitude": 10}}
]
Keys are formatted as:
- telemetry.{type}.{channel} for scalar values
- telemetry.{type}.{channel}.{subkey} for compound values (e.g., GPS)
Returns:
Dict mapping metric keys to float values. Invalid readings are skipped.
"""
if not isinstance(lpp_data, list):
log.warn(f"Expected list for LPP data, got {type(lpp_data).__name__}")
return {}
metrics: dict[str, float] = {}
for i, reading in enumerate(lpp_data):
if not isinstance(reading, dict):
log.debug(f"Skipping non-dict LPP reading at index {i}")
continue
sensor_type = reading.get("type")
if not isinstance(sensor_type, str) or not sensor_type.strip():
log.debug(f"Skipping reading with invalid type at index {i}")
continue
# Normalize sensor type for use as metric key component
sensor_type = sensor_type.strip().lower().replace(" ", "_")
channel = reading.get("channel", 0)
if not isinstance(channel, int):
channel = 0
value = reading.get("value")
base_key = f"telemetry.{sensor_type}.{channel}"
# Note: Check bool before int because bool is a subclass of int in Python.
# Some sensors may report digital on/off values as booleans.
if isinstance(value, bool):
metrics[base_key] = float(value)
elif isinstance(value, (int, float)):
metrics[base_key] = float(value)
elif isinstance(value, dict):
for subkey, subval in value.items():
if not isinstance(subkey, str):
continue
subkey_clean = subkey.strip().lower().replace(" ", "_")
if not subkey_clean:
continue
if isinstance(subval, bool):
metrics[f"{base_key}.{subkey_clean}"] = float(subval)
elif isinstance(subval, (int, float)):
metrics[f"{base_key}.{subkey_clean}"] = float(subval)
return metrics

View File

@@ -1,142 +1,331 @@
/**
* Chart tooltip enhancement for MeshCore Stats
* Chart Tooltip Enhancement for MeshCore Stats
*
* Progressive enhancement: charts work fully without JS,
* but this adds interactive tooltips on hover.
* Progressive enhancement: charts display fully without JavaScript.
* This module adds interactive tooltips showing datetime and value on hover,
* with an indicator dot that follows the data line.
*
* Data sources:
* - Data points: path.dataset.points or svg.dataset.points (JSON array of {ts, v})
* - Time range: svg.dataset.xStart, svg.dataset.xEnd (Unix timestamps)
* - Value range: svg.dataset.yMin, svg.dataset.yMax
* - Plot bounds: Derived from clipPath rect or line path bounding box
*/
(function() {
(function () {
'use strict';
// Create tooltip element
const tooltip = document.createElement('div');
tooltip.className = 'chart-tooltip';
tooltip.innerHTML = '<div class="tooltip-time"></div><div class="tooltip-value"></div>';
document.body.appendChild(tooltip);
// ============================================================================
// Configuration
// ============================================================================
const tooltipTime = tooltip.querySelector('.tooltip-time');
const tooltipValue = tooltip.querySelector('.tooltip-value');
// Track the current indicator element
let currentIndicator = null;
let currentSvg = null;
// Metric display labels and units (using firmware field names)
const metricLabels = {
// Companion metrics
'battery_mv': { label: 'Voltage', unit: 'V', decimals: 2 },
'uptime_secs': { label: 'Uptime', unit: 'days', decimals: 2 },
'contacts': { label: 'Contacts', unit: '', decimals: 0 },
'recv': { label: 'Received', unit: '/min', decimals: 1 },
'sent': { label: 'Sent', unit: '/min', decimals: 1 },
// Repeater metrics
'bat': { label: 'Voltage', unit: 'V', decimals: 2 },
'bat_pct': { label: 'Charge', unit: '%', decimals: 0 },
'uptime': { label: 'Uptime', unit: 'days', decimals: 2 },
'last_rssi': { label: 'RSSI', unit: 'dBm', decimals: 0 },
'last_snr': { label: 'SNR', unit: 'dB', decimals: 1 },
'noise_floor': { label: 'Noise', unit: 'dBm', decimals: 0 },
'tx_queue_len': { label: 'Queue', unit: '', decimals: 0 },
'nb_recv': { label: 'Received', unit: '/min', decimals: 1 },
'nb_sent': { label: 'Sent', unit: '/min', decimals: 1 },
'airtime': { label: 'TX Air', unit: 's/min', decimals: 2 },
'rx_airtime': { label: 'RX Air', unit: 's/min', decimals: 2 },
'flood_dups': { label: 'Dropped', unit: '/min', decimals: 1 },
'direct_dups': { label: 'Dropped', unit: '/min', decimals: 1 },
'sent_flood': { label: 'Sent', unit: '/min', decimals: 1 },
'recv_flood': { label: 'Received', unit: '/min', decimals: 1 },
'sent_direct': { label: 'Sent', unit: '/min', decimals: 1 },
'recv_direct': { label: 'Received', unit: '/min', decimals: 1 },
var CONFIG = {
tooltipOffset: 15,
viewportPadding: 10,
indicatorRadius: 5,
indicatorStrokeWidth: 2,
colors: {
light: { fill: '#b45309', stroke: '#ffffff' },
dark: { fill: '#f59e0b', stroke: '#0f1114' }
}
};
/**
* Format a timestamp as a readable date/time string
* Metric display configuration keyed by firmware field name.
* Each entry defines how to format values for that metric.
*/
function formatTime(ts, period) {
const date = new Date(ts * 1000);
const options = {
var METRIC_CONFIG = {
// Companion metrics
battery_mv: { label: 'Voltage', unit: 'V', decimals: 2 },
uptime_secs: { label: 'Uptime', unit: 'days', decimals: 2 },
contacts: { label: 'Contacts', unit: '', decimals: 0 },
recv: { label: 'Received', unit: '/min', decimals: 1 },
sent: { label: 'Sent', unit: '/min', decimals: 1 },
// Repeater metrics
bat: { label: 'Voltage', unit: 'V', decimals: 2 },
bat_pct: { label: 'Charge', unit: '%', decimals: 0 },
uptime: { label: 'Uptime', unit: 'days', decimals: 2 },
last_rssi: { label: 'RSSI', unit: 'dBm', decimals: 0 },
last_snr: { label: 'SNR', unit: 'dB', decimals: 1 },
noise_floor: { label: 'Noise', unit: 'dBm', decimals: 0 },
tx_queue_len: { label: 'Queue', unit: '', decimals: 0 },
nb_recv: { label: 'Received', unit: '/min', decimals: 1 },
nb_sent: { label: 'Sent', unit: '/min', decimals: 1 },
airtime: { label: 'TX Air', unit: 's/min', decimals: 2 },
rx_airtime: { label: 'RX Air', unit: 's/min', decimals: 2 },
flood_dups: { label: 'Dropped', unit: '/min', decimals: 1 },
direct_dups: { label: 'Dropped', unit: '/min', decimals: 1 },
sent_flood: { label: 'Sent', unit: '/min', decimals: 1 },
recv_flood: { label: 'Received', unit: '/min', decimals: 1 },
sent_direct: { label: 'Sent', unit: '/min', decimals: 1 },
recv_direct: { label: 'Received', unit: '/min', decimals: 1 }
};
// ============================================================================
// Formatting Utilities
// ============================================================================
/**
* Format a Unix timestamp as a localized date/time string.
* Uses browser language preference for locale (determines 12/24 hour format).
* Includes year only for year-period charts.
*/
function formatTimestamp(timestamp, period) {
var date = new Date(timestamp * 1000);
var options = {
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit'
hour: 'numeric',
minute: '2-digit',
timeZoneName: 'short'
};
// For year view, include year
if (period === 'year') {
options.year = 'numeric';
}
return date.toLocaleString(undefined, options);
// Use browser's language preference (navigator.language), not system locale
// Empty array [] or undefined would use OS regional settings instead
return date.toLocaleString(navigator.language, options);
}
/**
* Format a value with appropriate decimals and unit
* Format a numeric value with the appropriate decimals and unit for a metric.
*/
function formatValue(value, metric) {
const config = metricLabels[metric] || { label: metric, unit: '', decimals: 2 };
const formatted = value.toFixed(config.decimals);
return `${formatted}${config.unit ? ' ' + config.unit : ''}`;
function formatMetricValue(value, metric) {
var config = METRIC_CONFIG[metric] || { label: metric, unit: '', decimals: 2 };
var formatted = value.toFixed(config.decimals);
return config.unit ? formatted + ' ' + config.unit : formatted;
}
// ============================================================================
// Data Point Utilities
// ============================================================================
/**
* Find the closest data point to a timestamp, returning index too
* Find the data point closest to the target timestamp.
* Returns the point object or null if no points available.
*/
function findClosestPoint(dataPoints, targetTs) {
if (!dataPoints || dataPoints.length === 0) return null;
function findClosestDataPoint(dataPoints, targetTimestamp) {
if (!dataPoints || dataPoints.length === 0) {
return null;
}
let closestIdx = 0;
let minDiff = Math.abs(dataPoints[0].ts - targetTs);
var closest = dataPoints[0];
var minDiff = Math.abs(closest.ts - targetTimestamp);
for (let i = 1; i < dataPoints.length; i++) {
const diff = Math.abs(dataPoints[i].ts - targetTs);
for (var i = 1; i < dataPoints.length; i++) {
var diff = Math.abs(dataPoints[i].ts - targetTimestamp);
if (diff < minDiff) {
minDiff = diff;
closestIdx = i;
closest = dataPoints[i];
}
}
return { point: dataPoints[closestIdx], index: closestIdx };
return closest;
}
/**
* Create or get the indicator circle for an SVG
* Parse and cache data points on an SVG element.
* Handles HTML entity encoding from server-side JSON embedding.
*/
function getDataPoints(svg, rawJson) {
if (svg._dataPoints) {
return svg._dataPoints;
}
try {
var json = rawJson.replace(/&quot;/g, '"');
svg._dataPoints = JSON.parse(json);
return svg._dataPoints;
} catch (error) {
console.warn('Chart tooltip: failed to parse data points', error);
return null;
}
}
// ============================================================================
// SVG Coordinate Utilities
// ============================================================================
/**
* Get and cache the plot area bounds for an SVG chart.
* Prefers the clip path rect (defines full plot area) over line path bbox
* (which only covers the actual data range).
*/
function getPlotAreaBounds(svg, fallbackPath) {
if (svg._plotArea) {
return svg._plotArea;
}
var clipRect = svg.querySelector('clipPath rect');
if (clipRect) {
svg._plotArea = {
x: parseFloat(clipRect.getAttribute('x')),
y: parseFloat(clipRect.getAttribute('y')),
width: parseFloat(clipRect.getAttribute('width')),
height: parseFloat(clipRect.getAttribute('height'))
};
} else if (fallbackPath) {
svg._plotArea = fallbackPath.getBBox();
}
return svg._plotArea;
}
/**
* Find the chart line path element within an SVG.
* Tries multiple selectors for compatibility with different SVG structures.
*/
function findLinePath(svg) {
return (
svg.querySelector('#chart-line path') ||
svg.querySelector('path#chart-line') ||
svg.querySelector('[gid="chart-line"] path') ||
svg.querySelector('path[gid="chart-line"]') ||
svg.querySelector('path[data-points]')
);
}
/**
* Convert a screen X coordinate to SVG coordinate space.
*/
function screenToSvgX(svg, clientX) {
var svgRect = svg.getBoundingClientRect();
var viewBox = svg.viewBox.baseVal;
var scale = viewBox.width / svgRect.width;
return (clientX - svgRect.left) * scale + viewBox.x;
}
/**
* Map a timestamp to an X coordinate within the plot area.
*/
function timestampToX(timestamp, xStart, xEnd, plotArea) {
var relativePosition = (timestamp - xStart) / (xEnd - xStart);
return plotArea.x + relativePosition * plotArea.width;
}
/**
* Map a value to a Y coordinate within the plot area.
* SVG Y-axis is inverted (0 at top), so higher values map to lower Y.
*/
function valueToY(value, yMin, yMax, plotArea) {
var ySpan = yMax - yMin || 1;
var relativePosition = (value - yMin) / ySpan;
return plotArea.y + plotArea.height - relativePosition * plotArea.height;
}
// ============================================================================
// Tooltip Element
// ============================================================================
var tooltip = null;
var tooltipTimeEl = null;
var tooltipValueEl = null;
/**
* Create the tooltip DOM element (called once on init).
*/
function createTooltipElement() {
tooltip = document.createElement('div');
tooltip.className = 'chart-tooltip';
tooltip.innerHTML =
'<div class="tooltip-time"></div>' + '<div class="tooltip-value"></div>';
document.body.appendChild(tooltip);
tooltipTimeEl = tooltip.querySelector('.tooltip-time');
tooltipValueEl = tooltip.querySelector('.tooltip-value');
}
/**
* Update tooltip content and position it near the cursor.
*/
function showTooltip(event, timeText, valueText) {
tooltipTimeEl.textContent = timeText;
tooltipValueEl.textContent = valueText;
var left = event.pageX + CONFIG.tooltipOffset;
var top = event.pageY + CONFIG.tooltipOffset;
// Keep tooltip within viewport
var rect = tooltip.getBoundingClientRect();
if (left + rect.width > window.innerWidth - CONFIG.viewportPadding) {
left = event.pageX - rect.width - CONFIG.tooltipOffset;
}
if (top + rect.height > window.innerHeight - CONFIG.viewportPadding) {
top = event.pageY - rect.height - CONFIG.tooltipOffset;
}
tooltip.style.left = left + 'px';
tooltip.style.top = top + 'px';
tooltip.classList.add('visible');
}
/**
* Hide the tooltip.
*/
function hideTooltip() {
tooltip.classList.remove('visible');
}
// ============================================================================
// Indicator Dot
// ============================================================================
var currentIndicator = null;
var currentIndicatorSvg = null;
/**
* Get or create the indicator circle for an SVG chart.
* Reuses existing indicator if still on the same chart.
*/
function getIndicator(svg) {
if (currentSvg === svg && currentIndicator) {
if (currentIndicatorSvg === svg && currentIndicator) {
return currentIndicator;
}
// Remove old indicator if switching charts
// Remove indicator from previous chart
if (currentIndicator && currentIndicator.parentNode) {
currentIndicator.parentNode.removeChild(currentIndicator);
}
// Create new indicator as an SVG circle
const indicator = document.createElementNS('http://www.w3.org/2000/svg', 'circle');
indicator.setAttribute('r', '5');
// Create new indicator circle
var indicator = document.createElementNS(
'http://www.w3.org/2000/svg',
'circle'
);
indicator.setAttribute('r', CONFIG.indicatorRadius);
indicator.setAttribute('class', 'chart-indicator');
indicator.setAttribute('stroke-width', CONFIG.indicatorStrokeWidth);
indicator.style.pointerEvents = 'none';
// Get theme from SVG data attribute for color
const theme = svg.dataset.theme;
if (theme === 'dark') {
indicator.setAttribute('fill', '#f59e0b');
indicator.setAttribute('stroke', '#0f1114');
} else {
indicator.setAttribute('fill', '#b45309');
indicator.setAttribute('stroke', '#ffffff');
}
indicator.setAttribute('stroke-width', '2');
// Apply theme-appropriate colors
var theme = svg.dataset.theme === 'dark' ? 'dark' : 'light';
indicator.setAttribute('fill', CONFIG.colors[theme].fill);
indicator.setAttribute('stroke', CONFIG.colors[theme].stroke);
svg.appendChild(indicator);
currentIndicator = indicator;
currentSvg = svg;
currentIndicatorSvg = svg;
return indicator;
}
/**
* Hide and clean up the indicator
* Position the indicator at a specific data point.
*/
function positionIndicator(svg, dataPoint, xStart, xEnd, yMin, yMax, plotArea) {
var indicator = getIndicator(svg);
var x = timestampToX(dataPoint.ts, xStart, xEnd, plotArea);
var y = valueToY(dataPoint.v, yMin, yMax, plotArea);
indicator.setAttribute('cx', x);
indicator.setAttribute('cy', y);
indicator.style.display = '';
}
/**
* Hide the indicator dot.
*/
function hideIndicator() {
if (currentIndicator) {
@@ -144,185 +333,137 @@
}
}
// ============================================================================
// Event Handlers
// ============================================================================
/**
* Position tooltip near the mouse cursor
* Convert a touch event to a mouse-like event object.
*/
function positionTooltip(event) {
const offset = 15;
let left = event.pageX + offset;
let top = event.pageY + offset;
// Keep tooltip on screen
const rect = tooltip.getBoundingClientRect();
const viewportWidth = window.innerWidth;
const viewportHeight = window.innerHeight;
if (left + rect.width > viewportWidth - 10) {
left = event.pageX - rect.width - offset;
}
if (top + rect.height > viewportHeight - 10) {
top = event.pageY - rect.height - offset;
}
tooltip.style.left = left + 'px';
tooltip.style.top = top + 'px';
function touchToMouseEvent(touchEvent) {
var touch = touchEvent.touches[0];
return {
currentTarget: touchEvent.currentTarget,
clientX: touch.clientX,
clientY: touch.clientY,
pageX: touch.pageX,
pageY: touch.pageY
};
}
/**
* Handle mouse move over chart SVG
* Handle pointer movement over a chart (mouse or touch).
* Finds the closest data point and updates tooltip and indicator.
*/
function handleMouseMove(event) {
const svg = event.currentTarget;
const metric = svg.dataset.metric;
const period = svg.dataset.period;
const xStart = parseInt(svg.dataset.xStart, 10);
const xEnd = parseInt(svg.dataset.xEnd, 10);
const yMin = parseFloat(svg.dataset.yMin);
const yMax = parseFloat(svg.dataset.yMax);
function handlePointerMove(event) {
var svg = event.currentTarget;
// Find the path with data-points
const path = svg.querySelector('path[data-points]');
if (!path) return;
// Extract chart metadata
var metric = svg.dataset.metric;
var period = svg.dataset.period;
var xStart = parseInt(svg.dataset.xStart, 10);
var xEnd = parseInt(svg.dataset.xEnd, 10);
var yMin = parseFloat(svg.dataset.yMin);
var yMax = parseFloat(svg.dataset.yMax);
// Parse and cache data points and path coordinates on first access
if (!path._dataPoints) {
try {
const json = path.dataset.points.replace(/&quot;/g, '"');
path._dataPoints = JSON.parse(json);
} catch (e) {
console.warn('Failed to parse chart data:', e);
return;
}
// Find the line path and data points source
var linePath = findLinePath(svg);
if (!linePath) {
return;
}
// Cache the path's bounding box for coordinate mapping
if (!path._pathBox) {
path._pathBox = path.getBBox();
var rawPoints = linePath.dataset.points || svg.dataset.points;
if (!rawPoints) {
return;
}
const pathBox = path._pathBox;
// Parse data points (cached on svg element)
var dataPoints = getDataPoints(svg, rawPoints);
if (!dataPoints) {
return;
}
// Get mouse position in SVG coordinate space
const svgRect = svg.getBoundingClientRect();
const viewBox = svg.viewBox.baseVal;
// Get plot area bounds (cached on svg element)
var plotArea = getPlotAreaBounds(svg, linePath);
if (!plotArea) {
return;
}
// Convert screen X coordinate to SVG coordinate
const scaleX = viewBox.width / svgRect.width;
const svgX = (event.clientX - svgRect.left) * scaleX + viewBox.x;
// Convert screen position to timestamp
var svgX = screenToSvgX(svg, event.clientX);
var relativeX = Math.max(0, Math.min(1, (svgX - plotArea.x) / plotArea.width));
var targetTimestamp = xStart + relativeX * (xEnd - xStart);
// Calculate relative X position within the plot area (pathBox)
const relX = (svgX - pathBox.x) / pathBox.width;
// Find and display closest data point
var closestPoint = findClosestDataPoint(dataPoints, targetTimestamp);
if (!closestPoint) {
return;
}
// Clamp to plot area bounds
const clampedRelX = Math.max(0, Math.min(1, relX));
showTooltip(
event,
formatTimestamp(closestPoint.ts, period),
formatMetricValue(closestPoint.v, metric)
);
// Map relative X position to timestamp using the chart's X-axis range
const targetTs = xStart + clampedRelX * (xEnd - xStart);
// Find closest data point by timestamp
const result = findClosestPoint(path._dataPoints, targetTs);
if (!result) return;
const { point } = result;
// Update tooltip content
tooltipTime.textContent = formatTime(point.ts, period);
tooltipValue.textContent = formatValue(point.v, metric);
// Position and show tooltip
positionTooltip(event);
tooltip.classList.add('visible');
// Position the indicator at the data point
const indicator = getIndicator(svg);
// Calculate X position: map timestamp to path coordinate space
const pointRelX = (point.ts - xStart) / (xEnd - xStart);
const indicatorX = pathBox.x + pointRelX * pathBox.width;
// Calculate Y position using the actual Y-axis range from the chart
const ySpan = yMax - yMin || 1;
// Y is inverted in SVG (0 at top)
const pointRelY = 1 - (point.v - yMin) / ySpan;
const indicatorY = pathBox.y + pointRelY * pathBox.height;
indicator.setAttribute('cx', indicatorX);
indicator.setAttribute('cy', indicatorY);
indicator.style.display = '';
positionIndicator(svg, closestPoint, xStart, xEnd, yMin, yMax, plotArea);
}
/**
* Hide tooltip when leaving chart
* Handle pointer leaving the chart area.
*/
function handleMouseLeave() {
tooltip.classList.remove('visible');
function handlePointerLeave() {
hideTooltip();
hideIndicator();
}
/**
* Handle touch events for mobile
* Handle touch start event.
*/
function handleTouchStart(event) {
// Convert touch to mouse-like event
const touch = event.touches[0];
const mouseEvent = {
currentTarget: event.currentTarget,
clientX: touch.clientX,
clientY: touch.clientY,
pageX: touch.pageX,
pageY: touch.pageY
};
handleMouseMove(mouseEvent);
}
function handleTouchMove(event) {
const touch = event.touches[0];
const mouseEvent = {
currentTarget: event.currentTarget,
clientX: touch.clientX,
clientY: touch.clientY,
pageX: touch.pageX,
pageY: touch.pageY
};
handleMouseMove(mouseEvent);
}
function handleTouchEnd() {
handleMouseLeave();
handlePointerMove(touchToMouseEvent(event));
}
/**
* Initialize tooltips for all chart SVGs
* Handle touch move event.
*/
function initTooltips() {
// Find all chart SVGs with data attributes
const chartSvgs = document.querySelectorAll('svg[data-metric][data-period]');
function handleTouchMove(event) {
handlePointerMove(touchToMouseEvent(event));
}
chartSvgs.forEach(function(svg) {
// Mouse events for desktop
svg.addEventListener('mousemove', handleMouseMove);
svg.addEventListener('mouseleave', handleMouseLeave);
// ============================================================================
// Initialization
// ============================================================================
// Touch events for mobile
/**
* Attach event listeners to all chart SVG elements.
*/
function initializeChartTooltips() {
createTooltipElement();
var chartSvgs = document.querySelectorAll('svg[data-metric][data-period]');
chartSvgs.forEach(function (svg) {
// Desktop mouse events
svg.addEventListener('mousemove', handlePointerMove);
svg.addEventListener('mouseleave', handlePointerLeave);
// Mobile touch events
svg.addEventListener('touchstart', handleTouchStart, { passive: true });
svg.addEventListener('touchmove', handleTouchMove, { passive: true });
svg.addEventListener('touchend', handleTouchEnd);
svg.addEventListener('touchcancel', handleTouchEnd);
svg.addEventListener('touchend', handlePointerLeave);
svg.addEventListener('touchcancel', handlePointerLeave);
// Set cursor to indicate interactivity
// Visual affordance for interactivity
svg.style.cursor = 'crosshair';
// Allow vertical scrolling but prevent horizontal pan on mobile
svg.style.touchAction = 'pan-y';
});
}
// Initialize when DOM is ready
// Run initialization when DOM is ready
if (document.readyState === 'loading') {
document.addEventListener('DOMContentLoaded', initTooltips);
document.addEventListener('DOMContentLoaded', initializeChartTooltips);
} else {
initTooltips();
initializeChartTooltips();
}
})();