mirror of
https://github.com/jorijn/meshcore-stats.git
synced 2026-03-28 17:42:55 +01:00
Initial release: MeshCore Stats monitoring system
A Python-based monitoring system for MeshCore LoRa mesh networks. Collects metrics from companion and repeater nodes, stores them in a SQLite database, and generates a static website with interactive SVG charts and statistics. Features: - Data collection from local companion and remote repeater nodes - SQLite database with EAV schema for flexible metric storage - Interactive SVG chart generation with matplotlib - Static HTML site with day/week/month/year views - Monthly and yearly statistics reports (HTML, TXT, JSON) - Light and dark theme support - Circuit breaker for unreliable LoRa connections - Battery percentage calculation from 18650 discharge curves - Automated releases via release-please Live demo: https://meshcore.jorijn.com
This commit is contained in:
83
.claude/agents/frontend-expert.md
Normal file
83
.claude/agents/frontend-expert.md
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
---
|
||||||
|
name: frontend-expert
|
||||||
|
description: Use this agent when working on frontend development tasks including HTML structure, CSS styling, JavaScript interactions, accessibility compliance, UI/UX design decisions, responsive layouts, or component architecture. This agent should be engaged for reviewing frontend code quality, implementing new UI features, fixing accessibility issues, or optimizing user interfaces.\n\nExamples:\n\n<example>\nContext: User asks to create a new HTML page or component\nuser: "Create a navigation menu for the dashboard"\nassistant: "I'll use the frontend-expert agent to design and implement an accessible, well-structured navigation menu."\n<launches frontend-expert agent via Task tool>\n</example>\n\n<example>\nContext: User has written frontend code that needs review\nuser: "I just added this form to the page, can you check it?"\nassistant: "Let me use the frontend-expert agent to review your form for accessibility, semantic HTML, and UI best practices."\n<launches frontend-expert agent via Task tool>\n</example>\n\n<example>\nContext: User needs help with CSS or responsive design\nuser: "The charts on the dashboard look bad on mobile"\nassistant: "I'll engage the frontend-expert agent to analyze and fix the responsive layout issues for the charts."\n<launches frontend-expert agent via Task tool>\n</example>\n\n<example>\nContext: Proactive use after implementing UI changes\nassistant: "I've added the new status indicators to the HTML template. Now let me use the frontend-expert agent to verify the accessibility and semantic correctness of these changes."\n<launches frontend-expert agent via Task tool>\n</example>
|
||||||
|
model: opus
|
||||||
|
---
|
||||||
|
|
||||||
|
You are a senior frontend development expert with deep expertise in web standards, accessibility, and user interface design. You have comprehensive knowledge spanning HTML5 semantics, CSS architecture, JavaScript patterns, WCAG accessibility guidelines, and modern UI/UX principles.
|
||||||
|
|
||||||
|
## Core Expertise Areas
|
||||||
|
|
||||||
|
### Semantic HTML
|
||||||
|
- You enforce proper document structure with appropriate landmark elements (`<header>`, `<nav>`, `<main>`, `<article>`, `<section>`, `<aside>`, `<footer>`)
|
||||||
|
- You ensure heading hierarchy is logical and sequential (h1 → h2 → h3, never skipping levels)
|
||||||
|
- You select the most semantically appropriate element for each use case (e.g., `<button>` for actions, `<a>` for navigation, `<time>` for dates)
|
||||||
|
- You validate proper use of lists, tables (with proper headers and captions), and form elements
|
||||||
|
- You understand when to use ARIA and when native HTML semantics are sufficient
|
||||||
|
|
||||||
|
### Accessibility (WCAG 2.1 AA Compliance)
|
||||||
|
- You verify all interactive elements are keyboard accessible with visible focus indicators
|
||||||
|
- You ensure proper color contrast ratios (4.5:1 for normal text, 3:1 for large text)
|
||||||
|
- You require meaningful alt text for images and proper labeling for form controls
|
||||||
|
- You validate that dynamic content changes are announced to screen readers
|
||||||
|
- You check for proper focus management in modals, dialogs, and single-page navigation
|
||||||
|
- You ensure forms have associated labels, error messages are linked to inputs, and required fields are indicated accessibly
|
||||||
|
- You verify skip links exist for keyboard users to bypass repetitive content
|
||||||
|
- You understand ARIA roles, states, and properties and apply them correctly
|
||||||
|
|
||||||
|
### CSS Best Practices
|
||||||
|
- You advocate for maintainable CSS architecture (BEM, CSS Modules, or utility-first approaches)
|
||||||
|
- You ensure responsive design using mobile-first methodology with appropriate breakpoints
|
||||||
|
- You validate proper use of flexbox and grid for layouts
|
||||||
|
- You check for CSS that respects user preferences (prefers-reduced-motion, prefers-color-scheme)
|
||||||
|
- You optimize for performance by avoiding expensive selectors and unnecessary specificity
|
||||||
|
- You ensure text remains readable when zoomed to 200%
|
||||||
|
|
||||||
|
### UI/UX Design Principles
|
||||||
|
- You evaluate visual hierarchy and ensure important elements receive appropriate emphasis
|
||||||
|
- You verify consistent spacing, typography, and color usage
|
||||||
|
- You assess interactive element sizing (minimum 44x44px touch targets)
|
||||||
|
- You ensure feedback is provided for user actions (loading states, success/error messages)
|
||||||
|
- You validate that the interface is intuitive and follows established conventions
|
||||||
|
- You consider cognitive load and information architecture
|
||||||
|
|
||||||
|
### Performance & Best Practices
|
||||||
|
- You optimize images and recommend appropriate formats (WebP, SVG where appropriate)
|
||||||
|
- You ensure critical CSS is prioritized and non-critical assets are deferred
|
||||||
|
- You validate proper lazy loading implementation for images and iframes
|
||||||
|
- You check for efficient DOM structure and minimize unnecessary nesting
|
||||||
|
|
||||||
|
## Working Methodology
|
||||||
|
|
||||||
|
1. **When reviewing code**: Systematically check each aspect—semantics, accessibility, styling, and usability. Provide specific, actionable feedback with code examples.
|
||||||
|
|
||||||
|
2. **When implementing features**: Start with semantic HTML structure, layer in accessible interactions, then apply styling. Always test mentally against keyboard-only and screen reader usage.
|
||||||
|
|
||||||
|
3. **When debugging issues**: Consider the full stack—HTML structure, CSS cascade, JavaScript behavior, and browser rendering. Check browser developer tools suggestions.
|
||||||
|
|
||||||
|
4. **Prioritize issues by impact**: Critical accessibility barriers first, then semantic improvements, then enhancements.
|
||||||
|
|
||||||
|
## Output Standards
|
||||||
|
|
||||||
|
- Provide working code examples, not just descriptions
|
||||||
|
- Include comments explaining accessibility considerations
|
||||||
|
- Reference specific WCAG criteria when relevant (e.g., "WCAG 2.1 SC 1.4.3")
|
||||||
|
- Suggest testing approaches (keyboard testing, screen reader testing, automated tools like axe-core)
|
||||||
|
- When multiple valid approaches exist, explain trade-offs
|
||||||
|
|
||||||
|
## Quality Checklist (apply to all frontend work)
|
||||||
|
|
||||||
|
- [ ] Semantic HTML elements used appropriately
|
||||||
|
- [ ] Heading hierarchy is logical
|
||||||
|
- [ ] All images have appropriate alt text
|
||||||
|
- [ ] Form controls have associated labels
|
||||||
|
- [ ] Interactive elements are keyboard accessible
|
||||||
|
- [ ] Focus indicators are visible
|
||||||
|
- [ ] Color is not the only means of conveying information
|
||||||
|
- [ ] Color contrast meets WCAG AA standards
|
||||||
|
- [ ] Page is responsive and readable at various sizes
|
||||||
|
- [ ] Touch targets are sufficiently sized
|
||||||
|
- [ ] Loading and error states are handled
|
||||||
|
- [ ] ARIA is used correctly and only when necessary
|
||||||
|
|
||||||
|
You approach every frontend task with the mindset that the interface must work for everyone, regardless of how they access it. You balance aesthetic excellence with functional accessibility, never sacrificing one for the other.
|
||||||
103
.claude/agents/python-code-reviewer.md
Normal file
103
.claude/agents/python-code-reviewer.md
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
---
|
||||||
|
name: python-code-reviewer
|
||||||
|
description: Use this agent when you need expert code review for Python code. This agent should be invoked after writing or modifying Python code to ensure it meets high quality standards. Examples:\n\n<example>\nContext: User has just written a new Python function for battery percentage calculation.\nuser: "I've added a function to calculate battery percentage from voltage"\nassistant: "Let me review that code for you."\n<uses Agent tool to invoke python-code-reviewer>\nassistant: "I've reviewed your battery percentage calculation function using the python-code-reviewer agent. Here are the findings..."\n</example>\n\n<example>\nContext: User has refactored a module to reduce duplication.\nuser: "I've refactored the metric extraction logic to be more DRY"\nassistant: "Great! Let me have the code reviewer take a look at your refactoring."\n<uses Agent tool to invoke python-code-reviewer>\nassistant: "The python-code-reviewer agent has analyzed your refactoring..."\n</example>\n\n<example>\nContext: User has completed a new feature implementation.\nuser: "Here's the new RRD charting feature I just finished implementing"\nassistant: "Excellent! Before we proceed, let me invoke the python-code-reviewer to ensure it meets our quality standards."\n<uses Agent tool to invoke python-code-reviewer>\nassistant: "The code review is complete. Here's what the python-code-reviewer found..."\n</example>
|
||||||
|
model: opus
|
||||||
|
---
|
||||||
|
|
||||||
|
You are an elite Python code reviewer with over 15 years of experience building production systems. You have a deep understanding of Python idioms, design patterns, and software engineering principles. Your reviews are known for being thorough yet constructive, focusing on code quality, maintainability, and long-term sustainability.
|
||||||
|
|
||||||
|
Your core responsibilities:
|
||||||
|
|
||||||
|
1. **Code Quality Assessment**: Evaluate code for readability, clarity, and maintainability. Every line should communicate its intent clearly to future developers.
|
||||||
|
|
||||||
|
2. **DRY Principle Enforcement**: Identify and flag code duplication ruthlessly. Look for:
|
||||||
|
- Repeated logic that could be extracted into functions
|
||||||
|
- Similar patterns that could use abstraction
|
||||||
|
- Configuration or constants that should be centralized
|
||||||
|
- Opportunities for inheritance, composition, or shared utilities
|
||||||
|
|
||||||
|
3. **Python Best Practices**: Ensure code follows Python conventions:
|
||||||
|
- PEP 8 style guidelines (though focus on substance over style)
|
||||||
|
- Pythonic idioms (list comprehensions, generators, context managers)
|
||||||
|
- Proper use of standard library features
|
||||||
|
- Type hints where they add clarity (especially for public APIs)
|
||||||
|
- Docstrings for modules, classes, and non-obvious functions
|
||||||
|
|
||||||
|
4. **Design Pattern Recognition**: Identify opportunities for:
|
||||||
|
- Better separation of concerns
|
||||||
|
- More cohesive module design
|
||||||
|
- Appropriate abstraction levels
|
||||||
|
- Clearer interfaces and contracts
|
||||||
|
|
||||||
|
5. **Error Handling & Edge Cases**: Review for:
|
||||||
|
- Missing error handling
|
||||||
|
- Unhandled edge cases
|
||||||
|
- Silent failures or swallowed exceptions
|
||||||
|
- Validation of inputs and assumptions
|
||||||
|
|
||||||
|
6. **Performance & Efficiency**: Flag obvious performance issues:
|
||||||
|
- Unnecessary iterations or nested loops
|
||||||
|
- Missing opportunities for caching
|
||||||
|
- Inefficient data structures
|
||||||
|
- Resource leaks (unclosed files, connections)
|
||||||
|
|
||||||
|
7. **Testing & Testability**: Assess whether code is:
|
||||||
|
- Testable (dependencies can be mocked, side effects isolated)
|
||||||
|
- Following patterns that make testing easier
|
||||||
|
- Complex enough to warrant additional test coverage
|
||||||
|
|
||||||
|
**Review Process**:
|
||||||
|
|
||||||
|
1. First, understand the context: What is this code trying to accomplish? What constraints exist?
|
||||||
|
|
||||||
|
2. Read through the code completely before commenting. Look for patterns and overall structure.
|
||||||
|
|
||||||
|
3. Organize your feedback into categories:
|
||||||
|
- **Critical Issues**: Bugs, security problems, or major design flaws
|
||||||
|
- **Important Improvements**: DRY violations, readability issues, missing error handling
|
||||||
|
- **Suggestions**: Minor optimizations, style preferences, alternative approaches
|
||||||
|
- **Praise**: Acknowledge well-written code, clever solutions, good patterns
|
||||||
|
|
||||||
|
4. For each issue:
|
||||||
|
- Explain *why* it's a problem, not just *what* is wrong
|
||||||
|
- Provide concrete examples or code snippets showing the improvement
|
||||||
|
- Consider the trade-offs (sometimes duplication is acceptable for clarity)
|
||||||
|
|
||||||
|
5. Be specific with line numbers or code excerpts when referencing issues.
|
||||||
|
|
||||||
|
6. Balance criticism with encouragement. Good code review builds better developers.
|
||||||
|
|
||||||
|
**Your Output Format**:
|
||||||
|
|
||||||
|
Structure your review as:
|
||||||
|
|
||||||
|
```
|
||||||
|
## Code Review Summary
|
||||||
|
|
||||||
|
**Overall Assessment**: [Brief 1-2 sentence summary]
|
||||||
|
|
||||||
|
### Critical Issues
|
||||||
|
[List any bugs, security issues, or major problems]
|
||||||
|
|
||||||
|
### Important Improvements
|
||||||
|
[DRY violations, readability issues, missing error handling]
|
||||||
|
|
||||||
|
### Suggestions
|
||||||
|
[Nice-to-have improvements, alternative approaches]
|
||||||
|
|
||||||
|
### What Went Well
|
||||||
|
[Positive aspects worth highlighting]
|
||||||
|
|
||||||
|
### Recommended Actions
|
||||||
|
[Prioritized list of what to address first]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Important Principles**:
|
||||||
|
|
||||||
|
- **Context Matters**: Consider the project's stage (prototype vs. production), team size, and constraints
|
||||||
|
- **Pragmatism Over Perfection**: Not every issue needs fixing immediately. Help prioritize.
|
||||||
|
- **Teach, Don't Judge**: Explain the reasoning behind recommendations. Help developers grow.
|
||||||
|
- **Question Assumptions**: If something seems odd, ask why it's done that way before suggesting changes
|
||||||
|
- **Consider Project Patterns**: Look for and reference established patterns in the codebase (like those in CLAUDE.md)
|
||||||
|
|
||||||
|
When you're uncertain about context or requirements, ask clarifying questions rather than making assumptions. Your goal is to help create better code, not to enforce arbitrary rules.
|
||||||
29
.claude/settings.local.json
Normal file
29
.claude/settings.local.json
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
{
|
||||||
|
"permissions": {
|
||||||
|
"allow": [
|
||||||
|
"Bash(meshcore-cli:*)",
|
||||||
|
"Bash(.direnv/python-3.14/bin/pip index:*)",
|
||||||
|
"Bash(.direnv/python-3.14/bin/pip install:*)",
|
||||||
|
"Bash(.direnv/python-3.12/bin/pip:*)",
|
||||||
|
"Bash(rrdtool info:*)",
|
||||||
|
"Bash(rrdtool fetch:*)",
|
||||||
|
"Bash(cat:*)",
|
||||||
|
"Bash(xargs cat:*)",
|
||||||
|
"Bash(done)",
|
||||||
|
"Bash(ls:*)",
|
||||||
|
"Bash(source .envrc)",
|
||||||
|
"Bash(.direnv/python-3.12/bin/python:*)",
|
||||||
|
"Bash(xargs:*)",
|
||||||
|
"Bash(git add:*)",
|
||||||
|
"Bash(git commit:*)",
|
||||||
|
"Bash(git push)",
|
||||||
|
"Bash(find:*)",
|
||||||
|
"Bash(tree:*)",
|
||||||
|
"Skill(frontend-design:frontend-design)",
|
||||||
|
"Skill(frontend-design:frontend-design:*)",
|
||||||
|
"Bash(direnv exec:*)",
|
||||||
|
"Skill(frontend-design)",
|
||||||
|
"Skill(frontend-design:*)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
118
.envrc.example
Normal file
118
.envrc.example
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
# MeshCore Stats Configuration
|
||||||
|
# Copy this file to .envrc and customize for your setup:
|
||||||
|
# cp .envrc.example .envrc
|
||||||
|
#
|
||||||
|
# If using direnv, it will automatically load when you cd into this directory.
|
||||||
|
# Otherwise, source it manually: source .envrc
|
||||||
|
|
||||||
|
layout python3
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Connection Settings
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export MESH_TRANSPORT=serial
|
||||||
|
export MESH_SERIAL_PORT=/dev/ttyUSB0 # Adjust for your system (e.g., /dev/ttyACM0, /dev/cu.usbserial-*)
|
||||||
|
export MESH_SERIAL_BAUD=115200
|
||||||
|
export MESH_DEBUG=0 # Set to 1 for verbose meshcore debug output
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Remote Repeater Identity
|
||||||
|
# =============================================================================
|
||||||
|
# At least REPEATER_NAME or REPEATER_KEY_PREFIX is required to identify your repeater
|
||||||
|
|
||||||
|
export REPEATER_NAME="Your Repeater Name" # Advertised name shown in contacts
|
||||||
|
# export REPEATER_KEY_PREFIX="a1b2c3" # Alternative: hex prefix of public key
|
||||||
|
export REPEATER_PASSWORD="your-password" # Admin password for repeater login
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Display Names (shown in UI)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export REPEATER_DISPLAY_NAME="My Repeater"
|
||||||
|
export COMPANION_DISPLAY_NAME="My Companion"
|
||||||
|
|
||||||
|
# Public key prefixes (shown below node name in sidebar, e.g., "!a1b2c3d4")
|
||||||
|
# export REPEATER_PUBKEY_PREFIX="!a1b2c3d4"
|
||||||
|
# export COMPANION_PUBKEY_PREFIX="!e5f6g7h8"
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Location Metadata (for reports and sidebar display)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export REPORT_LOCATION_NAME="City, Country" # Full location name for reports
|
||||||
|
export REPORT_LOCATION_SHORT="City, XX" # Short version for sidebar/meta
|
||||||
|
export REPORT_LAT=0.0 # Latitude in decimal degrees
|
||||||
|
export REPORT_LON=0.0 # Longitude in decimal degrees
|
||||||
|
export REPORT_ELEV=0 # Elevation
|
||||||
|
export REPORT_ELEV_UNIT=m # "m" for meters, "ft" for feet
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Hardware Info (shown in sidebar)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export REPEATER_HARDWARE="Your Repeater Model" # e.g., "SenseCAP P1-Pro", "LILYGO T-Beam"
|
||||||
|
export COMPANION_HARDWARE="Your Companion Model" # e.g., "Elecrow ThinkNode-M1", "Heltec V3"
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Radio Configuration Presets
|
||||||
|
# =============================================================================
|
||||||
|
# Uncomment ONE preset below that matches your MeshCore configuration,
|
||||||
|
# or set custom values. These are for display purposes only.
|
||||||
|
|
||||||
|
# MeshCore EU/UK Narrow (default)
|
||||||
|
export RADIO_FREQUENCY="869.618 MHz"
|
||||||
|
export RADIO_BANDWIDTH="62.5 kHz"
|
||||||
|
export RADIO_SPREAD_FACTOR="SF8"
|
||||||
|
export RADIO_CODING_RATE="CR8"
|
||||||
|
|
||||||
|
# # MeshCore EU/UK Wide
|
||||||
|
# export RADIO_FREQUENCY="869.525 MHz"
|
||||||
|
# export RADIO_BANDWIDTH="250 kHz"
|
||||||
|
# export RADIO_SPREAD_FACTOR="SF10"
|
||||||
|
# export RADIO_CODING_RATE="CR5"
|
||||||
|
|
||||||
|
# # MeshCore US Standard
|
||||||
|
# export RADIO_FREQUENCY="906.875 MHz"
|
||||||
|
# export RADIO_BANDWIDTH="250 kHz"
|
||||||
|
# export RADIO_SPREAD_FACTOR="SF10"
|
||||||
|
# export RADIO_CODING_RATE="CR5"
|
||||||
|
|
||||||
|
# # MeshCore US Fast
|
||||||
|
# export RADIO_FREQUENCY="906.875 MHz"
|
||||||
|
# export RADIO_BANDWIDTH="500 kHz"
|
||||||
|
# export RADIO_SPREAD_FACTOR="SF7"
|
||||||
|
# export RADIO_CODING_RATE="CR5"
|
||||||
|
|
||||||
|
# # MeshCore ANZ (Australia/New Zealand)
|
||||||
|
# export RADIO_FREQUENCY="917.0 MHz"
|
||||||
|
# export RADIO_BANDWIDTH="250 kHz"
|
||||||
|
# export RADIO_SPREAD_FACTOR="SF10"
|
||||||
|
# export RADIO_CODING_RATE="CR5"
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Intervals and Timeouts
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export COMPANION_STEP=60 # Collection interval for companion (seconds)
|
||||||
|
export REPEATER_STEP=900 # Collection interval for repeater (seconds, 15min default)
|
||||||
|
export REMOTE_TIMEOUT_S=10 # Minimum timeout for LoRa requests
|
||||||
|
export REMOTE_RETRY_ATTEMPTS=2 # Number of retry attempts
|
||||||
|
export REMOTE_RETRY_BACKOFF_S=4 # Seconds between retries
|
||||||
|
|
||||||
|
# Circuit breaker settings (prevents spamming LoRa when repeater is unreachable)
|
||||||
|
export REMOTE_CB_FAILS=6 # Failures before circuit breaker opens
|
||||||
|
export REMOTE_CB_COOLDOWN_S=3600 # Cooldown period in seconds (1 hour)
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Paths
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export STATE_DIR=./data/state # SQLite database and circuit breaker state
|
||||||
|
export OUT_DIR=./out # Generated static site output
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Optional
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
export REPEATER_FETCH_ACL=0 # Set to 1 to fetch ACL from repeater
|
||||||
21
.github/workflows/release-please.yml
vendored
Normal file
21
.github/workflows/release-please.yml
vendored
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
name: Release Please
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
pull-requests: write
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
release-please:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Release Please
|
||||||
|
uses: googleapis/release-please-action@v4
|
||||||
|
with:
|
||||||
|
token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
config-file: release-please-config.json
|
||||||
|
manifest-file: .release-please-manifest.json
|
||||||
41
.gitignore
vendored
Normal file
41
.gitignore
vendored
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
.venv/
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env/
|
||||||
|
*.egg-info/
|
||||||
|
dist/
|
||||||
|
build/
|
||||||
|
|
||||||
|
# Environment
|
||||||
|
.envrc
|
||||||
|
.env
|
||||||
|
|
||||||
|
# Data directories (keep structure, ignore content)
|
||||||
|
data/snapshots/companion/**/*.json
|
||||||
|
data/snapshots/repeater/**/*.json
|
||||||
|
data/rrd/*.rrd
|
||||||
|
data/state/*
|
||||||
|
|
||||||
|
# Generated output (except .htaccess)
|
||||||
|
out/*
|
||||||
|
!out/.htaccess
|
||||||
|
|
||||||
|
|
||||||
|
# IDE
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
.direnv
|
||||||
3
.release-please-manifest.json
Normal file
3
.release-please-manifest.json
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
{
|
||||||
|
".": "0.1.0"
|
||||||
|
}
|
||||||
20
CHANGELOG.md
Normal file
20
CHANGELOG.md
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
This changelog is automatically generated by [release-please](https://github.com/googleapis/release-please) based on [Conventional Commits](https://www.conventionalcommits.org/).
|
||||||
|
|
||||||
|
## 0.1.0 (Initial Release)
|
||||||
|
|
||||||
|
Initial release of MeshCore Stats - a monitoring system for MeshCore LoRa mesh networks.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
- Data collection from companion and repeater nodes
|
||||||
|
- SQLite database with EAV schema for flexible metric storage
|
||||||
|
- Interactive SVG chart generation with matplotlib
|
||||||
|
- Static HTML site with day/week/month/year views
|
||||||
|
- Monthly and yearly statistics reports
|
||||||
|
- Light and dark theme support
|
||||||
|
- Circuit breaker for unreliable LoRa connections
|
||||||
|
- Battery percentage calculation from voltage curves
|
||||||
676
CLAUDE.md
Normal file
676
CLAUDE.md
Normal file
@@ -0,0 +1,676 @@
|
|||||||
|
# CLAUDE.md - MeshCore Stats Project Guide
|
||||||
|
|
||||||
|
> **Maintenance Note**: This file should always reflect the current state of the project. When making changes to the codebase (adding features, changing architecture, modifying configuration), update this document accordingly. Keep it accurate and comprehensive for future reference.
|
||||||
|
|
||||||
|
## Important: Source Files vs Generated Output
|
||||||
|
|
||||||
|
**NEVER edit files in `out/` directly** - they are generated and will be overwritten.
|
||||||
|
|
||||||
|
| Type | Source Location | Generated Output |
|
||||||
|
|------|-----------------|------------------|
|
||||||
|
| CSS | `src/meshmon/templates/styles.css` | `out/styles.css` |
|
||||||
|
| HTML templates | `src/meshmon/templates/*.html` | `out/*.html` |
|
||||||
|
| JavaScript | `src/meshmon/templates/*.js` | `out/*.js` |
|
||||||
|
|
||||||
|
Always edit the source templates, then regenerate with `direnv exec . python scripts/render_site.py`.
|
||||||
|
|
||||||
|
## Running Commands
|
||||||
|
|
||||||
|
**IMPORTANT**: Always use `direnv exec .` to run Python scripts in this project. This ensures the correct virtualenv and environment variables are loaded.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Correct way to run scripts
|
||||||
|
direnv exec . python scripts/render_site.py
|
||||||
|
|
||||||
|
# NEVER use these (virtualenv won't be loaded correctly):
|
||||||
|
# source .envrc && python ...
|
||||||
|
# .direnv/python-3.12/bin/python ...
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commit Message Guidelines
|
||||||
|
|
||||||
|
This project uses [Conventional Commits](https://www.conventionalcommits.org/) with [release-please](https://github.com/googleapis/release-please) for automated releases. **Commit messages directly control versioning and changelog generation.**
|
||||||
|
|
||||||
|
### Format
|
||||||
|
|
||||||
|
```
|
||||||
|
<type>[optional scope]: <description>
|
||||||
|
|
||||||
|
[optional body]
|
||||||
|
|
||||||
|
[optional footer(s)]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Commit Types
|
||||||
|
|
||||||
|
| Type | Description | Version Bump | Changelog Section |
|
||||||
|
|------|-------------|--------------|-------------------|
|
||||||
|
| `feat` | New feature or capability | Minor (0.1.0 → 0.2.0) | Features |
|
||||||
|
| `fix` | Bug fix | Patch (0.1.0 → 0.1.1) | Bug Fixes |
|
||||||
|
| `perf` | Performance improvement | Patch | Performance Improvements |
|
||||||
|
| `docs` | Documentation only | None | Documentation |
|
||||||
|
| `style` | Code style (formatting, whitespace) | None | Styles |
|
||||||
|
| `refactor` | Code change that neither fixes nor adds | None | Code Refactoring |
|
||||||
|
| `test` | Adding or correcting tests | None | Tests |
|
||||||
|
| `chore` | Maintenance tasks, dependencies | None | Miscellaneous Chores |
|
||||||
|
| `build` | Build system or dependencies | None | Build System |
|
||||||
|
| `ci` | CI/CD configuration | None | Continuous Integration |
|
||||||
|
| `revert` | Reverts a previous commit | Varies | Reverts |
|
||||||
|
|
||||||
|
### Breaking Changes (CRITICAL)
|
||||||
|
|
||||||
|
Breaking changes trigger a **major version bump** (0.x.x → 1.0.0 or 1.x.x → 2.0.0).
|
||||||
|
|
||||||
|
**Before marking a change as breaking, carefully consider:**
|
||||||
|
|
||||||
|
1. **Does this change the database schema?**
|
||||||
|
- Schema changes are NOT breaking - migrations are applied automatically
|
||||||
|
- Data loss scenarios (dropping columns with important data) should be documented but are typically NOT breaking since migrations handle them
|
||||||
|
|
||||||
|
2. **Does this change the configuration (environment variables)?**
|
||||||
|
- Adding new optional variables is NOT breaking
|
||||||
|
- Removing or renaming required variables IS breaking
|
||||||
|
- Changing default behavior that users depend on IS breaking
|
||||||
|
|
||||||
|
3. **Does this change the output format?**
|
||||||
|
- Adding new fields to JSON output is NOT breaking
|
||||||
|
- Removing fields or changing structure IS breaking
|
||||||
|
- Changing HTML structure that external tools parse IS breaking
|
||||||
|
|
||||||
|
4. **Does this require users to take action after upgrading?**
|
||||||
|
- If users must run migrations, update configs, or modify their setup → likely breaking
|
||||||
|
|
||||||
|
**How to mark breaking changes:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Option 1: Add ! after the type
|
||||||
|
feat!: remove support for legacy database schema
|
||||||
|
|
||||||
|
# Option 2: Add BREAKING CHANGE footer
|
||||||
|
feat: migrate to new metrics format
|
||||||
|
|
||||||
|
BREAKING CHANGE: The old wide-table schema is no longer supported.
|
||||||
|
Run the migration script before upgrading.
|
||||||
|
```
|
||||||
|
|
||||||
|
### Examples
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Feature (minor bump)
|
||||||
|
feat: add noise floor metric to repeater charts
|
||||||
|
|
||||||
|
# Bug fix (patch bump)
|
||||||
|
fix: correct battery percentage calculation below 3.5V
|
||||||
|
|
||||||
|
# Performance (patch bump)
|
||||||
|
perf: reduce chart rendering time with data point caching
|
||||||
|
|
||||||
|
# Documentation (no bump, in changelog)
|
||||||
|
docs: add troubleshooting section for serial connection issues
|
||||||
|
|
||||||
|
# Refactor (no bump, in changelog)
|
||||||
|
refactor: extract chart rendering logic into separate module
|
||||||
|
|
||||||
|
# Chore (no bump, in changelog)
|
||||||
|
chore: update matplotlib to 3.9.0
|
||||||
|
|
||||||
|
# Breaking change (major bump)
|
||||||
|
feat!: change metrics database schema to EAV format
|
||||||
|
|
||||||
|
BREAKING CHANGE: Existing databases must be migrated using
|
||||||
|
scripts/migrate_to_eav.py before running the new version.
|
||||||
|
```
|
||||||
|
|
||||||
|
### Scopes (Optional)
|
||||||
|
|
||||||
|
Use scopes to clarify what part of the codebase is affected:
|
||||||
|
|
||||||
|
- `charts` - Chart rendering (`src/meshmon/charts.py`)
|
||||||
|
- `db` - Database operations (`src/meshmon/db.py`)
|
||||||
|
- `html` - HTML generation (`src/meshmon/html.py`)
|
||||||
|
- `reports` - Report generation (`src/meshmon/reports.py`)
|
||||||
|
- `collector` - Data collection scripts
|
||||||
|
- `deps` - Dependencies
|
||||||
|
|
||||||
|
Example: `fix(charts): prevent crash when no data points available`
|
||||||
|
|
||||||
|
### Release Process
|
||||||
|
|
||||||
|
1. Commits to `main` with conventional prefixes are analyzed automatically
|
||||||
|
2. release-please creates/updates a "Release PR" with:
|
||||||
|
- Updated `CHANGELOG.md`
|
||||||
|
- Updated version in `src/meshmon/__init__.py`
|
||||||
|
3. When the Release PR is merged:
|
||||||
|
- A GitHub Release is created
|
||||||
|
- A git tag (e.g., `v0.2.0`) is created
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
|
||||||
|
This project monitors a MeshCore LoRa mesh network consisting of:
|
||||||
|
- **1 Companion node**: Connected via USB serial to a local machine
|
||||||
|
- **1 Remote repeater**: Reachable over LoRa from the companion
|
||||||
|
|
||||||
|
The system collects metrics, stores them in a SQLite database, and generates a static HTML dashboard with SVG charts.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────┐ LoRa ┌─────────────────┐
|
||||||
|
│ Companion │◄────────────►│ Repeater │
|
||||||
|
│ (USB Serial) │ │ (Remote) │
|
||||||
|
└────────┬────────┘ └─────────────────┘
|
||||||
|
│
|
||||||
|
│ Serial
|
||||||
|
▼
|
||||||
|
┌─────────────────┐
|
||||||
|
│ Local Host │
|
||||||
|
│ (This System) │
|
||||||
|
└─────────────────┘
|
||||||
|
|
||||||
|
Data Flow:
|
||||||
|
Phase 1: Collect → SQLite database
|
||||||
|
Phase 2: Render → SVG charts (from database, using matplotlib)
|
||||||
|
Phase 3: Render → Static HTML site (inline SVG)
|
||||||
|
Phase 4: Render → Reports (monthly/yearly statistics)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
meshcore-stats/
|
||||||
|
├── src/meshmon/ # Core library
|
||||||
|
│ ├── __init__.py
|
||||||
|
│ ├── env.py # Environment config parsing
|
||||||
|
│ ├── log.py # Logging utilities
|
||||||
|
│ ├── meshcore_client.py # MeshCore connection wrapper
|
||||||
|
│ ├── db.py # SQLite database module
|
||||||
|
│ ├── retry.py # Retry logic & circuit breaker
|
||||||
|
│ ├── charts.py # SVG chart rendering (matplotlib)
|
||||||
|
│ ├── html.py # HTML site generation
|
||||||
|
│ ├── metrics.py # Metric type definitions (counter vs gauge)
|
||||||
|
│ ├── reports.py # Report generation (WeeWX-style)
|
||||||
|
│ ├── battery.py # 18650 Li-ion voltage to percentage conversion
|
||||||
|
│ ├── migrations/ # SQL schema migrations
|
||||||
|
│ │ ├── 001_initial_schema.sql
|
||||||
|
│ │ └── 002_eav_schema.sql
|
||||||
|
│ └── templates/ # Jinja2 HTML templates
|
||||||
|
│ ├── base.html # Base template with head/meta tags
|
||||||
|
│ ├── node.html # Dashboard page template
|
||||||
|
│ ├── report.html # Individual report template
|
||||||
|
│ ├── report_index.html # Reports archive template
|
||||||
|
│ ├── credit.html # Reusable footer credit partial
|
||||||
|
│ ├── styles.css # Source CSS stylesheet
|
||||||
|
│ └── chart-tooltip.js # Source tooltip JavaScript
|
||||||
|
├── docs/ # Documentation
|
||||||
|
│ ├── firmware-responses.md # MeshCore firmware response formats
|
||||||
|
│ ├── battery-voltage-sources.md # Battery voltage research
|
||||||
|
│ └── repeater-winter-solar-analysis.md
|
||||||
|
├── scripts/ # Executable scripts (cron-friendly)
|
||||||
|
│ ├── collect_companion.py # Collect metrics from companion node
|
||||||
|
│ ├── collect_repeater.py # Collect metrics from repeater node
|
||||||
|
│ ├── render_charts.py # Generate SVG charts from database
|
||||||
|
│ ├── render_site.py # Generate static HTML site
|
||||||
|
│ ├── render_reports.py # Generate monthly/yearly reports
|
||||||
|
│ └── db_maintenance.sh # Database VACUUM/ANALYZE
|
||||||
|
├── data/
|
||||||
|
│ └── state/ # Persistent state
|
||||||
|
│ ├── metrics.db # SQLite database (WAL mode)
|
||||||
|
│ └── repeater_circuit.json
|
||||||
|
├── out/ # Generated static site
|
||||||
|
│ ├── day.html # Repeater pages at root (entry point)
|
||||||
|
│ ├── week.html
|
||||||
|
│ ├── month.html
|
||||||
|
│ ├── year.html
|
||||||
|
│ ├── .htaccess # Apache config (DirectoryIndex, cache control)
|
||||||
|
│ ├── styles.css # CSS stylesheet
|
||||||
|
│ ├── chart-tooltip.js # Progressive enhancement for chart tooltips
|
||||||
|
│ ├── companion/ # Companion pages (day/week/month/year.html)
|
||||||
|
│ ├── assets/ # SVG chart files and statistics
|
||||||
|
│ │ ├── companion/ # {metric}_{period}_{theme}.svg, chart_stats.json
|
||||||
|
│ │ └── repeater/ # {metric}_{period}_{theme}.svg, chart_stats.json
|
||||||
|
│ └── reports/ # Monthly/yearly statistics reports
|
||||||
|
│ ├── index.html # Reports listing page
|
||||||
|
│ ├── repeater/ # Repeater reports by year/month
|
||||||
|
│ │ └── YYYY/
|
||||||
|
│ │ ├── index.html, report.txt, report.json # Yearly
|
||||||
|
│ │ └── MM/
|
||||||
|
│ │ └── index.html, report.txt, report.json # Monthly
|
||||||
|
│ └── companion/ # Same structure as repeater
|
||||||
|
└── .envrc # Environment configuration
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
All configuration via environment variables (see `.envrc`):
|
||||||
|
|
||||||
|
### Connection Settings
|
||||||
|
- `MESH_TRANSPORT`: "serial" (default), "tcp", or "ble"
|
||||||
|
- `MESH_SERIAL_PORT`: Serial port path (auto-detects if unset)
|
||||||
|
- `MESH_SERIAL_BAUD`: Baud rate (default: 115200)
|
||||||
|
- `MESH_DEBUG`: Enable meshcore debug logging (0/1)
|
||||||
|
|
||||||
|
### Repeater Identity
|
||||||
|
- `REPEATER_NAME`: Advertised name to find in contacts
|
||||||
|
- `REPEATER_KEY_PREFIX`: Alternative: hex prefix of public key
|
||||||
|
- `REPEATER_PASSWORD`: Admin password for login
|
||||||
|
|
||||||
|
### Timeouts & Retry
|
||||||
|
- `REMOTE_TIMEOUT_S`: Minimum timeout for LoRa requests (default: 10)
|
||||||
|
- `REMOTE_RETRY_ATTEMPTS`: Number of retry attempts (default: 5)
|
||||||
|
- `REMOTE_RETRY_BACKOFF_S`: Seconds between retries (default: 4)
|
||||||
|
- `REMOTE_CB_FAILS`: Failures before circuit breaker opens (default: 6)
|
||||||
|
- `REMOTE_CB_COOLDOWN_S`: Circuit breaker cooldown (default: 3600)
|
||||||
|
|
||||||
|
### Intervals
|
||||||
|
- `COMPANION_STEP`: Collection interval for companion (default: 60s)
|
||||||
|
- `REPEATER_STEP`: Collection interval for repeater (default: 900s / 15min)
|
||||||
|
|
||||||
|
### Location & Display
|
||||||
|
- `REPORT_LOCATION_NAME`: Full location name for reports (default: "Your Location")
|
||||||
|
- `REPORT_LOCATION_SHORT`: Short location for sidebar/meta (default: "Your Location")
|
||||||
|
- `REPORT_LAT`: Latitude in decimal degrees (default: 0.0)
|
||||||
|
- `REPORT_LON`: Longitude in decimal degrees (default: 0.0)
|
||||||
|
- `REPORT_ELEV`: Elevation (default: 0.0)
|
||||||
|
- `REPORT_ELEV_UNIT`: Elevation unit, "m" or "ft" (default: "m")
|
||||||
|
- `REPEATER_DISPLAY_NAME`: Display name for repeater in UI (default: "Repeater Node")
|
||||||
|
- `COMPANION_DISPLAY_NAME`: Display name for companion in UI (default: "Companion Node")
|
||||||
|
- `REPEATER_HARDWARE`: Repeater hardware model for sidebar (default: "LoRa Repeater")
|
||||||
|
- `COMPANION_HARDWARE`: Companion hardware model for sidebar (default: "LoRa Node")
|
||||||
|
|
||||||
|
### Radio Configuration (for display)
|
||||||
|
- `RADIO_FREQUENCY`: e.g., "869.618 MHz"
|
||||||
|
- `RADIO_BANDWIDTH`: e.g., "62.5 kHz"
|
||||||
|
- `RADIO_SPREAD_FACTOR`: e.g., "SF8"
|
||||||
|
- `RADIO_CODING_RATE`: e.g., "CR8"
|
||||||
|
|
||||||
|
### Metrics (Hardcoded)
|
||||||
|
|
||||||
|
Metrics are now hardcoded in the codebase rather than configurable via environment variables. This simplifies the system and ensures consistency between the database schema and the code.
|
||||||
|
|
||||||
|
See the "Metrics Reference" sections below for the full list of companion and repeater metrics.
|
||||||
|
|
||||||
|
## Key Dependencies
|
||||||
|
|
||||||
|
- **meshcore**: Python library for MeshCore device communication
|
||||||
|
- Commands accessed via `mc.commands.method_name()`
|
||||||
|
- Contacts returned as dict keyed by public key
|
||||||
|
- Binary request `req_status_sync` returns payload directly
|
||||||
|
- **matplotlib**: SVG chart generation
|
||||||
|
- **jinja2**: HTML template rendering
|
||||||
|
|
||||||
|
## Metric Types
|
||||||
|
|
||||||
|
Metrics are classified as either **gauge** or **counter** in `src/meshmon/metrics.py`, using firmware field names directly:
|
||||||
|
|
||||||
|
- **GAUGE**: Instantaneous values
|
||||||
|
- Companion: `battery_mv`, `bat_pct`, `contacts`, `uptime_secs`
|
||||||
|
- Repeater: `bat`, `bat_pct`, `last_rssi`, `last_snr`, `noise_floor`, `uptime`, `tx_queue_len`
|
||||||
|
|
||||||
|
- **COUNTER**: Cumulative values that show rate of change - displayed as per-minute:
|
||||||
|
- Companion: `recv`, `sent`
|
||||||
|
- Repeater: `nb_recv`, `nb_sent`, `airtime`, `rx_airtime`, `flood_dups`, `direct_dups`, `sent_flood`, `recv_flood`, `sent_direct`, `recv_direct`
|
||||||
|
|
||||||
|
Counter metrics are converted to rates during chart rendering by calculating deltas between consecutive readings.
|
||||||
|
|
||||||
|
## Database Schema
|
||||||
|
|
||||||
|
Metrics are stored in a SQLite database at `data/state/metrics.db` with WAL mode enabled for concurrent access.
|
||||||
|
|
||||||
|
### EAV Schema (Entity-Attribute-Value)
|
||||||
|
|
||||||
|
The database uses an EAV schema for flexible metric storage. Firmware field names are stored directly, allowing new metrics to be captured automatically without schema changes.
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE metrics (
|
||||||
|
ts INTEGER NOT NULL, -- Unix timestamp
|
||||||
|
role TEXT NOT NULL, -- 'companion' or 'repeater'
|
||||||
|
metric TEXT NOT NULL, -- Firmware field name (e.g., 'bat', 'nb_recv')
|
||||||
|
value REAL, -- Metric value
|
||||||
|
PRIMARY KEY (ts, role, metric)
|
||||||
|
) STRICT, WITHOUT ROWID;
|
||||||
|
|
||||||
|
CREATE INDEX idx_metrics_role_ts ON metrics(role, ts);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Key features:**
|
||||||
|
- Firmware field names stored as-is (no translation)
|
||||||
|
- New firmware fields captured automatically
|
||||||
|
- Renamed/dropped fields handled gracefully
|
||||||
|
- ~3.75M rows/year is well within SQLite capacity
|
||||||
|
|
||||||
|
### Firmware Field Names
|
||||||
|
|
||||||
|
Collectors iterate firmware response dicts directly and insert all numeric values:
|
||||||
|
|
||||||
|
**Companion** (from `get_stats_core`, `get_stats_packets`, `get_contacts`):
|
||||||
|
- `battery_mv` - Battery in millivolts
|
||||||
|
- `uptime_secs` - Uptime in seconds
|
||||||
|
- `recv`, `sent` - Packet counters
|
||||||
|
- `contacts` - Number of contacts
|
||||||
|
|
||||||
|
**Repeater** (from `req_status_sync`):
|
||||||
|
- `bat` - Battery in millivolts
|
||||||
|
- `uptime` - Uptime in seconds
|
||||||
|
- `last_rssi`, `last_snr`, `noise_floor` - Signal metrics
|
||||||
|
- `nb_recv`, `nb_sent` - Packet counters
|
||||||
|
- `airtime`, `rx_airtime` - Airtime counters
|
||||||
|
- `tx_queue_len` - TX queue depth
|
||||||
|
- `flood_dups`, `direct_dups`, `sent_flood`, `recv_flood`, `sent_direct`, `recv_direct` - Detailed packet counters
|
||||||
|
|
||||||
|
See `docs/firmware-responses.md` for complete firmware response documentation.
|
||||||
|
|
||||||
|
### Derived Fields (Computed at Query Time)
|
||||||
|
|
||||||
|
Battery percentage (`bat_pct`) is computed at query time from voltage using the 18650 Li-ion discharge curve:
|
||||||
|
|
||||||
|
| Voltage | Percentage |
|
||||||
|
|---------|------------|
|
||||||
|
| 4.20V | 100% |
|
||||||
|
| 4.06V | 90% |
|
||||||
|
| 3.98V | 80% |
|
||||||
|
| 3.92V | 70% |
|
||||||
|
| 3.87V | 60% |
|
||||||
|
| 3.82V | 50% |
|
||||||
|
| 3.79V | 40% |
|
||||||
|
| 3.77V | 30% |
|
||||||
|
| 3.74V | 20% |
|
||||||
|
| 3.68V | 10% |
|
||||||
|
| 3.45V | 5% |
|
||||||
|
| 3.00V | 0% |
|
||||||
|
|
||||||
|
Uses piecewise linear interpolation between points. Implementation in `src/meshmon/battery.py`.
|
||||||
|
|
||||||
|
### Migration System
|
||||||
|
|
||||||
|
Database migrations are stored as SQL files in `src/meshmon/migrations/` with naming convention `NNN_description.sql`. Migrations are applied automatically on database initialization.
|
||||||
|
|
||||||
|
Current migrations:
|
||||||
|
1. `001_initial_schema.sql` - Creates db_meta table and initial wide tables
|
||||||
|
2. `002_eav_schema.sql` - Migrates to EAV schema, converts field names
|
||||||
|
|
||||||
|
## Running the Scripts
|
||||||
|
|
||||||
|
Always source the environment first:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using direnv (automatic)
|
||||||
|
cd /path/to/meshcore-stats
|
||||||
|
|
||||||
|
# Or manually
|
||||||
|
source .envrc 2>/dev/null
|
||||||
|
```
|
||||||
|
|
||||||
|
### Data Collection
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Collect companion data (run every 60s)
|
||||||
|
python scripts/collect_companion.py
|
||||||
|
|
||||||
|
# Collect repeater data (run every 15min)
|
||||||
|
python scripts/collect_repeater.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Chart Rendering
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Render all SVG charts from database (day/week/month/year for all metrics)
|
||||||
|
python scripts/render_charts.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Charts are rendered using matplotlib, reading directly from the SQLite database. Each chart is generated in both light and dark theme variants.
|
||||||
|
|
||||||
|
### HTML Generation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate static site pages
|
||||||
|
python scripts/render_site.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reports
|
||||||
|
|
||||||
|
Generates monthly and yearly statistics reports in HTML, TXT (WeeWX-style ASCII), and JSON formats:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate all reports
|
||||||
|
python scripts/render_reports.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Reports are generated for all available months/years based on database data. Output structure:
|
||||||
|
- `/reports/` - Index page listing all available reports
|
||||||
|
- `/reports/{role}/{year}/` - Yearly report (HTML, TXT, JSON)
|
||||||
|
- `/reports/{role}/{year}/{month}/` - Monthly report (HTML, TXT, JSON)
|
||||||
|
|
||||||
|
Counter metrics (rx, tx, airtime) are aggregated from absolute counter values in the database, with proper handling of device reboots (negative deltas).
|
||||||
|
|
||||||
|
## Web Dashboard UI
|
||||||
|
|
||||||
|
The static site uses a modern, responsive design with the following features:
|
||||||
|
|
||||||
|
### Site Structure
|
||||||
|
- **Repeater pages at root**: `/day.html`, `/week.html`, etc. (entry point)
|
||||||
|
- **Companion pages**: `/companion/day.html`, `/companion/week.html`, etc.
|
||||||
|
- **`.htaccess`**: Sets `DirectoryIndex day.html` so `/` loads repeater day view
|
||||||
|
|
||||||
|
### Page Layout
|
||||||
|
1. **Header**: Site branding, node name, pubkey prefix, status indicator, last updated time
|
||||||
|
2. **Navigation**: Node switcher (Repeater/Companion) + period tabs (Day/Week/Month/Year)
|
||||||
|
3. **Metrics Bar**: Key values at a glance (Battery, Uptime, RSSI, SNR for repeater)
|
||||||
|
4. **Dashboard Grid**: Two-column layout with Snapshot table and About section
|
||||||
|
5. **Charts Grid**: Two charts per row on desktop, one on mobile
|
||||||
|
|
||||||
|
### Status Indicator
|
||||||
|
Color-coded based on data freshness:
|
||||||
|
- **Green (online)**: Data less than 30 minutes old
|
||||||
|
- **Yellow (stale)**: Data 30 minutes to 2 hours old
|
||||||
|
- **Red (offline)**: Data more than 2 hours old
|
||||||
|
|
||||||
|
### Chart Tooltips
|
||||||
|
- Progressive enhancement via `chart-tooltip.js`
|
||||||
|
- Shows datetime and value when hovering over chart data
|
||||||
|
- Works without JavaScript (charts still display, just no tooltips)
|
||||||
|
- Uses `data-points`, `data-x-start`, `data-x-end` attributes embedded in SVG
|
||||||
|
|
||||||
|
### Social Sharing
|
||||||
|
Open Graph and Twitter Card meta tags for link previews:
|
||||||
|
- `og:title`, `og:description`, `og:site_name`
|
||||||
|
- `twitter:card` (summary_large_image format)
|
||||||
|
- Role-specific descriptions
|
||||||
|
|
||||||
|
### Design System (CSS Variables)
|
||||||
|
```css
|
||||||
|
--primary: #2563eb; /* Brand blue */
|
||||||
|
--bg: #f8fafc; /* Page background */
|
||||||
|
--bg-elevated: #ffffff; /* Card background */
|
||||||
|
--text: #1e293b; /* Primary text */
|
||||||
|
--text-muted: #64748b; /* Secondary text */
|
||||||
|
--border: #e2e8f0; /* Borders */
|
||||||
|
--success: #16a34a; /* Online status */
|
||||||
|
--warning: #ca8a04; /* Stale status */
|
||||||
|
--danger: #dc2626; /* Offline status */
|
||||||
|
```
|
||||||
|
|
||||||
|
### Responsive Breakpoints
|
||||||
|
- **< 900px**: Single column layout, stacked header
|
||||||
|
- **< 600px**: Smaller fonts, stacked table cells, horizontal scroll nav
|
||||||
|
|
||||||
|
## Chart Configuration
|
||||||
|
|
||||||
|
Charts are generated as inline SVGs using matplotlib (`src/meshmon/charts.py`).
|
||||||
|
|
||||||
|
### Rendering
|
||||||
|
- **Output**: SVG files at 800x280 pixels
|
||||||
|
- **Themes**: Light and dark variants (CSS `prefers-color-scheme` switches between them)
|
||||||
|
- **Inline**: SVGs are embedded directly in HTML for zero additional requests
|
||||||
|
- **Tooltips**: Data points embedded as JSON in SVG `data-points` attribute
|
||||||
|
|
||||||
|
### Time Aggregation (Binning)
|
||||||
|
Data points are aggregated into bins to keep chart file sizes reasonable and lines clean:
|
||||||
|
|
||||||
|
| Period | Bin Size | ~Data Points | Pixels/Point |
|
||||||
|
|--------|----------|--------------|--------------|
|
||||||
|
| Day | Raw (no binning) | ~96 | ~6.7px |
|
||||||
|
| Week | 30 minutes | ~336 | ~2px |
|
||||||
|
| Month | 2 hours | ~372 | ~1.7px |
|
||||||
|
| Year | 1 day | ~365 | ~1.8px |
|
||||||
|
|
||||||
|
### Visual Style
|
||||||
|
- 2 charts per row on desktop, 1 on mobile (< 900px)
|
||||||
|
- Amber/orange line color (#b45309 light, #f59e0b dark)
|
||||||
|
- Semi-transparent area fill with solid line on top
|
||||||
|
- Min/Avg/Max statistics displayed in chart footer
|
||||||
|
- Current value displayed in chart header
|
||||||
|
|
||||||
|
### Repeater Metrics Summary
|
||||||
|
|
||||||
|
Metrics use firmware field names directly from `req_status_sync`:
|
||||||
|
|
||||||
|
| Metric | Type | Display Unit | Description |
|
||||||
|
|--------|------|--------------|-------------|
|
||||||
|
| `bat` | gauge | Voltage (V) | Battery voltage (stored in mV, displayed as V) |
|
||||||
|
| `bat_pct` | gauge | Battery (%) | Battery percentage (computed at query time) |
|
||||||
|
| `last_rssi` | gauge | RSSI (dBm) | Signal strength of last packet |
|
||||||
|
| `last_snr` | gauge | SNR (dB) | Signal-to-noise ratio |
|
||||||
|
| `noise_floor` | gauge | dBm | Background RF noise |
|
||||||
|
| `uptime` | gauge | Days | Time since reboot (seconds ÷ 86400) |
|
||||||
|
| `tx_queue_len` | gauge | Queue depth | TX queue length |
|
||||||
|
| `nb_recv` | counter | Packets/min | Total packets received |
|
||||||
|
| `nb_sent` | counter | Packets/min | Total packets transmitted |
|
||||||
|
| `airtime` | counter | Seconds/min | TX airtime rate |
|
||||||
|
| `rx_airtime` | counter | Seconds/min | RX airtime rate |
|
||||||
|
| `flood_dups` | counter | Packets/min | Flood duplicate packets |
|
||||||
|
| `direct_dups` | counter | Packets/min | Direct duplicate packets |
|
||||||
|
| `sent_flood` | counter | Packets/min | Flood packets transmitted |
|
||||||
|
| `recv_flood` | counter | Packets/min | Flood packets received |
|
||||||
|
| `sent_direct` | counter | Packets/min | Direct packets transmitted |
|
||||||
|
| `recv_direct` | counter | Packets/min | Direct packets received |
|
||||||
|
|
||||||
|
### Companion Metrics Summary
|
||||||
|
|
||||||
|
Metrics use firmware field names directly from `get_stats_*`:
|
||||||
|
|
||||||
|
| Metric | Type | Display Unit | Description |
|
||||||
|
|--------|------|--------------|-------------|
|
||||||
|
| `battery_mv` | gauge | Voltage (V) | Battery voltage (stored in mV, displayed as V) |
|
||||||
|
| `bat_pct` | gauge | Battery (%) | Battery percentage (computed at query time) |
|
||||||
|
| `contacts` | gauge | Count | Known mesh nodes |
|
||||||
|
| `uptime_secs` | gauge | Days | Time since reboot (seconds ÷ 86400) |
|
||||||
|
| `recv` | counter | Packets/min | Total packets received |
|
||||||
|
| `sent` | counter | Packets/min | Total packets transmitted |
|
||||||
|
|
||||||
|
## Circuit Breaker
|
||||||
|
|
||||||
|
The repeater collector uses a circuit breaker to avoid spamming LoRa when the repeater is unreachable:
|
||||||
|
|
||||||
|
- State stored in `data/state/repeater_circuit.json`
|
||||||
|
- After N consecutive failures, enters cooldown
|
||||||
|
- During cooldown, skips collection instead of attempting request
|
||||||
|
- Resets on successful response
|
||||||
|
|
||||||
|
## Debugging
|
||||||
|
|
||||||
|
Enable debug logging:
|
||||||
|
```bash
|
||||||
|
MESH_DEBUG=1 python scripts/collect_companion.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Check circuit breaker state:
|
||||||
|
```bash
|
||||||
|
cat data/state/repeater_circuit.json
|
||||||
|
```
|
||||||
|
|
||||||
|
Test with meshcore-cli:
|
||||||
|
```bash
|
||||||
|
meshcore-cli -s /dev/ttyACM0 contacts
|
||||||
|
meshcore-cli -s /dev/ttyACM0 req_status "repeater name"
|
||||||
|
meshcore-cli -s /dev/ttyACM0 reset_path "repeater name"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Known Issues
|
||||||
|
|
||||||
|
1. **Repeater not responding**: If `req_status_sync` times out after all retry attempts, the repeater may:
|
||||||
|
- Not support binary protocol requests
|
||||||
|
- Have incorrect admin password configured
|
||||||
|
- Have routing issues (asymmetric path)
|
||||||
|
- Be offline or rebooted
|
||||||
|
|
||||||
|
2. **Environment variables not loaded**: Scripts must be run with direnv active or manually source `.envrc`
|
||||||
|
|
||||||
|
3. **Empty charts**: Need at least 2 data points to display meaningful data.
|
||||||
|
|
||||||
|
## Cron Setup (Example)
|
||||||
|
|
||||||
|
**Important**: Stagger companion and repeater collection to avoid USB serial conflicts.
|
||||||
|
|
||||||
|
```cron
|
||||||
|
# Companion: every minute at :00
|
||||||
|
* * * * * cd /path/to/meshcore-stats && .direnv/python-3.12/bin/python scripts/collect_companion.py
|
||||||
|
|
||||||
|
# Repeater: every 15 minutes at :01, :16, :31, :46 (offset by 1 min to avoid USB conflict)
|
||||||
|
1,16,31,46 * * * * cd /path/to/meshcore-stats && .direnv/python-3.12/bin/python scripts/collect_repeater.py
|
||||||
|
|
||||||
|
# Charts: every 5 minutes (generates SVG charts from database)
|
||||||
|
*/5 * * * * cd /path/to/meshcore-stats && .direnv/python-3.12/bin/python scripts/render_charts.py
|
||||||
|
|
||||||
|
# HTML: every 5 minutes
|
||||||
|
*/5 * * * * cd /path/to/meshcore-stats && .direnv/python-3.12/bin/python scripts/render_site.py
|
||||||
|
|
||||||
|
# Reports: daily at midnight (historical stats don't change often)
|
||||||
|
0 0 * * * cd /path/to/meshcore-stats && .direnv/python-3.12/bin/python scripts/render_reports.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Adding New Metrics
|
||||||
|
|
||||||
|
With the EAV schema, adding new metrics is simple:
|
||||||
|
|
||||||
|
1. **Automatic capture**: New numeric fields from firmware responses are automatically stored in the database. No schema changes needed.
|
||||||
|
|
||||||
|
2. **To display in charts**: Add the firmware field name to:
|
||||||
|
- `METRIC_CONFIG` in `src/meshmon/metrics.py` (label, unit, type, transform)
|
||||||
|
- `COMPANION_CHART_METRICS` or `REPEATER_CHART_METRICS` in `src/meshmon/metrics.py`
|
||||||
|
- `COMPANION_CHART_GROUPS` or `REPEATER_CHART_GROUPS` in `src/meshmon/html.py`
|
||||||
|
|
||||||
|
3. **To display in reports**: Add the firmware field name to:
|
||||||
|
- `COMPANION_REPORT_METRICS` or `REPEATER_REPORT_METRICS` in `src/meshmon/reports.py`
|
||||||
|
- Update the report table builders in `src/meshmon/html.py` if needed
|
||||||
|
|
||||||
|
4. Regenerate charts and site.
|
||||||
|
|
||||||
|
## Changing Metric Types
|
||||||
|
|
||||||
|
Metric configuration is centralized in `src/meshmon/metrics.py` using `MetricConfig`:
|
||||||
|
|
||||||
|
```python
|
||||||
|
MetricConfig(
|
||||||
|
label="Packets RX", # Human-readable label
|
||||||
|
unit="/min", # Display unit
|
||||||
|
type="counter", # "gauge" or "counter"
|
||||||
|
scale=60, # Multiply by this for display (60 = per minute)
|
||||||
|
transform="mv_to_v", # Optional: apply voltage conversion
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
To change a metric from gauge to counter (or vice versa):
|
||||||
|
|
||||||
|
1. Update `METRIC_CONFIG` in `src/meshmon/metrics.py` - change the `type` field
|
||||||
|
2. Update `scale` if needed (counters often use scale=60 for per-minute display)
|
||||||
|
3. Regenerate charts: `direnv exec . python scripts/render_charts.py`
|
||||||
|
|
||||||
|
## Database Maintenance
|
||||||
|
|
||||||
|
The SQLite database benefits from periodic maintenance to reclaim space and update query statistics. A maintenance script is provided:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run database maintenance (VACUUM + ANALYZE)
|
||||||
|
./scripts/db_maintenance.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
SQLite's VACUUM command acquires an exclusive lock internally. Other processes with `busy_timeout` configured will wait for maintenance to complete.
|
||||||
|
|
||||||
|
### Recommended Cron Entry
|
||||||
|
|
||||||
|
Add to your crontab for monthly maintenance at 3 AM on the 1st:
|
||||||
|
|
||||||
|
```cron
|
||||||
|
# Database maintenance: monthly at 3 AM on the 1st
|
||||||
|
0 3 1 * * cd /path/to/meshcore-stats && ./scripts/db_maintenance.sh >> /var/log/meshcore-stats-maintenance.log 2>&1
|
||||||
|
```
|
||||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2026 Jorijn Schrijvershof
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
340
README.md
Normal file
340
README.md
Normal file
@@ -0,0 +1,340 @@
|
|||||||
|
# MeshCore Stats
|
||||||
|
|
||||||
|
A Python-based monitoring system for a MeshCore repeater node and its companion. Collects metrics from both devices, stores them in a SQLite database, and generates a static website with interactive SVG charts and statistics.
|
||||||
|
|
||||||
|
**Live demo:** [meshcore.jorijn.com](https://meshcore.jorijn.com)
|
||||||
|
|
||||||
|
<p>
|
||||||
|
<img src="docs/screenshot-1.png" width="49%" alt="MeshCore Stats Dashboard">
|
||||||
|
<img src="docs/screenshot-2.png" width="49%" alt="MeshCore Stats Reports">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Data Collection** - Collect metrics from companion (local) and repeater (remote) nodes
|
||||||
|
- **Chart Rendering** - Generate interactive SVG charts from the database using matplotlib
|
||||||
|
- **Static Site** - Generate a static HTML website with day/week/month/year views
|
||||||
|
- **Reports** - Generate monthly and yearly statistics reports
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
### Python Dependencies
|
||||||
|
|
||||||
|
- Python 3.10+
|
||||||
|
- meshcore >= 2.2.3
|
||||||
|
- pyserial >= 3.5
|
||||||
|
- jinja2 >= 3.1.0
|
||||||
|
- matplotlib >= 3.8.0
|
||||||
|
|
||||||
|
### System Dependencies
|
||||||
|
|
||||||
|
- sqlite3 (for database maintenance script)
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
### 1. Create Virtual Environment
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /path/to/meshcore-stats
|
||||||
|
python3 -m venv .venv
|
||||||
|
source .venv/bin/activate
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Configure Environment Variables
|
||||||
|
|
||||||
|
Copy the example configuration file and customize it:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cp .envrc.example .envrc
|
||||||
|
# Edit .envrc with your settings
|
||||||
|
```
|
||||||
|
|
||||||
|
The `.envrc.example` file contains all available configuration options with documentation. Key settings to configure:
|
||||||
|
|
||||||
|
- **Connection**: `MESH_SERIAL_PORT`, `MESH_TRANSPORT`
|
||||||
|
- **Repeater Identity**: `REPEATER_NAME`, `REPEATER_PASSWORD`
|
||||||
|
- **Display Names**: `REPEATER_DISPLAY_NAME`, `COMPANION_DISPLAY_NAME`
|
||||||
|
- **Location**: `REPORT_LOCATION_NAME`, `REPORT_LAT`, `REPORT_LON`, `REPORT_ELEV`
|
||||||
|
- **Hardware Info**: `REPEATER_HARDWARE`, `COMPANION_HARDWARE`
|
||||||
|
- **Radio Config**: `RADIO_FREQUENCY`, `RADIO_BANDWIDTH`, etc. (includes presets for different regions)
|
||||||
|
|
||||||
|
If using direnv:
|
||||||
|
```bash
|
||||||
|
direnv allow
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Manual Execution
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Collect companion data
|
||||||
|
python scripts/collect_companion.py
|
||||||
|
|
||||||
|
# Collect repeater data
|
||||||
|
python scripts/collect_repeater.py
|
||||||
|
|
||||||
|
# Generate static site (includes chart rendering)
|
||||||
|
python scripts/render_site.py
|
||||||
|
|
||||||
|
# Generate reports
|
||||||
|
python scripts/render_reports.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cron Setup
|
||||||
|
|
||||||
|
Add these entries to your crontab (`crontab -e`):
|
||||||
|
|
||||||
|
```cron
|
||||||
|
# MeshCore Stats - adjust paths as needed
|
||||||
|
SHELL=/bin/bash
|
||||||
|
MESHCORE_STATS=/home/user/meshcore-stats
|
||||||
|
DIRENV=/usr/bin/direnv
|
||||||
|
|
||||||
|
# Every minute: collect companion data
|
||||||
|
* * * * * cd $MESHCORE_STATS && $DIRENV exec . python scripts/collect_companion.py
|
||||||
|
|
||||||
|
# Every 15 minutes: collect repeater data
|
||||||
|
*/15 * * * * cd $MESHCORE_STATS && $DIRENV exec . python scripts/collect_repeater.py
|
||||||
|
|
||||||
|
# Every 5 minutes: render site
|
||||||
|
*/5 * * * * cd $MESHCORE_STATS && $DIRENV exec . python scripts/render_site.py
|
||||||
|
|
||||||
|
# Daily at midnight: generate reports
|
||||||
|
0 0 * * * cd $MESHCORE_STATS && $DIRENV exec . python scripts/render_reports.py
|
||||||
|
|
||||||
|
# Monthly at 3 AM on the 1st: database maintenance
|
||||||
|
0 3 1 * * cd $MESHCORE_STATS && ./scripts/db_maintenance.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Serving the Site
|
||||||
|
|
||||||
|
The static site is generated in the `out/` directory. You can serve it with any web server:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Simple Python server for testing
|
||||||
|
cd out && python3 -m http.server 8080
|
||||||
|
|
||||||
|
# Or configure nginx/caddy to serve the out/ directory
|
||||||
|
```
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
meshcore-stats/
|
||||||
|
├── requirements.txt
|
||||||
|
├── README.md
|
||||||
|
├── .envrc.example # Example configuration (copy to .envrc)
|
||||||
|
├── .envrc # Your configuration (create this)
|
||||||
|
├── src/meshmon/
|
||||||
|
│ ├── __init__.py
|
||||||
|
│ ├── env.py # Environment variable parsing
|
||||||
|
│ ├── log.py # Logging helper
|
||||||
|
│ ├── meshcore_client.py # MeshCore connection and commands
|
||||||
|
│ ├── db.py # SQLite database module
|
||||||
|
│ ├── retry.py # Retry logic and circuit breaker
|
||||||
|
│ ├── charts.py # Matplotlib SVG chart generation
|
||||||
|
│ ├── html.py # HTML rendering
|
||||||
|
│ ├── reports.py # Report generation
|
||||||
|
│ ├── metrics.py # Metric type definitions
|
||||||
|
│ ├── battery.py # Battery voltage to percentage conversion
|
||||||
|
│ ├── migrations/ # SQL schema migrations
|
||||||
|
│ │ ├── 001_initial_schema.sql
|
||||||
|
│ │ └── 002_eav_schema.sql
|
||||||
|
│ └── templates/ # Jinja2 HTML templates
|
||||||
|
├── scripts/
|
||||||
|
│ ├── collect_companion.py # Collect metrics from companion node
|
||||||
|
│ ├── collect_repeater.py # Collect metrics from repeater node
|
||||||
|
│ ├── render_charts.py # Generate SVG charts from database
|
||||||
|
│ ├── render_site.py # Generate static HTML site
|
||||||
|
│ ├── render_reports.py # Generate monthly/yearly reports
|
||||||
|
│ └── db_maintenance.sh # Database VACUUM/ANALYZE
|
||||||
|
├── data/
|
||||||
|
│ └── state/
|
||||||
|
│ ├── metrics.db # SQLite database (WAL mode)
|
||||||
|
│ └── repeater_circuit.json
|
||||||
|
└── out/ # Generated site
|
||||||
|
├── .htaccess # Apache config (DirectoryIndex, caching)
|
||||||
|
├── styles.css # Stylesheet
|
||||||
|
├── chart-tooltip.js # Chart tooltip enhancement
|
||||||
|
├── day.html # Repeater pages (entry point)
|
||||||
|
├── week.html
|
||||||
|
├── month.html
|
||||||
|
├── year.html
|
||||||
|
├── companion/
|
||||||
|
│ ├── day.html
|
||||||
|
│ ├── week.html
|
||||||
|
│ ├── month.html
|
||||||
|
│ └── year.html
|
||||||
|
└── reports/
|
||||||
|
├── index.html
|
||||||
|
├── repeater/ # YYYY/MM reports
|
||||||
|
└── companion/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Chart Features
|
||||||
|
|
||||||
|
Charts are rendered as inline SVG using matplotlib with the following features:
|
||||||
|
|
||||||
|
- **Theme Support**: Automatic light/dark mode via CSS `prefers-color-scheme`
|
||||||
|
- **Interactive Tooltips**: Hover to see exact values and timestamps
|
||||||
|
- **Data Point Indicator**: Visual marker shows position on the chart line
|
||||||
|
- **Mobile Support**: Touch-friendly tooltips
|
||||||
|
- **Statistics**: Min/Avg/Max values displayed below each chart
|
||||||
|
- **Period Views**: Day, week, month, and year time ranges
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Serial Device Not Found
|
||||||
|
|
||||||
|
If you see "No serial ports found" or connection fails:
|
||||||
|
|
||||||
|
1. Check that your device is connected:
|
||||||
|
```bash
|
||||||
|
ls -la /dev/ttyUSB* /dev/ttyACM*
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Check permissions (add user to dialout group):
|
||||||
|
```bash
|
||||||
|
sudo usermod -a -G dialout $USER
|
||||||
|
# Log out and back in for changes to take effect
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Try specifying the port explicitly:
|
||||||
|
```bash
|
||||||
|
export MESH_SERIAL_PORT=/dev/ttyACM0
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Check dmesg for device detection:
|
||||||
|
```bash
|
||||||
|
dmesg | tail -20
|
||||||
|
```
|
||||||
|
|
||||||
|
### Repeater Not Found
|
||||||
|
|
||||||
|
If the script cannot find the repeater contact:
|
||||||
|
|
||||||
|
1. The script will print all discovered contacts - check for the correct name
|
||||||
|
2. Verify REPEATER_NAME matches exactly (case-sensitive)
|
||||||
|
3. Try using REPEATER_KEY_PREFIX instead with the first 6-12 hex chars of the public key
|
||||||
|
|
||||||
|
### Circuit Breaker
|
||||||
|
|
||||||
|
If repeater collection shows "cooldown active":
|
||||||
|
|
||||||
|
1. This is normal after multiple failed remote requests
|
||||||
|
2. Wait for the cooldown period (default 1 hour) or reset manually:
|
||||||
|
```bash
|
||||||
|
rm data/state/repeater_circuit.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment Variables Reference
|
||||||
|
|
||||||
|
| Variable | Default | Description |
|
||||||
|
|----------|---------|-------------|
|
||||||
|
| **Connection** | | |
|
||||||
|
| `MESH_TRANSPORT` | serial | Connection type: serial, tcp, ble |
|
||||||
|
| `MESH_SERIAL_PORT` | (auto) | Serial port path |
|
||||||
|
| `MESH_SERIAL_BAUD` | 115200 | Baud rate |
|
||||||
|
| `MESH_TCP_HOST` | localhost | TCP host |
|
||||||
|
| `MESH_TCP_PORT` | 5000 | TCP port |
|
||||||
|
| `MESH_BLE_ADDR` | - | BLE device address |
|
||||||
|
| `MESH_BLE_PIN` | - | BLE PIN |
|
||||||
|
| `MESH_DEBUG` | 0 | Enable debug output |
|
||||||
|
| **Repeater Identity** | | |
|
||||||
|
| `REPEATER_NAME` | - | Repeater advertised name |
|
||||||
|
| `REPEATER_KEY_PREFIX` | - | Repeater public key prefix |
|
||||||
|
| `REPEATER_PASSWORD` | - | Repeater login password |
|
||||||
|
| `REPEATER_FETCH_ACL` | 0 | Also fetch ACL from repeater |
|
||||||
|
| **Display Names** | | |
|
||||||
|
| `REPEATER_DISPLAY_NAME` | Repeater Node | Display name for repeater in UI |
|
||||||
|
| `COMPANION_DISPLAY_NAME` | Companion Node | Display name for companion in UI |
|
||||||
|
| **Location** | | |
|
||||||
|
| `REPORT_LOCATION_NAME` | Your Location | Full location name for reports |
|
||||||
|
| `REPORT_LOCATION_SHORT` | Your Location | Short location for sidebar/meta |
|
||||||
|
| `REPORT_LAT` | 0.0 | Latitude in decimal degrees |
|
||||||
|
| `REPORT_LON` | 0.0 | Longitude in decimal degrees |
|
||||||
|
| `REPORT_ELEV` | 0.0 | Elevation |
|
||||||
|
| `REPORT_ELEV_UNIT` | m | Elevation unit: "m" or "ft" |
|
||||||
|
| **Hardware Info** | | |
|
||||||
|
| `REPEATER_HARDWARE` | LoRa Repeater | Repeater hardware model for sidebar |
|
||||||
|
| `COMPANION_HARDWARE` | LoRa Node | Companion hardware model for sidebar |
|
||||||
|
| **Radio Config** | | |
|
||||||
|
| `RADIO_FREQUENCY` | 869.618 MHz | Radio frequency for display |
|
||||||
|
| `RADIO_BANDWIDTH` | 62.5 kHz | Radio bandwidth for display |
|
||||||
|
| `RADIO_SPREAD_FACTOR` | SF8 | Spread factor for display |
|
||||||
|
| `RADIO_CODING_RATE` | CR8 | Coding rate for display |
|
||||||
|
| **Intervals** | | |
|
||||||
|
| `COMPANION_STEP` | 60 | Companion data collection interval (seconds) |
|
||||||
|
| `REPEATER_STEP` | 900 | Repeater data collection interval (seconds) |
|
||||||
|
| `REMOTE_TIMEOUT_S` | 10 | Remote request timeout |
|
||||||
|
| `REMOTE_RETRY_ATTEMPTS` | 2 | Max retry attempts |
|
||||||
|
| `REMOTE_RETRY_BACKOFF_S` | 4 | Retry backoff delay |
|
||||||
|
| `REMOTE_CB_FAILS` | 6 | Failures before circuit opens |
|
||||||
|
| `REMOTE_CB_COOLDOWN_S` | 3600 | Circuit breaker cooldown |
|
||||||
|
| **Paths** | | |
|
||||||
|
| `STATE_DIR` | ./data/state | State file path |
|
||||||
|
| `OUT_DIR` | ./out | Output site path |
|
||||||
|
|
||||||
|
## Metrics Reference
|
||||||
|
|
||||||
|
The system uses an EAV (Entity-Attribute-Value) schema where firmware field names are stored directly in the database. This allows new metrics to be captured automatically without schema changes.
|
||||||
|
|
||||||
|
### Repeater Metrics
|
||||||
|
|
||||||
|
| Metric | Type | Display Unit | Description |
|
||||||
|
|--------|------|--------------|-------------|
|
||||||
|
| `bat` | Gauge | Voltage (V) | Battery voltage (stored in mV, displayed as V) |
|
||||||
|
| `bat_pct` | Gauge | Battery (%) | Battery percentage (computed from voltage) |
|
||||||
|
| `last_rssi` | Gauge | RSSI (dBm) | Signal strength of last packet |
|
||||||
|
| `last_snr` | Gauge | SNR (dB) | Signal-to-noise ratio |
|
||||||
|
| `noise_floor` | Gauge | dBm | Background RF noise |
|
||||||
|
| `uptime` | Gauge | Days | Time since reboot (seconds ÷ 86400) |
|
||||||
|
| `tx_queue_len` | Gauge | Queue depth | TX queue length |
|
||||||
|
| `nb_recv` | Counter | Packets/min | Total packets received |
|
||||||
|
| `nb_sent` | Counter | Packets/min | Total packets transmitted |
|
||||||
|
| `airtime` | Counter | Seconds/min | TX airtime rate |
|
||||||
|
| `rx_airtime` | Counter | Seconds/min | RX airtime rate |
|
||||||
|
| `flood_dups` | Counter | Packets/min | Flood duplicate packets |
|
||||||
|
| `direct_dups` | Counter | Packets/min | Direct duplicate packets |
|
||||||
|
| `sent_flood` | Counter | Packets/min | Flood packets transmitted |
|
||||||
|
| `recv_flood` | Counter | Packets/min | Flood packets received |
|
||||||
|
| `sent_direct` | Counter | Packets/min | Direct packets transmitted |
|
||||||
|
| `recv_direct` | Counter | Packets/min | Direct packets received |
|
||||||
|
|
||||||
|
### Companion Metrics
|
||||||
|
|
||||||
|
| Metric | Type | Display Unit | Description |
|
||||||
|
|--------|------|--------------|-------------|
|
||||||
|
| `battery_mv` | Gauge | Voltage (V) | Battery voltage (stored in mV, displayed as V) |
|
||||||
|
| `bat_pct` | Gauge | Battery (%) | Battery percentage (computed from voltage) |
|
||||||
|
| `contacts` | Gauge | Count | Known mesh nodes |
|
||||||
|
| `uptime_secs` | Gauge | Days | Time since reboot (seconds ÷ 86400) |
|
||||||
|
| `recv` | Counter | Packets/min | Total packets received |
|
||||||
|
| `sent` | Counter | Packets/min | Total packets transmitted |
|
||||||
|
|
||||||
|
### Metric Types
|
||||||
|
|
||||||
|
- **Gauge**: Instantaneous values stored as-is (battery voltage, RSSI, queue depth)
|
||||||
|
- **Counter**: Cumulative values where the rate of change is calculated (packets, airtime). Charts display per-minute rates.
|
||||||
|
|
||||||
|
## Database
|
||||||
|
|
||||||
|
Metrics are stored in a SQLite database at `data/state/metrics.db` with WAL mode enabled for concurrent read/write access.
|
||||||
|
|
||||||
|
### Schema Migrations
|
||||||
|
|
||||||
|
Database migrations are stored as SQL files in `src/meshmon/migrations/` and are applied automatically when the database is initialized. Migration files follow the naming convention `NNN_description.sql` (e.g., `001_initial_schema.sql`).
|
||||||
|
|
||||||
|
## Public Instances
|
||||||
|
|
||||||
|
A list of publicly accessible MeshCore Stats installations. Want to add yours? [Open a pull request](https://github.com/jorijn/meshcore-stats/pulls)!
|
||||||
|
|
||||||
|
| URL | Hardware | Location |
|
||||||
|
|-----|----------|----------|
|
||||||
|
| [meshcore.jorijn.com](https://meshcore.jorijn.com) | SenseCAP Solar Node P1 Pro + 6.5dBi Mikrotik antenna | Oosterhout, The Netherlands |
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
0
data/state/.gitkeep
Normal file
0
data/state/.gitkeep
Normal file
120
docs/firmware-responses.md
Normal file
120
docs/firmware-responses.md
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
# MeshCore Firmware Response Reference
|
||||||
|
|
||||||
|
This document captures the actual response structures from MeshCore firmware commands.
|
||||||
|
Use this as a reference when updating collectors or adding new metrics.
|
||||||
|
|
||||||
|
> **Last verified**: 2025-12-29
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Companion Node
|
||||||
|
|
||||||
|
The companion node is queried via USB serial using multiple commands.
|
||||||
|
|
||||||
|
### `get_stats_core()`
|
||||||
|
|
||||||
|
Core system statistics.
|
||||||
|
|
||||||
|
```python
|
||||||
|
{
|
||||||
|
'battery_mv': 3895, # Battery voltage in millivolts
|
||||||
|
'uptime_secs': 126, # Uptime in seconds
|
||||||
|
'errors': 0, # Error count
|
||||||
|
'queue_len': 0 # Message queue length
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### `get_stats_packets()`
|
||||||
|
|
||||||
|
Packet counters (cumulative since boot).
|
||||||
|
|
||||||
|
```python
|
||||||
|
{
|
||||||
|
'recv': 20, # Total packets received
|
||||||
|
'sent': 0, # Total packets sent
|
||||||
|
'flood_tx': 0, # Flood packets transmitted
|
||||||
|
'direct_tx': 0, # Direct packets transmitted
|
||||||
|
'flood_rx': 20, # Flood packets received
|
||||||
|
'direct_rx': 0 # Direct packets received
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### `get_stats_radio()`
|
||||||
|
|
||||||
|
Radio/RF statistics.
|
||||||
|
|
||||||
|
```python
|
||||||
|
{
|
||||||
|
'noise_floor': -113, # Noise floor in dBm
|
||||||
|
'last_rssi': -123, # RSSI of last received packet (dBm)
|
||||||
|
'last_snr': -8.5, # SNR of last received packet (dB)
|
||||||
|
'tx_air_secs': 0, # Cumulative TX airtime in seconds
|
||||||
|
'rx_air_secs': 11 # Cumulative RX airtime in seconds
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### `get_bat()`
|
||||||
|
|
||||||
|
Battery/storage info (note: not the main battery source for metrics).
|
||||||
|
|
||||||
|
```python
|
||||||
|
{
|
||||||
|
'level': 4436, # Battery level (unclear unit)
|
||||||
|
'used_kb': 256, # Used storage in KB
|
||||||
|
'total_kb': 1404 # Total storage in KB
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### `get_contacts()`
|
||||||
|
|
||||||
|
Returns a dict keyed by public key. Count via `len(payload)`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Repeater Node
|
||||||
|
|
||||||
|
The repeater is queried over LoRa using the binary protocol command `req_status_sync()`.
|
||||||
|
|
||||||
|
### `req_status_sync(contact)`
|
||||||
|
|
||||||
|
Returns a single dict with all status fields.
|
||||||
|
|
||||||
|
```python
|
||||||
|
{
|
||||||
|
'bat': 4047, # Battery voltage in millivolts
|
||||||
|
'uptime': 1441998, # Uptime in seconds
|
||||||
|
'last_rssi': -63, # RSSI of last received packet (dBm)
|
||||||
|
'last_snr': 12.5, # SNR of last received packet (dB)
|
||||||
|
'noise_floor': -118, # Noise floor in dBm
|
||||||
|
'tx_queue_len': 0, # TX queue depth
|
||||||
|
'nb_recv': 221311, # Total packets received (counter)
|
||||||
|
'nb_sent': 93993, # Total packets sent (counter)
|
||||||
|
'airtime': 64461, # TX airtime in seconds (counter)
|
||||||
|
'rx_airtime': 146626, # RX airtime in seconds (counter)
|
||||||
|
'flood_dups': 59799, # Duplicate flood packets (counter)
|
||||||
|
'direct_dups': 8, # Duplicate direct packets (counter)
|
||||||
|
'sent_flood': 92207, # Flood packets transmitted (counter)
|
||||||
|
'recv_flood': 216960, # Flood packets received (counter)
|
||||||
|
'sent_direct': 1786, # Direct packets transmitted (counter)
|
||||||
|
'recv_direct': 4328 # Direct packets received (counter)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Derived Metrics
|
||||||
|
|
||||||
|
These are computed at query time, not stored:
|
||||||
|
|
||||||
|
| Metric | Source | Computation |
|
||||||
|
|--------|--------|-------------|
|
||||||
|
| `bat_pct` | `bat` or `battery_mv` | `voltage_to_percentage(mv / 1000)` using 18650 Li-ion discharge curve |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- **Counters** reset to 0 on device reboot
|
||||||
|
- **Millivolts to volts**: Divide by 1000 (e.g., `bat: 4047` → `4.047V`)
|
||||||
|
- Repeater fields come from a single `req_status_sync()` call
|
||||||
|
- Companion fields are spread across multiple `get_stats_*()` calls
|
||||||
BIN
docs/screenshot-1.png
Normal file
BIN
docs/screenshot-1.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 1.4 MiB |
BIN
docs/screenshot-2.png
Normal file
BIN
docs/screenshot-2.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 1.1 MiB |
11
out/.htaccess
Executable file
11
out/.htaccess
Executable file
@@ -0,0 +1,11 @@
|
|||||||
|
DirectoryIndex day.html index.html
|
||||||
|
|
||||||
|
# Serve text files with UTF-8 charset
|
||||||
|
AddDefaultCharset UTF-8
|
||||||
|
AddCharset UTF-8 .txt .json .html .css .js .svg
|
||||||
|
|
||||||
|
<FilesMatch "\.(html|png|json|txt)$">
|
||||||
|
Header set Cache-Control "no-cache, no-store, must-revalidate"
|
||||||
|
Header set Pragma "no-cache"
|
||||||
|
Header set Expires "0"
|
||||||
|
</FilesMatch>
|
||||||
33
release-please-config.json
Normal file
33
release-please-config.json
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://raw.githubusercontent.com/googleapis/release-please/main/schemas/config.json",
|
||||||
|
"release-type": "python",
|
||||||
|
"packages": {
|
||||||
|
".": {
|
||||||
|
"package-name": "meshcore-stats",
|
||||||
|
"extra-files": [
|
||||||
|
{
|
||||||
|
"type": "generic",
|
||||||
|
"path": "src/meshmon/__init__.py",
|
||||||
|
"glob": false
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"changelog-sections": [
|
||||||
|
{ "type": "feat", "section": "Features" },
|
||||||
|
{ "type": "fix", "section": "Bug Fixes" },
|
||||||
|
{ "type": "perf", "section": "Performance Improvements" },
|
||||||
|
{ "type": "revert", "section": "Reverts" },
|
||||||
|
{ "type": "docs", "section": "Documentation", "hidden": false },
|
||||||
|
{ "type": "style", "section": "Styles", "hidden": false },
|
||||||
|
{ "type": "chore", "section": "Miscellaneous Chores", "hidden": false },
|
||||||
|
{ "type": "refactor", "section": "Code Refactoring", "hidden": false },
|
||||||
|
{ "type": "test", "section": "Tests", "hidden": false },
|
||||||
|
{ "type": "build", "section": "Build System", "hidden": false },
|
||||||
|
{ "type": "ci", "section": "Continuous Integration", "hidden": false }
|
||||||
|
],
|
||||||
|
"bump-minor-pre-major": true,
|
||||||
|
"bump-patch-for-minor-pre-major": true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"include-component-in-tag": false,
|
||||||
|
"include-v-in-tag": true
|
||||||
|
}
|
||||||
5
requirements.txt
Normal file
5
requirements.txt
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
meshcore>=2.2.3
|
||||||
|
meshcore-cli>=1.0.0
|
||||||
|
pyserial>=3.5
|
||||||
|
jinja2>=3.1.0
|
||||||
|
matplotlib>=3.8.0
|
||||||
213
scripts/collect_companion.py
Executable file
213
scripts/collect_companion.py
Executable file
@@ -0,0 +1,213 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 1: Collect data from companion node.
|
||||||
|
|
||||||
|
Connects to the local companion node via serial and collects:
|
||||||
|
- Device info
|
||||||
|
- Battery status
|
||||||
|
- Time
|
||||||
|
- Self telemetry
|
||||||
|
- Custom vars
|
||||||
|
- Contacts list
|
||||||
|
|
||||||
|
Outputs:
|
||||||
|
- Concise summary to stdout
|
||||||
|
- Metrics written to SQLite database (EAV schema)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path for imports
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
from meshmon.env import get_config
|
||||||
|
from meshmon import log
|
||||||
|
from meshmon.meshcore_client import connect_from_env, run_command
|
||||||
|
from meshmon.db import init_db, insert_metrics
|
||||||
|
|
||||||
|
|
||||||
|
async def collect_companion() -> int:
|
||||||
|
"""
|
||||||
|
Collect data from companion node.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Exit code (0 = success, 1 = connection failed)
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
ts = int(time.time())
|
||||||
|
|
||||||
|
log.debug("Connecting to companion node...")
|
||||||
|
mc = await connect_from_env()
|
||||||
|
|
||||||
|
if mc is None:
|
||||||
|
log.error("Failed to connect to companion node")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
# Metrics to insert (firmware field names)
|
||||||
|
metrics: dict[str, float] = {}
|
||||||
|
commands_succeeded = 0
|
||||||
|
|
||||||
|
# Commands are accessed via mc.commands
|
||||||
|
cmd = mc.commands
|
||||||
|
|
||||||
|
try:
|
||||||
|
# send_appstart (already called during connect, but call again to get self_info)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.send_appstart(), "send_appstart"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"appstart: {evt_type}")
|
||||||
|
else:
|
||||||
|
log.error(f"appstart failed: {err}")
|
||||||
|
|
||||||
|
# send_device_query
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.send_device_query(), "send_device_query"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"device_query: {payload}")
|
||||||
|
else:
|
||||||
|
log.error(f"device_query failed: {err}")
|
||||||
|
|
||||||
|
# get_bat
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_bat(), "get_bat"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"get_bat: {payload}")
|
||||||
|
else:
|
||||||
|
log.error(f"get_bat failed: {err}")
|
||||||
|
|
||||||
|
# get_time
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_time(), "get_time"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"get_time: {payload}")
|
||||||
|
else:
|
||||||
|
log.error(f"get_time failed: {err}")
|
||||||
|
|
||||||
|
# get_self_telemetry
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_self_telemetry(), "get_self_telemetry"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"get_self_telemetry: {payload}")
|
||||||
|
else:
|
||||||
|
log.error(f"get_self_telemetry failed: {err}")
|
||||||
|
|
||||||
|
# get_custom_vars
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_custom_vars(), "get_custom_vars"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
log.debug(f"get_custom_vars: {payload}")
|
||||||
|
else:
|
||||||
|
log.debug(f"get_custom_vars failed: {err}")
|
||||||
|
|
||||||
|
# get_contacts - count contacts
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_contacts(), "get_contacts"
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
commands_succeeded += 1
|
||||||
|
contacts_count = len(payload) if payload else 0
|
||||||
|
metrics["contacts"] = float(contacts_count)
|
||||||
|
log.debug(f"get_contacts: found {contacts_count} contacts")
|
||||||
|
else:
|
||||||
|
log.error(f"get_contacts failed: {err}")
|
||||||
|
|
||||||
|
# Get statistics - these contain the main metrics
|
||||||
|
# Core stats (battery_mv, uptime_secs, errors, queue_len)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_stats_core(), "get_stats_core"
|
||||||
|
)
|
||||||
|
if ok and payload and isinstance(payload, dict):
|
||||||
|
commands_succeeded += 1
|
||||||
|
# Insert all numeric fields from stats_core
|
||||||
|
for key, value in payload.items():
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
metrics[key] = float(value)
|
||||||
|
log.debug(f"stats_core: {payload}")
|
||||||
|
|
||||||
|
# Radio stats (noise_floor, last_rssi, last_snr, tx_air_secs, rx_air_secs)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_stats_radio(), "get_stats_radio"
|
||||||
|
)
|
||||||
|
if ok and payload and isinstance(payload, dict):
|
||||||
|
commands_succeeded += 1
|
||||||
|
for key, value in payload.items():
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
metrics[key] = float(value)
|
||||||
|
log.debug(f"stats_radio: {payload}")
|
||||||
|
|
||||||
|
# Packet stats (recv, sent, flood_tx, direct_tx, flood_rx, direct_rx)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.get_stats_packets(), "get_stats_packets"
|
||||||
|
)
|
||||||
|
if ok and payload and isinstance(payload, dict):
|
||||||
|
commands_succeeded += 1
|
||||||
|
for key, value in payload.items():
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
metrics[key] = float(value)
|
||||||
|
log.debug(f"stats_packets: {payload}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Error during collection: {e}")
|
||||||
|
|
||||||
|
finally:
|
||||||
|
# Close connection
|
||||||
|
if hasattr(mc, "disconnect"):
|
||||||
|
try:
|
||||||
|
await mc.disconnect()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
summary_parts = [f"ts={ts}"]
|
||||||
|
if "battery_mv" in metrics:
|
||||||
|
bat_v = metrics["battery_mv"] / 1000.0
|
||||||
|
summary_parts.append(f"bat={bat_v:.2f}V")
|
||||||
|
if "contacts" in metrics:
|
||||||
|
summary_parts.append(f"contacts={int(metrics['contacts'])}")
|
||||||
|
if "recv" in metrics:
|
||||||
|
summary_parts.append(f"rx={int(metrics['recv'])}")
|
||||||
|
if "sent" in metrics:
|
||||||
|
summary_parts.append(f"tx={int(metrics['sent'])}")
|
||||||
|
|
||||||
|
log.info(f"Companion: {', '.join(summary_parts)}")
|
||||||
|
|
||||||
|
# Write metrics to database
|
||||||
|
if commands_succeeded > 0 and metrics:
|
||||||
|
try:
|
||||||
|
inserted = insert_metrics(ts=ts, role="companion", metrics=metrics)
|
||||||
|
log.debug(f"Inserted {inserted} metrics to database (ts={ts})")
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to write metrics to database: {e}")
|
||||||
|
return 1
|
||||||
|
return 0
|
||||||
|
else:
|
||||||
|
log.error("No commands succeeded or no metrics collected")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Entry point."""
|
||||||
|
# Ensure database is initialized
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
exit_code = asyncio.run(collect_companion())
|
||||||
|
sys.exit(exit_code)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
307
scripts/collect_repeater.py
Executable file
307
scripts/collect_repeater.py
Executable file
@@ -0,0 +1,307 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 1: Collect data from remote repeater node.
|
||||||
|
|
||||||
|
Connects to the local companion node, finds the repeater contact,
|
||||||
|
and queries it over LoRa using binary protocol.
|
||||||
|
|
||||||
|
Features:
|
||||||
|
- Circuit breaker to avoid spamming LoRa
|
||||||
|
- Retry with backoff
|
||||||
|
- Timeout handling
|
||||||
|
|
||||||
|
Outputs:
|
||||||
|
- Concise summary to stdout
|
||||||
|
- Metrics written to SQLite database (EAV schema)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Callable, Coroutine, Optional
|
||||||
|
|
||||||
|
# Add src to path for imports
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
from meshmon.env import get_config
|
||||||
|
from meshmon import log
|
||||||
|
from meshmon.meshcore_client import (
|
||||||
|
connect_from_env,
|
||||||
|
run_command,
|
||||||
|
get_contact_by_name,
|
||||||
|
get_contact_by_key_prefix,
|
||||||
|
extract_contact_info,
|
||||||
|
list_contacts_summary,
|
||||||
|
)
|
||||||
|
from meshmon.db import init_db, insert_metrics
|
||||||
|
from meshmon.retry import get_repeater_circuit_breaker, with_retries
|
||||||
|
|
||||||
|
|
||||||
|
async def find_repeater_contact(mc: Any) -> Optional[Any]:
|
||||||
|
"""
|
||||||
|
Find the repeater contact by name or key prefix.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Contact dict or None
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
|
||||||
|
# Get all contacts first (this populates mc.contacts)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, mc.commands.get_contacts(), "get_contacts"
|
||||||
|
)
|
||||||
|
if not ok:
|
||||||
|
log.error(f"Failed to get contacts: {err}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# payload is a dict keyed by public key, mc.contacts is also populated
|
||||||
|
# The get_contact_by_name method searches mc._contacts
|
||||||
|
contacts_dict = mc.contacts if hasattr(mc, "contacts") else {}
|
||||||
|
if isinstance(payload, dict):
|
||||||
|
contacts_dict = payload
|
||||||
|
|
||||||
|
# Try by name first using the helper (searches mc._contacts)
|
||||||
|
if cfg.repeater_name:
|
||||||
|
log.debug(f"Looking for repeater by name: {cfg.repeater_name}")
|
||||||
|
contact = get_contact_by_name(mc, cfg.repeater_name)
|
||||||
|
if contact:
|
||||||
|
return contact
|
||||||
|
|
||||||
|
# Manual search in payload dict
|
||||||
|
for pk, c in contacts_dict.items():
|
||||||
|
if isinstance(c, dict):
|
||||||
|
name = c.get("adv_name", "")
|
||||||
|
if name and name.lower() == cfg.repeater_name.lower():
|
||||||
|
return c
|
||||||
|
|
||||||
|
# Try by key prefix
|
||||||
|
if cfg.repeater_key_prefix:
|
||||||
|
log.debug(f"Looking for repeater by key prefix: {cfg.repeater_key_prefix}")
|
||||||
|
contact = get_contact_by_key_prefix(mc, cfg.repeater_key_prefix)
|
||||||
|
if contact:
|
||||||
|
return contact
|
||||||
|
|
||||||
|
# Manual search
|
||||||
|
prefix = cfg.repeater_key_prefix.lower()
|
||||||
|
for pk, c in contacts_dict.items():
|
||||||
|
if pk.lower().startswith(prefix):
|
||||||
|
return c
|
||||||
|
|
||||||
|
# Not found - print available contacts
|
||||||
|
log.error("Repeater contact not found")
|
||||||
|
log.info("Available contacts:")
|
||||||
|
for pk, c in contacts_dict.items():
|
||||||
|
if isinstance(c, dict):
|
||||||
|
name = c.get("adv_name", c.get("name", "unnamed"))
|
||||||
|
key = pk[:12] if pk else ""
|
||||||
|
log.info(f" - {name} (key: {key}...)")
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
async def query_repeater_with_retry(
|
||||||
|
mc: Any,
|
||||||
|
contact: Any,
|
||||||
|
command_name: str,
|
||||||
|
command_coro_fn: Callable[[], Coroutine[Any, Any, Any]],
|
||||||
|
) -> tuple[bool, Optional[dict], Optional[str]]:
|
||||||
|
"""
|
||||||
|
Query repeater with retry logic.
|
||||||
|
|
||||||
|
The binary req_*_sync methods return the payload directly (or None on failure),
|
||||||
|
not an Event object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
mc: MeshCore instance
|
||||||
|
contact: Contact object
|
||||||
|
command_name: Name for logging
|
||||||
|
command_coro_fn: Function that returns command coroutine
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(success, payload, error_message)
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
|
||||||
|
async def do_query():
|
||||||
|
result = await command_coro_fn()
|
||||||
|
if result is None:
|
||||||
|
raise Exception("No response received")
|
||||||
|
return result
|
||||||
|
|
||||||
|
success, result, exc = await with_retries(
|
||||||
|
do_query,
|
||||||
|
attempts=cfg.remote_retry_attempts,
|
||||||
|
backoff_s=cfg.remote_retry_backoff_s,
|
||||||
|
name=command_name,
|
||||||
|
)
|
||||||
|
|
||||||
|
if success:
|
||||||
|
return (True, result, None)
|
||||||
|
else:
|
||||||
|
return (False, None, str(exc) if exc else "Failed")
|
||||||
|
|
||||||
|
|
||||||
|
async def collect_repeater() -> int:
|
||||||
|
"""
|
||||||
|
Collect data from remote repeater node.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Exit code (0 = success, 1 = error)
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
ts = int(time.time())
|
||||||
|
|
||||||
|
# Check circuit breaker first
|
||||||
|
cb = get_repeater_circuit_breaker()
|
||||||
|
|
||||||
|
if cb.is_open():
|
||||||
|
remaining = cb.cooldown_remaining()
|
||||||
|
log.warn(f"Circuit breaker open, cooldown active ({remaining}s remaining)")
|
||||||
|
# Skip collection - no metrics to write
|
||||||
|
return 0
|
||||||
|
|
||||||
|
# Connect to companion
|
||||||
|
log.debug("Connecting to companion node...")
|
||||||
|
mc = await connect_from_env()
|
||||||
|
|
||||||
|
if mc is None:
|
||||||
|
log.error("Failed to connect to companion node")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
# Metrics to insert (firmware field names from req_status_sync)
|
||||||
|
metrics: dict[str, float] = {}
|
||||||
|
node_name = "unknown"
|
||||||
|
status_ok = False
|
||||||
|
|
||||||
|
# Commands are accessed via mc.commands
|
||||||
|
cmd = mc.commands
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Initialize (appstart already called during connect)
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc, cmd.send_appstart(), "send_appstart"
|
||||||
|
)
|
||||||
|
if not ok:
|
||||||
|
log.error(f"appstart failed: {err}")
|
||||||
|
|
||||||
|
# Find repeater contact
|
||||||
|
contact = await find_repeater_contact(mc)
|
||||||
|
|
||||||
|
if contact is None:
|
||||||
|
log.error("Cannot find repeater contact")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
# Store contact info
|
||||||
|
contact_info = extract_contact_info(contact)
|
||||||
|
node_name = contact_info.get("adv_name", "unknown")
|
||||||
|
|
||||||
|
log.debug(f"Found repeater: {node_name}")
|
||||||
|
|
||||||
|
# Optional login (if command exists)
|
||||||
|
if cfg.repeater_password and hasattr(cmd, "send_login"):
|
||||||
|
log.debug("Attempting login...")
|
||||||
|
try:
|
||||||
|
ok, evt_type, payload, err = await run_command(
|
||||||
|
mc,
|
||||||
|
cmd.send_login(contact, cfg.repeater_password),
|
||||||
|
"send_login",
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
log.debug("Login successful")
|
||||||
|
else:
|
||||||
|
log.debug(f"Login failed or not supported: {err}")
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"Login not supported: {e}")
|
||||||
|
|
||||||
|
# Query status (using _sync version which returns payload directly)
|
||||||
|
# Use timeout=0 to let the device suggest timeout, with min_timeout as floor
|
||||||
|
log.debug("Querying repeater status...")
|
||||||
|
success, payload, err = await query_repeater_with_retry(
|
||||||
|
mc,
|
||||||
|
contact,
|
||||||
|
"req_status_sync",
|
||||||
|
lambda: cmd.req_status_sync(contact, timeout=0, min_timeout=cfg.remote_timeout_s),
|
||||||
|
)
|
||||||
|
if success and payload and isinstance(payload, dict):
|
||||||
|
status_ok = True
|
||||||
|
# Insert all numeric fields from status response
|
||||||
|
for key, value in payload.items():
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
metrics[key] = float(value)
|
||||||
|
log.debug(f"req_status_sync: {payload}")
|
||||||
|
else:
|
||||||
|
log.warn(f"req_status_sync failed: {err}")
|
||||||
|
|
||||||
|
# Optional ACL query (using _sync version)
|
||||||
|
if cfg.repeater_fetch_acl:
|
||||||
|
log.debug("Querying repeater ACL...")
|
||||||
|
success, payload, err = await query_repeater_with_retry(
|
||||||
|
mc,
|
||||||
|
contact,
|
||||||
|
"req_acl_sync",
|
||||||
|
lambda: cmd.req_acl_sync(contact, timeout=0, min_timeout=cfg.remote_timeout_s),
|
||||||
|
)
|
||||||
|
if success:
|
||||||
|
log.debug(f"req_acl_sync: {payload}")
|
||||||
|
else:
|
||||||
|
log.debug(f"req_acl_sync failed: {err}")
|
||||||
|
|
||||||
|
# Update circuit breaker
|
||||||
|
if status_ok:
|
||||||
|
cb.record_success()
|
||||||
|
log.debug("Circuit breaker: recorded success")
|
||||||
|
else:
|
||||||
|
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
|
||||||
|
log.debug(f"Circuit breaker: recorded failure ({cb.consecutive_failures}/{cfg.remote_cb_fails})")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Error during collection: {e}")
|
||||||
|
cb.record_failure(cfg.remote_cb_fails, cfg.remote_cb_cooldown_s)
|
||||||
|
|
||||||
|
finally:
|
||||||
|
# Close connection
|
||||||
|
if hasattr(mc, "disconnect"):
|
||||||
|
try:
|
||||||
|
await mc.disconnect()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
summary_parts = [f"ts={ts}"]
|
||||||
|
if "bat" in metrics:
|
||||||
|
bat_v = metrics["bat"] / 1000.0
|
||||||
|
summary_parts.append(f"bat={bat_v:.2f}V")
|
||||||
|
if "uptime" in metrics:
|
||||||
|
uptime_days = metrics["uptime"] // 86400
|
||||||
|
summary_parts.append(f"uptime={int(uptime_days)}d")
|
||||||
|
if "nb_recv" in metrics:
|
||||||
|
summary_parts.append(f"rx={int(metrics['nb_recv'])}")
|
||||||
|
if "nb_sent" in metrics:
|
||||||
|
summary_parts.append(f"tx={int(metrics['nb_sent'])}")
|
||||||
|
|
||||||
|
log.info(f"Repeater ({node_name}): {', '.join(summary_parts)}")
|
||||||
|
|
||||||
|
# Write metrics to database
|
||||||
|
if status_ok and metrics:
|
||||||
|
try:
|
||||||
|
inserted = insert_metrics(ts=ts, role="repeater", metrics=metrics)
|
||||||
|
log.debug(f"Inserted {inserted} metrics to database (ts={ts})")
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to write metrics to database: {e}")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
return 0 if status_ok else 1
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Entry point."""
|
||||||
|
# Ensure database is initialized
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
exit_code = asyncio.run(collect_repeater())
|
||||||
|
sys.exit(exit_code)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
43
scripts/db_maintenance.sh
Executable file
43
scripts/db_maintenance.sh
Executable file
@@ -0,0 +1,43 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Database maintenance script
|
||||||
|
#
|
||||||
|
# Runs VACUUM and ANALYZE on the SQLite database to compact it and
|
||||||
|
# update query statistics.
|
||||||
|
#
|
||||||
|
# Recommended: Run monthly via cron
|
||||||
|
# Example crontab entry:
|
||||||
|
# 0 3 1 * * cd /home/jorijn/apps/meshcore-stats && ./scripts/db_maintenance.sh
|
||||||
|
#
|
||||||
|
# This script will:
|
||||||
|
# 1. Run VACUUM to compact the database and reclaim space
|
||||||
|
# 2. Run ANALYZE to update query optimizer statistics
|
||||||
|
#
|
||||||
|
# Note: VACUUM acquires an exclusive lock internally. Other processes
|
||||||
|
# using busy_timeout will wait for it to complete.
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
PROJECT_DIR="$(dirname "$SCRIPT_DIR")"
|
||||||
|
DB_PATH="$PROJECT_DIR/data/state/metrics.db"
|
||||||
|
|
||||||
|
# Check if database exists
|
||||||
|
if [ ! -f "$DB_PATH" ]; then
|
||||||
|
echo "Database not found: $DB_PATH"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "$(date '+%Y-%m-%d %H:%M:%S') Starting database maintenance..."
|
||||||
|
echo "Database: $DB_PATH"
|
||||||
|
|
||||||
|
echo "Running VACUUM..."
|
||||||
|
sqlite3 "$DB_PATH" "VACUUM;"
|
||||||
|
|
||||||
|
echo "Running ANALYZE..."
|
||||||
|
sqlite3 "$DB_PATH" "ANALYZE;"
|
||||||
|
|
||||||
|
# Get database size
|
||||||
|
DB_SIZE=$(du -h "$DB_PATH" | cut -f1)
|
||||||
|
echo "Database size after maintenance: $DB_SIZE"
|
||||||
|
|
||||||
|
echo "$(date '+%Y-%m-%d %H:%M:%S') Maintenance complete"
|
||||||
51
scripts/render_charts.py
Executable file
51
scripts/render_charts.py
Executable file
@@ -0,0 +1,51 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 2: Render charts from SQLite database.
|
||||||
|
|
||||||
|
Generates SVG charts for day/week/month/year for both companion and repeater
|
||||||
|
using matplotlib, reading directly from the SQLite metrics database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path for imports
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
from meshmon.db import init_db, get_metric_count
|
||||||
|
from meshmon import log
|
||||||
|
from meshmon.charts import render_all_charts, save_chart_stats
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Render all charts and save statistics."""
|
||||||
|
# Ensure database is initialized
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
log.info("Rendering charts from database...")
|
||||||
|
|
||||||
|
# Check if data exists before rendering
|
||||||
|
companion_count = get_metric_count("companion")
|
||||||
|
repeater_count = get_metric_count("repeater")
|
||||||
|
|
||||||
|
# Companion charts
|
||||||
|
if companion_count > 0:
|
||||||
|
charts, stats = render_all_charts("companion")
|
||||||
|
save_chart_stats("companion", stats)
|
||||||
|
log.info(f"Rendered {len(charts)} companion charts ({companion_count} data points)")
|
||||||
|
else:
|
||||||
|
log.warn("No companion metrics in database")
|
||||||
|
|
||||||
|
# Repeater charts
|
||||||
|
if repeater_count > 0:
|
||||||
|
charts, stats = render_all_charts("repeater")
|
||||||
|
save_chart_stats("repeater", stats)
|
||||||
|
log.info(f"Rendered {len(charts)} repeater charts ({repeater_count} data points)")
|
||||||
|
else:
|
||||||
|
log.warn("No repeater metrics in database")
|
||||||
|
|
||||||
|
log.info("Chart rendering complete")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
304
scripts/render_reports.py
Executable file
304
scripts/render_reports.py
Executable file
@@ -0,0 +1,304 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 4: Render reports from SQLite database.
|
||||||
|
|
||||||
|
Generates monthly and yearly statistics reports in HTML, TXT, and JSON
|
||||||
|
formats for both repeater and companion nodes.
|
||||||
|
|
||||||
|
Output structure:
|
||||||
|
out/reports/
|
||||||
|
index.html # Reports listing
|
||||||
|
repeater/
|
||||||
|
2025/
|
||||||
|
index.html # Yearly report (HTML)
|
||||||
|
report.txt # Yearly report (TXT)
|
||||||
|
report.json # Yearly report (JSON)
|
||||||
|
12/
|
||||||
|
index.html # Monthly report (HTML)
|
||||||
|
report.txt # Monthly report (TXT)
|
||||||
|
report.json # Monthly report (JSON)
|
||||||
|
companion/
|
||||||
|
... # Same structure
|
||||||
|
"""
|
||||||
|
|
||||||
|
import calendar
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
# Add src to path for imports
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
from meshmon.db import init_db
|
||||||
|
from meshmon.env import get_config
|
||||||
|
from meshmon import log
|
||||||
|
|
||||||
|
|
||||||
|
def safe_write(path: Path, content: str) -> bool:
|
||||||
|
"""Write content to file with error handling.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
path: File path to write to
|
||||||
|
content: Content to write
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if write succeeded, False otherwise
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
path.write_text(content, encoding="utf-8")
|
||||||
|
return True
|
||||||
|
except IOError as e:
|
||||||
|
log.error(f"Failed to write {path}: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
from meshmon.reports import (
|
||||||
|
LocationInfo,
|
||||||
|
aggregate_monthly,
|
||||||
|
aggregate_yearly,
|
||||||
|
format_monthly_txt,
|
||||||
|
format_yearly_txt,
|
||||||
|
get_available_periods,
|
||||||
|
monthly_to_json,
|
||||||
|
yearly_to_json,
|
||||||
|
)
|
||||||
|
from meshmon.html import render_report_page, render_reports_index
|
||||||
|
|
||||||
|
|
||||||
|
def get_node_name(role: str) -> str:
|
||||||
|
"""Get display name for a node role from configuration."""
|
||||||
|
cfg = get_config()
|
||||||
|
if role == "repeater":
|
||||||
|
return cfg.repeater_display_name
|
||||||
|
elif role == "companion":
|
||||||
|
return cfg.companion_display_name
|
||||||
|
return role.capitalize()
|
||||||
|
|
||||||
|
|
||||||
|
def get_location() -> LocationInfo:
|
||||||
|
"""Get location info from config."""
|
||||||
|
cfg = get_config()
|
||||||
|
return LocationInfo(
|
||||||
|
name=cfg.report_location_name,
|
||||||
|
lat=cfg.report_lat,
|
||||||
|
lon=cfg.report_lon,
|
||||||
|
elev=cfg.report_elev,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def render_monthly_report(
|
||||||
|
role: str,
|
||||||
|
year: int,
|
||||||
|
month: int,
|
||||||
|
prev_period: Optional[tuple[int, int]] = None,
|
||||||
|
next_period: Optional[tuple[int, int]] = None,
|
||||||
|
) -> None:
|
||||||
|
"""Render monthly report in all formats.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
year: Report year
|
||||||
|
month: Report month (1-12)
|
||||||
|
prev_period: (year, month) of previous report, or None
|
||||||
|
next_period: (year, month) of next report, or None
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
node_name = get_node_name(role)
|
||||||
|
location = get_location()
|
||||||
|
|
||||||
|
log.info(f"Aggregating {role} monthly report for {year}-{month:02d}...")
|
||||||
|
agg = aggregate_monthly(role, year, month)
|
||||||
|
|
||||||
|
if not agg.daily:
|
||||||
|
log.warn(f"No data for {role} {year}-{month:02d}, skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Create output directory
|
||||||
|
out_dir = cfg.out_dir / "reports" / role / str(year) / f"{month:02d}"
|
||||||
|
out_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Build prev/next navigation
|
||||||
|
prev_report = None
|
||||||
|
next_report = None
|
||||||
|
if prev_period:
|
||||||
|
py, pm = prev_period
|
||||||
|
prev_report = {
|
||||||
|
"url": f"/reports/{role}/{py}/{pm:02d}/",
|
||||||
|
"label": f"{calendar.month_abbr[pm]} {py}",
|
||||||
|
}
|
||||||
|
if next_period:
|
||||||
|
ny, nm = next_period
|
||||||
|
next_report = {
|
||||||
|
"url": f"/reports/{role}/{ny}/{nm:02d}/",
|
||||||
|
"label": f"{calendar.month_abbr[nm]} {ny}",
|
||||||
|
}
|
||||||
|
|
||||||
|
# HTML
|
||||||
|
html = render_report_page(agg, node_name, "monthly", prev_report, next_report)
|
||||||
|
safe_write(out_dir / "index.html", html)
|
||||||
|
|
||||||
|
# TXT (WeeWX-style)
|
||||||
|
txt = format_monthly_txt(agg, node_name, location)
|
||||||
|
safe_write(out_dir / "report.txt", txt)
|
||||||
|
|
||||||
|
# JSON
|
||||||
|
json_data = monthly_to_json(agg)
|
||||||
|
safe_write(out_dir / "report.json", json.dumps(json_data, indent=2))
|
||||||
|
|
||||||
|
log.debug(f"Wrote monthly report: {out_dir}")
|
||||||
|
|
||||||
|
|
||||||
|
def render_yearly_report(
|
||||||
|
role: str,
|
||||||
|
year: int,
|
||||||
|
prev_year: Optional[int] = None,
|
||||||
|
next_year: Optional[int] = None,
|
||||||
|
) -> None:
|
||||||
|
"""Render yearly report in all formats.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
year: Report year
|
||||||
|
prev_year: Previous year with data, or None
|
||||||
|
next_year: Next year with data, or None
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
node_name = get_node_name(role)
|
||||||
|
location = get_location()
|
||||||
|
|
||||||
|
log.info(f"Aggregating {role} yearly report for {year}...")
|
||||||
|
agg = aggregate_yearly(role, year)
|
||||||
|
|
||||||
|
if not agg.monthly:
|
||||||
|
log.warn(f"No data for {role} {year}, skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Create output directory
|
||||||
|
out_dir = cfg.out_dir / "reports" / role / str(year)
|
||||||
|
out_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Build prev/next navigation
|
||||||
|
prev_report = None
|
||||||
|
next_report = None
|
||||||
|
if prev_year:
|
||||||
|
prev_report = {
|
||||||
|
"url": f"/reports/{role}/{prev_year}/",
|
||||||
|
"label": str(prev_year),
|
||||||
|
}
|
||||||
|
if next_year:
|
||||||
|
next_report = {
|
||||||
|
"url": f"/reports/{role}/{next_year}/",
|
||||||
|
"label": str(next_year),
|
||||||
|
}
|
||||||
|
|
||||||
|
# HTML
|
||||||
|
html = render_report_page(agg, node_name, "yearly", prev_report, next_report)
|
||||||
|
safe_write(out_dir / "index.html", html)
|
||||||
|
|
||||||
|
# TXT (WeeWX-style)
|
||||||
|
txt = format_yearly_txt(agg, node_name, location)
|
||||||
|
safe_write(out_dir / "report.txt", txt)
|
||||||
|
|
||||||
|
# JSON
|
||||||
|
json_data = yearly_to_json(agg)
|
||||||
|
safe_write(out_dir / "report.json", json.dumps(json_data, indent=2))
|
||||||
|
|
||||||
|
log.debug(f"Wrote yearly report: {out_dir}")
|
||||||
|
|
||||||
|
|
||||||
|
def build_reports_index_data() -> list[dict]:
|
||||||
|
"""Build data structure for reports index page.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of section dicts with 'role' and 'years' keys
|
||||||
|
"""
|
||||||
|
sections = []
|
||||||
|
|
||||||
|
for role in ["repeater", "companion"]:
|
||||||
|
periods = get_available_periods(role)
|
||||||
|
if not periods:
|
||||||
|
sections.append({"role": role, "years": []})
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Group by year
|
||||||
|
years_data = {}
|
||||||
|
for year, month in periods:
|
||||||
|
if year not in years_data:
|
||||||
|
years_data[year] = []
|
||||||
|
years_data[year].append({
|
||||||
|
"month": month,
|
||||||
|
"name": calendar.month_name[month],
|
||||||
|
})
|
||||||
|
|
||||||
|
# Build years list, sorted descending
|
||||||
|
years = []
|
||||||
|
for year in sorted(years_data.keys(), reverse=True):
|
||||||
|
years.append({
|
||||||
|
"year": year,
|
||||||
|
"months": sorted(years_data[year], key=lambda m: m["month"]),
|
||||||
|
})
|
||||||
|
|
||||||
|
sections.append({"role": role, "years": years})
|
||||||
|
|
||||||
|
return sections
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Generate all statistics reports."""
|
||||||
|
# Ensure database is initialized
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
cfg = get_config()
|
||||||
|
|
||||||
|
log.info("Generating reports from database...")
|
||||||
|
|
||||||
|
# Ensure base reports directory exists
|
||||||
|
(cfg.out_dir / "reports").mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
total_monthly = 0
|
||||||
|
total_yearly = 0
|
||||||
|
|
||||||
|
for role in ["repeater", "companion"]:
|
||||||
|
periods = get_available_periods(role)
|
||||||
|
if not periods:
|
||||||
|
log.info(f"No data found for {role}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
log.info(f"Found {len(periods)} months of data for {role}")
|
||||||
|
|
||||||
|
# Sort periods chronologically for prev/next navigation
|
||||||
|
sorted_periods = sorted(periods)
|
||||||
|
|
||||||
|
# Render monthly reports with prev/next links
|
||||||
|
for i, (year, month) in enumerate(sorted_periods):
|
||||||
|
prev_period = sorted_periods[i - 1] if i > 0 else None
|
||||||
|
next_period = sorted_periods[i + 1] if i < len(sorted_periods) - 1 else None
|
||||||
|
render_monthly_report(role, year, month, prev_period, next_period)
|
||||||
|
total_monthly += 1
|
||||||
|
|
||||||
|
# Get unique years
|
||||||
|
years = sorted(set(y for y, m in periods))
|
||||||
|
|
||||||
|
# Render yearly reports with prev/next links
|
||||||
|
for i, year in enumerate(years):
|
||||||
|
prev_year = years[i - 1] if i > 0 else None
|
||||||
|
next_year = years[i + 1] if i < len(years) - 1 else None
|
||||||
|
render_yearly_report(role, year, prev_year, next_year)
|
||||||
|
total_yearly += 1
|
||||||
|
|
||||||
|
# Render reports index
|
||||||
|
log.info("Rendering reports index...")
|
||||||
|
sections = build_reports_index_data()
|
||||||
|
index_html = render_reports_index(sections)
|
||||||
|
safe_write(cfg.out_dir / "reports" / "index.html", index_html)
|
||||||
|
|
||||||
|
log.info(
|
||||||
|
f"Generated {total_monthly} monthly + {total_yearly} yearly reports "
|
||||||
|
f"to {cfg.out_dir / 'reports'}"
|
||||||
|
)
|
||||||
|
log.info("Report generation complete")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
52
scripts/render_site.py
Executable file
52
scripts/render_site.py
Executable file
@@ -0,0 +1,52 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 3: Render static HTML site.
|
||||||
|
|
||||||
|
Generates static HTML pages using latest metrics from SQLite database
|
||||||
|
and rendered charts. Creates day/week/month/year pages for both
|
||||||
|
companion and repeater nodes.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path for imports
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
from meshmon.db import init_db, get_latest_metrics
|
||||||
|
from meshmon.env import get_config
|
||||||
|
from meshmon import log
|
||||||
|
from meshmon.html import write_site
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Render static site."""
|
||||||
|
# Ensure database is initialized
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
cfg = get_config()
|
||||||
|
|
||||||
|
log.info("Rendering static site...")
|
||||||
|
|
||||||
|
# Load latest metrics from database
|
||||||
|
companion_row = get_latest_metrics("companion")
|
||||||
|
if companion_row:
|
||||||
|
log.debug(f"Loaded companion metrics (ts={companion_row.get('ts')})")
|
||||||
|
else:
|
||||||
|
log.warn("No companion metrics found in database")
|
||||||
|
|
||||||
|
repeater_row = get_latest_metrics("repeater")
|
||||||
|
if repeater_row:
|
||||||
|
log.debug(f"Loaded repeater metrics (ts={repeater_row.get('ts')})")
|
||||||
|
else:
|
||||||
|
log.warn("No repeater metrics found in database")
|
||||||
|
|
||||||
|
# Write site
|
||||||
|
pages = write_site(companion_row, repeater_row)
|
||||||
|
|
||||||
|
log.info(f"Wrote {len(pages)} pages to {cfg.out_dir}")
|
||||||
|
log.info("Site rendering complete")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
3
src/meshmon/__init__.py
Normal file
3
src/meshmon/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
"""MeshCore network monitoring library."""
|
||||||
|
|
||||||
|
__version__ = "0.1.0" # x-release-please-version
|
||||||
50
src/meshmon/battery.py
Normal file
50
src/meshmon/battery.py
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
"""Battery voltage to percentage conversion for 18650 Li-ion cells."""
|
||||||
|
|
||||||
|
|
||||||
|
# Voltage to percentage lookup table for 18650 Li-ion cells
|
||||||
|
# Based on typical discharge curve: 4.20V = 100%, 3.00V = 0%
|
||||||
|
# Source: https://www.benzoenergy.com/blog/post/what-is-the-relationship-between-voltage-and-capacity-of-18650-li-ion-battery.html
|
||||||
|
VOLTAGE_TABLE = [
|
||||||
|
(4.20, 100),
|
||||||
|
(4.06, 90),
|
||||||
|
(3.98, 80),
|
||||||
|
(3.92, 70),
|
||||||
|
(3.87, 60),
|
||||||
|
(3.82, 50),
|
||||||
|
(3.79, 40),
|
||||||
|
(3.77, 30),
|
||||||
|
(3.74, 20),
|
||||||
|
(3.68, 10),
|
||||||
|
(3.45, 5),
|
||||||
|
(3.00, 0),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def voltage_to_percentage(voltage: float) -> float:
|
||||||
|
"""
|
||||||
|
Convert 18650 Li-ion battery voltage to percentage.
|
||||||
|
|
||||||
|
Uses piecewise linear interpolation between known points
|
||||||
|
on the discharge curve for accuracy.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
voltage: Battery voltage in volts
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Estimated battery percentage (0-100)
|
||||||
|
"""
|
||||||
|
if voltage >= 4.20:
|
||||||
|
return 100.0
|
||||||
|
if voltage <= 3.00:
|
||||||
|
return 0.0
|
||||||
|
|
||||||
|
# Find the two points to interpolate between
|
||||||
|
for i in range(len(VOLTAGE_TABLE) - 1):
|
||||||
|
v_high, p_high = VOLTAGE_TABLE[i]
|
||||||
|
v_low, p_low = VOLTAGE_TABLE[i + 1]
|
||||||
|
if v_low <= voltage <= v_high:
|
||||||
|
# Linear interpolation
|
||||||
|
ratio = (voltage - v_low) / (v_high - v_low)
|
||||||
|
return p_low + ratio * (p_high - p_low)
|
||||||
|
|
||||||
|
return 0.0
|
||||||
651
src/meshmon/charts.py
Normal file
651
src/meshmon/charts.py
Normal file
@@ -0,0 +1,651 @@
|
|||||||
|
"""Matplotlib-based chart generation from SQLite database.
|
||||||
|
|
||||||
|
This module generates SVG charts with CSS variable support for theming,
|
||||||
|
reading metrics directly from the SQLite database (EAV schema).
|
||||||
|
"""
|
||||||
|
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Literal, Optional
|
||||||
|
|
||||||
|
import matplotlib
|
||||||
|
matplotlib.use('Agg') # Non-interactive backend for server-side rendering
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
import matplotlib.dates as mdates
|
||||||
|
|
||||||
|
from .db import get_metrics_for_period
|
||||||
|
from .env import get_config
|
||||||
|
from .metrics import (
|
||||||
|
get_chart_metrics,
|
||||||
|
is_counter_metric,
|
||||||
|
get_graph_scale,
|
||||||
|
transform_value,
|
||||||
|
)
|
||||||
|
from . import log
|
||||||
|
|
||||||
|
|
||||||
|
# Type alias for theme names
|
||||||
|
ThemeName = Literal["light", "dark"]
|
||||||
|
|
||||||
|
# Bin size constants (in seconds)
|
||||||
|
# These control time aggregation to keep chart data points at reasonable density
|
||||||
|
BIN_30_MINUTES = 1800 # 30 minutes in seconds
|
||||||
|
BIN_2_HOURS = 7200 # 2 hours in seconds
|
||||||
|
BIN_1_DAY = 86400 # 1 day in seconds
|
||||||
|
|
||||||
|
# Period configuration: lookback duration and aggregation bin size
|
||||||
|
# Period configuration for chart rendering
|
||||||
|
# Target: ~100-400 data points per chart for clean visualization
|
||||||
|
# Chart plot area is ~640px, so aim for 1.5-6px per point
|
||||||
|
PERIOD_CONFIG = {
|
||||||
|
"day": {
|
||||||
|
"lookback": timedelta(days=1),
|
||||||
|
"bin_seconds": None, # No binning - raw data (~96 points at 15-min intervals)
|
||||||
|
},
|
||||||
|
"week": {
|
||||||
|
"lookback": timedelta(days=7),
|
||||||
|
"bin_seconds": BIN_30_MINUTES, # 30-min bins (~336 points, ~2px per point)
|
||||||
|
},
|
||||||
|
"month": {
|
||||||
|
"lookback": timedelta(days=31),
|
||||||
|
"bin_seconds": BIN_2_HOURS, # 2-hour bins (~372 points, ~1.7px per point)
|
||||||
|
},
|
||||||
|
"year": {
|
||||||
|
"lookback": timedelta(days=365),
|
||||||
|
"bin_seconds": BIN_1_DAY, # 1-day bins (~365 points, ~1.8px per point)
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class ChartTheme:
|
||||||
|
"""Color palette for chart rendering with CSS variable names."""
|
||||||
|
|
||||||
|
name: str
|
||||||
|
# Colors as hex values (without #)
|
||||||
|
background: str
|
||||||
|
canvas: str
|
||||||
|
text: str
|
||||||
|
axis: str
|
||||||
|
grid: str
|
||||||
|
line: str
|
||||||
|
area: str # Includes alpha channel
|
||||||
|
|
||||||
|
|
||||||
|
# Chart themes matching the Radio Observatory design (from redesign/chart_colors.py)
|
||||||
|
CHART_THEMES: dict[ThemeName, ChartTheme] = {
|
||||||
|
"light": ChartTheme(
|
||||||
|
name="light",
|
||||||
|
background="faf8f5", # Warm cream paper
|
||||||
|
canvas="ffffff",
|
||||||
|
text="1a1915", # Charcoal
|
||||||
|
axis="8a857a", # Muted text
|
||||||
|
grid="e8e4dc", # Subtle border
|
||||||
|
line="b45309", # Solar amber accent - burnt orange
|
||||||
|
area="b4530926", # 15% opacity fill
|
||||||
|
),
|
||||||
|
"dark": ChartTheme(
|
||||||
|
name="dark",
|
||||||
|
background="0f1114", # Deep observatory night
|
||||||
|
canvas="161a1e",
|
||||||
|
text="f0efe8", # Light text
|
||||||
|
axis="706d62", # Muted text
|
||||||
|
grid="252a30", # Subtle border
|
||||||
|
line="f59e0b", # Bright amber accent
|
||||||
|
area="f59e0b33", # 20% opacity fill
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class DataPoint:
|
||||||
|
"""A single data point with timestamp and value."""
|
||||||
|
timestamp: datetime
|
||||||
|
value: float
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class TimeSeries:
|
||||||
|
"""Time series data for a single metric."""
|
||||||
|
|
||||||
|
metric: str
|
||||||
|
role: str
|
||||||
|
period: str
|
||||||
|
points: list[DataPoint] = field(default_factory=list)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def timestamps(self) -> list[datetime]:
|
||||||
|
return [p.timestamp for p in self.points]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def values(self) -> list[float]:
|
||||||
|
return [p.value for p in self.points]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_empty(self) -> bool:
|
||||||
|
return len(self.points) == 0
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ChartStatistics:
|
||||||
|
"""Statistics for a time series (min/avg/max/current)."""
|
||||||
|
|
||||||
|
min_value: Optional[float] = None
|
||||||
|
avg_value: Optional[float] = None
|
||||||
|
max_value: Optional[float] = None
|
||||||
|
current_value: Optional[float] = None
|
||||||
|
|
||||||
|
def to_dict(self) -> dict[str, Optional[float]]:
|
||||||
|
"""Convert to dict matching existing chart_stats.json format."""
|
||||||
|
return {
|
||||||
|
"min": self.min_value,
|
||||||
|
"avg": self.avg_value,
|
||||||
|
"max": self.max_value,
|
||||||
|
"current": self.current_value,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _hex_to_rgba(hex_color: str) -> tuple[float, float, float, float]:
|
||||||
|
"""Convert hex color (without #) to RGBA tuple (0-1 range).
|
||||||
|
|
||||||
|
Accepts 6-char (RGB) or 8-char (RGBA) hex strings.
|
||||||
|
"""
|
||||||
|
r = int(hex_color[0:2], 16) / 255
|
||||||
|
g = int(hex_color[2:4], 16) / 255
|
||||||
|
b = int(hex_color[4:6], 16) / 255
|
||||||
|
a = int(hex_color[6:8], 16) / 255 if len(hex_color) >= 8 else 1.0
|
||||||
|
return (r, g, b, a)
|
||||||
|
|
||||||
|
|
||||||
|
def load_timeseries_from_db(
|
||||||
|
role: str,
|
||||||
|
metric: str,
|
||||||
|
end_time: datetime,
|
||||||
|
lookback: timedelta,
|
||||||
|
period: str,
|
||||||
|
) -> TimeSeries:
|
||||||
|
"""Load time series data from SQLite database.
|
||||||
|
|
||||||
|
Fetches metric data from EAV table, handles counter-to-rate conversion,
|
||||||
|
applies transforms (e.g., mv_to_v), and applies time binning as needed.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
metric: Metric name (firmware field name, e.g., "bat", "nb_recv")
|
||||||
|
end_time: End of the time range (typically now)
|
||||||
|
lookback: How far back to look
|
||||||
|
period: Period name for binning config ("day", "week", etc.)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
TimeSeries with extracted data points
|
||||||
|
"""
|
||||||
|
start_time = end_time - lookback
|
||||||
|
start_ts = int(start_time.timestamp())
|
||||||
|
end_ts = int(end_time.timestamp())
|
||||||
|
|
||||||
|
# Fetch all metrics for this role/period (returns pivoted dict)
|
||||||
|
all_metrics = get_metrics_for_period(role, start_ts, end_ts)
|
||||||
|
|
||||||
|
# Get data for this specific metric
|
||||||
|
metric_data = all_metrics.get(metric, [])
|
||||||
|
|
||||||
|
if not metric_data:
|
||||||
|
return TimeSeries(metric=metric, role=role, period=period)
|
||||||
|
|
||||||
|
is_counter = is_counter_metric(metric)
|
||||||
|
scale = get_graph_scale(metric)
|
||||||
|
|
||||||
|
# Convert to (datetime, value) tuples with transform applied
|
||||||
|
raw_points: list[tuple[datetime, float]] = []
|
||||||
|
for ts, value in metric_data:
|
||||||
|
# Apply any configured transform (e.g., mv_to_v for battery)
|
||||||
|
transformed_value = transform_value(metric, value)
|
||||||
|
raw_points.append((datetime.fromtimestamp(ts), transformed_value))
|
||||||
|
|
||||||
|
if not raw_points:
|
||||||
|
return TimeSeries(metric=metric, role=role, period=period)
|
||||||
|
|
||||||
|
# For counter metrics, calculate rate of change
|
||||||
|
if is_counter:
|
||||||
|
rate_points: list[tuple[datetime, float]] = []
|
||||||
|
|
||||||
|
for i in range(1, len(raw_points)):
|
||||||
|
prev_ts, prev_val = raw_points[i - 1]
|
||||||
|
curr_ts, curr_val = raw_points[i]
|
||||||
|
|
||||||
|
delta_val = curr_val - prev_val
|
||||||
|
delta_secs = (curr_ts - prev_ts).total_seconds()
|
||||||
|
|
||||||
|
if delta_secs <= 0:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Skip negative deltas (device reboot)
|
||||||
|
if delta_val < 0:
|
||||||
|
log.debug(f"Counter reset detected for {metric} at {curr_ts}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Calculate per-second rate, then apply scaling (typically x60 for per-minute)
|
||||||
|
rate = (delta_val / delta_secs) * scale
|
||||||
|
rate_points.append((curr_ts, rate))
|
||||||
|
|
||||||
|
raw_points = rate_points
|
||||||
|
else:
|
||||||
|
# For gauges, just apply scaling
|
||||||
|
raw_points = [(ts, val * scale) for ts, val in raw_points]
|
||||||
|
|
||||||
|
# Apply time binning if configured
|
||||||
|
period_cfg = PERIOD_CONFIG.get(period, {})
|
||||||
|
bin_seconds = period_cfg.get("bin_seconds")
|
||||||
|
|
||||||
|
if bin_seconds and len(raw_points) > 1:
|
||||||
|
raw_points = _aggregate_bins(raw_points, bin_seconds)
|
||||||
|
|
||||||
|
# Convert to DataPoints
|
||||||
|
points = [DataPoint(timestamp=ts, value=val) for ts, val in raw_points]
|
||||||
|
|
||||||
|
return TimeSeries(metric=metric, role=role, period=period, points=points)
|
||||||
|
|
||||||
|
|
||||||
|
def _aggregate_bins(
|
||||||
|
points: list[tuple[datetime, float]],
|
||||||
|
bin_seconds: int,
|
||||||
|
) -> list[tuple[datetime, float]]:
|
||||||
|
"""Aggregate points into time bins using mean.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
points: List of (timestamp, value) tuples, must be sorted
|
||||||
|
bin_seconds: Size of each bin in seconds
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated points, one per bin
|
||||||
|
"""
|
||||||
|
if not points:
|
||||||
|
return []
|
||||||
|
|
||||||
|
bins: dict[int, list[float]] = {}
|
||||||
|
|
||||||
|
for ts, val in points:
|
||||||
|
# Round timestamp down to bin boundary
|
||||||
|
epoch = int(ts.timestamp())
|
||||||
|
bin_key = (epoch // bin_seconds) * bin_seconds
|
||||||
|
|
||||||
|
if bin_key not in bins:
|
||||||
|
bins[bin_key] = []
|
||||||
|
bins[bin_key].append(val)
|
||||||
|
|
||||||
|
# Calculate mean for each bin
|
||||||
|
result = []
|
||||||
|
for bin_key in sorted(bins.keys()):
|
||||||
|
values = bins[bin_key]
|
||||||
|
mean_val = sum(values) / len(values)
|
||||||
|
bin_ts = datetime.fromtimestamp(bin_key + bin_seconds // 2) # Center of bin
|
||||||
|
result.append((bin_ts, mean_val))
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def calculate_statistics(ts: TimeSeries) -> ChartStatistics:
|
||||||
|
"""Calculate min/avg/max/current statistics from time series.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ts: Time series data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ChartStatistics with calculated values
|
||||||
|
"""
|
||||||
|
if ts.is_empty:
|
||||||
|
return ChartStatistics()
|
||||||
|
|
||||||
|
values = ts.values
|
||||||
|
|
||||||
|
return ChartStatistics(
|
||||||
|
min_value=min(values),
|
||||||
|
avg_value=sum(values) / len(values),
|
||||||
|
max_value=max(values),
|
||||||
|
current_value=values[-1] if values else None,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def render_chart_svg(
|
||||||
|
ts: TimeSeries,
|
||||||
|
theme: ChartTheme,
|
||||||
|
width: int = 800,
|
||||||
|
height: int = 280,
|
||||||
|
y_min: Optional[float] = None,
|
||||||
|
y_max: Optional[float] = None,
|
||||||
|
x_start: Optional[datetime] = None,
|
||||||
|
x_end: Optional[datetime] = None,
|
||||||
|
) -> str:
|
||||||
|
"""Render time series as SVG using matplotlib.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ts: Time series data to render
|
||||||
|
theme: Color theme to apply
|
||||||
|
width: Chart width in pixels
|
||||||
|
height: Chart height in pixels
|
||||||
|
y_min: Optional fixed Y-axis minimum
|
||||||
|
y_max: Optional fixed Y-axis maximum
|
||||||
|
x_start: Optional fixed X-axis start (for padding sparse data)
|
||||||
|
x_end: Optional fixed X-axis end (for padding sparse data)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
SVG string with embedded data-points attribute for tooltips
|
||||||
|
"""
|
||||||
|
# Create figure
|
||||||
|
dpi = 100
|
||||||
|
fig_width = width / dpi
|
||||||
|
fig_height = height / dpi
|
||||||
|
|
||||||
|
fig, ax = plt.subplots(figsize=(fig_width, fig_height), dpi=dpi)
|
||||||
|
|
||||||
|
# Track actual Y-axis values for tooltip injection
|
||||||
|
actual_y_min = y_min
|
||||||
|
actual_y_max = y_max
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Apply theme colors
|
||||||
|
fig.patch.set_facecolor(f"#{theme.background}")
|
||||||
|
ax.set_facecolor(f"#{theme.canvas}")
|
||||||
|
|
||||||
|
# Configure axes
|
||||||
|
ax.spines['top'].set_visible(False)
|
||||||
|
ax.spines['right'].set_visible(False)
|
||||||
|
ax.spines['left'].set_color(f"#{theme.grid}")
|
||||||
|
ax.spines['bottom'].set_color(f"#{theme.grid}")
|
||||||
|
|
||||||
|
ax.tick_params(colors=f"#{theme.axis}", labelsize=10)
|
||||||
|
ax.yaxis.label.set_color(f"#{theme.text}")
|
||||||
|
|
||||||
|
# Grid
|
||||||
|
ax.grid(True, linestyle='-', alpha=0.5, color=f"#{theme.grid}")
|
||||||
|
ax.set_axisbelow(True)
|
||||||
|
|
||||||
|
if ts.is_empty:
|
||||||
|
# Empty chart - just show axes
|
||||||
|
ax.text(
|
||||||
|
0.5, 0.5, "No data available",
|
||||||
|
transform=ax.transAxes,
|
||||||
|
ha='center', va='center',
|
||||||
|
fontsize=12,
|
||||||
|
color=f"#{theme.axis}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
timestamps = ts.timestamps
|
||||||
|
values = ts.values
|
||||||
|
|
||||||
|
# Plot area fill
|
||||||
|
area_color = _hex_to_rgba(theme.area)
|
||||||
|
ax.fill_between(timestamps, values, alpha=area_color[3], color=f"#{theme.line}")
|
||||||
|
|
||||||
|
# Plot line
|
||||||
|
ax.plot(timestamps, values, color=f"#{theme.line}", linewidth=2)
|
||||||
|
|
||||||
|
# Set Y-axis limits and track actual values used
|
||||||
|
if y_min is not None and y_max is not None:
|
||||||
|
ax.set_ylim(y_min, y_max)
|
||||||
|
actual_y_min, actual_y_max = y_min, y_max
|
||||||
|
else:
|
||||||
|
# Add some padding
|
||||||
|
val_min, val_max = min(values), max(values)
|
||||||
|
val_range = val_max - val_min if val_max != val_min else abs(val_max) * 0.1 or 1
|
||||||
|
padding = val_range * 0.1
|
||||||
|
actual_y_min = val_min - padding
|
||||||
|
actual_y_max = val_max + padding
|
||||||
|
ax.set_ylim(actual_y_min, actual_y_max)
|
||||||
|
|
||||||
|
# Set X-axis limits first (before configuring ticks)
|
||||||
|
if x_start is not None and x_end is not None:
|
||||||
|
ax.set_xlim(x_start, x_end)
|
||||||
|
|
||||||
|
# Format X-axis based on period (after setting limits)
|
||||||
|
_configure_x_axis(ax, ts.period)
|
||||||
|
|
||||||
|
# Tight layout
|
||||||
|
plt.tight_layout(pad=0.5)
|
||||||
|
|
||||||
|
# Render to SVG
|
||||||
|
svg_buffer = io.StringIO()
|
||||||
|
fig.savefig(svg_buffer, format='svg', bbox_inches='tight', pad_inches=0.1)
|
||||||
|
svg_content = svg_buffer.getvalue()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
# Ensure figure is closed to prevent memory leaks
|
||||||
|
plt.close(fig)
|
||||||
|
|
||||||
|
# Inject data-points attribute for tooltip support
|
||||||
|
if not ts.is_empty:
|
||||||
|
svg_content = _inject_data_attributes(
|
||||||
|
svg_content, ts, theme.name, x_start, x_end,
|
||||||
|
actual_y_min, actual_y_max
|
||||||
|
)
|
||||||
|
|
||||||
|
return svg_content
|
||||||
|
|
||||||
|
|
||||||
|
def _configure_x_axis(ax, period: str) -> None:
|
||||||
|
"""Configure X-axis formatting based on period."""
|
||||||
|
if period == "day":
|
||||||
|
ax.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M'))
|
||||||
|
ax.xaxis.set_major_locator(mdates.HourLocator(interval=4))
|
||||||
|
elif period == "week":
|
||||||
|
ax.xaxis.set_major_formatter(mdates.DateFormatter('%a'))
|
||||||
|
ax.xaxis.set_major_locator(mdates.DayLocator())
|
||||||
|
elif period == "month":
|
||||||
|
ax.xaxis.set_major_formatter(mdates.DateFormatter('%d'))
|
||||||
|
ax.xaxis.set_major_locator(mdates.DayLocator(interval=5))
|
||||||
|
else: # year
|
||||||
|
ax.xaxis.set_major_formatter(mdates.DateFormatter('%b'))
|
||||||
|
ax.xaxis.set_major_locator(mdates.MonthLocator())
|
||||||
|
|
||||||
|
# Rotate labels for readability
|
||||||
|
plt.setp(ax.xaxis.get_majorticklabels(), rotation=0, ha='center')
|
||||||
|
|
||||||
|
|
||||||
|
def _inject_data_attributes(
|
||||||
|
svg: str,
|
||||||
|
ts: TimeSeries,
|
||||||
|
theme_name: str,
|
||||||
|
x_start: Optional[datetime] = None,
|
||||||
|
x_end: Optional[datetime] = None,
|
||||||
|
y_min: Optional[float] = None,
|
||||||
|
y_max: Optional[float] = None,
|
||||||
|
) -> str:
|
||||||
|
"""Inject data-* attributes into SVG for tooltip support.
|
||||||
|
|
||||||
|
Adds:
|
||||||
|
- data-metric, data-period, data-theme, data-x-start, data-x-end, data-y-min, data-y-max to root <svg>
|
||||||
|
- data-points JSON array to the chart path element
|
||||||
|
|
||||||
|
Args:
|
||||||
|
svg: Raw SVG string
|
||||||
|
ts: Time series data
|
||||||
|
theme_name: Theme identifier
|
||||||
|
x_start: X-axis start timestamp (for proper mouse-to-time mapping)
|
||||||
|
x_end: X-axis end timestamp
|
||||||
|
y_min: Y-axis minimum value
|
||||||
|
y_max: Y-axis maximum value
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Modified SVG with data attributes
|
||||||
|
"""
|
||||||
|
# Build data points array for tooltips
|
||||||
|
data_points = [
|
||||||
|
{"ts": int(p.timestamp.timestamp()), "v": round(p.value, 4)}
|
||||||
|
for p in ts.points
|
||||||
|
]
|
||||||
|
data_points_json = json.dumps(data_points)
|
||||||
|
|
||||||
|
# Escape for HTML attribute (single quotes around JSON, escape internal quotes)
|
||||||
|
data_points_attr = data_points_json.replace('"', '"')
|
||||||
|
|
||||||
|
# Build X-axis range attributes for proper tooltip positioning
|
||||||
|
x_start_ts = int(x_start.timestamp()) if x_start else int(ts.points[0].timestamp.timestamp())
|
||||||
|
x_end_ts = int(x_end.timestamp()) if x_end else int(ts.points[-1].timestamp.timestamp())
|
||||||
|
|
||||||
|
# Build Y-axis range attributes
|
||||||
|
y_min_val = y_min if y_min is not None else min(p.value for p in ts.points)
|
||||||
|
y_max_val = y_max if y_max is not None else max(p.value for p in ts.points)
|
||||||
|
|
||||||
|
# Add attributes to root <svg> element
|
||||||
|
svg = re.sub(
|
||||||
|
r'<svg\b',
|
||||||
|
f'<svg data-metric="{ts.metric}" data-period="{ts.period}" data-theme="{theme_name}" '
|
||||||
|
f'data-x-start="{x_start_ts}" data-x-end="{x_end_ts}" '
|
||||||
|
f'data-y-min="{y_min_val}" data-y-max="{y_max_val}"',
|
||||||
|
svg,
|
||||||
|
count=1
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add data-points to the main path element (the line, not the fill)
|
||||||
|
# Look for the second path element (first is usually the fill area)
|
||||||
|
path_count = 0
|
||||||
|
def add_data_to_path(match):
|
||||||
|
nonlocal path_count
|
||||||
|
path_count += 1
|
||||||
|
if path_count == 2: # The line path
|
||||||
|
return f'<path data-points="{data_points_attr}"'
|
||||||
|
return match.group(0)
|
||||||
|
|
||||||
|
svg = re.sub(r'<path\b', add_data_to_path, svg)
|
||||||
|
|
||||||
|
return svg
|
||||||
|
|
||||||
|
|
||||||
|
def render_all_charts(
|
||||||
|
role: str,
|
||||||
|
metrics: Optional[list[str]] = None,
|
||||||
|
) -> tuple[list[Path], dict[str, dict[str, dict[str, Any]]]]:
|
||||||
|
"""Render all charts for a role in both light and dark themes.
|
||||||
|
|
||||||
|
Also collects min/avg/max/current statistics for each metric/period.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
metrics: Optional list of metric names to render (for testing)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (list of generated chart paths, stats dict)
|
||||||
|
Stats dict structure: {metric_name: {period: {min, avg, max, current}}}
|
||||||
|
"""
|
||||||
|
if metrics is None:
|
||||||
|
metrics = get_chart_metrics(role)
|
||||||
|
|
||||||
|
cfg = get_config()
|
||||||
|
charts_dir = cfg.out_dir / "assets" / role
|
||||||
|
charts_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
periods = ["day", "week", "month", "year"]
|
||||||
|
themes: list[ThemeName] = ["light", "dark"]
|
||||||
|
|
||||||
|
generated: list[Path] = []
|
||||||
|
all_stats: dict[str, dict[str, dict[str, Any]]] = {}
|
||||||
|
|
||||||
|
# Current time for all lookbacks
|
||||||
|
now = datetime.now()
|
||||||
|
|
||||||
|
# Fixed Y-axis ranges for specific metrics
|
||||||
|
# Battery metrics use transformed values (volts, not millivolts)
|
||||||
|
y_ranges = {
|
||||||
|
"bat": (3.0, 4.2), # Repeater battery (V)
|
||||||
|
"battery_mv": (3.0, 4.2), # Companion battery (V after transform)
|
||||||
|
"bat_pct": (0, 100), # Battery percentage
|
||||||
|
}
|
||||||
|
|
||||||
|
for metric in metrics:
|
||||||
|
all_stats[metric] = {}
|
||||||
|
|
||||||
|
for period in periods:
|
||||||
|
period_cfg = PERIOD_CONFIG[period]
|
||||||
|
|
||||||
|
# Load time series from database
|
||||||
|
ts = load_timeseries_from_db(
|
||||||
|
role=role,
|
||||||
|
metric=metric,
|
||||||
|
end_time=now,
|
||||||
|
lookback=period_cfg["lookback"],
|
||||||
|
period=period,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate and store statistics
|
||||||
|
stats = calculate_statistics(ts)
|
||||||
|
all_stats[metric][period] = stats.to_dict()
|
||||||
|
|
||||||
|
# Get Y-axis range for this metric
|
||||||
|
y_range = y_ranges.get(metric)
|
||||||
|
y_min = y_range[0] if y_range else None
|
||||||
|
y_max = y_range[1] if y_range else None
|
||||||
|
|
||||||
|
# Calculate X-axis range for full period padding
|
||||||
|
x_end = now
|
||||||
|
x_start = now - period_cfg["lookback"]
|
||||||
|
|
||||||
|
# Render chart for each theme
|
||||||
|
for theme_name in themes:
|
||||||
|
theme = CHART_THEMES[theme_name]
|
||||||
|
|
||||||
|
svg_content = render_chart_svg(
|
||||||
|
ts=ts,
|
||||||
|
theme=theme,
|
||||||
|
y_min=y_min,
|
||||||
|
y_max=y_max,
|
||||||
|
x_start=x_start,
|
||||||
|
x_end=x_end,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Save to file
|
||||||
|
output_path = charts_dir / f"{metric}_{period}_{theme_name}.svg"
|
||||||
|
output_path.write_text(svg_content, encoding="utf-8")
|
||||||
|
generated.append(output_path)
|
||||||
|
|
||||||
|
log.debug(f"Generated chart: {output_path}")
|
||||||
|
|
||||||
|
log.info(f"Rendered {len(generated)} charts for {role}")
|
||||||
|
return generated, all_stats
|
||||||
|
|
||||||
|
|
||||||
|
def save_chart_stats(role: str, stats: dict[str, dict[str, dict[str, Any]]]) -> Path:
|
||||||
|
"""Save chart statistics to JSON file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
stats: Stats dict from render_all_charts
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Path to saved JSON file
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
stats_path = cfg.out_dir / "assets" / role / "chart_stats.json"
|
||||||
|
stats_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
with open(stats_path, "w") as f:
|
||||||
|
json.dump(stats, f, indent=2)
|
||||||
|
|
||||||
|
log.debug(f"Saved chart stats to {stats_path}")
|
||||||
|
return stats_path
|
||||||
|
|
||||||
|
|
||||||
|
def load_chart_stats(role: str) -> dict[str, dict[str, dict[str, Any]]]:
|
||||||
|
"""Load chart statistics from JSON file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Stats dict, or empty dict if file doesn't exist
|
||||||
|
"""
|
||||||
|
cfg = get_config()
|
||||||
|
stats_path = cfg.out_dir / "assets" / role / "chart_stats.json"
|
||||||
|
|
||||||
|
if not stats_path.exists():
|
||||||
|
return {}
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(stats_path) as f:
|
||||||
|
return json.load(f)
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"Failed to load chart stats: {e}")
|
||||||
|
return {}
|
||||||
546
src/meshmon/db.py
Normal file
546
src/meshmon/db.py
Normal file
@@ -0,0 +1,546 @@
|
|||||||
|
"""SQLite database for metrics storage.
|
||||||
|
|
||||||
|
This module provides EAV (Entity-Attribute-Value) storage for metrics
|
||||||
|
from MeshCore devices. Firmware field names are stored directly for
|
||||||
|
future-proofing - new fields are captured automatically without schema changes.
|
||||||
|
|
||||||
|
Schema design:
|
||||||
|
- Single 'metrics' table with (ts, role, metric, value) structure
|
||||||
|
- Firmware field names stored as-is (e.g., 'bat', 'nb_recv', 'battery_mv')
|
||||||
|
- bat_pct computed at query time from voltage values
|
||||||
|
- Raw counter values stored (rates computed during chart rendering)
|
||||||
|
|
||||||
|
Migration system:
|
||||||
|
- Schema version tracked in db_meta table
|
||||||
|
- Migrations stored as SQL files in src/meshmon/migrations/
|
||||||
|
- Files named: NNN_description.sql (e.g., 001_initial_schema.sql)
|
||||||
|
- Applied in order on database init
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
from collections import defaultdict
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Iterator, Optional
|
||||||
|
|
||||||
|
from .battery import voltage_to_percentage
|
||||||
|
from .env import get_config
|
||||||
|
from . import log
|
||||||
|
|
||||||
|
|
||||||
|
# Path to migrations directory (relative to this file)
|
||||||
|
MIGRATIONS_DIR = Path(__file__).parent / "migrations"
|
||||||
|
|
||||||
|
# Valid role values (used to prevent SQL injection)
|
||||||
|
VALID_ROLES = ("companion", "repeater")
|
||||||
|
|
||||||
|
# Battery voltage field names by role (for bat_pct derivation)
|
||||||
|
BATTERY_FIELD = {
|
||||||
|
"companion": "battery_mv",
|
||||||
|
"repeater": "bat",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _validate_role(role: str) -> str:
|
||||||
|
"""Validate role parameter to prevent SQL injection.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: Role name to validate
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The validated role string
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If role is not valid
|
||||||
|
"""
|
||||||
|
if role not in VALID_ROLES:
|
||||||
|
raise ValueError(f"Invalid role: {role!r}. Must be one of {VALID_ROLES}")
|
||||||
|
return role
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# File-based Migration System
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
def _get_migration_files() -> list[tuple[int, Path]]:
|
||||||
|
"""Get all migration files sorted by version number.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of (version, path) tuples sorted by version
|
||||||
|
"""
|
||||||
|
if not MIGRATIONS_DIR.exists():
|
||||||
|
return []
|
||||||
|
|
||||||
|
migrations = []
|
||||||
|
for sql_file in MIGRATIONS_DIR.glob("*.sql"):
|
||||||
|
# Extract version number from filename (e.g., "001_initial.sql" -> 1)
|
||||||
|
try:
|
||||||
|
version_str = sql_file.stem.split("_")[0]
|
||||||
|
version = int(version_str)
|
||||||
|
migrations.append((version, sql_file))
|
||||||
|
except (ValueError, IndexError):
|
||||||
|
log.warn(f"Skipping invalid migration filename: {sql_file.name}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
return sorted(migrations, key=lambda x: x[0])
|
||||||
|
|
||||||
|
|
||||||
|
def _get_schema_version(conn: sqlite3.Connection) -> int:
|
||||||
|
"""Get current schema version from database.
|
||||||
|
|
||||||
|
Returns 0 if db_meta table doesn't exist (fresh database).
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT value FROM db_meta WHERE key = 'schema_version'"
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
return int(row[0]) if row else 0
|
||||||
|
except sqlite3.OperationalError:
|
||||||
|
# db_meta table doesn't exist
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def _set_schema_version(conn: sqlite3.Connection, version: int) -> None:
|
||||||
|
"""Set schema version in database."""
|
||||||
|
conn.execute(
|
||||||
|
"""
|
||||||
|
INSERT OR REPLACE INTO db_meta (key, value)
|
||||||
|
VALUES ('schema_version', ?)
|
||||||
|
""",
|
||||||
|
(str(version),)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _apply_migrations(conn: sqlite3.Connection) -> None:
|
||||||
|
"""Apply pending migrations from SQL files."""
|
||||||
|
current_version = _get_schema_version(conn)
|
||||||
|
migrations = _get_migration_files()
|
||||||
|
|
||||||
|
if not migrations:
|
||||||
|
raise RuntimeError(
|
||||||
|
f"No migration files found in {MIGRATIONS_DIR}. "
|
||||||
|
"Expected at least 001_initial_schema.sql"
|
||||||
|
)
|
||||||
|
|
||||||
|
latest_version = migrations[-1][0]
|
||||||
|
|
||||||
|
# Apply each migration that hasn't been applied yet
|
||||||
|
for version, sql_file in migrations:
|
||||||
|
if version <= current_version:
|
||||||
|
continue
|
||||||
|
|
||||||
|
log.info(f"Applying migration {sql_file.name}")
|
||||||
|
try:
|
||||||
|
sql_content = sql_file.read_text()
|
||||||
|
conn.executescript(sql_content)
|
||||||
|
_set_schema_version(conn, version)
|
||||||
|
conn.commit()
|
||||||
|
log.debug(f"Migration {version} applied successfully")
|
||||||
|
except Exception as e:
|
||||||
|
conn.rollback()
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Migration {sql_file.name} failed: {e}"
|
||||||
|
) from e
|
||||||
|
|
||||||
|
final_version = _get_schema_version(conn)
|
||||||
|
if final_version < latest_version:
|
||||||
|
log.warn(
|
||||||
|
f"Schema version {final_version} is behind latest migration {latest_version}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_schema_version() -> int:
|
||||||
|
"""Get current schema version from database.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current schema version, or 0 if database doesn't exist
|
||||||
|
"""
|
||||||
|
db_path = get_db_path()
|
||||||
|
if not db_path.exists():
|
||||||
|
return 0
|
||||||
|
|
||||||
|
with get_connection(readonly=True) as conn:
|
||||||
|
return _get_schema_version(conn)
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Database Connection & Initialization
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
def get_db_path() -> Path:
|
||||||
|
"""Get database file path."""
|
||||||
|
cfg = get_config()
|
||||||
|
return cfg.state_dir / "metrics.db"
|
||||||
|
|
||||||
|
|
||||||
|
def init_db(db_path: Optional[Path] = None) -> None:
|
||||||
|
"""Initialize database with schema and apply pending migrations.
|
||||||
|
|
||||||
|
Creates tables if they don't exist. Safe to call multiple times.
|
||||||
|
Applies any pending migrations to bring schema up to date.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_path: Optional path override (for testing)
|
||||||
|
"""
|
||||||
|
if db_path is None:
|
||||||
|
db_path = get_db_path()
|
||||||
|
|
||||||
|
db_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
try:
|
||||||
|
# Enable optimizations
|
||||||
|
conn.execute("PRAGMA journal_mode=WAL")
|
||||||
|
conn.execute("PRAGMA synchronous=NORMAL")
|
||||||
|
conn.execute("PRAGMA cache_size=-64000") # 64MB cache
|
||||||
|
conn.execute("PRAGMA temp_store=MEMORY")
|
||||||
|
|
||||||
|
# Apply schema creation and migrations
|
||||||
|
_apply_migrations(conn)
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
version = _get_schema_version(conn)
|
||||||
|
log.debug(f"Database initialized at {db_path} (schema v{version})")
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def get_connection(
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
readonly: bool = False
|
||||||
|
) -> Iterator[sqlite3.Connection]:
|
||||||
|
"""Context manager for database connections.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_path: Optional path override
|
||||||
|
readonly: If True, open in read-only mode
|
||||||
|
|
||||||
|
Yields:
|
||||||
|
sqlite3.Connection with Row factory enabled
|
||||||
|
"""
|
||||||
|
if db_path is None:
|
||||||
|
db_path = get_db_path()
|
||||||
|
|
||||||
|
if readonly:
|
||||||
|
uri = f"file:{db_path}?mode=ro"
|
||||||
|
conn = sqlite3.connect(uri, uri=True)
|
||||||
|
else:
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
|
||||||
|
conn.row_factory = sqlite3.Row
|
||||||
|
# Wait up to 5 seconds if database is locked
|
||||||
|
conn.execute("PRAGMA busy_timeout=5000")
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield conn
|
||||||
|
if not readonly:
|
||||||
|
conn.commit()
|
||||||
|
except Exception:
|
||||||
|
if not readonly:
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Metric Insert Functions (EAV)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
def insert_metric(
|
||||||
|
ts: int,
|
||||||
|
role: str,
|
||||||
|
metric: str,
|
||||||
|
value: float,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> bool:
|
||||||
|
"""Insert a single metric value.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ts: Unix timestamp
|
||||||
|
role: 'companion' or 'repeater'
|
||||||
|
metric: Firmware field name (e.g., 'bat', 'nb_recv')
|
||||||
|
value: Metric value
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if inserted, False if duplicate (ts, role, metric)
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with get_connection(db_path) as conn:
|
||||||
|
conn.execute(
|
||||||
|
"INSERT INTO metrics (ts, role, metric, value) VALUES (?, ?, ?, ?)",
|
||||||
|
(ts, role, metric, value)
|
||||||
|
)
|
||||||
|
return True
|
||||||
|
except sqlite3.IntegrityError as e:
|
||||||
|
if "UNIQUE constraint failed" in str(e) or "PRIMARY KEY" in str(e):
|
||||||
|
log.debug(f"Duplicate metric: ts={ts}, role={role}, metric={metric}")
|
||||||
|
return False
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def insert_metrics(
|
||||||
|
ts: int,
|
||||||
|
role: str,
|
||||||
|
metrics: dict[str, Any],
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> int:
|
||||||
|
"""Insert multiple metrics from a dict (e.g., firmware status response).
|
||||||
|
|
||||||
|
Only numeric values (int, float) are inserted. Non-numeric values are skipped.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ts: Unix timestamp
|
||||||
|
role: 'companion' or 'repeater'
|
||||||
|
metrics: Dict of metric_name -> value (from firmware response)
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of metrics inserted
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
inserted = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
with get_connection(db_path) as conn:
|
||||||
|
for metric, value in metrics.items():
|
||||||
|
# Only insert numeric values
|
||||||
|
if not isinstance(value, (int, float)):
|
||||||
|
continue
|
||||||
|
# Skip None values
|
||||||
|
if value is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
conn.execute(
|
||||||
|
"INSERT INTO metrics (ts, role, metric, value) VALUES (?, ?, ?, ?)",
|
||||||
|
(ts, role, metric, float(value))
|
||||||
|
)
|
||||||
|
inserted += 1
|
||||||
|
except sqlite3.IntegrityError:
|
||||||
|
# Duplicate, skip
|
||||||
|
pass
|
||||||
|
|
||||||
|
log.debug(f"Inserted {inserted} metrics for {role} at ts={ts}")
|
||||||
|
return inserted
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to insert metrics: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Metric Query Functions (EAV)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
def get_metrics_for_period(
|
||||||
|
role: str,
|
||||||
|
start_ts: int,
|
||||||
|
end_ts: int,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> dict[str, list[tuple[int, float]]]:
|
||||||
|
"""Fetch all metrics for a role within a time range.
|
||||||
|
|
||||||
|
Returns data pivoted by metric name for easy chart rendering.
|
||||||
|
Also computes bat_pct from battery voltage if available.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
start_ts: Start timestamp (inclusive)
|
||||||
|
end_ts: End timestamp (inclusive)
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping metric names to list of (timestamp, value) tuples,
|
||||||
|
sorted by timestamp ascending.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If role is not valid
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
with get_connection(db_path, readonly=True) as conn:
|
||||||
|
cursor = conn.execute(
|
||||||
|
"""
|
||||||
|
SELECT ts, metric, value
|
||||||
|
FROM metrics
|
||||||
|
WHERE role = ? AND ts BETWEEN ? AND ?
|
||||||
|
ORDER BY ts ASC
|
||||||
|
""",
|
||||||
|
(role, start_ts, end_ts)
|
||||||
|
)
|
||||||
|
|
||||||
|
result: dict[str, list[tuple[int, float]]] = defaultdict(list)
|
||||||
|
for row in cursor:
|
||||||
|
if row["value"] is not None:
|
||||||
|
result[row["metric"]].append((row["ts"], row["value"]))
|
||||||
|
|
||||||
|
# Compute bat_pct from battery voltage
|
||||||
|
bat_field = BATTERY_FIELD.get(role)
|
||||||
|
if bat_field and bat_field in result:
|
||||||
|
bat_pct_data = []
|
||||||
|
for ts, mv in result[bat_field]:
|
||||||
|
voltage = mv / 1000.0 # Convert millivolts to volts
|
||||||
|
pct = voltage_to_percentage(voltage)
|
||||||
|
if pct is not None:
|
||||||
|
bat_pct_data.append((ts, pct))
|
||||||
|
if bat_pct_data:
|
||||||
|
result["bat_pct"] = bat_pct_data
|
||||||
|
|
||||||
|
return dict(result)
|
||||||
|
|
||||||
|
|
||||||
|
def get_latest_metrics(
|
||||||
|
role: str,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> Optional[dict[str, Any]]:
|
||||||
|
"""Get the most recent metrics for a role.
|
||||||
|
|
||||||
|
Returns all metrics at the most recent timestamp as a flat dict.
|
||||||
|
Also computes bat_pct from battery voltage.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with 'ts' and all metric values, or None if no data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If role is not valid
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
with get_connection(db_path, readonly=True) as conn:
|
||||||
|
# Find the most recent timestamp for this role
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT MAX(ts) as max_ts FROM metrics WHERE role = ?",
|
||||||
|
(role,)
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if not row or row["max_ts"] is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
max_ts = row["max_ts"]
|
||||||
|
|
||||||
|
# Get all metrics at that timestamp
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT metric, value FROM metrics WHERE role = ? AND ts = ?",
|
||||||
|
(role, max_ts)
|
||||||
|
)
|
||||||
|
|
||||||
|
result: dict[str, Any] = {"ts": max_ts}
|
||||||
|
for row in cursor:
|
||||||
|
result[row["metric"]] = row["value"]
|
||||||
|
|
||||||
|
# Compute bat_pct from battery voltage
|
||||||
|
bat_field = BATTERY_FIELD.get(role)
|
||||||
|
if bat_field and bat_field in result:
|
||||||
|
voltage = result[bat_field] / 1000.0
|
||||||
|
result["bat_pct"] = voltage_to_percentage(voltage)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def get_metric_count(
|
||||||
|
role: str,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> int:
|
||||||
|
"""Get total number of metric rows for a role.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of rows
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If role is not valid
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
with get_connection(db_path, readonly=True) as conn:
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT COUNT(*) FROM metrics WHERE role = ?",
|
||||||
|
(role,)
|
||||||
|
)
|
||||||
|
return cursor.fetchone()[0]
|
||||||
|
|
||||||
|
|
||||||
|
def get_distinct_timestamps(
|
||||||
|
role: str,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> int:
|
||||||
|
"""Get count of distinct timestamps for a role.
|
||||||
|
|
||||||
|
Useful for understanding actual sample count (vs metric row count).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of distinct timestamps
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
with get_connection(db_path, readonly=True) as conn:
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT COUNT(DISTINCT ts) FROM metrics WHERE role = ?",
|
||||||
|
(role,)
|
||||||
|
)
|
||||||
|
return cursor.fetchone()[0]
|
||||||
|
|
||||||
|
|
||||||
|
def get_available_metrics(
|
||||||
|
role: str,
|
||||||
|
db_path: Optional[Path] = None,
|
||||||
|
) -> list[str]:
|
||||||
|
"""Get list of all metric names stored for a role.
|
||||||
|
|
||||||
|
Useful for discovering what metrics are available from firmware.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: "companion" or "repeater"
|
||||||
|
db_path: Optional path override
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of metric names
|
||||||
|
"""
|
||||||
|
role = _validate_role(role)
|
||||||
|
|
||||||
|
with get_connection(db_path, readonly=True) as conn:
|
||||||
|
cursor = conn.execute(
|
||||||
|
"SELECT DISTINCT metric FROM metrics WHERE role = ? ORDER BY metric",
|
||||||
|
(role,)
|
||||||
|
)
|
||||||
|
return [row["metric"] for row in cursor]
|
||||||
|
|
||||||
|
|
||||||
|
def vacuum_db(db_path: Optional[Path] = None) -> None:
|
||||||
|
"""Compact database and rebuild indexes.
|
||||||
|
|
||||||
|
Should be run periodically (e.g., weekly via cron).
|
||||||
|
"""
|
||||||
|
if db_path is None:
|
||||||
|
db_path = get_db_path()
|
||||||
|
|
||||||
|
conn = sqlite3.connect(db_path)
|
||||||
|
try:
|
||||||
|
conn.execute("VACUUM")
|
||||||
|
conn.execute("ANALYZE")
|
||||||
|
log.info("Database vacuumed and analyzed")
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
126
src/meshmon/env.py
Normal file
126
src/meshmon/env.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
"""Environment variable parsing and configuration."""
|
||||||
|
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
def get_str(key: str, default: Optional[str] = None) -> Optional[str]:
|
||||||
|
"""Get string env var."""
|
||||||
|
return os.environ.get(key, default)
|
||||||
|
|
||||||
|
|
||||||
|
def get_int(key: str, default: int) -> int:
|
||||||
|
"""Get integer env var."""
|
||||||
|
val = os.environ.get(key)
|
||||||
|
if val is None:
|
||||||
|
return default
|
||||||
|
try:
|
||||||
|
return int(val)
|
||||||
|
except ValueError:
|
||||||
|
return default
|
||||||
|
|
||||||
|
|
||||||
|
def get_bool(key: str, default: bool = False) -> bool:
|
||||||
|
"""Get boolean env var (0/1, true/false, yes/no)."""
|
||||||
|
val = os.environ.get(key, "").lower()
|
||||||
|
if not val:
|
||||||
|
return default
|
||||||
|
return val in ("1", "true", "yes", "on")
|
||||||
|
|
||||||
|
|
||||||
|
def get_float(key: str, default: float) -> float:
|
||||||
|
"""Get float env var."""
|
||||||
|
val = os.environ.get(key)
|
||||||
|
if val is None:
|
||||||
|
return default
|
||||||
|
try:
|
||||||
|
return float(val)
|
||||||
|
except ValueError:
|
||||||
|
return default
|
||||||
|
|
||||||
|
|
||||||
|
def get_path(key: str, default: str) -> Path:
|
||||||
|
"""Get path env var, expanding user and making absolute."""
|
||||||
|
val = os.environ.get(key, default)
|
||||||
|
return Path(val).expanduser().resolve()
|
||||||
|
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
"""Configuration loaded from environment variables."""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# Connection settings
|
||||||
|
self.mesh_transport = get_str("MESH_TRANSPORT", "serial")
|
||||||
|
self.mesh_serial_port = get_str("MESH_SERIAL_PORT") # None = auto-detect
|
||||||
|
self.mesh_serial_baud = get_int("MESH_SERIAL_BAUD", 115200)
|
||||||
|
self.mesh_tcp_host = get_str("MESH_TCP_HOST", "localhost")
|
||||||
|
self.mesh_tcp_port = get_int("MESH_TCP_PORT", 5000)
|
||||||
|
self.mesh_ble_addr = get_str("MESH_BLE_ADDR")
|
||||||
|
self.mesh_ble_pin = get_str("MESH_BLE_PIN")
|
||||||
|
self.mesh_debug = get_bool("MESH_DEBUG", False)
|
||||||
|
|
||||||
|
# Remote repeater identity
|
||||||
|
self.repeater_name = get_str("REPEATER_NAME")
|
||||||
|
self.repeater_key_prefix = get_str("REPEATER_KEY_PREFIX")
|
||||||
|
self.repeater_password = get_str("REPEATER_PASSWORD")
|
||||||
|
self.repeater_fetch_acl = get_bool("REPEATER_FETCH_ACL", False)
|
||||||
|
|
||||||
|
# Intervals and timeouts
|
||||||
|
self.companion_step = get_int("COMPANION_STEP", 60)
|
||||||
|
self.repeater_step = get_int("REPEATER_STEP", 900)
|
||||||
|
self.remote_timeout_s = get_int("REMOTE_TIMEOUT_S", 10)
|
||||||
|
self.remote_retry_attempts = get_int("REMOTE_RETRY_ATTEMPTS", 2)
|
||||||
|
self.remote_retry_backoff_s = get_int("REMOTE_RETRY_BACKOFF_S", 4)
|
||||||
|
self.remote_cb_fails = get_int("REMOTE_CB_FAILS", 6)
|
||||||
|
self.remote_cb_cooldown_s = get_int("REMOTE_CB_COOLDOWN_S", 3600)
|
||||||
|
|
||||||
|
# Paths
|
||||||
|
self.state_dir = get_path("STATE_DIR", "./data/state")
|
||||||
|
self.out_dir = get_path("OUT_DIR", "./out")
|
||||||
|
|
||||||
|
# Report location metadata
|
||||||
|
self.report_location_name = get_str(
|
||||||
|
"REPORT_LOCATION_NAME", "Your Location"
|
||||||
|
)
|
||||||
|
self.report_location_short = get_str(
|
||||||
|
"REPORT_LOCATION_SHORT", "Your Location"
|
||||||
|
)
|
||||||
|
self.report_lat = get_float("REPORT_LAT", 0.0)
|
||||||
|
self.report_lon = get_float("REPORT_LON", 0.0)
|
||||||
|
self.report_elev = get_float("REPORT_ELEV", 0.0)
|
||||||
|
self.report_elev_unit = get_str("REPORT_ELEV_UNIT", "m") # "m" or "ft"
|
||||||
|
|
||||||
|
# Node display names for UI
|
||||||
|
self.repeater_display_name = get_str(
|
||||||
|
"REPEATER_DISPLAY_NAME", "Repeater Node"
|
||||||
|
)
|
||||||
|
self.companion_display_name = get_str(
|
||||||
|
"COMPANION_DISPLAY_NAME", "Companion Node"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Public key prefixes for display (e.g., "!a1b2c3d4")
|
||||||
|
self.repeater_pubkey_prefix = get_str("REPEATER_PUBKEY_PREFIX")
|
||||||
|
self.companion_pubkey_prefix = get_str("COMPANION_PUBKEY_PREFIX")
|
||||||
|
|
||||||
|
# Hardware info for sidebar
|
||||||
|
self.repeater_hardware = get_str("REPEATER_HARDWARE", "LoRa Repeater")
|
||||||
|
self.companion_hardware = get_str("COMPANION_HARDWARE", "LoRa Node")
|
||||||
|
|
||||||
|
# Radio configuration (for display in sidebar)
|
||||||
|
self.radio_frequency = get_str("RADIO_FREQUENCY", "869.618 MHz")
|
||||||
|
self.radio_bandwidth = get_str("RADIO_BANDWIDTH", "62.5 kHz")
|
||||||
|
self.radio_spread_factor = get_str("RADIO_SPREAD_FACTOR", "SF8")
|
||||||
|
self.radio_coding_rate = get_str("RADIO_CODING_RATE", "CR8")
|
||||||
|
|
||||||
|
|
||||||
|
# Global config instance
|
||||||
|
_config: Optional[Config] = None
|
||||||
|
|
||||||
|
|
||||||
|
def get_config() -> Config:
|
||||||
|
"""Get or create the global config instance."""
|
||||||
|
global _config
|
||||||
|
if _config is None:
|
||||||
|
_config = Config()
|
||||||
|
return _config
|
||||||
158
src/meshmon/formatters.py
Normal file
158
src/meshmon/formatters.py
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
"""Shared formatting functions for display values."""
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any, Optional, Union
|
||||||
|
|
||||||
|
Number = Union[int, float]
|
||||||
|
|
||||||
|
from .battery import voltage_to_percentage
|
||||||
|
|
||||||
|
|
||||||
|
def format_time(ts: Optional[int]) -> str:
|
||||||
|
"""Format Unix timestamp to human readable string."""
|
||||||
|
if ts is None:
|
||||||
|
return "N/A"
|
||||||
|
try:
|
||||||
|
dt = datetime.fromtimestamp(ts)
|
||||||
|
return dt.strftime("%Y-%m-%d %H:%M:%S")
|
||||||
|
except (ValueError, OSError):
|
||||||
|
return "N/A"
|
||||||
|
|
||||||
|
|
||||||
|
def format_value(value: Any) -> str:
|
||||||
|
"""Format a value for display."""
|
||||||
|
if value is None:
|
||||||
|
return "N/A"
|
||||||
|
if isinstance(value, float):
|
||||||
|
return f"{value:.2f}"
|
||||||
|
return str(value)
|
||||||
|
|
||||||
|
|
||||||
|
def format_number(value: Optional[int]) -> str:
|
||||||
|
"""Format an integer with thousands separators."""
|
||||||
|
if value is None:
|
||||||
|
return "N/A"
|
||||||
|
return f"{value:,}"
|
||||||
|
|
||||||
|
|
||||||
|
def format_duration(seconds: Optional[int]) -> str:
|
||||||
|
"""Format duration in seconds to human readable string (days, hours, minutes, seconds)."""
|
||||||
|
if seconds is None:
|
||||||
|
return "N/A"
|
||||||
|
|
||||||
|
days = seconds // 86400
|
||||||
|
hours = (seconds % 86400) // 3600
|
||||||
|
mins = (seconds % 3600) // 60
|
||||||
|
secs = seconds % 60
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if days > 0:
|
||||||
|
parts.append(f"{days}d")
|
||||||
|
if hours > 0 or days > 0:
|
||||||
|
parts.append(f"{hours}h")
|
||||||
|
if mins > 0 or hours > 0 or days > 0:
|
||||||
|
parts.append(f"{mins}m")
|
||||||
|
parts.append(f"{secs}s")
|
||||||
|
|
||||||
|
return " ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
def format_uptime(seconds: Optional[int]) -> str:
|
||||||
|
"""Format uptime seconds to human readable string (days, hours, minutes)."""
|
||||||
|
if seconds is None:
|
||||||
|
return "N/A"
|
||||||
|
|
||||||
|
days = seconds // 86400
|
||||||
|
hours = (seconds % 86400) // 3600
|
||||||
|
mins = (seconds % 3600) // 60
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if days > 0:
|
||||||
|
parts.append(f"{days}d")
|
||||||
|
if hours > 0 or days > 0:
|
||||||
|
parts.append(f"{hours}h")
|
||||||
|
parts.append(f"{mins}m")
|
||||||
|
|
||||||
|
return " ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
def format_voltage_with_pct(mv: Optional[float]) -> str:
|
||||||
|
"""Format millivolts as voltage with battery percentage."""
|
||||||
|
if mv is None:
|
||||||
|
return "N/A"
|
||||||
|
v = mv / 1000.0
|
||||||
|
pct = voltage_to_percentage(v)
|
||||||
|
return f"{v:.2f} V ({pct:.0f}%)"
|
||||||
|
|
||||||
|
|
||||||
|
def format_compact_number(value: Optional[Number], precision: int = 1) -> str:
|
||||||
|
"""Format a number using compact notation (k, M suffixes).
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
- None: Returns "N/A"
|
||||||
|
- < 1,000: Raw integer (847)
|
||||||
|
- 1,000 - 9,999: Comma-separated (4,989)
|
||||||
|
- 10,000 - 999,999: Compact with suffix (242.1k)
|
||||||
|
- >= 1,000,000: Millions (1.5M)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value: The numeric value to format
|
||||||
|
precision: Decimal places for compact notation (default: 1)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Formatted string
|
||||||
|
"""
|
||||||
|
if value is None:
|
||||||
|
return "N/A"
|
||||||
|
|
||||||
|
# Handle negative values
|
||||||
|
if value < 0:
|
||||||
|
return f"-{format_compact_number(abs(value), precision)}"
|
||||||
|
|
||||||
|
if value >= 1_000_000:
|
||||||
|
return f"{value / 1_000_000:.{precision}f}M"
|
||||||
|
elif value >= 10_000:
|
||||||
|
return f"{value / 1_000:.{precision}f}k"
|
||||||
|
elif value >= 1_000:
|
||||||
|
return f"{int(value):,}"
|
||||||
|
else:
|
||||||
|
return str(int(value))
|
||||||
|
|
||||||
|
|
||||||
|
def format_duration_compact(seconds: Optional[int]) -> str:
|
||||||
|
"""Format duration showing only the two most significant units.
|
||||||
|
|
||||||
|
Uses truncation (floor), not rounding.
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
- None: Returns "N/A"
|
||||||
|
- 0: Returns "0s"
|
||||||
|
- < 60s: Seconds only (45s)
|
||||||
|
- < 1h: Minutes + seconds (45m 12s)
|
||||||
|
- < 1d: Hours + minutes (19h 45m)
|
||||||
|
- >= 1d: Days + hours (1d 20h)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
seconds: Duration in seconds
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Formatted duration string
|
||||||
|
"""
|
||||||
|
if seconds is None:
|
||||||
|
return "N/A"
|
||||||
|
if seconds == 0:
|
||||||
|
return "0s"
|
||||||
|
|
||||||
|
days = seconds // 86400
|
||||||
|
hours = (seconds % 86400) // 3600
|
||||||
|
mins = (seconds % 3600) // 60
|
||||||
|
secs = seconds % 60
|
||||||
|
|
||||||
|
if days > 0:
|
||||||
|
return f"{days}d {hours}h"
|
||||||
|
elif hours > 0:
|
||||||
|
return f"{hours}h {mins}m"
|
||||||
|
elif mins > 0:
|
||||||
|
return f"{mins}m {secs}s"
|
||||||
|
else:
|
||||||
|
return f"{secs}s"
|
||||||
1279
src/meshmon/html.py
Normal file
1279
src/meshmon/html.py
Normal file
File diff suppressed because it is too large
Load Diff
31
src/meshmon/log.py
Normal file
31
src/meshmon/log.py
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
"""Simple logging helper."""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
from datetime import datetime
|
||||||
|
from .env import get_config
|
||||||
|
|
||||||
|
|
||||||
|
def _ts() -> str:
|
||||||
|
"""Get current timestamp string."""
|
||||||
|
return datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
||||||
|
|
||||||
|
|
||||||
|
def info(msg: str) -> None:
|
||||||
|
"""Print info message to stdout."""
|
||||||
|
print(f"[{_ts()}] {msg}")
|
||||||
|
|
||||||
|
|
||||||
|
def debug(msg: str) -> None:
|
||||||
|
"""Print debug message if MESH_DEBUG is enabled."""
|
||||||
|
if get_config().mesh_debug:
|
||||||
|
print(f"[{_ts()}] DEBUG: {msg}")
|
||||||
|
|
||||||
|
|
||||||
|
def error(msg: str) -> None:
|
||||||
|
"""Print error message to stderr."""
|
||||||
|
print(f"[{_ts()}] ERROR: {msg}", file=sys.stderr)
|
||||||
|
|
||||||
|
|
||||||
|
def warn(msg: str) -> None:
|
||||||
|
"""Print warning message to stderr."""
|
||||||
|
print(f"[{_ts()}] WARN: {msg}", file=sys.stderr)
|
||||||
250
src/meshmon/meshcore_client.py
Normal file
250
src/meshmon/meshcore_client.py
Normal file
@@ -0,0 +1,250 @@
|
|||||||
|
"""MeshCore client wrapper with safe command execution and contact lookup."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
from typing import Any, Optional, Callable, Coroutine
|
||||||
|
|
||||||
|
from .env import get_config
|
||||||
|
from . import log
|
||||||
|
|
||||||
|
# Try to import meshcore - will fail gracefully if not installed
|
||||||
|
try:
|
||||||
|
from meshcore import MeshCore, EventType
|
||||||
|
MESHCORE_AVAILABLE = True
|
||||||
|
except ImportError:
|
||||||
|
MESHCORE_AVAILABLE = False
|
||||||
|
MeshCore = None
|
||||||
|
EventType = None
|
||||||
|
|
||||||
|
|
||||||
|
def auto_detect_serial_port() -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Auto-detect a suitable serial port for MeshCore device.
|
||||||
|
Prefers /dev/ttyACM* or /dev/ttyUSB* devices.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
import serial.tools.list_ports
|
||||||
|
except ImportError:
|
||||||
|
log.error("pyserial not installed, cannot auto-detect serial port")
|
||||||
|
return None
|
||||||
|
|
||||||
|
ports = list(serial.tools.list_ports.comports())
|
||||||
|
if not ports:
|
||||||
|
log.error("No serial ports found")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Prefer ACM devices (CDC/ACM USB), then USB serial
|
||||||
|
for port in ports:
|
||||||
|
if "ttyACM" in port.device:
|
||||||
|
log.info(f"Auto-detected serial port: {port.device} ({port.description})")
|
||||||
|
return port.device
|
||||||
|
|
||||||
|
for port in ports:
|
||||||
|
if "ttyUSB" in port.device:
|
||||||
|
log.info(f"Auto-detected serial port: {port.device} ({port.description})")
|
||||||
|
return port.device
|
||||||
|
|
||||||
|
# Fall back to first available
|
||||||
|
port = ports[0]
|
||||||
|
log.info(f"Using first available port: {port.device} ({port.description})")
|
||||||
|
return port.device
|
||||||
|
|
||||||
|
|
||||||
|
async def connect_from_env() -> Optional[Any]:
|
||||||
|
"""
|
||||||
|
Connect to MeshCore device using environment configuration.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MeshCore instance or None on failure
|
||||||
|
"""
|
||||||
|
if not MESHCORE_AVAILABLE:
|
||||||
|
log.error("meshcore library not available")
|
||||||
|
return None
|
||||||
|
|
||||||
|
cfg = get_config()
|
||||||
|
transport = cfg.mesh_transport.lower()
|
||||||
|
|
||||||
|
try:
|
||||||
|
if transport == "serial":
|
||||||
|
port = cfg.mesh_serial_port
|
||||||
|
if not port:
|
||||||
|
port = auto_detect_serial_port()
|
||||||
|
if not port:
|
||||||
|
log.error("No serial port configured or detected")
|
||||||
|
return None
|
||||||
|
|
||||||
|
log.debug(f"Connecting via serial: {port} @ {cfg.mesh_serial_baud}")
|
||||||
|
mc = await MeshCore.create_serial(
|
||||||
|
port, cfg.mesh_serial_baud, debug=cfg.mesh_debug
|
||||||
|
)
|
||||||
|
return mc
|
||||||
|
|
||||||
|
elif transport == "tcp":
|
||||||
|
log.debug(f"Connecting via TCP: {cfg.mesh_tcp_host}:{cfg.mesh_tcp_port}")
|
||||||
|
mc = await MeshCore.create_tcp(cfg.mesh_tcp_host, cfg.mesh_tcp_port)
|
||||||
|
return mc
|
||||||
|
|
||||||
|
elif transport == "ble":
|
||||||
|
if not cfg.mesh_ble_addr:
|
||||||
|
log.error("MESH_BLE_ADDR required for BLE transport")
|
||||||
|
return None
|
||||||
|
log.debug(f"Connecting via BLE: {cfg.mesh_ble_addr}")
|
||||||
|
mc = await MeshCore.create_ble(cfg.mesh_ble_addr, pin=cfg.mesh_ble_pin)
|
||||||
|
return mc
|
||||||
|
|
||||||
|
else:
|
||||||
|
log.error(f"Unknown transport: {transport}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.error(f"Failed to connect: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
async def run_command(
|
||||||
|
mc: Any,
|
||||||
|
cmd_coro: Coroutine,
|
||||||
|
name: str,
|
||||||
|
) -> tuple[bool, Optional[str], Optional[dict], Optional[str]]:
|
||||||
|
"""
|
||||||
|
Run a MeshCore command and capture result.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
mc: MeshCore instance
|
||||||
|
cmd_coro: The command coroutine to execute
|
||||||
|
name: Human-readable command name for logging
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(success, event_type_name, payload_dict, error_message)
|
||||||
|
"""
|
||||||
|
if not MESHCORE_AVAILABLE:
|
||||||
|
return (False, None, None, "meshcore not available")
|
||||||
|
|
||||||
|
try:
|
||||||
|
log.debug(f"Running command: {name}")
|
||||||
|
event = await cmd_coro
|
||||||
|
|
||||||
|
if event is None:
|
||||||
|
return (False, None, None, "No response received")
|
||||||
|
|
||||||
|
# Extract event type name
|
||||||
|
event_type_name = None
|
||||||
|
if hasattr(event, "type"):
|
||||||
|
if hasattr(event.type, "name"):
|
||||||
|
event_type_name = event.type.name
|
||||||
|
else:
|
||||||
|
event_type_name = str(event.type)
|
||||||
|
|
||||||
|
# Check for error
|
||||||
|
if EventType and hasattr(event, "type") and event.type == EventType.ERROR:
|
||||||
|
error_msg = None
|
||||||
|
if hasattr(event, "payload"):
|
||||||
|
error_msg = str(event.payload)
|
||||||
|
return (False, event_type_name, None, error_msg or "Command returned error")
|
||||||
|
|
||||||
|
# Extract payload
|
||||||
|
payload = None
|
||||||
|
if hasattr(event, "payload"):
|
||||||
|
payload = event.payload
|
||||||
|
# Try to convert to dict if it's a custom object
|
||||||
|
if payload is not None and not isinstance(payload, dict):
|
||||||
|
if hasattr(payload, "__dict__"):
|
||||||
|
payload = vars(payload)
|
||||||
|
elif hasattr(payload, "_asdict"):
|
||||||
|
payload = payload._asdict()
|
||||||
|
else:
|
||||||
|
payload = {"raw": payload}
|
||||||
|
|
||||||
|
log.debug(f"Command {name} returned: {event_type_name}")
|
||||||
|
return (True, event_type_name, payload, None)
|
||||||
|
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
return (False, None, None, "Timeout")
|
||||||
|
except Exception as e:
|
||||||
|
return (False, None, None, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def get_contact_by_name(mc: Any, name: str) -> Optional[Any]:
|
||||||
|
"""
|
||||||
|
Find a contact by advertised name.
|
||||||
|
|
||||||
|
Note: This is a synchronous method on MeshCore.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
mc: MeshCore instance
|
||||||
|
name: The advertised name to search for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Contact object or None
|
||||||
|
"""
|
||||||
|
if not hasattr(mc, "get_contact_by_name"):
|
||||||
|
log.warn("get_contact_by_name not available in meshcore")
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
return mc.get_contact_by_name(name)
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"get_contact_by_name failed: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_contact_by_key_prefix(mc: Any, prefix: str) -> Optional[Any]:
|
||||||
|
"""
|
||||||
|
Find a contact by public key prefix.
|
||||||
|
|
||||||
|
Note: This is a synchronous method on MeshCore.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
mc: MeshCore instance
|
||||||
|
prefix: Hex prefix of the public key
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Contact object or None
|
||||||
|
"""
|
||||||
|
if not hasattr(mc, "get_contact_by_key_prefix"):
|
||||||
|
log.warn("get_contact_by_key_prefix not available in meshcore")
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
return mc.get_contact_by_key_prefix(prefix)
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"get_contact_by_key_prefix failed: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def extract_contact_info(contact: Any) -> dict[str, Any]:
|
||||||
|
"""Extract useful info from a contact object or dict."""
|
||||||
|
info = {}
|
||||||
|
|
||||||
|
attrs = ["adv_name", "name", "pubkey_prefix", "public_key", "type", "flags"]
|
||||||
|
|
||||||
|
# Handle dict contacts
|
||||||
|
if isinstance(contact, dict):
|
||||||
|
for attr in attrs:
|
||||||
|
if attr in contact:
|
||||||
|
val = contact[attr]
|
||||||
|
if val is not None:
|
||||||
|
if isinstance(val, bytes):
|
||||||
|
info[attr] = val.hex()
|
||||||
|
else:
|
||||||
|
info[attr] = val
|
||||||
|
else:
|
||||||
|
# Handle object contacts
|
||||||
|
for attr in attrs:
|
||||||
|
if hasattr(contact, attr):
|
||||||
|
val = getattr(contact, attr)
|
||||||
|
if val is not None:
|
||||||
|
if isinstance(val, bytes):
|
||||||
|
info[attr] = val.hex()
|
||||||
|
else:
|
||||||
|
info[attr] = val
|
||||||
|
|
||||||
|
return info
|
||||||
|
|
||||||
|
|
||||||
|
def list_contacts_summary(contacts: list) -> list[dict[str, Any]]:
|
||||||
|
"""Get summary of all contacts for debugging."""
|
||||||
|
result = []
|
||||||
|
for c in contacts:
|
||||||
|
info = extract_contact_info(c)
|
||||||
|
result.append(info)
|
||||||
|
return result
|
||||||
306
src/meshmon/metrics.py
Normal file
306
src/meshmon/metrics.py
Normal file
@@ -0,0 +1,306 @@
|
|||||||
|
"""Centralized metrics configuration.
|
||||||
|
|
||||||
|
This module defines metric display properties using firmware field names.
|
||||||
|
It is the single source of truth for:
|
||||||
|
- Metric type (gauge vs counter)
|
||||||
|
- Display labels and units
|
||||||
|
- Scaling factors for charts
|
||||||
|
- Which metrics to display per role
|
||||||
|
|
||||||
|
Firmware field names are used directly (e.g., 'bat', 'nb_recv', 'battery_mv').
|
||||||
|
See docs/firmware-responses.md for the complete field reference.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class MetricConfig:
|
||||||
|
"""Configuration for displaying a metric.
|
||||||
|
|
||||||
|
Attributes:
|
||||||
|
label: Human-readable label for charts/reports
|
||||||
|
unit: Display unit (e.g., 'V', 'dBm', '/min')
|
||||||
|
type: 'gauge' for instantaneous values, 'counter' for cumulative values
|
||||||
|
scale: Multiply raw value by this for display (e.g., 60 for per-minute)
|
||||||
|
transform: Optional transform to apply ('mv_to_v' for millivolts to volts)
|
||||||
|
"""
|
||||||
|
label: str
|
||||||
|
unit: str
|
||||||
|
type: str = "gauge"
|
||||||
|
scale: float = 1.0
|
||||||
|
transform: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Metric Definitions (firmware field names)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
METRIC_CONFIG: dict[str, MetricConfig] = {
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
# Companion metrics (from get_stats_core, get_stats_packets, get_contacts)
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
"battery_mv": MetricConfig(
|
||||||
|
label="Battery Voltage",
|
||||||
|
unit="V",
|
||||||
|
transform="mv_to_v",
|
||||||
|
),
|
||||||
|
"uptime_secs": MetricConfig(
|
||||||
|
label="System Uptime",
|
||||||
|
unit="days",
|
||||||
|
scale=1 / 86400,
|
||||||
|
),
|
||||||
|
"contacts": MetricConfig(
|
||||||
|
label="Known Contacts",
|
||||||
|
unit="",
|
||||||
|
),
|
||||||
|
"recv": MetricConfig(
|
||||||
|
label="Total Packets Received",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"sent": MetricConfig(
|
||||||
|
label="Total Packets Sent",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
# Repeater metrics (from req_status_sync)
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
"bat": MetricConfig(
|
||||||
|
label="Battery Voltage",
|
||||||
|
unit="V",
|
||||||
|
transform="mv_to_v",
|
||||||
|
),
|
||||||
|
"uptime": MetricConfig(
|
||||||
|
label="System Uptime",
|
||||||
|
unit="days",
|
||||||
|
scale=1 / 86400,
|
||||||
|
),
|
||||||
|
"last_rssi": MetricConfig(
|
||||||
|
label="Signal Strength (RSSI)",
|
||||||
|
unit="dBm",
|
||||||
|
),
|
||||||
|
"last_snr": MetricConfig(
|
||||||
|
label="Signal-to-Noise Ratio",
|
||||||
|
unit="dB",
|
||||||
|
),
|
||||||
|
"noise_floor": MetricConfig(
|
||||||
|
label="RF Noise Floor",
|
||||||
|
unit="dBm",
|
||||||
|
),
|
||||||
|
"tx_queue_len": MetricConfig(
|
||||||
|
label="Transmit Queue Depth",
|
||||||
|
unit="",
|
||||||
|
),
|
||||||
|
"nb_recv": MetricConfig(
|
||||||
|
label="Total Packets Received",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"nb_sent": MetricConfig(
|
||||||
|
label="Total Packets Sent",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"airtime": MetricConfig(
|
||||||
|
label="Transmit Airtime",
|
||||||
|
unit="s/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"rx_airtime": MetricConfig(
|
||||||
|
label="Receive Airtime",
|
||||||
|
unit="s/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"flood_dups": MetricConfig(
|
||||||
|
label="Flood Duplicates Dropped",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"direct_dups": MetricConfig(
|
||||||
|
label="Direct Duplicates Dropped",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"sent_flood": MetricConfig(
|
||||||
|
label="Flood Packets Sent",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"recv_flood": MetricConfig(
|
||||||
|
label="Flood Packets Received",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"sent_direct": MetricConfig(
|
||||||
|
label="Direct Packets Sent",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
"recv_direct": MetricConfig(
|
||||||
|
label="Direct Packets Received",
|
||||||
|
unit="/min",
|
||||||
|
type="counter",
|
||||||
|
scale=60,
|
||||||
|
),
|
||||||
|
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
# Derived metrics (computed at query time, not stored in database)
|
||||||
|
# -------------------------------------------------------------------------
|
||||||
|
"bat_pct": MetricConfig(
|
||||||
|
label="Charge Level",
|
||||||
|
unit="%",
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Metrics to display in charts (in display order)
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
COMPANION_CHART_METRICS = [
|
||||||
|
"battery_mv",
|
||||||
|
"bat_pct",
|
||||||
|
"uptime_secs",
|
||||||
|
"contacts",
|
||||||
|
"recv",
|
||||||
|
"sent",
|
||||||
|
]
|
||||||
|
|
||||||
|
REPEATER_CHART_METRICS = [
|
||||||
|
"bat",
|
||||||
|
"bat_pct",
|
||||||
|
"last_rssi",
|
||||||
|
"last_snr",
|
||||||
|
"noise_floor",
|
||||||
|
"uptime",
|
||||||
|
"tx_queue_len",
|
||||||
|
"nb_recv",
|
||||||
|
"nb_sent",
|
||||||
|
"airtime",
|
||||||
|
"rx_airtime",
|
||||||
|
"flood_dups",
|
||||||
|
"direct_dups",
|
||||||
|
"sent_flood",
|
||||||
|
"recv_flood",
|
||||||
|
"sent_direct",
|
||||||
|
"recv_direct",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# Helper functions
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
def get_chart_metrics(role: str) -> list[str]:
|
||||||
|
"""Get list of metrics to chart for a role.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
role: 'companion' or 'repeater'
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of metric names in display order
|
||||||
|
"""
|
||||||
|
if role == "companion":
|
||||||
|
return COMPANION_CHART_METRICS
|
||||||
|
elif role == "repeater":
|
||||||
|
return REPEATER_CHART_METRICS
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unknown role: {role}")
|
||||||
|
|
||||||
|
|
||||||
|
def get_metric_config(metric: str) -> Optional[MetricConfig]:
|
||||||
|
"""Get configuration for a metric.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MetricConfig or None if metric is not configured
|
||||||
|
"""
|
||||||
|
return METRIC_CONFIG.get(metric)
|
||||||
|
|
||||||
|
|
||||||
|
def is_counter_metric(metric: str) -> bool:
|
||||||
|
"""Check if a metric is a counter type.
|
||||||
|
|
||||||
|
Counter metrics show rate of change (delta per time unit).
|
||||||
|
Gauge metrics show instantaneous values.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if counter, False if gauge or unknown
|
||||||
|
"""
|
||||||
|
config = METRIC_CONFIG.get(metric)
|
||||||
|
return config is not None and config.type == "counter"
|
||||||
|
|
||||||
|
|
||||||
|
def get_graph_scale(metric: str) -> float:
|
||||||
|
"""Get the scaling factor for graphing a metric.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Scale factor (1.0 if not configured)
|
||||||
|
"""
|
||||||
|
config = METRIC_CONFIG.get(metric)
|
||||||
|
return config.scale if config else 1.0
|
||||||
|
|
||||||
|
|
||||||
|
def get_metric_label(metric: str) -> str:
|
||||||
|
"""Get human-readable label for a metric.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Display label or the metric name if not configured
|
||||||
|
"""
|
||||||
|
config = METRIC_CONFIG.get(metric)
|
||||||
|
return config.label if config else metric
|
||||||
|
|
||||||
|
|
||||||
|
def get_metric_unit(metric: str) -> str:
|
||||||
|
"""Get display unit for a metric.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Unit string or empty string if not configured
|
||||||
|
"""
|
||||||
|
config = METRIC_CONFIG.get(metric)
|
||||||
|
return config.unit if config else ""
|
||||||
|
|
||||||
|
|
||||||
|
def transform_value(metric: str, value: float) -> float:
|
||||||
|
"""Apply any configured transform to a metric value.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
metric: Firmware field name
|
||||||
|
value: Raw value from database
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Transformed value for display
|
||||||
|
"""
|
||||||
|
config = METRIC_CONFIG.get(metric)
|
||||||
|
if config and config.transform == "mv_to_v":
|
||||||
|
return value / 1000.0
|
||||||
|
return value
|
||||||
41
src/meshmon/migrations/001_initial_schema.sql
Normal file
41
src/meshmon/migrations/001_initial_schema.sql
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
-- Migration 001: Initial schema
|
||||||
|
-- Creates companion_metrics and repeater_metrics tables
|
||||||
|
|
||||||
|
-- Companion metrics (6 metrics, ~525K rows/year at 60s intervals)
|
||||||
|
CREATE TABLE IF NOT EXISTS companion_metrics (
|
||||||
|
ts INTEGER PRIMARY KEY NOT NULL,
|
||||||
|
bat_v REAL,
|
||||||
|
bat_pct REAL,
|
||||||
|
contacts INTEGER,
|
||||||
|
uptime INTEGER,
|
||||||
|
rx INTEGER,
|
||||||
|
tx INTEGER
|
||||||
|
) STRICT, WITHOUT ROWID;
|
||||||
|
|
||||||
|
-- Repeater metrics (17 metrics, ~35K rows/year at 15min intervals)
|
||||||
|
CREATE TABLE IF NOT EXISTS repeater_metrics (
|
||||||
|
ts INTEGER PRIMARY KEY NOT NULL,
|
||||||
|
bat_v REAL,
|
||||||
|
bat_pct REAL,
|
||||||
|
rssi INTEGER,
|
||||||
|
snr REAL,
|
||||||
|
uptime INTEGER,
|
||||||
|
noise INTEGER,
|
||||||
|
txq INTEGER,
|
||||||
|
rx INTEGER,
|
||||||
|
tx INTEGER,
|
||||||
|
airtime INTEGER,
|
||||||
|
rx_air INTEGER,
|
||||||
|
fl_dups INTEGER,
|
||||||
|
di_dups INTEGER,
|
||||||
|
fl_tx INTEGER,
|
||||||
|
fl_rx INTEGER,
|
||||||
|
di_tx INTEGER,
|
||||||
|
di_rx INTEGER
|
||||||
|
) STRICT, WITHOUT ROWID;
|
||||||
|
|
||||||
|
-- Schema metadata
|
||||||
|
CREATE TABLE IF NOT EXISTS db_meta (
|
||||||
|
key TEXT PRIMARY KEY NOT NULL,
|
||||||
|
value TEXT NOT NULL
|
||||||
|
) STRICT, WITHOUT ROWID;
|
||||||
97
src/meshmon/migrations/002_eav_schema.sql
Normal file
97
src/meshmon/migrations/002_eav_schema.sql
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
-- Migration 002: EAV metrics schema
|
||||||
|
-- Replaces wide companion_metrics and repeater_metrics tables with
|
||||||
|
-- a single flexible Entity-Attribute-Value pattern.
|
||||||
|
--
|
||||||
|
-- Benefits:
|
||||||
|
-- - New firmware fields automatically captured without schema changes
|
||||||
|
-- - Firmware field renames only require Python config updates
|
||||||
|
-- - Unified query interface for all metrics
|
||||||
|
--
|
||||||
|
-- Trade-offs:
|
||||||
|
-- - More rows (~3.75M/year vs ~560K/year)
|
||||||
|
-- - Queries filter by metric name instead of column access
|
||||||
|
-- - All values stored as REAL (no per-column type safety)
|
||||||
|
|
||||||
|
-- Create new EAV metrics table
|
||||||
|
CREATE TABLE metrics (
|
||||||
|
ts INTEGER NOT NULL,
|
||||||
|
role TEXT NOT NULL, -- 'companion' or 'repeater'
|
||||||
|
metric TEXT NOT NULL, -- Firmware field name (e.g., 'bat', 'nb_recv')
|
||||||
|
value REAL, -- Metric value
|
||||||
|
PRIMARY KEY (ts, role, metric)
|
||||||
|
) STRICT, WITHOUT ROWID;
|
||||||
|
|
||||||
|
-- Index for common query pattern: get all metrics for a role in time range
|
||||||
|
-- Primary key covers (ts, role, metric) but this helps when filtering by role first
|
||||||
|
CREATE INDEX idx_metrics_role_ts ON metrics(role, ts);
|
||||||
|
|
||||||
|
-- Migrate companion data to firmware field names
|
||||||
|
-- Note: bat_v stored as volts, convert back to millivolts (battery_mv)
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'companion', 'battery_mv', bat_v * 1000 FROM companion_metrics WHERE bat_v IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'companion', 'uptime_secs', uptime FROM companion_metrics WHERE uptime IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'companion', 'contacts', contacts FROM companion_metrics WHERE contacts IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'companion', 'recv', rx FROM companion_metrics WHERE rx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'companion', 'sent', tx FROM companion_metrics WHERE tx IS NOT NULL;
|
||||||
|
|
||||||
|
-- Migrate repeater data to firmware field names
|
||||||
|
-- Note: bat_v stored as volts, convert back to millivolts (bat)
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'bat', bat_v * 1000 FROM repeater_metrics WHERE bat_v IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'uptime', uptime FROM repeater_metrics WHERE uptime IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'last_rssi', rssi FROM repeater_metrics WHERE rssi IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'last_snr', snr FROM repeater_metrics WHERE snr IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'noise_floor', noise FROM repeater_metrics WHERE noise IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'tx_queue_len', txq FROM repeater_metrics WHERE txq IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'nb_recv', rx FROM repeater_metrics WHERE rx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'nb_sent', tx FROM repeater_metrics WHERE tx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'airtime', airtime FROM repeater_metrics WHERE airtime IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'rx_airtime', rx_air FROM repeater_metrics WHERE rx_air IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'flood_dups', fl_dups FROM repeater_metrics WHERE fl_dups IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'direct_dups', di_dups FROM repeater_metrics WHERE di_dups IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'sent_flood', fl_tx FROM repeater_metrics WHERE fl_tx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'recv_flood', fl_rx FROM repeater_metrics WHERE fl_rx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'sent_direct', di_tx FROM repeater_metrics WHERE di_tx IS NOT NULL;
|
||||||
|
|
||||||
|
INSERT INTO metrics (ts, role, metric, value)
|
||||||
|
SELECT ts, 'repeater', 'recv_direct', di_rx FROM repeater_metrics WHERE di_rx IS NOT NULL;
|
||||||
|
|
||||||
|
-- Drop old wide tables
|
||||||
|
DROP TABLE companion_metrics;
|
||||||
|
DROP TABLE repeater_metrics;
|
||||||
1210
src/meshmon/reports.py
Normal file
1210
src/meshmon/reports.py
Normal file
File diff suppressed because it is too large
Load Diff
126
src/meshmon/retry.py
Normal file
126
src/meshmon/retry.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
"""Retry logic and circuit breaker state management."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Callable, Coroutine, Optional, TypeVar
|
||||||
|
|
||||||
|
from .env import get_config
|
||||||
|
from . import log
|
||||||
|
|
||||||
|
T = TypeVar("T")
|
||||||
|
|
||||||
|
|
||||||
|
class CircuitBreaker:
|
||||||
|
"""
|
||||||
|
Simple circuit breaker for remote requests.
|
||||||
|
State is persisted to JSON file.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, state_file: Path):
|
||||||
|
self.state_file = state_file
|
||||||
|
self.consecutive_failures = 0
|
||||||
|
self.cooldown_until: float = 0
|
||||||
|
self.last_success: float = 0
|
||||||
|
self._load()
|
||||||
|
|
||||||
|
def _load(self) -> None:
|
||||||
|
"""Load state from file."""
|
||||||
|
if self.state_file.exists():
|
||||||
|
try:
|
||||||
|
data = json.loads(self.state_file.read_text())
|
||||||
|
self.consecutive_failures = data.get("consecutive_failures", 0)
|
||||||
|
self.cooldown_until = data.get("cooldown_until", 0)
|
||||||
|
self.last_success = data.get("last_success", 0)
|
||||||
|
except (json.JSONDecodeError, OSError) as e:
|
||||||
|
log.warn(f"Failed to load circuit breaker state: {e}")
|
||||||
|
|
||||||
|
def _save(self) -> None:
|
||||||
|
"""Save state to file."""
|
||||||
|
self.state_file.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
data = {
|
||||||
|
"consecutive_failures": self.consecutive_failures,
|
||||||
|
"cooldown_until": self.cooldown_until,
|
||||||
|
"last_success": self.last_success,
|
||||||
|
}
|
||||||
|
self.state_file.write_text(json.dumps(data, indent=2), encoding="utf-8")
|
||||||
|
|
||||||
|
def is_open(self) -> bool:
|
||||||
|
"""Check if circuit is open (in cooldown)."""
|
||||||
|
return time.time() < self.cooldown_until
|
||||||
|
|
||||||
|
def cooldown_remaining(self) -> int:
|
||||||
|
"""Return seconds remaining in cooldown, or 0 if not in cooldown."""
|
||||||
|
remaining = self.cooldown_until - time.time()
|
||||||
|
return max(0, int(remaining))
|
||||||
|
|
||||||
|
def record_success(self) -> None:
|
||||||
|
"""Record a successful call."""
|
||||||
|
self.consecutive_failures = 0
|
||||||
|
self.last_success = time.time()
|
||||||
|
self._save()
|
||||||
|
|
||||||
|
def record_failure(self, max_failures: int, cooldown_s: int) -> None:
|
||||||
|
"""Record a failed call and potentially open the circuit."""
|
||||||
|
self.consecutive_failures += 1
|
||||||
|
if self.consecutive_failures >= max_failures:
|
||||||
|
self.cooldown_until = time.time() + cooldown_s
|
||||||
|
log.warn(
|
||||||
|
f"Circuit breaker opened: {self.consecutive_failures} failures, "
|
||||||
|
f"cooldown for {cooldown_s}s"
|
||||||
|
)
|
||||||
|
self._save()
|
||||||
|
|
||||||
|
def to_dict(self) -> dict:
|
||||||
|
"""Return state as dict for snapshot."""
|
||||||
|
return {
|
||||||
|
"consecutive_failures": self.consecutive_failures,
|
||||||
|
"cooldown_until": self.cooldown_until,
|
||||||
|
"last_success": self.last_success,
|
||||||
|
"is_open": self.is_open(),
|
||||||
|
"cooldown_remaining_s": self.cooldown_remaining(),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
async def with_retries(
|
||||||
|
fn: Callable[[], Coroutine[Any, Any, T]],
|
||||||
|
attempts: int = 2,
|
||||||
|
backoff_s: float = 4.0,
|
||||||
|
name: str = "operation",
|
||||||
|
) -> tuple[bool, Optional[T], Optional[Exception]]:
|
||||||
|
"""
|
||||||
|
Execute async function with retries.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
fn: Async function to call
|
||||||
|
attempts: Max number of attempts
|
||||||
|
backoff_s: Seconds to wait between retries
|
||||||
|
name: Name for logging
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(success, result, last_exception)
|
||||||
|
"""
|
||||||
|
last_exception: Optional[Exception] = None
|
||||||
|
|
||||||
|
for attempt in range(1, attempts + 1):
|
||||||
|
try:
|
||||||
|
result = await fn()
|
||||||
|
if attempt > 1:
|
||||||
|
log.info(f"{name}: succeeded on attempt {attempt}/{attempts}")
|
||||||
|
return (True, result, None)
|
||||||
|
except Exception as e:
|
||||||
|
last_exception = e
|
||||||
|
log.info(f"{name}: attempt {attempt}/{attempts} failed: {e}")
|
||||||
|
if attempt < attempts:
|
||||||
|
log.debug(f"{name}: retrying in {backoff_s}s...")
|
||||||
|
await asyncio.sleep(backoff_s)
|
||||||
|
|
||||||
|
return (False, None, last_exception)
|
||||||
|
|
||||||
|
|
||||||
|
def get_repeater_circuit_breaker() -> CircuitBreaker:
|
||||||
|
"""Get the circuit breaker for repeater requests."""
|
||||||
|
cfg = get_config()
|
||||||
|
state_file = cfg.state_dir / "repeater_circuit.json"
|
||||||
|
return CircuitBreaker(state_file)
|
||||||
29
src/meshmon/templates/base.html
Normal file
29
src/meshmon/templates/base.html
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>{{ title }} — MeshCore Stats</title>
|
||||||
|
<meta name="description" content="{{ meta_description }}">
|
||||||
|
|
||||||
|
<!-- Open Graph -->
|
||||||
|
<meta property="og:title" content="{{ title }} — MeshCore Stats">
|
||||||
|
<meta property="og:description" content="{{ meta_description }}">
|
||||||
|
<meta property="og:type" content="website">
|
||||||
|
<meta property="og:site_name" content="MeshCore Stats">
|
||||||
|
{% if og_image %}<meta property="og:image" content="{{ og_image }}">{% endif %}
|
||||||
|
|
||||||
|
<!-- Twitter Card -->
|
||||||
|
<meta name="twitter:card" content="summary_large_image">
|
||||||
|
<meta name="twitter:title" content="{{ title }} — MeshCore Stats">
|
||||||
|
<meta name="twitter:description" content="{{ meta_description }}">
|
||||||
|
|
||||||
|
<link rel="stylesheet" href="{{ css_path }}styles.css">
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
{% block body %}{% endblock %}
|
||||||
|
|
||||||
|
<!-- Chart tooltip enhancement (progressive) -->
|
||||||
|
<script src="{{ css_path }}chart-tooltip.js" defer></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
328
src/meshmon/templates/chart-tooltip.js
Normal file
328
src/meshmon/templates/chart-tooltip.js
Normal file
@@ -0,0 +1,328 @@
|
|||||||
|
/**
|
||||||
|
* Chart tooltip enhancement for MeshCore Stats
|
||||||
|
*
|
||||||
|
* Progressive enhancement: charts work fully without JS,
|
||||||
|
* but this adds interactive tooltips on hover.
|
||||||
|
*/
|
||||||
|
(function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
// Create tooltip element
|
||||||
|
const tooltip = document.createElement('div');
|
||||||
|
tooltip.className = 'chart-tooltip';
|
||||||
|
tooltip.innerHTML = '<div class="tooltip-time"></div><div class="tooltip-value"></div>';
|
||||||
|
document.body.appendChild(tooltip);
|
||||||
|
|
||||||
|
const tooltipTime = tooltip.querySelector('.tooltip-time');
|
||||||
|
const tooltipValue = tooltip.querySelector('.tooltip-value');
|
||||||
|
|
||||||
|
// Track the current indicator element
|
||||||
|
let currentIndicator = null;
|
||||||
|
let currentSvg = null;
|
||||||
|
|
||||||
|
// Metric display labels and units (using firmware field names)
|
||||||
|
const metricLabels = {
|
||||||
|
// Companion metrics
|
||||||
|
'battery_mv': { label: 'Voltage', unit: 'V', decimals: 2 },
|
||||||
|
'uptime_secs': { label: 'Uptime', unit: 'days', decimals: 2 },
|
||||||
|
'contacts': { label: 'Contacts', unit: '', decimals: 0 },
|
||||||
|
'recv': { label: 'Received', unit: '/min', decimals: 1 },
|
||||||
|
'sent': { label: 'Sent', unit: '/min', decimals: 1 },
|
||||||
|
|
||||||
|
// Repeater metrics
|
||||||
|
'bat': { label: 'Voltage', unit: 'V', decimals: 2 },
|
||||||
|
'bat_pct': { label: 'Charge', unit: '%', decimals: 0 },
|
||||||
|
'uptime': { label: 'Uptime', unit: 'days', decimals: 2 },
|
||||||
|
'last_rssi': { label: 'RSSI', unit: 'dBm', decimals: 0 },
|
||||||
|
'last_snr': { label: 'SNR', unit: 'dB', decimals: 1 },
|
||||||
|
'noise_floor': { label: 'Noise', unit: 'dBm', decimals: 0 },
|
||||||
|
'tx_queue_len': { label: 'Queue', unit: '', decimals: 0 },
|
||||||
|
'nb_recv': { label: 'Received', unit: '/min', decimals: 1 },
|
||||||
|
'nb_sent': { label: 'Sent', unit: '/min', decimals: 1 },
|
||||||
|
'airtime': { label: 'TX Air', unit: 's/min', decimals: 2 },
|
||||||
|
'rx_airtime': { label: 'RX Air', unit: 's/min', decimals: 2 },
|
||||||
|
'flood_dups': { label: 'Dropped', unit: '/min', decimals: 1 },
|
||||||
|
'direct_dups': { label: 'Dropped', unit: '/min', decimals: 1 },
|
||||||
|
'sent_flood': { label: 'Sent', unit: '/min', decimals: 1 },
|
||||||
|
'recv_flood': { label: 'Received', unit: '/min', decimals: 1 },
|
||||||
|
'sent_direct': { label: 'Sent', unit: '/min', decimals: 1 },
|
||||||
|
'recv_direct': { label: 'Received', unit: '/min', decimals: 1 },
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format a timestamp as a readable date/time string
|
||||||
|
*/
|
||||||
|
function formatTime(ts, period) {
|
||||||
|
const date = new Date(ts * 1000);
|
||||||
|
const options = {
|
||||||
|
month: 'short',
|
||||||
|
day: 'numeric',
|
||||||
|
hour: '2-digit',
|
||||||
|
minute: '2-digit'
|
||||||
|
};
|
||||||
|
|
||||||
|
// For year view, include year
|
||||||
|
if (period === 'year') {
|
||||||
|
options.year = 'numeric';
|
||||||
|
}
|
||||||
|
|
||||||
|
return date.toLocaleString(undefined, options);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format a value with appropriate decimals and unit
|
||||||
|
*/
|
||||||
|
function formatValue(value, metric) {
|
||||||
|
const config = metricLabels[metric] || { label: metric, unit: '', decimals: 2 };
|
||||||
|
const formatted = value.toFixed(config.decimals);
|
||||||
|
return `${formatted}${config.unit ? ' ' + config.unit : ''}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find the closest data point to a timestamp, returning index too
|
||||||
|
*/
|
||||||
|
function findClosestPoint(dataPoints, targetTs) {
|
||||||
|
if (!dataPoints || dataPoints.length === 0) return null;
|
||||||
|
|
||||||
|
let closestIdx = 0;
|
||||||
|
let minDiff = Math.abs(dataPoints[0].ts - targetTs);
|
||||||
|
|
||||||
|
for (let i = 1; i < dataPoints.length; i++) {
|
||||||
|
const diff = Math.abs(dataPoints[i].ts - targetTs);
|
||||||
|
if (diff < minDiff) {
|
||||||
|
minDiff = diff;
|
||||||
|
closestIdx = i;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { point: dataPoints[closestIdx], index: closestIdx };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create or get the indicator circle for an SVG
|
||||||
|
*/
|
||||||
|
function getIndicator(svg) {
|
||||||
|
if (currentSvg === svg && currentIndicator) {
|
||||||
|
return currentIndicator;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove old indicator if switching charts
|
||||||
|
if (currentIndicator && currentIndicator.parentNode) {
|
||||||
|
currentIndicator.parentNode.removeChild(currentIndicator);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create new indicator as an SVG circle
|
||||||
|
const indicator = document.createElementNS('http://www.w3.org/2000/svg', 'circle');
|
||||||
|
indicator.setAttribute('r', '5');
|
||||||
|
indicator.setAttribute('class', 'chart-indicator');
|
||||||
|
indicator.style.pointerEvents = 'none';
|
||||||
|
|
||||||
|
// Get theme from SVG data attribute for color
|
||||||
|
const theme = svg.dataset.theme;
|
||||||
|
if (theme === 'dark') {
|
||||||
|
indicator.setAttribute('fill', '#f59e0b');
|
||||||
|
indicator.setAttribute('stroke', '#0f1114');
|
||||||
|
} else {
|
||||||
|
indicator.setAttribute('fill', '#b45309');
|
||||||
|
indicator.setAttribute('stroke', '#ffffff');
|
||||||
|
}
|
||||||
|
indicator.setAttribute('stroke-width', '2');
|
||||||
|
|
||||||
|
svg.appendChild(indicator);
|
||||||
|
currentIndicator = indicator;
|
||||||
|
currentSvg = svg;
|
||||||
|
|
||||||
|
return indicator;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hide and clean up the indicator
|
||||||
|
*/
|
||||||
|
function hideIndicator() {
|
||||||
|
if (currentIndicator) {
|
||||||
|
currentIndicator.style.display = 'none';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Position tooltip near the mouse cursor
|
||||||
|
*/
|
||||||
|
function positionTooltip(event) {
|
||||||
|
const offset = 15;
|
||||||
|
let left = event.pageX + offset;
|
||||||
|
let top = event.pageY + offset;
|
||||||
|
|
||||||
|
// Keep tooltip on screen
|
||||||
|
const rect = tooltip.getBoundingClientRect();
|
||||||
|
const viewportWidth = window.innerWidth;
|
||||||
|
const viewportHeight = window.innerHeight;
|
||||||
|
|
||||||
|
if (left + rect.width > viewportWidth - 10) {
|
||||||
|
left = event.pageX - rect.width - offset;
|
||||||
|
}
|
||||||
|
if (top + rect.height > viewportHeight - 10) {
|
||||||
|
top = event.pageY - rect.height - offset;
|
||||||
|
}
|
||||||
|
|
||||||
|
tooltip.style.left = left + 'px';
|
||||||
|
tooltip.style.top = top + 'px';
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle mouse move over chart SVG
|
||||||
|
*/
|
||||||
|
function handleMouseMove(event) {
|
||||||
|
const svg = event.currentTarget;
|
||||||
|
const metric = svg.dataset.metric;
|
||||||
|
const period = svg.dataset.period;
|
||||||
|
const xStart = parseInt(svg.dataset.xStart, 10);
|
||||||
|
const xEnd = parseInt(svg.dataset.xEnd, 10);
|
||||||
|
const yMin = parseFloat(svg.dataset.yMin);
|
||||||
|
const yMax = parseFloat(svg.dataset.yMax);
|
||||||
|
|
||||||
|
// Find the path with data-points
|
||||||
|
const path = svg.querySelector('path[data-points]');
|
||||||
|
if (!path) return;
|
||||||
|
|
||||||
|
// Parse and cache data points and path coordinates on first access
|
||||||
|
if (!path._dataPoints) {
|
||||||
|
try {
|
||||||
|
const json = path.dataset.points.replace(/"/g, '"');
|
||||||
|
path._dataPoints = JSON.parse(json);
|
||||||
|
} catch (e) {
|
||||||
|
console.warn('Failed to parse chart data:', e);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cache the path's bounding box for coordinate mapping
|
||||||
|
if (!path._pathBox) {
|
||||||
|
path._pathBox = path.getBBox();
|
||||||
|
}
|
||||||
|
|
||||||
|
const pathBox = path._pathBox;
|
||||||
|
|
||||||
|
// Get mouse position in SVG coordinate space
|
||||||
|
const svgRect = svg.getBoundingClientRect();
|
||||||
|
const viewBox = svg.viewBox.baseVal;
|
||||||
|
|
||||||
|
// Convert screen X coordinate to SVG coordinate
|
||||||
|
const scaleX = viewBox.width / svgRect.width;
|
||||||
|
const svgX = (event.clientX - svgRect.left) * scaleX + viewBox.x;
|
||||||
|
|
||||||
|
// Calculate relative X position within the plot area (pathBox)
|
||||||
|
const relX = (svgX - pathBox.x) / pathBox.width;
|
||||||
|
|
||||||
|
// Clamp to plot area bounds
|
||||||
|
const clampedRelX = Math.max(0, Math.min(1, relX));
|
||||||
|
|
||||||
|
// Map relative X position to timestamp using the chart's X-axis range
|
||||||
|
const targetTs = xStart + clampedRelX * (xEnd - xStart);
|
||||||
|
|
||||||
|
// Find closest data point by timestamp
|
||||||
|
const result = findClosestPoint(path._dataPoints, targetTs);
|
||||||
|
if (!result) return;
|
||||||
|
|
||||||
|
const { point } = result;
|
||||||
|
|
||||||
|
// Update tooltip content
|
||||||
|
tooltipTime.textContent = formatTime(point.ts, period);
|
||||||
|
tooltipValue.textContent = formatValue(point.v, metric);
|
||||||
|
|
||||||
|
// Position and show tooltip
|
||||||
|
positionTooltip(event);
|
||||||
|
tooltip.classList.add('visible');
|
||||||
|
|
||||||
|
// Position the indicator at the data point
|
||||||
|
const indicator = getIndicator(svg);
|
||||||
|
|
||||||
|
// Calculate X position: map timestamp to path coordinate space
|
||||||
|
const pointRelX = (point.ts - xStart) / (xEnd - xStart);
|
||||||
|
const indicatorX = pathBox.x + pointRelX * pathBox.width;
|
||||||
|
|
||||||
|
// Calculate Y position using the actual Y-axis range from the chart
|
||||||
|
const ySpan = yMax - yMin || 1;
|
||||||
|
// Y is inverted in SVG (0 at top)
|
||||||
|
const pointRelY = 1 - (point.v - yMin) / ySpan;
|
||||||
|
const indicatorY = pathBox.y + pointRelY * pathBox.height;
|
||||||
|
|
||||||
|
indicator.setAttribute('cx', indicatorX);
|
||||||
|
indicator.setAttribute('cy', indicatorY);
|
||||||
|
indicator.style.display = '';
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hide tooltip when leaving chart
|
||||||
|
*/
|
||||||
|
function handleMouseLeave() {
|
||||||
|
tooltip.classList.remove('visible');
|
||||||
|
hideIndicator();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle touch events for mobile
|
||||||
|
*/
|
||||||
|
function handleTouchStart(event) {
|
||||||
|
// Convert touch to mouse-like event
|
||||||
|
const touch = event.touches[0];
|
||||||
|
const mouseEvent = {
|
||||||
|
currentTarget: event.currentTarget,
|
||||||
|
clientX: touch.clientX,
|
||||||
|
clientY: touch.clientY,
|
||||||
|
pageX: touch.pageX,
|
||||||
|
pageY: touch.pageY
|
||||||
|
};
|
||||||
|
|
||||||
|
handleMouseMove(mouseEvent);
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleTouchMove(event) {
|
||||||
|
const touch = event.touches[0];
|
||||||
|
const mouseEvent = {
|
||||||
|
currentTarget: event.currentTarget,
|
||||||
|
clientX: touch.clientX,
|
||||||
|
clientY: touch.clientY,
|
||||||
|
pageX: touch.pageX,
|
||||||
|
pageY: touch.pageY
|
||||||
|
};
|
||||||
|
|
||||||
|
handleMouseMove(mouseEvent);
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleTouchEnd() {
|
||||||
|
handleMouseLeave();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize tooltips for all chart SVGs
|
||||||
|
*/
|
||||||
|
function initTooltips() {
|
||||||
|
// Find all chart SVGs with data attributes
|
||||||
|
const chartSvgs = document.querySelectorAll('svg[data-metric][data-period]');
|
||||||
|
|
||||||
|
chartSvgs.forEach(function(svg) {
|
||||||
|
// Mouse events for desktop
|
||||||
|
svg.addEventListener('mousemove', handleMouseMove);
|
||||||
|
svg.addEventListener('mouseleave', handleMouseLeave);
|
||||||
|
|
||||||
|
// Touch events for mobile
|
||||||
|
svg.addEventListener('touchstart', handleTouchStart, { passive: true });
|
||||||
|
svg.addEventListener('touchmove', handleTouchMove, { passive: true });
|
||||||
|
svg.addEventListener('touchend', handleTouchEnd);
|
||||||
|
svg.addEventListener('touchcancel', handleTouchEnd);
|
||||||
|
|
||||||
|
// Set cursor to indicate interactivity
|
||||||
|
svg.style.cursor = 'crosshair';
|
||||||
|
|
||||||
|
// Allow vertical scrolling but prevent horizontal pan on mobile
|
||||||
|
svg.style.touchAction = 'pan-y';
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize when DOM is ready
|
||||||
|
if (document.readyState === 'loading') {
|
||||||
|
document.addEventListener('DOMContentLoaded', initTooltips);
|
||||||
|
} else {
|
||||||
|
initTooltips();
|
||||||
|
}
|
||||||
|
})();
|
||||||
5
src/meshmon/templates/credit.html
Normal file
5
src/meshmon/templates/credit.html
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
<footer class="site-credit">
|
||||||
|
<span>Built by <a href="https://jorijn.com" target="_blank" rel="noopener">Jorijn</a></span>
|
||||||
|
<span class="separator" aria-hidden="true">·</span>
|
||||||
|
<a href="https://github.com/jorijn/meshcore-stats" target="_blank" rel="noopener">Source</a>
|
||||||
|
</footer>
|
||||||
166
src/meshmon/templates/node.html
Normal file
166
src/meshmon/templates/node.html
Normal file
@@ -0,0 +1,166 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
{% block body %}
|
||||||
|
<div class="layout">
|
||||||
|
<!-- Sidebar - Instrument Panel -->
|
||||||
|
<aside class="sidebar">
|
||||||
|
<header class="site-header">
|
||||||
|
<div class="site-title">MeshCore Stats</div>
|
||||||
|
<div class="site-subtitle">LoRa Mesh Observatory</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<!-- Node Selector -->
|
||||||
|
<nav class="node-selector">
|
||||||
|
<a href="{{ repeater_link }}"{% if role == 'repeater' %} class="active"{% endif %}>Repeater</a>
|
||||||
|
<a href="{{ companion_link }}"{% if role == 'companion' %} class="active"{% endif %}>Companion</a>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
<!-- Node Info Card -->
|
||||||
|
<div class="node-info">
|
||||||
|
<div class="node-header">
|
||||||
|
<div>
|
||||||
|
<div class="node-name">{{ node_name }}</div>
|
||||||
|
{% if pubkey_pre %}
|
||||||
|
<div class="node-id">{{ pubkey_pre }}</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<span class="status-badge {{ status_class }}">{{ status_text }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Critical Metrics -->
|
||||||
|
<div class="critical-metrics">
|
||||||
|
{% for m in critical_metrics %}
|
||||||
|
<div class="metric">
|
||||||
|
<div class="metric-value">{{ m.value }}{% if m.unit %}<span class="unit">{{ m.unit }}</span>{% endif %}</div>
|
||||||
|
<div class="metric-label">{{ m.label }}</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if secondary_metrics %}
|
||||||
|
<!-- Secondary Metrics -->
|
||||||
|
<div class="secondary-metrics">
|
||||||
|
{% for m in secondary_metrics %}
|
||||||
|
<div class="secondary-metric">
|
||||||
|
<span class="label">{{ m.label }}</span>
|
||||||
|
<span class="value">{{ m.value }}</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% if traffic_table_rows %}
|
||||||
|
<!-- Traffic Metrics Table -->
|
||||||
|
<table class="traffic-table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th scope="col"></th>
|
||||||
|
<th scope="col">RX</th>
|
||||||
|
<th scope="col">TX</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{% for row in traffic_table_rows %}
|
||||||
|
<tr>
|
||||||
|
<th scope="row">{{ row.label }}</th>
|
||||||
|
<td{% if row.rx_raw %} title="{{ row.rx_raw | format_number }} {{ row.unit }}"{% endif %}>{{ row.rx if row.rx else '—' }}</td>
|
||||||
|
<td{% if row.tx_raw %} title="{{ row.tx_raw | format_number }} {{ row.unit }}"{% endif %}>{{ row.tx if row.tx else '—' }}</td>
|
||||||
|
</tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% if node_details %}
|
||||||
|
<!-- Node Details -->
|
||||||
|
<div class="node-details">
|
||||||
|
{% for d in node_details %}
|
||||||
|
<div class="node-details-row">
|
||||||
|
<span class="label">{{ d.label }}</span>
|
||||||
|
<span class="value">{{ d.value }}</span>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% if radio_config %}
|
||||||
|
<!-- Radio Config -->
|
||||||
|
<dl class="radio-config">
|
||||||
|
{% for r in radio_config %}
|
||||||
|
<dt>{{ r.label }}</dt>
|
||||||
|
<dd>{{ r.value }}</dd>
|
||||||
|
{% endfor %}
|
||||||
|
</dl>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Last Updated -->
|
||||||
|
<div class="last-updated">
|
||||||
|
Last observation
|
||||||
|
{% if last_updated %}
|
||||||
|
<time datetime="{{ last_updated_iso }}">{{ last_updated }}</time>
|
||||||
|
{% else %}
|
||||||
|
<time>N/A</time>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Footer -->
|
||||||
|
<footer class="site-footer">
|
||||||
|
<a href="{{ reports_link }}">View Reports Archive</a>
|
||||||
|
</footer>
|
||||||
|
</aside>
|
||||||
|
|
||||||
|
<!-- Main Content - Charts -->
|
||||||
|
<main class="main-content">
|
||||||
|
<!-- Period Navigation -->
|
||||||
|
<nav class="period-nav">
|
||||||
|
<a href="{{ base_path }}/day.html"{% if period == 'day' %} class="active"{% endif %}>Day</a>
|
||||||
|
<a href="{{ base_path }}/week.html"{% if period == 'week' %} class="active"{% endif %}>Week</a>
|
||||||
|
<a href="{{ base_path }}/month.html"{% if period == 'month' %} class="active"{% endif %}>Month</a>
|
||||||
|
<a href="{{ base_path }}/year.html"{% if period == 'year' %} class="active"{% endif %}>Year</a>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
<header class="page-header">
|
||||||
|
<h1 class="page-title">{{ page_title }}</h1>
|
||||||
|
<p class="page-subtitle">{{ page_subtitle }}</p>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{% for group in chart_groups %}
|
||||||
|
<section class="chart-group">
|
||||||
|
<h2 class="chart-group-title">{{ group.title }}</h2>
|
||||||
|
<div class="charts-grid">
|
||||||
|
{% for chart in group.charts %}
|
||||||
|
<article class="chart-card" data-metric="{{ chart.metric }}">
|
||||||
|
<header class="chart-header">
|
||||||
|
<h3 class="chart-title">{{ chart.label }}</h3>
|
||||||
|
{% if chart.current %}
|
||||||
|
<span class="chart-current">{{ chart.current }}</span>
|
||||||
|
{% endif %}
|
||||||
|
</header>
|
||||||
|
{% if chart.use_svg %}
|
||||||
|
<div class="chart-svg-container">
|
||||||
|
<div class="chart-svg light-theme">{{ chart.svg_light | safe }}</div>
|
||||||
|
<div class="chart-svg dark-theme">{{ chart.svg_dark | safe }}</div>
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<picture>
|
||||||
|
<source srcset="{{ chart.src_dark }}" media="(prefers-color-scheme: dark)">
|
||||||
|
<img src="{{ chart.src_light }}" alt="{{ chart.label }} over the past {{ period }}" class="chart-image" loading="lazy">
|
||||||
|
</picture>
|
||||||
|
{% endif %}
|
||||||
|
{% if chart.stats %}
|
||||||
|
<footer class="chart-footer">
|
||||||
|
{% for stat in chart.stats %}
|
||||||
|
<span class="chart-stat"><span class="label">{{ stat.label }}</span> <span class="value">{{ stat.value }}</span></span>
|
||||||
|
{% endfor %}
|
||||||
|
</footer>
|
||||||
|
{% endif %}
|
||||||
|
</article>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
{% include "credit.html" %}
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
||||||
98
src/meshmon/templates/report.html
Normal file
98
src/meshmon/templates/report.html
Normal file
@@ -0,0 +1,98 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
{% block body %}
|
||||||
|
<div class="reports-layout">
|
||||||
|
<header class="reports-header">
|
||||||
|
<nav class="reports-breadcrumb">
|
||||||
|
<a href="{{ css_path }}day.html">Dashboard</a>
|
||||||
|
<span>/</span>
|
||||||
|
<a href="{{ css_path }}reports/">Reports</a>
|
||||||
|
<span>/</span>
|
||||||
|
{% if report_type == 'monthly' %}
|
||||||
|
<a href="{{ css_path }}reports/{{ role }}/{{ year }}/">{{ year }}</a>
|
||||||
|
<span>/</span>
|
||||||
|
<span>{{ month_name }}</span>
|
||||||
|
{% else %}
|
||||||
|
<span>{{ year }}</span>
|
||||||
|
{% endif %}
|
||||||
|
</nav>
|
||||||
|
<h1 class="reports-title">{{ report_title }}</h1>
|
||||||
|
<p class="reports-subtitle">{{ report_subtitle }}</p>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<!-- Location Info -->
|
||||||
|
<dl class="location-info">
|
||||||
|
<dt>Node</dt>
|
||||||
|
<dd>{{ node_name }}</dd>
|
||||||
|
<dt>Location</dt>
|
||||||
|
<dd>{{ location_name }}</dd>
|
||||||
|
<dt>Coordinates</dt>
|
||||||
|
<dd>{{ coords_str }} • Elevation {{ elev }}m</dd>
|
||||||
|
</dl>
|
||||||
|
|
||||||
|
<!-- Data Table -->
|
||||||
|
<div class="data-table-wrapper">
|
||||||
|
<table class="data-table">
|
||||||
|
<thead>
|
||||||
|
{% if col_groups %}
|
||||||
|
<tr class="col-group-header">
|
||||||
|
{% for cg in col_groups %}
|
||||||
|
<th{% if cg.colspan > 1 %} colspan="{{ cg.colspan }}"{% endif %}>{{ cg.label }}</th>
|
||||||
|
{% endfor %}
|
||||||
|
</tr>
|
||||||
|
{% endif %}
|
||||||
|
<tr>
|
||||||
|
{% for header in table_headers %}
|
||||||
|
<th{% if header.tooltip %} title="{{ header.tooltip }}"{% endif %}>{{ header.label }}</th>
|
||||||
|
{% endfor %}
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{% for row in table_rows %}
|
||||||
|
{% if not row.is_summary %}
|
||||||
|
<tr>
|
||||||
|
{% for cell in row.cells %}
|
||||||
|
<td{% if cell.class %} class="{{ cell.class }}"{% endif %}>{{ cell.value|safe }}</td>
|
||||||
|
{% endfor %}
|
||||||
|
</tr>
|
||||||
|
{% endif %}
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
<tfoot>
|
||||||
|
{% for row in table_rows %}
|
||||||
|
{% if row.is_summary %}
|
||||||
|
<tr>
|
||||||
|
{% for cell in row.cells %}
|
||||||
|
<td{% if cell.class %} class="{{ cell.class }}"{% endif %}>{{ cell.value|safe }}</td>
|
||||||
|
{% endfor %}
|
||||||
|
</tr>
|
||||||
|
{% endif %}
|
||||||
|
{% endfor %}
|
||||||
|
</tfoot>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if report_type == 'yearly' and monthly_links %}
|
||||||
|
<!-- Monthly links -->
|
||||||
|
<section style="margin-top: var(--space-2xl);">
|
||||||
|
<h3 style="font-size: var(--text-sm); font-weight: 500; color: var(--text-tertiary); text-transform: uppercase; letter-spacing: 0.1em; margin-bottom: var(--space-md);">Monthly Reports</h3>
|
||||||
|
<div style="display: flex; flex-wrap: wrap; gap: var(--space-sm);">
|
||||||
|
{% for link in monthly_links %}
|
||||||
|
<a href="{{ link.url }}" class="archive-month">{{ link.label }}</a>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<!-- Download links -->
|
||||||
|
<div style="margin-top: var(--space-xl); display: flex; gap: var(--space-lg); font-size: var(--text-sm);">
|
||||||
|
<a href="report.txt" style="color: var(--text-secondary); text-decoration: none;" download="{{ download_prefix }}.txt">Download TXT</a>
|
||||||
|
<a href="report.json" style="color: var(--text-secondary); text-decoration: none;" download="{{ download_prefix }}.json">Download JSON</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<footer class="site-footer" style="margin-top: var(--space-3xl);">
|
||||||
|
<a href="{{ css_path }}reports/">Back to Reports Archive</a>
|
||||||
|
</footer>
|
||||||
|
|
||||||
|
{% include "credit.html" %}
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
||||||
51
src/meshmon/templates/report_index.html
Normal file
51
src/meshmon/templates/report_index.html
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
{% block body %}
|
||||||
|
<div class="reports-layout">
|
||||||
|
<header class="reports-header">
|
||||||
|
<nav class="reports-breadcrumb">
|
||||||
|
<a href="{{ css_path }}day.html">Dashboard</a>
|
||||||
|
<span>/</span>
|
||||||
|
<span>Reports</span>
|
||||||
|
</nav>
|
||||||
|
<h1 class="reports-title">Reports Archive</h1>
|
||||||
|
<p class="reports-subtitle">Monthly and yearly statistics for the MeshCore network</p>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{% for section in report_sections %}
|
||||||
|
<section style="margin-bottom: var(--space-3xl);">
|
||||||
|
<h2 style="font-family: 'Instrument Serif', Georgia, serif; font-size: var(--text-xl); margin-bottom: var(--space-lg);">{{ section.role | capitalize }}</h2>
|
||||||
|
<p style="font-size: var(--text-sm); color: var(--text-secondary); margin-bottom: var(--space-lg);">
|
||||||
|
{{ section.description }}
|
||||||
|
</p>
|
||||||
|
|
||||||
|
{% if section.years %}
|
||||||
|
<div class="archive-list">
|
||||||
|
{% for year_data in section.years %}
|
||||||
|
<article class="archive-year">
|
||||||
|
<h3 class="archive-year-title"><a href="{{ css_path }}reports/{{ section.role }}/{{ year_data.year }}/">{{ year_data.year }}</a></h3>
|
||||||
|
<div class="archive-months">
|
||||||
|
{% for month_num in range(1, 13) %}
|
||||||
|
{% set month_data = year_data.months | selectattr('month', 'equalto', month_num) | list | first %}
|
||||||
|
{% if month_data %}
|
||||||
|
<a href="{{ css_path }}reports/{{ section.role }}/{{ year_data.year }}/{{ '%02d' | format(month_num) }}/" class="archive-month">{{ month_abbrs[month_num] }}</a>
|
||||||
|
{% else %}
|
||||||
|
<span class="archive-month empty">{{ month_abbrs[month_num] }}</span>
|
||||||
|
{% endif %}
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
</article>
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% else %}
|
||||||
|
<p style="font-size: var(--text-sm); color: var(--text-muted);">No reports available yet.</p>
|
||||||
|
{% endif %}
|
||||||
|
</section>
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
<footer class="site-footer" style="margin-top: var(--space-3xl);">
|
||||||
|
<a href="{{ css_path }}day.html">Back to Dashboard</a>
|
||||||
|
</footer>
|
||||||
|
|
||||||
|
{% include "credit.html" %}
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
||||||
1107
src/meshmon/templates/styles.css
Normal file
1107
src/meshmon/templates/styles.css
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user