Merge remote-tracking branch 'origin/dev-v3' into dev-v3

This commit is contained in:
Pablo Revilla
2025-11-21 13:47:37 -08:00
6 changed files with 318 additions and 9 deletions

204
AGENTS.md Normal file
View File

@@ -0,0 +1,204 @@
# AI Agent Guidelines for Meshview
This document provides context and guidelines for AI coding assistants working on the Meshview project.
## Project Overview
Meshview is a real-time monitoring and diagnostic tool for Meshtastic mesh networks. It provides web-based visualization and analysis of network activity, including:
- Real-time packet monitoring from MQTT streams
- Interactive map visualization of node locations
- Network topology graphs showing connectivity
- Message traffic analysis and conversation tracking
- Node statistics and telemetry data
- Packet inspection and traceroute analysis
## Architecture
### Core Components
1. **MQTT Reader** (`meshview/mqtt_reader.py`) - Subscribes to MQTT topics and receives mesh packets
2. **Database Manager** (`meshview/database.py`, `startdb.py`) - Handles database initialization and migrations
3. **MQTT Store** (`meshview/mqtt_store.py`) - Processes and stores packets in the database
4. **Web Server** (`meshview/web.py`, `main.py`) - Serves the web interface and API endpoints
5. **API Layer** (`meshview/web_api/api.py`) - REST API endpoints for data access
6. **Models** (`meshview/models.py`) - SQLAlchemy database models
7. **Decode Payload** (`meshview/decode_payload.py`) - Protobuf message decoding
### Technology Stack
- **Python 3.13+** - Main language
- **aiohttp** - Async web framework
- **aiomqtt** - Async MQTT client
- **SQLAlchemy (async)** - ORM with async support
- **Alembic** - Database migrations
- **Jinja2** - Template engine
- **Protobuf** - Message serialization (Meshtastic protocol)
- **SQLite/PostgreSQL** - Database backends (SQLite default, PostgreSQL via asyncpg)
### Key Patterns
- **Async/Await** - All I/O operations are asynchronous
- **Database Migrations** - Use Alembic for schema changes (see `docs/Database-Changes-With-Alembic.md`)
- **Configuration** - INI file-based config (`config.ini`, see `sample.config.ini`)
- **Modular API** - API routes separated into `meshview/web_api/` module
## Project Structure
```
meshview/
├── alembic/ # Database migration scripts
├── docs/ # Technical documentation
├── meshview/ # Main application package
│ ├── static/ # Static web assets (HTML, JS, CSS)
│ ├── templates/ # Jinja2 HTML templates
│ ├── web_api/ # API route handlers
│ └── *.py # Core modules
├── main.py # Web server entry point
├── startdb.py # Database manager entry point
├── mvrun.py # Combined runner (starts both services)
├── config.ini # Runtime configuration
└── requirements.txt # Python dependencies
```
## Development Workflow
### Setup
1. Use Python 3.13+ virtual environment
### Running
- **Database**: `./env/bin/python startdb.py`
- **Web Server**: `./env/bin/python main.py`
- **Both**: `./env/bin/python mvrun.py`
## Code Style
- **Line length**: 100 characters (see `pyproject.toml`)
- **Linting**: Ruff (configured in `pyproject.toml`)
- **Formatting**: Ruff formatter
- **Type hints**: Preferred but not strictly required
- **Async**: Use `async def` and `await` for I/O operations
## Important Files
### Configuration
- `config.ini` - Runtime configuration (server, MQTT, database, cleanup)
- `sample.config.ini` - Template configuration file
- `alembic.ini` - Alembic migration configuration
### Database
- `meshview/models.py` - SQLAlchemy models (Packet, Node, Traceroute, etc.)
- `meshview/database.py` - Database initialization and session management
- `alembic/versions/` - Migration scripts
### Core Logic
- `meshview/mqtt_reader.py` - MQTT subscription and message reception
- `meshview/mqtt_store.py` - Packet processing and storage
- `meshview/decode_payload.py` - Protobuf decoding
- `meshview/web.py` - Web server routes and handlers
- `meshview/web_api/api.py` - REST API endpoints
### Templates
- `meshview/templates/` - Jinja2 HTML templates
- `meshview/static/` - Static files (HTML pages, JS, CSS)
## Common Tasks
### Adding a New API Endpoint
1. Add route handler in `meshview/web_api/api.py`
2. Register route in `meshview/web.py` (if needed)
3. Update `docs/API_Documentation.md` if public API
### Database Schema Changes
1. Modify models in `meshview/models.py`
2. Create migration: `alembic revision --autogenerate -m "description"`
3. Review generated migration in `alembic/versions/`
4. Test migration: `alembic upgrade head`
5. **Never** modify existing migration files after they've been applied
### Adding a New Web Page
1. Create template in `meshview/templates/`
2. Add route in `meshview/web.py`
3. Add navigation link if needed (check existing templates for pattern)
4. Add static assets if needed in `meshview/static/`
### Processing New Packet Types
1. Check `meshview/decode_payload.py` for existing decoders
2. Add decoder function if new type
3. Update `meshview/mqtt_store.py` to handle new packet type
4. Update database models if new data needs storage
## Key Concepts
### Meshtastic Protocol
- Uses Protobuf for message serialization
- Packets contain various message types (text, position, telemetry, etc.)
- MQTT topics follow pattern: `msh/{region}/{subregion}/#`
### Database Schema
- **packet** - Raw packet data
- **node** - Mesh node information
- **traceroute** - Network path information
- **packet_seen** - Packet observation records
### Real-time Updates
- Web pages use Server-Sent Events (SSE) for live updates
- Map and firehose pages auto-refresh based on config intervals
- API endpoints return JSON for programmatic access
## Best Practices
1. **Always use async/await** for database and network operations
2. **Use Alembic** for all database schema changes
3. **Follow existing patterns** - check similar code before adding new features
4. **Update documentation** - keep `docs/` and README current
5. **Test migrations** - verify migrations work both up and down
6. **Handle errors gracefully** - log errors, don't crash on bad packets
7. **Respect configuration** - use `config.ini` values, don't hardcode
## Common Pitfalls
- **Don't modify applied migrations** - create new ones instead
- **Don't block the event loop** - use async I/O, not sync
- **Don't forget timezone handling** - timestamps are stored in UTC
- **Don't hardcode paths** - use configuration values
- **Don't ignore MQTT reconnection** - handle connection failures gracefully
## Resources
- **Main README**: `README.md` - Installation and basic usage
- **Docker Guide**: `README-Docker.md` - Container deployment
- **API Docs**: `docs/API_Documentation.md` - API endpoint reference
- **Migration Guide**: `docs/Database-Changes-With-Alembic.md` - Database workflow
- **Contributing**: `CONTRIBUTING.md` - Contribution guidelines
## Version Information
- **Current Version**: 3.0.0 (November 2025)
- **Python Requirement**: 3.13+
- **Key Features**: Alembic migrations, automated backups, Docker support, traceroute return paths
## Rules for robots
- Always run ruff check and ruff format after making changes (only on python changes)
---
When working on this project, prioritize:
1. Maintaining async patterns
2. Following existing code structure
3. Using proper database migrations
4. Keeping documentation updated
5. Testing changes thoroughly

View File

@@ -1,6 +1,6 @@
from datetime import datetime, timedelta
from sqlalchemy import func, select, text
from sqlalchemy import func, nullslast, select, text
from sqlalchemy.orm import lazyload
from meshview import database
@@ -45,7 +45,10 @@ async def get_packets(
if before:
q = q.where(Packet.import_time_us < before)
q = q.order_by(Packet.import_time_us.desc())
# Order by import_time_us when available, fallback to import_time for old data
# This handles databases where import_time_us may be NULL (old backups)
# First sort by import_time_us (NULLs last), then by import_time for those rows
q = q.order_by(nullslast(Packet.import_time_us.desc()), Packet.import_time.desc())
if limit is not None:
q = q.limit(limit)

View File

@@ -87,7 +87,19 @@ document.addEventListener("DOMContentLoaded", async () => {
renderedPacketIds.add(packet.id);
packetMap.set(packet.id, packet);
const date = new Date(packet.import_time_us / 1000);
// NOTE: Temporary stopgap - fallback to import_time until old data with
// import_time_us=0 is migrated/cleaned up. Can be simplified once all
// legacy records have been updated.
// Fall back to import_time if import_time_us is 0, null, or undefined
let date;
if (packet.import_time_us && packet.import_time_us > 0) {
date = new Date(packet.import_time_us / 1000);
} else if (packet.import_time) {
date = new Date(packet.import_time);
} else {
// Last resort: use current time
date = new Date();
}
const formattedTime = date.toLocaleTimeString([], { hour:"numeric", minute:"2-digit", second:"2-digit", hour12:true });
const formattedDate = `${(date.getMonth()+1).toString().padStart(2,"0")}/${date.getDate().toString().padStart(2,"0")}/${date.getFullYear()}`;
const formattedTimestamp = `${formattedTime} - ${formattedDate}`;
@@ -136,7 +148,17 @@ document.addEventListener("DOMContentLoaded", async () => {
function renderPacketsEnsureDescending(packets, highlight=false) {
if (!Array.isArray(packets) || packets.length===0) return;
const sortedDesc = packets.slice().sort((a,b)=>b.import_time_us - a.import_time_us);
const sortedDesc = packets.slice().sort((a,b)=>{
// NOTE: Temporary stopgap - fallback to import_time until old data with
// import_time_us=0 is migrated/cleaned up. Can be simplified once all
// legacy records have been updated.
// Sort by import_time_us with fallback to import_time
const aTime = (a.import_time_us && a.import_time_us > 0) ? a.import_time_us :
(a.import_time ? new Date(a.import_time).getTime() * 1000 : 0);
const bTime = (b.import_time_us && b.import_time_us > 0) ? b.import_time_us :
(b.import_time ? new Date(b.import_time).getTime() * 1000 : 0);
return bTime - aTime;
});
for (let i=sortedDesc.length-1; i>=0; i--) renderPacket(sortedDesc[i], highlight);
}

View File

@@ -121,6 +121,10 @@ async def api_packets(request):
"portnum": int(p.portnum) if p.portnum is not None else None,
"payload": (p.payload or "").strip(),
"import_time_us": p.import_time_us,
# NOTE: Temporary stopgap - include import_time as fallback until old data
# with import_time_us=0 is migrated/cleaned up. Can be removed once all
# legacy records have been updated.
"import_time": p.import_time.isoformat() if p.import_time else None,
"channel": getattr(p.from_node, "channel", ""),
"long_name": getattr(p.from_node, "long_name", ""),
}
@@ -175,7 +179,10 @@ async def api_packets(request):
ui_packets = [p for p in ui_packets if contains.lower() in p.payload.lower()]
# --- Sort descending by import_time_us ---
ui_packets.sort(key=lambda p: p.import_time_us, reverse=True)
# Handle None values by treating them as smallest (will be sorted last)
ui_packets.sort(
key=lambda p: (p.import_time_us is not None, p.import_time_us or 0), reverse=True
)
ui_packets = ui_packets[:limit]
# --- Prepare output ---
@@ -184,6 +191,10 @@ async def api_packets(request):
packet_dict = {
"id": p.id,
"import_time_us": p.import_time_us,
# NOTE: Temporary stopgap - include import_time as fallback until old data
# with import_time_us=0 is migrated/cleaned up. Can be removed once all
# legacy records have been updated.
"import_time": p.import_time.isoformat() if p.import_time else None,
"channel": getattr(p.from_node, "channel", ""),
"from_node_id": p.from_node_id,
"to_node_id": p.to_node_id,
@@ -202,7 +213,32 @@ async def api_packets(request):
packets_data.append(packet_dict)
return web.json_response({"packets": packets_data})
# Calculate latest_import_time for incremental updates
# NOTE: Temporary stopgap - fallback to import_time until old data with
# import_time_us=0 is migrated/cleaned up. Can be simplified once all
# legacy records have been updated.
# Use the highest import_time_us, with fallback to import_time
latest_import_time = None
if packets_data:
for p in packets_data:
if p.get("import_time_us") and p["import_time_us"] > 0:
if latest_import_time is None or p["import_time_us"] > latest_import_time:
latest_import_time = p["import_time_us"]
elif p.get("import_time") and latest_import_time is None:
# Fallback: convert ISO string to microseconds if import_time_us is missing
try:
dt = datetime.datetime.fromisoformat(
p["import_time"].replace("Z", "+00:00")
)
latest_import_time = int(dt.timestamp() * 1_000_000)
except (ValueError, AttributeError):
pass
response = {"packets": packets_data}
if latest_import_time is not None:
response["latest_import_time"] = latest_import_time
return web.json_response(response)
except Exception as e:
logger.error(f"Error in /api/packets: {e}")

View File

@@ -99,9 +99,7 @@ def main():
# Add --config runtime argument
parser.add_argument('--config', help="Path to the configuration file.", default='config.ini')
parser.add_argument('--pid_dir', help="PID files path.", default='.')
parser.add_argument(
'--py_exec', help="Path to the Python executable.", default='./env/bin/python'
)
parser.add_argument('--py_exec', help="Path to the Python executable.", default=sys.executable)
args = parser.parse_args()
# PID file paths

View File

@@ -1,3 +1,49 @@
[project]
name = "meshview"
version = "3.0.0"
description = "Real-time monitoring and diagnostic tool for the Meshtastic mesh network"
readme = "README.md"
requires-python = ">=3.10"
dependencies = [
# Core async + networking
"aiohttp>=3.11.12,<4.0.0",
"aiohttp-sse",
"aiodns>=3.2.0,<4.0.0",
"aiomqtt>=2.3.0,<3.0.0",
"asyncpg>=0.30.0,<0.31.0",
"aiosqlite>=0.21.0,<0.22.0",
# Database + ORM
"sqlalchemy[asyncio]>=2.0.38,<3.0.0",
"alembic>=1.14.0,<2.0.0",
# Serialization / security
"protobuf>=5.29.3,<6.0.0",
"cryptography>=44.0.1,<45.0.0",
# Templates
"Jinja2>=3.1.5,<4.0.0",
"MarkupSafe>=3.0.2,<4.0.0",
# Graphs / diagrams
"pydot>=3.0.4,<4.0.0",
]
[project.optional-dependencies]
dev = [
# Data science stack
"numpy>=2.2.3,<3.0.0",
"pandas>=2.2.3,<3.0.0",
"matplotlib>=3.10.0,<4.0.0",
"seaborn>=0.13.2,<1.0.0",
"plotly>=6.0.0,<7.0.0",
# Image support
"pillow>=11.1.0,<12.0.0",
# Debugging / profiling
"psutil>=7.0.0,<8.0.0",
"objgraph>=3.6.2,<4.0.0",
# Testing
"pytest>=8.3.4,<9.0.0",
"pytest-aiohttp>=1.0.5,<2.0.0",
"pytest-asyncio>=0.24.0,<1.0.0",
]
[tool.ruff]
# Linting
target-version = "py313"