Compare commits

...

78 Commits

Author SHA1 Message Date
JingleManSweep df2b9ea432 Merge pull request #35 from ipnet-mesh/claude/issue-33-20251206-1703
Add last seen time to map node labels
2025-12-06 17:18:49 +00:00
claude[bot] 55443376be Use full display name in map node labels
Update map node labels to show the full display name (friendly_name tag
→ advertised node name → public key prefix) instead of just the 2-char
public key prefix. This makes node labels consistent with the rest of
the site and easier to identify at a glance.

The backend already computes the correct display name in map.py:96-101,
so this change just uses that computed name in the label.

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:12:50 +00:00
claude[bot] ed7a46b1a7 Add last seen time to map node labels
Display relative time since last seen (e.g., '2m', '1h', '2d') in node
labels on the map page. This makes it easier to quickly identify how
recently nodes were active without opening the popup.

- Add formatRelativeTime() function to calculate time difference
- Update createNodeIcon() to include relative time in label
- Adjust icon size to accommodate additional text
- Format: keyPrefix (timeAgo) e.g., 'ab (5m)'

Fixes #33

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:05:30 +00:00
JingleManSweep c2eef3db50 Merge pull request #34 from ipnet-mesh/claude/issue-25-20251206-1646
fix: Handle empty channel_idx parameter in messages filter
2025-12-06 16:53:34 +00:00
claude[bot] 4916ea0cea fix: Handle empty channel_idx parameter in messages filter
Fixed parse error when clicking the filter button on messages screen
with "All Channels" selected. The form was sending an empty string
for channel_idx, but FastAPI expected either a valid integer or None.

Changes:
- Accept channel_idx as string in query parameter
- Parse and validate channel_idx before passing to API
- Treat empty strings as None to prevent validation errors
- Add error handling for invalid integer values

Fixes #25

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 16:49:23 +00:00
JingleManSweep e3fc7e4f07 Merge pull request #32 from ipnet-mesh/add-claude-github-actions-1765039315831
Add Claude Code GitHub Workflow
2025-12-06 16:42:13 +00:00
JingleManSweep b656bfda21 "Claude PR Assistant workflow" 2025-12-06 16:41:56 +00:00
Louis King fb7201dc2d Updates 2025-12-06 16:37:25 +00:00
Louis King 74346d9c82 Hopefully use Git tag as version on website 2025-12-06 16:32:31 +00:00
Louis King beb471fcd8 Switched to Git versioning 2025-12-06 16:28:13 +00:00
Louis King 8597052bf7 Fixed table overflows 2025-12-06 16:07:13 +00:00
Louis King f85768f661 Fixed DB migration 2025-12-06 15:46:41 +00:00
JingleManSweep 79bb4a4250 Merge pull request #23 from ipnet-mesh/claude/plan-event-deduplication-01NWhrJaWzQiGi1xLzmk1udg
Deduplication for multiple receiver nodes
2025-12-06 15:36:37 +00:00
Louis King 714c3cbbd2 Set sensible Docker tag label 2025-12-06 15:32:15 +00:00
Louis King f0531c9e40 Updated env example 2025-12-06 15:16:26 +00:00
Louis King dd0b4c73c5 More fixes 2025-12-06 15:10:03 +00:00
Louis King 78a086e1ea Updates 2025-12-06 14:51:47 +00:00
Louis King 9cd1d50bf6 Updates 2025-12-06 14:38:53 +00:00
Louis King 733342a9ec Fixed README and Compose 2025-12-06 14:21:17 +00:00
Louis King d715e4e4f0 Updates 2025-12-06 13:33:02 +00:00
Louis King 2ea04deb7e Updates 2025-12-06 12:53:29 +00:00
Claude 6e3b86a1ad Add collector-level event deduplication using content hashes
Replace presentation-layer deduplication with collector-level approach:
- Add event_hash column to messages, advertisements, trace_paths, telemetry tables
- Handlers compute content hashes and skip duplicate events at insertion time
- Use 5-minute time buckets for advertisements and telemetry
- Include Alembic migration for schema changes
2025-12-06 12:23:14 +00:00
Louis King b807932ca3 Added arm64 Docker image support 2025-12-06 12:16:05 +00:00
Claude c80986fe67 Add event deduplication at presentation layer
When multiple receiver nodes are running, the same mesh events (messages,
advertisements) are reported multiple times. This causes duplicate entries
in the Web UI.

Changes:
- Add hash_utils.py with deterministic hash functions for each event type
- Add `dedupe` parameter to messages and advertisements API endpoints (default: True)
- Update dashboard stats to use distinct counts for messages/advertisements
- Deduplicate recent advertisements and channel messages in dashboard
- Add comprehensive tests for hash utilities

Hash strategy:
- Messages: hash of text + pubkey_prefix + channel_idx + sender_timestamp + txt_type
- Advertisements: hash of public_key + name + adv_type + flags + 5-minute time bucket
2025-12-06 12:08:07 +00:00
Louis King 3a060f77cc Updates 2025-12-06 11:46:36 +00:00
JingleManSweep 7461cf6dc1 Merge pull request #22 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Updates
2025-12-05 21:18:31 +00:00
Louis King 23f6c290c9 Updates 2025-12-05 21:17:34 +00:00
JingleManSweep 1ae1736391 Merge pull request #21 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Associate nodes with members in database
2025-12-05 21:17:20 +00:00
Claude a4b13d3456 Add member-node association support
Members can now have multiple associated nodes, each with a public_key
and node_role (e.g., 'chat', 'repeater'). This replaces the single
public_key field on members with a one-to-many relationship.

Changes:
- Add MemberNode model for member-node associations
- Update Member model to remove public_key, add nodes relationship
- Update Pydantic schemas with MemberNodeCreate/MemberNodeRead
- Update member_import.py to handle nodes list in seed files
- Update API routes to handle nodes in create/update/read operations
- Add Alembic migration to create member_nodes table and migrate data
- Update example seed file with new format
2025-12-05 20:34:09 +00:00
Louis King 0016edbdac Updates 2025-12-05 20:03:14 +00:00
Louis King 0b8fc6e707 Charts 2025-12-05 19:50:22 +00:00
Louis King d1181ae4f9 Updates 2025-12-05 19:27:06 +00:00
JingleManSweep 6b41e64b26 Merge pull request #20 from ipnet-mesh/claude/ui-improvements-ads-page-01GQvLau46crtrqftWzFie5d
UI improvements and advertisements page
2025-12-05 18:26:07 +00:00
Claude 995b066b0d Remove sender name from channel messages summary, keep only timestamp 2025-12-05 18:23:33 +00:00
Claude 0d14ed0ccc Add latest channel messages to Network dashboard
Replace the channel counts table with actual recent messages per channel:
- Added ChannelMessage schema for channel message summaries
- Dashboard API now fetches latest 5 messages for each channel with sender name lookups
- Network page displays messages grouped by channel with sender names and timestamps
- Only shows channels that have messages
2025-12-05 18:20:49 +00:00
Claude e3ce1258a8 Fix advertisements Type column by falling back to source node's adv_type
The adv_type from the Advertisement record is often null, but the linked
Node has the correct adv_type. Now falls back to source_node.adv_type
when adv.adv_type is null.
2025-12-05 18:14:16 +00:00
Claude 087a3c4c43 Messages list: swap Time/Type columns, add receiver node links
- Swap Time and Type columns (Type now first)
- Add receiver_name and receiver_friendly_name to MessageRead schema
- Update messages API to fetch receiver node names and tags
- Make Receiver column a link showing name with public key prefix
2025-12-05 18:09:29 +00:00
Claude 5077178a6d Add Type column with emoji to Advertisements list 2025-12-05 18:04:04 +00:00
Claude a44d38dad6 Update Nodes list to match Advertisements style
- Rename Name column to Node
- Remove separate Public Key column
- Show name with public key prefix below (like Advertisements list)
- Add whitespace-nowrap to Last Seen column
2025-12-05 18:03:14 +00:00
Claude 3469278fba Add node name lookups to advertisements list
- Join with Node table to get node names and tags for both source
  and receiver nodes
- Display friendly_name (from tags), node_name, or advertised name
  with priority in that order
- Show name with public key preview for both Node and Received By columns
2025-12-05 17:58:52 +00:00
Claude 89c81630c9 Link 'Powered by MeshCore Hub' to GitHub repository 2025-12-05 17:55:59 +00:00
Claude ab4a5886db Simplify advertisements table: Node, Received By, Time columns
- Remove Name and Type columns (usually null)
- Reorder columns: Node first, then Received By, then Time
- Link both Node and Received By to their node detail pages
- Show node name with public key preview when available
2025-12-05 17:53:54 +00:00
Claude ec7082e01a Fix message text indentation in messages list
Put message content inline with td tag to prevent whitespace-pre-wrap
from preserving template indentation.
2025-12-05 17:43:09 +00:00
Claude b4e7d45cf6 UI improvements: smaller hero, stats bar, advertisements page, messages fixes
- Reduce hero section size and add stats bar with node/message counts
- Add new Advertisements page with public key filtering
- Update hero navigation buttons: Dashboard, Nodes, Advertisements, Messages
- Add Advertisements to main navigation menu
- Remove Hops column from messages list (always empty)
- Display full message text with proper multi-line wrapping
2025-12-05 17:14:26 +00:00
JingleManSweep 864494c3a8 Merge pull request #19 from ipnet-mesh/claude/research-node-id-usage-019DURYQHkvodx9sV39hTmNM
Research node ID and public key usage
2025-12-05 16:59:30 +00:00
Claude 84e83a3384 Rename receiver_public_key to received_by
Shorter, cleaner field name for the receiving interface node's
public key in API responses.
2025-12-05 16:57:24 +00:00
Claude 796e303665 Remove internal UUID fields from API responses
Internal database UUIDs (id, node_id, receiver_node_id) were being
exposed in API responses. These are implementation details that should
not be visible to API consumers. The canonical identifier for nodes
is the 64-char hex public_key.

Changes:
- Remove id, node_id from NodeTagRead, NodeRead schemas
- Remove id from MemberRead schema
- Remove id, receiver_node_id, node_id from MessageRead, AdvertisementRead,
  TracePathRead, TelemetryRead schemas
- Update web map component to use public_key instead of member.id
  for owner filtering
- Update tests to not assert on removed fields
2025-12-05 16:50:21 +00:00
Louis King a5d8d586e1 Updated README 2025-12-05 12:56:12 +00:00
JingleManSweep 26239fe11f Merge pull request #18 from ipnet-mesh/claude/prepare-public-release-01AFsizAneHmWjHZD6of5MWy
Prepare repository for public release
2025-12-05 12:40:56 +00:00
Claude 0e50a9d3b0 Prepare repository for public release
- Update license from MIT to GPL-3.0-or-later in pyproject.toml
- Update project URLs from meshcore-dev to ipnet-mesh organization
- Add explicit GPL-3.0 license statement to README
- Fix AGENTS.md venv directory reference (.venv vs venv)
- Remove undocumented NETWORK_LOCATION from README
- Fix stats endpoint path in README (/api/v1/dashboard/stats)
- Clarify seed and data directory descriptions in project structure
2025-12-05 12:12:55 +00:00
JingleManSweep 9b7d8cc31b Merge pull request #17 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Updates
2025-12-04 19:35:15 +00:00
Louis King d7152a5359 Updates 2025-12-04 19:34:18 +00:00
JingleManSweep a1cc4388ae Merge pull request #16 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Implement Map page with node filtering
2025-12-04 19:32:24 +00:00
Claude bb0b9f05ec Add debug info to map data endpoint for troubleshooting
- Return total_nodes, nodes_with_coords, and error in response
- Display meaningful messages when no nodes or no coordinates found
- Log API errors and node counts for debugging
2025-12-04 18:37:50 +00:00
Claude fe744c7c0c Fix map markers to use inline styles and center on nodes
- Use inline styles for marker colors instead of CSS classes for reliable rendering
- Center map on node locations when data is first loaded
- Refactor filter logic to separate recentering behavior
- Update legend to use inline styles
2025-12-04 18:34:24 +00:00
Claude cf4e82503a Add filters to map page for node type, infrastructure, and owner
- Enhanced /map/data endpoint to include node role tag and member ownership
- Added client-side filtering for node type (chat, repeater, room)
- Added toggle to filter for infrastructure nodes only (role: infra)
- Added dropdown filter for member owner (nodes linked via public_key)
- Color-coded markers by node type with gold border for infrastructure
- Added legend showing marker types
- Dynamic count display showing total vs filtered nodes
2025-12-04 18:29:43 +00:00
Louis King d6346fdfde Updates 2025-12-04 18:18:36 +00:00
Louis King cf2c3350cc Updates 2025-12-04 18:10:29 +00:00
Louis King 110c701787 Updates 2025-12-04 16:37:59 +00:00
Louis King fc0dc1a448 Updates 2025-12-04 16:12:51 +00:00
Louis King d283a8c79b Updates 2025-12-04 16:00:15 +00:00
JingleManSweep f129a4e0f3 Merge pull request #15 from ipnet-mesh/feature/originator-address
Updates
2025-12-04 15:52:49 +00:00
JingleManSweep 90f6a68b9b Merge pull request #14 from ipnet-mesh/claude/update-docs-docker-agents-01B9FYrem1tEwNRkx7rCH1QU
Update docs: Docker profiles, seed data, and Members model
2025-12-04 15:46:00 +00:00
Louis King 6cf3152ef9 Updates 2025-12-04 15:45:35 +00:00
Claude 058a6e2c95 Update docs: Docker profiles, seed data, and Members model
- Update Docker Compose section: core services run by default (mqtt,
  collector, api, web), optional profiles for interfaces and utilities
- Document automatic seeding on collector startup
- Add SEED_HOME environment variable documentation
- Document new Members model and YAML seed file format
- Update node_tags format to YAML with public_key-keyed structure
- Update project structure to reflect seed/ and data/ directories
- Add CLI reference for collector seed commands
2025-12-04 15:31:04 +00:00
JingleManSweep acccdfedba Merge pull request #13 from ipnet-mesh/claude/fix-black-ci-linting-01UGvpVAnxCMthohsvzFBiQF
Fix Black linting issues in CI pipeline
2025-12-04 15:12:36 +00:00
Claude 83f3157e8b Fix Black formatting: add trailing comma in set_dispatch_callback
Black requires a trailing comma after the callback parameter when the
function signature spans multiple lines.
2025-12-04 15:06:13 +00:00
JingleManSweep c564a61cf7 Merge pull request #12 from ipnet-mesh/claude/trigger-contact-database-01Qp5vbzzE1c77Kv21wYfQbQ
Trigger Contact database before CONTACT events
2025-12-04 14:59:21 +00:00
Claude bbe8491ff1 Add info-level logging to contact handler for debugging
Change debug logging to info level so contact processing is visible
in default log output. This helps verify that contact events are
being received and processed correctly.
2025-12-04 14:39:14 +00:00
Claude 1d6a9638a1 Refactor contacts to emit individual MQTT events per contact
Instead of sending all contacts in one MQTT message, the interface
now splits the device's contacts response into individual 'contact'
events. This is more consistent with other event patterns and makes
the collector simpler.

Interface changes:
- Add _publish_contacts() to split contacts dict into individual events
- Publish each contact as 'contact' event (not 'contacts')

Collector changes:
- Rename handle_contacts to handle_contact for single contact
- Simplify handler to process one contact per message
- Register handler for 'contact' events
2025-12-04 14:34:05 +00:00
Claude cf633f9f44 Update contacts handler to match actual device payload format
The device sends contact entries with different field names than
originally expected:
- adv_name (not name) for the advertised node name
- type (numeric: 0=none, 1=chat, 2=repeater, 3=room) instead of node_type

Changes:
- Update handle_contacts to extract adv_name and convert numeric type
- Add NODE_TYPE_MAP for type conversion
- Always update node name if different (not just if empty)
- Add debug logging for node updates
- Update ContactInfo schema with actual device fields
2025-12-04 14:25:11 +00:00
Claude 102e40a395 Fix event subscriptions setup timing for contact database sync
Move _setup_event_subscriptions() from run() to connect() so that
event subscriptions are active before get_contacts() is called during
device initialization. Previously, CONTACTS events were lost because
subscriptions weren't set up until run() was called.
2025-12-04 14:18:27 +00:00
Claude 241902685d Add contact database sync to interface startup
Trigger get_contacts() during receiver initialization to fetch the
device's contact database and broadcast CONTACTS events over MQTT.
This enables the collector to associate broadcast node names with
node records in the database.

Changes:
- Add get_contacts() abstract method to BaseMeshCoreDevice
- Implement get_contacts() in MeshCoreDevice using meshcore library
- Implement get_contacts() in MockMeshCoreDevice for testing
- Call get_contacts() in Receiver._initialize_device() after startup
2025-12-04 14:10:58 +00:00
JingleManSweep 2f2ae30c89 Merge pull request #11 from ipnet-mesh/claude/collector-seed-yaml-conversion-01WgpDYuzrzP5nkeC2EG9o2L
Convert collector seed mechanism from JSON to YAML
2025-12-04 01:34:47 +00:00
Louis King fff04e4b99 Updates 2025-12-04 01:33:25 +00:00
Claude df05c3a462 Convert collector seed mechanism from JSON to YAML
- Replace JSON seed files with YAML format for better readability
- Auto-detect YAML primitive types (number, boolean, string) from values
- Add automatic seed import on collector startup
- Split lat/lon into separate tags instead of combined coordinate string
- Add PyYAML dependency and types-PyYAML for type checking
- Update example/seed and contrib/seed/ipnet with clean YAML format
- Update tests to verify YAML primitive type detection
2025-12-04 01:27:03 +00:00
Louis King e2d865f200 Fix nodes page test to match template output
The test was checking for adv_type values (REPEATER, CLIENT) but the
nodes.html template doesn't display that column. Updated to check for
public key prefixes instead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-04 01:03:05 +00:00
Louis King fa335bdb14 Updates 2025-12-04 00:59:49 +00:00
93 changed files with 5370 additions and 1513 deletions
+37 -19
View File
@@ -5,29 +5,33 @@
# Docker Image
# ===================
# Leave empty to build from local Dockerfile, or set to use a pre-built image:
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:latest
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:main
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:v1.0.0
MESHCORE_IMAGE=
# Docker image version tag to use
# Options: latest, main, v1.0.0, etc.
IMAGE_VERSION=latest
# ===================
# Data Directory
# Data & Seed Directories
# ===================
# Base directory for all service data (collector DB, tags, members, etc.)
# Base directory for runtime data (database, etc.)
# Default: ./data (relative to docker-compose.yml location)
# Inside containers this is mapped to /data
#
# Structure:
# ${DATA_HOME}/
# ── collector/
# │ ├── meshcore.db # SQLite database
# │ └── tags.json # Node tags for import
# └── web/
# └── members.json # Network members list
# ── meshcore.db # SQLite database
DATA_HOME=./data
# Directory containing seed data files for import
# Default: ./seed (relative to docker-compose.yml location)
# Inside containers this is mapped to /seed
#
# Structure:
# ${SEED_HOME}/
# ├── node_tags.yaml # Node tags for import
# └── members.yaml # Network members for import
SEED_HOME=./seed
# ===================
# Common Settings
# ===================
@@ -35,12 +39,20 @@ DATA_HOME=./data
# Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
LOG_LEVEL=INFO
# MQTT Broker Settings (internal use)
# ===================
# MQTT Settings
# ===================
# MQTT Broker connection (for interface/collector/api services)
# When using the local MQTT broker (--profile mqtt), use "mqtt" as host
# When using an external broker, set the hostname/IP
MQTT_HOST=mqtt
MQTT_PORT=1883
MQTT_USERNAME=
MQTT_PASSWORD=
MQTT_PREFIX=meshcore
# External MQTT port mapping
# External port mappings for local MQTT broker (--profile mqtt only)
MQTT_EXTERNAL_PORT=1883
MQTT_WS_PORT=9001
@@ -58,6 +70,7 @@ SERIAL_PORT_SENDER=/dev/ttyUSB1
SERIAL_BAUD=115200
# Optional node address override (64-char hex string)
# Only set if you need to override the device's public key
NODE_ADDRESS=
NODE_ADDRESS_SENDER=
@@ -84,15 +97,20 @@ WEB_PORT=8080
NETWORK_NAME=MeshCore Network
NETWORK_CITY=
NETWORK_COUNTRY=
NETWORK_LOCATION=
# Radio configuration (comma-delimited)
# Format: <profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>
# Example: EU/UK Narrow,869.618MHz,62.5kHz,8,8,22dBm
NETWORK_RADIO_CONFIG=
# Contact information
NETWORK_CONTACT_EMAIL=
NETWORK_CONTACT_DISCORD=
NETWORK_CONTACT_GITHUB=
# Members file location (optional override)
# Default: ${DATA_HOME}/web/members.json
# Only set this if you want to use a different location
# MEMBERS_FILE=/custom/path/to/members.json
# Welcome text displayed on the homepage (plain text, optional)
# If not set, a default welcome message is shown
NETWORK_WELCOME_TEXT=
# ===================
# Webhook Settings
+50
View File
@@ -0,0 +1,50 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
# prompt: 'Update the pull request description to include a summary of changes.'
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://docs.claude.com/en/docs/claude-code/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'
+7 -1
View File
@@ -23,6 +23,9 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
@@ -51,15 +54,18 @@ jobs:
with:
context: .
file: Dockerfile
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
SETUPTOOLS_SCM_PRETEND_VERSION=${{ startsWith(github.ref, 'refs/tags/v') && github.ref_name || format('0.0.0.dev0+g{0}', github.sha) }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Test Docker image
if: github.event_name == 'pull_request'
run: |
docker build -t meshcore-hub-test -f Dockerfile .
docker build -t meshcore-hub-test --build-arg SETUPTOOLS_SCM_PRETEND_VERSION=0.0.0.dev0+g${{ github.sha }} -f Dockerfile .
docker run --rm meshcore-hub-test --version
docker run --rm meshcore-hub-test --help
+2
View File
@@ -33,6 +33,7 @@ share/python-wheels/
MANIFEST
uv.lock
docker-compose.override.yml
# PyInstaller
# Usually these files are written by a python script from a template
@@ -217,3 +218,4 @@ __marimo__/
# MeshCore Hub specific
*.db
meshcore.db
src/meshcore_hub/_version.py
+1
View File
@@ -38,3 +38,4 @@ repos:
- fastapi>=0.100.0
- alembic>=1.7.0
- types-paho-mqtt>=1.6.0
- types-PyYAML>=6.0.0
+60 -30
View File
@@ -5,8 +5,8 @@ This document provides context and guidelines for AI coding assistants working o
## Agent Rules
* You MUST use Python (version in `.python-version` file)
* You MUST activate a Python virtual environment in the `venv` directory or create one if it does not exist:
- `ls ./venv` to check if it exists
* You MUST activate a Python virtual environment in the `.venv` directory or create one if it does not exist:
- `ls ./.venv` to check if it exists
- `python -m venv .venv` to create it
* You MUST always activate the virtual environment before running any commands
- `source .venv/bin/activate`
@@ -114,17 +114,15 @@ class NodeRead(BaseModel):
### SQLAlchemy Models
```python
from sqlalchemy import String, DateTime, ForeignKey
from sqlalchemy import String, DateTime, Text
from sqlalchemy.orm import Mapped, mapped_column, relationship
from datetime import datetime
from uuid import uuid4
from typing import Optional
from meshcore_hub.common.models.base import Base
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin
class Node(Base):
class Node(Base, UUIDMixin, TimestampMixin):
__tablename__ = "nodes"
id: Mapped[str] = mapped_column(String(36), primary_key=True, default=lambda: str(uuid4()))
public_key: Mapped[str] = mapped_column(String(64), unique=True, index=True)
name: Mapped[str | None] = mapped_column(String(255), nullable=True)
adv_type: Mapped[str | None] = mapped_column(String(20), nullable=True)
@@ -132,6 +130,18 @@ class Node(Base):
# Relationships
tags: Mapped[list["NodeTag"]] = relationship(back_populates="node", cascade="all, delete-orphan")
class Member(Base, UUIDMixin, TimestampMixin):
"""Network member model - stores info about network operators."""
__tablename__ = "members"
name: Mapped[str] = mapped_column(String(255), nullable=False)
callsign: Mapped[Optional[str]] = mapped_column(String(20), nullable=True)
role: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
description: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
contact: Mapped[Optional[str]] = mapped_column(String(255), nullable=True)
public_key: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
```
### FastAPI Routes
@@ -239,7 +249,12 @@ meshcore-hub/
│ │ ├── mqtt.py # MQTT utilities
│ │ ├── logging.py # Logging config
│ │ ├── models/ # SQLAlchemy models
│ │ │ ├── node.py # Node model
│ │ │ ├── member.py # Network member model
│ │ │ └── ...
│ │ └── schemas/ # Pydantic schemas
│ │ ├── members.py # Member API schemas
│ │ └── ...
│ ├── interface/
│ │ ├── cli.py
│ │ ├── device.py # MeshCore device wrapper
@@ -247,9 +262,10 @@ meshcore-hub/
│ │ ├── receiver.py # RECEIVER mode
│ │ └── sender.py # SENDER mode
│ ├── collector/
│ │ ├── cli.py
│ │ ├── cli.py # Collector CLI with seed commands
│ │ ├── subscriber.py # MQTT subscriber
│ │ ├── tag_import.py # Tag import from JSON
│ │ ├── tag_import.py # Tag import from YAML
│ │ ├── member_import.py # Member import from YAML
│ │ ├── handlers/ # Event handlers
│ │ └── webhook.py # Webhook dispatcher
│ ├── api/
@@ -258,11 +274,15 @@ meshcore-hub/
│ │ ├── auth.py # Authentication
│ │ ├── dependencies.py
│ │ ├── routes/ # API routes
│ │ │ ├── members.py # Member CRUD endpoints
│ │ │ └── ...
│ │ └── templates/ # Dashboard HTML
│ └── web/
│ ├── cli.py
│ ├── app.py # FastAPI app
│ ├── routes/ # Page routes
│ │ ├── members.py # Members page
│ │ └── ...
│ ├── templates/ # Jinja2 templates
│ └── static/ # CSS, JS
├── tests/
@@ -278,20 +298,17 @@ meshcore-hub/
├── etc/
│ └── mosquitto.conf # MQTT broker configuration
├── example/
│ └── data/
│ ├── collector/
│ └── tags.json # Example node tags data
│ └── web/
└── members.json # Example network members data
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
└── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME)
├── node_tags.yaml # Node tags for import
│ └── members.yaml # Network members for import
├── data/ # Runtime data (gitignored, DATA_HOME default)
── collector/ # Collector data
── meshcore.db # SQLite database
│ │ └── tags.json # Node tags for import
│ └── web/ # Web data
│ └── members.json # Network members list
── collector/ # Collector data
── meshcore.db # SQLite database
├── Dockerfile # Docker build configuration
── docker-compose.yml # Docker Compose services (gitignored)
└── docker-compose.yml.example # Docker Compose template
── docker-compose.yml # Docker Compose services
```
## MQTT Topic Structure
@@ -436,26 +453,39 @@ meshcore-hub interface --mode receiver --mock
See [PLAN.md](PLAN.md#configuration-environment-variables) for complete list.
Key variables:
- `DATA_HOME` - Base directory for all service data (default: `./data`)
- `DATA_HOME` - Base directory for runtime data (default: `./data`)
- `SEED_HOME` - Directory containing seed data files (default: `./seed`)
- `MQTT_HOST`, `MQTT_PORT`, `MQTT_PREFIX` - MQTT broker connection
- `DATABASE_URL` - SQLAlchemy database URL (default: `sqlite:///{DATA_HOME}/collector/meshcore.db`)
- `API_READ_KEY`, `API_ADMIN_KEY` - API authentication keys
- `LOG_LEVEL` - Logging verbosity
### Data Directory Structure
### Directory Structure
The `DATA_HOME` environment variable controls where all service data is stored:
**Seed Data (`SEED_HOME`)** - Contains initial data files for database seeding:
```
${SEED_HOME}/
├── node_tags.yaml # Node tags (keyed by public_key)
└── members.yaml # Network members list
```
**Runtime Data (`DATA_HOME`)** - Contains runtime data (gitignored):
```
${DATA_HOME}/
── collector/
── meshcore.db # SQLite database
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
── collector/
── meshcore.db # SQLite database
```
Services automatically create their subdirectories if they don't exist.
### Automatic Seeding
The collector automatically imports seed data on startup if YAML files exist in `SEED_HOME`:
- `node_tags.yaml` - Node tag definitions (keyed by public_key)
- `members.yaml` - Network member definitions
Manual seeding can be triggered with: `meshcore-hub collector seed`
### Webhook Configuration
The collector supports forwarding events to external HTTP endpoints:
+5 -2
View File
@@ -21,6 +21,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
# Build argument for version (set via CI or manually)
ARG SETUPTOOLS_SCM_PRETEND_VERSION=0.0.0+docker
# Copy project files
WORKDIR /app
COPY pyproject.toml README.md ./
@@ -28,9 +31,9 @@ COPY src/ ./src/
COPY alembic/ ./alembic/
COPY alembic.ini ./
# Install the package
# Install the package with version from build arg
RUN pip install --upgrade pip && \
pip install .
SETUPTOOLS_SCM_PRETEND_VERSION=${SETUPTOOLS_SCM_PRETEND_VERSION} pip install .
# =============================================================================
# Stage 2: Runtime - Final production image
+229 -84
View File
@@ -62,49 +62,140 @@ MeshCore Hub provides a complete solution for monitoring, collecting, and intera
- **Web Dashboard**: Visualize network status, node locations, and message history
- **Docker Ready**: Single image with all components, easy deployment
## Getting Started
### Simple Self-Hosted Setup
The quickest way to get started is running the entire stack on a single machine with a connected MeshCore device.
**Prerequisites:**
1. Flash the [USB Companion firmware](https://meshcore.dev/) onto a compatible device (e.g., Heltec V3, T-Beam)
2. Connect the device via USB to a machine that supports Docker or Python
**Steps:**
```bash
# Clone the repository
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env: set SERIAL_PORT to your device (e.g., /dev/ttyUSB0 or /dev/ttyACM0)
# Start the entire stack with local MQTT broker
docker compose --profile mqtt --profile core --profile receiver up -d
# View the web dashboard
open http://localhost:8080
```
This starts all services: MQTT broker, collector, API, web dashboard, and the interface receiver that bridges your MeshCore device to the system.
### Distributed Community Setup
For larger deployments, you can separate receiver nodes from the central infrastructure. This allows multiple community members to contribute receiver coverage while hosting the backend centrally.
```
┌─────────────────────────────────────────────────────────────────────┐
│ Community Members │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ Raspberry Pi │ │ Raspberry Pi │ │ Any Linux │ │
│ │ + MeshCore │ │ + MeshCore │ │ + MeshCore │ │
│ │ Device │ │ Device │ │ Device │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬───────┘ │
│ │ │ │ │
│ │ receiver profile only │ │
│ └──────────────────┼──────────────────┘ │
│ │ │
│ MQTT (port 1883) │
│ │ │
└────────────────────────────┼─────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐
│ Community VPS / Server │
│ │
│ ┌──────────┐ ┌───────────┐ ┌─────────┐ ┌──────────────┐ │
│ │ MQTT │──▶│ Collector │──▶│ API │◀──│ Web Dashboard│ │
│ │ Broker │ │ │ │ │ │ (public) │ │
│ └──────────┘ └───────────┘ └─────────┘ └──────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────┘
```
**On each receiver node (Raspberry Pi, etc.):**
```bash
# Only run the receiver component
# Configure .env with MQTT_HOST pointing to your central server
MQTT_HOST=your-community-server.com
SERIAL_PORT=/dev/ttyUSB0
docker compose --profile receiver up -d
```
**On the central server (VPS/cloud):**
```bash
# Run the core infrastructure with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to an existing MQTT broker (set MQTT_HOST in .env)
docker compose --profile core up -d
```
This architecture allows:
- Multiple receivers for better RF coverage across a geographic area
- Centralized data storage and web interface
- Community members to contribute coverage with minimal setup
- The central server to be hosted anywhere with internet access
## Quick Start
### Using Docker Compose (Recommended)
Docker Compose supports **profiles** to selectively enable/disable components:
Docker Compose uses **profiles** to select which services to run:
| Profile | Services |
|---------|----------|
| `mqtt` | Eclipse Mosquitto MQTT broker |
| `interface-receiver` | MeshCore device receiver (events to MQTT) |
| `interface-sender` | MeshCore device sender (MQTT to device) |
| `collector` | MQTT subscriber + database storage |
| `api` | REST API server |
| `web` | Web dashboard |
| `mock` | All services with mock device (for testing) |
| `all` | All production services |
| Profile | Services | Use Case |
|---------|----------|----------|
| `core` | collector, api, web | Central server infrastructure |
| `receiver` | interface-receiver | Receiver node (events to MQTT) |
| `sender` | interface-sender | Sender node (MQTT to device) |
| `mqtt` | mosquitto broker | Local MQTT broker (optional) |
| `mock` | interface-mock-receiver | Testing without hardware |
| `migrate` | db-migrate | One-time database migration |
| `seed` | seed | One-time seed data import |
**Note:** Most deployments connect to an external MQTT broker. Add `--profile mqtt` only if you need a local broker.
```bash
# Clone the repository
git clone https://github.com/your-org/meshcore-hub.git
cd meshcore-hub/docker
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env with your settings (API keys, serial port, network info)
# Option 1: Start all services with mock device (for testing)
docker compose --profile mock up -d
# Create database schema
docker compose --profile migrate run --rm db-migrate
# Option 2: Start specific services for production
docker compose --profile mqtt --profile collector --profile api --profile web up -d
# Seed the database
docker compose --profile seed run --rm seed
# Option 3: Start all production services (requires real MeshCore device)
docker compose --profile all up -d
# Start core services with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to external MQTT (configure MQTT_HOST in .env)
docker compose --profile core up -d
# Start just the receiver (connects to MQTT_HOST from .env)
docker compose --profile receiver up -d
# View logs
docker compose logs -f
# Run database migrations
docker compose --profile migrate up
# Stop services
docker compose --profile mock down
docker compose down
```
#### Serial Device Access
@@ -170,7 +261,8 @@ All components are configured via environment variables. Create a `.env` file or
| Variable | Default | Description |
|----------|---------|-------------|
| `DATABASE_URL` | `sqlite:///./meshcore.db` | SQLAlchemy database URL |
| `DATABASE_URL` | `sqlite:///{data_home}/collector/meshcore.db` | SQLAlchemy database URL |
| `SEED_HOME` | `./seed` | Directory containing seed data files (node_tags.yaml, members.yaml) |
#### Webhook Configuration
@@ -216,7 +308,6 @@ Webhook payload format:
| `NETWORK_NAME` | `MeshCore Network` | Display name for the network |
| `NETWORK_CITY` | *(none)* | City where network is located |
| `NETWORK_COUNTRY` | *(none)* | Country code (ISO 3166-1 alpha-2) |
| `NETWORK_LOCATION` | *(none)* | Center coordinates (lat,lon) |
## CLI Reference
@@ -229,10 +320,12 @@ meshcore-hub interface --mode receiver --port /dev/ttyUSB0
meshcore-hub interface --mode sender --mock # Use mock device
# Collector component
meshcore-hub collector --database-url sqlite:///./data.db
# Import node tags from JSON file
meshcore-hub collector import-tags /path/to/tags.json
meshcore-hub collector # Run collector (auto-seeds on startup)
meshcore-hub collector seed # Import all seed data from SEED_HOME
meshcore-hub collector import-tags # Import node tags from SEED_HOME/node_tags.yaml
meshcore-hub collector import-tags /path/to/file.yaml # Import from specific file
meshcore-hub collector import-members # Import members from SEED_HOME/members.yaml
meshcore-hub collector import-members /path/to/file.yaml # Import from specific file
# API component
meshcore-hub api --host 0.0.0.0 --port 8000
@@ -246,74 +339,124 @@ meshcore-hub db downgrade # Rollback one migration
meshcore-hub db current # Show current revision
```
## Seed Data
The collector supports seeding the database with node tags and network members on startup. Seed files are read from the `SEED_HOME` directory (default: `./seed`).
### Automatic Seeding
When the collector starts, it automatically imports seed data from YAML files if they exist:
- `{SEED_HOME}/node_tags.yaml` - Node tag definitions
- `{SEED_HOME}/members.yaml` - Network member definitions
### Manual Seeding
```bash
# Native CLI
meshcore-hub collector seed
# With Docker Compose
docker compose --profile seed up
```
### Directory Structure
```
seed/ # SEED_HOME (seed data files)
├── node_tags.yaml # Node tags for import
└── members.yaml # Network members for import
data/ # DATA_HOME (runtime data)
└── collector/
└── meshcore.db # SQLite database
```
Example seed files are provided in `example/seed/`.
## Node Tags
Node tags allow you to attach custom metadata to nodes (e.g., location, role, owner). Tags are stored in the database and returned with node data via the API.
### Importing Tags from JSON
### Node Tags YAML Format
Tags can be bulk imported from a JSON file:
Tags are keyed by public key in YAML format:
```bash
# Native CLI
meshcore-hub collector import-tags /path/to/tags.json
```yaml
# Each key is a 64-character hex public key
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
# With Docker Compose
docker compose --profile import-tags run --rm import-tags
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
altitude: 150
location:
value: "37.8044,-122.2712"
type: coordinate
```
### Tags JSON Format
Tag values can be:
- **YAML primitives** (auto-detected type): strings, numbers, booleans
- **Explicit type** (for special types like coordinate):
```yaml
location:
value: "37.7749,-122.4194"
type: coordinate
```
```json
{
"tags": [
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "location",
"value": "San Francisco, CA",
"value_type": "string"
},
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "altitude",
"value": "150",
"value_type": "number"
}
]
}
Supported types: `string`, `number`, `boolean`, `coordinate`
### Import Tags Manually
```bash
# Import from default location ({SEED_HOME}/node_tags.yaml)
meshcore-hub collector import-tags
# Import from specific file
meshcore-hub collector import-tags /path/to/node_tags.yaml
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes
```
## Network Members
Network members represent the people operating nodes in your network. Members can optionally be linked to nodes via their public key.
### Members YAML Format
```yaml
members:
- name: John Doe
callsign: N0CALL
role: Network Operator
description: Example member entry
contact: john@example.com
public_key: 0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef
```
| Field | Required | Description |
|-------|----------|-------------|
| `public_key` | Yes | 64-character hex public key of the node |
| `key` | Yes | Tag name (max 100 characters) |
| `value` | No | Tag value (stored as text) |
| `value_type` | No | Type hint: `string`, `number`, `boolean`, or `coordinate` (default: `string`) |
| `name` | Yes | Member's display name |
| `callsign` | No | Amateur radio callsign |
| `role` | No | Member's role in the network |
| `description` | No | Additional description |
| `contact` | No | Contact information |
| `public_key` | No | Associated node public key (64-char hex) |
### Import Options
### Import Members Manually
```bash
# Create nodes if they don't exist (default behavior)
meshcore-hub collector import-tags tags.json
# Import from default location ({SEED_HOME}/members.yaml)
meshcore-hub collector import-members
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes tags.json
# Import from specific file
meshcore-hub collector import-members /path/to/members.yaml
```
### Data Directory Structure
For Docker deployments, organize your data files:
```
data/
├── collector/
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
```
Example files are provided in `example/data/`.
### Managing Tags via API
Tags can also be managed via the REST API:
@@ -385,7 +528,7 @@ curl -X POST \
| GET | `/api/v1/trace-paths` | List trace paths |
| POST | `/api/v1/commands/send-message` | Send direct message |
| POST | `/api/v1/commands/send-channel-message` | Send channel message |
| GET | `/api/v1/stats` | Get network statistics |
| GET | `/api/v1/dashboard/stats` | Get network statistics |
## Development
@@ -393,7 +536,7 @@ curl -X POST \
```bash
# Clone and setup
git clone https://github.com/your-org/meshcore-hub.git
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
python -m venv .venv
source .venv/bin/activate
@@ -459,11 +602,13 @@ meshcore-hub/
├── alembic/ # Database migrations
├── etc/ # Configuration files (mosquitto.conf)
├── example/ # Example files for testing
│ └── data/ # Example data files (members.json)
├── data/ # Runtime data (gitignored)
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
│ └── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME, copy from example/seed/)
├── data/ # Runtime data directory (DATA_HOME, created at runtime)
├── Dockerfile # Docker build configuration
├── docker-compose.yml # Docker Compose services (gitignored)
├── docker-compose.yml.example # Docker Compose template
├── docker-compose.yml # Docker Compose services
├── PROMPT.md # Project specification
├── SCHEMAS.md # Event schema documentation
├── PLAN.md # Implementation plan
@@ -491,7 +636,7 @@ meshcore-hub/
## License
See [LICENSE](LICENSE) for details.
This project is licensed under the GNU General Public License v3.0 or later (GPL-3.0-or-later). See [LICENSE](LICENSE) for details.
## Acknowledgments
@@ -0,0 +1,114 @@
"""Add member_nodes association table
Revision ID: 002
Revises: 001
Create Date: 2024-12-05
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "002"
down_revision: Union[str, None] = "001"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Create member_nodes table
op.create_table(
"member_nodes",
sa.Column("id", sa.String(), nullable=False),
sa.Column("member_id", sa.String(36), nullable=False),
sa.Column("public_key", sa.String(64), nullable=False),
sa.Column("node_role", sa.String(50), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(["member_id"], ["members.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("ix_member_nodes_member_id", "member_nodes", ["member_id"])
op.create_index("ix_member_nodes_public_key", "member_nodes", ["public_key"])
op.create_index(
"ix_member_nodes_member_public_key",
"member_nodes",
["member_id", "public_key"],
)
# Migrate existing public_key data from members to member_nodes
# Get all members with a public_key
connection = op.get_bind()
members_with_keys = connection.execute(
sa.text("SELECT id, public_key FROM members WHERE public_key IS NOT NULL")
).fetchall()
# Insert into member_nodes
for member_id, public_key in members_with_keys:
# Generate a UUID for the new row
import uuid
node_id = str(uuid.uuid4())
connection.execute(
sa.text(
"""
INSERT INTO member_nodes (id, member_id, public_key, created_at, updated_at)
VALUES (:id, :member_id, :public_key, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP)
"""
),
{"id": node_id, "member_id": member_id, "public_key": public_key},
)
# Drop the public_key column from members
op.drop_index("ix_members_public_key", table_name="members")
op.drop_column("members", "public_key")
def downgrade() -> None:
# Add public_key column back to members
op.add_column(
"members",
sa.Column("public_key", sa.String(64), nullable=True),
)
op.create_index("ix_members_public_key", "members", ["public_key"])
# Migrate data back - take the first node for each member
connection = op.get_bind()
member_nodes = connection.execute(
sa.text(
"""
SELECT DISTINCT member_id, public_key
FROM member_nodes
WHERE (member_id, created_at) IN (
SELECT member_id, MIN(created_at)
FROM member_nodes
GROUP BY member_id
)
"""
)
).fetchall()
for member_id, public_key in member_nodes:
connection.execute(
sa.text(
"UPDATE members SET public_key = :public_key WHERE id = :member_id"
),
{"public_key": public_key, "member_id": member_id},
)
# Drop member_nodes table
op.drop_table("member_nodes")
@@ -0,0 +1,67 @@
"""Add event_hash column to event tables for deduplication
Revision ID: 003
Revises: 002
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "003"
down_revision: Union[str, None] = "002"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Add event_hash column to messages table
op.add_column(
"messages",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])
# Add event_hash column to advertisements table
op.add_column(
"advertisements",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_advertisements_event_hash", "advertisements", ["event_hash"])
# Add event_hash column to trace_paths table
op.add_column(
"trace_paths",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Add event_hash column to telemetry table
op.add_column(
"telemetry",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
def downgrade() -> None:
# Remove event_hash from telemetry
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
op.drop_column("telemetry", "event_hash")
# Remove event_hash from trace_paths
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
op.drop_column("trace_paths", "event_hash")
# Remove event_hash from advertisements
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
op.drop_column("advertisements", "event_hash")
# Remove event_hash from messages
op.drop_index("ix_messages_event_hash", table_name="messages")
op.drop_column("messages", "event_hash")
@@ -0,0 +1,63 @@
"""Add event_receivers junction table for multi-receiver tracking
Revision ID: 004
Revises: 003
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "004"
down_revision: Union[str, None] = "003"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"event_receivers",
sa.Column("id", sa.String(36), primary_key=True),
sa.Column("event_type", sa.String(20), nullable=False),
sa.Column("event_hash", sa.String(32), nullable=False),
sa.Column(
"receiver_node_id",
sa.String(36),
sa.ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("snr", sa.Float, nullable=True),
sa.Column("received_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
)
op.create_index(
"ix_event_receivers_event_hash",
"event_receivers",
["event_hash"],
)
op.create_index(
"ix_event_receivers_receiver_node_id",
"event_receivers",
["receiver_node_id"],
)
op.create_index(
"ix_event_receivers_type_hash",
"event_receivers",
["event_type", "event_hash"],
)
def downgrade() -> None:
op.drop_index("ix_event_receivers_type_hash", table_name="event_receivers")
op.drop_index("ix_event_receivers_receiver_node_id", table_name="event_receivers")
op.drop_index("ix_event_receivers_event_hash", table_name="event_receivers")
op.drop_table("event_receivers")
@@ -0,0 +1,126 @@
"""Make event_hash columns unique for race condition prevention
Revision ID: 005
Revises: 004
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy import inspect
# revision identifiers, used by Alembic.
revision: str = "005"
down_revision: Union[str, None] = "004"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def _index_exists(table_name: str, index_name: str) -> bool:
"""Check if an index exists on a table."""
bind = op.get_bind()
inspector = inspect(bind)
indexes = inspector.get_indexes(table_name)
return any(idx["name"] == index_name for idx in indexes)
def _has_unique_on_column(table_name: str, column_name: str) -> bool:
"""Check if a unique constraint or unique index exists on a column."""
bind = op.get_bind()
inspector = inspect(bind)
# Check unique constraints
uniques = inspector.get_unique_constraints(table_name)
for uq in uniques:
if column_name in uq.get("column_names", []):
return True
# Also check indexes (SQLite may create unique index instead of constraint)
indexes = inspector.get_indexes(table_name)
for idx in indexes:
if idx.get("unique") and column_name in idx.get("column_names", []):
return True
return False
def upgrade() -> None:
# Convert non-unique indexes to unique indexes for race condition prevention
# Note: SQLite handles NULL values as unique (each NULL is distinct)
# SQLite doesn't support ALTER TABLE ADD CONSTRAINT, so we use unique indexes
# Messages
if _index_exists("messages", "ix_messages_event_hash"):
op.drop_index("ix_messages_event_hash", table_name="messages")
if not _has_unique_on_column("messages", "event_hash"):
op.create_index(
"ix_messages_event_hash_unique",
"messages",
["event_hash"],
unique=True,
)
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash"):
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
if not _has_unique_on_column("advertisements", "event_hash"):
op.create_index(
"ix_advertisements_event_hash_unique",
"advertisements",
["event_hash"],
unique=True,
)
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
if not _has_unique_on_column("trace_paths", "event_hash"):
op.create_index(
"ix_trace_paths_event_hash_unique",
"trace_paths",
["event_hash"],
unique=True,
)
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash"):
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
if not _has_unique_on_column("telemetry", "event_hash"):
op.create_index(
"ix_telemetry_event_hash_unique",
"telemetry",
["event_hash"],
unique=True,
)
def downgrade() -> None:
# Restore non-unique indexes
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash_unique"):
op.drop_index("ix_telemetry_event_hash_unique", table_name="telemetry")
if not _index_exists("telemetry", "ix_telemetry_event_hash"):
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash_unique"):
op.drop_index("ix_trace_paths_event_hash_unique", table_name="trace_paths")
if not _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash_unique"):
op.drop_index(
"ix_advertisements_event_hash_unique", table_name="advertisements"
)
if not _index_exists("advertisements", "ix_advertisements_event_hash"):
op.create_index(
"ix_advertisements_event_hash", "advertisements", ["event_hash"]
)
# Messages
if _index_exists("messages", "ix_messages_event_hash_unique"):
op.drop_index("ix_messages_event_hash_unique", table_name="messages")
if not _index_exists("messages", "ix_messages_event_hash"):
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])
-10
View File
@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Louis",
"callsign": "Louis",
"role": "admin",
"description": "IPNet Founder"
}
]
}
+66
View File
@@ -0,0 +1,66 @@
# IPNet Network Members
members:
- name: Louis
callsign: Louis
role: admin
description: IPNet Founder
nodes:
# ip2-rep01
- public_key: 2337484665ced7e210007e9fd9db98ced0a24a6eab8b4cbe3a06b3a1cea33ca1
node_role: repeater
# ip2-rep02
- public_key: 8cb01fff1afc099055af418ce5fc5e60384df9ff763c25dd7e6a5e0922e8df90
node_role: repeater
# ip2-rep03
- public_key: 5b565df747913358e24d890b2227de9c35d09763746b6ec326c15ebbf9b8be3b
node_role: repeater
# ip2-sol01
- public_key: 87eb9487a1a4351e986e55627b2d09c4da61f94d080eaf4d7129caef89886e25
node_role: repeater
# personal chat node
- public_key: c6e0d85528b4b5d7f53aa7dded2b7e0b9c8f8a5c00acfaad47476ef5f3c7dc47
node_role: chat
- name: Mark
callsign: Mark
role: member
description: IPNet Member
nodes:
- public_key: 22309435fbd9dd1f14870a1895dc854779f6b2af72b08542f6105d264a493ebe
node_role: repeater
- public_key: 9135986b83815ada92883358435cc6528c7db60cb647f9b6547739a1ce5eb1c8
node_role: repeater
- public_key: 2a4f89e766dfa1758e35a69962c1f6d352b206a5e3562a589155a3ebfe7fc2bb
node_role: repeater
- public_key: e790b73b2d6e377dd0f575c847f3ef42232f610eb9a19af57083fc4f647309ac
node_role: repeater
- public_key: d3c20d962f7384c111fbafad6fbc1c1dc0e5c3ce802fb3ee11020e8d8207ed3a
node_role: repeater
- public_key: b00ce9d218203e96d8557a4d59e06f5de59bbc4dcc4df9c870079d2cb8b5bd80
node_role: repeater
- name: CCZ
callsign: CCZ
role: member
nodes:
- public_key: e334ec5475789d542ed9e692fbeef7444a371fcc05adcbda1f47ba6a3191b459
node_role: repeater
- public_key: cc15fb33e98f2e098a543f516f770dc3061a1a6b30f79b84780663bf68ae6b53
node_role: repeater
- public_key: 20ed75ffc0f9777951716bb3d308d7f041fd2ad32fe2e998e600d0361e1fe2ac
node_role: repeater
description: IPNet Member
- name: Walshie
callsign: Walshie86
role: member
description: IPNet Member
nodes:
- public_key: bd7b5ac75f660675b39f368e1dbb6d1dbcefd8bd7a170e21a942954f67c8bf52
node_role: repeater
- public_key: 9cf300c40112ea34d0a59858270948b27ab6cd87e840de338f3ca782c17537b2
node_role: repeater
- name: Craig
callsign: M7XCN
role: member
description: IPNet Member
nodes:
- public_key: 8accb6d0189ccaffb745ba54793e7fe3edd515edb45554325d957e48c1b9f3b3
node_role: repeater
-613
View File
@@ -1,613 +0,0 @@
{
"2337484665ced7e210007e9fd9db98ced0a24a6eab8b4cbe3a06b3a1cea33ca1": {
"friendly_name": "IP2 Repeater 1",
"node_id": "ip2-rep01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0357627,1.132079",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "31",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8cb01fff1afc099055af418ce5fc5e60384df9ff763c25dd7e6a5e0922e8df90": {
"friendly_name": "IP2 Repeater 2",
"node_id": "ip2-rep02.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0390682,1.1304141",
"type": "coordinate"
},
"location_description": "Belstead Road",
"hardware": "Heltec V3",
"antenna": "McGill 6dBi Omni",
"elevation": {
"value": "44",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"5b565df747913358e24d890b2227de9c35d09763746b6ec326c15ebbf9b8be3b": {
"friendly_name": "IP2 Repeater 3",
"node_id": "ip2-rep03.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.046356,1.134661",
"type": "coordinate"
},
"location_description": "Birkfield Drive",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "52",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"780d0939f90b22d3bd7cbedcaf4e8d468a12c01886ab24b8cfa11eab2f5516c5": {
"friendly_name": "IP2 Integration 1",
"node_id": "ip2-int01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"30121dc60362c633c457ffa18f49b3e1d6823402c33709f32d7df70612250b96": {
"friendly_name": "MeshBot",
"node_id": "bot.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"9135986b83815ada92883358435cc6528c7db60cb647f9b6547739a1ce5eb1c8": {
"friendly_name": "IP3 Repeater 1",
"node_id": "ip3-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045803,1.204416",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "42",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e334ec5475789d542ed9e692fbeef7444a371fcc05adcbda1f47ba6a3191b459": {
"friendly_name": "IP3 Repeater 2",
"node_id": "ip3-rep02.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.03297,1.17543",
"type": "coordinate"
},
"location_description": "Morland Road Allotments",
"hardware": "Heltec T114",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"cc15fb33e98f2e098a543f516f770dc3061a1a6b30f79b84780663bf68ae6b53": {
"friendly_name": "IP3 Repeater 3",
"node_id": "ip3-rep03.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04499,1.18149",
"type": "coordinate"
},
"location_description": "Hatfield Road",
"hardware": "Heltec V3",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"22309435fbd9dd1f14870a1895dc854779f6b2af72b08542f6105d264a493ebe": {
"friendly_name": "IP3 Integration 1",
"node_id": "ip3-int01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045773,1.212808",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Generic 3dBi Whip",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"2a4f89e766dfa1758e35a69962c1f6d352b206a5e3562a589155a3ebfe7fc2bb": {
"friendly_name": "IP3 Repeater 4",
"node_id": "ip3-rep04.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.046383,1.174542",
"type": "coordinate"
},
"location_description": "Holywells",
"hardware": "Sensecap Solar",
"antenna": "Paradar 6.5dbi Omni",
"elevation": {
"value": "21",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e790b73b2d6e377dd0f575c847f3ef42232f610eb9a19af57083fc4f647309ac": {
"friendly_name": "IP3 Repeater 5",
"node_id": "ip3-rep05.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.05252,1.17034",
"type": "coordinate"
},
"location_description": "Back Hamlet",
"hardware": "Heltec T114",
"antenna": "Paradar 6.5dBi Omni",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"20ed75ffc0f9777951716bb3d308d7f041fd2ad32fe2e998e600d0361e1fe2ac": {
"friendly_name": "IP3 Repeater 6",
"node_id": "ip3-rep06.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04893,1.18965",
"type": "coordinate"
},
"location_description": "Dover Road",
"hardware": "Unknown",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"bd7b5ac75f660675b39f368e1dbb6d1dbcefd8bd7a170e21a942954f67c8bf52": {
"friendly_name": "IP8 Repeater 1",
"node_id": "rep01.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.033684,1.118384",
"type": "coordinate"
},
"location_description": "Grove Hill",
"hardware": "Heltec V3",
"antenna": "McGill 3dBi Omni",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"9cf300c40112ea34d0a59858270948b27ab6cd87e840de338f3ca782c17537b2": {
"friendly_name": "IP8 Repeater 2",
"node_id": "rep02.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.035648,1.073271",
"type": "coordinate"
},
"location_description": "Washbrook",
"hardware": "Sensecap Solar",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"d3c20d962f7384c111fbafad6fbc1c1dc0e5c3ce802fb3ee11020e8d8207ed3a": {
"friendly_name": "IP4 Repeater 1",
"node_id": "ip4-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.052445,1.156882",
"type": "coordinate"
},
"location_description": "Wine Rack",
"hardware": "Heltec T114",
"antenna": "Generic 5dbi Whip",
"elevation": {
"value": "50",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"b00ce9d218203e96d8557a4d59e06f5de59bbc4dcc4df9c870079d2cb8b5bd80": {
"friendly_name": "IP4 Repeater 2",
"node_id": "ip4-rep02.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.06217,1.18332",
"type": "coordinate"
},
"location_description": "Rushmere Road",
"hardware": "Heltec V3",
"antenna": "Paradar 5dbi Whip",
"elevation": {
"value": "35",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8accb6d0189ccaffb745ba54793e7fe3edd515edb45554325d957e48c1b9f3b3": {
"friendly_name": "IP4 Repeater 3",
"node_id": "ip4-rep03.ipnt.uk",
"member_id": "craig",
"area": "IP4",
"location": {
"value": "52.058,1.165",
"type": "coordinate"
},
"location_description": "IP4 Area",
"hardware": "Heltec v3",
"antenna": "Generic Whip",
"elevation": {
"value": "30",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"69fb8431e7ab307513797544fab99ce53ce24c46ec2d3a11767fe70f2ca37b23": {
"friendly_name": "IP3 Test Repeater 1",
"node_id": "ip3-tst01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.041869,1.204789",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Station G2",
"antenna": "McGill 10dBi Panel",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "true",
"type": "boolean"
},
"mesh_role": "repeater"
}
}
+239
View File
@@ -0,0 +1,239 @@
# IPNet Network Node Tags
# Uses YAML primitives: numbers, booleans, and strings are auto-detected
# IP2 Area Nodes
2337484665ced7e210007e9fd9db98ced0a24a6eab8b4cbe3a06b3a1cea33ca1:
friendly_name: IP2 Repeater 1
node_id: ip2-rep01.ipnt.uk
member_id: louis
area: IP2
lat: 52.0357627
lon: 1.132079
location_description: Fountains Road
hardware: Heltec V3
antenna: Paradar 8.5dBi Omni
elevation: 31
role: infra
8cb01fff1afc099055af418ce5fc5e60384df9ff763c25dd7e6a5e0922e8df90:
friendly_name: IP2 Repeater 2
node_id: ip2-rep02.ipnt.uk
member_id: louis
area: IP2
lat: 52.0390682
lon: 1.1304141
location_description: Belstead Road
hardware: Heltec V3
antenna: McGill 6dBi Omni
elevation: 44
role: infra
5b565df747913358e24d890b2227de9c35d09763746b6ec326c15ebbf9b8be3b:
friendly_name: IP2 Repeater 3
node_id: ip2-rep03.ipnt.uk
member_id: louis
area: IP2
lat: 52.046356
lon: 1.134661
location_description: Birkfield Drive
hardware: Heltec V3
antenna: Paradar 8.5dBi Omni
elevation: 52
role: infra
780d0939f90b22d3bd7cbedcaf4e8d468a12c01886ab24b8cfa11eab2f5516c5:
friendly_name: IP2 Integration 1
node_id: ip2-int01.ipnt.uk
member_id: louis
area: IP2
lat: 52.0354539
lon: 1.1295338
location_description: Fountains Road
hardware: Heltec V3
antenna: Generic 5dBi Whip
elevation: 25
role: infra
30121dc60362c633c457ffa18f49b3e1d6823402c33709f32d7df70612250b96:
friendly_name: MeshBot
node_id: bot.ipnt.uk
member_id: louis
area: IP2
lat: 52.0354539
lon: 1.1295338
location_description: Fountains Road
hardware: Heltec V3
antenna: Generic 5dBi Whip
elevation: 25
role: infra
# IP3 Area Nodes
9135986b83815ada92883358435cc6528c7db60cb647f9b6547739a1ce5eb1c8:
friendly_name: IP3 Repeater 1
node_id: ip3-rep01.ipnt.uk
member_id: markab
area: IP3
lat: 52.045803
lon: 1.204416
location_description: Brokehall
hardware: Heltec V3
antenna: Paradar 8.5dBi Omni
elevation: 42
role: infra
e334ec5475789d542ed9e692fbeef7444a371fcc05adcbda1f47ba6a3191b459:
friendly_name: IP3 Repeater 2
node_id: ip3-rep02.ipnt.uk
member_id: ccz
area: IP3
lat: 52.03297
lon: 1.17543
location_description: Morland Road Allotments
hardware: Heltec T114
antenna: Unknown
elevation: 39
role: infra
cc15fb33e98f2e098a543f516f770dc3061a1a6b30f79b84780663bf68ae6b53:
friendly_name: IP3 Repeater 3
node_id: ip3-rep03.ipnt.uk
member_id: ccz
area: IP3
lat: 52.04499
lon: 1.18149
location_description: Hatfield Road
hardware: Heltec V3
antenna: Unknown
elevation: 39
role: infra
22309435fbd9dd1f14870a1895dc854779f6b2af72b08542f6105d264a493ebe:
friendly_name: IP3 Integration 1
node_id: ip3-int01.ipnt.uk
member_id: markab
area: IP3
lat: 52.045773
lon: 1.212808
location_description: Brokehall
hardware: Heltec V3
antenna: Generic 3dBi Whip
elevation: 37
role: infra
2a4f89e766dfa1758e35a69962c1f6d352b206a5e3562a589155a3ebfe7fc2bb:
friendly_name: IP3 Repeater 4
node_id: ip3-rep04.ipnt.uk
member_id: markab
area: IP3
lat: 52.046383
lon: 1.174542
location_description: Holywells
hardware: Sensecap Solar
antenna: Paradar 6.5dbi Omni
elevation: 21
role: infra
e790b73b2d6e377dd0f575c847f3ef42232f610eb9a19af57083fc4f647309ac:
friendly_name: IP3 Repeater 5
node_id: ip3-rep05.ipnt.uk
member_id: markab
area: IP3
lat: 52.05252
lon: 1.17034
location_description: Back Hamlet
hardware: Heltec T114
antenna: Paradar 6.5dBi Omni
elevation: 38
role: infra
20ed75ffc0f9777951716bb3d308d7f041fd2ad32fe2e998e600d0361e1fe2ac:
friendly_name: IP3 Repeater 6
node_id: ip3-rep06.ipnt.uk
member_id: ccz
area: IP3
lat: 52.04893
lon: 1.18965
location_description: Dover Road
hardware: Unknown
antenna: Generic 5dBi Whip
elevation: 38
role: infra
69fb8431e7ab307513797544fab99ce53ce24c46ec2d3a11767fe70f2ca37b23:
friendly_name: IP3 Test Repeater 1
node_id: ip3-tst01.ipnt.uk
member_id: markab
area: IP3
lat: 52.041869
lon: 1.204789
location_description: Brokehall
hardware: Station G2
antenna: McGill 10dBi Panel
elevation: 37
role: infra
# IP4 Area Nodes
d3c20d962f7384c111fbafad6fbc1c1dc0e5c3ce802fb3ee11020e8d8207ed3a:
friendly_name: IP4 Repeater 1
node_id: ip4-rep01.ipnt.uk
member_id: markab
area: IP4
lat: 52.052445
lon: 1.156882
location_description: Wine Rack
hardware: Heltec T114
antenna: Generic 5dbi Whip
elevation: 50
role: infra
b00ce9d218203e96d8557a4d59e06f5de59bbc4dcc4df9c870079d2cb8b5bd80:
friendly_name: IP4 Repeater 2
node_id: ip4-rep02.ipnt.uk
member_id: markab
area: IP4
lat: 52.06217
lon: 1.18332
location_description: Rushmere Road
hardware: Heltec V3
antenna: Paradar 5dbi Whip
elevation: 35
role: infra
8accb6d0189ccaffb745ba54793e7fe3edd515edb45554325d957e48c1b9f3b3:
friendly_name: IP4 Repeater 3
node_id: ip4-rep03.ipnt.uk
member_id: craig
area: IP4
lat: 52.058
lon: 1.165
location_description: IP4 Area
hardware: Heltec v3
antenna: Generic Whip
elevation: 30
role: infra
# IP8 Area Nodes
bd7b5ac75f660675b39f368e1dbb6d1dbcefd8bd7a170e21a942954f67c8bf52:
friendly_name: IP8 Repeater 1
node_id: rep01.ip8.ipnt.uk
member_id: walshie86
area: IP8
lat: 52.033684
lon: 1.118384
location_description: Grove Hill
hardware: Heltec V3
antenna: McGill 3dBi Omni
elevation: 13
role: infra
9cf300c40112ea34d0a59858270948b27ab6cd87e840de338f3ca782c17537b2:
friendly_name: IP8 Repeater 2
node_id: rep02.ip8.ipnt.uk
member_id: walshie86
area: IP8
lat: 52.035648
lon: 1.073271
location_description: Washbrook
hardware: Sensecap Solar
elevation: 13
role: infra
-14
View File
@@ -1,14 +0,0 @@
services:
api:
networks:
- default
- pangolin
web:
networks:
- default
- pangolin
networks:
pangolin:
external: true
+33 -27
View File
@@ -1,10 +1,14 @@
services:
# ==========================================================================
# MQTT Broker - Eclipse Mosquitto
# MQTT Broker - Eclipse Mosquitto (optional, use --profile mqtt)
# Most users will connect to an external MQTT broker instead
# ==========================================================================
mqtt:
image: eclipse-mosquitto:2
container_name: meshcore-mqtt
profiles:
- all
- mqtt
restart: unless-stopped
ports:
- "${MQTT_EXTERNAL_PORT:-1883}:1883"
@@ -24,17 +28,15 @@ services:
# Interface Receiver - MeshCore device to MQTT bridge (events)
# ==========================================================================
interface-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-receiver
profiles:
- interface-receiver
- all
- receiver
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT:-/dev/ttyUSB0}:${SERIAL_PORT:-/dev/ttyUSB0}"
user: root # Required for device access
@@ -60,17 +62,15 @@ services:
# Interface Sender - MQTT to MeshCore device bridge (commands)
# ==========================================================================
interface-sender:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-sender
profiles:
- interface-sender
- all
- sender
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT_SENDER:-/dev/ttyUSB1}:${SERIAL_PORT_SENDER:-/dev/ttyUSB1}"
user: root # Required for device access
@@ -96,17 +96,15 @@ services:
# Interface Mock Receiver - For testing without real devices
# ==========================================================================
interface-mock-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-mock-receiver
profiles:
- all
- mock
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -128,18 +126,18 @@ services:
# Collector - MQTT subscriber and database storage
# ==========================================================================
collector:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-collector
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
volumes:
# Mount data directory (contains collector/meshcore.db)
- ${DATA_HOME:-./data}:/data
- ${SEED_HOME:-./seed}:/seed
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -148,6 +146,7 @@ services:
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- DATA_HOME=/data
- SEED_HOME=/seed
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
# Webhook configuration
@@ -174,15 +173,16 @@ services:
# API Server - REST API for querying data and sending commands
# ==========================================================================
api:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-api
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
collector:
condition: service_started
ports:
@@ -216,11 +216,14 @@ services:
# Web Dashboard - Web interface for network visualization
# ==========================================================================
web:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-web
profiles:
- all
- core
restart: unless-stopped
depends_on:
api:
@@ -236,10 +239,11 @@ services:
- NETWORK_NAME=${NETWORK_NAME:-MeshCore Network}
- NETWORK_CITY=${NETWORK_CITY:-}
- NETWORK_COUNTRY=${NETWORK_COUNTRY:-}
- NETWORK_LOCATION=${NETWORK_LOCATION:-}
- NETWORK_RADIO_CONFIG=${NETWORK_RADIO_CONFIG:-}
- NETWORK_CONTACT_EMAIL=${NETWORK_CONTACT_EMAIL:-}
- NETWORK_CONTACT_DISCORD=${NETWORK_CONTACT_DISCORD:-}
- NETWORK_CONTACT_GITHUB=${NETWORK_CONTACT_GITHUB:-}
- NETWORK_WELCOME_TEXT=${NETWORK_WELCOME_TEXT:-}
command: ["web"]
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:8080/health')"]
@@ -252,12 +256,13 @@ services:
# Database Migrations - Run Alembic migrations
# ==========================================================================
db-migrate:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-db-migrate
profiles:
- all
- migrate
volumes:
# Mount data directory (uses collector/meshcore.db)
@@ -272,12 +277,13 @@ services:
# Seed Data - Import node_tags.json and members.json from SEED_HOME
# ==========================================================================
seed:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-seed
profiles:
- all
- seed
volumes:
# Mount data directory for database (read-write)
-10
View File
@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Example Member",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member entry"
}
]
}
+16
View File
@@ -0,0 +1,16 @@
# Example members seed file
# Each member can have multiple nodes with different roles (chat, repeater, etc.)
members:
- name: Example Member
callsign: N0CALL
role: Network Operator
description: Example member entry with multiple nodes
nodes:
- public_key: 0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef
node_role: chat
- public_key: fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210
node_role: repeater
- name: Simple Member
callsign: N0CALL2
role: Observer
description: Member without any nodes
-16
View File
@@ -1,16 +0,0 @@
{
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef": {
"friendly_name": "Gateway Node",
"location": {"value": "37.7749,-122.4194", "type": "coordinate"},
"lat": {"value": "37.7749", "type": "number"},
"lon": {"value": "-122.4194", "type": "number"},
"role": "gateway"
},
"fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210": {
"friendly_name": "Oakland Repeater",
"location": {"value": "37.8044,-122.2712", "type": "coordinate"},
"lat": {"value": "37.8044", "type": "number"},
"lon": {"value": "-122.2712", "type": "number"},
"altitude": {"value": "150", "type": "number"}
}
}
+29
View File
@@ -0,0 +1,29 @@
# Example node tags seed file
# Each key is a 64-character hex public key
#
# Tag values can be:
# - YAML primitives (auto-detected type):
# friendly_name: Gateway Node # string
# elevation: 150 # number
# is_online: true # boolean
#
# - Explicit type (for special types like coordinate):
# location:
# value: "37.7749,-122.4194"
# type: coordinate
#
# Supported types: string, number, boolean, coordinate
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
lat: 37.8044
lon: -122.2712
altitude: 150
is_online: false
+14 -8
View File
@@ -1,13 +1,13 @@
[build-system]
requires = ["setuptools>=68.0", "wheel"]
requires = ["setuptools>=68.0", "wheel", "setuptools-scm>=8.0"]
build-backend = "setuptools.build_meta"
[project]
name = "meshcore-hub"
version = "0.1.0"
dynamic = ["version"]
description = "Python monorepo for managing and orchestrating MeshCore mesh networks"
readme = "README.md"
license = {text = "MIT"}
license = {text = "GPL-3.0-or-later"}
requires-python = ">=3.11"
authors = [
{name = "MeshCore Hub Contributors"}
@@ -15,7 +15,7 @@ authors = [
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
@@ -39,6 +39,7 @@ dependencies = [
"httpx>=0.25.0",
"aiosqlite>=0.19.0",
"meshcore>=2.2.0",
"pyyaml>=6.0.0",
]
[project.optional-dependencies]
@@ -51,6 +52,7 @@ dev = [
"mypy>=1.5.0",
"pre-commit>=3.4.0",
"types-paho-mqtt>=1.6.0",
"types-PyYAML>=6.0.0",
]
postgres = [
"asyncpg>=0.28.0",
@@ -61,10 +63,14 @@ postgres = [
meshcore-hub = "meshcore_hub.__main__:main"
[project.urls]
Homepage = "https://github.com/meshcore-dev/meshcore-hub"
Documentation = "https://github.com/meshcore-dev/meshcore-hub#readme"
Repository = "https://github.com/meshcore-dev/meshcore-hub"
Issues = "https://github.com/meshcore-dev/meshcore-hub/issues"
Homepage = "https://github.com/ipnet-mesh/meshcore-hub"
Documentation = "https://github.com/ipnet-mesh/meshcore-hub#readme"
Repository = "https://github.com/ipnet-mesh/meshcore-hub"
Issues = "https://github.com/ipnet-mesh/meshcore-hub/issues"
[tool.setuptools_scm]
version_file = "src/meshcore_hub/_version.py"
fallback_version = "0.0.0+unknown"
[tool.setuptools.packages.find]
where = ["src"]
+3 -1
View File
@@ -1,3 +1,5 @@
"""MeshCore Hub - Python monorepo for managing MeshCore mesh networks."""
__version__ = "0.1.0"
from meshcore_hub._version import __version__, __version_tuple__
__all__ = ["__version__", "__version_tuple__"]
+34
View File
@@ -174,6 +174,40 @@ def db_history() -> None:
command.history(alembic_cfg)
@db.command("stamp")
@click.option(
"--revision",
type=str,
default="head",
help="Target revision to stamp (default: head)",
)
@click.option(
"--database-url",
type=str,
default=None,
envvar="DATABASE_URL",
help="Database connection URL",
)
def db_stamp(revision: str, database_url: str | None) -> None:
"""Stamp database with revision without running migrations.
Use this to mark an existing database as up-to-date when the schema
was created before Alembic migrations were introduced.
"""
import os
from alembic import command
from alembic.config import Config
click.echo(f"Stamping database with revision: {revision}")
alembic_cfg = Config("alembic.ini")
if database_url:
os.environ["DATABASE_URL"] = database_url
command.stamp(alembic_cfg, revision)
click.echo("Database stamped successfully.")
# Health check commands for Docker HEALTHCHECK
@cli.group()
def health() -> None:
+1 -2
View File
@@ -32,10 +32,9 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
# Get database URL from app state
database_url = getattr(app.state, "database_url", "sqlite:///./meshcore.db")
# Initialize database
# Initialize database (schema managed by Alembic migrations)
logger.info(f"Initializing database: {database_url}")
_db_manager = DatabaseManager(database_url)
_db_manager.create_tables()
yield
+208 -10
View File
@@ -5,32 +5,125 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement
from meshcore_hub.common.schemas.messages import AdvertisementList, AdvertisementRead
from meshcore_hub.common.models import Advertisement, EventReceiver, Node, NodeTag
from meshcore_hub.common.schemas.messages import (
AdvertisementList,
AdvertisementRead,
ReceiverInfo,
)
router = APIRouter()
def _get_friendly_name(node: Optional[Node]) -> Optional[str]:
"""Extract friendly_name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "friendly_name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes."""
if not event_hashes:
return {}
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
node_ids = [r.node_id for r in results]
friendly_names: dict[str, str] = {}
if node_ids:
fn_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "friendly_name")
)
for node_id, value in session.execute(fn_query).all():
friendly_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
friendly_name=friendly_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=AdvertisementList)
async def list_advertisements(
_: RequireRead,
session: DbSession,
public_key: Optional[str] = Query(None, description="Filter by public key"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> AdvertisementList:
"""List advertisements with filtering and pagination."""
# Build query
query = select(Advertisement)
# Aliases for node joins
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
# Build query with both receiver and source node joins
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
)
if public_key:
query = query.where(Advertisement.public_key == public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Advertisement.received_at >= since)
@@ -45,10 +138,58 @@ async def list_advertisements(
query = query.order_by(Advertisement.received_at.desc()).offset(offset).limit(limit)
# Execute
advertisements = session.execute(query).scalars().all()
results = session.execute(query).all()
# Collect node IDs to fetch tags
node_ids = set()
for row in results:
if row.receiver_id:
node_ids.add(row.receiver_id)
if row.source_id:
node_ids.add(row.source_id)
# Fetch nodes with tags
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
# Fetch all receivers for these advertisements
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", event_hashes
)
# Build response with node details
items = []
for row in results:
adv = row[0]
receiver_node = nodes_by_id.get(row.receiver_id) if row.receiver_id else None
source_node = nodes_by_id.get(row.source_id) if row.source_id else None
data = {
"received_by": row.receiver_pk,
"receiver_name": row.receiver_name,
"receiver_friendly_name": _get_friendly_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": row.source_name,
"node_friendly_name": _get_friendly_name(source_node),
"adv_type": adv.adv_type or row.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": (
receivers_by_hash.get(adv.event_hash, []) if adv.event_hash else []
),
}
items.append(AdvertisementRead(**data))
return AdvertisementList(
items=[AdvertisementRead.model_validate(a) for a in advertisements],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +203,67 @@ async def get_advertisement(
advertisement_id: str,
) -> AdvertisementRead:
"""Get a single advertisement by ID."""
query = select(Advertisement).where(Advertisement.id == advertisement_id)
advertisement = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
.where(Advertisement.id == advertisement_id)
)
result = session.execute(query).one_or_none()
if not advertisement:
if not result:
raise HTTPException(status_code=404, detail="Advertisement not found")
return AdvertisementRead.model_validate(advertisement)
adv = result[0]
# Fetch nodes with tags for friendly names
node_ids = []
if result.receiver_id:
node_ids.append(result.receiver_id)
if result.source_id:
node_ids.append(result.source_id)
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
receiver_node = nodes_by_id.get(result.receiver_id) if result.receiver_id else None
source_node = nodes_by_id.get(result.source_id) if result.source_id else None
# Fetch receivers for this advertisement
receivers = []
if adv.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", [adv.event_hash]
)
receivers = receivers_by_hash.get(adv.event_hash, [])
data = {
"received_by": result.receiver_pk,
"receiver_name": result.receiver_name,
"receiver_friendly_name": _get_friendly_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": result.source_name,
"node_friendly_name": _get_friendly_name(source_node),
"adv_type": adv.adv_type or result.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": receivers,
}
return AdvertisementRead(**data)
+221 -4
View File
@@ -9,7 +9,15 @@ from sqlalchemy import func, select
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import DashboardStats, RecentAdvertisement
from meshcore_hub.common.schemas.messages import (
ChannelMessage,
DailyActivity,
DailyActivityPoint,
DashboardStats,
MessageActivity,
NodeCountHistory,
RecentAdvertisement,
)
router = APIRouter()
@@ -74,10 +82,23 @@ async def get_stats(
.all()
)
# Get friendly_name tags for the advertised nodes
# Get node names, adv_types, and friendly_name tags for the advertised nodes
ad_public_keys = [ad.public_key for ad in recent_ads]
node_names: dict[str, str] = {}
node_adv_types: dict[str, str] = {}
friendly_names: dict[str, str] = {}
if ad_public_keys:
# Get node names and adv_types from Node table
node_query = select(Node.public_key, Node.name, Node.adv_type).where(
Node.public_key.in_(ad_public_keys)
)
for public_key, name, adv_type in session.execute(node_query).all():
if name:
node_names[public_key] = name
if adv_type:
node_adv_types[public_key] = adv_type
# Get friendly_name tags
friendly_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
@@ -90,9 +111,9 @@ async def get_stats(
recent_advertisements = [
RecentAdvertisement(
public_key=ad.public_key,
name=ad.name,
name=ad.name or node_names.get(ad.public_key),
friendly_name=friendly_names.get(ad.public_key),
adv_type=ad.adv_type,
adv_type=ad.adv_type or node_adv_types.get(ad.public_key),
received_at=ad.received_at,
)
for ad in recent_ads
@@ -110,6 +131,55 @@ async def get_stats(
int(channel): int(count) for channel, count in channel_results
}
# Get latest 5 messages for each channel that has messages
channel_messages: dict[int, list[ChannelMessage]] = {}
for channel_idx, _ in channel_results:
messages_query = (
select(Message)
.where(Message.message_type == "channel")
.where(Message.channel_idx == channel_idx)
.order_by(Message.received_at.desc())
.limit(5)
)
channel_msgs = session.execute(messages_query).scalars().all()
# Look up sender names for these messages
msg_prefixes = [m.pubkey_prefix for m in channel_msgs if m.pubkey_prefix]
msg_sender_names: dict[str, str] = {}
msg_friendly_names: dict[str, str] = {}
if msg_prefixes:
for prefix in set(msg_prefixes):
sender_node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(sender_node_query).all():
if name:
msg_sender_names[public_key[:12]] = name
sender_friendly_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.startswith(prefix))
.where(NodeTag.key == "friendly_name")
)
for public_key, value in session.execute(sender_friendly_query).all():
msg_friendly_names[public_key[:12]] = value
channel_messages[int(channel_idx)] = [
ChannelMessage(
text=m.text,
sender_name=(
msg_sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
sender_friendly_name=(
msg_friendly_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
pubkey_prefix=m.pubkey_prefix,
received_at=m.received_at,
)
for m in channel_msgs
]
return DashboardStats(
total_nodes=total_nodes,
active_nodes=active_nodes,
@@ -119,9 +189,156 @@ async def get_stats(
advertisements_24h=advertisements_24h,
recent_advertisements=recent_advertisements,
channel_message_counts=channel_message_counts,
channel_messages=channel_messages,
)
@router.get("/activity", response_model=DailyActivity)
async def get_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> DailyActivity:
"""Get daily advertisement activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily advertisement counts for each day in the period
"""
# Limit to max 90 days
days = min(days, 90)
now = datetime.now(timezone.utc)
start_date = (now - timedelta(days=days - 1)).replace(
hour=0, minute=0, second=0, microsecond=0
)
# Query advertisement counts grouped by date
# Use SQLite's date() function for grouping (returns string 'YYYY-MM-DD')
date_expr = func.date(Advertisement.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Advertisement.received_at >= start_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
# Build a dict of date -> count from results (date is already a string)
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return DailyActivity(days=days, data=data)
@router.get("/message-activity", response_model=MessageActivity)
async def get_message_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> MessageActivity:
"""Get daily message activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily message counts for each day in the period
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
start_date = (now - timedelta(days=days - 1)).replace(
hour=0, minute=0, second=0, microsecond=0
)
# Query message counts grouped by date
date_expr = func.date(Message.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Message.received_at >= start_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return MessageActivity(days=days, data=data)
@router.get("/node-count", response_model=NodeCountHistory)
async def get_node_count_history(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> NodeCountHistory:
"""Get cumulative node count over time.
For each day, shows the total number of nodes that existed by that date
(based on their created_at timestamp).
Args:
days: Number of days to include (default 30, max 90)
Returns:
Cumulative node count for each day in the period
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
start_date = (now - timedelta(days=days - 1)).replace(
hour=0, minute=0, second=0, microsecond=0
)
# Get all nodes with their creation dates
# Count nodes created on or before each date
data = []
for i in range(days):
date = start_date + timedelta(days=i)
end_of_day = date.replace(hour=23, minute=59, second=59, microsecond=999999)
date_str = date.strftime("%Y-%m-%d")
# Count nodes created on or before this date
count = (
session.execute(
select(func.count())
.select_from(Node)
.where(Node.created_at <= end_of_day)
).scalar()
or 0
)
data.append(DailyActivityPoint(date=date_str, count=count))
return NodeCountHistory(days=days, data=data)
@router.get("/", response_class=HTMLResponse)
async def dashboard(
request: Request,
+140 -15
View File
@@ -2,13 +2,15 @@
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import selectinload
from meshcore_hub.api.auth import RequireAdmin, RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Member
from meshcore_hub.common.models import Member, MemberNode, Node
from meshcore_hub.common.schemas.members import (
MemberCreate,
MemberList,
MemberNodeRead,
MemberRead,
MemberUpdate,
)
@@ -16,11 +18,55 @@ from meshcore_hub.common.schemas.members import (
router = APIRouter()
def _enrich_member_nodes(
member: Member, node_info: dict[str, dict]
) -> list[MemberNodeRead]:
"""Enrich member nodes with node details from the database.
Args:
member: The member with nodes to enrich
node_info: Dict mapping public_key to node details
Returns:
List of MemberNodeRead with node details populated
"""
enriched_nodes = []
for mn in member.nodes:
info = node_info.get(mn.public_key, {})
enriched_nodes.append(
MemberNodeRead(
public_key=mn.public_key,
node_role=mn.node_role,
created_at=mn.created_at,
updated_at=mn.updated_at,
node_name=info.get("name"),
node_adv_type=info.get("adv_type"),
friendly_name=info.get("friendly_name"),
)
)
return enriched_nodes
def _member_to_read(member: Member, node_info: dict[str, dict]) -> MemberRead:
"""Convert a Member model to MemberRead with enriched node data."""
return MemberRead(
id=member.id,
name=member.name,
callsign=member.callsign,
role=member.role,
description=member.description,
contact=member.contact,
nodes=_enrich_member_nodes(member, node_info),
created_at=member.created_at,
updated_at=member.updated_at,
)
@router.get("", response_model=MemberList)
async def list_members(
_: RequireRead,
session: DbSession,
limit: int = Query(default=50, ge=1, le=100),
limit: int = Query(default=50, ge=1, le=500),
offset: int = Query(default=0, ge=0),
) -> MemberList:
"""List all members with pagination."""
@@ -28,12 +74,45 @@ async def list_members(
count_query = select(func.count()).select_from(Member)
total = session.execute(count_query).scalar() or 0
# Get members
query = select(Member).order_by(Member.name).limit(limit).offset(offset)
members = session.execute(query).scalars().all()
# Get members with nodes eagerly loaded
query = (
select(Member)
.options(selectinload(Member.nodes))
.order_by(Member.name)
.limit(limit)
.offset(offset)
)
members = list(session.execute(query).scalars().all())
# Collect all public keys from member nodes
all_public_keys = set()
for m in members:
for mn in m.nodes:
all_public_keys.add(mn.public_key)
# Fetch node info for all public keys in one query
node_info: dict[str, dict] = {}
if all_public_keys:
node_query = (
select(Node)
.options(selectinload(Node.tags))
.where(Node.public_key.in_(all_public_keys))
)
nodes = session.execute(node_query).scalars().all()
for node in nodes:
friendly_name = None
for tag in node.tags:
if tag.key == "friendly_name":
friendly_name = tag.value
break
node_info[node.public_key] = {
"name": node.name,
"adv_type": node.adv_type,
"friendly_name": friendly_name,
}
return MemberList(
items=[MemberRead.model_validate(m) for m in members],
items=[_member_to_read(m, node_info) for m in members],
total=total,
limit=limit,
offset=offset,
@@ -47,13 +126,37 @@ async def get_member(
member_id: str,
) -> MemberRead:
"""Get a specific member by ID."""
query = select(Member).where(Member.id == member_id)
query = (
select(Member).options(selectinload(Member.nodes)).where(Member.id == member_id)
)
member = session.execute(query).scalar_one_or_none()
if not member:
raise HTTPException(status_code=404, detail="Member not found")
return MemberRead.model_validate(member)
# Fetch node info for member's nodes
node_info: dict[str, dict] = {}
public_keys = [mn.public_key for mn in member.nodes]
if public_keys:
node_query = (
select(Node)
.options(selectinload(Node.tags))
.where(Node.public_key.in_(public_keys))
)
nodes = session.execute(node_query).scalars().all()
for node in nodes:
friendly_name = None
for tag in node.tags:
if tag.key == "friendly_name":
friendly_name = tag.value
break
node_info[node.public_key] = {
"name": node.name,
"adv_type": node.adv_type,
"friendly_name": friendly_name,
}
return _member_to_read(member, node_info)
@router.post("", response_model=MemberRead, status_code=201)
@@ -63,9 +166,6 @@ async def create_member(
member: MemberCreate,
) -> MemberRead:
"""Create a new member."""
# Normalize public_key to lowercase if provided
public_key = member.public_key.lower() if member.public_key else None
# Create member
new_member = Member(
name=member.name,
@@ -73,9 +173,20 @@ async def create_member(
role=member.role,
description=member.description,
contact=member.contact,
public_key=public_key,
)
session.add(new_member)
session.flush() # Get the ID for the member
# Add nodes if provided
if member.nodes:
for node_data in member.nodes:
node = MemberNode(
member_id=new_member.id,
public_key=node_data.public_key.lower(),
node_role=node_data.node_role,
)
session.add(node)
session.commit()
session.refresh(new_member)
@@ -90,7 +201,9 @@ async def update_member(
member: MemberUpdate,
) -> MemberRead:
"""Update a member."""
query = select(Member).where(Member.id == member_id)
query = (
select(Member).options(selectinload(Member.nodes)).where(Member.id == member_id)
)
existing = session.execute(query).scalar_one_or_none()
if not existing:
@@ -107,8 +220,20 @@ async def update_member(
existing.description = member.description
if member.contact is not None:
existing.contact = member.contact
if member.public_key is not None:
existing.public_key = member.public_key.lower()
# Update nodes if provided (replaces existing nodes)
if member.nodes is not None:
# Clear existing nodes
existing.nodes.clear()
# Add new nodes
for node_data in member.nodes:
node = MemberNode(
member_id=existing.id,
public_key=node_data.public_key.lower(),
node_role=node_data.node_role,
)
existing.nodes.append(node)
session.commit()
session.refresh(existing)
+186 -14
View File
@@ -5,15 +5,95 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead
from meshcore_hub.common.models import EventReceiver, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead, ReceiverInfo
router = APIRouter()
def _get_friendly_name(node: Optional[Node]) -> Optional[str]:
"""Extract friendly_name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "friendly_name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes.
Args:
session: Database session
event_type: Type of event ('message', 'advertisement', etc.)
event_hashes: List of event hashes to fetch receivers for
Returns:
Dict mapping event_hash to list of ReceiverInfo objects
"""
if not event_hashes:
return {}
# Query event_receivers with receiver node info
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
# Group by event_hash
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
# Get friendly names for receiver nodes
node_ids = [r.node_id for r in results]
friendly_names: dict[str, str] = {}
if node_ids:
fn_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "friendly_name")
)
for node_id, value in session.execute(fn_query).all():
friendly_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
friendly_name=friendly_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=MessageList)
async def list_messages(
_: RequireRead,
@@ -21,6 +101,9 @@ async def list_messages(
message_type: Optional[str] = Query(None, description="Filter by message type"),
pubkey_prefix: Optional[str] = Query(None, description="Filter by sender prefix"),
channel_idx: Optional[int] = Query(None, description="Filter by channel"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
search: Optional[str] = Query(None, description="Search in message text"),
@@ -28,8 +111,16 @@ async def list_messages(
offset: int = Query(0, ge=0, description="Page offset"),
) -> MessageList:
"""List messages with filtering and pagination."""
# Build query
query = select(Message)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(
Message,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
).outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
if message_type:
query = query.where(Message.message_type == message_type)
@@ -40,6 +131,9 @@ async def list_messages(
if channel_idx is not None:
query = query.where(Message.channel_idx == channel_idx)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Message.received_at >= since)
@@ -57,14 +151,24 @@ async def list_messages(
query = query.order_by(Message.received_at.desc()).offset(offset).limit(limit)
# Execute
messages = session.execute(query).scalars().all()
results = session.execute(query).all()
# Look up friendly_names for senders with pubkey_prefix
pubkey_prefixes = [m.pubkey_prefix for m in messages if m.pubkey_prefix]
# Look up sender names and friendly_names for senders with pubkey_prefix
pubkey_prefixes = [r[0].pubkey_prefix for r in results if r[0].pubkey_prefix]
sender_names: dict[str, str] = {}
friendly_names: dict[str, str] = {}
if pubkey_prefixes:
# Find nodes whose public_key starts with any of these prefixes
for prefix in set(pubkey_prefixes):
# Get node name
node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(node_query).all():
if name:
sender_names[public_key[:12]] = name
# Get friendly_name tag
friendly_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
@@ -72,17 +176,50 @@ async def list_messages(
.where(NodeTag.key == "friendly_name")
)
for public_key, value in session.execute(friendly_name_query).all():
# Map the prefix to the friendly_name
friendly_names[public_key[:12]] = value
# Build response with friendly_names
# Collect receiver node IDs to fetch tags
receiver_ids = set()
for row in results:
if row.receiver_id:
receiver_ids.add(row.receiver_id)
# Fetch receiver nodes with tags
receivers_by_id: dict[str, Node] = {}
if receiver_ids:
receivers_query = (
select(Node)
.where(Node.id.in_(receiver_ids))
.options(selectinload(Node.tags))
)
receivers = session.execute(receivers_query).scalars().all()
receivers_by_id = {n.id: n for n in receivers}
# Fetch all receivers for these messages
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(session, "message", event_hashes)
# Build response with sender info and received_by
items = []
for m in messages:
for row in results:
m = row[0]
receiver_pk = row.receiver_pk
receiver_name = row.receiver_name
receiver_node = (
receivers_by_id.get(row.receiver_id) if row.receiver_id else None
)
msg_dict = {
"id": m.id,
"receiver_node_id": m.receiver_node_id,
"received_by": receiver_pk,
"receiver_name": receiver_name,
"receiver_friendly_name": _get_friendly_name(receiver_node),
"message_type": m.message_type,
"pubkey_prefix": m.pubkey_prefix,
"sender_name": (
sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
"sender_friendly_name": (
friendly_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
@@ -95,6 +232,9 @@ async def list_messages(
"sender_timestamp": m.sender_timestamp,
"received_at": m.received_at,
"created_at": m.created_at,
"receivers": (
receivers_by_hash.get(m.event_hash, []) if m.event_hash else []
),
}
items.append(MessageRead(**msg_dict))
@@ -113,10 +253,42 @@ async def get_message(
message_id: str,
) -> MessageRead:
"""Get a single message by ID."""
query = select(Message).where(Message.id == message_id)
message = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Message, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
.where(Message.id == message_id)
)
result = session.execute(query).one_or_none()
if not message:
if not result:
raise HTTPException(status_code=404, detail="Message not found")
return MessageRead.model_validate(message)
message, receiver_pk = result
# Fetch receivers for this message
receivers = []
if message.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "message", [message.event_hash]
)
receivers = receivers_by_hash.get(message.event_hash, [])
data = {
"id": message.id,
"receiver_node_id": message.receiver_node_id,
"received_by": receiver_pk,
"message_type": message.message_type,
"pubkey_prefix": message.pubkey_prefix,
"channel_idx": message.channel_idx,
"text": message.text,
"path_len": message.path_len,
"txt_type": message.txt_type,
"signature": message.signature,
"snr": message.snr,
"sender_timestamp": message.sender_timestamp,
"received_at": message.received_at,
"created_at": message.created_at,
"receivers": receivers,
}
return MessageRead(**data)
+1 -1
View File
@@ -19,7 +19,7 @@ async def list_nodes(
session: DbSession,
search: Optional[str] = Query(None, description="Search in name or public key"),
adv_type: Optional[str] = Query(None, description="Filter by advertisement type"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
limit: int = Query(50, ge=1, le=500, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> NodeList:
"""List all nodes with pagination and filtering."""
+52 -9
View File
@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Telemetry
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.schemas.messages import TelemetryList, TelemetryRead
router = APIRouter()
@@ -19,18 +20,29 @@ async def list_telemetry(
_: RequireRead,
session: DbSession,
node_public_key: Optional[str] = Query(None, description="Filter by node"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TelemetryList:
"""List telemetry records with filtering and pagination."""
# Build query
query = select(Telemetry)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(Telemetry, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id
)
if node_public_key:
query = query.where(Telemetry.node_public_key == node_public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Telemetry.received_at >= since)
@@ -45,10 +57,25 @@ async def list_telemetry(
query = query.order_by(Telemetry.received_at.desc()).offset(offset).limit(limit)
# Execute
records = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tel, receiver_pk in results:
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
items.append(TelemetryRead(**data))
return TelemetryList(
items=[TelemetryRead.model_validate(t) for t in records],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +89,26 @@ async def get_telemetry(
telemetry_id: str,
) -> TelemetryRead:
"""Get a single telemetry record by ID."""
query = select(Telemetry).where(Telemetry.id == telemetry_id)
telemetry = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Telemetry, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id)
.where(Telemetry.id == telemetry_id)
)
result = session.execute(query).one_or_none()
if not telemetry:
if not result:
raise HTTPException(status_code=404, detail="Telemetry record not found")
return TelemetryRead.model_validate(telemetry)
tel, receiver_pk = result
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
return TelemetryRead(**data)
+60 -9
View File
@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import TracePath
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.schemas.messages import TracePathList, TracePathRead
router = APIRouter()
@@ -18,14 +19,25 @@ router = APIRouter()
async def list_trace_paths(
_: RequireRead,
session: DbSession,
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TracePathList:
"""List trace paths with filtering and pagination."""
# Build query
query = select(TracePath)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(TracePath, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id
)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(TracePath.received_at >= since)
@@ -41,10 +53,29 @@ async def list_trace_paths(
query = query.order_by(TracePath.received_at.desc()).offset(offset).limit(limit)
# Execute
trace_paths = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tp, receiver_pk in results:
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
items.append(TracePathRead(**data))
return TracePathList(
items=[TracePathRead.model_validate(t) for t in trace_paths],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -58,10 +89,30 @@ async def get_trace_path(
trace_path_id: str,
) -> TracePathRead:
"""Get a single trace path by ID."""
query = select(TracePath).where(TracePath.id == trace_path_id)
trace_path = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(TracePath, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id)
.where(TracePath.id == trace_path_id)
)
result = session.execute(query).one_or_none()
if not trace_path:
if not result:
raise HTTPException(status_code=404, detail="Trace path not found")
return TracePathRead.model_validate(trace_path)
tp, receiver_pk = result
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
return TracePathRead(**data)
+158 -89
View File
@@ -1,9 +1,14 @@
"""CLI for the Collector component."""
from typing import TYPE_CHECKING
import click
from meshcore_hub.common.logging import configure_logging
if TYPE_CHECKING:
from meshcore_hub.common.database import DatabaseManager
@click.group(invoke_without_command=True)
@click.pass_context
@@ -137,6 +142,7 @@ def collector(
database_url=effective_db_url,
log_level=log_level,
data_home=data_home or settings.data_home,
seed_home=settings.effective_seed_home,
)
@@ -149,9 +155,13 @@ def _run_collector_service(
database_url: str,
log_level: str,
data_home: str,
seed_home: str,
) -> None:
"""Run the collector service.
On startup, automatically seeds the database from YAML files in seed_home
if they exist.
Webhooks can be configured via environment variables:
- WEBHOOK_ADVERTISEMENT_URL: Webhook for advertisement events
- WEBHOOK_MESSAGE_URL: Webhook for all message events
@@ -168,20 +178,47 @@ def _run_collector_service(
click.echo("Starting MeshCore Collector")
click.echo(f"Data home: {data_home}")
click.echo(f"Seed home: {seed_home}")
click.echo(f"MQTT: {mqtt_host}:{mqtt_port} (prefix: {prefix})")
click.echo(f"Database: {database_url}")
# Initialize database (schema managed by Alembic migrations)
from meshcore_hub.common.database import DatabaseManager
db = DatabaseManager(database_url)
# Auto-seed from seed files on startup
click.echo("")
click.echo("Checking for seed files...")
seed_home_path = Path(seed_home)
node_tags_exists = (seed_home_path / "node_tags.yaml").exists()
members_exists = (seed_home_path / "members.yaml").exists()
if node_tags_exists or members_exists:
click.echo("Running seed import...")
_run_seed_import(
seed_home=seed_home,
db=db,
create_nodes=True,
verbose=True,
)
else:
click.echo(f"No seed files found in {seed_home}")
db.dispose()
# Load webhook configuration from settings
from meshcore_hub.common.config import get_collector_settings
from meshcore_hub.collector.webhook import (
WebhookDispatcher,
create_webhooks_from_settings,
)
from meshcore_hub.common.config import get_collector_settings
settings = get_collector_settings()
webhooks = create_webhooks_from_settings(settings)
webhook_dispatcher = WebhookDispatcher(webhooks) if webhooks else None
click.echo("")
if webhook_dispatcher and webhook_dispatcher.webhooks:
click.echo(f"Webhooks configured: {len(webhooks)}")
for wh in webhooks:
@@ -191,6 +228,8 @@ def _run_collector_service(
from meshcore_hub.collector.subscriber import run_collector
click.echo("")
click.echo("Starting MQTT subscriber...")
run_collector(
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
@@ -218,6 +257,7 @@ def run_cmd(ctx: click.Context) -> None:
database_url=ctx.obj["database_url"],
log_level=ctx.obj["log_level"],
data_home=ctx.obj["data_home"],
seed_home=ctx.obj["seed_home"],
)
@@ -236,17 +276,15 @@ def seed_cmd(
"""Import seed data from SEED_HOME directory.
Looks for the following files in SEED_HOME:
- node_tags.json: Node tag definitions (keyed by public_key)
- members.json: Network member definitions
- node_tags.yaml: Node tag definitions (keyed by public_key)
- members.yaml: Network member definitions
Files that don't exist are skipped. This command is idempotent -
existing records are updated, new records are created.
SEED_HOME defaults to {DATA_HOME}/collector but can be overridden
SEED_HOME defaults to ./seed but can be overridden
with the --seed-home option or SEED_HOME environment variable.
"""
from pathlib import Path
configure_logging(level=ctx.obj["log_level"])
seed_home = ctx.obj["seed_home"]
@@ -254,50 +292,17 @@ def seed_cmd(
click.echo(f"Database: {ctx.obj['database_url']}")
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Track what was imported
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.json"
if node_tags_file.exists():
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=not no_create_nodes,
)
click.echo(f" Tags: {stats['created']} created, {stats['updated']} updated")
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo node_tags.json found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.json"
if members_file.exists():
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
click.echo(f" Members: {stats['created']} created, {stats['updated']} updated")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo members.json found in {seed_home}")
# Run seed import
imported_any = _run_seed_import(
seed_home=seed_home,
db=db,
create_nodes=not no_create_nodes,
verbose=True,
)
if not imported_any:
click.echo("\nNo seed files found. Nothing to import.")
@@ -307,6 +312,76 @@ def seed_cmd(
db.dispose()
def _run_seed_import(
seed_home: str,
db: "DatabaseManager",
create_nodes: bool = True,
verbose: bool = False,
) -> bool:
"""Run seed import from SEED_HOME directory.
Args:
seed_home: Path to seed home directory
db: Database manager instance
create_nodes: If True, create nodes that don't exist
verbose: If True, output progress messages
Returns:
True if any files were imported, False otherwise
"""
from pathlib import Path
from meshcore_hub.collector.member_import import import_members
from meshcore_hub.collector.tag_import import import_tags
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.yaml"
if node_tags_file.exists():
if verbose:
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=create_nodes,
)
if verbose:
click.echo(
f" Tags: {stats['created']} created, {stats['updated']} updated"
)
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo node_tags.yaml found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.yaml"
if members_file.exists():
if verbose:
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
if verbose:
click.echo(
f" Members: {stats['created']} created, {stats['updated']} updated"
)
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo members.yaml found in {seed_home}")
return imported_any
@collector.command("import-tags")
@click.argument("file", type=click.Path(), required=False, default=None)
@click.option(
@@ -321,32 +396,32 @@ def import_tags_cmd(
file: str | None,
no_create_nodes: bool,
) -> None:
"""Import node tags from a JSON file.
"""Import node tags from a YAML file.
Reads a JSON file containing tag definitions and upserts them
Reads a YAML file containing tag definitions and upserts them
into the database. Existing tags are updated, new tags are created.
FILE is the path to the JSON file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.json.
FILE is the path to the YAML file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.yaml.
Expected YAML format (keyed by public_key):
Expected JSON format (keyed by public_key):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
0123456789abcdef...:
friendly_name: My Node
location:
value: "52.0,1.0"
type: coordinate
altitude:
value: "150"
type: number
Shorthand is also supported (string values with default type):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"role": "gateway"
}
}
0123456789abcdef...:
friendly_name: My Node
role: gateway
Supported types: string, number, boolean, coordinate
"""
@@ -362,7 +437,7 @@ def import_tags_cmd(
if not Path(tags_file).exists():
click.echo(f"Tags file not found: {tags_file}")
if not file:
click.echo("Specify a file path or create the default node_tags.json.")
click.echo("Specify a file path or create the default node_tags.yaml.")
return
click.echo(f"Importing tags from: {tags_file}")
@@ -371,9 +446,8 @@ def import_tags_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import tags
stats = import_tags(
@@ -407,33 +481,29 @@ def import_members_cmd(
ctx: click.Context,
file: str | None,
) -> None:
"""Import network members from a JSON file.
"""Import network members from a YAML file.
Reads a JSON file containing member definitions and upserts them
Reads a YAML file containing member definitions and upserts them
into the database. Existing members (matched by name) are updated,
new members are created.
FILE is the path to the JSON file containing members.
If not provided, defaults to {SEED_HOME}/members.json.
FILE is the path to the YAML file containing members.
If not provided, defaults to {SEED_HOME}/members.yaml.
Expected YAML format (list):
Expected JSON format (list):
\b
[
{
"name": "John Doe",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member"
}
]
- name: John Doe
callsign: N0CALL
role: Network Operator
description: Example member
Or with "members" key:
\b
{
"members": [
{"name": "John Doe", "callsign": "N0CALL", ...}
]
}
members:
- name: John Doe
callsign: N0CALL
"""
from pathlib import Path
@@ -447,7 +517,7 @@ def import_members_cmd(
if not Path(members_file).exists():
click.echo(f"Members file not found: {members_file}")
if not file:
click.echo("Specify a file path or create the default members.json.")
click.echo("Specify a file path or create the default members.yaml.")
return
click.echo(f"Importing members from: {members_file}")
@@ -456,9 +526,8 @@ def import_members_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import members
stats = import_members(
@@ -19,7 +19,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
)
from meshcore_hub.collector.handlers.trace import handle_trace_data
from meshcore_hub.collector.handlers.telemetry import handle_telemetry
from meshcore_hub.collector.handlers.contacts import handle_contacts
from meshcore_hub.collector.handlers.contacts import handle_contact
from meshcore_hub.collector.handlers.event_log import handle_event_log
# Persisted events with specific handlers
@@ -28,7 +28,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
subscriber.register_handler("channel_msg_recv", handle_channel_message)
subscriber.register_handler("trace_data", handle_trace_data)
subscriber.register_handler("telemetry_response", handle_telemetry)
subscriber.register_handler("contacts", handle_contacts)
subscriber.register_handler("contact", handle_contact) # Individual contact events
# Informational events (logged only)
subscriber.register_handler("send_confirmed", handle_event_log)
@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Advertisement, Node
from meshcore_hub.common.hash_utils import compute_advertisement_hash
from meshcore_hub.common.models import Advertisement, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,17 @@ def handle_advertisement(
flags = payload.get("flags")
now = datetime.now(timezone.utc)
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_advertisement_hash(
public_key=adv_public_key,
name=name,
adv_type=adv_type,
flags=flags,
received_at=now,
)
with db.session_scope() as session:
# Find or create receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -55,6 +66,37 @@ def handle_advertisement(
)
session.add(receiver_node)
session.flush()
else:
receiver_node.last_seen = now
# Check if advertisement with same hash already exists
existing = session.execute(
select(Advertisement.id).where(Advertisement.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Still update advertised node's last_seen even for duplicate advertisements
node_query = select(Node).where(Node.public_key == adv_public_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
node.last_seen = now
# Add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Advertisements don't have SNR
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to advertisement "
f"(hash={event_hash[:8]}...)"
)
return
# Find or create advertised node
node_query = select(Node).where(Node.public_key == adv_public_key)
@@ -91,9 +133,43 @@ def handle_advertisement(
adv_type=adv_type,
flags=flags,
received_at=now,
event_hash=event_hash,
)
session.add(advertisement)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate advertisement skipped (race condition, "
f"hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(
f"Stored advertisement from {name or adv_public_key[:12]!r} "
f"(type={adv_type})"
+55 -43
View File
@@ -1,4 +1,4 @@
"""Handler for contacts sync events."""
"""Handler for contact sync events."""
import logging
from datetime import datetime, timezone
@@ -11,65 +11,77 @@ from meshcore_hub.common.models import Node
logger = logging.getLogger(__name__)
# Map numeric node type to string representation
NODE_TYPE_MAP = {
0: "none",
1: "chat",
2: "repeater",
3: "room",
}
def handle_contacts(
def handle_contact(
public_key: str,
event_type: str,
payload: dict[str, Any],
db: DatabaseManager,
) -> None:
"""Handle a contacts sync event.
"""Handle a single contact event.
Upserts all contacts in the contacts list.
Upserts a contact into the nodes table.
Args:
public_key: Receiver node's public key (from MQTT topic)
event_type: Event type name
payload: Contacts payload
payload: Single contact object with fields:
- public_key: Contact's public key
- adv_name: Advertised name
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
db: Database manager
"""
contacts = payload.get("contacts", [])
if not contacts:
logger.debug("Empty contacts list received")
contact_key = payload.get("public_key")
if not contact_key:
logger.warning("Contact event missing public_key field")
return
# Device uses 'adv_name' for the advertised name
name = payload.get("adv_name") or payload.get("name")
logger.info(f"Processing contact: {contact_key[:12]}... adv_name={name}")
# Device uses numeric 'type' field, convert to string
raw_type = payload.get("type")
if raw_type is not None:
node_type: str | None = NODE_TYPE_MAP.get(raw_type, str(raw_type))
else:
node_type = payload.get("node_type")
now = datetime.now(timezone.utc)
created_count = 0
updated_count = 0
with db.session_scope() as session:
for contact in contacts:
contact_key = contact.get("public_key")
if not contact_key:
continue
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
name = contact.get("name")
node_type = contact.get("node_type")
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
# Update existing node
if name and not node.name:
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
node.last_seen = now
updated_count += 1
else:
# Create new node
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=now,
if node:
# Update existing node - always update name if we have one
if name and name != node.name:
logger.info(
f"Updating node {contact_key[:12]}... "
f"name: {node.name!r} -> {name!r}"
)
session.add(node)
created_count += 1
logger.info(
f"Processed contacts sync: {created_count} new, {updated_count} updated"
)
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
node.last_seen = now
else:
# Create new node
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=now,
)
session.add(node)
logger.info(f"Created node from contact: {contact_key[:12]}... ({name})")
+69 -2
View File
@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Message, Node
from meshcore_hub.common.hash_utils import compute_message_hash
from meshcore_hub.common.models import Message, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -84,8 +86,17 @@ def _handle_message(
except (ValueError, OSError):
pass
# Compute event hash for deduplication
event_hash = compute_message_hash(
text=text,
pubkey_prefix=pubkey_prefix,
channel_idx=channel_idx,
sender_timestamp=sender_timestamp,
txt_type=txt_type,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -102,6 +113,29 @@ def _handle_message(
else:
receiver_node.last_seen = now
# Check if message with same hash already exists
existing = session.execute(
select(Message.id).where(Message.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to message "
f"(hash={event_hash[:8]}...)"
)
return
# Create message record
message = Message(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -115,9 +149,42 @@ def _handle_message(
snr=snr,
sender_timestamp=sender_timestamp,
received_at=now,
event_hash=event_hash,
)
session.add(message)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate message skipped (race condition, hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
return
if message_type == "contact":
logger.info(
f"Stored contact message from {pubkey_prefix!r}: "
@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.hash_utils import compute_telemetry_hash
from meshcore_hub.common.models import Node, Telemetry, add_event_receiver
logger = logging.getLogger(__name__)
@@ -49,8 +51,15 @@ def handle_telemetry(
except ValueError:
lpp_bytes = lpp_data.encode()
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_telemetry_hash(
node_public_key=node_public_key,
parsed_data=parsed_data,
received_at=now,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -67,6 +76,29 @@ def handle_telemetry(
else:
receiver_node.last_seen = now
# Check if telemetry with same hash already exists
existing = session.execute(
select(Telemetry.id).where(Telemetry.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to telemetry "
f"(node={node_public_key[:12]}...)"
)
return
# Find or create reporting node
reporting_node = None
if node_public_key:
@@ -92,9 +124,43 @@ def handle_telemetry(
lpp_data=lpp_bytes,
parsed_data=parsed_data,
received_at=now,
event_hash=event_hash,
)
session.add(telemetry)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate telemetry skipped (race condition, "
f"node={node_public_key[:12]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
# Log telemetry values
if parsed_data:
values = ", ".join(f"{k}={v}" for k, v in parsed_data.items())
+63 -2
View File
@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.hash_utils import compute_trace_hash
from meshcore_hub.common.models import Node, TracePath, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,11 @@ def handle_trace_data(
snr_values = payload.get("snr_values")
hop_count = payload.get("hop_count")
# Compute event hash for deduplication (initiator_tag is unique per trace)
event_hash = compute_trace_hash(initiator_tag=initiator_tag)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -58,6 +63,29 @@ def handle_trace_data(
else:
receiver_node.last_seen = now
# Check if trace with same hash already exists
existing = session.execute(
select(TracePath.id).where(TracePath.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Trace events don't have a single SNR value
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to trace "
f"(tag={initiator_tag})"
)
return
# Create trace path record
trace_path = TracePath(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -69,7 +97,40 @@ def handle_trace_data(
snr_values=snr_values,
hop_count=hop_count,
received_at=now,
event_hash=event_hash,
)
session.add(trace_path)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate trace skipped (race condition, tag={initiator_tag})"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(f"Stored trace data: tag={initiator_tag}, hops={hop_count}")
+69 -31
View File
@@ -1,19 +1,36 @@
"""Import members from JSON file."""
"""Import members from YAML file."""
import json
import logging
from pathlib import Path
from typing import Any, Optional
import yaml
from pydantic import BaseModel, Field, field_validator
from sqlalchemy import select
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Member
from meshcore_hub.common.models import Member, MemberNode
logger = logging.getLogger(__name__)
class NodeData(BaseModel):
"""Schema for a node entry in the member import file."""
public_key: str = Field(..., min_length=64, max_length=64)
node_role: Optional[str] = Field(default=None, max_length=50)
@field_validator("public_key")
@classmethod
def validate_public_key(cls, v: str) -> str:
"""Validate and normalize public key."""
if len(v) != 64:
raise ValueError(f"public_key must be 64 characters, got {len(v)}")
if not all(c in "0123456789abcdefABCDEF" for c in v):
raise ValueError("public_key must be a valid hex string")
return v.lower()
class MemberData(BaseModel):
"""Schema for a member entry in the import file."""
@@ -22,40 +39,39 @@ class MemberData(BaseModel):
role: Optional[str] = Field(default=None, max_length=100)
description: Optional[str] = Field(default=None)
contact: Optional[str] = Field(default=None, max_length=255)
public_key: Optional[str] = Field(default=None)
@field_validator("public_key")
@classmethod
def validate_public_key(cls, v: Optional[str]) -> Optional[str]:
"""Validate and normalize public key if provided."""
if v is None:
return None
if len(v) != 64:
raise ValueError(f"public_key must be 64 characters, got {len(v)}")
if not all(c in "0123456789abcdefABCDEF" for c in v):
raise ValueError("public_key must be a valid hex string")
return v.lower()
nodes: Optional[list[NodeData]] = Field(default=None)
def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
"""Load and validate members from a JSON file.
"""Load and validate members from a YAML file.
Supports two formats:
1. List of member objects:
[{"name": "Member 1", ...}, {"name": "Member 2", ...}]
- name: Member 1
callsign: M1
nodes:
- public_key: abc123...
node_role: chat
2. Object with "members" key:
{"members": [{"name": "Member 1", ...}, ...]}
members:
- name: Member 1
callsign: M1
nodes:
- public_key: abc123...
node_role: chat
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
Returns:
List of validated member dictionaries
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -63,7 +79,7 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
raise FileNotFoundError(f"Members file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
# Handle both formats
if isinstance(data, list):
@@ -73,9 +89,7 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
if not isinstance(members_list, list):
raise ValueError("'members' key must contain a list")
else:
raise ValueError(
"Members file must be a list or an object with 'members' key"
)
raise ValueError("Members file must be a list or a mapping with 'members' key")
# Validate each member
validated: list[dict[str, Any]] = []
@@ -99,13 +113,14 @@ def import_members(
file_path: str | Path,
db: DatabaseManager,
) -> dict[str, Any]:
"""Import members from a JSON file into the database.
"""Import members from a YAML file into the database.
Performs upsert operations based on name - existing members are updated,
new members are created.
new members are created. Nodes are synced (existing nodes removed and
replaced with new ones from the file).
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
db: Database manager instance
Returns:
@@ -150,8 +165,20 @@ def import_members(
existing.description = member_data["description"]
if member_data.get("contact") is not None:
existing.contact = member_data["contact"]
if member_data.get("public_key") is not None:
existing.public_key = member_data["public_key"]
# Sync nodes if provided
if member_data.get("nodes") is not None:
# Remove existing nodes
existing.nodes.clear()
# Add new nodes
for node_data in member_data["nodes"]:
node = MemberNode(
member_id=existing.id,
public_key=node_data["public_key"],
node_role=node_data.get("node_role"),
)
existing.nodes.append(node)
stats["updated"] += 1
logger.debug(f"Updated member: {name}")
@@ -163,9 +190,20 @@ def import_members(
role=member_data.get("role"),
description=member_data.get("description"),
contact=member_data.get("contact"),
public_key=member_data.get("public_key"),
)
session.add(new_member)
session.flush() # Get the ID for the member
# Add nodes if provided
if member_data.get("nodes"):
for node_data in member_data["nodes"]:
node = MemberNode(
member_id=new_member.id,
public_key=node_data["public_key"],
node_role=node_data.get("node_role"),
)
session.add(node)
stats["created"] += 1
logger.debug(f"Created member: {name}")
+6 -4
View File
@@ -206,14 +206,16 @@ class Subscriber:
"""Start the subscriber."""
logger.info("Starting collector subscriber")
# Create database tables if needed
# Verify database connection (schema managed by Alembic migrations)
try:
self.db.create_tables()
# Test connection by getting a session
session = self.db.get_session()
session.close()
self._db_connected = True
logger.info("Database initialized")
logger.info("Database connection verified")
except Exception as e:
self._db_connected = False
logger.error(f"Failed to initialize database: {e}")
logger.error(f"Failed to connect to database: {e}")
raise
# Connect to MQTT broker
+36 -24
View File
@@ -1,11 +1,11 @@
"""Import node tags from JSON file."""
"""Import node tags from YAML file."""
import json
import logging
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
import yaml
from pydantic import BaseModel, Field, model_validator
from sqlalchemy import select
@@ -64,33 +64,33 @@ def validate_public_key(public_key: str) -> str:
def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
"""Load and validate tags from a JSON file.
"""Load and validate tags from a YAML file.
New format - dictionary keyed by public_key:
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
YAML format - dictionary keyed by public_key:
0123456789abcdef...:
friendly_name: My Node
location:
value: "52.0,1.0"
type: coordinate
altitude:
value: "150"
type: number
Shorthand is allowed - string values are auto-converted:
{
"0123456789abcdef...": {
"friendly_name": "My Node"
}
}
0123456789abcdef...:
friendly_name: My Node
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
Returns:
Dictionary mapping public_key to tag dictionary
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -98,10 +98,10 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
raise FileNotFoundError(f"Tags file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
if not isinstance(data, dict):
raise ValueError("Tags file must contain a JSON object")
raise ValueError("Tags file must contain a YAML mapping")
# Validate each entry
validated: dict[str, dict[str, Any]] = {}
@@ -117,12 +117,24 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
for tag_key, tag_value in tags.items():
if isinstance(tag_value, dict):
# Full format with value and type
raw_value = tag_value.get("value")
# Convert value to string if it's not None
str_value = str(raw_value) if raw_value is not None else None
validated_tags[tag_key] = {
"value": tag_value.get("value"),
"value": str_value,
"type": tag_value.get("type", "string"),
}
elif isinstance(tag_value, bool):
# YAML boolean - must check before int since bool is subclass of int
validated_tags[tag_key] = {
"value": str(tag_value).lower(),
"type": "boolean",
}
elif isinstance(tag_value, (int, float)):
# YAML number (int or float)
validated_tags[tag_key] = {"value": str(tag_value), "type": "number"}
elif isinstance(tag_value, str):
# Shorthand: just a string value
# String value
validated_tags[tag_key] = {"value": tag_value, "type": "string"}
elif tag_value is None:
validated_tags[tag_key] = {"value": None, "type": "string"}
@@ -140,12 +152,12 @@ def import_tags(
db: DatabaseManager,
create_nodes: bool = True,
) -> dict[str, Any]:
"""Import tags from a JSON file into the database.
"""Import tags from a YAML file into the database.
Performs upsert operations - existing tags are updated, new tags are created.
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
db: Database manager instance
create_nodes: If True, create nodes that don't exist. If False, skip tags
for non-existent nodes.
+1 -1
View File
@@ -404,7 +404,7 @@ _dispatch_callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None
def set_dispatch_callback(
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]]
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]],
) -> None:
"""Set a callback for synchronous webhook dispatch.
+11 -8
View File
@@ -80,7 +80,7 @@ class CollectorSettings(CommonSettings):
description="SQLAlchemy database URL (default: sqlite:///{data_home}/collector/meshcore.db)",
)
# Seed home directory - contains initial data files (node_tags.json, members.json)
# Seed home directory - contains initial data files (node_tags.yaml, members.yaml)
seed_home: str = Field(
default="./seed",
description="Directory containing seed data files (default: ./seed)",
@@ -147,17 +147,17 @@ class CollectorSettings(CommonSettings):
@property
def node_tags_file(self) -> str:
"""Get the path to node_tags.json in seed_home."""
"""Get the path to node_tags.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "node_tags.json")
return str(Path(self.effective_seed_home) / "node_tags.yaml")
@property
def members_file(self) -> str:
"""Get the path to members.json in seed_home."""
"""Get the path to members.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "members.json")
return str(Path(self.effective_seed_home) / "members.yaml")
@field_validator("database_url")
@classmethod
@@ -231,9 +231,6 @@ class WebSettings(CommonSettings):
network_country: Optional[str] = Field(
default=None, description="Network country (ISO 3166-1 alpha-2)"
)
network_location: Optional[str] = Field(
default=None, description="Network location (lat,lon)"
)
network_radio_config: Optional[str] = Field(
default=None, description="Radio configuration details"
)
@@ -243,6 +240,12 @@ class WebSettings(CommonSettings):
network_contact_discord: Optional[str] = Field(
default=None, description="Discord server link"
)
network_contact_github: Optional[str] = Field(
default=None, description="GitHub repository URL"
)
network_welcome_text: Optional[str] = Field(
default=None, description="Welcome text for homepage"
)
@property
def web_data_dir(self) -> str:
+142
View File
@@ -0,0 +1,142 @@
"""Event hash utilities for deduplication.
This module provides functions to compute deterministic hashes for events,
allowing deduplication when multiple receiver nodes report the same event.
"""
import hashlib
from datetime import datetime
from typing import Optional
def compute_message_hash(
text: str,
pubkey_prefix: Optional[str] = None,
channel_idx: Optional[int] = None,
sender_timestamp: Optional[datetime] = None,
txt_type: Optional[int] = None,
) -> str:
"""Compute a deterministic hash for a message.
The hash is computed from fields that uniquely identify a message's content
and sender, excluding receiver-specific data.
Args:
text: Message content
pubkey_prefix: Sender's public key prefix (12 chars)
channel_idx: Channel index for channel messages
sender_timestamp: Sender's timestamp
txt_type: Message type indicator
Returns:
32-character hex hash string
"""
# Build a canonical string from the relevant fields
parts = [
text or "",
pubkey_prefix or "",
str(channel_idx) if channel_idx is not None else "",
sender_timestamp.isoformat() if sender_timestamp else "",
str(txt_type) if txt_type is not None else "",
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_advertisement_hash(
public_key: str,
name: Optional[str] = None,
adv_type: Optional[str] = None,
flags: Optional[int] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for an advertisement.
Advertisements are bucketed by time since the same node may advertise
periodically and we want to deduplicate within a time window.
Args:
public_key: Advertised node's public key
name: Advertised name
adv_type: Node type
flags: Capability flags
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time to allow deduplication within a window
time_bucket = ""
if received_at:
# Round down to nearest bucket
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
parts = [
public_key,
name or "",
adv_type or "",
str(flags) if flags is not None else "",
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_trace_hash(initiator_tag: int) -> str:
"""Compute a deterministic hash for a trace path.
Trace paths have a unique initiator_tag that serves as the identifier.
Args:
initiator_tag: Unique trace identifier
Returns:
32-character hex hash string
"""
return hashlib.md5(str(initiator_tag).encode("utf-8")).hexdigest()
def compute_telemetry_hash(
node_public_key: str,
parsed_data: Optional[dict] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for a telemetry record.
Telemetry is bucketed by time since nodes report periodically.
Args:
node_public_key: Reporting node's public key
parsed_data: Decoded sensor readings
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time
time_bucket = ""
if received_at:
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
# Serialize parsed_data deterministically
data_str = ""
if parsed_data:
# Sort keys for deterministic serialization
sorted_items = sorted(parsed_data.items())
data_str = str(sorted_items)
parts = [
node_public_key,
data_str,
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
@@ -9,6 +9,8 @@ from meshcore_hub.common.models.trace_path import TracePath
from meshcore_hub.common.models.telemetry import Telemetry
from meshcore_hub.common.models.event_log import EventLog
from meshcore_hub.common.models.member import Member
from meshcore_hub.common.models.member_node import MemberNode
from meshcore_hub.common.models.event_receiver import EventReceiver, add_event_receiver
__all__ = [
"Base",
@@ -21,4 +23,7 @@ __all__ = [
"Telemetry",
"EventLog",
"Member",
"MemberNode",
"EventReceiver",
"add_event_receiver",
]
@@ -58,6 +58,11 @@ class Advertisement(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_advertisements_received_at", "received_at"),)
@@ -0,0 +1,127 @@
"""EventReceiver model for tracking which nodes received each event."""
from datetime import datetime
from typing import TYPE_CHECKING, Optional
from uuid import uuid4
from sqlalchemy import DateTime, Float, ForeignKey, Index, String, UniqueConstraint
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
from sqlalchemy.orm import Mapped, Session, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin, utc_now
if TYPE_CHECKING:
from meshcore_hub.common.models.node import Node
class EventReceiver(Base, UUIDMixin, TimestampMixin):
"""Junction model tracking which receivers observed each event.
This table enables multi-receiver tracking for deduplicated events.
When multiple receiver nodes observe the same mesh event, each receiver
gets an entry in this table linked by the event_hash.
Attributes:
id: UUID primary key
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event (links to event tables)
receiver_node_id: FK to the node that received this event
snr: Signal-to-noise ratio at this receiver (if available)
received_at: When this specific receiver saw the event
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "event_receivers"
event_type: Mapped[str] = mapped_column(
String(20),
nullable=False,
)
event_hash: Mapped[str] = mapped_column(
String(32),
nullable=False,
index=True,
)
receiver_node_id: Mapped[str] = mapped_column(
String(36),
ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
index=True,
)
snr: Mapped[Optional[float]] = mapped_column(
Float,
nullable=True,
)
received_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
default=utc_now,
nullable=False,
)
# Relationship to receiver node
receiver_node: Mapped["Node"] = relationship(
"Node",
foreign_keys=[receiver_node_id],
)
__table_args__ = (
UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
Index("ix_event_receivers_type_hash", "event_type", "event_hash"),
)
def __repr__(self) -> str:
return (
f"<EventReceiver(type={self.event_type}, "
f"hash={self.event_hash[:8]}..., "
f"node={self.receiver_node_id[:8]}...)>"
)
def add_event_receiver(
session: Session,
event_type: str,
event_hash: str,
receiver_node_id: str,
snr: Optional[float] = None,
received_at: Optional[datetime] = None,
) -> bool:
"""Add a receiver to an event, handling duplicates gracefully.
Uses INSERT OR IGNORE to handle the unique constraint on (event_hash, receiver_node_id).
Args:
session: SQLAlchemy session
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event
receiver_node_id: UUID of the receiver node
snr: Signal-to-noise ratio at this receiver (optional)
received_at: When this receiver saw the event (defaults to now)
Returns:
True if a new receiver entry was added, False if it already existed.
"""
from datetime import timezone
now = received_at or datetime.now(timezone.utc)
stmt = (
sqlite_insert(EventReceiver)
.values(
id=str(uuid4()),
event_type=event_type,
event_hash=event_hash,
receiver_node_id=receiver_node_id,
snr=snr,
received_at=now,
created_at=now,
updated_at=now,
)
.on_conflict_do_nothing(index_elements=["event_hash", "receiver_node_id"])
)
result = session.execute(stmt)
# CursorResult has rowcount attribute
rowcount = getattr(result, "rowcount", 0)
return bool(rowcount and rowcount > 0)
+12 -7
View File
@@ -1,17 +1,21 @@
"""Member model for network member information."""
from typing import Optional
from typing import TYPE_CHECKING, Optional
from sqlalchemy import String, Text
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.orm import Mapped, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin
if TYPE_CHECKING:
from meshcore_hub.common.models.member_node import MemberNode
class Member(Base, UUIDMixin, TimestampMixin):
"""Member model for network member information.
Stores information about network members/operators.
Members can have multiple associated nodes (chat, repeater, etc.).
Attributes:
id: UUID primary key
@@ -20,7 +24,7 @@ class Member(Base, UUIDMixin, TimestampMixin):
role: Member's role in the network (optional)
description: Additional description (optional)
contact: Contact information (optional)
public_key: Associated node public key (optional, 64-char hex)
nodes: List of associated MemberNode records
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
@@ -47,10 +51,11 @@ class Member(Base, UUIDMixin, TimestampMixin):
String(255),
nullable=True,
)
public_key: Mapped[Optional[str]] = mapped_column(
String(64),
nullable=True,
index=True,
# Relationship to member nodes
nodes: Mapped[list["MemberNode"]] = relationship(
back_populates="member",
cascade="all, delete-orphan",
)
def __repr__(self) -> str:
@@ -0,0 +1,56 @@
"""MemberNode model for associating nodes with members."""
from typing import TYPE_CHECKING, Optional
from sqlalchemy import ForeignKey, String, Index
from sqlalchemy.orm import Mapped, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin
if TYPE_CHECKING:
from meshcore_hub.common.models.member import Member
class MemberNode(Base, UUIDMixin, TimestampMixin):
"""Association model linking members to their nodes.
A member can have multiple nodes (e.g., chat node, repeater).
Each node is identified by its public_key and has a role.
Attributes:
id: UUID primary key
member_id: Foreign key to the member
public_key: Node's public key (64-char hex)
node_role: Role of the node (e.g., 'chat', 'repeater')
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "member_nodes"
member_id: Mapped[str] = mapped_column(
String(36),
ForeignKey("members.id", ondelete="CASCADE"),
nullable=False,
index=True,
)
public_key: Mapped[str] = mapped_column(
String(64),
nullable=False,
index=True,
)
node_role: Mapped[Optional[str]] = mapped_column(
String(50),
nullable=True,
)
# Relationship back to member
member: Mapped["Member"] = relationship(back_populates="nodes")
# Composite index for efficient lookups
__table_args__ = (
Index("ix_member_nodes_member_public_key", "member_id", "public_key"),
)
def __repr__(self) -> str:
return f"<MemberNode(member_id={self.member_id}, public_key={self.public_key[:8]}..., role={self.node_role})>"
@@ -76,6 +76,11 @@ class Message(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_messages_message_type", "message_type"),
@@ -54,6 +54,11 @@ class Telemetry(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_telemetry_received_at", "received_at"),)
+6 -1
View File
@@ -3,7 +3,7 @@
from datetime import datetime
from typing import Optional
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer, String
from sqlalchemy.dialects.sqlite import JSON
from sqlalchemy.orm import Mapped, mapped_column
@@ -67,6 +67,11 @@ class TracePath(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_trace_paths_initiator_tag", "initiator_tag"),
+8 -1
View File
@@ -20,6 +20,7 @@ from meshcore_hub.common.schemas.nodes import (
NodeTagRead,
)
from meshcore_hub.common.schemas.messages import (
ReceiverInfo,
MessageRead,
MessageList,
MessageFilters,
@@ -35,6 +36,9 @@ from meshcore_hub.common.schemas.members import (
MemberRead,
MemberList,
)
from meshcore_hub.common.schemas.network import (
RadioConfig,
)
__all__ = [
# Events
@@ -54,7 +58,8 @@ __all__ = [
"NodeTagCreate",
"NodeTagUpdate",
"NodeTagRead",
# Messages
# Messages & Events
"ReceiverInfo",
"MessageRead",
"MessageList",
"MessageFilters",
@@ -67,4 +72,6 @@ __all__ = [
"MemberUpdate",
"MemberRead",
"MemberList",
# Network
"RadioConfig",
]
+38 -3
View File
@@ -157,7 +157,16 @@ class TelemetryResponseEvent(BaseModel):
class ContactInfo(BaseModel):
"""Schema for a single contact in CONTACTS event."""
"""Schema for a single contact in CONTACTS event.
Device payload fields:
- public_key: Node's 64-char hex public key
- adv_name: Node's advertised name (device field)
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
- flags: Capability flags
- last_advert: Unix timestamp of last advertisement
- adv_lat, adv_lon: GPS coordinates (if available)
"""
public_key: str = Field(
...,
@@ -165,14 +174,40 @@ class ContactInfo(BaseModel):
max_length=64,
description="Node's full public key",
)
adv_name: Optional[str] = Field(
default=None,
max_length=255,
description="Node's advertised name (from device)",
)
type: Optional[int] = Field(
default=None,
description="Numeric node type: 0=none, 1=chat, 2=repeater, 3=room",
)
flags: Optional[int] = Field(
default=None,
description="Capability/status flags bitmask",
)
last_advert: Optional[int] = Field(
default=None,
description="Unix timestamp of last advertisement",
)
adv_lat: Optional[float] = Field(
default=None,
description="GPS latitude (if available)",
)
adv_lon: Optional[float] = Field(
default=None,
description="GPS longitude (if available)",
)
# Legacy field names for backwards compatibility
name: Optional[str] = Field(
default=None,
max_length=255,
description="Node name/alias",
description="Node name/alias (legacy, prefer adv_name)",
)
node_type: Optional[str] = Field(
default=None,
description="Node type: chat, repeater, room, none",
description="Node type string (legacy, prefer type)",
)
+42 -13
View File
@@ -6,6 +6,43 @@ from typing import Optional
from pydantic import BaseModel, Field
class MemberNodeCreate(BaseModel):
"""Schema for creating a member node association."""
public_key: str = Field(
...,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Node's public key (64-char hex)",
)
node_role: Optional[str] = Field(
default=None,
max_length=50,
description="Role of the node (e.g., 'chat', 'repeater')",
)
class MemberNodeRead(BaseModel):
"""Schema for reading a member node association."""
public_key: str = Field(..., description="Node's public key")
node_role: Optional[str] = Field(default=None, description="Role of the node")
created_at: datetime = Field(..., description="Creation timestamp")
updated_at: datetime = Field(..., description="Last update timestamp")
# Node details (populated from nodes table if available)
node_name: Optional[str] = Field(default=None, description="Node's name from DB")
node_adv_type: Optional[str] = Field(
default=None, description="Node's advertisement type"
)
friendly_name: Optional[str] = Field(
default=None, description="Node's friendly name tag"
)
class Config:
from_attributes = True
class MemberCreate(BaseModel):
"""Schema for creating a member."""
@@ -34,12 +71,9 @@ class MemberCreate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
nodes: Optional[list[MemberNodeCreate]] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
description="List of associated nodes",
)
@@ -71,12 +105,9 @@ class MemberUpdate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
nodes: Optional[list[MemberNodeCreate]] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
description="List of associated nodes (replaces existing nodes)",
)
@@ -89,9 +120,7 @@ class MemberRead(BaseModel):
role: Optional[str] = Field(default=None, description="Member's role")
description: Optional[str] = Field(default=None, description="Description")
contact: Optional[str] = Field(default=None, description="Contact information")
public_key: Optional[str] = Field(
default=None, description="Associated node public key"
)
nodes: list[MemberNodeRead] = Field(default=[], description="Associated nodes")
created_at: datetime = Field(..., description="Creation timestamp")
updated_at: datetime = Field(..., description="Last update timestamp")
+108 -14
View File
@@ -6,17 +6,41 @@ from typing import Literal, Optional
from pydantic import BaseModel, Field
class ReceiverInfo(BaseModel):
"""Information about a receiver that observed an event."""
node_id: str = Field(..., description="Receiver node UUID")
public_key: str = Field(..., description="Receiver node public key")
name: Optional[str] = Field(default=None, description="Receiver node name")
friendly_name: Optional[str] = Field(
default=None, description="Receiver friendly name from tags"
)
snr: Optional[float] = Field(
default=None, description="Signal-to-noise ratio at this receiver"
)
received_at: datetime = Field(..., description="When this receiver saw the event")
class Config:
from_attributes = True
class MessageRead(BaseModel):
"""Schema for reading a message."""
id: str = Field(..., description="Message UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_friendly_name: Optional[str] = Field(
default=None, description="Receiver friendly name from tags"
)
message_type: str = Field(..., description="Message type (contact, channel)")
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender's public key prefix (12 chars)"
)
sender_name: Optional[str] = Field(
default=None, description="Sender's advertised node name"
)
sender_friendly_name: Optional[str] = Field(
default=None, description="Sender's friendly name from node tags"
)
@@ -31,6 +55,9 @@ class MessageRead(BaseModel):
)
received_at: datetime = Field(..., description="When received by interface")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list, description="All receivers that observed this message"
)
class Config:
from_attributes = True
@@ -79,17 +106,29 @@ class MessageFilters(BaseModel):
class AdvertisementRead(BaseModel):
"""Schema for reading an advertisement."""
id: str = Field(..., description="Advertisement UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_friendly_name: Optional[str] = Field(
default=None, description="Receiver friendly name from tags"
)
node_id: Optional[str] = Field(default=None, description="Advertised node UUID")
public_key: str = Field(..., description="Advertised public key")
name: Optional[str] = Field(default=None, description="Advertised name")
node_name: Optional[str] = Field(
default=None, description="Node name from nodes table"
)
node_friendly_name: Optional[str] = Field(
default=None, description="Node friendly name from tags"
)
adv_type: Optional[str] = Field(default=None, description="Node type")
flags: Optional[int] = Field(default=None, description="Capability flags")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this advertisement",
)
class Config:
from_attributes = True
@@ -107,9 +146,8 @@ class AdvertisementList(BaseModel):
class TracePathRead(BaseModel):
"""Schema for reading a trace path."""
id: str = Field(..., description="Trace path UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
initiator_tag: int = Field(..., description="Trace identifier")
path_len: Optional[int] = Field(default=None, description="Path length")
@@ -124,6 +162,10 @@ class TracePathRead(BaseModel):
hop_count: Optional[int] = Field(default=None, description="Total hops")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this trace",
)
class Config:
from_attributes = True
@@ -141,17 +183,19 @@ class TracePathList(BaseModel):
class TelemetryRead(BaseModel):
"""Schema for reading a telemetry record."""
id: str = Field(..., description="Telemetry UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
node_id: Optional[str] = Field(default=None, description="Reporting node UUID")
node_public_key: str = Field(..., description="Reporting node public key")
parsed_data: Optional[dict] = Field(
default=None, description="Decoded sensor readings"
)
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this telemetry",
)
class Config:
from_attributes = True
@@ -176,6 +220,20 @@ class RecentAdvertisement(BaseModel):
received_at: datetime = Field(..., description="When received")
class ChannelMessage(BaseModel):
"""Schema for a channel message summary."""
text: str = Field(..., description="Message text")
sender_name: Optional[str] = Field(default=None, description="Sender name")
sender_friendly_name: Optional[str] = Field(
default=None, description="Sender friendly name"
)
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender public key prefix"
)
received_at: datetime = Field(..., description="When received")
class DashboardStats(BaseModel):
"""Schema for dashboard statistics."""
@@ -194,3 +252,39 @@ class DashboardStats(BaseModel):
default_factory=dict,
description="Message count per channel",
)
channel_messages: dict[int, list[ChannelMessage]] = Field(
default_factory=dict,
description="Recent messages per channel (up to 5 each)",
)
class DailyActivityPoint(BaseModel):
"""Schema for a single day's activity count."""
date: str = Field(..., description="Date in YYYY-MM-DD format")
count: int = Field(..., description="Count for this day")
class DailyActivity(BaseModel):
"""Schema for daily advertisement activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Daily advertisement counts"
)
class MessageActivity(BaseModel):
"""Schema for daily message activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(..., description="Daily message counts")
class NodeCountHistory(BaseModel):
"""Schema for node count over time."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Cumulative node count per day"
)
@@ -0,0 +1,65 @@
"""Pydantic schemas for network configuration."""
from typing import Optional
from pydantic import BaseModel
class RadioConfig(BaseModel):
"""Parsed radio configuration from comma-delimited string.
Format: "<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Example: "EU/UK Narrow,869.618MHz,62.5kHz,8,8,22dBm"
"""
profile: Optional[str] = None
frequency: Optional[str] = None
bandwidth: Optional[str] = None
spreading_factor: Optional[int] = None
coding_rate: Optional[int] = None
tx_power: Optional[str] = None
@classmethod
def from_config_string(cls, config_str: Optional[str]) -> Optional["RadioConfig"]:
"""Parse a comma-delimited radio config string.
Args:
config_str: Comma-delimited string in format:
"<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Returns:
RadioConfig instance if parsing succeeds, None if input is None or empty
"""
if not config_str:
return None
parts = [p.strip() for p in config_str.split(",")]
# Handle partial configs by filling with None
while len(parts) < 6:
parts.append("")
# Parse spreading factor and coding rate as integers
spreading_factor = None
coding_rate = None
try:
if parts[3]:
spreading_factor = int(parts[3])
except ValueError:
pass
try:
if parts[4]:
coding_rate = int(parts[4])
except ValueError:
pass
return cls(
profile=parts[0] or None,
frequency=parts[1] or None,
bandwidth=parts[2] or None,
spreading_factor=spreading_factor,
coding_rate=coding_rate,
tx_power=parts[5] or None,
)
-3
View File
@@ -41,8 +41,6 @@ class NodeTagUpdate(BaseModel):
class NodeTagRead(BaseModel):
"""Schema for reading a node tag."""
id: str = Field(..., description="Tag UUID")
node_id: str = Field(..., description="Parent node UUID")
key: str = Field(..., description="Tag name/key")
value: Optional[str] = Field(default=None, description="Tag value")
value_type: str = Field(..., description="Value type hint")
@@ -56,7 +54,6 @@ class NodeTagRead(BaseModel):
class NodeRead(BaseModel):
"""Schema for reading a node."""
id: str = Field(..., description="Node UUID")
public_key: str = Field(..., description="Node's 64-character hex public key")
name: Optional[str] = Field(default=None, description="Node display name")
adv_type: Optional[str] = Field(default=None, description="Advertisement type")
+33 -3
View File
@@ -175,6 +175,17 @@ class BaseMeshCoreDevice(ABC):
"""
pass
@abstractmethod
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database.
Triggers a CONTACTS event with all stored contacts from the device.
Returns:
True if request was sent successfully
"""
pass
@abstractmethod
def run(self) -> None:
"""Run the device event loop (blocking)."""
@@ -322,6 +333,10 @@ class MeshCoreDevice(BaseMeshCoreDevice):
self._connected = True
logger.info(f"Connected to MeshCore device, public_key: {self._public_key}")
# Set up event subscriptions so events can be received immediately
self._setup_event_subscriptions()
return True
except Exception as e:
@@ -521,14 +536,29 @@ class MeshCoreDevice(BaseMeshCoreDevice):
logger.error(f"Failed to start message fetching: {e}")
return False
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database."""
if not self._connected or not self._mc:
logger.error("Cannot get contacts: not connected")
return False
try:
async def _get_contacts() -> None:
await self._mc.commands.get_contacts()
self._loop.run_until_complete(_get_contacts())
logger.info("Requested contacts from device")
return True
except Exception as e:
logger.error(f"Failed to get contacts: {e}")
return False
def run(self) -> None:
"""Run the device event loop."""
self._running = True
logger.info("Starting device event loop")
# Set up event subscriptions
self._setup_event_subscriptions()
# Run the async event loop
async def _run_loop() -> None:
while self._running and self._connected:
+27
View File
@@ -280,6 +280,33 @@ class MockMeshCoreDevice(BaseMeshCoreDevice):
logger.info("Mock: Started automatic message fetching")
return True
def get_contacts(self) -> bool:
"""Fetch contacts from mock device contact database."""
if not self._connected:
logger.error("Cannot get contacts: not connected")
return False
logger.info("Mock: Requesting contacts from device")
# Generate CONTACTS event with all configured mock nodes
def send_contacts() -> None:
time.sleep(0.2)
contacts = [
{
"public_key": node.public_key,
"name": node.name,
"node_type": node.adv_type,
}
for node in self.mock_config.nodes
]
self._dispatch_event(
EventType.CONTACTS,
{"contacts": contacts},
)
threading.Thread(target=send_contacts, daemon=True).start()
return True
def run(self) -> None:
"""Run the mock device event loop."""
self._running = True
+75 -13
View File
@@ -74,7 +74,8 @@ class Receiver:
def _initialize_device(self) -> None:
"""Initialize device after connection.
Sets the hardware clock, sends a local advertisement, and starts message fetching.
Sets the hardware clock, sends a local advertisement, starts message fetching,
and syncs the contact database.
"""
# Set device time to current Unix timestamp
current_time = int(time.time())
@@ -95,6 +96,12 @@ class Receiver:
else:
logger.warning("Failed to start automatic message fetching")
# Fetch contact database to sync known nodes
if self.device.get_contacts():
logger.info("Requested contact database sync")
else:
logger.warning("Failed to request contact database")
def _handle_event(self, event_type: EventType, payload: dict[str, Any]) -> None:
"""Handle device event and publish to MQTT.
@@ -110,6 +117,11 @@ class Receiver:
# Convert event type to MQTT topic name
event_name = event_type.value
# Special handling for CONTACTS: split into individual messages
if event_type == EventType.CONTACTS:
self._publish_contacts(payload)
return
# Publish to MQTT
self.mqtt.publish_event(
self.device.public_key,
@@ -122,6 +134,49 @@ class Receiver:
except Exception as e:
logger.error(f"Failed to publish event to MQTT: {e}")
def _publish_contacts(self, payload: dict[str, Any]) -> None:
"""Publish each contact as a separate MQTT message.
The device returns contacts as a dict keyed by public_key.
We split this into individual 'contact' events for cleaner processing.
Args:
payload: Dict of contacts keyed by public_key
"""
if not self.device.public_key:
logger.warning("Cannot publish contacts: device public key not available")
return
# Handle both formats:
# - Dict keyed by public_key (real device)
# - Dict with "contacts" array (mock device)
if "contacts" in payload:
contacts = payload["contacts"]
else:
contacts = list(payload.values())
if not contacts:
logger.debug("Empty contacts list received")
return
device_key = self.device.public_key # Capture for type narrowing
count = 0
for contact in contacts:
if not isinstance(contact, dict):
continue
try:
self.mqtt.publish_event(
device_key,
"contact", # Use singular 'contact' for individual events
contact,
)
count += 1
except Exception as e:
logger.error(f"Failed to publish contact event: {e}")
logger.info(f"Published {count} contact events to MQTT")
def start(self) -> None:
"""Start the receiver."""
logger.info("Starting RECEIVER mode")
@@ -142,17 +197,19 @@ class Receiver:
logger.error(f"Failed to connect to MQTT broker: {e}")
raise
# Connect to device
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_receiver)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Initialize device: set time and send local advertisement
self._initialize_device()
@@ -236,17 +293,22 @@ def create_receiver(
Returns:
Configured Receiver instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-receiver-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-receiver-{device.public_key[:12] if device.public_key else 'unknown'}",
)
mqtt_client = MQTTClient(mqtt_config)
+16 -9
View File
@@ -200,14 +200,16 @@ class Sender:
"""Start the sender."""
logger.info("Starting SENDER mode")
# Connect to device first
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_sender)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Connect to MQTT broker
try:
@@ -307,17 +309,22 @@ def create_sender(
Returns:
Configured Sender instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-sender-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-sender-{device.public_key[:12] if device.public_key else 'unknown'}",
)
mqtt_client = MQTTClient(mqtt_config)
+44 -16
View File
@@ -11,6 +11,7 @@ from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from meshcore_hub import __version__
from meshcore_hub.common.schemas import RadioConfig
logger = logging.getLogger(__name__)
@@ -47,32 +48,42 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
def create_app(
api_url: str = "http://localhost:8000",
api_url: str | None = None,
api_key: str | None = None,
network_name: str = "MeshCore Network",
network_name: str | None = None,
network_city: str | None = None,
network_country: str | None = None,
network_location: tuple[float, float] | None = None,
network_radio_config: str | None = None,
network_contact_email: str | None = None,
network_contact_discord: str | None = None,
network_contact_github: str | None = None,
network_welcome_text: str | None = None,
) -> FastAPI:
"""Create and configure the web dashboard application.
When called without arguments (e.g., in reload mode), settings are loaded
from environment variables via the WebSettings class.
Args:
api_url: Base URL of the MeshCore Hub API
api_key: API key for authentication
network_name: Display name for the network
network_city: City where the network is located
network_country: Country where the network is located
network_location: (lat, lon) tuple for map centering
network_radio_config: Radio configuration description
network_contact_email: Contact email address
network_contact_discord: Discord invite/server info
network_contact_github: GitHub repository URL
network_welcome_text: Welcome text for homepage
Returns:
Configured FastAPI application
"""
# Load settings from environment if not provided
from meshcore_hub.common.config import get_web_settings
settings = get_web_settings()
app = FastAPI(
title="MeshCore Hub Dashboard",
description="Web dashboard for MeshCore network visualization",
@@ -82,16 +93,27 @@ def create_app(
redoc_url=None,
)
# Store configuration in app state
app.state.api_url = api_url
app.state.api_key = api_key
app.state.network_name = network_name
app.state.network_city = network_city
app.state.network_country = network_country
app.state.network_location = network_location or (0.0, 0.0)
app.state.network_radio_config = network_radio_config
app.state.network_contact_email = network_contact_email
app.state.network_contact_discord = network_contact_discord
# Store configuration in app state (use args if provided, else settings)
app.state.api_url = api_url or settings.api_base_url
app.state.api_key = api_key or settings.api_key
app.state.network_name = network_name or settings.network_name
app.state.network_city = network_city or settings.network_city
app.state.network_country = network_country or settings.network_country
app.state.network_radio_config = (
network_radio_config or settings.network_radio_config
)
app.state.network_contact_email = (
network_contact_email or settings.network_contact_email
)
app.state.network_contact_discord = (
network_contact_discord or settings.network_contact_discord
)
app.state.network_contact_github = (
network_contact_github or settings.network_contact_github
)
app.state.network_welcome_text = (
network_welcome_text or settings.network_welcome_text
)
# Set up templates
templates = Jinja2Templates(directory=str(TEMPLATES_DIR))
@@ -134,13 +156,19 @@ def get_templates(request: Request) -> Jinja2Templates:
def get_network_context(request: Request) -> dict:
"""Get network configuration context for templates."""
# Parse radio config from comma-delimited string
radio_config = RadioConfig.from_config_string(
request.app.state.network_radio_config
)
return {
"network_name": request.app.state.network_name,
"network_city": request.app.state.network_city,
"network_country": request.app.state.network_country,
"network_location": request.app.state.network_location,
"network_radio_config": request.app.state.network_radio_config,
"network_radio_config": radio_config,
"network_contact_email": request.app.state.network_contact_email,
"network_contact_discord": request.app.state.network_contact_discord,
"network_contact_github": request.app.state.network_contact_github,
"network_welcome_text": request.app.state.network_welcome_text,
"version": __version__,
}
+48 -47
View File
@@ -7,21 +7,21 @@ import click
@click.option(
"--host",
type=str,
default="0.0.0.0",
default=None,
envvar="WEB_HOST",
help="Web server host",
help="Web server host (default: 0.0.0.0)",
)
@click.option(
"--port",
type=int,
default=8080,
default=None,
envvar="WEB_PORT",
help="Web server port",
help="Web server port (default: 8080)",
)
@click.option(
"--api-url",
type=str,
default="http://localhost:8000",
default=None,
envvar="API_BASE_URL",
help="API server base URL",
)
@@ -42,7 +42,7 @@ import click
@click.option(
"--network-name",
type=str,
default="MeshCore Network",
default=None,
envvar="NETWORK_NAME",
help="Network display name",
)
@@ -60,20 +60,6 @@ import click
envvar="NETWORK_COUNTRY",
help="Network country",
)
@click.option(
"--network-lat",
type=float,
default=0.0,
envvar="NETWORK_LAT",
help="Network center latitude",
)
@click.option(
"--network-lon",
type=float,
default=0.0,
envvar="NETWORK_LON",
help="Network center longitude",
)
@click.option(
"--network-radio-config",
type=str,
@@ -95,6 +81,20 @@ import click
envvar="NETWORK_CONTACT_DISCORD",
help="Discord server info",
)
@click.option(
"--network-contact-github",
type=str,
default=None,
envvar="NETWORK_CONTACT_GITHUB",
help="GitHub repository URL",
)
@click.option(
"--network-welcome-text",
type=str,
default=None,
envvar="NETWORK_WELCOME_TEXT",
help="Welcome text for homepage",
)
@click.option(
"--reload",
is_flag=True,
@@ -104,19 +104,19 @@ import click
@click.pass_context
def web(
ctx: click.Context,
host: str,
port: int,
api_url: str,
host: str | None,
port: int | None,
api_url: str | None,
api_key: str | None,
data_home: str | None,
network_name: str,
network_name: str | None,
network_city: str | None,
network_country: str | None,
network_lat: float,
network_lon: float,
network_radio_config: str | None,
network_contact_email: str | None,
network_contact_discord: str | None,
network_contact_github: str | None,
network_welcome_text: str | None,
reload: bool,
) -> None:
"""Run the web dashboard.
@@ -146,46 +146,46 @@ def web(
from meshcore_hub.common.config import get_web_settings
from meshcore_hub.web.app import create_app
# Get settings to compute effective values
# Get settings for defaults and display
settings = get_web_settings()
# Override data_home if provided
if data_home:
settings = settings.model_copy(update={"data_home": data_home})
# Use CLI args or fall back to settings
effective_host = host or settings.web_host
effective_port = port or settings.web_port
effective_data_home = data_home or settings.data_home
# Ensure web data directory exists
web_data_dir = Path(effective_data_home) / "web"
web_data_dir.mkdir(parents=True, exist_ok=True)
# Display effective settings
effective_network_name = network_name or settings.network_name
click.echo("=" * 50)
click.echo("MeshCore Hub Web Dashboard")
click.echo("=" * 50)
click.echo(f"Host: {host}")
click.echo(f"Port: {port}")
click.echo(f"Host: {effective_host}")
click.echo(f"Port: {effective_port}")
click.echo(f"Data home: {effective_data_home}")
click.echo(f"API URL: {api_url}")
click.echo(f"API key configured: {api_key is not None}")
click.echo(f"Network: {network_name}")
if network_city and network_country:
click.echo(f"Location: {network_city}, {network_country}")
if network_lat != 0.0 or network_lon != 0.0:
click.echo(f"Map center: {network_lat}, {network_lon}")
click.echo(f"API URL: {api_url or settings.api_base_url}")
click.echo(f"API key configured: {(api_key or settings.api_key) is not None}")
click.echo(f"Network: {effective_network_name}")
effective_city = network_city or settings.network_city
effective_country = network_country or settings.network_country
if effective_city and effective_country:
click.echo(f"Location: {effective_city}, {effective_country}")
click.echo(f"Reload mode: {reload}")
click.echo("=" * 50)
network_location = (network_lat, network_lon)
if reload:
# For development, use uvicorn's reload feature
click.echo("\nStarting in development mode with auto-reload...")
click.echo("Note: Using default settings for reload mode.")
click.echo("Note: Settings loaded from environment/config.")
uvicorn.run(
"meshcore_hub.web.app:create_app",
host=host,
port=port,
host=effective_host,
port=effective_port,
reload=True,
factory=True,
)
@@ -197,11 +197,12 @@ def web(
network_name=network_name,
network_city=network_city,
network_country=network_country,
network_location=network_location,
network_radio_config=network_radio_config,
network_contact_email=network_contact_email,
network_contact_discord=network_contact_discord,
network_contact_github=network_contact_github,
network_welcome_text=network_welcome_text,
)
click.echo("\nStarting web dashboard...")
uvicorn.run(app, host=host, port=port)
uvicorn.run(app, host=effective_host, port=effective_port)
+2
View File
@@ -6,6 +6,7 @@ from meshcore_hub.web.routes.home import router as home_router
from meshcore_hub.web.routes.network import router as network_router
from meshcore_hub.web.routes.nodes import router as nodes_router
from meshcore_hub.web.routes.messages import router as messages_router
from meshcore_hub.web.routes.advertisements import router as advertisements_router
from meshcore_hub.web.routes.map import router as map_router
from meshcore_hub.web.routes.members import router as members_router
@@ -17,6 +18,7 @@ web_router.include_router(home_router)
web_router.include_router(network_router)
web_router.include_router(nodes_router)
web_router.include_router(messages_router)
web_router.include_router(advertisements_router)
web_router.include_router(map_router)
web_router.include_router(members_router)
@@ -0,0 +1,64 @@
"""Advertisements page route."""
import logging
from fastapi import APIRouter, Query, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/advertisements", response_class=HTMLResponse)
async def advertisements_list(
request: Request,
public_key: str | None = Query(None, description="Filter by public key"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
) -> HTMLResponse:
"""Render the advertisements list page."""
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
# Calculate offset
offset = (page - 1) * limit
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if public_key:
params["public_key"] = public_key
# Fetch advertisements from API
advertisements = []
total = 0
try:
response = await request.app.state.http_client.get(
"/api/v1/advertisements", params=params
)
if response.status_code == 200:
data = response.json()
advertisements = data.get("items", [])
total = data.get("total", 0)
except Exception as e:
logger.warning(f"Failed to fetch advertisements from API: {e}")
context["api_error"] = str(e)
# Calculate pagination
total_pages = (total + limit - 1) // limit if total > 0 else 1
context.update(
{
"advertisements": advertisements,
"total": total,
"page": page,
"limit": limit,
"total_pages": total_pages,
"public_key": public_key or "",
}
)
return templates.TemplateResponse("advertisements.html", context)
+38
View File
@@ -1,10 +1,14 @@
"""Home page route."""
import json
import logging
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@@ -15,4 +19,38 @@ async def home(request: Request) -> HTMLResponse:
context = get_network_context(request)
context["request"] = request
# Fetch stats from API
stats = {
"total_nodes": 0,
"active_nodes": 0,
"total_messages": 0,
"messages_today": 0,
"total_advertisements": 0,
"advertisements_24h": 0,
}
# Fetch activity data for chart
activity = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
stats = response.json()
except Exception as e:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch activity from API: {e}")
context["stats"] = stats
# Pass activity data as JSON string for the chart
context["activity_json"] = json.dumps(activity)
return templates.TemplateResponse("home.html", context)
+73 -7
View File
@@ -1,6 +1,7 @@
"""Map page route."""
import logging
from typing import Any
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse, JSONResponse
@@ -23,10 +24,39 @@ async def map_page(request: Request) -> HTMLResponse:
@router.get("/map/data")
async def map_data(request: Request) -> JSONResponse:
"""Return node location data as JSON for the map."""
nodes_with_location = []
"""Return node location data as JSON for the map.
Includes role tag, member ownership info, and all data needed for filtering.
"""
nodes_with_location: list[dict[str, Any]] = []
members_list: list[dict[str, Any]] = []
members_by_key: dict[str, dict[str, Any]] = {}
error: str | None = None
total_nodes = 0
nodes_with_coords = 0
try:
# Fetch all members to build lookup by public_key
members_response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 500}
)
if members_response.status_code == 200:
members_data = members_response.json()
for member in members_data.get("items", []):
# Only include members with public_key (required for node ownership)
if member.get("public_key"):
member_info = {
"public_key": member.get("public_key"),
"name": member.get("name"),
"callsign": member.get("callsign"),
}
members_list.append(member_info)
members_by_key[member["public_key"]] = member_info
else:
logger.warning(
f"Failed to fetch members: status {members_response.status_code}"
)
# Fetch all nodes from API
response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"limit": 500}
@@ -34,6 +64,7 @@ async def map_data(request: Request) -> JSONResponse:
if response.status_code == 200:
data = response.json()
nodes = data.get("items", [])
total_nodes = len(nodes)
# Filter nodes with location tags
for node in nodes:
@@ -41,6 +72,8 @@ async def map_data(request: Request) -> JSONResponse:
lat = None
lon = None
friendly_name = None
role = None
for tag in tags:
key = tag.get("key")
if key == "lat":
@@ -55,37 +88,70 @@ async def map_data(request: Request) -> JSONResponse:
pass
elif key == "friendly_name":
friendly_name = tag.get("value")
elif key == "role":
role = tag.get("value")
if lat is not None and lon is not None:
nodes_with_coords += 1
# Use friendly_name, then node name, then public key prefix
display_name = (
friendly_name
or node.get("name")
or node.get("public_key", "")[:12]
)
public_key = node.get("public_key")
# Find owner member if exists
owner = members_by_key.get(public_key)
nodes_with_location.append(
{
"public_key": node.get("public_key"),
"public_key": public_key,
"name": display_name,
"adv_type": node.get("adv_type"),
"lat": lat,
"lon": lon,
"last_seen": node.get("last_seen"),
"role": role,
"is_infra": role == "infra",
"owner": owner,
}
)
else:
error = f"API returned status {response.status_code}"
logger.warning(f"Failed to fetch nodes: {error}")
except Exception as e:
error = str(e)
logger.warning(f"Failed to fetch nodes for map: {e}")
# Get network center location
network_location = request.app.state.network_location
logger.info(
f"Map data: {total_nodes} total nodes, " f"{nodes_with_coords} with coordinates"
)
# Calculate center from nodes, or use default (0, 0)
center_lat = 0.0
center_lon = 0.0
if nodes_with_location:
center_lat = sum(n["lat"] for n in nodes_with_location) / len(
nodes_with_location
)
center_lon = sum(n["lon"] for n in nodes_with_location) / len(
nodes_with_location
)
return JSONResponse(
{
"nodes": nodes_with_location,
"members": members_list,
"center": {
"lat": network_location[0],
"lon": network_location[1],
"lat": center_lat,
"lon": center_lon,
},
"debug": {
"total_nodes": total_nodes,
"nodes_with_coords": nodes_with_coords,
"error": error,
},
}
)
+13
View File
@@ -21,6 +21,15 @@ async def members_page(request: Request) -> HTMLResponse:
# Fetch members from API
members = []
def node_sort_key(node: dict) -> int:
"""Sort nodes: repeater first, then chat, then others."""
role = (node.get("node_role") or "").lower()
if role == "repeater":
return 0
if role == "chat":
return 1
return 2
try:
response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 100}
@@ -28,6 +37,10 @@ async def members_page(request: Request) -> HTMLResponse:
if response.status_code == 200:
data = response.json()
members = data.get("items", [])
# Sort nodes within each member (repeater first, then chat)
for member in members:
if member.get("nodes"):
member["nodes"] = sorted(member["nodes"], key=node_sort_key)
except Exception as e:
logger.warning(f"Failed to fetch members from API: {e}")
context["api_error"] = str(e)
+12 -4
View File
@@ -15,7 +15,7 @@ router = APIRouter()
async def messages_list(
request: Request,
message_type: str | None = Query(None, description="Filter by message type"),
channel_idx: int | None = Query(None, description="Filter by channel"),
channel_idx: str | None = Query(None, description="Filter by channel"),
search: str | None = Query(None, description="Search in message text"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
@@ -28,12 +28,20 @@ async def messages_list(
# Calculate offset
offset = (page - 1) * limit
# Parse channel_idx, treating empty string as None
channel_idx_int: int | None = None
if channel_idx and channel_idx.strip():
try:
channel_idx_int = int(channel_idx)
except ValueError:
logger.warning(f"Invalid channel_idx value: {channel_idx}")
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if message_type:
params["message_type"] = message_type
if channel_idx is not None:
params["channel_idx"] = channel_idx
if channel_idx_int is not None:
params["channel_idx"] = channel_idx_int
# Fetch messages from API
messages = []
@@ -62,7 +70,7 @@ async def messages_list(
"limit": limit,
"total_pages": total_pages,
"message_type": message_type or "",
"channel_idx": channel_idx,
"channel_idx": channel_idx_int,
"search": search or "",
}
)
+36
View File
@@ -1,5 +1,6 @@
"""Network overview page route."""
import json
import logging
from fastapi import APIRouter, Request
@@ -30,6 +31,11 @@ async def network_overview(request: Request) -> HTMLResponse:
"channel_message_counts": {},
}
# Fetch activity data for charts (7 days)
advert_activity = {"days": 7, "data": []}
message_activity = {"days": 7, "data": []}
node_count = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
@@ -38,6 +44,36 @@ async def network_overview(request: Request) -> HTMLResponse:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
advert_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch advertisement activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/message-activity", params={"days": 7}
)
if response.status_code == 200:
message_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch message activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/node-count", params={"days": 7}
)
if response.status_code == 200:
node_count = response.json()
except Exception as e:
logger.warning(f"Failed to fetch node count from API: {e}")
context["stats"] = stats
context["advert_activity_json"] = json.dumps(advert_activity)
context["message_activity_json"] = json.dumps(message_activity)
context["node_count_json"] = json.dumps(node_count)
return templates.TemplateResponse("network.html", context)
@@ -0,0 +1,152 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Advertisements{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-6">
<h1 class="text-3xl font-bold">Advertisements</h1>
<span class="badge badge-lg">{{ total }} total</span>
</div>
{% if api_error %}
<div class="alert alert-warning mb-6">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>Could not fetch data from API: {{ api_error }}</span>
</div>
{% endif %}
<!-- Filters -->
<div class="card bg-base-100 shadow mb-6">
<div class="card-body py-4">
<form method="GET" action="/advertisements" class="flex gap-4 flex-wrap items-end">
<div class="form-control">
<label class="label py-1">
<span class="label-text">Public Key</span>
</label>
<input type="text" name="public_key" value="{{ public_key }}" placeholder="Filter by public key..." class="input input-bordered input-sm w-80" />
</div>
<button type="submit" class="btn btn-primary btn-sm">Filter</button>
<a href="/advertisements" class="btn btn-ghost btn-sm">Clear</a>
</form>
</div>
</div>
<!-- Advertisements Table -->
<div class="overflow-x-auto overflow-y-visible bg-base-100 rounded-box shadow">
<table class="table table-zebra">
<thead>
<tr>
<th>Node</th>
<th>Type</th>
<th>Received By</th>
<th>Time</th>
</tr>
</thead>
<tbody>
{% for ad in advertisements %}
<tr class="hover">
<td>
<a href="/nodes/{{ ad.public_key }}" class="link link-hover">
{% if ad.node_friendly_name or ad.node_name or ad.name %}
<div class="font-medium">{{ ad.node_friendly_name or ad.node_name or ad.name }}</div>
<div class="text-xs font-mono opacity-70">{{ ad.public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ ad.public_key[:16] }}...</span>
{% endif %}
</a>
</td>
<td>
{% if ad.adv_type and ad.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif ad.adv_type and ad.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif ad.adv_type %}
<span title="{{ ad.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td>
{% if ad.receivers and ad.receivers|length > 1 %}
<div class="dropdown dropdown-hover dropdown-end">
<label tabindex="0" class="badge badge-outline badge-sm cursor-pointer">
{{ ad.receivers|length }} receivers
</label>
<ul tabindex="0" class="dropdown-content z-[1] menu p-2 shadow bg-base-100 rounded-box w-56">
{% for recv in ad.receivers %}
<li>
<a href="/nodes/{{ recv.public_key }}" class="text-sm">
{{ recv.friendly_name or recv.name or recv.public_key[:12] + '...' }}
</a>
</li>
{% endfor %}
</ul>
</div>
{% elif ad.receivers and ad.receivers|length == 1 %}
<a href="/nodes/{{ ad.receivers[0].public_key }}" class="link link-hover">
{% if ad.receivers[0].friendly_name or ad.receivers[0].name %}
<div class="font-medium">{{ ad.receivers[0].friendly_name or ad.receivers[0].name }}</div>
<div class="text-xs font-mono opacity-70">{{ ad.receivers[0].public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ ad.receivers[0].public_key[:16] }}...</span>
{% endif %}
</a>
{% elif ad.received_by %}
<a href="/nodes/{{ ad.received_by }}" class="link link-hover">
{% if ad.receiver_friendly_name or ad.receiver_name %}
<div class="font-medium">{{ ad.receiver_friendly_name or ad.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ ad.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ ad.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td class="text-sm whitespace-nowrap">
{{ ad.received_at[:19].replace('T', ' ') if ad.received_at else '-' }}
</td>
</tr>
{% else %}
<tr>
<td colspan="4" class="text-center py-8 opacity-70">No advertisements found.</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
<!-- Pagination -->
{% if total_pages > 1 %}
<div class="flex justify-center mt-6">
<div class="join">
{% if page > 1 %}
<a href="?page={{ page - 1 }}&public_key={{ public_key }}&limit={{ limit }}" class="join-item btn btn-sm">Previous</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Previous</button>
{% endif %}
{% for p in range(1, total_pages + 1) %}
{% if p == page %}
<button class="join-item btn btn-sm btn-active">{{ p }}</button>
{% elif p == 1 or p == total_pages or (p >= page - 2 and p <= page + 2) %}
<a href="?page={{ p }}&public_key={{ public_key }}&limit={{ limit }}" class="join-item btn btn-sm">{{ p }}</a>
{% elif p == 2 or p == total_pages - 1 %}
<button class="join-item btn btn-sm btn-disabled">...</button>
{% endif %}
{% endfor %}
{% if page < total_pages %}
<a href="?page={{ page + 1 }}&public_key={{ public_key }}&limit={{ limit }}" class="join-item btn btn-sm">Next</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Next</button>
{% endif %}
</div>
</div>
{% endif %}
{% endblock %}
+8 -2
View File
@@ -58,6 +58,7 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
@@ -75,6 +76,7 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
@@ -105,10 +107,14 @@
{% endif %}
{% if network_contact_email and network_contact_discord %} | {% endif %}
{% if network_contact_discord %}
<span>Discord: {{ network_contact_discord }}</span>
<a href="{{ network_contact_discord }}" target="_blank" rel="noopener noreferrer" class="link link-hover">Discord</a>
{% endif %}
{% if (network_contact_email or network_contact_discord) and network_contact_github %} | {% endif %}
{% if network_contact_github %}
<a href="{{ network_contact_github }}" target="_blank" rel="noopener noreferrer" class="link link-hover">GitHub</a>
{% endif %}
</p>
<p class="text-xs opacity-50 mt-2">Powered by MeshCore Hub v{{ version }}</p>
<p class="text-xs opacity-50 mt-2">Powered by <a href="https://github.com/ipnet-mesh/meshcore-hub" target="_blank" rel="noopener noreferrer" class="link link-hover">MeshCore Hub</a> v{{ version }}</p>
</aside>
</footer>
+206 -24
View File
@@ -3,42 +3,103 @@
{% block title %}{{ network_name }} - Home{% endblock %}
{% block content %}
<div class="hero min-h-[50vh] bg-base-100 rounded-box">
<div class="hero py-8 bg-base-100 rounded-box">
<div class="hero-content text-center">
<div class="max-w-2xl">
<h1 class="text-5xl font-bold">{{ network_name }}</h1>
<h1 class="text-4xl font-bold">{{ network_name }}</h1>
{% if network_city and network_country %}
<p class="py-2 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
<p class="py-1 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
{% endif %}
<p class="py-6">
{% if network_welcome_text %}
<p class="py-4">{{ network_welcome_text }}</p>
{% else %}
<p class="py-4">
Welcome to the {{ network_name }} mesh network dashboard.
Monitor network activity, view connected nodes, and explore message history.
</p>
{% endif %}
<div class="flex gap-4 justify-center flex-wrap">
<a href="/network" class="btn btn-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z" />
</svg>
View Network Stats
Dashboard
</a>
<a href="/nodes" class="btn btn-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Browse Nodes
Nodes
</a>
<a href="/map" class="btn btn-accent">
<a href="/advertisements" class="btn btn-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 20l-5.447-2.724A1 1 0 013 16.382V5.618a1 1 0 011.447-.894L9 7m0 13l6-3m-6 3V7m6 10l4.553 2.276A1 1 0 0021 18.382V7.618a1 1 0 00-.553-.894L15 4m0 13V4m0 0L9 7" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
View Map
Advertisements
</a>
<a href="/messages" class="btn btn-info">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
Messages
</a>
</div>
</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-8">
<!-- Stats Cards -->
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mt-6">
<!-- Total Nodes -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
</div>
<div class="stat-title">Total Nodes</div>
<div class="stat-value text-primary">{{ stats.total_nodes }}</div>
<div class="stat-desc">All discovered nodes</div>
</div>
<!-- Advertisements (24h) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
</div>
<div class="stat-title">Advertisements</div>
<div class="stat-value text-secondary">{{ stats.advertisements_24h }}</div>
<div class="stat-desc">Received in last 24 hours</div>
</div>
<!-- Total Messages -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
</div>
<div class="stat-title">Total Messages</div>
<div class="stat-value text-accent">{{ stats.total_messages }}</div>
<div class="stat-desc">All time</div>
</div>
<!-- Messages Today -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-info">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
</div>
<div class="stat-title">Messages Today</div>
<div class="stat-value text-info">{{ stats.messages_today }}</div>
<div class="stat-desc">Last 24 hours</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-6">
<!-- Network Info Card -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
@@ -50,11 +111,43 @@
</h2>
<div class="space-y-2">
{% if network_radio_config %}
{% if network_radio_config.profile %}
<div class="flex justify-between">
<span class="opacity-70">Radio Config:</span>
<span class="font-mono">{{ network_radio_config }}</span>
<span class="opacity-70">Profile:</span>
<span class="font-mono">{{ network_radio_config.profile }}</span>
</div>
{% endif %}
{% if network_radio_config.frequency %}
<div class="flex justify-between">
<span class="opacity-70">Frequency:</span>
<span class="font-mono">{{ network_radio_config.frequency }}</span>
</div>
{% endif %}
{% if network_radio_config.spreading_factor %}
<div class="flex justify-between">
<span class="opacity-70">Spreading Factor:</span>
<span class="font-mono">{{ network_radio_config.spreading_factor }}</span>
</div>
{% endif %}
{% if network_radio_config.bandwidth %}
<div class="flex justify-between">
<span class="opacity-70">Bandwidth:</span>
<span class="font-mono">{{ network_radio_config.bandwidth }}</span>
</div>
{% endif %}
{% if network_radio_config.coding_rate %}
<div class="flex justify-between">
<span class="opacity-70">Coding Rate:</span>
<span class="font-mono">{{ network_radio_config.coding_rate }}</span>
</div>
{% endif %}
{% if network_radio_config.tx_power %}
<div class="flex justify-between">
<span class="opacity-70">TX Power:</span>
<span class="font-mono">{{ network_radio_config.tx_power }}</span>
</div>
{% endif %}
{% endif %}
{% if network_location and network_location != (0.0, 0.0) %}
<div class="flex justify-between">
<span class="opacity-70">Location:</span>
@@ -65,20 +158,19 @@
</div>
</div>
<!-- Quick Links Card -->
<!-- Network Activity Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 12l3-3 3 3 4-4M8 21l4-4 4 4M3 4h18M4 4h16v12a1 1 0 01-1 1H5a1 1 0 01-1-1V4z" />
</svg>
Quick Links
Network Activity
</h2>
<ul class="menu bg-base-200 rounded-box">
<li><a href="/messages">Recent Messages</a></li>
<li><a href="/nodes">All Nodes</a></li>
<li><a href="/members">Network Members</a></li>
</ul>
<p class="text-sm opacity-70 mb-2">Advertisements received per day (last 7 days)</p>
<div class="h-48">
<canvas id="activityChart"></canvas>
</div>
</div>
</div>
@@ -101,14 +193,22 @@
</a>
{% endif %}
{% if network_contact_discord %}
<div class="btn btn-outline btn-sm btn-block">
<a href="{{ network_contact_discord }}" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" viewBox="0 0 24 24" fill="currentColor">
<path d="M20.317 4.3698a19.7913 19.7913 0 00-4.8851-1.5152.0741.0741 0 00-.0785.0371c-.211.3753-.4447.8648-.6083 1.2495-1.8447-.2762-3.68-.2762-5.4868 0-.1636-.3933-.4058-.8742-.6177-1.2495a.077.077 0 00-.0785-.037 19.7363 19.7363 0 00-4.8852 1.515.0699.0699 0 00-.0321.0277C.5334 9.0458-.319 13.5799.0992 18.0578a.0824.0824 0 00.0312.0561c2.0528 1.5076 4.0413 2.4228 5.9929 3.0294a.0777.0777 0 00.0842-.0276c.4616-.6304.8731-1.2952 1.226-1.9942a.076.076 0 00-.0416-.1057c-.6528-.2476-1.2743-.5495-1.8722-.8923a.077.077 0 01-.0076-.1277c.1258-.0943.2517-.1923.3718-.2914a.0743.0743 0 01.0776-.0105c3.9278 1.7933 8.18 1.7933 12.0614 0a.0739.0739 0 01.0785.0095c.1202.099.246.1981.3728.2924a.077.077 0 01-.0066.1276 12.2986 12.2986 0 01-1.873.8914.0766.0766 0 00-.0407.1067c.3604.698.7719 1.3628 1.225 1.9932a.076.076 0 00.0842.0286c1.961-.6067 3.9495-1.5219 6.0023-3.0294a.077.077 0 00.0313-.0552c.5004-5.177-.8382-9.6739-3.5485-13.6604a.061.061 0 00-.0312-.0286zM8.02 15.3312c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9555-2.4189 2.157-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.9555 2.4189-2.1569 2.4189zm7.9748 0c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9554-2.4189 2.1569-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.946 2.4189-2.1568 2.4189Z"/>
</svg>
{{ network_contact_discord }}
</div>
Discord
</a>
{% endif %}
{% if not network_contact_email and not network_contact_discord %}
{% if network_contact_github %}
<a href="{{ network_contact_github }}" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" viewBox="0 0 24 24" fill="currentColor">
<path d="M12 0c-6.626 0-12 5.373-12 12 0 5.302 3.438 9.8 8.207 11.387.599.111.793-.261.793-.577v-2.234c-3.338.726-4.033-1.416-4.033-1.416-.546-1.387-1.333-1.756-1.333-1.756-1.089-.745.083-.729.083-.729 1.205.084 1.839 1.237 1.839 1.237 1.07 1.834 2.807 1.304 3.492.997.107-.775.418-1.305.762-1.604-2.665-.305-5.467-1.334-5.467-5.931 0-1.311.469-2.381 1.236-3.221-.124-.303-.535-1.524.117-3.176 0 0 1.008-.322 3.301 1.23.957-.266 1.983-.399 3.003-.404 1.02.005 2.047.138 3.006.404 2.291-1.552 3.297-1.23 3.297-1.23.653 1.653.242 2.874.118 3.176.77.84 1.235 1.911 1.235 3.221 0 4.609-2.807 5.624-5.479 5.921.43.372.823 1.102.823 2.222v3.293c0 .319.192.694.801.576 4.765-1.589 8.199-6.086 8.199-11.386 0-6.627-5.373-12-12-12z"/>
</svg>
GitHub
</a>
{% endif %}
{% if not network_contact_email and not network_contact_discord and not network_contact_github %}
<p class="text-sm opacity-70">No contact information configured.</p>
{% endif %}
</div>
@@ -116,3 +216,85 @@
</div>
</div>
{% endblock %}
{% block extra_scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
(function() {
const activityData = {{ activity_json | safe }};
const ctx = document.getElementById('activityChart');
if (ctx && activityData.data && activityData.data.length > 0) {
// Format dates for display (show only day/month)
const labels = activityData.data.map(d => {
const date = new Date(d.date);
return date.toLocaleDateString('en-GB', { day: 'numeric', month: 'short' });
});
const counts = activityData.data.map(d => d.count);
new Chart(ctx, {
type: 'line',
data: {
labels: labels,
datasets: [{
label: 'Advertisements',
data: counts,
borderColor: 'oklch(0.7 0.15 250)',
backgroundColor: 'oklch(0.7 0.15 250 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: {
display: false
},
tooltip: {
mode: 'index',
intersect: false,
backgroundColor: 'oklch(0.25 0 0)',
titleColor: 'oklch(0.9 0 0)',
bodyColor: 'oklch(0.9 0 0)',
borderColor: 'oklch(0.4 0 0)',
borderWidth: 1
}
},
scales: {
x: {
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
maxRotation: 45,
minRotation: 45,
maxTicksLimit: 10
}
},
y: {
beginAtZero: true,
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
precision: 0
}
}
},
interaction: {
mode: 'nearest',
axis: 'x',
intersect: false
}
}
});
}
})();
</script>
{% endblock %}
+315 -46
View File
@@ -5,7 +5,7 @@
{% block extra_head %}
<style>
#map {
height: calc(100vh - 250px);
height: calc(100vh - 350px);
min-height: 400px;
border-radius: var(--rounded-box);
}
@@ -22,7 +22,45 @@
{% block content %}
<div class="flex items-center justify-between mb-6">
<h1 class="text-3xl font-bold">Node Map</h1>
<span id="node-count" class="badge badge-lg">Loading...</span>
<div class="flex items-center gap-2">
<span id="node-count" class="badge badge-lg">Loading...</span>
<span id="filtered-count" class="badge badge-lg badge-ghost hidden"></span>
</div>
</div>
<!-- Filters -->
<div class="card bg-base-100 shadow mb-6">
<div class="card-body py-4">
<div class="flex gap-4 flex-wrap items-end">
<div class="form-control">
<label class="label py-1">
<span class="label-text">Node Type</span>
</label>
<select id="filter-type" class="select select-bordered select-sm">
<option value="">All Types</option>
<option value="chat">Chat</option>
<option value="repeater">Repeater</option>
<option value="room">Room</option>
</select>
</div>
<div class="form-control">
<label class="label py-1">
<span class="label-text">Owner</span>
</label>
<select id="filter-owner" class="select select-bordered select-sm">
<option value="">All Owners</option>
<!-- Populated dynamically -->
</select>
</div>
<div class="form-control">
<label class="label cursor-pointer gap-2 py-1">
<span class="label-text">Infrastructure Only</span>
<input type="checkbox" id="filter-infra" class="checkbox checkbox-sm checkbox-primary" />
</label>
</div>
<button id="clear-filters" class="btn btn-ghost btn-sm">Clear Filters</button>
</div>
</div>
</div>
<div class="card bg-base-100 shadow-xl">
@@ -31,69 +69,300 @@
</div>
</div>
<div class="mt-4 text-sm opacity-70">
<p>Nodes are placed on the map based on their <code>lat</code> and <code>lon</code> tags.</p>
<p>To add a node to the map, set its location tags via the API.</p>
<!-- Legend -->
<div class="mt-4 flex flex-wrap gap-4 items-center text-sm">
<span class="opacity-70">Legend:</span>
<div class="flex items-center gap-1">
<span class="text-lg">💬</span>
<span>Chat</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">📡</span>
<span>Repeater</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">🪧</span>
<span>Room</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">📍</span>
<span>Other</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg" style="filter: drop-shadow(0 0 4px gold);">📡</span>
<span>Infrastructure (gold glow)</span>
</div>
</div>
<div class="mt-2 text-sm opacity-70">
<p>Nodes are placed on the map based on their <code>lat</code> and <code>lon</code> tags. Infrastructure nodes are tagged with <code>role: infra</code>.</p>
</div>
{% endblock %}
{% block extra_scripts %}
<script>
// Initialize map
const map = L.map('map').setView([{{ network_location[0] }}, {{ network_location[1] }}], 10);
// Initialize map with world view (will be centered on nodes once loaded)
const map = L.map('map').setView([0, 0], 2);
// Add tile layer
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
}).addTo(map);
// Custom marker icon
const nodeIcon = L.divIcon({
className: 'custom-div-icon',
html: `<div style="background-color: oklch(var(--p)); width: 12px; height: 12px; border-radius: 50%; border: 2px solid white; box-shadow: 0 2px 4px rgba(0,0,0,0.3);"></div>`,
iconSize: [12, 12],
iconAnchor: [6, 6]
});
// Store all nodes and markers
let allNodes = [];
let allMembers = [];
let markers = [];
let mapCenter = { lat: 0, lon: 0 };
// Normalize adv_type to lowercase for consistent comparison
function normalizeType(type) {
return type ? type.toLowerCase() : null;
}
// Format relative time (e.g., "2m", "1h", "2d")
function formatRelativeTime(lastSeenStr) {
if (!lastSeenStr) return null;
const lastSeen = new Date(lastSeenStr);
const now = new Date();
const diffMs = now - lastSeen;
const diffSec = Math.floor(diffMs / 1000);
const diffMin = Math.floor(diffSec / 60);
const diffHour = Math.floor(diffMin / 60);
const diffDay = Math.floor(diffHour / 24);
if (diffDay > 0) return `${diffDay}d`;
if (diffHour > 0) return `${diffHour}h`;
if (diffMin > 0) return `${diffMin}m`;
return '<1m';
}
// Get emoji marker based on node type
function getNodeEmoji(node) {
const type = normalizeType(node.adv_type);
if (type === 'chat') return '💬';
if (type === 'repeater') return '📡';
if (type === 'room') return '🪧';
return '📍';
}
// Get display name for node type
function getTypeDisplay(node) {
const type = normalizeType(node.adv_type);
if (type === 'chat') return 'Chat';
if (type === 'repeater') return 'Repeater';
if (type === 'room') return 'Room';
return type ? type.charAt(0).toUpperCase() + type.slice(1) : 'Unknown';
}
// Create marker icon for a node
function createNodeIcon(node) {
const emoji = getNodeEmoji(node);
const infraGlow = node.is_infra ? 'filter: drop-shadow(0 0 4px gold);' : '';
const displayName = node.name || '';
const relativeTime = formatRelativeTime(node.last_seen);
const timeDisplay = relativeTime ? ` (${relativeTime})` : '';
return L.divIcon({
className: 'custom-div-icon',
html: `<div style="display: flex; align-items: center; gap: 2px;">
<span style="font-size: 24px; ${infraGlow} text-shadow: 0 0 3px #1a237e, 0 0 6px #1a237e, 0 1px 2px rgba(0,0,0,0.7);">${emoji}</span>
<span style="font-size: 10px; font-weight: bold; color: #000; background: rgba(255,255,255,0.9); padding: 1px 4px; border-radius: 3px; box-shadow: 0 1px 3px rgba(0,0,0,0.3);">${displayName}${timeDisplay}</span>
</div>`,
iconSize: [82, 28],
iconAnchor: [14, 14]
});
}
// Create popup content for a node
function createPopupContent(node) {
let ownerHtml = '';
if (node.owner) {
const ownerDisplay = node.owner.callsign
? `${node.owner.name} (${node.owner.callsign})`
: node.owner.name;
ownerHtml = `<p><span class="opacity-70">Owner:</span> ${ownerDisplay}</p>`;
}
let roleHtml = '';
if (node.role) {
const roleClass = node.is_infra ? 'badge-warning' : 'badge-ghost';
roleHtml = `<p><span class="opacity-70">Role:</span> <span class="badge badge-xs ${roleClass}">${node.role}</span></p>`;
}
const emoji = getNodeEmoji(node);
const typeDisplay = getTypeDisplay(node);
return `
<div class="p-2">
<h3 class="font-bold text-lg mb-2">${emoji} ${node.name}</h3>
<div class="space-y-1 text-sm">
<p><span class="opacity-70">Type:</span> ${typeDisplay}</p>
${roleHtml}
${ownerHtml}
<p><span class="opacity-70">Key:</span> <code class="text-xs">${node.public_key.substring(0, 16)}...</code></p>
<p><span class="opacity-70">Location:</span> ${node.lat.toFixed(4)}, ${node.lon.toFixed(4)}</p>
${node.last_seen ? `<p><span class="opacity-70">Last seen:</span> ${node.last_seen.substring(0, 19).replace('T', ' ')}</p>` : ''}
</div>
<a href="/nodes/${node.public_key}" class="btn btn-primary btn-xs mt-3">View Details</a>
</div>
`;
}
// Clear all markers from map
function clearMarkers() {
markers.forEach(marker => map.removeLayer(marker));
markers = [];
}
// Core filter logic - returns filtered nodes and updates markers
function applyFiltersCore() {
const typeFilter = document.getElementById('filter-type').value;
const ownerFilter = document.getElementById('filter-owner').value;
const infraOnly = document.getElementById('filter-infra').checked;
// Filter nodes
const filteredNodes = allNodes.filter(node => {
// Type filter (case-insensitive)
if (typeFilter && normalizeType(node.adv_type) !== typeFilter) return false;
// Infrastructure filter
if (infraOnly && !node.is_infra) return false;
// Owner filter
if (ownerFilter) {
if (!node.owner || node.owner.public_key !== ownerFilter) return false;
}
return true;
});
// Clear existing markers
clearMarkers();
// Add filtered markers
filteredNodes.forEach(node => {
const marker = L.marker([node.lat, node.lon], { icon: createNodeIcon(node) }).addTo(map);
marker.bindPopup(createPopupContent(node));
markers.push(marker);
});
// Update counts
const countEl = document.getElementById('node-count');
const filteredEl = document.getElementById('filtered-count');
if (filteredNodes.length === allNodes.length) {
countEl.textContent = `${allNodes.length} nodes on map`;
filteredEl.classList.add('hidden');
} else {
countEl.textContent = `${allNodes.length} total`;
filteredEl.textContent = `${filteredNodes.length} shown`;
filteredEl.classList.remove('hidden');
}
return filteredNodes;
}
// Apply filters and recenter map on filtered nodes
function applyFilters() {
const filteredNodes = applyFiltersCore();
// Fit bounds if we have filtered nodes
if (filteredNodes.length > 0) {
const bounds = L.latLngBounds(filteredNodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
} else if (mapCenter.lat !== 0 || mapCenter.lon !== 0) {
map.setView([mapCenter.lat, mapCenter.lon], 10);
}
}
// Apply filters without recentering (for initial load after manual center)
function applyFiltersNoRecenter() {
applyFiltersCore();
}
// Populate owner filter dropdown
function populateOwnerFilter() {
const select = document.getElementById('filter-owner');
// Get unique owners from nodes that have locations
const ownersWithNodes = new Set();
allNodes.forEach(node => {
if (node.owner) {
ownersWithNodes.add(node.owner.public_key);
}
});
// Filter members to only those who own nodes on the map
const relevantMembers = allMembers.filter(m => ownersWithNodes.has(m.public_key));
// Sort by name
relevantMembers.sort((a, b) => a.name.localeCompare(b.name));
// Add options
relevantMembers.forEach(member => {
const option = document.createElement('option');
option.value = member.public_key;
option.textContent = member.callsign
? `${member.name} (${member.callsign})`
: member.name;
select.appendChild(option);
});
}
// Clear all filters
function clearFilters() {
document.getElementById('filter-type').value = '';
document.getElementById('filter-owner').value = '';
document.getElementById('filter-infra').checked = false;
applyFilters();
}
// Event listeners for filters
document.getElementById('filter-type').addEventListener('change', applyFilters);
document.getElementById('filter-owner').addEventListener('change', applyFilters);
document.getElementById('filter-infra').addEventListener('change', applyFilters);
document.getElementById('clear-filters').addEventListener('click', clearFilters);
// Fetch and display nodes
fetch('/map/data')
.then(response => response.json())
.then(data => {
const nodes = data.nodes;
const center = data.center;
allNodes = data.nodes;
allMembers = data.members || [];
mapCenter = data.center;
// Update node count
document.getElementById('node-count').textContent = `${nodes.length} nodes on map`;
// Log debug info
const debug = data.debug || {};
console.log('Map data loaded:', debug);
console.log('Sample node data:', allNodes.length > 0 ? allNodes[0] : 'No nodes');
// Add markers for each node
nodes.forEach(node => {
const marker = L.marker([node.lat, node.lon], { icon: nodeIcon }).addTo(map);
// Create popup content
const popupContent = `
<div class="p-2">
<h3 class="font-bold text-lg mb-2">${node.name}</h3>
<div class="space-y-1 text-sm">
<p><span class="opacity-70">Type:</span> ${node.adv_type || 'Unknown'}</p>
<p><span class="opacity-70">Key:</span> <code class="text-xs">${node.public_key.substring(0, 16)}...</code></p>
<p><span class="opacity-70">Location:</span> ${node.lat.toFixed(4)}, ${node.lon.toFixed(4)}</p>
${node.last_seen ? `<p><span class="opacity-70">Last seen:</span> ${node.last_seen.substring(0, 19).replace('T', ' ')}</p>` : ''}
</div>
<a href="/nodes/${node.public_key}" class="btn btn-primary btn-xs mt-3">View Details</a>
</div>
`;
marker.bindPopup(popupContent);
});
// Fit bounds if we have nodes
if (nodes.length > 0) {
const bounds = L.latLngBounds(nodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
} else if (center.lat !== 0 || center.lon !== 0) {
// Use network center if no nodes
map.setView([center.lat, center.lon], 10);
if (debug.error) {
document.getElementById('node-count').textContent = `Error: ${debug.error}`;
return;
}
if (debug.total_nodes === 0) {
document.getElementById('node-count').textContent = 'No nodes in database';
return;
}
if (debug.nodes_with_coords === 0) {
document.getElementById('node-count').textContent = `${debug.total_nodes} nodes (none have coordinates)`;
return;
}
// Populate owner filter
populateOwnerFilter();
// Initial display - center map on nodes if available
if (allNodes.length > 0) {
const bounds = L.latLngBounds(allNodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
}
// Apply filters (won't re-center since we just did above)
applyFiltersNoRecenter();
})
.catch(error => {
console.error('Error loading map data:', error);
+51 -43
View File
@@ -9,43 +9,55 @@
</div>
{% if members %}
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 items-start">
{% for member in members %}
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
{{ member.name }}
{% if member.callsign %}
<span class="badge badge-secondary">{{ member.callsign }}</span>
<span class="badge badge-success">{{ member.callsign }}</span>
{% endif %}
</h2>
{% if member.role %}
<p class="text-sm opacity-70">{{ member.role }}</p>
{% endif %}
{% if member.description %}
<p class="mt-2">{{ member.description }}</p>
{% endif %}
{% if member.email or member.discord or member.website %}
<div class="card-actions justify-start mt-4">
{% if member.email %}
<a href="mailto:{{ member.email }}" class="btn btn-ghost btn-xs">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M3 8l7.89 5.26a2 2 0 002.22 0L21 8M5 19h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z" />
</svg>
Email
</a>
{% endif %}
{% if member.website %}
<a href="{{ member.website }}" target="_blank" class="btn btn-ghost btn-xs">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 12a9 9 0 01-9 9m9-9a9 9 0 00-9-9m9 9H3m9 9a9 9 0 01-9-9m9 9c1.657 0 3-4.03 3-9s-1.343-9-3-9m0 18c-1.657 0-3-4.03-3-9s1.343-9 3-9m-9 9a9 9 0 019-9" />
</svg>
Website
</a>
{% endif %}
{% if member.contact %}
<p class="text-sm mt-2">
<span class="opacity-70">Contact:</span> {{ member.contact }}
</p>
{% endif %}
{% if member.nodes %}
<div class="mt-4 space-y-2">
{% for node in member.nodes %}
{% set adv_type = node.node_adv_type or node.node_role %}
<a href="/nodes/{{ node.public_key }}" class="flex items-center gap-3 p-2 bg-base-200 rounded-lg hover:bg-base-300 transition-colors">
<span class="text-lg" title="{{ adv_type or 'Unknown' }}">
{% if adv_type and adv_type|lower == 'chat' %}
💬
{% elif adv_type and adv_type|lower == 'repeater' %}
📡
{% elif adv_type and adv_type|lower == 'room' %}
🪧
{% elif adv_type %}
📍
{% else %}
📦
{% endif %}
</span>
<div>
{% if node.friendly_name or node.node_name %}
<div class="font-medium text-sm">{{ node.friendly_name or node.node_name }}</div>
<div class="font-mono text-xs opacity-60">{{ node.public_key[:12] }}...</div>
{% else %}
<div class="font-mono text-sm">{{ node.public_key[:12] }}...</div>
{% endif %}
</div>
</a>
{% endfor %}
</div>
{% endif %}
</div>
@@ -59,31 +71,27 @@
</svg>
<div>
<h3 class="font-bold">No members configured</h3>
<p class="text-sm">To display network members, provide a members JSON file using the <code>--members-file</code> option.</p>
<p class="text-sm">To display network members, create a members.yaml file in your seed directory.</p>
</div>
</div>
<div class="mt-6 card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">Members File Format</h2>
<p class="mb-4">Create a JSON file with the following structure:</p>
<pre class="bg-base-200 p-4 rounded-box text-sm overflow-x-auto"><code>{
"members": [
{
"name": "John Doe",
"callsign": "AB1CD",
"role": "Network Admin",
"description": "Manages the main repeater node.",
"email": "john@example.com",
"website": "https://example.com"
},
{
"name": "Jane Smith",
"role": "Member",
"description": "Regular user in the downtown area."
}
]
}</code></pre>
<p class="mb-4">Create a YAML file at <code>$SEED_HOME/members.yaml</code> with the following structure:</p>
<pre class="bg-base-200 p-4 rounded-box text-sm overflow-x-auto"><code>members:
- name: John Doe
callsign: AB1CD
role: Network Admin
description: Manages the main repeater node.
contact: john@example.com
nodes:
- public_key: abc123def456... # 64-char hex
node_role: repeater
- name: Jane Smith
role: Member
description: Regular user in the downtown area.</code></pre>
<p class="mt-4 text-sm opacity-70">Run <code>meshcore-hub collector seed</code> to import members, or they will be imported automatically on collector startup.</p>
</div>
</div>
{% endif %}
+52 -20
View File
@@ -49,24 +49,21 @@
</div>
<!-- Messages Table -->
<div class="overflow-x-auto bg-base-100 rounded-box shadow">
<div class="overflow-x-auto overflow-y-visible bg-base-100 rounded-box shadow">
<table class="table table-zebra">
<thead>
<tr>
<th>Time</th>
<th>Type</th>
<th>Time</th>
<th>From/Channel</th>
<th>Message</th>
<th>Receiver</th>
<th>SNR</th>
<th>Hops</th>
</tr>
</thead>
<tbody>
{% for msg in messages %}
<tr class="hover">
<td class="text-xs whitespace-nowrap">
{{ msg.received_at[:19].replace('T', ' ') if msg.received_at else '-' }}
</td>
<tr class="hover align-top">
<td>
{% if msg.message_type == 'channel' %}
<span class="badge badge-info badge-sm">Channel</span>
@@ -74,30 +71,65 @@
<span class="badge badge-success badge-sm">Direct</span>
{% endif %}
</td>
<td class="text-sm">
<td class="text-sm whitespace-nowrap">
{{ msg.received_at[:19].replace('T', ' ') if msg.received_at else '-' }}
</td>
<td class="text-sm whitespace-nowrap">
{% if msg.message_type == 'channel' %}
<span class="font-mono">CH{{ msg.channel_idx }}</span>
{% else %}
{% if msg.sender_friendly_name %}
<span class="font-medium">{{ msg.sender_friendly_name }}</span>
{% if msg.sender_friendly_name or msg.sender_name %}
<span class="font-medium">{{ msg.sender_friendly_name or msg.sender_name }}</span>
{% else %}
<span class="font-mono text-xs">{{ (msg.pubkey_prefix or '-')[:12] }}</span>
{% endif %}
{% endif %}
</td>
<td class="truncate-cell" title="{{ msg.text }}">
{{ msg.text or '-' }}
</td>
<td class="text-center">
{% if msg.snr is not none %}
<span class="badge badge-ghost badge-sm">{{ "%.1f"|format(msg.snr) }}</span>
<td class="break-words max-w-md" style="white-space: pre-wrap;">{{ msg.text or '-' }}</td>
<td>
{% if msg.receivers and msg.receivers|length > 1 %}
<div class="dropdown dropdown-hover dropdown-end">
<label tabindex="0" class="badge badge-outline badge-sm cursor-pointer">
{{ msg.receivers|length }} receivers
</label>
<ul tabindex="0" class="dropdown-content z-[1] menu p-2 shadow bg-base-100 rounded-box w-56">
{% for recv in msg.receivers %}
<li>
<a href="/nodes/{{ recv.public_key }}" class="text-sm">
<span class="flex-1">{{ recv.friendly_name or recv.name or recv.public_key[:12] + '...' }}</span>
{% if recv.snr is not none %}
<span class="badge badge-ghost badge-xs">{{ "%.1f"|format(recv.snr) }}</span>
{% endif %}
</a>
</li>
{% endfor %}
</ul>
</div>
{% elif msg.receivers and msg.receivers|length == 1 %}
<a href="/nodes/{{ msg.receivers[0].public_key }}" class="link link-hover">
{% if msg.receivers[0].friendly_name or msg.receivers[0].name %}
<div class="font-medium">{{ msg.receivers[0].friendly_name or msg.receivers[0].name }}</div>
<div class="text-xs font-mono opacity-70">{{ msg.receivers[0].public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ msg.receivers[0].public_key[:16] }}...</span>
{% endif %}
</a>
{% elif msg.received_by %}
<a href="/nodes/{{ msg.received_by }}" class="link link-hover">
{% if msg.receiver_friendly_name or msg.receiver_name %}
<div class="font-medium">{{ msg.receiver_friendly_name or msg.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ msg.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ msg.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
-
<span class="opacity-50">-</span>
{% endif %}
</td>
<td class="text-center">
{% if msg.hops is not none %}
<span class="badge badge-ghost badge-sm">{{ msg.hops }}</span>
<td class="text-center whitespace-nowrap">
{% if msg.snr is not none %}
<span class="badge badge-ghost badge-sm">{{ "%.1f"|format(msg.snr) }}</span>
{% else %}
-
{% endif %}
+193 -64
View File
@@ -22,54 +22,54 @@
</div>
{% endif %}
<!-- Stats Cards -->
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mb-8">
<!-- Total Nodes -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
<!-- Activity Charts -->
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-8">
<!-- Advertisements Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
Advertisements
</h2>
<p class="text-xs opacity-70">Per day (last 7 days)</p>
<div class="h-32">
<canvas id="advertChart"></canvas>
</div>
</div>
<div class="stat-title">Total Nodes</div>
<div class="stat-value text-primary">{{ stats.total_nodes }}</div>
<div class="stat-desc">All discovered nodes</div>
</div>
<!-- Advertisements (24h) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
<!-- Messages Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
Messages
</h2>
<p class="text-xs opacity-70">Per day (last 7 days)</p>
<div class="h-32">
<canvas id="messageChart"></canvas>
</div>
</div>
<div class="stat-title">Advertisements</div>
<div class="stat-value text-secondary">{{ stats.advertisements_24h }}</div>
<div class="stat-desc">Received in last 24 hours</div>
</div>
<!-- Total Messages -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
<!-- Node Count Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Total Nodes
</h2>
<p class="text-xs opacity-70">Over time (last 7 days)</p>
<div class="h-32">
<canvas id="nodeChart"></canvas>
</div>
</div>
<div class="stat-title">Total Messages</div>
<div class="stat-value text-accent">{{ stats.total_messages }}</div>
<div class="stat-desc">All time</div>
</div>
<!-- Messages Today -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-info">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
</div>
<div class="stat-title">Messages Today</div>
<div class="stat-value text-info">{{ stats.messages_today }}</div>
<div class="stat-desc">Last 24 hours</div>
</div>
</div>
@@ -90,6 +90,7 @@
<thead>
<tr>
<th>Node</th>
<th>Type</th>
<th class="text-right">Received</th>
</tr>
</thead>
@@ -97,11 +98,26 @@
{% for ad in stats.recent_advertisements %}
<tr>
<td>
<div class="font-medium">{{ ad.friendly_name or ad.name or ad.public_key[:12] + '...' }}</div>
<a href="/nodes/{{ ad.public_key }}" class="link link-hover">
<div class="font-medium">{{ ad.friendly_name or ad.name or ad.public_key[:12] + '...' }}</div>
</a>
{% if ad.friendly_name or ad.name %}
<div class="text-xs opacity-50 font-mono">{{ ad.public_key[:12] }}...</div>
{% endif %}
</td>
<td>
{% if ad.adv_type and ad.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif ad.adv_type and ad.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif ad.adv_type %}
<span title="{{ ad.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td class="text-right text-sm opacity-70">{{ ad.received_at.split('T')[1][:8] if ad.received_at else '-' }}</td>
</tr>
{% endfor %}
@@ -114,39 +130,37 @@
</div>
</div>
<!-- Channel Stats -->
<!-- Channel Messages -->
{% if stats.channel_messages %}
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 20l4-16m2 16l4-16M6 9h14M4 15h14" />
</svg>
Channel Messages
Recent Channel Messages
</h2>
{% if stats.channel_message_counts %}
<div class="overflow-x-auto">
<table class="table table-compact w-full">
<thead>
<tr>
<th>Channel</th>
<th class="text-right">Count</th>
</tr>
</thead>
<tbody>
{% for channel, count in stats.channel_message_counts.items() %}
<tr>
<td>Channel {{ channel }}</td>
<td class="text-right font-mono">{{ count }}</td>
</tr>
<div class="space-y-4">
{% for channel, messages in stats.channel_messages.items() %}
<div>
<h3 class="font-semibold text-sm mb-2 flex items-center gap-2">
<span class="badge badge-info badge-sm">CH{{ channel }}</span>
Channel {{ channel }}
</h3>
<div class="space-y-1 pl-2 border-l-2 border-base-300">
{% for msg in messages %}
<div class="text-sm">
<span class="text-xs opacity-50">{{ msg.received_at.split('T')[1][:5] if msg.received_at else '' }}</span>
<span class="break-words" style="white-space: pre-wrap;">{{ msg.text }}</span>
</div>
{% endfor %}
</tbody>
</table>
</div>
</div>
{% endfor %}
</div>
{% else %}
<p class="text-sm opacity-70">No channel messages recorded yet.</p>
{% endif %}
</div>
</div>
{% endif %}
</div>
<!-- Quick Actions -->
@@ -171,3 +185,118 @@
</a>
</div>
{% endblock %}
{% block extra_scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
(function() {
const advertData = {{ advert_activity_json | safe }};
const messageData = {{ message_activity_json | safe }};
const nodeData = {{ node_count_json | safe }};
// Common chart options
const commonOptions = {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: { display: false },
tooltip: {
mode: 'index',
intersect: false,
backgroundColor: 'oklch(0.25 0 0)',
titleColor: 'oklch(0.9 0 0)',
bodyColor: 'oklch(0.9 0 0)',
borderColor: 'oklch(0.4 0 0)',
borderWidth: 1
}
},
scales: {
x: {
grid: { color: 'oklch(0.4 0 0 / 0.2)' },
ticks: { color: 'oklch(0.7 0 0)', maxRotation: 45, minRotation: 45 }
},
y: {
beginAtZero: true,
grid: { color: 'oklch(0.4 0 0 / 0.2)' },
ticks: { color: 'oklch(0.7 0 0)', precision: 0 }
}
},
interaction: { mode: 'nearest', axis: 'x', intersect: false }
};
// Helper to format dates
function formatLabels(data) {
return data.map(d => {
const date = new Date(d.date);
return date.toLocaleDateString('en-GB', { day: 'numeric', month: 'short' });
});
}
// Advertisements chart
const advertCtx = document.getElementById('advertChart');
if (advertCtx && advertData.data && advertData.data.length > 0) {
new Chart(advertCtx, {
type: 'line',
data: {
labels: formatLabels(advertData.data),
datasets: [{
label: 'Advertisements',
data: advertData.data.map(d => d.count),
borderColor: 'oklch(0.7 0.15 250)',
backgroundColor: 'oklch(0.7 0.15 250 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
// Messages chart
const messageCtx = document.getElementById('messageChart');
if (messageCtx && messageData.data && messageData.data.length > 0) {
new Chart(messageCtx, {
type: 'line',
data: {
labels: formatLabels(messageData.data),
datasets: [{
label: 'Messages',
data: messageData.data.map(d => d.count),
borderColor: 'oklch(0.7 0.15 160)',
backgroundColor: 'oklch(0.7 0.15 160 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
// Node count chart
const nodeCtx = document.getElementById('nodeChart');
if (nodeCtx && nodeData.data && nodeData.data.length > 0) {
new Chart(nodeCtx, {
type: 'line',
data: {
labels: formatLabels(nodeData.data),
datasets: [{
label: 'Total Nodes',
data: nodeData.data.map(d => d.count),
borderColor: 'oklch(0.7 0.15 30)',
backgroundColor: 'oklch(0.7 0.15 30 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
})();
</script>
{% endblock %}
@@ -102,15 +102,40 @@
<tr>
<th>Time</th>
<th>Type</th>
<th>Name</th>
<th>Received By</th>
</tr>
</thead>
<tbody>
{% for adv in advertisements %}
<tr>
<td class="text-xs">{{ adv.received_at[:19].replace('T', ' ') if adv.received_at else '-' }}</td>
<td>{{ adv.adv_type or '-' }}</td>
<td>{{ adv.name or '-' }}</td>
<td class="text-xs whitespace-nowrap">{{ adv.received_at[:19].replace('T', ' ') if adv.received_at else '-' }}</td>
<td>
{% if adv.adv_type and adv.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif adv.adv_type and adv.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif adv.adv_type and adv.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif adv.adv_type %}
<span title="{{ adv.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td>
{% if adv.received_by %}
<a href="/nodes/{{ adv.received_by }}" class="link link-hover">
{% if adv.receiver_friendly_name or adv.receiver_name %}
<div class="font-medium text-sm">{{ adv.receiver_friendly_name or adv.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ adv.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-xs">{{ adv.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
@@ -133,12 +158,13 @@
<tr>
<th>Time</th>
<th>Data</th>
<th>Received By</th>
</tr>
</thead>
<tbody>
{% for tel in telemetry %}
<tr>
<td class="text-xs">{{ tel.received_at[:19].replace('T', ' ') if tel.received_at else '-' }}</td>
<td class="text-xs whitespace-nowrap">{{ tel.received_at[:19].replace('T', ' ') if tel.received_at else '-' }}</td>
<td class="text-xs font-mono">
{% if tel.parsed_data %}
{{ tel.parsed_data | tojson }}
@@ -146,6 +172,20 @@
-
{% endif %}
</td>
<td>
{% if tel.received_by %}
<a href="/nodes/{{ tel.received_by }}" class="link link-hover">
{% if tel.receiver_friendly_name or tel.receiver_name %}
<div class="font-medium text-sm">{{ tel.receiver_friendly_name or tel.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ tel.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-xs">{{ tel.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
+24 -7
View File
@@ -49,8 +49,8 @@
<table class="table table-zebra">
<thead>
<tr>
<th>Name</th>
<th>Public Key</th>
<th>Node</th>
<th>Type</th>
<th>Last Seen</th>
<th>Tags</th>
</tr>
@@ -64,13 +64,30 @@
{% endif %}
{% endfor %}
<tr class="hover">
<td class="font-medium">
<a href="/nodes/{{ node.public_key }}" class="link link-hover">{{ ns.friendly_name or node.name or '-' }}</a>
<td>
<a href="/nodes/{{ node.public_key }}" class="link link-hover">
{% if ns.friendly_name or node.name %}
<div class="font-medium">{{ ns.friendly_name or node.name }}</div>
<div class="text-xs font-mono opacity-70">{{ node.public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ node.public_key[:16] }}...</span>
{% endif %}
</a>
</td>
<td class="font-mono text-xs truncate-cell" title="{{ node.public_key }}">
{{ node.public_key[:16] }}...
<td>
{% if node.adv_type and node.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif node.adv_type and node.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif node.adv_type and node.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif node.adv_type %}
<span title="{{ node.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td class="text-sm">
<td class="text-sm whitespace-nowrap">
{% if node.last_seen %}
{{ node.last_seen[:19].replace('T', ' ') }}
{% else %}
-1
View File
@@ -49,7 +49,6 @@ class TestGetAdvertisement:
)
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_advertisement.id
assert data["public_key"] == sample_advertisement.public_key
def test_get_advertisement_not_found(self, client_no_auth):
+123
View File
@@ -58,3 +58,126 @@ class TestDashboardHtml:
assert "Recent Nodes" in response.text
# The node name should appear in the table
assert sample_node.name in response.text
class TestDashboardActivity:
"""Tests for GET /dashboard/activity endpoint."""
def test_get_activity_empty(self, client_no_auth):
"""Test getting activity with empty database."""
response = client_no_auth.get("/api/v1/dashboard/activity")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_activity_custom_days(self, client_no_auth):
"""Test getting activity with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/activity?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_activity_max_days(self, client_no_auth):
"""Test that activity is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/activity?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_activity_with_data(self, client_no_auth, sample_advertisement):
"""Test getting activity with advertisement in database."""
response = client_no_auth.get("/api/v1/dashboard/activity")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0
total_count = sum(point["count"] for point in data["data"])
assert total_count >= 1
class TestMessageActivity:
"""Tests for GET /dashboard/message-activity endpoint."""
def test_get_message_activity_empty(self, client_no_auth):
"""Test getting message activity with empty database."""
response = client_no_auth.get("/api/v1/dashboard/message-activity")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_message_activity_custom_days(self, client_no_auth):
"""Test getting message activity with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/message-activity?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_message_activity_max_days(self, client_no_auth):
"""Test that message activity is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/message-activity?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_message_activity_with_data(self, client_no_auth, sample_message):
"""Test getting message activity with message in database."""
response = client_no_auth.get("/api/v1/dashboard/message-activity")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0
total_count = sum(point["count"] for point in data["data"])
assert total_count >= 1
class TestNodeCountHistory:
"""Tests for GET /dashboard/node-count endpoint."""
def test_get_node_count_empty(self, client_no_auth):
"""Test getting node count with empty database."""
response = client_no_auth.get("/api/v1/dashboard/node-count")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_node_count_custom_days(self, client_no_auth):
"""Test getting node count with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/node-count?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_node_count_max_days(self, client_no_auth):
"""Test that node count is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/node-count?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_node_count_with_data(self, client_no_auth, sample_node):
"""Test getting node count with node in database."""
response = client_no_auth.get("/api/v1/dashboard/node-count")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0 (cumulative)
# The last day should have count >= 1
assert data["data"][-1]["count"] >= 1
-1
View File
@@ -51,7 +51,6 @@ class TestGetMessage:
response = client_no_auth.get(f"/api/v1/messages/{sample_message.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_message.id
assert data["text"] == sample_message.text
def test_get_message_not_found(self, client_no_auth):
-1
View File
@@ -45,7 +45,6 @@ class TestGetTelemetry:
response = client_no_auth.get(f"/api/v1/telemetry/{sample_telemetry.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_telemetry.id
assert data["node_public_key"] == sample_telemetry.node_public_key
def test_get_telemetry_not_found(self, client_no_auth):
-1
View File
@@ -31,7 +31,6 @@ class TestGetTracePath:
response = client_no_auth.get(f"/api/v1/trace-paths/{sample_trace_path.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_trace_path.id
assert data["path_hashes"] == sample_trace_path.path_hashes
def test_get_trace_path_not_found(self, client_no_auth):
+62 -34
View File
@@ -1,10 +1,10 @@
"""Tests for tag import functionality."""
import json
import tempfile
from pathlib import Path
import pytest
import yaml
from sqlalchemy import select
from meshcore_hub.collector.tag_import import (
@@ -62,8 +62,8 @@ class TestLoadTagsFile:
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
result = load_tags_file(f.name)
@@ -84,8 +84,8 @@ class TestLoadTagsFile:
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
result = load_tags_file(f.name)
@@ -104,8 +104,8 @@ class TestLoadTagsFile:
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
result = load_tags_file(f.name)
@@ -118,15 +118,15 @@ class TestLoadTagsFile:
def test_file_not_found(self):
"""Test loading non-existent file."""
with pytest.raises(FileNotFoundError):
load_tags_file("/nonexistent/path/tags.json")
load_tags_file("/nonexistent/path/tags.yaml")
def test_invalid_json(self):
"""Test loading invalid JSON file."""
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
f.write("invalid json {{{")
def test_invalid_yaml(self):
"""Test loading invalid YAML file."""
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
f.write("invalid: yaml: content: [unclosed")
f.flush()
with pytest.raises(json.JSONDecodeError):
with pytest.raises(yaml.YAMLError):
load_tags_file(f.name)
Path(f.name).unlink()
@@ -135,11 +135,11 @@ class TestLoadTagsFile:
"""Test loading file with invalid schema (not a dict)."""
data = [{"public_key": "abc"}]
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
with pytest.raises(ValueError, match="must contain a JSON object"):
with pytest.raises(ValueError, match="must contain a YAML mapping"):
load_tags_file(f.name)
Path(f.name).unlink()
@@ -148,8 +148,8 @@ class TestLoadTagsFile:
"""Test loading file with invalid public key."""
data = {"invalid_key": {"tag": "value"}}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
with pytest.raises(ValueError, match="must be 64 characters"):
@@ -161,8 +161,8 @@ class TestLoadTagsFile:
"""Test loading empty tags file."""
data: dict[str, dict[str, str]] = {}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
result = load_tags_file(f.name)
@@ -195,8 +195,8 @@ class TestImportTags:
},
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
yield f.name
@@ -270,7 +270,7 @@ class TestImportTags:
def test_import_nonexistent_file(self, db_manager):
"""Test import with non-existent file."""
stats = import_tags("/nonexistent/tags.json", db_manager)
stats = import_tags("/nonexistent/tags.yaml", db_manager)
assert stats["total"] == 0
assert len(stats["errors"]) == 1
@@ -280,8 +280,8 @@ class TestImportTags:
"""Test import with empty tags object."""
data: dict[str, dict[str, str]] = {}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
stats = import_tags(f.name, db_manager)
@@ -300,8 +300,8 @@ class TestImportTags:
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
import_tags(f.name, db_manager, create_nodes=True)
@@ -323,8 +323,8 @@ class TestImportTags:
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
stats = import_tags(f.name, db_manager, create_nodes=True)
@@ -339,16 +339,16 @@ class TestImportTags:
Path(f.name).unlink()
def test_import_numeric_value_converted(self, db_manager):
"""Test that numeric values are converted to strings."""
def test_import_numeric_value_detected(self, db_manager):
"""Test that YAML numeric values are detected and stored with number type."""
data = {
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef": {
"num_val": 42,
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".json", delete=False) as f:
json.dump(data, f)
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
stats = import_tags(f.name, db_manager, create_nodes=True)
@@ -359,6 +359,34 @@ class TestImportTags:
tag = session.execute(select(NodeTag)).scalar_one()
assert tag.key == "num_val"
assert tag.value == "42"
assert tag.value_type == "string"
assert tag.value_type == "number"
Path(f.name).unlink()
def test_import_boolean_value_detected(self, db_manager):
"""Test that YAML boolean values are detected and stored with boolean type."""
data = {
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef": {
"is_active": True,
"is_disabled": False,
}
}
with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as f:
yaml.dump(data, f)
f.flush()
stats = import_tags(f.name, db_manager, create_nodes=True)
assert stats["created"] == 2
with db_manager.session_scope() as session:
tags = session.execute(select(NodeTag)).scalars().all()
tag_dict = {t.key: t for t in tags}
assert tag_dict["is_active"].value == "true"
assert tag_dict["is_active"].value_type == "boolean"
assert tag_dict["is_disabled"].value == "false"
assert tag_dict["is_disabled"].value_type == "boolean"
Path(f.name).unlink()
+6 -6
View File
@@ -64,8 +64,8 @@ class TestCollectorSettings:
assert settings.seed_home == "./seed"
assert settings.effective_seed_home == "seed"
# node_tags_file and members_file are derived from effective_seed_home
assert settings.node_tags_file == "seed/node_tags.json"
assert settings.members_file == "seed/members.json"
assert settings.node_tags_file == "seed/node_tags.yaml"
assert settings.members_file == "seed/members.yaml"
def test_custom_data_home(self) -> None:
"""Test that custom data_home affects effective paths."""
@@ -78,8 +78,8 @@ class TestCollectorSettings:
assert settings.collector_data_dir == "/custom/data/collector"
# seed_home is independent of data_home
assert settings.effective_seed_home == "seed"
assert settings.node_tags_file == "seed/node_tags.json"
assert settings.members_file == "seed/members.json"
assert settings.node_tags_file == "seed/node_tags.yaml"
assert settings.members_file == "seed/members.yaml"
def test_explicit_database_url_overrides(self) -> None:
"""Test that explicit database_url overrides the default."""
@@ -96,8 +96,8 @@ class TestCollectorSettings:
assert settings.seed_home == "/seed/data"
assert settings.effective_seed_home == "/seed/data"
assert settings.node_tags_file == "/seed/data/node_tags.json"
assert settings.members_file == "/seed/data/members.json"
assert settings.node_tags_file == "/seed/data/node_tags.yaml"
assert settings.members_file == "/seed/data/members.yaml"
class TestAPISettings:
+266
View File
@@ -0,0 +1,266 @@
"""Tests for hash utilities for event deduplication."""
from datetime import datetime, timezone
from meshcore_hub.common.hash_utils import (
compute_advertisement_hash,
compute_message_hash,
compute_telemetry_hash,
compute_trace_hash,
)
class TestComputeMessageHash:
"""Tests for compute_message_hash function."""
def test_same_content_produces_same_hash(self) -> None:
"""Identical messages should produce the same hash."""
timestamp = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
hash1 = compute_message_hash(
text="Hello World",
pubkey_prefix="01ab2186c4d5",
channel_idx=4,
sender_timestamp=timestamp,
txt_type=1,
)
hash2 = compute_message_hash(
text="Hello World",
pubkey_prefix="01ab2186c4d5",
channel_idx=4,
sender_timestamp=timestamp,
txt_type=1,
)
assert hash1 == hash2
def test_different_text_produces_different_hash(self) -> None:
"""Messages with different text should have different hashes."""
timestamp = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
hash1 = compute_message_hash(
text="Hello World",
pubkey_prefix="01ab2186c4d5",
sender_timestamp=timestamp,
)
hash2 = compute_message_hash(
text="Goodbye World",
pubkey_prefix="01ab2186c4d5",
sender_timestamp=timestamp,
)
assert hash1 != hash2
def test_different_sender_produces_different_hash(self) -> None:
"""Messages from different senders should have different hashes."""
timestamp = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
hash1 = compute_message_hash(
text="Hello",
pubkey_prefix="01ab2186c4d5",
sender_timestamp=timestamp,
)
hash2 = compute_message_hash(
text="Hello",
pubkey_prefix="99ff8877aabb",
sender_timestamp=timestamp,
)
assert hash1 != hash2
def test_different_channel_produces_different_hash(self) -> None:
"""Messages on different channels should have different hashes."""
hash1 = compute_message_hash(text="Hello", channel_idx=1)
hash2 = compute_message_hash(text="Hello", channel_idx=2)
assert hash1 != hash2
def test_handles_none_values(self) -> None:
"""Hash function should handle None values gracefully."""
hash1 = compute_message_hash(
text="Test",
pubkey_prefix=None,
channel_idx=None,
sender_timestamp=None,
txt_type=None,
)
assert hash1 is not None
assert len(hash1) == 32 # MD5 hex digest length
class TestComputeAdvertisementHash:
"""Tests for compute_advertisement_hash function."""
def test_same_content_same_bucket_produces_same_hash(self) -> None:
"""Advertisements within the same time bucket should match."""
# Two times within the same 5-minute (300 second) bucket
time1 = datetime(2024, 1, 15, 10, 31, 0, tzinfo=timezone.utc)
time2 = datetime(2024, 1, 15, 10, 33, 0, tzinfo=timezone.utc)
hash1 = compute_advertisement_hash(
public_key="a" * 64,
name="Node1",
adv_type="chat",
flags=128,
received_at=time1,
bucket_seconds=300, # 5 minutes
)
hash2 = compute_advertisement_hash(
public_key="a" * 64,
name="Node1",
adv_type="chat",
flags=128,
received_at=time2,
bucket_seconds=300, # 5 minutes
)
assert hash1 == hash2
def test_different_bucket_produces_different_hash(self) -> None:
"""Advertisements in different time buckets should not match."""
# Two times in different 5-minute (300 second) buckets
time1 = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
time2 = datetime(2024, 1, 15, 10, 36, 0, tzinfo=timezone.utc)
hash1 = compute_advertisement_hash(
public_key="a" * 64,
name="Node1",
received_at=time1,
bucket_seconds=300, # 5 minutes
)
hash2 = compute_advertisement_hash(
public_key="a" * 64,
name="Node1",
received_at=time2,
bucket_seconds=300, # 5 minutes
)
assert hash1 != hash2
def test_different_public_key_produces_different_hash(self) -> None:
"""Advertisements from different nodes should have different hashes."""
time = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
hash1 = compute_advertisement_hash(
public_key="a" * 64,
received_at=time,
)
hash2 = compute_advertisement_hash(
public_key="b" * 64,
received_at=time,
)
assert hash1 != hash2
def test_configurable_bucket_size(self) -> None:
"""Bucket size should be configurable."""
time1 = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
time2 = datetime(2024, 1, 15, 10, 35, 0, tzinfo=timezone.utc)
# With 5-minute (300s) bucket, these should be in different buckets
hash1_5min = compute_advertisement_hash(
public_key="a" * 64,
received_at=time1,
bucket_seconds=300, # 5 minutes
)
hash2_5min = compute_advertisement_hash(
public_key="a" * 64,
received_at=time2,
bucket_seconds=300, # 5 minutes
)
assert hash1_5min != hash2_5min
# With 10-minute (600s) bucket, these should be in the same bucket
hash1_10min = compute_advertisement_hash(
public_key="a" * 64,
received_at=time1,
bucket_seconds=600, # 10 minutes
)
hash2_10min = compute_advertisement_hash(
public_key="a" * 64,
received_at=time2,
bucket_seconds=600, # 10 minutes
)
assert hash1_10min == hash2_10min
class TestComputeTraceHash:
"""Tests for compute_trace_hash function."""
def test_same_tag_produces_same_hash(self) -> None:
"""Same initiator_tag should produce same hash."""
hash1 = compute_trace_hash(initiator_tag=123456789)
hash2 = compute_trace_hash(initiator_tag=123456789)
assert hash1 == hash2
def test_different_tag_produces_different_hash(self) -> None:
"""Different initiator_tag should produce different hash."""
hash1 = compute_trace_hash(initiator_tag=123456789)
hash2 = compute_trace_hash(initiator_tag=987654321)
assert hash1 != hash2
class TestComputeTelemetryHash:
"""Tests for compute_telemetry_hash function."""
def test_same_content_same_bucket_produces_same_hash(self) -> None:
"""Telemetry within the same time bucket should match."""
time1 = datetime(2024, 1, 15, 10, 31, 0, tzinfo=timezone.utc)
time2 = datetime(2024, 1, 15, 10, 33, 0, tzinfo=timezone.utc)
data = {"temperature": 22.5, "humidity": 65}
hash1 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data=data,
received_at=time1,
bucket_seconds=300, # 5 minutes
)
hash2 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data=data,
received_at=time2,
bucket_seconds=300, # 5 minutes
)
assert hash1 == hash2
def test_different_data_produces_different_hash(self) -> None:
"""Different sensor readings should produce different hashes."""
time = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
hash1 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data={"temperature": 22.5},
received_at=time,
)
hash2 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data={"temperature": 25.0},
received_at=time,
)
assert hash1 != hash2
def test_deterministic_dict_serialization(self) -> None:
"""Dict serialization should be deterministic regardless of key order."""
time = datetime(2024, 1, 15, 10, 30, 0, tzinfo=timezone.utc)
# Same data, different key order in source dicts
data1 = {"a": 1, "b": 2, "c": 3}
data2 = {"c": 3, "a": 1, "b": 2}
hash1 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data=data1,
received_at=time,
)
hash2 = compute_telemetry_hash(
node_public_key="a" * 64,
parsed_data=data2,
received_at=time,
)
assert hash1 == hash2
+63 -4
View File
@@ -147,6 +147,57 @@ class MockHttpClient:
},
}
# Default activity response (for home page chart)
self._responses["GET:/api/v1/dashboard/activity"] = {
"status_code": 200,
"json": {
"days": 7,
"data": [
{"date": "2024-01-01", "count": 10},
{"date": "2024-01-02", "count": 15},
{"date": "2024-01-03", "count": 8},
{"date": "2024-01-04", "count": 12},
{"date": "2024-01-05", "count": 20},
{"date": "2024-01-06", "count": 5},
{"date": "2024-01-07", "count": 18},
],
},
}
# Default message activity response (for network page chart)
self._responses["GET:/api/v1/dashboard/message-activity"] = {
"status_code": 200,
"json": {
"days": 7,
"data": [
{"date": "2024-01-01", "count": 5},
{"date": "2024-01-02", "count": 8},
{"date": "2024-01-03", "count": 3},
{"date": "2024-01-04", "count": 10},
{"date": "2024-01-05", "count": 7},
{"date": "2024-01-06", "count": 2},
{"date": "2024-01-07", "count": 9},
],
},
}
# Default node count response (for network page chart)
self._responses["GET:/api/v1/dashboard/node-count"] = {
"status_code": 200,
"json": {
"days": 7,
"data": [
{"date": "2024-01-01", "count": 5},
{"date": "2024-01-02", "count": 6},
{"date": "2024-01-03", "count": 7},
{"date": "2024-01-04", "count": 8},
{"date": "2024-01-05", "count": 9},
{"date": "2024-01-06", "count": 9},
{"date": "2024-01-07", "count": 10},
],
},
}
# Health check response
self._responses["GET:/health"] = {
"status_code": 200,
@@ -224,7 +275,6 @@ def web_app(mock_http_client: MockHttpClient) -> Any:
network_name="Test Network",
network_city="Test City",
network_country="Test Country",
network_location=(40.7128, -74.0060),
network_radio_config="Test Radio Config",
network_contact_email="test@example.com",
network_contact_discord="https://discord.gg/test",
@@ -265,7 +315,17 @@ def mock_http_client_with_members() -> MockHttpClient:
"role": "Admin",
"description": None,
"contact": "alice@example.com",
"public_key": None,
"nodes": [
{
"public_key": "abc123def456abc123def456abc123def456abc123def456abc123def456abc1",
"node_role": "chat",
"created_at": "2024-01-01T00:00:00Z",
"updated_at": "2024-01-01T00:00:00Z",
"node_name": "Alice's Node",
"node_adv_type": "chat",
"friendly_name": "Alice Chat",
}
],
"created_at": "2024-01-01T00:00:00Z",
"updated_at": "2024-01-01T00:00:00Z",
},
@@ -276,7 +336,7 @@ def mock_http_client_with_members() -> MockHttpClient:
"role": "Member",
"description": None,
"contact": None,
"public_key": None,
"nodes": [],
"created_at": "2024-01-01T00:00:00Z",
"updated_at": "2024-01-01T00:00:00Z",
},
@@ -298,7 +358,6 @@ def web_app_with_members(mock_http_client_with_members: MockHttpClient) -> Any:
network_name="Test Network",
network_city="Test City",
network_country="Test Country",
network_location=(40.7128, -74.0060),
network_radio_config="Test Radio Config",
network_contact_email="test@example.com",
network_contact_discord="https://discord.gg/test",
+16 -1
View File
@@ -38,4 +38,19 @@ class TestMembersPage:
assert "Bob" in response.text
assert "W1ABC" in response.text
assert "W2XYZ" in response.text
assert "Admin" in response.text
def test_members_with_nodes_shows_node_links(
self, client_with_members: TestClient
) -> None:
"""Test that members page shows associated nodes with links."""
response = client_with_members.get("/members")
assert response.status_code == 200
# Alice has a node associated - check for friendly name display
assert "Alice Chat" in response.text
# Check for partial public key underneath
assert "abc123def456" in response.text
# Check for link to node detail page (full public key)
assert (
"/nodes/abc123def456abc123def456abc123def456abc123def456abc123def456abc1"
in response.text
)
+3 -3
View File
@@ -31,11 +31,11 @@ class TestNodesListPage:
"""Test that nodes page displays node data from API."""
response = client.get("/nodes")
assert response.status_code == 200
# Check for node data from mock
# Check for node data from mock (names and public key prefixes)
assert "Node One" in response.text
assert "Node Two" in response.text
assert "REPEATER" in response.text
assert "CLIENT" in response.text
assert "abc123" in response.text
assert "def456" in response.text
def test_nodes_displays_public_keys(
self, client: TestClient, mock_http_client: MockHttpClient