Compare commits

...

114 Commits

Author SHA1 Message Date
Louis King
cd4f0b91dc Various UI improvements 2025-12-08 22:07:46 +00:00
Louis King
a290db0491 Updated chart stats 2025-12-08 19:37:45 +00:00
Louis King
92b0b883e6 More website improvements 2025-12-08 17:07:39 +00:00
Louis King
9e621c0029 Fixed test 2025-12-08 16:42:13 +00:00
Louis King
a251f3a09f Added map to node detail page, made title consistent with emoji 2025-12-08 16:37:53 +00:00
Louis King
0fdedfe5ba Tidied Advert/Node search 2025-12-08 16:22:08 +00:00
Louis King
243a3e8521 Added truncate CLI command 2025-12-08 15:54:32 +00:00
JingleManSweep
b24a6f0894 Merge pull request #54 from ipnet-mesh/feature/more-filters
Fixed Member model
2025-12-08 15:15:04 +00:00
Louis King
57f51c741c Fixed Member model 2025-12-08 15:13:24 +00:00
Louis King
65b8418af4 Fixed last seen issue 2025-12-08 00:15:25 +00:00
JingleManSweep
89ceee8741 Merge pull request #51 from ipnet-mesh/feat/sync-receiver-contacts-on-advert
Receiver nodes now sync contacts to MQTT on every advert received
2025-12-07 23:36:11 +00:00
Louis King
64ec1a7135 Receiver nodes now sync contacts to MQTT on every advert received 2025-12-07 23:34:33 +00:00
JingleManSweep
3d632a94b1 Merge pull request #50 from ipnet-mesh/feat/remove-friendly-name
Removed friendly name support and tidied tags
2025-12-07 23:03:39 +00:00
Louis King
fbd29ff78e Removed friendly name support and tidied tags 2025-12-07 23:02:19 +00:00
Louis King
86bff07f7d Removed contrib 2025-12-07 22:22:32 +00:00
Louis King
3abd5ce3ea Updates 2025-12-07 22:18:16 +00:00
Louis King
0bf2086f16 Added screenshot 2025-12-07 22:05:34 +00:00
Louis King
40dc6647e9 Updates 2025-12-07 22:02:42 +00:00
Louis King
f4e95a254e Fixes 2025-12-07 22:00:46 +00:00
Louis King
ba43be9e62 Fixes 2025-12-07 21:58:42 +00:00
JingleManSweep
5b22ab29cf Merge pull request #49 from ipnet-mesh/fix/version-display
Fixed version display
2025-12-07 21:56:26 +00:00
Louis King
278d102064 Fixed version display 2025-12-07 21:55:10 +00:00
JingleManSweep
f0cee14bd8 Merge pull request #48 from ipnet-mesh/feature/mqtt-tls
Added support for MQTT TLS
2025-12-07 21:16:13 +00:00
Louis King
5ff8d16bcb Added support for MQTT TLS 2025-12-07 21:15:05 +00:00
JingleManSweep
e8a60d4869 Merge pull request #47 from ipnet-mesh/feature/node-cleanup
Added Node/Data cleanup
2025-12-07 20:50:09 +00:00
Louis King
84b8614e29 Updates 2025-12-06 21:42:33 +00:00
Louis King
3bc47a33bc Added data retention and node cleanup 2025-12-06 21:27:19 +00:00
Louis King
3ae8ecbd70 Updates 2025-12-06 20:46:31 +00:00
JingleManSweep
38164380af Merge pull request #41 from ipnet-mesh/claude/issue-37-20251206-1854
feat: Add MESHCORE_DEVICE_NAME config to set node name on startup
2025-12-06 19:33:32 +00:00
claude[bot]
dc3c771c76 docs: Document MESHCORE_DEVICE_NAME configuration option
Add documentation for the new MESHCORE_DEVICE_NAME environment variable
that was introduced in this PR. Updates include:

- Added to .env.example with description
- Added to Interface Settings table in README.md
- Added to CLI Reference examples in README.md
- Added to Interface configuration table in PLAN.md

🤖 Generated with [Claude Code](https://claude.ai/claude-code)

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-06 19:07:57 +00:00
claude[bot]
deb307c6ae feat: Add MESHCORE_DEVICE_NAME config to set node name on startup
- Add meshcore_device_name field to InterfaceSettings
- Implement set_name() method in device interface (real and mock)
- Update receiver to set device name during initialization if configured
- Add --device-name CLI option with MESHCORE_DEVICE_NAME env var support
- Device name is set after time sync and before advertisement broadcast

Fixes #37

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 19:00:56 +00:00
JingleManSweep
b8c8284643 Merge pull request #39 from ipnet-mesh/claude/issue-38-20251206-1840
Send flood advertisement on receiver startup
2025-12-06 18:48:50 +00:00
JingleManSweep
d310a119ed Update src/meshcore_hub/interface/receiver.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-12-06 18:48:03 +00:00
claude[bot]
2b307679c9 Send flood advertisement on receiver startup
Changed the startup advertisement from flood=False to flood=True
so that the device name is broadcast to the mesh network.

Fixes #38

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 18:42:53 +00:00
Louis King
6f7521951f Updates 2025-12-06 18:29:12 +00:00
Louis King
ab498292b2 Updated README with upgrading instructions 2025-12-06 17:31:10 +00:00
JingleManSweep
df2b9ea432 Merge pull request #35 from ipnet-mesh/claude/issue-33-20251206-1703
Add last seen time to map node labels
2025-12-06 17:18:49 +00:00
claude[bot]
55443376be Use full display name in map node labels
Update map node labels to show the full display name (friendly_name tag
→ advertised node name → public key prefix) instead of just the 2-char
public key prefix. This makes node labels consistent with the rest of
the site and easier to identify at a glance.

The backend already computes the correct display name in map.py:96-101,
so this change just uses that computed name in the label.

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:12:50 +00:00
claude[bot]
ed7a46b1a7 Add last seen time to map node labels
Display relative time since last seen (e.g., '2m', '1h', '2d') in node
labels on the map page. This makes it easier to quickly identify how
recently nodes were active without opening the popup.

- Add formatRelativeTime() function to calculate time difference
- Update createNodeIcon() to include relative time in label
- Adjust icon size to accommodate additional text
- Format: keyPrefix (timeAgo) e.g., 'ab (5m)'

Fixes #33

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:05:30 +00:00
JingleManSweep
c2eef3db50 Merge pull request #34 from ipnet-mesh/claude/issue-25-20251206-1646
fix: Handle empty channel_idx parameter in messages filter
2025-12-06 16:53:34 +00:00
claude[bot]
4916ea0cea fix: Handle empty channel_idx parameter in messages filter
Fixed parse error when clicking the filter button on messages screen
with "All Channels" selected. The form was sending an empty string
for channel_idx, but FastAPI expected either a valid integer or None.

Changes:
- Accept channel_idx as string in query parameter
- Parse and validate channel_idx before passing to API
- Treat empty strings as None to prevent validation errors
- Add error handling for invalid integer values

Fixes #25

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 16:49:23 +00:00
JingleManSweep
e3fc7e4f07 Merge pull request #32 from ipnet-mesh/add-claude-github-actions-1765039315831
Add Claude Code GitHub Workflow
2025-12-06 16:42:13 +00:00
JingleManSweep
b656bfda21 "Claude PR Assistant workflow" 2025-12-06 16:41:56 +00:00
Louis King
fb7201dc2d Updates 2025-12-06 16:37:25 +00:00
Louis King
74346d9c82 Hopefully use Git tag as version on website 2025-12-06 16:32:31 +00:00
Louis King
beb471fcd8 Switched to Git versioning 2025-12-06 16:28:13 +00:00
Louis King
8597052bf7 Fixed table overflows 2025-12-06 16:07:13 +00:00
Louis King
f85768f661 Fixed DB migration 2025-12-06 15:46:41 +00:00
JingleManSweep
79bb4a4250 Merge pull request #23 from ipnet-mesh/claude/plan-event-deduplication-01NWhrJaWzQiGi1xLzmk1udg
Deduplication for multiple receiver nodes
2025-12-06 15:36:37 +00:00
Louis King
714c3cbbd2 Set sensible Docker tag label 2025-12-06 15:32:15 +00:00
Louis King
f0531c9e40 Updated env example 2025-12-06 15:16:26 +00:00
Louis King
dd0b4c73c5 More fixes 2025-12-06 15:10:03 +00:00
Louis King
78a086e1ea Updates 2025-12-06 14:51:47 +00:00
Louis King
9cd1d50bf6 Updates 2025-12-06 14:38:53 +00:00
Louis King
733342a9ec Fixed README and Compose 2025-12-06 14:21:17 +00:00
Louis King
d715e4e4f0 Updates 2025-12-06 13:33:02 +00:00
Louis King
2ea04deb7e Updates 2025-12-06 12:53:29 +00:00
Claude
6e3b86a1ad Add collector-level event deduplication using content hashes
Replace presentation-layer deduplication with collector-level approach:
- Add event_hash column to messages, advertisements, trace_paths, telemetry tables
- Handlers compute content hashes and skip duplicate events at insertion time
- Use 5-minute time buckets for advertisements and telemetry
- Include Alembic migration for schema changes
2025-12-06 12:23:14 +00:00
Louis King
b807932ca3 Added arm64 Docker image support 2025-12-06 12:16:05 +00:00
Claude
c80986fe67 Add event deduplication at presentation layer
When multiple receiver nodes are running, the same mesh events (messages,
advertisements) are reported multiple times. This causes duplicate entries
in the Web UI.

Changes:
- Add hash_utils.py with deterministic hash functions for each event type
- Add `dedupe` parameter to messages and advertisements API endpoints (default: True)
- Update dashboard stats to use distinct counts for messages/advertisements
- Deduplicate recent advertisements and channel messages in dashboard
- Add comprehensive tests for hash utilities

Hash strategy:
- Messages: hash of text + pubkey_prefix + channel_idx + sender_timestamp + txt_type
- Advertisements: hash of public_key + name + adv_type + flags + 5-minute time bucket
2025-12-06 12:08:07 +00:00
Louis King
3a060f77cc Updates 2025-12-06 11:46:36 +00:00
JingleManSweep
7461cf6dc1 Merge pull request #22 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Updates
2025-12-05 21:18:31 +00:00
Louis King
23f6c290c9 Updates 2025-12-05 21:17:34 +00:00
JingleManSweep
1ae1736391 Merge pull request #21 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Associate nodes with members in database
2025-12-05 21:17:20 +00:00
Claude
a4b13d3456 Add member-node association support
Members can now have multiple associated nodes, each with a public_key
and node_role (e.g., 'chat', 'repeater'). This replaces the single
public_key field on members with a one-to-many relationship.

Changes:
- Add MemberNode model for member-node associations
- Update Member model to remove public_key, add nodes relationship
- Update Pydantic schemas with MemberNodeCreate/MemberNodeRead
- Update member_import.py to handle nodes list in seed files
- Update API routes to handle nodes in create/update/read operations
- Add Alembic migration to create member_nodes table and migrate data
- Update example seed file with new format
2025-12-05 20:34:09 +00:00
Louis King
0016edbdac Updates 2025-12-05 20:03:14 +00:00
Louis King
0b8fc6e707 Charts 2025-12-05 19:50:22 +00:00
Louis King
d1181ae4f9 Updates 2025-12-05 19:27:06 +00:00
JingleManSweep
6b41e64b26 Merge pull request #20 from ipnet-mesh/claude/ui-improvements-ads-page-01GQvLau46crtrqftWzFie5d
UI improvements and advertisements page
2025-12-05 18:26:07 +00:00
Claude
995b066b0d Remove sender name from channel messages summary, keep only timestamp 2025-12-05 18:23:33 +00:00
Claude
0d14ed0ccc Add latest channel messages to Network dashboard
Replace the channel counts table with actual recent messages per channel:
- Added ChannelMessage schema for channel message summaries
- Dashboard API now fetches latest 5 messages for each channel with sender name lookups
- Network page displays messages grouped by channel with sender names and timestamps
- Only shows channels that have messages
2025-12-05 18:20:49 +00:00
Claude
e3ce1258a8 Fix advertisements Type column by falling back to source node's adv_type
The adv_type from the Advertisement record is often null, but the linked
Node has the correct adv_type. Now falls back to source_node.adv_type
when adv.adv_type is null.
2025-12-05 18:14:16 +00:00
Claude
087a3c4c43 Messages list: swap Time/Type columns, add receiver node links
- Swap Time and Type columns (Type now first)
- Add receiver_name and receiver_friendly_name to MessageRead schema
- Update messages API to fetch receiver node names and tags
- Make Receiver column a link showing name with public key prefix
2025-12-05 18:09:29 +00:00
Claude
5077178a6d Add Type column with emoji to Advertisements list 2025-12-05 18:04:04 +00:00
Claude
a44d38dad6 Update Nodes list to match Advertisements style
- Rename Name column to Node
- Remove separate Public Key column
- Show name with public key prefix below (like Advertisements list)
- Add whitespace-nowrap to Last Seen column
2025-12-05 18:03:14 +00:00
Claude
3469278fba Add node name lookups to advertisements list
- Join with Node table to get node names and tags for both source
  and receiver nodes
- Display friendly_name (from tags), node_name, or advertised name
  with priority in that order
- Show name with public key preview for both Node and Received By columns
2025-12-05 17:58:52 +00:00
Claude
89c81630c9 Link 'Powered by MeshCore Hub' to GitHub repository 2025-12-05 17:55:59 +00:00
Claude
ab4a5886db Simplify advertisements table: Node, Received By, Time columns
- Remove Name and Type columns (usually null)
- Reorder columns: Node first, then Received By, then Time
- Link both Node and Received By to their node detail pages
- Show node name with public key preview when available
2025-12-05 17:53:54 +00:00
Claude
ec7082e01a Fix message text indentation in messages list
Put message content inline with td tag to prevent whitespace-pre-wrap
from preserving template indentation.
2025-12-05 17:43:09 +00:00
Claude
b4e7d45cf6 UI improvements: smaller hero, stats bar, advertisements page, messages fixes
- Reduce hero section size and add stats bar with node/message counts
- Add new Advertisements page with public key filtering
- Update hero navigation buttons: Dashboard, Nodes, Advertisements, Messages
- Add Advertisements to main navigation menu
- Remove Hops column from messages list (always empty)
- Display full message text with proper multi-line wrapping
2025-12-05 17:14:26 +00:00
JingleManSweep
864494c3a8 Merge pull request #19 from ipnet-mesh/claude/research-node-id-usage-019DURYQHkvodx9sV39hTmNM
Research node ID and public key usage
2025-12-05 16:59:30 +00:00
Claude
84e83a3384 Rename receiver_public_key to received_by
Shorter, cleaner field name for the receiving interface node's
public key in API responses.
2025-12-05 16:57:24 +00:00
Claude
796e303665 Remove internal UUID fields from API responses
Internal database UUIDs (id, node_id, receiver_node_id) were being
exposed in API responses. These are implementation details that should
not be visible to API consumers. The canonical identifier for nodes
is the 64-char hex public_key.

Changes:
- Remove id, node_id from NodeTagRead, NodeRead schemas
- Remove id from MemberRead schema
- Remove id, receiver_node_id, node_id from MessageRead, AdvertisementRead,
  TracePathRead, TelemetryRead schemas
- Update web map component to use public_key instead of member.id
  for owner filtering
- Update tests to not assert on removed fields
2025-12-05 16:50:21 +00:00
Louis King
a5d8d586e1 Updated README 2025-12-05 12:56:12 +00:00
JingleManSweep
26239fe11f Merge pull request #18 from ipnet-mesh/claude/prepare-public-release-01AFsizAneHmWjHZD6of5MWy
Prepare repository for public release
2025-12-05 12:40:56 +00:00
Claude
0e50a9d3b0 Prepare repository for public release
- Update license from MIT to GPL-3.0-or-later in pyproject.toml
- Update project URLs from meshcore-dev to ipnet-mesh organization
- Add explicit GPL-3.0 license statement to README
- Fix AGENTS.md venv directory reference (.venv vs venv)
- Remove undocumented NETWORK_LOCATION from README
- Fix stats endpoint path in README (/api/v1/dashboard/stats)
- Clarify seed and data directory descriptions in project structure
2025-12-05 12:12:55 +00:00
JingleManSweep
9b7d8cc31b Merge pull request #17 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Updates
2025-12-04 19:35:15 +00:00
Louis King
d7152a5359 Updates 2025-12-04 19:34:18 +00:00
JingleManSweep
a1cc4388ae Merge pull request #16 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Implement Map page with node filtering
2025-12-04 19:32:24 +00:00
Claude
bb0b9f05ec Add debug info to map data endpoint for troubleshooting
- Return total_nodes, nodes_with_coords, and error in response
- Display meaningful messages when no nodes or no coordinates found
- Log API errors and node counts for debugging
2025-12-04 18:37:50 +00:00
Claude
fe744c7c0c Fix map markers to use inline styles and center on nodes
- Use inline styles for marker colors instead of CSS classes for reliable rendering
- Center map on node locations when data is first loaded
- Refactor filter logic to separate recentering behavior
- Update legend to use inline styles
2025-12-04 18:34:24 +00:00
Claude
cf4e82503a Add filters to map page for node type, infrastructure, and owner
- Enhanced /map/data endpoint to include node role tag and member ownership
- Added client-side filtering for node type (chat, repeater, room)
- Added toggle to filter for infrastructure nodes only (role: infra)
- Added dropdown filter for member owner (nodes linked via public_key)
- Color-coded markers by node type with gold border for infrastructure
- Added legend showing marker types
- Dynamic count display showing total vs filtered nodes
2025-12-04 18:29:43 +00:00
Louis King
d6346fdfde Updates 2025-12-04 18:18:36 +00:00
Louis King
cf2c3350cc Updates 2025-12-04 18:10:29 +00:00
Louis King
110c701787 Updates 2025-12-04 16:37:59 +00:00
Louis King
fc0dc1a448 Updates 2025-12-04 16:12:51 +00:00
Louis King
d283a8c79b Updates 2025-12-04 16:00:15 +00:00
JingleManSweep
f129a4e0f3 Merge pull request #15 from ipnet-mesh/feature/originator-address
Updates
2025-12-04 15:52:49 +00:00
JingleManSweep
90f6a68b9b Merge pull request #14 from ipnet-mesh/claude/update-docs-docker-agents-01B9FYrem1tEwNRkx7rCH1QU
Update docs: Docker profiles, seed data, and Members model
2025-12-04 15:46:00 +00:00
Louis King
6cf3152ef9 Updates 2025-12-04 15:45:35 +00:00
Claude
058a6e2c95 Update docs: Docker profiles, seed data, and Members model
- Update Docker Compose section: core services run by default (mqtt,
  collector, api, web), optional profiles for interfaces and utilities
- Document automatic seeding on collector startup
- Add SEED_HOME environment variable documentation
- Document new Members model and YAML seed file format
- Update node_tags format to YAML with public_key-keyed structure
- Update project structure to reflect seed/ and data/ directories
- Add CLI reference for collector seed commands
2025-12-04 15:31:04 +00:00
JingleManSweep
acccdfedba Merge pull request #13 from ipnet-mesh/claude/fix-black-ci-linting-01UGvpVAnxCMthohsvzFBiQF
Fix Black linting issues in CI pipeline
2025-12-04 15:12:36 +00:00
Claude
83f3157e8b Fix Black formatting: add trailing comma in set_dispatch_callback
Black requires a trailing comma after the callback parameter when the
function signature spans multiple lines.
2025-12-04 15:06:13 +00:00
JingleManSweep
c564a61cf7 Merge pull request #12 from ipnet-mesh/claude/trigger-contact-database-01Qp5vbzzE1c77Kv21wYfQbQ
Trigger Contact database before CONTACT events
2025-12-04 14:59:21 +00:00
Claude
bbe8491ff1 Add info-level logging to contact handler for debugging
Change debug logging to info level so contact processing is visible
in default log output. This helps verify that contact events are
being received and processed correctly.
2025-12-04 14:39:14 +00:00
Claude
1d6a9638a1 Refactor contacts to emit individual MQTT events per contact
Instead of sending all contacts in one MQTT message, the interface
now splits the device's contacts response into individual 'contact'
events. This is more consistent with other event patterns and makes
the collector simpler.

Interface changes:
- Add _publish_contacts() to split contacts dict into individual events
- Publish each contact as 'contact' event (not 'contacts')

Collector changes:
- Rename handle_contacts to handle_contact for single contact
- Simplify handler to process one contact per message
- Register handler for 'contact' events
2025-12-04 14:34:05 +00:00
Claude
cf633f9f44 Update contacts handler to match actual device payload format
The device sends contact entries with different field names than
originally expected:
- adv_name (not name) for the advertised node name
- type (numeric: 0=none, 1=chat, 2=repeater, 3=room) instead of node_type

Changes:
- Update handle_contacts to extract adv_name and convert numeric type
- Add NODE_TYPE_MAP for type conversion
- Always update node name if different (not just if empty)
- Add debug logging for node updates
- Update ContactInfo schema with actual device fields
2025-12-04 14:25:11 +00:00
Claude
102e40a395 Fix event subscriptions setup timing for contact database sync
Move _setup_event_subscriptions() from run() to connect() so that
event subscriptions are active before get_contacts() is called during
device initialization. Previously, CONTACTS events were lost because
subscriptions weren't set up until run() was called.
2025-12-04 14:18:27 +00:00
Claude
241902685d Add contact database sync to interface startup
Trigger get_contacts() during receiver initialization to fetch the
device's contact database and broadcast CONTACTS events over MQTT.
This enables the collector to associate broadcast node names with
node records in the database.

Changes:
- Add get_contacts() abstract method to BaseMeshCoreDevice
- Implement get_contacts() in MeshCoreDevice using meshcore library
- Implement get_contacts() in MockMeshCoreDevice for testing
- Call get_contacts() in Receiver._initialize_device() after startup
2025-12-04 14:10:58 +00:00
JingleManSweep
2f2ae30c89 Merge pull request #11 from ipnet-mesh/claude/collector-seed-yaml-conversion-01WgpDYuzrzP5nkeC2EG9o2L
Convert collector seed mechanism from JSON to YAML
2025-12-04 01:34:47 +00:00
Louis King
fff04e4b99 Updates 2025-12-04 01:33:25 +00:00
Claude
df05c3a462 Convert collector seed mechanism from JSON to YAML
- Replace JSON seed files with YAML format for better readability
- Auto-detect YAML primitive types (number, boolean, string) from values
- Add automatic seed import on collector startup
- Split lat/lon into separate tags instead of combined coordinate string
- Add PyYAML dependency and types-PyYAML for type checking
- Update example/seed and contrib/seed/ipnet with clean YAML format
- Update tests to verify YAML primitive type detection
2025-12-04 01:27:03 +00:00
Louis King
e2d865f200 Fix nodes page test to match template output
The test was checking for adv_type values (REPEATER, CLIENT) but the
nodes.html template doesn't display that column. Updated to check for
public key prefixes instead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-04 01:03:05 +00:00
Louis King
fa335bdb14 Updates 2025-12-04 00:59:49 +00:00
109 changed files with 7217 additions and 1627 deletions

View File

@@ -5,29 +5,33 @@
# Docker Image
# ===================
# Leave empty to build from local Dockerfile, or set to use a pre-built image:
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:latest
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:main
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:v1.0.0
MESHCORE_IMAGE=
# Docker image version tag to use
# Options: latest, main, v1.0.0, etc.
IMAGE_VERSION=latest
# ===================
# Data Directory
# Data & Seed Directories
# ===================
# Base directory for all service data (collector DB, tags, members, etc.)
# Base directory for runtime data (database, etc.)
# Default: ./data (relative to docker-compose.yml location)
# Inside containers this is mapped to /data
#
# Structure:
# ${DATA_HOME}/
# ── collector/
# │ ├── meshcore.db # SQLite database
# │ └── tags.json # Node tags for import
# └── web/
# └── members.json # Network members list
# ── meshcore.db # SQLite database
DATA_HOME=./data
# Directory containing seed data files for import
# Default: ./seed (relative to docker-compose.yml location)
# Inside containers this is mapped to /seed
#
# Structure:
# ${SEED_HOME}/
# ├── node_tags.yaml # Node tags for import
# └── members.yaml # Network members for import
SEED_HOME=./seed
# ===================
# Common Settings
# ===================
@@ -35,12 +39,25 @@ DATA_HOME=./data
# Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
LOG_LEVEL=INFO
# MQTT Broker Settings (internal use)
# ===================
# MQTT Settings
# ===================
# MQTT Broker connection (for interface/collector/api services)
# When using the local MQTT broker (--profile mqtt), use "mqtt" as host
# When using an external broker, set the hostname/IP
MQTT_HOST=mqtt
MQTT_PORT=1883
MQTT_USERNAME=
MQTT_PASSWORD=
MQTT_PREFIX=meshcore
# External MQTT port mapping
# Enable TLS/SSL for MQTT connection (default: false)
# When enabled, uses TLS with system CA certificates (e.g., for Let's Encrypt)
# Set to true for secure MQTT connections (port 8883)
MQTT_TLS=false
# External port mappings for local MQTT broker (--profile mqtt only)
MQTT_EXTERNAL_PORT=1883
MQTT_WS_PORT=9001
@@ -57,7 +74,12 @@ SERIAL_PORT_SENDER=/dev/ttyUSB1
# Baud rate for serial communication
SERIAL_BAUD=115200
# Optional device/node name to set on startup
# This name is broadcast to the mesh network in advertisements
MESHCORE_DEVICE_NAME=
# Optional node address override (64-char hex string)
# Only set if you need to override the device's public key
NODE_ADDRESS=
NODE_ADDRESS_SENDER=
@@ -84,15 +106,20 @@ WEB_PORT=8080
NETWORK_NAME=MeshCore Network
NETWORK_CITY=
NETWORK_COUNTRY=
NETWORK_LOCATION=
# Radio configuration (comma-delimited)
# Format: <profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>
# Example: EU/UK Narrow,869.618MHz,62.5kHz,8,8,22dBm
NETWORK_RADIO_CONFIG=
# Contact information
NETWORK_CONTACT_EMAIL=
NETWORK_CONTACT_DISCORD=
NETWORK_CONTACT_GITHUB=
# Members file location (optional override)
# Default: ${DATA_HOME}/web/members.json
# Only set this if you want to use a different location
# MEMBERS_FILE=/custom/path/to/members.json
# Welcome text displayed on the homepage (plain text, optional)
# If not set, a default welcome message is shown
NETWORK_WELCOME_TEXT=
# ===================
# Webhook Settings
@@ -119,3 +146,35 @@ WEBHOOK_MESSAGE_SECRET=
WEBHOOK_TIMEOUT=10.0
WEBHOOK_MAX_RETRIES=3
WEBHOOK_RETRY_BACKOFF=2.0
# ===================
# Data Retention Settings
# ===================
# Enable automatic cleanup of old event data
# When enabled, the collector runs periodic cleanup to delete old events
# Default: true
DATA_RETENTION_ENABLED=true
# Number of days to retain event data (advertisements, messages, telemetry, etc.)
# Events older than this are deleted during cleanup
# Default: 30 days
DATA_RETENTION_DAYS=30
# Hours between automatic cleanup runs (applies to both events and nodes)
# Default: 24 hours (once per day)
DATA_RETENTION_INTERVAL_HOURS=24
# ===================
# Node Cleanup Settings
# ===================
# Enable automatic cleanup of inactive nodes
# Nodes that haven't been seen (last_seen) for the specified period are removed
# Nodes with last_seen=NULL (never seen on network) are NOT removed
# Default: true
NODE_CLEANUP_ENABLED=true
# Remove nodes not seen for this many days (based on last_seen field)
# Default: 7 days
NODE_CLEANUP_DAYS=7

View File

@@ -2,9 +2,9 @@ name: CI
on:
push:
branches: [main, master]
branches: [main]
pull_request:
branches: [main, master]
branches: [main]
jobs:
lint:

49
.github/workflows/claude.yml vendored Normal file
View File

@@ -0,0 +1,49 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
# prompt: 'Update the pull request description to include a summary of changes.'
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://docs.claude.com/en/docs/claude-code/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'

View File

@@ -2,11 +2,9 @@ name: Docker
on:
push:
branches: [main, master]
branches: [main]
tags:
- "v*"
pull_request:
branches: [main, master]
env:
REGISTRY: ghcr.io
@@ -23,6 +21,9 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
@@ -51,15 +52,18 @@ jobs:
with:
context: .
file: Dockerfile
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
BUILD_VERSION=${{ github.ref_name }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Test Docker image
if: github.event_name == 'pull_request'
run: |
docker build -t meshcore-hub-test -f Dockerfile .
docker build -t meshcore-hub-test --build-arg BUILD_VERSION=${{ github.ref_name }} -f Dockerfile .
docker run --rm meshcore-hub-test --version
docker run --rm meshcore-hub-test --help

1
.gitignore vendored
View File

@@ -33,6 +33,7 @@ share/python-wheels/
MANIFEST
uv.lock
docker-compose.override.yml
# PyInstaller
# Usually these files are written by a python script from a template

View File

@@ -38,3 +38,4 @@ repos:
- fastapi>=0.100.0
- alembic>=1.7.0
- types-paho-mqtt>=1.6.0
- types-PyYAML>=6.0.0

147
AGENTS.md
View File

@@ -5,8 +5,8 @@ This document provides context and guidelines for AI coding assistants working o
## Agent Rules
* You MUST use Python (version in `.python-version` file)
* You MUST activate a Python virtual environment in the `venv` directory or create one if it does not exist:
- `ls ./venv` to check if it exists
* You MUST activate a Python virtual environment in the `.venv` directory or create one if it does not exist:
- `ls ./.venv` to check if it exists
- `python -m venv .venv` to create it
* You MUST always activate the virtual environment before running any commands
- `source .venv/bin/activate`
@@ -114,17 +114,15 @@ class NodeRead(BaseModel):
### SQLAlchemy Models
```python
from sqlalchemy import String, DateTime, ForeignKey
from sqlalchemy import String, DateTime, Text
from sqlalchemy.orm import Mapped, mapped_column, relationship
from datetime import datetime
from uuid import uuid4
from typing import Optional
from meshcore_hub.common.models.base import Base
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin
class Node(Base):
class Node(Base, UUIDMixin, TimestampMixin):
__tablename__ = "nodes"
id: Mapped[str] = mapped_column(String(36), primary_key=True, default=lambda: str(uuid4()))
public_key: Mapped[str] = mapped_column(String(64), unique=True, index=True)
name: Mapped[str | None] = mapped_column(String(255), nullable=True)
adv_type: Mapped[str | None] = mapped_column(String(20), nullable=True)
@@ -132,6 +130,18 @@ class Node(Base):
# Relationships
tags: Mapped[list["NodeTag"]] = relationship(back_populates="node", cascade="all, delete-orphan")
class Member(Base, UUIDMixin, TimestampMixin):
"""Network member model - stores info about network operators."""
__tablename__ = "members"
name: Mapped[str] = mapped_column(String(255), nullable=False)
callsign: Mapped[Optional[str]] = mapped_column(String(20), nullable=True)
role: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
description: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
contact: Mapped[Optional[str]] = mapped_column(String(255), nullable=True)
public_key: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
```
### FastAPI Routes
@@ -239,7 +249,12 @@ meshcore-hub/
│ │ ├── mqtt.py # MQTT utilities
│ │ ├── logging.py # Logging config
│ │ ├── models/ # SQLAlchemy models
│ │ │ ├── node.py # Node model
│ │ │ ├── member.py # Network member model
│ │ │ └── ...
│ │ └── schemas/ # Pydantic schemas
│ │ ├── members.py # Member API schemas
│ │ └── ...
│ ├── interface/
│ │ ├── cli.py
│ │ ├── device.py # MeshCore device wrapper
@@ -247,9 +262,11 @@ meshcore-hub/
│ │ ├── receiver.py # RECEIVER mode
│ │ └── sender.py # SENDER mode
│ ├── collector/
│ │ ├── cli.py
│ │ ├── cli.py # Collector CLI with seed commands
│ │ ├── subscriber.py # MQTT subscriber
│ │ ├── tag_import.py # Tag import from JSON
│ │ ├── cleanup.py # Data retention/cleanup service
│ │ ├── tag_import.py # Tag import from YAML
│ │ ├── member_import.py # Member import from YAML
│ │ ├── handlers/ # Event handlers
│ │ └── webhook.py # Webhook dispatcher
│ ├── api/
@@ -258,11 +275,15 @@ meshcore-hub/
│ │ ├── auth.py # Authentication
│ │ ├── dependencies.py
│ │ ├── routes/ # API routes
│ │ │ ├── members.py # Member CRUD endpoints
│ │ │ └── ...
│ │ └── templates/ # Dashboard HTML
│ └── web/
│ ├── cli.py
│ ├── app.py # FastAPI app
│ ├── routes/ # Page routes
│ │ ├── members.py # Members page
│ │ └── ...
│ ├── templates/ # Jinja2 templates
│ └── static/ # CSS, JS
├── tests/
@@ -278,20 +299,17 @@ meshcore-hub/
├── etc/
│ └── mosquitto.conf # MQTT broker configuration
├── example/
│ └── data/
│ ├── collector/
│ └── tags.json # Example node tags data
│ └── web/
└── members.json # Example network members data
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
└── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME)
├── node_tags.yaml # Node tags for import
│ └── members.yaml # Network members for import
├── data/ # Runtime data (gitignored, DATA_HOME default)
── collector/ # Collector data
── meshcore.db # SQLite database
│ │ └── tags.json # Node tags for import
│ └── web/ # Web data
│ └── members.json # Network members list
── collector/ # Collector data
── meshcore.db # SQLite database
├── Dockerfile # Docker build configuration
── docker-compose.yml # Docker Compose services (gitignored)
└── docker-compose.yml.example # Docker Compose template
── docker-compose.yml # Docker Compose services
```
## MQTT Topic Structure
@@ -436,26 +454,39 @@ meshcore-hub interface --mode receiver --mock
See [PLAN.md](PLAN.md#configuration-environment-variables) for complete list.
Key variables:
- `DATA_HOME` - Base directory for all service data (default: `./data`)
- `DATA_HOME` - Base directory for runtime data (default: `./data`)
- `SEED_HOME` - Directory containing seed data files (default: `./seed`)
- `MQTT_HOST`, `MQTT_PORT`, `MQTT_PREFIX` - MQTT broker connection
- `DATABASE_URL` - SQLAlchemy database URL (default: `sqlite:///{DATA_HOME}/collector/meshcore.db`)
- `API_READ_KEY`, `API_ADMIN_KEY` - API authentication keys
- `LOG_LEVEL` - Logging verbosity
### Data Directory Structure
### Directory Structure
The `DATA_HOME` environment variable controls where all service data is stored:
**Seed Data (`SEED_HOME`)** - Contains initial data files for database seeding:
```
${SEED_HOME}/
├── node_tags.yaml # Node tags (keyed by public_key)
└── members.yaml # Network members list
```
**Runtime Data (`DATA_HOME`)** - Contains runtime data (gitignored):
```
${DATA_HOME}/
── collector/
── meshcore.db # SQLite database
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
── collector/
── meshcore.db # SQLite database
```
Services automatically create their subdirectories if they don't exist.
### Automatic Seeding
The collector automatically imports seed data on startup if YAML files exist in `SEED_HOME`:
- `node_tags.yaml` - Node tag definitions (keyed by public_key)
- `members.yaml` - Network member definitions
Manual seeding can be triggered with: `meshcore-hub collector seed`
### Webhook Configuration
The collector supports forwarding events to external HTTP endpoints:
@@ -472,6 +503,48 @@ The collector supports forwarding events to external HTTP endpoints:
| `WEBHOOK_MAX_RETRIES` | Max retries on failure (default: 3) |
| `WEBHOOK_RETRY_BACKOFF` | Exponential backoff multiplier (default: 2.0) |
### Data Retention / Cleanup Configuration
The collector supports automatic cleanup of old event data and inactive nodes:
**Event Data Cleanup:**
| Variable | Description |
|----------|-------------|
| `DATA_RETENTION_ENABLED` | Enable automatic event data cleanup (default: true) |
| `DATA_RETENTION_DAYS` | Days to retain event data (default: 30) |
| `DATA_RETENTION_INTERVAL_HOURS` | Hours between cleanup runs (default: 24) |
When enabled, the collector automatically deletes event data older than the retention period:
- Advertisements
- Messages (channel and direct)
- Telemetry
- Trace paths
- Event logs
**Node Cleanup:**
| Variable | Description |
|----------|-------------|
| `NODE_CLEANUP_ENABLED` | Enable automatic cleanup of inactive nodes (default: true) |
| `NODE_CLEANUP_DAYS` | Remove nodes not seen for this many days (default: 7) |
When enabled, the collector automatically removes nodes where:
- `last_seen` is older than the configured number of days
- Nodes with `last_seen=NULL` (never seen on network) are **NOT** removed
- Nodes created via tag import that have never been seen on the mesh are preserved
**Note:** Both event data and node cleanup run on the same schedule (DATA_RETENTION_INTERVAL_HOURS).
Manual cleanup can be triggered at any time with:
```bash
# Dry run to see what would be deleted
meshcore-hub collector cleanup --retention-days 30 --dry-run
# Live cleanup
meshcore-hub collector cleanup --retention-days 30
```
Webhook payload structure:
```json
{
@@ -551,6 +624,20 @@ On startup, the receiver performs these initialization steps:
1. Set device clock to current Unix timestamp
2. Send a local (non-flood) advertisement
3. Start automatic message fetching
4. Sync the device's contact database
### Contact Sync Behavior
The receiver syncs the device's contact database in two scenarios:
1. **Startup**: Initial sync when receiver starts
2. **Advertisement Events**: Automatic sync triggered whenever an advertisement is received from the mesh
Since advertisements are typically received every ~20 minutes, contact sync happens automatically without manual intervention. Each contact from the device is published individually to MQTT:
- Topic: `{prefix}/{device_public_key}/event/contact`
- Payload: `{public_key, adv_name, type}`
This ensures the collector's database stays current with all nodes discovered on the mesh network.
## References

View File

@@ -28,8 +28,12 @@ COPY src/ ./src/
COPY alembic/ ./alembic/
COPY alembic.ini ./
# Install the package
RUN pip install --upgrade pip && \
# Build argument for version (set via CI or manually)
ARG BUILD_VERSION=dev
# Set version in _version.py and install the package
RUN sed -i "s|__version__ = \"dev\"|__version__ = \"${BUILD_VERSION}\"|" src/meshcore_hub/_version.py && \
pip install --upgrade pip && \
pip install .
# =============================================================================

View File

@@ -481,6 +481,7 @@ ${DATA_HOME}/
| INTERFACE_MODE | RECEIVER | RECEIVER or SENDER |
| SERIAL_PORT | /dev/ttyUSB0 | Serial port path |
| SERIAL_BAUD | 115200 | Baud rate |
| MESHCORE_DEVICE_NAME | *(none)* | Device/node name set on startup |
| MOCK_DEVICE | false | Use mock device |
### Collector

370
README.md
View File

@@ -2,6 +2,8 @@
Python 3.11+ platform for managing and orchestrating MeshCore mesh networks.
![MeshCore Hub Web Dashboard](docs/images/web.png)
## Overview
MeshCore Hub provides a complete solution for monitoring, collecting, and interacting with MeshCore mesh networks. It consists of multiple components that work together:
@@ -62,49 +64,140 @@ MeshCore Hub provides a complete solution for monitoring, collecting, and intera
- **Web Dashboard**: Visualize network status, node locations, and message history
- **Docker Ready**: Single image with all components, easy deployment
## Getting Started
### Simple Self-Hosted Setup
The quickest way to get started is running the entire stack on a single machine with a connected MeshCore device.
**Prerequisites:**
1. Flash the [USB Companion firmware](https://meshcore.dev/) onto a compatible device (e.g., Heltec V3, T-Beam)
2. Connect the device via USB to a machine that supports Docker or Python
**Steps:**
```bash
# Clone the repository
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env: set SERIAL_PORT to your device (e.g., /dev/ttyUSB0 or /dev/ttyACM0)
# Start the entire stack with local MQTT broker
docker compose --profile mqtt --profile core --profile receiver up -d
# View the web dashboard
open http://localhost:8080
```
This starts all services: MQTT broker, collector, API, web dashboard, and the interface receiver that bridges your MeshCore device to the system.
### Distributed Community Setup
For larger deployments, you can separate receiver nodes from the central infrastructure. This allows multiple community members to contribute receiver coverage while hosting the backend centrally.
```
┌─────────────────────────────────────────────────────────────────────┐
│ Community Members │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ Raspberry Pi │ │ Raspberry Pi │ │ Any Linux │ │
│ │ + MeshCore │ │ + MeshCore │ │ + MeshCore │ │
│ │ Device │ │ Device │ │ Device │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬───────┘ │
│ │ │ │ │
│ │ receiver profile only │ │
│ └──────────────────┼──────────────────┘ │
│ │ │
│ MQTT (port 1883) │
│ │ │
└────────────────────────────┼─────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐
│ Community VPS / Server │
│ │
│ ┌──────────┐ ┌───────────┐ ┌─────────┐ ┌──────────────┐ │
│ │ MQTT │──▶│ Collector │──▶│ API │◀──│ Web Dashboard│ │
│ │ Broker │ │ │ │ │ │ (public) │ │
│ └──────────┘ └───────────┘ └─────────┘ └──────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────┘
```
**On each receiver node (Raspberry Pi, etc.):**
```bash
# Only run the receiver component
# Configure .env with MQTT_HOST pointing to your central server
MQTT_HOST=your-community-server.com
SERIAL_PORT=/dev/ttyUSB0
docker compose --profile receiver up -d
```
**On the central server (VPS/cloud):**
```bash
# Run the core infrastructure with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to an existing MQTT broker (set MQTT_HOST in .env)
docker compose --profile core up -d
```
This architecture allows:
- Multiple receivers for better RF coverage across a geographic area
- Centralized data storage and web interface
- Community members to contribute coverage with minimal setup
- The central server to be hosted anywhere with internet access
## Quick Start
### Using Docker Compose (Recommended)
Docker Compose supports **profiles** to selectively enable/disable components:
Docker Compose uses **profiles** to select which services to run:
| Profile | Services |
|---------|----------|
| `mqtt` | Eclipse Mosquitto MQTT broker |
| `interface-receiver` | MeshCore device receiver (events to MQTT) |
| `interface-sender` | MeshCore device sender (MQTT to device) |
| `collector` | MQTT subscriber + database storage |
| `api` | REST API server |
| `web` | Web dashboard |
| `mock` | All services with mock device (for testing) |
| `all` | All production services |
| Profile | Services | Use Case |
|---------|----------|----------|
| `core` | collector, api, web | Central server infrastructure |
| `receiver` | interface-receiver | Receiver node (events to MQTT) |
| `sender` | interface-sender | Sender node (MQTT to device) |
| `mqtt` | mosquitto broker | Local MQTT broker (optional) |
| `mock` | interface-mock-receiver | Testing without hardware |
| `migrate` | db-migrate | One-time database migration |
| `seed` | seed | One-time seed data import |
**Note:** Most deployments connect to an external MQTT broker. Add `--profile mqtt` only if you need a local broker.
```bash
# Clone the repository
git clone https://github.com/your-org/meshcore-hub.git
cd meshcore-hub/docker
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env with your settings (API keys, serial port, network info)
# Option 1: Start all services with mock device (for testing)
docker compose --profile mock up -d
# Create database schema
docker compose --profile migrate run --rm db-migrate
# Option 2: Start specific services for production
docker compose --profile mqtt --profile collector --profile api --profile web up -d
# Seed the database
docker compose --profile seed run --rm seed
# Option 3: Start all production services (requires real MeshCore device)
docker compose --profile all up -d
# Start core services with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to external MQTT (configure MQTT_HOST in .env)
docker compose --profile core up -d
# Start just the receiver (connects to MQTT_HOST from .env)
docker compose --profile receiver up -d
# View logs
docker compose logs -f
# Run database migrations
docker compose --profile migrate up
# Stop services
docker compose --profile mock down
docker compose down
```
#### Serial Device Access
@@ -144,6 +237,57 @@ meshcore-hub api
meshcore-hub web
```
## Updating an Existing Installation
To update MeshCore Hub to the latest version:
```bash
# Navigate to your installation directory
cd meshcore-hub
# Pull the latest code
git pull
# Pull latest Docker images
docker compose --profile all pull
# Recreate and restart services
# For receiver/sender only installs:
docker compose --profile receiver up -d --force-recreate
# For core services with MQTT:
docker compose --profile mqtt --profile core up -d --force-recreate
# For core services without local MQTT:
docker compose --profile core up -d --force-recreate
# For complete stack (all services):
docker compose --profile mqtt --profile core --profile receiver up -d --force-recreate
# View logs to verify update
docker compose logs -f
```
**Note:** Database migrations run automatically on collector startup, so no manual migration step is needed when using Docker.
For manual installations:
```bash
# Pull latest code
git pull
# Activate virtual environment
source .venv/bin/activate
# Update dependencies
pip install -e ".[dev]"
# Run database migrations
meshcore-hub db upgrade
# Restart your services
```
## Configuration
All components are configured via environment variables. Create a `.env` file or export variables:
@@ -164,17 +308,19 @@ All components are configured via environment variables. Create a `.env` file or
| `INTERFACE_MODE` | `RECEIVER` | Operating mode (RECEIVER or SENDER) |
| `SERIAL_PORT` | `/dev/ttyUSB0` | Serial port for MeshCore device |
| `SERIAL_BAUD` | `115200` | Serial baud rate |
| `MESHCORE_DEVICE_NAME` | *(none)* | Device/node name set on startup (broadcast in advertisements) |
| `MOCK_DEVICE` | `false` | Use mock device for testing |
### Collector Settings
| Variable | Default | Description |
|----------|---------|-------------|
| `DATABASE_URL` | `sqlite:///./meshcore.db` | SQLAlchemy database URL |
| `DATABASE_URL` | `sqlite:///{data_home}/collector/meshcore.db` | SQLAlchemy database URL |
| `SEED_HOME` | `./seed` | Directory containing seed data files (node_tags.yaml, members.yaml) |
#### Webhook Configuration
The collector can forward events to external HTTP endpoints:
The collector can forward certain events to external HTTP endpoints:
| Variable | Default | Description |
|----------|---------|-------------|
@@ -216,7 +362,6 @@ Webhook payload format:
| `NETWORK_NAME` | `MeshCore Network` | Display name for the network |
| `NETWORK_CITY` | *(none)* | City where network is located |
| `NETWORK_COUNTRY` | *(none)* | Country code (ISO 3166-1 alpha-2) |
| `NETWORK_LOCATION` | *(none)* | Center coordinates (lat,lon) |
## CLI Reference
@@ -226,13 +371,16 @@ meshcore-hub --help
# Interface component
meshcore-hub interface --mode receiver --port /dev/ttyUSB0
meshcore-hub interface --mode receiver --device-name "Gateway Node" # Set device name
meshcore-hub interface --mode sender --mock # Use mock device
# Collector component
meshcore-hub collector --database-url sqlite:///./data.db
# Import node tags from JSON file
meshcore-hub collector import-tags /path/to/tags.json
meshcore-hub collector # Run collector (auto-seeds on startup)
meshcore-hub collector seed # Import all seed data from SEED_HOME
meshcore-hub collector import-tags # Import node tags from SEED_HOME/node_tags.yaml
meshcore-hub collector import-tags /path/to/file.yaml # Import from specific file
meshcore-hub collector import-members # Import members from SEED_HOME/members.yaml
meshcore-hub collector import-members /path/to/file.yaml # Import from specific file
# API component
meshcore-hub api --host 0.0.0.0 --port 8000
@@ -246,74 +394,124 @@ meshcore-hub db downgrade # Rollback one migration
meshcore-hub db current # Show current revision
```
## Seed Data
The collector supports seeding the database with node tags and network members on startup. Seed files are read from the `SEED_HOME` directory (default: `./seed`).
### Automatic Seeding
When the collector starts, it automatically imports seed data from YAML files if they exist:
- `{SEED_HOME}/node_tags.yaml` - Node tag definitions
- `{SEED_HOME}/members.yaml` - Network member definitions
### Manual Seeding
```bash
# Native CLI
meshcore-hub collector seed
# With Docker Compose
docker compose --profile seed up
```
### Directory Structure
```
seed/ # SEED_HOME (seed data files)
├── node_tags.yaml # Node tags for import
└── members.yaml # Network members for import
data/ # DATA_HOME (runtime data)
└── collector/
└── meshcore.db # SQLite database
```
Example seed files are provided in `example/seed/`.
## Node Tags
Node tags allow you to attach custom metadata to nodes (e.g., location, role, owner). Tags are stored in the database and returned with node data via the API.
### Importing Tags from JSON
### Node Tags YAML Format
Tags can be bulk imported from a JSON file:
Tags are keyed by public key in YAML format:
```bash
# Native CLI
meshcore-hub collector import-tags /path/to/tags.json
```yaml
# Each key is a 64-character hex public key
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
# With Docker Compose
docker compose --profile import-tags run --rm import-tags
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
altitude: 150
location:
value: "37.8044,-122.2712"
type: coordinate
```
### Tags JSON Format
Tag values can be:
- **YAML primitives** (auto-detected type): strings, numbers, booleans
- **Explicit type** (for special types like coordinate):
```yaml
location:
value: "37.7749,-122.4194"
type: coordinate
```
```json
{
"tags": [
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "location",
"value": "San Francisco, CA",
"value_type": "string"
},
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "altitude",
"value": "150",
"value_type": "number"
}
]
}
Supported types: `string`, `number`, `boolean`, `coordinate`
### Import Tags Manually
```bash
# Import from default location ({SEED_HOME}/node_tags.yaml)
meshcore-hub collector import-tags
# Import from specific file
meshcore-hub collector import-tags /path/to/node_tags.yaml
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes
```
## Network Members
Network members represent the people operating nodes in your network. Members can optionally be linked to nodes via their public key.
### Members YAML Format
```yaml
members:
- name: John Doe
callsign: N0CALL
role: Network Operator
description: Example member entry
contact: john@example.com
public_key: 0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef
```
| Field | Required | Description |
|-------|----------|-------------|
| `public_key` | Yes | 64-character hex public key of the node |
| `key` | Yes | Tag name (max 100 characters) |
| `value` | No | Tag value (stored as text) |
| `value_type` | No | Type hint: `string`, `number`, `boolean`, or `coordinate` (default: `string`) |
| `name` | Yes | Member's display name |
| `callsign` | No | Amateur radio callsign |
| `role` | No | Member's role in the network |
| `description` | No | Additional description |
| `contact` | No | Contact information |
| `public_key` | No | Associated node public key (64-char hex) |
### Import Options
### Import Members Manually
```bash
# Create nodes if they don't exist (default behavior)
meshcore-hub collector import-tags tags.json
# Import from default location ({SEED_HOME}/members.yaml)
meshcore-hub collector import-members
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes tags.json
# Import from specific file
meshcore-hub collector import-members /path/to/members.yaml
```
### Data Directory Structure
For Docker deployments, organize your data files:
```
data/
├── collector/
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
```
Example files are provided in `example/data/`.
### Managing Tags via API
Tags can also be managed via the REST API:
@@ -385,7 +583,7 @@ curl -X POST \
| GET | `/api/v1/trace-paths` | List trace paths |
| POST | `/api/v1/commands/send-message` | Send direct message |
| POST | `/api/v1/commands/send-channel-message` | Send channel message |
| GET | `/api/v1/stats` | Get network statistics |
| GET | `/api/v1/dashboard/stats` | Get network statistics |
## Development
@@ -393,7 +591,7 @@ curl -X POST \
```bash
# Clone and setup
git clone https://github.com/your-org/meshcore-hub.git
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
python -m venv .venv
source .venv/bin/activate
@@ -459,11 +657,13 @@ meshcore-hub/
├── alembic/ # Database migrations
├── etc/ # Configuration files (mosquitto.conf)
├── example/ # Example files for testing
│ └── data/ # Example data files (members.json)
├── data/ # Runtime data (gitignored)
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
│ └── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME, copy from example/seed/)
├── data/ # Runtime data directory (DATA_HOME, created at runtime)
├── Dockerfile # Docker build configuration
├── docker-compose.yml # Docker Compose services (gitignored)
├── docker-compose.yml.example # Docker Compose template
├── docker-compose.yml # Docker Compose services
├── PROMPT.md # Project specification
├── SCHEMAS.md # Event schema documentation
├── PLAN.md # Implementation plan
@@ -491,7 +691,7 @@ meshcore-hub/
## License
See [LICENSE](LICENSE) for details.
This project is licensed under the GNU General Public License v3.0 or later (GPL-3.0-or-later). See [LICENSE](LICENSE) for details.
## Acknowledgments

View File

@@ -0,0 +1,114 @@
"""Add member_nodes association table
Revision ID: 002
Revises: 001
Create Date: 2024-12-05
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "002"
down_revision: Union[str, None] = "001"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Create member_nodes table
op.create_table(
"member_nodes",
sa.Column("id", sa.String(), nullable=False),
sa.Column("member_id", sa.String(36), nullable=False),
sa.Column("public_key", sa.String(64), nullable=False),
sa.Column("node_role", sa.String(50), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(["member_id"], ["members.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("ix_member_nodes_member_id", "member_nodes", ["member_id"])
op.create_index("ix_member_nodes_public_key", "member_nodes", ["public_key"])
op.create_index(
"ix_member_nodes_member_public_key",
"member_nodes",
["member_id", "public_key"],
)
# Migrate existing public_key data from members to member_nodes
# Get all members with a public_key
connection = op.get_bind()
members_with_keys = connection.execute(
sa.text("SELECT id, public_key FROM members WHERE public_key IS NOT NULL")
).fetchall()
# Insert into member_nodes
for member_id, public_key in members_with_keys:
# Generate a UUID for the new row
import uuid
node_id = str(uuid.uuid4())
connection.execute(
sa.text(
"""
INSERT INTO member_nodes (id, member_id, public_key, created_at, updated_at)
VALUES (:id, :member_id, :public_key, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP)
"""
),
{"id": node_id, "member_id": member_id, "public_key": public_key},
)
# Drop the public_key column from members
op.drop_index("ix_members_public_key", table_name="members")
op.drop_column("members", "public_key")
def downgrade() -> None:
# Add public_key column back to members
op.add_column(
"members",
sa.Column("public_key", sa.String(64), nullable=True),
)
op.create_index("ix_members_public_key", "members", ["public_key"])
# Migrate data back - take the first node for each member
connection = op.get_bind()
member_nodes = connection.execute(
sa.text(
"""
SELECT DISTINCT member_id, public_key
FROM member_nodes
WHERE (member_id, created_at) IN (
SELECT member_id, MIN(created_at)
FROM member_nodes
GROUP BY member_id
)
"""
)
).fetchall()
for member_id, public_key in member_nodes:
connection.execute(
sa.text(
"UPDATE members SET public_key = :public_key WHERE id = :member_id"
),
{"public_key": public_key, "member_id": member_id},
)
# Drop member_nodes table
op.drop_table("member_nodes")

View File

@@ -0,0 +1,67 @@
"""Add event_hash column to event tables for deduplication
Revision ID: 003
Revises: 002
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "003"
down_revision: Union[str, None] = "002"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Add event_hash column to messages table
op.add_column(
"messages",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])
# Add event_hash column to advertisements table
op.add_column(
"advertisements",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_advertisements_event_hash", "advertisements", ["event_hash"])
# Add event_hash column to trace_paths table
op.add_column(
"trace_paths",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Add event_hash column to telemetry table
op.add_column(
"telemetry",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
def downgrade() -> None:
# Remove event_hash from telemetry
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
op.drop_column("telemetry", "event_hash")
# Remove event_hash from trace_paths
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
op.drop_column("trace_paths", "event_hash")
# Remove event_hash from advertisements
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
op.drop_column("advertisements", "event_hash")
# Remove event_hash from messages
op.drop_index("ix_messages_event_hash", table_name="messages")
op.drop_column("messages", "event_hash")

View File

@@ -0,0 +1,63 @@
"""Add event_receivers junction table for multi-receiver tracking
Revision ID: 004
Revises: 003
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "004"
down_revision: Union[str, None] = "003"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"event_receivers",
sa.Column("id", sa.String(36), primary_key=True),
sa.Column("event_type", sa.String(20), nullable=False),
sa.Column("event_hash", sa.String(32), nullable=False),
sa.Column(
"receiver_node_id",
sa.String(36),
sa.ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("snr", sa.Float, nullable=True),
sa.Column("received_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
)
op.create_index(
"ix_event_receivers_event_hash",
"event_receivers",
["event_hash"],
)
op.create_index(
"ix_event_receivers_receiver_node_id",
"event_receivers",
["receiver_node_id"],
)
op.create_index(
"ix_event_receivers_type_hash",
"event_receivers",
["event_type", "event_hash"],
)
def downgrade() -> None:
op.drop_index("ix_event_receivers_type_hash", table_name="event_receivers")
op.drop_index("ix_event_receivers_receiver_node_id", table_name="event_receivers")
op.drop_index("ix_event_receivers_event_hash", table_name="event_receivers")
op.drop_table("event_receivers")

View File

@@ -0,0 +1,126 @@
"""Make event_hash columns unique for race condition prevention
Revision ID: 005
Revises: 004
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy import inspect
# revision identifiers, used by Alembic.
revision: str = "005"
down_revision: Union[str, None] = "004"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def _index_exists(table_name: str, index_name: str) -> bool:
"""Check if an index exists on a table."""
bind = op.get_bind()
inspector = inspect(bind)
indexes = inspector.get_indexes(table_name)
return any(idx["name"] == index_name for idx in indexes)
def _has_unique_on_column(table_name: str, column_name: str) -> bool:
"""Check if a unique constraint or unique index exists on a column."""
bind = op.get_bind()
inspector = inspect(bind)
# Check unique constraints
uniques = inspector.get_unique_constraints(table_name)
for uq in uniques:
if column_name in uq.get("column_names", []):
return True
# Also check indexes (SQLite may create unique index instead of constraint)
indexes = inspector.get_indexes(table_name)
for idx in indexes:
if idx.get("unique") and column_name in idx.get("column_names", []):
return True
return False
def upgrade() -> None:
# Convert non-unique indexes to unique indexes for race condition prevention
# Note: SQLite handles NULL values as unique (each NULL is distinct)
# SQLite doesn't support ALTER TABLE ADD CONSTRAINT, so we use unique indexes
# Messages
if _index_exists("messages", "ix_messages_event_hash"):
op.drop_index("ix_messages_event_hash", table_name="messages")
if not _has_unique_on_column("messages", "event_hash"):
op.create_index(
"ix_messages_event_hash_unique",
"messages",
["event_hash"],
unique=True,
)
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash"):
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
if not _has_unique_on_column("advertisements", "event_hash"):
op.create_index(
"ix_advertisements_event_hash_unique",
"advertisements",
["event_hash"],
unique=True,
)
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
if not _has_unique_on_column("trace_paths", "event_hash"):
op.create_index(
"ix_trace_paths_event_hash_unique",
"trace_paths",
["event_hash"],
unique=True,
)
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash"):
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
if not _has_unique_on_column("telemetry", "event_hash"):
op.create_index(
"ix_telemetry_event_hash_unique",
"telemetry",
["event_hash"],
unique=True,
)
def downgrade() -> None:
# Restore non-unique indexes
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash_unique"):
op.drop_index("ix_telemetry_event_hash_unique", table_name="telemetry")
if not _index_exists("telemetry", "ix_telemetry_event_hash"):
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash_unique"):
op.drop_index("ix_trace_paths_event_hash_unique", table_name="trace_paths")
if not _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash_unique"):
op.drop_index(
"ix_advertisements_event_hash_unique", table_name="advertisements"
)
if not _index_exists("advertisements", "ix_advertisements_event_hash"):
op.create_index(
"ix_advertisements_event_hash", "advertisements", ["event_hash"]
)
# Messages
if _index_exists("messages", "ix_messages_event_hash_unique"):
op.drop_index("ix_messages_event_hash_unique", table_name="messages")
if not _index_exists("messages", "ix_messages_event_hash"):
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])

View File

@@ -0,0 +1,39 @@
"""Make Node.last_seen nullable
Revision ID: 0b944542ccd8
Revises: 005
Create Date: 2025-12-08 00:07:49.891245+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "0b944542ccd8"
down_revision: Union[str, None] = "005"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Make Node.last_seen nullable since nodes from contact sync
# haven't actually been "seen" on the mesh yet
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.alter_column("last_seen", existing_type=sa.DATETIME(), nullable=True)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Revert Node.last_seen to non-nullable
# Note: This will fail if there are NULL values in last_seen
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.alter_column("last_seen", existing_type=sa.DATETIME(), nullable=False)
# ### end Alembic commands ###

View File

@@ -0,0 +1,111 @@
"""Add member_id field to members table
Revision ID: 03b9b2451bd9
Revises: 0b944542ccd8
Create Date: 2025-12-08 14:34:30.337799+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "03b9b2451bd9"
down_revision: Union[str, None] = "0b944542ccd8"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("advertisements", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_advertisements_event_hash_unique"))
batch_op.create_unique_constraint(
"uq_advertisements_event_hash", ["event_hash"]
)
with op.batch_alter_table("members", schema=None) as batch_op:
# Add member_id as nullable first to handle existing data
batch_op.add_column(
sa.Column("member_id", sa.String(length=100), nullable=True)
)
# Generate member_id for existing members based on their name
# Convert name to lowercase and replace spaces with underscores
connection = op.get_bind()
connection.execute(
sa.text(
"UPDATE members SET member_id = LOWER(REPLACE(name, ' ', '_')) WHERE member_id IS NULL"
)
)
with op.batch_alter_table("members", schema=None) as batch_op:
# Now make it non-nullable and add unique index
batch_op.alter_column("member_id", nullable=False)
batch_op.drop_index(batch_op.f("ix_members_name"))
batch_op.create_index(
batch_op.f("ix_members_member_id"), ["member_id"], unique=True
)
with op.batch_alter_table("messages", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_messages_event_hash_unique"))
batch_op.create_unique_constraint("uq_messages_event_hash", ["event_hash"])
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_nodes_public_key"))
batch_op.create_index(
batch_op.f("ix_nodes_public_key"), ["public_key"], unique=True
)
with op.batch_alter_table("telemetry", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_telemetry_event_hash_unique"))
batch_op.create_unique_constraint("uq_telemetry_event_hash", ["event_hash"])
with op.batch_alter_table("trace_paths", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_trace_paths_event_hash_unique"))
batch_op.create_unique_constraint("uq_trace_paths_event_hash", ["event_hash"])
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("trace_paths", schema=None) as batch_op:
batch_op.drop_constraint("uq_trace_paths_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_trace_paths_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("telemetry", schema=None) as batch_op:
batch_op.drop_constraint("uq_telemetry_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_telemetry_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_nodes_public_key"))
batch_op.create_index(
batch_op.f("ix_nodes_public_key"), ["public_key"], unique=False
)
with op.batch_alter_table("messages", schema=None) as batch_op:
batch_op.drop_constraint("uq_messages_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_messages_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("members", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_members_member_id"))
batch_op.create_index(batch_op.f("ix_members_name"), ["name"], unique=False)
batch_op.drop_column("member_id")
with op.batch_alter_table("advertisements", schema=None) as batch_op:
batch_op.drop_constraint("uq_advertisements_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_advertisements_event_hash_unique"), ["event_hash"], unique=1
)
# ### end Alembic commands ###

View File

@@ -0,0 +1,57 @@
"""Remove member_nodes table
Revision ID: aa1162502616
Revises: 03b9b2451bd9
Create Date: 2025-12-08 15:04:37.260923+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "aa1162502616"
down_revision: Union[str, None] = "03b9b2451bd9"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Drop the member_nodes table
# Nodes are now associated with members via a 'member_id' tag on the node
op.drop_table("member_nodes")
def downgrade() -> None:
# Recreate the member_nodes table if needed for rollback
op.create_table(
"member_nodes",
sa.Column("id", sa.String(length=36), nullable=False),
sa.Column("member_id", sa.String(length=36), nullable=False),
sa.Column("public_key", sa.String(length=64), nullable=False),
sa.Column("node_role", sa.String(length=50), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.ForeignKeyConstraint(
["member_id"],
["members.id"],
name=op.f("fk_member_nodes_member_id_members"),
ondelete="CASCADE",
),
sa.PrimaryKeyConstraint("id", name=op.f("pk_member_nodes")),
)
op.create_index(
op.f("ix_member_nodes_member_id"), "member_nodes", ["member_id"], unique=False
)
op.create_index(
op.f("ix_member_nodes_public_key"), "member_nodes", ["public_key"], unique=False
)
op.create_index(
"ix_member_nodes_member_public_key",
"member_nodes",
["member_id", "public_key"],
unique=False,
)

View File

@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Louis",
"callsign": "Louis",
"role": "admin",
"description": "IPNet Founder"
}
]
}

View File

@@ -1,613 +0,0 @@
{
"2337484665ced7e210007e9fd9db98ced0a24a6eab8b4cbe3a06b3a1cea33ca1": {
"friendly_name": "IP2 Repeater 1",
"node_id": "ip2-rep01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0357627,1.132079",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "31",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8cb01fff1afc099055af418ce5fc5e60384df9ff763c25dd7e6a5e0922e8df90": {
"friendly_name": "IP2 Repeater 2",
"node_id": "ip2-rep02.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0390682,1.1304141",
"type": "coordinate"
},
"location_description": "Belstead Road",
"hardware": "Heltec V3",
"antenna": "McGill 6dBi Omni",
"elevation": {
"value": "44",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"5b565df747913358e24d890b2227de9c35d09763746b6ec326c15ebbf9b8be3b": {
"friendly_name": "IP2 Repeater 3",
"node_id": "ip2-rep03.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.046356,1.134661",
"type": "coordinate"
},
"location_description": "Birkfield Drive",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "52",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"780d0939f90b22d3bd7cbedcaf4e8d468a12c01886ab24b8cfa11eab2f5516c5": {
"friendly_name": "IP2 Integration 1",
"node_id": "ip2-int01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"30121dc60362c633c457ffa18f49b3e1d6823402c33709f32d7df70612250b96": {
"friendly_name": "MeshBot",
"node_id": "bot.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"9135986b83815ada92883358435cc6528c7db60cb647f9b6547739a1ce5eb1c8": {
"friendly_name": "IP3 Repeater 1",
"node_id": "ip3-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045803,1.204416",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "42",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e334ec5475789d542ed9e692fbeef7444a371fcc05adcbda1f47ba6a3191b459": {
"friendly_name": "IP3 Repeater 2",
"node_id": "ip3-rep02.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.03297,1.17543",
"type": "coordinate"
},
"location_description": "Morland Road Allotments",
"hardware": "Heltec T114",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"cc15fb33e98f2e098a543f516f770dc3061a1a6b30f79b84780663bf68ae6b53": {
"friendly_name": "IP3 Repeater 3",
"node_id": "ip3-rep03.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04499,1.18149",
"type": "coordinate"
},
"location_description": "Hatfield Road",
"hardware": "Heltec V3",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"22309435fbd9dd1f14870a1895dc854779f6b2af72b08542f6105d264a493ebe": {
"friendly_name": "IP3 Integration 1",
"node_id": "ip3-int01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045773,1.212808",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Generic 3dBi Whip",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"2a4f89e766dfa1758e35a69962c1f6d352b206a5e3562a589155a3ebfe7fc2bb": {
"friendly_name": "IP3 Repeater 4",
"node_id": "ip3-rep04.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.046383,1.174542",
"type": "coordinate"
},
"location_description": "Holywells",
"hardware": "Sensecap Solar",
"antenna": "Paradar 6.5dbi Omni",
"elevation": {
"value": "21",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e790b73b2d6e377dd0f575c847f3ef42232f610eb9a19af57083fc4f647309ac": {
"friendly_name": "IP3 Repeater 5",
"node_id": "ip3-rep05.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.05252,1.17034",
"type": "coordinate"
},
"location_description": "Back Hamlet",
"hardware": "Heltec T114",
"antenna": "Paradar 6.5dBi Omni",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"20ed75ffc0f9777951716bb3d308d7f041fd2ad32fe2e998e600d0361e1fe2ac": {
"friendly_name": "IP3 Repeater 6",
"node_id": "ip3-rep06.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04893,1.18965",
"type": "coordinate"
},
"location_description": "Dover Road",
"hardware": "Unknown",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"bd7b5ac75f660675b39f368e1dbb6d1dbcefd8bd7a170e21a942954f67c8bf52": {
"friendly_name": "IP8 Repeater 1",
"node_id": "rep01.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.033684,1.118384",
"type": "coordinate"
},
"location_description": "Grove Hill",
"hardware": "Heltec V3",
"antenna": "McGill 3dBi Omni",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"9cf300c40112ea34d0a59858270948b27ab6cd87e840de338f3ca782c17537b2": {
"friendly_name": "IP8 Repeater 2",
"node_id": "rep02.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.035648,1.073271",
"type": "coordinate"
},
"location_description": "Washbrook",
"hardware": "Sensecap Solar",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"d3c20d962f7384c111fbafad6fbc1c1dc0e5c3ce802fb3ee11020e8d8207ed3a": {
"friendly_name": "IP4 Repeater 1",
"node_id": "ip4-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.052445,1.156882",
"type": "coordinate"
},
"location_description": "Wine Rack",
"hardware": "Heltec T114",
"antenna": "Generic 5dbi Whip",
"elevation": {
"value": "50",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"b00ce9d218203e96d8557a4d59e06f5de59bbc4dcc4df9c870079d2cb8b5bd80": {
"friendly_name": "IP4 Repeater 2",
"node_id": "ip4-rep02.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.06217,1.18332",
"type": "coordinate"
},
"location_description": "Rushmere Road",
"hardware": "Heltec V3",
"antenna": "Paradar 5dbi Whip",
"elevation": {
"value": "35",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8accb6d0189ccaffb745ba54793e7fe3edd515edb45554325d957e48c1b9f3b3": {
"friendly_name": "IP4 Repeater 3",
"node_id": "ip4-rep03.ipnt.uk",
"member_id": "craig",
"area": "IP4",
"location": {
"value": "52.058,1.165",
"type": "coordinate"
},
"location_description": "IP4 Area",
"hardware": "Heltec v3",
"antenna": "Generic Whip",
"elevation": {
"value": "30",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"69fb8431e7ab307513797544fab99ce53ce24c46ec2d3a11767fe70f2ca37b23": {
"friendly_name": "IP3 Test Repeater 1",
"node_id": "ip3-tst01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.041869,1.204789",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Station G2",
"antenna": "McGill 10dBi Panel",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "true",
"type": "boolean"
},
"mesh_role": "repeater"
}
}

View File

@@ -1,14 +0,0 @@
services:
api:
networks:
- default
- pangolin
web:
networks:
- default
- pangolin
networks:
pangolin:
external: true

View File

@@ -1,10 +1,14 @@
services:
# ==========================================================================
# MQTT Broker - Eclipse Mosquitto
# MQTT Broker - Eclipse Mosquitto (optional, use --profile mqtt)
# Most users will connect to an external MQTT broker instead
# ==========================================================================
mqtt:
image: eclipse-mosquitto:2
container_name: meshcore-mqtt
profiles:
- all
- mqtt
restart: unless-stopped
ports:
- "${MQTT_EXTERNAL_PORT:-1883}:1883"
@@ -24,17 +28,15 @@ services:
# Interface Receiver - MeshCore device to MQTT bridge (events)
# ==========================================================================
interface-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-receiver
profiles:
- interface-receiver
- all
- receiver
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT:-/dev/ttyUSB0}:${SERIAL_PORT:-/dev/ttyUSB0}"
user: root # Required for device access
@@ -45,6 +47,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- SERIAL_PORT=${SERIAL_PORT:-/dev/ttyUSB0}
- SERIAL_BAUD=${SERIAL_BAUD:-115200}
- NODE_ADDRESS=${NODE_ADDRESS:-}
@@ -60,17 +63,15 @@ services:
# Interface Sender - MQTT to MeshCore device bridge (commands)
# ==========================================================================
interface-sender:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-sender
profiles:
- interface-sender
- all
- sender
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT_SENDER:-/dev/ttyUSB1}:${SERIAL_PORT_SENDER:-/dev/ttyUSB1}"
user: root # Required for device access
@@ -81,6 +82,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- SERIAL_PORT=${SERIAL_PORT_SENDER:-/dev/ttyUSB1}
- SERIAL_BAUD=${SERIAL_BAUD:-115200}
- NODE_ADDRESS=${NODE_ADDRESS_SENDER:-}
@@ -96,17 +98,15 @@ services:
# Interface Mock Receiver - For testing without real devices
# ==========================================================================
interface-mock-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-mock-receiver
profiles:
- all
- mock
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -114,6 +114,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- MOCK_DEVICE=true
- NODE_ADDRESS=${NODE_ADDRESS:-0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef}
command: ["interface", "receiver", "--mock"]
@@ -128,18 +129,21 @@ services:
# Collector - MQTT subscriber and database storage
# ==========================================================================
collector:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-collector
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
seed:
condition: service_completed_successfully
volumes:
# Mount data directory (contains collector/meshcore.db)
- ${DATA_HOME:-./data}:/data
- ${SEED_HOME:-./seed}:/seed
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -147,7 +151,9 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- DATA_HOME=/data
- SEED_HOME=/seed
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
# Webhook configuration
@@ -162,6 +168,12 @@ services:
- WEBHOOK_TIMEOUT=${WEBHOOK_TIMEOUT:-10.0}
- WEBHOOK_MAX_RETRIES=${WEBHOOK_MAX_RETRIES:-3}
- WEBHOOK_RETRY_BACKOFF=${WEBHOOK_RETRY_BACKOFF:-2.0}
# Data retention and cleanup configuration
- DATA_RETENTION_ENABLED=${DATA_RETENTION_ENABLED:-true}
- DATA_RETENTION_DAYS=${DATA_RETENTION_DAYS:-30}
- DATA_RETENTION_INTERVAL_HOURS=${DATA_RETENTION_INTERVAL_HOURS:-24}
- NODE_CLEANUP_ENABLED=${NODE_CLEANUP_ENABLED:-true}
- NODE_CLEANUP_DAYS=${NODE_CLEANUP_DAYS:-7}
command: ["collector"]
healthcheck:
test: ["CMD", "meshcore-hub", "health", "collector"]
@@ -174,15 +186,18 @@ services:
# API Server - REST API for querying data and sending commands
# ==========================================================================
api:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-api
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
seed:
condition: service_completed_successfully
collector:
condition: service_started
ports:
@@ -197,6 +212,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- DATA_HOME=/data
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
@@ -216,11 +232,14 @@ services:
# Web Dashboard - Web interface for network visualization
# ==========================================================================
web:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-web
profiles:
- all
- core
restart: unless-stopped
depends_on:
api:
@@ -236,10 +255,11 @@ services:
- NETWORK_NAME=${NETWORK_NAME:-MeshCore Network}
- NETWORK_CITY=${NETWORK_CITY:-}
- NETWORK_COUNTRY=${NETWORK_COUNTRY:-}
- NETWORK_LOCATION=${NETWORK_LOCATION:-}
- NETWORK_RADIO_CONFIG=${NETWORK_RADIO_CONFIG:-}
- NETWORK_CONTACT_EMAIL=${NETWORK_CONTACT_EMAIL:-}
- NETWORK_CONTACT_DISCORD=${NETWORK_CONTACT_DISCORD:-}
- NETWORK_CONTACT_GITHUB=${NETWORK_CONTACT_GITHUB:-}
- NETWORK_WELCOME_TEXT=${NETWORK_WELCOME_TEXT:-}
command: ["web"]
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:8080/health')"]
@@ -252,13 +272,16 @@ services:
# Database Migrations - Run Alembic migrations
# ==========================================================================
db-migrate:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-db-migrate
profiles:
- all
- core
- migrate
restart: "no"
volumes:
# Mount data directory (uses collector/meshcore.db)
- ${DATA_HOME:-./data}:/data
@@ -272,13 +295,19 @@ services:
# Seed Data - Import node_tags.json and members.json from SEED_HOME
# ==========================================================================
seed:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-seed
profiles:
- all
- core
- seed
restart: "no"
depends_on:
db-migrate:
condition: service_completed_successfully
volumes:
# Mount data directory for database (read-write)
- ${DATA_HOME:-./data}:/data

BIN
docs/images/web.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 133 KiB

View File

@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Example Member",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member entry"
}
]
}

14
example/seed/members.yaml Normal file
View File

@@ -0,0 +1,14 @@
# Example members seed file
# Note: Nodes are associated with members via a 'member_id' tag on the node.
# Use node_tags.yaml to set member_id tags on nodes.
members:
- member_id: example_member
name: Example Member
callsign: N0CALL
role: Network Operator
description: Example network operator member
- member_id: simple_member
name: Simple Member
callsign: N0CALL2
role: Observer
description: Example observer member

View File

@@ -1,16 +0,0 @@
{
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef": {
"friendly_name": "Gateway Node",
"location": {"value": "37.7749,-122.4194", "type": "coordinate"},
"lat": {"value": "37.7749", "type": "number"},
"lon": {"value": "-122.4194", "type": "number"},
"role": "gateway"
},
"fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210": {
"friendly_name": "Oakland Repeater",
"location": {"value": "37.8044,-122.2712", "type": "coordinate"},
"lat": {"value": "37.8044", "type": "number"},
"lon": {"value": "-122.2712", "type": "number"},
"altitude": {"value": "150", "type": "number"}
}
}

View File

@@ -0,0 +1,29 @@
# Example node tags seed file
# Each key is a 64-character hex public key
#
# Tag values can be:
# - YAML primitives (auto-detected type):
# friendly_name: Gateway Node # string
# elevation: 150 # number
# is_online: true # boolean
#
# - Explicit type (for special types like coordinate):
# location:
# value: "37.7749,-122.4194"
# type: coordinate
#
# Supported types: string, number, boolean, coordinate
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
lat: 37.8044
lon: -122.2712
altitude: 150
is_online: false

View File

@@ -4,10 +4,10 @@ build-backend = "setuptools.build_meta"
[project]
name = "meshcore-hub"
version = "0.1.0"
version = "0.0.0"
description = "Python monorepo for managing and orchestrating MeshCore mesh networks"
readme = "README.md"
license = {text = "MIT"}
license = {text = "GPL-3.0-or-later"}
requires-python = ">=3.11"
authors = [
{name = "MeshCore Hub Contributors"}
@@ -15,7 +15,7 @@ authors = [
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
@@ -39,6 +39,7 @@ dependencies = [
"httpx>=0.25.0",
"aiosqlite>=0.19.0",
"meshcore>=2.2.0",
"pyyaml>=6.0.0",
]
[project.optional-dependencies]
@@ -51,6 +52,7 @@ dev = [
"mypy>=1.5.0",
"pre-commit>=3.4.0",
"types-paho-mqtt>=1.6.0",
"types-PyYAML>=6.0.0",
]
postgres = [
"asyncpg>=0.28.0",
@@ -61,10 +63,10 @@ postgres = [
meshcore-hub = "meshcore_hub.__main__:main"
[project.urls]
Homepage = "https://github.com/meshcore-dev/meshcore-hub"
Documentation = "https://github.com/meshcore-dev/meshcore-hub#readme"
Repository = "https://github.com/meshcore-dev/meshcore-hub"
Issues = "https://github.com/meshcore-dev/meshcore-hub/issues"
Homepage = "https://github.com/ipnet-mesh/meshcore-hub"
Documentation = "https://github.com/ipnet-mesh/meshcore-hub#readme"
Repository = "https://github.com/ipnet-mesh/meshcore-hub"
Issues = "https://github.com/ipnet-mesh/meshcore-hub/issues"
[tool.setuptools.packages.find]
where = ["src"]

View File

@@ -1,3 +1,5 @@
"""MeshCore Hub - Python monorepo for managing MeshCore mesh networks."""
__version__ = "0.1.0"
from meshcore_hub._version import __version__
__all__ = ["__version__"]

View File

@@ -174,6 +174,40 @@ def db_history() -> None:
command.history(alembic_cfg)
@db.command("stamp")
@click.option(
"--revision",
type=str,
default="head",
help="Target revision to stamp (default: head)",
)
@click.option(
"--database-url",
type=str,
default=None,
envvar="DATABASE_URL",
help="Database connection URL",
)
def db_stamp(revision: str, database_url: str | None) -> None:
"""Stamp database with revision without running migrations.
Use this to mark an existing database as up-to-date when the schema
was created before Alembic migrations were introduced.
"""
import os
from alembic import command
from alembic.config import Config
click.echo(f"Stamping database with revision: {revision}")
alembic_cfg = Config("alembic.ini")
if database_url:
os.environ["DATABASE_URL"] = database_url
command.stamp(alembic_cfg, revision)
click.echo("Database stamped successfully.")
# Health check commands for Docker HEALTHCHECK
@cli.group()
def health() -> None:

View File

@@ -0,0 +1,8 @@
"""MeshCore Hub version information.
This file contains the version string for the package.
It can be overridden at build time by setting BUILD_VERSION environment variable.
"""
__version__ = "dev"
__all__ = ["__version__"]

View File

@@ -32,10 +32,9 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
# Get database URL from app state
database_url = getattr(app.state, "database_url", "sqlite:///./meshcore.db")
# Initialize database
# Initialize database (schema managed by Alembic migrations)
logger.info(f"Initializing database: {database_url}")
_db_manager = DatabaseManager(database_url)
_db_manager.create_tables()
yield
@@ -53,6 +52,7 @@ def create_app(
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
cors_origins: list[str] | None = None,
) -> FastAPI:
"""Create and configure the FastAPI application.
@@ -64,6 +64,7 @@ def create_app(
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
cors_origins: Allowed CORS origins
Returns:
@@ -86,6 +87,7 @@ def create_app(
app.state.mqtt_host = mqtt_host
app.state.mqtt_port = mqtt_port
app.state.mqtt_prefix = mqtt_prefix
app.state.mqtt_tls = mqtt_tls
# Configure CORS
if cors_origins is None:

View File

@@ -67,6 +67,13 @@ import click
envvar="MQTT_TOPIC_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--cors-origins",
type=str,
@@ -92,6 +99,7 @@ def api(
mqtt_host: str,
mqtt_port: int,
mqtt_prefix: str,
mqtt_tls: bool,
cors_origins: str | None,
reload: bool,
) -> None:
@@ -171,6 +179,7 @@ def api(
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
cors_origins=origins_list,
)

View File

@@ -57,6 +57,7 @@ def get_mqtt_client(request: Request) -> MQTTClient:
mqtt_host = getattr(request.app.state, "mqtt_host", "localhost")
mqtt_port = getattr(request.app.state, "mqtt_port", 1883)
mqtt_prefix = getattr(request.app.state, "mqtt_prefix", "meshcore")
mqtt_tls = getattr(request.app.state, "mqtt_tls", False)
# Use unique client ID to allow multiple API instances
unique_id = uuid.uuid4().hex[:8]
@@ -65,6 +66,7 @@ def get_mqtt_client(request: Request) -> MQTTClient:
port=mqtt_port,
prefix=mqtt_prefix,
client_id=f"meshcore-api-{unique_id}",
tls=mqtt_tls,
)
client = MQTTClient(config)

View File

@@ -4,33 +4,145 @@ from datetime import datetime
from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy import func, or_, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement
from meshcore_hub.common.schemas.messages import AdvertisementList, AdvertisementRead
from meshcore_hub.common.models import Advertisement, EventReceiver, Node, NodeTag
from meshcore_hub.common.schemas.messages import (
AdvertisementList,
AdvertisementRead,
ReceiverInfo,
)
router = APIRouter()
def _get_tag_name(node: Optional[Node]) -> Optional[str]:
"""Extract name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes."""
if not event_hashes:
return {}
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
node_ids = [r.node_id for r in results]
tag_names: dict[str, str] = {}
if node_ids:
tag_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "name")
)
for node_id, value in session.execute(tag_query).all():
tag_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
tag_name=tag_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=AdvertisementList)
async def list_advertisements(
_: RequireRead,
session: DbSession,
search: Optional[str] = Query(
None, description="Search in name tag, node name, or public key"
),
public_key: Optional[str] = Query(None, description="Filter by public key"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> AdvertisementList:
"""List advertisements with filtering and pagination."""
# Build query
query = select(Advertisement)
# Aliases for node joins
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
# Build query with both receiver and source node joins
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
)
if search:
# Search in public key, advertisement name, node name, or name tag
search_pattern = f"%{search}%"
query = query.where(
or_(
Advertisement.public_key.ilike(search_pattern),
Advertisement.name.ilike(search_pattern),
SourceNode.name.ilike(search_pattern),
SourceNode.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "name", NodeTag.value.ilike(search_pattern)
)
),
)
)
if public_key:
query = query.where(Advertisement.public_key == public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Advertisement.received_at >= since)
@@ -45,10 +157,58 @@ async def list_advertisements(
query = query.order_by(Advertisement.received_at.desc()).offset(offset).limit(limit)
# Execute
advertisements = session.execute(query).scalars().all()
results = session.execute(query).all()
# Collect node IDs to fetch tags
node_ids = set()
for row in results:
if row.receiver_id:
node_ids.add(row.receiver_id)
if row.source_id:
node_ids.add(row.source_id)
# Fetch nodes with tags
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
# Fetch all receivers for these advertisements
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", event_hashes
)
# Build response with node details
items = []
for row in results:
adv = row[0]
receiver_node = nodes_by_id.get(row.receiver_id) if row.receiver_id else None
source_node = nodes_by_id.get(row.source_id) if row.source_id else None
data = {
"received_by": row.receiver_pk,
"receiver_name": row.receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": row.source_name,
"node_tag_name": _get_tag_name(source_node),
"adv_type": adv.adv_type or row.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": (
receivers_by_hash.get(adv.event_hash, []) if adv.event_hash else []
),
}
items.append(AdvertisementRead(**data))
return AdvertisementList(
items=[AdvertisementRead.model_validate(a) for a in advertisements],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +222,67 @@ async def get_advertisement(
advertisement_id: str,
) -> AdvertisementRead:
"""Get a single advertisement by ID."""
query = select(Advertisement).where(Advertisement.id == advertisement_id)
advertisement = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
.where(Advertisement.id == advertisement_id)
)
result = session.execute(query).one_or_none()
if not advertisement:
if not result:
raise HTTPException(status_code=404, detail="Advertisement not found")
return AdvertisementRead.model_validate(advertisement)
adv = result[0]
# Fetch nodes with tags for friendly names
node_ids = []
if result.receiver_id:
node_ids.append(result.receiver_id)
if result.source_id:
node_ids.append(result.source_id)
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
receiver_node = nodes_by_id.get(result.receiver_id) if result.receiver_id else None
source_node = nodes_by_id.get(result.source_id) if result.source_id else None
# Fetch receivers for this advertisement
receivers = []
if adv.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", [adv.event_hash]
)
receivers = receivers_by_hash.get(adv.event_hash, [])
data = {
"received_by": result.receiver_pk,
"receiver_name": result.receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": result.source_name,
"node_tag_name": _get_tag_name(source_node),
"adv_type": adv.adv_type or result.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": receivers,
}
return AdvertisementRead(**data)

View File

@@ -9,7 +9,15 @@ from sqlalchemy import func, select
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import DashboardStats, RecentAdvertisement
from meshcore_hub.common.schemas.messages import (
ChannelMessage,
DailyActivity,
DailyActivityPoint,
DashboardStats,
MessageActivity,
NodeCountHistory,
RecentAdvertisement,
)
router = APIRouter()
@@ -23,6 +31,7 @@ async def get_stats(
now = datetime.now(timezone.utc)
today_start = now.replace(hour=0, minute=0, second=0, microsecond=0)
yesterday = now - timedelta(days=1)
seven_days_ago = now - timedelta(days=7)
# Total nodes
total_nodes = session.execute(select(func.count()).select_from(Node)).scalar() or 0
@@ -65,6 +74,26 @@ async def get_stats(
or 0
)
# Advertisements in last 7 days
advertisements_7d = (
session.execute(
select(func.count())
.select_from(Advertisement)
.where(Advertisement.received_at >= seven_days_ago)
).scalar()
or 0
)
# Messages in last 7 days
messages_7d = (
session.execute(
select(func.count())
.select_from(Message)
.where(Message.received_at >= seven_days_ago)
).scalar()
or 0
)
# Recent advertisements (last 10)
recent_ads = (
session.execute(
@@ -74,25 +103,38 @@ async def get_stats(
.all()
)
# Get friendly_name tags for the advertised nodes
# Get node names, adv_types, and name tags for the advertised nodes
ad_public_keys = [ad.public_key for ad in recent_ads]
friendly_names: dict[str, str] = {}
node_names: dict[str, str] = {}
node_adv_types: dict[str, str] = {}
tag_names: dict[str, str] = {}
if ad_public_keys:
friendly_name_query = (
# Get node names and adv_types from Node table
node_query = select(Node.public_key, Node.name, Node.adv_type).where(
Node.public_key.in_(ad_public_keys)
)
for public_key, name, adv_type in session.execute(node_query).all():
if name:
node_names[public_key] = name
if adv_type:
node_adv_types[public_key] = adv_type
# Get name tags
tag_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.in_(ad_public_keys))
.where(NodeTag.key == "friendly_name")
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(friendly_name_query).all():
friendly_names[public_key] = value
for public_key, value in session.execute(tag_name_query).all():
tag_names[public_key] = value
recent_advertisements = [
RecentAdvertisement(
public_key=ad.public_key,
name=ad.name,
friendly_name=friendly_names.get(ad.public_key),
adv_type=ad.adv_type,
name=ad.name or node_names.get(ad.public_key),
tag_name=tag_names.get(ad.public_key),
adv_type=ad.adv_type or node_adv_types.get(ad.public_key),
received_at=ad.received_at,
)
for ad in recent_ads
@@ -110,18 +152,218 @@ async def get_stats(
int(channel): int(count) for channel, count in channel_results
}
# Get latest 5 messages for each channel that has messages
channel_messages: dict[int, list[ChannelMessage]] = {}
for channel_idx, _ in channel_results:
messages_query = (
select(Message)
.where(Message.message_type == "channel")
.where(Message.channel_idx == channel_idx)
.order_by(Message.received_at.desc())
.limit(5)
)
channel_msgs = session.execute(messages_query).scalars().all()
# Look up sender names for these messages
msg_prefixes = [m.pubkey_prefix for m in channel_msgs if m.pubkey_prefix]
msg_sender_names: dict[str, str] = {}
msg_tag_names: dict[str, str] = {}
if msg_prefixes:
for prefix in set(msg_prefixes):
sender_node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(sender_node_query).all():
if name:
msg_sender_names[public_key[:12]] = name
sender_tag_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.startswith(prefix))
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(sender_tag_query).all():
msg_tag_names[public_key[:12]] = value
channel_messages[int(channel_idx)] = [
ChannelMessage(
text=m.text,
sender_name=(
msg_sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
sender_tag_name=(
msg_tag_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
pubkey_prefix=m.pubkey_prefix,
received_at=m.received_at,
)
for m in channel_msgs
]
return DashboardStats(
total_nodes=total_nodes,
active_nodes=active_nodes,
total_messages=total_messages,
messages_today=messages_today,
messages_7d=messages_7d,
total_advertisements=total_advertisements,
advertisements_24h=advertisements_24h,
advertisements_7d=advertisements_7d,
recent_advertisements=recent_advertisements,
channel_message_counts=channel_message_counts,
channel_messages=channel_messages,
)
@router.get("/activity", response_model=DailyActivity)
async def get_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> DailyActivity:
"""Get daily advertisement activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily advertisement counts for each day in the period (excluding today)
"""
# Limit to max 90 days
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Query advertisement counts grouped by date
# Use SQLite's date() function for grouping (returns string 'YYYY-MM-DD')
date_expr = func.date(Advertisement.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Advertisement.received_at >= start_date)
.where(Advertisement.received_at < end_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
# Build a dict of date -> count from results (date is already a string)
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return DailyActivity(days=days, data=data)
@router.get("/message-activity", response_model=MessageActivity)
async def get_message_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> MessageActivity:
"""Get daily message activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily message counts for each day in the period (excluding today)
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Query message counts grouped by date
date_expr = func.date(Message.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Message.received_at >= start_date)
.where(Message.received_at < end_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return MessageActivity(days=days, data=data)
@router.get("/node-count", response_model=NodeCountHistory)
async def get_node_count_history(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> NodeCountHistory:
"""Get cumulative node count over time.
For each day, shows the total number of nodes that existed by that date
(based on their created_at timestamp).
Args:
days: Number of days to include (default 30, max 90)
Returns:
Cumulative node count for each day in the period (excluding today)
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Get all nodes with their creation dates
# Count nodes created on or before each date
data = []
for i in range(days):
date = start_date + timedelta(days=i)
end_of_day = date.replace(hour=23, minute=59, second=59, microsecond=999999)
date_str = date.strftime("%Y-%m-%d")
# Count nodes created on or before this date
count = (
session.execute(
select(func.count())
.select_from(Node)
.where(Node.created_at <= end_of_day)
).scalar()
or 0
)
data.append(DailyActivityPoint(date=date_str, count=count))
return NodeCountHistory(days=days, data=data)
@router.get("/", response_class=HTMLResponse)
async def dashboard(
request: Request,

View File

@@ -20,7 +20,7 @@ router = APIRouter()
async def list_members(
_: RequireRead,
session: DbSession,
limit: int = Query(default=50, ge=1, le=100),
limit: int = Query(default=50, ge=1, le=500),
offset: int = Query(default=0, ge=0),
) -> MemberList:
"""List all members with pagination."""
@@ -28,9 +28,9 @@ async def list_members(
count_query = select(func.count()).select_from(Member)
total = session.execute(count_query).scalar() or 0
# Get members
# Get members ordered by name
query = select(Member).order_by(Member.name).limit(limit).offset(offset)
members = session.execute(query).scalars().all()
members = list(session.execute(query).scalars().all())
return MemberList(
items=[MemberRead.model_validate(m) for m in members],
@@ -63,17 +63,23 @@ async def create_member(
member: MemberCreate,
) -> MemberRead:
"""Create a new member."""
# Normalize public_key to lowercase if provided
public_key = member.public_key.lower() if member.public_key else None
# Check if member_id already exists
query = select(Member).where(Member.member_id == member.member_id)
existing = session.execute(query).scalar_one_or_none()
if existing:
raise HTTPException(
status_code=400,
detail=f"Member with member_id '{member.member_id}' already exists",
)
# Create member
new_member = Member(
member_id=member.member_id,
name=member.name,
callsign=member.callsign,
role=member.role,
description=member.description,
contact=member.contact,
public_key=public_key,
)
session.add(new_member)
session.commit()
@@ -97,6 +103,18 @@ async def update_member(
raise HTTPException(status_code=404, detail="Member not found")
# Update fields
if member.member_id is not None:
# Check if new member_id is already taken by another member
check_query = select(Member).where(
Member.member_id == member.member_id, Member.id != member_id
)
collision = session.execute(check_query).scalar_one_or_none()
if collision:
raise HTTPException(
status_code=400,
detail=f"Member with member_id '{member.member_id}' already exists",
)
existing.member_id = member.member_id
if member.name is not None:
existing.name = member.name
if member.callsign is not None:
@@ -107,8 +125,6 @@ async def update_member(
existing.description = member.description
if member.contact is not None:
existing.contact = member.contact
if member.public_key is not None:
existing.public_key = member.public_key.lower()
session.commit()
session.refresh(existing)

View File

@@ -5,15 +5,95 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead
from meshcore_hub.common.models import EventReceiver, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead, ReceiverInfo
router = APIRouter()
def _get_tag_name(node: Optional[Node]) -> Optional[str]:
"""Extract name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes.
Args:
session: Database session
event_type: Type of event ('message', 'advertisement', etc.)
event_hashes: List of event hashes to fetch receivers for
Returns:
Dict mapping event_hash to list of ReceiverInfo objects
"""
if not event_hashes:
return {}
# Query event_receivers with receiver node info
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
# Group by event_hash
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
# Get tag names for receiver nodes
node_ids = [r.node_id for r in results]
tag_names: dict[str, str] = {}
if node_ids:
tag_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "name")
)
for node_id, value in session.execute(tag_query).all():
tag_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
tag_name=tag_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=MessageList)
async def list_messages(
_: RequireRead,
@@ -21,6 +101,9 @@ async def list_messages(
message_type: Optional[str] = Query(None, description="Filter by message type"),
pubkey_prefix: Optional[str] = Query(None, description="Filter by sender prefix"),
channel_idx: Optional[int] = Query(None, description="Filter by channel"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
search: Optional[str] = Query(None, description="Search in message text"),
@@ -28,8 +111,16 @@ async def list_messages(
offset: int = Query(0, ge=0, description="Page offset"),
) -> MessageList:
"""List messages with filtering and pagination."""
# Build query
query = select(Message)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(
Message,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
).outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
if message_type:
query = query.where(Message.message_type == message_type)
@@ -40,6 +131,9 @@ async def list_messages(
if channel_idx is not None:
query = query.where(Message.channel_idx == channel_idx)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Message.received_at >= since)
@@ -57,34 +151,77 @@ async def list_messages(
query = query.order_by(Message.received_at.desc()).offset(offset).limit(limit)
# Execute
messages = session.execute(query).scalars().all()
results = session.execute(query).all()
# Look up friendly_names for senders with pubkey_prefix
pubkey_prefixes = [m.pubkey_prefix for m in messages if m.pubkey_prefix]
friendly_names: dict[str, str] = {}
# Look up sender names and tag names for senders with pubkey_prefix
pubkey_prefixes = [r[0].pubkey_prefix for r in results if r[0].pubkey_prefix]
sender_names: dict[str, str] = {}
sender_tag_names: dict[str, str] = {}
if pubkey_prefixes:
# Find nodes whose public_key starts with any of these prefixes
for prefix in set(pubkey_prefixes):
friendly_name_query = (
# Get node name
node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(node_query).all():
if name:
sender_names[public_key[:12]] = name
# Get name tag
tag_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.startswith(prefix))
.where(NodeTag.key == "friendly_name")
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(friendly_name_query).all():
# Map the prefix to the friendly_name
friendly_names[public_key[:12]] = value
for public_key, value in session.execute(tag_name_query).all():
sender_tag_names[public_key[:12]] = value
# Build response with friendly_names
# Collect receiver node IDs to fetch tags
receiver_ids = set()
for row in results:
if row.receiver_id:
receiver_ids.add(row.receiver_id)
# Fetch receiver nodes with tags
receivers_by_id: dict[str, Node] = {}
if receiver_ids:
receivers_query = (
select(Node)
.where(Node.id.in_(receiver_ids))
.options(selectinload(Node.tags))
)
receivers = session.execute(receivers_query).scalars().all()
receivers_by_id = {n.id: n for n in receivers}
# Fetch all receivers for these messages
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(session, "message", event_hashes)
# Build response with sender info and received_by
items = []
for m in messages:
for row in results:
m = row[0]
receiver_pk = row.receiver_pk
receiver_name = row.receiver_name
receiver_node = (
receivers_by_id.get(row.receiver_id) if row.receiver_id else None
)
msg_dict = {
"id": m.id,
"receiver_node_id": m.receiver_node_id,
"received_by": receiver_pk,
"receiver_name": receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"message_type": m.message_type,
"pubkey_prefix": m.pubkey_prefix,
"sender_friendly_name": (
friendly_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
"sender_name": (
sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
"sender_tag_name": (
sender_tag_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
"channel_idx": m.channel_idx,
"text": m.text,
@@ -95,6 +232,9 @@ async def list_messages(
"sender_timestamp": m.sender_timestamp,
"received_at": m.received_at,
"created_at": m.created_at,
"receivers": (
receivers_by_hash.get(m.event_hash, []) if m.event_hash else []
),
}
items.append(MessageRead(**msg_dict))
@@ -113,10 +253,42 @@ async def get_message(
message_id: str,
) -> MessageRead:
"""Get a single message by ID."""
query = select(Message).where(Message.id == message_id)
message = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Message, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
.where(Message.id == message_id)
)
result = session.execute(query).one_or_none()
if not message:
if not result:
raise HTTPException(status_code=404, detail="Message not found")
return MessageRead.model_validate(message)
message, receiver_pk = result
# Fetch receivers for this message
receivers = []
if message.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "message", [message.event_hash]
)
receivers = receivers_by_hash.get(message.event_hash, [])
data = {
"id": message.id,
"receiver_node_id": message.receiver_node_id,
"received_by": receiver_pk,
"message_type": message.message_type,
"pubkey_prefix": message.pubkey_prefix,
"channel_idx": message.channel_idx,
"text": message.text,
"path_len": message.path_len,
"txt_type": message.txt_type,
"signature": message.signature,
"snr": message.snr,
"sender_timestamp": message.sender_timestamp,
"received_at": message.received_at,
"created_at": message.created_at,
"receivers": receivers,
}
return MessageRead(**data)

View File

@@ -3,11 +3,12 @@
from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy import func, or_, select
from sqlalchemy.orm import selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Node
from meshcore_hub.common.models import Node, NodeTag
from meshcore_hub.common.schemas.nodes import NodeList, NodeRead
router = APIRouter()
@@ -17,18 +18,31 @@ router = APIRouter()
async def list_nodes(
_: RequireRead,
session: DbSession,
search: Optional[str] = Query(None, description="Search in name or public key"),
search: Optional[str] = Query(
None, description="Search in name tag, node name, or public key"
),
adv_type: Optional[str] = Query(None, description="Filter by advertisement type"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
limit: int = Query(50, ge=1, le=500, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> NodeList:
"""List all nodes with pagination and filtering."""
# Build query
query = select(Node)
# Build base query with tags loaded
query = select(Node).options(selectinload(Node.tags))
if search:
# Search in public key, node name, or name tag
# For name tag search, we need to join with NodeTag
search_pattern = f"%{search}%"
query = query.where(
(Node.name.ilike(f"%{search}%")) | (Node.public_key.ilike(f"%{search}%"))
or_(
Node.public_key.ilike(search_pattern),
Node.name.ilike(search_pattern),
Node.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "name", NodeTag.value.ilike(search_pattern)
)
),
)
)
if adv_type:
@@ -38,7 +52,7 @@ async def list_nodes(
count_query = select(func.count()).select_from(query.subquery())
total = session.execute(count_query).scalar() or 0
# Apply pagination
# Apply pagination and ordering
query = query.order_by(Node.last_seen.desc()).offset(offset).limit(limit)
# Execute

View File

@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Telemetry
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.schemas.messages import TelemetryList, TelemetryRead
router = APIRouter()
@@ -19,18 +20,29 @@ async def list_telemetry(
_: RequireRead,
session: DbSession,
node_public_key: Optional[str] = Query(None, description="Filter by node"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TelemetryList:
"""List telemetry records with filtering and pagination."""
# Build query
query = select(Telemetry)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(Telemetry, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id
)
if node_public_key:
query = query.where(Telemetry.node_public_key == node_public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Telemetry.received_at >= since)
@@ -45,10 +57,25 @@ async def list_telemetry(
query = query.order_by(Telemetry.received_at.desc()).offset(offset).limit(limit)
# Execute
records = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tel, receiver_pk in results:
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
items.append(TelemetryRead(**data))
return TelemetryList(
items=[TelemetryRead.model_validate(t) for t in records],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +89,26 @@ async def get_telemetry(
telemetry_id: str,
) -> TelemetryRead:
"""Get a single telemetry record by ID."""
query = select(Telemetry).where(Telemetry.id == telemetry_id)
telemetry = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Telemetry, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id)
.where(Telemetry.id == telemetry_id)
)
result = session.execute(query).one_or_none()
if not telemetry:
if not result:
raise HTTPException(status_code=404, detail="Telemetry record not found")
return TelemetryRead.model_validate(telemetry)
tel, receiver_pk = result
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
return TelemetryRead(**data)

View File

@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import TracePath
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.schemas.messages import TracePathList, TracePathRead
router = APIRouter()
@@ -18,14 +19,25 @@ router = APIRouter()
async def list_trace_paths(
_: RequireRead,
session: DbSession,
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TracePathList:
"""List trace paths with filtering and pagination."""
# Build query
query = select(TracePath)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(TracePath, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id
)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(TracePath.received_at >= since)
@@ -41,10 +53,29 @@ async def list_trace_paths(
query = query.order_by(TracePath.received_at.desc()).offset(offset).limit(limit)
# Execute
trace_paths = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tp, receiver_pk in results:
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
items.append(TracePathRead(**data))
return TracePathList(
items=[TracePathRead.model_validate(t) for t in trace_paths],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -58,10 +89,30 @@ async def get_trace_path(
trace_path_id: str,
) -> TracePathRead:
"""Get a single trace path by ID."""
query = select(TracePath).where(TracePath.id == trace_path_id)
trace_path = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(TracePath, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id)
.where(TracePath.id == trace_path_id)
)
result = session.execute(query).one_or_none()
if not trace_path:
if not result:
raise HTTPException(status_code=404, detail="Trace path not found")
return TracePathRead.model_validate(trace_path)
tp, receiver_pk = result
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
return TracePathRead(**data)

View File

@@ -0,0 +1,225 @@
"""Data retention and cleanup service for MeshCore Hub.
This module provides functionality to delete old event data and inactive nodes
based on configured retention policies.
"""
import logging
from datetime import datetime, timedelta, timezone
from sqlalchemy import delete, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from meshcore_hub.common.models import (
Advertisement,
EventLog,
Message,
Node,
Telemetry,
TracePath,
)
logger = logging.getLogger(__name__)
class CleanupStats:
"""Statistics from a cleanup operation."""
def __init__(self) -> None:
self.advertisements_deleted: int = 0
self.messages_deleted: int = 0
self.telemetry_deleted: int = 0
self.trace_paths_deleted: int = 0
self.event_logs_deleted: int = 0
self.nodes_deleted: int = 0
self.total_deleted: int = 0
def __repr__(self) -> str:
return (
f"CleanupStats(total={self.total_deleted}, "
f"advertisements={self.advertisements_deleted}, "
f"messages={self.messages_deleted}, "
f"telemetry={self.telemetry_deleted}, "
f"trace_paths={self.trace_paths_deleted}, "
f"event_logs={self.event_logs_deleted}, "
f"nodes={self.nodes_deleted})"
)
async def cleanup_old_data(
db: AsyncSession,
retention_days: int,
dry_run: bool = False,
) -> CleanupStats:
"""Delete event data older than the retention period.
Args:
db: Database session
retention_days: Number of days to retain data
dry_run: If True, only count records without deleting
Returns:
CleanupStats object with deletion counts
"""
stats = CleanupStats()
cutoff_date = datetime.now(timezone.utc) - timedelta(days=retention_days)
logger.info(
"Starting data cleanup (dry_run=%s, retention_days=%d, cutoff=%s)",
dry_run,
retention_days,
cutoff_date.isoformat(),
)
# Clean up advertisements
stats.advertisements_deleted = await _cleanup_table(
db, Advertisement, cutoff_date, "advertisements", dry_run
)
# Clean up messages
stats.messages_deleted = await _cleanup_table(
db, Message, cutoff_date, "messages", dry_run
)
# Clean up telemetry
stats.telemetry_deleted = await _cleanup_table(
db, Telemetry, cutoff_date, "telemetry", dry_run
)
# Clean up trace paths
stats.trace_paths_deleted = await _cleanup_table(
db, TracePath, cutoff_date, "trace_paths", dry_run
)
# Clean up event logs
stats.event_logs_deleted = await _cleanup_table(
db, EventLog, cutoff_date, "event_logs", dry_run
)
stats.total_deleted = (
stats.advertisements_deleted
+ stats.messages_deleted
+ stats.telemetry_deleted
+ stats.trace_paths_deleted
+ stats.event_logs_deleted
)
if not dry_run:
await db.commit()
logger.info("Cleanup completed: %s", stats)
else:
logger.info("Cleanup dry run completed: %s", stats)
return stats
async def _cleanup_table(
db: AsyncSession,
model: type,
cutoff_date: datetime,
table_name: str,
dry_run: bool,
) -> int:
"""Delete old records from a specific table.
Args:
db: Database session
model: SQLAlchemy model class
cutoff_date: Delete records older than this date
table_name: Name of table for logging
dry_run: If True, only count without deleting
Returns:
Number of records deleted (or would be deleted in dry_run)
"""
from sqlalchemy import select
if dry_run:
# Count records that would be deleted
stmt = (
select(func.count())
.select_from(model)
.where(model.created_at < cutoff_date) # type: ignore[attr-defined]
)
result = await db.execute(stmt)
count = result.scalar() or 0
logger.debug(
"[DRY RUN] Would delete %d records from %s older than %s",
count,
table_name,
cutoff_date.isoformat(),
)
return count
else:
# Delete old records
result = await db.execute(delete(model).where(model.created_at < cutoff_date)) # type: ignore[attr-defined]
count = result.rowcount or 0 # type: ignore[attr-defined]
logger.debug(
"Deleted %d records from %s older than %s",
count,
table_name,
cutoff_date.isoformat(),
)
return count
async def cleanup_inactive_nodes(
db: AsyncSession,
inactivity_days: int,
dry_run: bool = False,
) -> int:
"""Delete nodes that haven't been seen for the specified number of days.
Only deletes nodes where last_seen is older than the cutoff date.
Nodes with last_seen=NULL are NOT deleted (never seen on network).
Args:
db: Database session
inactivity_days: Delete nodes not seen for this many days
dry_run: If True, only count without deleting
Returns:
Number of nodes deleted (or would be deleted in dry_run)
"""
cutoff_date = datetime.now(timezone.utc) - timedelta(days=inactivity_days)
logger.info(
"Starting node cleanup (dry_run=%s, inactivity_days=%d, cutoff=%s)",
dry_run,
inactivity_days,
cutoff_date.isoformat(),
)
if dry_run:
# Count nodes that would be deleted
# Only count nodes with last_seen < cutoff (excludes NULL last_seen)
stmt = (
select(func.count())
.select_from(Node)
.where(Node.last_seen < cutoff_date)
.where(Node.last_seen.isnot(None))
)
result = await db.execute(stmt)
count = result.scalar() or 0
logger.info(
"[DRY RUN] Would delete %d nodes not seen since %s",
count,
cutoff_date.isoformat(),
)
return count
else:
# Delete inactive nodes
# Only delete nodes with last_seen < cutoff (excludes NULL last_seen)
result = await db.execute(
delete(Node)
.where(Node.last_seen < cutoff_date)
.where(Node.last_seen.isnot(None))
)
await db.commit()
count = result.rowcount or 0 # type: ignore[attr-defined]
logger.info(
"Deleted %d nodes not seen since %s",
count,
cutoff_date.isoformat(),
)
return count

View File

@@ -1,9 +1,14 @@
"""CLI for the Collector component."""
from typing import TYPE_CHECKING
import click
from meshcore_hub.common.logging import configure_logging
if TYPE_CHECKING:
from meshcore_hub.common.database import DatabaseManager
@click.group(invoke_without_command=True)
@click.pass_context
@@ -42,6 +47,13 @@ from meshcore_hub.common.logging import configure_logging
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--data-home",
type=str,
@@ -77,6 +89,7 @@ def collector(
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
data_home: str | None,
seed_home: str | None,
database_url: str | None,
@@ -120,6 +133,7 @@ def collector(
ctx.obj["mqtt_username"] = mqtt_username
ctx.obj["mqtt_password"] = mqtt_password
ctx.obj["prefix"] = prefix
ctx.obj["mqtt_tls"] = mqtt_tls
ctx.obj["data_home"] = data_home or settings.data_home
ctx.obj["seed_home"] = settings.effective_seed_home
ctx.obj["database_url"] = effective_db_url
@@ -134,9 +148,11 @@ def collector(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
prefix=prefix,
mqtt_tls=mqtt_tls,
database_url=effective_db_url,
log_level=log_level,
data_home=data_home or settings.data_home,
seed_home=settings.effective_seed_home,
)
@@ -146,12 +162,17 @@ def _run_collector_service(
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
database_url: str,
log_level: str,
data_home: str,
seed_home: str,
) -> None:
"""Run the collector service.
Note: Seed data import should be done using the 'meshcore-hub collector seed'
command or the dedicated seed container before starting the collector service.
Webhooks can be configured via environment variables:
- WEBHOOK_ADVERTISEMENT_URL: Webhook for advertisement events
- WEBHOOK_MESSAGE_URL: Webhook for all message events
@@ -168,20 +189,22 @@ def _run_collector_service(
click.echo("Starting MeshCore Collector")
click.echo(f"Data home: {data_home}")
click.echo(f"Seed home: {seed_home}")
click.echo(f"MQTT: {mqtt_host}:{mqtt_port} (prefix: {prefix})")
click.echo(f"Database: {database_url}")
# Load webhook configuration from settings
from meshcore_hub.common.config import get_collector_settings
from meshcore_hub.collector.webhook import (
WebhookDispatcher,
create_webhooks_from_settings,
)
from meshcore_hub.common.config import get_collector_settings
settings = get_collector_settings()
webhooks = create_webhooks_from_settings(settings)
webhook_dispatcher = WebhookDispatcher(webhooks) if webhooks else None
click.echo("")
if webhook_dispatcher and webhook_dispatcher.webhooks:
click.echo(f"Webhooks configured: {len(webhooks)}")
for wh in webhooks:
@@ -191,14 +214,42 @@ def _run_collector_service(
from meshcore_hub.collector.subscriber import run_collector
# Show cleanup configuration
click.echo("")
click.echo("Cleanup configuration:")
if settings.data_retention_enabled:
click.echo(
f" Event data: Enabled (retention: {settings.data_retention_days} days)"
)
else:
click.echo(" Event data: Disabled")
if settings.node_cleanup_enabled:
click.echo(
f" Inactive nodes: Enabled (inactivity: {settings.node_cleanup_days} days)"
)
else:
click.echo(" Inactive nodes: Disabled")
if settings.data_retention_enabled or settings.node_cleanup_enabled:
click.echo(f" Interval: {settings.data_retention_interval_hours} hours")
click.echo("")
click.echo("Starting MQTT subscriber...")
run_collector(
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
database_url=database_url,
webhook_dispatcher=webhook_dispatcher,
cleanup_enabled=settings.data_retention_enabled,
cleanup_retention_days=settings.data_retention_days,
cleanup_interval_hours=settings.data_retention_interval_hours,
node_cleanup_enabled=settings.node_cleanup_enabled,
node_cleanup_days=settings.node_cleanup_days,
)
@@ -215,9 +266,11 @@ def run_cmd(ctx: click.Context) -> None:
mqtt_username=ctx.obj["mqtt_username"],
mqtt_password=ctx.obj["mqtt_password"],
prefix=ctx.obj["prefix"],
mqtt_tls=ctx.obj["mqtt_tls"],
database_url=ctx.obj["database_url"],
log_level=ctx.obj["log_level"],
data_home=ctx.obj["data_home"],
seed_home=ctx.obj["seed_home"],
)
@@ -236,17 +289,15 @@ def seed_cmd(
"""Import seed data from SEED_HOME directory.
Looks for the following files in SEED_HOME:
- node_tags.json: Node tag definitions (keyed by public_key)
- members.json: Network member definitions
- node_tags.yaml: Node tag definitions (keyed by public_key)
- members.yaml: Network member definitions
Files that don't exist are skipped. This command is idempotent -
existing records are updated, new records are created.
SEED_HOME defaults to {DATA_HOME}/collector but can be overridden
SEED_HOME defaults to ./seed but can be overridden
with the --seed-home option or SEED_HOME environment variable.
"""
from pathlib import Path
configure_logging(level=ctx.obj["log_level"])
seed_home = ctx.obj["seed_home"]
@@ -254,50 +305,17 @@ def seed_cmd(
click.echo(f"Database: {ctx.obj['database_url']}")
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Track what was imported
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.json"
if node_tags_file.exists():
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=not no_create_nodes,
)
click.echo(f" Tags: {stats['created']} created, {stats['updated']} updated")
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo node_tags.json found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.json"
if members_file.exists():
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
click.echo(f" Members: {stats['created']} created, {stats['updated']} updated")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo members.json found in {seed_home}")
# Run seed import
imported_any = _run_seed_import(
seed_home=seed_home,
db=db,
create_nodes=not no_create_nodes,
verbose=True,
)
if not imported_any:
click.echo("\nNo seed files found. Nothing to import.")
@@ -307,6 +325,79 @@ def seed_cmd(
db.dispose()
def _run_seed_import(
seed_home: str,
db: "DatabaseManager",
create_nodes: bool = True,
verbose: bool = False,
) -> bool:
"""Run seed import from SEED_HOME directory.
Args:
seed_home: Path to seed home directory
db: Database manager instance
create_nodes: If True, create nodes that don't exist
verbose: If True, output progress messages
Returns:
True if any files were imported, False otherwise
"""
from pathlib import Path
from meshcore_hub.collector.member_import import import_members
from meshcore_hub.collector.tag_import import import_tags
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.yaml"
if node_tags_file.exists():
if verbose:
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=create_nodes,
clear_existing=True,
)
if verbose:
if stats["deleted"]:
click.echo(f" Deleted {stats['deleted']} existing tags")
click.echo(
f" Tags: {stats['created']} created, {stats['updated']} updated"
)
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo node_tags.yaml found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.yaml"
if members_file.exists():
if verbose:
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
if verbose:
click.echo(
f" Members: {stats['created']} created, {stats['updated']} updated"
)
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo members.yaml found in {seed_home}")
return imported_any
@collector.command("import-tags")
@click.argument("file", type=click.Path(), required=False, default=None)
@click.option(
@@ -315,38 +406,46 @@ def seed_cmd(
default=False,
help="Skip tags for nodes that don't exist (default: create nodes)",
)
@click.option(
"--clear-existing",
is_flag=True,
default=False,
help="Delete all existing tags before importing",
)
@click.pass_context
def import_tags_cmd(
ctx: click.Context,
file: str | None,
no_create_nodes: bool,
clear_existing: bool,
) -> None:
"""Import node tags from a JSON file.
"""Import node tags from a YAML file.
Reads a JSON file containing tag definitions and upserts them
into the database. Existing tags are updated, new tags are created.
Reads a YAML file containing tag definitions and upserts them
into the database. By default, existing tags are updated and new tags are created.
Use --clear-existing to delete all tags before importing.
FILE is the path to the JSON file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.json.
FILE is the path to the YAML file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.yaml.
Expected YAML format (keyed by public_key):
Expected JSON format (keyed by public_key):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
0123456789abcdef...:
friendly_name: My Node
location:
value: "52.0,1.0"
type: coordinate
altitude:
value: "150"
type: number
Shorthand is also supported (string values with default type):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"role": "gateway"
}
}
0123456789abcdef...:
friendly_name: My Node
role: gateway
Supported types: string, number, boolean, coordinate
"""
@@ -362,7 +461,7 @@ def import_tags_cmd(
if not Path(tags_file).exists():
click.echo(f"Tags file not found: {tags_file}")
if not file:
click.echo("Specify a file path or create the default node_tags.json.")
click.echo("Specify a file path or create the default node_tags.yaml.")
return
click.echo(f"Importing tags from: {tags_file}")
@@ -371,20 +470,22 @@ def import_tags_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import tags
stats = import_tags(
file_path=tags_file,
db=db,
create_nodes=not no_create_nodes,
clear_existing=clear_existing,
)
# Report results
click.echo("")
click.echo("Import complete:")
if stats["deleted"]:
click.echo(f" Tags deleted: {stats['deleted']}")
click.echo(f" Total tags in file: {stats['total']}")
click.echo(f" Tags created: {stats['created']}")
click.echo(f" Tags updated: {stats['updated']}")
@@ -407,33 +508,29 @@ def import_members_cmd(
ctx: click.Context,
file: str | None,
) -> None:
"""Import network members from a JSON file.
"""Import network members from a YAML file.
Reads a JSON file containing member definitions and upserts them
Reads a YAML file containing member definitions and upserts them
into the database. Existing members (matched by name) are updated,
new members are created.
FILE is the path to the JSON file containing members.
If not provided, defaults to {SEED_HOME}/members.json.
FILE is the path to the YAML file containing members.
If not provided, defaults to {SEED_HOME}/members.yaml.
Expected YAML format (list):
Expected JSON format (list):
\b
[
{
"name": "John Doe",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member"
}
]
- name: John Doe
callsign: N0CALL
role: Network Operator
description: Example member
Or with "members" key:
\b
{
"members": [
{"name": "John Doe", "callsign": "N0CALL", ...}
]
}
members:
- name: John Doe
callsign: N0CALL
"""
from pathlib import Path
@@ -447,7 +544,7 @@ def import_members_cmd(
if not Path(members_file).exists():
click.echo(f"Members file not found: {members_file}")
if not file:
click.echo("Specify a file path or create the default members.json.")
click.echo("Specify a file path or create the default members.yaml.")
return
click.echo(f"Importing members from: {members_file}")
@@ -456,9 +553,8 @@ def import_members_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import members
stats = import_members(
@@ -480,3 +576,299 @@ def import_members_cmd(
click.echo(f" - {error}", err=True)
db.dispose()
@collector.command("cleanup")
@click.option(
"--retention-days",
type=int,
default=30,
envvar="DATA_RETENTION_DAYS",
help="Number of days to retain data (default: 30)",
)
@click.option(
"--dry-run",
is_flag=True,
default=False,
help="Show what would be deleted without deleting",
)
@click.pass_context
def cleanup_cmd(
ctx: click.Context,
retention_days: int,
dry_run: bool,
) -> None:
"""Manually run data cleanup to delete old events.
Deletes event data older than the retention period:
- Advertisements
- Messages (channel and direct)
- Telemetry
- Trace paths
- Event logs
Node records are never deleted - only event data.
Use --dry-run to preview what would be deleted without
actually deleting anything.
"""
import asyncio
configure_logging(level=ctx.obj["log_level"])
click.echo(f"Database: {ctx.obj['database_url']}")
click.echo(f"Retention: {retention_days} days")
click.echo(f"Mode: {'DRY RUN' if dry_run else 'LIVE'}")
click.echo("")
if dry_run:
click.echo("Running in dry-run mode - no data will be deleted.")
else:
click.echo("WARNING: This will permanently delete old event data!")
if not click.confirm("Continue?"):
click.echo("Aborted.")
return
click.echo("")
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.cleanup import cleanup_old_data
# Initialize database
db = DatabaseManager(ctx.obj["database_url"])
# Run cleanup
async def run_cleanup() -> None:
async with db.async_session() as session:
stats = await cleanup_old_data(
session,
retention_days,
dry_run=dry_run,
)
click.echo("")
click.echo("Cleanup results:")
click.echo(f" Advertisements: {stats.advertisements_deleted}")
click.echo(f" Messages: {stats.messages_deleted}")
click.echo(f" Telemetry: {stats.telemetry_deleted}")
click.echo(f" Trace paths: {stats.trace_paths_deleted}")
click.echo(f" Event logs: {stats.event_logs_deleted}")
click.echo(f" Total: {stats.total_deleted}")
if dry_run:
click.echo("")
click.echo("(Dry run - no data was actually deleted)")
asyncio.run(run_cleanup())
db.dispose()
click.echo("")
click.echo("Cleanup complete." if not dry_run else "Dry run complete.")
@collector.command("truncate")
@click.option(
"--members",
is_flag=True,
default=False,
help="Truncate members table",
)
@click.option(
"--nodes",
is_flag=True,
default=False,
help="Truncate nodes table (also clears tags, advertisements, messages, telemetry, trace paths)",
)
@click.option(
"--messages",
is_flag=True,
default=False,
help="Truncate messages table",
)
@click.option(
"--advertisements",
is_flag=True,
default=False,
help="Truncate advertisements table",
)
@click.option(
"--telemetry",
is_flag=True,
default=False,
help="Truncate telemetry table",
)
@click.option(
"--trace-paths",
is_flag=True,
default=False,
help="Truncate trace_paths table",
)
@click.option(
"--event-logs",
is_flag=True,
default=False,
help="Truncate event_logs table",
)
@click.option(
"--all",
"truncate_all",
is_flag=True,
default=False,
help="Truncate ALL tables (use with caution!)",
)
@click.option(
"--yes",
is_flag=True,
default=False,
help="Skip confirmation prompt",
)
@click.pass_context
def truncate_cmd(
ctx: click.Context,
members: bool,
nodes: bool,
messages: bool,
advertisements: bool,
telemetry: bool,
trace_paths: bool,
event_logs: bool,
truncate_all: bool,
yes: bool,
) -> None:
"""Truncate (clear) data tables.
WARNING: This permanently deletes data! Use with caution.
Examples:
# Clear members table
meshcore-hub collector truncate --members
# Clear messages and advertisements
meshcore-hub collector truncate --messages --advertisements
# Clear everything (requires confirmation)
meshcore-hub collector truncate --all
Note: Clearing nodes also clears all related data (tags, advertisements,
messages, telemetry, trace paths) due to foreign key constraints.
"""
configure_logging(level=ctx.obj["log_level"])
# Determine what to truncate
if truncate_all:
tables_to_clear = {
"members": True,
"nodes": True,
"messages": True,
"advertisements": True,
"telemetry": True,
"trace_paths": True,
"event_logs": True,
}
else:
tables_to_clear = {
"members": members,
"nodes": nodes,
"messages": messages,
"advertisements": advertisements,
"telemetry": telemetry,
"trace_paths": trace_paths,
"event_logs": event_logs,
}
# Check if any tables selected
if not any(tables_to_clear.values()):
click.echo("No tables specified. Use --help to see available options.")
return
# Show what will be cleared
click.echo("Database: " + ctx.obj["database_url"])
click.echo("")
click.echo("The following tables will be PERMANENTLY CLEARED:")
for table, should_clear in tables_to_clear.items():
if should_clear:
click.echo(f" - {table}")
if tables_to_clear.get("nodes"):
click.echo("")
click.echo(
"WARNING: Clearing nodes will also clear all related data due to foreign keys:"
)
click.echo(" - node_tags")
click.echo(" - advertisements")
click.echo(" - messages")
click.echo(" - telemetry")
click.echo(" - trace_paths")
click.echo("")
# Confirm
if not yes:
if not click.confirm(
"Are you sure you want to permanently delete this data?", default=False
):
click.echo("Aborted.")
return
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import (
Advertisement,
EventLog,
Member,
Message,
Node,
NodeTag,
Telemetry,
TracePath,
)
from sqlalchemy import delete
from sqlalchemy.engine import CursorResult
db = DatabaseManager(ctx.obj["database_url"])
with db.session_scope() as session:
# Truncate in correct order to respect foreign keys
cleared: list[str] = []
# Clear members (no dependencies)
if tables_to_clear.get("members"):
result: CursorResult = session.execute(delete(Member)) # type: ignore
cleared.append(f"members: {result.rowcount} rows")
# Clear event-specific tables first (they depend on nodes)
if tables_to_clear.get("messages"):
result = session.execute(delete(Message)) # type: ignore
cleared.append(f"messages: {result.rowcount} rows")
if tables_to_clear.get("advertisements"):
result = session.execute(delete(Advertisement)) # type: ignore
cleared.append(f"advertisements: {result.rowcount} rows")
if tables_to_clear.get("telemetry"):
result = session.execute(delete(Telemetry)) # type: ignore
cleared.append(f"telemetry: {result.rowcount} rows")
if tables_to_clear.get("trace_paths"):
result = session.execute(delete(TracePath)) # type: ignore
cleared.append(f"trace_paths: {result.rowcount} rows")
if tables_to_clear.get("event_logs"):
result = session.execute(delete(EventLog)) # type: ignore
cleared.append(f"event_logs: {result.rowcount} rows")
# Clear nodes last (this will cascade delete tags and any remaining events)
if tables_to_clear.get("nodes"):
# Delete tags first (they depend on nodes)
tag_result: CursorResult = session.execute(delete(NodeTag)) # type: ignore
cleared.append(f"node_tags: {tag_result.rowcount} rows (cascade)")
# Delete nodes (will cascade to remaining related tables)
node_result: CursorResult = session.execute(delete(Node)) # type: ignore
cleared.append(f"nodes: {node_result.rowcount} rows")
db.dispose()
click.echo("")
click.echo("Truncate complete. Cleared:")
for item in cleared:
click.echo(f" - {item}")
click.echo("")

View File

@@ -19,7 +19,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
)
from meshcore_hub.collector.handlers.trace import handle_trace_data
from meshcore_hub.collector.handlers.telemetry import handle_telemetry
from meshcore_hub.collector.handlers.contacts import handle_contacts
from meshcore_hub.collector.handlers.contacts import handle_contact
from meshcore_hub.collector.handlers.event_log import handle_event_log
# Persisted events with specific handlers
@@ -28,7 +28,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
subscriber.register_handler("channel_msg_recv", handle_channel_message)
subscriber.register_handler("trace_data", handle_trace_data)
subscriber.register_handler("telemetry_response", handle_telemetry)
subscriber.register_handler("contacts", handle_contacts)
subscriber.register_handler("contact", handle_contact) # Individual contact events
# Informational events (logged only)
subscriber.register_handler("send_confirmed", handle_event_log)

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Advertisement, Node
from meshcore_hub.common.hash_utils import compute_advertisement_hash
from meshcore_hub.common.models import Advertisement, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,17 @@ def handle_advertisement(
flags = payload.get("flags")
now = datetime.now(timezone.utc)
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_advertisement_hash(
public_key=adv_public_key,
name=name,
adv_type=adv_type,
flags=flags,
received_at=now,
)
with db.session_scope() as session:
# Find or create receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -55,6 +66,37 @@ def handle_advertisement(
)
session.add(receiver_node)
session.flush()
else:
receiver_node.last_seen = now
# Check if advertisement with same hash already exists
existing = session.execute(
select(Advertisement.id).where(Advertisement.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Still update advertised node's last_seen even for duplicate advertisements
node_query = select(Node).where(Node.public_key == adv_public_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
node.last_seen = now
# Add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Advertisements don't have SNR
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to advertisement "
f"(hash={event_hash[:8]}...)"
)
return
# Find or create advertised node
node_query = select(Node).where(Node.public_key == adv_public_key)
@@ -91,9 +133,43 @@ def handle_advertisement(
adv_type=adv_type,
flags=flags,
received_at=now,
event_hash=event_hash,
)
session.add(advertisement)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate advertisement skipped (race condition, "
f"hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(
f"Stored advertisement from {name or adv_public_key[:12]!r} "
f"(type={adv_type})"

View File

@@ -1,4 +1,4 @@
"""Handler for contacts sync events."""
"""Handler for contact sync events."""
import logging
from datetime import datetime, timezone
@@ -11,65 +11,79 @@ from meshcore_hub.common.models import Node
logger = logging.getLogger(__name__)
# Map numeric node type to string representation
NODE_TYPE_MAP = {
0: "none",
1: "chat",
2: "repeater",
3: "room",
}
def handle_contacts(
def handle_contact(
public_key: str,
event_type: str,
payload: dict[str, Any],
db: DatabaseManager,
) -> None:
"""Handle a contacts sync event.
"""Handle a single contact event.
Upserts all contacts in the contacts list.
Upserts a contact into the nodes table.
Args:
public_key: Receiver node's public key (from MQTT topic)
event_type: Event type name
payload: Contacts payload
payload: Single contact object with fields:
- public_key: Contact's public key
- adv_name: Advertised name
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
db: Database manager
"""
contacts = payload.get("contacts", [])
if not contacts:
logger.debug("Empty contacts list received")
contact_key = payload.get("public_key")
if not contact_key:
logger.warning("Contact event missing public_key field")
return
# Device uses 'adv_name' for the advertised name
name = payload.get("adv_name") or payload.get("name")
logger.info(f"Processing contact: {contact_key[:12]}... adv_name={name}")
# Device uses numeric 'type' field, convert to string
raw_type = payload.get("type")
if raw_type is not None:
node_type: str | None = NODE_TYPE_MAP.get(raw_type, str(raw_type))
else:
node_type = payload.get("node_type")
now = datetime.now(timezone.utc)
created_count = 0
updated_count = 0
with db.session_scope() as session:
for contact in contacts:
contact_key = contact.get("public_key")
if not contact_key:
continue
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
name = contact.get("name")
node_type = contact.get("node_type")
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
# Update existing node
if name and not node.name:
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
node.last_seen = now
updated_count += 1
else:
# Create new node
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=now,
if node:
# Update existing node - always update name if we have one
if name and name != node.name:
logger.info(
f"Updating node {contact_key[:12]}... "
f"name: {node.name!r} -> {name!r}"
)
session.add(node)
created_count += 1
logger.info(
f"Processed contacts sync: {created_count} new, {updated_count} updated"
)
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
# Do NOT update last_seen for contact sync - only advertisement events
# should update last_seen since that's when the node was actually seen
else:
# Create new node from contact database
# Set last_seen=None since we haven't actually seen this node advertise yet
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=None, # Will be set when we receive an advertisement
)
session.add(node)
logger.info(f"Created node from contact: {contact_key[:12]}... ({name})")

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Message, Node
from meshcore_hub.common.hash_utils import compute_message_hash
from meshcore_hub.common.models import Message, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -84,8 +86,17 @@ def _handle_message(
except (ValueError, OSError):
pass
# Compute event hash for deduplication
event_hash = compute_message_hash(
text=text,
pubkey_prefix=pubkey_prefix,
channel_idx=channel_idx,
sender_timestamp=sender_timestamp,
txt_type=txt_type,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -102,6 +113,29 @@ def _handle_message(
else:
receiver_node.last_seen = now
# Check if message with same hash already exists
existing = session.execute(
select(Message.id).where(Message.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to message "
f"(hash={event_hash[:8]}...)"
)
return
# Create message record
message = Message(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -115,9 +149,42 @@ def _handle_message(
snr=snr,
sender_timestamp=sender_timestamp,
received_at=now,
event_hash=event_hash,
)
session.add(message)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate message skipped (race condition, hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
return
if message_type == "contact":
logger.info(
f"Stored contact message from {pubkey_prefix!r}: "

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.hash_utils import compute_telemetry_hash
from meshcore_hub.common.models import Node, Telemetry, add_event_receiver
logger = logging.getLogger(__name__)
@@ -49,8 +51,15 @@ def handle_telemetry(
except ValueError:
lpp_bytes = lpp_data.encode()
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_telemetry_hash(
node_public_key=node_public_key,
parsed_data=parsed_data,
received_at=now,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -67,6 +76,29 @@ def handle_telemetry(
else:
receiver_node.last_seen = now
# Check if telemetry with same hash already exists
existing = session.execute(
select(Telemetry.id).where(Telemetry.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to telemetry "
f"(node={node_public_key[:12]}...)"
)
return
# Find or create reporting node
reporting_node = None
if node_public_key:
@@ -92,9 +124,43 @@ def handle_telemetry(
lpp_data=lpp_bytes,
parsed_data=parsed_data,
received_at=now,
event_hash=event_hash,
)
session.add(telemetry)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate telemetry skipped (race condition, "
f"node={node_public_key[:12]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
# Log telemetry values
if parsed_data:
values = ", ".join(f"{k}={v}" for k, v in parsed_data.items())

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.hash_utils import compute_trace_hash
from meshcore_hub.common.models import Node, TracePath, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,11 @@ def handle_trace_data(
snr_values = payload.get("snr_values")
hop_count = payload.get("hop_count")
# Compute event hash for deduplication (initiator_tag is unique per trace)
event_hash = compute_trace_hash(initiator_tag=initiator_tag)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -58,6 +63,29 @@ def handle_trace_data(
else:
receiver_node.last_seen = now
# Check if trace with same hash already exists
existing = session.execute(
select(TracePath.id).where(TracePath.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Trace events don't have a single SNR value
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to trace "
f"(tag={initiator_tag})"
)
return
# Create trace path record
trace_path = TracePath(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -69,7 +97,40 @@ def handle_trace_data(
snr_values=snr_values,
hop_count=hop_count,
received_at=now,
event_hash=event_hash,
)
session.add(trace_path)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate trace skipped (race condition, tag={initiator_tag})"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(f"Stored trace data: tag={initiator_tag}, hops={hop_count}")

View File

@@ -1,11 +1,11 @@
"""Import members from JSON file."""
"""Import members from YAML file."""
import json
import logging
from pathlib import Path
from typing import Any, Optional
from pydantic import BaseModel, Field, field_validator
import yaml
from pydantic import BaseModel, Field
from sqlalchemy import select
from meshcore_hub.common.database import DatabaseManager
@@ -15,47 +15,46 @@ logger = logging.getLogger(__name__)
class MemberData(BaseModel):
"""Schema for a member entry in the import file."""
"""Schema for a member entry in the import file.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: str = Field(..., min_length=1, max_length=100)
name: str = Field(..., min_length=1, max_length=255)
callsign: Optional[str] = Field(default=None, max_length=20)
role: Optional[str] = Field(default=None, max_length=100)
description: Optional[str] = Field(default=None)
contact: Optional[str] = Field(default=None, max_length=255)
public_key: Optional[str] = Field(default=None)
@field_validator("public_key")
@classmethod
def validate_public_key(cls, v: Optional[str]) -> Optional[str]:
"""Validate and normalize public key if provided."""
if v is None:
return None
if len(v) != 64:
raise ValueError(f"public_key must be 64 characters, got {len(v)}")
if not all(c in "0123456789abcdefABCDEF" for c in v):
raise ValueError("public_key must be a valid hex string")
return v.lower()
def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
"""Load and validate members from a JSON file.
"""Load and validate members from a YAML file.
Supports two formats:
1. List of member objects:
[{"name": "Member 1", ...}, {"name": "Member 2", ...}]
- member_id: member1
name: Member 1
callsign: M1
2. Object with "members" key:
{"members": [{"name": "Member 1", ...}, ...]}
members:
- member_id: member1
name: Member 1
callsign: M1
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
Returns:
List of validated member dictionaries
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -63,7 +62,7 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
raise FileNotFoundError(f"Members file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
# Handle both formats
if isinstance(data, list):
@@ -73,15 +72,15 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
if not isinstance(members_list, list):
raise ValueError("'members' key must contain a list")
else:
raise ValueError(
"Members file must be a list or an object with 'members' key"
)
raise ValueError("Members file must be a list or a mapping with 'members' key")
# Validate each member
validated: list[dict[str, Any]] = []
for i, member in enumerate(members_list):
if not isinstance(member, dict):
raise ValueError(f"Member at index {i} must be an object")
if "member_id" not in member:
raise ValueError(f"Member at index {i} must have a 'member_id' field")
if "name" not in member:
raise ValueError(f"Member at index {i} must have a 'name' field")
@@ -99,13 +98,16 @@ def import_members(
file_path: str | Path,
db: DatabaseManager,
) -> dict[str, Any]:
"""Import members from a JSON file into the database.
"""Import members from a YAML file into the database.
Performs upsert operations based on name - existing members are updated,
Performs upsert operations based on member_id - existing members are updated,
new members are created.
Note: Nodes are associated with members via a 'member_id' tag on the node.
This import does not manage node associations.
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
db: Database manager instance
Returns:
@@ -134,14 +136,17 @@ def import_members(
with db.session_scope() as session:
for member_data in members_data:
try:
member_id = member_data["member_id"]
name = member_data["name"]
# Find existing member by name
query = select(Member).where(Member.name == name)
# Find existing member by member_id
query = select(Member).where(Member.member_id == member_id)
existing = session.execute(query).scalar_one_or_none()
if existing:
# Update existing member
if member_data.get("name") is not None:
existing.name = member_data["name"]
if member_data.get("callsign") is not None:
existing.callsign = member_data["callsign"]
if member_data.get("role") is not None:
@@ -150,27 +155,26 @@ def import_members(
existing.description = member_data["description"]
if member_data.get("contact") is not None:
existing.contact = member_data["contact"]
if member_data.get("public_key") is not None:
existing.public_key = member_data["public_key"]
stats["updated"] += 1
logger.debug(f"Updated member: {name}")
logger.debug(f"Updated member: {member_id} ({name})")
else:
# Create new member
new_member = Member(
member_id=member_id,
name=name,
callsign=member_data.get("callsign"),
role=member_data.get("role"),
description=member_data.get("description"),
contact=member_data.get("contact"),
public_key=member_data.get("public_key"),
)
session.add(new_member)
stats["created"] += 1
logger.debug(f"Created member: {name}")
logger.debug(f"Created member: {member_id} ({name})")
except Exception as e:
error_msg = f"Error processing member '{member_data.get('name', 'unknown')}': {e}"
error_msg = f"Error processing member '{member_data.get('member_id', 'unknown')}' ({member_data.get('name', 'unknown')}): {e}"
stats["errors"].append(error_msg)
logger.error(error_msg)

View File

@@ -6,6 +6,7 @@ The subscriber:
3. Routes events to appropriate handlers
4. Persists data to database
5. Dispatches events to configured webhooks
6. Performs scheduled data cleanup if enabled
"""
import asyncio
@@ -14,6 +15,7 @@ import signal
import threading
import time
import uuid
from datetime import datetime, timezone
from typing import Any, Callable, Optional, TYPE_CHECKING
from meshcore_hub.common.database import DatabaseManager
@@ -38,6 +40,11 @@ class Subscriber:
mqtt_client: MQTTClient,
db_manager: DatabaseManager,
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
):
"""Initialize subscriber.
@@ -45,6 +52,11 @@ class Subscriber:
mqtt_client: MQTT client instance
db_manager: Database manager instance
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
"""
self.mqtt = mqtt_client
self.db = db_manager
@@ -59,6 +71,14 @@ class Subscriber:
self._webhook_queue: list[tuple[str, dict[str, Any], str]] = []
self._webhook_lock = threading.Lock()
self._webhook_thread: Optional[threading.Thread] = None
# Data cleanup
self._cleanup_enabled = cleanup_enabled
self._cleanup_retention_days = cleanup_retention_days
self._cleanup_interval_hours = cleanup_interval_hours
self._node_cleanup_enabled = node_cleanup_enabled
self._node_cleanup_days = node_cleanup_days
self._cleanup_thread: Optional[threading.Thread] = None
self._last_cleanup: Optional[datetime] = None
@property
def is_healthy(self) -> bool:
@@ -202,18 +222,129 @@ class Subscriber:
if self._webhook_thread.is_alive():
logger.warning("Webhook processor thread did not stop cleanly")
def _start_cleanup_scheduler(self) -> None:
"""Start background thread for periodic data cleanup."""
if not self._cleanup_enabled and not self._node_cleanup_enabled:
logger.info("Data cleanup and node cleanup are both disabled")
return
logger.info(
"Starting cleanup scheduler (interval_hours=%d)",
self._cleanup_interval_hours,
)
if self._cleanup_enabled:
logger.info(
" Event data cleanup: ENABLED (retention_days=%d)",
self._cleanup_retention_days,
)
else:
logger.info(" Event data cleanup: DISABLED")
if self._node_cleanup_enabled:
logger.info(
" Node cleanup: ENABLED (inactivity_days=%d)", self._node_cleanup_days
)
else:
logger.info(" Node cleanup: DISABLED")
def run_cleanup_loop() -> None:
"""Run async cleanup tasks in background thread."""
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
try:
while self._running:
# Check if cleanup is due
now = datetime.now(timezone.utc)
should_run = False
if self._last_cleanup is None:
# First run
should_run = True
else:
# Check if interval has passed
hours_since_last = (
now - self._last_cleanup
).total_seconds() / 3600
should_run = hours_since_last >= self._cleanup_interval_hours
if should_run:
try:
logger.info("Starting scheduled cleanup")
from meshcore_hub.collector.cleanup import (
cleanup_old_data,
cleanup_inactive_nodes,
)
# Get async session and run cleanup
async def run_cleanup() -> None:
async with self.db.async_session() as session:
# Run event data cleanup if enabled
if self._cleanup_enabled:
stats = await cleanup_old_data(
session,
self._cleanup_retention_days,
dry_run=False,
)
logger.info(
"Event cleanup completed: %s", stats
)
# Run node cleanup if enabled
if self._node_cleanup_enabled:
nodes_deleted = await cleanup_inactive_nodes(
session,
self._node_cleanup_days,
dry_run=False,
)
logger.info(
"Node cleanup completed: %d nodes deleted",
nodes_deleted,
)
loop.run_until_complete(run_cleanup())
self._last_cleanup = now
except Exception as e:
logger.error(f"Cleanup error: {e}", exc_info=True)
# Sleep for 1 hour before next check
for _ in range(3600):
if not self._running:
break
time.sleep(1)
finally:
loop.close()
logger.info("Cleanup scheduler stopped")
self._cleanup_thread = threading.Thread(
target=run_cleanup_loop, daemon=True, name="cleanup-scheduler"
)
self._cleanup_thread.start()
def _stop_cleanup_scheduler(self) -> None:
"""Stop the cleanup scheduler thread."""
if self._cleanup_thread and self._cleanup_thread.is_alive():
# Thread will exit when self._running becomes False
self._cleanup_thread.join(timeout=5.0)
if self._cleanup_thread.is_alive():
logger.warning("Cleanup scheduler thread did not stop cleanly")
def start(self) -> None:
"""Start the subscriber."""
logger.info("Starting collector subscriber")
# Create database tables if needed
# Verify database connection (schema managed by Alembic migrations)
try:
self.db.create_tables()
# Test connection by getting a session
session = self.db.get_session()
session.close()
self._db_connected = True
logger.info("Database initialized")
logger.info("Database connection verified")
except Exception as e:
self._db_connected = False
logger.error(f"Failed to initialize database: {e}")
logger.error(f"Failed to connect to database: {e}")
raise
# Connect to MQTT broker
@@ -237,6 +368,9 @@ class Subscriber:
# Start webhook processor if configured
self._start_webhook_processor()
# Start cleanup scheduler if configured
self._start_cleanup_scheduler()
# Start health reporter for Docker health checks
self._health_reporter = HealthReporter(
component="collector",
@@ -269,6 +403,9 @@ class Subscriber:
self._running = False
self._shutdown_event.set()
# Stop cleanup scheduler
self._stop_cleanup_scheduler()
# Stop webhook processor
self._stop_webhook_processor()
@@ -291,8 +428,14 @@ def create_subscriber(
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
database_url: str = "sqlite:///./meshcore.db",
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
) -> Subscriber:
"""Create a configured subscriber instance.
@@ -302,8 +445,14 @@ def create_subscriber(
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
database_url: Database connection URL
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
Returns:
Configured Subscriber instance
@@ -317,6 +466,7 @@ def create_subscriber(
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-collector-{unique_id}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
@@ -324,7 +474,16 @@ def create_subscriber(
db_manager = DatabaseManager(database_url)
# Create subscriber
subscriber = Subscriber(mqtt_client, db_manager, webhook_dispatcher)
subscriber = Subscriber(
mqtt_client,
db_manager,
webhook_dispatcher,
cleanup_enabled=cleanup_enabled,
cleanup_retention_days=cleanup_retention_days,
cleanup_interval_hours=cleanup_interval_hours,
node_cleanup_enabled=node_cleanup_enabled,
node_cleanup_days=node_cleanup_days,
)
# Register handlers
from meshcore_hub.collector.handlers import register_all_handlers
@@ -340,8 +499,14 @@ def run_collector(
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
database_url: str = "sqlite:///./meshcore.db",
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
) -> None:
"""Run the collector (blocking).
@@ -351,8 +516,14 @@ def run_collector(
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
database_url: Database connection URL
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
"""
subscriber = create_subscriber(
mqtt_host=mqtt_host,
@@ -360,8 +531,14 @@ def run_collector(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
database_url=database_url,
webhook_dispatcher=webhook_dispatcher,
cleanup_enabled=cleanup_enabled,
cleanup_retention_days=cleanup_retention_days,
cleanup_interval_hours=cleanup_interval_hours,
node_cleanup_enabled=node_cleanup_enabled,
node_cleanup_days=node_cleanup_days,
)
# Set up signal handlers

View File

@@ -1,13 +1,13 @@
"""Import node tags from JSON file."""
"""Import node tags from YAML file."""
import json
import logging
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
import yaml
from pydantic import BaseModel, Field, model_validator
from sqlalchemy import select
from sqlalchemy import delete, func, select
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, NodeTag
@@ -64,33 +64,33 @@ def validate_public_key(public_key: str) -> str:
def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
"""Load and validate tags from a JSON file.
"""Load and validate tags from a YAML file.
New format - dictionary keyed by public_key:
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
YAML format - dictionary keyed by public_key:
0123456789abcdef...:
friendly_name: My Node
location:
value: "52.0,1.0"
type: coordinate
altitude:
value: "150"
type: number
Shorthand is allowed - string values are auto-converted:
{
"0123456789abcdef...": {
"friendly_name": "My Node"
}
}
0123456789abcdef...:
friendly_name: My Node
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
Returns:
Dictionary mapping public_key to tag dictionary
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -98,10 +98,10 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
raise FileNotFoundError(f"Tags file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
if not isinstance(data, dict):
raise ValueError("Tags file must contain a JSON object")
raise ValueError("Tags file must contain a YAML mapping")
# Validate each entry
validated: dict[str, dict[str, Any]] = {}
@@ -117,12 +117,24 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
for tag_key, tag_value in tags.items():
if isinstance(tag_value, dict):
# Full format with value and type
raw_value = tag_value.get("value")
# Convert value to string if it's not None
str_value = str(raw_value) if raw_value is not None else None
validated_tags[tag_key] = {
"value": tag_value.get("value"),
"value": str_value,
"type": tag_value.get("type", "string"),
}
elif isinstance(tag_value, bool):
# YAML boolean - must check before int since bool is subclass of int
validated_tags[tag_key] = {
"value": str(tag_value).lower(),
"type": "boolean",
}
elif isinstance(tag_value, (int, float)):
# YAML number (int or float)
validated_tags[tag_key] = {"value": str(tag_value), "type": "number"}
elif isinstance(tag_value, str):
# Shorthand: just a string value
# String value
validated_tags[tag_key] = {"value": tag_value, "type": "string"}
elif tag_value is None:
validated_tags[tag_key] = {"value": None, "type": "string"}
@@ -139,16 +151,19 @@ def import_tags(
file_path: str | Path,
db: DatabaseManager,
create_nodes: bool = True,
clear_existing: bool = False,
) -> dict[str, Any]:
"""Import tags from a JSON file into the database.
"""Import tags from a YAML file into the database.
Performs upsert operations - existing tags are updated, new tags are created.
Optionally clears all existing tags before import.
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
db: Database manager instance
create_nodes: If True, create nodes that don't exist. If False, skip tags
for non-existent nodes.
clear_existing: If True, delete all existing tags before importing.
Returns:
Dictionary with import statistics:
@@ -157,6 +172,7 @@ def import_tags(
- updated: Number of existing tags updated
- skipped: Number of tags skipped (node not found and create_nodes=False)
- nodes_created: Number of new nodes created
- deleted: Number of existing tags deleted (if clear_existing=True)
- errors: List of error messages
"""
stats: dict[str, Any] = {
@@ -165,6 +181,7 @@ def import_tags(
"updated": 0,
"skipped": 0,
"nodes_created": 0,
"deleted": 0,
"errors": [],
}
@@ -182,6 +199,15 @@ def import_tags(
now = datetime.now(timezone.utc)
with db.session_scope() as session:
# Clear all existing tags if requested
if clear_existing:
delete_count = (
session.execute(select(func.count()).select_from(NodeTag)).scalar() or 0
)
session.execute(delete(NodeTag))
stats["deleted"] = delete_count
logger.info(f"Deleted {delete_count} existing tags")
# Cache nodes by public_key to reduce queries
node_cache: dict[str, Node] = {}
@@ -198,7 +224,8 @@ def import_tags(
node = Node(
public_key=public_key,
first_seen=now,
last_seen=now,
# last_seen is intentionally left unset (None)
# It will be set when the node is actually seen via events
)
session.add(node)
session.flush()
@@ -219,24 +246,8 @@ def import_tags(
tag_value = tag_data.get("value")
tag_type = tag_data.get("type", "string")
# Find or create tag
tag_query = select(NodeTag).where(
NodeTag.node_id == node.id,
NodeTag.key == tag_key,
)
existing_tag = session.execute(tag_query).scalar_one_or_none()
if existing_tag:
# Update existing tag
existing_tag.value = tag_value
existing_tag.value_type = tag_type
stats["updated"] += 1
logger.debug(
f"Updated tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Create new tag
if clear_existing:
# When clearing, always create new tags
new_tag = NodeTag(
node_id=node.id,
key=tag_key,
@@ -249,6 +260,39 @@ def import_tags(
f"Created tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Find or create tag
tag_query = select(NodeTag).where(
NodeTag.node_id == node.id,
NodeTag.key == tag_key,
)
existing_tag = session.execute(
tag_query
).scalar_one_or_none()
if existing_tag:
# Update existing tag
existing_tag.value = tag_value
existing_tag.value_type = tag_type
stats["updated"] += 1
logger.debug(
f"Updated tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Create new tag
new_tag = NodeTag(
node_id=node.id,
key=tag_key,
value=tag_value,
value_type=tag_type,
)
session.add(new_tag)
stats["created"] += 1
logger.debug(
f"Created tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
except Exception as e:
error_msg = f"Error processing tag {tag_key} for {public_key[:12]}...: {e}"

View File

@@ -404,7 +404,7 @@ _dispatch_callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None
def set_dispatch_callback(
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]]
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]],
) -> None:
"""Set a callback for synchronous webhook dispatch.

View File

@@ -52,6 +52,9 @@ class CommonSettings(BaseSettings):
default=None, description="MQTT password (optional)"
)
mqtt_prefix: str = Field(default="meshcore", description="MQTT topic prefix")
mqtt_tls: bool = Field(
default=False, description="Enable TLS/SSL for MQTT connection"
)
class InterfaceSettings(CommonSettings):
@@ -70,6 +73,11 @@ class InterfaceSettings(CommonSettings):
# Mock device
mock_device: bool = Field(default=False, description="Use mock device for testing")
# Device name
meshcore_device_name: Optional[str] = Field(
default=None, description="Device/node name (optional)"
)
class CollectorSettings(CommonSettings):
"""Settings for the Collector component."""
@@ -80,7 +88,7 @@ class CollectorSettings(CommonSettings):
description="SQLAlchemy database URL (default: sqlite:///{data_home}/collector/meshcore.db)",
)
# Seed home directory - contains initial data files (node_tags.json, members.json)
# Seed home directory - contains initial data files (node_tags.yaml, members.yaml)
seed_home: str = Field(
default="./seed",
description="Directory containing seed data files (default: ./seed)",
@@ -121,6 +129,29 @@ class CollectorSettings(CommonSettings):
default=2.0, description="Retry backoff multiplier"
)
# Data retention / cleanup settings
data_retention_enabled: bool = Field(
default=True, description="Enable automatic event data cleanup"
)
data_retention_days: int = Field(
default=30, description="Number of days to retain event data", ge=1
)
data_retention_interval_hours: int = Field(
default=24,
description="Hours between automatic cleanup runs (applies to both events and nodes)",
ge=1,
)
# Node cleanup settings
node_cleanup_enabled: bool = Field(
default=True, description="Enable automatic cleanup of inactive nodes"
)
node_cleanup_days: int = Field(
default=7,
description="Remove nodes not seen for this many days (last_seen)",
ge=1,
)
@property
def collector_data_dir(self) -> str:
"""Get the collector data directory path."""
@@ -147,17 +178,17 @@ class CollectorSettings(CommonSettings):
@property
def node_tags_file(self) -> str:
"""Get the path to node_tags.json in seed_home."""
"""Get the path to node_tags.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "node_tags.json")
return str(Path(self.effective_seed_home) / "node_tags.yaml")
@property
def members_file(self) -> str:
"""Get the path to members.json in seed_home."""
"""Get the path to members.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "members.json")
return str(Path(self.effective_seed_home) / "members.yaml")
@field_validator("database_url")
@classmethod
@@ -231,9 +262,6 @@ class WebSettings(CommonSettings):
network_country: Optional[str] = Field(
default=None, description="Network country (ISO 3166-1 alpha-2)"
)
network_location: Optional[str] = Field(
default=None, description="Network location (lat,lon)"
)
network_radio_config: Optional[str] = Field(
default=None, description="Radio configuration details"
)
@@ -243,6 +271,12 @@ class WebSettings(CommonSettings):
network_contact_discord: Optional[str] = Field(
default=None, description="Discord server link"
)
network_contact_github: Optional[str] = Field(
default=None, description="GitHub repository URL"
)
network_welcome_text: Optional[str] = Field(
default=None, description="Welcome text for homepage"
)
@property
def web_data_dir(self) -> str:

View File

@@ -1,10 +1,11 @@
"""Database connection and session management."""
from contextlib import contextmanager
from typing import Generator
from contextlib import asynccontextmanager, contextmanager
from typing import AsyncGenerator, Generator
from sqlalchemy import create_engine, event
from sqlalchemy.engine import Engine
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import Session, sessionmaker
from meshcore_hub.common.models.base import Base
@@ -100,6 +101,17 @@ class DatabaseManager:
self.engine = create_database_engine(database_url, echo=echo)
self.session_factory = create_session_factory(self.engine)
# Create async engine for async operations
async_url = database_url.replace("sqlite://", "sqlite+aiosqlite://")
self.async_engine = create_async_engine(async_url, echo=echo)
from sqlalchemy.ext.asyncio import async_sessionmaker
self.async_session_factory = async_sessionmaker(
self.async_engine,
class_=AsyncSession,
expire_on_commit=False,
)
def create_tables(self) -> None:
"""Create all database tables."""
create_tables(self.engine)
@@ -138,6 +150,21 @@ class DatabaseManager:
finally:
session.close()
@asynccontextmanager
async def async_session(self) -> AsyncGenerator[AsyncSession, None]:
"""Provide an async session context manager.
Yields:
AsyncSession instance
Example:
async with db.async_session() as session:
result = await session.execute(select(Node))
await session.commit()
"""
async with self.async_session_factory() as session:
yield session
def dispose(self) -> None:
"""Dispose of the database engine and connection pool."""
self.engine.dispose()

View File

@@ -0,0 +1,142 @@
"""Event hash utilities for deduplication.
This module provides functions to compute deterministic hashes for events,
allowing deduplication when multiple receiver nodes report the same event.
"""
import hashlib
from datetime import datetime
from typing import Optional
def compute_message_hash(
text: str,
pubkey_prefix: Optional[str] = None,
channel_idx: Optional[int] = None,
sender_timestamp: Optional[datetime] = None,
txt_type: Optional[int] = None,
) -> str:
"""Compute a deterministic hash for a message.
The hash is computed from fields that uniquely identify a message's content
and sender, excluding receiver-specific data.
Args:
text: Message content
pubkey_prefix: Sender's public key prefix (12 chars)
channel_idx: Channel index for channel messages
sender_timestamp: Sender's timestamp
txt_type: Message type indicator
Returns:
32-character hex hash string
"""
# Build a canonical string from the relevant fields
parts = [
text or "",
pubkey_prefix or "",
str(channel_idx) if channel_idx is not None else "",
sender_timestamp.isoformat() if sender_timestamp else "",
str(txt_type) if txt_type is not None else "",
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_advertisement_hash(
public_key: str,
name: Optional[str] = None,
adv_type: Optional[str] = None,
flags: Optional[int] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for an advertisement.
Advertisements are bucketed by time since the same node may advertise
periodically and we want to deduplicate within a time window.
Args:
public_key: Advertised node's public key
name: Advertised name
adv_type: Node type
flags: Capability flags
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time to allow deduplication within a window
time_bucket = ""
if received_at:
# Round down to nearest bucket
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
parts = [
public_key,
name or "",
adv_type or "",
str(flags) if flags is not None else "",
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_trace_hash(initiator_tag: int) -> str:
"""Compute a deterministic hash for a trace path.
Trace paths have a unique initiator_tag that serves as the identifier.
Args:
initiator_tag: Unique trace identifier
Returns:
32-character hex hash string
"""
return hashlib.md5(str(initiator_tag).encode("utf-8")).hexdigest()
def compute_telemetry_hash(
node_public_key: str,
parsed_data: Optional[dict] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for a telemetry record.
Telemetry is bucketed by time since nodes report periodically.
Args:
node_public_key: Reporting node's public key
parsed_data: Decoded sensor readings
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time
time_bucket = ""
if received_at:
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
# Serialize parsed_data deterministically
data_str = ""
if parsed_data:
# Sort keys for deterministic serialization
sorted_items = sorted(parsed_data.items())
data_str = str(sorted_items)
parts = [
node_public_key,
data_str,
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()

View File

@@ -9,6 +9,7 @@ from meshcore_hub.common.models.trace_path import TracePath
from meshcore_hub.common.models.telemetry import Telemetry
from meshcore_hub.common.models.event_log import EventLog
from meshcore_hub.common.models.member import Member
from meshcore_hub.common.models.event_receiver import EventReceiver, add_event_receiver
__all__ = [
"Base",
@@ -21,4 +22,6 @@ __all__ = [
"Telemetry",
"EventLog",
"Member",
"EventReceiver",
"add_event_receiver",
]

View File

@@ -58,6 +58,11 @@ class Advertisement(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_advertisements_received_at", "received_at"),)

View File

@@ -0,0 +1,127 @@
"""EventReceiver model for tracking which nodes received each event."""
from datetime import datetime
from typing import TYPE_CHECKING, Optional
from uuid import uuid4
from sqlalchemy import DateTime, Float, ForeignKey, Index, String, UniqueConstraint
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
from sqlalchemy.orm import Mapped, Session, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin, utc_now
if TYPE_CHECKING:
from meshcore_hub.common.models.node import Node
class EventReceiver(Base, UUIDMixin, TimestampMixin):
"""Junction model tracking which receivers observed each event.
This table enables multi-receiver tracking for deduplicated events.
When multiple receiver nodes observe the same mesh event, each receiver
gets an entry in this table linked by the event_hash.
Attributes:
id: UUID primary key
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event (links to event tables)
receiver_node_id: FK to the node that received this event
snr: Signal-to-noise ratio at this receiver (if available)
received_at: When this specific receiver saw the event
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "event_receivers"
event_type: Mapped[str] = mapped_column(
String(20),
nullable=False,
)
event_hash: Mapped[str] = mapped_column(
String(32),
nullable=False,
index=True,
)
receiver_node_id: Mapped[str] = mapped_column(
String(36),
ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
index=True,
)
snr: Mapped[Optional[float]] = mapped_column(
Float,
nullable=True,
)
received_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
default=utc_now,
nullable=False,
)
# Relationship to receiver node
receiver_node: Mapped["Node"] = relationship(
"Node",
foreign_keys=[receiver_node_id],
)
__table_args__ = (
UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
Index("ix_event_receivers_type_hash", "event_type", "event_hash"),
)
def __repr__(self) -> str:
return (
f"<EventReceiver(type={self.event_type}, "
f"hash={self.event_hash[:8]}..., "
f"node={self.receiver_node_id[:8]}...)>"
)
def add_event_receiver(
session: Session,
event_type: str,
event_hash: str,
receiver_node_id: str,
snr: Optional[float] = None,
received_at: Optional[datetime] = None,
) -> bool:
"""Add a receiver to an event, handling duplicates gracefully.
Uses INSERT OR IGNORE to handle the unique constraint on (event_hash, receiver_node_id).
Args:
session: SQLAlchemy session
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event
receiver_node_id: UUID of the receiver node
snr: Signal-to-noise ratio at this receiver (optional)
received_at: When this receiver saw the event (defaults to now)
Returns:
True if a new receiver entry was added, False if it already existed.
"""
from datetime import timezone
now = received_at or datetime.now(timezone.utc)
stmt = (
sqlite_insert(EventReceiver)
.values(
id=str(uuid4()),
event_type=event_type,
event_hash=event_hash,
receiver_node_id=receiver_node_id,
snr=snr,
received_at=now,
created_at=now,
updated_at=now,
)
.on_conflict_do_nothing(index_elements=["event_hash", "receiver_node_id"])
)
result = session.execute(stmt)
# CursorResult has rowcount attribute
rowcount = getattr(result, "rowcount", 0)
return bool(rowcount and rowcount > 0)

View File

@@ -12,21 +12,28 @@ class Member(Base, UUIDMixin, TimestampMixin):
"""Member model for network member information.
Stores information about network members/operators.
Nodes are associated with members via a 'member_id' tag on the node.
Attributes:
id: UUID primary key
member_id: Unique member identifier (e.g., 'walshie86')
name: Member's display name
callsign: Amateur radio callsign (optional)
role: Member's role in the network (optional)
description: Additional description (optional)
contact: Contact information (optional)
public_key: Associated node public key (optional, 64-char hex)
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "members"
member_id: Mapped[str] = mapped_column(
String(100),
nullable=False,
unique=True,
index=True,
)
name: Mapped[str] = mapped_column(
String(255),
nullable=False,
@@ -47,11 +54,6 @@ class Member(Base, UUIDMixin, TimestampMixin):
String(255),
nullable=True,
)
public_key: Mapped[Optional[str]] = mapped_column(
String(64),
nullable=True,
index=True,
)
def __repr__(self) -> str:
return f"<Member(id={self.id}, name={self.name}, callsign={self.callsign})>"
return f"<Member(id={self.id}, member_id={self.member_id}, name={self.name}, callsign={self.callsign})>"

View File

@@ -76,6 +76,11 @@ class Message(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_messages_message_type", "message_type"),

View File

@@ -52,10 +52,10 @@ class Node(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
last_seen: Mapped[datetime] = mapped_column(
last_seen: Mapped[Optional[datetime]] = mapped_column(
DateTime(timezone=True),
default=utc_now,
nullable=False,
default=None,
nullable=True,
)
# Relationships

View File

@@ -54,6 +54,11 @@ class Telemetry(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_telemetry_received_at", "received_at"),)

View File

@@ -3,7 +3,7 @@
from datetime import datetime
from typing import Optional
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer, String
from sqlalchemy.dialects.sqlite import JSON
from sqlalchemy.orm import Mapped, mapped_column
@@ -67,6 +67,11 @@ class TracePath(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_trace_paths_initiator_tag", "initiator_tag"),

View File

@@ -23,6 +23,7 @@ class MQTTConfig:
client_id: Optional[str] = None
keepalive: int = 60
clean_session: bool = True
tls: bool = False
class TopicBuilder:
@@ -131,6 +132,11 @@ class MQTTClient:
self._connected = False
self._message_handlers: dict[str, list[MessageHandler]] = {}
# Set up TLS if enabled
if config.tls:
self._client.tls_set()
logger.debug("TLS/SSL enabled for MQTT connection")
# Set up authentication if provided
if config.username:
self._client.username_pw_set(config.username, config.password)
@@ -344,6 +350,7 @@ def create_mqtt_client(
password: Optional[str] = None,
prefix: str = "meshcore",
client_id: Optional[str] = None,
tls: bool = False,
) -> MQTTClient:
"""Create and configure an MQTT client.
@@ -354,6 +361,7 @@ def create_mqtt_client(
password: MQTT password (optional)
prefix: Topic prefix
client_id: Client identifier (optional)
tls: Enable TLS/SSL connection (optional)
Returns:
Configured MQTTClient instance
@@ -365,5 +373,6 @@ def create_mqtt_client(
password=password,
prefix=prefix,
client_id=client_id,
tls=tls,
)
return MQTTClient(config)

View File

@@ -20,6 +20,7 @@ from meshcore_hub.common.schemas.nodes import (
NodeTagRead,
)
from meshcore_hub.common.schemas.messages import (
ReceiverInfo,
MessageRead,
MessageList,
MessageFilters,
@@ -35,6 +36,9 @@ from meshcore_hub.common.schemas.members import (
MemberRead,
MemberList,
)
from meshcore_hub.common.schemas.network import (
RadioConfig,
)
__all__ = [
# Events
@@ -54,7 +58,8 @@ __all__ = [
"NodeTagCreate",
"NodeTagUpdate",
"NodeTagRead",
# Messages
# Messages & Events
"ReceiverInfo",
"MessageRead",
"MessageList",
"MessageFilters",
@@ -67,4 +72,6 @@ __all__ = [
"MemberUpdate",
"MemberRead",
"MemberList",
# Network
"RadioConfig",
]

View File

@@ -157,7 +157,16 @@ class TelemetryResponseEvent(BaseModel):
class ContactInfo(BaseModel):
"""Schema for a single contact in CONTACTS event."""
"""Schema for a single contact in CONTACTS event.
Device payload fields:
- public_key: Node's 64-char hex public key
- adv_name: Node's advertised name (device field)
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
- flags: Capability flags
- last_advert: Unix timestamp of last advertisement
- adv_lat, adv_lon: GPS coordinates (if available)
"""
public_key: str = Field(
...,
@@ -165,14 +174,40 @@ class ContactInfo(BaseModel):
max_length=64,
description="Node's full public key",
)
adv_name: Optional[str] = Field(
default=None,
max_length=255,
description="Node's advertised name (from device)",
)
type: Optional[int] = Field(
default=None,
description="Numeric node type: 0=none, 1=chat, 2=repeater, 3=room",
)
flags: Optional[int] = Field(
default=None,
description="Capability/status flags bitmask",
)
last_advert: Optional[int] = Field(
default=None,
description="Unix timestamp of last advertisement",
)
adv_lat: Optional[float] = Field(
default=None,
description="GPS latitude (if available)",
)
adv_lon: Optional[float] = Field(
default=None,
description="GPS longitude (if available)",
)
# Legacy field names for backwards compatibility
name: Optional[str] = Field(
default=None,
max_length=255,
description="Node name/alias",
description="Node name/alias (legacy, prefer adv_name)",
)
node_type: Optional[str] = Field(
default=None,
description="Node type: chat, repeater, room, none",
description="Node type string (legacy, prefer type)",
)

View File

@@ -7,8 +7,18 @@ from pydantic import BaseModel, Field
class MemberCreate(BaseModel):
"""Schema for creating a member."""
"""Schema for creating a member.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: str = Field(
...,
min_length=1,
max_length=100,
description="Unique member identifier (e.g., 'walshie86')",
)
name: str = Field(
...,
min_length=1,
@@ -34,18 +44,21 @@ class MemberCreate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
)
class MemberUpdate(BaseModel):
"""Schema for updating a member."""
"""Schema for updating a member.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: Optional[str] = Field(
default=None,
min_length=1,
max_length=100,
description="Unique member identifier (e.g., 'walshie86')",
)
name: Optional[str] = Field(
default=None,
min_length=1,
@@ -71,27 +84,22 @@ class MemberUpdate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
)
class MemberRead(BaseModel):
"""Schema for reading a member."""
"""Schema for reading a member.
Note: Nodes are associated with members via a 'member_id' tag on the node.
To find nodes for a member, query nodes with a 'member_id' tag matching this member.
"""
id: str = Field(..., description="Member UUID")
member_id: str = Field(..., description="Unique member identifier")
name: str = Field(..., description="Member's display name")
callsign: Optional[str] = Field(default=None, description="Amateur radio callsign")
role: Optional[str] = Field(default=None, description="Member's role")
description: Optional[str] = Field(default=None, description="Description")
contact: Optional[str] = Field(default=None, description="Contact information")
public_key: Optional[str] = Field(
default=None, description="Associated node public key"
)
created_at: datetime = Field(..., description="Creation timestamp")
updated_at: datetime = Field(..., description="Last update timestamp")

View File

@@ -6,19 +6,41 @@ from typing import Literal, Optional
from pydantic import BaseModel, Field
class ReceiverInfo(BaseModel):
"""Information about a receiver that observed an event."""
node_id: str = Field(..., description="Receiver node UUID")
public_key: str = Field(..., description="Receiver node public key")
name: Optional[str] = Field(default=None, description="Receiver node name")
tag_name: Optional[str] = Field(default=None, description="Receiver name from tags")
snr: Optional[float] = Field(
default=None, description="Signal-to-noise ratio at this receiver"
)
received_at: datetime = Field(..., description="When this receiver saw the event")
class Config:
from_attributes = True
class MessageRead(BaseModel):
"""Schema for reading a message."""
id: str = Field(..., description="Message UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_tag_name: Optional[str] = Field(
default=None, description="Receiver name from tags"
)
message_type: str = Field(..., description="Message type (contact, channel)")
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender's public key prefix (12 chars)"
)
sender_friendly_name: Optional[str] = Field(
default=None, description="Sender's friendly name from node tags"
sender_name: Optional[str] = Field(
default=None, description="Sender's advertised node name"
)
sender_tag_name: Optional[str] = Field(
default=None, description="Sender's name from node tags"
)
channel_idx: Optional[int] = Field(default=None, description="Channel index")
text: str = Field(..., description="Message content")
@@ -31,6 +53,9 @@ class MessageRead(BaseModel):
)
received_at: datetime = Field(..., description="When received by interface")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list, description="All receivers that observed this message"
)
class Config:
from_attributes = True
@@ -79,17 +104,29 @@ class MessageFilters(BaseModel):
class AdvertisementRead(BaseModel):
"""Schema for reading an advertisement."""
id: str = Field(..., description="Advertisement UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_tag_name: Optional[str] = Field(
default=None, description="Receiver name from tags"
)
node_id: Optional[str] = Field(default=None, description="Advertised node UUID")
public_key: str = Field(..., description="Advertised public key")
name: Optional[str] = Field(default=None, description="Advertised name")
node_name: Optional[str] = Field(
default=None, description="Node name from nodes table"
)
node_tag_name: Optional[str] = Field(
default=None, description="Node name from tags"
)
adv_type: Optional[str] = Field(default=None, description="Node type")
flags: Optional[int] = Field(default=None, description="Capability flags")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this advertisement",
)
class Config:
from_attributes = True
@@ -107,9 +144,8 @@ class AdvertisementList(BaseModel):
class TracePathRead(BaseModel):
"""Schema for reading a trace path."""
id: str = Field(..., description="Trace path UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
initiator_tag: int = Field(..., description="Trace identifier")
path_len: Optional[int] = Field(default=None, description="Path length")
@@ -124,6 +160,10 @@ class TracePathRead(BaseModel):
hop_count: Optional[int] = Field(default=None, description="Total hops")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this trace",
)
class Config:
from_attributes = True
@@ -141,17 +181,19 @@ class TracePathList(BaseModel):
class TelemetryRead(BaseModel):
"""Schema for reading a telemetry record."""
id: str = Field(..., description="Telemetry UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
node_id: Optional[str] = Field(default=None, description="Reporting node UUID")
node_public_key: str = Field(..., description="Reporting node public key")
parsed_data: Optional[dict] = Field(
default=None, description="Decoded sensor readings"
)
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this telemetry",
)
class Config:
from_attributes = True
@@ -171,11 +213,25 @@ class RecentAdvertisement(BaseModel):
public_key: str = Field(..., description="Node public key")
name: Optional[str] = Field(default=None, description="Node name")
friendly_name: Optional[str] = Field(default=None, description="Friendly name tag")
tag_name: Optional[str] = Field(default=None, description="Name tag")
adv_type: Optional[str] = Field(default=None, description="Node type")
received_at: datetime = Field(..., description="When received")
class ChannelMessage(BaseModel):
"""Schema for a channel message summary."""
text: str = Field(..., description="Message text")
sender_name: Optional[str] = Field(default=None, description="Sender name")
sender_tag_name: Optional[str] = Field(
default=None, description="Sender name from tags"
)
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender public key prefix"
)
received_at: datetime = Field(..., description="When received")
class DashboardStats(BaseModel):
"""Schema for dashboard statistics."""
@@ -183,10 +239,14 @@ class DashboardStats(BaseModel):
active_nodes: int = Field(..., description="Nodes active in last 24h")
total_messages: int = Field(..., description="Total number of messages")
messages_today: int = Field(..., description="Messages received today")
messages_7d: int = Field(default=0, description="Messages received in last 7 days")
total_advertisements: int = Field(..., description="Total advertisements")
advertisements_24h: int = Field(
default=0, description="Advertisements received in last 24h"
)
advertisements_7d: int = Field(
default=0, description="Advertisements received in last 7 days"
)
recent_advertisements: list[RecentAdvertisement] = Field(
default_factory=list, description="Last 10 advertisements"
)
@@ -194,3 +254,39 @@ class DashboardStats(BaseModel):
default_factory=dict,
description="Message count per channel",
)
channel_messages: dict[int, list[ChannelMessage]] = Field(
default_factory=dict,
description="Recent messages per channel (up to 5 each)",
)
class DailyActivityPoint(BaseModel):
"""Schema for a single day's activity count."""
date: str = Field(..., description="Date in YYYY-MM-DD format")
count: int = Field(..., description="Count for this day")
class DailyActivity(BaseModel):
"""Schema for daily advertisement activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Daily advertisement counts"
)
class MessageActivity(BaseModel):
"""Schema for daily message activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(..., description="Daily message counts")
class NodeCountHistory(BaseModel):
"""Schema for node count over time."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Cumulative node count per day"
)

View File

@@ -0,0 +1,65 @@
"""Pydantic schemas for network configuration."""
from typing import Optional
from pydantic import BaseModel
class RadioConfig(BaseModel):
"""Parsed radio configuration from comma-delimited string.
Format: "<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Example: "EU/UK Narrow,869.618MHz,62.5kHz,8,8,22dBm"
"""
profile: Optional[str] = None
frequency: Optional[str] = None
bandwidth: Optional[str] = None
spreading_factor: Optional[int] = None
coding_rate: Optional[int] = None
tx_power: Optional[str] = None
@classmethod
def from_config_string(cls, config_str: Optional[str]) -> Optional["RadioConfig"]:
"""Parse a comma-delimited radio config string.
Args:
config_str: Comma-delimited string in format:
"<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Returns:
RadioConfig instance if parsing succeeds, None if input is None or empty
"""
if not config_str:
return None
parts = [p.strip() for p in config_str.split(",")]
# Handle partial configs by filling with None
while len(parts) < 6:
parts.append("")
# Parse spreading factor and coding rate as integers
spreading_factor = None
coding_rate = None
try:
if parts[3]:
spreading_factor = int(parts[3])
except ValueError:
pass
try:
if parts[4]:
coding_rate = int(parts[4])
except ValueError:
pass
return cls(
profile=parts[0] or None,
frequency=parts[1] or None,
bandwidth=parts[2] or None,
spreading_factor=spreading_factor,
coding_rate=coding_rate,
tx_power=parts[5] or None,
)

View File

@@ -41,8 +41,6 @@ class NodeTagUpdate(BaseModel):
class NodeTagRead(BaseModel):
"""Schema for reading a node tag."""
id: str = Field(..., description="Tag UUID")
node_id: str = Field(..., description="Parent node UUID")
key: str = Field(..., description="Tag name/key")
value: Optional[str] = Field(default=None, description="Tag value")
value_type: str = Field(..., description="Value type hint")
@@ -56,13 +54,14 @@ class NodeTagRead(BaseModel):
class NodeRead(BaseModel):
"""Schema for reading a node."""
id: str = Field(..., description="Node UUID")
public_key: str = Field(..., description="Node's 64-character hex public key")
name: Optional[str] = Field(default=None, description="Node display name")
adv_type: Optional[str] = Field(default=None, description="Advertisement type")
flags: Optional[int] = Field(default=None, description="Capability flags")
first_seen: datetime = Field(..., description="First advertisement timestamp")
last_seen: datetime = Field(..., description="Last activity timestamp")
last_seen: Optional[datetime] = Field(
default=None, description="Last activity timestamp"
)
created_at: datetime = Field(..., description="Record creation timestamp")
updated_at: datetime = Field(..., description="Record update timestamp")
tags: list[NodeTagRead] = Field(default_factory=list, description="Node tags")
@@ -85,7 +84,7 @@ class NodeFilters(BaseModel):
search: Optional[str] = Field(
default=None,
description="Search in name or public key",
description="Search in name tag, node name, or public key",
)
adv_type: Optional[str] = Field(
default=None,

View File

@@ -51,6 +51,13 @@ def interface() -> None:
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -86,6 +93,13 @@ def interface() -> None:
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--log-level",
type=click.Choice(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]),
@@ -99,11 +113,13 @@ def run(
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
log_level: str,
) -> None:
"""Run the interface component.
@@ -139,11 +155,13 @@ def run(
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)
elif mode_upper == "SENDER":
from meshcore_hub.interface.sender import run_sender
@@ -153,11 +171,13 @@ def run(
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)
else:
click.echo(f"Unknown mode: {mode}", err=True)
@@ -193,6 +213,13 @@ def run(
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -228,16 +255,25 @@ def run(
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
def receiver(
port: str,
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
) -> None:
"""Run interface in RECEIVER mode.
@@ -262,6 +298,7 @@ def receiver(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)
@@ -294,6 +331,13 @@ def receiver(
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -329,16 +373,25 @@ def receiver(
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
def sender(
port: str,
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
) -> None:
"""Run interface in SENDER mode.
@@ -363,4 +416,5 @@ def sender(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)

View File

@@ -164,6 +164,18 @@ class BaseMeshCoreDevice(ABC):
"""
pass
@abstractmethod
def set_name(self, name: str) -> bool:
"""Set the device's node name.
Args:
name: Node name to set
Returns:
True if name was set successfully
"""
pass
@abstractmethod
def start_message_fetching(self) -> bool:
"""Start automatic message fetching.
@@ -175,6 +187,30 @@ class BaseMeshCoreDevice(ABC):
"""
pass
@abstractmethod
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database.
Triggers a CONTACTS event with all stored contacts from the device.
Note: This should only be called before the event loop is running.
Returns:
True if request was sent successfully
"""
pass
@abstractmethod
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request on the event loop.
This is safe to call from event handlers while the event loop is running.
Returns:
True if request was scheduled successfully
"""
pass
@abstractmethod
def run(self) -> None:
"""Run the device event loop (blocking)."""
@@ -322,6 +358,10 @@ class MeshCoreDevice(BaseMeshCoreDevice):
self._connected = True
logger.info(f"Connected to MeshCore device, public_key: {self._public_key}")
# Set up event subscriptions so events can be received immediately
self._setup_event_subscriptions()
return True
except Exception as e:
@@ -503,6 +543,24 @@ class MeshCoreDevice(BaseMeshCoreDevice):
logger.error(f"Failed to set device time: {e}")
return False
def set_name(self, name: str) -> bool:
"""Set the device's node name."""
if not self._connected or not self._mc:
logger.error("Cannot set name: not connected")
return False
try:
async def _set_name() -> None:
await self._mc.commands.set_name(name)
self._loop.run_until_complete(_set_name())
logger.info(f"Set device name to '{name}'")
return True
except Exception as e:
logger.error(f"Failed to set device name: {e}")
return False
def start_message_fetching(self) -> bool:
"""Start automatic message fetching."""
if not self._connected or not self._mc:
@@ -521,14 +579,59 @@ class MeshCoreDevice(BaseMeshCoreDevice):
logger.error(f"Failed to start message fetching: {e}")
return False
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database.
Note: This method should only be called before the event loop is running
(e.g., during initialization). For calling during event processing,
use schedule_get_contacts() instead.
"""
if not self._connected or not self._mc:
logger.error("Cannot get contacts: not connected")
return False
try:
async def _get_contacts() -> None:
await self._mc.commands.get_contacts()
self._loop.run_until_complete(_get_contacts())
logger.info("Requested contacts from device")
return True
except Exception as e:
logger.error(f"Failed to get contacts: {e}")
return False
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request on the event loop.
This is safe to call from event handlers while the event loop is running.
The request is scheduled as a task on the event loop.
Returns:
True if request was scheduled, False if device not connected
"""
if not self._connected or not self._mc:
logger.error("Cannot get contacts: not connected")
return False
try:
async def _get_contacts() -> None:
await self._mc.commands.get_contacts()
asyncio.run_coroutine_threadsafe(_get_contacts(), self._loop)
logger.info("Scheduled contact sync request")
return True
except Exception as e:
logger.error(f"Failed to schedule get contacts: {e}")
return False
def run(self) -> None:
"""Run the device event loop."""
self._running = True
logger.info("Starting device event loop")
# Set up event subscriptions
self._setup_event_subscriptions()
# Run the async event loop
async def _run_loop() -> None:
while self._running and self._connected:

View File

@@ -271,6 +271,17 @@ class MockMeshCoreDevice(BaseMeshCoreDevice):
logger.info(f"Mock: Set device time to {timestamp}")
return True
def set_name(self, name: str) -> bool:
"""Set the mock device's node name."""
if not self._connected:
logger.error("Cannot set name: not connected")
return False
logger.info(f"Mock: Set device name to '{name}'")
# Update the mock config name
self.mock_config.name = name
return True
def start_message_fetching(self) -> bool:
"""Start automatic message fetching (mock)."""
if not self._connected:
@@ -280,6 +291,44 @@ class MockMeshCoreDevice(BaseMeshCoreDevice):
logger.info("Mock: Started automatic message fetching")
return True
def get_contacts(self) -> bool:
"""Fetch contacts from mock device contact database.
Note: This should only be called before the event loop is running.
"""
if not self._connected:
logger.error("Cannot get contacts: not connected")
return False
logger.info("Mock: Requesting contacts from device")
# Generate CONTACTS event with all configured mock nodes
def send_contacts() -> None:
time.sleep(0.2)
contacts = [
{
"public_key": node.public_key,
"name": node.name,
"node_type": node.adv_type,
}
for node in self.mock_config.nodes
]
self._dispatch_event(
EventType.CONTACTS,
{"contacts": contacts},
)
threading.Thread(target=send_contacts, daemon=True).start()
return True
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request.
For the mock device, this is the same as get_contacts() since we
don't have a real async event loop. The contacts are sent via a thread.
"""
return self.get_contacts()
def run(self) -> None:
"""Run the mock device event loop."""
self._running = True

View File

@@ -33,15 +33,18 @@ class Receiver:
self,
device: BaseMeshCoreDevice,
mqtt_client: MQTTClient,
device_name: Optional[str] = None,
):
"""Initialize receiver.
Args:
device: MeshCore device instance
mqtt_client: MQTT client instance
device_name: Optional device/node name to set on startup
"""
self.device = device
self.mqtt = mqtt_client
self.device_name = device_name
self._running = False
self._shutdown_event = threading.Event()
self._device_connected = False
@@ -71,10 +74,14 @@ class Receiver:
"device_public_key": self.device.public_key,
}
def _initialize_device(self) -> None:
def _initialize_device(self, device_name: Optional[str] = None) -> None:
"""Initialize device after connection.
Sets the hardware clock, sends a local advertisement, and starts message fetching.
Sets the hardware clock, optionally sets device name, sends a local advertisement,
starts message fetching, and syncs the contact database.
Args:
device_name: Optional device/node name to set
"""
# Set device time to current Unix timestamp
current_time = int(time.time())
@@ -83,11 +90,18 @@ class Receiver:
else:
logger.warning("Failed to synchronize device clock")
# Send a local (non-flood) advertisement to announce presence
if self.device.send_advertisement(flood=False):
logger.info("Sent local advertisement")
# Set device name if provided
if device_name:
if self.device.set_name(device_name):
logger.info(f"Set device name to '{device_name}'")
else:
logger.warning(f"Failed to set device name to '{device_name}'")
# Send a flood advertisement to broadcast device name
if self.device.send_advertisement(flood=True):
logger.info("Sent flood advertisement")
else:
logger.warning("Failed to send local advertisement")
logger.warning("Failed to send flood advertisement")
# Start automatic message fetching
if self.device.start_message_fetching():
@@ -95,6 +109,12 @@ class Receiver:
else:
logger.warning("Failed to start automatic message fetching")
# Fetch contact database to sync known nodes
if self.device.get_contacts():
logger.info("Requested contact database sync")
else:
logger.warning("Failed to request contact database")
def _handle_event(self, event_type: EventType, payload: dict[str, Any]) -> None:
"""Handle device event and publish to MQTT.
@@ -110,6 +130,11 @@ class Receiver:
# Convert event type to MQTT topic name
event_name = event_type.value
# Special handling for CONTACTS: split into individual messages
if event_type == EventType.CONTACTS:
self._publish_contacts(payload)
return
# Publish to MQTT
self.mqtt.publish_event(
self.device.public_key,
@@ -119,9 +144,67 @@ class Receiver:
logger.debug(f"Published {event_name} event to MQTT")
# Trigger contact sync on advertisements
if event_type == EventType.ADVERTISEMENT:
self._sync_contacts()
except Exception as e:
logger.error(f"Failed to publish event to MQTT: {e}")
def _sync_contacts(self) -> None:
"""Request contact sync from device.
Called when advertisements are received to ensure contact database
stays current with all nodes on the mesh.
"""
logger.info("Advertisement received, triggering contact sync")
success = self.device.schedule_get_contacts()
if not success:
logger.warning("Contact sync request failed")
def _publish_contacts(self, payload: dict[str, Any]) -> None:
"""Publish each contact as a separate MQTT message.
The device returns contacts as a dict keyed by public_key.
We split this into individual 'contact' events for cleaner processing.
Args:
payload: Dict of contacts keyed by public_key
"""
if not self.device.public_key:
logger.warning("Cannot publish contacts: device public key not available")
return
# Handle both formats:
# - Dict keyed by public_key (real device)
# - Dict with "contacts" array (mock device)
if "contacts" in payload:
contacts = payload["contacts"]
else:
contacts = list(payload.values())
if not contacts:
logger.debug("Empty contacts list received")
return
device_key = self.device.public_key # Capture for type narrowing
count = 0
for contact in contacts:
if not isinstance(contact, dict):
continue
try:
self.mqtt.publish_event(
device_key,
"contact", # Use singular 'contact' for individual events
contact,
)
count += 1
except Exception as e:
logger.error(f"Failed to publish contact event: {e}")
logger.info(f"Published {count} contact events to MQTT")
def start(self) -> None:
"""Start the receiver."""
logger.info("Starting RECEIVER mode")
@@ -142,20 +225,22 @@ class Receiver:
logger.error(f"Failed to connect to MQTT broker: {e}")
raise
# Connect to device
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_receiver)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Initialize device: set time and send local advertisement
self._initialize_device()
# Initialize device: set time, optionally set name, and send local advertisement
self._initialize_device(device_name=self.device_name)
self._running = True
@@ -214,11 +299,13 @@ def create_receiver(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> Receiver:
"""Create a configured receiver instance.
@@ -227,30 +314,38 @@ def create_receiver(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name to set on startup
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
Returns:
Configured Receiver instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-receiver-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-receiver-{device.public_key[:12] if device.public_key else 'unknown'}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
return Receiver(device, mqtt_client)
return Receiver(device, mqtt_client, device_name=device_name)
def run_receiver(
@@ -258,11 +353,13 @@ def run_receiver(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> None:
"""Run the receiver (blocking).
@@ -273,22 +370,26 @@ def run_receiver(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name to set on startup
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
"""
receiver = create_receiver(
port=port,
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
)
# Set up signal handlers

View File

@@ -200,14 +200,16 @@ class Sender:
"""Start the sender."""
logger.info("Starting SENDER mode")
# Connect to device first
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_sender)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Connect to MQTT broker
try:
@@ -285,11 +287,13 @@ def create_sender(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> Sender:
"""Create a configured sender instance.
@@ -298,26 +302,34 @@ def create_sender(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name (not used in SENDER mode)
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
Returns:
Configured Sender instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-sender-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-sender-{device.public_key[:12] if device.public_key else 'unknown'}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
@@ -329,11 +341,13 @@ def run_sender(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> None:
"""Run the sender (blocking).
@@ -344,22 +358,26 @@ def run_sender(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name (not used in SENDER mode)
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
"""
sender = create_sender(
port=port,
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
)
# Set up signal handlers

View File

@@ -11,6 +11,7 @@ from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from meshcore_hub import __version__
from meshcore_hub.common.schemas import RadioConfig
logger = logging.getLogger(__name__)
@@ -47,32 +48,42 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
def create_app(
api_url: str = "http://localhost:8000",
api_url: str | None = None,
api_key: str | None = None,
network_name: str = "MeshCore Network",
network_name: str | None = None,
network_city: str | None = None,
network_country: str | None = None,
network_location: tuple[float, float] | None = None,
network_radio_config: str | None = None,
network_contact_email: str | None = None,
network_contact_discord: str | None = None,
network_contact_github: str | None = None,
network_welcome_text: str | None = None,
) -> FastAPI:
"""Create and configure the web dashboard application.
When called without arguments (e.g., in reload mode), settings are loaded
from environment variables via the WebSettings class.
Args:
api_url: Base URL of the MeshCore Hub API
api_key: API key for authentication
network_name: Display name for the network
network_city: City where the network is located
network_country: Country where the network is located
network_location: (lat, lon) tuple for map centering
network_radio_config: Radio configuration description
network_contact_email: Contact email address
network_contact_discord: Discord invite/server info
network_contact_github: GitHub repository URL
network_welcome_text: Welcome text for homepage
Returns:
Configured FastAPI application
"""
# Load settings from environment if not provided
from meshcore_hub.common.config import get_web_settings
settings = get_web_settings()
app = FastAPI(
title="MeshCore Hub Dashboard",
description="Web dashboard for MeshCore network visualization",
@@ -82,16 +93,27 @@ def create_app(
redoc_url=None,
)
# Store configuration in app state
app.state.api_url = api_url
app.state.api_key = api_key
app.state.network_name = network_name
app.state.network_city = network_city
app.state.network_country = network_country
app.state.network_location = network_location or (0.0, 0.0)
app.state.network_radio_config = network_radio_config
app.state.network_contact_email = network_contact_email
app.state.network_contact_discord = network_contact_discord
# Store configuration in app state (use args if provided, else settings)
app.state.api_url = api_url or settings.api_base_url
app.state.api_key = api_key or settings.api_key
app.state.network_name = network_name or settings.network_name
app.state.network_city = network_city or settings.network_city
app.state.network_country = network_country or settings.network_country
app.state.network_radio_config = (
network_radio_config or settings.network_radio_config
)
app.state.network_contact_email = (
network_contact_email or settings.network_contact_email
)
app.state.network_contact_discord = (
network_contact_discord or settings.network_contact_discord
)
app.state.network_contact_github = (
network_contact_github or settings.network_contact_github
)
app.state.network_welcome_text = (
network_welcome_text or settings.network_welcome_text
)
# Set up templates
templates = Jinja2Templates(directory=str(TEMPLATES_DIR))
@@ -134,13 +156,19 @@ def get_templates(request: Request) -> Jinja2Templates:
def get_network_context(request: Request) -> dict:
"""Get network configuration context for templates."""
# Parse radio config from comma-delimited string
radio_config = RadioConfig.from_config_string(
request.app.state.network_radio_config
)
return {
"network_name": request.app.state.network_name,
"network_city": request.app.state.network_city,
"network_country": request.app.state.network_country,
"network_location": request.app.state.network_location,
"network_radio_config": request.app.state.network_radio_config,
"network_radio_config": radio_config,
"network_contact_email": request.app.state.network_contact_email,
"network_contact_discord": request.app.state.network_contact_discord,
"network_contact_github": request.app.state.network_contact_github,
"network_welcome_text": request.app.state.network_welcome_text,
"version": __version__,
}

View File

@@ -7,21 +7,21 @@ import click
@click.option(
"--host",
type=str,
default="0.0.0.0",
default=None,
envvar="WEB_HOST",
help="Web server host",
help="Web server host (default: 0.0.0.0)",
)
@click.option(
"--port",
type=int,
default=8080,
default=None,
envvar="WEB_PORT",
help="Web server port",
help="Web server port (default: 8080)",
)
@click.option(
"--api-url",
type=str,
default="http://localhost:8000",
default=None,
envvar="API_BASE_URL",
help="API server base URL",
)
@@ -42,7 +42,7 @@ import click
@click.option(
"--network-name",
type=str,
default="MeshCore Network",
default=None,
envvar="NETWORK_NAME",
help="Network display name",
)
@@ -60,20 +60,6 @@ import click
envvar="NETWORK_COUNTRY",
help="Network country",
)
@click.option(
"--network-lat",
type=float,
default=0.0,
envvar="NETWORK_LAT",
help="Network center latitude",
)
@click.option(
"--network-lon",
type=float,
default=0.0,
envvar="NETWORK_LON",
help="Network center longitude",
)
@click.option(
"--network-radio-config",
type=str,
@@ -95,6 +81,20 @@ import click
envvar="NETWORK_CONTACT_DISCORD",
help="Discord server info",
)
@click.option(
"--network-contact-github",
type=str,
default=None,
envvar="NETWORK_CONTACT_GITHUB",
help="GitHub repository URL",
)
@click.option(
"--network-welcome-text",
type=str,
default=None,
envvar="NETWORK_WELCOME_TEXT",
help="Welcome text for homepage",
)
@click.option(
"--reload",
is_flag=True,
@@ -104,19 +104,19 @@ import click
@click.pass_context
def web(
ctx: click.Context,
host: str,
port: int,
api_url: str,
host: str | None,
port: int | None,
api_url: str | None,
api_key: str | None,
data_home: str | None,
network_name: str,
network_name: str | None,
network_city: str | None,
network_country: str | None,
network_lat: float,
network_lon: float,
network_radio_config: str | None,
network_contact_email: str | None,
network_contact_discord: str | None,
network_contact_github: str | None,
network_welcome_text: str | None,
reload: bool,
) -> None:
"""Run the web dashboard.
@@ -146,46 +146,46 @@ def web(
from meshcore_hub.common.config import get_web_settings
from meshcore_hub.web.app import create_app
# Get settings to compute effective values
# Get settings for defaults and display
settings = get_web_settings()
# Override data_home if provided
if data_home:
settings = settings.model_copy(update={"data_home": data_home})
# Use CLI args or fall back to settings
effective_host = host or settings.web_host
effective_port = port or settings.web_port
effective_data_home = data_home or settings.data_home
# Ensure web data directory exists
web_data_dir = Path(effective_data_home) / "web"
web_data_dir.mkdir(parents=True, exist_ok=True)
# Display effective settings
effective_network_name = network_name or settings.network_name
click.echo("=" * 50)
click.echo("MeshCore Hub Web Dashboard")
click.echo("=" * 50)
click.echo(f"Host: {host}")
click.echo(f"Port: {port}")
click.echo(f"Host: {effective_host}")
click.echo(f"Port: {effective_port}")
click.echo(f"Data home: {effective_data_home}")
click.echo(f"API URL: {api_url}")
click.echo(f"API key configured: {api_key is not None}")
click.echo(f"Network: {network_name}")
if network_city and network_country:
click.echo(f"Location: {network_city}, {network_country}")
if network_lat != 0.0 or network_lon != 0.0:
click.echo(f"Map center: {network_lat}, {network_lon}")
click.echo(f"API URL: {api_url or settings.api_base_url}")
click.echo(f"API key configured: {(api_key or settings.api_key) is not None}")
click.echo(f"Network: {effective_network_name}")
effective_city = network_city or settings.network_city
effective_country = network_country or settings.network_country
if effective_city and effective_country:
click.echo(f"Location: {effective_city}, {effective_country}")
click.echo(f"Reload mode: {reload}")
click.echo("=" * 50)
network_location = (network_lat, network_lon)
if reload:
# For development, use uvicorn's reload feature
click.echo("\nStarting in development mode with auto-reload...")
click.echo("Note: Using default settings for reload mode.")
click.echo("Note: Settings loaded from environment/config.")
uvicorn.run(
"meshcore_hub.web.app:create_app",
host=host,
port=port,
host=effective_host,
port=effective_port,
reload=True,
factory=True,
)
@@ -197,11 +197,12 @@ def web(
network_name=network_name,
network_city=network_city,
network_country=network_country,
network_location=network_location,
network_radio_config=network_radio_config,
network_contact_email=network_contact_email,
network_contact_discord=network_contact_discord,
network_contact_github=network_contact_github,
network_welcome_text=network_welcome_text,
)
click.echo("\nStarting web dashboard...")
uvicorn.run(app, host=host, port=port)
uvicorn.run(app, host=effective_host, port=effective_port)

View File

@@ -6,6 +6,7 @@ from meshcore_hub.web.routes.home import router as home_router
from meshcore_hub.web.routes.network import router as network_router
from meshcore_hub.web.routes.nodes import router as nodes_router
from meshcore_hub.web.routes.messages import router as messages_router
from meshcore_hub.web.routes.advertisements import router as advertisements_router
from meshcore_hub.web.routes.map import router as map_router
from meshcore_hub.web.routes.members import router as members_router
@@ -17,6 +18,7 @@ web_router.include_router(home_router)
web_router.include_router(network_router)
web_router.include_router(nodes_router)
web_router.include_router(messages_router)
web_router.include_router(advertisements_router)
web_router.include_router(map_router)
web_router.include_router(members_router)

View File

@@ -0,0 +1,64 @@
"""Advertisements page route."""
import logging
from fastapi import APIRouter, Query, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/advertisements", response_class=HTMLResponse)
async def advertisements_list(
request: Request,
search: str | None = Query(None, description="Search term"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
) -> HTMLResponse:
"""Render the advertisements list page."""
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
# Calculate offset
offset = (page - 1) * limit
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if search:
params["search"] = search
# Fetch advertisements from API
advertisements = []
total = 0
try:
response = await request.app.state.http_client.get(
"/api/v1/advertisements", params=params
)
if response.status_code == 200:
data = response.json()
advertisements = data.get("items", [])
total = data.get("total", 0)
except Exception as e:
logger.warning(f"Failed to fetch advertisements from API: {e}")
context["api_error"] = str(e)
# Calculate pagination
total_pages = (total + limit - 1) // limit if total > 0 else 1
context.update(
{
"advertisements": advertisements,
"total": total,
"page": page,
"limit": limit,
"total_pages": total_pages,
"search": search or "",
}
)
return templates.TemplateResponse("advertisements.html", context)

View File

@@ -1,10 +1,14 @@
"""Home page route."""
import json
import logging
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@@ -15,4 +19,38 @@ async def home(request: Request) -> HTMLResponse:
context = get_network_context(request)
context["request"] = request
# Fetch stats from API
stats = {
"total_nodes": 0,
"active_nodes": 0,
"total_messages": 0,
"messages_today": 0,
"total_advertisements": 0,
"advertisements_24h": 0,
}
# Fetch activity data for chart
activity = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
stats = response.json()
except Exception as e:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch activity from API: {e}")
context["stats"] = stats
# Pass activity data as JSON string for the chart
context["activity_json"] = json.dumps(activity)
return templates.TemplateResponse("home.html", context)

View File

@@ -1,6 +1,7 @@
"""Map page route."""
import logging
from typing import Any
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse, JSONResponse
@@ -23,10 +24,39 @@ async def map_page(request: Request) -> HTMLResponse:
@router.get("/map/data")
async def map_data(request: Request) -> JSONResponse:
"""Return node location data as JSON for the map."""
nodes_with_location = []
"""Return node location data as JSON for the map.
Includes role tag, member ownership info, and all data needed for filtering.
"""
nodes_with_location: list[dict[str, Any]] = []
members_list: list[dict[str, Any]] = []
members_by_key: dict[str, dict[str, Any]] = {}
error: str | None = None
total_nodes = 0
nodes_with_coords = 0
try:
# Fetch all members to build lookup by public_key
members_response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 500}
)
if members_response.status_code == 200:
members_data = members_response.json()
for member in members_data.get("items", []):
# Only include members with public_key (required for node ownership)
if member.get("public_key"):
member_info = {
"public_key": member.get("public_key"),
"name": member.get("name"),
"callsign": member.get("callsign"),
}
members_list.append(member_info)
members_by_key[member["public_key"]] = member_info
else:
logger.warning(
f"Failed to fetch members: status {members_response.status_code}"
)
# Fetch all nodes from API
response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"limit": 500}
@@ -34,6 +64,7 @@ async def map_data(request: Request) -> JSONResponse:
if response.status_code == 200:
data = response.json()
nodes = data.get("items", [])
total_nodes = len(nodes)
# Filter nodes with location tags
for node in nodes:
@@ -41,6 +72,8 @@ async def map_data(request: Request) -> JSONResponse:
lat = None
lon = None
friendly_name = None
role = None
for tag in tags:
key = tag.get("key")
if key == "lat":
@@ -55,37 +88,70 @@ async def map_data(request: Request) -> JSONResponse:
pass
elif key == "friendly_name":
friendly_name = tag.get("value")
elif key == "role":
role = tag.get("value")
if lat is not None and lon is not None:
nodes_with_coords += 1
# Use friendly_name, then node name, then public key prefix
display_name = (
friendly_name
or node.get("name")
or node.get("public_key", "")[:12]
)
public_key = node.get("public_key")
# Find owner member if exists
owner = members_by_key.get(public_key)
nodes_with_location.append(
{
"public_key": node.get("public_key"),
"public_key": public_key,
"name": display_name,
"adv_type": node.get("adv_type"),
"lat": lat,
"lon": lon,
"last_seen": node.get("last_seen"),
"role": role,
"is_infra": role == "infra",
"owner": owner,
}
)
else:
error = f"API returned status {response.status_code}"
logger.warning(f"Failed to fetch nodes: {error}")
except Exception as e:
error = str(e)
logger.warning(f"Failed to fetch nodes for map: {e}")
# Get network center location
network_location = request.app.state.network_location
logger.info(
f"Map data: {total_nodes} total nodes, " f"{nodes_with_coords} with coordinates"
)
# Calculate center from nodes, or use default (0, 0)
center_lat = 0.0
center_lon = 0.0
if nodes_with_location:
center_lat = sum(n["lat"] for n in nodes_with_location) / len(
nodes_with_location
)
center_lon = sum(n["lon"] for n in nodes_with_location) / len(
nodes_with_location
)
return JSONResponse(
{
"nodes": nodes_with_location,
"members": members_list,
"center": {
"lat": network_location[0],
"lon": network_location[1],
"lat": center_lat,
"lon": center_lon,
},
"debug": {
"total_nodes": total_nodes,
"nodes_with_coords": nodes_with_coords,
"error": error,
},
}
)

View File

@@ -21,13 +21,74 @@ async def members_page(request: Request) -> HTMLResponse:
# Fetch members from API
members = []
def node_sort_key(node: dict) -> int:
"""Sort nodes: repeater first, then chat, then others."""
adv_type = (node.get("adv_type") or "").lower()
if adv_type == "repeater":
return 0
if adv_type == "chat":
return 1
return 2
try:
# Fetch all members
response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 100}
)
if response.status_code == 200:
data = response.json()
members = data.get("items", [])
# Fetch all nodes with member_id tags in one query
nodes_response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"has_tag": "member_id", "limit": 500}
)
# Build a map of member_id -> nodes
member_nodes_map: dict[str, list] = {}
if nodes_response.status_code == 200:
nodes_data = nodes_response.json()
all_nodes = nodes_data.get("items", [])
for node in all_nodes:
# Find member_id tag
for tag in node.get("tags", []):
if tag.get("key") == "member_id":
member_id_value = tag.get("value")
if member_id_value:
if member_id_value not in member_nodes_map:
member_nodes_map[member_id_value] = []
member_nodes_map[member_id_value].append(node)
break
# Assign nodes to members and sort
for member in members:
member_id = member.get("member_id")
if member_id and member_id in member_nodes_map:
# Sort nodes (repeater first, then chat, then by name tag)
nodes = member_nodes_map[member_id]
# Sort by advertisement type first, then by name
def full_sort_key(node: dict) -> tuple:
adv_type = (node.get("adv_type") or "").lower()
type_priority = (
0
if adv_type == "repeater"
else (1 if adv_type == "chat" else 2)
)
# Get name from tags
node_name = node.get("name") or ""
for tag in node.get("tags", []):
if tag.get("key") == "name":
node_name = tag.get("value") or node_name
break
return (type_priority, node_name.lower())
member["nodes"] = sorted(nodes, key=full_sort_key)
else:
member["nodes"] = []
except Exception as e:
logger.warning(f"Failed to fetch members from API: {e}")
context["api_error"] = str(e)

View File

@@ -15,7 +15,7 @@ router = APIRouter()
async def messages_list(
request: Request,
message_type: str | None = Query(None, description="Filter by message type"),
channel_idx: int | None = Query(None, description="Filter by channel"),
channel_idx: str | None = Query(None, description="Filter by channel"),
search: str | None = Query(None, description="Search in message text"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
@@ -28,12 +28,20 @@ async def messages_list(
# Calculate offset
offset = (page - 1) * limit
# Parse channel_idx, treating empty string as None
channel_idx_int: int | None = None
if channel_idx and channel_idx.strip():
try:
channel_idx_int = int(channel_idx)
except ValueError:
logger.warning(f"Invalid channel_idx value: {channel_idx}")
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if message_type:
params["message_type"] = message_type
if channel_idx is not None:
params["channel_idx"] = channel_idx
if channel_idx_int is not None:
params["channel_idx"] = channel_idx_int
# Fetch messages from API
messages = []
@@ -62,7 +70,7 @@ async def messages_list(
"limit": limit,
"total_pages": total_pages,
"message_type": message_type or "",
"channel_idx": channel_idx,
"channel_idx": channel_idx_int,
"search": search or "",
}
)

View File

@@ -1,5 +1,6 @@
"""Network overview page route."""
import json
import logging
from fastapi import APIRouter, Request
@@ -30,6 +31,11 @@ async def network_overview(request: Request) -> HTMLResponse:
"channel_message_counts": {},
}
# Fetch activity data for charts (7 days)
advert_activity = {"days": 7, "data": []}
message_activity = {"days": 7, "data": []}
node_count = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
@@ -38,6 +44,36 @@ async def network_overview(request: Request) -> HTMLResponse:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
advert_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch advertisement activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/message-activity", params={"days": 7}
)
if response.status_code == 200:
message_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch message activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/node-count", params={"days": 7}
)
if response.status_code == 200:
node_count = response.json()
except Exception as e:
logger.warning(f"Failed to fetch node count from API: {e}")
context["stats"] = stats
context["advert_activity_json"] = json.dumps(advert_activity)
context["message_activity_json"] = json.dumps(message_activity)
context["node_count_json"] = json.dumps(node_count)
return templates.TemplateResponse("network.html", context)

View File

@@ -0,0 +1,63 @@
/**
* MeshCore Hub - Common JavaScript Utilities
*/
/**
* Format a timestamp as relative time (e.g., "2m", "1h", "2d")
* @param {string|Date} timestamp - ISO timestamp string or Date object
* @returns {string} Relative time string, or empty string if invalid
*/
function formatRelativeTime(timestamp) {
if (!timestamp) return '';
const date = timestamp instanceof Date ? timestamp : new Date(timestamp);
if (isNaN(date.getTime())) return '';
const now = new Date();
const diffMs = now - date;
const diffSec = Math.floor(diffMs / 1000);
const diffMin = Math.floor(diffSec / 60);
const diffHour = Math.floor(diffMin / 60);
const diffDay = Math.floor(diffHour / 24);
if (diffDay > 0) return `${diffDay}d`;
if (diffHour > 0) return `${diffHour}h`;
if (diffMin > 0) return `${diffMin}m`;
return '<1m';
}
/**
* Populate all elements with data-timestamp attribute with relative time
*/
function populateRelativeTimestamps() {
document.querySelectorAll('[data-timestamp]:not([data-receiver-tooltip])').forEach(el => {
const timestamp = el.dataset.timestamp;
if (timestamp) {
el.textContent = formatRelativeTime(timestamp);
}
});
}
/**
* Populate receiver tooltip elements with name and relative time
*/
function populateReceiverTooltips() {
document.querySelectorAll('[data-receiver-tooltip]').forEach(el => {
const name = el.dataset.name || '';
const timestamp = el.dataset.timestamp;
const relTime = timestamp ? formatRelativeTime(timestamp) : '';
// Build tooltip: "NodeName (2m ago)" or just "NodeName" or just "2m ago"
let tooltip = name;
if (relTime) {
tooltip = name ? `${name} (${relTime} ago)` : `${relTime} ago`;
}
el.title = tooltip;
});
}
// Auto-populate when DOM is ready
document.addEventListener('DOMContentLoaded', () => {
populateRelativeTimestamps();
populateReceiverTooltips();
});

View File

@@ -0,0 +1,116 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Advertisements{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-6">
<h1 class="text-3xl font-bold">Advertisements</h1>
<span class="badge badge-lg">{{ total }} total</span>
</div>
{% if api_error %}
<div class="alert alert-warning mb-6">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>Could not fetch data from API: {{ api_error }}</span>
</div>
{% endif %}
<!-- Filters -->
<div class="card bg-base-100 shadow mb-6">
<div class="card-body py-4">
<form method="GET" action="/advertisements" class="flex gap-4 flex-wrap items-end">
<div class="form-control">
<label class="label py-1">
<span class="label-text">Search</span>
</label>
<input type="text" name="search" value="{{ search }}" placeholder="Search by name, ID, or public key..." class="input input-bordered input-sm w-80" />
</div>
<button type="submit" class="btn btn-primary btn-sm">Search</button>
<a href="/advertisements" class="btn btn-ghost btn-sm">Clear</a>
</form>
</div>
</div>
<!-- Advertisements Table -->
<div class="overflow-x-auto overflow-y-visible bg-base-100 rounded-box shadow">
<table class="table table-zebra">
<thead>
<tr>
<th>Node</th>
<th>Time</th>
<th>Receivers</th>
</tr>
</thead>
<tbody>
{% for ad in advertisements %}
<tr class="hover">
<td>
<a href="/nodes/{{ ad.public_key }}" class="link link-hover flex items-center gap-2">
<span class="text-lg" title="{{ ad.adv_type or 'Unknown' }}">{% if ad.adv_type and ad.adv_type|lower == 'chat' %}💬{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}📡{% elif ad.adv_type and ad.adv_type|lower == 'room' %}🪧{% else %}📍{% endif %}</span>
<div>
{% if ad.node_tag_name or ad.node_name or ad.name %}
<div class="font-medium">{{ ad.node_tag_name or ad.node_name or ad.name }}</div>
<div class="text-xs font-mono opacity-70">{{ ad.public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ ad.public_key[:16] }}...</span>
{% endif %}
</div>
</a>
</td>
<td class="text-sm whitespace-nowrap">
{{ ad.received_at[:19].replace('T', ' ') if ad.received_at else '-' }}
</td>
<td>
{% if ad.receivers and ad.receivers|length >= 1 %}
<div class="flex gap-1">
{% for recv in ad.receivers %}
<a href="/nodes/{{ recv.public_key }}" class="text-lg hover:opacity-70" data-receiver-tooltip data-name="{{ recv.tag_name or recv.name or recv.public_key[:12] }}" data-timestamp="{{ recv.received_at }}">📡</a>
{% endfor %}
</div>
{% elif ad.received_by %}
<a href="/nodes/{{ ad.received_by }}" class="text-lg hover:opacity-70" title="{{ ad.receiver_tag_name or ad.receiver_name or ad.received_by[:12] }}">📡</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% else %}
<tr>
<td colspan="3" class="text-center py-8 opacity-70">No advertisements found.</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
<!-- Pagination -->
{% if total_pages > 1 %}
<div class="flex justify-center mt-6">
<div class="join">
{% if page > 1 %}
<a href="?page={{ page - 1 }}&search={{ search }}&limit={{ limit }}" class="join-item btn btn-sm">Previous</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Previous</button>
{% endif %}
{% for p in range(1, total_pages + 1) %}
{% if p == page %}
<button class="join-item btn btn-sm btn-active">{{ p }}</button>
{% elif p == 1 or p == total_pages or (p >= page - 2 and p <= page + 2) %}
<a href="?page={{ p }}&search={{ search }}&limit={{ limit }}" class="join-item btn btn-sm">{{ p }}</a>
{% elif p == 2 or p == total_pages - 1 %}
<button class="join-item btn btn-sm btn-disabled">...</button>
{% endif %}
{% endfor %}
{% if page < total_pages %}
<a href="?page={{ page + 1 }}&search={{ search }}&limit={{ limit }}" class="join-item btn btn-sm">Next</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Next</button>
{% endif %}
</div>
</div>
{% endif %}
{% endblock %}

View File

@@ -58,6 +58,7 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
@@ -75,13 +76,14 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
</ul>
</div>
<div class="navbar-end">
<div class="badge badge-outline badge-sm">v{{ version }}</div>
<div class="badge badge-outline badge-sm">{{ version }}</div>
</div>
</div>
@@ -105,16 +107,23 @@
{% endif %}
{% if network_contact_email and network_contact_discord %} | {% endif %}
{% if network_contact_discord %}
<span>Discord: {{ network_contact_discord }}</span>
<a href="{{ network_contact_discord }}" target="_blank" rel="noopener noreferrer" class="link link-hover">Discord</a>
{% endif %}
{% if (network_contact_email or network_contact_discord) and network_contact_github %} | {% endif %}
{% if network_contact_github %}
<a href="{{ network_contact_github }}" target="_blank" rel="noopener noreferrer" class="link link-hover">GitHub</a>
{% endif %}
</p>
<p class="text-xs opacity-50 mt-2">Powered by MeshCore Hub v{{ version }}</p>
<p class="text-xs opacity-50 mt-2">Powered by <a href="https://github.com/ipnet-mesh/meshcore-hub" target="_blank" rel="noopener noreferrer" class="link link-hover">MeshCore Hub</a> {{ version }}</p>
</aside>
</footer>
<!-- Leaflet JS for maps -->
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
<!-- Common utilities -->
<script src="/static/js/utils.js"></script>
{% block extra_scripts %}{% endblock %}
</body>
</html>

View File

@@ -3,42 +3,91 @@
{% block title %}{{ network_name }} - Home{% endblock %}
{% block content %}
<div class="hero min-h-[50vh] bg-base-100 rounded-box">
<div class="hero py-8 bg-base-100 rounded-box">
<div class="hero-content text-center">
<div class="max-w-2xl">
<h1 class="text-5xl font-bold">{{ network_name }}</h1>
<h1 class="text-4xl font-bold">{{ network_name }}</h1>
{% if network_city and network_country %}
<p class="py-2 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
<p class="py-1 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
{% endif %}
<p class="py-6">
{% if network_welcome_text %}
<p class="py-4">{{ network_welcome_text }}</p>
{% else %}
<p class="py-4">
Welcome to the {{ network_name }} mesh network dashboard.
Monitor network activity, view connected nodes, and explore message history.
</p>
{% endif %}
<div class="flex gap-4 justify-center flex-wrap">
<a href="/network" class="btn btn-primary">
<a href="/network" class="btn btn-neutral">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z" />
</svg>
View Network Stats
Dashboard
</a>
<a href="/nodes" class="btn btn-secondary">
<a href="/nodes" class="btn btn-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Browse Nodes
Nodes
</a>
<a href="/map" class="btn btn-accent">
<a href="/advertisements" class="btn btn-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 20l-5.447-2.724A1 1 0 013 16.382V5.618a1 1 0 011.447-.894L9 7m0 13l6-3m-6 3V7m6 10l4.553 2.276A1 1 0 0021 18.382V7.618a1 1 0 00-.553-.894L15 4m0 13V4m0 0L9 7" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
View Map
Advertisements
</a>
<a href="/messages" class="btn btn-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
Messages
</a>
</div>
</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-8">
<!-- Stats Cards -->
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mt-6">
<!-- Total Nodes -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
</div>
<div class="stat-title">Total Nodes</div>
<div class="stat-value text-primary">{{ stats.total_nodes }}</div>
<div class="stat-desc">All discovered nodes</div>
</div>
<!-- Advertisements (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
</div>
<div class="stat-title">Advertisements</div>
<div class="stat-value text-secondary">{{ stats.advertisements_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
<!-- Messages (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
</div>
<div class="stat-title">Messages</div>
<div class="stat-value text-accent">{{ stats.messages_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-6">
<!-- Network Info Card -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
@@ -50,11 +99,43 @@
</h2>
<div class="space-y-2">
{% if network_radio_config %}
{% if network_radio_config.profile %}
<div class="flex justify-between">
<span class="opacity-70">Radio Config:</span>
<span class="font-mono">{{ network_radio_config }}</span>
<span class="opacity-70">Profile:</span>
<span class="font-mono">{{ network_radio_config.profile }}</span>
</div>
{% endif %}
{% if network_radio_config.frequency %}
<div class="flex justify-between">
<span class="opacity-70">Frequency:</span>
<span class="font-mono">{{ network_radio_config.frequency }}</span>
</div>
{% endif %}
{% if network_radio_config.spreading_factor %}
<div class="flex justify-between">
<span class="opacity-70">Spreading Factor:</span>
<span class="font-mono">{{ network_radio_config.spreading_factor }}</span>
</div>
{% endif %}
{% if network_radio_config.bandwidth %}
<div class="flex justify-between">
<span class="opacity-70">Bandwidth:</span>
<span class="font-mono">{{ network_radio_config.bandwidth }}</span>
</div>
{% endif %}
{% if network_radio_config.coding_rate %}
<div class="flex justify-between">
<span class="opacity-70">Coding Rate:</span>
<span class="font-mono">{{ network_radio_config.coding_rate }}</span>
</div>
{% endif %}
{% if network_radio_config.tx_power %}
<div class="flex justify-between">
<span class="opacity-70">TX Power:</span>
<span class="font-mono">{{ network_radio_config.tx_power }}</span>
</div>
{% endif %}
{% endif %}
{% if network_location and network_location != (0.0, 0.0) %}
<div class="flex justify-between">
<span class="opacity-70">Location:</span>
@@ -65,20 +146,19 @@
</div>
</div>
<!-- Quick Links Card -->
<!-- Network Activity Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 12l3-3 3 3 4-4M8 21l4-4 4 4M3 4h18M4 4h16v12a1 1 0 01-1 1H5a1 1 0 01-1-1V4z" />
</svg>
Quick Links
Network Activity
</h2>
<ul class="menu bg-base-200 rounded-box">
<li><a href="/messages">Recent Messages</a></li>
<li><a href="/nodes">All Nodes</a></li>
<li><a href="/members">Network Members</a></li>
</ul>
<p class="text-sm opacity-70 mb-2">Advertisements received per day (last 7 days)</p>
<div class="h-48">
<canvas id="activityChart"></canvas>
</div>
</div>
</div>
@@ -101,14 +181,22 @@
</a>
{% endif %}
{% if network_contact_discord %}
<div class="btn btn-outline btn-sm btn-block">
<a href="{{ network_contact_discord }}" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" viewBox="0 0 24 24" fill="currentColor">
<path d="M20.317 4.3698a19.7913 19.7913 0 00-4.8851-1.5152.0741.0741 0 00-.0785.0371c-.211.3753-.4447.8648-.6083 1.2495-1.8447-.2762-3.68-.2762-5.4868 0-.1636-.3933-.4058-.8742-.6177-1.2495a.077.077 0 00-.0785-.037 19.7363 19.7363 0 00-4.8852 1.515.0699.0699 0 00-.0321.0277C.5334 9.0458-.319 13.5799.0992 18.0578a.0824.0824 0 00.0312.0561c2.0528 1.5076 4.0413 2.4228 5.9929 3.0294a.0777.0777 0 00.0842-.0276c.4616-.6304.8731-1.2952 1.226-1.9942a.076.076 0 00-.0416-.1057c-.6528-.2476-1.2743-.5495-1.8722-.8923a.077.077 0 01-.0076-.1277c.1258-.0943.2517-.1923.3718-.2914a.0743.0743 0 01.0776-.0105c3.9278 1.7933 8.18 1.7933 12.0614 0a.0739.0739 0 01.0785.0095c.1202.099.246.1981.3728.2924a.077.077 0 01-.0066.1276 12.2986 12.2986 0 01-1.873.8914.0766.0766 0 00-.0407.1067c.3604.698.7719 1.3628 1.225 1.9932a.076.076 0 00.0842.0286c1.961-.6067 3.9495-1.5219 6.0023-3.0294a.077.077 0 00.0313-.0552c.5004-5.177-.8382-9.6739-3.5485-13.6604a.061.061 0 00-.0312-.0286zM8.02 15.3312c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9555-2.4189 2.157-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.9555 2.4189-2.1569 2.4189zm7.9748 0c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9554-2.4189 2.1569-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.946 2.4189-2.1568 2.4189Z"/>
</svg>
{{ network_contact_discord }}
</div>
Discord
</a>
{% endif %}
{% if not network_contact_email and not network_contact_discord %}
{% if network_contact_github %}
<a href="{{ network_contact_github }}" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" viewBox="0 0 24 24" fill="currentColor">
<path d="M12 0c-6.626 0-12 5.373-12 12 0 5.302 3.438 9.8 8.207 11.387.599.111.793-.261.793-.577v-2.234c-3.338.726-4.033-1.416-4.033-1.416-.546-1.387-1.333-1.756-1.333-1.756-1.089-.745.083-.729.083-.729 1.205.084 1.839 1.237 1.839 1.237 1.07 1.834 2.807 1.304 3.492.997.107-.775.418-1.305.762-1.604-2.665-.305-5.467-1.334-5.467-5.931 0-1.311.469-2.381 1.236-3.221-.124-.303-.535-1.524.117-3.176 0 0 1.008-.322 3.301 1.23.957-.266 1.983-.399 3.003-.404 1.02.005 2.047.138 3.006.404 2.291-1.552 3.297-1.23 3.297-1.23.653 1.653.242 2.874.118 3.176.77.84 1.235 1.911 1.235 3.221 0 4.609-2.807 5.624-5.479 5.921.43.372.823 1.102.823 2.222v3.293c0 .319.192.694.801.576 4.765-1.589 8.199-6.086 8.199-11.386 0-6.627-5.373-12-12-12z"/>
</svg>
GitHub
</a>
{% endif %}
{% if not network_contact_email and not network_contact_discord and not network_contact_github %}
<p class="text-sm opacity-70">No contact information configured.</p>
{% endif %}
</div>
@@ -116,3 +204,85 @@
</div>
</div>
{% endblock %}
{% block extra_scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
(function() {
const activityData = {{ activity_json | safe }};
const ctx = document.getElementById('activityChart');
if (ctx && activityData.data && activityData.data.length > 0) {
// Format dates for display (show only day/month)
const labels = activityData.data.map(d => {
const date = new Date(d.date);
return date.toLocaleDateString('en-GB', { day: 'numeric', month: 'short' });
});
const counts = activityData.data.map(d => d.count);
new Chart(ctx, {
type: 'line',
data: {
labels: labels,
datasets: [{
label: 'Advertisements',
data: counts,
borderColor: 'oklch(0.7 0.17 330)',
backgroundColor: 'oklch(0.7 0.17 330 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: {
display: false
},
tooltip: {
mode: 'index',
intersect: false,
backgroundColor: 'oklch(0.25 0 0)',
titleColor: 'oklch(0.9 0 0)',
bodyColor: 'oklch(0.9 0 0)',
borderColor: 'oklch(0.4 0 0)',
borderWidth: 1
}
},
scales: {
x: {
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
maxRotation: 45,
minRotation: 45,
maxTicksLimit: 10
}
},
y: {
beginAtZero: true,
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
precision: 0
}
}
},
interaction: {
mode: 'nearest',
axis: 'x',
intersect: false
}
}
});
}
})();
</script>
{% endblock %}

View File

@@ -5,7 +5,7 @@
{% block extra_head %}
<style>
#map {
height: calc(100vh - 250px);
height: calc(100vh - 350px);
min-height: 400px;
border-radius: var(--rounded-box);
}
@@ -22,7 +22,39 @@
{% block content %}
<div class="flex items-center justify-between mb-6">
<h1 class="text-3xl font-bold">Node Map</h1>
<span id="node-count" class="badge badge-lg">Loading...</span>
<div class="flex items-center gap-2">
<span id="node-count" class="badge badge-lg">Loading...</span>
<span id="filtered-count" class="badge badge-lg badge-ghost hidden"></span>
</div>
</div>
<!-- Filters -->
<div class="card bg-base-100 shadow mb-6">
<div class="card-body py-4">
<div class="flex gap-4 flex-wrap items-end">
<div class="form-control">
<label class="label py-1">
<span class="label-text">Node Type</span>
</label>
<select id="filter-type" class="select select-bordered select-sm">
<option value="">All Types</option>
<option value="chat">Chat</option>
<option value="repeater">Repeater</option>
<option value="room">Room</option>
</select>
</div>
<div class="form-control">
<label class="label py-1">
<span class="label-text">Owner</span>
</label>
<select id="filter-owner" class="select select-bordered select-sm">
<option value="">All Owners</option>
<!-- Populated dynamically -->
</select>
</div>
<button id="clear-filters" class="btn btn-ghost btn-sm">Clear Filters</button>
</div>
</div>
</div>
<div class="card bg-base-100 shadow-xl">
@@ -31,69 +63,272 @@
</div>
</div>
<div class="mt-4 text-sm opacity-70">
<!-- Legend -->
<div class="mt-4 flex flex-wrap gap-4 items-center text-sm">
<span class="opacity-70">Legend:</span>
<div class="flex items-center gap-1">
<span class="text-lg">💬</span>
<span>Chat</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">📡</span>
<span>Repeater</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">🪧</span>
<span>Room</span>
</div>
<div class="flex items-center gap-1">
<span class="text-lg">📍</span>
<span>Other</span>
</div>
</div>
<div class="mt-2 text-sm opacity-70">
<p>Nodes are placed on the map based on their <code>lat</code> and <code>lon</code> tags.</p>
<p>To add a node to the map, set its location tags via the API.</p>
</div>
{% endblock %}
{% block extra_scripts %}
<script>
// Initialize map
const map = L.map('map').setView([{{ network_location[0] }}, {{ network_location[1] }}], 10);
// Initialize map with world view (will be centered on nodes once loaded)
const map = L.map('map').setView([0, 0], 2);
// Add tile layer
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
}).addTo(map);
// Custom marker icon
const nodeIcon = L.divIcon({
className: 'custom-div-icon',
html: `<div style="background-color: oklch(var(--p)); width: 12px; height: 12px; border-radius: 50%; border: 2px solid white; box-shadow: 0 2px 4px rgba(0,0,0,0.3);"></div>`,
iconSize: [12, 12],
iconAnchor: [6, 6]
});
// Store all nodes and markers
let allNodes = [];
let allMembers = [];
let markers = [];
let mapCenter = { lat: 0, lon: 0 };
// Normalize adv_type to lowercase for consistent comparison
function normalizeType(type) {
return type ? type.toLowerCase() : null;
}
// formatRelativeTime is provided by /static/js/utils.js
// Get emoji marker based on node type
function getNodeEmoji(node) {
const type = normalizeType(node.adv_type);
if (type === 'chat') return '💬';
if (type === 'repeater') return '📡';
if (type === 'room') return '🪧';
return '📍';
}
// Get display name for node type
function getTypeDisplay(node) {
const type = normalizeType(node.adv_type);
if (type === 'chat') return 'Chat';
if (type === 'repeater') return 'Repeater';
if (type === 'room') return 'Room';
return type ? type.charAt(0).toUpperCase() + type.slice(1) : 'Unknown';
}
// Create marker icon for a node
function createNodeIcon(node) {
const emoji = getNodeEmoji(node);
const displayName = node.name || '';
const relativeTime = formatRelativeTime(node.last_seen);
const timeDisplay = relativeTime ? ` (${relativeTime})` : '';
return L.divIcon({
className: 'custom-div-icon',
html: `<div style="display: flex; align-items: center; gap: 2px;">
<span style="font-size: 24px; text-shadow: 0 0 3px #1a237e, 0 0 6px #1a237e, 0 1px 2px rgba(0,0,0,0.7);">${emoji}</span>
<span style="font-size: 10px; font-weight: bold; color: #000; background: rgba(255,255,255,0.9); padding: 1px 4px; border-radius: 3px; box-shadow: 0 1px 3px rgba(0,0,0,0.3);">${displayName}${timeDisplay}</span>
</div>`,
iconSize: [82, 28],
iconAnchor: [14, 14]
});
}
// Create popup content for a node
function createPopupContent(node) {
let ownerHtml = '';
if (node.owner) {
const ownerDisplay = node.owner.callsign
? `${node.owner.name} (${node.owner.callsign})`
: node.owner.name;
ownerHtml = `<p><span class="opacity-70">Owner:</span> ${ownerDisplay}</p>`;
}
let roleHtml = '';
if (node.role) {
roleHtml = `<p><span class="opacity-70">Role:</span> <span class="badge badge-xs badge-ghost">${node.role}</span></p>`;
}
const emoji = getNodeEmoji(node);
const typeDisplay = getTypeDisplay(node);
return `
<div class="p-2">
<h3 class="font-bold text-lg mb-2">${emoji} ${node.name}</h3>
<div class="space-y-1 text-sm">
<p><span class="opacity-70">Type:</span> ${typeDisplay}</p>
${roleHtml}
${ownerHtml}
<p><span class="opacity-70">Key:</span> <code class="text-xs">${node.public_key.substring(0, 16)}...</code></p>
<p><span class="opacity-70">Location:</span> ${node.lat.toFixed(4)}, ${node.lon.toFixed(4)}</p>
${node.last_seen ? `<p><span class="opacity-70">Last seen:</span> ${node.last_seen.substring(0, 19).replace('T', ' ')}</p>` : ''}
</div>
<a href="/nodes/${node.public_key}" class="btn btn-primary btn-xs mt-3">View Details</a>
</div>
`;
}
// Clear all markers from map
function clearMarkers() {
markers.forEach(marker => map.removeLayer(marker));
markers = [];
}
// Core filter logic - returns filtered nodes and updates markers
function applyFiltersCore() {
const typeFilter = document.getElementById('filter-type').value;
const ownerFilter = document.getElementById('filter-owner').value;
// Filter nodes
const filteredNodes = allNodes.filter(node => {
// Type filter (case-insensitive)
if (typeFilter && normalizeType(node.adv_type) !== typeFilter) return false;
// Owner filter
if (ownerFilter) {
if (!node.owner || node.owner.public_key !== ownerFilter) return false;
}
return true;
});
// Clear existing markers
clearMarkers();
// Add filtered markers
filteredNodes.forEach(node => {
const marker = L.marker([node.lat, node.lon], { icon: createNodeIcon(node) }).addTo(map);
marker.bindPopup(createPopupContent(node));
markers.push(marker);
});
// Update counts
const countEl = document.getElementById('node-count');
const filteredEl = document.getElementById('filtered-count');
if (filteredNodes.length === allNodes.length) {
countEl.textContent = `${allNodes.length} nodes on map`;
filteredEl.classList.add('hidden');
} else {
countEl.textContent = `${allNodes.length} total`;
filteredEl.textContent = `${filteredNodes.length} shown`;
filteredEl.classList.remove('hidden');
}
return filteredNodes;
}
// Apply filters and recenter map on filtered nodes
function applyFilters() {
const filteredNodes = applyFiltersCore();
// Fit bounds if we have filtered nodes
if (filteredNodes.length > 0) {
const bounds = L.latLngBounds(filteredNodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
} else if (mapCenter.lat !== 0 || mapCenter.lon !== 0) {
map.setView([mapCenter.lat, mapCenter.lon], 10);
}
}
// Apply filters without recentering (for initial load after manual center)
function applyFiltersNoRecenter() {
applyFiltersCore();
}
// Populate owner filter dropdown
function populateOwnerFilter() {
const select = document.getElementById('filter-owner');
// Get unique owners from nodes that have locations
const ownersWithNodes = new Set();
allNodes.forEach(node => {
if (node.owner) {
ownersWithNodes.add(node.owner.public_key);
}
});
// Filter members to only those who own nodes on the map
const relevantMembers = allMembers.filter(m => ownersWithNodes.has(m.public_key));
// Sort by name
relevantMembers.sort((a, b) => a.name.localeCompare(b.name));
// Add options
relevantMembers.forEach(member => {
const option = document.createElement('option');
option.value = member.public_key;
option.textContent = member.callsign
? `${member.name} (${member.callsign})`
: member.name;
select.appendChild(option);
});
}
// Clear all filters
function clearFilters() {
document.getElementById('filter-type').value = '';
document.getElementById('filter-owner').value = '';
applyFilters();
}
// Event listeners for filters
document.getElementById('filter-type').addEventListener('change', applyFilters);
document.getElementById('filter-owner').addEventListener('change', applyFilters);
document.getElementById('clear-filters').addEventListener('click', clearFilters);
// Fetch and display nodes
fetch('/map/data')
.then(response => response.json())
.then(data => {
const nodes = data.nodes;
const center = data.center;
allNodes = data.nodes;
allMembers = data.members || [];
mapCenter = data.center;
// Update node count
document.getElementById('node-count').textContent = `${nodes.length} nodes on map`;
// Log debug info
const debug = data.debug || {};
console.log('Map data loaded:', debug);
console.log('Sample node data:', allNodes.length > 0 ? allNodes[0] : 'No nodes');
// Add markers for each node
nodes.forEach(node => {
const marker = L.marker([node.lat, node.lon], { icon: nodeIcon }).addTo(map);
// Create popup content
const popupContent = `
<div class="p-2">
<h3 class="font-bold text-lg mb-2">${node.name}</h3>
<div class="space-y-1 text-sm">
<p><span class="opacity-70">Type:</span> ${node.adv_type || 'Unknown'}</p>
<p><span class="opacity-70">Key:</span> <code class="text-xs">${node.public_key.substring(0, 16)}...</code></p>
<p><span class="opacity-70">Location:</span> ${node.lat.toFixed(4)}, ${node.lon.toFixed(4)}</p>
${node.last_seen ? `<p><span class="opacity-70">Last seen:</span> ${node.last_seen.substring(0, 19).replace('T', ' ')}</p>` : ''}
</div>
<a href="/nodes/${node.public_key}" class="btn btn-primary btn-xs mt-3">View Details</a>
</div>
`;
marker.bindPopup(popupContent);
});
// Fit bounds if we have nodes
if (nodes.length > 0) {
const bounds = L.latLngBounds(nodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
} else if (center.lat !== 0 || center.lon !== 0) {
// Use network center if no nodes
map.setView([center.lat, center.lon], 10);
if (debug.error) {
document.getElementById('node-count').textContent = `Error: ${debug.error}`;
return;
}
if (debug.total_nodes === 0) {
document.getElementById('node-count').textContent = 'No nodes in database';
return;
}
if (debug.nodes_with_coords === 0) {
document.getElementById('node-count').textContent = `${debug.total_nodes} nodes (none have coordinates)`;
return;
}
// Populate owner filter
populateOwnerFilter();
// Initial display - center map on nodes if available
if (allNodes.length > 0) {
const bounds = L.latLngBounds(allNodes.map(n => [n.lat, n.lon]));
map.fitBounds(bounds, { padding: [50, 50] });
}
// Apply filters (won't re-center since we just did above)
applyFiltersNoRecenter();
})
.catch(error => {
console.error('Error loading map data:', error);

View File

@@ -9,43 +9,57 @@
</div>
{% if members %}
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 items-start">
{% for member in members %}
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
{{ member.name }}
{% if member.callsign %}
<span class="badge badge-secondary">{{ member.callsign }}</span>
<span class="badge badge-success">{{ member.callsign }}</span>
{% endif %}
</h2>
{% if member.role %}
<p class="text-sm opacity-70">{{ member.role }}</p>
{% endif %}
{% if member.description %}
<p class="mt-2">{{ member.description }}</p>
{% endif %}
{% if member.email or member.discord or member.website %}
<div class="card-actions justify-start mt-4">
{% if member.email %}
<a href="mailto:{{ member.email }}" class="btn btn-ghost btn-xs">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M3 8l7.89 5.26a2 2 0 002.22 0L21 8M5 19h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z" />
</svg>
Email
</a>
{% endif %}
{% if member.website %}
<a href="{{ member.website }}" target="_blank" class="btn btn-ghost btn-xs">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 12a9 9 0 01-9 9m9-9a9 9 0 00-9-9m9 9H3m9 9a9 9 0 01-9-9m9 9c1.657 0 3-4.03 3-9s-1.343-9-3-9m0 18c-1.657 0-3-4.03-3-9s1.343-9 3-9m-9 9a9 9 0 019-9" />
</svg>
Website
</a>
{% endif %}
{% if member.contact %}
<p class="text-sm mt-2">
<span class="opacity-70">Contact:</span> {{ member.contact }}
</p>
{% endif %}
{% if member.nodes %}
<div class="mt-4 space-y-2">
{% for node in member.nodes %}
{% set adv_type = node.adv_type %}
{% set node_tag_name = node.tags|selectattr('key', 'equalto', 'name')|map(attribute='value')|first %}
{% set display_name = node_tag_name or node.name %}
<a href="/nodes/{{ node.public_key }}" class="flex items-center gap-3 p-2 bg-base-200 rounded-lg hover:bg-base-300 transition-colors">
<span class="text-lg" title="{{ adv_type or 'Unknown' }}">
{% if adv_type and adv_type|lower == 'chat' %}
💬
{% elif adv_type and adv_type|lower == 'repeater' %}
📡
{% elif adv_type and adv_type|lower == 'room' %}
🪧
{% elif adv_type %}
📍
{% else %}
📦
{% endif %}
</span>
<div>
{% if display_name %}
<div class="font-medium text-sm">{{ display_name }}</div>
<div class="font-mono text-xs opacity-60">{{ node.public_key[:12] }}...</div>
{% else %}
<div class="font-mono text-sm">{{ node.public_key[:12] }}...</div>
{% endif %}
</div>
</a>
{% endfor %}
</div>
{% endif %}
</div>
@@ -59,31 +73,29 @@
</svg>
<div>
<h3 class="font-bold">No members configured</h3>
<p class="text-sm">To display network members, provide a members JSON file using the <code>--members-file</code> option.</p>
<p class="text-sm">To display network members, create a members.yaml file in your seed directory.</p>
</div>
</div>
<div class="mt-6 card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">Members File Format</h2>
<p class="mb-4">Create a JSON file with the following structure:</p>
<pre class="bg-base-200 p-4 rounded-box text-sm overflow-x-auto"><code>{
"members": [
{
"name": "John Doe",
"callsign": "AB1CD",
"role": "Network Admin",
"description": "Manages the main repeater node.",
"email": "john@example.com",
"website": "https://example.com"
},
{
"name": "Jane Smith",
"role": "Member",
"description": "Regular user in the downtown area."
}
]
}</code></pre>
<p class="mb-4">Create a YAML file at <code>$SEED_HOME/members.yaml</code> with the following structure:</p>
<pre class="bg-base-200 p-4 rounded-box text-sm overflow-x-auto"><code>members:
- member_id: johndoe
name: John Doe
callsign: AB1CD
role: Network Admin
description: Manages the main repeater node.
contact: john@example.com
- member_id: janesmith
name: Jane Smith
role: Member
description: Regular user in the downtown area.</code></pre>
<p class="mt-4 text-sm opacity-70">
Run <code>meshcore-hub collector seed</code> to import members.<br/>
To associate nodes with members, add a <code>member_id</code> tag to nodes in <code>node_tags.yaml</code>.
</p>
</div>
</div>
{% endif %}

View File

@@ -49,57 +49,57 @@
</div>
<!-- Messages Table -->
<div class="overflow-x-auto bg-base-100 rounded-box shadow">
<div class="overflow-x-auto overflow-y-visible bg-base-100 rounded-box shadow">
<table class="table table-zebra">
<thead>
<tr>
<th>Time</th>
<th>Type</th>
<th>Time</th>
<th>From/Channel</th>
<th>Message</th>
<th>SNR</th>
<th>Hops</th>
<th class="text-center">SNR</th>
<th>Receivers</th>
</tr>
</thead>
<tbody>
{% for msg in messages %}
<tr class="hover">
<td class="text-xs whitespace-nowrap">
<tr class="hover align-top">
<td class="text-lg" title="{{ msg.message_type|capitalize }}">
{% if msg.message_type == 'channel' %}📻{% else %}👤{% endif %}
</td>
<td class="text-sm whitespace-nowrap">
{{ msg.received_at[:19].replace('T', ' ') if msg.received_at else '-' }}
</td>
<td>
{% if msg.message_type == 'channel' %}
<span class="badge badge-info badge-sm">Channel</span>
{% else %}
<span class="badge badge-success badge-sm">Direct</span>
{% endif %}
</td>
<td class="text-sm">
<td class="text-sm whitespace-nowrap">
{% if msg.message_type == 'channel' %}
<span class="font-mono">CH{{ msg.channel_idx }}</span>
{% else %}
{% if msg.sender_friendly_name %}
<span class="font-medium">{{ msg.sender_friendly_name }}</span>
{% if msg.sender_tag_name or msg.sender_name %}
<span class="font-medium">{{ msg.sender_tag_name or msg.sender_name }}</span>
{% else %}
<span class="font-mono text-xs">{{ (msg.pubkey_prefix or '-')[:12] }}</span>
{% endif %}
{% endif %}
</td>
<td class="truncate-cell" title="{{ msg.text }}">
{{ msg.text or '-' }}
</td>
<td class="text-center">
<td class="break-words max-w-md" style="white-space: pre-wrap;">{{ msg.text or '-' }}</td>
<td class="text-center whitespace-nowrap">
{% if msg.snr is not none %}
<span class="badge badge-ghost badge-sm">{{ "%.1f"|format(msg.snr) }}</span>
{% else %}
-
{% endif %}
</td>
<td class="text-center">
{% if msg.hops is not none %}
<span class="badge badge-ghost badge-sm">{{ msg.hops }}</span>
<td>
{% if msg.receivers and msg.receivers|length >= 1 %}
<div class="flex gap-1">
{% for recv in msg.receivers %}
<a href="/nodes/{{ recv.public_key }}" class="text-lg hover:opacity-70" data-receiver-tooltip data-name="{{ recv.tag_name or recv.name or recv.public_key[:12] }}" data-timestamp="{{ recv.received_at }}">📡</a>
{% endfor %}
</div>
{% elif msg.received_by %}
<a href="/nodes/{{ msg.received_by }}" class="text-lg hover:opacity-70" title="{{ msg.receiver_tag_name or msg.receiver_name or msg.received_by[:12] }}">📡</a>
{% else %}
-
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>

View File

@@ -23,7 +23,7 @@
{% endif %}
<!-- Stats Cards -->
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6 mb-8">
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-6">
<!-- Total Nodes -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-primary">
@@ -36,7 +36,7 @@
<div class="stat-desc">All discovered nodes</div>
</div>
<!-- Advertisements (24h) -->
<!-- Advertisements (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
@@ -44,32 +44,71 @@
</svg>
</div>
<div class="stat-title">Advertisements</div>
<div class="stat-value text-secondary">{{ stats.advertisements_24h }}</div>
<div class="stat-desc">Received in last 24 hours</div>
<div class="stat-value text-secondary">{{ stats.advertisements_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
<!-- Total Messages -->
<!-- Messages (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
</div>
<div class="stat-title">Total Messages</div>
<div class="stat-value text-accent">{{ stats.total_messages }}</div>
<div class="stat-desc">All time</div>
<div class="stat-title">Messages</div>
<div class="stat-value text-accent">{{ stats.messages_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
</div>
<!-- Activity Charts -->
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mb-8">
<!-- Node Count Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Total Nodes
</h2>
<p class="text-xs opacity-70">Over time (last 7 days)</p>
<div class="h-32">
<canvas id="nodeChart"></canvas>
</div>
</div>
</div>
<!-- Messages Today -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-info">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<!-- Advertisements Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
Advertisements
</h2>
<p class="text-xs opacity-70">Per day (last 7 days)</p>
<div class="h-32">
<canvas id="advertChart"></canvas>
</div>
</div>
</div>
<!-- Messages Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title text-base">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
Messages
</h2>
<p class="text-xs opacity-70">Per day (last 7 days)</p>
<div class="h-32">
<canvas id="messageChart"></canvas>
</div>
</div>
<div class="stat-title">Messages Today</div>
<div class="stat-value text-info">{{ stats.messages_today }}</div>
<div class="stat-desc">Last 24 hours</div>
</div>
</div>
@@ -90,6 +129,7 @@
<thead>
<tr>
<th>Node</th>
<th>Type</th>
<th class="text-right">Received</th>
</tr>
</thead>
@@ -97,11 +137,26 @@
{% for ad in stats.recent_advertisements %}
<tr>
<td>
<div class="font-medium">{{ ad.friendly_name or ad.name or ad.public_key[:12] + '...' }}</div>
<a href="/nodes/{{ ad.public_key }}" class="link link-hover">
<div class="font-medium">{{ ad.friendly_name or ad.name or ad.public_key[:12] + '...' }}</div>
</a>
{% if ad.friendly_name or ad.name %}
<div class="text-xs opacity-50 font-mono">{{ ad.public_key[:12] }}...</div>
{% endif %}
</td>
<td>
{% if ad.adv_type and ad.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif ad.adv_type and ad.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif ad.adv_type %}
<span title="{{ ad.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td class="text-right text-sm opacity-70">{{ ad.received_at.split('T')[1][:8] if ad.received_at else '-' }}</td>
</tr>
{% endfor %}
@@ -114,60 +169,152 @@
</div>
</div>
<!-- Channel Stats -->
<!-- Channel Messages -->
{% if stats.channel_messages %}
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 20l4-16m2 16l4-16M6 9h14M4 15h14" />
</svg>
Channel Messages
Recent Channel Messages
</h2>
{% if stats.channel_message_counts %}
<div class="overflow-x-auto">
<table class="table table-compact w-full">
<thead>
<tr>
<th>Channel</th>
<th class="text-right">Count</th>
</tr>
</thead>
<tbody>
{% for channel, count in stats.channel_message_counts.items() %}
<tr>
<td>Channel {{ channel }}</td>
<td class="text-right font-mono">{{ count }}</td>
</tr>
<div class="space-y-4">
{% for channel, messages in stats.channel_messages.items() %}
<div>
<h3 class="font-semibold text-sm mb-2 flex items-center gap-2">
<span class="badge badge-info badge-sm">CH{{ channel }}</span>
Channel {{ channel }}
</h3>
<div class="space-y-1 pl-2 border-l-2 border-base-300">
{% for msg in messages %}
<div class="text-sm">
<span class="text-xs opacity-50">{{ msg.received_at.split('T')[1][:5] if msg.received_at else '' }}</span>
<span class="break-words" style="white-space: pre-wrap;">{{ msg.text }}</span>
</div>
{% endfor %}
</tbody>
</table>
</div>
</div>
{% endfor %}
</div>
{% else %}
<p class="text-sm opacity-70">No channel messages recorded yet.</p>
{% endif %}
</div>
</div>
{% endif %}
</div>
<!-- Quick Actions -->
<div class="flex gap-4 mt-8 flex-wrap">
<a href="/nodes" class="btn btn-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Browse Nodes
</a>
<a href="/messages" class="btn btn-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
View Messages
</a>
<a href="/map" class="btn btn-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 20l-5.447-2.724A1 1 0 013 16.382V5.618a1 1 0 011.447-.894L9 7m0 13l6-3m-6 3V7m6 10l4.553 2.276A1 1 0 0021 18.382V7.618a1 1 0 00-.553-.894L15 4m0 13V4m0 0L9 7" />
</svg>
View Map
</a>
</div>
{% endblock %}
{% block extra_scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
(function() {
const advertData = {{ advert_activity_json | safe }};
const messageData = {{ message_activity_json | safe }};
const nodeData = {{ node_count_json | safe }};
// Common chart options
const commonOptions = {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: { display: false },
tooltip: {
mode: 'index',
intersect: false,
backgroundColor: 'oklch(0.25 0 0)',
titleColor: 'oklch(0.9 0 0)',
bodyColor: 'oklch(0.9 0 0)',
borderColor: 'oklch(0.4 0 0)',
borderWidth: 1
}
},
scales: {
x: {
grid: { color: 'oklch(0.4 0 0 / 0.2)' },
ticks: { color: 'oklch(0.7 0 0)', maxRotation: 45, minRotation: 45 }
},
y: {
beginAtZero: true,
grid: { color: 'oklch(0.4 0 0 / 0.2)' },
ticks: { color: 'oklch(0.7 0 0)', precision: 0 }
}
},
interaction: { mode: 'nearest', axis: 'x', intersect: false }
};
// Helper to format dates
function formatLabels(data) {
return data.map(d => {
const date = new Date(d.date);
return date.toLocaleDateString('en-GB', { day: 'numeric', month: 'short' });
});
}
// Advertisements chart (secondary color - pink/magenta)
const advertCtx = document.getElementById('advertChart');
if (advertCtx && advertData.data && advertData.data.length > 0) {
new Chart(advertCtx, {
type: 'line',
data: {
labels: formatLabels(advertData.data),
datasets: [{
label: 'Advertisements',
data: advertData.data.map(d => d.count),
borderColor: 'oklch(0.7 0.17 330)',
backgroundColor: 'oklch(0.7 0.17 330 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
// Messages chart (accent color - teal/cyan)
const messageCtx = document.getElementById('messageChart');
if (messageCtx && messageData.data && messageData.data.length > 0) {
new Chart(messageCtx, {
type: 'line',
data: {
labels: formatLabels(messageData.data),
datasets: [{
label: 'Messages',
data: messageData.data.map(d => d.count),
borderColor: 'oklch(0.75 0.18 180)',
backgroundColor: 'oklch(0.75 0.18 180 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
// Node count chart (primary color - purple/blue)
const nodeCtx = document.getElementById('nodeChart');
if (nodeCtx && nodeData.data && nodeData.data.length > 0) {
new Chart(nodeCtx, {
type: 'line',
data: {
labels: formatLabels(nodeData.data),
datasets: [{
label: 'Total Nodes',
data: nodeData.data.map(d => d.count),
borderColor: 'oklch(0.65 0.24 265)',
backgroundColor: 'oklch(0.65 0.24 265 / 0.1)',
fill: true,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: commonOptions
});
}
})();
</script>
{% endblock %}

View File

@@ -2,19 +2,35 @@
{% block title %}{{ network_name }} - Node Details{% endblock %}
{% block extra_head %}
<style>
#node-map {
height: 300px;
border-radius: var(--rounded-box);
}
.leaflet-popup-content-wrapper {
background: oklch(var(--b1));
color: oklch(var(--bc));
}
.leaflet-popup-tip {
background: oklch(var(--b1));
}
</style>
{% endblock %}
{% block content %}
<div class="breadcrumbs text-sm mb-4">
<ul>
<li><a href="/">Home</a></li>
<li><a href="/nodes">Nodes</a></li>
{% if node %}
{% set ns = namespace(friendly_name=none) %}
{% set ns = namespace(tag_name=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'friendly_name' %}
{% set ns.friendly_name = tag.value %}
{% if tag.key == 'name' %}
{% set ns.tag_name = tag.value %}
{% endif %}
{% endfor %}
<li>{{ ns.friendly_name or node.name or public_key[:12] + '...' }}</li>
<li>{{ ns.tag_name or node.name or public_key[:12] + '...' }}</li>
{% else %}
<li>Not Found</li>
{% endif %}
@@ -31,20 +47,28 @@
{% endif %}
{% if node %}
{% set ns = namespace(friendly_name=none) %}
{% set ns = namespace(tag_name=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'friendly_name' %}
{% set ns.friendly_name = tag.value %}
{% if tag.key == 'name' %}
{% set ns.tag_name = tag.value %}
{% endif %}
{% endfor %}
<!-- Node Info Card -->
<div class="card bg-base-100 shadow-xl mb-6">
<div class="card-body">
<h1 class="card-title text-2xl">
{{ ns.friendly_name or node.name or 'Unnamed Node' }}
{% if node.adv_type %}
<span class="badge badge-secondary">{{ node.adv_type }}</span>
{% if node.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif node.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif node.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% else %}
<span title="{{ node.adv_type }}">📍</span>
{% endif %}
{% endif %}
{{ ns.tag_name or node.name or 'Unnamed Node' }}
</h1>
<div class="grid grid-cols-1 md:grid-cols-2 gap-4 mt-4">
@@ -61,32 +85,55 @@
</div>
</div>
<!-- Tags -->
{% if node.tags %}
<div class="mt-6">
<h3 class="font-semibold opacity-70 mb-2">Tags</h3>
<div class="overflow-x-auto">
<table class="table table-compact w-full">
<thead>
<tr>
<th>Key</th>
<th>Value</th>
<th>Type</th>
</tr>
</thead>
<tbody>
{% for tag in node.tags %}
<tr>
<td class="font-mono">{{ tag.key }}</td>
<td>{{ tag.value }}</td>
<td class="opacity-70">{{ tag.value_type or 'string' }}</td>
</tr>
{% endfor %}
</tbody>
</table>
<!-- Tags and Map Grid -->
{% set ns_map = namespace(lat=none, lon=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'lat' %}
{% set ns_map.lat = tag.value %}
{% elif tag.key == 'lon' %}
{% set ns_map.lon = tag.value %}
{% endif %}
{% endfor %}
<div class="grid grid-cols-1 {% if ns_map.lat and ns_map.lon %}lg:grid-cols-2{% endif %} gap-6 mt-6">
<!-- Tags -->
{% if node.tags %}
<div>
<h3 class="font-semibold opacity-70 mb-2">Tags</h3>
<div class="overflow-x-auto">
<table class="table table-compact w-full">
<thead>
<tr>
<th>Key</th>
<th>Value</th>
<th>Type</th>
</tr>
</thead>
<tbody>
{% for tag in node.tags %}
<tr>
<td class="font-mono">{{ tag.key }}</td>
<td>{{ tag.value }}</td>
<td class="opacity-70">{{ tag.value_type or 'string' }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
{% endif %}
<!-- Location Map -->
{% if ns_map.lat and ns_map.lon %}
<div>
<h3 class="font-semibold opacity-70 mb-2">Location</h3>
<div id="node-map" class="mb-2"></div>
<div class="text-sm opacity-70">
<p>Coordinates: {{ ns_map.lat }}, {{ ns_map.lon }}</p>
</div>
</div>
{% endif %}
</div>
{% endif %}
</div>
</div>
@@ -102,15 +149,40 @@
<tr>
<th>Time</th>
<th>Type</th>
<th>Name</th>
<th>Received By</th>
</tr>
</thead>
<tbody>
{% for adv in advertisements %}
<tr>
<td class="text-xs">{{ adv.received_at[:19].replace('T', ' ') if adv.received_at else '-' }}</td>
<td>{{ adv.adv_type or '-' }}</td>
<td>{{ adv.name or '-' }}</td>
<td class="text-xs whitespace-nowrap">{{ adv.received_at[:19].replace('T', ' ') if adv.received_at else '-' }}</td>
<td>
{% if adv.adv_type and adv.adv_type|lower == 'chat' %}
<span title="Chat">💬</span>
{% elif adv.adv_type and adv.adv_type|lower == 'repeater' %}
<span title="Repeater">📡</span>
{% elif adv.adv_type and adv.adv_type|lower == 'room' %}
<span title="Room">🪧</span>
{% elif adv.adv_type %}
<span title="{{ adv.adv_type }}">📍</span>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
<td>
{% if adv.received_by %}
<a href="/nodes/{{ adv.received_by }}" class="link link-hover">
{% if adv.receiver_tag_name or adv.receiver_name %}
<div class="font-medium text-sm">{{ adv.receiver_tag_name or adv.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ adv.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-xs">{{ adv.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
@@ -133,12 +205,13 @@
<tr>
<th>Time</th>
<th>Data</th>
<th>Received By</th>
</tr>
</thead>
<tbody>
{% for tel in telemetry %}
<tr>
<td class="text-xs">{{ tel.received_at[:19].replace('T', ' ') if tel.received_at else '-' }}</td>
<td class="text-xs whitespace-nowrap">{{ tel.received_at[:19].replace('T', ' ') if tel.received_at else '-' }}</td>
<td class="text-xs font-mono">
{% if tel.parsed_data %}
{{ tel.parsed_data | tojson }}
@@ -146,6 +219,20 @@
-
{% endif %}
</td>
<td>
{% if tel.received_by %}
<a href="/nodes/{{ tel.received_by }}" class="link link-hover">
{% if tel.receiver_tag_name or tel.receiver_name %}
<div class="font-medium text-sm">{{ tel.receiver_tag_name or tel.receiver_name }}</div>
<div class="text-xs font-mono opacity-70">{{ tel.received_by[:16] }}...</div>
{% else %}
<span class="font-mono text-xs">{{ tel.received_by[:16] }}...</span>
{% endif %}
</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
@@ -168,3 +255,67 @@
<a href="/nodes" class="btn btn-primary mt-4">Back to Nodes</a>
{% endif %}
{% endblock %}
{% block extra_scripts %}
{% if node %}
{% set ns_map = namespace(lat=none, lon=none, name=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'lat' %}
{% set ns_map.lat = tag.value %}
{% elif tag.key == 'lon' %}
{% set ns_map.lon = tag.value %}
{% elif tag.key == 'name' %}
{% set ns_map.name = tag.value %}
{% endif %}
{% endfor %}
{% if ns_map.lat and ns_map.lon %}
<script>
// Initialize map centered on the node's location
const nodeLat = {{ ns_map.lat }};
const nodeLon = {{ ns_map.lon }};
const nodeName = {{ (ns_map.name or node.name or 'Unnamed Node') | tojson }};
const nodeType = {{ (node.adv_type or '') | tojson }};
const publicKey = {{ node.public_key | tojson }};
const map = L.map('node-map').setView([nodeLat, nodeLon], 15);
// Add tile layer
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
}).addTo(map);
// Get emoji marker based on node type
function getNodeEmoji(type) {
const normalizedType = type ? type.toLowerCase() : null;
if (normalizedType === 'chat') return '💬';
if (normalizedType === 'repeater') return '📡';
if (normalizedType === 'room') return '🪧';
return '📍';
}
// Create marker icon (just the emoji, no label)
const emoji = getNodeEmoji(nodeType);
const icon = L.divIcon({
className: 'custom-div-icon',
html: `<span style="font-size: 32px; text-shadow: 0 0 3px #1a237e, 0 0 6px #1a237e, 0 1px 2px rgba(0,0,0,0.7);">${emoji}</span>`,
iconSize: [32, 32],
iconAnchor: [16, 16]
});
// Add marker
const marker = L.marker([nodeLat, nodeLon], { icon: icon }).addTo(map);
// Add popup (shown on click, not by default)
marker.bindPopup(`
<div class="p-2">
<h3 class="font-bold text-lg mb-2">${emoji} ${nodeName}</h3>
<div class="space-y-1 text-sm">
${nodeType ? `<p><span class="opacity-70">Type:</span> ${nodeType}</p>` : ''}
<p><span class="opacity-70">Coordinates:</span> ${nodeLat.toFixed(4)}, ${nodeLon.toFixed(4)}</p>
</div>
</div>
`);
</script>
{% endif %}
{% endif %}
{% endblock %}

View File

@@ -25,7 +25,7 @@
<label class="label py-1">
<span class="label-text">Search</span>
</label>
<input type="text" name="search" value="{{ search }}" placeholder="Name or public key..." class="input input-bordered input-sm w-64" />
<input type="text" name="search" value="{{ search }}" placeholder="Search by name, ID, or public key..." class="input input-bordered input-sm w-80" />
</div>
<div class="form-control">
<label class="label py-1">
@@ -49,28 +49,34 @@
<table class="table table-zebra">
<thead>
<tr>
<th>Name</th>
<th>Public Key</th>
<th>Node</th>
<th>Last Seen</th>
<th>Tags</th>
</tr>
</thead>
<tbody>
{% for node in nodes %}
{% set ns = namespace(friendly_name=none) %}
{% set ns = namespace(tag_name=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'friendly_name' %}
{% set ns.friendly_name = tag.value %}
{% if tag.key == 'name' %}
{% set ns.tag_name = tag.value %}
{% endif %}
{% endfor %}
<tr class="hover">
<td class="font-medium">
<a href="/nodes/{{ node.public_key }}" class="link link-hover">{{ ns.friendly_name or node.name or '-' }}</a>
<td>
<a href="/nodes/{{ node.public_key }}" class="link link-hover flex items-center gap-2">
<span class="text-lg" title="{{ node.adv_type or 'Unknown' }}">{% if node.adv_type and node.adv_type|lower == 'chat' %}💬{% elif node.adv_type and node.adv_type|lower == 'repeater' %}📡{% elif node.adv_type and node.adv_type|lower == 'room' %}🪧{% else %}📍{% endif %}</span>
<div>
{% if ns.tag_name or node.name %}
<div class="font-medium">{{ ns.tag_name or node.name }}</div>
<div class="text-xs font-mono opacity-70">{{ node.public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ node.public_key[:16] }}...</span>
{% endif %}
</div>
</a>
</td>
<td class="font-mono text-xs truncate-cell" title="{{ node.public_key }}">
{{ node.public_key[:16] }}...
</td>
<td class="text-sm">
<td class="text-sm whitespace-nowrap">
{% if node.last_seen %}
{{ node.last_seen[:19].replace('T', ' ') }}
{% else %}
@@ -94,7 +100,7 @@
</tr>
{% else %}
<tr>
<td colspan="4" class="text-center py-8 opacity-70">No nodes found.</td>
<td colspan="3" class="text-center py-8 opacity-70">No nodes found.</td>
</tr>
{% endfor %}
</tbody>

View File

@@ -49,7 +49,6 @@ class TestGetAdvertisement:
)
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_advertisement.id
assert data["public_key"] == sample_advertisement.public_key
def test_get_advertisement_not_found(self, client_no_auth):

View File

@@ -1,5 +1,11 @@
"""Tests for dashboard API routes."""
from datetime import datetime, timedelta, timezone
import pytest
from meshcore_hub.common.models import Advertisement, Message, Node
class TestDashboardStats:
"""Tests for GET /dashboard/stats endpoint."""
@@ -58,3 +64,185 @@ class TestDashboardHtml:
assert "Recent Nodes" in response.text
# The node name should appear in the table
assert sample_node.name in response.text
class TestDashboardActivity:
"""Tests for GET /dashboard/activity endpoint."""
@pytest.fixture
def past_advertisement(self, api_db_session):
"""Create an advertisement from yesterday (since today is excluded)."""
yesterday = datetime.now(timezone.utc) - timedelta(days=1)
advert = Advertisement(
public_key="abc123def456abc123def456abc123de",
name="TestNode",
adv_type="REPEATER",
received_at=yesterday,
)
api_db_session.add(advert)
api_db_session.commit()
api_db_session.refresh(advert)
return advert
def test_get_activity_empty(self, client_no_auth):
"""Test getting activity with empty database."""
response = client_no_auth.get("/api/v1/dashboard/activity")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_activity_custom_days(self, client_no_auth):
"""Test getting activity with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/activity?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_activity_max_days(self, client_no_auth):
"""Test that activity is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/activity?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_activity_with_data(self, client_no_auth, past_advertisement):
"""Test getting activity with advertisement in database.
Note: Activity endpoints exclude today's data to avoid showing
incomplete stats early in the day.
"""
response = client_no_auth.get("/api/v1/dashboard/activity")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0
total_count = sum(point["count"] for point in data["data"])
assert total_count >= 1
class TestMessageActivity:
"""Tests for GET /dashboard/message-activity endpoint."""
@pytest.fixture
def past_message(self, api_db_session):
"""Create a message from yesterday (since today is excluded)."""
yesterday = datetime.now(timezone.utc) - timedelta(days=1)
message = Message(
message_type="direct",
pubkey_prefix="abc123",
text="Hello World",
received_at=yesterday,
)
api_db_session.add(message)
api_db_session.commit()
api_db_session.refresh(message)
return message
def test_get_message_activity_empty(self, client_no_auth):
"""Test getting message activity with empty database."""
response = client_no_auth.get("/api/v1/dashboard/message-activity")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_message_activity_custom_days(self, client_no_auth):
"""Test getting message activity with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/message-activity?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_message_activity_max_days(self, client_no_auth):
"""Test that message activity is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/message-activity?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_message_activity_with_data(self, client_no_auth, past_message):
"""Test getting message activity with message in database.
Note: Activity endpoints exclude today's data to avoid showing
incomplete stats early in the day.
"""
response = client_no_auth.get("/api/v1/dashboard/message-activity")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0
total_count = sum(point["count"] for point in data["data"])
assert total_count >= 1
class TestNodeCountHistory:
"""Tests for GET /dashboard/node-count endpoint."""
@pytest.fixture
def past_node(self, api_db_session):
"""Create a node from yesterday (since today is excluded)."""
yesterday = datetime.now(timezone.utc) - timedelta(days=1)
node = Node(
public_key="abc123def456abc123def456abc123de",
name="Test Node",
adv_type="REPEATER",
first_seen=yesterday,
last_seen=yesterday,
created_at=yesterday,
)
api_db_session.add(node)
api_db_session.commit()
api_db_session.refresh(node)
return node
def test_get_node_count_empty(self, client_no_auth):
"""Test getting node count with empty database."""
response = client_no_auth.get("/api/v1/dashboard/node-count")
assert response.status_code == 200
data = response.json()
assert data["days"] == 30
assert len(data["data"]) == 30
# All counts should be 0
for point in data["data"]:
assert point["count"] == 0
assert "date" in point
def test_get_node_count_custom_days(self, client_no_auth):
"""Test getting node count with custom days parameter."""
response = client_no_auth.get("/api/v1/dashboard/node-count?days=7")
assert response.status_code == 200
data = response.json()
assert data["days"] == 7
assert len(data["data"]) == 7
def test_get_node_count_max_days(self, client_no_auth):
"""Test that node count is capped at 90 days."""
response = client_no_auth.get("/api/v1/dashboard/node-count?days=365")
assert response.status_code == 200
data = response.json()
assert data["days"] == 90
assert len(data["data"]) == 90
def test_get_node_count_with_data(self, client_no_auth, past_node):
"""Test getting node count with node in database.
Note: Activity endpoints exclude today's data to avoid showing
incomplete stats early in the day.
"""
response = client_no_auth.get("/api/v1/dashboard/node-count")
assert response.status_code == 200
data = response.json()
# At least one day should have a count > 0 (cumulative)
# The last day should have count >= 1
assert data["data"][-1]["count"] >= 1

View File

@@ -51,7 +51,6 @@ class TestGetMessage:
response = client_no_auth.get(f"/api/v1/messages/{sample_message.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_message.id
assert data["text"] == sample_message.text
def test_get_message_not_found(self, client_no_auth):

View File

@@ -45,7 +45,6 @@ class TestGetTelemetry:
response = client_no_auth.get(f"/api/v1/telemetry/{sample_telemetry.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_telemetry.id
assert data["node_public_key"] == sample_telemetry.node_public_key
def test_get_telemetry_not_found(self, client_no_auth):

View File

@@ -31,7 +31,6 @@ class TestGetTracePath:
response = client_no_auth.get(f"/api/v1/trace-paths/{sample_trace_path.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == sample_trace_path.id
assert data["path_hashes"] == sample_trace_path.path_hashes
def test_get_trace_path_not_found(self, client_no_auth):

View File

@@ -1,8 +1,11 @@
"""Fixtures for collector component tests."""
import pytest
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.ext.asyncio import async_sessionmaker
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models.base import Base
@pytest.fixture
@@ -20,3 +23,29 @@ def db_session(db_manager):
session = db_manager.get_session()
yield session
session.close()
@pytest.fixture
async def async_db_session():
"""Create an async database session for testing.
Uses a separate in-memory database with tables created inline.
"""
# Create async engine with in-memory database
engine = create_async_engine("sqlite+aiosqlite:///:memory:")
# Create tables
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
# Create session factory
async_session_maker = async_sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
# Provide session
async with async_session_maker() as session:
yield session
# Cleanup
await engine.dispose()

Some files were not shown because too many files have changed in this diff Show More