Compare commits

151 Commits

Author SHA1 Message Date
JingleManSweep
624fa458ac Merge pull request #66 from ipnet-mesh/chore/fix-sqlite-path-exists
Ensure SQLite database path/subdirectories exist before initialising …
2026-01-15 17:36:58 +00:00
Louis King
309d575fc0 Ensure SQLite database path/subdirectories exist before initialising database 2026-01-15 17:32:56 +00:00
Louis King
f7b4df13a7 Added more test coverage 2026-01-12 21:00:02 +00:00
Louis King
13bae5c8d7 Added more test coverage 2026-01-12 20:34:53 +00:00
Louis King
8a6b4d8e88 Tidying 2026-01-12 20:02:45 +00:00
JingleManSweep
b67e1b5b2b Merge pull request #65 from ipnet-mesh/claude/plan-member-editor-BwkcS
Plan Member Editor for Organization Management
2026-01-12 19:59:32 +00:00
Louis King
d4e3dc0399 Local tweaks 2026-01-12 19:59:14 +00:00
Claude
7f0adfa6a7 Implement Member Editor admin interface
Add a complete CRUD interface for managing network members at /a/members,
following the proven pattern established by the Tag Editor.

Changes:
- Add member routes to admin.py (GET, POST create/update/delete)
- Create admin/members.html template with member table, forms, and modals
- Add Members navigation card to admin index page
- Include proper authentication checks and flash message handling
- Fix mypy type hints for optional form fields

The Member Editor allows admins to:
- View all network members in a sortable table
- Create new members with all fields (member_id, name, callsign, role, contact, description)
- Edit existing members via modal dialog
- Delete members with confirmation
- Client-side validation for member_id format (alphanumeric + underscore)

All backend API infrastructure (models, schemas, routes) was already implemented.
This is purely a web UI layer built on top of the existing /api/v1/members endpoints.
2026-01-12 19:41:56 +00:00
Claude
94b03b49d9 Add comprehensive Member Editor implementation plan
Create detailed plan for building a Member Editor admin interface at /a/members.
The plan follows the proven Tag Editor pattern and includes:

- Complete route structure for CRUD operations
- Full HTML template layout with modals and forms
- JavaScript event handlers for edit/delete actions
- Integration with existing Member API endpoints
- Testing checklist and acceptance criteria

All backend infrastructure (API, models, schemas) already exists.
This is purely a web UI implementation task estimated at 2-3 hours.
2026-01-12 19:33:13 +00:00
Louis King
20d75fe041 Add bulk copy and delete all tags for node replacement workflow
When replacing a node device, users can now:
- Copy All: Copy all tags to a new node (skips existing tags)
- Delete All: Remove all tags from a node after migration

New API endpoints:
- POST /api/v1/nodes/{pk}/tags/copy-to/{dest_pk}
- DELETE /api/v1/nodes/{pk}/tags

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 14:46:51 +00:00
Louis King
307f3935e0 Add access denied page for unauthenticated admin access
When users try to access /a/ without valid OAuth2Proxy headers (e.g.,
GitHub account not in org), they now see a friendly 403 page instead
of a 500 error. Added authentication checks to all admin routes.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 13:34:03 +00:00
Louis King
6901bafb02 Tidying Tag Editor layout 2026-01-11 13:13:22 +00:00
JingleManSweep
e595dc2b27 Merge pull request #63 from ipnet-mesh/claude/admin-node-tags-interface-pHbKm
Add admin interface for managing node tags
2026-01-11 12:51:56 +00:00
Louis King
ed2cf09ff3 Improve admin UI and remove unused coordinate tag type
- Replace node type badge with icon in admin tag editor
- Add Edit/Add Tags button on node detail page (when admin enabled and authenticated)
- Remove automatic seed container startup to prevent overwriting user changes
- Remove unused 'coordinate' value type from node tags (only string, number, boolean remain)
2026-01-11 12:49:34 +00:00
Claude
bec736a894 Sort node dropdown alphabetically in admin interface
Nodes in the dropdown are now sorted alphabetically by name,
with unnamed nodes appearing at the end.
2026-01-11 12:01:11 +00:00
Claude
1457360703 Use API_ADMIN_KEY for web service to enable admin operations
The web admin interface needs write permissions to create, update,
move, and delete node tags. Changed to use API_ADMIN_KEY with
fallback to API_READ_KEY if admin key is not configured.
2026-01-11 11:55:15 +00:00
Claude
d8a0f2abb8 Fix security vulnerabilities and add validation
- Fix XSS vulnerability by using data attributes instead of inline
  onclick handlers in node_tags.html template
- Fix URL injection by using urlencode for all redirect URL parameters
- Add validation to reject moves where source and destination nodes
  are the same (returns 400 Bad Request)
- Add error handling for response.json() calls that may fail
- Add missing test coverage for update endpoint error scenarios
2026-01-11 11:51:57 +00:00
Claude
367f838371 Add admin interface for managing node tags
Implement CRUD operations for NodeTags in the admin interface:

- Add NodeTagMove schema for moving tags between nodes
- Add PUT /nodes/{public_key}/tags/{key}/move API endpoint
- Add web routes at /a/node-tags for tag management
- Create admin templates with node selector and tag management UI
- Support editing, adding, moving, and deleting tags via API calls
- Add comprehensive tests for new functionality

The interface allows selecting a node from a dropdown, viewing its
tags, and performing all CRUD operations including moving a tag
to a different node without having to delete and recreate it.
2026-01-11 01:34:07 +00:00
Louis King
741dd3ce84 Initial admin commit 2026-01-11 00:42:57 +00:00
JingleManSweep
0a12f389df Merge pull request #62 from ipnet-mesh/feature/contact-gps
Store Node GPS Coordinates
2026-01-09 20:17:40 +00:00
Louis King
8240c2fd57 Initial commit 2026-01-09 20:07:36 +00:00
Louis King
38f7fe291e Add member filtering to map page using member_id tag
Change the map filter from matching nodes by public_key to using the
member_id tag system. Now populates the member dropdown with all members
from the database and filters nodes based on their member_id tag value.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 19:16:15 +00:00
JingleManSweep
e4087efbf0 Merge pull request #61 from ipnet-mesh/feature/ui-improvements
Remove SNR column from messages and add last seen to members
2026-01-08 21:25:03 +00:00
Louis King
3051984fb9 Remove SNR column from messages and add last seen to members
- Remove SNR column from messages list (no longer provided by meshcore library)
- Add relative "last seen" time to nodes on members page with tooltip
- Add populateRelativeTimeElements() utility for time elements
2026-01-08 21:23:14 +00:00
JingleManSweep
eea2c90ea4 Merge pull request #58 from ipnet-mesh/feature/ui-improvements
Add member/node filters, mobile card views, and pagination macro
2026-01-08 20:15:54 +00:00
Louis King
d52c23fc29 Add member/node filters, mobile card views, and pagination macro
- Add member_id filter to nodes and advertisements API endpoints
- Add member and node dropdowns to web list pages
- Implement responsive mobile card view for nodes and advertisements
- Extract pagination into reusable Jinja2 macro (_macros.html)
- Fix Python version in README (3.11+ -> 3.13+)
2026-01-08 20:13:49 +00:00
Louis King
a1fb71ce65 Add responsive mobile card view for messages page 2026-01-08 16:50:29 +00:00
JingleManSweep
6a5549081f Merge pull request #56 from ipnet-mesh/fix/receiver-contact-cleanup
Add contact cleanup to interface RECEIVER mode
2026-01-08 10:28:26 +00:00
Louis King
68e24ee886 Fix 2026-01-08 10:26:31 +00:00
Louis King
61d6b6287e Add contact cleanup to interface RECEIVER mode
- Add CONTACT_CLEANUP_ENABLED and CONTACT_CLEANUP_DAYS settings
- Implement remove_contact and schedule_remove_contact on device classes
- During contact sync, remove stale contacts from companion node
- Stale contacts (not advertised for > N days) not published to MQTT
- Update Python version to 3.13 across project config
- Remove brittle config tests that assumed default env values
2026-01-08 10:22:27 +00:00
Louis King
7007c84577 Updated screenshot 2025-12-08 23:45:22 +00:00
Louis King
fd928d9fea Updated diagrams 2025-12-08 23:40:52 +00:00
Louis King
68b6aa85cd Updated diagrams 2025-12-08 23:39:25 +00:00
Louis King
abbc07edb3 Updated diagrams 2025-12-08 23:37:13 +00:00
Louis King
b42add310e Updated diagrams 2025-12-08 23:36:13 +00:00
Louis King
98a5526e80 Updated diagrams 2025-12-08 23:34:28 +00:00
Louis King
db86b3198e Some minor UI improvements, updated env.example, and docs 2025-12-08 23:06:04 +00:00
Louis King
cd4f0b91dc Various UI improvements 2025-12-08 22:07:46 +00:00
Louis King
a290db0491 Updated chart stats 2025-12-08 19:37:45 +00:00
Louis King
92b0b883e6 More website improvements 2025-12-08 17:07:39 +00:00
Louis King
9e621c0029 Fixed test 2025-12-08 16:42:13 +00:00
Louis King
a251f3a09f Added map to node detail page, made title consistent with emoji 2025-12-08 16:37:53 +00:00
Louis King
0fdedfe5ba Tidied Advert/Node search 2025-12-08 16:22:08 +00:00
Louis King
243a3e8521 Added truncate CLI command 2025-12-08 15:54:32 +00:00
JingleManSweep
b24a6f0894 Merge pull request #54 from ipnet-mesh/feature/more-filters
Fixed Member model
2025-12-08 15:15:04 +00:00
Louis King
57f51c741c Fixed Member model 2025-12-08 15:13:24 +00:00
Louis King
65b8418af4 Fixed last seen issue 2025-12-08 00:15:25 +00:00
JingleManSweep
89ceee8741 Merge pull request #51 from ipnet-mesh/feat/sync-receiver-contacts-on-advert
Receiver nodes now sync contacts to MQTT on every advert received
2025-12-07 23:36:11 +00:00
Louis King
64ec1a7135 Receiver nodes now sync contacts to MQTT on every advert received 2025-12-07 23:34:33 +00:00
JingleManSweep
3d632a94b1 Merge pull request #50 from ipnet-mesh/feat/remove-friendly-name
Removed friendly name support and tidied tags
2025-12-07 23:03:39 +00:00
Louis King
fbd29ff78e Removed friendly name support and tidied tags 2025-12-07 23:02:19 +00:00
Louis King
86bff07f7d Removed contrib 2025-12-07 22:22:32 +00:00
Louis King
3abd5ce3ea Updates 2025-12-07 22:18:16 +00:00
Louis King
0bf2086f16 Added screenshot 2025-12-07 22:05:34 +00:00
Louis King
40dc6647e9 Updates 2025-12-07 22:02:42 +00:00
Louis King
f4e95a254e Fixes 2025-12-07 22:00:46 +00:00
Louis King
ba43be9e62 Fixes 2025-12-07 21:58:42 +00:00
JingleManSweep
5b22ab29cf Merge pull request #49 from ipnet-mesh/fix/version-display
Fixed version display
2025-12-07 21:56:26 +00:00
Louis King
278d102064 Fixed version display 2025-12-07 21:55:10 +00:00
JingleManSweep
f0cee14bd8 Merge pull request #48 from ipnet-mesh/feature/mqtt-tls
Added support for MQTT TLS
2025-12-07 21:16:13 +00:00
Louis King
5ff8d16bcb Added support for MQTT TLS 2025-12-07 21:15:05 +00:00
JingleManSweep
e8a60d4869 Merge pull request #47 from ipnet-mesh/feature/node-cleanup
Added Node/Data cleanup
2025-12-07 20:50:09 +00:00
Louis King
84b8614e29 Updates 2025-12-06 21:42:33 +00:00
Louis King
3bc47a33bc Added data retention and node cleanup 2025-12-06 21:27:19 +00:00
Louis King
3ae8ecbd70 Updates 2025-12-06 20:46:31 +00:00
JingleManSweep
38164380af Merge pull request #41 from ipnet-mesh/claude/issue-37-20251206-1854
feat: Add MESHCORE_DEVICE_NAME config to set node name on startup
2025-12-06 19:33:32 +00:00
claude[bot]
dc3c771c76 docs: Document MESHCORE_DEVICE_NAME configuration option
Add documentation for the new MESHCORE_DEVICE_NAME environment variable
that was introduced in this PR. Updates include:

- Added to .env.example with description
- Added to Interface Settings table in README.md
- Added to CLI Reference examples in README.md
- Added to Interface configuration table in PLAN.md

🤖 Generated with [Claude Code](https://claude.ai/claude-code)

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-06 19:07:57 +00:00
claude[bot]
deb307c6ae feat: Add MESHCORE_DEVICE_NAME config to set node name on startup
- Add meshcore_device_name field to InterfaceSettings
- Implement set_name() method in device interface (real and mock)
- Update receiver to set device name during initialization if configured
- Add --device-name CLI option with MESHCORE_DEVICE_NAME env var support
- Device name is set after time sync and before advertisement broadcast

Fixes #37

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 19:00:56 +00:00
JingleManSweep
b8c8284643 Merge pull request #39 from ipnet-mesh/claude/issue-38-20251206-1840
Send flood advertisement on receiver startup
2025-12-06 18:48:50 +00:00
JingleManSweep
d310a119ed Update src/meshcore_hub/interface/receiver.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-12-06 18:48:03 +00:00
claude[bot]
2b307679c9 Send flood advertisement on receiver startup
Changed the startup advertisement from flood=False to flood=True
so that the device name is broadcast to the mesh network.

Fixes #38

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 18:42:53 +00:00
Louis King
6f7521951f Updates 2025-12-06 18:29:12 +00:00
Louis King
ab498292b2 Updated README with upgrading instructions 2025-12-06 17:31:10 +00:00
JingleManSweep
df2b9ea432 Merge pull request #35 from ipnet-mesh/claude/issue-33-20251206-1703
Add last seen time to map node labels
2025-12-06 17:18:49 +00:00
claude[bot]
55443376be Use full display name in map node labels
Update map node labels to show the full display name (friendly_name tag
→ advertised node name → public key prefix) instead of just the 2-char
public key prefix. This makes node labels consistent with the rest of
the site and easier to identify at a glance.

The backend already computes the correct display name in map.py:96-101,
so this change just uses that computed name in the label.

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:12:50 +00:00
claude[bot]
ed7a46b1a7 Add last seen time to map node labels
Display relative time since last seen (e.g., '2m', '1h', '2d') in node
labels on the map page. This makes it easier to quickly identify how
recently nodes were active without opening the popup.

- Add formatRelativeTime() function to calculate time difference
- Update createNodeIcon() to include relative time in label
- Adjust icon size to accommodate additional text
- Format: keyPrefix (timeAgo) e.g., 'ab (5m)'

Fixes #33

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 17:05:30 +00:00
JingleManSweep
c2eef3db50 Merge pull request #34 from ipnet-mesh/claude/issue-25-20251206-1646
fix: Handle empty channel_idx parameter in messages filter
2025-12-06 16:53:34 +00:00
claude[bot]
4916ea0cea fix: Handle empty channel_idx parameter in messages filter
Fixed parse error when clicking the filter button on messages screen
with "All Channels" selected. The form was sending an empty string
for channel_idx, but FastAPI expected either a valid integer or None.

Changes:
- Accept channel_idx as string in query parameter
- Parse and validate channel_idx before passing to API
- Treat empty strings as None to prevent validation errors
- Add error handling for invalid integer values

Fixes #25

Co-authored-by: JingleManSweep <jinglemansweep@users.noreply.github.com>
2025-12-06 16:49:23 +00:00
JingleManSweep
e3fc7e4f07 Merge pull request #32 from ipnet-mesh/add-claude-github-actions-1765039315831
Add Claude Code GitHub Workflow
2025-12-06 16:42:13 +00:00
JingleManSweep
b656bfda21 "Claude PR Assistant workflow" 2025-12-06 16:41:56 +00:00
Louis King
fb7201dc2d Updates 2025-12-06 16:37:25 +00:00
Louis King
74346d9c82 Hopefully use Git tag as version on website 2025-12-06 16:32:31 +00:00
Louis King
beb471fcd8 Switched to Git versioning 2025-12-06 16:28:13 +00:00
Louis King
8597052bf7 Fixed table overflows 2025-12-06 16:07:13 +00:00
Louis King
f85768f661 Fixed DB migration 2025-12-06 15:46:41 +00:00
JingleManSweep
79bb4a4250 Merge pull request #23 from ipnet-mesh/claude/plan-event-deduplication-01NWhrJaWzQiGi1xLzmk1udg
Deduplication for multiple receiver nodes
2025-12-06 15:36:37 +00:00
Louis King
714c3cbbd2 Set sensible Docker tag label 2025-12-06 15:32:15 +00:00
Louis King
f0531c9e40 Updated env example 2025-12-06 15:16:26 +00:00
Louis King
dd0b4c73c5 More fixes 2025-12-06 15:10:03 +00:00
Louis King
78a086e1ea Updates 2025-12-06 14:51:47 +00:00
Louis King
9cd1d50bf6 Updates 2025-12-06 14:38:53 +00:00
Louis King
733342a9ec Fixed README and Compose 2025-12-06 14:21:17 +00:00
Louis King
d715e4e4f0 Updates 2025-12-06 13:33:02 +00:00
Louis King
2ea04deb7e Updates 2025-12-06 12:53:29 +00:00
Claude
6e3b86a1ad Add collector-level event deduplication using content hashes
Replace presentation-layer deduplication with collector-level approach:
- Add event_hash column to messages, advertisements, trace_paths, telemetry tables
- Handlers compute content hashes and skip duplicate events at insertion time
- Use 5-minute time buckets for advertisements and telemetry
- Include Alembic migration for schema changes
2025-12-06 12:23:14 +00:00
Louis King
b807932ca3 Added arm64 Docker image support 2025-12-06 12:16:05 +00:00
Claude
c80986fe67 Add event deduplication at presentation layer
When multiple receiver nodes are running, the same mesh events (messages,
advertisements) are reported multiple times. This causes duplicate entries
in the Web UI.

Changes:
- Add hash_utils.py with deterministic hash functions for each event type
- Add `dedupe` parameter to messages and advertisements API endpoints (default: True)
- Update dashboard stats to use distinct counts for messages/advertisements
- Deduplicate recent advertisements and channel messages in dashboard
- Add comprehensive tests for hash utilities

Hash strategy:
- Messages: hash of text + pubkey_prefix + channel_idx + sender_timestamp + txt_type
- Advertisements: hash of public_key + name + adv_type + flags + 5-minute time bucket
2025-12-06 12:08:07 +00:00
Louis King
3a060f77cc Updates 2025-12-06 11:46:36 +00:00
JingleManSweep
7461cf6dc1 Merge pull request #22 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Updates
2025-12-05 21:18:31 +00:00
Louis King
23f6c290c9 Updates 2025-12-05 21:17:34 +00:00
JingleManSweep
1ae1736391 Merge pull request #21 from ipnet-mesh/claude/add-member-node-association-01JezMA7XzvwsX37rMNoBmvo
Associate nodes with members in database
2025-12-05 21:17:20 +00:00
Claude
a4b13d3456 Add member-node association support
Members can now have multiple associated nodes, each with a public_key
and node_role (e.g., 'chat', 'repeater'). This replaces the single
public_key field on members with a one-to-many relationship.

Changes:
- Add MemberNode model for member-node associations
- Update Member model to remove public_key, add nodes relationship
- Update Pydantic schemas with MemberNodeCreate/MemberNodeRead
- Update member_import.py to handle nodes list in seed files
- Update API routes to handle nodes in create/update/read operations
- Add Alembic migration to create member_nodes table and migrate data
- Update example seed file with new format
2025-12-05 20:34:09 +00:00
Louis King
0016edbdac Updates 2025-12-05 20:03:14 +00:00
Louis King
0b8fc6e707 Charts 2025-12-05 19:50:22 +00:00
Louis King
d1181ae4f9 Updates 2025-12-05 19:27:06 +00:00
JingleManSweep
6b41e64b26 Merge pull request #20 from ipnet-mesh/claude/ui-improvements-ads-page-01GQvLau46crtrqftWzFie5d
UI improvements and advertisements page
2025-12-05 18:26:07 +00:00
Claude
995b066b0d Remove sender name from channel messages summary, keep only timestamp 2025-12-05 18:23:33 +00:00
Claude
0d14ed0ccc Add latest channel messages to Network dashboard
Replace the channel counts table with actual recent messages per channel:
- Added ChannelMessage schema for channel message summaries
- Dashboard API now fetches latest 5 messages for each channel with sender name lookups
- Network page displays messages grouped by channel with sender names and timestamps
- Only shows channels that have messages
2025-12-05 18:20:49 +00:00
Claude
e3ce1258a8 Fix advertisements Type column by falling back to source node's adv_type
The adv_type from the Advertisement record is often null, but the linked
Node has the correct adv_type. Now falls back to source_node.adv_type
when adv.adv_type is null.
2025-12-05 18:14:16 +00:00
Claude
087a3c4c43 Messages list: swap Time/Type columns, add receiver node links
- Swap Time and Type columns (Type now first)
- Add receiver_name and receiver_friendly_name to MessageRead schema
- Update messages API to fetch receiver node names and tags
- Make Receiver column a link showing name with public key prefix
2025-12-05 18:09:29 +00:00
Claude
5077178a6d Add Type column with emoji to Advertisements list 2025-12-05 18:04:04 +00:00
Claude
a44d38dad6 Update Nodes list to match Advertisements style
- Rename Name column to Node
- Remove separate Public Key column
- Show name with public key prefix below (like Advertisements list)
- Add whitespace-nowrap to Last Seen column
2025-12-05 18:03:14 +00:00
Claude
3469278fba Add node name lookups to advertisements list
- Join with Node table to get node names and tags for both source
  and receiver nodes
- Display friendly_name (from tags), node_name, or advertised name
  with priority in that order
- Show name with public key preview for both Node and Received By columns
2025-12-05 17:58:52 +00:00
Claude
89c81630c9 Link 'Powered by MeshCore Hub' to GitHub repository 2025-12-05 17:55:59 +00:00
Claude
ab4a5886db Simplify advertisements table: Node, Received By, Time columns
- Remove Name and Type columns (usually null)
- Reorder columns: Node first, then Received By, then Time
- Link both Node and Received By to their node detail pages
- Show node name with public key preview when available
2025-12-05 17:53:54 +00:00
Claude
ec7082e01a Fix message text indentation in messages list
Put message content inline with td tag to prevent whitespace-pre-wrap
from preserving template indentation.
2025-12-05 17:43:09 +00:00
Claude
b4e7d45cf6 UI improvements: smaller hero, stats bar, advertisements page, messages fixes
- Reduce hero section size and add stats bar with node/message counts
- Add new Advertisements page with public key filtering
- Update hero navigation buttons: Dashboard, Nodes, Advertisements, Messages
- Add Advertisements to main navigation menu
- Remove Hops column from messages list (always empty)
- Display full message text with proper multi-line wrapping
2025-12-05 17:14:26 +00:00
JingleManSweep
864494c3a8 Merge pull request #19 from ipnet-mesh/claude/research-node-id-usage-019DURYQHkvodx9sV39hTmNM
Research node ID and public key usage
2025-12-05 16:59:30 +00:00
Claude
84e83a3384 Rename receiver_public_key to received_by
Shorter, cleaner field name for the receiving interface node's
public key in API responses.
2025-12-05 16:57:24 +00:00
Claude
796e303665 Remove internal UUID fields from API responses
Internal database UUIDs (id, node_id, receiver_node_id) were being
exposed in API responses. These are implementation details that should
not be visible to API consumers. The canonical identifier for nodes
is the 64-char hex public_key.

Changes:
- Remove id, node_id from NodeTagRead, NodeRead schemas
- Remove id from MemberRead schema
- Remove id, receiver_node_id, node_id from MessageRead, AdvertisementRead,
  TracePathRead, TelemetryRead schemas
- Update web map component to use public_key instead of member.id
  for owner filtering
- Update tests to not assert on removed fields
2025-12-05 16:50:21 +00:00
Louis King
a5d8d586e1 Updated README 2025-12-05 12:56:12 +00:00
JingleManSweep
26239fe11f Merge pull request #18 from ipnet-mesh/claude/prepare-public-release-01AFsizAneHmWjHZD6of5MWy
Prepare repository for public release
2025-12-05 12:40:56 +00:00
Claude
0e50a9d3b0 Prepare repository for public release
- Update license from MIT to GPL-3.0-or-later in pyproject.toml
- Update project URLs from meshcore-dev to ipnet-mesh organization
- Add explicit GPL-3.0 license statement to README
- Fix AGENTS.md venv directory reference (.venv vs venv)
- Remove undocumented NETWORK_LOCATION from README
- Fix stats endpoint path in README (/api/v1/dashboard/stats)
- Clarify seed and data directory descriptions in project structure
2025-12-05 12:12:55 +00:00
JingleManSweep
9b7d8cc31b Merge pull request #17 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Updates
2025-12-04 19:35:15 +00:00
Louis King
d7152a5359 Updates 2025-12-04 19:34:18 +00:00
JingleManSweep
a1cc4388ae Merge pull request #16 from ipnet-mesh/claude/implement-map-page-01F5nbj6KfXDv4Dsp1KE6jJZ
Implement Map page with node filtering
2025-12-04 19:32:24 +00:00
Claude
bb0b9f05ec Add debug info to map data endpoint for troubleshooting
- Return total_nodes, nodes_with_coords, and error in response
- Display meaningful messages when no nodes or no coordinates found
- Log API errors and node counts for debugging
2025-12-04 18:37:50 +00:00
Claude
fe744c7c0c Fix map markers to use inline styles and center on nodes
- Use inline styles for marker colors instead of CSS classes for reliable rendering
- Center map on node locations when data is first loaded
- Refactor filter logic to separate recentering behavior
- Update legend to use inline styles
2025-12-04 18:34:24 +00:00
Claude
cf4e82503a Add filters to map page for node type, infrastructure, and owner
- Enhanced /map/data endpoint to include node role tag and member ownership
- Added client-side filtering for node type (chat, repeater, room)
- Added toggle to filter for infrastructure nodes only (role: infra)
- Added dropdown filter for member owner (nodes linked via public_key)
- Color-coded markers by node type with gold border for infrastructure
- Added legend showing marker types
- Dynamic count display showing total vs filtered nodes
2025-12-04 18:29:43 +00:00
Louis King
d6346fdfde Updates 2025-12-04 18:18:36 +00:00
Louis King
cf2c3350cc Updates 2025-12-04 18:10:29 +00:00
Louis King
110c701787 Updates 2025-12-04 16:37:59 +00:00
Louis King
fc0dc1a448 Updates 2025-12-04 16:12:51 +00:00
Louis King
d283a8c79b Updates 2025-12-04 16:00:15 +00:00
JingleManSweep
f129a4e0f3 Merge pull request #15 from ipnet-mesh/feature/originator-address
Updates
2025-12-04 15:52:49 +00:00
JingleManSweep
90f6a68b9b Merge pull request #14 from ipnet-mesh/claude/update-docs-docker-agents-01B9FYrem1tEwNRkx7rCH1QU
Update docs: Docker profiles, seed data, and Members model
2025-12-04 15:46:00 +00:00
Louis King
6cf3152ef9 Updates 2025-12-04 15:45:35 +00:00
Claude
058a6e2c95 Update docs: Docker profiles, seed data, and Members model
- Update Docker Compose section: core services run by default (mqtt,
  collector, api, web), optional profiles for interfaces and utilities
- Document automatic seeding on collector startup
- Add SEED_HOME environment variable documentation
- Document new Members model and YAML seed file format
- Update node_tags format to YAML with public_key-keyed structure
- Update project structure to reflect seed/ and data/ directories
- Add CLI reference for collector seed commands
2025-12-04 15:31:04 +00:00
JingleManSweep
acccdfedba Merge pull request #13 from ipnet-mesh/claude/fix-black-ci-linting-01UGvpVAnxCMthohsvzFBiQF
Fix Black linting issues in CI pipeline
2025-12-04 15:12:36 +00:00
Claude
83f3157e8b Fix Black formatting: add trailing comma in set_dispatch_callback
Black requires a trailing comma after the callback parameter when the
function signature spans multiple lines.
2025-12-04 15:06:13 +00:00
JingleManSweep
c564a61cf7 Merge pull request #12 from ipnet-mesh/claude/trigger-contact-database-01Qp5vbzzE1c77Kv21wYfQbQ
Trigger Contact database before CONTACT events
2025-12-04 14:59:21 +00:00
Claude
bbe8491ff1 Add info-level logging to contact handler for debugging
Change debug logging to info level so contact processing is visible
in default log output. This helps verify that contact events are
being received and processed correctly.
2025-12-04 14:39:14 +00:00
Claude
1d6a9638a1 Refactor contacts to emit individual MQTT events per contact
Instead of sending all contacts in one MQTT message, the interface
now splits the device's contacts response into individual 'contact'
events. This is more consistent with other event patterns and makes
the collector simpler.

Interface changes:
- Add _publish_contacts() to split contacts dict into individual events
- Publish each contact as 'contact' event (not 'contacts')

Collector changes:
- Rename handle_contacts to handle_contact for single contact
- Simplify handler to process one contact per message
- Register handler for 'contact' events
2025-12-04 14:34:05 +00:00
Claude
cf633f9f44 Update contacts handler to match actual device payload format
The device sends contact entries with different field names than
originally expected:
- adv_name (not name) for the advertised node name
- type (numeric: 0=none, 1=chat, 2=repeater, 3=room) instead of node_type

Changes:
- Update handle_contacts to extract adv_name and convert numeric type
- Add NODE_TYPE_MAP for type conversion
- Always update node name if different (not just if empty)
- Add debug logging for node updates
- Update ContactInfo schema with actual device fields
2025-12-04 14:25:11 +00:00
Claude
102e40a395 Fix event subscriptions setup timing for contact database sync
Move _setup_event_subscriptions() from run() to connect() so that
event subscriptions are active before get_contacts() is called during
device initialization. Previously, CONTACTS events were lost because
subscriptions weren't set up until run() was called.
2025-12-04 14:18:27 +00:00
Claude
241902685d Add contact database sync to interface startup
Trigger get_contacts() during receiver initialization to fetch the
device's contact database and broadcast CONTACTS events over MQTT.
This enables the collector to associate broadcast node names with
node records in the database.

Changes:
- Add get_contacts() abstract method to BaseMeshCoreDevice
- Implement get_contacts() in MeshCoreDevice using meshcore library
- Implement get_contacts() in MockMeshCoreDevice for testing
- Call get_contacts() in Receiver._initialize_device() after startup
2025-12-04 14:10:58 +00:00
JingleManSweep
2f2ae30c89 Merge pull request #11 from ipnet-mesh/claude/collector-seed-yaml-conversion-01WgpDYuzrzP5nkeC2EG9o2L
Convert collector seed mechanism from JSON to YAML
2025-12-04 01:34:47 +00:00
Louis King
fff04e4b99 Updates 2025-12-04 01:33:25 +00:00
Claude
df05c3a462 Convert collector seed mechanism from JSON to YAML
- Replace JSON seed files with YAML format for better readability
- Auto-detect YAML primitive types (number, boolean, string) from values
- Add automatic seed import on collector startup
- Split lat/lon into separate tags instead of combined coordinate string
- Add PyYAML dependency and types-PyYAML for type checking
- Update example/seed and contrib/seed/ipnet with clean YAML format
- Update tests to verify YAML primitive type detection
2025-12-04 01:27:03 +00:00
Louis King
e2d865f200 Fix nodes page test to match template output
The test was checking for adv_type values (REPEATER, CLIENT) but the
nodes.html template doesn't display that column. Updated to check for
public key prefixes instead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-04 01:03:05 +00:00
Louis King
fa335bdb14 Updates 2025-12-04 00:59:49 +00:00
128 changed files with 11590 additions and 1914 deletions

View File

@@ -1,110 +1,127 @@
# MeshCore Hub - Docker Compose Environment Configuration
# MeshCore Hub - Environment Configuration
# Copy this file to .env and customize values
#
# Configuration is grouped by service. Most deployments only need:
# - Common Settings (always required)
# - MQTT Settings (always required)
# - Interface Settings (for receiver/sender services)
#
# The Collector, API, and Web services typically run as a combined "core"
# profile and share the same data directory.
#
# -----------------------------------------------------------------------------
# QUICK START: Receiver/Sender Only
# -----------------------------------------------------------------------------
# For a minimal receiver or sender setup, you only need these settings:
#
# MQTT_HOST=your-mqtt-broker.example.com
# MQTT_PORT=1883
# MQTT_USERNAME=your_username
# MQTT_PASSWORD=your_password
# MQTT_TLS=false
# SERIAL_PORT=/dev/ttyUSB0
#
# Serial ports are typically /dev/ttyUSB[0-9] or /dev/ttyACM[0-9] on Linux.
# -----------------------------------------------------------------------------
# ===================
# Docker Image
# ===================
# =============================================================================
# COMMON SETTINGS
# =============================================================================
# These settings apply to all services
# Leave empty to build from local Dockerfile, or set to use a pre-built image:
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:latest
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:main
# MESHCORE_IMAGE=ghcr.io/ipnet-mesh/meshcore-hub:v1.0.0
MESHCORE_IMAGE=
# Docker image version tag to use
# Options: latest, main, v1.0.0, etc.
IMAGE_VERSION=latest
# ===================
# Data Directory
# ===================
# Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
LOG_LEVEL=INFO
# Base directory for all service data (collector DB, tags, members, etc.)
# Base directory for runtime data (database, etc.)
# Default: ./data (relative to docker-compose.yml location)
# Inside containers this is mapped to /data
#
# Structure:
# ${DATA_HOME}/
# ── collector/
# ── meshcore.db # SQLite database
# │ └── tags.json # Node tags for import
# └── web/
# └── members.json # Network members list
# ── collector/
# ── meshcore.db # SQLite database
DATA_HOME=./data
# ===================
# Common Settings
# ===================
# Directory containing seed data files for import
# Default: ./seed (relative to docker-compose.yml location)
# Inside containers this is mapped to /seed
#
# Structure:
# ${SEED_HOME}/
# ├── node_tags.yaml # Node tags for import
# └── members.yaml # Network members for import
SEED_HOME=./seed
# Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
LOG_LEVEL=INFO
# =============================================================================
# MQTT SETTINGS
# =============================================================================
# MQTT broker connection settings for interface, collector, and API services
# MQTT Broker Settings (internal use)
# MQTT Broker host
# When using the local MQTT broker (--profile mqtt), use "mqtt"
# When using an external broker, set the hostname/IP
MQTT_HOST=mqtt
# MQTT Broker port (default: 1883, or 8883 for TLS)
MQTT_PORT=1883
# MQTT authentication (optional)
MQTT_USERNAME=
MQTT_PASSWORD=
# MQTT topic prefix for all MeshCore messages
MQTT_PREFIX=meshcore
# External MQTT port mapping
# Enable TLS/SSL for MQTT connection
# When enabled, uses TLS with system CA certificates (e.g., for Let's Encrypt)
MQTT_TLS=false
# External port mappings for local MQTT broker (--profile mqtt only)
MQTT_EXTERNAL_PORT=1883
MQTT_WS_PORT=9001
# ===================
# Interface Settings
# ===================
# =============================================================================
# INTERFACE SETTINGS (Receiver/Sender)
# =============================================================================
# Settings for the MeshCore device interface services
# Serial port for receiver device
SERIAL_PORT=/dev/ttyUSB0
# Serial port for sender device (if separate)
# Serial port for sender device (if using separate device)
SERIAL_PORT_SENDER=/dev/ttyUSB1
# Baud rate for serial communication
SERIAL_BAUD=115200
# Optional device/node name to set on startup
# This name is broadcast to the mesh network in advertisements
MESHCORE_DEVICE_NAME=
# Optional node address override (64-char hex string)
# Only set if you need to override the device's public key
NODE_ADDRESS=
NODE_ADDRESS_SENDER=
# ===================
# API Settings
# ===================
# =============================================================================
# COLLECTOR SETTINGS
# =============================================================================
# The collector subscribes to MQTT events and stores them in the database
# External API port
API_PORT=8000
# API Keys for authentication (generate secure keys for production!)
# Example: openssl rand -hex 32
API_READ_KEY=
API_ADMIN_KEY=
# ===================
# Web Dashboard Settings
# ===================
# External web port
WEB_PORT=8080
# Network Information (displayed on web dashboard)
NETWORK_NAME=MeshCore Network
NETWORK_CITY=
NETWORK_COUNTRY=
NETWORK_LOCATION=
NETWORK_RADIO_CONFIG=
NETWORK_CONTACT_EMAIL=
NETWORK_CONTACT_DISCORD=
# Members file location (optional override)
# Default: ${DATA_HOME}/web/members.json
# Only set this if you want to use a different location
# MEMBERS_FILE=/custom/path/to/members.json
# ===================
# -------------------
# Webhook Settings
# ===================
# -------------------
# Webhooks forward mesh events to external HTTP endpoints as POST requests
# Webhook for advertisement events (node discovery)
# Events are sent as POST requests with JSON payload
WEBHOOK_ADVERTISEMENT_URL=
WEBHOOK_ADVERTISEMENT_SECRET=
# Webhook for all message events (channel and direct messages)
# Use this for a single endpoint handling all messages
WEBHOOK_MESSAGE_URL=
WEBHOOK_MESSAGE_SECRET=
@@ -119,3 +136,83 @@ WEBHOOK_MESSAGE_SECRET=
WEBHOOK_TIMEOUT=10.0
WEBHOOK_MAX_RETRIES=3
WEBHOOK_RETRY_BACKOFF=2.0
# -------------------
# Data Retention Settings
# -------------------
# Automatic cleanup of old event data (advertisements, messages, telemetry, etc.)
# Enable automatic cleanup of old event data
DATA_RETENTION_ENABLED=true
# Number of days to retain event data
# Events older than this are deleted during cleanup
DATA_RETENTION_DAYS=30
# Hours between automatic cleanup runs
# Applies to both event data and node cleanup
DATA_RETENTION_INTERVAL_HOURS=24
# -------------------
# Node Cleanup Settings
# -------------------
# Automatic removal of inactive nodes
# Enable automatic cleanup of inactive nodes
# Nodes with last_seen=NULL (never seen on network) are NOT removed
NODE_CLEANUP_ENABLED=true
# Remove nodes not seen for this many days (based on last_seen field)
NODE_CLEANUP_DAYS=7
# =============================================================================
# API SETTINGS
# =============================================================================
# REST API for querying data and sending commands
# External API port
API_PORT=8000
# API Keys for authentication
# Generate secure keys for production: openssl rand -hex 32
# Leave empty to disable authentication (not recommended for production)
API_READ_KEY=
API_ADMIN_KEY=
# =============================================================================
# WEB DASHBOARD SETTINGS
# =============================================================================
# Web interface for visualizing network status
# External web port
WEB_PORT=8080
# -------------------
# Network Information
# -------------------
# Displayed on the web dashboard homepage
# Network display name
NETWORK_NAME=MeshCore Network
# Network location
NETWORK_CITY=
NETWORK_COUNTRY=
# Radio configuration (comma-delimited)
# Format: <profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>
# Example: EU/UK Narrow,869.618MHz,62.5kHz,SF8,CR8,22dBm
NETWORK_RADIO_CONFIG=
# Welcome text displayed on the homepage (optional, plain text)
# If not set, a default welcome message is shown
NETWORK_WELCOME_TEXT=
# -------------------
# Contact Information
# -------------------
# Contact links displayed in the footer
NETWORK_CONTACT_EMAIL=
NETWORK_CONTACT_DISCORD=
NETWORK_CONTACT_GITHUB=

View File

@@ -2,9 +2,9 @@ name: CI
on:
push:
branches: [main, master]
branches: [main]
pull_request:
branches: [main, master]
branches: [main]
jobs:
lint:
@@ -16,7 +16,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
python-version: "3.13"
- name: Install dependencies
run: |
@@ -39,7 +39,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.11"]
python-version: ["3.13"]
steps:
- uses: actions/checkout@v4
@@ -60,7 +60,7 @@ jobs:
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
if: matrix.python-version == '3.11'
if: matrix.python-version == '3.13'
with:
files: ./coverage.xml
fail_ci_if_error: false
@@ -76,7 +76,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
python-version: "3.13"
- name: Install build tools
run: |

49
.github/workflows/claude.yml vendored Normal file
View File

@@ -0,0 +1,49 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
# prompt: 'Update the pull request description to include a summary of changes.'
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://docs.claude.com/en/docs/claude-code/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'

View File

@@ -2,11 +2,9 @@ name: Docker
on:
push:
branches: [main, master]
branches: [main]
tags:
- "v*"
pull_request:
branches: [main, master]
env:
REGISTRY: ghcr.io
@@ -23,6 +21,9 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
@@ -51,15 +52,18 @@ jobs:
with:
context: .
file: Dockerfile
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
BUILD_VERSION=${{ github.ref_name }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Test Docker image
if: github.event_name == 'pull_request'
run: |
docker build -t meshcore-hub-test -f Dockerfile .
docker build -t meshcore-hub-test --build-arg BUILD_VERSION=${{ github.ref_name }} -f Dockerfile .
docker run --rm meshcore-hub-test --version
docker run --rm meshcore-hub-test --help

1
.gitignore vendored
View File

@@ -33,6 +33,7 @@ share/python-wheels/
MANIFEST
uv.lock
docker-compose.override.yml
# PyInstaller
# Usually these files are written by a python script from a template

View File

@@ -14,7 +14,7 @@ repos:
rev: 24.3.0
hooks:
- id: black
language_version: python3.11
language_version: python3.13
args: ["--line-length=88"]
- repo: https://github.com/pycqa/flake8
@@ -38,3 +38,4 @@ repos:
- fastapi>=0.100.0
- alembic>=1.7.0
- types-paho-mqtt>=1.6.0
- types-PyYAML>=6.0.0

View File

@@ -1 +1 @@
3.11
3.13

193
AGENTS.md
View File

@@ -5,8 +5,8 @@ This document provides context and guidelines for AI coding assistants working o
## Agent Rules
* You MUST use Python (version in `.python-version` file)
* You MUST activate a Python virtual environment in the `venv` directory or create one if it does not exist:
- `ls ./venv` to check if it exists
* You MUST activate a Python virtual environment in the `.venv` directory or create one if it does not exist:
- `ls ./.venv` to check if it exists
- `python -m venv .venv` to create it
* You MUST always activate the virtual environment before running any commands
- `source .venv/bin/activate`
@@ -18,7 +18,7 @@ This document provides context and guidelines for AI coding assistants working o
## Project Overview
MeshCore Hub is a Python 3.11+ monorepo for managing and orchestrating MeshCore mesh networks. It consists of five main components:
MeshCore Hub is a Python 3.13+ monorepo for managing and orchestrating MeshCore mesh networks. It consists of five main components:
- **meshcore_interface**: Serial/USB interface to MeshCore companion nodes, publishes/subscribes to MQTT
- **meshcore_collector**: Collects MeshCore events from MQTT and stores them in a database
@@ -37,7 +37,7 @@ MeshCore Hub is a Python 3.11+ monorepo for managing and orchestrating MeshCore
| Category | Technology |
|----------|------------|
| Language | Python 3.11+ |
| Language | Python 3.13+ |
| Package Management | pip with pyproject.toml |
| CLI Framework | Click |
| Configuration | Pydantic Settings |
@@ -114,17 +114,15 @@ class NodeRead(BaseModel):
### SQLAlchemy Models
```python
from sqlalchemy import String, DateTime, ForeignKey
from sqlalchemy import String, DateTime, Text
from sqlalchemy.orm import Mapped, mapped_column, relationship
from datetime import datetime
from uuid import uuid4
from typing import Optional
from meshcore_hub.common.models.base import Base
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin
class Node(Base):
class Node(Base, UUIDMixin, TimestampMixin):
__tablename__ = "nodes"
id: Mapped[str] = mapped_column(String(36), primary_key=True, default=lambda: str(uuid4()))
public_key: Mapped[str] = mapped_column(String(64), unique=True, index=True)
name: Mapped[str | None] = mapped_column(String(255), nullable=True)
adv_type: Mapped[str | None] = mapped_column(String(20), nullable=True)
@@ -132,6 +130,18 @@ class Node(Base):
# Relationships
tags: Mapped[list["NodeTag"]] = relationship(back_populates="node", cascade="all, delete-orphan")
class Member(Base, UUIDMixin, TimestampMixin):
"""Network member model - stores info about network operators."""
__tablename__ = "members"
name: Mapped[str] = mapped_column(String(255), nullable=False)
callsign: Mapped[Optional[str]] = mapped_column(String(20), nullable=True)
role: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
description: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
contact: Mapped[Optional[str]] = mapped_column(String(255), nullable=True)
public_key: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
```
### FastAPI Routes
@@ -239,7 +249,12 @@ meshcore-hub/
│ │ ├── mqtt.py # MQTT utilities
│ │ ├── logging.py # Logging config
│ │ ├── models/ # SQLAlchemy models
│ │ │ ├── node.py # Node model
│ │ │ ├── member.py # Network member model
│ │ │ └── ...
│ │ └── schemas/ # Pydantic schemas
│ │ ├── members.py # Member API schemas
│ │ └── ...
│ ├── interface/
│ │ ├── cli.py
│ │ ├── device.py # MeshCore device wrapper
@@ -247,9 +262,11 @@ meshcore-hub/
│ │ ├── receiver.py # RECEIVER mode
│ │ └── sender.py # SENDER mode
│ ├── collector/
│ │ ├── cli.py
│ │ ├── cli.py # Collector CLI with seed commands
│ │ ├── subscriber.py # MQTT subscriber
│ │ ├── tag_import.py # Tag import from JSON
│ │ ├── cleanup.py # Data retention/cleanup service
│ │ ├── tag_import.py # Tag import from YAML
│ │ ├── member_import.py # Member import from YAML
│ │ ├── handlers/ # Event handlers
│ │ └── webhook.py # Webhook dispatcher
│ ├── api/
@@ -257,12 +274,15 @@ meshcore-hub/
│ │ ├── app.py # FastAPI app
│ │ ├── auth.py # Authentication
│ │ ├── dependencies.py
│ │ ── routes/ # API routes
│ │ ── templates/ # Dashboard HTML
│ │ ── routes/ # API routes
│ │ ── members.py # Member CRUD endpoints
│ │ └── ...
│ └── web/
│ ├── cli.py
│ ├── app.py # FastAPI app
│ ├── routes/ # Page routes
│ │ ├── members.py # Members page
│ │ └── ...
│ ├── templates/ # Jinja2 templates
│ └── static/ # CSS, JS
├── tests/
@@ -278,20 +298,17 @@ meshcore-hub/
├── etc/
│ └── mosquitto.conf # MQTT broker configuration
├── example/
│ └── data/
│ ├── collector/
│ └── tags.json # Example node tags data
│ └── web/
└── members.json # Example network members data
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
└── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME)
├── node_tags.yaml # Node tags for import
│ └── members.yaml # Network members for import
├── data/ # Runtime data (gitignored, DATA_HOME default)
── collector/ # Collector data
── meshcore.db # SQLite database
│ │ └── tags.json # Node tags for import
│ └── web/ # Web data
│ └── members.json # Network members list
── collector/ # Collector data
── meshcore.db # SQLite database
├── Dockerfile # Docker build configuration
── docker-compose.yml # Docker Compose services (gitignored)
└── docker-compose.yml.example # Docker Compose template
── docker-compose.yml # Docker Compose services
```
## MQTT Topic Structure
@@ -436,26 +453,52 @@ meshcore-hub interface --mode receiver --mock
See [PLAN.md](PLAN.md#configuration-environment-variables) for complete list.
Key variables:
- `DATA_HOME` - Base directory for all service data (default: `./data`)
- `DATA_HOME` - Base directory for runtime data (default: `./data`)
- `SEED_HOME` - Directory containing seed data files (default: `./seed`)
- `MQTT_HOST`, `MQTT_PORT`, `MQTT_PREFIX` - MQTT broker connection
- `DATABASE_URL` - SQLAlchemy database URL (default: `sqlite:///{DATA_HOME}/collector/meshcore.db`)
- `MQTT_TLS` - Enable TLS/SSL for MQTT (default: `false`)
- `API_READ_KEY`, `API_ADMIN_KEY` - API authentication keys
- `WEB_ADMIN_ENABLED` - Enable admin interface at /a/ (default: `false`, requires auth proxy)
- `LOG_LEVEL` - Logging verbosity
### Data Directory Structure
The database defaults to `sqlite:///{DATA_HOME}/collector/meshcore.db` and does not typically need to be configured.
The `DATA_HOME` environment variable controls where all service data is stored:
### Directory Structure
**Seed Data (`SEED_HOME`)** - Contains initial data files for database seeding:
```
${SEED_HOME}/
├── node_tags.yaml # Node tags (keyed by public_key)
└── members.yaml # Network members list
```
**Runtime Data (`DATA_HOME`)** - Contains runtime data (gitignored):
```
${DATA_HOME}/
── collector/
── meshcore.db # SQLite database
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
── collector/
── meshcore.db # SQLite database
```
Services automatically create their subdirectories if they don't exist.
### Seeding
The database can be seeded with node tags and network members from YAML files in `SEED_HOME`:
- `node_tags.yaml` - Node tag definitions (keyed by public_key)
- `members.yaml` - Network member definitions
**Important:** Seeding is NOT automatic and must be run explicitly. This prevents seed files from overwriting user changes made via the admin UI.
```bash
# Native CLI
meshcore-hub collector seed
# With Docker Compose
docker compose --profile seed up
```
**Note:** Once the admin UI is enabled (`WEB_ADMIN_ENABLED=true`), tags should be managed through the web interface rather than seed files.
### Webhook Configuration
The collector supports forwarding events to external HTTP endpoints:
@@ -472,6 +515,64 @@ The collector supports forwarding events to external HTTP endpoints:
| `WEBHOOK_MAX_RETRIES` | Max retries on failure (default: 3) |
| `WEBHOOK_RETRY_BACKOFF` | Exponential backoff multiplier (default: 2.0) |
### Data Retention / Cleanup Configuration
The collector supports automatic cleanup of old event data and inactive nodes:
**Event Data Cleanup:**
| Variable | Description |
|----------|-------------|
| `DATA_RETENTION_ENABLED` | Enable automatic event data cleanup (default: true) |
| `DATA_RETENTION_DAYS` | Days to retain event data (default: 30) |
| `DATA_RETENTION_INTERVAL_HOURS` | Hours between cleanup runs (default: 24) |
When enabled, the collector automatically deletes event data older than the retention period:
- Advertisements
- Messages (channel and direct)
- Telemetry
- Trace paths
- Event logs
**Node Cleanup:**
| Variable | Description |
|----------|-------------|
| `NODE_CLEANUP_ENABLED` | Enable automatic cleanup of inactive nodes (default: true) |
| `NODE_CLEANUP_DAYS` | Remove nodes not seen for this many days (default: 7) |
When enabled, the collector automatically removes nodes where:
- `last_seen` is older than the configured number of days
- Nodes with `last_seen=NULL` (never seen on network) are **NOT** removed
- Nodes created via tag import that have never been seen on the mesh are preserved
**Note:** Both event data and node cleanup run on the same schedule (DATA_RETENTION_INTERVAL_HOURS).
**Contact Cleanup (Interface RECEIVER):**
The interface RECEIVER mode can automatically remove stale contacts from the MeshCore companion node's contact database. This prevents the companion node from resyncing old/dead contacts back to the collector, freeing up memory on the device (typically limited to ~100 contacts).
| Variable | Description |
|----------|-------------|
| `CONTACT_CLEANUP_ENABLED` | Enable automatic removal of stale contacts (default: true) |
| `CONTACT_CLEANUP_DAYS` | Remove contacts not advertised for this many days (default: 7) |
When enabled, during each contact sync the receiver checks each contact's `last_advert` timestamp:
- Contacts with `last_advert` older than `CONTACT_CLEANUP_DAYS` are removed from the device
- Stale contacts are not published to MQTT (preventing collector database pollution)
- Contacts without a `last_advert` timestamp are preserved (no removal without data)
This cleanup runs automatically whenever the receiver syncs contacts (on startup and after each advertisement event).
Manual cleanup can be triggered at any time with:
```bash
# Dry run to see what would be deleted
meshcore-hub collector cleanup --retention-days 30 --dry-run
# Live cleanup
meshcore-hub collector cleanup --retention-days 30
```
Webhook payload structure:
```json
{
@@ -486,9 +587,13 @@ Webhook payload structure:
### Common Issues
1. **MQTT Connection Failed**: Check broker is running and `MQTT_HOST`/`MQTT_PORT` are correct
2. **Database Migration Errors**: Ensure `DATABASE_URL` is correct, run `alembic upgrade head`
2. **Database Migration Errors**: Ensure `DATA_HOME` is writable, run `meshcore-hub db upgrade`
3. **Import Errors**: Ensure package is installed with `pip install -e .`
4. **Type Errors**: Run `mypy src/` to check type annotations
4. **Type Errors**: Run `pre-commit run --all-files` to check type annotations and other issues
5. **NixOS greenlet errors**: On NixOS, the pre-built greenlet wheel may fail with `libstdc++.so.6` errors. Rebuild from source:
```bash
pip install --no-binary greenlet greenlet
```
### Debugging
@@ -551,6 +656,20 @@ On startup, the receiver performs these initialization steps:
1. Set device clock to current Unix timestamp
2. Send a local (non-flood) advertisement
3. Start automatic message fetching
4. Sync the device's contact database
### Contact Sync Behavior
The receiver syncs the device's contact database in two scenarios:
1. **Startup**: Initial sync when receiver starts
2. **Advertisement Events**: Automatic sync triggered whenever an advertisement is received from the mesh
Since advertisements are typically received every ~20 minutes, contact sync happens automatically without manual intervention. Each contact from the device is published individually to MQTT:
- Topic: `{prefix}/{device_public_key}/event/contact`
- Payload: `{public_key, adv_name, type}`
This ensures the collector's database stays current with all nodes discovered on the mesh network.
## References

View File

@@ -4,7 +4,7 @@
# =============================================================================
# Stage 1: Builder - Install dependencies and build package
# =============================================================================
FROM python:3.11-slim AS builder
FROM python:3.13-slim AS builder
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1 \
@@ -28,14 +28,18 @@ COPY src/ ./src/
COPY alembic/ ./alembic/
COPY alembic.ini ./
# Install the package
RUN pip install --upgrade pip && \
# Build argument for version (set via CI or manually)
ARG BUILD_VERSION=dev
# Set version in _version.py and install the package
RUN sed -i "s|__version__ = \"dev\"|__version__ = \"${BUILD_VERSION}\"|" src/meshcore_hub/_version.py && \
pip install --upgrade pip && \
pip install .
# =============================================================================
# Stage 2: Runtime - Final production image
# =============================================================================
FROM python:3.11-slim AS runtime
FROM python:3.13-slim AS runtime
# Labels
LABEL org.opencontainers.image.title="MeshCore Hub" \

View File

@@ -481,6 +481,7 @@ ${DATA_HOME}/
| INTERFACE_MODE | RECEIVER | RECEIVER or SENDER |
| SERIAL_PORT | /dev/ttyUSB0 | Serial port path |
| SERIAL_BAUD | 115200 | Baud rate |
| MESHCORE_DEVICE_NAME | *(none)* | Device/node name set on startup |
| MOCK_DEVICE | false | Use mock device |
### Collector

486
README.md
View File

@@ -1,6 +1,8 @@
# MeshCore Hub
Python 3.11+ platform for managing and orchestrating MeshCore mesh networks.
Python 3.13+ platform for managing and orchestrating MeshCore mesh networks.
![MeshCore Hub Web Dashboard](docs/images/web.png)
## Overview
@@ -15,41 +17,45 @@ MeshCore Hub provides a complete solution for monitoring, collecting, and intera
## Architecture
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
MeshCore │ │ MeshCore │ │ MeshCore │
Device 1 │ │ Device 2 │ │ Device 3 │
└────────┬────────┘ └────────┬────────┘ └────────┬────────┘
│ │ │
│ Serial/USB │ Serial/USB │ Serial/USB
│ │ │
┌────────▼────────┐ ┌────────▼────────┐ ┌────────▼────────┐
Interface │ │ Interface │ │ Interface │
(RECEIVER) │ │ (RECEIVER) │ │ (SENDER) │
└────────┬────────┘ └────────┬────────┘ └────────▲────────┘
│ │ │
│ Publish │ Publish │ Subscribe
│ │ │
└───────────┬───────────┴───────────────────────┘
┌──────▼──────┐
│ MQTT │
│ Broker │
└──────┬──────┘
┌──────▼──────┐
│ Collector │
└──────┬──────┘
┌──────▼──────┐
│ Database │
└──────┬──────┘
┌───────────┴───────────┐
│ │
┌──────▼──────┐ ┌───────▼───────┐
│ API │◄──────│ Web Dashboard │
└─────────────┘ └───────────────┘
```mermaid
flowchart LR
subgraph Devices["MeshCore Devices"]
D1["Device 1"]
D2["Device 2"]
D3["Device 3"]
end
subgraph Interfaces["Interface Layer"]
I1["RECEIVER"]
I2["RECEIVER"]
I3["SENDER"]
end
D1 -->|Serial| I1
D2 -->|Serial| I2
D3 -->|Serial| I3
I1 -->|Publish| MQTT
I2 -->|Publish| MQTT
MQTT -->|Subscribe| I3
MQTT["MQTT Broker"]
subgraph Backend["Backend Services"]
Collector --> Database --> API
end
MQTT --> Collector
API --> Web["Web Dashboard"]
style Devices fill:none,stroke:#0288d1,stroke-width:2px
style Interfaces fill:none,stroke:#f57c00,stroke-width:2px
style Backend fill:none,stroke:#388e3c,stroke-width:2px
style MQTT fill:none,stroke:#7b1fa2,stroke-width:3px
style Collector fill:none,stroke:#388e3c,stroke-width:2px
style Database fill:none,stroke:#c2185b,stroke-width:2px
style API fill:none,stroke:#1976d2,stroke-width:2px
style Web fill:none,stroke:#ffa000,stroke-width:2px
```
## Features
@@ -62,49 +68,141 @@ MeshCore Hub provides a complete solution for monitoring, collecting, and intera
- **Web Dashboard**: Visualize network status, node locations, and message history
- **Docker Ready**: Single image with all components, easy deployment
## Getting Started
### Simple Self-Hosted Setup
The quickest way to get started is running the entire stack on a single machine with a connected MeshCore device.
**Prerequisites:**
1. Flash the [USB Companion firmware](https://meshcore.dev/) onto a compatible device (e.g., Heltec V3, T-Beam)
2. Connect the device via USB to a machine that supports Docker or Python
**Steps:**
```bash
# Clone the repository
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env: set SERIAL_PORT to your device (e.g., /dev/ttyUSB0 or /dev/ttyACM0)
# Start the entire stack with local MQTT broker
docker compose --profile mqtt --profile core --profile receiver up -d
# View the web dashboard
open http://localhost:8080
```
This starts all services: MQTT broker, collector, API, web dashboard, and the interface receiver that bridges your MeshCore device to the system.
### Distributed Community Setup
For larger deployments, you can separate receiver nodes from the central infrastructure. This allows multiple community members to contribute receiver coverage while hosting the backend centrally.
```mermaid
flowchart TB
subgraph Community["Community Members"]
R1["Raspberry Pi + MeshCore"]
R2["Raspberry Pi + MeshCore"]
R3["Any Linux + MeshCore"]
end
subgraph Server["Community VPS / Server"]
MQTT["MQTT Broker"]
Collector
API
Web["Web Dashboard (public)"]
MQTT --> Collector --> API
API <--- Web
end
R1 -->|MQTT port 1883| MQTT
R2 -->|MQTT port 1883| MQTT
R3 -->|MQTT port 1883| MQTT
style Community fill:none,stroke:#0288d1,stroke-width:2px
style Server fill:none,stroke:#388e3c,stroke-width:2px
style MQTT fill:none,stroke:#7b1fa2,stroke-width:3px
style Collector fill:none,stroke:#388e3c,stroke-width:2px
style API fill:none,stroke:#1976d2,stroke-width:2px
style Web fill:none,stroke:#ffa000,stroke-width:2px
```
**On each receiver node (Raspberry Pi, etc.):**
```bash
# Only run the receiver component
# Configure .env with MQTT_HOST pointing to your central server
MQTT_HOST=your-community-server.com
SERIAL_PORT=/dev/ttyUSB0
docker compose --profile receiver up -d
```
**On the central server (VPS/cloud):**
```bash
# Run the core infrastructure with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to an existing MQTT broker (set MQTT_HOST in .env)
docker compose --profile core up -d
```
This architecture allows:
- Multiple receivers for better RF coverage across a geographic area
- Centralized data storage and web interface
- Community members to contribute coverage with minimal setup
- The central server to be hosted anywhere with internet access
## Quick Start
### Using Docker Compose (Recommended)
Docker Compose supports **profiles** to selectively enable/disable components:
Docker Compose uses **profiles** to select which services to run:
| Profile | Services |
|---------|----------|
| `mqtt` | Eclipse Mosquitto MQTT broker |
| `interface-receiver` | MeshCore device receiver (events to MQTT) |
| `interface-sender` | MeshCore device sender (MQTT to device) |
| `collector` | MQTT subscriber + database storage |
| `api` | REST API server |
| `web` | Web dashboard |
| `mock` | All services with mock device (for testing) |
| `all` | All production services |
| Profile | Services | Use Case |
|---------|----------|----------|
| `core` | collector, api, web | Central server infrastructure |
| `receiver` | interface-receiver | Receiver node (events to MQTT) |
| `sender` | interface-sender | Sender node (MQTT to device) |
| `mqtt` | mosquitto broker | Local MQTT broker (optional) |
| `mock` | interface-mock-receiver | Testing without hardware |
| `migrate` | db-migrate | One-time database migration |
| `seed` | seed | One-time seed data import |
**Note:** Most deployments connect to an external MQTT broker. Add `--profile mqtt` only if you need a local broker.
```bash
# Clone the repository
git clone https://github.com/your-org/meshcore-hub.git
cd meshcore-hub/docker
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
# Copy and configure environment
cp .env.example .env
# Edit .env with your settings (API keys, serial port, network info)
# Option 1: Start all services with mock device (for testing)
docker compose --profile mock up -d
# Create database schema
docker compose --profile migrate run --rm db-migrate
# Option 2: Start specific services for production
docker compose --profile mqtt --profile collector --profile api --profile web up -d
# Seed the database
docker compose --profile seed run --rm seed
# Option 3: Start all production services (requires real MeshCore device)
docker compose --profile all up -d
# Start core services with local MQTT broker
docker compose --profile mqtt --profile core up -d
# Or connect to external MQTT (configure MQTT_HOST in .env)
docker compose --profile core up -d
# Start just the receiver (connects to MQTT_HOST from .env)
docker compose --profile receiver up -d
# View logs
docker compose logs -f
# Run database migrations
docker compose --profile migrate up
# Stop services
docker compose --profile mock down
docker compose down
```
#### Serial Device Access
@@ -144,6 +242,57 @@ meshcore-hub api
meshcore-hub web
```
## Updating an Existing Installation
To update MeshCore Hub to the latest version:
```bash
# Navigate to your installation directory
cd meshcore-hub
# Pull the latest code
git pull
# Pull latest Docker images
docker compose --profile all pull
# Recreate and restart services
# For receiver/sender only installs:
docker compose --profile receiver up -d --force-recreate
# For core services with MQTT:
docker compose --profile mqtt --profile core up -d --force-recreate
# For core services without local MQTT:
docker compose --profile core up -d --force-recreate
# For complete stack (all services):
docker compose --profile mqtt --profile core --profile receiver up -d --force-recreate
# View logs to verify update
docker compose logs -f
```
**Note:** Database migrations run automatically on collector startup, so no manual migration step is needed when using Docker.
For manual installations:
```bash
# Pull latest code
git pull
# Activate virtual environment
source .venv/bin/activate
# Update dependencies
pip install -e ".[dev]"
# Run database migrations
meshcore-hub db upgrade
# Restart your services
```
## Configuration
All components are configured via environment variables. Create a `.env` file or export variables:
@@ -153,28 +302,30 @@ All components are configured via environment variables. Create a `.env` file or
| Variable | Default | Description |
|----------|---------|-------------|
| `LOG_LEVEL` | `INFO` | Logging level (DEBUG, INFO, WARNING, ERROR) |
| `DATA_HOME` | `./data` | Base directory for runtime data |
| `SEED_HOME` | `./seed` | Directory containing seed data files |
| `MQTT_HOST` | `localhost` | MQTT broker hostname |
| `MQTT_PORT` | `1883` | MQTT broker port |
| `MQTT_USERNAME` | *(none)* | MQTT username (optional) |
| `MQTT_PASSWORD` | *(none)* | MQTT password (optional) |
| `MQTT_PREFIX` | `meshcore` | Topic prefix for all MQTT messages |
| `MQTT_TLS` | `false` | Enable TLS/SSL for MQTT connection |
### Interface Settings
| Variable | Default | Description |
|----------|---------|-------------|
| `INTERFACE_MODE` | `RECEIVER` | Operating mode (RECEIVER or SENDER) |
| `SERIAL_PORT` | `/dev/ttyUSB0` | Serial port for MeshCore device |
| `SERIAL_BAUD` | `115200` | Serial baud rate |
| `MOCK_DEVICE` | `false` | Use mock device for testing |
| `MESHCORE_DEVICE_NAME` | *(none)* | Device/node name set on startup (broadcast in advertisements) |
### Collector Settings
| Variable | Default | Description |
|----------|---------|-------------|
| `DATABASE_URL` | `sqlite:///./meshcore.db` | SQLAlchemy database URL |
The database is stored in `{DATA_HOME}/collector/meshcore.db` by default.
#### Webhook Configuration
The collector can forward events to external HTTP endpoints:
The collector can forward certain events to external HTTP endpoints:
| Variable | Default | Description |
|----------|---------|-------------|
@@ -197,6 +348,18 @@ Webhook payload format:
}
```
#### Data Retention
The collector automatically cleans up old event data and inactive nodes:
| Variable | Default | Description |
|----------|---------|-------------|
| `DATA_RETENTION_ENABLED` | `true` | Enable automatic cleanup of old events |
| `DATA_RETENTION_DAYS` | `30` | Days to retain event data |
| `DATA_RETENTION_INTERVAL_HOURS` | `24` | Hours between cleanup runs |
| `NODE_CLEANUP_ENABLED` | `true` | Enable removal of inactive nodes |
| `NODE_CLEANUP_DAYS` | `7` | Remove nodes not seen for this many days |
### API Settings
| Variable | Default | Description |
@@ -213,10 +376,15 @@ Webhook payload format:
| `WEB_HOST` | `0.0.0.0` | Web server bind address |
| `WEB_PORT` | `8080` | Web server port |
| `API_BASE_URL` | `http://localhost:8000` | API endpoint URL |
| `WEB_ADMIN_ENABLED` | `false` | Enable admin interface at /a/ (requires auth proxy) |
| `NETWORK_NAME` | `MeshCore Network` | Display name for the network |
| `NETWORK_CITY` | *(none)* | City where network is located |
| `NETWORK_COUNTRY` | *(none)* | Country code (ISO 3166-1 alpha-2) |
| `NETWORK_LOCATION` | *(none)* | Center coordinates (lat,lon) |
| `NETWORK_RADIO_CONFIG` | *(none)* | Radio config (comma-delimited: profile,freq,bw,sf,cr,power) |
| `NETWORK_WELCOME_TEXT` | *(none)* | Custom welcome text for homepage |
| `NETWORK_CONTACT_EMAIL` | *(none)* | Contact email address |
| `NETWORK_CONTACT_DISCORD` | *(none)* | Discord server link |
| `NETWORK_CONTACT_GITHUB` | *(none)* | GitHub repository URL |
## CLI Reference
@@ -226,13 +394,16 @@ meshcore-hub --help
# Interface component
meshcore-hub interface --mode receiver --port /dev/ttyUSB0
meshcore-hub interface --mode receiver --device-name "Gateway Node" # Set device name
meshcore-hub interface --mode sender --mock # Use mock device
# Collector component
meshcore-hub collector --database-url sqlite:///./data.db
# Import node tags from JSON file
meshcore-hub collector import-tags /path/to/tags.json
meshcore-hub collector # Run collector
meshcore-hub collector seed # Import all seed data from SEED_HOME
meshcore-hub collector import-tags # Import node tags from SEED_HOME/node_tags.yaml
meshcore-hub collector import-tags /path/to/file.yaml # Import from specific file
meshcore-hub collector import-members # Import members from SEED_HOME/members.yaml
meshcore-hub collector import-members /path/to/file.yaml # Import from specific file
# API component
meshcore-hub api --host 0.0.0.0 --port 8000
@@ -246,74 +417,125 @@ meshcore-hub db downgrade # Rollback one migration
meshcore-hub db current # Show current revision
```
## Seed Data
The database can be seeded with node tags and network members from YAML files in the `SEED_HOME` directory (default: `./seed`).
### Running the Seed Process
Seeding is a separate process and must be run explicitly:
```bash
# Native CLI
meshcore-hub collector seed
# With Docker Compose
docker compose --profile seed up
```
This imports data from the following files (if they exist):
- `{SEED_HOME}/node_tags.yaml` - Node tag definitions
- `{SEED_HOME}/members.yaml` - Network member definitions
### Directory Structure
```
seed/ # SEED_HOME (seed data files)
├── node_tags.yaml # Node tags for import
└── members.yaml # Network members for import
data/ # DATA_HOME (runtime data)
└── collector/
└── meshcore.db # SQLite database
```
Example seed files are provided in `example/seed/`.
## Node Tags
Node tags allow you to attach custom metadata to nodes (e.g., location, role, owner). Tags are stored in the database and returned with node data via the API.
### Importing Tags from JSON
### Node Tags YAML Format
Tags can be bulk imported from a JSON file:
Tags are keyed by public key in YAML format:
```bash
# Native CLI
meshcore-hub collector import-tags /path/to/tags.json
```yaml
# Each key is a 64-character hex public key
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
# With Docker Compose
docker compose --profile import-tags run --rm import-tags
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
altitude: 150
```
### Tags JSON Format
Tag values can be:
- **YAML primitives** (auto-detected type): strings, numbers, booleans
- **Explicit type** (when you need to force a specific type):
```yaml
altitude:
value: "150"
type: number
```
```json
{
"tags": [
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "location",
"value": "San Francisco, CA",
"value_type": "string"
},
{
"public_key": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"key": "altitude",
"value": "150",
"value_type": "number"
}
]
}
Supported types: `string`, `number`, `boolean`
### Import Tags Manually
```bash
# Import from default location ({SEED_HOME}/node_tags.yaml)
meshcore-hub collector import-tags
# Import from specific file
meshcore-hub collector import-tags /path/to/node_tags.yaml
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes
```
## Network Members
Network members represent the people operating nodes in your network. Members can optionally be linked to nodes via their public key.
### Members YAML Format
```yaml
- member_id: walshie86
name: Walshie
callsign: Walshie86
role: member
description: IPNet Member
- member_id: craig
name: Craig
callsign: M7XCN
role: member
description: IPNet Member
```
| Field | Required | Description |
|-------|----------|-------------|
| `public_key` | Yes | 64-character hex public key of the node |
| `key` | Yes | Tag name (max 100 characters) |
| `value` | No | Tag value (stored as text) |
| `value_type` | No | Type hint: `string`, `number`, `boolean`, or `coordinate` (default: `string`) |
| `member_id` | Yes | Unique identifier for the member |
| `name` | Yes | Member's display name |
| `callsign` | No | Amateur radio callsign |
| `role` | No | Member's role in the network |
| `description` | No | Additional description |
| `contact` | No | Contact information |
| `public_key` | No | Associated node public key (64-char hex) |
### Import Options
### Import Members Manually
```bash
# Create nodes if they don't exist (default behavior)
meshcore-hub collector import-tags tags.json
# Import from default location ({SEED_HOME}/members.yaml)
meshcore-hub collector import-members
# Skip tags for nodes that don't exist
meshcore-hub collector import-tags --no-create-nodes tags.json
# Import from specific file
meshcore-hub collector import-members /path/to/members.yaml
```
### Data Directory Structure
For Docker deployments, organize your data files:
```
data/
├── collector/
│ └── tags.json # Node tags for import
└── web/
└── members.json # Network members list
```
Example files are provided in `example/data/`.
### Managing Tags via API
Tags can also be managed via the REST API:
@@ -385,7 +607,7 @@ curl -X POST \
| GET | `/api/v1/trace-paths` | List trace paths |
| POST | `/api/v1/commands/send-message` | Send direct message |
| POST | `/api/v1/commands/send-channel-message` | Send channel message |
| GET | `/api/v1/stats` | Get network statistics |
| GET | `/api/v1/dashboard/stats` | Get network statistics |
## Development
@@ -393,7 +615,7 @@ curl -X POST \
```bash
# Clone and setup
git clone https://github.com/your-org/meshcore-hub.git
git clone https://github.com/ipnet-mesh/meshcore-hub.git
cd meshcore-hub
python -m venv .venv
source .venv/bin/activate
@@ -422,14 +644,8 @@ pytest -k "test_list"
### Code Quality
```bash
# Format code
black src/ tests/
# Lint
flake8 src/ tests/
# Type check
mypy src/
# Run all code quality checks (formatting, linting, type checking)
pre-commit run --all-files
```
### Creating Database Migrations
@@ -459,11 +675,13 @@ meshcore-hub/
├── alembic/ # Database migrations
├── etc/ # Configuration files (mosquitto.conf)
├── example/ # Example files for testing
│ └── data/ # Example data files (members.json)
├── data/ # Runtime data (gitignored)
│ └── seed/ # Example seed data files
│ ├── node_tags.yaml # Example node tags
│ └── members.yaml # Example network members
├── seed/ # Seed data directory (SEED_HOME, copy from example/seed/)
├── data/ # Runtime data directory (DATA_HOME, created at runtime)
├── Dockerfile # Docker build configuration
├── docker-compose.yml # Docker Compose services (gitignored)
├── docker-compose.yml.example # Docker Compose template
├── docker-compose.yml # Docker Compose services
├── PROMPT.md # Project specification
├── SCHEMAS.md # Event schema documentation
├── PLAN.md # Implementation plan
@@ -484,14 +702,14 @@ meshcore-hub/
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Make your changes
4. Run tests and linting (`pytest && black . && flake8`)
4. Run tests and quality checks (`pytest && pre-commit run --all-files`)
5. Commit your changes (`git commit -m 'Add amazing feature'`)
6. Push to the branch (`git push origin feature/amazing-feature`)
7. Open a Pull Request
## License
See [LICENSE](LICENSE) for details.
This project is licensed under the GNU General Public License v3.0 or later (GPL-3.0-or-later). See [LICENSE](LICENSE) for details.
## Acknowledgments

View File

@@ -0,0 +1,114 @@
"""Add member_nodes association table
Revision ID: 002
Revises: 001
Create Date: 2024-12-05
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "002"
down_revision: Union[str, None] = "001"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Create member_nodes table
op.create_table(
"member_nodes",
sa.Column("id", sa.String(), nullable=False),
sa.Column("member_id", sa.String(36), nullable=False),
sa.Column("public_key", sa.String(64), nullable=False),
sa.Column("node_role", sa.String(50), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(["member_id"], ["members.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("ix_member_nodes_member_id", "member_nodes", ["member_id"])
op.create_index("ix_member_nodes_public_key", "member_nodes", ["public_key"])
op.create_index(
"ix_member_nodes_member_public_key",
"member_nodes",
["member_id", "public_key"],
)
# Migrate existing public_key data from members to member_nodes
# Get all members with a public_key
connection = op.get_bind()
members_with_keys = connection.execute(
sa.text("SELECT id, public_key FROM members WHERE public_key IS NOT NULL")
).fetchall()
# Insert into member_nodes
for member_id, public_key in members_with_keys:
# Generate a UUID for the new row
import uuid
node_id = str(uuid.uuid4())
connection.execute(
sa.text(
"""
INSERT INTO member_nodes (id, member_id, public_key, created_at, updated_at)
VALUES (:id, :member_id, :public_key, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP)
"""
),
{"id": node_id, "member_id": member_id, "public_key": public_key},
)
# Drop the public_key column from members
op.drop_index("ix_members_public_key", table_name="members")
op.drop_column("members", "public_key")
def downgrade() -> None:
# Add public_key column back to members
op.add_column(
"members",
sa.Column("public_key", sa.String(64), nullable=True),
)
op.create_index("ix_members_public_key", "members", ["public_key"])
# Migrate data back - take the first node for each member
connection = op.get_bind()
member_nodes = connection.execute(
sa.text(
"""
SELECT DISTINCT member_id, public_key
FROM member_nodes
WHERE (member_id, created_at) IN (
SELECT member_id, MIN(created_at)
FROM member_nodes
GROUP BY member_id
)
"""
)
).fetchall()
for member_id, public_key in member_nodes:
connection.execute(
sa.text(
"UPDATE members SET public_key = :public_key WHERE id = :member_id"
),
{"public_key": public_key, "member_id": member_id},
)
# Drop member_nodes table
op.drop_table("member_nodes")

View File

@@ -0,0 +1,67 @@
"""Add event_hash column to event tables for deduplication
Revision ID: 003
Revises: 002
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "003"
down_revision: Union[str, None] = "002"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Add event_hash column to messages table
op.add_column(
"messages",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])
# Add event_hash column to advertisements table
op.add_column(
"advertisements",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_advertisements_event_hash", "advertisements", ["event_hash"])
# Add event_hash column to trace_paths table
op.add_column(
"trace_paths",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Add event_hash column to telemetry table
op.add_column(
"telemetry",
sa.Column("event_hash", sa.String(32), nullable=True),
)
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
def downgrade() -> None:
# Remove event_hash from telemetry
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
op.drop_column("telemetry", "event_hash")
# Remove event_hash from trace_paths
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
op.drop_column("trace_paths", "event_hash")
# Remove event_hash from advertisements
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
op.drop_column("advertisements", "event_hash")
# Remove event_hash from messages
op.drop_index("ix_messages_event_hash", table_name="messages")
op.drop_column("messages", "event_hash")

View File

@@ -0,0 +1,63 @@
"""Add event_receivers junction table for multi-receiver tracking
Revision ID: 004
Revises: 003
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "004"
down_revision: Union[str, None] = "003"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"event_receivers",
sa.Column("id", sa.String(36), primary_key=True),
sa.Column("event_type", sa.String(20), nullable=False),
sa.Column("event_hash", sa.String(32), nullable=False),
sa.Column(
"receiver_node_id",
sa.String(36),
sa.ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("snr", sa.Float, nullable=True),
sa.Column("received_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
)
op.create_index(
"ix_event_receivers_event_hash",
"event_receivers",
["event_hash"],
)
op.create_index(
"ix_event_receivers_receiver_node_id",
"event_receivers",
["receiver_node_id"],
)
op.create_index(
"ix_event_receivers_type_hash",
"event_receivers",
["event_type", "event_hash"],
)
def downgrade() -> None:
op.drop_index("ix_event_receivers_type_hash", table_name="event_receivers")
op.drop_index("ix_event_receivers_receiver_node_id", table_name="event_receivers")
op.drop_index("ix_event_receivers_event_hash", table_name="event_receivers")
op.drop_table("event_receivers")

View File

@@ -0,0 +1,126 @@
"""Make event_hash columns unique for race condition prevention
Revision ID: 005
Revises: 004
Create Date: 2024-12-06
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy import inspect
# revision identifiers, used by Alembic.
revision: str = "005"
down_revision: Union[str, None] = "004"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def _index_exists(table_name: str, index_name: str) -> bool:
"""Check if an index exists on a table."""
bind = op.get_bind()
inspector = inspect(bind)
indexes = inspector.get_indexes(table_name)
return any(idx["name"] == index_name for idx in indexes)
def _has_unique_on_column(table_name: str, column_name: str) -> bool:
"""Check if a unique constraint or unique index exists on a column."""
bind = op.get_bind()
inspector = inspect(bind)
# Check unique constraints
uniques = inspector.get_unique_constraints(table_name)
for uq in uniques:
if column_name in uq.get("column_names", []):
return True
# Also check indexes (SQLite may create unique index instead of constraint)
indexes = inspector.get_indexes(table_name)
for idx in indexes:
if idx.get("unique") and column_name in idx.get("column_names", []):
return True
return False
def upgrade() -> None:
# Convert non-unique indexes to unique indexes for race condition prevention
# Note: SQLite handles NULL values as unique (each NULL is distinct)
# SQLite doesn't support ALTER TABLE ADD CONSTRAINT, so we use unique indexes
# Messages
if _index_exists("messages", "ix_messages_event_hash"):
op.drop_index("ix_messages_event_hash", table_name="messages")
if not _has_unique_on_column("messages", "event_hash"):
op.create_index(
"ix_messages_event_hash_unique",
"messages",
["event_hash"],
unique=True,
)
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash"):
op.drop_index("ix_advertisements_event_hash", table_name="advertisements")
if not _has_unique_on_column("advertisements", "event_hash"):
op.create_index(
"ix_advertisements_event_hash_unique",
"advertisements",
["event_hash"],
unique=True,
)
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.drop_index("ix_trace_paths_event_hash", table_name="trace_paths")
if not _has_unique_on_column("trace_paths", "event_hash"):
op.create_index(
"ix_trace_paths_event_hash_unique",
"trace_paths",
["event_hash"],
unique=True,
)
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash"):
op.drop_index("ix_telemetry_event_hash", table_name="telemetry")
if not _has_unique_on_column("telemetry", "event_hash"):
op.create_index(
"ix_telemetry_event_hash_unique",
"telemetry",
["event_hash"],
unique=True,
)
def downgrade() -> None:
# Restore non-unique indexes
# Telemetry
if _index_exists("telemetry", "ix_telemetry_event_hash_unique"):
op.drop_index("ix_telemetry_event_hash_unique", table_name="telemetry")
if not _index_exists("telemetry", "ix_telemetry_event_hash"):
op.create_index("ix_telemetry_event_hash", "telemetry", ["event_hash"])
# Trace paths
if _index_exists("trace_paths", "ix_trace_paths_event_hash_unique"):
op.drop_index("ix_trace_paths_event_hash_unique", table_name="trace_paths")
if not _index_exists("trace_paths", "ix_trace_paths_event_hash"):
op.create_index("ix_trace_paths_event_hash", "trace_paths", ["event_hash"])
# Advertisements
if _index_exists("advertisements", "ix_advertisements_event_hash_unique"):
op.drop_index(
"ix_advertisements_event_hash_unique", table_name="advertisements"
)
if not _index_exists("advertisements", "ix_advertisements_event_hash"):
op.create_index(
"ix_advertisements_event_hash", "advertisements", ["event_hash"]
)
# Messages
if _index_exists("messages", "ix_messages_event_hash_unique"):
op.drop_index("ix_messages_event_hash_unique", table_name="messages")
if not _index_exists("messages", "ix_messages_event_hash"):
op.create_index("ix_messages_event_hash", "messages", ["event_hash"])

View File

@@ -0,0 +1,39 @@
"""Make Node.last_seen nullable
Revision ID: 0b944542ccd8
Revises: 005
Create Date: 2025-12-08 00:07:49.891245+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "0b944542ccd8"
down_revision: Union[str, None] = "005"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Make Node.last_seen nullable since nodes from contact sync
# haven't actually been "seen" on the mesh yet
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.alter_column("last_seen", existing_type=sa.DATETIME(), nullable=True)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Revert Node.last_seen to non-nullable
# Note: This will fail if there are NULL values in last_seen
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.alter_column("last_seen", existing_type=sa.DATETIME(), nullable=False)
# ### end Alembic commands ###

View File

@@ -0,0 +1,111 @@
"""Add member_id field to members table
Revision ID: 03b9b2451bd9
Revises: 0b944542ccd8
Create Date: 2025-12-08 14:34:30.337799+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "03b9b2451bd9"
down_revision: Union[str, None] = "0b944542ccd8"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("advertisements", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_advertisements_event_hash_unique"))
batch_op.create_unique_constraint(
"uq_advertisements_event_hash", ["event_hash"]
)
with op.batch_alter_table("members", schema=None) as batch_op:
# Add member_id as nullable first to handle existing data
batch_op.add_column(
sa.Column("member_id", sa.String(length=100), nullable=True)
)
# Generate member_id for existing members based on their name
# Convert name to lowercase and replace spaces with underscores
connection = op.get_bind()
connection.execute(
sa.text(
"UPDATE members SET member_id = LOWER(REPLACE(name, ' ', '_')) WHERE member_id IS NULL"
)
)
with op.batch_alter_table("members", schema=None) as batch_op:
# Now make it non-nullable and add unique index
batch_op.alter_column("member_id", nullable=False)
batch_op.drop_index(batch_op.f("ix_members_name"))
batch_op.create_index(
batch_op.f("ix_members_member_id"), ["member_id"], unique=True
)
with op.batch_alter_table("messages", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_messages_event_hash_unique"))
batch_op.create_unique_constraint("uq_messages_event_hash", ["event_hash"])
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_nodes_public_key"))
batch_op.create_index(
batch_op.f("ix_nodes_public_key"), ["public_key"], unique=True
)
with op.batch_alter_table("telemetry", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_telemetry_event_hash_unique"))
batch_op.create_unique_constraint("uq_telemetry_event_hash", ["event_hash"])
with op.batch_alter_table("trace_paths", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_trace_paths_event_hash_unique"))
batch_op.create_unique_constraint("uq_trace_paths_event_hash", ["event_hash"])
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("trace_paths", schema=None) as batch_op:
batch_op.drop_constraint("uq_trace_paths_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_trace_paths_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("telemetry", schema=None) as batch_op:
batch_op.drop_constraint("uq_telemetry_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_telemetry_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_nodes_public_key"))
batch_op.create_index(
batch_op.f("ix_nodes_public_key"), ["public_key"], unique=False
)
with op.batch_alter_table("messages", schema=None) as batch_op:
batch_op.drop_constraint("uq_messages_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_messages_event_hash_unique"), ["event_hash"], unique=1
)
with op.batch_alter_table("members", schema=None) as batch_op:
batch_op.drop_index(batch_op.f("ix_members_member_id"))
batch_op.create_index(batch_op.f("ix_members_name"), ["name"], unique=False)
batch_op.drop_column("member_id")
with op.batch_alter_table("advertisements", schema=None) as batch_op:
batch_op.drop_constraint("uq_advertisements_event_hash", type_="unique")
batch_op.create_index(
batch_op.f("ix_advertisements_event_hash_unique"), ["event_hash"], unique=1
)
# ### end Alembic commands ###

View File

@@ -0,0 +1,57 @@
"""Remove member_nodes table
Revision ID: aa1162502616
Revises: 03b9b2451bd9
Create Date: 2025-12-08 15:04:37.260923+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "aa1162502616"
down_revision: Union[str, None] = "03b9b2451bd9"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Drop the member_nodes table
# Nodes are now associated with members via a 'member_id' tag on the node
op.drop_table("member_nodes")
def downgrade() -> None:
# Recreate the member_nodes table if needed for rollback
op.create_table(
"member_nodes",
sa.Column("id", sa.String(length=36), nullable=False),
sa.Column("member_id", sa.String(length=36), nullable=False),
sa.Column("public_key", sa.String(length=64), nullable=False),
sa.Column("node_role", sa.String(length=50), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.ForeignKeyConstraint(
["member_id"],
["members.id"],
name=op.f("fk_member_nodes_member_id_members"),
ondelete="CASCADE",
),
sa.PrimaryKeyConstraint("id", name=op.f("pk_member_nodes")),
)
op.create_index(
op.f("ix_member_nodes_member_id"), "member_nodes", ["member_id"], unique=False
)
op.create_index(
op.f("ix_member_nodes_public_key"), "member_nodes", ["public_key"], unique=False
)
op.create_index(
"ix_member_nodes_member_public_key",
"member_nodes",
["member_id", "public_key"],
unique=False,
)

View File

@@ -0,0 +1,37 @@
"""add lat lon columns to nodes
Revision ID: 4e2e787a1660
Revises: aa1162502616
Create Date: 2026-01-09 20:04:04.273741+00:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "4e2e787a1660"
down_revision: Union[str, None] = "aa1162502616"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.add_column(sa.Column("lat", sa.Float(), nullable=True))
batch_op.add_column(sa.Column("lon", sa.Float(), nullable=True))
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("nodes", schema=None) as batch_op:
batch_op.drop_column("lon")
batch_op.drop_column("lat")
# ### end Alembic commands ###

View File

@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Louis",
"callsign": "Louis",
"role": "admin",
"description": "IPNet Founder"
}
]
}

View File

@@ -1,613 +0,0 @@
{
"2337484665ced7e210007e9fd9db98ced0a24a6eab8b4cbe3a06b3a1cea33ca1": {
"friendly_name": "IP2 Repeater 1",
"node_id": "ip2-rep01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0357627,1.132079",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "31",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8cb01fff1afc099055af418ce5fc5e60384df9ff763c25dd7e6a5e0922e8df90": {
"friendly_name": "IP2 Repeater 2",
"node_id": "ip2-rep02.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0390682,1.1304141",
"type": "coordinate"
},
"location_description": "Belstead Road",
"hardware": "Heltec V3",
"antenna": "McGill 6dBi Omni",
"elevation": {
"value": "44",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"5b565df747913358e24d890b2227de9c35d09763746b6ec326c15ebbf9b8be3b": {
"friendly_name": "IP2 Repeater 3",
"node_id": "ip2-rep03.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.046356,1.134661",
"type": "coordinate"
},
"location_description": "Birkfield Drive",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "52",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"780d0939f90b22d3bd7cbedcaf4e8d468a12c01886ab24b8cfa11eab2f5516c5": {
"friendly_name": "IP2 Integration 1",
"node_id": "ip2-int01.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"30121dc60362c633c457ffa18f49b3e1d6823402c33709f32d7df70612250b96": {
"friendly_name": "MeshBot",
"node_id": "bot.ipnt.uk",
"member_id": "louis",
"area": "IP2",
"location": {
"value": "52.0354539,1.1295338",
"type": "coordinate"
},
"location_description": "Fountains Road",
"hardware": "Heltec V3",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "25",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"9135986b83815ada92883358435cc6528c7db60cb647f9b6547739a1ce5eb1c8": {
"friendly_name": "IP3 Repeater 1",
"node_id": "ip3-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045803,1.204416",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Paradar 8.5dBi Omni",
"elevation": {
"value": "42",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e334ec5475789d542ed9e692fbeef7444a371fcc05adcbda1f47ba6a3191b459": {
"friendly_name": "IP3 Repeater 2",
"node_id": "ip3-rep02.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.03297,1.17543",
"type": "coordinate"
},
"location_description": "Morland Road Allotments",
"hardware": "Heltec T114",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"cc15fb33e98f2e098a543f516f770dc3061a1a6b30f79b84780663bf68ae6b53": {
"friendly_name": "IP3 Repeater 3",
"node_id": "ip3-rep03.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04499,1.18149",
"type": "coordinate"
},
"location_description": "Hatfield Road",
"hardware": "Heltec V3",
"antenna": "Unknown",
"elevation": {
"value": "39",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"22309435fbd9dd1f14870a1895dc854779f6b2af72b08542f6105d264a493ebe": {
"friendly_name": "IP3 Integration 1",
"node_id": "ip3-int01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.045773,1.212808",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Heltec V3",
"antenna": "Generic 3dBi Whip",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "false",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "integration"
},
"2a4f89e766dfa1758e35a69962c1f6d352b206a5e3562a589155a3ebfe7fc2bb": {
"friendly_name": "IP3 Repeater 4",
"node_id": "ip3-rep04.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.046383,1.174542",
"type": "coordinate"
},
"location_description": "Holywells",
"hardware": "Sensecap Solar",
"antenna": "Paradar 6.5dbi Omni",
"elevation": {
"value": "21",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"e790b73b2d6e377dd0f575c847f3ef42232f610eb9a19af57083fc4f647309ac": {
"friendly_name": "IP3 Repeater 5",
"node_id": "ip3-rep05.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.05252,1.17034",
"type": "coordinate"
},
"location_description": "Back Hamlet",
"hardware": "Heltec T114",
"antenna": "Paradar 6.5dBi Omni",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"20ed75ffc0f9777951716bb3d308d7f041fd2ad32fe2e998e600d0361e1fe2ac": {
"friendly_name": "IP3 Repeater 6",
"node_id": "ip3-rep06.ipnt.uk",
"member_id": "ccz",
"area": "IP3",
"location": {
"value": "52.04893,1.18965",
"type": "coordinate"
},
"location_description": "Dover Road",
"hardware": "Unknown",
"antenna": "Generic 5dBi Whip",
"elevation": {
"value": "38",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"bd7b5ac75f660675b39f368e1dbb6d1dbcefd8bd7a170e21a942954f67c8bf52": {
"friendly_name": "IP8 Repeater 1",
"node_id": "rep01.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.033684,1.118384",
"type": "coordinate"
},
"location_description": "Grove Hill",
"hardware": "Heltec V3",
"antenna": "McGill 3dBi Omni",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"9cf300c40112ea34d0a59858270948b27ab6cd87e840de338f3ca782c17537b2": {
"friendly_name": "IP8 Repeater 2",
"node_id": "rep02.ip8.ipnt.uk",
"member_id": "walshie86",
"area": "IP8",
"location": {
"value": "52.035648,1.073271",
"type": "coordinate"
},
"location_description": "Washbrook",
"hardware": "Sensecap Solar",
"elevation": {
"value": "13",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"d3c20d962f7384c111fbafad6fbc1c1dc0e5c3ce802fb3ee11020e8d8207ed3a": {
"friendly_name": "IP4 Repeater 1",
"node_id": "ip4-rep01.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.052445,1.156882",
"type": "coordinate"
},
"location_description": "Wine Rack",
"hardware": "Heltec T114",
"antenna": "Generic 5dbi Whip",
"elevation": {
"value": "50",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"b00ce9d218203e96d8557a4d59e06f5de59bbc4dcc4df9c870079d2cb8b5bd80": {
"friendly_name": "IP4 Repeater 2",
"node_id": "ip4-rep02.ipnt.uk",
"member_id": "markab",
"area": "IP4",
"location": {
"value": "52.06217,1.18332",
"type": "coordinate"
},
"location_description": "Rushmere Road",
"hardware": "Heltec V3",
"antenna": "Paradar 5dbi Whip",
"elevation": {
"value": "35",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"8accb6d0189ccaffb745ba54793e7fe3edd515edb45554325d957e48c1b9f3b3": {
"friendly_name": "IP4 Repeater 3",
"node_id": "ip4-rep03.ipnt.uk",
"member_id": "craig",
"area": "IP4",
"location": {
"value": "52.058,1.165",
"type": "coordinate"
},
"location_description": "IP4 Area",
"hardware": "Heltec v3",
"antenna": "Generic Whip",
"elevation": {
"value": "30",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "true",
"type": "boolean"
},
"is_testing": {
"value": "false",
"type": "boolean"
},
"mesh_role": "repeater"
},
"69fb8431e7ab307513797544fab99ce53ce24c46ec2d3a11767fe70f2ca37b23": {
"friendly_name": "IP3 Test Repeater 1",
"node_id": "ip3-tst01.ipnt.uk",
"member_id": "markab",
"area": "IP3",
"location": {
"value": "52.041869,1.204789",
"type": "coordinate"
},
"location_description": "Brokehall",
"hardware": "Station G2",
"antenna": "McGill 10dBi Panel",
"elevation": {
"value": "37",
"type": "number"
},
"show_on_map": {
"value": "true",
"type": "boolean"
},
"is_public": {
"value": "true",
"type": "boolean"
},
"is_online": {
"value": "false",
"type": "boolean"
},
"is_testing": {
"value": "true",
"type": "boolean"
},
"mesh_role": "repeater"
}
}

View File

@@ -1,14 +0,0 @@
services:
api:
networks:
- default
- pangolin
web:
networks:
- default
- pangolin
networks:
pangolin:
external: true

View File

@@ -1,10 +1,14 @@
services:
# ==========================================================================
# MQTT Broker - Eclipse Mosquitto
# MQTT Broker - Eclipse Mosquitto (optional, use --profile mqtt)
# Most users will connect to an external MQTT broker instead
# ==========================================================================
mqtt:
image: eclipse-mosquitto:2
container_name: meshcore-mqtt
profiles:
- all
- mqtt
restart: unless-stopped
ports:
- "${MQTT_EXTERNAL_PORT:-1883}:1883"
@@ -24,17 +28,15 @@ services:
# Interface Receiver - MeshCore device to MQTT bridge (events)
# ==========================================================================
interface-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-receiver
profiles:
- interface-receiver
- all
- receiver
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT:-/dev/ttyUSB0}:${SERIAL_PORT:-/dev/ttyUSB0}"
user: root # Required for device access
@@ -45,6 +47,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- SERIAL_PORT=${SERIAL_PORT:-/dev/ttyUSB0}
- SERIAL_BAUD=${SERIAL_BAUD:-115200}
- NODE_ADDRESS=${NODE_ADDRESS:-}
@@ -60,17 +63,15 @@ services:
# Interface Sender - MQTT to MeshCore device bridge (commands)
# ==========================================================================
interface-sender:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-sender
profiles:
- interface-sender
- all
- sender
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
devices:
- "${SERIAL_PORT_SENDER:-/dev/ttyUSB1}:${SERIAL_PORT_SENDER:-/dev/ttyUSB1}"
user: root # Required for device access
@@ -81,6 +82,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- SERIAL_PORT=${SERIAL_PORT_SENDER:-/dev/ttyUSB1}
- SERIAL_BAUD=${SERIAL_BAUD:-115200}
- NODE_ADDRESS=${NODE_ADDRESS_SENDER:-}
@@ -96,17 +98,15 @@ services:
# Interface Mock Receiver - For testing without real devices
# ==========================================================================
interface-mock-receiver:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-interface-mock-receiver
profiles:
- all
- mock
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -114,6 +114,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- MOCK_DEVICE=true
- NODE_ADDRESS=${NODE_ADDRESS:-0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef}
command: ["interface", "receiver", "--mock"]
@@ -128,18 +129,21 @@ services:
# Collector - MQTT subscriber and database storage
# ==========================================================================
collector:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-collector
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
db-migrate:
condition: service_completed_successfully
volumes:
# Mount data directory (contains collector/meshcore.db)
- ${DATA_HOME:-./data}:/data
- ${SEED_HOME:-./seed}:/seed
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- MQTT_HOST=${MQTT_HOST:-mqtt}
@@ -147,7 +151,9 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- DATA_HOME=/data
- SEED_HOME=/seed
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
# Webhook configuration
@@ -162,6 +168,12 @@ services:
- WEBHOOK_TIMEOUT=${WEBHOOK_TIMEOUT:-10.0}
- WEBHOOK_MAX_RETRIES=${WEBHOOK_MAX_RETRIES:-3}
- WEBHOOK_RETRY_BACKOFF=${WEBHOOK_RETRY_BACKOFF:-2.0}
# Data retention and cleanup configuration
- DATA_RETENTION_ENABLED=${DATA_RETENTION_ENABLED:-true}
- DATA_RETENTION_DAYS=${DATA_RETENTION_DAYS:-30}
- DATA_RETENTION_INTERVAL_HOURS=${DATA_RETENTION_INTERVAL_HOURS:-24}
- NODE_CLEANUP_ENABLED=${NODE_CLEANUP_ENABLED:-true}
- NODE_CLEANUP_DAYS=${NODE_CLEANUP_DAYS:-7}
command: ["collector"]
healthcheck:
test: ["CMD", "meshcore-hub", "health", "collector"]
@@ -174,15 +186,18 @@ services:
# API Server - REST API for querying data and sending commands
# ==========================================================================
api:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-api
profiles:
- all
- core
restart: unless-stopped
depends_on:
mqtt:
condition: service_healthy
db-migrate:
condition: service_completed_successfully
collector:
condition: service_started
ports:
@@ -197,6 +212,7 @@ services:
- MQTT_USERNAME=${MQTT_USERNAME:-}
- MQTT_PASSWORD=${MQTT_PASSWORD:-}
- MQTT_PREFIX=${MQTT_PREFIX:-meshcore}
- MQTT_TLS=${MQTT_TLS:-false}
- DATA_HOME=/data
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
@@ -216,11 +232,14 @@ services:
# Web Dashboard - Web interface for network visualization
# ==========================================================================
web:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-web
profiles:
- all
- core
restart: unless-stopped
depends_on:
api:
@@ -230,16 +249,20 @@ services:
environment:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- API_BASE_URL=http://api:8000
- API_KEY=${API_READ_KEY:-}
# Use ADMIN key to allow write operations from admin interface
# Falls back to READ key if ADMIN key is not set
- API_KEY=${API_ADMIN_KEY:-${API_READ_KEY:-}}
- WEB_HOST=0.0.0.0
- WEB_PORT=8080
- WEB_ADMIN_ENABLED=${WEB_ADMIN_ENABLED:-false}
- NETWORK_NAME=${NETWORK_NAME:-MeshCore Network}
- NETWORK_CITY=${NETWORK_CITY:-}
- NETWORK_COUNTRY=${NETWORK_COUNTRY:-}
- NETWORK_LOCATION=${NETWORK_LOCATION:-}
- NETWORK_RADIO_CONFIG=${NETWORK_RADIO_CONFIG:-}
- NETWORK_CONTACT_EMAIL=${NETWORK_CONTACT_EMAIL:-}
- NETWORK_CONTACT_DISCORD=${NETWORK_CONTACT_DISCORD:-}
- NETWORK_CONTACT_GITHUB=${NETWORK_CONTACT_GITHUB:-}
- NETWORK_WELCOME_TEXT=${NETWORK_WELCOME_TEXT:-}
command: ["web"]
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:8080/health')"]
@@ -252,13 +275,16 @@ services:
# Database Migrations - Run Alembic migrations
# ==========================================================================
db-migrate:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-db-migrate
profiles:
- all
- core
- migrate
restart: "no"
volumes:
# Mount data directory (uses collector/meshcore.db)
- ${DATA_HOME:-./data}:/data
@@ -269,16 +295,23 @@ services:
command: ["db", "upgrade"]
# ==========================================================================
# Seed Data - Import node_tags.json and members.json from SEED_HOME
# Seed Data - Import node_tags.yaml and members.yaml from SEED_HOME
# NOTE: This is NOT run automatically. Use --profile seed to run explicitly.
# Since tags are now managed via the admin UI, automatic seeding would
# overwrite user changes.
# ==========================================================================
seed:
image: ghcr.io/ipnet-mesh/meshcore-hub:main
image: ghcr.io/ipnet-mesh/meshcore-hub:${IMAGE_VERSION:-latest}
build:
context: .
dockerfile: Dockerfile
container_name: meshcore-seed
profiles:
- seed
restart: "no"
depends_on:
db-migrate:
condition: service_completed_successfully
volumes:
# Mount data directory for database (read-write)
- ${DATA_HOME:-./data}:/data
@@ -290,7 +323,7 @@ services:
- LOG_LEVEL=${LOG_LEVEL:-INFO}
# Explicitly unset to use DATA_HOME-based default path
- DATABASE_URL=
# Imports both node_tags.json and members.json if they exist
# Imports both node_tags.yaml and members.yaml if they exist
command: ["collector", "seed"]
# ==========================================================================

BIN
docs/images/web.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 124 KiB

View File

@@ -1,10 +0,0 @@
{
"members": [
{
"name": "Example Member",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member entry"
}
]
}

14
example/seed/members.yaml Normal file
View File

@@ -0,0 +1,14 @@
# Example members seed file
# Note: Nodes are associated with members via a 'member_id' tag on the node.
# Use node_tags.yaml to set member_id tags on nodes.
members:
- member_id: example_member
name: Example Member
callsign: N0CALL
role: Network Operator
description: Example network operator member
- member_id: simple_member
name: Simple Member
callsign: N0CALL2
role: Observer
description: Example observer member

View File

@@ -1,16 +0,0 @@
{
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef": {
"friendly_name": "Gateway Node",
"location": {"value": "37.7749,-122.4194", "type": "coordinate"},
"lat": {"value": "37.7749", "type": "number"},
"lon": {"value": "-122.4194", "type": "number"},
"role": "gateway"
},
"fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210": {
"friendly_name": "Oakland Repeater",
"location": {"value": "37.8044,-122.2712", "type": "coordinate"},
"lat": {"value": "37.8044", "type": "number"},
"lon": {"value": "-122.2712", "type": "number"},
"altitude": {"value": "150", "type": "number"}
}
}

View File

@@ -0,0 +1,29 @@
# Example node tags seed file
# Each key is a 64-character hex public key
#
# Tag values can be:
# - YAML primitives (auto-detected type):
# friendly_name: Gateway Node # string
# elevation: 150 # number
# is_online: true # boolean
#
# - Explicit type (when you need to force a specific type):
# altitude:
# value: "150"
# type: number
#
# Supported types: string, number, boolean
0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef:
friendly_name: Gateway Node
role: gateway
lat: 37.7749
lon: -122.4194
is_online: true
fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210:
friendly_name: Oakland Repeater
lat: 37.8044
lon: -122.2712
altitude: 150
is_online: false

View File

@@ -4,22 +4,21 @@ build-backend = "setuptools.build_meta"
[project]
name = "meshcore-hub"
version = "0.1.0"
version = "0.0.0"
description = "Python monorepo for managing and orchestrating MeshCore mesh networks"
readme = "README.md"
license = {text = "MIT"}
requires-python = ">=3.11"
license = {text = "GPL-3.0-or-later"}
requires-python = ">=3.13"
authors = [
{name = "MeshCore Hub Contributors"}
]
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Communications",
"Topic :: System :: Networking",
]
@@ -39,6 +38,7 @@ dependencies = [
"httpx>=0.25.0",
"aiosqlite>=0.19.0",
"meshcore>=2.2.0",
"pyyaml>=6.0.0",
]
[project.optional-dependencies]
@@ -51,6 +51,7 @@ dev = [
"mypy>=1.5.0",
"pre-commit>=3.4.0",
"types-paho-mqtt>=1.6.0",
"types-PyYAML>=6.0.0",
]
postgres = [
"asyncpg>=0.28.0",
@@ -61,10 +62,10 @@ postgres = [
meshcore-hub = "meshcore_hub.__main__:main"
[project.urls]
Homepage = "https://github.com/meshcore-dev/meshcore-hub"
Documentation = "https://github.com/meshcore-dev/meshcore-hub#readme"
Repository = "https://github.com/meshcore-dev/meshcore-hub"
Issues = "https://github.com/meshcore-dev/meshcore-hub/issues"
Homepage = "https://github.com/ipnet-mesh/meshcore-hub"
Documentation = "https://github.com/ipnet-mesh/meshcore-hub#readme"
Repository = "https://github.com/ipnet-mesh/meshcore-hub"
Issues = "https://github.com/ipnet-mesh/meshcore-hub/issues"
[tool.setuptools.packages.find]
where = ["src"]
@@ -76,7 +77,7 @@ meshcore_hub = ["py.typed"]
[tool.black]
line-length = 88
target-version = ["py311"]
target-version = ["py312"]
include = '\.pyi?$'
extend-exclude = '''
/(
@@ -95,7 +96,7 @@ extend-exclude = '''
'''
[tool.mypy]
python_version = "3.11"
python_version = "3.13"
warn_return_any = true
warn_unused_ignores = true
disallow_untyped_defs = true

View File

@@ -1,3 +1,5 @@
"""MeshCore Hub - Python monorepo for managing MeshCore mesh networks."""
__version__ = "0.1.0"
from meshcore_hub._version import __version__
__all__ = ["__version__"]

View File

@@ -174,6 +174,40 @@ def db_history() -> None:
command.history(alembic_cfg)
@db.command("stamp")
@click.option(
"--revision",
type=str,
default="head",
help="Target revision to stamp (default: head)",
)
@click.option(
"--database-url",
type=str,
default=None,
envvar="DATABASE_URL",
help="Database connection URL",
)
def db_stamp(revision: str, database_url: str | None) -> None:
"""Stamp database with revision without running migrations.
Use this to mark an existing database as up-to-date when the schema
was created before Alembic migrations were introduced.
"""
import os
from alembic import command
from alembic.config import Config
click.echo(f"Stamping database with revision: {revision}")
alembic_cfg = Config("alembic.ini")
if database_url:
os.environ["DATABASE_URL"] = database_url
command.stamp(alembic_cfg, revision)
click.echo("Database stamped successfully.")
# Health check commands for Docker HEALTHCHECK
@cli.group()
def health() -> None:

View File

@@ -0,0 +1,8 @@
"""MeshCore Hub version information.
This file contains the version string for the package.
It can be overridden at build time by setting BUILD_VERSION environment variable.
"""
__version__ = "dev"
__all__ = ["__version__"]

View File

@@ -32,10 +32,9 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
# Get database URL from app state
database_url = getattr(app.state, "database_url", "sqlite:///./meshcore.db")
# Initialize database
# Initialize database (schema managed by Alembic migrations)
logger.info(f"Initializing database: {database_url}")
_db_manager = DatabaseManager(database_url)
_db_manager.create_tables()
yield
@@ -53,6 +52,7 @@ def create_app(
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
cors_origins: list[str] | None = None,
) -> FastAPI:
"""Create and configure the FastAPI application.
@@ -64,6 +64,7 @@ def create_app(
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
cors_origins: Allowed CORS origins
Returns:
@@ -86,6 +87,7 @@ def create_app(
app.state.mqtt_host = mqtt_host
app.state.mqtt_port = mqtt_port
app.state.mqtt_prefix = mqtt_prefix
app.state.mqtt_tls = mqtt_tls
# Configure CORS
if cors_origins is None:

View File

@@ -67,6 +67,13 @@ import click
envvar="MQTT_TOPIC_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--cors-origins",
type=str,
@@ -92,6 +99,7 @@ def api(
mqtt_host: str,
mqtt_port: int,
mqtt_prefix: str,
mqtt_tls: bool,
cors_origins: str | None,
reload: bool,
) -> None:
@@ -171,6 +179,7 @@ def api(
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
cors_origins=origins_list,
)

View File

@@ -57,6 +57,7 @@ def get_mqtt_client(request: Request) -> MQTTClient:
mqtt_host = getattr(request.app.state, "mqtt_host", "localhost")
mqtt_port = getattr(request.app.state, "mqtt_port", 1883)
mqtt_prefix = getattr(request.app.state, "mqtt_prefix", "meshcore")
mqtt_tls = getattr(request.app.state, "mqtt_tls", False)
# Use unique client ID to allow multiple API instances
unique_id = uuid.uuid4().hex[:8]
@@ -65,6 +66,7 @@ def get_mqtt_client(request: Request) -> MQTTClient:
port=mqtt_port,
prefix=mqtt_prefix,
client_id=f"meshcore-api-{unique_id}",
tls=mqtt_tls,
)
client = MQTTClient(config)

View File

@@ -4,33 +4,158 @@ from datetime import datetime
from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy import func, or_, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement
from meshcore_hub.common.schemas.messages import AdvertisementList, AdvertisementRead
from meshcore_hub.common.models import Advertisement, EventReceiver, Node, NodeTag
from meshcore_hub.common.schemas.messages import (
AdvertisementList,
AdvertisementRead,
ReceiverInfo,
)
router = APIRouter()
def _get_tag_name(node: Optional[Node]) -> Optional[str]:
"""Extract name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes."""
if not event_hashes:
return {}
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
node_ids = [r.node_id for r in results]
tag_names: dict[str, str] = {}
if node_ids:
tag_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "name")
)
for node_id, value in session.execute(tag_query).all():
tag_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
tag_name=tag_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=AdvertisementList)
async def list_advertisements(
_: RequireRead,
session: DbSession,
search: Optional[str] = Query(
None, description="Search in name tag, node name, or public key"
),
public_key: Optional[str] = Query(None, description="Filter by public key"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
member_id: Optional[str] = Query(
None, description="Filter by member_id tag value of source node"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> AdvertisementList:
"""List advertisements with filtering and pagination."""
# Build query
query = select(Advertisement)
# Aliases for node joins
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
# Build query with both receiver and source node joins
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
)
if search:
# Search in public key, advertisement name, node name, or name tag
search_pattern = f"%{search}%"
query = query.where(
or_(
Advertisement.public_key.ilike(search_pattern),
Advertisement.name.ilike(search_pattern),
SourceNode.name.ilike(search_pattern),
SourceNode.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "name", NodeTag.value.ilike(search_pattern)
)
),
)
)
if public_key:
query = query.where(Advertisement.public_key == public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if member_id:
# Filter advertisements from nodes that have a member_id tag with the specified value
query = query.where(
SourceNode.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "member_id", NodeTag.value == member_id
)
)
)
if since:
query = query.where(Advertisement.received_at >= since)
@@ -45,10 +170,58 @@ async def list_advertisements(
query = query.order_by(Advertisement.received_at.desc()).offset(offset).limit(limit)
# Execute
advertisements = session.execute(query).scalars().all()
results = session.execute(query).all()
# Collect node IDs to fetch tags
node_ids = set()
for row in results:
if row.receiver_id:
node_ids.add(row.receiver_id)
if row.source_id:
node_ids.add(row.source_id)
# Fetch nodes with tags
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
# Fetch all receivers for these advertisements
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", event_hashes
)
# Build response with node details
items = []
for row in results:
adv = row[0]
receiver_node = nodes_by_id.get(row.receiver_id) if row.receiver_id else None
source_node = nodes_by_id.get(row.source_id) if row.source_id else None
data = {
"received_by": row.receiver_pk,
"receiver_name": row.receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": row.source_name,
"node_tag_name": _get_tag_name(source_node),
"adv_type": adv.adv_type or row.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": (
receivers_by_hash.get(adv.event_hash, []) if adv.event_hash else []
),
}
items.append(AdvertisementRead(**data))
return AdvertisementList(
items=[AdvertisementRead.model_validate(a) for a in advertisements],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +235,67 @@ async def get_advertisement(
advertisement_id: str,
) -> AdvertisementRead:
"""Get a single advertisement by ID."""
query = select(Advertisement).where(Advertisement.id == advertisement_id)
advertisement = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
SourceNode = aliased(Node)
query = (
select(
Advertisement,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
SourceNode.name.label("source_name"),
SourceNode.id.label("source_id"),
SourceNode.adv_type.label("source_adv_type"),
)
.outerjoin(ReceiverNode, Advertisement.receiver_node_id == ReceiverNode.id)
.outerjoin(SourceNode, Advertisement.node_id == SourceNode.id)
.where(Advertisement.id == advertisement_id)
)
result = session.execute(query).one_or_none()
if not advertisement:
if not result:
raise HTTPException(status_code=404, detail="Advertisement not found")
return AdvertisementRead.model_validate(advertisement)
adv = result[0]
# Fetch nodes with tags for friendly names
node_ids = []
if result.receiver_id:
node_ids.append(result.receiver_id)
if result.source_id:
node_ids.append(result.source_id)
nodes_by_id: dict[str, Node] = {}
if node_ids:
nodes_query = (
select(Node).where(Node.id.in_(node_ids)).options(selectinload(Node.tags))
)
nodes = session.execute(nodes_query).scalars().all()
nodes_by_id = {n.id: n for n in nodes}
receiver_node = nodes_by_id.get(result.receiver_id) if result.receiver_id else None
source_node = nodes_by_id.get(result.source_id) if result.source_id else None
# Fetch receivers for this advertisement
receivers = []
if adv.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "advertisement", [adv.event_hash]
)
receivers = receivers_by_hash.get(adv.event_hash, [])
data = {
"received_by": result.receiver_pk,
"receiver_name": result.receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"public_key": adv.public_key,
"name": adv.name,
"node_name": result.source_name,
"node_tag_name": _get_tag_name(source_node),
"adv_type": adv.adv_type or result.source_adv_type,
"flags": adv.flags,
"received_at": adv.received_at,
"created_at": adv.created_at,
"receivers": receivers,
}
return AdvertisementRead(**data)

View File

@@ -9,7 +9,15 @@ from sqlalchemy import func, select
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Advertisement, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import DashboardStats, RecentAdvertisement
from meshcore_hub.common.schemas.messages import (
ChannelMessage,
DailyActivity,
DailyActivityPoint,
DashboardStats,
MessageActivity,
NodeCountHistory,
RecentAdvertisement,
)
router = APIRouter()
@@ -23,6 +31,7 @@ async def get_stats(
now = datetime.now(timezone.utc)
today_start = now.replace(hour=0, minute=0, second=0, microsecond=0)
yesterday = now - timedelta(days=1)
seven_days_ago = now - timedelta(days=7)
# Total nodes
total_nodes = session.execute(select(func.count()).select_from(Node)).scalar() or 0
@@ -65,6 +74,26 @@ async def get_stats(
or 0
)
# Advertisements in last 7 days
advertisements_7d = (
session.execute(
select(func.count())
.select_from(Advertisement)
.where(Advertisement.received_at >= seven_days_ago)
).scalar()
or 0
)
# Messages in last 7 days
messages_7d = (
session.execute(
select(func.count())
.select_from(Message)
.where(Message.received_at >= seven_days_ago)
).scalar()
or 0
)
# Recent advertisements (last 10)
recent_ads = (
session.execute(
@@ -74,25 +103,38 @@ async def get_stats(
.all()
)
# Get friendly_name tags for the advertised nodes
# Get node names, adv_types, and name tags for the advertised nodes
ad_public_keys = [ad.public_key for ad in recent_ads]
friendly_names: dict[str, str] = {}
node_names: dict[str, str] = {}
node_adv_types: dict[str, str] = {}
tag_names: dict[str, str] = {}
if ad_public_keys:
friendly_name_query = (
# Get node names and adv_types from Node table
node_query = select(Node.public_key, Node.name, Node.adv_type).where(
Node.public_key.in_(ad_public_keys)
)
for public_key, name, adv_type in session.execute(node_query).all():
if name:
node_names[public_key] = name
if adv_type:
node_adv_types[public_key] = adv_type
# Get name tags
tag_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.in_(ad_public_keys))
.where(NodeTag.key == "friendly_name")
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(friendly_name_query).all():
friendly_names[public_key] = value
for public_key, value in session.execute(tag_name_query).all():
tag_names[public_key] = value
recent_advertisements = [
RecentAdvertisement(
public_key=ad.public_key,
name=ad.name,
friendly_name=friendly_names.get(ad.public_key),
adv_type=ad.adv_type,
name=ad.name or node_names.get(ad.public_key),
tag_name=tag_names.get(ad.public_key),
adv_type=ad.adv_type or node_adv_types.get(ad.public_key),
received_at=ad.received_at,
)
for ad in recent_ads
@@ -110,18 +152,218 @@ async def get_stats(
int(channel): int(count) for channel, count in channel_results
}
# Get latest 5 messages for each channel that has messages
channel_messages: dict[int, list[ChannelMessage]] = {}
for channel_idx, _ in channel_results:
messages_query = (
select(Message)
.where(Message.message_type == "channel")
.where(Message.channel_idx == channel_idx)
.order_by(Message.received_at.desc())
.limit(5)
)
channel_msgs = session.execute(messages_query).scalars().all()
# Look up sender names for these messages
msg_prefixes = [m.pubkey_prefix for m in channel_msgs if m.pubkey_prefix]
msg_sender_names: dict[str, str] = {}
msg_tag_names: dict[str, str] = {}
if msg_prefixes:
for prefix in set(msg_prefixes):
sender_node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(sender_node_query).all():
if name:
msg_sender_names[public_key[:12]] = name
sender_tag_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.startswith(prefix))
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(sender_tag_query).all():
msg_tag_names[public_key[:12]] = value
channel_messages[int(channel_idx)] = [
ChannelMessage(
text=m.text,
sender_name=(
msg_sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
sender_tag_name=(
msg_tag_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
pubkey_prefix=m.pubkey_prefix,
received_at=m.received_at,
)
for m in channel_msgs
]
return DashboardStats(
total_nodes=total_nodes,
active_nodes=active_nodes,
total_messages=total_messages,
messages_today=messages_today,
messages_7d=messages_7d,
total_advertisements=total_advertisements,
advertisements_24h=advertisements_24h,
advertisements_7d=advertisements_7d,
recent_advertisements=recent_advertisements,
channel_message_counts=channel_message_counts,
channel_messages=channel_messages,
)
@router.get("/activity", response_model=DailyActivity)
async def get_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> DailyActivity:
"""Get daily advertisement activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily advertisement counts for each day in the period (excluding today)
"""
# Limit to max 90 days
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Query advertisement counts grouped by date
# Use SQLite's date() function for grouping (returns string 'YYYY-MM-DD')
date_expr = func.date(Advertisement.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Advertisement.received_at >= start_date)
.where(Advertisement.received_at < end_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
# Build a dict of date -> count from results (date is already a string)
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return DailyActivity(days=days, data=data)
@router.get("/message-activity", response_model=MessageActivity)
async def get_message_activity(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> MessageActivity:
"""Get daily message activity for the specified period.
Args:
days: Number of days to include (default 30, max 90)
Returns:
Daily message counts for each day in the period (excluding today)
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Query message counts grouped by date
date_expr = func.date(Message.received_at)
query = (
select(
date_expr.label("date"),
func.count().label("count"),
)
.where(Message.received_at >= start_date)
.where(Message.received_at < end_date)
.group_by(date_expr)
.order_by(date_expr)
)
results = session.execute(query).all()
counts_by_date = {row.date: row.count for row in results}
# Generate all dates in the range, filling in zeros for missing days
data = []
for i in range(days):
date = start_date + timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
count = counts_by_date.get(date_str, 0)
data.append(DailyActivityPoint(date=date_str, count=count))
return MessageActivity(days=days, data=data)
@router.get("/node-count", response_model=NodeCountHistory)
async def get_node_count_history(
_: RequireRead,
session: DbSession,
days: int = 30,
) -> NodeCountHistory:
"""Get cumulative node count over time.
For each day, shows the total number of nodes that existed by that date
(based on their created_at timestamp).
Args:
days: Number of days to include (default 30, max 90)
Returns:
Cumulative node count for each day in the period (excluding today)
"""
days = min(days, 90)
now = datetime.now(timezone.utc)
# End at start of today (exclude today's incomplete data)
end_date = now.replace(hour=0, minute=0, second=0, microsecond=0)
start_date = end_date - timedelta(days=days)
# Get all nodes with their creation dates
# Count nodes created on or before each date
data = []
for i in range(days):
date = start_date + timedelta(days=i)
end_of_day = date.replace(hour=23, minute=59, second=59, microsecond=999999)
date_str = date.strftime("%Y-%m-%d")
# Count nodes created on or before this date
count = (
session.execute(
select(func.count())
.select_from(Node)
.where(Node.created_at <= end_of_day)
).scalar()
or 0
)
data.append(DailyActivityPoint(date=date_str, count=count))
return NodeCountHistory(days=days, data=data)
@router.get("/", response_class=HTMLResponse)
async def dashboard(
request: Request,

View File

@@ -20,7 +20,7 @@ router = APIRouter()
async def list_members(
_: RequireRead,
session: DbSession,
limit: int = Query(default=50, ge=1, le=100),
limit: int = Query(default=50, ge=1, le=500),
offset: int = Query(default=0, ge=0),
) -> MemberList:
"""List all members with pagination."""
@@ -28,9 +28,9 @@ async def list_members(
count_query = select(func.count()).select_from(Member)
total = session.execute(count_query).scalar() or 0
# Get members
# Get members ordered by name
query = select(Member).order_by(Member.name).limit(limit).offset(offset)
members = session.execute(query).scalars().all()
members = list(session.execute(query).scalars().all())
return MemberList(
items=[MemberRead.model_validate(m) for m in members],
@@ -63,17 +63,23 @@ async def create_member(
member: MemberCreate,
) -> MemberRead:
"""Create a new member."""
# Normalize public_key to lowercase if provided
public_key = member.public_key.lower() if member.public_key else None
# Check if member_id already exists
query = select(Member).where(Member.member_id == member.member_id)
existing = session.execute(query).scalar_one_or_none()
if existing:
raise HTTPException(
status_code=400,
detail=f"Member with member_id '{member.member_id}' already exists",
)
# Create member
new_member = Member(
member_id=member.member_id,
name=member.name,
callsign=member.callsign,
role=member.role,
description=member.description,
contact=member.contact,
public_key=public_key,
)
session.add(new_member)
session.commit()
@@ -97,6 +103,18 @@ async def update_member(
raise HTTPException(status_code=404, detail="Member not found")
# Update fields
if member.member_id is not None:
# Check if new member_id is already taken by another member
check_query = select(Member).where(
Member.member_id == member.member_id, Member.id != member_id
)
collision = session.execute(check_query).scalar_one_or_none()
if collision:
raise HTTPException(
status_code=400,
detail=f"Member with member_id '{member.member_id}' already exists",
)
existing.member_id = member.member_id
if member.name is not None:
existing.name = member.name
if member.callsign is not None:
@@ -107,8 +125,6 @@ async def update_member(
existing.description = member.description
if member.contact is not None:
existing.contact = member.contact
if member.public_key is not None:
existing.public_key = member.public_key.lower()
session.commit()
session.refresh(existing)

View File

@@ -5,15 +5,95 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased, selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead
from meshcore_hub.common.models import EventReceiver, Message, Node, NodeTag
from meshcore_hub.common.schemas.messages import MessageList, MessageRead, ReceiverInfo
router = APIRouter()
def _get_tag_name(node: Optional[Node]) -> Optional[str]:
"""Extract name tag from a node's tags."""
if not node or not node.tags:
return None
for tag in node.tags:
if tag.key == "name":
return tag.value
return None
def _fetch_receivers_for_events(
session: DbSession,
event_type: str,
event_hashes: list[str],
) -> dict[str, list[ReceiverInfo]]:
"""Fetch receiver info for a list of events by their hashes.
Args:
session: Database session
event_type: Type of event ('message', 'advertisement', etc.)
event_hashes: List of event hashes to fetch receivers for
Returns:
Dict mapping event_hash to list of ReceiverInfo objects
"""
if not event_hashes:
return {}
# Query event_receivers with receiver node info
query = (
select(
EventReceiver.event_hash,
EventReceiver.snr,
EventReceiver.received_at,
Node.id.label("node_id"),
Node.public_key,
Node.name,
)
.join(Node, EventReceiver.receiver_node_id == Node.id)
.where(EventReceiver.event_type == event_type)
.where(EventReceiver.event_hash.in_(event_hashes))
.order_by(EventReceiver.received_at)
)
results = session.execute(query).all()
# Group by event_hash
receivers_by_hash: dict[str, list[ReceiverInfo]] = {}
# Get tag names for receiver nodes
node_ids = [r.node_id for r in results]
tag_names: dict[str, str] = {}
if node_ids:
tag_query = (
select(NodeTag.node_id, NodeTag.value)
.where(NodeTag.node_id.in_(node_ids))
.where(NodeTag.key == "name")
)
for node_id, value in session.execute(tag_query).all():
tag_names[node_id] = value
for row in results:
if row.event_hash not in receivers_by_hash:
receivers_by_hash[row.event_hash] = []
receivers_by_hash[row.event_hash].append(
ReceiverInfo(
node_id=row.node_id,
public_key=row.public_key,
name=row.name,
tag_name=tag_names.get(row.node_id),
snr=row.snr,
received_at=row.received_at,
)
)
return receivers_by_hash
@router.get("", response_model=MessageList)
async def list_messages(
_: RequireRead,
@@ -21,6 +101,9 @@ async def list_messages(
message_type: Optional[str] = Query(None, description="Filter by message type"),
pubkey_prefix: Optional[str] = Query(None, description="Filter by sender prefix"),
channel_idx: Optional[int] = Query(None, description="Filter by channel"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
search: Optional[str] = Query(None, description="Search in message text"),
@@ -28,8 +111,16 @@ async def list_messages(
offset: int = Query(0, ge=0, description="Page offset"),
) -> MessageList:
"""List messages with filtering and pagination."""
# Build query
query = select(Message)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(
Message,
ReceiverNode.public_key.label("receiver_pk"),
ReceiverNode.name.label("receiver_name"),
ReceiverNode.id.label("receiver_id"),
).outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
if message_type:
query = query.where(Message.message_type == message_type)
@@ -40,6 +131,9 @@ async def list_messages(
if channel_idx is not None:
query = query.where(Message.channel_idx == channel_idx)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Message.received_at >= since)
@@ -57,34 +151,77 @@ async def list_messages(
query = query.order_by(Message.received_at.desc()).offset(offset).limit(limit)
# Execute
messages = session.execute(query).scalars().all()
results = session.execute(query).all()
# Look up friendly_names for senders with pubkey_prefix
pubkey_prefixes = [m.pubkey_prefix for m in messages if m.pubkey_prefix]
friendly_names: dict[str, str] = {}
# Look up sender names and tag names for senders with pubkey_prefix
pubkey_prefixes = [r[0].pubkey_prefix for r in results if r[0].pubkey_prefix]
sender_names: dict[str, str] = {}
sender_tag_names: dict[str, str] = {}
if pubkey_prefixes:
# Find nodes whose public_key starts with any of these prefixes
for prefix in set(pubkey_prefixes):
friendly_name_query = (
# Get node name
node_query = select(Node.public_key, Node.name).where(
Node.public_key.startswith(prefix)
)
for public_key, name in session.execute(node_query).all():
if name:
sender_names[public_key[:12]] = name
# Get name tag
tag_name_query = (
select(Node.public_key, NodeTag.value)
.join(NodeTag, Node.id == NodeTag.node_id)
.where(Node.public_key.startswith(prefix))
.where(NodeTag.key == "friendly_name")
.where(NodeTag.key == "name")
)
for public_key, value in session.execute(friendly_name_query).all():
# Map the prefix to the friendly_name
friendly_names[public_key[:12]] = value
for public_key, value in session.execute(tag_name_query).all():
sender_tag_names[public_key[:12]] = value
# Build response with friendly_names
# Collect receiver node IDs to fetch tags
receiver_ids = set()
for row in results:
if row.receiver_id:
receiver_ids.add(row.receiver_id)
# Fetch receiver nodes with tags
receivers_by_id: dict[str, Node] = {}
if receiver_ids:
receivers_query = (
select(Node)
.where(Node.id.in_(receiver_ids))
.options(selectinload(Node.tags))
)
receivers = session.execute(receivers_query).scalars().all()
receivers_by_id = {n.id: n for n in receivers}
# Fetch all receivers for these messages
event_hashes = [r[0].event_hash for r in results if r[0].event_hash]
receivers_by_hash = _fetch_receivers_for_events(session, "message", event_hashes)
# Build response with sender info and received_by
items = []
for m in messages:
for row in results:
m = row[0]
receiver_pk = row.receiver_pk
receiver_name = row.receiver_name
receiver_node = (
receivers_by_id.get(row.receiver_id) if row.receiver_id else None
)
msg_dict = {
"id": m.id,
"receiver_node_id": m.receiver_node_id,
"received_by": receiver_pk,
"receiver_name": receiver_name,
"receiver_tag_name": _get_tag_name(receiver_node),
"message_type": m.message_type,
"pubkey_prefix": m.pubkey_prefix,
"sender_friendly_name": (
friendly_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
"sender_name": (
sender_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
"sender_tag_name": (
sender_tag_names.get(m.pubkey_prefix) if m.pubkey_prefix else None
),
"channel_idx": m.channel_idx,
"text": m.text,
@@ -95,6 +232,9 @@ async def list_messages(
"sender_timestamp": m.sender_timestamp,
"received_at": m.received_at,
"created_at": m.created_at,
"receivers": (
receivers_by_hash.get(m.event_hash, []) if m.event_hash else []
),
}
items.append(MessageRead(**msg_dict))
@@ -113,10 +253,42 @@ async def get_message(
message_id: str,
) -> MessageRead:
"""Get a single message by ID."""
query = select(Message).where(Message.id == message_id)
message = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Message, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Message.receiver_node_id == ReceiverNode.id)
.where(Message.id == message_id)
)
result = session.execute(query).one_or_none()
if not message:
if not result:
raise HTTPException(status_code=404, detail="Message not found")
return MessageRead.model_validate(message)
message, receiver_pk = result
# Fetch receivers for this message
receivers = []
if message.event_hash:
receivers_by_hash = _fetch_receivers_for_events(
session, "message", [message.event_hash]
)
receivers = receivers_by_hash.get(message.event_hash, [])
data = {
"id": message.id,
"receiver_node_id": message.receiver_node_id,
"received_by": receiver_pk,
"message_type": message.message_type,
"pubkey_prefix": message.pubkey_prefix,
"channel_idx": message.channel_idx,
"text": message.text,
"path_len": message.path_len,
"txt_type": message.txt_type,
"signature": message.signature,
"snr": message.snr,
"sender_timestamp": message.sender_timestamp,
"received_at": message.received_at,
"created_at": message.created_at,
"receivers": receivers,
}
return MessageRead(**data)

View File

@@ -6,7 +6,13 @@ from sqlalchemy import select
from meshcore_hub.api.auth import RequireAdmin, RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Node, NodeTag
from meshcore_hub.common.schemas.nodes import NodeTagCreate, NodeTagRead, NodeTagUpdate
from meshcore_hub.common.schemas.nodes import (
NodeTagCreate,
NodeTagMove,
NodeTagRead,
NodeTagsCopyResult,
NodeTagUpdate,
)
router = APIRouter()
@@ -130,6 +136,131 @@ async def update_node_tag(
return NodeTagRead.model_validate(node_tag)
@router.put("/nodes/{public_key}/tags/{key}/move", response_model=NodeTagRead)
async def move_node_tag(
_: RequireAdmin,
session: DbSession,
public_key: str,
key: str,
data: NodeTagMove,
) -> NodeTagRead:
"""Move a node tag to a different node."""
# Check if source and destination are the same
if public_key == data.new_public_key:
raise HTTPException(
status_code=400,
detail="Source and destination nodes are the same",
)
# Find source node
source_query = select(Node).where(Node.public_key == public_key)
source_node = session.execute(source_query).scalar_one_or_none()
if not source_node:
raise HTTPException(status_code=404, detail="Source node not found")
# Find tag
tag_query = select(NodeTag).where(
(NodeTag.node_id == source_node.id) & (NodeTag.key == key)
)
node_tag = session.execute(tag_query).scalar_one_or_none()
if not node_tag:
raise HTTPException(status_code=404, detail="Tag not found")
# Find destination node
dest_query = select(Node).where(Node.public_key == data.new_public_key)
dest_node = session.execute(dest_query).scalar_one_or_none()
if not dest_node:
raise HTTPException(status_code=404, detail="Destination node not found")
# Check if tag already exists on destination node
conflict_query = select(NodeTag).where(
(NodeTag.node_id == dest_node.id) & (NodeTag.key == key)
)
conflict = session.execute(conflict_query).scalar_one_or_none()
if conflict:
raise HTTPException(
status_code=409,
detail=f"Tag '{key}' already exists on destination node",
)
# Move tag to destination node
node_tag.node_id = dest_node.id
session.commit()
session.refresh(node_tag)
return NodeTagRead.model_validate(node_tag)
@router.post(
"/nodes/{public_key}/tags/copy-to/{dest_public_key}",
response_model=NodeTagsCopyResult,
)
async def copy_all_tags(
_: RequireAdmin,
session: DbSession,
public_key: str,
dest_public_key: str,
) -> NodeTagsCopyResult:
"""Copy all tags from one node to another.
Tags that already exist on the destination node are skipped.
"""
# Check if source and destination are the same
if public_key == dest_public_key:
raise HTTPException(
status_code=400,
detail="Source and destination nodes are the same",
)
# Find source node
source_query = select(Node).where(Node.public_key == public_key)
source_node = session.execute(source_query).scalar_one_or_none()
if not source_node:
raise HTTPException(status_code=404, detail="Source node not found")
# Find destination node
dest_query = select(Node).where(Node.public_key == dest_public_key)
dest_node = session.execute(dest_query).scalar_one_or_none()
if not dest_node:
raise HTTPException(status_code=404, detail="Destination node not found")
# Get existing tags on destination node
existing_query = select(NodeTag.key).where(NodeTag.node_id == dest_node.id)
existing_keys = set(session.execute(existing_query).scalars().all())
# Copy tags
copied = 0
skipped_keys = []
for tag in source_node.tags:
if tag.key in existing_keys:
skipped_keys.append(tag.key)
continue
new_tag = NodeTag(
node_id=dest_node.id,
key=tag.key,
value=tag.value,
value_type=tag.value_type,
)
session.add(new_tag)
copied += 1
session.commit()
return NodeTagsCopyResult(
copied=copied,
skipped=len(skipped_keys),
skipped_keys=skipped_keys,
)
@router.delete("/nodes/{public_key}/tags/{key}", status_code=204)
async def delete_node_tag(
_: RequireAdmin,
@@ -156,3 +287,27 @@ async def delete_node_tag(
session.delete(node_tag)
session.commit()
@router.delete("/nodes/{public_key}/tags")
async def delete_all_node_tags(
_: RequireAdmin,
session: DbSession,
public_key: str,
) -> dict:
"""Delete all tags for a node."""
# Find node
node_query = select(Node).where(Node.public_key == public_key)
node = session.execute(node_query).scalar_one_or_none()
if not node:
raise HTTPException(status_code=404, detail="Node not found")
# Count and delete all tags
count = len(node.tags)
for tag in node.tags:
session.delete(tag)
session.commit()
return {"deleted": count}

View File

@@ -3,11 +3,12 @@
from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy import func, or_, select
from sqlalchemy.orm import selectinload
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Node
from meshcore_hub.common.models import Node, NodeTag
from meshcore_hub.common.schemas.nodes import NodeList, NodeRead
router = APIRouter()
@@ -17,28 +18,52 @@ router = APIRouter()
async def list_nodes(
_: RequireRead,
session: DbSession,
search: Optional[str] = Query(None, description="Search in name or public key"),
search: Optional[str] = Query(
None, description="Search in name tag, node name, or public key"
),
adv_type: Optional[str] = Query(None, description="Filter by advertisement type"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
member_id: Optional[str] = Query(None, description="Filter by member_id tag value"),
limit: int = Query(50, ge=1, le=500, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> NodeList:
"""List all nodes with pagination and filtering."""
# Build query
query = select(Node)
# Build base query with tags loaded
query = select(Node).options(selectinload(Node.tags))
if search:
# Search in public key, node name, or name tag
# For name tag search, we need to join with NodeTag
search_pattern = f"%{search}%"
query = query.where(
(Node.name.ilike(f"%{search}%")) | (Node.public_key.ilike(f"%{search}%"))
or_(
Node.public_key.ilike(search_pattern),
Node.name.ilike(search_pattern),
Node.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "name", NodeTag.value.ilike(search_pattern)
)
),
)
)
if adv_type:
query = query.where(Node.adv_type == adv_type)
if member_id:
# Filter nodes that have a member_id tag with the specified value
query = query.where(
Node.id.in_(
select(NodeTag.node_id).where(
NodeTag.key == "member_id", NodeTag.value == member_id
)
)
)
# Get total count
count_query = select(func.count()).select_from(query.subquery())
total = session.execute(count_query).scalar() or 0
# Apply pagination
# Apply pagination and ordering
query = query.order_by(Node.last_seen.desc()).offset(offset).limit(limit)
# Execute

View File

@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import Telemetry
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.schemas.messages import TelemetryList, TelemetryRead
router = APIRouter()
@@ -19,18 +20,29 @@ async def list_telemetry(
_: RequireRead,
session: DbSession,
node_public_key: Optional[str] = Query(None, description="Filter by node"),
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TelemetryList:
"""List telemetry records with filtering and pagination."""
# Build query
query = select(Telemetry)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(Telemetry, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id
)
if node_public_key:
query = query.where(Telemetry.node_public_key == node_public_key)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(Telemetry.received_at >= since)
@@ -45,10 +57,25 @@ async def list_telemetry(
query = query.order_by(Telemetry.received_at.desc()).offset(offset).limit(limit)
# Execute
records = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tel, receiver_pk in results:
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
items.append(TelemetryRead(**data))
return TelemetryList(
items=[TelemetryRead.model_validate(t) for t in records],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -62,10 +89,26 @@ async def get_telemetry(
telemetry_id: str,
) -> TelemetryRead:
"""Get a single telemetry record by ID."""
query = select(Telemetry).where(Telemetry.id == telemetry_id)
telemetry = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(Telemetry, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, Telemetry.receiver_node_id == ReceiverNode.id)
.where(Telemetry.id == telemetry_id)
)
result = session.execute(query).one_or_none()
if not telemetry:
if not result:
raise HTTPException(status_code=404, detail="Telemetry record not found")
return TelemetryRead.model_validate(telemetry)
tel, receiver_pk = result
data = {
"id": tel.id,
"receiver_node_id": tel.receiver_node_id,
"received_by": receiver_pk,
"node_id": tel.node_id,
"node_public_key": tel.node_public_key,
"parsed_data": tel.parsed_data,
"received_at": tel.received_at,
"created_at": tel.created_at,
}
return TelemetryRead(**data)

View File

@@ -5,10 +5,11 @@ from typing import Optional
from fastapi import APIRouter, HTTPException, Query
from sqlalchemy import func, select
from sqlalchemy.orm import aliased
from meshcore_hub.api.auth import RequireRead
from meshcore_hub.api.dependencies import DbSession
from meshcore_hub.common.models import TracePath
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.schemas.messages import TracePathList, TracePathRead
router = APIRouter()
@@ -18,14 +19,25 @@ router = APIRouter()
async def list_trace_paths(
_: RequireRead,
session: DbSession,
received_by: Optional[str] = Query(
None, description="Filter by receiver node public key"
),
since: Optional[datetime] = Query(None, description="Start timestamp"),
until: Optional[datetime] = Query(None, description="End timestamp"),
limit: int = Query(50, ge=1, le=100, description="Page size"),
offset: int = Query(0, ge=0, description="Page offset"),
) -> TracePathList:
"""List trace paths with filtering and pagination."""
# Build query
query = select(TracePath)
# Alias for receiver node join
ReceiverNode = aliased(Node)
# Build query with receiver node join
query = select(TracePath, ReceiverNode.public_key.label("receiver_pk")).outerjoin(
ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id
)
if received_by:
query = query.where(ReceiverNode.public_key == received_by)
if since:
query = query.where(TracePath.received_at >= since)
@@ -41,10 +53,29 @@ async def list_trace_paths(
query = query.order_by(TracePath.received_at.desc()).offset(offset).limit(limit)
# Execute
trace_paths = session.execute(query).scalars().all()
results = session.execute(query).all()
# Build response with received_by
items = []
for tp, receiver_pk in results:
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
items.append(TracePathRead(**data))
return TracePathList(
items=[TracePathRead.model_validate(t) for t in trace_paths],
items=items,
total=total,
limit=limit,
offset=offset,
@@ -58,10 +89,30 @@ async def get_trace_path(
trace_path_id: str,
) -> TracePathRead:
"""Get a single trace path by ID."""
query = select(TracePath).where(TracePath.id == trace_path_id)
trace_path = session.execute(query).scalar_one_or_none()
ReceiverNode = aliased(Node)
query = (
select(TracePath, ReceiverNode.public_key.label("receiver_pk"))
.outerjoin(ReceiverNode, TracePath.receiver_node_id == ReceiverNode.id)
.where(TracePath.id == trace_path_id)
)
result = session.execute(query).one_or_none()
if not trace_path:
if not result:
raise HTTPException(status_code=404, detail="Trace path not found")
return TracePathRead.model_validate(trace_path)
tp, receiver_pk = result
data = {
"id": tp.id,
"receiver_node_id": tp.receiver_node_id,
"received_by": receiver_pk,
"initiator_tag": tp.initiator_tag,
"path_len": tp.path_len,
"flags": tp.flags,
"auth": tp.auth,
"path_hashes": tp.path_hashes,
"snr_values": tp.snr_values,
"hop_count": tp.hop_count,
"received_at": tp.received_at,
"created_at": tp.created_at,
}
return TracePathRead(**data)

View File

@@ -0,0 +1,225 @@
"""Data retention and cleanup service for MeshCore Hub.
This module provides functionality to delete old event data and inactive nodes
based on configured retention policies.
"""
import logging
from datetime import datetime, timedelta, timezone
from sqlalchemy import delete, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from meshcore_hub.common.models import (
Advertisement,
EventLog,
Message,
Node,
Telemetry,
TracePath,
)
logger = logging.getLogger(__name__)
class CleanupStats:
"""Statistics from a cleanup operation."""
def __init__(self) -> None:
self.advertisements_deleted: int = 0
self.messages_deleted: int = 0
self.telemetry_deleted: int = 0
self.trace_paths_deleted: int = 0
self.event_logs_deleted: int = 0
self.nodes_deleted: int = 0
self.total_deleted: int = 0
def __repr__(self) -> str:
return (
f"CleanupStats(total={self.total_deleted}, "
f"advertisements={self.advertisements_deleted}, "
f"messages={self.messages_deleted}, "
f"telemetry={self.telemetry_deleted}, "
f"trace_paths={self.trace_paths_deleted}, "
f"event_logs={self.event_logs_deleted}, "
f"nodes={self.nodes_deleted})"
)
async def cleanup_old_data(
db: AsyncSession,
retention_days: int,
dry_run: bool = False,
) -> CleanupStats:
"""Delete event data older than the retention period.
Args:
db: Database session
retention_days: Number of days to retain data
dry_run: If True, only count records without deleting
Returns:
CleanupStats object with deletion counts
"""
stats = CleanupStats()
cutoff_date = datetime.now(timezone.utc) - timedelta(days=retention_days)
logger.info(
"Starting data cleanup (dry_run=%s, retention_days=%d, cutoff=%s)",
dry_run,
retention_days,
cutoff_date.isoformat(),
)
# Clean up advertisements
stats.advertisements_deleted = await _cleanup_table(
db, Advertisement, cutoff_date, "advertisements", dry_run
)
# Clean up messages
stats.messages_deleted = await _cleanup_table(
db, Message, cutoff_date, "messages", dry_run
)
# Clean up telemetry
stats.telemetry_deleted = await _cleanup_table(
db, Telemetry, cutoff_date, "telemetry", dry_run
)
# Clean up trace paths
stats.trace_paths_deleted = await _cleanup_table(
db, TracePath, cutoff_date, "trace_paths", dry_run
)
# Clean up event logs
stats.event_logs_deleted = await _cleanup_table(
db, EventLog, cutoff_date, "event_logs", dry_run
)
stats.total_deleted = (
stats.advertisements_deleted
+ stats.messages_deleted
+ stats.telemetry_deleted
+ stats.trace_paths_deleted
+ stats.event_logs_deleted
)
if not dry_run:
await db.commit()
logger.info("Cleanup completed: %s", stats)
else:
logger.info("Cleanup dry run completed: %s", stats)
return stats
async def _cleanup_table(
db: AsyncSession,
model: type,
cutoff_date: datetime,
table_name: str,
dry_run: bool,
) -> int:
"""Delete old records from a specific table.
Args:
db: Database session
model: SQLAlchemy model class
cutoff_date: Delete records older than this date
table_name: Name of table for logging
dry_run: If True, only count without deleting
Returns:
Number of records deleted (or would be deleted in dry_run)
"""
from sqlalchemy import select
if dry_run:
# Count records that would be deleted
stmt = (
select(func.count())
.select_from(model)
.where(model.created_at < cutoff_date) # type: ignore[attr-defined]
)
result = await db.execute(stmt)
count = result.scalar() or 0
logger.debug(
"[DRY RUN] Would delete %d records from %s older than %s",
count,
table_name,
cutoff_date.isoformat(),
)
return count
else:
# Delete old records
result = await db.execute(delete(model).where(model.created_at < cutoff_date)) # type: ignore[attr-defined]
count = result.rowcount or 0 # type: ignore[attr-defined]
logger.debug(
"Deleted %d records from %s older than %s",
count,
table_name,
cutoff_date.isoformat(),
)
return count
async def cleanup_inactive_nodes(
db: AsyncSession,
inactivity_days: int,
dry_run: bool = False,
) -> int:
"""Delete nodes that haven't been seen for the specified number of days.
Only deletes nodes where last_seen is older than the cutoff date.
Nodes with last_seen=NULL are NOT deleted (never seen on network).
Args:
db: Database session
inactivity_days: Delete nodes not seen for this many days
dry_run: If True, only count without deleting
Returns:
Number of nodes deleted (or would be deleted in dry_run)
"""
cutoff_date = datetime.now(timezone.utc) - timedelta(days=inactivity_days)
logger.info(
"Starting node cleanup (dry_run=%s, inactivity_days=%d, cutoff=%s)",
dry_run,
inactivity_days,
cutoff_date.isoformat(),
)
if dry_run:
# Count nodes that would be deleted
# Only count nodes with last_seen < cutoff (excludes NULL last_seen)
stmt = (
select(func.count())
.select_from(Node)
.where(Node.last_seen < cutoff_date)
.where(Node.last_seen.isnot(None))
)
result = await db.execute(stmt)
count = result.scalar() or 0
logger.info(
"[DRY RUN] Would delete %d nodes not seen since %s",
count,
cutoff_date.isoformat(),
)
return count
else:
# Delete inactive nodes
# Only delete nodes with last_seen < cutoff (excludes NULL last_seen)
result = await db.execute(
delete(Node)
.where(Node.last_seen < cutoff_date)
.where(Node.last_seen.isnot(None))
)
await db.commit()
count = result.rowcount or 0 # type: ignore[attr-defined]
logger.info(
"Deleted %d nodes not seen since %s",
count,
cutoff_date.isoformat(),
)
return count

View File

@@ -1,9 +1,14 @@
"""CLI for the Collector component."""
from typing import TYPE_CHECKING
import click
from meshcore_hub.common.logging import configure_logging
if TYPE_CHECKING:
from meshcore_hub.common.database import DatabaseManager
@click.group(invoke_without_command=True)
@click.pass_context
@@ -42,6 +47,13 @@ from meshcore_hub.common.logging import configure_logging
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--data-home",
type=str,
@@ -77,6 +89,7 @@ def collector(
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
data_home: str | None,
seed_home: str | None,
database_url: str | None,
@@ -120,6 +133,7 @@ def collector(
ctx.obj["mqtt_username"] = mqtt_username
ctx.obj["mqtt_password"] = mqtt_password
ctx.obj["prefix"] = prefix
ctx.obj["mqtt_tls"] = mqtt_tls
ctx.obj["data_home"] = data_home or settings.data_home
ctx.obj["seed_home"] = settings.effective_seed_home
ctx.obj["database_url"] = effective_db_url
@@ -134,9 +148,11 @@ def collector(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
prefix=prefix,
mqtt_tls=mqtt_tls,
database_url=effective_db_url,
log_level=log_level,
data_home=data_home or settings.data_home,
seed_home=settings.effective_seed_home,
)
@@ -146,12 +162,17 @@ def _run_collector_service(
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
database_url: str,
log_level: str,
data_home: str,
seed_home: str,
) -> None:
"""Run the collector service.
Note: Seed data import should be done using the 'meshcore-hub collector seed'
command or the dedicated seed container before starting the collector service.
Webhooks can be configured via environment variables:
- WEBHOOK_ADVERTISEMENT_URL: Webhook for advertisement events
- WEBHOOK_MESSAGE_URL: Webhook for all message events
@@ -168,20 +189,22 @@ def _run_collector_service(
click.echo("Starting MeshCore Collector")
click.echo(f"Data home: {data_home}")
click.echo(f"Seed home: {seed_home}")
click.echo(f"MQTT: {mqtt_host}:{mqtt_port} (prefix: {prefix})")
click.echo(f"Database: {database_url}")
# Load webhook configuration from settings
from meshcore_hub.common.config import get_collector_settings
from meshcore_hub.collector.webhook import (
WebhookDispatcher,
create_webhooks_from_settings,
)
from meshcore_hub.common.config import get_collector_settings
settings = get_collector_settings()
webhooks = create_webhooks_from_settings(settings)
webhook_dispatcher = WebhookDispatcher(webhooks) if webhooks else None
click.echo("")
if webhook_dispatcher and webhook_dispatcher.webhooks:
click.echo(f"Webhooks configured: {len(webhooks)}")
for wh in webhooks:
@@ -191,14 +214,42 @@ def _run_collector_service(
from meshcore_hub.collector.subscriber import run_collector
# Show cleanup configuration
click.echo("")
click.echo("Cleanup configuration:")
if settings.data_retention_enabled:
click.echo(
f" Event data: Enabled (retention: {settings.data_retention_days} days)"
)
else:
click.echo(" Event data: Disabled")
if settings.node_cleanup_enabled:
click.echo(
f" Inactive nodes: Enabled (inactivity: {settings.node_cleanup_days} days)"
)
else:
click.echo(" Inactive nodes: Disabled")
if settings.data_retention_enabled or settings.node_cleanup_enabled:
click.echo(f" Interval: {settings.data_retention_interval_hours} hours")
click.echo("")
click.echo("Starting MQTT subscriber...")
run_collector(
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
database_url=database_url,
webhook_dispatcher=webhook_dispatcher,
cleanup_enabled=settings.data_retention_enabled,
cleanup_retention_days=settings.data_retention_days,
cleanup_interval_hours=settings.data_retention_interval_hours,
node_cleanup_enabled=settings.node_cleanup_enabled,
node_cleanup_days=settings.node_cleanup_days,
)
@@ -215,9 +266,11 @@ def run_cmd(ctx: click.Context) -> None:
mqtt_username=ctx.obj["mqtt_username"],
mqtt_password=ctx.obj["mqtt_password"],
prefix=ctx.obj["prefix"],
mqtt_tls=ctx.obj["mqtt_tls"],
database_url=ctx.obj["database_url"],
log_level=ctx.obj["log_level"],
data_home=ctx.obj["data_home"],
seed_home=ctx.obj["seed_home"],
)
@@ -236,17 +289,15 @@ def seed_cmd(
"""Import seed data from SEED_HOME directory.
Looks for the following files in SEED_HOME:
- node_tags.json: Node tag definitions (keyed by public_key)
- members.json: Network member definitions
- node_tags.yaml: Node tag definitions (keyed by public_key)
- members.yaml: Network member definitions
Files that don't exist are skipped. This command is idempotent -
existing records are updated, new records are created.
SEED_HOME defaults to {DATA_HOME}/collector but can be overridden
SEED_HOME defaults to ./seed but can be overridden
with the --seed-home option or SEED_HOME environment variable.
"""
from pathlib import Path
configure_logging(level=ctx.obj["log_level"])
seed_home = ctx.obj["seed_home"]
@@ -254,50 +305,17 @@ def seed_cmd(
click.echo(f"Database: {ctx.obj['database_url']}")
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Track what was imported
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.json"
if node_tags_file.exists():
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=not no_create_nodes,
)
click.echo(f" Tags: {stats['created']} created, {stats['updated']} updated")
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo node_tags.json found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.json"
if members_file.exists():
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
click.echo(f" Members: {stats['created']} created, {stats['updated']} updated")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
else:
click.echo(f"\nNo members.json found in {seed_home}")
# Run seed import
imported_any = _run_seed_import(
seed_home=seed_home,
db=db,
create_nodes=not no_create_nodes,
verbose=True,
)
if not imported_any:
click.echo("\nNo seed files found. Nothing to import.")
@@ -307,6 +325,79 @@ def seed_cmd(
db.dispose()
def _run_seed_import(
seed_home: str,
db: "DatabaseManager",
create_nodes: bool = True,
verbose: bool = False,
) -> bool:
"""Run seed import from SEED_HOME directory.
Args:
seed_home: Path to seed home directory
db: Database manager instance
create_nodes: If True, create nodes that don't exist
verbose: If True, output progress messages
Returns:
True if any files were imported, False otherwise
"""
from pathlib import Path
from meshcore_hub.collector.member_import import import_members
from meshcore_hub.collector.tag_import import import_tags
imported_any = False
# Import node tags if file exists
node_tags_file = Path(seed_home) / "node_tags.yaml"
if node_tags_file.exists():
if verbose:
click.echo(f"\nImporting node tags from: {node_tags_file}")
stats = import_tags(
file_path=str(node_tags_file),
db=db,
create_nodes=create_nodes,
clear_existing=True,
)
if verbose:
if stats["deleted"]:
click.echo(f" Deleted {stats['deleted']} existing tags")
click.echo(
f" Tags: {stats['created']} created, {stats['updated']} updated"
)
if stats["nodes_created"]:
click.echo(f" Nodes created: {stats['nodes_created']}")
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo node_tags.yaml found in {seed_home}")
# Import members if file exists
members_file = Path(seed_home) / "members.yaml"
if members_file.exists():
if verbose:
click.echo(f"\nImporting members from: {members_file}")
stats = import_members(
file_path=str(members_file),
db=db,
)
if verbose:
click.echo(
f" Members: {stats['created']} created, {stats['updated']} updated"
)
if stats["errors"]:
for error in stats["errors"]:
click.echo(f" Error: {error}", err=True)
imported_any = True
elif verbose:
click.echo(f"\nNo members.yaml found in {seed_home}")
return imported_any
@collector.command("import-tags")
@click.argument("file", type=click.Path(), required=False, default=None)
@click.option(
@@ -315,40 +406,48 @@ def seed_cmd(
default=False,
help="Skip tags for nodes that don't exist (default: create nodes)",
)
@click.option(
"--clear-existing",
is_flag=True,
default=False,
help="Delete all existing tags before importing",
)
@click.pass_context
def import_tags_cmd(
ctx: click.Context,
file: str | None,
no_create_nodes: bool,
clear_existing: bool,
) -> None:
"""Import node tags from a JSON file.
"""Import node tags from a YAML file.
Reads a JSON file containing tag definitions and upserts them
into the database. Existing tags are updated, new tags are created.
Reads a YAML file containing tag definitions and upserts them
into the database. By default, existing tags are updated and new tags are created.
Use --clear-existing to delete all tags before importing.
FILE is the path to the JSON file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.json.
FILE is the path to the YAML file containing tags.
If not provided, defaults to {SEED_HOME}/node_tags.yaml.
Expected YAML format (keyed by public_key):
Expected JSON format (keyed by public_key):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
0123456789abcdef...:
friendly_name: My Node
altitude:
value: "150"
type: number
active:
value: "true"
type: boolean
Shorthand is also supported (string values with default type):
\b
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"role": "gateway"
}
}
Supported types: string, number, boolean, coordinate
\b
0123456789abcdef...:
friendly_name: My Node
role: gateway
Supported types: string, number, boolean
"""
from pathlib import Path
@@ -362,7 +461,7 @@ def import_tags_cmd(
if not Path(tags_file).exists():
click.echo(f"Tags file not found: {tags_file}")
if not file:
click.echo("Specify a file path or create the default node_tags.json.")
click.echo("Specify a file path or create the default node_tags.yaml.")
return
click.echo(f"Importing tags from: {tags_file}")
@@ -371,20 +470,22 @@ def import_tags_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.tag_import import import_tags
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import tags
stats = import_tags(
file_path=tags_file,
db=db,
create_nodes=not no_create_nodes,
clear_existing=clear_existing,
)
# Report results
click.echo("")
click.echo("Import complete:")
if stats["deleted"]:
click.echo(f" Tags deleted: {stats['deleted']}")
click.echo(f" Total tags in file: {stats['total']}")
click.echo(f" Tags created: {stats['created']}")
click.echo(f" Tags updated: {stats['updated']}")
@@ -407,33 +508,29 @@ def import_members_cmd(
ctx: click.Context,
file: str | None,
) -> None:
"""Import network members from a JSON file.
"""Import network members from a YAML file.
Reads a JSON file containing member definitions and upserts them
Reads a YAML file containing member definitions and upserts them
into the database. Existing members (matched by name) are updated,
new members are created.
FILE is the path to the JSON file containing members.
If not provided, defaults to {SEED_HOME}/members.json.
FILE is the path to the YAML file containing members.
If not provided, defaults to {SEED_HOME}/members.yaml.
Expected YAML format (list):
Expected JSON format (list):
\b
[
{
"name": "John Doe",
"callsign": "N0CALL",
"role": "Network Operator",
"description": "Example member"
}
]
- name: John Doe
callsign: N0CALL
role: Network Operator
description: Example member
Or with "members" key:
\b
{
"members": [
{"name": "John Doe", "callsign": "N0CALL", ...}
]
}
members:
- name: John Doe
callsign: N0CALL
"""
from pathlib import Path
@@ -447,7 +544,7 @@ def import_members_cmd(
if not Path(members_file).exists():
click.echo(f"Members file not found: {members_file}")
if not file:
click.echo("Specify a file path or create the default members.json.")
click.echo("Specify a file path or create the default members.yaml.")
return
click.echo(f"Importing members from: {members_file}")
@@ -456,9 +553,8 @@ def import_members_cmd(
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.member_import import import_members
# Initialize database
# Initialize database (schema managed by Alembic migrations)
db = DatabaseManager(ctx.obj["database_url"])
db.create_tables()
# Import members
stats = import_members(
@@ -480,3 +576,299 @@ def import_members_cmd(
click.echo(f" - {error}", err=True)
db.dispose()
@collector.command("cleanup")
@click.option(
"--retention-days",
type=int,
default=30,
envvar="DATA_RETENTION_DAYS",
help="Number of days to retain data (default: 30)",
)
@click.option(
"--dry-run",
is_flag=True,
default=False,
help="Show what would be deleted without deleting",
)
@click.pass_context
def cleanup_cmd(
ctx: click.Context,
retention_days: int,
dry_run: bool,
) -> None:
"""Manually run data cleanup to delete old events.
Deletes event data older than the retention period:
- Advertisements
- Messages (channel and direct)
- Telemetry
- Trace paths
- Event logs
Node records are never deleted - only event data.
Use --dry-run to preview what would be deleted without
actually deleting anything.
"""
import asyncio
configure_logging(level=ctx.obj["log_level"])
click.echo(f"Database: {ctx.obj['database_url']}")
click.echo(f"Retention: {retention_days} days")
click.echo(f"Mode: {'DRY RUN' if dry_run else 'LIVE'}")
click.echo("")
if dry_run:
click.echo("Running in dry-run mode - no data will be deleted.")
else:
click.echo("WARNING: This will permanently delete old event data!")
if not click.confirm("Continue?"):
click.echo("Aborted.")
return
click.echo("")
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.collector.cleanup import cleanup_old_data
# Initialize database
db = DatabaseManager(ctx.obj["database_url"])
# Run cleanup
async def run_cleanup() -> None:
async with db.async_session() as session:
stats = await cleanup_old_data(
session,
retention_days,
dry_run=dry_run,
)
click.echo("")
click.echo("Cleanup results:")
click.echo(f" Advertisements: {stats.advertisements_deleted}")
click.echo(f" Messages: {stats.messages_deleted}")
click.echo(f" Telemetry: {stats.telemetry_deleted}")
click.echo(f" Trace paths: {stats.trace_paths_deleted}")
click.echo(f" Event logs: {stats.event_logs_deleted}")
click.echo(f" Total: {stats.total_deleted}")
if dry_run:
click.echo("")
click.echo("(Dry run - no data was actually deleted)")
asyncio.run(run_cleanup())
db.dispose()
click.echo("")
click.echo("Cleanup complete." if not dry_run else "Dry run complete.")
@collector.command("truncate")
@click.option(
"--members",
is_flag=True,
default=False,
help="Truncate members table",
)
@click.option(
"--nodes",
is_flag=True,
default=False,
help="Truncate nodes table (also clears tags, advertisements, messages, telemetry, trace paths)",
)
@click.option(
"--messages",
is_flag=True,
default=False,
help="Truncate messages table",
)
@click.option(
"--advertisements",
is_flag=True,
default=False,
help="Truncate advertisements table",
)
@click.option(
"--telemetry",
is_flag=True,
default=False,
help="Truncate telemetry table",
)
@click.option(
"--trace-paths",
is_flag=True,
default=False,
help="Truncate trace_paths table",
)
@click.option(
"--event-logs",
is_flag=True,
default=False,
help="Truncate event_logs table",
)
@click.option(
"--all",
"truncate_all",
is_flag=True,
default=False,
help="Truncate ALL tables (use with caution!)",
)
@click.option(
"--yes",
is_flag=True,
default=False,
help="Skip confirmation prompt",
)
@click.pass_context
def truncate_cmd(
ctx: click.Context,
members: bool,
nodes: bool,
messages: bool,
advertisements: bool,
telemetry: bool,
trace_paths: bool,
event_logs: bool,
truncate_all: bool,
yes: bool,
) -> None:
"""Truncate (clear) data tables.
WARNING: This permanently deletes data! Use with caution.
Examples:
# Clear members table
meshcore-hub collector truncate --members
# Clear messages and advertisements
meshcore-hub collector truncate --messages --advertisements
# Clear everything (requires confirmation)
meshcore-hub collector truncate --all
Note: Clearing nodes also clears all related data (tags, advertisements,
messages, telemetry, trace paths) due to foreign key constraints.
"""
configure_logging(level=ctx.obj["log_level"])
# Determine what to truncate
if truncate_all:
tables_to_clear = {
"members": True,
"nodes": True,
"messages": True,
"advertisements": True,
"telemetry": True,
"trace_paths": True,
"event_logs": True,
}
else:
tables_to_clear = {
"members": members,
"nodes": nodes,
"messages": messages,
"advertisements": advertisements,
"telemetry": telemetry,
"trace_paths": trace_paths,
"event_logs": event_logs,
}
# Check if any tables selected
if not any(tables_to_clear.values()):
click.echo("No tables specified. Use --help to see available options.")
return
# Show what will be cleared
click.echo("Database: " + ctx.obj["database_url"])
click.echo("")
click.echo("The following tables will be PERMANENTLY CLEARED:")
for table, should_clear in tables_to_clear.items():
if should_clear:
click.echo(f" - {table}")
if tables_to_clear.get("nodes"):
click.echo("")
click.echo(
"WARNING: Clearing nodes will also clear all related data due to foreign keys:"
)
click.echo(" - node_tags")
click.echo(" - advertisements")
click.echo(" - messages")
click.echo(" - telemetry")
click.echo(" - trace_paths")
click.echo("")
# Confirm
if not yes:
if not click.confirm(
"Are you sure you want to permanently delete this data?", default=False
):
click.echo("Aborted.")
return
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import (
Advertisement,
EventLog,
Member,
Message,
Node,
NodeTag,
Telemetry,
TracePath,
)
from sqlalchemy import delete
from sqlalchemy.engine import CursorResult
db = DatabaseManager(ctx.obj["database_url"])
with db.session_scope() as session:
# Truncate in correct order to respect foreign keys
cleared: list[str] = []
# Clear members (no dependencies)
if tables_to_clear.get("members"):
result: CursorResult = session.execute(delete(Member)) # type: ignore
cleared.append(f"members: {result.rowcount} rows")
# Clear event-specific tables first (they depend on nodes)
if tables_to_clear.get("messages"):
result = session.execute(delete(Message)) # type: ignore
cleared.append(f"messages: {result.rowcount} rows")
if tables_to_clear.get("advertisements"):
result = session.execute(delete(Advertisement)) # type: ignore
cleared.append(f"advertisements: {result.rowcount} rows")
if tables_to_clear.get("telemetry"):
result = session.execute(delete(Telemetry)) # type: ignore
cleared.append(f"telemetry: {result.rowcount} rows")
if tables_to_clear.get("trace_paths"):
result = session.execute(delete(TracePath)) # type: ignore
cleared.append(f"trace_paths: {result.rowcount} rows")
if tables_to_clear.get("event_logs"):
result = session.execute(delete(EventLog)) # type: ignore
cleared.append(f"event_logs: {result.rowcount} rows")
# Clear nodes last (this will cascade delete tags and any remaining events)
if tables_to_clear.get("nodes"):
# Delete tags first (they depend on nodes)
tag_result: CursorResult = session.execute(delete(NodeTag)) # type: ignore
cleared.append(f"node_tags: {tag_result.rowcount} rows (cascade)")
# Delete nodes (will cascade to remaining related tables)
node_result: CursorResult = session.execute(delete(Node)) # type: ignore
cleared.append(f"nodes: {node_result.rowcount} rows")
db.dispose()
click.echo("")
click.echo("Truncate complete. Cleared:")
for item in cleared:
click.echo(f" - {item}")
click.echo("")

View File

@@ -19,7 +19,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
)
from meshcore_hub.collector.handlers.trace import handle_trace_data
from meshcore_hub.collector.handlers.telemetry import handle_telemetry
from meshcore_hub.collector.handlers.contacts import handle_contacts
from meshcore_hub.collector.handlers.contacts import handle_contact
from meshcore_hub.collector.handlers.event_log import handle_event_log
# Persisted events with specific handlers
@@ -28,7 +28,7 @@ def register_all_handlers(subscriber: "Subscriber") -> None:
subscriber.register_handler("channel_msg_recv", handle_channel_message)
subscriber.register_handler("trace_data", handle_trace_data)
subscriber.register_handler("telemetry_response", handle_telemetry)
subscriber.register_handler("contacts", handle_contacts)
subscriber.register_handler("contact", handle_contact) # Individual contact events
# Informational events (logged only)
subscriber.register_handler("send_confirmed", handle_event_log)

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Advertisement, Node
from meshcore_hub.common.hash_utils import compute_advertisement_hash
from meshcore_hub.common.models import Advertisement, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,17 @@ def handle_advertisement(
flags = payload.get("flags")
now = datetime.now(timezone.utc)
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_advertisement_hash(
public_key=adv_public_key,
name=name,
adv_type=adv_type,
flags=flags,
received_at=now,
)
with db.session_scope() as session:
# Find or create receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -55,6 +66,37 @@ def handle_advertisement(
)
session.add(receiver_node)
session.flush()
else:
receiver_node.last_seen = now
# Check if advertisement with same hash already exists
existing = session.execute(
select(Advertisement.id).where(Advertisement.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Still update advertised node's last_seen even for duplicate advertisements
node_query = select(Node).where(Node.public_key == adv_public_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
node.last_seen = now
# Add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Advertisements don't have SNR
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to advertisement "
f"(hash={event_hash[:8]}...)"
)
return
# Find or create advertised node
node_query = select(Node).where(Node.public_key == adv_public_key)
@@ -91,9 +133,43 @@ def handle_advertisement(
adv_type=adv_type,
flags=flags,
received_at=now,
event_hash=event_hash,
)
session.add(advertisement)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate advertisement skipped (race condition, "
f"hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="advertisement",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(
f"Stored advertisement from {name or adv_public_key[:12]!r} "
f"(type={adv_type})"

View File

@@ -1,4 +1,4 @@
"""Handler for contacts sync events."""
"""Handler for contact sync events."""
import logging
from datetime import datetime, timezone
@@ -11,65 +11,90 @@ from meshcore_hub.common.models import Node
logger = logging.getLogger(__name__)
# Map numeric node type to string representation
NODE_TYPE_MAP = {
0: "none",
1: "chat",
2: "repeater",
3: "room",
}
def handle_contacts(
def handle_contact(
public_key: str,
event_type: str,
payload: dict[str, Any],
db: DatabaseManager,
) -> None:
"""Handle a contacts sync event.
"""Handle a single contact event.
Upserts all contacts in the contacts list.
Upserts a contact into the nodes table.
Args:
public_key: Receiver node's public key (from MQTT topic)
event_type: Event type name
payload: Contacts payload
payload: Single contact object with fields:
- public_key: Contact's public key
- adv_name: Advertised name
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
db: Database manager
"""
contacts = payload.get("contacts", [])
if not contacts:
logger.debug("Empty contacts list received")
contact_key = payload.get("public_key")
if not contact_key:
logger.warning("Contact event missing public_key field")
return
# Device uses 'adv_name' for the advertised name
name = payload.get("adv_name") or payload.get("name")
# GPS coordinates (optional)
lat = payload.get("adv_lat")
lon = payload.get("adv_lon")
logger.info(f"Processing contact: {contact_key[:12]}... adv_name={name}")
# Device uses numeric 'type' field, convert to string
raw_type = payload.get("type")
if raw_type is not None:
node_type: str | None = NODE_TYPE_MAP.get(raw_type, str(raw_type))
else:
node_type = payload.get("node_type")
now = datetime.now(timezone.utc)
created_count = 0
updated_count = 0
with db.session_scope() as session:
for contact in contacts:
contact_key = contact.get("public_key")
if not contact_key:
continue
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
name = contact.get("name")
node_type = contact.get("node_type")
# Find or create node
node_query = select(Node).where(Node.public_key == contact_key)
node = session.execute(node_query).scalar_one_or_none()
if node:
# Update existing node
if name and not node.name:
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
node.last_seen = now
updated_count += 1
else:
# Create new node
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=now,
if node:
# Update existing node - always update name if we have one
if name and name != node.name:
logger.info(
f"Updating node {contact_key[:12]}... "
f"name: {node.name!r} -> {name!r}"
)
session.add(node)
created_count += 1
logger.info(
f"Processed contacts sync: {created_count} new, {updated_count} updated"
)
node.name = name
if node_type and not node.adv_type:
node.adv_type = node_type
# Update GPS coordinates if provided
if lat is not None:
node.lat = lat
if lon is not None:
node.lon = lon
# Do NOT update last_seen for contact sync - only advertisement events
# should update last_seen since that's when the node was actually seen
else:
# Create new node from contact database
# Set last_seen=None since we haven't actually seen this node advertise yet
node = Node(
public_key=contact_key,
name=name,
adv_type=node_type,
first_seen=now,
last_seen=None, # Will be set when we receive an advertisement
lat=lat,
lon=lon,
)
session.add(node)
logger.info(f"Created node from contact: {contact_key[:12]}... ({name})")

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Message, Node
from meshcore_hub.common.hash_utils import compute_message_hash
from meshcore_hub.common.models import Message, Node, add_event_receiver
logger = logging.getLogger(__name__)
@@ -84,8 +86,17 @@ def _handle_message(
except (ValueError, OSError):
pass
# Compute event hash for deduplication
event_hash = compute_message_hash(
text=text,
pubkey_prefix=pubkey_prefix,
channel_idx=channel_idx,
sender_timestamp=sender_timestamp,
txt_type=txt_type,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -102,6 +113,29 @@ def _handle_message(
else:
receiver_node.last_seen = now
# Check if message with same hash already exists
existing = session.execute(
select(Message.id).where(Message.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to message "
f"(hash={event_hash[:8]}...)"
)
return
# Create message record
message = Message(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -115,9 +149,42 @@ def _handle_message(
snr=snr,
sender_timestamp=sender_timestamp,
received_at=now,
event_hash=event_hash,
)
session.add(message)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate message skipped (race condition, hash={event_hash[:8]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="message",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=snr,
received_at=now,
)
return
if message_type == "contact":
logger.info(
f"Stored contact message from {pubkey_prefix!r}: "

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, Telemetry
from meshcore_hub.common.hash_utils import compute_telemetry_hash
from meshcore_hub.common.models import Node, Telemetry, add_event_receiver
logger = logging.getLogger(__name__)
@@ -49,8 +51,15 @@ def handle_telemetry(
except ValueError:
lpp_bytes = lpp_data.encode()
# Compute event hash for deduplication (30-second time bucket)
event_hash = compute_telemetry_hash(
node_public_key=node_public_key,
parsed_data=parsed_data,
received_at=now,
)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -67,6 +76,29 @@ def handle_telemetry(
else:
receiver_node.last_seen = now
# Check if telemetry with same hash already exists
existing = session.execute(
select(Telemetry.id).where(Telemetry.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to telemetry "
f"(node={node_public_key[:12]}...)"
)
return
# Find or create reporting node
reporting_node = None
if node_public_key:
@@ -92,9 +124,43 @@ def handle_telemetry(
lpp_data=lpp_bytes,
parsed_data=parsed_data,
received_at=now,
event_hash=event_hash,
)
session.add(telemetry)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate telemetry skipped (race condition, "
f"node={node_public_key[:12]}...)"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="telemetry",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
# Log telemetry values
if parsed_data:
values = ", ".join(f"{k}={v}" for k, v in parsed_data.items())

View File

@@ -5,9 +5,11 @@ from datetime import datetime, timezone
from typing import Any
from sqlalchemy import select
from sqlalchemy.exc import IntegrityError
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, TracePath
from meshcore_hub.common.hash_utils import compute_trace_hash
from meshcore_hub.common.models import Node, TracePath, add_event_receiver
logger = logging.getLogger(__name__)
@@ -40,8 +42,11 @@ def handle_trace_data(
snr_values = payload.get("snr_values")
hop_count = payload.get("hop_count")
# Compute event hash for deduplication (initiator_tag is unique per trace)
event_hash = compute_trace_hash(initiator_tag=initiator_tag)
with db.session_scope() as session:
# Find receiver node
# Find or create receiver node first (needed for both new and duplicate events)
receiver_node = None
if public_key:
receiver_query = select(Node).where(Node.public_key == public_key)
@@ -58,6 +63,29 @@ def handle_trace_data(
else:
receiver_node.last_seen = now
# Check if trace with same hash already exists
existing = session.execute(
select(TracePath.id).where(TracePath.event_hash == event_hash)
).scalar_one_or_none()
if existing:
# Event already exists - just add this receiver to the junction table
if receiver_node:
added = add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None, # Trace events don't have a single SNR value
received_at=now,
)
if added:
logger.debug(
f"Added receiver {public_key[:12]}... to trace "
f"(tag={initiator_tag})"
)
return
# Create trace path record
trace_path = TracePath(
receiver_node_id=receiver_node.id if receiver_node else None,
@@ -69,7 +97,40 @@ def handle_trace_data(
snr_values=snr_values,
hop_count=hop_count,
received_at=now,
event_hash=event_hash,
)
session.add(trace_path)
# Add first receiver to junction table
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
# Flush to check for duplicate constraint violation (race condition)
try:
session.flush()
except IntegrityError:
# Race condition: another request inserted the same event_hash
session.rollback()
logger.debug(
f"Duplicate trace skipped (race condition, tag={initiator_tag})"
)
# Re-add receiver to existing event in a new transaction
if receiver_node:
add_event_receiver(
session=session,
event_type="trace",
event_hash=event_hash,
receiver_node_id=receiver_node.id,
snr=None,
received_at=now,
)
return
logger.info(f"Stored trace data: tag={initiator_tag}, hops={hop_count}")

View File

@@ -1,11 +1,11 @@
"""Import members from JSON file."""
"""Import members from YAML file."""
import json
import logging
from pathlib import Path
from typing import Any, Optional
from pydantic import BaseModel, Field, field_validator
import yaml
from pydantic import BaseModel, Field
from sqlalchemy import select
from meshcore_hub.common.database import DatabaseManager
@@ -15,47 +15,46 @@ logger = logging.getLogger(__name__)
class MemberData(BaseModel):
"""Schema for a member entry in the import file."""
"""Schema for a member entry in the import file.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: str = Field(..., min_length=1, max_length=100)
name: str = Field(..., min_length=1, max_length=255)
callsign: Optional[str] = Field(default=None, max_length=20)
role: Optional[str] = Field(default=None, max_length=100)
description: Optional[str] = Field(default=None)
contact: Optional[str] = Field(default=None, max_length=255)
public_key: Optional[str] = Field(default=None)
@field_validator("public_key")
@classmethod
def validate_public_key(cls, v: Optional[str]) -> Optional[str]:
"""Validate and normalize public key if provided."""
if v is None:
return None
if len(v) != 64:
raise ValueError(f"public_key must be 64 characters, got {len(v)}")
if not all(c in "0123456789abcdefABCDEF" for c in v):
raise ValueError("public_key must be a valid hex string")
return v.lower()
def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
"""Load and validate members from a JSON file.
"""Load and validate members from a YAML file.
Supports two formats:
1. List of member objects:
[{"name": "Member 1", ...}, {"name": "Member 2", ...}]
- member_id: member1
name: Member 1
callsign: M1
2. Object with "members" key:
{"members": [{"name": "Member 1", ...}, ...]}
members:
- member_id: member1
name: Member 1
callsign: M1
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
Returns:
List of validated member dictionaries
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -63,7 +62,7 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
raise FileNotFoundError(f"Members file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
# Handle both formats
if isinstance(data, list):
@@ -73,15 +72,15 @@ def load_members_file(file_path: str | Path) -> list[dict[str, Any]]:
if not isinstance(members_list, list):
raise ValueError("'members' key must contain a list")
else:
raise ValueError(
"Members file must be a list or an object with 'members' key"
)
raise ValueError("Members file must be a list or a mapping with 'members' key")
# Validate each member
validated: list[dict[str, Any]] = []
for i, member in enumerate(members_list):
if not isinstance(member, dict):
raise ValueError(f"Member at index {i} must be an object")
if "member_id" not in member:
raise ValueError(f"Member at index {i} must have a 'member_id' field")
if "name" not in member:
raise ValueError(f"Member at index {i} must have a 'name' field")
@@ -99,13 +98,16 @@ def import_members(
file_path: str | Path,
db: DatabaseManager,
) -> dict[str, Any]:
"""Import members from a JSON file into the database.
"""Import members from a YAML file into the database.
Performs upsert operations based on name - existing members are updated,
Performs upsert operations based on member_id - existing members are updated,
new members are created.
Note: Nodes are associated with members via a 'member_id' tag on the node.
This import does not manage node associations.
Args:
file_path: Path to the members JSON file
file_path: Path to the members YAML file
db: Database manager instance
Returns:
@@ -134,14 +136,17 @@ def import_members(
with db.session_scope() as session:
for member_data in members_data:
try:
member_id = member_data["member_id"]
name = member_data["name"]
# Find existing member by name
query = select(Member).where(Member.name == name)
# Find existing member by member_id
query = select(Member).where(Member.member_id == member_id)
existing = session.execute(query).scalar_one_or_none()
if existing:
# Update existing member
if member_data.get("name") is not None:
existing.name = member_data["name"]
if member_data.get("callsign") is not None:
existing.callsign = member_data["callsign"]
if member_data.get("role") is not None:
@@ -150,27 +155,26 @@ def import_members(
existing.description = member_data["description"]
if member_data.get("contact") is not None:
existing.contact = member_data["contact"]
if member_data.get("public_key") is not None:
existing.public_key = member_data["public_key"]
stats["updated"] += 1
logger.debug(f"Updated member: {name}")
logger.debug(f"Updated member: {member_id} ({name})")
else:
# Create new member
new_member = Member(
member_id=member_id,
name=name,
callsign=member_data.get("callsign"),
role=member_data.get("role"),
description=member_data.get("description"),
contact=member_data.get("contact"),
public_key=member_data.get("public_key"),
)
session.add(new_member)
stats["created"] += 1
logger.debug(f"Created member: {name}")
logger.debug(f"Created member: {member_id} ({name})")
except Exception as e:
error_msg = f"Error processing member '{member_data.get('name', 'unknown')}': {e}"
error_msg = f"Error processing member '{member_data.get('member_id', 'unknown')}' ({member_data.get('name', 'unknown')}): {e}"
stats["errors"].append(error_msg)
logger.error(error_msg)

View File

@@ -6,6 +6,7 @@ The subscriber:
3. Routes events to appropriate handlers
4. Persists data to database
5. Dispatches events to configured webhooks
6. Performs scheduled data cleanup if enabled
"""
import asyncio
@@ -14,6 +15,7 @@ import signal
import threading
import time
import uuid
from datetime import datetime, timezone
from typing import Any, Callable, Optional, TYPE_CHECKING
from meshcore_hub.common.database import DatabaseManager
@@ -38,6 +40,11 @@ class Subscriber:
mqtt_client: MQTTClient,
db_manager: DatabaseManager,
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
):
"""Initialize subscriber.
@@ -45,6 +52,11 @@ class Subscriber:
mqtt_client: MQTT client instance
db_manager: Database manager instance
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
"""
self.mqtt = mqtt_client
self.db = db_manager
@@ -59,6 +71,14 @@ class Subscriber:
self._webhook_queue: list[tuple[str, dict[str, Any], str]] = []
self._webhook_lock = threading.Lock()
self._webhook_thread: Optional[threading.Thread] = None
# Data cleanup
self._cleanup_enabled = cleanup_enabled
self._cleanup_retention_days = cleanup_retention_days
self._cleanup_interval_hours = cleanup_interval_hours
self._node_cleanup_enabled = node_cleanup_enabled
self._node_cleanup_days = node_cleanup_days
self._cleanup_thread: Optional[threading.Thread] = None
self._last_cleanup: Optional[datetime] = None
@property
def is_healthy(self) -> bool:
@@ -202,18 +222,129 @@ class Subscriber:
if self._webhook_thread.is_alive():
logger.warning("Webhook processor thread did not stop cleanly")
def _start_cleanup_scheduler(self) -> None:
"""Start background thread for periodic data cleanup."""
if not self._cleanup_enabled and not self._node_cleanup_enabled:
logger.info("Data cleanup and node cleanup are both disabled")
return
logger.info(
"Starting cleanup scheduler (interval_hours=%d)",
self._cleanup_interval_hours,
)
if self._cleanup_enabled:
logger.info(
" Event data cleanup: ENABLED (retention_days=%d)",
self._cleanup_retention_days,
)
else:
logger.info(" Event data cleanup: DISABLED")
if self._node_cleanup_enabled:
logger.info(
" Node cleanup: ENABLED (inactivity_days=%d)", self._node_cleanup_days
)
else:
logger.info(" Node cleanup: DISABLED")
def run_cleanup_loop() -> None:
"""Run async cleanup tasks in background thread."""
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
try:
while self._running:
# Check if cleanup is due
now = datetime.now(timezone.utc)
should_run = False
if self._last_cleanup is None:
# First run
should_run = True
else:
# Check if interval has passed
hours_since_last = (
now - self._last_cleanup
).total_seconds() / 3600
should_run = hours_since_last >= self._cleanup_interval_hours
if should_run:
try:
logger.info("Starting scheduled cleanup")
from meshcore_hub.collector.cleanup import (
cleanup_old_data,
cleanup_inactive_nodes,
)
# Get async session and run cleanup
async def run_cleanup() -> None:
async with self.db.async_session() as session:
# Run event data cleanup if enabled
if self._cleanup_enabled:
stats = await cleanup_old_data(
session,
self._cleanup_retention_days,
dry_run=False,
)
logger.info(
"Event cleanup completed: %s", stats
)
# Run node cleanup if enabled
if self._node_cleanup_enabled:
nodes_deleted = await cleanup_inactive_nodes(
session,
self._node_cleanup_days,
dry_run=False,
)
logger.info(
"Node cleanup completed: %d nodes deleted",
nodes_deleted,
)
loop.run_until_complete(run_cleanup())
self._last_cleanup = now
except Exception as e:
logger.error(f"Cleanup error: {e}", exc_info=True)
# Sleep for 1 hour before next check
for _ in range(3600):
if not self._running:
break
time.sleep(1)
finally:
loop.close()
logger.info("Cleanup scheduler stopped")
self._cleanup_thread = threading.Thread(
target=run_cleanup_loop, daemon=True, name="cleanup-scheduler"
)
self._cleanup_thread.start()
def _stop_cleanup_scheduler(self) -> None:
"""Stop the cleanup scheduler thread."""
if self._cleanup_thread and self._cleanup_thread.is_alive():
# Thread will exit when self._running becomes False
self._cleanup_thread.join(timeout=5.0)
if self._cleanup_thread.is_alive():
logger.warning("Cleanup scheduler thread did not stop cleanly")
def start(self) -> None:
"""Start the subscriber."""
logger.info("Starting collector subscriber")
# Create database tables if needed
# Verify database connection (schema managed by Alembic migrations)
try:
self.db.create_tables()
# Test connection by getting a session
session = self.db.get_session()
session.close()
self._db_connected = True
logger.info("Database initialized")
logger.info("Database connection verified")
except Exception as e:
self._db_connected = False
logger.error(f"Failed to initialize database: {e}")
logger.error(f"Failed to connect to database: {e}")
raise
# Connect to MQTT broker
@@ -237,6 +368,9 @@ class Subscriber:
# Start webhook processor if configured
self._start_webhook_processor()
# Start cleanup scheduler if configured
self._start_cleanup_scheduler()
# Start health reporter for Docker health checks
self._health_reporter = HealthReporter(
component="collector",
@@ -269,6 +403,9 @@ class Subscriber:
self._running = False
self._shutdown_event.set()
# Stop cleanup scheduler
self._stop_cleanup_scheduler()
# Stop webhook processor
self._stop_webhook_processor()
@@ -291,8 +428,14 @@ def create_subscriber(
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
database_url: str = "sqlite:///./meshcore.db",
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
) -> Subscriber:
"""Create a configured subscriber instance.
@@ -302,8 +445,14 @@ def create_subscriber(
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
database_url: Database connection URL
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
Returns:
Configured Subscriber instance
@@ -317,6 +466,7 @@ def create_subscriber(
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-collector-{unique_id}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
@@ -324,7 +474,16 @@ def create_subscriber(
db_manager = DatabaseManager(database_url)
# Create subscriber
subscriber = Subscriber(mqtt_client, db_manager, webhook_dispatcher)
subscriber = Subscriber(
mqtt_client,
db_manager,
webhook_dispatcher,
cleanup_enabled=cleanup_enabled,
cleanup_retention_days=cleanup_retention_days,
cleanup_interval_hours=cleanup_interval_hours,
node_cleanup_enabled=node_cleanup_enabled,
node_cleanup_days=node_cleanup_days,
)
# Register handlers
from meshcore_hub.collector.handlers import register_all_handlers
@@ -340,8 +499,14 @@ def run_collector(
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
database_url: str = "sqlite:///./meshcore.db",
webhook_dispatcher: Optional["WebhookDispatcher"] = None,
cleanup_enabled: bool = False,
cleanup_retention_days: int = 30,
cleanup_interval_hours: int = 24,
node_cleanup_enabled: bool = False,
node_cleanup_days: int = 90,
) -> None:
"""Run the collector (blocking).
@@ -351,8 +516,14 @@ def run_collector(
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
database_url: Database connection URL
webhook_dispatcher: Optional webhook dispatcher for event forwarding
cleanup_enabled: Enable automatic event data cleanup
cleanup_retention_days: Number of days to retain event data
cleanup_interval_hours: Hours between cleanup runs
node_cleanup_enabled: Enable automatic cleanup of inactive nodes
node_cleanup_days: Remove nodes not seen for this many days
"""
subscriber = create_subscriber(
mqtt_host=mqtt_host,
@@ -360,8 +531,14 @@ def run_collector(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
database_url=database_url,
webhook_dispatcher=webhook_dispatcher,
cleanup_enabled=cleanup_enabled,
cleanup_retention_days=cleanup_retention_days,
cleanup_interval_hours=cleanup_interval_hours,
node_cleanup_enabled=node_cleanup_enabled,
node_cleanup_days=node_cleanup_days,
)
# Set up signal handlers

View File

@@ -1,13 +1,13 @@
"""Import node tags from JSON file."""
"""Import node tags from YAML file."""
import json
import logging
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
import yaml
from pydantic import BaseModel, Field, model_validator
from sqlalchemy import select
from sqlalchemy import delete, func, select
from meshcore_hub.common.database import DatabaseManager
from meshcore_hub.common.models import Node, NodeTag
@@ -19,7 +19,7 @@ class TagValue(BaseModel):
"""Schema for a tag value with type."""
value: str | None = None
type: str = Field(default="string", pattern=r"^(string|number|boolean|coordinate)$")
type: str = Field(default="string", pattern=r"^(string|number|boolean)$")
class NodeTags(BaseModel):
@@ -64,33 +64,33 @@ def validate_public_key(public_key: str) -> str:
def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
"""Load and validate tags from a JSON file.
"""Load and validate tags from a YAML file.
New format - dictionary keyed by public_key:
{
"0123456789abcdef...": {
"friendly_name": "My Node",
"location": {"value": "52.0,1.0", "type": "coordinate"},
"altitude": {"value": "150", "type": "number"}
}
}
YAML format - dictionary keyed by public_key:
0123456789abcdef...:
friendly_name: My Node
location:
value: "52.0,1.0"
type: coordinate
altitude:
value: "150"
type: number
Shorthand is allowed - string values are auto-converted:
{
"0123456789abcdef...": {
"friendly_name": "My Node"
}
}
0123456789abcdef...:
friendly_name: My Node
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
Returns:
Dictionary mapping public_key to tag dictionary
Raises:
FileNotFoundError: If file does not exist
json.JSONDecodeError: If file is not valid JSON
yaml.YAMLError: If file is not valid YAML
ValueError: If file content is invalid
"""
path = Path(file_path)
@@ -98,10 +98,10 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
raise FileNotFoundError(f"Tags file not found: {file_path}")
with open(path, "r") as f:
data = json.load(f)
data = yaml.safe_load(f)
if not isinstance(data, dict):
raise ValueError("Tags file must contain a JSON object")
raise ValueError("Tags file must contain a YAML mapping")
# Validate each entry
validated: dict[str, dict[str, Any]] = {}
@@ -117,12 +117,24 @@ def load_tags_file(file_path: str | Path) -> dict[str, dict[str, Any]]:
for tag_key, tag_value in tags.items():
if isinstance(tag_value, dict):
# Full format with value and type
raw_value = tag_value.get("value")
# Convert value to string if it's not None
str_value = str(raw_value) if raw_value is not None else None
validated_tags[tag_key] = {
"value": tag_value.get("value"),
"value": str_value,
"type": tag_value.get("type", "string"),
}
elif isinstance(tag_value, bool):
# YAML boolean - must check before int since bool is subclass of int
validated_tags[tag_key] = {
"value": str(tag_value).lower(),
"type": "boolean",
}
elif isinstance(tag_value, (int, float)):
# YAML number (int or float)
validated_tags[tag_key] = {"value": str(tag_value), "type": "number"}
elif isinstance(tag_value, str):
# Shorthand: just a string value
# String value
validated_tags[tag_key] = {"value": tag_value, "type": "string"}
elif tag_value is None:
validated_tags[tag_key] = {"value": None, "type": "string"}
@@ -139,16 +151,19 @@ def import_tags(
file_path: str | Path,
db: DatabaseManager,
create_nodes: bool = True,
clear_existing: bool = False,
) -> dict[str, Any]:
"""Import tags from a JSON file into the database.
"""Import tags from a YAML file into the database.
Performs upsert operations - existing tags are updated, new tags are created.
Optionally clears all existing tags before import.
Args:
file_path: Path to the tags JSON file
file_path: Path to the tags YAML file
db: Database manager instance
create_nodes: If True, create nodes that don't exist. If False, skip tags
for non-existent nodes.
clear_existing: If True, delete all existing tags before importing.
Returns:
Dictionary with import statistics:
@@ -157,6 +172,7 @@ def import_tags(
- updated: Number of existing tags updated
- skipped: Number of tags skipped (node not found and create_nodes=False)
- nodes_created: Number of new nodes created
- deleted: Number of existing tags deleted (if clear_existing=True)
- errors: List of error messages
"""
stats: dict[str, Any] = {
@@ -165,6 +181,7 @@ def import_tags(
"updated": 0,
"skipped": 0,
"nodes_created": 0,
"deleted": 0,
"errors": [],
}
@@ -182,6 +199,15 @@ def import_tags(
now = datetime.now(timezone.utc)
with db.session_scope() as session:
# Clear all existing tags if requested
if clear_existing:
delete_count = (
session.execute(select(func.count()).select_from(NodeTag)).scalar() or 0
)
session.execute(delete(NodeTag))
stats["deleted"] = delete_count
logger.info(f"Deleted {delete_count} existing tags")
# Cache nodes by public_key to reduce queries
node_cache: dict[str, Node] = {}
@@ -198,7 +224,8 @@ def import_tags(
node = Node(
public_key=public_key,
first_seen=now,
last_seen=now,
# last_seen is intentionally left unset (None)
# It will be set when the node is actually seen via events
)
session.add(node)
session.flush()
@@ -219,24 +246,8 @@ def import_tags(
tag_value = tag_data.get("value")
tag_type = tag_data.get("type", "string")
# Find or create tag
tag_query = select(NodeTag).where(
NodeTag.node_id == node.id,
NodeTag.key == tag_key,
)
existing_tag = session.execute(tag_query).scalar_one_or_none()
if existing_tag:
# Update existing tag
existing_tag.value = tag_value
existing_tag.value_type = tag_type
stats["updated"] += 1
logger.debug(
f"Updated tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Create new tag
if clear_existing:
# When clearing, always create new tags
new_tag = NodeTag(
node_id=node.id,
key=tag_key,
@@ -249,6 +260,39 @@ def import_tags(
f"Created tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Find or create tag
tag_query = select(NodeTag).where(
NodeTag.node_id == node.id,
NodeTag.key == tag_key,
)
existing_tag = session.execute(
tag_query
).scalar_one_or_none()
if existing_tag:
# Update existing tag
existing_tag.value = tag_value
existing_tag.value_type = tag_type
stats["updated"] += 1
logger.debug(
f"Updated tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
else:
# Create new tag
new_tag = NodeTag(
node_id=node.id,
key=tag_key,
value=tag_value,
value_type=tag_type,
)
session.add(new_tag)
stats["created"] += 1
logger.debug(
f"Created tag {tag_key}={tag_value} "
f"for {public_key[:12]}..."
)
except Exception as e:
error_msg = f"Error processing tag {tag_key} for {public_key[:12]}...: {e}"

View File

@@ -404,7 +404,7 @@ _dispatch_callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None
def set_dispatch_callback(
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]]
callback: Optional[Callable[[str, dict[str, Any], Optional[str]], None]],
) -> None:
"""Set a callback for synchronous webhook dispatch.

View File

@@ -52,6 +52,9 @@ class CommonSettings(BaseSettings):
default=None, description="MQTT password (optional)"
)
mqtt_prefix: str = Field(default="meshcore", description="MQTT topic prefix")
mqtt_tls: bool = Field(
default=False, description="Enable TLS/SSL for MQTT connection"
)
class InterfaceSettings(CommonSettings):
@@ -70,6 +73,22 @@ class InterfaceSettings(CommonSettings):
# Mock device
mock_device: bool = Field(default=False, description="Use mock device for testing")
# Device name
meshcore_device_name: Optional[str] = Field(
default=None, description="Device/node name (optional)"
)
# Contact cleanup settings
contact_cleanup_enabled: bool = Field(
default=True,
description="Enable automatic removal of stale contacts from companion node",
)
contact_cleanup_days: int = Field(
default=7,
description="Remove contacts not advertised for this many days",
ge=1,
)
class CollectorSettings(CommonSettings):
"""Settings for the Collector component."""
@@ -80,7 +99,7 @@ class CollectorSettings(CommonSettings):
description="SQLAlchemy database URL (default: sqlite:///{data_home}/collector/meshcore.db)",
)
# Seed home directory - contains initial data files (node_tags.json, members.json)
# Seed home directory - contains initial data files (node_tags.yaml, members.yaml)
seed_home: str = Field(
default="./seed",
description="Directory containing seed data files (default: ./seed)",
@@ -121,6 +140,29 @@ class CollectorSettings(CommonSettings):
default=2.0, description="Retry backoff multiplier"
)
# Data retention / cleanup settings
data_retention_enabled: bool = Field(
default=True, description="Enable automatic event data cleanup"
)
data_retention_days: int = Field(
default=30, description="Number of days to retain event data", ge=1
)
data_retention_interval_hours: int = Field(
default=24,
description="Hours between automatic cleanup runs (applies to both events and nodes)",
ge=1,
)
# Node cleanup settings
node_cleanup_enabled: bool = Field(
default=True, description="Enable automatic cleanup of inactive nodes"
)
node_cleanup_days: int = Field(
default=7,
description="Remove nodes not seen for this many days (last_seen)",
ge=1,
)
@property
def collector_data_dir(self) -> str:
"""Get the collector data directory path."""
@@ -147,17 +189,17 @@ class CollectorSettings(CommonSettings):
@property
def node_tags_file(self) -> str:
"""Get the path to node_tags.json in seed_home."""
"""Get the path to node_tags.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "node_tags.json")
return str(Path(self.effective_seed_home) / "node_tags.yaml")
@property
def members_file(self) -> str:
"""Get the path to members.json in seed_home."""
"""Get the path to members.yaml in seed_home."""
from pathlib import Path
return str(Path(self.effective_seed_home) / "members.json")
return str(Path(self.effective_seed_home) / "members.yaml")
@field_validator("database_url")
@classmethod
@@ -211,6 +253,12 @@ class WebSettings(CommonSettings):
web_host: str = Field(default="0.0.0.0", description="Web server host")
web_port: int = Field(default=8080, description="Web server port")
# Admin interface (disabled by default for security)
web_admin_enabled: bool = Field(
default=False,
description="Enable admin interface at /a/ (requires OAuth2Proxy in front)",
)
# API connection
api_base_url: str = Field(
default="http://localhost:8000",
@@ -231,9 +279,6 @@ class WebSettings(CommonSettings):
network_country: Optional[str] = Field(
default=None, description="Network country (ISO 3166-1 alpha-2)"
)
network_location: Optional[str] = Field(
default=None, description="Network location (lat,lon)"
)
network_radio_config: Optional[str] = Field(
default=None, description="Radio configuration details"
)
@@ -243,6 +288,12 @@ class WebSettings(CommonSettings):
network_contact_discord: Optional[str] = Field(
default=None, description="Discord server link"
)
network_contact_github: Optional[str] = Field(
default=None, description="GitHub repository URL"
)
network_welcome_text: Optional[str] = Field(
default=None, description="Welcome text for homepage"
)
@property
def web_data_dir(self) -> str:

View File

@@ -1,10 +1,11 @@
"""Database connection and session management."""
from contextlib import contextmanager
from typing import Generator
from contextlib import asynccontextmanager, contextmanager
from typing import AsyncGenerator, Generator
from sqlalchemy import create_engine, event
from sqlalchemy.engine import Engine
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import Session, sessionmaker
from meshcore_hub.common.models.base import Base
@@ -97,9 +98,29 @@ class DatabaseManager:
echo: Enable SQL query logging
"""
self.database_url = database_url
# Ensure parent directory exists for SQLite databases
if database_url.startswith("sqlite:///"):
from pathlib import Path
# Extract path from sqlite:///path/to/db.sqlite
db_path = Path(database_url.replace("sqlite:///", ""))
db_path.parent.mkdir(parents=True, exist_ok=True)
self.engine = create_database_engine(database_url, echo=echo)
self.session_factory = create_session_factory(self.engine)
# Create async engine for async operations
async_url = database_url.replace("sqlite://", "sqlite+aiosqlite://")
self.async_engine = create_async_engine(async_url, echo=echo)
from sqlalchemy.ext.asyncio import async_sessionmaker
self.async_session_factory = async_sessionmaker(
self.async_engine,
class_=AsyncSession,
expire_on_commit=False,
)
def create_tables(self) -> None:
"""Create all database tables."""
create_tables(self.engine)
@@ -138,6 +159,21 @@ class DatabaseManager:
finally:
session.close()
@asynccontextmanager
async def async_session(self) -> AsyncGenerator[AsyncSession, None]:
"""Provide an async session context manager.
Yields:
AsyncSession instance
Example:
async with db.async_session() as session:
result = await session.execute(select(Node))
await session.commit()
"""
async with self.async_session_factory() as session:
yield session
def dispose(self) -> None:
"""Dispose of the database engine and connection pool."""
self.engine.dispose()

View File

@@ -0,0 +1,142 @@
"""Event hash utilities for deduplication.
This module provides functions to compute deterministic hashes for events,
allowing deduplication when multiple receiver nodes report the same event.
"""
import hashlib
from datetime import datetime
from typing import Optional
def compute_message_hash(
text: str,
pubkey_prefix: Optional[str] = None,
channel_idx: Optional[int] = None,
sender_timestamp: Optional[datetime] = None,
txt_type: Optional[int] = None,
) -> str:
"""Compute a deterministic hash for a message.
The hash is computed from fields that uniquely identify a message's content
and sender, excluding receiver-specific data.
Args:
text: Message content
pubkey_prefix: Sender's public key prefix (12 chars)
channel_idx: Channel index for channel messages
sender_timestamp: Sender's timestamp
txt_type: Message type indicator
Returns:
32-character hex hash string
"""
# Build a canonical string from the relevant fields
parts = [
text or "",
pubkey_prefix or "",
str(channel_idx) if channel_idx is not None else "",
sender_timestamp.isoformat() if sender_timestamp else "",
str(txt_type) if txt_type is not None else "",
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_advertisement_hash(
public_key: str,
name: Optional[str] = None,
adv_type: Optional[str] = None,
flags: Optional[int] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for an advertisement.
Advertisements are bucketed by time since the same node may advertise
periodically and we want to deduplicate within a time window.
Args:
public_key: Advertised node's public key
name: Advertised name
adv_type: Node type
flags: Capability flags
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time to allow deduplication within a window
time_bucket = ""
if received_at:
# Round down to nearest bucket
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
parts = [
public_key,
name or "",
adv_type or "",
str(flags) if flags is not None else "",
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()
def compute_trace_hash(initiator_tag: int) -> str:
"""Compute a deterministic hash for a trace path.
Trace paths have a unique initiator_tag that serves as the identifier.
Args:
initiator_tag: Unique trace identifier
Returns:
32-character hex hash string
"""
return hashlib.md5(str(initiator_tag).encode("utf-8")).hexdigest()
def compute_telemetry_hash(
node_public_key: str,
parsed_data: Optional[dict] = None,
received_at: Optional[datetime] = None,
bucket_seconds: int = 30,
) -> str:
"""Compute a deterministic hash for a telemetry record.
Telemetry is bucketed by time since nodes report periodically.
Args:
node_public_key: Reporting node's public key
parsed_data: Decoded sensor readings
received_at: When received (used for time bucketing)
bucket_seconds: Time bucket size in seconds (default 30)
Returns:
32-character hex hash string
"""
# Bucket the time
time_bucket = ""
if received_at:
epoch = int(received_at.timestamp())
bucket_epoch = (epoch // bucket_seconds) * bucket_seconds
time_bucket = str(bucket_epoch)
# Serialize parsed_data deterministically
data_str = ""
if parsed_data:
# Sort keys for deterministic serialization
sorted_items = sorted(parsed_data.items())
data_str = str(sorted_items)
parts = [
node_public_key,
data_str,
time_bucket,
]
canonical = "|".join(parts)
return hashlib.md5(canonical.encode("utf-8")).hexdigest()

View File

@@ -9,6 +9,7 @@ from meshcore_hub.common.models.trace_path import TracePath
from meshcore_hub.common.models.telemetry import Telemetry
from meshcore_hub.common.models.event_log import EventLog
from meshcore_hub.common.models.member import Member
from meshcore_hub.common.models.event_receiver import EventReceiver, add_event_receiver
__all__ = [
"Base",
@@ -21,4 +22,6 @@ __all__ = [
"Telemetry",
"EventLog",
"Member",
"EventReceiver",
"add_event_receiver",
]

View File

@@ -58,6 +58,11 @@ class Advertisement(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_advertisements_received_at", "received_at"),)

View File

@@ -0,0 +1,127 @@
"""EventReceiver model for tracking which nodes received each event."""
from datetime import datetime
from typing import TYPE_CHECKING, Optional
from uuid import uuid4
from sqlalchemy import DateTime, Float, ForeignKey, Index, String, UniqueConstraint
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
from sqlalchemy.orm import Mapped, Session, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin, utc_now
if TYPE_CHECKING:
from meshcore_hub.common.models.node import Node
class EventReceiver(Base, UUIDMixin, TimestampMixin):
"""Junction model tracking which receivers observed each event.
This table enables multi-receiver tracking for deduplicated events.
When multiple receiver nodes observe the same mesh event, each receiver
gets an entry in this table linked by the event_hash.
Attributes:
id: UUID primary key
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event (links to event tables)
receiver_node_id: FK to the node that received this event
snr: Signal-to-noise ratio at this receiver (if available)
received_at: When this specific receiver saw the event
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "event_receivers"
event_type: Mapped[str] = mapped_column(
String(20),
nullable=False,
)
event_hash: Mapped[str] = mapped_column(
String(32),
nullable=False,
index=True,
)
receiver_node_id: Mapped[str] = mapped_column(
String(36),
ForeignKey("nodes.id", ondelete="CASCADE"),
nullable=False,
index=True,
)
snr: Mapped[Optional[float]] = mapped_column(
Float,
nullable=True,
)
received_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
default=utc_now,
nullable=False,
)
# Relationship to receiver node
receiver_node: Mapped["Node"] = relationship(
"Node",
foreign_keys=[receiver_node_id],
)
__table_args__ = (
UniqueConstraint(
"event_hash", "receiver_node_id", name="uq_event_receivers_hash_node"
),
Index("ix_event_receivers_type_hash", "event_type", "event_hash"),
)
def __repr__(self) -> str:
return (
f"<EventReceiver(type={self.event_type}, "
f"hash={self.event_hash[:8]}..., "
f"node={self.receiver_node_id[:8]}...)>"
)
def add_event_receiver(
session: Session,
event_type: str,
event_hash: str,
receiver_node_id: str,
snr: Optional[float] = None,
received_at: Optional[datetime] = None,
) -> bool:
"""Add a receiver to an event, handling duplicates gracefully.
Uses INSERT OR IGNORE to handle the unique constraint on (event_hash, receiver_node_id).
Args:
session: SQLAlchemy session
event_type: Type of event ('message', 'advertisement', 'trace', 'telemetry')
event_hash: Hash identifying the unique event
receiver_node_id: UUID of the receiver node
snr: Signal-to-noise ratio at this receiver (optional)
received_at: When this receiver saw the event (defaults to now)
Returns:
True if a new receiver entry was added, False if it already existed.
"""
from datetime import timezone
now = received_at or datetime.now(timezone.utc)
stmt = (
sqlite_insert(EventReceiver)
.values(
id=str(uuid4()),
event_type=event_type,
event_hash=event_hash,
receiver_node_id=receiver_node_id,
snr=snr,
received_at=now,
created_at=now,
updated_at=now,
)
.on_conflict_do_nothing(index_elements=["event_hash", "receiver_node_id"])
)
result = session.execute(stmt)
# CursorResult has rowcount attribute
rowcount = getattr(result, "rowcount", 0)
return bool(rowcount and rowcount > 0)

View File

@@ -12,21 +12,28 @@ class Member(Base, UUIDMixin, TimestampMixin):
"""Member model for network member information.
Stores information about network members/operators.
Nodes are associated with members via a 'member_id' tag on the node.
Attributes:
id: UUID primary key
member_id: Unique member identifier (e.g., 'walshie86')
name: Member's display name
callsign: Amateur radio callsign (optional)
role: Member's role in the network (optional)
description: Additional description (optional)
contact: Contact information (optional)
public_key: Associated node public key (optional, 64-char hex)
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
__tablename__ = "members"
member_id: Mapped[str] = mapped_column(
String(100),
nullable=False,
unique=True,
index=True,
)
name: Mapped[str] = mapped_column(
String(255),
nullable=False,
@@ -47,11 +54,6 @@ class Member(Base, UUIDMixin, TimestampMixin):
String(255),
nullable=True,
)
public_key: Mapped[Optional[str]] = mapped_column(
String(64),
nullable=True,
index=True,
)
def __repr__(self) -> str:
return f"<Member(id={self.id}, name={self.name}, callsign={self.callsign})>"
return f"<Member(id={self.id}, member_id={self.member_id}, name={self.name}, callsign={self.callsign})>"

View File

@@ -76,6 +76,11 @@ class Message(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_messages_message_type", "message_type"),

View File

@@ -3,7 +3,7 @@
from datetime import datetime
from typing import TYPE_CHECKING, Optional
from sqlalchemy import DateTime, Index, Integer, String
from sqlalchemy import DateTime, Float, Index, Integer, String
from sqlalchemy.orm import Mapped, mapped_column, relationship
from meshcore_hub.common.models.base import Base, TimestampMixin, UUIDMixin, utc_now
@@ -23,6 +23,8 @@ class Node(Base, UUIDMixin, TimestampMixin):
flags: Capability/status flags bitmask
first_seen: Timestamp of first advertisement
last_seen: Timestamp of most recent activity
lat: GPS latitude coordinate (if available)
lon: GPS longitude coordinate (if available)
created_at: Record creation timestamp
updated_at: Record update timestamp
"""
@@ -52,10 +54,18 @@ class Node(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
last_seen: Mapped[datetime] = mapped_column(
last_seen: Mapped[Optional[datetime]] = mapped_column(
DateTime(timezone=True),
default=utc_now,
nullable=False,
default=None,
nullable=True,
)
lat: Mapped[Optional[float]] = mapped_column(
Float,
nullable=True,
)
lon: Mapped[Optional[float]] = mapped_column(
Float,
nullable=True,
)
# Relationships

View File

@@ -21,7 +21,7 @@ class NodeTag(Base, UUIDMixin, TimestampMixin):
node_id: Foreign key to nodes table
key: Tag name/key
value: Tag value (stored as text, can be JSON for typed values)
value_type: Type hint (string, number, boolean, coordinate)
value_type: Type hint (string, number, boolean)
created_at: Record creation timestamp
updated_at: Record update timestamp
"""

View File

@@ -54,6 +54,11 @@ class Telemetry(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (Index("ix_telemetry_received_at", "received_at"),)

View File

@@ -3,7 +3,7 @@
from datetime import datetime
from typing import Optional
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer
from sqlalchemy import BigInteger, DateTime, ForeignKey, Index, Integer, String
from sqlalchemy.dialects.sqlite import JSON
from sqlalchemy.orm import Mapped, mapped_column
@@ -67,6 +67,11 @@ class TracePath(Base, UUIDMixin, TimestampMixin):
default=utc_now,
nullable=False,
)
event_hash: Mapped[Optional[str]] = mapped_column(
String(32),
nullable=True,
unique=True,
)
__table_args__ = (
Index("ix_trace_paths_initiator_tag", "initiator_tag"),

View File

@@ -23,6 +23,7 @@ class MQTTConfig:
client_id: Optional[str] = None
keepalive: int = 60
clean_session: bool = True
tls: bool = False
class TopicBuilder:
@@ -131,6 +132,11 @@ class MQTTClient:
self._connected = False
self._message_handlers: dict[str, list[MessageHandler]] = {}
# Set up TLS if enabled
if config.tls:
self._client.tls_set()
logger.debug("TLS/SSL enabled for MQTT connection")
# Set up authentication if provided
if config.username:
self._client.username_pw_set(config.username, config.password)
@@ -344,6 +350,7 @@ def create_mqtt_client(
password: Optional[str] = None,
prefix: str = "meshcore",
client_id: Optional[str] = None,
tls: bool = False,
) -> MQTTClient:
"""Create and configure an MQTT client.
@@ -354,6 +361,7 @@ def create_mqtt_client(
password: MQTT password (optional)
prefix: Topic prefix
client_id: Client identifier (optional)
tls: Enable TLS/SSL connection (optional)
Returns:
Configured MQTTClient instance
@@ -365,5 +373,6 @@ def create_mqtt_client(
password=password,
prefix=prefix,
client_id=client_id,
tls=tls,
)
return MQTTClient(config)

View File

@@ -20,6 +20,7 @@ from meshcore_hub.common.schemas.nodes import (
NodeTagRead,
)
from meshcore_hub.common.schemas.messages import (
ReceiverInfo,
MessageRead,
MessageList,
MessageFilters,
@@ -35,6 +36,9 @@ from meshcore_hub.common.schemas.members import (
MemberRead,
MemberList,
)
from meshcore_hub.common.schemas.network import (
RadioConfig,
)
__all__ = [
# Events
@@ -54,7 +58,8 @@ __all__ = [
"NodeTagCreate",
"NodeTagUpdate",
"NodeTagRead",
# Messages
# Messages & Events
"ReceiverInfo",
"MessageRead",
"MessageList",
"MessageFilters",
@@ -67,4 +72,6 @@ __all__ = [
"MemberUpdate",
"MemberRead",
"MemberList",
# Network
"RadioConfig",
]

View File

@@ -157,7 +157,16 @@ class TelemetryResponseEvent(BaseModel):
class ContactInfo(BaseModel):
"""Schema for a single contact in CONTACTS event."""
"""Schema for a single contact in CONTACTS event.
Device payload fields:
- public_key: Node's 64-char hex public key
- adv_name: Node's advertised name (device field)
- type: Numeric node type (0=none, 1=chat, 2=repeater, 3=room)
- flags: Capability flags
- last_advert: Unix timestamp of last advertisement
- adv_lat, adv_lon: GPS coordinates (if available)
"""
public_key: str = Field(
...,
@@ -165,14 +174,40 @@ class ContactInfo(BaseModel):
max_length=64,
description="Node's full public key",
)
adv_name: Optional[str] = Field(
default=None,
max_length=255,
description="Node's advertised name (from device)",
)
type: Optional[int] = Field(
default=None,
description="Numeric node type: 0=none, 1=chat, 2=repeater, 3=room",
)
flags: Optional[int] = Field(
default=None,
description="Capability/status flags bitmask",
)
last_advert: Optional[int] = Field(
default=None,
description="Unix timestamp of last advertisement",
)
adv_lat: Optional[float] = Field(
default=None,
description="GPS latitude (if available)",
)
adv_lon: Optional[float] = Field(
default=None,
description="GPS longitude (if available)",
)
# Legacy field names for backwards compatibility
name: Optional[str] = Field(
default=None,
max_length=255,
description="Node name/alias",
description="Node name/alias (legacy, prefer adv_name)",
)
node_type: Optional[str] = Field(
default=None,
description="Node type: chat, repeater, room, none",
description="Node type string (legacy, prefer type)",
)

View File

@@ -7,8 +7,18 @@ from pydantic import BaseModel, Field
class MemberCreate(BaseModel):
"""Schema for creating a member."""
"""Schema for creating a member.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: str = Field(
...,
min_length=1,
max_length=100,
description="Unique member identifier (e.g., 'walshie86')",
)
name: str = Field(
...,
min_length=1,
@@ -34,18 +44,21 @@ class MemberCreate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
)
class MemberUpdate(BaseModel):
"""Schema for updating a member."""
"""Schema for updating a member.
Note: Nodes are associated with members via a 'member_id' tag on the node,
not through this schema.
"""
member_id: Optional[str] = Field(
default=None,
min_length=1,
max_length=100,
description="Unique member identifier (e.g., 'walshie86')",
)
name: Optional[str] = Field(
default=None,
min_length=1,
@@ -71,27 +84,22 @@ class MemberUpdate(BaseModel):
max_length=255,
description="Contact information",
)
public_key: Optional[str] = Field(
default=None,
min_length=64,
max_length=64,
pattern=r"^[0-9a-fA-F]{64}$",
description="Associated node public key (64-char hex)",
)
class MemberRead(BaseModel):
"""Schema for reading a member."""
"""Schema for reading a member.
Note: Nodes are associated with members via a 'member_id' tag on the node.
To find nodes for a member, query nodes with a 'member_id' tag matching this member.
"""
id: str = Field(..., description="Member UUID")
member_id: str = Field(..., description="Unique member identifier")
name: str = Field(..., description="Member's display name")
callsign: Optional[str] = Field(default=None, description="Amateur radio callsign")
role: Optional[str] = Field(default=None, description="Member's role")
description: Optional[str] = Field(default=None, description="Description")
contact: Optional[str] = Field(default=None, description="Contact information")
public_key: Optional[str] = Field(
default=None, description="Associated node public key"
)
created_at: datetime = Field(..., description="Creation timestamp")
updated_at: datetime = Field(..., description="Last update timestamp")

View File

@@ -6,19 +6,41 @@ from typing import Literal, Optional
from pydantic import BaseModel, Field
class ReceiverInfo(BaseModel):
"""Information about a receiver that observed an event."""
node_id: str = Field(..., description="Receiver node UUID")
public_key: str = Field(..., description="Receiver node public key")
name: Optional[str] = Field(default=None, description="Receiver node name")
tag_name: Optional[str] = Field(default=None, description="Receiver name from tags")
snr: Optional[float] = Field(
default=None, description="Signal-to-noise ratio at this receiver"
)
received_at: datetime = Field(..., description="When this receiver saw the event")
class Config:
from_attributes = True
class MessageRead(BaseModel):
"""Schema for reading a message."""
id: str = Field(..., description="Message UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_tag_name: Optional[str] = Field(
default=None, description="Receiver name from tags"
)
message_type: str = Field(..., description="Message type (contact, channel)")
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender's public key prefix (12 chars)"
)
sender_friendly_name: Optional[str] = Field(
default=None, description="Sender's friendly name from node tags"
sender_name: Optional[str] = Field(
default=None, description="Sender's advertised node name"
)
sender_tag_name: Optional[str] = Field(
default=None, description="Sender's name from node tags"
)
channel_idx: Optional[int] = Field(default=None, description="Channel index")
text: str = Field(..., description="Message content")
@@ -31,6 +53,9 @@ class MessageRead(BaseModel):
)
received_at: datetime = Field(..., description="When received by interface")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list, description="All receivers that observed this message"
)
class Config:
from_attributes = True
@@ -79,17 +104,29 @@ class MessageFilters(BaseModel):
class AdvertisementRead(BaseModel):
"""Schema for reading an advertisement."""
id: str = Field(..., description="Advertisement UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
receiver_name: Optional[str] = Field(default=None, description="Receiver node name")
receiver_tag_name: Optional[str] = Field(
default=None, description="Receiver name from tags"
)
node_id: Optional[str] = Field(default=None, description="Advertised node UUID")
public_key: str = Field(..., description="Advertised public key")
name: Optional[str] = Field(default=None, description="Advertised name")
node_name: Optional[str] = Field(
default=None, description="Node name from nodes table"
)
node_tag_name: Optional[str] = Field(
default=None, description="Node name from tags"
)
adv_type: Optional[str] = Field(default=None, description="Node type")
flags: Optional[int] = Field(default=None, description="Capability flags")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this advertisement",
)
class Config:
from_attributes = True
@@ -107,9 +144,8 @@ class AdvertisementList(BaseModel):
class TracePathRead(BaseModel):
"""Schema for reading a trace path."""
id: str = Field(..., description="Trace path UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
initiator_tag: int = Field(..., description="Trace identifier")
path_len: Optional[int] = Field(default=None, description="Path length")
@@ -124,6 +160,10 @@ class TracePathRead(BaseModel):
hop_count: Optional[int] = Field(default=None, description="Total hops")
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this trace",
)
class Config:
from_attributes = True
@@ -141,17 +181,19 @@ class TracePathList(BaseModel):
class TelemetryRead(BaseModel):
"""Schema for reading a telemetry record."""
id: str = Field(..., description="Telemetry UUID")
receiver_node_id: Optional[str] = Field(
default=None, description="Receiving interface node UUID"
received_by: Optional[str] = Field(
default=None, description="Receiving interface node public key"
)
node_id: Optional[str] = Field(default=None, description="Reporting node UUID")
node_public_key: str = Field(..., description="Reporting node public key")
parsed_data: Optional[dict] = Field(
default=None, description="Decoded sensor readings"
)
received_at: datetime = Field(..., description="When received")
created_at: datetime = Field(..., description="Record creation timestamp")
receivers: list[ReceiverInfo] = Field(
default_factory=list,
description="All receivers that observed this telemetry",
)
class Config:
from_attributes = True
@@ -171,11 +213,25 @@ class RecentAdvertisement(BaseModel):
public_key: str = Field(..., description="Node public key")
name: Optional[str] = Field(default=None, description="Node name")
friendly_name: Optional[str] = Field(default=None, description="Friendly name tag")
tag_name: Optional[str] = Field(default=None, description="Name tag")
adv_type: Optional[str] = Field(default=None, description="Node type")
received_at: datetime = Field(..., description="When received")
class ChannelMessage(BaseModel):
"""Schema for a channel message summary."""
text: str = Field(..., description="Message text")
sender_name: Optional[str] = Field(default=None, description="Sender name")
sender_tag_name: Optional[str] = Field(
default=None, description="Sender name from tags"
)
pubkey_prefix: Optional[str] = Field(
default=None, description="Sender public key prefix"
)
received_at: datetime = Field(..., description="When received")
class DashboardStats(BaseModel):
"""Schema for dashboard statistics."""
@@ -183,10 +239,14 @@ class DashboardStats(BaseModel):
active_nodes: int = Field(..., description="Nodes active in last 24h")
total_messages: int = Field(..., description="Total number of messages")
messages_today: int = Field(..., description="Messages received today")
messages_7d: int = Field(default=0, description="Messages received in last 7 days")
total_advertisements: int = Field(..., description="Total advertisements")
advertisements_24h: int = Field(
default=0, description="Advertisements received in last 24h"
)
advertisements_7d: int = Field(
default=0, description="Advertisements received in last 7 days"
)
recent_advertisements: list[RecentAdvertisement] = Field(
default_factory=list, description="Last 10 advertisements"
)
@@ -194,3 +254,39 @@ class DashboardStats(BaseModel):
default_factory=dict,
description="Message count per channel",
)
channel_messages: dict[int, list[ChannelMessage]] = Field(
default_factory=dict,
description="Recent messages per channel (up to 5 each)",
)
class DailyActivityPoint(BaseModel):
"""Schema for a single day's activity count."""
date: str = Field(..., description="Date in YYYY-MM-DD format")
count: int = Field(..., description="Count for this day")
class DailyActivity(BaseModel):
"""Schema for daily advertisement activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Daily advertisement counts"
)
class MessageActivity(BaseModel):
"""Schema for daily message activity over a period."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(..., description="Daily message counts")
class NodeCountHistory(BaseModel):
"""Schema for node count over time."""
days: int = Field(..., description="Number of days in the period")
data: list[DailyActivityPoint] = Field(
..., description="Cumulative node count per day"
)

View File

@@ -0,0 +1,65 @@
"""Pydantic schemas for network configuration."""
from typing import Optional
from pydantic import BaseModel
class RadioConfig(BaseModel):
"""Parsed radio configuration from comma-delimited string.
Format: "<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Example: "EU/UK Narrow,869.618MHz,62.5kHz,8,8,22dBm"
"""
profile: Optional[str] = None
frequency: Optional[str] = None
bandwidth: Optional[str] = None
spreading_factor: Optional[int] = None
coding_rate: Optional[int] = None
tx_power: Optional[str] = None
@classmethod
def from_config_string(cls, config_str: Optional[str]) -> Optional["RadioConfig"]:
"""Parse a comma-delimited radio config string.
Args:
config_str: Comma-delimited string in format:
"<profile>,<frequency>,<bandwidth>,<spreading_factor>,<coding_rate>,<tx_power>"
Returns:
RadioConfig instance if parsing succeeds, None if input is None or empty
"""
if not config_str:
return None
parts = [p.strip() for p in config_str.split(",")]
# Handle partial configs by filling with None
while len(parts) < 6:
parts.append("")
# Parse spreading factor and coding rate as integers
spreading_factor = None
coding_rate = None
try:
if parts[3]:
spreading_factor = int(parts[3])
except ValueError:
pass
try:
if parts[4]:
coding_rate = int(parts[4])
except ValueError:
pass
return cls(
profile=parts[0] or None,
frequency=parts[1] or None,
bandwidth=parts[2] or None,
spreading_factor=spreading_factor,
coding_rate=coding_rate,
tx_power=parts[5] or None,
)

View File

@@ -19,7 +19,7 @@ class NodeTagCreate(BaseModel):
default=None,
description="Tag value",
)
value_type: Literal["string", "number", "boolean", "coordinate"] = Field(
value_type: Literal["string", "number", "boolean"] = Field(
default="string",
description="Value type hint",
)
@@ -32,17 +32,36 @@ class NodeTagUpdate(BaseModel):
default=None,
description="Tag value",
)
value_type: Optional[Literal["string", "number", "boolean", "coordinate"]] = Field(
value_type: Optional[Literal["string", "number", "boolean"]] = Field(
default=None,
description="Value type hint",
)
class NodeTagMove(BaseModel):
"""Schema for moving a node tag to a different node."""
new_public_key: str = Field(
...,
min_length=64,
max_length=64,
description="Public key of the destination node",
)
class NodeTagsCopyResult(BaseModel):
"""Schema for bulk copy tags result."""
copied: int = Field(..., description="Number of tags copied")
skipped: int = Field(..., description="Number of tags skipped (already exist)")
skipped_keys: list[str] = Field(
default_factory=list, description="Keys of skipped tags"
)
class NodeTagRead(BaseModel):
"""Schema for reading a node tag."""
id: str = Field(..., description="Tag UUID")
node_id: str = Field(..., description="Parent node UUID")
key: str = Field(..., description="Tag name/key")
value: Optional[str] = Field(default=None, description="Tag value")
value_type: str = Field(..., description="Value type hint")
@@ -56,13 +75,16 @@ class NodeTagRead(BaseModel):
class NodeRead(BaseModel):
"""Schema for reading a node."""
id: str = Field(..., description="Node UUID")
public_key: str = Field(..., description="Node's 64-character hex public key")
name: Optional[str] = Field(default=None, description="Node display name")
adv_type: Optional[str] = Field(default=None, description="Advertisement type")
flags: Optional[int] = Field(default=None, description="Capability flags")
first_seen: datetime = Field(..., description="First advertisement timestamp")
last_seen: datetime = Field(..., description="Last activity timestamp")
last_seen: Optional[datetime] = Field(
default=None, description="Last activity timestamp"
)
lat: Optional[float] = Field(default=None, description="GPS latitude coordinate")
lon: Optional[float] = Field(default=None, description="GPS longitude coordinate")
created_at: datetime = Field(..., description="Record creation timestamp")
updated_at: datetime = Field(..., description="Record update timestamp")
tags: list[NodeTagRead] = Field(default_factory=list, description="Node tags")
@@ -85,7 +107,7 @@ class NodeFilters(BaseModel):
search: Optional[str] = Field(
default=None,
description="Search in name or public key",
description="Search in name tag, node name, or public key",
)
adv_type: Optional[str] = Field(
default=None,

View File

@@ -51,6 +51,13 @@ def interface() -> None:
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -86,6 +93,26 @@ def interface() -> None:
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--contact-cleanup/--no-contact-cleanup",
default=True,
envvar="CONTACT_CLEANUP_ENABLED",
help="Enable/disable automatic removal of stale contacts (RECEIVER mode only)",
)
@click.option(
"--contact-cleanup-days",
type=int,
default=7,
envvar="CONTACT_CLEANUP_DAYS",
help="Remove contacts not advertised for this many days (RECEIVER mode only)",
)
@click.option(
"--log-level",
type=click.Choice(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]),
@@ -99,11 +126,15 @@ def run(
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
contact_cleanup: bool,
contact_cleanup_days: int,
log_level: str,
) -> None:
"""Run the interface component.
@@ -139,11 +170,15 @@ def run(
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
contact_cleanup_enabled=contact_cleanup,
contact_cleanup_days=contact_cleanup_days,
)
elif mode_upper == "SENDER":
from meshcore_hub.interface.sender import run_sender
@@ -153,11 +188,13 @@ def run(
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)
else:
click.echo(f"Unknown mode: {mode}", err=True)
@@ -193,6 +230,13 @@ def run(
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -228,16 +272,40 @@ def run(
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
@click.option(
"--contact-cleanup/--no-contact-cleanup",
default=True,
envvar="CONTACT_CLEANUP_ENABLED",
help="Enable/disable automatic removal of stale contacts",
)
@click.option(
"--contact-cleanup-days",
type=int,
default=7,
envvar="CONTACT_CLEANUP_DAYS",
help="Remove contacts not advertised for this many days",
)
def receiver(
port: str,
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
contact_cleanup: bool,
contact_cleanup_days: int,
) -> None:
"""Run interface in RECEIVER mode.
@@ -257,11 +325,15 @@ def receiver(
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
contact_cleanup_enabled=contact_cleanup,
contact_cleanup_days=contact_cleanup_days,
)
@@ -294,6 +366,13 @@ def receiver(
envvar="NODE_ADDRESS",
help="Override for device public key/address (hex string)",
)
@click.option(
"--device-name",
type=str,
default=None,
envvar="MESHCORE_DEVICE_NAME",
help="Device/node name (optional)",
)
@click.option(
"--mqtt-host",
type=str,
@@ -329,16 +408,25 @@ def receiver(
envvar="MQTT_PREFIX",
help="MQTT topic prefix",
)
@click.option(
"--mqtt-tls",
is_flag=True,
default=False,
envvar="MQTT_TLS",
help="Enable TLS/SSL for MQTT connection",
)
def sender(
port: str,
baud: int,
mock: bool,
node_address: str | None,
device_name: str | None,
mqtt_host: str,
mqtt_port: int,
mqtt_username: str | None,
mqtt_password: str | None,
prefix: str,
mqtt_tls: bool,
) -> None:
"""Run interface in SENDER mode.
@@ -363,4 +451,5 @@ def sender(
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=prefix,
mqtt_tls=mqtt_tls,
)

View File

@@ -164,6 +164,18 @@ class BaseMeshCoreDevice(ABC):
"""
pass
@abstractmethod
def set_name(self, name: str) -> bool:
"""Set the device's node name.
Args:
name: Node name to set
Returns:
True if name was set successfully
"""
pass
@abstractmethod
def start_message_fetching(self) -> bool:
"""Start automatic message fetching.
@@ -175,6 +187,56 @@ class BaseMeshCoreDevice(ABC):
"""
pass
@abstractmethod
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database.
Triggers a CONTACTS event with all stored contacts from the device.
Note: This should only be called before the event loop is running.
Returns:
True if request was sent successfully
"""
pass
@abstractmethod
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request on the event loop.
This is safe to call from event handlers while the event loop is running.
Returns:
True if request was scheduled successfully
"""
pass
@abstractmethod
def remove_contact(self, public_key: str) -> bool:
"""Remove a contact from the device's contact database.
Args:
public_key: The 64-character hex public key of the contact to remove
Returns:
True if contact was removed successfully
"""
pass
@abstractmethod
def schedule_remove_contact(self, public_key: str) -> bool:
"""Schedule a remove_contact request on the event loop.
This is safe to call from event handlers while the event loop is running.
Args:
public_key: The 64-character hex public key of the contact to remove
Returns:
True if request was scheduled successfully
"""
pass
@abstractmethod
def run(self) -> None:
"""Run the device event loop (blocking)."""
@@ -322,6 +384,10 @@ class MeshCoreDevice(BaseMeshCoreDevice):
self._connected = True
logger.info(f"Connected to MeshCore device, public_key: {self._public_key}")
# Set up event subscriptions so events can be received immediately
self._setup_event_subscriptions()
return True
except Exception as e:
@@ -503,6 +569,24 @@ class MeshCoreDevice(BaseMeshCoreDevice):
logger.error(f"Failed to set device time: {e}")
return False
def set_name(self, name: str) -> bool:
"""Set the device's node name."""
if not self._connected or not self._mc:
logger.error("Cannot set name: not connected")
return False
try:
async def _set_name() -> None:
await self._mc.commands.set_name(name)
self._loop.run_until_complete(_set_name())
logger.info(f"Set device name to '{name}'")
return True
except Exception as e:
logger.error(f"Failed to set device name: {e}")
return False
def start_message_fetching(self) -> bool:
"""Start automatic message fetching."""
if not self._connected or not self._mc:
@@ -521,14 +605,107 @@ class MeshCoreDevice(BaseMeshCoreDevice):
logger.error(f"Failed to start message fetching: {e}")
return False
def get_contacts(self) -> bool:
"""Fetch contacts from device contact database.
Note: This method should only be called before the event loop is running
(e.g., during initialization). For calling during event processing,
use schedule_get_contacts() instead.
"""
if not self._connected or not self._mc:
logger.error("Cannot get contacts: not connected")
return False
try:
async def _get_contacts() -> None:
await self._mc.commands.get_contacts()
self._loop.run_until_complete(_get_contacts())
logger.info("Requested contacts from device")
return True
except Exception as e:
logger.error(f"Failed to get contacts: {e}")
return False
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request on the event loop.
This is safe to call from event handlers while the event loop is running.
The request is scheduled as a task on the event loop.
Returns:
True if request was scheduled, False if device not connected
"""
if not self._connected or not self._mc:
logger.error("Cannot get contacts: not connected")
return False
try:
async def _get_contacts() -> None:
await self._mc.commands.get_contacts()
asyncio.run_coroutine_threadsafe(_get_contacts(), self._loop)
logger.info("Scheduled contact sync request")
return True
except Exception as e:
logger.error(f"Failed to schedule get contacts: {e}")
return False
def remove_contact(self, public_key: str) -> bool:
"""Remove a contact from the device's contact database.
Note: This method should only be called before the event loop is running
(e.g., during initialization). For calling during event processing,
use schedule_remove_contact() instead.
"""
if not self._connected or not self._mc:
logger.error("Cannot remove contact: not connected")
return False
try:
async def _remove_contact() -> None:
await self._mc.commands.remove_contact(public_key)
self._loop.run_until_complete(_remove_contact())
logger.info(f"Removed contact {public_key[:12]}...")
return True
except Exception as e:
logger.error(f"Failed to remove contact: {e}")
return False
def schedule_remove_contact(self, public_key: str) -> bool:
"""Schedule a remove_contact request on the event loop.
This is safe to call from event handlers while the event loop is running.
The request is scheduled as a task on the event loop.
Returns:
True if request was scheduled, False if device not connected
"""
if not self._connected or not self._mc:
logger.error("Cannot remove contact: not connected")
return False
try:
async def _remove_contact() -> None:
await self._mc.commands.remove_contact(public_key)
asyncio.run_coroutine_threadsafe(_remove_contact(), self._loop)
logger.debug(f"Scheduled removal of contact {public_key[:12]}...")
return True
except Exception as e:
logger.error(f"Failed to schedule remove contact: {e}")
return False
def run(self) -> None:
"""Run the device event loop."""
self._running = True
logger.info("Starting device event loop")
# Set up event subscriptions
self._setup_event_subscriptions()
# Run the async event loop
async def _run_loop() -> None:
while self._running and self._connected:

View File

@@ -271,6 +271,17 @@ class MockMeshCoreDevice(BaseMeshCoreDevice):
logger.info(f"Mock: Set device time to {timestamp}")
return True
def set_name(self, name: str) -> bool:
"""Set the mock device's node name."""
if not self._connected:
logger.error("Cannot set name: not connected")
return False
logger.info(f"Mock: Set device name to '{name}'")
# Update the mock config name
self.mock_config.name = name
return True
def start_message_fetching(self) -> bool:
"""Start automatic message fetching (mock)."""
if not self._connected:
@@ -280,6 +291,68 @@ class MockMeshCoreDevice(BaseMeshCoreDevice):
logger.info("Mock: Started automatic message fetching")
return True
def get_contacts(self) -> bool:
"""Fetch contacts from mock device contact database.
Note: This should only be called before the event loop is running.
"""
if not self._connected:
logger.error("Cannot get contacts: not connected")
return False
logger.info("Mock: Requesting contacts from device")
# Generate CONTACTS event with all configured mock nodes
def send_contacts() -> None:
time.sleep(0.2)
contacts = [
{
"public_key": node.public_key,
"name": node.name,
"node_type": node.adv_type,
}
for node in self.mock_config.nodes
]
self._dispatch_event(
EventType.CONTACTS,
{"contacts": contacts},
)
threading.Thread(target=send_contacts, daemon=True).start()
return True
def schedule_get_contacts(self) -> bool:
"""Schedule a get_contacts request.
For the mock device, this is the same as get_contacts() since we
don't have a real async event loop. The contacts are sent via a thread.
"""
return self.get_contacts()
def remove_contact(self, public_key: str) -> bool:
"""Remove a contact from the mock device's contact database."""
if not self._connected:
logger.error("Cannot remove contact: not connected")
return False
# Find and remove the contact from mock_config.nodes
for i, node in enumerate(self.mock_config.nodes):
if node.public_key == public_key:
del self.mock_config.nodes[i]
logger.info(f"Mock: Removed contact {public_key[:12]}...")
return True
logger.warning(f"Mock: Contact {public_key[:12]}... not found")
return True # Return True even if not found (idempotent)
def schedule_remove_contact(self, public_key: str) -> bool:
"""Schedule a remove_contact request.
For the mock device, this is the same as remove_contact() since we
don't have a real async event loop.
"""
return self.remove_contact(public_key)
def run(self) -> None:
"""Run the mock device event loop."""
self._running = True

View File

@@ -20,6 +20,9 @@ from meshcore_hub.interface.device import (
create_device,
)
# Default contact cleanup settings
DEFAULT_CONTACT_CLEANUP_DAYS = 7
logger = logging.getLogger(__name__)
@@ -33,15 +36,24 @@ class Receiver:
self,
device: BaseMeshCoreDevice,
mqtt_client: MQTTClient,
device_name: Optional[str] = None,
contact_cleanup_enabled: bool = True,
contact_cleanup_days: int = DEFAULT_CONTACT_CLEANUP_DAYS,
):
"""Initialize receiver.
Args:
device: MeshCore device instance
mqtt_client: MQTT client instance
device_name: Optional device/node name to set on startup
contact_cleanup_enabled: Whether to remove stale contacts from device
contact_cleanup_days: Remove contacts not advertised for this many days
"""
self.device = device
self.mqtt = mqtt_client
self.device_name = device_name
self.contact_cleanup_enabled = contact_cleanup_enabled
self.contact_cleanup_days = contact_cleanup_days
self._running = False
self._shutdown_event = threading.Event()
self._device_connected = False
@@ -71,10 +83,14 @@ class Receiver:
"device_public_key": self.device.public_key,
}
def _initialize_device(self) -> None:
def _initialize_device(self, device_name: Optional[str] = None) -> None:
"""Initialize device after connection.
Sets the hardware clock, sends a local advertisement, and starts message fetching.
Sets the hardware clock, optionally sets device name, sends a local advertisement,
starts message fetching, and syncs the contact database.
Args:
device_name: Optional device/node name to set
"""
# Set device time to current Unix timestamp
current_time = int(time.time())
@@ -83,11 +99,18 @@ class Receiver:
else:
logger.warning("Failed to synchronize device clock")
# Send a local (non-flood) advertisement to announce presence
if self.device.send_advertisement(flood=False):
logger.info("Sent local advertisement")
# Set device name if provided
if device_name:
if self.device.set_name(device_name):
logger.info(f"Set device name to '{device_name}'")
else:
logger.warning(f"Failed to set device name to '{device_name}'")
# Send a flood advertisement to broadcast device name
if self.device.send_advertisement(flood=True):
logger.info("Sent flood advertisement")
else:
logger.warning("Failed to send local advertisement")
logger.warning("Failed to send flood advertisement")
# Start automatic message fetching
if self.device.start_message_fetching():
@@ -95,6 +118,12 @@ class Receiver:
else:
logger.warning("Failed to start automatic message fetching")
# Fetch contact database to sync known nodes
if self.device.get_contacts():
logger.info("Requested contact database sync")
else:
logger.warning("Failed to request contact database")
def _handle_event(self, event_type: EventType, payload: dict[str, Any]) -> None:
"""Handle device event and publish to MQTT.
@@ -110,6 +139,11 @@ class Receiver:
# Convert event type to MQTT topic name
event_name = event_type.value
# Special handling for CONTACTS: split into individual messages
if event_type == EventType.CONTACTS:
self._publish_contacts(payload)
return
# Publish to MQTT
self.mqtt.publish_event(
self.device.public_key,
@@ -119,9 +153,101 @@ class Receiver:
logger.debug(f"Published {event_name} event to MQTT")
# Trigger contact sync on advertisements
if event_type == EventType.ADVERTISEMENT:
self._sync_contacts()
except Exception as e:
logger.error(f"Failed to publish event to MQTT: {e}")
def _sync_contacts(self) -> None:
"""Request contact sync from device.
Called when advertisements are received to ensure contact database
stays current with all nodes on the mesh.
"""
logger.info("Advertisement received, triggering contact sync")
success = self.device.schedule_get_contacts()
if not success:
logger.warning("Contact sync request failed")
def _publish_contacts(self, payload: dict[str, Any]) -> None:
"""Publish each contact as a separate MQTT message.
The device returns contacts as a dict keyed by public_key.
We split this into individual 'contact' events for cleaner processing.
Stale contacts (not advertised for > contact_cleanup_days) are removed
from the device and not published.
Args:
payload: Dict of contacts keyed by public_key
"""
if not self.device.public_key:
logger.warning("Cannot publish contacts: device public key not available")
return
# Handle both formats:
# - Dict keyed by public_key (real device)
# - Dict with "contacts" array (mock device)
if "contacts" in payload:
contacts = payload["contacts"]
else:
contacts = list(payload.values())
if not contacts:
logger.debug("Empty contacts list received")
return
device_key = self.device.public_key # Capture for type narrowing
current_time = int(time.time())
stale_threshold = current_time - (self.contact_cleanup_days * 24 * 60 * 60)
published_count = 0
removed_count = 0
for contact in contacts:
if not isinstance(contact, dict):
continue
public_key = contact.get("public_key")
if not public_key:
continue
# Check if contact is stale based on last_advert timestamp
# Only check if cleanup is enabled and last_advert exists
if self.contact_cleanup_enabled:
last_advert = contact.get("last_advert")
if last_advert is not None and last_advert > 0:
if last_advert < stale_threshold:
# Contact is stale - remove from device
adv_name = contact.get("adv_name", contact.get("name", ""))
logger.info(
f"Removing stale contact {public_key[:12]}... "
f"({adv_name}) - last advertised "
f"{(current_time - last_advert) // 86400} days ago"
)
self.device.schedule_remove_contact(public_key)
removed_count += 1
continue # Don't publish stale contacts
try:
self.mqtt.publish_event(
device_key,
"contact", # Use singular 'contact' for individual events
contact,
)
published_count += 1
except Exception as e:
logger.error(f"Failed to publish contact event: {e}")
if removed_count > 0:
logger.info(
f"Contact sync: published {published_count}, "
f"removed {removed_count} stale contacts"
)
else:
logger.info(f"Published {published_count} contact events to MQTT")
def start(self) -> None:
"""Start the receiver."""
logger.info("Starting RECEIVER mode")
@@ -142,20 +268,22 @@ class Receiver:
logger.error(f"Failed to connect to MQTT broker: {e}")
raise
# Connect to device
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_receiver)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
self.mqtt.stop()
self.mqtt.disconnect()
self._mqtt_connected = False
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Initialize device: set time and send local advertisement
self._initialize_device()
# Initialize device: set time, optionally set name, and send local advertisement
self._initialize_device(device_name=self.device_name)
self._running = True
@@ -214,11 +342,15 @@ def create_receiver(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
contact_cleanup_enabled: bool = True,
contact_cleanup_days: int = DEFAULT_CONTACT_CLEANUP_DAYS,
) -> Receiver:
"""Create a configured receiver instance.
@@ -227,30 +359,46 @@ def create_receiver(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name to set on startup
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
contact_cleanup_enabled: Whether to remove stale contacts from device
contact_cleanup_days: Remove contacts not advertised for this many days
Returns:
Configured Receiver instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-receiver-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-receiver-{device.public_key[:12] if device.public_key else 'unknown'}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
return Receiver(device, mqtt_client)
return Receiver(
device,
mqtt_client,
device_name=device_name,
contact_cleanup_enabled=contact_cleanup_enabled,
contact_cleanup_days=contact_cleanup_days,
)
def run_receiver(
@@ -258,11 +406,15 @@ def run_receiver(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
contact_cleanup_enabled: bool = True,
contact_cleanup_days: int = DEFAULT_CONTACT_CLEANUP_DAYS,
) -> None:
"""Run the receiver (blocking).
@@ -273,22 +425,30 @@ def run_receiver(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name to set on startup
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
contact_cleanup_enabled: Whether to remove stale contacts from device
contact_cleanup_days: Remove contacts not advertised for this many days
"""
receiver = create_receiver(
port=port,
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
contact_cleanup_enabled=contact_cleanup_enabled,
contact_cleanup_days=contact_cleanup_days,
)
# Set up signal handlers

View File

@@ -200,14 +200,16 @@ class Sender:
"""Start the sender."""
logger.info("Starting SENDER mode")
# Connect to device first
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
# Device should already be connected (from create_sender)
# but handle case where start() is called directly
if not self.device.is_connected:
if not self.device.connect():
self._device_connected = False
logger.error("Failed to connect to MeshCore device")
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
self._device_connected = True
logger.info(f"Connected to MeshCore device: {self.device.public_key}")
# Connect to MQTT broker
try:
@@ -285,11 +287,13 @@ def create_sender(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> Sender:
"""Create a configured sender instance.
@@ -298,26 +302,34 @@ def create_sender(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name (not used in SENDER mode)
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
Returns:
Configured Sender instance
"""
# Create device
# Create and connect device first to get public key
device = create_device(port=port, baud=baud, mock=mock, node_address=node_address)
# Create MQTT client
if not device.connect():
raise RuntimeError("Failed to connect to MeshCore device")
logger.info(f"Connected to MeshCore device: {device.public_key}")
# Create MQTT client with device's public key for unique client ID
mqtt_config = MQTTConfig(
host=mqtt_host,
port=mqtt_port,
username=mqtt_username,
password=mqtt_password,
prefix=mqtt_prefix,
client_id=f"meshcore-sender-{device.public_key[:8] if device.public_key else 'unknown'}",
client_id=f"meshcore-sender-{device.public_key[:12] if device.public_key else 'unknown'}",
tls=mqtt_tls,
)
mqtt_client = MQTTClient(mqtt_config)
@@ -329,11 +341,13 @@ def run_sender(
baud: int = 115200,
mock: bool = False,
node_address: Optional[str] = None,
device_name: Optional[str] = None,
mqtt_host: str = "localhost",
mqtt_port: int = 1883,
mqtt_username: Optional[str] = None,
mqtt_password: Optional[str] = None,
mqtt_prefix: str = "meshcore",
mqtt_tls: bool = False,
) -> None:
"""Run the sender (blocking).
@@ -344,22 +358,26 @@ def run_sender(
baud: Baud rate
mock: Use mock device
node_address: Optional override for device public key/address
device_name: Optional device/node name (not used in SENDER mode)
mqtt_host: MQTT broker host
mqtt_port: MQTT broker port
mqtt_username: MQTT username
mqtt_password: MQTT password
mqtt_prefix: MQTT topic prefix
mqtt_tls: Enable TLS/SSL for MQTT connection
"""
sender = create_sender(
port=port,
baud=baud,
mock=mock,
node_address=node_address,
device_name=device_name,
mqtt_host=mqtt_host,
mqtt_port=mqtt_port,
mqtt_username=mqtt_username,
mqtt_password=mqtt_password,
mqtt_prefix=mqtt_prefix,
mqtt_tls=mqtt_tls,
)
# Set up signal handlers

View File

@@ -11,6 +11,7 @@ from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from meshcore_hub import __version__
from meshcore_hub.common.schemas import RadioConfig
logger = logging.getLogger(__name__)
@@ -47,32 +48,44 @@ async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
def create_app(
api_url: str = "http://localhost:8000",
api_url: str | None = None,
api_key: str | None = None,
network_name: str = "MeshCore Network",
admin_enabled: bool | None = None,
network_name: str | None = None,
network_city: str | None = None,
network_country: str | None = None,
network_location: tuple[float, float] | None = None,
network_radio_config: str | None = None,
network_contact_email: str | None = None,
network_contact_discord: str | None = None,
network_contact_github: str | None = None,
network_welcome_text: str | None = None,
) -> FastAPI:
"""Create and configure the web dashboard application.
When called without arguments (e.g., in reload mode), settings are loaded
from environment variables via the WebSettings class.
Args:
api_url: Base URL of the MeshCore Hub API
api_key: API key for authentication
admin_enabled: Enable admin interface at /a/
network_name: Display name for the network
network_city: City where the network is located
network_country: Country where the network is located
network_location: (lat, lon) tuple for map centering
network_radio_config: Radio configuration description
network_contact_email: Contact email address
network_contact_discord: Discord invite/server info
network_contact_github: GitHub repository URL
network_welcome_text: Welcome text for homepage
Returns:
Configured FastAPI application
"""
# Load settings from environment if not provided
from meshcore_hub.common.config import get_web_settings
settings = get_web_settings()
app = FastAPI(
title="MeshCore Hub Dashboard",
description="Web dashboard for MeshCore network visualization",
@@ -82,16 +95,30 @@ def create_app(
redoc_url=None,
)
# Store configuration in app state
app.state.api_url = api_url
app.state.api_key = api_key
app.state.network_name = network_name
app.state.network_city = network_city
app.state.network_country = network_country
app.state.network_location = network_location or (0.0, 0.0)
app.state.network_radio_config = network_radio_config
app.state.network_contact_email = network_contact_email
app.state.network_contact_discord = network_contact_discord
# Store configuration in app state (use args if provided, else settings)
app.state.api_url = api_url or settings.api_base_url
app.state.api_key = api_key or settings.api_key
app.state.admin_enabled = (
admin_enabled if admin_enabled is not None else settings.web_admin_enabled
)
app.state.network_name = network_name or settings.network_name
app.state.network_city = network_city or settings.network_city
app.state.network_country = network_country or settings.network_country
app.state.network_radio_config = (
network_radio_config or settings.network_radio_config
)
app.state.network_contact_email = (
network_contact_email or settings.network_contact_email
)
app.state.network_contact_discord = (
network_contact_discord or settings.network_contact_discord
)
app.state.network_contact_github = (
network_contact_github or settings.network_contact_github
)
app.state.network_welcome_text = (
network_welcome_text or settings.network_welcome_text
)
# Set up templates
templates = Jinja2Templates(directory=str(TEMPLATES_DIR))
@@ -134,13 +161,20 @@ def get_templates(request: Request) -> Jinja2Templates:
def get_network_context(request: Request) -> dict:
"""Get network configuration context for templates."""
# Parse radio config from comma-delimited string
radio_config = RadioConfig.from_config_string(
request.app.state.network_radio_config
)
return {
"network_name": request.app.state.network_name,
"network_city": request.app.state.network_city,
"network_country": request.app.state.network_country,
"network_location": request.app.state.network_location,
"network_radio_config": request.app.state.network_radio_config,
"network_radio_config": radio_config,
"network_contact_email": request.app.state.network_contact_email,
"network_contact_discord": request.app.state.network_contact_discord,
"network_contact_github": request.app.state.network_contact_github,
"network_welcome_text": request.app.state.network_welcome_text,
"admin_enabled": request.app.state.admin_enabled,
"version": __version__,
}

View File

@@ -7,21 +7,21 @@ import click
@click.option(
"--host",
type=str,
default="0.0.0.0",
default=None,
envvar="WEB_HOST",
help="Web server host",
help="Web server host (default: 0.0.0.0)",
)
@click.option(
"--port",
type=int,
default=8080,
default=None,
envvar="WEB_PORT",
help="Web server port",
help="Web server port (default: 8080)",
)
@click.option(
"--api-url",
type=str,
default="http://localhost:8000",
default=None,
envvar="API_BASE_URL",
help="API server base URL",
)
@@ -42,7 +42,7 @@ import click
@click.option(
"--network-name",
type=str,
default="MeshCore Network",
default=None,
envvar="NETWORK_NAME",
help="Network display name",
)
@@ -60,20 +60,6 @@ import click
envvar="NETWORK_COUNTRY",
help="Network country",
)
@click.option(
"--network-lat",
type=float,
default=0.0,
envvar="NETWORK_LAT",
help="Network center latitude",
)
@click.option(
"--network-lon",
type=float,
default=0.0,
envvar="NETWORK_LON",
help="Network center longitude",
)
@click.option(
"--network-radio-config",
type=str,
@@ -95,6 +81,20 @@ import click
envvar="NETWORK_CONTACT_DISCORD",
help="Discord server info",
)
@click.option(
"--network-contact-github",
type=str,
default=None,
envvar="NETWORK_CONTACT_GITHUB",
help="GitHub repository URL",
)
@click.option(
"--network-welcome-text",
type=str,
default=None,
envvar="NETWORK_WELCOME_TEXT",
help="Welcome text for homepage",
)
@click.option(
"--reload",
is_flag=True,
@@ -104,19 +104,19 @@ import click
@click.pass_context
def web(
ctx: click.Context,
host: str,
port: int,
api_url: str,
host: str | None,
port: int | None,
api_url: str | None,
api_key: str | None,
data_home: str | None,
network_name: str,
network_name: str | None,
network_city: str | None,
network_country: str | None,
network_lat: float,
network_lon: float,
network_radio_config: str | None,
network_contact_email: str | None,
network_contact_discord: str | None,
network_contact_github: str | None,
network_welcome_text: str | None,
reload: bool,
) -> None:
"""Run the web dashboard.
@@ -146,46 +146,46 @@ def web(
from meshcore_hub.common.config import get_web_settings
from meshcore_hub.web.app import create_app
# Get settings to compute effective values
# Get settings for defaults and display
settings = get_web_settings()
# Override data_home if provided
if data_home:
settings = settings.model_copy(update={"data_home": data_home})
# Use CLI args or fall back to settings
effective_host = host or settings.web_host
effective_port = port or settings.web_port
effective_data_home = data_home or settings.data_home
# Ensure web data directory exists
web_data_dir = Path(effective_data_home) / "web"
web_data_dir.mkdir(parents=True, exist_ok=True)
# Display effective settings
effective_network_name = network_name or settings.network_name
click.echo("=" * 50)
click.echo("MeshCore Hub Web Dashboard")
click.echo("=" * 50)
click.echo(f"Host: {host}")
click.echo(f"Port: {port}")
click.echo(f"Host: {effective_host}")
click.echo(f"Port: {effective_port}")
click.echo(f"Data home: {effective_data_home}")
click.echo(f"API URL: {api_url}")
click.echo(f"API key configured: {api_key is not None}")
click.echo(f"Network: {network_name}")
if network_city and network_country:
click.echo(f"Location: {network_city}, {network_country}")
if network_lat != 0.0 or network_lon != 0.0:
click.echo(f"Map center: {network_lat}, {network_lon}")
click.echo(f"API URL: {api_url or settings.api_base_url}")
click.echo(f"API key configured: {(api_key or settings.api_key) is not None}")
click.echo(f"Network: {effective_network_name}")
effective_city = network_city or settings.network_city
effective_country = network_country or settings.network_country
if effective_city and effective_country:
click.echo(f"Location: {effective_city}, {effective_country}")
click.echo(f"Reload mode: {reload}")
click.echo("=" * 50)
network_location = (network_lat, network_lon)
if reload:
# For development, use uvicorn's reload feature
click.echo("\nStarting in development mode with auto-reload...")
click.echo("Note: Using default settings for reload mode.")
click.echo("Note: Settings loaded from environment/config.")
uvicorn.run(
"meshcore_hub.web.app:create_app",
host=host,
port=port,
host=effective_host,
port=effective_port,
reload=True,
factory=True,
)
@@ -197,11 +197,12 @@ def web(
network_name=network_name,
network_city=network_city,
network_country=network_country,
network_location=network_location,
network_radio_config=network_radio_config,
network_contact_email=network_contact_email,
network_contact_discord=network_contact_discord,
network_contact_github=network_contact_github,
network_welcome_text=network_welcome_text,
)
click.echo("\nStarting web dashboard...")
uvicorn.run(app, host=host, port=port)
uvicorn.run(app, host=effective_host, port=effective_port)

View File

@@ -6,8 +6,10 @@ from meshcore_hub.web.routes.home import router as home_router
from meshcore_hub.web.routes.network import router as network_router
from meshcore_hub.web.routes.nodes import router as nodes_router
from meshcore_hub.web.routes.messages import router as messages_router
from meshcore_hub.web.routes.advertisements import router as advertisements_router
from meshcore_hub.web.routes.map import router as map_router
from meshcore_hub.web.routes.members import router as members_router
from meshcore_hub.web.routes.admin import router as admin_router
# Create main web router
web_router = APIRouter()
@@ -17,7 +19,9 @@ web_router.include_router(home_router)
web_router.include_router(network_router)
web_router.include_router(nodes_router)
web_router.include_router(messages_router)
web_router.include_router(advertisements_router)
web_router.include_router(map_router)
web_router.include_router(members_router)
web_router.include_router(admin_router)
__all__ = ["web_router"]

View File

@@ -0,0 +1,591 @@
"""Admin page routes."""
import logging
from typing import Any, Optional
from urllib.parse import urlencode
from fastapi import APIRouter, Form, HTTPException, Query, Request
from fastapi.responses import HTMLResponse, RedirectResponse
from httpx import Response
from meshcore_hub.web.app import get_network_context, get_templates
def _build_redirect_url(
public_key: str,
message: Optional[str] = None,
error: Optional[str] = None,
) -> str:
"""Build a properly encoded redirect URL with optional message/error."""
params: dict[str, str] = {"public_key": public_key}
if message:
params["message"] = message
if error:
params["error"] = error
return f"/a/node-tags?{urlencode(params)}"
def _get_error_detail(response: Response) -> str:
"""Safely extract error detail from response JSON."""
try:
data: Any = response.json()
detail: str = data.get("detail", "Unknown error")
return detail
except Exception:
return "Unknown error"
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/a", tags=["admin"])
def _check_admin_enabled(request: Request) -> None:
"""Check if admin interface is enabled, raise 404 if not."""
if not getattr(request.app.state, "admin_enabled", False):
raise HTTPException(status_code=404, detail="Not Found")
def _get_auth_context(request: Request) -> dict:
"""Extract OAuth2Proxy authentication headers."""
return {
"auth_user": request.headers.get("X-Forwarded-User"),
"auth_groups": request.headers.get("X-Forwarded-Groups"),
"auth_email": request.headers.get("X-Forwarded-Email"),
"auth_username": request.headers.get("X-Forwarded-Preferred-Username"),
}
def _is_authenticated(request: Request) -> bool:
"""Check if user is authenticated via OAuth2Proxy headers."""
return bool(
request.headers.get("X-Forwarded-User")
or request.headers.get("X-Forwarded-Email")
)
def _require_auth(request: Request) -> None:
"""Require authentication, raise 403 if not authenticated."""
if not _is_authenticated(request):
raise HTTPException(status_code=403, detail="Access denied")
@router.get("/", response_class=HTMLResponse)
async def admin_home(request: Request) -> HTMLResponse:
"""Render the admin page with OAuth2Proxy user info."""
_check_admin_enabled(request)
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
context.update(_get_auth_context(request))
# Check if user is authenticated
if not _is_authenticated(request):
return templates.TemplateResponse(
"admin/access_denied.html", context, status_code=403
)
return templates.TemplateResponse("admin/index.html", context)
@router.get("/node-tags", response_class=HTMLResponse)
async def admin_node_tags(
request: Request,
public_key: Optional[str] = Query(None),
message: Optional[str] = Query(None),
error: Optional[str] = Query(None),
) -> HTMLResponse:
"""Admin page for managing node tags."""
_check_admin_enabled(request)
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
context.update(_get_auth_context(request))
# Check if user is authenticated
if not _is_authenticated(request):
return templates.TemplateResponse(
"admin/access_denied.html", context, status_code=403
)
# Flash messages from redirects
context["message"] = message
context["error"] = error
# Fetch all nodes for dropdown
nodes = []
try:
response = await request.app.state.http_client.get(
"/api/v1/nodes",
params={"limit": 100},
)
if response.status_code == 200:
data = response.json()
nodes = data.get("items", [])
# Sort nodes alphabetically by name (unnamed nodes at the end)
nodes.sort(
key=lambda n: (n.get("name") is None, (n.get("name") or "").lower())
)
except Exception as e:
logger.exception("Failed to fetch nodes: %s", e)
context["error"] = "Failed to fetch nodes"
context["nodes"] = nodes
context["selected_public_key"] = public_key
# Fetch tags for selected node
tags = []
selected_node = None
if public_key:
# Find the selected node in the list
for node in nodes:
if node.get("public_key") == public_key:
selected_node = node
break
try:
response = await request.app.state.http_client.get(
f"/api/v1/nodes/{public_key}/tags",
)
if response.status_code == 200:
tags = response.json()
elif response.status_code == 404:
context["error"] = "Node not found"
except Exception as e:
logger.exception("Failed to fetch tags: %s", e)
context["error"] = "Failed to fetch tags"
context["tags"] = tags
context["selected_node"] = selected_node
return templates.TemplateResponse("admin/node_tags.html", context)
@router.post("/node-tags", response_class=RedirectResponse)
async def admin_create_node_tag(
request: Request,
public_key: str = Form(...),
key: str = Form(...),
value: str = Form(""),
value_type: str = Form("string"),
) -> RedirectResponse:
"""Create a new node tag."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.post(
f"/api/v1/nodes/{public_key}/tags",
json={
"key": key,
"value": value or None,
"value_type": value_type,
},
)
if response.status_code == 201:
redirect_url = _build_redirect_url(
public_key, message=f"Tag '{key}' created successfully"
)
elif response.status_code == 409:
redirect_url = _build_redirect_url(
public_key, error=f"Tag '{key}' already exists"
)
elif response.status_code == 404:
redirect_url = _build_redirect_url(public_key, error="Node not found")
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to create tag: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to create tag")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/node-tags/update", response_class=RedirectResponse)
async def admin_update_node_tag(
request: Request,
public_key: str = Form(...),
key: str = Form(...),
value: str = Form(""),
value_type: str = Form("string"),
) -> RedirectResponse:
"""Update an existing node tag."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.put(
f"/api/v1/nodes/{public_key}/tags/{key}",
json={
"value": value or None,
"value_type": value_type,
},
)
if response.status_code == 200:
redirect_url = _build_redirect_url(
public_key, message=f"Tag '{key}' updated successfully"
)
elif response.status_code == 404:
redirect_url = _build_redirect_url(
public_key, error=f"Tag '{key}' not found"
)
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to update tag: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to update tag")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/node-tags/move", response_class=RedirectResponse)
async def admin_move_node_tag(
request: Request,
public_key: str = Form(...),
key: str = Form(...),
new_public_key: str = Form(...),
) -> RedirectResponse:
"""Move a node tag to a different node."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.put(
f"/api/v1/nodes/{public_key}/tags/{key}/move",
json={"new_public_key": new_public_key},
)
if response.status_code == 200:
# Redirect to the destination node after successful move
redirect_url = _build_redirect_url(
new_public_key, message=f"Tag '{key}' moved successfully"
)
elif response.status_code == 404:
# Stay on source node if not found
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
elif response.status_code == 409:
redirect_url = _build_redirect_url(
public_key, error=f"Tag '{key}' already exists on destination node"
)
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to move tag: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to move tag")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/node-tags/delete", response_class=RedirectResponse)
async def admin_delete_node_tag(
request: Request,
public_key: str = Form(...),
key: str = Form(...),
) -> RedirectResponse:
"""Delete a node tag."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.delete(
f"/api/v1/nodes/{public_key}/tags/{key}",
)
if response.status_code == 204:
redirect_url = _build_redirect_url(
public_key, message=f"Tag '{key}' deleted successfully"
)
elif response.status_code == 404:
redirect_url = _build_redirect_url(
public_key, error=f"Tag '{key}' not found"
)
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to delete tag: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to delete tag")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/node-tags/copy-all", response_class=RedirectResponse)
async def admin_copy_all_tags(
request: Request,
public_key: str = Form(...),
dest_public_key: str = Form(...),
) -> RedirectResponse:
"""Copy all tags from one node to another."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.post(
f"/api/v1/nodes/{public_key}/tags/copy-to/{dest_public_key}",
)
if response.status_code == 200:
data = response.json()
copied = data.get("copied", 0)
skipped = data.get("skipped", 0)
if skipped > 0:
message = f"Copied {copied} tag(s), skipped {skipped} existing"
else:
message = f"Copied {copied} tag(s) successfully"
# Redirect to destination node to show copied tags
redirect_url = _build_redirect_url(dest_public_key, message=message)
elif response.status_code == 400:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
elif response.status_code == 404:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to copy tags: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to copy tags")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/node-tags/delete-all", response_class=RedirectResponse)
async def admin_delete_all_tags(
request: Request,
public_key: str = Form(...),
) -> RedirectResponse:
"""Delete all tags from a node."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.delete(
f"/api/v1/nodes/{public_key}/tags",
)
if response.status_code == 200:
data = response.json()
deleted = data.get("deleted", 0)
message = f"Deleted {deleted} tag(s) successfully"
redirect_url = _build_redirect_url(public_key, message=message)
elif response.status_code == 404:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
else:
redirect_url = _build_redirect_url(
public_key, error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to delete tags: %s", e)
redirect_url = _build_redirect_url(public_key, error="Failed to delete tags")
return RedirectResponse(url=redirect_url, status_code=303)
def _build_members_redirect_url(
message: Optional[str] = None,
error: Optional[str] = None,
) -> str:
"""Build a properly encoded redirect URL for members page with optional message/error."""
params: dict[str, str] = {}
if message:
params["message"] = message
if error:
params["error"] = error
if params:
return f"/a/members?{urlencode(params)}"
return "/a/members"
@router.get("/members", response_class=HTMLResponse)
async def admin_members(
request: Request,
message: Optional[str] = Query(None),
error: Optional[str] = Query(None),
) -> HTMLResponse:
"""Admin page for managing members."""
_check_admin_enabled(request)
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
context.update(_get_auth_context(request))
# Check if user is authenticated
if not _is_authenticated(request):
return templates.TemplateResponse(
"admin/access_denied.html", context, status_code=403
)
# Flash messages from redirects
context["message"] = message
context["error"] = error
# Fetch all members
members = []
try:
response = await request.app.state.http_client.get(
"/api/v1/members",
params={"limit": 500},
)
if response.status_code == 200:
data = response.json()
members = data.get("items", [])
# Sort members alphabetically by name
members.sort(key=lambda m: m.get("name", "").lower())
except Exception as e:
logger.exception("Failed to fetch members: %s", e)
context["error"] = "Failed to fetch members"
context["members"] = members
return templates.TemplateResponse("admin/members.html", context)
@router.post("/members", response_class=RedirectResponse)
async def admin_create_member(
request: Request,
name: str = Form(...),
member_id: str = Form(...),
callsign: Optional[str] = Form(None),
role: Optional[str] = Form(None),
description: Optional[str] = Form(None),
contact: Optional[str] = Form(None),
) -> RedirectResponse:
"""Create a new member."""
_check_admin_enabled(request)
_require_auth(request)
try:
# Build request payload
payload = {
"name": name,
"member_id": member_id,
}
if callsign:
payload["callsign"] = callsign
if role:
payload["role"] = role
if description:
payload["description"] = description
if contact:
payload["contact"] = contact
response = await request.app.state.http_client.post(
"/api/v1/members",
json=payload,
)
if response.status_code == 201:
redirect_url = _build_members_redirect_url(
message=f"Member '{name}' created successfully"
)
elif response.status_code == 409:
redirect_url = _build_members_redirect_url(
error=f"Member ID '{member_id}' already exists"
)
else:
redirect_url = _build_members_redirect_url(
error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to create member: %s", e)
redirect_url = _build_members_redirect_url(error="Failed to create member")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/members/update", response_class=RedirectResponse)
async def admin_update_member(
request: Request,
id: str = Form(...),
name: Optional[str] = Form(None),
member_id: Optional[str] = Form(None),
callsign: Optional[str] = Form(None),
role: Optional[str] = Form(None),
description: Optional[str] = Form(None),
contact: Optional[str] = Form(None),
) -> RedirectResponse:
"""Update an existing member."""
_check_admin_enabled(request)
_require_auth(request)
try:
# Build update payload (only include non-None fields)
payload: dict[str, str | None] = {}
if name is not None:
payload["name"] = name
if member_id is not None:
payload["member_id"] = member_id
if callsign is not None:
payload["callsign"] = callsign if callsign else None
if role is not None:
payload["role"] = role if role else None
if description is not None:
payload["description"] = description if description else None
if contact is not None:
payload["contact"] = contact if contact else None
response = await request.app.state.http_client.put(
f"/api/v1/members/{id}",
json=payload,
)
if response.status_code == 200:
redirect_url = _build_members_redirect_url(
message="Member updated successfully"
)
elif response.status_code == 404:
redirect_url = _build_members_redirect_url(error="Member not found")
elif response.status_code == 409:
redirect_url = _build_members_redirect_url(
error=f"Member ID '{member_id}' already exists"
)
else:
redirect_url = _build_members_redirect_url(
error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to update member: %s", e)
redirect_url = _build_members_redirect_url(error="Failed to update member")
return RedirectResponse(url=redirect_url, status_code=303)
@router.post("/members/delete", response_class=RedirectResponse)
async def admin_delete_member(
request: Request,
id: str = Form(...),
) -> RedirectResponse:
"""Delete a member."""
_check_admin_enabled(request)
_require_auth(request)
try:
response = await request.app.state.http_client.delete(
f"/api/v1/members/{id}",
)
if response.status_code == 204:
redirect_url = _build_members_redirect_url(
message="Member deleted successfully"
)
elif response.status_code == 404:
redirect_url = _build_members_redirect_url(error="Member not found")
else:
redirect_url = _build_members_redirect_url(
error=_get_error_detail(response)
)
except Exception as e:
logger.exception("Failed to delete member: %s", e)
redirect_url = _build_members_redirect_url(error="Failed to delete member")
return RedirectResponse(url=redirect_url, status_code=303)

View File

@@ -0,0 +1,99 @@
"""Advertisements page route."""
import logging
from fastapi import APIRouter, Query, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/advertisements", response_class=HTMLResponse)
async def advertisements_list(
request: Request,
search: str | None = Query(None, description="Search term"),
member_id: str | None = Query(None, description="Filter by member"),
public_key: str | None = Query(None, description="Filter by node public key"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
) -> HTMLResponse:
"""Render the advertisements list page."""
templates = get_templates(request)
context = get_network_context(request)
context["request"] = request
# Calculate offset
offset = (page - 1) * limit
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if search:
params["search"] = search
if member_id:
params["member_id"] = member_id
if public_key:
params["public_key"] = public_key
# Fetch advertisements from API
advertisements = []
total = 0
members = []
nodes = []
try:
# Fetch members for dropdown
members_response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 100}
)
if members_response.status_code == 200:
members = members_response.json().get("items", [])
# Fetch nodes for dropdown
nodes_response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"limit": 500}
)
if nodes_response.status_code == 200:
nodes = nodes_response.json().get("items", [])
# Sort nodes alphabetically by display name
def get_node_display_name(node: dict) -> str:
for tag in node.get("tags") or []:
if tag.get("key") == "name":
return str(tag.get("value", "")).lower()
return str(node.get("name") or node.get("public_key", "")).lower()
nodes.sort(key=get_node_display_name)
response = await request.app.state.http_client.get(
"/api/v1/advertisements", params=params
)
if response.status_code == 200:
data = response.json()
advertisements = data.get("items", [])
total = data.get("total", 0)
except Exception as e:
logger.warning(f"Failed to fetch advertisements from API: {e}")
context["api_error"] = str(e)
# Calculate pagination
total_pages = (total + limit - 1) // limit if total > 0 else 1
context.update(
{
"advertisements": advertisements,
"total": total,
"page": page,
"limit": limit,
"total_pages": total_pages,
"search": search or "",
"member_id": member_id or "",
"public_key": public_key or "",
"members": members,
"nodes": nodes,
}
)
return templates.TemplateResponse("advertisements.html", context)

View File

@@ -1,10 +1,14 @@
"""Home page route."""
import json
import logging
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from meshcore_hub.web.app import get_network_context, get_templates
logger = logging.getLogger(__name__)
router = APIRouter()
@@ -15,4 +19,49 @@ async def home(request: Request) -> HTMLResponse:
context = get_network_context(request)
context["request"] = request
# Fetch stats from API
stats = {
"total_nodes": 0,
"active_nodes": 0,
"total_messages": 0,
"messages_today": 0,
"total_advertisements": 0,
"advertisements_24h": 0,
}
# Fetch activity data for charts
advert_activity = {"days": 7, "data": []}
message_activity = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
stats = response.json()
except Exception as e:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
advert_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/message-activity", params={"days": 7}
)
if response.status_code == 200:
message_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch message activity from API: {e}")
context["stats"] = stats
# Pass activity data as JSON strings for the chart
context["advert_activity_json"] = json.dumps(advert_activity)
context["message_activity_json"] = json.dumps(message_activity)
return templates.TemplateResponse("home.html", context)

View File

@@ -1,6 +1,7 @@
"""Map page route."""
import logging
from typing import Any
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse, JSONResponse
@@ -23,10 +24,38 @@ async def map_page(request: Request) -> HTMLResponse:
@router.get("/map/data")
async def map_data(request: Request) -> JSONResponse:
"""Return node location data as JSON for the map."""
nodes_with_location = []
"""Return node location data as JSON for the map.
Includes role tag, member ownership info, and all data needed for filtering.
"""
nodes_with_location: list[dict[str, Any]] = []
members_list: list[dict[str, Any]] = []
members_by_id: dict[str, dict[str, Any]] = {}
error: str | None = None
total_nodes = 0
nodes_with_coords = 0
try:
# Fetch all members to build lookup by member_id
members_response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 500}
)
if members_response.status_code == 200:
members_data = members_response.json()
for member in members_data.get("items", []):
member_info = {
"member_id": member.get("member_id"),
"name": member.get("name"),
"callsign": member.get("callsign"),
}
members_list.append(member_info)
if member.get("member_id"):
members_by_id[member["member_id"]] = member_info
else:
logger.warning(
f"Failed to fetch members: status {members_response.status_code}"
)
# Fetch all nodes from API
response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"limit": 500}
@@ -34,6 +63,7 @@ async def map_data(request: Request) -> JSONResponse:
if response.status_code == 200:
data = response.json()
nodes = data.get("items", [])
total_nodes = len(nodes)
# Filter nodes with location tags
for node in nodes:
@@ -41,6 +71,9 @@ async def map_data(request: Request) -> JSONResponse:
lat = None
lon = None
friendly_name = None
role = None
node_member_id = None
for tag in tags:
key = tag.get("key")
if key == "lat":
@@ -55,37 +88,75 @@ async def map_data(request: Request) -> JSONResponse:
pass
elif key == "friendly_name":
friendly_name = tag.get("value")
elif key == "role":
role = tag.get("value")
elif key == "member_id":
node_member_id = tag.get("value")
if lat is not None and lon is not None:
nodes_with_coords += 1
# Use friendly_name, then node name, then public key prefix
display_name = (
friendly_name
or node.get("name")
or node.get("public_key", "")[:12]
)
public_key = node.get("public_key")
# Find owner member by member_id tag
owner = (
members_by_id.get(node_member_id) if node_member_id else None
)
nodes_with_location.append(
{
"public_key": node.get("public_key"),
"public_key": public_key,
"name": display_name,
"adv_type": node.get("adv_type"),
"lat": lat,
"lon": lon,
"last_seen": node.get("last_seen"),
"role": role,
"is_infra": role == "infra",
"member_id": node_member_id,
"owner": owner,
}
)
else:
error = f"API returned status {response.status_code}"
logger.warning(f"Failed to fetch nodes: {error}")
except Exception as e:
error = str(e)
logger.warning(f"Failed to fetch nodes for map: {e}")
# Get network center location
network_location = request.app.state.network_location
logger.info(
f"Map data: {total_nodes} total nodes, " f"{nodes_with_coords} with coordinates"
)
# Calculate center from nodes, or use default (0, 0)
center_lat = 0.0
center_lon = 0.0
if nodes_with_location:
center_lat = sum(n["lat"] for n in nodes_with_location) / len(
nodes_with_location
)
center_lon = sum(n["lon"] for n in nodes_with_location) / len(
nodes_with_location
)
return JSONResponse(
{
"nodes": nodes_with_location,
"members": members_list,
"center": {
"lat": network_location[0],
"lon": network_location[1],
"lat": center_lat,
"lon": center_lon,
},
"debug": {
"total_nodes": total_nodes,
"nodes_with_coords": nodes_with_coords,
"error": error,
},
}
)

View File

@@ -21,13 +21,74 @@ async def members_page(request: Request) -> HTMLResponse:
# Fetch members from API
members = []
def node_sort_key(node: dict) -> int:
"""Sort nodes: repeater first, then chat, then others."""
adv_type = (node.get("adv_type") or "").lower()
if adv_type == "repeater":
return 0
if adv_type == "chat":
return 1
return 2
try:
# Fetch all members
response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 100}
)
if response.status_code == 200:
data = response.json()
members = data.get("items", [])
# Fetch all nodes with member_id tags in one query
nodes_response = await request.app.state.http_client.get(
"/api/v1/nodes", params={"has_tag": "member_id", "limit": 500}
)
# Build a map of member_id -> nodes
member_nodes_map: dict[str, list] = {}
if nodes_response.status_code == 200:
nodes_data = nodes_response.json()
all_nodes = nodes_data.get("items", [])
for node in all_nodes:
# Find member_id tag
for tag in node.get("tags", []):
if tag.get("key") == "member_id":
member_id_value = tag.get("value")
if member_id_value:
if member_id_value not in member_nodes_map:
member_nodes_map[member_id_value] = []
member_nodes_map[member_id_value].append(node)
break
# Assign nodes to members and sort
for member in members:
member_id = member.get("member_id")
if member_id and member_id in member_nodes_map:
# Sort nodes (repeater first, then chat, then by name tag)
nodes = member_nodes_map[member_id]
# Sort by advertisement type first, then by name
def full_sort_key(node: dict) -> tuple:
adv_type = (node.get("adv_type") or "").lower()
type_priority = (
0
if adv_type == "repeater"
else (1 if adv_type == "chat" else 2)
)
# Get name from tags
node_name = node.get("name") or ""
for tag in node.get("tags", []):
if tag.get("key") == "name":
node_name = tag.get("value") or node_name
break
return (type_priority, node_name.lower())
member["nodes"] = sorted(nodes, key=full_sort_key)
else:
member["nodes"] = []
except Exception as e:
logger.warning(f"Failed to fetch members from API: {e}")
context["api_error"] = str(e)

View File

@@ -15,7 +15,7 @@ router = APIRouter()
async def messages_list(
request: Request,
message_type: str | None = Query(None, description="Filter by message type"),
channel_idx: int | None = Query(None, description="Filter by channel"),
channel_idx: str | None = Query(None, description="Filter by channel"),
search: str | None = Query(None, description="Search in message text"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(50, ge=1, le=100, description="Items per page"),
@@ -28,12 +28,20 @@ async def messages_list(
# Calculate offset
offset = (page - 1) * limit
# Parse channel_idx, treating empty string as None
channel_idx_int: int | None = None
if channel_idx and channel_idx.strip():
try:
channel_idx_int = int(channel_idx)
except ValueError:
logger.warning(f"Invalid channel_idx value: {channel_idx}")
# Build query params
params: dict[str, int | str] = {"limit": limit, "offset": offset}
if message_type:
params["message_type"] = message_type
if channel_idx is not None:
params["channel_idx"] = channel_idx
if channel_idx_int is not None:
params["channel_idx"] = channel_idx_int
# Fetch messages from API
messages = []
@@ -62,7 +70,7 @@ async def messages_list(
"limit": limit,
"total_pages": total_pages,
"message_type": message_type or "",
"channel_idx": channel_idx,
"channel_idx": channel_idx_int,
"search": search or "",
}
)

View File

@@ -1,5 +1,6 @@
"""Network overview page route."""
import json
import logging
from fastapi import APIRouter, Request
@@ -30,6 +31,11 @@ async def network_overview(request: Request) -> HTMLResponse:
"channel_message_counts": {},
}
# Fetch activity data for charts (7 days)
advert_activity = {"days": 7, "data": []}
message_activity = {"days": 7, "data": []}
node_count = {"days": 7, "data": []}
try:
response = await request.app.state.http_client.get("/api/v1/dashboard/stats")
if response.status_code == 200:
@@ -38,6 +44,36 @@ async def network_overview(request: Request) -> HTMLResponse:
logger.warning(f"Failed to fetch stats from API: {e}")
context["api_error"] = str(e)
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/activity", params={"days": 7}
)
if response.status_code == 200:
advert_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch advertisement activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/message-activity", params={"days": 7}
)
if response.status_code == 200:
message_activity = response.json()
except Exception as e:
logger.warning(f"Failed to fetch message activity from API: {e}")
try:
response = await request.app.state.http_client.get(
"/api/v1/dashboard/node-count", params={"days": 7}
)
if response.status_code == 200:
node_count = response.json()
except Exception as e:
logger.warning(f"Failed to fetch node count from API: {e}")
context["stats"] = stats
context["advert_activity_json"] = json.dumps(advert_activity)
context["message_activity_json"] = json.dumps(message_activity)
context["node_count_json"] = json.dumps(node_count)
return templates.TemplateResponse("network.html", context)

View File

@@ -16,6 +16,7 @@ async def nodes_list(
request: Request,
search: str | None = Query(None, description="Search term"),
adv_type: str | None = Query(None, description="Filter by node type"),
member_id: str | None = Query(None, description="Filter by member"),
page: int = Query(1, ge=1, description="Page number"),
limit: int = Query(20, ge=1, le=100, description="Items per page"),
) -> HTMLResponse:
@@ -33,12 +34,22 @@ async def nodes_list(
params["search"] = search
if adv_type:
params["adv_type"] = adv_type
if member_id:
params["member_id"] = member_id
# Fetch nodes from API
nodes = []
total = 0
members = []
try:
# Fetch members for dropdown
members_response = await request.app.state.http_client.get(
"/api/v1/members", params={"limit": 100}
)
if members_response.status_code == 200:
members = members_response.json().get("items", [])
response = await request.app.state.http_client.get(
"/api/v1/nodes", params=params
)
@@ -62,6 +73,8 @@ async def nodes_list(
"total_pages": total_pages,
"search": search or "",
"adv_type": adv_type or "",
"member_id": member_id or "",
"members": members,
}
)
@@ -105,12 +118,18 @@ async def node_detail(request: Request, public_key: str) -> HTMLResponse:
logger.warning(f"Failed to fetch node details from API: {e}")
context["api_error"] = str(e)
# Check if admin editing is available
admin_enabled = getattr(request.app.state, "admin_enabled", False)
auth_user = request.headers.get("X-Forwarded-User")
context.update(
{
"node": node,
"advertisements": advertisements,
"telemetry": telemetry,
"public_key": public_key,
"admin_enabled": admin_enabled,
"is_authenticated": bool(auth_user),
}
)

View File

@@ -0,0 +1,12 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg width="100%" height="100%" viewBox="0 0 134 15" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" xml:space="preserve" xmlns:serif="http://www.serif.com/" style="fill-rule:evenodd;clip-rule:evenodd;stroke-linejoin:round;stroke-miterlimit:2;">
<path d="M3.277,0.053C2.829,0.053 2.401,0.41 2.321,0.851L0.013,13.623C-0.067,14.064 0.232,14.421 0.681,14.421L3.13,14.421C3.578,14.421 4.006,14.064 4.086,13.623L5.004,8.54L6.684,13.957C6.766,14.239 7.02,14.421 7.337,14.421L10.58,14.421C10.897,14.421 11.217,14.239 11.401,13.957L15.043,8.513L14.119,13.623C14.038,14.064 14.338,14.421 14.787,14.421L17.236,14.421C17.684,14.421 18.112,14.064 18.192,13.623L20.5,0.851C20.582,0.41 20.283,0.053 19.834,0.053L16.69,0.053C16.373,0.053 16.053,0.235 15.87,0.517L9.897,9.473C9.803,9.616 9.578,9.578 9.528,9.41L7.074,0.517C6.992,0.235 6.738,0.053 6.421,0.053L3.277,0.053Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M21.146,14.421C21.146,14.421 33.257,14.421 33.257,14.421C33.526,14.421 33.784,14.205 33.831,13.942L34.337,11.128C34.385,10.863 34.206,10.649 33.936,10.649L25.519,10.649C25.429,10.649 25.37,10.576 25.385,10.488L25.635,9.105C25.65,9.017 25.736,8.944 25.826,8.944L32.596,8.944C32.865,8.944 33.123,8.728 33.171,8.465L33.621,5.974C33.669,5.709 33.49,5.495 33.221,5.495L26.45,5.495C26.361,5.495 26.301,5.423 26.317,5.335L26.584,3.852C26.599,3.764 26.685,3.691 26.775,3.691L35.192,3.691C35.462,3.691 35.719,3.476 35.767,3.21L36.258,0.498C36.306,0.235 36.126,0.019 35.857,0.019L23.746,0.019C23.297,0.019 22.867,0.378 22.788,0.819L20.474,13.621C20.396,14.062 20.695,14.421 21.146,14.421Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M45.926,14.419L45.926,14.421L46.346,14.421C48.453,14.421 50.465,12.742 50.839,10.67L51.081,9.327C51.456,7.256 50.05,5.576 47.943,5.576L41.455,5.576C41.186,5.576 41.007,5.363 41.054,5.097L41.218,4.192C41.266,3.927 41.524,3.713 41.793,3.713L50.569,3.713C51.018,3.713 51.446,3.356 51.526,2.915L51.9,0.85C51.98,0.407 51.68,0.05 51.232,0.05L41.638,0.05C39.531,0.05 37.519,1.73 37.145,3.801L36.88,5.267C36.505,7.339 37.91,9.018 40.018,9.018L46.506,9.018C46.775,9.018 46.954,9.231 46.907,9.497L46.785,10.176C46.737,10.441 46.479,10.655 46.21,10.655L37.189,10.655C36.741,10.655 36.313,11.012 36.233,11.453L35.841,13.621C35.761,14.062 36.061,14.419 36.51,14.419L45.926,14.419Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M68.008,0.046C68.008,0.046 65.296,0.046 65.296,0.046C64.847,0.046 64.42,0.403 64.34,0.844L63.532,5.31C63.517,5.398 63.431,5.469 63.341,5.469L58.085,5.469C57.995,5.469 57.936,5.398 57.951,5.31L58.758,0.844C58.837,0.403 58.539,0.046 58.09,0.046L55.378,0.046C54.93,0.046 54.502,0.403 54.422,0.844L52.112,13.623C52.032,14.064 52.331,14.421 52.78,14.421L55.492,14.421C55.941,14.421 56.369,14.064 56.449,13.623L57.272,9.074C57.287,8.986 57.373,8.914 57.462,8.914L62.719,8.914C62.809,8.914 62.868,8.985 62.853,9.074L62.032,13.623C61.952,14.064 62.252,14.421 62.7,14.421L65.413,14.421C65.861,14.421 66.289,14.064 66.369,13.623L68.678,0.844C68.755,0.403 68.457,0.046 68.008,0.046Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M72.099,14.421C72.099,14.421 80.066,14.421 80.066,14.421C80.515,14.421 80.943,14.064 81.022,13.623L81.414,11.453C81.494,11.012 81.194,10.655 80.746,10.655L73.828,10.655C73.559,10.655 73.38,10.441 73.427,10.176L74.51,4.215C74.558,3.951 74.815,3.736 75.082,3.736L82,3.736C82.448,3.736 82.876,3.379 82.956,2.938L83.34,0.817C83.42,0.376 83.12,0.019 82.672,0.019L74.724,0.019C72.622,0.019 70.614,1.691 70.236,3.757L68.965,10.665C68.587,12.738 69.99,14.421 72.099,14.421Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M97.176,-0C97.176,0 88.882,0 88.882,0C86.775,0 84.763,1.68 84.389,3.751L83.139,10.67C82.765,12.741 84.169,14.421 86.277,14.421L94.571,14.421C96.678,14.421 98.69,12.741 99.064,10.67L100.314,3.751C100.689,1.68 99.284,-0 97.176,-0ZM94.798,10.178C94.75,10.443 94.492,10.657 94.223,10.657L87.978,10.657C87.709,10.657 87.529,10.443 87.577,10.178L88.659,4.192C88.707,3.927 88.964,3.713 89.234,3.713L95.477,3.713C95.747,3.713 95.926,3.927 95.878,4.192L94.798,10.178Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M101.284,14.421L103.995,14.421C104.443,14.421 104.871,14.065 104.951,13.624L105.43,10.97C105.446,10.882 105.531,10.81 105.621,10.81L108.902,10.806C109.064,10.806 109.2,10.886 109.267,11.018L110.813,14.035C110.992,14.392 111.319,14.434 112.303,14.419C112.88,14.426 113.756,14.382 115.169,14.382C115.623,14.382 115.902,13.907 115.678,13.51L113.989,10.569C113.945,10.491 113.993,10.386 114.086,10.34C115.39,9.707 116.423,8.477 116.681,7.055L117.27,3.785C117.646,1.713 116.242,0.033 114.134,0.033L103.884,0.033C103.436,0.033 103.008,0.39 102.928,0.831L100.616,13.623C100.536,14.064 100.836,14.421 101.284,14.421L101.284,14.421ZM106.73,3.791C106.745,3.703 106.831,3.631 106.921,3.631L112.225,3.631C112.626,3.631 112.891,3.949 112.821,4.343L112.431,6.494C112.359,6.885 111.979,7.204 111.58,7.204L106.276,7.204C106.186,7.204 106.127,7.133 106.142,7.043L106.73,3.791Z" style="fill:white;fill-rule:nonzero;"/>
<path d="M118.277,14.421C118.277,14.421 130.388,14.421 130.388,14.421C130.657,14.421 130.915,14.205 130.963,13.942L131.468,11.128C131.516,10.863 131.337,10.649 131.068,10.649L122.65,10.649C122.56,10.649 122.501,10.576 122.516,10.488L122.766,9.105C122.781,9.017 122.867,8.944 122.957,8.944L129.728,8.944C129.997,8.944 130.254,8.728 130.302,8.465L130.753,5.974C130.801,5.709 130.621,5.495 130.352,5.495L123.581,5.495C123.492,5.495 123.432,5.423 123.448,5.335L123.715,3.852C123.73,3.764 123.816,3.691 123.906,3.691L132.324,3.691C132.593,3.691 132.851,3.476 132.898,3.21L133.389,0.498C133.437,0.235 133.257,0.019 132.988,0.019L120.877,0.019C120.428,0.019 119.999,0.378 119.919,0.819L117.605,13.621C117.527,14.062 117.827,14.421 118.277,14.421Z" style="fill:white;fill-rule:nonzero;"/>
</svg>

After

Width:  |  Height:  |  Size: 5.9 KiB

View File

@@ -0,0 +1,78 @@
/**
* MeshCore Hub - Common JavaScript Utilities
*/
/**
* Format a timestamp as relative time (e.g., "2m", "1h", "2d")
* @param {string|Date} timestamp - ISO timestamp string or Date object
* @returns {string} Relative time string, or empty string if invalid
*/
function formatRelativeTime(timestamp) {
if (!timestamp) return '';
const date = timestamp instanceof Date ? timestamp : new Date(timestamp);
if (isNaN(date.getTime())) return '';
const now = new Date();
const diffMs = now - date;
const diffSec = Math.floor(diffMs / 1000);
const diffMin = Math.floor(diffSec / 60);
const diffHour = Math.floor(diffMin / 60);
const diffDay = Math.floor(diffHour / 24);
if (diffDay > 0) return `${diffDay}d`;
if (diffHour > 0) return `${diffHour}h`;
if (diffMin > 0) return `${diffMin}m`;
return '<1m';
}
/**
* Populate all elements with data-timestamp attribute with relative time
*/
function populateRelativeTimestamps() {
document.querySelectorAll('[data-timestamp]:not([data-receiver-tooltip])').forEach(el => {
const timestamp = el.dataset.timestamp;
if (timestamp) {
el.textContent = formatRelativeTime(timestamp);
}
});
}
/**
* Populate receiver tooltip elements with name and relative time
*/
function populateReceiverTooltips() {
document.querySelectorAll('[data-receiver-tooltip]').forEach(el => {
const name = el.dataset.name || '';
const timestamp = el.dataset.timestamp;
const relTime = timestamp ? formatRelativeTime(timestamp) : '';
// Build tooltip: "NodeName (2m ago)" or just "NodeName" or just "2m ago"
let tooltip = name;
if (relTime) {
tooltip = name ? `${name} (${relTime} ago)` : `${relTime} ago`;
}
el.title = tooltip;
});
}
/**
* Populate <time> elements with data-relative-time attribute
* Uses the datetime attribute as the timestamp source
*/
function populateRelativeTimeElements() {
document.querySelectorAll('time[data-relative-time]').forEach(el => {
const timestamp = el.getAttribute('datetime');
if (timestamp) {
const relTime = formatRelativeTime(timestamp);
el.textContent = relTime ? `${relTime} ago` : '';
}
});
}
// Auto-populate when DOM is ready
document.addEventListener('DOMContentLoaded', () => {
populateRelativeTimestamps();
populateReceiverTooltips();
populateRelativeTimeElements();
});

View File

@@ -0,0 +1,47 @@
{# Reusable macros for templates #}
{#
Pagination macro
Parameters:
- page: Current page number
- total_pages: Total number of pages
- params: Dict of query parameters to preserve (e.g., {"search": "foo", "limit": 50})
#}
{% macro pagination(page, total_pages, params={}) %}
{% if total_pages > 1 %}
{% set query_parts = [] %}
{% for key, value in params.items() %}
{% if value is not none and value != '' %}
{% set _ = query_parts.append(key ~ '=' ~ value) %}
{% endif %}
{% endfor %}
{% set base_query = query_parts|join('&') %}
{% set query_prefix = '&' if base_query else '' %}
<div class="flex justify-center mt-6">
<div class="join">
{% if page > 1 %}
<a href="?page={{ page - 1 }}{{ query_prefix }}{{ base_query }}" class="join-item btn btn-sm">Previous</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Previous</button>
{% endif %}
{% for p in range(1, total_pages + 1) %}
{% if p == page %}
<button class="join-item btn btn-sm btn-active">{{ p }}</button>
{% elif p == 1 or p == total_pages or (p >= page - 2 and p <= page + 2) %}
<a href="?page={{ p }}{{ query_prefix }}{{ base_query }}" class="join-item btn btn-sm">{{ p }}</a>
{% elif p == 2 or p == total_pages - 1 %}
<button class="join-item btn btn-sm btn-disabled">...</button>
{% endif %}
{% endfor %}
{% if page < total_pages %}
<a href="?page={{ page + 1 }}{{ query_prefix }}{{ base_query }}" class="join-item btn btn-sm">Next</a>
{% else %}
<button class="join-item btn btn-sm btn-disabled">Next</button>
{% endif %}
</div>
</div>
{% endif %}
{% endmacro %}

View File

@@ -0,0 +1,20 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Access Denied{% endblock %}
{% block content %}
<div class="flex flex-col items-center justify-center min-h-[50vh]">
<div class="text-center">
<svg xmlns="http://www.w3.org/2000/svg" class="h-24 w-24 mx-auto text-error opacity-50 mb-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
<h1 class="text-3xl font-bold mb-2">Access Denied</h1>
<p class="text-lg opacity-70 mb-6">You don't have permission to access the admin area.</p>
<p class="text-sm opacity-50 mb-8">Please contact the network administrator if you believe this is an error.</p>
<div class="flex gap-4 justify-center">
<a href="/" class="btn btn-primary">Return Home</a>
<a href="/oauth2/sign_out" class="btn btn-outline">Sign Out</a>
</div>
</div>
</div>
{% endblock %}

View File

@@ -0,0 +1,70 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Admin{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-4">
<div>
<h1 class="text-3xl font-bold">Admin</h1>
<div class="text-sm breadcrumbs">
<ul>
<li><a href="/">Home</a></li>
<li>Admin</li>
</ul>
</div>
</div>
<a href="/oauth2/sign_out" class="btn btn-outline btn-sm">Sign Out</a>
</div>
<!-- Authenticated User Info -->
<div class="flex flex-wrap items-center gap-4 text-sm opacity-70 mb-6">
{% if auth_username or auth_user %}
<span class="flex items-center gap-1.5">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
d="M16 7a4 4 0 11-8 0 4 4 0 018 0zM12 14a7 7 0 00-7 7h14a7 7 0 00-7-7z" />
</svg>
{{ auth_username or auth_user }}
</span>
{% endif %}
{% if auth_email %}
<span class="flex items-center gap-1.5">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
d="M3 8l7.89 5.26a2 2 0 002.22 0L21 8M5 19h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z" />
</svg>
{{ auth_email }}
</span>
{% endif %}
</div>
<!-- Navigation Cards -->
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
<a href="/a/members" class="card bg-base-100 shadow-xl hover:shadow-2xl transition-shadow">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24"
stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
d="M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z" />
</svg>
Members
</h2>
<p>Manage network members and operators.</p>
</div>
</a>
<a href="/a/node-tags" class="card bg-base-100 shadow-xl hover:shadow-2xl transition-shadow">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24"
stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A2 2 0 013 12V7a4 4 0 014-4z" />
</svg>
Node Tags
</h2>
<p>Manage custom tags and metadata for network nodes.</p>
</div>
</a>
</div>
{% endblock %}

View File

@@ -0,0 +1,282 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Members Admin{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-6">
<div>
<h1 class="text-3xl font-bold">Members</h1>
<div class="text-sm breadcrumbs">
<ul>
<li><a href="/">Home</a></li>
<li><a href="/a/">Admin</a></li>
<li>Members</li>
</ul>
</div>
</div>
<a href="/oauth2/sign_out" class="btn btn-outline btn-sm">Sign Out</a>
</div>
<!-- Flash Messages -->
{% if message %}
<div class="alert alert-success mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>{{ message }}</span>
</div>
{% endif %}
{% if error %}
<div class="alert alert-error mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10 14l2-2m0 0l2-2m-2 2l-2-2m2 2l2 2m7-2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>{{ error }}</span>
</div>
{% endif %}
<!-- Members Table -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<div class="flex justify-between items-center">
<h2 class="card-title">Network Members ({{ members|length }})</h2>
<button class="btn btn-primary btn-sm" onclick="addModal.showModal()">Add Member</button>
</div>
{% if members %}
<div class="overflow-x-auto">
<table class="table table-zebra">
<thead>
<tr>
<th>Member ID</th>
<th>Name</th>
<th>Callsign</th>
<th>Contact</th>
<th class="w-32">Actions</th>
</tr>
</thead>
<tbody>
{% for member in members %}
<tr data-member-id="{{ member.id }}"
data-member-name="{{ member.name }}"
data-member-member-id="{{ member.member_id }}"
data-member-callsign="{{ member.callsign or '' }}"
data-member-description="{{ member.description or '' }}"
data-member-contact="{{ member.contact or '' }}">
<td class="font-mono font-semibold">{{ member.member_id }}</td>
<td>{{ member.name }}</td>
<td>
{% if member.callsign %}
<span class="badge badge-primary">{{ member.callsign }}</span>
{% else %}
<span class="text-base-content/40">-</span>
{% endif %}
</td>
<td class="max-w-xs truncate" title="{{ member.contact or '' }}">{{ member.contact or '-' }}</td>
<td>
<div class="flex gap-1">
<button class="btn btn-ghost btn-xs btn-edit">
Edit
</button>
<button class="btn btn-ghost btn-xs text-error btn-delete">
Delete
</button>
</div>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="text-center py-8 text-base-content/60">
<p>No members configured yet.</p>
<p class="text-sm mt-2">Click "Add Member" to create the first member.</p>
</div>
{% endif %}
</div>
</div>
<!-- Add Modal -->
<dialog id="addModal" class="modal">
<div class="modal-box w-11/12 max-w-2xl">
<h3 class="font-bold text-lg">Add New Member</h3>
<form method="post" action="/a/members" class="py-4">
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<div class="form-control">
<label class="label">
<span class="label-text">Member ID <span class="text-error">*</span></span>
</label>
<input type="text" name="member_id" id="add_member_id" class="input input-bordered"
placeholder="walshie86" required maxlength="50"
pattern="[a-zA-Z0-9_]+"
title="Letters, numbers, and underscores only">
<label class="label">
<span class="label-text-alt">Unique identifier (letters, numbers, underscore)</span>
</label>
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Name <span class="text-error">*</span></span>
</label>
<input type="text" name="name" id="add_name" class="input input-bordered"
placeholder="John Smith" required maxlength="255">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Callsign</span>
</label>
<input type="text" name="callsign" id="add_callsign" class="input input-bordered"
placeholder="VK4ABC" maxlength="20">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Contact</span>
</label>
<input type="text" name="contact" id="add_contact" class="input input-bordered"
placeholder="john@example.com or phone number" maxlength="255">
</div>
<div class="form-control md:col-span-2">
<label class="label">
<span class="label-text">Description</span>
</label>
<textarea name="description" id="add_description" rows="3" class="textarea textarea-bordered"
placeholder="Brief description of member's role and responsibilities..."></textarea>
</div>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="addModal.close()">Cancel</button>
<button type="submit" class="btn btn-primary">Add Member</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Edit Modal -->
<dialog id="editModal" class="modal">
<div class="modal-box w-11/12 max-w-2xl">
<h3 class="font-bold text-lg">Edit Member</h3>
<form method="post" action="/a/members/update" class="py-4">
<input type="hidden" name="id" id="edit_id">
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<div class="form-control">
<label class="label">
<span class="label-text">Member ID <span class="text-error">*</span></span>
</label>
<input type="text" name="member_id" id="edit_member_id" class="input input-bordered"
required maxlength="50" pattern="[a-zA-Z0-9_]+"
title="Letters, numbers, and underscores only">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Name <span class="text-error">*</span></span>
</label>
<input type="text" name="name" id="edit_name" class="input input-bordered"
required maxlength="255">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Callsign</span>
</label>
<input type="text" name="callsign" id="edit_callsign" class="input input-bordered"
maxlength="20">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Contact</span>
</label>
<input type="text" name="contact" id="edit_contact" class="input input-bordered"
maxlength="255">
</div>
<div class="form-control md:col-span-2">
<label class="label">
<span class="label-text">Description</span>
</label>
<textarea name="description" id="edit_description" rows="3"
class="textarea textarea-bordered"></textarea>
</div>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="editModal.close()">Cancel</button>
<button type="submit" class="btn btn-primary">Save Changes</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Delete Modal -->
<dialog id="deleteModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Delete Member</h3>
<form method="post" action="/a/members/delete" class="py-4">
<input type="hidden" name="id" id="delete_id">
<p class="py-4">Are you sure you want to delete member <strong id="delete_member_name"></strong>?</p>
<div class="alert alert-error mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>This action cannot be undone.</span>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="deleteModal.close()">Cancel</button>
<button type="submit" class="btn btn-error">Delete</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
{% endblock %}
{% block extra_scripts %}
<script>
// Use event delegation to handle button clicks safely
document.addEventListener('DOMContentLoaded', function() {
// Edit button handler
document.querySelectorAll('.btn-edit').forEach(function(btn) {
btn.addEventListener('click', function() {
var row = this.closest('tr');
document.getElementById('edit_id').value = row.dataset.memberId;
document.getElementById('edit_member_id').value = row.dataset.memberMemberId;
document.getElementById('edit_name').value = row.dataset.memberName;
document.getElementById('edit_callsign').value = row.dataset.memberCallsign;
document.getElementById('edit_description').value = row.dataset.memberDescription;
document.getElementById('edit_contact').value = row.dataset.memberContact;
editModal.showModal();
});
});
// Delete button handler
document.querySelectorAll('.btn-delete').forEach(function(btn) {
btn.addEventListener('click', function() {
var row = this.closest('tr');
document.getElementById('delete_id').value = row.dataset.memberId;
document.getElementById('delete_member_name').textContent = row.dataset.memberName;
deleteModal.showModal();
});
});
});
</script>
{% endblock %}

View File

@@ -0,0 +1,434 @@
{% extends "base.html" %}
{% block title %}{{ network_name }} - Node Tags Admin{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-6">
<div>
<h1 class="text-3xl font-bold">Node Tags</h1>
<div class="text-sm breadcrumbs">
<ul>
<li><a href="/">Home</a></li>
<li><a href="/a/">Admin</a></li>
<li>Node Tags</li>
</ul>
</div>
</div>
<a href="/oauth2/sign_out" class="btn btn-outline btn-sm">Sign Out</a>
</div>
<!-- Flash Messages -->
{% if message %}
<div class="alert alert-success mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>{{ message }}</span>
</div>
{% endif %}
{% if error %}
<div class="alert alert-error mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M10 14l2-2m0 0l2-2m-2 2l-2-2m2 2l2 2m7-2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>{{ error }}</span>
</div>
{% endif %}
<!-- Node Selector -->
<div class="card bg-base-100 shadow-xl mb-6">
<div class="card-body">
<h2 class="card-title">Select Node</h2>
<form method="get" action="/a/node-tags" class="flex gap-4 items-end">
<div class="form-control flex-1">
<label class="label">
<span class="label-text">Node</span>
</label>
<select name="public_key" class="select select-bordered w-full" onchange="this.form.submit()">
<option value="">-- Select a node --</option>
{% for node in nodes %}
<option value="{{ node.public_key }}" {% if node.public_key == selected_public_key %}selected{% endif %}>
{{ node.name or 'Unnamed' }} ({{ node.public_key[:8] }}...{{ node.public_key[-4:] }})
</option>
{% endfor %}
</select>
</div>
<button type="submit" class="btn btn-primary">Load Tags</button>
</form>
</div>
</div>
{% if selected_public_key and selected_node %}
<!-- Selected Node Info -->
<div class="card bg-base-100 shadow-xl mb-6">
<div class="card-body">
<div class="flex justify-between items-start">
<div class="flex items-start gap-3">
<span class="text-2xl" title="{{ selected_node.adv_type or 'Unknown' }}">{% if selected_node.adv_type and selected_node.adv_type|lower == 'chat' %}💬{% elif selected_node.adv_type and selected_node.adv_type|lower == 'repeater' %}📡{% elif selected_node.adv_type and selected_node.adv_type|lower == 'room' %}🪧{% else %}📍{% endif %}</span>
<div>
<h2 class="card-title">{{ selected_node.name or 'Unnamed Node' }}</h2>
<p class="text-sm opacity-70 font-mono">{{ selected_public_key }}</p>
</div>
</div>
<div class="flex gap-2">
{% if tags %}
<button class="btn btn-outline btn-sm" onclick="copyAllModal.showModal()">Copy All</button>
<button class="btn btn-outline btn-error btn-sm" onclick="deleteAllModal.showModal()">Delete All</button>
{% endif %}
<a href="/nodes/{{ selected_public_key }}" class="btn btn-ghost btn-sm">View Node</a>
</div>
</div>
</div>
</div>
<!-- Tags Table -->
<div class="card bg-base-100 shadow-xl mb-6">
<div class="card-body">
<h2 class="card-title">Tags ({{ tags|length }})</h2>
{% if tags %}
<div class="overflow-x-auto">
<table class="table table-zebra">
<thead>
<tr>
<th>Key</th>
<th>Value</th>
<th>Type</th>
<th>Updated</th>
<th class="w-48">Actions</th>
</tr>
</thead>
<tbody>
{% for tag in tags %}
<tr data-tag-key="{{ tag.key }}" data-tag-value="{{ tag.value or '' }}" data-tag-type="{{ tag.value_type }}">
<td class="font-mono font-semibold">{{ tag.key }}</td>
<td class="max-w-xs truncate" title="{{ tag.value or '' }}">{{ tag.value or '-' }}</td>
<td>
<span class="badge badge-ghost badge-sm">{{ tag.value_type }}</span>
</td>
<td class="text-sm opacity-70">{{ tag.updated_at[:10] if tag.updated_at else '-' }}</td>
<td>
<div class="flex gap-1">
<button class="btn btn-ghost btn-xs btn-edit">
Edit
</button>
<button class="btn btn-ghost btn-xs btn-move">
Move
</button>
<button class="btn btn-ghost btn-xs text-error btn-delete">
Delete
</button>
</div>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="text-center py-8 text-base-content/60">
<p>No tags found for this node.</p>
<p class="text-sm mt-2">Add a new tag below.</p>
</div>
{% endif %}
</div>
</div>
<!-- Add New Tag Form -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">Add New Tag</h2>
<form method="post" action="/a/node-tags" class="grid grid-cols-1 md:grid-cols-4 gap-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<div class="form-control">
<label class="label">
<span class="label-text">Key</span>
</label>
<input type="text" name="key" class="input input-bordered" placeholder="tag_name" required maxlength="100">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Value</span>
</label>
<input type="text" name="value" class="input input-bordered" placeholder="tag value">
</div>
<div class="form-control">
<label class="label">
<span class="label-text">Type</span>
</label>
<select name="value_type" class="select select-bordered">
<option value="string">string</option>
<option value="number">number</option>
<option value="boolean">boolean</option>
</select>
</div>
<div class="form-control">
<label class="label">
<span class="label-text">&nbsp;</span>
</label>
<button type="submit" class="btn btn-primary">Add Tag</button>
</div>
</form>
</div>
</div>
<!-- Edit Modal -->
<dialog id="editModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Edit Tag</h3>
<form method="post" action="/a/node-tags/update" class="py-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<input type="hidden" name="key" id="editKey">
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Key</span>
</label>
<input type="text" id="editKeyDisplay" class="input input-bordered" disabled>
</div>
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Value</span>
</label>
<input type="text" name="value" id="editValue" class="input input-bordered">
</div>
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Type</span>
</label>
<select name="value_type" id="editValueType" class="select select-bordered w-full">
<option value="string">string</option>
<option value="number">number</option>
<option value="boolean">boolean</option>
</select>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="editModal.close()">Cancel</button>
<button type="submit" class="btn btn-primary">Save Changes</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Move Modal -->
<dialog id="moveModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Move Tag to Another Node</h3>
<form method="post" action="/a/node-tags/move" class="py-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<input type="hidden" name="key" id="moveKey">
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Tag Key</span>
</label>
<input type="text" id="moveKeyDisplay" class="input input-bordered" disabled>
</div>
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Destination Node</span>
</label>
<select name="new_public_key" id="moveDestination" class="select select-bordered w-full" required>
<option value="">-- Select destination node --</option>
{% for node in nodes %}
{% if node.public_key != selected_public_key %}
<option value="{{ node.public_key }}">
{{ node.name or 'Unnamed' }} ({{ node.public_key[:8] }}...{{ node.public_key[-4:] }})
</option>
{% endif %}
{% endfor %}
</select>
</div>
<div class="alert alert-warning mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>This will move the tag from the current node to the destination node.</span>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="moveModal.close()">Cancel</button>
<button type="submit" class="btn btn-warning">Move Tag</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Delete Modal -->
<dialog id="deleteModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Delete Tag</h3>
<form method="post" action="/a/node-tags/delete" class="py-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<input type="hidden" name="key" id="deleteKey">
<p class="py-4">Are you sure you want to delete the tag "<span id="deleteKeyDisplay" class="font-mono font-semibold"></span>"?</p>
<div class="alert alert-error mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>This action cannot be undone.</span>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="deleteModal.close()">Cancel</button>
<button type="submit" class="btn btn-error">Delete</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Copy All Tags Modal -->
<dialog id="copyAllModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Copy All Tags to Another Node</h3>
<form method="post" action="/a/node-tags/copy-all" class="py-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<p class="mb-4">Copy all {{ tags|length }} tag(s) from <strong>{{ selected_node.name or 'Unnamed' }}</strong> to another node.</p>
<div class="form-control mb-4">
<label class="label">
<span class="label-text">Destination Node</span>
</label>
<select name="dest_public_key" class="select select-bordered w-full" required>
<option value="">-- Select destination node --</option>
{% for node in nodes %}
{% if node.public_key != selected_public_key %}
<option value="{{ node.public_key }}">
{{ node.name or 'Unnamed' }} ({{ node.public_key[:8] }}...{{ node.public_key[-4:] }})
</option>
{% endif %}
{% endfor %}
</select>
</div>
<div class="alert alert-info mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>Tags that already exist on the destination node will be skipped. Original tags remain on this node.</span>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="copyAllModal.close()">Cancel</button>
<button type="submit" class="btn btn-primary">Copy Tags</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
<!-- Delete All Tags Modal -->
<dialog id="deleteAllModal" class="modal">
<div class="modal-box">
<h3 class="font-bold text-lg">Delete All Tags</h3>
<form method="post" action="/a/node-tags/delete-all" class="py-4">
<input type="hidden" name="public_key" value="{{ selected_public_key }}">
<p class="mb-4">Are you sure you want to delete all {{ tags|length }} tag(s) from <strong>{{ selected_node.name or 'Unnamed' }}</strong>?</p>
<div class="alert alert-error mb-4">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>This action cannot be undone. All tags will be permanently deleted.</span>
</div>
<div class="modal-action">
<button type="button" class="btn" onclick="deleteAllModal.close()">Cancel</button>
<button type="submit" class="btn btn-error">Delete All Tags</button>
</div>
</form>
</div>
<form method="dialog" class="modal-backdrop">
<button>close</button>
</form>
</dialog>
{% elif selected_public_key and not selected_node %}
<div class="alert alert-warning">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>Node not found: {{ selected_public_key }}</span>
</div>
{% else %}
<div class="card bg-base-100 shadow-xl">
<div class="card-body text-center py-12">
<svg xmlns="http://www.w3.org/2000/svg" class="h-16 w-16 mx-auto mb-4 opacity-30" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A2 2 0 013 12V7a4 4 0 014-4z" />
</svg>
<h2 class="text-xl font-semibold mb-2">Select a Node</h2>
<p class="opacity-70">Choose a node from the dropdown above to view and manage its tags.</p>
</div>
</div>
{% endif %}
{% endblock %}
{% block extra_scripts %}
<script>
// Use event delegation to handle button clicks safely
document.addEventListener('DOMContentLoaded', function() {
// Edit button handler
document.querySelectorAll('.btn-edit').forEach(function(btn) {
btn.addEventListener('click', function() {
var row = this.closest('tr');
var key = row.dataset.tagKey;
var value = row.dataset.tagValue;
var valueType = row.dataset.tagType;
document.getElementById('editKey').value = key;
document.getElementById('editKeyDisplay').value = key;
document.getElementById('editValue').value = value;
document.getElementById('editValueType').value = valueType;
editModal.showModal();
});
});
// Move button handler
document.querySelectorAll('.btn-move').forEach(function(btn) {
btn.addEventListener('click', function() {
var row = this.closest('tr');
var key = row.dataset.tagKey;
document.getElementById('moveKey').value = key;
document.getElementById('moveKeyDisplay').value = key;
document.getElementById('moveDestination').selectedIndex = 0;
moveModal.showModal();
});
});
// Delete button handler
document.querySelectorAll('.btn-delete').forEach(function(btn) {
btn.addEventListener('click', function() {
var row = this.closest('tr');
var key = row.dataset.tagKey;
document.getElementById('deleteKey').value = key;
document.getElementById('deleteKeyDisplay').textContent = key;
deleteModal.showModal();
});
});
});
</script>
{% endblock %}

View File

@@ -0,0 +1,163 @@
{% extends "base.html" %}
{% from "_macros.html" import pagination %}
{% block title %}{{ network_name }} - Advertisements{% endblock %}
{% block content %}
<div class="flex items-center justify-between mb-6">
<h1 class="text-3xl font-bold">Advertisements</h1>
<span class="badge badge-lg">{{ total }} total</span>
</div>
{% if api_error %}
<div class="alert alert-warning mb-6">
<svg xmlns="http://www.w3.org/2000/svg" class="stroke-current shrink-0 h-6 w-6" fill="none" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span>Could not fetch data from API: {{ api_error }}</span>
</div>
{% endif %}
<!-- Filters -->
<div class="card bg-base-100 shadow mb-6">
<div class="card-body py-4">
<form method="GET" action="/advertisements" class="flex gap-4 flex-wrap items-end">
<div class="form-control">
<label class="label py-1">
<span class="label-text">Search</span>
</label>
<input type="text" name="search" value="{{ search }}" placeholder="Search by name, ID, or public key..." class="input input-bordered input-sm w-80" />
</div>
{% if nodes %}
<div class="form-control">
<label class="label py-1">
<span class="label-text">Node</span>
</label>
<select name="public_key" class="select select-bordered select-sm">
<option value="">All Nodes</option>
{% for node in nodes %}
{% set ns = namespace(tag_name=none) %}
{% for tag in node.tags or [] %}
{% if tag.key == 'name' %}
{% set ns.tag_name = tag.value %}
{% endif %}
{% endfor %}
<option value="{{ node.public_key }}" {% if public_key == node.public_key %}selected{% endif %}>{{ ns.tag_name or node.name or node.public_key[:12] + '...' }}</option>
{% endfor %}
</select>
</div>
{% endif %}
{% if members %}
<div class="form-control">
<label class="label py-1">
<span class="label-text">Member</span>
</label>
<select name="member_id" class="select select-bordered select-sm">
<option value="">All Members</option>
{% for member in members %}
<option value="{{ member.member_id }}" {% if member_id == member.member_id %}selected{% endif %}>{{ member.name }}{% if member.callsign %} ({{ member.callsign }}){% endif %}</option>
{% endfor %}
</select>
</div>
{% endif %}
<div class="flex gap-2 w-full sm:w-auto">
<button type="submit" class="btn btn-primary btn-sm">Filter</button>
<a href="/advertisements" class="btn btn-ghost btn-sm">Clear</a>
</div>
</form>
</div>
</div>
<!-- Advertisements List - Mobile Card View -->
<div class="lg:hidden space-y-3">
{% for ad in advertisements %}
<a href="/nodes/{{ ad.public_key }}" class="card bg-base-100 shadow-sm block">
<div class="card-body p-3">
<div class="flex items-center justify-between gap-2">
<div class="flex items-center gap-2 min-w-0">
<span class="text-lg flex-shrink-0" title="{{ ad.adv_type or 'Unknown' }}">{% if ad.adv_type and ad.adv_type|lower == 'chat' %}💬{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}📡{% elif ad.adv_type and ad.adv_type|lower == 'room' %}🪧{% else %}📍{% endif %}</span>
<div class="min-w-0">
{% if ad.node_tag_name or ad.node_name or ad.name %}
<div class="font-medium text-sm truncate">{{ ad.node_tag_name or ad.node_name or ad.name }}</div>
<div class="text-xs font-mono opacity-60 truncate">{{ ad.public_key[:16] }}...</div>
{% else %}
<div class="font-mono text-sm truncate">{{ ad.public_key[:16] }}...</div>
{% endif %}
</div>
</div>
<div class="text-right flex-shrink-0">
<div class="text-xs opacity-60">
{{ ad.received_at[:16].replace('T', ' ') if ad.received_at else '-' }}
</div>
{% if ad.receivers and ad.receivers|length >= 1 %}
<div class="flex gap-0.5 justify-end mt-1">
{% for recv in ad.receivers %}
<span class="text-sm" title="{{ recv.tag_name or recv.name or recv.public_key[:12] }}">📡</span>
{% endfor %}
</div>
{% elif ad.received_by %}
<span class="text-sm" title="{{ ad.receiver_tag_name or ad.receiver_name or ad.received_by[:12] }}">📡</span>
{% endif %}
</div>
</div>
</div>
</a>
{% else %}
<div class="text-center py-8 opacity-70">No advertisements found.</div>
{% endfor %}
</div>
<!-- Advertisements Table - Desktop View -->
<div class="hidden lg:block overflow-x-auto overflow-y-visible bg-base-100 rounded-box shadow">
<table class="table table-zebra">
<thead>
<tr>
<th>Node</th>
<th>Time</th>
<th>Receivers</th>
</tr>
</thead>
<tbody>
{% for ad in advertisements %}
<tr class="hover">
<td>
<a href="/nodes/{{ ad.public_key }}" class="link link-hover flex items-center gap-2">
<span class="text-lg" title="{{ ad.adv_type or 'Unknown' }}">{% if ad.adv_type and ad.adv_type|lower == 'chat' %}💬{% elif ad.adv_type and ad.adv_type|lower == 'repeater' %}📡{% elif ad.adv_type and ad.adv_type|lower == 'room' %}🪧{% else %}📍{% endif %}</span>
<div>
{% if ad.node_tag_name or ad.node_name or ad.name %}
<div class="font-medium">{{ ad.node_tag_name or ad.node_name or ad.name }}</div>
<div class="text-xs font-mono opacity-70">{{ ad.public_key[:16] }}...</div>
{% else %}
<span class="font-mono text-sm">{{ ad.public_key[:16] }}...</span>
{% endif %}
</div>
</a>
</td>
<td class="text-sm whitespace-nowrap">
{{ ad.received_at[:19].replace('T', ' ') if ad.received_at else '-' }}
</td>
<td>
{% if ad.receivers and ad.receivers|length >= 1 %}
<div class="flex gap-1">
{% for recv in ad.receivers %}
<a href="/nodes/{{ recv.public_key }}" class="text-lg hover:opacity-70" data-receiver-tooltip data-name="{{ recv.tag_name or recv.name or recv.public_key[:12] }}" data-timestamp="{{ recv.received_at }}">📡</a>
{% endfor %}
</div>
{% elif ad.received_by %}
<a href="/nodes/{{ ad.received_by }}" class="text-lg hover:opacity-70" title="{{ ad.receiver_tag_name or ad.receiver_name or ad.received_by[:12] }}">📡</a>
{% else %}
<span class="opacity-50">-</span>
{% endif %}
</td>
</tr>
{% else %}
<tr>
<td colspan="3" class="text-center py-8 opacity-70">No advertisements found.</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{{ pagination(page, total_pages, {"search": search, "public_key": public_key, "member_id": member_id, "limit": limit}) }}
{% endblock %}

View File

@@ -58,6 +58,7 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
@@ -75,13 +76,14 @@
<li><a href="/" class="{% if request.url.path == '/' %}active{% endif %}">Home</a></li>
<li><a href="/network" class="{% if request.url.path == '/network' %}active{% endif %}">Network</a></li>
<li><a href="/nodes" class="{% if '/nodes' in request.url.path %}active{% endif %}">Nodes</a></li>
<li><a href="/advertisements" class="{% if request.url.path == '/advertisements' %}active{% endif %}">Advertisements</a></li>
<li><a href="/messages" class="{% if request.url.path == '/messages' %}active{% endif %}">Messages</a></li>
<li><a href="/map" class="{% if request.url.path == '/map' %}active{% endif %}">Map</a></li>
<li><a href="/members" class="{% if request.url.path == '/members' %}active{% endif %}">Members</a></li>
</ul>
</div>
<div class="navbar-end">
<div class="badge badge-outline badge-sm">v{{ version }}</div>
<div class="badge badge-outline badge-sm">{{ version }}</div>
</div>
</div>
@@ -105,16 +107,23 @@
{% endif %}
{% if network_contact_email and network_contact_discord %} | {% endif %}
{% if network_contact_discord %}
<span>Discord: {{ network_contact_discord }}</span>
<a href="{{ network_contact_discord }}" target="_blank" rel="noopener noreferrer" class="link link-hover">Discord</a>
{% endif %}
{% if (network_contact_email or network_contact_discord) and network_contact_github %} | {% endif %}
{% if network_contact_github %}
<a href="{{ network_contact_github }}" target="_blank" rel="noopener noreferrer" class="link link-hover">GitHub</a>
{% endif %}
</p>
<p class="text-xs opacity-50 mt-2">Powered by MeshCore Hub v{{ version }}</p>
<p class="text-xs opacity-50 mt-2">{% if admin_enabled %}<a href="/a/" class="link link-hover">Admin</a> | {% endif %}Powered by <a href="https://github.com/ipnet-mesh/meshcore-hub" target="_blank" rel="noopener noreferrer" class="link link-hover">MeshCore Hub</a> {{ version }}</p>
</aside>
</footer>
<!-- Leaflet JS for maps -->
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
<!-- Common utilities -->
<script src="/static/js/utils.js"></script>
{% block extra_scripts %}{% endblock %}
</body>
</html>

View File

@@ -3,42 +3,91 @@
{% block title %}{{ network_name }} - Home{% endblock %}
{% block content %}
<div class="hero min-h-[50vh] bg-base-100 rounded-box">
<div class="hero py-8 bg-base-100 rounded-box">
<div class="hero-content text-center">
<div class="max-w-2xl">
<h1 class="text-5xl font-bold">{{ network_name }}</h1>
<h1 class="text-4xl font-bold">{{ network_name }}</h1>
{% if network_city and network_country %}
<p class="py-2 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
<p class="py-1 text-lg opacity-70">{{ network_city }}, {{ network_country }}</p>
{% endif %}
<p class="py-6">
{% if network_welcome_text %}
<p class="py-4">{{ network_welcome_text }}</p>
{% else %}
<p class="py-4">
Welcome to the {{ network_name }} mesh network dashboard.
Monitor network activity, view connected nodes, and explore message history.
</p>
{% endif %}
<div class="flex gap-4 justify-center flex-wrap">
<a href="/network" class="btn btn-primary">
<a href="/network" class="btn btn-neutral">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z" />
</svg>
View Network Stats
Dashboard
</a>
<a href="/nodes" class="btn btn-secondary">
<a href="/nodes" class="btn btn-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
Browse Nodes
Nodes
</a>
<a href="/map" class="btn btn-accent">
<a href="/advertisements" class="btn btn-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 20l-5.447-2.724A1 1 0 013 16.382V5.618a1 1 0 011.447-.894L9 7m0 13l6-3m-6 3V7m6 10l4.553 2.276A1 1 0 0021 18.382V7.618a1 1 0 00-.553-.894L15 4m0 13V4m0 0L9 7" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
View Map
Advertisements
</a>
<a href="/messages" class="btn btn-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-5 w-5 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
Messages
</a>
</div>
</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-8">
<!-- Stats Cards -->
<div class="grid grid-cols-1 md:grid-cols-3 gap-6 mt-6">
<!-- Total Nodes -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-primary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
</div>
<div class="stat-title">Total Nodes</div>
<div class="stat-value text-primary">{{ stats.total_nodes }}</div>
<div class="stat-desc">All discovered nodes</div>
</div>
<!-- Advertisements (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-secondary">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5.882V19.24a1.76 1.76 0 01-3.417.592l-2.147-6.15M18 13a3 3 0 100-6M5.436 13.683A4.001 4.001 0 017 6h1.832c4.1 0 7.625-1.234 9.168-3v14c-1.543-1.766-5.067-3-9.168-3H7a3.988 3.988 0 01-1.564-.317z" />
</svg>
</div>
<div class="stat-title">Advertisements</div>
<div class="stat-value text-secondary">{{ stats.advertisements_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
<!-- Messages (7 days) -->
<div class="stat bg-base-100 rounded-box shadow">
<div class="stat-figure text-accent">
<svg xmlns="http://www.w3.org/2000/svg" class="h-8 w-8" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 10h.01M12 10h.01M16 10h.01M9 16H5a2 2 0 01-2-2V6a2 2 0 012-2h14a2 2 0 012 2v8a2 2 0 01-2 2h-5l-5 5v-5z" />
</svg>
</div>
<div class="stat-title">Messages</div>
<div class="stat-value text-accent">{{ stats.messages_7d }}</div>
<div class="stat-desc">Last 7 days</div>
</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6 mt-6">
<!-- Network Info Card -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
@@ -50,11 +99,43 @@
</h2>
<div class="space-y-2">
{% if network_radio_config %}
{% if network_radio_config.profile %}
<div class="flex justify-between">
<span class="opacity-70">Radio Config:</span>
<span class="font-mono">{{ network_radio_config }}</span>
<span class="opacity-70">Profile:</span>
<span class="font-mono">{{ network_radio_config.profile }}</span>
</div>
{% endif %}
{% if network_radio_config.frequency %}
<div class="flex justify-between">
<span class="opacity-70">Frequency:</span>
<span class="font-mono">{{ network_radio_config.frequency }}</span>
</div>
{% endif %}
{% if network_radio_config.bandwidth %}
<div class="flex justify-between">
<span class="opacity-70">Bandwidth:</span>
<span class="font-mono">{{ network_radio_config.bandwidth }}</span>
</div>
{% endif %}
{% if network_radio_config.spreading_factor %}
<div class="flex justify-between">
<span class="opacity-70">Spreading Factor:</span>
<span class="font-mono">{{ network_radio_config.spreading_factor }}</span>
</div>
{% endif %}
{% if network_radio_config.coding_rate %}
<div class="flex justify-between">
<span class="opacity-70">Coding Rate:</span>
<span class="font-mono">{{ network_radio_config.coding_rate }}</span>
</div>
{% endif %}
{% if network_radio_config.tx_power %}
<div class="flex justify-between">
<span class="opacity-70">TX Power:</span>
<span class="font-mono">{{ network_radio_config.tx_power }}</span>
</div>
{% endif %}
{% endif %}
{% if network_location and network_location != (0.0, 0.0) %}
<div class="flex justify-between">
<span class="opacity-70">Location:</span>
@@ -65,54 +146,144 @@
</div>
</div>
<!-- Quick Links Card -->
<!-- Powered by MeshCore -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1" />
</svg>
Quick Links
</h2>
<ul class="menu bg-base-200 rounded-box">
<li><a href="/messages">Recent Messages</a></li>
<li><a href="/nodes">All Nodes</a></li>
<li><a href="/members">Network Members</a></li>
</ul>
<div class="card-body flex flex-col items-center justify-center">
<p class="text-sm opacity-70 mb-4 text-center">Our local off-grid mesh network is made possible by</p>
<a href="https://meshcore.co.uk/" target="_blank" rel="noopener noreferrer" class="hover:opacity-80 transition-opacity">
<img src="/static/img/meshcore.svg" alt="MeshCore" class="h-8" />
</a>
<p class="text-xs opacity-50 mt-4 text-center">Connecting people and things, without using the internet</p>
<div class="flex gap-2 mt-4">
<a href="https://meshcore.co.uk/" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-1" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 12a9 9 0 01-9 9m9-9a9 9 0 00-9-9m9 9H3m9 9a9 9 0 01-9-9m9 9c1.657 0 3-4.03 3-9s-1.343-9-3-9m0 18c-1.657 0-3-4.03-3-9s1.343-9 3-9m-9 9a9 9 0 019-9" />
</svg>
Website
</a>
<a href="https://github.com/meshcore-dev/MeshCore" target="_blank" rel="noopener noreferrer" class="btn btn-outline btn-sm">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-1" viewBox="0 0 24 24" fill="currentColor">
<path d="M12 0c-6.626 0-12 5.373-12 12 0 5.302 3.438 9.8 8.207 11.387.599.111.793-.261.793-.577v-2.234c-3.338.726-4.033-1.416-4.033-1.416-.546-1.387-1.333-1.756-1.333-1.756-1.089-.745.083-.729.083-.729 1.205.084 1.839 1.237 1.839 1.237 1.07 1.834 2.807 1.304 3.492.997.107-.775.418-1.305.762-1.604-2.665-.305-5.467-1.334-5.467-5.931 0-1.311.469-2.381 1.236-3.221-.124-.303-.535-1.524.117-3.176 0 0 1.008-.322 3.301 1.23.957-.266 1.983-.399 3.003-.404 1.02.005 2.047.138 3.006.404 2.291-1.552 3.297-1.23 3.297-1.23.653 1.653.242 2.874.118 3.176.77.84 1.235 1.911 1.235 3.221 0 4.609-2.807 5.624-5.479 5.921.43.372.823 1.102.823 2.222v3.293c0 .319.192.694.801.576 4.765-1.589 8.199-6.086 8.199-11.386 0-6.627-5.373-12-12-12z"/>
</svg>
GitHub
</a>
</div>
</div>
</div>
<!-- Contact Card -->
<!-- Network Activity Chart -->
<div class="card bg-base-100 shadow-xl">
<div class="card-body">
<h2 class="card-title">
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M3 8l7.89 5.26a2 2 0 002.22 0L21 8M5 19h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v10a2 2 0 002 2z" />
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M7 12l3-3 3 3 4-4M8 21l4-4 4 4M3 4h18M4 4h16v12a1 1 0 01-1 1H5a1 1 0 01-1-1V4z" />
</svg>
Contact
Network Activity
</h2>
<div class="space-y-2">
{% if network_contact_email %}
<a href="mailto:{{ network_contact_email }}" class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" fill="none" viewBox="0 0 24 24" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M16 12a4 4 0 10-8 0 4 4 0 008 0zm0 0v1.5a2.5 2.5 0 005 0V12a9 9 0 10-9 9m4.5-1.206a8.959 8.959 0 01-4.5 1.207" />
</svg>
{{ network_contact_email }}
</a>
{% endif %}
{% if network_contact_discord %}
<div class="btn btn-outline btn-sm btn-block">
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 mr-2" viewBox="0 0 24 24" fill="currentColor">
<path d="M20.317 4.3698a19.7913 19.7913 0 00-4.8851-1.5152.0741.0741 0 00-.0785.0371c-.211.3753-.4447.8648-.6083 1.2495-1.8447-.2762-3.68-.2762-5.4868 0-.1636-.3933-.4058-.8742-.6177-1.2495a.077.077 0 00-.0785-.037 19.7363 19.7363 0 00-4.8852 1.515.0699.0699 0 00-.0321.0277C.5334 9.0458-.319 13.5799.0992 18.0578a.0824.0824 0 00.0312.0561c2.0528 1.5076 4.0413 2.4228 5.9929 3.0294a.0777.0777 0 00.0842-.0276c.4616-.6304.8731-1.2952 1.226-1.9942a.076.076 0 00-.0416-.1057c-.6528-.2476-1.2743-.5495-1.8722-.8923a.077.077 0 01-.0076-.1277c.1258-.0943.2517-.1923.3718-.2914a.0743.0743 0 01.0776-.0105c3.9278 1.7933 8.18 1.7933 12.0614 0a.0739.0739 0 01.0785.0095c.1202.099.246.1981.3728.2924a.077.077 0 01-.0066.1276 12.2986 12.2986 0 01-1.873.8914.0766.0766 0 00-.0407.1067c.3604.698.7719 1.3628 1.225 1.9932a.076.076 0 00.0842.0286c1.961-.6067 3.9495-1.5219 6.0023-3.0294a.077.077 0 00.0313-.0552c.5004-5.177-.8382-9.6739-3.5485-13.6604a.061.061 0 00-.0312-.0286zM8.02 15.3312c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9555-2.4189 2.157-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.9555 2.4189-2.1569 2.4189zm7.9748 0c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9554-2.4189 2.1569-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.946 2.4189-2.1568 2.4189Z"/>
</svg>
{{ network_contact_discord }}
</div>
{% endif %}
{% if not network_contact_email and not network_contact_discord %}
<p class="text-sm opacity-70">No contact information configured.</p>
{% endif %}
<p class="text-sm opacity-70 mb-2">Activity per day (last 7 days)</p>
<div class="h-48">
<canvas id="activityChart"></canvas>
</div>
</div>
</div>
</div>
{% endblock %}
{% block extra_scripts %}
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
(function() {
const advertData = {{ advert_activity_json | safe }};
const messageData = {{ message_activity_json | safe }};
const ctx = document.getElementById('activityChart');
if (ctx && advertData.data && advertData.data.length > 0) {
// Format dates for display (show only day/month)
const labels = advertData.data.map(d => {
const date = new Date(d.date);
return date.toLocaleDateString('en-GB', { day: 'numeric', month: 'short' });
});
const advertCounts = advertData.data.map(d => d.count);
const messageCounts = messageData.data ? messageData.data.map(d => d.count) : [];
new Chart(ctx, {
type: 'line',
data: {
labels: labels,
datasets: [{
label: 'Advertisements',
data: advertCounts,
borderColor: 'oklch(0.7 0.17 330)',
backgroundColor: 'oklch(0.7 0.17 330 / 0.1)',
fill: false,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}, {
label: 'Messages',
data: messageCounts,
borderColor: 'oklch(0.7 0.15 200)',
backgroundColor: 'oklch(0.7 0.15 200 / 0.1)',
fill: false,
tension: 0.3,
pointRadius: 2,
pointHoverRadius: 5
}]
},
options: {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: {
display: true,
position: 'bottom',
labels: {
color: 'oklch(0.7 0 0)',
boxWidth: 12,
padding: 8
}
},
tooltip: {
mode: 'index',
intersect: false,
backgroundColor: 'oklch(0.25 0 0)',
titleColor: 'oklch(0.9 0 0)',
bodyColor: 'oklch(0.9 0 0)',
borderColor: 'oklch(0.4 0 0)',
borderWidth: 1
}
},
scales: {
x: {
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
maxRotation: 45,
minRotation: 45,
maxTicksLimit: 10
}
},
y: {
beginAtZero: true,
grid: {
color: 'oklch(0.4 0 0 / 0.2)'
},
ticks: {
color: 'oklch(0.7 0 0)',
precision: 0
}
}
},
interaction: {
mode: 'nearest',
axis: 'x',
intersect: false
}
}
});
}
})();
</script>
{% endblock %}

Some files were not shown because too many files have changed in this diff Show More