208 Commits

Author SHA1 Message Date
Joel Krauska
8bea2bd744 ruff 2025-11-25 18:12:04 -08:00
Pablo Revilla
7edb7b5c38 Fix the net page as it was not showing the date information 2025-11-25 09:31:56 -08:00
Pablo Revilla
17a1265842 Fix the net page as it was not showing the date information 2025-11-25 09:05:37 -08:00
Pablo Revilla
cc03d237bb Fix the net page as it was not showing the date information 2025-11-24 19:38:46 -08:00
Pablo Revilla
06cc15a03c Fix the net page as it was not showing the date information 2025-11-24 18:46:28 -08:00
Pablo Revilla
99cb5654e4 More changes... almost ready for release.
Ranamed 2 pages for easy or reading.
2025-11-24 16:11:11 -08:00
Pablo Revilla
3c8fa0185e Remamed new_node to node. shorter and descriptive. 2025-11-24 10:20:06 -08:00
Pablo Revilla
1e85aa01c6 Remamed new_node to node. shorter and descriptive. 2025-11-24 10:16:30 -08:00
Pablo Revilla
535c5c8ada Remamed new_node to node. shorter and descriptive. 2025-11-24 10:06:43 -08:00
Pablo Revilla
4a5b982e6f Remamed new_node to node. shorter and descriptive. 2025-11-24 09:25:09 -08:00
Pablo Revilla
a71f371c85 Remamed new_node to node. shorter and descriptive. 2025-11-24 09:12:47 -08:00
Pablo Revilla
4150953b96 Remamed new_node to node. shorter and descriptive. 2025-11-24 09:09:51 -08:00
Pablo Revilla
0eed8f8001 Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 15:41:25 -08:00
Pablo Revilla
14aabc3b10 Merge remote-tracking branch 'origin/dev-v3' into dev-v3 2025-11-21 13:47:37 -08:00
Pablo Revilla
fc01cb6a85 Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 13:47:22 -08:00
Joel Krauska
5214b80816 another compatibility fix when _us is empty and we need to sort by BOTH old and new 2025-11-21 12:24:11 -08:00
Joel Krauska
73cd325b35 Make the robots do our bidding 2025-11-21 12:07:14 -08:00
Joel Krauska
f89686fb88 fix 0 epoch dates in /chat 2025-11-21 12:06:50 -08:00
Joel Krauska
0c89b3ec22 use sys.executable 2025-11-21 12:06:50 -08:00
Joel Krauska
9cd1975278 pyproject.toml requirements 2025-11-21 12:06:50 -08:00
Pablo Revilla
052a9460ca Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 11:49:11 -08:00
Pablo Revilla
af6bb0fa64 Merge remote-tracking branch 'origin/dev-v3' into dev-v3 2025-11-21 11:43:07 -08:00
Pablo Revilla
8fae62e51a Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 11:41:50 -08:00
Joel Krauska
4af1aac6ec more ruff 2025-11-21 11:37:24 -08:00
Joel Krauska
ed695684d9 fix ruff format 2025-11-21 11:36:48 -08:00
Pablo Revilla
5e0852e558 Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 11:10:16 -08:00
Pablo Revilla
e135630f8d Finishing up all the pages for the 3.0 release.
Now all pages are functional.
2025-11-21 10:57:18 -08:00
Pablo Revilla
5f5ae75d84 Worked on /api/packet. Needed to modify
- Added new api endpoint /api/packets_seen
- Modified web.py and store.py to support changes to APIs.
- Started to work on new_node.html and new_packet.html for presentation of data.
2025-11-19 11:18:34 -08:00
Pablo Revilla
39c0dd589d Worked on /api/packet. Needed to modify
- Added new api endpoint /api/packets_seen
- Modified web.py and store.py to support changes to APIs.
- Started to work on new_node.html and new_packet.html for presentation of data.
2025-11-13 15:28:45 -08:00
Pablo Revilla
7411c7e8ee Worked on /api/packet. Needed to modify
- Store.py to read the new time data
- api.py to present the new time data
- firehose.html chat.html and map.html now use the new apis and the time is the browser local time
2025-11-10 21:51:48 -08:00
Pablo Revilla
27daa92694 Worked on /api/packet. Needed to modify
- Store.py to read the new time data
- api.py to present the new time data
- firehose.html chat.html and map.html now use the new apis and the time is the browser local time
2025-11-10 21:33:50 -08:00
Óscar García Amor
d4f251f1b6 Improves container build (#94) 2025-11-10 10:32:06 -08:00
Pablo Revilla
ac4ac9264f Worked on /api/packet. Needed to modify
- Store.py to read the new time data
- api.py to present the new time data
- firehose.html chat.html and map.html now use the new apis and the time is the browser local time
2025-11-05 20:46:12 -08:00
Pablo Revilla
4a3f205d26 Worked on /api/packet. Needed to modify
- Store.py to read the new time data
- api.py to present the new time data
- firehose.html chat.html and map.html now use the new apis and the time is the browser local time
2025-11-05 19:07:23 -08:00
Joel Krauska
9fa874762e Add us first/last timestamps to node table too 2025-11-05 16:48:29 -08:00
Joel Krauska
e0d8ceecac Alembic was blocking mqtt logs 2025-11-05 14:36:50 -08:00
Joel Krauska
67738105c8 Summary of 3.0.0 stuff 2025-11-04 21:33:08 -08:00
Joel Krauska
04e76ebd28 graphviz for dot in Container 2025-11-04 21:12:22 -08:00
Joel Krauska
04051bc00a setup-dev 2025-11-04 21:11:38 -08:00
Joel Krauska
f903c82c79 Docker Docs 2025-11-04 21:04:20 -08:00
Joel Krauska
4b9dfba03d ruff 2025-11-04 20:57:58 -08:00
Joel Krauska
70f727a6dd backups and cleanups are different 2025-11-04 20:55:55 -08:00
Joel Krauska
bc70b5c39d DB Backups 2025-11-04 20:51:50 -08:00
Jim Schrempp
a65de73b3a Traceroute Return Path logged and displayed (#97)
* traceroute returns are now logged and /packetlist now graphs the correct data for a return route
* now using alembic to update schema
* HOWTO - Alembic

---------

Co-authored-by: Joel Krauska <jkrauska@gmail.com>
2025-11-04 20:36:24 -08:00
Joel Krauska
cc053951b1 make /app owned by ap0p 2025-11-04 20:16:41 -08:00
Joel Krauska
fcff4f5849 checkout and containerfile 2025-11-04 20:04:05 -08:00
Joel Krauska
4e9f121514 fix symlink 2025-11-04 20:00:58 -08:00
Joel Krauska
c0ed5031e6 symlink 2025-11-04 20:00:01 -08:00
Joel Krauska
a6b1e30d29 auto build containers 2025-11-04 19:57:49 -08:00
Joel Krauska
24de8e73fb Container using slim/uv 2025-11-04 19:51:19 -08:00
Joel Krauska
b86af326af improve migrations and fix logging problem with mqtt 2025-11-04 16:27:18 -08:00
Joel Krauska
0a0ec5c45f remove unused loop 2025-11-03 20:23:21 -08:00
Joel Krauska
9ac045a1c5 fallback if missing config 2025-11-03 20:18:17 -08:00
Joel Krauska
e343d6aa15 mvrun work 2025-11-03 20:13:36 -08:00
Óscar García Amor
4de92da1ae Set dbcleanup.log location configurable 2025-11-03 20:07:34 -08:00
Óscar García Amor
74369deaea Improves arguments in mvrun.py 2025-11-03 20:07:34 -08:00
Joel Krauska
44671a1358 vuln 2025-11-03 20:05:48 -08:00
Joel Krauska
64261d3bc4 ruff and docs 2025-11-03 20:05:04 -08:00
Joel Krauska
fe59b42a53 break out api calls in to their own file to reduce footprint 2025-11-03 19:07:39 -08:00
Joel Krauska
f8ed76b41e alembic log format 2025-11-03 15:07:17 -08:00
Joel Krauska
7d0d704412 health endpoint 2025-11-03 15:05:28 -08:00
Joel Krauska
991794ed3d ignore other database files 2025-11-03 15:00:05 -08:00
Joel Krauska
87ade281ba update api docs 2025-11-03 14:52:34 -08:00
Joel Krauska
4c3858958b rm 2025-11-03 14:50:55 -08:00
Joel Krauska
0139169c7d more doc tidy 2025-11-03 14:50:43 -08:00
Joel Krauska
0b438366f1 add readme in docs: 2025-11-03 14:48:09 -08:00
Joel Krauska
9e38a3a394 remove old migrate script 2025-11-03 14:47:45 -08:00
Joel Krauska
cf55334165 move technical docs 2025-11-03 14:47:29 -08:00
Joel Krauska
dda94aa2cb add /version json endpoint 2025-11-03 14:26:49 -08:00
Joel Krauska
64169787b3 modify alembic to support cleaner migrations 2025-11-03 14:11:42 -08:00
Joel Krauska
fa28f6b63f Remove old index notes script -- no longer needed 2025-11-03 13:29:11 -08:00
Joel Krauska
5ca3b472a6 Store UTC int time in DB (#81)
* use UTC int time
2025-11-03 13:26:44 -08:00
Joel Krauska
8ec44ad552 Add alembic DB schema management (#86)
* Use alembic
* add creation helper
* example migration tool
2025-11-03 12:53:34 -08:00
Pablo Revilla
60ae77772d worked on making map and base all API driven 2025-11-02 11:41:15 -08:00
Pablo Revilla
ed33bfe540 worked on making map and base all API driven 2025-11-02 11:39:29 -08:00
Pablo Revilla
47a22911ca worked on making map and base all API driven 2025-11-01 18:30:26 -07:00
Pablo Revilla
d61427db8f worked on making map and base all API driven 2025-10-31 16:55:41 -07:00
Pablo Revilla
f11455eebc worked on making map and base all API driven 2025-10-31 16:55:05 -07:00
Pablo Revilla
0a548904c8 Merge remote-tracking branch 'origin/master' 2025-10-31 16:54:15 -07:00
Pablo Revilla
a0e5bb0747 worked on making map and base all API driven 2025-10-31 16:52:32 -07:00
Pablo Revilla
986ef8e4e5 Merge pull request #92 from io235/master
Add Salzburg/Austria to list of running instances
2025-10-31 11:07:59 -07:00
Pablo Revilla
54f7f1b1ce worked on making map and base all API driven 2025-10-31 07:45:48 -07:00
Io
6886a97874 Add Salzburg/Austria to list of running instances 2025-10-31 10:09:13 +00:00
Pablo Revilla
c4f2e3f24f Merge pull request #87 from jkrauska/traceLines
render traceroutes on top
2025-10-27 21:31:11 -07:00
Joel Krauska
8db8e90f80 use ruff format 2025-10-27 14:40:41 -07:00
Joel Krauska
3ea2809df0 render traceroutes on top 2025-10-27 14:38:35 -07:00
Pablo Revilla
f7f932d821 worked on making map and base all API driven 2025-10-23 13:53:41 -07:00
Pablo Revilla
ad8835a46b worked on making map and base all API driven 2025-10-22 15:57:34 -07:00
Pablo Revilla
cbe4895b2c worked on making map and base all API driven 2025-10-22 15:22:47 -07:00
Pablo Revilla
d9b1d5ac49 worked on making map and base all API driven 2025-10-22 14:31:07 -07:00
Pablo Revilla
13aa73e88f worked on making map and base all API driven 2025-10-22 09:20:08 -07:00
Pablo Revilla
58244bff09 worked on making map and base all API driven 2025-10-22 08:54:18 -07:00
Pablo Revilla
635353f3c8 worked on making map and base all API driven 2025-10-21 21:07:28 -07:00
Pablo Revilla
d5fb589665 worked on making map and base all API driven 2025-10-18 15:52:50 -07:00
Pablo Revilla
a4b51ace73 worked on making map and base all API driven 2025-10-18 15:27:13 -07:00
Pablo Revilla
75d0d9ea6a worked on making map and base all API driven 2025-10-18 15:13:08 -07:00
Pablo Revilla
c909ff58a5 Merge pull request #80 from pablorevilla-meshtastic/revert-78-10-15-25-bugs2
Revert "Add configurable channel filtering with allowlist and minimum packet threshold"
2025-10-17 18:40:48 -07:00
Pablo Revilla
a15b039a1f Revert "Add configurable channel filtering with allowlist and minimum packet threshold" 2025-10-17 18:40:22 -07:00
Pablo Revilla
d52b7d0929 Merge pull request #78 from nullrouten0/10-15-25-bugs2
Add configurable channel filtering with allowlist and minimum packet threshold
2025-10-17 16:54:06 -07:00
Nathan
d56ee8f4c5 ruff fixes 2025-10-17 15:36:36 -07:00
Pablo Revilla
52ca8a4060 Merge branch 'master' into 10-15-25-bugs2 2025-10-17 15:09:10 -07:00
Pablo Revilla
e4a6de3615 worked on making map and base all API driven 2025-10-17 14:26:08 -07:00
Pablo Revilla
3cca445cad worked on making map and base all API driven 2025-10-17 12:54:44 -07:00
Pablo Revilla
8b0c7a16e7 Start adding language support 2025-10-16 13:34:35 -07:00
Nathan
c5a1009877 added channel filtering min_packets, and allowlist, fixed javascript error, new sample.config.ini sections 2025-10-16 01:07:49 -07:00
Pablo Revilla
65ada1ba3e Merge pull request #77 from pablorevilla-meshtastic/revert-73-maphours-stacked
Revert "Maphours changes stacked with filtering additions"
2025-10-15 21:36:29 -07:00
Pablo Revilla
7f94bc0e39 Merge branch 'master' into revert-73-maphours-stacked 2025-10-15 21:36:06 -07:00
Pablo Revilla
5d687da598 Merge pull request #76 from pablorevilla-meshtastic/revert-75-10-15-25-bugs
Revert "fixed map to show only channels with locations"
2025-10-15 21:33:16 -07:00
Pablo Revilla
a002cde2d7 Revert "fixed map to show only channels with locations" 2025-10-15 21:32:50 -07:00
Nathan
954cd4653d fixing node graph selector 2025-10-15 18:02:52 -07:00
Pablo Revilla
454c8ff6e2 Start adding language support 2025-10-15 16:27:43 -07:00
Pablo Revilla
021bc54f9d Start adding language support 2025-10-15 16:25:34 -07:00
Pablo Revilla
155ef89724 Merge remote-tracking branch 'origin/master' 2025-10-15 16:24:07 -07:00
Pablo Revilla
084647eec1 Start adding language support 2025-10-15 16:23:59 -07:00
Pablo Revilla
c13a851145 Merge pull request #75 from nullrouten0/10-15-25-bugs
fixed map to show only channels with locations
2025-10-15 16:15:01 -07:00
Pablo Revilla
114cd980b9 Merge branch 'master' into 10-15-25-bugs 2025-10-15 16:14:47 -07:00
Nathan
c23a650c0d fixed map to show only channels with locations 2025-10-15 16:07:52 -07:00
Pablo Revilla
318bf83403 Revert "Maphours changes stacked with filtering additions" 2025-10-15 15:57:56 -07:00
Pablo Revilla
636ab3e976 Start adding language support 2025-10-15 15:57:39 -07:00
Pablo Revilla
ea10a656e7 Start adding language support 2025-10-15 15:31:04 -07:00
Pablo Revilla
bcd007e5e2 Merge pull request #73 from nullrouten0/maphours-stacked
Maphours changes stacked with filtering additions
2025-10-15 15:09:24 -07:00
Nathan
b35acde821 Add channel-aware activity filters and API-driven dashboards 2025-10-14 21:34:52 -07:00
Nathan
b7752bc315 Map: activity time filters 2025-10-11 21:21:10 -07:00
Nathan
257bf7ffac Add channel filters to stats, chat, and firehose views 2025-10-11 16:28:34 -07:00
Pablo Revilla
d561d1a8de Start adding language support 2025-10-10 21:37:45 -07:00
Pablo Revilla
60e7389d83 Start adding language support 2025-10-10 21:34:42 -07:00
Pablo Revilla
4ac3262544 Start adding language support 2025-10-10 21:34:36 -07:00
Pablo Revilla
87643e4bd2 Start adding language support 2025-10-10 21:32:36 -07:00
Pablo Revilla
29174a649c Start adding language support 2025-10-10 21:28:48 -07:00
Pablo Revilla
712aea5139 Start adding language support 2025-10-10 21:10:35 -07:00
Pablo Revilla
d6fadd99d0 Start adding language support 2025-10-10 21:01:43 -07:00
Pablo Revilla
ae0b0944f0 Merge pull request #68 from jkrauska/profileTop
Add database indexes for 10X improvement in page load for /top
2025-10-10 12:58:41 -07:00
Pablo Revilla
d7b830e2f7 Merge pull request #69 from jkrauska/lornet.pl
fix for loranet.pl missing gateway_id
2025-10-08 18:31:05 -07:00
Joel Krauska
4a1737ebd4 fix for loranet.pl 2025-10-07 20:13:00 -07:00
Joel Krauska
60131007df fix for ruff 2025-10-07 19:49:57 -07:00
Joel Krauska
23d66c0d67 add database indexes 2025-10-07 18:22:50 -07:00
Pablo Revilla
30ba603f66 Merge pull request #67 from jkrauska/nodeListFavorites
FEATURE: Add NodeList Favorites and Remember Map Filters
2025-10-07 16:08:52 -07:00
Pablo Revilla
9811102681 Merge pull request #66 from jkrauska/apiEdges
BUG: Fix for api/edges traceback
2025-10-07 15:54:04 -07:00
Joel Krauska
7c92b06bec use ruff format 2025-10-07 14:15:29 -07:00
Joel Krauska
adda666a39 Add Favorites and Remember Filters 2025-10-07 14:04:14 -07:00
Joel Krauska
3e673f30bc Fix for api/edges traceback 2025-10-07 13:59:20 -07:00
Pablo Revilla
beefb4c5df Merge pull request #64 from jkrauska/ruffVersionFix
Bump ruff version - fix open call from lang work
2025-10-03 21:56:19 -07:00
Joel Krauska
e1bada8378 Bump ruff version - fix open call from lang work 2025-10-03 21:34:13 -07:00
Pablo Revilla
fbd6fcb123 Merge pull request #62 from jkrauska/ruffAutomation
Automate ruff in github action
2025-10-03 21:03:53 -07:00
Pablo Revilla
5d267effa5 Remove unused code 2025-10-03 20:56:47 -07:00
Joel Krauska
e28d248cf9 Automate ruff in github action 2025-10-03 20:55:11 -07:00
Pablo Revilla
ab101dd461 Merge pull request #61 from jkrauska/jkrauska/ruffFormat
Add Ruff formatting and pre-commit hooks
2025-10-03 20:49:44 -07:00
Joel Krauska
35212d403e Merge branch 'master' into jkrauska/ruffFormat 2025-10-03 20:43:16 -07:00
Pablo Revilla
3603014fd2 Added maps coordinates to /api/config 2025-10-03 20:41:02 -07:00
Joel Krauska
e25ff22127 Add Ruff formatting and pre-commit hooks
Configure Ruff as the code formatter and linter with pre-commit hooks.
  Applied automatic formatting fixes across the entire codebase including:
  - Import sorting and organization
  - Code style consistency (spacing, line breaks, indentation)
  - String quote normalization
  - Removal of trailing whitespace and unnecessary blank lines
2025-10-03 20:38:37 -07:00
Pablo Revilla
aa9922e7fa work on error where packet ids could be duplicate and crash the loop 2025-10-03 12:54:00 -07:00
Pablo Revilla
a9b16d6c18 work on error where packet ids could be duplicate and crash the loop 2025-10-03 12:33:04 -07:00
Pablo Revilla
b4fda0bb01 Merge remote-tracking branch 'origin/master' 2025-10-03 11:59:35 -07:00
Pablo Revilla
215817abc7 Cleanup the install process 2025-10-03 11:59:21 -07:00
Pablo Revilla
f167e8780d Merge pull request #57 from jkrauska/jkrauska/startupLogging
Add structured logging and improved startup/shutdown handling
2025-10-03 08:58:31 -07:00
Joel Krauska
2723022dd5 Add structured logging and improved startup/shutdown handling
- Add consistent logging format across all modules (timestamp, file:line, PID, level)
- Add startup logging for MQTT connection, web server startup with URL display
- Add MQTT message processing metrics (count and rate logging every 10k messages)
- Add graceful shutdown handling with signal handlers and PID file cleanup
- Add configurable HTTP access log toggle via config.ini (default: disabled)
- Replace print() statements with proper logger calls throughout
- Update .gitignore to exclude PID files (meshview-db.pid, meshview-web.pid)
- Update documentation for new logging configuration options

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-01 17:49:01 -07:00
Pablo Revilla
d2d18746ef Fixed bug on edges API 2025-10-01 14:00:14 -07:00
Pablo Revilla
7146f69beb update protobuf 2025-10-01 12:07:43 -07:00
Pablo Revilla
db8703919d Merge pull request #53 from jkrauska/jkrauska/mapzoom
Add url parameters to /map to support zoomed view
2025-10-01 11:43:45 -07:00
Pablo Revilla
baeaf29df0 Merge pull request #51 from Cloud-121/master
Fix Client_BASE not showing in Mesh Graphs
2025-10-01 09:18:50 -07:00
Pablo Revilla
44ddfe7ed7 update protobuf 2025-10-01 08:08:50 -07:00
Pablo Revilla
fc28dcc53e Merge pull request #52 from jkrauska/master
Minor README Tweaks and gitignore add
2025-10-01 08:05:34 -07:00
Pablo Revilla
81a2c0c7ca update protobuf 2025-10-01 08:01:00 -07:00
Joel Krauska
c7f5467acb Add url parameters to /map to support zoomed view 2025-09-30 17:56:35 -07:00
Joel Krauska
396e5ccbf1 Minor README Tweaks and gitignore add 2025-09-30 17:17:55 -07:00
Cloud Hayes
0a522f9a19 Fix Client_BASE not showing in Mesh Graphs 2025-09-30 17:40:06 -05:00
Pablo Revilla
40c5d4e291 update protobuf 2025-09-25 13:37:55 -07:00
Pablo Revilla
550a266212 update protobuf 2025-09-24 20:28:33 -07:00
Pablo Revilla
238ac409f8 testing new integration 2025-09-24 20:16:10 -07:00
Pablo Revilla
ee640b2cec Update .gitmodules 2025-09-24 20:13:28 -07:00
Pablo Revilla
561d410e6a Updates stats with pie chart and report for channel 2025-09-24 19:55:41 -07:00
Pablo Revilla
a20dafe714 Updates stats with pie chart and report for channel 2025-09-24 19:54:33 -07:00
Pablo Revilla
3cd93c08a7 Updates stats with pie chart and report for channel 2025-09-24 19:39:06 -07:00
Pablo Revilla
11537fdef1 Merge remote-tracking branch 'origin/master' 2025-09-24 19:38:55 -07:00
Pablo Revilla
5068f7acb1 Updates stats with pie chart and report for channel 2025-09-24 17:23:12 -07:00
Pablo Revilla
85f04f485e Merge pull request #36 from madeofstown/master
New Install Procedure
2025-09-24 17:17:16 -07:00
Pablo Revilla
a094b3edd5 Merge branch 'master' into master 2025-09-24 17:17:01 -07:00
Pablo Revilla
8d7f72ac6e Updates stats with pie chart and report for channel 2025-09-22 10:25:24 -07:00
Pablo Revilla
03e198b80c Updates stats with pie chart and report for channel 2025-09-22 10:18:06 -07:00
Pablo Revilla
86b4fa6cbf Update README.md 2025-09-19 21:48:39 -07:00
Pablo Revilla
e6424e3c6d Update README.md 2025-09-19 21:48:13 -07:00
Pablo Revilla
e2c1e311b8 Update README.md 2025-09-19 11:11:25 -07:00
Pablo Revilla
02f63fca70 Work on DB cleanup tool 2025-09-19 10:50:15 -07:00
Pablo Revilla
f9a6f3dff2 Work on DB cleanup tool 2025-09-19 09:20:43 -07:00
Pablo Revilla
0da2ef841c Work on DB cleanup tool 2025-09-19 09:11:00 -07:00
Pablo Revilla
4ffd287c84 Work on DB cleanup tool 2025-09-19 08:50:10 -07:00
Pablo Revilla
ec0dd4ef03 Work on status page 2025-09-18 10:28:55 -07:00
Pablo Revilla
608fde9e9c Work on db cleanup tool 2025-09-18 10:25:27 -07:00
Pablo Revilla
7c40c64de8 Work on db cleanup tool 2025-09-18 09:45:01 -07:00
Pablo Revilla
4f4c18fa14 Work on db cleanup tool 2025-09-18 09:37:24 -07:00
Pablo Revilla
6eb1cdbd2d Work on db cleanup tool 2025-09-18 07:40:25 -07:00
Pablo Revilla
cad3051e7f Work on db cleanup tool 2025-09-18 07:38:58 -07:00
Pablo Revilla
2b9422efbc fixed spelling of variable firehouse_interval 2025-09-18 07:34:26 -07:00
Pablo Revilla
ddb691d4de fixed spelling of variable firehouse_interval 2025-09-17 23:05:05 -07:00
Pablo Revilla
bbab5fefd0 make the /api/config endpoint restrictive to what it provides. It will only show what is needed for the current code. 2025-09-17 23:01:29 -07:00
Pablo Revilla
6e223a066a make the /api/config endpoint restrictive to what it provides. It will only show what is needed for the current code. 2025-09-17 23:00:44 -07:00
Pablo Revilla
61b74473e3 make the /api/config endpoint restrictive to what it provides. It will only show what is needed for the current code. 2025-09-17 22:55:40 -07:00
Pablo Revilla
f06fa3a4a3 Added Database cleanup feature to startdb.py 2025-09-17 22:22:35 -07:00
Pablo Revilla
9d4ebc00f6 Added Database cleanup feature to startdb.py 2025-09-17 22:15:12 -07:00
Pablo Revilla
a69d1a5729 Added Database cleanup feature to startdb.py 2025-09-17 22:12:04 -07:00
madeofstown
7e3076c0e2 Update README.md
symlink target is relative to link location
2025-08-19 19:24:58 -07:00
madeofstown
e3f5c0f006 Delete meshtastic/protobuf 2025-08-19 19:14:48 -07:00
madeofstown
572e79c9ac Update .gitmodules 2025-08-19 19:10:46 -07:00
madeofstown
fb70f644e5 Update README.md
Change install procedure to mitigate broken submodule
2025-08-19 19:08:38 -07:00
madeofstown
954d6300de Update .gitignore 2025-08-19 17:35:26 -07:00
madeofstown
9ceca0eea9 Update .gitmodules 2025-08-19 17:31:36 -07:00
madeofstown
24f768f725 Merge pull request #5 from madeofstown/testing
re-add meshtastic/python submodule
2025-08-19 17:14:35 -07:00
madeofstown
89f3eade15 re-add meshtastic/python submodule 2025-08-19 17:12:40 -07:00
91 changed files with 9341 additions and 4180 deletions

10
.dockerignore Normal file
View File

@@ -0,0 +1,10 @@
# This keeps Docker from including hostOS virtual environment folders
env/
.venv/
# Database files and backups
*.db
*.db-shm
*.db-wal
backups/
*.db.gz

52
.github/workflows/container.yml vendored Normal file
View File

@@ -0,0 +1,52 @@
name: Build container
on:
push:
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
# list of Docker images to use as base name for tags
images: |
ghcr.io/${{ github.repository }}
# generate Docker tags based on the following events/attributes
tags: |
type=ref,event=branch
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=match,pattern=v\d.\d.\d,value=latest
- name: Login to GitHub Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build and push
uses: docker/build-push-action@v6
with:
context: .
file: ./Containerfile
push: ${{ github.event_name != 'pull_request' }}
labels: ${{ steps.meta.outputs.labels }}
tags: ${{ steps.meta.outputs.tags }}
platforms: linux/amd64,linux/arm64
# optional cache (speeds up rebuilds)
cache-from: type=gha
cache-to: type=gha,mode=max

39
.github/workflows/lint.yml vendored Normal file
View File

@@ -0,0 +1,39 @@
name: Ruff
on:
pull_request:
paths:
- "**/*.py"
- "pyproject.toml"
- "ruff.toml"
- ".pre-commit-config.yaml"
jobs:
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Cache Ruff
uses: actions/cache@v4
with:
path: ~/.cache/ruff
key: ruff-${{ runner.os }}-${{ hashFiles('**/pyproject.toml', '**/ruff.toml') }}
- name: Install Ruff
run: pip install "ruff==0.13.3"
# Lint (with GitHub annotation format for inline PR messages)
- name: Ruff check
run: ruff check --output-format=github .
# Fail PR if formatting is needed
- name: Ruff format (check-only)
run: ruff format --check .
# TODO: Investigate only applying to changed files and possibly apply fixes

40
.gitignore vendored
View File

@@ -1,7 +1,47 @@
env/*
__pycache__/*
meshview/__pycache__/*
alembic/__pycache__/*
meshtastic/protobuf/*
# Database files
packets.db
packets*.db
*.db
*.db-shm
*.db-wal
# Database backups
backups/
*.db.gz
# Process files
meshview-db.pid
meshview-web.pid
# Config and logs
/table_details.py
config.ini
*.log
# Screenshots
screenshots/*
# Python
python/nanopb
__pycache__/
*.pyc
*.pyo
*.pyd
.Python
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# OS
.DS_Store
Thumbs.db

1
.gitmodules vendored Normal file
View File

@@ -0,0 +1 @@

8
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,8 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.13.3 # pin the latest youre comfortable with
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix] # fail if it had to change files
- id: ruff-format

204
AGENTS.md Normal file
View File

@@ -0,0 +1,204 @@
# AI Agent Guidelines for Meshview
This document provides context and guidelines for AI coding assistants working on the Meshview project.
## Project Overview
Meshview is a real-time monitoring and diagnostic tool for Meshtastic mesh networks. It provides web-based visualization and analysis of network activity, including:
- Real-time packet monitoring from MQTT streams
- Interactive map visualization of node locations
- Network topology graphs showing connectivity
- Message traffic analysis and conversation tracking
- Node statistics and telemetry data
- Packet inspection and traceroute analysis
## Architecture
### Core Components
1. **MQTT Reader** (`meshview/mqtt_reader.py`) - Subscribes to MQTT topics and receives mesh packets
2. **Database Manager** (`meshview/database.py`, `startdb.py`) - Handles database initialization and migrations
3. **MQTT Store** (`meshview/mqtt_store.py`) - Processes and stores packets in the database
4. **Web Server** (`meshview/web.py`, `main.py`) - Serves the web interface and API endpoints
5. **API Layer** (`meshview/web_api/api.py`) - REST API endpoints for data access
6. **Models** (`meshview/models.py`) - SQLAlchemy database models
7. **Decode Payload** (`meshview/decode_payload.py`) - Protobuf message decoding
### Technology Stack
- **Python 3.13+** - Main language
- **aiohttp** - Async web framework
- **aiomqtt** - Async MQTT client
- **SQLAlchemy (async)** - ORM with async support
- **Alembic** - Database migrations
- **Jinja2** - Template engine
- **Protobuf** - Message serialization (Meshtastic protocol)
- **SQLite/PostgreSQL** - Database backends (SQLite default, PostgreSQL via asyncpg)
### Key Patterns
- **Async/Await** - All I/O operations are asynchronous
- **Database Migrations** - Use Alembic for schema changes (see `docs/Database-Changes-With-Alembic.md`)
- **Configuration** - INI file-based config (`config.ini`, see `sample.config.ini`)
- **Modular API** - API routes separated into `meshview/web_api/` module
## Project Structure
```
meshview/
├── alembic/ # Database migration scripts
├── docs/ # Technical documentation
├── meshview/ # Main application package
│ ├── static/ # Static web assets (HTML, JS, CSS)
│ ├── templates/ # Jinja2 HTML templates
│ ├── web_api/ # API route handlers
│ └── *.py # Core modules
├── main.py # Web server entry point
├── startdb.py # Database manager entry point
├── mvrun.py # Combined runner (starts both services)
├── config.ini # Runtime configuration
└── requirements.txt # Python dependencies
```
## Development Workflow
### Setup
1. Use Python 3.13+ virtual environment
### Running
- **Database**: `./env/bin/python startdb.py`
- **Web Server**: `./env/bin/python main.py`
- **Both**: `./env/bin/python mvrun.py`
## Code Style
- **Line length**: 100 characters (see `pyproject.toml`)
- **Linting**: Ruff (configured in `pyproject.toml`)
- **Formatting**: Ruff formatter
- **Type hints**: Preferred but not strictly required
- **Async**: Use `async def` and `await` for I/O operations
## Important Files
### Configuration
- `config.ini` - Runtime configuration (server, MQTT, database, cleanup)
- `sample.config.ini` - Template configuration file
- `alembic.ini` - Alembic migration configuration
### Database
- `meshview/models.py` - SQLAlchemy models (Packet, Node, Traceroute, etc.)
- `meshview/database.py` - Database initialization and session management
- `alembic/versions/` - Migration scripts
### Core Logic
- `meshview/mqtt_reader.py` - MQTT subscription and message reception
- `meshview/mqtt_store.py` - Packet processing and storage
- `meshview/decode_payload.py` - Protobuf decoding
- `meshview/web.py` - Web server routes and handlers
- `meshview/web_api/api.py` - REST API endpoints
### Templates
- `meshview/templates/` - Jinja2 HTML templates
- `meshview/static/` - Static files (HTML pages, JS, CSS)
## Common Tasks
### Adding a New API Endpoint
1. Add route handler in `meshview/web_api/api.py`
2. Register route in `meshview/web.py` (if needed)
3. Update `docs/API_Documentation.md` if public API
### Database Schema Changes
1. Modify models in `meshview/models.py`
2. Create migration: `alembic revision --autogenerate -m "description"`
3. Review generated migration in `alembic/versions/`
4. Test migration: `alembic upgrade head`
5. **Never** modify existing migration files after they've been applied
### Adding a New Web Page
1. Create template in `meshview/templates/`
2. Add route in `meshview/web.py`
3. Add navigation link if needed (check existing templates for pattern)
4. Add static assets if needed in `meshview/static/`
### Processing New Packet Types
1. Check `meshview/decode_payload.py` for existing decoders
2. Add decoder function if new type
3. Update `meshview/mqtt_store.py` to handle new packet type
4. Update database models if new data needs storage
## Key Concepts
### Meshtastic Protocol
- Uses Protobuf for message serialization
- Packets contain various message types (text, position, telemetry, etc.)
- MQTT topics follow pattern: `msh/{region}/{subregion}/#`
### Database Schema
- **packet** - Raw packet data
- **node** - Mesh node information
- **traceroute** - Network path information
- **packet_seen** - Packet observation records
### Real-time Updates
- Web pages use Server-Sent Events (SSE) for live updates
- Map and firehose pages auto-refresh based on config intervals
- API endpoints return JSON for programmatic access
## Best Practices
1. **Always use async/await** for database and network operations
2. **Use Alembic** for all database schema changes
3. **Follow existing patterns** - check similar code before adding new features
4. **Update documentation** - keep `docs/` and README current
5. **Test migrations** - verify migrations work both up and down
6. **Handle errors gracefully** - log errors, don't crash on bad packets
7. **Respect configuration** - use `config.ini` values, don't hardcode
## Common Pitfalls
- **Don't modify applied migrations** - create new ones instead
- **Don't block the event loop** - use async I/O, not sync
- **Don't forget timezone handling** - timestamps are stored in UTC
- **Don't hardcode paths** - use configuration values
- **Don't ignore MQTT reconnection** - handle connection failures gracefully
## Resources
- **Main README**: `README.md` - Installation and basic usage
- **Docker Guide**: `README-Docker.md` - Container deployment
- **API Docs**: `docs/API_Documentation.md` - API endpoint reference
- **Migration Guide**: `docs/Database-Changes-With-Alembic.md` - Database workflow
- **Contributing**: `CONTRIBUTING.md` - Contribution guidelines
## Version Information
- **Current Version**: 3.0.0 (November 2025)
- **Python Requirement**: 3.13+
- **Key Features**: Alembic migrations, automated backups, Docker support, traceroute return paths
## Rules for robots
- Always run ruff check and ruff format after making changes (only on python changes)
---
When working on this project, prioritize:
1. Maintaining async patterns
2. Following existing code structure
3. Using proper database migrations
4. Keeping documentation updated
5. Testing changes thoroughly

133
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,133 @@
# Contributing to Meshview
First off, thanks for taking the time to contribute! ❤️
All types of contributions are encouraged and valued. See the [Table of Contents](#table-of-contents) for ways to help and details about how this project handles contributions. Please read the relevant section before getting started — it will make things smoother for both you and the maintainers.
The Meshview community looks forward to your contributions. 🎉
> And if you like the project but dont have time to contribute code, thats fine! You can still support Meshview by:
> - ⭐ Starring the repo on GitHub
> - Talking about Meshview on social media
> - Referencing Meshview in your own projects README
> - Mentioning Meshview at local meetups or to colleagues/friends
---
## Table of Contents
- [Code of Conduct](#code-of-conduct)
- [I Have a Question](#i-have-a-question)
- [I Want to Contribute](#i-want-to-contribute)
- [Reporting Bugs](#reporting-bugs)
- [Suggesting Enhancements](#suggesting-enhancements)
- [Your First Code Contribution](#your-first-code-contribution)
- [Improving the Documentation](#improving-the-documentation)
- [Styleguides](#styleguides)
- [Commit Messages](#commit-messages)
- [Join the Project Team](#join-the-project-team)
---
## Code of Conduct
Meshview is an open and welcoming community. We want everyone to feel safe, respected, and valued.
### Our Standards
- Be respectful and considerate in all interactions.
- Welcome new contributors and help them learn.
- Provide constructive feedback, not personal attacks.
- Focus on collaboration and what benefits the community.
Unacceptable behavior includes harassment, insults, hate speech, personal attacks, or publishing others private information without permission.
---
## I Have a Question
> Before asking, please read the [documentation](docs/README.md) if available.
1. Search the [issues list](../../issues) to see if your question has already been asked.
2. If not, open a [new issue](../../issues/new) with the **question** label.
3. Provide as much context as possible (OS, Python version, database type, etc.).
---
## I Want to Contribute
### Legal Notice
By contributing to Meshview, you agree that:
- You authored the content yourself.
- You have the necessary rights to the content.
- Your contribution can be provided under the projects license.
---
### Reporting Bugs
Before submitting a bug report:
- Make sure youre using the latest Meshview version.
- Verify the issue is not due to a misconfigured environment (SQLite/MySQL, Python version, etc.).
- Search existing [bug reports](../../issues?q=label%3Abug).
- Collect relevant information:
- Steps to reproduce
- Error messages / stack traces
- OS, Python version, and database backend
- Any logs (`meshview-db.service`, `mqtt_reader.py`, etc.)
How to report:
- Open a [new issue](../../issues/new).
- Use a **clear and descriptive title**.
- Include reproduction steps and expected vs. actual behavior.
⚠️ Security issues should **not** be reported in public issues. Instead, email us at **meshview-maintainers@proton.me**.
---
### Suggesting Enhancements
Enhancements are tracked as [issues](../../issues). Before suggesting:
- Make sure the feature doesnt already exist.
- Search for prior suggestions.
- Check that it fits Meshviews scope (mesh packet analysis, visualization, telemetry, etc.).
When submitting:
- Use a **clear and descriptive title**.
- Describe the current behavior and what youd like to see instead.
- Include examples, screenshots, or mockups if relevant.
- Explain why it would be useful to most Meshview users.
---
### Your First Code Contribution
We love first-time contributors! 🚀
If youd like to start coding:
1. Look for issues tagged with [good first issue](../../issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22).
2. Fork the repository and clone it locally.
3. Set up the development environment:
4. Run the app locally
5. Create a new branch, make your changes, commit, and push.
6. Open a pull request!
---
### Improving the Documentation
Docs are just as important as code. You can help by:
- Fixing typos or broken links.
- Clarifying confusing instructions.
- Adding examples (e.g., setting up Nginx as a reverse proxy, SQLite vs. MySQL setup).
- Writing or updating tutorials.
---
## Join the Project Team
Meshview is a community-driven project. If you consistently contribute (code, documentation, or community help), wed love to invite you as a maintainer.
Start by contributing regularly, engaging in issues/PRs, and helping others.
---
✨ Thats it! Thanks again for being part of Meshview. Every contribution matters.

80
Containerfile Normal file
View File

@@ -0,0 +1,80 @@
# Build Image
# Uses python:3.13-slim because no native dependencies are needed for meshview itself
# (everything is available as a wheel)
FROM docker.io/python:3.13-slim AS meshview-build
RUN apt-get update && \
apt-get install -y --no-install-recommends curl patch && \
rm -rf /var/lib/apt/lists/*
# Add a non-root user/group
ARG APP_USER=app
RUN useradd -m -u 10001 -s /bin/bash ${APP_USER}
# Install uv and put it on PATH system-wide
RUN curl -LsSf https://astral.sh/uv/install.sh | sh \
&& install -m 0755 /root/.local/bin/uv /usr/local/bin/uv
WORKDIR /app
RUN chown -R ${APP_USER}:${APP_USER} /app
# Copy deps first for caching
COPY --chown=${APP_USER}:${APP_USER} pyproject.toml uv.lock* requirements*.txt ./
# Optional: wheels-only to avoid slow source builds
ENV UV_NO_BUILD=1
RUN uv venv /opt/venv
# RUN uv sync --frozen
ENV VIRTUAL_ENV=/opt/venv
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
RUN uv pip install --no-cache-dir --upgrade pip \
&& if [ -f requirements.txt ]; then uv pip install --only-binary=:all: -r requirements.txt; fi
# Copy app code
COPY --chown=${APP_USER}:${APP_USER} . .
# Patch config
RUN patch sample.config.ini < container/config.patch
# Clean
RUN rm -rf /app/.git* && \
rm -rf /app/.pre-commit-config.yaml && \
rm -rf /app/*.md && \
rm -rf /app/COPYING && \
rm -rf /app/Containerfile && \
rm -rf /app/Dockerfile && \
rm -rf /app/container && \
rm -rf /app/docker && \
rm -rf /app/docs && \
rm -rf /app/pyproject.toml && \
rm -rf /app/requirements.txt && \
rm -rf /app/screenshots
# Prepare /app and /opt to copy
RUN mkdir -p /meshview && \
mv /app /opt /meshview
# Use a clean container for install
FROM docker.io/python:3.13-slim
ARG APP_USER=app
COPY --from=meshview-build /meshview /
RUN apt-get update && \
apt-get install -y --no-install-recommends graphviz && \
rm -rf /var/lib/apt/lists/* && \
useradd -m -u 10001 -s /bin/bash ${APP_USER} && \
mkdir -p /etc/meshview /var/lib/meshview /var/log/meshview && \
mv /app/sample.config.ini /etc/meshview/config.ini && \
chown -R ${APP_USER}:${APP_USER} /var/lib/meshview /var/log/meshview
# Drop privileges
USER ${APP_USER}
WORKDIR /app
ENTRYPOINT [ "/opt/venv/bin/python", "mvrun.py"]
CMD ["--pid_dir", "/tmp", "--py_exec", "/opt/venv/bin/python", "--config", "/etc/meshview/config.ini" ]
EXPOSE 8081
VOLUME [ "/etc/meshview", "/var/lib/meshview", "/var/log/meshview" ]

1
Dockerfile Symbolic link
View File

@@ -0,0 +1 @@
Containerfile

243
README-Docker.md Normal file
View File

@@ -0,0 +1,243 @@
# Running MeshView with Docker
MeshView container images are built automatically and published to GitHub Container Registry.
## Quick Start
Pull and run the latest image:
```bash
docker pull ghcr.io/pablorevilla-meshtastic/meshview:latest
docker run -d \
--name meshview \
-p 8081:8081 \
-v ./config:/etc/meshview \
-v ./data:/var/lib/meshview \
-v ./logs:/var/log/meshview \
ghcr.io/pablorevilla-meshtastic/meshview:latest
```
Access the web interface at: http://localhost:8081
## Volume Mounts
The container uses three volumes for persistent data:
| Volume | Purpose | Required |
|--------|---------|----------|
| `/etc/meshview` | Configuration files | Yes |
| `/var/lib/meshview` | Database storage | Recommended |
| `/var/log/meshview` | Log files | Optional |
### Configuration Volume
Mount a directory containing your `config.ini` file:
```bash
-v /path/to/your/config:/etc/meshview
```
If no config is provided, the container will use the default `sample.config.ini`.
### Database Volume
Mount a directory to persist the SQLite database:
```bash
-v /path/to/your/data:/var/lib/meshview
```
**Important:** Without this mount, your database will be lost when the container stops.
### Logs Volume
Mount a directory to access logs from the host:
```bash
-v /path/to/your/logs:/var/log/meshview
```
## Complete Example
Create a directory structure and run:
```bash
# Create directories
mkdir -p meshview/{config,data,logs,backups}
# Copy sample config (first time only)
docker run --rm ghcr.io/pablorevilla-meshtastic/meshview:latest \
cat /etc/meshview/config.ini > meshview/config/config.ini
# Edit config.ini with your MQTT settings
nano meshview/config/config.ini
# Run the container
docker run -d \
--name meshview \
--restart unless-stopped \
-p 8081:8081 \
-v $(pwd)/meshview/config:/etc/meshview \
-v $(pwd)/meshview/data:/var/lib/meshview \
-v $(pwd)/meshview/logs:/var/log/meshview \
ghcr.io/pablorevilla-meshtastic/meshview:latest
```
## Docker Compose
Create a `docker-compose.yml`:
```yaml
version: '3.8'
services:
meshview:
image: ghcr.io/pablorevilla-meshtastic/meshview:latest
container_name: meshview
restart: unless-stopped
ports:
- "8081:8081"
volumes:
- ./config:/etc/meshview
- ./data:/var/lib/meshview
- ./logs:/var/log/meshview
- ./backups:/var/lib/meshview/backups # For database backups
environment:
- TZ=America/Los_Angeles # Set your timezone
```
Run with:
```bash
docker-compose up -d
```
## Configuration
### Minimum Configuration
Edit your `config.ini` to configure MQTT connection:
```ini
[mqtt]
server = mqtt.meshtastic.org
topics = ["msh/US/#"]
port = 1883
username =
password =
[database]
connection_string = sqlite+aiosqlite:///var/lib/meshview/packets.db
```
### Database Backups
To enable automatic daily backups inside the container:
```ini
[cleanup]
backup_enabled = True
backup_dir = /var/lib/meshview/backups
backup_hour = 2
backup_minute = 00
```
Then mount the backups directory:
```bash
-v $(pwd)/meshview/backups:/var/lib/meshview/backups
```
## Available Tags
| Tag | Description |
|-----|-------------|
| `latest` | Latest build from the main branch |
| `dev-v3` | Development branch |
| `v1.2.3` | Specific version tags |
## Updating
Pull the latest image and restart:
```bash
docker pull ghcr.io/pablorevilla-meshtastic/meshview:latest
docker restart meshview
```
Or with docker-compose:
```bash
docker-compose pull
docker-compose up -d
```
## Logs
View container logs:
```bash
docker logs meshview
# Follow logs
docker logs -f meshview
# Last 100 lines
docker logs --tail 100 meshview
```
## Troubleshooting
### Container won't start
Check logs:
```bash
docker logs meshview
```
### Database permission issues
Ensure the data directory is writable:
```bash
chmod -R 755 meshview/data
```
### Can't connect to MQTT
1. Check your MQTT configuration in `config.ini`
2. Verify network connectivity from the container:
```bash
docker exec meshview ping mqtt.meshtastic.org
```
### Port already in use
Change the host port (left side):
```bash
-p 8082:8081
```
Then access at: http://localhost:8082
## Building Your Own Image
If you want to build from source:
```bash
git clone https://github.com/pablorevilla-meshtastic/meshview.git
cd meshview
docker build -f Containerfile -t meshview:local .
```
## Security Notes
- The container runs as a non-root user (`app`, UID 10001)
- No privileged access required
- Only port 8081 is exposed
- All data stored in mounted volumes
## Support
- GitHub Issues: https://github.com/pablorevilla-meshtastic/meshview/issues
- Documentation: https://github.com/pablorevilla-meshtastic/meshview

189
README.md
View File

@@ -2,13 +2,44 @@
# Meshview
![Start Page](screenshots/animated.gif)
The project serves as a real-time monitoring and diagnostic tool for the Meshtastic mesh network. It provides detailed insights into the network's activity, including message traffic, node positions, and telemetry data.
The project serves as a real-time monitoring and diagnostic tool for the Meshtastic mesh network. It provides detailed insights into network activity, including message traffic, node positions, and telemetry data.
### Version 3.0.0 update - November 2025
**Major Infrastructure Improvements:**
* **Database Migrations**: Alembic integration for safe schema upgrades and database versioning
* **Automated Backups**: Independent database backup system with gzip compression (separate from cleanup)
* **Development Tools**: Quick setup script (`setup-dev.sh`) with pre-commit hooks for code quality
* **Docker Support**: Pre-built containers now available on GitHub Container Registry with automatic builds - ogarcia
**New Features:**
* **Traceroute Return Path**: Log and display return path data for traceroute packets - jschrempp
* **Microsecond Timestamps**: Added `import_time_us` columns for higher precision time tracking
**Technical Improvements:**
* Migration from manual SQL to Alembic-managed schema
* Container images use `uv` for faster dependency installation
* Python 3.13 support with slim Debian-based images
* Documentation collection in `docs/` directory
* API routes moved to separate modules for better organization
* /version and /health endpoints added for monitoring
See [README-Docker.md](README-Docker.md) for container deployment and [docs/](docs/) for technical documentation.
### Version 2.0.7 update - September 2025
* New database maintenance capability to automatically keep a specific number of days of data.
* Added configuration for update intervals for both the Live Map and the Firehose pages.
### Version 2.0.6 update - August 2025
* New Live Map (Shows packet feed live)
* New API /api/config (See API documentation)
* New API /api/edges (See API documentation)
* Adds edges to the map (click to see traceroute and neighbours)
### Version 2.0.4 update - August 2025
* New statistic page with more data.
* New API /api/stats (See API documentation).
@@ -37,33 +68,61 @@ The project serves as a real-time monitoring and diagnostic tool for the Meshtas
Samples of currently running instances:
- https://meshview.bayme.sh (SF Bay Area)
- https://www.svme.sh/ (Sacramento Valley)
- https://meshview.nyme.sh/ (New York)
- https://map.wpamesh.net/ (Western Pennsylvania)
- https://meshview.chicagolandmesh.org/ (Chicago)
- https://www.svme.sh (Sacramento Valley)
- https://meshview.nyme.sh (New York)
- https://meshview.socalmesh.org (LA Area)
- https://map.wpamesh.net (Western Pennsylvania)
- https://meshview.chicagolandmesh.org (Chicago)
- https://meshview.mt.gt (Canadaverse)
- https://meshview.meshtastic.es (Spain)
- https://view.mtnme.sh/ (North Georgia / East Tennessee)
- https://socalmesh.w4hac.com (Southern California)
- https://view.mtnme.sh (North Georgia / East Tennessee)
- https://meshview.lsinfra.de (Hessen - Germany)
- https://map.nswmesh.au/ (Sydney - Australia)
- https://meshview.pvmesh.org/ (Pioneer Valley, Massachusetts)
- https://map.nswmesh.au (Sydney - Australia)
- https://meshview.pvmesh.org (Pioneer Valley, Massachusetts)
- https://meshview.louisianamesh.org (Louisiana)
- https://meshview.meshcolombia.co/ (Colombia)
- https://meshview-salzburg.jmt.gr/ (Salzburg / Austria)
---
## Installing
Requires **`python3.11`** or above.
### Using Docker (Recommended)
The easiest way to run MeshView is using Docker. Pre-built images are available from GitHub Container Registry.
See **[README-Docker.md](README-Docker.md)** for complete Docker installation and usage instructions.
### Manual Installation
Requires **`python3.13`** or above.
Clone the repo from GitHub:
```bash
git clone https://github.com/pablorevilla-meshtastic/meshview.git
cd meshview
```
#### Quick Setup (Recommended)
Run the development setup script:
```bash
./setup-dev.sh
```
This will:
- Create Python virtual environment
- Install all requirements
- Install development tools (pre-commit, pytest)
- Set up pre-commit hooks for code formatting
- Create config.ini from sample
#### Manual Setup
Create a Python virtual environment:
```bash
cd meshview
python3 -m venv env
```
@@ -73,7 +132,7 @@ Install the environment requirements:
./env/bin/pip install -r requirements.txt
```
Install `graphviz`:
Install `graphviz` on MacOS or Debian/Ubuntu Linux:
```bash
sudo apt-get install graphviz
@@ -131,6 +190,9 @@ title = Bay Area Mesh
# A brief message shown on the homepage.
message = Real time data from around the bay area and beyond.
# Starting URL when loading the index page.
starting = /chat
# Enable or disable site features (as strings: "True" or "False").
nodes = True
conversations = True
@@ -142,16 +204,21 @@ map = True
top = True
# Map boundaries (used for the map view).
# Defaults will show the San Francisco Bay Area
map_top_left_lat = 39
map_top_left_lon = -123
map_bottom_right_lat = 36
map_bottom_right_lon = -121
# Updates intervals in seconds, zero or negative number means no updates
# defaults will be 3 seconds
map_interval=3
firehose_interval=3
# Weekly net details
weekly_net_message = Weekly Mesh check-in. We will keep it open on every Wednesday from 5:00pm for checkins. The message format should be (LONG NAME) - (CITY YOU ARE IN) #BayMeshNet.
net_tag = #BayMeshNet
# -------------------------
# MQTT Broker Configuration
# -------------------------
@@ -160,7 +227,7 @@ net_tag = #BayMeshNet
server = mqtt.bayme.sh
# Topics to subscribe to (as JSON-like list, but still a string).
topics = ["msh/US/bayarea/#", "msh/US/CA/mrymesh/#", "msh/US/CA/sacvalley/#"]
topics = ["msh/US/bayarea/#", "msh/US/CA/mrymesh/#", "msh/US/CA/sacvalley"]
# Port used by MQTT (typically 1883 for unencrypted).
port = 1883
@@ -176,13 +243,41 @@ password = large4cats
[database]
# SQLAlchemy connection string. This one uses SQLite with asyncio support.
connection_string = sqlite+aiosqlite:///packets.db
# -------------------------
# Database Cleanup Configuration
# -------------------------
[cleanup]
# Enable or disable daily cleanup
enabled = False
# Number of days to keep records in the database
days_to_keep = 14
# Time to run daily cleanup (24-hour format)
hour = 2
minute = 00
# Run VACUUM after cleanup
vacuum = False
# -------------------------
# Logging Configuration
# -------------------------
[logging]
# Enable or disable HTTP access logs from the web server
# When disabled, request logs like "GET /api/chat" will not appear
# Application logs (errors, startup messages, etc.) are unaffected
# Set to True to enable, False to disable (default: False)
access_log = False
# Database cleanup logfile location
db_cleanup_logfile = dbcleanup.log
```
---
## Running Meshview
Start the database:
Start the database manager:
```bash
./env/bin/python startdb.py
@@ -209,12 +304,29 @@ Open in your browser: http://localhost:8081/
## Running Meshview with `mvrun.py`
- `mvrun.py` starts both `startdb.py` and `main.py` in separate threads and merges the output.
- It accepts the `--config` argument like the others.
- It accepts several command-line arguments for flexible deployment.
```bash
./env/bin/python mvrun.py
```
**Command-line options:**
- `--config CONFIG` - Path to the configuration file (default: `config.ini`)
- `--pid_dir PID_DIR` - Directory for PID files (default: `.`)
- `--py_exec PY_EXEC` - Path to the Python executable (default: `./env/bin/python`)
**Examples:**
```bash
# Use a specific config file
./env/bin/python mvrun.py --config /etc/meshview/config.ini
# Store PID files in a specific directory
./env/bin/python mvrun.py --pid_dir /var/run/meshview
# Use a different Python executable
./env/bin/python mvrun.py --py_exec /usr/bin/python3
```
---
## Setting Up Systemd Services (Ubuntu)
@@ -303,6 +415,36 @@ sudo systemctl daemon-reload
```
## 5. Database Maintenance
### Database maintnance can now be done via the script itself here is the section from the configuration file.
- Simple to setup
- It will not drop any packets
```
# -------------------------
# Database Cleanup Configuration
# -------------------------
[cleanup]
# Enable or disable daily cleanup
enabled = False
# Number of days to keep records in the database
days_to_keep = 14
# Time to run daily cleanup (24-hour format)
hour = 2
minute = 00
# Run VACUUM after cleanup
vacuum = False
# -------------------------
# Logging Configuration
# -------------------------
[logging]
# Enable or disable HTTP access logs from the web server
access_log = False
# Database cleanup logfile location
db_cleanup_logfile = dbcleanup.log
```
Once changes are done you need to restart the script for changes to load.
### Alternatively we can do it via your OS
- Create and save bash script below. (Modify /path/to/file/ to the correct path)
- Name it cleanup.sh
- Make it executable.
@@ -348,4 +490,19 @@ Add schedule to the bottom of the file (modify /path/to/file/ to the correct pat
Check the log file to see it the script run at the specific time.
---
## Testing
MeshView includes a test suite using pytest. For detailed testing documentation, see [README-testing.md](README-testing.md).
Quick start:
```bash
./env/bin/pytest tests/test_api_simple.py -v
```
---
## Technical Documentation
For more detailed technical documentation including database migrations, architecture details, and advanced topics, see the [docs/](docs/) directory.

120
alembic.ini Normal file
View File

@@ -0,0 +1,120 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
# Use forward slashes (/) also on windows to provide an os agnostic path
script_location = alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
# version_path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
version_path_separator = os
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# sqlalchemy.url will be set programmatically from meshview config
# sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = INFO
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(asctime)s %(filename)s:%(lineno)d [pid:%(process)d] %(levelname)s - %(message)s
datefmt = %Y-%m-%d %H:%M:%S

1
alembic/README Normal file
View File

@@ -0,0 +1 @@
Generic single-database configuration.

102
alembic/env.py Normal file
View File

@@ -0,0 +1,102 @@
import asyncio
from logging.config import fileConfig
from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
# Import models metadata for autogenerate support
from meshview.models import Base
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
# Use disable_existing_loggers=False to preserve app logging configuration
if config.config_file_name is not None:
fileConfig(config.config_file_name, disable_existing_loggers=False)
# Add your model's MetaData object here for 'autogenerate' support
target_metadata = Base.metadata
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection: Connection) -> None:
"""Run migrations with the given connection."""
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
async def run_async_migrations() -> None:
"""Run migrations in async mode."""
# Get configuration section
configuration = config.get_section(config.config_ini_section, {})
# If sqlalchemy.url is not set in alembic.ini, try to get it from meshview config
if "sqlalchemy.url" not in configuration:
try:
from meshview.config import CONFIG
configuration["sqlalchemy.url"] = CONFIG["database"]["connection_string"]
except Exception:
# Fallback to a default for initial migration creation
configuration["sqlalchemy.url"] = "sqlite+aiosqlite:///packets.db"
connectable = async_engine_from_config(
configuration,
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode with async support."""
try:
# Event loop is already running, schedule and run the coroutine
import concurrent.futures
with concurrent.futures.ThreadPoolExecutor() as pool:
pool.submit(lambda: asyncio.run(run_async_migrations())).result()
except RuntimeError:
# No event loop running, create one
asyncio.run(run_async_migrations())
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

26
alembic/script.py.mako Normal file
View File

@@ -0,0 +1,26 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,45 @@
"""Add example table
Revision ID: 1717fa5c6545
Revises: c88468b7ab0b
Create Date: 2025-10-26 20:59:04.347066
"""
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = '1717fa5c6545'
down_revision: str | None = 'add_time_us_cols'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade() -> None:
"""Create example table with sample columns."""
op.create_table(
'example',
sa.Column('id', sa.Integer(), nullable=False, primary_key=True, autoincrement=True),
sa.Column('name', sa.String(length=100), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('value', sa.Float(), nullable=True),
sa.Column('is_active', sa.Boolean(), nullable=False, server_default='1'),
sa.Column(
'created_at', sa.DateTime(), nullable=False, server_default=sa.text('CURRENT_TIMESTAMP')
),
sa.Column('updated_at', sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint('id'),
)
# Create an index on the name column for faster lookups
op.create_index('idx_example_name', 'example', ['name'])
def downgrade() -> None:
"""Remove example table."""
op.drop_index('idx_example_name', table_name='example')
op.drop_table('example')

View File

@@ -0,0 +1,35 @@
"""Add first_seen_us and last_seen_us to node table
Revision ID: 2b5a61bb2b75
Revises: ac311b3782a1
Create Date: 2025-11-05 15:19:13.446724
"""
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = '2b5a61bb2b75'
down_revision: str | None = 'ac311b3782a1'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade() -> None:
# Add microsecond epoch timestamp columns for first and last seen times
op.add_column('node', sa.Column('first_seen_us', sa.BigInteger(), nullable=True))
op.add_column('node', sa.Column('last_seen_us', sa.BigInteger(), nullable=True))
op.create_index('idx_node_first_seen_us', 'node', ['first_seen_us'], unique=False)
op.create_index('idx_node_last_seen_us', 'node', ['last_seen_us'], unique=False)
def downgrade() -> None:
# Remove the microsecond epoch timestamp columns and their indexes
op.drop_index('idx_node_last_seen_us', table_name='node')
op.drop_index('idx_node_first_seen_us', table_name='node')
op.drop_column('node', 'last_seen_us')
op.drop_column('node', 'first_seen_us')

View File

@@ -0,0 +1,31 @@
"""add route_return to traceroute
Revision ID: ac311b3782a1
Revises: 1717fa5c6545
Create Date: 2025-11-04 20:28:33.174137
"""
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = 'ac311b3782a1'
down_revision: str | None = '1717fa5c6545'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade() -> None:
# Add route_return column to traceroute table
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.add_column(sa.Column('route_return', sa.LargeBinary(), nullable=True))
def downgrade() -> None:
# Remove route_return column from traceroute table
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.drop_column('route_return')

View File

@@ -0,0 +1,74 @@
"""add import_time_us columns
Revision ID: add_time_us_cols
Revises: c88468b7ab0b
Create Date: 2025-11-03 14:10:00.000000
"""
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = 'add_time_us_cols'
down_revision: str | None = 'c88468b7ab0b'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade() -> None:
# Check if columns already exist, add them if they don't
conn = op.get_bind()
inspector = sa.inspect(conn)
# Add import_time_us to packet table
packet_columns = [col['name'] for col in inspector.get_columns('packet')]
if 'import_time_us' not in packet_columns:
with op.batch_alter_table('packet', schema=None) as batch_op:
batch_op.add_column(sa.Column('import_time_us', sa.BigInteger(), nullable=True))
op.create_index(
'idx_packet_import_time_us', 'packet', [sa.text('import_time_us DESC')], unique=False
)
op.create_index(
'idx_packet_from_node_time_us',
'packet',
['from_node_id', sa.text('import_time_us DESC')],
unique=False,
)
# Add import_time_us to packet_seen table
packet_seen_columns = [col['name'] for col in inspector.get_columns('packet_seen')]
if 'import_time_us' not in packet_seen_columns:
with op.batch_alter_table('packet_seen', schema=None) as batch_op:
batch_op.add_column(sa.Column('import_time_us', sa.BigInteger(), nullable=True))
op.create_index(
'idx_packet_seen_import_time_us', 'packet_seen', ['import_time_us'], unique=False
)
# Add import_time_us to traceroute table
traceroute_columns = [col['name'] for col in inspector.get_columns('traceroute')]
if 'import_time_us' not in traceroute_columns:
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.add_column(sa.Column('import_time_us', sa.BigInteger(), nullable=True))
op.create_index(
'idx_traceroute_import_time_us', 'traceroute', ['import_time_us'], unique=False
)
def downgrade() -> None:
# Drop indexes and columns
op.drop_index('idx_traceroute_import_time_us', table_name='traceroute')
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.drop_column('import_time_us')
op.drop_index('idx_packet_seen_import_time_us', table_name='packet_seen')
with op.batch_alter_table('packet_seen', schema=None) as batch_op:
batch_op.drop_column('import_time_us')
op.drop_index('idx_packet_from_node_time_us', table_name='packet')
op.drop_index('idx_packet_import_time_us', table_name='packet')
with op.batch_alter_table('packet', schema=None) as batch_op:
batch_op.drop_column('import_time_us')

View File

@@ -0,0 +1,160 @@
"""Initial migration
Revision ID: c88468b7ab0b
Revises:
Create Date: 2025-10-26 20:56:50.285200
"""
from collections.abc import Sequence
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = 'c88468b7ab0b'
down_revision: str | None = None
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Get connection and inspector to check what exists
conn = op.get_bind()
inspector = sa.inspect(conn)
existing_tables = inspector.get_table_names()
# Create node table if it doesn't exist
if 'node' not in existing_tables:
op.create_table(
'node',
sa.Column('id', sa.String(), nullable=False),
sa.Column('node_id', sa.BigInteger(), nullable=True),
sa.Column('long_name', sa.String(), nullable=True),
sa.Column('short_name', sa.String(), nullable=True),
sa.Column('hw_model', sa.String(), nullable=True),
sa.Column('firmware', sa.String(), nullable=True),
sa.Column('role', sa.String(), nullable=True),
sa.Column('last_lat', sa.BigInteger(), nullable=True),
sa.Column('last_long', sa.BigInteger(), nullable=True),
sa.Column('channel', sa.String(), nullable=True),
sa.Column('last_update', sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('node_id'),
)
op.create_index('idx_node_node_id', 'node', ['node_id'], unique=False)
# Create packet table if it doesn't exist
if 'packet' not in existing_tables:
op.create_table(
'packet',
sa.Column('id', sa.BigInteger(), nullable=False),
sa.Column('portnum', sa.Integer(), nullable=True),
sa.Column('from_node_id', sa.BigInteger(), nullable=True),
sa.Column('to_node_id', sa.BigInteger(), nullable=True),
sa.Column('payload', sa.LargeBinary(), nullable=True),
sa.Column('import_time', sa.DateTime(), nullable=True),
sa.Column('import_time_us', sa.BigInteger(), nullable=True),
sa.Column('channel', sa.String(), nullable=True),
sa.PrimaryKeyConstraint('id'),
)
op.create_index('idx_packet_from_node_id', 'packet', ['from_node_id'], unique=False)
op.create_index('idx_packet_to_node_id', 'packet', ['to_node_id'], unique=False)
op.create_index(
'idx_packet_import_time', 'packet', [sa.text('import_time DESC')], unique=False
)
op.create_index(
'idx_packet_import_time_us', 'packet', [sa.text('import_time_us DESC')], unique=False
)
op.create_index(
'idx_packet_from_node_time',
'packet',
['from_node_id', sa.text('import_time DESC')],
unique=False,
)
op.create_index(
'idx_packet_from_node_time_us',
'packet',
['from_node_id', sa.text('import_time_us DESC')],
unique=False,
)
# Create packet_seen table if it doesn't exist
if 'packet_seen' not in existing_tables:
op.create_table(
'packet_seen',
sa.Column('packet_id', sa.BigInteger(), nullable=False),
sa.Column('node_id', sa.BigInteger(), nullable=False),
sa.Column('rx_time', sa.BigInteger(), nullable=False),
sa.Column('hop_limit', sa.Integer(), nullable=True),
sa.Column('hop_start', sa.Integer(), nullable=True),
sa.Column('channel', sa.String(), nullable=True),
sa.Column('rx_snr', sa.Float(), nullable=True),
sa.Column('rx_rssi', sa.Integer(), nullable=True),
sa.Column('topic', sa.String(), nullable=True),
sa.Column('import_time', sa.DateTime(), nullable=True),
sa.Column('import_time_us', sa.BigInteger(), nullable=True),
sa.ForeignKeyConstraint(
['packet_id'],
['packet.id'],
),
sa.PrimaryKeyConstraint('packet_id', 'node_id', 'rx_time'),
)
op.create_index('idx_packet_seen_node_id', 'packet_seen', ['node_id'], unique=False)
op.create_index('idx_packet_seen_packet_id', 'packet_seen', ['packet_id'], unique=False)
op.create_index(
'idx_packet_seen_import_time_us', 'packet_seen', ['import_time_us'], unique=False
)
# Create traceroute table if it doesn't exist
if 'traceroute' not in existing_tables:
op.create_table(
'traceroute',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('packet_id', sa.BigInteger(), nullable=True),
sa.Column('gateway_node_id', sa.BigInteger(), nullable=True),
sa.Column('done', sa.Boolean(), nullable=True),
sa.Column('route', sa.LargeBinary(), nullable=True),
sa.Column('import_time', sa.DateTime(), nullable=True),
sa.Column('import_time_us', sa.BigInteger(), nullable=True),
sa.ForeignKeyConstraint(
['packet_id'],
['packet.id'],
),
sa.PrimaryKeyConstraint('id'),
)
op.create_index('idx_traceroute_import_time', 'traceroute', ['import_time'], unique=False)
op.create_index(
'idx_traceroute_import_time_us', 'traceroute', ['import_time_us'], unique=False
)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
# Drop traceroute table and indexes
op.drop_index('idx_traceroute_import_time_us', table_name='traceroute')
op.drop_index('idx_traceroute_import_time', table_name='traceroute')
op.drop_table('traceroute')
# Drop packet_seen table and indexes
op.drop_index('idx_packet_seen_import_time_us', table_name='packet_seen')
op.drop_index('idx_packet_seen_packet_id', table_name='packet_seen')
op.drop_index('idx_packet_seen_node_id', table_name='packet_seen')
op.drop_table('packet_seen')
# Drop packet table and indexes
op.drop_index('idx_packet_from_node_time_us', table_name='packet')
op.drop_index('idx_packet_from_node_time', table_name='packet')
op.drop_index('idx_packet_import_time_us', table_name='packet')
op.drop_index('idx_packet_import_time', table_name='packet')
op.drop_index('idx_packet_to_node_id', table_name='packet')
op.drop_index('idx_packet_from_node_id', table_name='packet')
op.drop_table('packet')
# Drop node table and indexes
op.drop_index('idx_node_node_id', table_name='node')
op.drop_table('node')
# ### end Alembic commands ###

57
container/build-container.sh Executable file
View File

@@ -0,0 +1,57 @@
#!/bin/sh
#
# build-container.sh
#
# Script to build MeshView container images
set -e
# Default values
IMAGE_NAME="meshview"
TAG="latest"
CONTAINERFILE="Containerfile"
# Parse arguments
while [ $# -gt 0 ]; do
case "$1" in
--tag|-t)
TAG="$2"
shift 2
;;
--name|-n)
IMAGE_NAME="$2"
shift 2
;;
--file|-f)
CONTAINERFILE="$2"
shift 2
;;
--help|-h)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Options:"
echo " -t, --tag TAG Tag for the image (default: latest)"
echo " -n, --name NAME Image name (default: meshview)"
echo " -f, --file FILE Containerfile path (default: Containerfile)"
echo " -h, --help Show this help"
exit 0
;;
*)
echo "Unknown option: $1"
echo "Use --help for usage information"
exit 1
;;
esac
done
echo "Building MeshView container image..."
echo " Image: ${IMAGE_NAME}:${TAG}"
echo " Containerfile: ${CONTAINERFILE}"
echo ""
# Build the container
docker build -f "${CONTAINERFILE}" -t "${IMAGE_NAME}:${TAG}" .
echo ""
echo "Build complete!"
echo "Run with: docker run --rm -p 8081:8081 ${IMAGE_NAME}:${TAG}"

37
container/config.patch Normal file
View File

@@ -0,0 +1,37 @@
diff --git a/sample.config.ini b/sample.config.ini
index 0e64980..494685c 100644
--- a/sample.config.ini
+++ b/sample.config.ini
@@ -3,7 +3,7 @@
# -------------------------
[server]
# The address to bind the server to. Use * to listen on all interfaces.
-bind = *
+bind = 0.0.0.0
# Port to run the web server on.
port = 8081
@@ -64,7 +64,7 @@ net_tag = #BayMeshNet
# -------------------------
[mqtt]
# MQTT server hostname or IP.
-server = mqtt.bayme.sh
+server = mqtt.meshtastic.org
# Topics to subscribe to (as JSON-like list, but still a string).
topics = ["msh/US/bayarea/#", "msh/US/CA/mrymesh/#", "msh/US/CA/sacvalley"]
@@ -82,7 +82,7 @@ password = large4cats
# -------------------------
[database]
# SQLAlchemy connection string. This one uses SQLite with asyncio support.
-connection_string = sqlite+aiosqlite:///packets.db
+connection_string = sqlite+aiosqlite:////var/lib/meshview/packets.db
# -------------------------
@@ -110,4 +110,4 @@ vacuum = False
# Set to True to enable, False to disable (default: False)
access_log = False
# Database cleanup logfile
-db_cleanup_logfile = dbcleanup.log
+db_cleanup_logfile = /var/log/meshview/dbcleanup.log

52
create_example_migration.py Executable file
View File

@@ -0,0 +1,52 @@
#!/usr/bin/env python3
"""
Script to create a blank migration for manual editing.
Usage:
./env/bin/python create_example_migration.py
This creates an empty migration file that you can manually edit to add
custom migration logic (data migrations, complex schema changes, etc.)
Unlike create_migration.py which auto-generates from model changes,
this creates a blank template for you to fill in.
"""
import os
import sys
# Add current directory to path
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
from alembic.config import Config
from alembic import command
# Create Alembic config
alembic_cfg = Config("alembic.ini")
# Set database URL from meshview config
try:
from meshview.config import CONFIG
database_url = CONFIG["database"]["connection_string"]
alembic_cfg.set_main_option("sqlalchemy.url", database_url)
print(f"Using database URL from config: {database_url}")
except Exception as e:
print(f"Warning: Could not load meshview config: {e}")
print("Using default database URL")
alembic_cfg.set_main_option("sqlalchemy.url", "sqlite+aiosqlite:///packets.db")
# Generate blank migration
try:
print("Creating blank migration for manual editing...")
command.revision(alembic_cfg, autogenerate=False, message="Manual migration")
print("✓ Successfully created blank migration!")
print("\nNow edit the generated file in alembic/versions/")
print("Add your custom upgrade() and downgrade() logic")
except Exception as e:
print(f"✗ Error creating migration: {e}")
import traceback
traceback.print_exc()
sys.exit(1)

58
create_migration.py Executable file
View File

@@ -0,0 +1,58 @@
#!/usr/bin/env python3
"""
Helper script to create Alembic migrations from SQLAlchemy model changes.
Usage:
./env/bin/python create_migration.py
This will:
1. Load your current models from meshview/models.py
2. Compare them to the current database schema
3. Auto-generate a migration with the detected changes
4. Save the migration to alembic/versions/
After running this, review the generated migration file before committing!
"""
import os
import sys
# Add current directory to path
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
from alembic.config import Config
from alembic import command
# Create Alembic config
alembic_cfg = Config("alembic.ini")
# Set database URL from meshview config
try:
from meshview.config import CONFIG
database_url = CONFIG["database"]["connection_string"]
alembic_cfg.set_main_option("sqlalchemy.url", database_url)
print(f"Using database URL from config: {database_url}")
except Exception as e:
print(f"Warning: Could not load meshview config: {e}")
print("Using default database URL")
alembic_cfg.set_main_option("sqlalchemy.url", "sqlite+aiosqlite:///packets.db")
# Generate migration
try:
print("\nComparing models to current database schema...")
print("Generating migration...\n")
command.revision(alembic_cfg, autogenerate=True, message="Auto-generated migration")
print("\n✓ Successfully created migration!")
print("\nNext steps:")
print("1. Review the generated file in alembic/versions/")
print("2. Edit the migration message/logic if needed")
print("3. Test the migration: ./env/bin/alembic upgrade head")
print("4. Commit the migration file to version control")
except Exception as e:
print(f"\n✗ Error creating migration: {e}")
import traceback
traceback.print_exc()
sys.exit(1)

View File

@@ -1,28 +0,0 @@
FROM python:3.12-slim
# Set work directory
WORKDIR /app
# Install system dependencies (graphviz required, git for cloning)
RUN apt-get update && \
apt-get install -y --no-install-recommends git graphviz && \
rm -rf /var/lib/apt/lists/*
# Clone the repo with submodules
RUN git clone --recurse-submodules https://github.com/pablorevilla-meshtastic/meshview.git /app
# Create virtual environment
RUN python -m venv /app/env
# Upgrade pip and install requirements in venv
RUN /app/env/bin/pip install --no-cache-dir --upgrade pip && \
/app/env/bin/pip install --no-cache-dir -r /app/requirements.txt
# Copy sample config
RUN cp /app/sample.config.ini /app/config.ini
# Expose port
EXPOSE 8081
# Run the app via venv
CMD ["/app/env/bin/python", "/app/mvrun.py"]

View File

@@ -1,44 +1,36 @@
# MeshView Docker Container
This Dockerfile builds a containerized version of the [MeshView](https://github.com/pablorevilla-meshtastic/meshview) application. It uses a lightweight Python environment and sets up the required virtual environment as expected by the application.
> **Note:** This directory contains legacy Docker build files.
>
> **For current Docker usage instructions, please see [README-Docker.md](../README-Docker.md) in the project root.**
## Image Details
## Current Approach
- **Base Image**: `python:3.12-slim`
- **Working Directory**: `/app`
- **Python Virtual Environment**: `/app/env`
Pre-built container images are automatically built and published to GitHub Container Registry:
```bash
docker pull ghcr.io/pablorevilla-meshtastic/meshview:latest
```
See **[README-Docker.md](../README-Docker.md)** for:
- Quick start instructions
- Volume mount configuration
- Docker Compose examples
- Backup configuration
- Troubleshooting
## Legacy Build (Not Recommended)
If you need to build your own image for development:
```bash
# From project root
docker build -f Containerfile -t meshview:local .
```
The current Containerfile uses:
- **Base Image**: `python:3.13-slim` (Debian-based)
- **Build tool**: `uv` for fast dependency installation
- **User**: Non-root user `app` (UID 10001)
- **Exposed Port**: `8081`
## Build Instructions
Build the Docker image:
```bash
docker build -t meshview-docker .
```
## Run Instructions
Run the container:
```bash
docker run -d --name meshview-docker -p 8081:8081 meshview-docker
```
This maps container port `8081` to your host. The application runs via:
```bash
/app/env/bin/python /app/mvrun.py
```
## Web Interface
Once the container is running, you can access the MeshView web interface by visiting:
http://localhost:8081
If running on a remote server, replace `localhost` with the host's IP or domain name:
http://<host>:8081
Ensure that port `8081` is open and not blocked by a firewall or security group.
- **Volumes**: `/etc/meshview`, `/var/lib/meshview`, `/var/log/meshview`

361
docs/ALEMBIC_SETUP.md Normal file
View File

@@ -0,0 +1,361 @@
# Alembic Database Migration Setup
This document describes the automatic database migration system implemented for MeshView using Alembic.
## Overview
The system provides automatic database schema migrations with coordination between the writer app (startdb.py) and reader app (web.py):
- **Writer App**: Automatically runs pending migrations on startup
- **Reader App**: Waits for migrations to complete before starting
## Architecture
### Key Components
1. **`meshview/migrations.py`** - Migration management utilities
- `run_migrations()` - Runs pending migrations (writer app)
- `wait_for_migrations()` - Waits for schema to be current (reader app)
- `is_database_up_to_date()` - Checks schema version
- Migration status tracking table
2. **`alembic/`** - Alembic migration directory
- `env.py` - Configured for async SQLAlchemy support
- `versions/` - Migration scripts directory
- `alembic.ini` - Alembic configuration
3. **Modified Apps**:
- `startdb.py` - Writer app that runs migrations before MQTT ingestion
- `meshview/web.py` - Reader app that waits for schema updates
## How It Works - Automatic In-Place Updates
### ✨ Fully Automatic Operation
**No manual migration commands needed!** The database schema updates automatically when you:
1. Deploy new code with migration files
2. Restart the applications
### Writer App (startdb.py) Startup Sequence
1. Initialize database connection
2. Create migration status tracking table
3. Set "migration in progress" flag
4. **🔄 Automatically run any pending Alembic migrations** (synchronously)
- Detects current schema version
- Compares to latest available migration
- Runs all pending migrations in sequence
- Updates database schema in place
5. Clear "migration in progress" flag
6. Start MQTT ingestion and other tasks
### Reader App (web.py) Startup Sequence
1. Initialize database connection
2. **Check database schema version**
3. If not up to date:
- Wait up to 60 seconds (30 retries × 2 seconds)
- Check every 2 seconds for schema updates
- Automatically proceeds once writer completes migrations
4. Once schema is current, start web server
### 🎯 Key Point: Zero Manual Steps
When you deploy new code with migrations:
```bash
# Just start the apps - migrations happen automatically!
./env/bin/python startdb.py # Migrations run here automatically
./env/bin/python main.py # Waits for migrations, then starts
```
**The database updates itself!** No need to run `alembic upgrade` manually.
### Coordination
The apps coordinate using:
- **Alembic version table** (`alembic_version`) - Tracks current schema version
- **Migration status table** (`migration_status`) - Optional flag for "in progress" state
## Creating New Migrations
### Using the helper script:
```bash
./env/bin/python create_migration.py
```
### Manual creation:
```bash
./env/bin/alembic revision --autogenerate -m "Description of changes"
```
This will:
1. Compare current database schema with SQLAlchemy models
2. Generate a migration script in `alembic/versions/`
3. Automatically detect most schema changes
### Manual migration (advanced):
```bash
./env/bin/alembic revision -m "Manual migration"
```
Then edit the generated file to add custom migration logic.
## Running Migrations
### Automatic (Recommended)
Migrations run automatically when the writer app starts:
```bash
./env/bin/python startdb.py
```
### Manual
To run migrations manually:
```bash
./env/bin/alembic upgrade head
```
To downgrade:
```bash
./env/bin/alembic downgrade -1 # Go back one version
./env/bin/alembic downgrade base # Go back to beginning
```
## Checking Migration Status
Check current database version:
```bash
./env/bin/alembic current
```
View migration history:
```bash
./env/bin/alembic history
```
## Benefits
1. **Zero Manual Intervention**: Migrations run automatically on startup
2. **Safe Coordination**: Reader won't connect to incompatible schema
3. **Version Control**: All schema changes tracked in git
4. **Rollback Capability**: Can downgrade if needed
5. **Auto-generation**: Most migrations created automatically from model changes
## Migration Workflow
### Development Process
1. **Modify SQLAlchemy models** in `meshview/models.py`
2. **Create migration**:
```bash
./env/bin/python create_migration.py
```
3. **Review generated migration** in `alembic/versions/`
4. **Test migration**:
- Stop all apps
- Start writer app (migrations run automatically)
- Start reader app (waits for schema to be current)
5. **Commit migration** to version control
### Production Deployment
1. **Deploy new code** with migration scripts
2. **Start writer app** - Migrations run automatically
3. **Start reader app** - Waits for migrations, then starts
4. **Monitor logs** for migration success
## Troubleshooting
### Migration fails
Check logs in writer app for error details. To manually fix:
```bash
./env/bin/alembic current # Check current version
./env/bin/alembic history # View available versions
./env/bin/alembic upgrade head # Try manual upgrade
```
### Reader app won't start (timeout)
Check if writer app is running and has completed migrations:
```bash
./env/bin/alembic current
```
### Reset to clean state
⚠️ **Warning: This will lose all data**
```bash
rm packets.db # Or your database file
./env/bin/alembic upgrade head # Create fresh schema
```
## File Structure
```
meshview/
├── alembic.ini # Alembic configuration
├── alembic/
│ ├── env.py # Async-enabled migration runner
│ ├── script.py.mako # Migration template
│ └── versions/ # Migration scripts
│ └── c88468b7ab0b_initial_migration.py
├── meshview/
│ ├── models.py # SQLAlchemy models (source of truth)
│ ├── migrations.py # Migration utilities
│ ├── mqtt_database.py # Writer database connection
│ └── database.py # Reader database connection
├── startdb.py # Writer app (runs migrations)
├── main.py # Entry point for reader app
└── create_migration.py # Helper script for creating migrations
```
## Configuration
Database URL is read from `config.ini`:
```ini
[database]
connection_string = sqlite+aiosqlite:///packets.db
```
Alembic automatically uses this configuration through `meshview/migrations.py`.
## Important Notes
1. **Always test migrations** in development before deploying to production
2. **Backup database** before running migrations in production
3. **Check for data loss** - Some migrations may require data migration logic
4. **Coordinate deployments** - Start writer before readers in multi-instance setups
5. **Monitor logs** during first startup after deployment
## Example Migrations
### Example 1: Generated Initial Migration
Here's what an auto-generated migration looks like (from comparing models to database):
```python
"""Initial migration
Revision ID: c88468b7ab0b
Revises:
Create Date: 2025-01-26 20:56:50.123456
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers
revision = 'c88468b7ab0b'
down_revision = None
branch_labels = None
depends_on = None
def upgrade() -> None:
# Upgrade operations
op.create_table('node',
sa.Column('id', sa.String(), nullable=False),
sa.Column('node_id', sa.BigInteger(), nullable=True),
# ... more columns
sa.PrimaryKeyConstraint('id')
)
def downgrade() -> None:
# Downgrade operations
op.drop_table('node')
```
### Example 2: Manual Migration Adding a New Table
We've included an example migration (`1717fa5c6545_add_example_table.py`) that demonstrates how to manually create a new table:
```python
"""Add example table
Revision ID: 1717fa5c6545
Revises: c88468b7ab0b
Create Date: 2025-10-26 20:59:04.347066
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
def upgrade() -> None:
"""Create example table with sample columns."""
op.create_table(
'example',
sa.Column('id', sa.Integer(), nullable=False, primary_key=True, autoincrement=True),
sa.Column('name', sa.String(length=100), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('value', sa.Float(), nullable=True),
sa.Column('is_active', sa.Boolean(), nullable=False, server_default='1'),
sa.Column('created_at', sa.DateTime(), nullable=False,
server_default=sa.text('CURRENT_TIMESTAMP')),
sa.Column('updated_at', sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
# Create an index on the name column for faster lookups
op.create_index('idx_example_name', 'example', ['name'])
def downgrade() -> None:
"""Remove example table."""
op.drop_index('idx_example_name', table_name='example')
op.drop_table('example')
```
**Key features demonstrated:**
- Various column types (Integer, String, Text, Float, Boolean, DateTime)
- Primary key with autoincrement
- Nullable and non-nullable columns
- Server defaults (for timestamps and booleans)
- Creating indexes
- Proper downgrade that reverses all changes
**To test this migration:**
```bash
# Apply the migration
./env/bin/alembic upgrade head
# Check it was applied
./env/bin/alembic current
# Verify table was created
sqlite3 packetsPL.db "SELECT sql FROM sqlite_master WHERE type='table' AND name='example';"
# Roll back the migration
./env/bin/alembic downgrade -1
# Verify table was removed
sqlite3 packetsPL.db "SELECT name FROM sqlite_master WHERE type='table' AND name='example';"
```
**To remove this example migration** (after testing):
```bash
# First make sure you're not on this revision
./env/bin/alembic downgrade c88468b7ab0b
# Then delete the migration file
rm alembic/versions/1717fa5c6545_add_example_table.py
```
## References
- [Alembic Documentation](https://alembic.sqlalchemy.org/)
- [SQLAlchemy Documentation](https://docs.sqlalchemy.org/)
- [Async SQLAlchemy](https://docs.sqlalchemy.org/en/20/orm/extensions/asyncio.html)

View File

@@ -111,12 +111,29 @@ Returns a list of packets with optional filters.
---
### Notes
- All timestamps (`import_time`, `last_seen`) are returned in ISO 8601 format.
- `portnum` is an integer representing the packet type.
- `payload` is always a UTF-8 decoded string.
---
## 4 Statistics API: GET `/api/stats`
## 4. Channels API
### GET `/api/channels`
Returns a list of channels seen in a given time period.
**Query Parameters**
- `period_type` (optional, string): Time granularity (`hour` or `day`). Default: `hour`.
- `length` (optional, int): Number of periods to look back. Default: `24`.
**Response Example**
```json
{
"channels": ["LongFast", "MediumFast", "ShortFast"]
}
```
---
## 5. Statistics API
### GET `/api/stats`
Retrieve packet statistics aggregated by time periods, with optional filtering.
@@ -157,3 +174,171 @@ Retrieve packet statistics aggregated by time periods, with optional filtering.
// more entries...
]
}
```
---
## 6. Edges API
### GET `/api/edges`
Returns network edges (connections between nodes) based on traceroutes and neighbor info.
**Query Parameters**
- `type` (optional, string): Filter by edge type (`traceroute` or `neighbor`). If omitted, returns both types.
**Response Example**
```json
{
"edges": [
{
"from": 12345678,
"to": 87654321,
"type": "traceroute"
},
{
"from": 11111111,
"to": 22222222,
"type": "neighbor"
}
]
}
```
---
## 7. Configuration API
### GET `/api/config`
Returns the current site configuration (safe subset exposed to clients).
**Response Example**
```json
{
"site": {
"domain": "meshview.example.com",
"language": "en",
"title": "Bay Area Mesh",
"message": "Real time data from around the bay area",
"starting": "/chat",
"nodes": "true",
"conversations": "true",
"everything": "true",
"graphs": "true",
"stats": "true",
"net": "true",
"map": "true",
"top": "true",
"map_top_left_lat": 39.0,
"map_top_left_lon": -123.0,
"map_bottom_right_lat": 36.0,
"map_bottom_right_lon": -121.0,
"map_interval": 3,
"firehose_interval": 3,
"weekly_net_message": "Weekly Mesh check-in message.",
"net_tag": "#BayMeshNet",
"version": "2.0.8 ~ 10-22-25"
},
"mqtt": {
"server": "mqtt.bayme.sh",
"topics": ["msh/US/bayarea/#"]
},
"cleanup": {
"enabled": "false",
"days_to_keep": "14",
"hour": "2",
"minute": "0",
"vacuum": "false"
}
}
```
---
## 8. Language/Translations API
### GET `/api/lang`
Returns translation strings for the UI.
**Query Parameters**
- `lang` (optional, string): Language code (e.g., `en`, `es`). Defaults to site language setting.
- `section` (optional, string): Specific section to retrieve translations for.
**Response Example (full)**
```json
{
"chat": {
"title": "Chat",
"send": "Send"
},
"map": {
"title": "Map",
"zoom_in": "Zoom In"
}
}
```
**Response Example (section-specific)**
Request: `/api/lang?section=chat`
```json
{
"title": "Chat",
"send": "Send"
}
```
---
## 9. Health Check API
### GET `/health`
Health check endpoint for monitoring, load balancers, and orchestration systems.
**Response Example (Healthy)**
```json
{
"status": "healthy",
"timestamp": "2025-11-03T14:30:00.123456Z",
"version": "3.0.0",
"git_revision": "6416978",
"database": "connected",
"database_size": "853.03 MB",
"database_size_bytes": 894468096
}
```
**Response Example (Unhealthy)**
Status Code: `503 Service Unavailable`
```json
{
"status": "unhealthy",
"timestamp": "2025-11-03T14:30:00.123456Z",
"version": "2.0.8",
"git_revision": "6416978",
"database": "disconnected"
}
```
---
## 10. Version API
### GET `/version`
Returns detailed version information including semver, release date, and git revision.
**Response Example**
```json
{
"version": "2.0.8",
"release_date": "2025-10-22",
"git_revision": "6416978a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q",
"git_revision_short": "6416978"
}
```
---
## Notes
- All timestamps (`import_time`, `last_seen`) are returned in ISO 8601 format.
- `portnum` is an integer representing the packet type.
- `payload` is always a UTF-8 decoded string.
- Node IDs are integers (e.g., `12345678`).

View File

@@ -0,0 +1,146 @@
# Database Changes With Alembic
This guide explains how to make database schema changes in MeshView using Alembic migrations.
## Overview
When you need to add, modify, or remove columns from database tables, you must:
1. Update the SQLAlchemy model
2. Create an Alembic migration
3. Let the system automatically apply the migration
## Step-by-Step Process
### 1. Update the Model
Edit `meshview/models.py` to add/modify the column in the appropriate model class:
```python
class Traceroute(Base):
__tablename__ = "traceroute"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
# ... existing columns ...
route_return: Mapped[bytes] = mapped_column(nullable=True) # New column
```
### 2. Create an Alembic Migration
Generate a new migration file with a descriptive message:
```bash
./env/bin/alembic revision -m "add route_return to traceroute"
```
This creates a new file in `alembic/versions/` with a unique revision ID.
### 3. Fill in the Migration
Edit the generated migration file to implement the actual database changes:
```python
def upgrade() -> None:
# Add route_return column to traceroute table
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.add_column(sa.Column('route_return', sa.LargeBinary(), nullable=True))
def downgrade() -> None:
# Remove route_return column from traceroute table
with op.batch_alter_table('traceroute', schema=None) as batch_op:
batch_op.drop_column('route_return')
```
### 4. Migration Runs Automatically
When you restart the application with `mvrun.py`:
1. The writer process (`startdb.py`) starts up
2. It checks if the database schema is up to date
3. If new migrations are pending, it runs them automatically
4. The reader process (web server) waits for migrations to complete before starting
**No manual migration command is needed** - the application handles this automatically on startup.
### 5. Commit Both Files
Add both files to git:
```bash
git add meshview/models.py
git add alembic/versions/ac311b3782a1_add_route_return_to_traceroute.py
git commit -m "Add route_return column to traceroute table"
```
## Important Notes
### SQLite Compatibility
Always use `batch_alter_table` for SQLite compatibility:
```python
with op.batch_alter_table('table_name', schema=None) as batch_op:
batch_op.add_column(...)
```
SQLite has limited ALTER TABLE support, and `batch_alter_table` works around these limitations.
### Migration Process
- **Writer process** (`startdb.py`): Runs migrations on startup
- **Reader process** (web server in `main.py`): Waits for migrations to complete
- Migrations are checked and applied every time the application starts
- The system uses a migration status table to coordinate between processes
### Common Column Types
```python
# Integer
column: Mapped[int] = mapped_column(BigInteger, nullable=True)
# String
column: Mapped[str] = mapped_column(nullable=True)
# Bytes/Binary
column: Mapped[bytes] = mapped_column(nullable=True)
# DateTime
column: Mapped[datetime] = mapped_column(nullable=True)
# Boolean
column: Mapped[bool] = mapped_column(nullable=True)
# Float
column: Mapped[float] = mapped_column(nullable=True)
```
### Migration File Location
Migrations are stored in: `alembic/versions/`
Each migration file includes:
- Revision ID (unique identifier)
- Down revision (previous migration in chain)
- Create date
- `upgrade()` function (applies changes)
- `downgrade()` function (reverts changes)
## Troubleshooting
### Migration Not Running
If migrations don't run automatically:
1. Check that the database is writable
2. Look for errors in the startup logs
3. Verify the migration chain is correct (each migration references the previous one)
### Manual Migration (Not Recommended)
If you need to manually run migrations for debugging:
```bash
./env/bin/alembic upgrade head
```
However, the application normally handles this automatically.

14
docs/README.md Normal file
View File

@@ -0,0 +1,14 @@
# Technical Documentation
This directory contains technical documentation for MeshView that goes beyond initial setup and basic usage.
These documents are intended for developers, contributors, and advanced users who need deeper insight into the system's architecture, database migrations, API endpoints, and internal workings.
## Contents
- [ALEMBIC_SETUP.md](ALEMBIC_SETUP.md) - Database migration setup and management
- [TIMESTAMP_MIGRATION.md](TIMESTAMP_MIGRATION.md) - Details on timestamp schema changes
- [API_Documentation.md](API_Documentation.md) - REST API endpoints and usage
- [CODE_IMPROVEMENTS.md](CODE_IMPROVEMENTS.md) - Suggested code improvements and refactoring ideas
For initial setup and basic usage instructions, please see the main [README.md](../README.md) in the root directory.

193
docs/TIMESTAMP_MIGRATION.md Normal file
View File

@@ -0,0 +1,193 @@
# High-Resolution Timestamp Migration
This document describes the implementation of GitHub issue #55: storing high-resolution timestamps as integers in the database for improved performance and query efficiency.
## Overview
The meshview database now stores timestamps in two formats:
1. **TEXT format** (`import_time`): Human-readable ISO8601 format with microseconds (e.g., `2025-03-12 04:15:56.058038`)
2. **INTEGER format** (`import_time_us`): Microseconds since Unix epoch (1970-01-01 00:00:00 UTC)
The dual format approach provides:
- **Backward compatibility**: Existing `import_time` TEXT columns remain unchanged
- **Performance**: Fast integer comparisons and math operations
- **Precision**: Microsecond resolution for accurate timing
- **Efficiency**: Compact storage and fast indexed lookups
## Database Changes
### New Columns Added
Three tables have new `import_time_us` columns:
1. **packet.import_time_us** (INTEGER)
- Stores when the packet was imported into the database
- Indexed for fast queries
2. **packet_seen.import_time_us** (INTEGER)
- Stores when the packet_seen record was imported
- Indexed for performance
3. **traceroute.import_time_us** (INTEGER)
- Stores when the traceroute was imported
- Indexed for fast lookups
### New Indexes
The following indexes were created for optimal query performance:
```sql
CREATE INDEX idx_packet_import_time_us ON packet(import_time_us DESC);
CREATE INDEX idx_packet_from_node_time_us ON packet(from_node_id, import_time_us DESC);
CREATE INDEX idx_packet_seen_import_time_us ON packet_seen(import_time_us);
CREATE INDEX idx_traceroute_import_time_us ON traceroute(import_time_us);
```
## Migration Process
### For Existing Databases
Run the migration script to add the new columns and populate them from existing data:
```bash
python migrate_add_timestamp_us.py [database_path]
```
If no path is provided, it defaults to `packets.db` in the current directory.
The migration script:
1. Checks if migration is needed (idempotent)
2. Adds `import_time_us` columns to the three tables
3. Populates the new columns from existing `import_time` values
4. Creates indexes for optimal performance
5. Verifies the migration completed successfully
### For New Databases
New databases created with the updated schema will automatically include the `import_time_us` columns. The MQTT store module populates both columns when inserting new records.
## Code Changes
### Models (meshview/models.py)
The ORM models now include the new `import_time_us` fields:
```python
class Packet(Base):
import_time: Mapped[datetime] = mapped_column(nullable=True)
import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
```
### MQTT Store (meshview/mqtt_store.py)
The data ingestion logic now populates both timestamp columns using UTC time:
```python
now = datetime.datetime.now(datetime.timezone.utc)
now_us = int(now.timestamp() * 1_000_000)
# Both columns are populated
import_time=now,
import_time_us=now_us,
```
**Important**: All new timestamps use UTC (Coordinated Universal Time) for consistency across time zones.
## Using the New Timestamps
### Example Queries
**Query packets from the last 7 days:**
```sql
-- Old way (slower)
SELECT * FROM packet
WHERE import_time >= datetime('now', '-7 days');
-- New way (faster)
SELECT * FROM packet
WHERE import_time_us >= (strftime('%s', 'now', '-7 days') * 1000000);
```
**Query packets in a specific time range:**
```sql
SELECT * FROM packet
WHERE import_time_us BETWEEN 1759254380000000 AND 1759254390000000;
```
**Calculate time differences (in microseconds):**
```sql
SELECT
id,
(import_time_us - LAG(import_time_us) OVER (ORDER BY import_time_us)) / 1000000.0 as seconds_since_last
FROM packet
LIMIT 10;
```
### Converting Timestamps
**From datetime to microseconds (UTC):**
```python
import datetime
now = datetime.datetime.now(datetime.timezone.utc)
now_us = int(now.timestamp() * 1_000_000)
```
**From microseconds to datetime:**
```python
import datetime
timestamp_us = 1759254380813451
dt = datetime.datetime.fromtimestamp(timestamp_us / 1_000_000)
```
**In SQL queries:**
```sql
-- Datetime to microseconds
SELECT CAST((strftime('%s', import_time) || substr(import_time, 21, 6)) AS INTEGER);
-- Microseconds to datetime (approximate)
SELECT datetime(import_time_us / 1000000, 'unixepoch');
```
## Performance Benefits
The integer timestamp columns provide significant performance improvements:
1. **Faster comparisons**: Integer comparisons are much faster than string/datetime comparisons
2. **Smaller index size**: Integer indexes are more compact than datetime indexes
3. **Range queries**: BETWEEN operations on integers are highly optimized
4. **Math operations**: Easy to calculate time differences, averages, etc.
5. **Sorting**: Integer sorting is faster than datetime sorting
## Backward Compatibility
The original `import_time` TEXT columns remain unchanged:
- Existing code continues to work
- Human-readable timestamps still available
- Gradual migration to new columns possible
- No breaking changes for existing queries
## Future Work
Future improvements could include:
- Migrating queries to use `import_time_us` columns
- Deprecating the TEXT `import_time` columns (after transition period)
- Adding helper functions for timestamp conversion
- Creating views that expose both formats
## Testing
The migration was tested on a production database with:
- 132,466 packet records
- 1,385,659 packet_seen records
- 28,414 traceroute records
All records were successfully migrated with microsecond precision preserved.
## References
- GitHub Issue: #55 - Storing High-Resolution Timestamps in SQLite
- SQLite datetime functions: https://www.sqlite.org/lang_datefunc.html
- Python datetime module: https://docs.python.org/3/library/datetime.html

View File

@@ -1,12 +1,12 @@
import asyncio
from meshview import web
async def main():
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(
web.run_server()
)
tg.create_task(web.run_server())
if __name__ == '__main__':
asyncio.run(main())

File diff suppressed because one or more lines are too long

View File

@@ -116,6 +116,13 @@ class Config(google.protobuf.message.Message):
but should not be given priority over other routers in order to avoid unnecessaraily
consuming hops.
"""
CLIENT_BASE: Config.DeviceConfig._Role.ValueType # 12
"""
Description: Treats packets from or to favorited nodes as ROUTER, and all other packets as CLIENT.
Technical Details: Used for stronger attic/roof nodes to distribute messages more widely
from weaker, indoor, or less-well-positioned nodes. Recommended for users with multiple nodes
where one CLIENT_BASE acts as a more powerful base station, such as an attic/roof node.
"""
class Role(_Role, metaclass=_RoleEnumTypeWrapper):
"""
@@ -200,6 +207,13 @@ class Config(google.protobuf.message.Message):
but should not be given priority over other routers in order to avoid unnecessaraily
consuming hops.
"""
CLIENT_BASE: Config.DeviceConfig.Role.ValueType # 12
"""
Description: Treats packets from or to favorited nodes as ROUTER, and all other packets as CLIENT.
Technical Details: Used for stronger attic/roof nodes to distribute messages more widely
from weaker, indoor, or less-well-positioned nodes. Recommended for users with multiple nodes
where one CLIENT_BASE acts as a more powerful base station, such as an attic/roof node.
"""
class _RebroadcastMode:
ValueType = typing.NewType("ValueType", builtins.int)
@@ -1048,12 +1062,12 @@ class Config(google.protobuf.message.Message):
"""
OLED_SH1107: Config.DisplayConfig._OledType.ValueType # 3
"""
Can not be auto detected but set by proto. Used for 128x128 screens
"""
OLED_SH1107_128_64: Config.DisplayConfig._OledType.ValueType # 4
"""
Can not be auto detected but set by proto. Used for 128x64 screens
"""
OLED_SH1107_128_128: Config.DisplayConfig._OledType.ValueType # 4
"""
Can not be auto detected but set by proto. Used for 128x128 screens
"""
class OledType(_OledType, metaclass=_OledTypeEnumTypeWrapper):
"""
@@ -1074,12 +1088,12 @@ class Config(google.protobuf.message.Message):
"""
OLED_SH1107: Config.DisplayConfig.OledType.ValueType # 3
"""
Can not be auto detected but set by proto. Used for 128x128 screens
"""
OLED_SH1107_128_64: Config.DisplayConfig.OledType.ValueType # 4
"""
Can not be auto detected but set by proto. Used for 128x64 screens
"""
OLED_SH1107_128_128: Config.DisplayConfig.OledType.ValueType # 4
"""
Can not be auto detected but set by proto. Used for 128x128 screens
"""
class _DisplayMode:
ValueType = typing.NewType("ValueType", builtins.int)

View File

@@ -13,7 +13,7 @@ _sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n#meshtastic/protobuf/device_ui.proto\x12\x13meshtastic.protobuf\"\xda\x04\n\x0e\x44\x65viceUIConfig\x12\x0f\n\x07version\x18\x01 \x01(\r\x12\x19\n\x11screen_brightness\x18\x02 \x01(\r\x12\x16\n\x0escreen_timeout\x18\x03 \x01(\r\x12\x13\n\x0bscreen_lock\x18\x04 \x01(\x08\x12\x15\n\rsettings_lock\x18\x05 \x01(\x08\x12\x10\n\x08pin_code\x18\x06 \x01(\r\x12)\n\x05theme\x18\x07 \x01(\x0e\x32\x1a.meshtastic.protobuf.Theme\x12\x15\n\ralert_enabled\x18\x08 \x01(\x08\x12\x16\n\x0e\x62\x61nner_enabled\x18\t \x01(\x08\x12\x14\n\x0cring_tone_id\x18\n \x01(\r\x12/\n\x08language\x18\x0b \x01(\x0e\x32\x1d.meshtastic.protobuf.Language\x12\x34\n\x0bnode_filter\x18\x0c \x01(\x0b\x32\x1f.meshtastic.protobuf.NodeFilter\x12:\n\x0enode_highlight\x18\r \x01(\x0b\x32\".meshtastic.protobuf.NodeHighlight\x12\x18\n\x10\x63\x61libration_data\x18\x0e \x01(\x0c\x12*\n\x08map_data\x18\x0f \x01(\x0b\x32\x18.meshtastic.protobuf.Map\x12\x36\n\x0c\x63ompass_mode\x18\x10 \x01(\x0e\x32 .meshtastic.protobuf.CompassMode\x12\x18\n\x10screen_rgb_color\x18\x11 \x01(\r\x12\x1b\n\x13is_clockface_analog\x18\x12 \x01(\x08\"\xa7\x01\n\nNodeFilter\x12\x16\n\x0eunknown_switch\x18\x01 \x01(\x08\x12\x16\n\x0eoffline_switch\x18\x02 \x01(\x08\x12\x19\n\x11public_key_switch\x18\x03 \x01(\x08\x12\x11\n\thops_away\x18\x04 \x01(\x05\x12\x17\n\x0fposition_switch\x18\x05 \x01(\x08\x12\x11\n\tnode_name\x18\x06 \x01(\t\x12\x0f\n\x07\x63hannel\x18\x07 \x01(\x05\"~\n\rNodeHighlight\x12\x13\n\x0b\x63hat_switch\x18\x01 \x01(\x08\x12\x17\n\x0fposition_switch\x18\x02 \x01(\x08\x12\x18\n\x10telemetry_switch\x18\x03 \x01(\x08\x12\x12\n\niaq_switch\x18\x04 \x01(\x08\x12\x11\n\tnode_name\x18\x05 \x01(\t\"=\n\x08GeoPoint\x12\x0c\n\x04zoom\x18\x01 \x01(\x05\x12\x10\n\x08latitude\x18\x02 \x01(\x05\x12\x11\n\tlongitude\x18\x03 \x01(\x05\"U\n\x03Map\x12+\n\x04home\x18\x01 \x01(\x0b\x32\x1d.meshtastic.protobuf.GeoPoint\x12\r\n\x05style\x18\x02 \x01(\t\x12\x12\n\nfollow_gps\x18\x03 \x01(\x08*>\n\x0b\x43ompassMode\x12\x0b\n\x07\x44YNAMIC\x10\x00\x12\x0e\n\nFIXED_RING\x10\x01\x12\x12\n\x0e\x46REEZE_HEADING\x10\x02*%\n\x05Theme\x12\x08\n\x04\x44\x41RK\x10\x00\x12\t\n\x05LIGHT\x10\x01\x12\x07\n\x03RED\x10\x02*\xa9\x02\n\x08Language\x12\x0b\n\x07\x45NGLISH\x10\x00\x12\n\n\x06\x46RENCH\x10\x01\x12\n\n\x06GERMAN\x10\x02\x12\x0b\n\x07ITALIAN\x10\x03\x12\x0e\n\nPORTUGUESE\x10\x04\x12\x0b\n\x07SPANISH\x10\x05\x12\x0b\n\x07SWEDISH\x10\x06\x12\x0b\n\x07\x46INNISH\x10\x07\x12\n\n\x06POLISH\x10\x08\x12\x0b\n\x07TURKISH\x10\t\x12\x0b\n\x07SERBIAN\x10\n\x12\x0b\n\x07RUSSIAN\x10\x0b\x12\t\n\x05\x44UTCH\x10\x0c\x12\t\n\x05GREEK\x10\r\x12\r\n\tNORWEGIAN\x10\x0e\x12\r\n\tSLOVENIAN\x10\x0f\x12\r\n\tUKRAINIAN\x10\x10\x12\r\n\tBULGARIAN\x10\x11\x12\x16\n\x12SIMPLIFIED_CHINESE\x10\x1e\x12\x17\n\x13TRADITIONAL_CHINESE\x10\x1f\x42\x63\n\x13\x63om.geeksville.meshB\x0e\x44\x65viceUIProtosZ\"github.com/meshtastic/go/generated\xaa\x02\x14Meshtastic.Protobufs\xba\x02\x00\x62\x06proto3')
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n#meshtastic/protobuf/device_ui.proto\x12\x13meshtastic.protobuf\"\xda\x04\n\x0e\x44\x65viceUIConfig\x12\x0f\n\x07version\x18\x01 \x01(\r\x12\x19\n\x11screen_brightness\x18\x02 \x01(\r\x12\x16\n\x0escreen_timeout\x18\x03 \x01(\r\x12\x13\n\x0bscreen_lock\x18\x04 \x01(\x08\x12\x15\n\rsettings_lock\x18\x05 \x01(\x08\x12\x10\n\x08pin_code\x18\x06 \x01(\r\x12)\n\x05theme\x18\x07 \x01(\x0e\x32\x1a.meshtastic.protobuf.Theme\x12\x15\n\ralert_enabled\x18\x08 \x01(\x08\x12\x16\n\x0e\x62\x61nner_enabled\x18\t \x01(\x08\x12\x14\n\x0cring_tone_id\x18\n \x01(\r\x12/\n\x08language\x18\x0b \x01(\x0e\x32\x1d.meshtastic.protobuf.Language\x12\x34\n\x0bnode_filter\x18\x0c \x01(\x0b\x32\x1f.meshtastic.protobuf.NodeFilter\x12:\n\x0enode_highlight\x18\r \x01(\x0b\x32\".meshtastic.protobuf.NodeHighlight\x12\x18\n\x10\x63\x61libration_data\x18\x0e \x01(\x0c\x12*\n\x08map_data\x18\x0f \x01(\x0b\x32\x18.meshtastic.protobuf.Map\x12\x36\n\x0c\x63ompass_mode\x18\x10 \x01(\x0e\x32 .meshtastic.protobuf.CompassMode\x12\x18\n\x10screen_rgb_color\x18\x11 \x01(\r\x12\x1b\n\x13is_clockface_analog\x18\x12 \x01(\x08\"\xa7\x01\n\nNodeFilter\x12\x16\n\x0eunknown_switch\x18\x01 \x01(\x08\x12\x16\n\x0eoffline_switch\x18\x02 \x01(\x08\x12\x19\n\x11public_key_switch\x18\x03 \x01(\x08\x12\x11\n\thops_away\x18\x04 \x01(\x05\x12\x17\n\x0fposition_switch\x18\x05 \x01(\x08\x12\x11\n\tnode_name\x18\x06 \x01(\t\x12\x0f\n\x07\x63hannel\x18\x07 \x01(\x05\"~\n\rNodeHighlight\x12\x13\n\x0b\x63hat_switch\x18\x01 \x01(\x08\x12\x17\n\x0fposition_switch\x18\x02 \x01(\x08\x12\x18\n\x10telemetry_switch\x18\x03 \x01(\x08\x12\x12\n\niaq_switch\x18\x04 \x01(\x08\x12\x11\n\tnode_name\x18\x05 \x01(\t\"=\n\x08GeoPoint\x12\x0c\n\x04zoom\x18\x01 \x01(\x05\x12\x10\n\x08latitude\x18\x02 \x01(\x05\x12\x11\n\tlongitude\x18\x03 \x01(\x05\"U\n\x03Map\x12+\n\x04home\x18\x01 \x01(\x0b\x32\x1d.meshtastic.protobuf.GeoPoint\x12\r\n\x05style\x18\x02 \x01(\t\x12\x12\n\nfollow_gps\x18\x03 \x01(\x08*>\n\x0b\x43ompassMode\x12\x0b\n\x07\x44YNAMIC\x10\x00\x12\x0e\n\nFIXED_RING\x10\x01\x12\x12\n\x0e\x46REEZE_HEADING\x10\x02*%\n\x05Theme\x12\x08\n\x04\x44\x41RK\x10\x00\x12\t\n\x05LIGHT\x10\x01\x12\x07\n\x03RED\x10\x02*\xb4\x02\n\x08Language\x12\x0b\n\x07\x45NGLISH\x10\x00\x12\n\n\x06\x46RENCH\x10\x01\x12\n\n\x06GERMAN\x10\x02\x12\x0b\n\x07ITALIAN\x10\x03\x12\x0e\n\nPORTUGUESE\x10\x04\x12\x0b\n\x07SPANISH\x10\x05\x12\x0b\n\x07SWEDISH\x10\x06\x12\x0b\n\x07\x46INNISH\x10\x07\x12\n\n\x06POLISH\x10\x08\x12\x0b\n\x07TURKISH\x10\t\x12\x0b\n\x07SERBIAN\x10\n\x12\x0b\n\x07RUSSIAN\x10\x0b\x12\t\n\x05\x44UTCH\x10\x0c\x12\t\n\x05GREEK\x10\r\x12\r\n\tNORWEGIAN\x10\x0e\x12\r\n\tSLOVENIAN\x10\x0f\x12\r\n\tUKRAINIAN\x10\x10\x12\r\n\tBULGARIAN\x10\x11\x12\t\n\x05\x43ZECH\x10\x12\x12\x16\n\x12SIMPLIFIED_CHINESE\x10\x1e\x12\x17\n\x13TRADITIONAL_CHINESE\x10\x1f\x42\x63\n\x13\x63om.geeksville.meshB\x0e\x44\x65viceUIProtosZ\"github.com/meshtastic/go/generated\xaa\x02\x14Meshtastic.Protobufs\xba\x02\x00\x62\x06proto3')
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
@@ -26,7 +26,7 @@ if _descriptor._USE_C_DESCRIPTORS == False:
_globals['_THEME']._serialized_start=1177
_globals['_THEME']._serialized_end=1214
_globals['_LANGUAGE']._serialized_start=1217
_globals['_LANGUAGE']._serialized_end=1514
_globals['_LANGUAGE']._serialized_end=1525
_globals['_DEVICEUICONFIG']._serialized_start=61
_globals['_DEVICEUICONFIG']._serialized_end=663
_globals['_NODEFILTER']._serialized_start=666

View File

@@ -165,6 +165,10 @@ class _LanguageEnumTypeWrapper(google.protobuf.internal.enum_type_wrapper._EnumT
"""
Bulgarian
"""
CZECH: _Language.ValueType # 18
"""
Czech
"""
SIMPLIFIED_CHINESE: _Language.ValueType # 30
"""
Simplified Chinese (experimental)
@@ -251,6 +255,10 @@ BULGARIAN: Language.ValueType # 17
"""
Bulgarian
"""
CZECH: Language.ValueType # 18
"""
Czech
"""
SIMPLIFIED_CHINESE: Language.ValueType # 30
"""
Simplified Chinese (experimental)

File diff suppressed because one or more lines are too long

View File

@@ -486,6 +486,14 @@ class _HardwareModelEnumTypeWrapper(google.protobuf.internal.enum_type_wrapper._
MeshSolar is an integrated power management and communication solution designed for outdoor low-power devices.
https://heltec.org/project/meshsolar/
"""
T_ECHO_LITE: _HardwareModel.ValueType # 109
"""
Lilygo T-Echo Lite
"""
HELTEC_V4: _HardwareModel.ValueType # 110
"""
New Heltec LoRA32 with ESP32-S3 CPU
"""
PRIVATE_HW: _HardwareModel.ValueType # 255
"""
------------------------------------------------------------------------------------------------------------------------------------------
@@ -955,6 +963,14 @@ HELTEC_MESH_SOLAR: HardwareModel.ValueType # 108
MeshSolar is an integrated power management and communication solution designed for outdoor low-power devices.
https://heltec.org/project/meshsolar/
"""
T_ECHO_LITE: HardwareModel.ValueType # 109
"""
Lilygo T-Echo Lite
"""
HELTEC_V4: HardwareModel.ValueType # 110
"""
New Heltec LoRA32 with ESP32-S3 CPU
"""
PRIVATE_HW: HardwareModel.ValueType # 255
"""
------------------------------------------------------------------------------------------------------------------------------------------

File diff suppressed because one or more lines are too long

View File

@@ -824,6 +824,7 @@ class ModuleConfig(google.protobuf.message.Message):
ENABLED_FIELD_NUMBER: builtins.int
SENDER_FIELD_NUMBER: builtins.int
SAVE_FIELD_NUMBER: builtins.int
CLEAR_ON_REBOOT_FIELD_NUMBER: builtins.int
enabled: builtins.bool
"""
Enable the Range Test Module
@@ -837,14 +838,20 @@ class ModuleConfig(google.protobuf.message.Message):
Bool value indicating that this node should save a RangeTest.csv file.
ESP32 Only
"""
clear_on_reboot: builtins.bool
"""
Bool indicating that the node should cleanup / destroy it's RangeTest.csv file.
ESP32 Only
"""
def __init__(
self,
*,
enabled: builtins.bool = ...,
sender: builtins.int = ...,
save: builtins.bool = ...,
clear_on_reboot: builtins.bool = ...,
) -> None: ...
def ClearField(self, field_name: typing.Literal["enabled", b"enabled", "save", b"save", "sender", b"sender"]) -> None: ...
def ClearField(self, field_name: typing.Literal["clear_on_reboot", b"clear_on_reboot", "enabled", b"enabled", "save", b"save", "sender", b"sender"]) -> None: ...
@typing.final
class TelemetryConfig(google.protobuf.message.Message):

File diff suppressed because one or more lines are too long

View File

@@ -199,6 +199,10 @@ class _TelemetrySensorTypeEnumTypeWrapper(google.protobuf.internal.enum_type_wra
"""
SEN5X PM SENSORS
"""
TSL2561: _TelemetrySensorType.ValueType # 44
"""
TSL2561 light sensor
"""
class TelemetrySensorType(_TelemetrySensorType, metaclass=_TelemetrySensorTypeEnumTypeWrapper):
"""
@@ -381,6 +385,10 @@ SEN5X: TelemetrySensorType.ValueType # 43
"""
SEN5X PM SENSORS
"""
TSL2561: TelemetrySensorType.ValueType # 44
"""
TSL2561 light sensor
"""
global___TelemetrySensorType = TelemetrySensorType
@typing.final

57
meshview/__version__.py Normal file
View File

@@ -0,0 +1,57 @@
"""Version information for MeshView."""
import subprocess
from pathlib import Path
__version__ = "3.0.0"
__release_date__ = "2025-11-05"
def get_git_revision():
"""Get the current git revision hash."""
try:
repo_dir = Path(__file__).parent.parent
result = subprocess.run(
["git", "rev-parse", "HEAD"],
capture_output=True,
text=True,
check=True,
cwd=repo_dir,
)
return result.stdout.strip()
except (subprocess.CalledProcessError, FileNotFoundError):
return "unknown"
def get_git_revision_short():
"""Get the short git revision hash."""
try:
repo_dir = Path(__file__).parent.parent
result = subprocess.run(
["git", "rev-parse", "--short", "HEAD"],
capture_output=True,
text=True,
check=True,
cwd=repo_dir,
)
return result.stdout.strip()
except (subprocess.CalledProcessError, FileNotFoundError):
return "unknown"
def get_version_info():
"""Get complete version information."""
return {
"version": __version__,
"release_date": __release_date__,
"git_revision": get_git_revision(),
"git_revision_short": get_git_revision_short(),
}
# Cache git info at import time for performance
_git_revision = get_git_revision()
_git_revision_short = get_git_revision_short()
# Full version string for display
__version_string__ = f"{__version__} ~ {__release_date__}"

View File

@@ -1,9 +1,11 @@
import configparser
import argparse
import configparser
# Parse command-line arguments
parser = argparse.ArgumentParser(description="MeshView Configuration Loader")
parser.add_argument("--config", type=str, default="config.ini", help="Path to config.ini file (default: config.ini)")
parser.add_argument(
"--config", type=str, default="config.ini", help="Path to config.ini file (default: config.ini)"
)
args = parser.parse_args()
# Initialize config parser
@@ -12,4 +14,3 @@ if not config_parser.read(args.config):
raise FileNotFoundError(f"Config file '{args.config}' not found! Ensure the file exists.")
CONFIG = {section: dict(config_parser.items(section)) for section in config_parser.sections()}

View File

@@ -1,32 +1,24 @@
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
from meshview import models
from sqlalchemy.ext.asyncio import async_sessionmaker
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
engine = None
async_session = None
def init_database(database_connection_string, read_only=False):
def init_database(database_connection_string):
global engine, async_session
kwargs = {"echo": False}
if database_connection_string.startswith("sqlite"):
if read_only:
# Ensure SQLite is opened in read-only mode
database_connection_string += "?mode=ro"
kwargs["connect_args"] = {"uri": True}
else:
kwargs["connect_args"] = {"timeout": 60}
else:
kwargs["pool_size"] = 20
kwargs["max_overflow"] = 50
# Ensure SQLite is opened in read-only mode
database_connection_string += "?mode=ro"
kwargs["connect_args"] = {"uri": True}
engine = create_async_engine(database_connection_string, **kwargs)
async_session = async_sessionmaker( bind=engine,
class_=AsyncSession,
expire_on_commit=False,
)
async_session = async_sessionmaker(
bind=engine,
class_=AsyncSession,
expire_on_commit=False,
)
async def create_tables():
async with engine.begin() as conn:

View File

@@ -1,16 +1,16 @@
from meshtastic.protobuf.mqtt_pb2 import MapReport
from meshtastic.protobuf.portnums_pb2 import PortNum
from google.protobuf.message import DecodeError
from meshtastic.protobuf.mesh_pb2 import (
Position,
MeshPacket,
NeighborInfo,
NodeInfo,
User,
Position,
RouteDiscovery,
Routing,
MeshPacket,
User,
)
from meshtastic.protobuf.mqtt_pb2 import MapReport
from meshtastic.protobuf.portnums_pb2 import PortNum
from meshtastic.protobuf.telemetry_pb2 import Telemetry
from google.protobuf.message import DecodeError
def text_message(payload):
@@ -25,7 +25,7 @@ DECODE_MAP = {
PortNum.TRACEROUTE_APP: RouteDiscovery.FromString,
PortNum.ROUTING_APP: Routing.FromString,
PortNum.TEXT_MESSAGE_APP: text_message,
PortNum.MAP_REPORT_APP: MapReport.FromString
PortNum.MAP_REPORT_APP: MapReport.FromString,
}

141
meshview/lang/en.json Normal file
View File

@@ -0,0 +1,141 @@
{
"base": {
"chat": "Chat",
"nodes": "Nodes",
"everything": "See Everything",
"graphs": "Mesh Graphs",
"net": "Weekly Net",
"map": "Live Map",
"stats": "Stats",
"top": "Top Traffic Nodes",
"footer": "Visit <strong><a href=\"https://github.com/pablorevilla-meshtastic/meshview\">Meshview</a></strong> on GitHub",
"node id": "Node id",
"go to node": "Go to Node",
"all": "All",
"portnum_options": {
"1": "Text Message",
"3": "Position",
"4": "Node Info",
"67": "Telemetry",
"70": "Traceroute",
"71": "Neighbor Info"
}
},
"chat": {
"replying_to": "Replying to:",
"view_packet_details": "View packet details"
},
"nodelist": {
"search_placeholder": "Search by name or ID...",
"all_roles": "All Roles",
"all_channels": "All Channels",
"all_hw_models": "All HW Models",
"all_firmware": "All Firmware",
"export_csv": "Export CSV",
"clear_filters": "Clear Filters",
"showing": "Showing",
"nodes": "nodes",
"short": "Short",
"long_name": "Long Name",
"hw_model": "HW Model",
"firmware": "Firmware",
"role": "Role",
"last_lat": "Last Latitude",
"last_long": "Last Longitude",
"channel": "Channel",
"last_update": "Last Update",
"loading_nodes": "Loading nodes...",
"no_nodes": "No nodes found",
"error_nodes": "Error loading nodes"
},
"net": {
"number_of_checkins": "Number of Check-ins:",
"view_packet_details": "View packet details",
"view_all_packets_from_node": "View all packets from this node",
"no_packets_found": "No packets found."
},
"map": {
"channel": "Channel:",
"model": "Model:",
"role": "Role:",
"last_seen": "Last seen:",
"firmware": "Firmware:",
"show_routers_only": "Show Routers Only",
"share_view": "Share This View"
},
"stats":
{
"mesh_stats_summary": "Mesh Statistics - Summary (all available in Database)",
"total_nodes": "Total Nodes",
"total_packets": "Total Packets",
"total_packets_seen": "Total Packets Seen",
"packets_per_day_all": "Packets per Day - All Ports (Last 14 Days)",
"packets_per_day_text": "Packets per Day - Text Messages (Port 1, Last 14 Days)",
"packets_per_hour_all": "Packets per Hour - All Ports",
"packets_per_hour_text": "Packets per Hour - Text Messages (Port 1)",
"packet_types_last_24h": "Packet Types - Last 24 Hours",
"hardware_breakdown": "Hardware Breakdown",
"role_breakdown": "Role Breakdown",
"channel_breakdown": "Channel Breakdown",
"expand_chart": "Expand Chart",
"export_csv": "Export CSV",
"all_channels": "All Channels",
"node_id": "Node ID"
},
"top":
{
"top_traffic_nodes": "Top Traffic Nodes (last 24 hours)",
"chart_description_1": "This chart shows a bell curve (normal distribution) based on the total \"Times Seen\" values for all nodes. It helps visualize how frequently nodes are heard, relative to the average.",
"chart_description_2": "This \"Times Seen\" value is the closest that we can get to Mesh utilization by node.",
"mean_label": "Mean:",
"stddev_label": "Standard Deviation:",
"long_name": "Long Name",
"short_name": "Short Name",
"channel": "Channel",
"packets_sent": "Packets Sent",
"times_seen": "Times Seen",
"seen_percent": "Seen % of Mean",
"no_nodes": "No top traffic nodes available."
},
"nodegraph":
{
"channel_label": "Channel:",
"search_node_placeholder": "Search node...",
"search_button": "Search",
"long_name_label": "Long Name:",
"short_name_label": "Short Name:",
"role_label": "Role:",
"hw_model_label": "Hardware Model:",
"node_not_found": "Node not found in current channel!"
},
"firehose":
{
"live_feed": "📡 Live Feed",
"pause": "Pause",
"resume": "Resume",
"time": "Time",
"packet_id": "Packet ID",
"from": "From",
"to": "To",
"port": "Port",
"links": "Links",
"unknown_app": "UNKNOWN APP",
"text_message": "Text Message",
"position": "Position",
"node_info": "Node Info",
"routing": "Routing",
"administration": "Administration",
"waypoint": "Waypoint",
"store_forward": "Store Forward",
"telemetry": "Telemetry",
"trace_route": "Trace Route",
"neighbor_info": "Neighbor Info",
"direct_to_mqtt": "direct to MQTT",
"all": "All",
"map": "Map",
"graph": "Graph"
}
}

142
meshview/lang/es.json Normal file
View File

@@ -0,0 +1,142 @@
{
"base": {
"conversations": "Conversaciones",
"nodes": "Nodos",
"everything": "Mostrar Todo",
"graph": "Gráficos de la Malla",
"net": "Red Semanal",
"map": "Mapa en Vivo",
"stats": "Estadísticas",
"top": "Nodos con Mayor Tráfico",
"footer": "Visita <strong><a href=\"https://github.com/pablorevilla-meshtastic/meshview\">Meshview</a></strong> en Github.",
"node id": "ID de Nodo",
"go to node": "Ir al nodo",
"all": "Todos",
"portnum_options": {
"1": "Mensaje de Texto",
"3": "Ubicación",
"4": "Información del Nodo",
"67": "Telemetría",
"70": "Traceroute",
"71": "Información de Vecinos"
}
},
"chat": {
"replying_to": "Respondiendo a:",
"view_packet_details": "Ver detalles del paquete"
},
"nodelist": {
"search_placeholder": "Buscar por nombre o ID...",
"all_roles": "Todos los Roles",
"all_channels": "Todos los Canales",
"all_hw_models": "Todos los Modelos",
"all_firmware": "Todo el Firmware",
"export_csv": "Exportar CSV",
"clear_filters": "Limpiar Filtros",
"showing": "Mostrando",
"nodes": "nodos",
"short": "Corto",
"long_name": "Largo",
"hw_model": "Modelo",
"firmware": "Firmware",
"role": "Rol",
"last_lat": "Última Latitud",
"last_long": "Última Longitud",
"channel": "Canal",
"last_update": "Última Actualización",
"loading_nodes": "Cargando nodos...",
"no_nodes": "No se encontraron nodos",
"error_nodes": "Error al cargar nodos"
},
"net": {
"number_of_checkins": "Número de registros:",
"view_packet_details": "Ver detalles del paquete",
"view_all_packets_from_node": "Ver todos los paquetes de este nodo",
"no_packets_found": "No se encontraron paquetes."
},
"map": {
"channel": "Canal:",
"model": "Modelo:",
"role": "Rol:",
"last_seen": "Visto por última vez:",
"firmware": "Firmware:",
"show_routers_only": "Mostrar solo enrutadores",
"share_view": "Compartir esta vista"
},
"stats": {
"mesh_stats_summary": "Estadísticas de la Malla - Resumen (completas en la base de datos)",
"total_nodes": "Nodos Totales",
"total_packets": "Paquetes Totales",
"total_packets_seen": "Paquetes Totales Vistos",
"packets_per_day_all": "Paquetes por Día - Todos los Puertos (Últimos 14 Días)",
"packets_per_day_text": "Paquetes por Día - Mensajes de Texto (Puerto 1, Últimos 14 Días)",
"packets_per_hour_all": "Paquetes por Hora - Todos los Puertos",
"packets_per_hour_text": "Paquetes por Hora - Mensajes de Texto (Puerto 1)",
"packet_types_last_24h": "Tipos de Paquetes - Últimas 24 Horas",
"hardware_breakdown": "Distribución de Hardware",
"role_breakdown": "Distribución de Roles",
"channel_breakdown": "Distribución de Canales",
"expand_chart": "Ampliar Gráfico",
"export_csv": "Exportar CSV",
"all_channels": "Todos los Canales"
},
"top": {
"top_traffic_nodes": "Tráfico (últimas 24 horas)",
"chart_description_1": "Este gráfico muestra una curva normal (distribución normal) basada en el valor total de \"Veces Visto\" para todos los nodos. Ayuda a visualizar con qué frecuencia se detectan los nodos en relación con el promedio.",
"chart_description_2": "Este valor de \"Veces Visto\" es lo más aproximado que tenemos al nivel de uso de la malla por nodo.",
"mean_label": "Media:",
"stddev_label": "Desviación Estándar:",
"long_name": "Nombre Largo",
"short_name": "Nombre Corto",
"channel": "Canal",
"packets_sent": "Paquetes Enviados",
"times_seen": "Veces Visto",
"seen_percent": "% Visto respecto a la Media",
"no_nodes": "No hay nodos con mayor tráfico disponibles."
},
"nodegraph":
{
"channel_label": "Canal:",
"search_placeholder": "Buscar nodo...",
"search_button": "Buscar",
"long_name_label": "Nombre completo:",
"short_name_label": "Nombre corto:",
"role_label": "Rol:",
"hw_model_label": "Modelo de hardware:",
"traceroute": "Traceroute",
"neighbor": "Vecino",
"other": "Otro",
"unknown": "Desconocido",
"node_not_found": "¡Nodo no encontrado en el canal actual!"
},
"firehose":
{
"live_feed": "📡 Flujo en Vivo",
"pause": "Pausar",
"resume": "Continuar",
"time": "Hora",
"packet_id": "ID del Paquete",
"from": "De",
"to": "Para",
"port": "Puerto",
"links": "Enlaces",
"unknown_app": "APLICACIÓN DESCONOCIDA",
"text_message": "Mensaje de Texto",
"position": "Posición",
"node_info": "Información del Nodo",
"routing": "Enrutamiento",
"administration": "Administración",
"waypoint": "Punto de Ruta",
"store_forward": "Almacenar y Reenviar",
"telemetry": "Telemetría",
"trace_route": "Rastreo de Ruta",
"neighbor_info": "Información de Vecinos",
"direct_to_mqtt": "Directo a MQTT",
"all": "Todos",
"map": "Mapa",
"graph": "Gráfico"
}
}

243
meshview/migrations.py Normal file
View File

@@ -0,0 +1,243 @@
"""
Database migration management for MeshView.
This module provides utilities for:
- Running Alembic migrations programmatically
- Checking database schema versions
- Coordinating migrations between writer and reader apps
"""
import asyncio
import logging
from pathlib import Path
from alembic.config import Config
from alembic.runtime.migration import MigrationContext
from alembic.script import ScriptDirectory
from sqlalchemy import text
from sqlalchemy.ext.asyncio import AsyncEngine
from alembic import command
logger = logging.getLogger(__name__)
def get_alembic_config(database_url: str) -> Config:
"""
Get Alembic configuration with the database URL set.
Args:
database_url: SQLAlchemy database connection string
Returns:
Configured Alembic Config object
"""
# Get the alembic.ini path (in project root)
alembic_ini = Path(__file__).parent.parent / "alembic.ini"
config = Config(str(alembic_ini))
config.set_main_option("sqlalchemy.url", database_url)
return config
async def get_current_revision(engine: AsyncEngine) -> str | None:
"""
Get the current database schema revision.
Args:
engine: Async SQLAlchemy engine
Returns:
Current revision string, or None if no migrations applied
"""
async with engine.connect() as connection:
def _get_revision(conn):
context = MigrationContext.configure(conn)
return context.get_current_revision()
revision = await connection.run_sync(_get_revision)
return revision
async def get_head_revision(database_url: str) -> str | None:
"""
Get the head (latest) revision from migration scripts.
Args:
database_url: Database connection string
Returns:
Head revision string, or None if no migrations exist
"""
config = get_alembic_config(database_url)
script_dir = ScriptDirectory.from_config(config)
head = script_dir.get_current_head()
return head
async def is_database_up_to_date(engine: AsyncEngine, database_url: str) -> bool:
"""
Check if database is at the latest schema version.
Args:
engine: Async SQLAlchemy engine
database_url: Database connection string
Returns:
True if database is up to date, False otherwise
"""
current = await get_current_revision(engine)
head = await get_head_revision(database_url)
# If there are no migrations yet, consider it up to date
if head is None:
return True
return current == head
def run_migrations(database_url: str) -> None:
"""
Run all pending migrations to bring database up to date.
This is a synchronous operation that runs Alembic migrations.
Should be called by the writer app on startup.
Args:
database_url: Database connection string
"""
logger.info("Running database migrations...")
import sys
sys.stdout.flush()
config = get_alembic_config(database_url)
try:
# Run migrations to head
logger.info("Calling alembic upgrade command...")
sys.stdout.flush()
command.upgrade(config, "head")
logger.info("Database migrations completed successfully")
sys.stdout.flush()
except Exception as e:
logger.error(f"Error running migrations: {e}")
raise
async def wait_for_migrations(
engine: AsyncEngine, database_url: str, max_retries: int = 30, retry_delay: int = 2
) -> bool:
"""
Wait for database migrations to complete.
This should be called by the reader app to wait until
the database schema is up to date before proceeding.
Args:
engine: Async SQLAlchemy engine
database_url: Database connection string
max_retries: Maximum number of retry attempts
retry_delay: Seconds to wait between retries
Returns:
True if database is up to date, False if max retries exceeded
"""
for attempt in range(max_retries):
try:
if await is_database_up_to_date(engine, database_url):
logger.info("Database schema is up to date")
return True
current = await get_current_revision(engine)
head = await get_head_revision(database_url)
logger.info(
f"Database schema not up to date (current: {current}, head: {head}). "
f"Waiting... (attempt {attempt + 1}/{max_retries})"
)
await asyncio.sleep(retry_delay)
except Exception as e:
logger.warning(
f"Error checking database version (attempt {attempt + 1}/{max_retries}): {e}"
)
await asyncio.sleep(retry_delay)
logger.error(f"Database schema not up to date after {max_retries} attempts")
return False
async def create_migration_status_table(engine: AsyncEngine) -> None:
"""
Create a simple status table for migration coordination.
This table can be used to signal when migrations are in progress.
Args:
engine: Async SQLAlchemy engine
"""
async with engine.begin() as conn:
await conn.execute(
text("""
CREATE TABLE IF NOT EXISTS migration_status (
id INTEGER PRIMARY KEY CHECK (id = 1),
in_progress BOOLEAN NOT NULL DEFAULT 0,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
""")
)
# Insert initial row if not exists
await conn.execute(
text("""
INSERT OR IGNORE INTO migration_status (id, in_progress)
VALUES (1, 0)
""")
)
async def set_migration_in_progress(engine: AsyncEngine, in_progress: bool) -> None:
"""
Set the migration in-progress flag.
Args:
engine: Async SQLAlchemy engine
in_progress: True if migration is in progress, False otherwise
"""
async with engine.begin() as conn:
await conn.execute(
text("""
UPDATE migration_status
SET in_progress = :in_progress,
updated_at = CURRENT_TIMESTAMP
WHERE id = 1
"""),
{"in_progress": in_progress},
)
async def is_migration_in_progress(engine: AsyncEngine) -> bool:
"""
Check if a migration is currently in progress.
Args:
engine: Async SQLAlchemy engine
Returns:
True if migration is in progress, False otherwise
"""
try:
async with engine.connect() as conn:
result = await conn.execute(
text("SELECT in_progress FROM migration_status WHERE id = 1")
)
row = result.fetchone()
return bool(row[0]) if row else False
except Exception:
# If table doesn't exist or query fails, assume no migration in progress
return False

View File

@@ -1,8 +1,8 @@
from datetime import datetime
from sqlalchemy.orm import DeclarativeBase, foreign
from sqlalchemy import BigInteger, ForeignKey, Index, desc
from sqlalchemy.ext.asyncio import AsyncAttrs
from sqlalchemy.orm import mapped_column, relationship, Mapped
from sqlalchemy import ForeignKey, BigInteger, Index, desc
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship
class Base(AsyncAttrs, DeclarativeBase):
@@ -23,9 +23,13 @@ class Node(Base):
last_long: Mapped[int] = mapped_column(BigInteger, nullable=True)
channel: Mapped[str] = mapped_column(nullable=True)
last_update: Mapped[datetime] = mapped_column(nullable=True)
first_seen_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
last_seen_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
__table_args__ = (
Index("idx_node_node_id", "node_id"),
Index("idx_node_first_seen_us", "first_seen_us"),
Index("idx_node_last_seen_us", "last_seen_us"),
)
def to_dict(self):
@@ -52,12 +56,17 @@ class Packet(Base):
)
payload: Mapped[bytes] = mapped_column(nullable=True)
import_time: Mapped[datetime] = mapped_column(nullable=True)
import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
channel: Mapped[str] = mapped_column(nullable=True)
__table_args__ = (
Index("idx_packet_from_node_id", "from_node_id"),
Index("idx_packet_to_node_id", "to_node_id"),
Index("idx_packet_import_time", desc("import_time")),
Index("idx_packet_import_time_us", desc("import_time_us")),
# Composite index for /top endpoint performance - filters by from_node_id AND import_time
Index("idx_packet_from_node_time", "from_node_id", desc("import_time")),
Index("idx_packet_from_node_time_us", "from_node_id", desc("import_time_us")),
)
@@ -78,14 +87,19 @@ class PacketSeen(Base):
rx_rssi: Mapped[int] = mapped_column(nullable=True)
topic: Mapped[str] = mapped_column(nullable=True)
import_time: Mapped[datetime] = mapped_column(nullable=True)
import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
__table_args__ = (
Index("idx_packet_seen_node_id", "node_id"),
# Index for /top endpoint performance - JOIN on packet_id
Index("idx_packet_seen_packet_id", "packet_id"),
Index("idx_packet_seen_import_time_us", "import_time_us"),
)
class Traceroute(Base):
__tablename__ = "traceroute"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
packet_id = mapped_column(ForeignKey("packet.id"))
packet: Mapped["Packet"] = relationship(
@@ -95,3 +109,10 @@ class Traceroute(Base):
done: Mapped[bool] = mapped_column(nullable=True)
route: Mapped[bytes] = mapped_column(nullable=True)
import_time: Mapped[datetime] = mapped_column(nullable=True)
route_return: Mapped[bytes] = mapped_column(nullable=True)
import_time_us: Mapped[int] = mapped_column(BigInteger, nullable=True)
__table_args__ = (
Index("idx_traceroute_import_time", "import_time"),
Index("idx_traceroute_import_time_us", "import_time_us"),
)

View File

@@ -1,15 +1,16 @@
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
from meshview import models
from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker
def init_database(database_connection_string):
global engine, async_session
kwargs = {}
if not database_connection_string.startswith('sqlite'):
kwargs['pool_size'] = 20
kwargs['max_overflow'] = 50
engine = create_async_engine(database_connection_string, echo=False, connect_args={"timeout": 60})
engine = create_async_engine(
database_connection_string, echo=False, connect_args={"timeout": 900}
)
async_session = async_sessionmaker(engine, expire_on_commit=False)
async def create_tables():
async with engine.begin() as conn:
await conn.run_sync(models.Base.metadata.create_all)

View File

@@ -1,13 +1,25 @@
import base64
import asyncio
import base64
import logging
import random
import time
import aiomqtt
from google.protobuf.message import DecodeError
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from google.protobuf.message import DecodeError
from meshtastic.protobuf.mqtt_pb2 import ServiceEnvelope
KEY = base64.b64decode("1PG7OiApB1nwvP+rz05pAQ==")
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s %(filename)s:%(lineno)d [pid:%(process)d] %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
logger = logging.getLogger(__name__)
def decrypt(packet):
if packet.HasField("decoded"):
@@ -27,6 +39,8 @@ def decrypt(packet):
async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_passwd):
identifier = str(random.getrandbits(16))
msg_count = 0
start_time = None
while True:
try:
async with aiomqtt.Client(
@@ -36,10 +50,15 @@ async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_pa
password=mqtt_passwd,
identifier=identifier,
) as client:
logger.info(f"Connected to MQTT broker at {mqtt_server}:{mqtt_port}")
for topic in topics:
print(f"Subscribing to: {topic}")
logger.info(f"Subscribing to: {topic}")
await client.subscribe(topic)
# Reset start time when connected
if start_time is None:
start_time = time.time()
async for msg in client.messages:
try:
envelope = ServiceEnvelope.FromString(msg.payload)
@@ -52,11 +71,23 @@ async def get_topic_envelopes(mqtt_server, mqtt_port, topics, mqtt_user, mqtt_pa
continue
# Skip packets from specific node
# FIXME: make this configurable as a list of node IDs to skip
if getattr(envelope.packet, "from", None) == 2144342101:
continue
msg_count += 1
# FIXME: make this interval configurable or time based
if (
msg_count % 10000 == 0
): # Log notice every 10000 messages (approx every hour at 3/sec)
elapsed_time = time.time() - start_time
msg_rate = msg_count / elapsed_time if elapsed_time > 0 else 0
logger.info(
f"Processed {msg_count} messages so far... ({msg_rate:.2f} msg/sec)"
)
yield msg.topic.value, envelope
except aiomqtt.MqttError as e:
print(f"MQTT error: {e}, reconnecting in 1s...")
logger.error(f"MQTT error: {e}, reconnecting in 1s...")
await asyncio.sleep(1)

View File

@@ -1,38 +1,45 @@
import datetime
import re
from sqlalchemy import select
from sqlalchemy import update
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
from meshtastic.protobuf.config_pb2 import Config
from meshtastic.protobuf.mesh_pb2 import HardwareModel
from meshtastic.protobuf.portnums_pb2 import PortNum
from meshtastic.protobuf.mesh_pb2 import User, HardwareModel
from meshview import mqtt_database
from meshview import decode_payload
from meshview.models import Packet, PacketSeen, Node, Traceroute
from meshview import decode_payload, mqtt_database
from meshview.models import Node, Packet, PacketSeen, Traceroute
async def process_envelope(topic, env):
# Checking if the received packet is a MAP_REPORT
# Update the node table with the firmware version
# MAP_REPORT_APP
if env.packet.decoded.portnum == PortNum.MAP_REPORT_APP:
# Extract the node ID from the packet and format the user ID
node_id = getattr(env.packet, "from")
user_id = f"!{node_id:0{8}x}"
# Decode the MAP report payload
map_report = decode_payload.decode_payload(PortNum.MAP_REPORT_APP, env.packet.decoded.payload)
map_report = decode_payload.decode_payload(
PortNum.MAP_REPORT_APP, env.packet.decoded.payload
)
# Establish an asynchronous database session
async with mqtt_database.async_session() as session:
try:
hw_model = HardwareModel.Name(map_report.hw_model) if hasattr(HardwareModel, 'Name') else "unknown"
role = Config.DeviceConfig.Role.Name(map_report.role) if hasattr(Config.DeviceConfig.Role,
'Name') else "unknown"
node = (await session.execute(select(Node).where(Node.node_id == node_id))).scalar_one_or_none()
hw_model = (
HardwareModel.Name(map_report.hw_model)
if hasattr(HardwareModel, "Name")
else "unknown"
)
role = (
Config.DeviceConfig.Role.Name(map_report.role)
if hasattr(Config.DeviceConfig.Role, "Name")
else "unknown"
)
node = (
await session.execute(select(Node).where(Node.node_id == node_id))
).scalar_one_or_none()
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
# Some nodes might have uplink disabled for the default channel
# and only be sending map reports, so check if it exists yet
if node:
node.node_id = node_id
node.long_name = map_report.long_name
@@ -43,53 +50,81 @@ async def process_envelope(topic, env):
node.last_lat = map_report.latitude_i
node.last_long = map_report.longitude_i
node.firmware = map_report.firmware_version
node.last_update = datetime.datetime.now()
node.last_update = now
node.last_seen_us = now_us
if node.first_seen_us is None:
node.first_seen_us = now_us
else:
node = Node(
id=user_id, node_id=node_id,
long_name=map_report.long_name, short_name=map_report.short_name,
hw_model=hw_model, role=role, channel=env.channel_id,
id=user_id,
node_id=node_id,
long_name=map_report.long_name,
short_name=map_report.short_name,
hw_model=hw_model,
role=role,
channel=env.channel_id,
firmware=map_report.firmware_version,
last_lat=map_report.latitude_i, last_long=map_report.longitude_i,
last_update=datetime.datetime.now(),
last_lat=map_report.latitude_i,
last_long=map_report.longitude_i,
last_update=now,
first_seen_us=now_us,
last_seen_us=now_us,
)
session.add(node)
except Exception as e:
print(f"Error processing MAP_REPORT_APP: {e}")
# Commit the changes to the database
await session.commit()
# This ignores any packet that does not have a ID
if not env.packet.id:
return
async with mqtt_database.async_session() as session:
# --- Packet insert with ON CONFLICT DO NOTHING
result = await session.execute(select(Packet).where(Packet.id == env.packet.id))
new_packet = False
# FIXME: Not Used
# new_packet = False
packet = result.scalar_one_or_none()
if not packet:
new_packet = True
packet = Packet(
id=env.packet.id,
portnum=env.packet.decoded.portnum,
from_node_id=getattr(env.packet, "from"),
to_node_id=env.packet.to,
payload=env.packet.SerializeToString(),
import_time=datetime.datetime.now(),
channel=env.channel_id,
# FIXME: Not Used
# new_packet = True
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
stmt = (
sqlite_insert(Packet)
.values(
id=env.packet.id,
portnum=env.packet.decoded.portnum,
from_node_id=getattr(env.packet, "from"),
to_node_id=env.packet.to,
payload=env.packet.SerializeToString(),
import_time=now,
import_time_us=now_us,
channel=env.channel_id,
)
.on_conflict_do_nothing(index_elements=["id"])
)
session.add(packet)
await session.execute(stmt)
# --- PacketSeen (no conflict handling here, normal insert)
if not env.gateway_id:
print("WARNING: Missing gateway_id, skipping PacketSeen entry")
# Most likely a misconfiguration of a mqtt publisher?
return
else:
node_id = int(env.gateway_id[1:], 16)
result = await session.execute(
select(PacketSeen).where(
PacketSeen.packet_id == env.packet.id,
PacketSeen.node_id == int(env.gateway_id[1:], 16),
PacketSeen.node_id == node_id,
PacketSeen.rx_time == env.packet.rx_time,
)
)
seen = None
if not result.scalar_one_or_none():
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
seen = PacketSeen(
packet_id=env.packet.id,
node_id=int(env.gateway_id[1:], 16),
@@ -100,20 +135,40 @@ async def process_envelope(topic, env):
hop_limit=env.packet.hop_limit,
hop_start=env.packet.hop_start,
topic=topic,
import_time=datetime.datetime.now(),
import_time=now,
import_time_us=now_us,
)
session.add(seen)
# --- NODEINFO_APP handling
if env.packet.decoded.portnum == PortNum.NODEINFO_APP:
try:
user = decode_payload.decode_payload(PortNum.NODEINFO_APP, env.packet.decoded.payload)
user = decode_payload.decode_payload(
PortNum.NODEINFO_APP, env.packet.decoded.payload
)
if user and user.id:
node_id = int(user.id[1:], 16) if user.id[0] == "!" else None
hw_model = HardwareModel.Name(user.hw_model) if user.hw_model in HardwareModel.values() else f"unknown({user.hw_model})"
role = Config.DeviceConfig.Role.Name(user.role) if hasattr(Config.DeviceConfig.Role,
'Name') else "unknown"
if user.id[0] == "!" and re.fullmatch(r"[0-9a-fA-F]+", user.id[1:]):
node_id = int(user.id[1:], 16)
else:
node_id = None
node = (await session.execute(select(Node).where(Node.id == user.id))).scalar_one_or_none()
hw_model = (
HardwareModel.Name(user.hw_model)
if user.hw_model in HardwareModel.values()
else f"unknown({user.hw_model})"
)
role = (
Config.DeviceConfig.Role.Name(user.role)
if hasattr(Config.DeviceConfig.Role, "Name")
else "unknown"
)
node = (
await session.execute(select(Node).where(Node.id == user.id))
).scalar_one_or_none()
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
if node:
node.node_id = node_id
@@ -122,48 +177,67 @@ async def process_envelope(topic, env):
node.hw_model = hw_model
node.role = role
node.channel = env.channel_id
node.last_update = datetime.datetime.now()
node.last_update = now
node.last_seen_us = now_us
if node.first_seen_us is None:
node.first_seen_us = now_us
else:
node = Node(
id=user.id, node_id=node_id,
long_name=user.long_name, short_name=user.short_name,
hw_model=hw_model, role=role, channel=env.channel_id,
last_update=datetime.datetime.now(),
id=user.id,
node_id=node_id,
long_name=user.long_name,
short_name=user.short_name,
hw_model=hw_model,
role=role,
channel=env.channel_id,
last_update=now,
first_seen_us=now_us,
last_seen_us=now_us,
)
session.add(node)
except Exception as e:
print(f"Error processing NODEINFO_APP: {e}")
# --- POSITION_APP handling
if env.packet.decoded.portnum == PortNum.POSITION_APP:
position = decode_payload.decode_payload(
PortNum.POSITION_APP, env.packet.decoded.payload
)
if position and position.latitude_i and position.longitude_i:
from_node_id = getattr(env.packet, 'from')
node = (await session.execute(select(Node).where(Node.node_id == from_node_id))).scalar_one_or_none()
from_node_id = getattr(env.packet, "from")
node = (
await session.execute(select(Node).where(Node.node_id == from_node_id))
).scalar_one_or_none()
if node:
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
node.last_lat = position.latitude_i
node.last_long = position.longitude_i
node.last_update = now
node.last_seen_us = now_us
if node.first_seen_us is None:
node.first_seen_us = now_us
session.add(node)
# --- TRACEROUTE_APP (no conflict handling, normal insert)
if env.packet.decoded.portnum == PortNum.TRACEROUTE_APP:
packet_id = None
if env.packet.decoded.want_response:
packet_id = env.packet.id
else:
result = await session.execute(select(Packet).where(Packet.id == env.packet.decoded.request_id))
if result.scalar_one_or_none():
packet_id = env.packet.decoded.request_id
packet_id = env.packet.id
if packet_id is not None:
session.add(Traceroute(
packet_id=packet_id,
route=env.packet.decoded.payload,
done=not env.packet.decoded.want_response,
gateway_node_id=int(env.gateway_id[1:], 16),
import_time=datetime.datetime.now(),
))
now = datetime.datetime.now(datetime.UTC)
now_us = int(now.timestamp() * 1_000_000)
session.add(
Traceroute(
packet_id=packet_id,
route=env.packet.decoded.payload,
done=not env.packet.decoded.want_response,
gateway_node_id=int(env.gateway_id[1:], 16),
import_time=now,
import_time_us=now_us,
)
)
await session.commit()
if new_packet:
await packet.awaitable_attrs.to_node
await packet.awaitable_attrs.from_node
# if new_packet:
# await packet.awaitable_attrs.to_node
# await packet.awaitable_attrs.from_node

View File

@@ -1,6 +1,6 @@
import asyncio
import contextlib
from collections import defaultdict
import asyncio
waiting_node_ids_events = defaultdict(set)
@@ -36,11 +36,13 @@ def create_event(node_id):
def remove_event(node_event):
waiting_node_ids_events[node_event.node_id].remove(node_event)
def notify_packet(node_id, packet):
for event in waiting_node_ids_events[node_id]:
event.packets.append(packet)
event.set()
def notify_uplinked(node_id, packet):
for event in waiting_node_ids_events[node_id]:
event.uplinked.append(packet)

View File

@@ -0,0 +1,164 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>Mesh Nodes Population Heatmap</title>
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" />
<style>
body { margin: 0; background: #000; }
#map { height: 100vh; width: 100%; }
#legend {
position: absolute; bottom: 10px; right: 10px;
background: rgba(0,0,0,0.8);
color: white; padding: 10px 14px;
font-family: monospace; font-size: 13px;
border-radius: 5px; z-index: 1000;
box-shadow: 0 0 10px rgba(0,0,0,0.6);
}
.legend-item { display: flex; align-items: center; margin-bottom: 5px; }
.legend-color { width: 18px; height: 18px; margin-right: 6px; border-radius: 3px; }
</style>
</head>
<body>
<div id="map"></div>
<div id="legend"></div>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
<script src="https://unpkg.com/leaflet.heat/dist/leaflet-heat.js"></script>
<script>
const map = L.map("map");
L.tileLayer("https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png", {
maxZoom: 19,
attribution: "© OpenStreetMap"
}).addTo(map);
let heatLayer = null;
let nodeCoords = [];
let hoverTooltip = L.tooltip({
permanent: false,
direction: "top",
className: "node-tooltip"
});
// --- Legend ---
const legend = document.getElementById("legend");
const legendItems = [
{ color: "#0000ff", label: "Low" },
{ color: "#8000ff", label: "Moderate" },
{ color: "#00ffff", label: "Elevated" },
{ color: "#00ff00", label: "High" },
{ color: "#ffff00", label: "Very High" },
{ color: "#ff0000", label: "Congested?" }
];
legendItems.forEach(item => {
const div = document.createElement("div");
div.className = "legend-item";
const colorBox = document.createElement("div");
colorBox.className = "legend-color";
colorBox.style.background = item.color;
const label = document.createElement("span");
label.textContent = item.label;
div.appendChild(colorBox);
div.appendChild(label);
legend.appendChild(div);
});
// --- Load nodes and create heatmap ---
async function loadNodes() {
try {
const res = await fetch("/api/nodes?days_active=3");
if (!res.ok) throw new Error(`HTTP error ${res.status}`);
const data = await res.json();
const nodes = data.nodes || [];
nodeCoords = [];
const heatPoints = [];
nodes.forEach(node => {
const lat = node.last_lat / 1e7;
const lng = node.last_long / 1e7;
if (lat && lng && !isNaN(lat) && !isNaN(lng)) {
nodeCoords.push([lat, lng]);
heatPoints.push([lat, lng, 1.0]); // equal weight per node
}
});
if (heatLayer) map.removeLayer(heatLayer);
heatLayer = L.heatLayer(heatPoints, {
radius: 18, // smaller circles
blur: 10, // slightly tighter glow
maxZoom: 15,
minOpacity: 0.4,
gradient: {
0.0: "#0000ff", // deep blue
0.2: "#8000ff", // purple
0.4: "#00ffff", // cyan
0.6: "#00ff00", // green
0.8: "#ffff00", // yellow
0.9: "#ff8000", // orange
1.0: "#ff0000" // red
}
}).addTo(map);
await setMapBoundsFromConfig();
} catch (err) {
console.error("Failed to load nodes:", err);
}
}
// --- Map bounds ---
async function setMapBoundsFromConfig() {
try {
const res = await fetch("/api/config");
const config = await res.json();
const topLat = parseFloat(config.site.map_top_left_lat);
const topLon = parseFloat(config.site.map_top_left_lon);
const bottomLat = parseFloat(config.site.map_bottom_right_lat);
const bottomLon = parseFloat(config.site.map_bottom_right_lon);
if ([topLat, topLon, bottomLat, bottomLon].some(v => isNaN(v))) {
throw new Error("Map bounds contain NaN");
}
map.fitBounds([[topLat, topLon], [bottomLat, bottomLon]]);
} catch (err) {
console.error("Failed to load map bounds from config:", err);
map.setView([37.77, -122.42], 9);
}
}
// --- Count nearby nodes ---
function countNearbyNodes(latlng, radiusMeters) {
let count = 0;
const latR = radiusMeters / 111320; // meters per degree lat
const lngR = radiusMeters / (111320 * Math.cos(latlng.lat * Math.PI / 180));
for (const [lat, lng] of nodeCoords) {
if (Math.abs(lat - latlng.lat) <= latR && Math.abs(lng - latlng.lng) <= lngR)
count++;
}
return count;
}
// --- Tooltip on hover ---
map.on("mousemove", e => {
if (!nodeCoords.length) return;
const zoom = map.getZoom();
const radiusMeters = 2000 / Math.pow(2, zoom - 10); // dynamic nearness by zoom
const count = countNearbyNodes(e.latlng, radiusMeters);
if (count > 0) {
hoverTooltip
.setLatLng(e.latlng)
.setContent(`${count} nodes nearby (${radiusMeters.toFixed(0)}m radius)`)
.addTo(map);
} else {
map.closeTooltip(hoverTooltip);
}
});
loadNodes();
</script>
</body>
</html>

225
meshview/static/kiosk.html Normal file
View File

@@ -0,0 +1,225 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>Mesh Nodes Live Map</title>
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" crossorigin=""/>
<style>
body { margin: 0; font-family: monospace; background: #121212; color: #eee; }
#map { height: 100vh; width: 100%; }
#legend {
position: absolute;
bottom: 10px;
right: 10px;
background: white; /* changed from rgba(0,0,0,0.8) to white */
color: black; /* text color black */
padding: 10px;
border-radius: 5px;
z-index: 1000;
font-size: 13px;
line-height: 1.5;
border: 1px solid #ccc; /* optional: subtle border for white bg */
}
#filter-container { margin-bottom: 6px; text-align: left; }
.filter-checkbox { margin-right: 4px; }
.blinking-tooltip {
background: white;
color: black;
border: 1px solid #000;
border-radius: 4px;
padding: 2px 5px;
}
</style>
</head>
<body>
<div id="map"></div>
<div id="legend">
<div id="filter-container"></div>
</div>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js" crossorigin></script>
<script src="https://unpkg.com/leaflet-polylinedecorator@1.6.0/dist/leaflet.polylinedecorator.js" crossorigin></script>
<script>
(async function(){
// --- Load config ---
let config = {};
try {
const res = await fetch('/api/config');
config = await res.json();
} catch(err){ console.error('Failed to load config', err); }
const mapInterval = Number(config.site?.map_interval) || 3;
const bayAreaBounds = [
[Number(config.site?.map_top_left_lat), Number(config.site?.map_top_left_lon)],
[Number(config.site?.map_bottom_right_lat), Number(config.site?.map_bottom_right_lon)]
];
// --- Initialize map ---
const map = L.map('map');
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom: 19 }).addTo(map);
map.fitBounds(bayAreaBounds);
// --- Utilities ---
const palette = ["#e6194b","#4363d8","#f58231","#911eb4","#46f0f0","#f032e6","#bcf60c","#fabebe","#008080","#e6beff","#9a6324","#fffac8","#800000","#aaffc3","#808000","#ffd8b1","#000075","#808080"];
const colorMap = new Map(); let nextColorIndex=0;
function hashToColor(str){
if(colorMap.has(str)) return colorMap.get(str);
const color = palette[nextColorIndex % palette.length];
colorMap.set(str, color); nextColorIndex++;
return color;
}
function timeAgo(dateStr){
const diff = Date.now() - new Date(dateStr);
const s=Math.floor(diff/1000), m=Math.floor(s/60), h=Math.floor(m/60), d=Math.floor(h/24);
if(d>0) return d+'d'; if(h>0) return h+'h'; if(m>0) return m+'m'; return s+'s';
}
function isInvalidCoord(node){ return !node || !node.last_lat || !node.last_long; }
// --- Load nodes ---
let nodes = [];
try {
const res = await fetch('/api/nodes');
const data = await res.json();
nodes = data.nodes || [];
} catch(err){ console.error('Failed to load nodes', err); }
const markers = {};
const markerById = {}; // Keyed by numeric node_id for packets
const nodeMap = new Map(); // Keyed by numeric node_id
const channels = new Set();
const activeBlinks = new Map();
const portMap = {1:"Text",67:"Telemetry",3:"Position",70:"Traceroute",4:"Node Info",71:"Neighbour Info",73:"Map Report"};
nodes.forEach(node=>{
if(isInvalidCoord(node)) return;
const lat = node.last_lat/1e7;
const lng = node.last_long/1e7;
const isRouter = node.role.toLowerCase().includes("router");
channels.add(node.channel);
nodeMap.set(node.node_id,node);
const color = hashToColor(node.channel);
const marker = L.circleMarker([lat,lng],{radius:isRouter?9:7,color:"white",fillColor:color,fillOpacity:1,weight:0.7}).addTo(map);
marker.nodeId = node.node_id;
marker.originalColor = color;
markerById[node.node_id]=marker;
let popupContent=`<b>${node.long_name} (${node.short_name})</b><br>
<b>Channel:</b> ${node.channel}<br>
<b>Model:</b> ${node.hw_model}<br>
<b>Role:</b> ${node.role}<br>`;
if(node.last_update) popupContent+=`<b>Last seen:</b> ${timeAgo(node.last_update)}<br>`;
if(node.firmware) popupContent+=`<b>Firmware:</b> ${node.firmware}<br>`;
marker.on('click', e=>{
e.originalEvent.stopPropagation();
marker.bindPopup(popupContent).openPopup();
setTimeout(()=>marker.closePopup(),3000);
});
if(!markers[node.channel]) markers[node.channel]=[];
markers[node.channel].push({marker,isRouter});
});
// --- Filters ---
const filterContainer=document.getElementById('filter-container');
channels.forEach(channel=>{
const id=`filter-${channel.replace(/\s+/g,'-').toLowerCase()}`;
const color=hashToColor(channel);
const label=document.createElement('label');
label.style.color=color;
label.innerHTML=`<input type="checkbox" class="filter-checkbox" id="${id}" checked> ${channel}`;
filterContainer.appendChild(label);
});
function updateMarkers(){
nodes.forEach(node=>{
const id=`filter-${node.channel.replace(/\s+/g,'-').toLowerCase()}`;
const checkbox=document.getElementById(id);
const marker=markerById[node.node_id];
if(marker) marker.setStyle({fillOpacity: checkbox.checked?1:0});
});
localStorage.setItem('meshview_map_filters', JSON.stringify({
channels: Array.from(channels).reduce((obj,c)=>{ obj[c]=document.getElementById(`filter-${c.replace(/\s+/g,'-').toLowerCase()}`).checked; return obj; },{})
}));
}
document.querySelectorAll(".filter-checkbox").forEach(input=>input.addEventListener("change",updateMarkers));
// Load saved filters
const savedFilters=JSON.parse(localStorage.getItem('meshview_map_filters')||'{}');
if(savedFilters.channels){
Object.keys(savedFilters.channels).forEach(c=>{
const checkbox=document.getElementById(`filter-${c.replace(/\s+/g,'-').toLowerCase()}`);
if(checkbox) checkbox.checked=savedFilters.channels[c];
});
}
updateMarkers();
// --- Packet blinking ---
function blinkNode(marker,longName,portnum){
if(!map.hasLayer(marker)) return;
if(activeBlinks.has(marker)){
clearInterval(activeBlinks.get(marker));
marker.setStyle({fillColor: marker.originalColor});
if(marker.tooltip) map.removeLayer(marker.tooltip);
}
let count=0;
const portName=portMap[portnum]||`Port ${portnum}`;
const tooltip=L.tooltip({permanent:true,direction:'top',offset:[0,-marker.options.radius-5],className:'blinking-tooltip'})
.setContent(`${longName} (${portName})`).setLatLng(marker.getLatLng());
tooltip.addTo(map); marker.tooltip=tooltip;
const interval=setInterval(()=>{
if(map.hasLayer(marker)){
marker.setStyle({fillColor:count%2===0?'yellow':marker.originalColor});
marker.bringToFront();
}
count++;
if(count>7){ clearInterval(interval); marker.setStyle({fillColor:marker.originalColor}); map.removeLayer(tooltip); activeBlinks.delete(marker); }
},500);
activeBlinks.set(marker,interval);
}
let lastImportTime=null;
async function fetchLatestPacket(){
try{
const res=await fetch(`/api/packets?limit=1`);
const data=await res.json();
lastImportTime=data.packets?.[0]?.import_time || new Date().toISOString();
}catch(err){ console.error(err); }
}
async function fetchNewPackets(){
if(!lastImportTime) return;
try{
const res=await fetch(`/api/packets?since=${lastImportTime}`);
const data=await res.json();
if(!data.packets || !data.packets.length) return;
let latest=lastImportTime;
data.packets.forEach(packet=>{
if(packet.import_time && packet.import_time>latest) latest=packet.import_time;
const marker=markerById[packet.from_node_id];
const nodeData=nodeMap.get(packet.from_node_id);
if(marker && nodeData) blinkNode(marker,nodeData.long_name,packet.portnum);
});
lastImportTime=latest;
}catch(err){ console.error(err); }
}
if(mapInterval>0){ fetchLatestPacket(); setInterval(fetchNewPackets,mapInterval*1000); }
})();
</script>
</body>
</html>

View File

@@ -18,7 +18,6 @@
.legend-item { display: flex; align-items: center; margin-bottom: 4px; }
.legend-color { width: 16px; height: 16px; margin-right: 6px; border-radius: 4px; }
/* Floating pulse label style */
.pulse-label span {
background: rgba(0,0,0,0.6);
padding: 2px 4px;
@@ -35,166 +34,176 @@
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
<script>
const map = L.map("map");
L.tileLayer("https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png", { maxZoom: 19, attribution: "© OpenStreetMap" }).addTo(map);
const map = L.map("map");
L.tileLayer("https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png", { maxZoom: 19, attribution: "© OpenStreetMap" }).addTo(map);
const nodeMarkers = new Map();
let lastPacketTime = null;
const nodeMarkers = new Map();
let lastPacketTime = null;
const portColors = {
1:"red",
67:"cyan",
3:"orange",
70:"purple",
4:"yellow",
71:"brown",
73:"pink"
};
const portLabels = {
1:"Text",
67:"Telemetry",
3:"Position",
70:"Traceroute",
4:"Node Info",
71:"Neighbour Info",
73:"Map Report"
};
function getPulseColor(portnum) { return portColors[portnum] || "green"; }
const portColors = {
1:"red",
67:"cyan",
3:"orange",
70:"purple",
4:"yellow",
71:"brown",
73:"pink"
};
const portLabels = {
1:"Text",
67:"Telemetry",
3:"Position",
70:"Traceroute",
4:"Node Info",
71:"Neighbour Info",
73:"Map Report"
};
function getPulseColor(portnum) { return portColors[portnum] || "green"; }
// Generate legend dynamically
const legend = document.getElementById("legend");
for (const [port, color] of Object.entries(portColors)) {
const item = document.createElement("div");
item.className = "legend-item";
const colorBox = document.createElement("div");
colorBox.className = "legend-color";
colorBox.style.background = color;
const label = document.createElement("span");
label.textContent = `${portLabels[port] || "Custom"} (${port})`;
item.appendChild(colorBox);
item.appendChild(label);
legend.appendChild(item);
}
// Legend
const legend = document.getElementById("legend");
for (const [port, color] of Object.entries(portColors)) {
const item = document.createElement("div");
item.className = "legend-item";
const colorBox = document.createElement("div");
colorBox.className = "legend-color";
colorBox.style.background = color;
const label = document.createElement("span");
label.textContent = `${portLabels[port] || "Custom"} (${port})`;
item.appendChild(colorBox);
item.appendChild(label);
legend.appendChild(item);
}
// Pulse marker with floating label on top
function pulseMarker(marker, highlightColor = "red") {
if (!marker) return;
if (marker.activePulse) return;
marker.activePulse = true;
// Pulse marker
function pulseMarker(marker, highlightColor = "red") {
if (!marker || marker.activePulse) return;
marker.activePulse = true;
const originalColor = marker.options.originalColor;
const originalRadius = marker.options.originalRadius;
marker.bringToFront();
const originalColor = marker.options.originalColor;
const originalRadius = marker.options.originalRadius;
marker.bringToFront();
const nodeInfo = marker.options.nodeInfo || {};
const portLabel = marker.currentPortLabel || "";
const displayName = `${nodeInfo.long_name || nodeInfo.short_name || "Unknown"}${portLabel ? ` (<i>${portLabel}</i>)` : ""}`;
const nodeInfo = marker.options.nodeInfo || {};
const portLabel = marker.currentPortLabel || "";
const displayName = `${nodeInfo.long_name || nodeInfo.short_name || "Unknown"}${portLabel ? ` (<i>${portLabel}</i>)` : ""}`;
marker.bindTooltip(displayName, {
permanent: true,
direction: 'top',
className: 'pulse-label',
offset: [0, -10],
html: true // Allow italics
}).openTooltip();
marker.bindTooltip(displayName, {
permanent: true,
direction: 'top',
className: 'pulse-label',
offset: [0, -10],
html: true
}).openTooltip();
const flashDuration = 2000, fadeDuration = 1000, flashInterval = 100, maxRadius = originalRadius + 5;
let flashTime = 0;
const flashDuration = 2000, fadeDuration = 1000, flashInterval = 100, maxRadius = originalRadius + 5;
let flashTime = 0;
const flashTimer = setInterval(() => {
flashTime += flashInterval;
const isOn = (flashTime / flashInterval) % 2 === 0;
marker.setStyle({ fillColor: isOn ? highlightColor : originalColor, radius: isOn ? maxRadius : originalRadius });
const flashTimer = setInterval(() => {
flashTime += flashInterval;
const isOn = (flashTime / flashInterval) % 2 === 0;
marker.setStyle({ fillColor: isOn ? highlightColor : originalColor, radius: isOn ? maxRadius : originalRadius });
if (flashTime >= flashDuration) {
clearInterval(flashTimer);
const fadeStart = performance.now();
function fade(now) {
const t = Math.min((now - fadeStart) / fadeDuration, 1);
const radius = originalRadius + (maxRadius - originalRadius) * (1 - t);
marker.setStyle({ fillColor: highlightColor, radius: radius, fillOpacity: 1 });
if (t < 1) requestAnimationFrame(fade);
else {
marker.setStyle({ fillColor: originalColor, radius: originalRadius, fillOpacity: 1 });
marker.unbindTooltip();
marker.activePulse = false;
}
if (flashTime >= flashDuration) {
clearInterval(flashTimer);
const fadeStart = performance.now();
function fade(now) {
const t = Math.min((now - fadeStart) / fadeDuration, 1);
const radius = originalRadius + (maxRadius - originalRadius) * (1 - t);
marker.setStyle({ fillColor: highlightColor, radius: radius, fillOpacity: 1 });
if (t < 1) requestAnimationFrame(fade);
else {
marker.setStyle({ fillColor: originalColor, radius: originalRadius, fillOpacity: 1 });
marker.unbindTooltip();
marker.activePulse = false;
}
requestAnimationFrame(fade);
}
}, flashInterval);
}
async function loadNodes() {
try {
const res = await fetch("/api/nodes");
const nodes = (await res.json()).nodes;
nodes.forEach(node => {
const color = "blue";
const lat = node.last_lat, lng = node.last_long;
if(lat && lng) {
const marker = L.circleMarker([lat/1e7,lng/1e7], {
radius:7, color:"white", fillColor:color, fillOpacity:1, weight:0.7
}).addTo(map);
marker.options.originalColor=color;
marker.options.originalRadius=7;
marker.options.nodeInfo=node;
marker.bindPopup(`<b>${node.long_name||node.short_name||"Unknown"}</b><br>ID: ${node.node_id}<br>Role: ${node.role}`);
nodeMarkers.set(node.node_id, marker);
} else {
nodeMarkers.set(node.node_id, {options:{nodeInfo:node}});
}
});
const markersWithCoords = Array.from(nodeMarkers.values()).filter(m=>m instanceof L.CircleMarker);
if(markersWithCoords.length>0) {
await setMapBoundsFromConfig();
} else {
map.setView([37.77,-122.42],9);
}
} catch(err){ console.error(err); }
}
async function setMapBoundsFromConfig() {
try {
const res = await fetch("/api/config");
const config = await res.json();
const topLeft = [ parseFloat(config.site.map_top_left_lat), parseFloat(config.site.map_top_left_lon) ];
const bottomRight = [ parseFloat(config.site.map_bottom_right_lat), parseFloat(config.site.map_bottom_right_lon) ];
map.fitBounds([topLeft, bottomRight]);
} catch(err) {
console.error("Failed to load map bounds from config:", err);
map.setView([37.77,-122.42],9);
requestAnimationFrame(fade);
}
}
}, flashInterval);
}
// --- Load nodes ---
async function loadNodes() {
try {
const res = await fetch("/api/nodes");
if (!res.ok) throw new Error(`HTTP error ${res.status}`);
const data = await res.json();
const nodes = data.nodes || [];
nodes.forEach(node => {
const lat = node.last_lat / 1e7;
const lng = node.last_long / 1e7;
if (lat && lng && !isNaN(lat) && !isNaN(lng)) {
const color = "blue";
const marker = L.circleMarker([lat,lng], {
radius:7, color:"white", fillColor:color, fillOpacity:1, weight:0.7
}).addTo(map);
marker.options.originalColor = color;
marker.options.originalRadius = 7;
marker.options.nodeInfo = node;
marker.bindPopup(`<b>${node.long_name||node.short_name||"Unknown"}</b><br>ID: ${node.node_id}<br>Role: ${node.role}`);
nodeMarkers.set(node.node_id, marker);
} else {
nodeMarkers.set(node.node_id, {options:{nodeInfo:node}});
}
});
const markersWithCoords = Array.from(nodeMarkers.values()).filter(m=>m instanceof L.CircleMarker);
if(markersWithCoords.length>0) await setMapBoundsFromConfig();
else map.setView([37.77,-122.42],9);
} catch(err){
console.error("Failed to load nodes:", err);
}
}
// --- Map bounds ---
async function setMapBoundsFromConfig() {
try {
const res = await fetch("/api/config");
const config = await res.json();
const topLat = parseFloat(config.site.map_top_left_lat);
const topLon = parseFloat(config.site.map_top_left_lon);
const bottomLat = parseFloat(config.site.map_bottom_right_lat);
const bottomLon = parseFloat(config.site.map_bottom_right_lon);
if ([topLat, topLon, bottomLat, bottomLon].some(v => isNaN(v))) {
throw new Error("Map bounds contain NaN");
}
map.fitBounds([[topLat, topLon], [bottomLat, bottomLon]]);
} catch(err) {
console.error("Failed to load map bounds from config:", err);
map.setView([37.77,-122.42],9);
}
}
// --- Poll packets ---
async function pollPackets() {
try {
let url = "/api/packets?limit=10";
if (lastPacketTime) url += `&since=${lastPacketTime}`;
const packets = (await (await fetch(url)).json()).packets || [];
if (packets.length > 0) lastPacketTime = packets[0].import_time;
const res = await fetch(url);
if (!res.ok) throw new Error(`HTTP error ${res.status}`);
const data = await res.json();
const packets = data.packets || [];
if (packets.length) lastPacketTime = packets[0].import_time;
packets.forEach(pkt => {
const marker = nodeMarkers.get(pkt.from_node_id);
// 🔍 Debug log
const nodeName = marker?.options?.nodeInfo?.short_name || marker?.options?.nodeInfo?.long_name || "Unknown";
console.log(`Packet received: port=${pkt.portnum}, node=${nodeName}`);
if (marker instanceof L.CircleMarker) {
marker.currentPortLabel = portLabels[pkt.portnum] || `${pkt.portnum}`; // Save label
if (marker instanceof L.CircleMarker) { // only real markers
marker.currentPortLabel = portLabels[pkt.portnum] || `${pkt.portnum}`;
pulseMarker(marker, getPulseColor(pkt.portnum));
}
});
} catch (err) {
console.error(err);
} catch(err){
console.error("Failed to fetch packets:", err);
}
}
1
loadNodes().then(()=>{ setInterval(pollPackets,1000); });
// --- Initialize ---
loadNodes().then(() => setInterval(pollPackets, 1000));
</script>
</body>
</html>

View File

@@ -1,10 +1,12 @@
from sqlalchemy import select, func
from sqlalchemy.orm import lazyload
from meshview import database
from meshview.models import Packet, PacketSeen, Node, Traceroute
from sqlalchemy import text
from datetime import datetime, timedelta
from sqlalchemy import and_, func, or_, select, text
from sqlalchemy.orm import lazyload
from meshview import database, models
from meshview.models import Node, Packet, PacketSeen, Traceroute
async def get_node(node_id):
async with database.async_session() as session:
result = await session.execute(select(Node).where(Node.node_id == node_id))
@@ -22,30 +24,65 @@ async def get_fuzzy_nodes(query):
return result.scalars()
async def get_packets(node_id=None, portnum=None, after=None, before=None, limit=None):
async def get_packets(
from_node_id=None,
to_node_id=None,
node_id=None, # legacy: match either from OR to
portnum=None,
after=None,
contains=None, # NEW: SQL-level substring match
limit=50,
):
"""
SQLAlchemy 2.0 async ORM version.
Supports strict from/to/node filtering, substring payload search,
portnum, since, and limit.
"""
async with database.async_session() as session:
q = select(Packet)
stmt = select(models.Packet)
conditions = []
if node_id:
q = q.where(
(Packet.from_node_id == node_id) | (Packet.to_node_id == node_id)
# Strict FROM filter
if from_node_id is not None:
conditions.append(models.Packet.from_node_id == from_node_id)
# Strict TO filter
if to_node_id is not None:
conditions.append(models.Packet.to_node_id == to_node_id)
# Legacy node ID filter: match either direction
if node_id is not None:
conditions.append(
or_(models.Packet.from_node_id == node_id, models.Packet.to_node_id == node_id)
)
if portnum:
q = q.where(Packet.portnum == portnum)
if after:
q = q.where(Packet.import_time > after)
if before:
q = q.where(Packet.import_time < before)
q = q.order_by(Packet.import_time.desc())
# Port filter
if portnum is not None:
conditions.append(models.Packet.portnum == portnum)
if limit is not None:
q = q.limit(limit)
# Timestamp filter
if after is not None:
conditions.append(models.Packet.import_time_us > after)
result = await session.execute(q)
packets = list(result.scalars())
return packets
# Case-insensitive substring search on UTF-8 payload (stored as BLOB)
if contains:
contains_lower = contains.lower()
conditions.append(func.lower(models.Packet.payload).like(f"%{contains_lower}%"))
# Apply all conditions
if conditions:
stmt = stmt.where(and_(*conditions))
# Order newest → oldest
stmt = stmt.order_by(models.Packet.import_time_us.desc())
# Apply limit
stmt = stmt.limit(limit)
# Execute query
result = await session.execute(stmt)
return result.scalars().all()
async def get_packets_from(node_id=None, portnum=None, since=None, limit=500):
@@ -53,9 +90,7 @@ async def get_packets_from(node_id=None, portnum=None, since=None, limit=500):
q = select(Packet)
if node_id:
q = q.where(
Packet.from_node_id == node_id
)
q = q.where(Packet.from_node_id == node_id)
if portnum:
q = q.where(Packet.portnum == portnum)
if since:
@@ -71,15 +106,6 @@ async def get_packet(packet_id):
return result.scalar_one_or_none()
async def get_uplinked_packets(node_id, portnum=None):
async with database.async_session() as session:
q = select(Packet).join(PacketSeen).where(PacketSeen.node_id == node_id).order_by(Packet.import_time.desc()).limit(500)
if portnum:
q = q.where(Packet.portnum == portnum)
result = await session.execute(q)
return result.scalars()
async def get_packets_seen(packet_id):
async with database.async_session() as session:
result = await session.execute(
@@ -93,36 +119,41 @@ async def get_packets_seen(packet_id):
async def has_packets(node_id, portnum):
async with database.async_session() as session:
return bool(
(await session.execute(
(
await session.execute(
select(Packet.id).where(Packet.from_node_id == node_id).limit(1)
)).scalar()
)
).scalar()
)
async def get_traceroute(packet_id):
async with database.async_session() as session:
result = await session.execute(
select(Traceroute)
.where(Traceroute.packet_id == packet_id)
.order_by(Traceroute.import_time)
select(Traceroute)
.where(Traceroute.packet_id == packet_id)
.order_by(Traceroute.import_time)
)
return result.scalars()
async def get_traceroutes(since):
async with database.async_session() as session:
result = await session.execute(
select(Traceroute)
.join(Packet)
.where(Traceroute.import_time > (datetime.now() - since))
.order_by(Traceroute.import_time)
stmt = (
select(Traceroute)
.join(Packet)
.where(Traceroute.import_time > since)
.order_by(Traceroute.import_time)
)
return result.scalars()
stream = await session.stream_scalars(stmt)
async for tr in stream:
yield tr
async def get_mqtt_neighbors(since):
async with database.async_session() as session:
result = await session.execute(select(PacketSeen, Packet)
result = await session.execute(
select(PacketSeen, Packet)
.join(Packet)
.where(
(PacketSeen.hop_limit == PacketSeen.hop_start)
@@ -137,23 +168,6 @@ async def get_mqtt_neighbors(since):
return result
# We count the total amount of packages
# This is to be used by /stats in web.py
async def get_total_packet_count():
async with database.async_session() as session:
q = select(func.count(Packet.id)) # Use SQLAlchemy's func to count packets
result = await session.execute(q)
return result.scalar() # Return the total count of packets
# We count the total amount of seen packets
async def get_total_packet_seen_count():
async with database.async_session() as session:
q = select(func.count(PacketSeen.node_id)) # Use SQLAlchemy's func to count nodes
result = await session.execute(q)
return result.scalar() # Return the` total count of seen packets
async def get_total_node_count(channel: str = None) -> int:
try:
async with database.async_session() as session:
@@ -174,7 +188,8 @@ async def get_total_node_count(channel: str = None) -> int:
async def get_top_traffic_nodes():
try:
async with database.async_session() as session:
result = await session.execute(text("""
result = await session.execute(
text("""
SELECT
n.node_id,
n.long_name,
@@ -189,18 +204,22 @@ async def get_top_traffic_nodes():
GROUP BY n.node_id, n.long_name, n.short_name
HAVING total_packets_sent > 0
ORDER BY total_times_seen DESC;
"""))
""")
)
rows = result.fetchall()
nodes = [{
'node_id': row[0],
'long_name': row[1],
'short_name': row[2],
'channel': row[3],
'total_packets_sent': row[4],
'total_times_seen': row[5]
} for row in rows]
nodes = [
{
'node_id': row[0],
'long_name': row[1],
'short_name': row[2],
'channel': row[3],
'total_packets_sent': row[4],
'total_times_seen': row[5],
}
for row in rows
]
return nodes
except Exception as e:
@@ -208,7 +227,6 @@ async def get_top_traffic_nodes():
return []
async def get_node_traffic(node_id: int):
try:
async with database.async_session() as session:
@@ -223,15 +241,19 @@ async def get_node_traffic(node_id: int):
AND packet.import_time >= DATETIME('now', 'localtime', '-24 hours')
GROUP BY packet.portnum
ORDER BY packet_count DESC;
"""), {"node_id": node_id}
"""),
{"node_id": node_id},
)
# Map the result to include node.long_name and packet data
traffic_data = [{
"long_name": row[0], # node.long_name
"portnum": row[1], # packet.portnum
"packet_count": row[2] # COUNT(*) as packet_count
} for row in result.all()]
traffic_data = [
{
"long_name": row[0], # node.long_name
"portnum": row[1], # packet.portnum
"packet_count": row[2], # COUNT(*) as packet_count
}
for row in result.all()
]
return traffic_data
@@ -241,7 +263,6 @@ async def get_node_traffic(node_id: int):
return []
async def get_nodes(role=None, channel=None, hw_model=None, days_active=None):
"""
Fetches nodes from the database based on optional filtering criteria.
@@ -256,7 +277,7 @@ async def get_nodes(role=None, channel=None, hw_model=None, days_active=None):
"""
try:
async with database.async_session() as session:
#print(channel) # Debugging output (consider replacing with logging)
# print(channel) # Debugging output (consider replacing with logging)
# Start with a base query selecting all nodes
query = select(Node)
@@ -283,7 +304,7 @@ async def get_nodes(role=None, channel=None, hw_model=None, days_active=None):
nodes = result.scalars().all()
return nodes # Return the list of nodes
except Exception as e:
except Exception:
print("error reading DB") # Consider using logging instead of print
return [] # Return an empty list in case of failure
@@ -294,7 +315,7 @@ async def get_packet_stats(
channel: str | None = None,
portnum: int | None = None,
to_node: int | None = None,
from_node: int | None = None
from_node: int | None = None,
):
now = datetime.now()
@@ -308,13 +329,10 @@ async def get_packet_stats(
raise ValueError("period_type must be 'hour' or 'day'")
async with database.async_session() as session:
q = (
select(
func.strftime(time_format, Packet.import_time).label('period'),
func.count().label('count')
)
.where(Packet.import_time >= start_time)
)
q = select(
func.strftime(time_format, Packet.import_time).label('period'),
func.count().label('count'),
).where(Packet.import_time >= start_time)
# Filters
if channel:
@@ -338,5 +356,161 @@ async def get_packet_stats(
"portnum": portnum,
"to_node": to_node,
"from_node": from_node,
"data": data
"data": data,
}
async def get_channels_in_period(period_type: str = "hour", length: int = 24):
"""
Returns a sorted list of distinct channels used in packets over a given period.
period_type: "hour" or "day"
length: number of hours or days to look back
"""
now_us = int(datetime.utcnow().timestamp() * 1_000_000)
if period_type == "hour":
delta_us = length * 3600 * 1_000_000
elif period_type == "day":
delta_us = length * 86400 * 1_000_000
else:
raise ValueError("period_type must be 'hour' or 'day'")
start_us = now_us - delta_us
async with database.async_session() as session:
stmt = (
select(Packet.channel)
.where(Packet.import_time_us >= start_us)
.distinct()
.order_by(Packet.channel)
)
result = await session.execute(stmt)
channels = [ch for ch in result.scalars().all() if ch is not None]
return channels
async def get_total_packet_count(
period_type: str | None = None,
length: int | None = None,
channel: str | None = None,
from_node: int | None = None,
to_node: int | None = None,
):
"""
Count total packets, with ALL filters optional.
If no filters -> return ALL packets ever.
Uses import_time_us (microseconds).
"""
# CASE 1: no filters -> count everything
if (
period_type is None
and length is None
and channel is None
and from_node is None
and to_node is None
):
async with database.async_session() as session:
q = select(func.count(Packet.id))
res = await session.execute(q)
return res.scalar() or 0
# CASE 2: filtered mode -> compute time window using import_time_us
now_us = int(datetime.now().timestamp() * 1_000_000)
if period_type is None:
period_type = "day"
if length is None:
length = 1
if period_type == "hour":
start_time_us = now_us - (length * 3600 * 1_000_000)
elif period_type == "day":
start_time_us = now_us - (length * 86400 * 1_000_000)
else:
raise ValueError("period_type must be 'hour' or 'day'")
async with database.async_session() as session:
q = select(func.count(Packet.id)).where(Packet.import_time_us >= start_time_us)
if channel:
q = q.where(func.lower(Packet.channel) == channel.lower())
if from_node:
q = q.where(Packet.from_node_id == from_node)
if to_node:
q = q.where(Packet.to_node_id == to_node)
res = await session.execute(q)
return res.scalar() or 0
async def get_total_packet_seen_count(
packet_id: int | None = None,
period_type: str | None = None,
length: int | None = None,
channel: str | None = None,
from_node: int | None = None,
to_node: int | None = None,
):
"""
Count total PacketSeen rows.
- If packet_id is provided -> count only that packet's seen entries.
- Otherwise match EXACT SAME FILTERS as get_total_packet_count.
Uses import_time_us for time window.
"""
# SPECIAL CASE: direct packet_id lookup
if packet_id is not None:
async with database.async_session() as session:
q = select(func.count(PacketSeen.packet_id)).where(PacketSeen.packet_id == packet_id)
res = await session.execute(q)
return res.scalar() or 0
# No filters -> return ALL seen entries
if (
period_type is None
and length is None
and channel is None
and from_node is None
and to_node is None
):
async with database.async_session() as session:
q = select(func.count(PacketSeen.packet_id))
res = await session.execute(q)
return res.scalar() or 0
# Compute time window
now_us = int(datetime.now().timestamp() * 1_000_000)
if period_type is None:
period_type = "day"
if length is None:
length = 1
if period_type == "hour":
start_time_us = now_us - (length * 3600 * 1_000_000)
elif period_type == "day":
start_time_us = now_us - (length * 86400 * 1_000_000)
else:
raise ValueError("period_type must be 'hour' or 'day'")
# JOIN Packet so we can apply identical filters
async with database.async_session() as session:
q = (
select(func.count(PacketSeen.packet_id))
.join(Packet, Packet.id == PacketSeen.packet_id)
.where(Packet.import_time_us >= start_time_us)
)
if channel:
q = q.where(func.lower(Packet.channel) == channel.lower())
if from_node:
q = q.where(Packet.from_node_id == from_node)
if to_node:
q = q.where(Packet.to_node_id == to_node)
res = await session.execute(q)
return res.scalar() or 0

View File

@@ -1,26 +1,27 @@
<!doctype html>
<html lang="en" data-bs-theme="dark">
<head>
<title>
Meshview - {{ site_config.get("site", {}).get("title", "") }}
{% if node and node.short_name %}-- {{ node.short_name }}{% endif %}
</title>
<title>Meshview</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<!-- Scripts -->
<script src="https://unpkg.com/htmx.org@1.9.11" integrity="sha384-0gxUXCCR8yv9FM2b+U3FDbsKthCI66oH5IA9fHppQq9DDMHuMauqq1ZHBpJxQ0J0" crossorigin="anonymous"></script>
<script src="https://unpkg.com/htmx.org@1.9.11/dist/ext/sse.js" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js" integrity="sha384-YvpcrYf0tY3lHB60NNkmXc5s9fDVZLESaAA55NDzOxhy9GkcIdslK1eN7N6jIeHz" crossorigin="anonymous"></script>
<!-- Stylesheets -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-QWTKZyjpPEjISv5WaRU9OFeRpok6YctnYmDr5pNlyT2bRjXh0JMhjY6hW+ALEwIH" crossorigin="anonymous">
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" integrity="sha256-p4NxAoJBhIIN+hmNHrzRCf9tD/miZyoHS5obTRR9BMY=" crossorigin=""/>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js" integrity="sha256-20nQCchB9co0qIjJZRGuk2/Z9VM+kNiyxNV1lvTlZBo=" crossorigin=""></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js" crossorigin="anonymous"></script>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" crossorigin="anonymous">
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" crossorigin=""/>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js" crossorigin=""></script>
{% block head %}{% endblock %}
<style>
body {
opacity: 0;
transition: opacity 0.3s ease-in-out;
}
body.ready {
opacity: 1;
}
.htmx-indicator {
opacity: 0;
transition: opacity 500ms ease-in;
@@ -28,47 +29,174 @@
.htmx-request .htmx-indicator {
opacity: 1;
}
#search_form {
z-index: 4000;
}
#details_map {
width: 100%;
height: 500px;
}
{% block css %}{% endblock %}
</style>
</head>
<body>
{% set site = site_config.get("site", {}) %}
<br>
<div style="text-align:center" id="site-header"></div>
<div style="text-align:center" id="site-message"></div>
<div style="text-align:center" id="site-menu"></div>
<br>
<div style="text-align:center">
<strong>{{ site.get("title", "") }} {{ site.get("domain", "") }}</strong>
</div>
<div style="text-align: center;">
{{ site.get("message", "") }}
</div>
<br>
<div style="text-align:center">
{% if site.get("nodes") == "True" %}<a href="/nodelist">Nodes</a>{% endif %}
{% if site.get("conversations") == "True" %}&nbsp;-&nbsp;<a href="/chat">Conversations</a>{% endif %}
{% if site.get("everything") == "True" %}&nbsp;-&nbsp;<a href="/firehose">See <strong>everything</strong></a>{% endif %}
{% if site.get("graphs") == "True" %}&nbsp;-&nbsp;<a href="/nodegraph">Mesh Graphs</a>{% endif %}
{% if site.get("net") == "True" %}&nbsp;-&nbsp;<a href="/net">Weekly Net</a>{% endif %}
{% if site.get("map") == "True" %}&nbsp;-&nbsp;<a href="/map">Live Map</a>{% endif %}
{% if site.get("stats") == "True" %}&nbsp;-&nbsp;<a href="/stats">Stats</a>{% endif %}
{% if site.get("top") == "True" %}&nbsp;-&nbsp;<a href="/top">Top Traffic</a>{% endif %}
</div>
{% block body %}{% endblock %}
{% include "search_form.html" %}
<br>
<div style="text-align:center" id="footer" data-translate="footer"></div>
<div style="text-align:center"><small id="site-version">ver. unknown</small></div>
<br>
{% block body %}{% endblock %}
<script>
// --- Shared Promises ---
if (!window._siteConfigPromise) {
window._siteConfigPromise = (async () => {
try {
const res = await fetch("/api/config");
const cfg = await res.json();
window._siteConfig = cfg;
console.log("Loaded config:", cfg);
return cfg;
} catch (err) {
console.error("Failed to load /api/config:", err);
return {};
}
})();
}
// --- Load language AFTER config ---
if (!window._langPromise) {
window._langPromise = (async () => {
try {
const cfg = await window._siteConfigPromise;
const site = cfg.site || {};
const userLang = site.language || "en";
const section = "base";
const url = `/api/lang?lang=${userLang}&section=${section}`;
const res = await fetch(url);
const lang = await res.json();
window._lang = lang;
console.log(`Loaded language (${userLang}):`, lang);
return lang;
} catch (err) {
console.error("Failed to load language:", err);
return {};
}
})();
}
// --- Translation Helper ---
function applyTranslations(dict) {
document.querySelectorAll("[data-translate]").forEach(el => {
const key = el.dataset.translate;
const value = dict[key];
if (!value) return;
if (el.placeholder) {
el.placeholder = value;
} else if (el.tagName === "INPUT" && el.value) {
el.value = value;
} else if (key === "footer") {
el.innerHTML = value;
} else {
el.textContent = value;
}
});
}
// --- Fill portnum select dynamically ---
function fillPortnumSelect(dict, selectedValue) {
const sel = document.getElementById("portnum_select");
if (!sel) return;
const portOptions = dict.portnum_options || {};
sel.innerHTML = "";
const allOption = document.createElement("option");
allOption.value = "";
allOption.textContent = dict.all || "All";
if (!selectedValue) allOption.selected = true;
sel.appendChild(allOption);
for (const [val, label] of Object.entries(portOptions)) {
const opt = document.createElement("option");
opt.value = val;
opt.textContent = label;
if (parseInt(val) === parseInt(selectedValue)) {
opt.selected = true;
}
sel.appendChild(opt);
}
}
// --- Main Init ---
async function initializePage() {
try {
const [cfg, lang] = await Promise.all([
window._siteConfigPromise,
window._langPromise
]);
const dict = lang || {};
const site = cfg.site || {};
// Title
document.title = "Meshview - " + (site.title || "");
// Header & Message
document.getElementById("site-header").innerHTML =
`<strong>${site.title || ""} ${site.domain ? "(" + site.domain + ")" : ""}</strong>`;
document.getElementById("site-message").textContent = site.message || "";
// Menu
const menu = document.getElementById("site-menu");
if (menu) {
const items = [];
const keys = ["nodes", "chat", "everything", "graphs", "net", "map", "stats", "top"];
const urls = ["/nodelist", "/chat", "/firehose", "/nodegraph", "/net", "/map", "/stats", "/top"];
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
if (site[key] === "true") {
items.push(`<a href="${urls[i]}">${dict[key] || key}</a>`);
}
}
menu.innerHTML = items.join("&nbsp;-&nbsp;");
}
// Version
document.getElementById("site-version").textContent =
"ver. " + (site.version || "unknown");
// Apply translations
applyTranslations(dict);
fillPortnumSelect(dict, "{{ portnum or '' }}");
document.body.classList.add("ready");
} catch (err) {
console.error("Failed to initialize page:", err);
document.body.classList.add("ready");
}
}
document.addEventListener("DOMContentLoaded", initializePage);
</script>
<br>
<div style="text-align:center">
Visit <strong><a href="https://github.com/pablorevilla-meshtastic/meshview">Meshview</a></strong> on Github.
<small>ver. {{ SOFTWARE_RELEASE | default("unknown") }}</small>
</div>
<br>
</body>
</html>

View File

@@ -1,16 +0,0 @@
<div id="buttons" class="btn-group" role="group">
<a
role="button"
class="btn {{ 'btn-primary' if packet_event == 'packet' else 'btn-secondary'}}"
href="/packet_list/{{node_id}}?{{query_string}}"
>
TX/RX
</a>
<a
role="button"
class="btn {{ 'btn-primary' if packet_event == 'uplinked' else 'btn-secondary'}}"
href="/uplinked_list/{{node_id}}?{{query_string}}"
>
Uplinked
</a>
</div>

View File

@@ -3,176 +3,210 @@
{% block css %}
.timestamp {
min-width: 10em;
color: #ccc;
}
.chat-packet:nth-of-type(odd) {
background-color: #3a3a3a;
}
.chat-packet:nth-of-type(odd) { background-color: #3a3a3a; }
.chat-packet {
border-bottom: 1px solid #555;
padding: 8px;
border-radius: 8px;
padding: 3px 6px;
border-radius: 6px;
margin: 0;
}
.chat-packet:nth-of-type(even) {
background-color: #333333;
/* Same column spacing as before */
.chat-packet > [class^="col-"] {
padding-left: 10px !important;
padding-right: 10px !important;
padding-top: 1px !important;
padding-bottom: 1px !important;
}
.chat-packet:nth-of-type(even) { background-color: #333333; }
.channel {
font-style: italic;
color: #bbb;
}
.channel a {
font-style: normal;
color: #999;
}
@keyframes flash {
0% { background-color: #ffe066; }
100% { background-color: inherit; }
}
.chat-packet.flash {
animation: flash 3.5s ease-out;
}
.chat-packet.flash { animation: flash 3.5s ease-out; }
/* Nested reply style below the message */
.replying-to {
font-size: 0.85em;
color: #aaa; /* gray text */
margin-top: 4px;
padding-left: 20px; /* increased indentation */
.replying-to .reply-preview {
font-size: 0.8em;
color: #aaa;
margin-top: 2px;
padding-left: 10px;
}
}
.replying-to .reply-preview { color: #aaa; }
{% endblock %}
{% block body %}
<div id="chat-container">
<div class="container" id="chat-log">
<div id="chat-container" class="mt-3">
<!-- ⭐ CHAT TITLE WITH ICON, aligned to container ⭐ -->
<div class="container px-2">
<h2 data-translate="chat_title" style="color:white; margin:0 0 10px 0;">
💬 Chat
</h2>
</div>
<div class="container" id="chat-log"></div>
</div>
<script>
const chatContainer = document.querySelector("#chat-log");
let lastTime = null;
const renderedPacketIds = new Set();
const packetMap = new Map(); // store all packets weve seen
document.addEventListener("DOMContentLoaded", async () => {
const chatContainer = document.querySelector("#chat-log");
if (!chatContainer) return console.error("#chat-log not found");
function escapeHtml(text) {
const div = document.createElement("div");
div.textContent = text == null ? "" : text;
return div.innerHTML;
}
let lastTime = null;
const renderedPacketIds = new Set();
const packetMap = new Map();
let chatLang = {};
function renderPacket(packet, highlight = false) {
// prevent duplicates
if (renderedPacketIds.has(packet.id)) return;
renderedPacketIds.add(packet.id);
packetMap.set(packet.id, packet);
function applyTranslations(dict, root = document) {
root.querySelectorAll("[data-translate]").forEach(el => {
const key = el.dataset.translate;
const val = dict[key];
if (!val) return;
if (el.placeholder) el.placeholder = val;
else if (el.tagName === "INPUT" && el.value) el.value = val;
else if (key === "footer") el.innerHTML = val;
else el.textContent = val;
});
}
const date = new Date(packet.import_time);
const formattedTime = date.toLocaleTimeString([], {
hour: "numeric", minute: "2-digit", second: "2-digit", hour12: true
});
const formattedDate = `${(date.getMonth() + 1).toString().padStart(2,"0")}/` +
`${date.getDate().toString().padStart(2,"0")}/` +
`${date.getFullYear()}`;
const formattedTimestamp = `${formattedTime} - ${formattedDate}`;
function escapeHtml(text) {
const div = document.createElement("div");
div.textContent = text ?? "";
return div.innerHTML;
}
// Try to resolve the reply target
let replyHtml = "";
if (packet.reply_id) {
const parent = packetMap.get(packet.reply_id);
if (parent) {
replyHtml = `
<div class="replying-to">
<div class="reply-preview">
<i>Replying to: <strong>${escapeHtml((parent.long_name || "").trim() || `Node ${parent.from_node_id}`)}</strong>:
${escapeHtml(parent.payload || "")}</i>
</div>
</div>
function renderPacket(packet, highlight = false) {
if (renderedPacketIds.has(packet.id)) return;
renderedPacketIds.add(packet.id);
packetMap.set(packet.id, packet);
let date;
if (packet.import_time_us && packet.import_time_us > 0) {
date = new Date(packet.import_time_us / 1000);
} else if (packet.import_time) {
date = new Date(packet.import_time);
} else {
date = new Date();
}
const formattedTime = date.toLocaleTimeString([], {
hour:"numeric",
minute:"2-digit",
second:"2-digit",
hour12:true
});
const formattedDate =
`${(date.getMonth()+1).toString().padStart(2,"0")}/` +
`${date.getDate().toString().padStart(2,"0")}/` +
`${date.getFullYear()}`;
const formattedTimestamp = `${formattedTime} - ${formattedDate}`;
let replyHtml = "";
if (packet.reply_id) {
const parent = packetMap.get(packet.reply_id);
const replyPrefix = `<i data-translate="replying_to"></i>`;
if (parent) {
replyHtml = `
<div class="replying-to">
${replyPrefix}
<strong>${escapeHtml((parent.long_name || "").trim() || `Node ${parent.from_node_id}`)}</strong>:
${escapeHtml(parent.payload || "")}
</div>`;
} else {
replyHtml = `
<div class="replying-to">
${replyPrefix}
<a href="/packet/${packet.reply_id}">${packet.reply_id}</a>
</div>`;
}
}
const div = document.createElement("div");
div.className = "row chat-packet" + (highlight ? " flash" : "");
div.dataset.packetId = packet.id;
div.innerHTML = `
<span class="col-2 timestamp" title="${packet.import_time_us}">${formattedTimestamp}</span>
<span class="col-2 channel">
<a href="/packet/${packet.id}" title="${chatLang.view_packet_details || 'View details'}">🔎</a>
${escapeHtml(packet.channel || "")}
</span>
<span class="col-3 nodename">
<a href="/node/${packet.from_node_id}">
${escapeHtml((packet.long_name || "").trim() || `Node ${packet.from_node_id}`)}
</a>
</span>
<span class="col-5 message">${escapeHtml(packet.payload)}${replyHtml}</span>
`;
} else {
// fallback if parent not loaded yet
replyHtml = `
<div class="replying-to">
<i>Replying to: <a href="/packet/${packet.reply_id}">${packet.reply_id}</a></i>
</div>
`;
}
chatContainer.prepend(div);
applyTranslations(chatLang, div);
if (highlight) setTimeout(() => div.classList.remove("flash"), 2500);
}
const div = document.createElement("div");
div.className = "row chat-packet" + (highlight ? " flash" : "");
div.dataset.packetId = packet.id;
div.innerHTML = `
<span class="col-2 timestamp" title="${packet.import_time}">
${formattedTimestamp}
</span>
<span class="col-2 channel">
<a href="/packet/${packet.id}" title="View packet details">✉️</a> ${escapeHtml(packet.channel || "")}
</span>
<span class="col-3 nodename">
<a href="/packet_list/${packet.from_node_id}">
${escapeHtml((packet.long_name || "").trim() || `Node ${packet.from_node_id}`)}
</a>
</span>
<span class="col-5 message">
${escapeHtml(packet.payload)}
${replyHtml}
</span>
`;
// Prepend so newest messages are at the top.
chatContainer.prepend(div);
if (highlight) setTimeout(() => div.classList.remove("flash"), 2500);
}
function renderPacketsEnsureDescending(packets, highlight = false) {
if (!Array.isArray(packets) || packets.length === 0) return;
const sortedDesc = packets.slice().sort((a, b) =>
new Date(b.import_time) - new Date(a.import_time)
);
for (let i = sortedDesc.length - 1; i >= 0; i--) {
renderPacket(sortedDesc[i], highlight);
}
}
async function fetchInitial() {
try {
const url = new URL("/api/chat", window.location.origin);
url.searchParams.set("limit", "100");
const resp = await fetch(url);
const data = await resp.json();
if (data && data.packets && data.packets.length > 0) {
renderPacketsEnsureDescending(data.packets, false);
if (data.latest_import_time) lastTime = data.latest_import_time;
function renderPacketsEnsureDescending(packets, highlight=false) {
if (!Array.isArray(packets) || packets.length===0) return;
const sortedDesc = packets.slice().sort((a,b)=>{
const aTime =
(a.import_time_us && a.import_time_us > 0)
? a.import_time_us
: (a.import_time ? new Date(a.import_time).getTime() * 1000 : 0);
const bTime =
(b.import_time_us && b.import_time_us > 0)
? b.import_time_us
: (b.import_time ? new Date(b.import_time).getTime() * 1000 : 0);
return bTime - aTime;
});
for (let i=sortedDesc.length-1; i>=0; i--) renderPacket(sortedDesc[i], highlight);
}
} catch (err) {
console.error("Initial fetch error:", err);
}
}
async function fetchUpdates() {
try {
const url = new URL("/api/chat", window.location.origin);
url.searchParams.set("limit", "100");
if (lastTime) url.searchParams.set("since", lastTime);
const resp = await fetch(url);
const data = await resp.json();
if (data && data.packets && data.packets.length > 0) {
renderPacketsEnsureDescending(data.packets, true);
if (data.latest_import_time) lastTime = data.latest_import_time;
async function fetchInitial() {
try {
const resp = await fetch("/api/packets?portnum=1&limit=100");
const data = await resp.json();
if (data?.packets?.length) renderPacketsEnsureDescending(data.packets);
lastTime = data?.latest_import_time || lastTime;
} catch(err){ console.error("Initial fetch error:", err); }
}
} catch (err) {
console.error("Fetch updates error:", err);
}
}
// initial load
fetchInitial();
setInterval(fetchUpdates, 5000);
async function fetchUpdates() {
try {
const url = new URL("/api/packets?portnum=1", window.location.origin);
url.searchParams.set("limit","100");
if (lastTime) url.searchParams.set("since", lastTime);
const resp = await fetch(url);
const data = await resp.json();
if (data?.packets?.length) renderPacketsEnsureDescending(data.packets, true);
lastTime = data?.latest_import_time || lastTime;
} catch(err){ console.error("Fetch updates error:", err); }
}
async function loadChatLang() {
try {
const cfg = await window._siteConfigPromise;
const langCode = cfg?.site?.language || "en";
const res = await fetch(`/api/lang?lang=${langCode}&section=chat`);
chatLang = await res.json();
applyTranslations(chatLang);
} catch(err){ console.error("Chat translation load failed:", err); }
}
await Promise.all([loadChatLang(), fetchInitial()]);
setInterval(fetchUpdates, 5000);
});
</script>
{% endblock %}

View File

@@ -1,7 +0,0 @@
<datalist
id="node_options"
>
{% for option in node_options %}
<option value="{{option.id}}">{{option.id}} -- {{option.long_name}} ({{option.short_name}})</option>
{% endfor %}
</datalist>

View File

@@ -1,101 +1,312 @@
{% extends "base.html" %}
{% block css %}
.container {
max-width: 900px;
margin: 0 auto;
}
.container {
margin: 0 auto;
padding: 10px;
}
#pause-button {
white-space: nowrap;
padding: 4px 10px;
font-size: 0.9rem;
border-radius: 6px;
}
#pause-button {
white-space: nowrap;
padding: 2px 8px;
font-size: 0.85rem;
}
.packet-table {
width: 100%;
border-collapse: collapse;
font-size: 0.85rem;
color: #e4e9ee;
}
.packet-table th, .packet-table td {
border: 1px solid #3a3f44;
padding: 6px 10px;
text-align: left;
}
.packet-table th {
background-color: #1f2226;
font-weight: bold;
}
.packet-table tr:nth-of-type(odd) { background-color: #272b2f; }
.packet-table tr:nth-of-type(even) { background-color: #212529; }
.port-tag {
display: inline-block;
padding: 1px 6px;
border-radius: 6px;
font-size: 0.75rem;
font-weight: 500;
color: #fff;
}
.port-0 { background-color: #6c757d; }
.port-1 { background-color: #007bff; }
.port-3 { background-color: #28a745; }
.port-4 { background-color: #ffc107; }
.port-5 { background-color: #dc3545; }
.port-6 { background-color: #20c997; }
.port-65 { background-color: #ff66b3; }
.port-67 { background-color: #17a2b8; }
.port-70 { background-color: #6f42c1; }
.port-71 { background-color: #fd7e14; }
.to-mqtt { font-style: italic; color: #aaa; }
.payload-row { display: none; background-color: #1b1e22; }
.payload-cell {
padding: 8px 12px;
font-family: monospace;
white-space: pre-wrap;
color: #b0bec5;
border-top: none;
}
.packet-table tr.expanded + .payload-row { display: table-row; }
.toggle-btn {
cursor: pointer;
color: #aaa;
margin-right: 6px;
font-weight: bold;
}
.toggle-btn:hover { color: #fff; }
/* Link next to port tag */
.inline-link {
margin-left: 6px;
font-weight: bold;
text-decoration: none;
color: #9fd4ff;
}
.inline-link:hover {
color: #c7e6ff;
}
{% endblock %}
{% block body %}
<div class="container">
<form class="d-flex align-items-center justify-content-between mb-2">
{% set options = {
1: "Text Message",
3: "Position",
4: "Node Info",
67: "Telemetry",
71: "Neighbor Info",
70: "Trace Route",
}
%}
<button type="button" id="pause-button" class="btn btn-sm btn-outline-secondary">Pause</button>
<form class="d-flex align-items-center justify-content-between mb-3">
<h5 class="mb-0" data-translate-lang="live_feed">📡 Live Feed</h5>
<button type="button" id="pause-button" class="btn btn-sm btn-outline-secondary" data-translate-lang="pause">Pause</button>
</form>
<div class="row">
<div class="col-xs" id="packet_list">
{% for packet in packets %}
{% include 'packet.html' %}
{% else %}
No packets found.
{% endfor %}
</div>
</div>
<table class="packet-table">
<thead>
<tr>
<th data-translate-lang="time">Time</th>
<th data-translate-lang="packet_id">Packet ID</th>
<th data-translate-lang="from">From</th>
<th data-translate-lang="to">To</th>
<th data-translate-lang="port">Port</th>
</tr>
</thead>
<tbody id="packet_list"></tbody>
</table>
</div>
<script>
let lastTime = null;
let portnum = "{{ portnum if portnum is not none else '' }}";
let lastImportTimeUs = null;
let updatesPaused = false;
let nodeMap = {};
let updateInterval = 3000;
let firehoseTranslations = {};
// Use firehose_interval from config (seconds), default to 3s if not set
const firehoseInterval = {{ site_config["site"]["firehose_interal"] | default(3) }};
if (firehoseInterval < 0) firehoseInterval = 0;
function fetchUpdates() {
if (updatesPaused || firehoseInterval === 0) return;
const url = new URL("/firehose/updates", window.location.origin);
if (lastTime) url.searchParams.set("last_time", lastTime);
if (portnum) url.searchParams.set("portnum", portnum);
fetch(url)
.then(res => res.json())
.then(data => {
if (data.packets && data.packets.length > 0) {
lastTime = data.last_time;
const list = document.getElementById("packet_list");
for (const html of data.packets.reverse()) {
list.insertAdjacentHTML("afterbegin", html);
}
}
})
.catch(err => {
console.error("Update fetch failed:", err);
});
function applyTranslations(translations, root=document) {
root.querySelectorAll("[data-translate-lang]").forEach(el => {
const key = el.dataset.translateLang;
if (translations[key]) el.textContent = translations[key];
});
}
document.addEventListener("DOMContentLoaded", () => {
const pauseBtn = document.getElementById("pause-button");
const portnumSelector = document.querySelector('select[name="portnum"]');
if (portnumSelector) {
portnumSelector.addEventListener("change", (e) => {
const selected = e.target.value;
const url = new URL(window.location.href);
url.searchParams.set("portnum", selected);
window.location.href = url;
});
async function loadTranslations() {
try {
const cfg = await window._siteConfigPromise;
const langCode = cfg?.site?.language || "en";
const res = await fetch(`/api/lang?lang=${langCode}&section=firehose`);
firehoseTranslations = await res.json();
applyTranslations(firehoseTranslations, document);
} catch (err) {
console.error("Firehose translation load failed:", err);
}
}
const PORT_MAP = {
0: "UNKNOWN APP",
1: "Text Message",
3: "Position",
4: "Node Info",
5: "Routing",
6: "Administration",
8: "Waypoint",
65: "Store Forward",
67: "Telemetry",
70: "Trace Route",
71: "Neighbor Info",
};
const PORT_COLORS = {
0: "#6c757d",
1: "#007bff",
3: "#28a745",
4: "#ffc107",
5: "#dc3545",
6: "#20c997",
65: "#6610f2",
67: "#17a2b8",
68: "#fd7e14",
69: "#6f42c1",
70: "#ff4444",
71: "#ff66cc",
72: "#00cc99",
73: "#9999ff",
74: "#cc00cc",
75: "#ffbb33",
76: "#00bcd4",
77: "#8bc34a",
78: "#795548"
};
// Load node names
async function loadNodes() {
const res = await fetch("/api/nodes");
if (!res.ok) return;
const data = await res.json();
for (const n of data.nodes || []) {
nodeMap[n.node_id] = n.long_name || n.short_name || n.id || n.node_id;
}
nodeMap[4294967295] = "All";
}
function nodeName(id) {
if (id === 4294967295) return `<span class="to-mqtt">All</span>`;
return nodeMap[id] || id;
}
function portLabel(portnum, payload, linksHtml) {
const name = PORT_MAP[portnum] || "Unknown";
const color = PORT_COLORS[portnum] || "#6c757d";
const safePayload = payload ? payload.replace(/"/g, "&quot;") : "";
return `
<span class="port-tag" style="background-color:${color}" title="${safePayload}">
${name}
</span>
<span class="text-secondary">(${portnum})</span>
${linksHtml || ""}
`;
}
function formatLocalTime(importTimeUs) {
const ms = importTimeUs / 1000;
const date = new Date(ms);
return date.toLocaleTimeString([], { hour: "2-digit", minute: "2-digit", second: "2-digit" });
}
async function configureFirehose() {
try {
const cfg = await window._siteConfigPromise;
const intervalSec = cfg?.site?.firehose_interval;
if (intervalSec && !isNaN(intervalSec)) updateInterval = parseInt(intervalSec) * 1000;
} catch (err) {
console.warn("Failed to read firehose interval:", err);
}
}
async function fetchUpdates() {
if (updatesPaused) return;
const url = new URL("/api/packets", window.location.origin);
if (lastImportTimeUs) url.searchParams.set("since", lastImportTimeUs);
url.searchParams.set("limit", 50);
try {
const res = await fetch(url);
if (!res.ok) return;
const data = await res.json();
const packets = data.packets || [];
if (!packets.length) return;
const list = document.getElementById("packet_list");
for (const pkt of packets.reverse()) {
const from = pkt.from_node_id === 4294967295
? `<span class="to-mqtt">All</span>`
: `<a href="/node/${pkt.from_node_id}" style="text-decoration:underline; color:inherit;">${nodeMap[pkt.from_node_id] || pkt.from_node_id}</a>`;
const to = pkt.to_node_id === 1
? `<span class="to-mqtt">direct to MQTT</span>`
: pkt.to_node_id === 4294967295
? `<span class="to-mqtt">All</span>`
: `<a href="/node/${pkt.to_node_id}" style="text-decoration:underline; color:inherit;">${nodeMap[pkt.to_node_id] || pkt.to_node_id}</a>`;
// Inline link next to port tag
let inlineLinks = "";
if (pkt.portnum === 3 && pkt.payload) {
const latMatch = pkt.payload.match(/latitude_i:\s*(-?\d+)/);
const lonMatch = pkt.payload.match(/longitude_i:\s*(-?\d+)/);
if (latMatch && lonMatch) {
const lat = parseInt(latMatch[1]) / 1e7;
const lon = parseInt(lonMatch[1]) / 1e7;
inlineLinks += ` <a class="inline-link" href="https://www.google.com/maps?q=${lat},${lon}" target="_blank">📍</a>`;
}
}
if (pkt.portnum === 70) {
let traceId = pkt.id;
const match = pkt.payload.match(/ID:\s*(\d+)/i);
if (match) traceId = match[1];
inlineLinks += ` <a class="inline-link" href="/graph/traceroute/${traceId}" target="_blank">⮕</a>`;
}
const safePayload = (pkt.payload || "").replace(/</g, "&lt;").replace(/>/g, "&gt;");
const localTime = formatLocalTime(pkt.import_time_us);
const html = `
<tr class="packet-row" data-id="${pkt.id}">
<td>${localTime}</td>
<td><span class="toggle-btn">▶</span> <a href="/packet/${pkt.id}" style="text-decoration:underline; color:inherit;">${pkt.id}</a></td>
<td>${from}</td>
<td>${to}</td>
<td>${portLabel(pkt.portnum, pkt.payload, inlineLinks)}</td>
</tr>
<tr class="payload-row">
<td colspan="5" class="payload-cell">${safePayload}</td>
</tr>`;
list.insertAdjacentHTML("afterbegin", html);
}
while (list.rows.length > 400) list.deleteRow(-1);
lastImportTimeUs = packets[packets.length - 1].import_time_us;
} catch (err) {
console.error("Packet fetch failed:", err);
}
}
// --- Initialize ---
document.addEventListener("DOMContentLoaded", async () => {
const pauseBtn = document.getElementById("pause-button");
pauseBtn.addEventListener("click", () => {
updatesPaused = !updatesPaused;
pauseBtn.textContent = updatesPaused ? "Resume" : "Pause";
pauseBtn.textContent = updatesPaused
? (firehoseTranslations.resume || "Resume")
: (firehoseTranslations.pause || "Pause");
});
// Start fetching updates with configurable interval
document.addEventListener("click", (e) => {
const btn = e.target.closest(".toggle-btn");
if (!btn) return;
const row = btn.closest(".packet-row");
row.classList.toggle("expanded");
btn.textContent = row.classList.contains("expanded") ? "▼" : "▶";
});
await loadTranslations();
await configureFirehose();
await loadNodes();
fetchUpdates();
if (firehoseInterval > 0) {
setInterval(fetchUpdates, firehoseInterval * 1000);
}
setInterval(fetchUpdates, updateInterval);
});
</script>
{% endblock %}

View File

@@ -1,47 +1,36 @@
{% extends "base.html" %}
{% block css %}
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css"
integrity="sha256-p4NxAoJBhIIN+hmNHrzRCf9tD/miZyoHS5obTRR9BMY="
crossorigin=""/>
<style>
.legend {
background: white;
padding: 8px;
line-height: 1.5;
border-radius: 5px;
box-shadow: 0 0 10px rgba(0,0,0,0.3);
font-size: 14px;
color: black;
}
.legend i {
width: 12px;
height: 12px;
display: inline-block;
margin-right: 6px;
border-radius: 50%;
}
#filter-container {
text-align: center;
margin-top: 10px;
}
.filter-checkbox {
margin: 0 10px;
}
.blinking-tooltip {
background: white;
color: black;
border: 1px solid black;
border-radius: 4px;
padding: 2px 5px;
.legend { background:white;padding:8px;line-height:1.5;border-radius:5px;box-shadow:0 0 10px rgba(0,0,0,0.3);font-size:14px;color:black; }
.legend i { width:12px;height:12px;display:inline-block;margin-right:6px;border-radius:50%; }
#filter-container { text-align:center;margin-top:10px; }
.filter-checkbox { margin:0 10px; }
#share-button, #reset-filters-button {
padding:5px 15px;border:none;border-radius:4px;font-size:14px;cursor:pointer;color:white;
}
#share-button { margin-left:20px; background-color:#4CAF50; }
#share-button:hover { background-color:#45a049; }
#share-button:active { background-color:#3d8b40; }
#reset-filters-button { margin-left:10px; background-color:#f44336; }
#reset-filters-button:hover { background-color:#da190b; }
#reset-filters-button:active { background-color:#c41e0d; }
.blinking-tooltip { background:white;color:black;border:1px solid black;border-radius:4px;padding:2px 5px; }
</style>
{% endblock %}
{% block body %}
<div id="map" style="width: 100%; height: calc(100vh - 270px)"></div>
<div id="map" style="width:100%;height:calc(100vh - 270px)"></div>
<div id="filter-container">
<input type="checkbox" class="filter-checkbox" id="filter-routers-only"> Show Routers Only
</div>
<div style="text-align:center;margin-top:5px;">
<button id="share-button" onclick="shareCurrentView()">🔗 Share This View</button>
<button id="reset-filters-button" onclick="resetFiltersToDefaults()">↺ Reset Filters To Defaults</button>
</div>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"
integrity="sha256-20nQCchB9co0qIjJZRGuk2/Z9VM+kNiyxNV1lvTlZBo="
@@ -51,298 +40,297 @@
crossorigin></script>
<script>
// ---- Map Setup ----
var map = L.map('map');
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', {
maxZoom: 19,
attribution: '&copy; <a href="http://www.openstreetmap.org/copyright">OpenStreetMap</a>'
}).addTo(map);
// ---------------------- Map Initialization ----------------------
var map = L.map('map');
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom:19, attribution:'&copy; OpenStreetMap' }).addTo(map);
// ---- Node Data ----
var markers = {};
var markerById = {};
var nodes = [
{% for node in nodes %}
{
lat: {{ ((node.last_lat / 10**7) + (range(-9,9) | random) / 30000) | round(7) }},
long: {{ ((node.last_long / 10**7) + (range(-9,9) | random) / 30000) | round(7) if node.last_long is not none else "null" }},
long_name: {{ (node.long_name or "") | tojson }},
short_name: {{ (node.short_name or "") | tojson }},
channel: {{ (node.channel or "") | tojson }},
hw_model: {{ (node.hw_model or "") | tojson }},
role: {{ (node.role or "") | tojson }},
last_update: {{ node.last_update | default("", true) | tojson }},
firmware: {{ (node.firmware or "") | tojson }},
id: {{ (node.node_id or "") | tojson }},
isRouter: {{ 'true' if 'router' in (node.role or '').lower() else 'false' }}
}{{ "," if not loop.last else "" }}
{% endfor %}
];
// ---------------------- Globals ----------------------
var nodes=[], markers={}, markerById={}, nodeMap = new Map();
var edgesData=[], edgeLayer = L.layerGroup().addTo(map), selectedNodeId = null;
var activeBlinks = new Map(), lastImportTime = null;
var mapInterval = 0;
const portMap = {1:"Text",67:"Telemetry",3:"Position",70:"Traceroute",4:"Node Info",71:"Neighbour Info",73:"Map Report"};
const palette = ["#e6194b","#4363d8","#f58231","#911eb4","#46f0f0","#f032e6","#bcf60c","#fabebe","#008080","#e6beff","#9a6324","#fffac8","#800000","#aaffc3","#808000","#ffd8b1","#000075","#808080"];
const colorMap = new Map(); let nextColorIndex = 0;
const channelSet = new Set();
const portMap = {1: "Text", 67: "Telemetry", 3: "Position", 70: "Traceroute", 4: "Node Info", 71: "Neighbour Info", 73: "Map Report"};
// ---------------------- Helpers ----------------------
function timeAgo(date){ const diff=Date.now()-new Date(date), s=Math.floor(diff/1000), m=Math.floor(s/60), h=Math.floor(m/60), d=Math.floor(h/24); return d>0?d+"d":h>0?h+"h":m>0?m+"m":s+"s"; }
function hashToColor(str){ if(colorMap.has(str)) return colorMap.get(str); const c=palette[nextColorIndex++%palette.length]; colorMap.set(str,c); return c; }
function isInvalidCoord(n){ return !n||!n.lat||!n.long||n.lat===0||n.long===0||Number.isNaN(n.lat)||Number.isNaN(n.long); }
function timeAgo(date) {
const now = Date.now();
const diff = now - new Date(date);
const seconds = Math.floor(diff / 1000);
const minutes = Math.floor(seconds / 60);
const hours = Math.floor(minutes / 60);
const days = Math.floor(hours / 24);
if (days > 0) return days + "d";
if (hours > 0) return hours + "h";
if (minutes > 0) return minutes + "m";
return seconds + "s";
// ---------------------- Packet Fetching ----------------------
function fetchLatestPacket(){
fetch(`/api/packets?limit=1`)
.then(r=>r.json())
.then(data=>{
lastImportTime=data.packets?.[0]?.import_time_us||0;
})
.catch(console.error);
}
function fetchNewPackets(){
if(mapInterval <= 0) return;
if(lastImportTime===null) return;
const url = new URL(`/api/packets`, window.location.origin);
url.searchParams.set("since", lastImportTime);
url.searchParams.set("limit", 50);
fetch(url)
.then(r=>r.json())
.then(data=>{
if(!data.packets || data.packets.length===0) return;
let latest = lastImportTime;
data.packets.forEach(pkt=>{
if(pkt.import_time_us > latest) latest = pkt.import_time_us;
const marker = markerById[pkt.from_node_id];
const nodeData = nodeMap.get(pkt.from_node_id);
if(marker && nodeData) blinkNode(marker,nodeData.long_name,pkt.portnum);
});
lastImportTime = latest;
})
.catch(console.error);
}
// ---------------------- Polling ----------------------
let packetInterval=null;
function startPacketFetcher(){
if(mapInterval<=0) return;
if(!packetInterval){
fetchLatestPacket();
packetInterval=setInterval(fetchNewPackets,mapInterval*1000);
}
}
function stopPacketFetcher(){
if(packetInterval){
clearInterval(packetInterval);
packetInterval=null;
}
}
document.addEventListener("visibilitychange",()=>{
document.hidden?stopPacketFetcher():startPacketFetcher();
});
// ---------------------- WAIT FOR CONFIG ----------------------
async function waitForConfig() {
while (typeof window._siteConfigPromise === "undefined") {
console.log("Waiting for _siteConfigPromise...");
await new Promise(r => setTimeout(r, 100));
}
const palette = ["#e6194b","#4363d8","#f58231","#911eb4","#46f0f0","#f032e6","#bcf60c","#fabebe","#008080","#e6beff","#9a6324","#fffac8","#800000","#aaffc3","#808000","#ffd8b1","#000075","#808080"];
const colorMap = new Map();
let nextColorIndex = 0;
function hashToColor(str) {
if (colorMap.has(str)) return colorMap.get(str);
const color = palette[nextColorIndex % palette.length];
colorMap.set(str, color);
nextColorIndex++;
return color;
try {
const cfg = await window._siteConfigPromise;
if (!cfg || !cfg.site) throw new Error("Config missing site object");
return cfg.site;
} catch (err) {
console.error("Error loading site config:", err);
return {};
}
}
const nodeMap = new Map();
nodes.forEach(n => nodeMap.set(n.id, n));
// ---------------------- Load Config & Start Polling ----------------------
async function initMapPolling() {
try {
const site = await waitForConfig();
mapInterval = parseInt(site.map_interval, 10) || 0;
function isInvalidCoord(node) {
if (!node) return true;
let {lat, long} = node;
return !lat || !long || lat === 0 || long === 0 || Number.isNaN(lat) || Number.isNaN(long);
// ---- Check URL params ----
const params = new URLSearchParams(window.location.search);
const lat = parseFloat(params.get('lat'));
const lng = parseFloat(params.get('lng'));
const zoom = parseInt(params.get('zoom'), 10);
if (!isNaN(lat) && !isNaN(lng) && !isNaN(zoom)) {
map.setView([lat, lng], zoom);
window.configBoundsApplied = true;
setTimeout(() => map.invalidateSize(), 100);
}
else {
const topLeft = [parseFloat(site.map_top_left_lat), parseFloat(site.map_top_left_lon)];
const bottomRight = [parseFloat(site.map_bottom_right_lat), parseFloat(site.map_bottom_right_lon)];
if (topLeft.every(isFinite) && bottomRight.every(isFinite)) {
map.fitBounds([topLeft, bottomRight]);
window.configBoundsApplied = true;
setTimeout(() => map.invalidateSize(), 100);
}
}
if (mapInterval > 0) {
console.log(`Starting map polling every ${mapInterval}s`);
startPacketFetcher();
} else {
console.log("Map polling disabled (map_interval=0)");
}
} catch (err) {
console.error("Failed to load /api/config:", err);
}
}
// ---- Marker Plotting ----
var bounds = L.latLngBounds();
var channels = new Set();
initMapPolling();
nodes.forEach(node => {
if (!isInvalidCoord(node)) {
let category = node.channel;
channels.add(category);
let color = hashToColor(category);
// ---------------------- Load Nodes + Edges ----------------------
fetch('/api/nodes?days_active=3').then(r=>r.json()).then(data=>{
if(!data.nodes) return;
nodes = data.nodes.map(n=>({
key: n.node_id!==null?n.node_id:n.id,
id: n.id,
node_id: n.node_id,
lat: n.last_lat?n.last_lat/1e7:null,
long: n.last_long?n.last_long/1e7:null,
long_name: n.long_name||"",
short_name: n.short_name||"",
channel: n.channel||"",
hw_model: n.hw_model||"",
role: n.role||"",
firmware: n.firmware||"",
last_update: n.last_update||"",
isRouter: n.role? n.role.toLowerCase().includes("router"):false
}));
nodes.forEach(n=>{ nodeMap.set(n.key,n); if(n.channel) channelSet.add(n.channel); });
renderNodesOnMap();
createChannelFilters();
return fetch('/api/edges');
}).then(r=>r?r.json():null).then(data=>{
if(data && data.edges) edgesData=data.edges;
}).catch(console.error);
let markerOptions = { radius: node.isRouter ? 9 : 7, color: "white", fillColor: color, fillOpacity: 1, weight: 0.7 };
let popupContent = `<b><a href="/packet_list/${node.id}">${node.long_name}</a> (${node.short_name})</b><br>
<b>Channel:</b> ${node.channel}<br>
<b>Model:</b> ${node.hw_model}<br>
<b>Role:</b> ${node.role}<br>`;
if (node.last_update) popupContent += `<b>Last seen:</b> ${timeAgo(node.last_update)}<br>`;
if (node.firmware) popupContent += `<b>Firmware:</b> ${node.firmware}<br>`;
// ---------------------- Render Nodes ----------------------
function renderNodesOnMap(){
const bounds = L.latLngBounds();
nodes.forEach(node=>{
if(isInvalidCoord(node)) return;
const color = hashToColor(node.channel);
const opts = { radius: node.isRouter?9:7, color:"white", fillColor:color, fillOpacity:1, weight:0.7 };
const marker = L.circleMarker([node.lat,node.long],opts).addTo(map);
marker.nodeId = node.key;
marker.originalColor = color;
markerById[node.key] = marker;
const popup = `<b><a href="/node/${node.node_id}">${node.long_name}</a> (${node.short_name})</b><br>
<b>Channel:</b> ${node.channel}<br>
<b>Model:</b> ${node.hw_model}<br>
<b>Role:</b> ${node.role}<br>
${node.last_update? `<b>Last seen:</b> ${timeAgo(node.last_update)}<br>`:""}
${node.firmware? `<b>Firmware:</b> ${node.firmware}<br>`:""}`;
marker.on('click',()=>{ onNodeClick(node); marker.bindPopup(popup).openPopup(); setTimeout(()=>marker.closePopup(),3000); });
bounds.extend(marker.getLatLng());
});
if(!window.configBoundsApplied && bounds.isValid()){
map.fitBounds(bounds);
setTimeout(()=>map.invalidateSize(),100);
}
}
var marker = L.circleMarker([node.lat, node.long], markerOptions).addTo(map);
marker.nodeId = node.id;
marker.originalColor = color;
markerById[node.id] = marker;
marker.on('click', function(e) {
e.originalEvent.stopPropagation();
marker.bindPopup(popupContent).openPopup();
setTimeout(() => marker.closePopup(), 3000);
onNodeClick(node);
});
if (!markers[category]) markers[category] = [];
markers[category].push({ marker, isRouter: node.isRouter });
bounds.extend(marker.getLatLng());
// ---------------------- Render Edges ----------------------
function onNodeClick(node){
selectedNodeId = node.key;
edgeLayer.clearLayers();
edgesData.forEach(edge=>{
if(edge.from!==node.key && edge.to!==node.key) return;
const f=nodeMap.get(edge.from), t=nodeMap.get(edge.to);
if(!f||!t||isInvalidCoord(f)||isInvalidCoord(t)) return;
const color=edge.type==="neighbor"?"gray":"orange";
const l=L.polyline([[f.lat,f.long],[t.lat,t.long]],{color,weight:3}).addTo(edgeLayer);
if(edge.type==="traceroute"){
L.polylineDecorator(l,{patterns:[{offset:'100%',repeat:0,symbol:L.Symbol.arrowHead({pixelSize:5,polygon:false,pathOptions:{stroke:true,color}})}]}).addTo(edgeLayer);
}
});
}
map.on('click',e=>{ if(!e.originalEvent.target.classList.contains('leaflet-interactive')){ edgeLayer.clearLayers(); selectedNodeId=null; } });
var bayAreaBounds = [
[{{ site_config["site"]["map_top_left_lat"] }}, {{ site_config["site"]["map_top_left_lon"] }}],
[{{ site_config["site"]["map_bottom_right_lat"] }}, {{ site_config["site"]["map_bottom_right_lon"] }}]
];
map.fitBounds(bayAreaBounds);
// ---------------------- Packet Blinking ----------------------
function blinkNode(marker,longName,portnum){
if(!map.hasLayer(marker)) return;
if(activeBlinks.has(marker)){ clearInterval(activeBlinks.get(marker)); marker.setStyle({fillColor:marker.originalColor}); if(marker.tooltip) map.removeLayer(marker.tooltip); }
let blinkCount=0;
const portName = portMap[portnum]||`Port ${portnum}`;
const tooltip = L.tooltip({permanent:true,direction:'top',offset:[0,-marker.options.radius-5],className:'blinking-tooltip'})
.setContent(`${longName} (${portName})`).setLatLng(marker.getLatLng()).addTo(map);
marker.tooltip = tooltip;
const interval = setInterval(()=>{
if(map.hasLayer(marker)){ marker.setStyle({fillColor: blinkCount%2===0?'yellow':marker.originalColor}); marker.bringToFront(); }
blinkCount++;
if(blinkCount>7){ clearInterval(interval); marker.setStyle({fillColor:marker.originalColor}); map.removeLayer(tooltip); activeBlinks.delete(marker); }
},500);
activeBlinks.set(marker,interval);
}
// ---- Filters ----
let filterContainer = document.getElementById("filter-container");
channels.forEach(channel => {
let filterId = `filter-${channel.replace(/\s+/g, '-').toLowerCase()}`;
let color = hashToColor(channel);
let label = document.createElement('label');
label.style.color = color;
label.innerHTML = `<input type="checkbox" class="filter-checkbox" id="${filterId}" checked> ${channel}`;
// ---------------------- Channel Filters ----------------------
function createChannelFilters(){
const filterContainer = document.getElementById("filter-container");
const savedState = JSON.parse(localStorage.getItem("mapFilters") || "{}");
channelSet.forEach(channel=>{
const checkbox = document.createElement("input");
checkbox.type = "checkbox";
checkbox.className = "filter-checkbox";
checkbox.id = `filter-channel-${channel}`;
checkbox.checked = savedState[channel] !== false;
checkbox.addEventListener("change", saveFiltersToLocalStorage);
checkbox.addEventListener("change", updateNodeVisibility);
filterContainer.appendChild(checkbox);
const label = document.createElement("label");
label.htmlFor = checkbox.id;
label.innerText = channel;
label.style.color = hashToColor(channel);
filterContainer.appendChild(label);
});
function updateMarkers() {
let showRoutersOnly = document.getElementById("filter-routers-only").checked;
nodes.forEach(node => {
let category = node.channel;
let checkbox = document.getElementById(`filter-${category.replace(/\s+/g,'-').toLowerCase()}`);
let shouldShow = checkbox.checked && (!showRoutersOnly || node.isRouter);
let marker = markerById[node.id];
if (marker) marker.setStyle({ fillOpacity: shouldShow ? 1 : 0 });
});
}
const routerOnly = document.getElementById("filter-routers-only");
routerOnly.checked = savedState["routersOnly"] || false;
routerOnly.addEventListener("change", saveFiltersToLocalStorage);
routerOnly.addEventListener("change", updateNodeVisibility);
document.querySelectorAll(".filter-checkbox").forEach(input => input.addEventListener("change", updateMarkers));
updateNodeVisibility();
}
// ---- Edges ----
var edgeLayer = L.layerGroup().addTo(map);
var edgesData = null;
let selectedNodeId = null;
function saveFiltersToLocalStorage(){
const state = {};
channelSet.forEach(ch => {
const el = document.getElementById(`filter-channel-${ch}`);
state[ch] = el.checked;
});
state["routersOnly"] = document.getElementById("filter-routers-only").checked;
localStorage.setItem("mapFilters", JSON.stringify(state));
}
fetch('/api/edges').then(res => res.json()).then(data => { edgesData = data.edges; }).catch(err => console.error(err));
function onNodeClick(node) {
if (selectedNodeId != node.id) {
selectedNodeId = node.id;
edgeLayer.clearLayers();
if (!edgesData) return;
if (!map.hasLayer(edgeLayer)) edgeLayer.addTo(map);
edgesData.forEach(edge => {
if (edge.from !== node.id && edge.to !== node.id) return;
const fromNode = nodeMap.get(edge.from);
const toNode = nodeMap.get(edge.to);
if (!fromNode || !toNode) return;
if (isInvalidCoord(fromNode) || isInvalidCoord(toNode)) return;
const lineColor = edge.type === "neighbor" ? "darkred" : "black";
const dash = edge.type === "traceroute" ? "5,5" : null;
const weight = edge.type === "neighbor" ? 3 : 2;
const polyline = L.polyline([[fromNode.lat, fromNode.long],[toNode.lat, toNode.long]], { color: lineColor, weight, opacity: 1, dashArray: dash }).addTo(edgeLayer).bringToFront();
if (edge.type === "traceroute") {
L.polylineDecorator(polyline, {
patterns: [{ offset: '100%', repeat: 0, symbol: L.Symbol.arrowHead({ pixelSize: 5, polygon: false, pathOptions: { stroke: true, color: lineColor } }) }]
}).addTo(edgeLayer);
}
});
}
}
map.on('click', function(e) {
if (!e.originalEvent.target.classList.contains('leaflet-interactive')) {
edgeLayer.clearLayers();
selectedNodeId = null;
function updateNodeVisibility(){
const showRoutersOnly = document.getElementById("filter-routers-only").checked;
const activeChannels = Array.from(channelSet).filter(ch=>document.getElementById(`filter-channel-${ch}`).checked);
nodes.forEach(n=>{
const marker = markerById[n.key];
if(marker){
const visible = (!showRoutersOnly || n.isRouter) && activeChannels.includes(n.channel);
if(visible) map.addLayer(marker); else map.removeLayer(marker);
}
});
}
// ---- Blinking Nodes ----
var activeBlinks = new Map();
// ---------------------- Share / Reset ----------------------
function shareCurrentView() {
const center = map.getCenter();
const zoom = map.getZoom();
const lat = center.lat.toFixed(6);
const lng = center.lng.toFixed(6);
function blinkNode(marker, longName, portnum) {
if (!map.hasLayer(marker)) return;
if (activeBlinks.has(marker)) {
clearInterval(activeBlinks.get(marker));
marker.setStyle({ fillColor: marker.originalColor });
if (marker.tooltip) map.removeLayer(marker.tooltip);
}
const shareUrl = `${window.location.origin}/map?lat=${lat}&lng=${lng}&zoom=${zoom}`;
navigator.clipboard.writeText(shareUrl).then(() => {
const button = document.getElementById('share-button');
const originalText = button.textContent;
button.textContent = '✓ Link Copied!';
button.style.backgroundColor = '#2196F3';
setTimeout(() => {
button.textContent = originalText;
button.style.backgroundColor = '#4CAF50';
}, 2000);
}).catch(() => alert('Share this link:\n' + shareUrl));
}
let blinkCount = 0;
let portName = portMap[portnum] || `Port ${portnum}`;
let tooltip = L.tooltip({
permanent: true,
direction: 'top',
offset: [0, -marker.options.radius - 5],
className: 'blinking-tooltip'
}).setContent(`${longName} (${portName})`).setLatLng(marker.getLatLng());
tooltip.addTo(map);
marker.tooltip = tooltip;
let interval = setInterval(() => {
if (map.hasLayer(marker)) {
// Alternate color
marker.setStyle({ fillColor: blinkCount % 2 === 0 ? 'yellow' : marker.originalColor });
// Bring marker to top
marker.bringToFront();
}
blinkCount++;
if (blinkCount > 7) {
clearInterval(interval);
marker.setStyle({ fillColor: marker.originalColor });
map.removeLayer(tooltip);
activeBlinks.delete(marker);
}
}, 500);
activeBlinks.set(marker, interval);
}
// ---- Packet Fetching ----
let lastImportTime = null;
function fetchLatestPacket() {
fetch(`/api/packets?limit=1`)
.then(res => res.json())
.then(data => {
if (data.packets && data.packets.length > 0) {
lastImportTime = data.packets[0].import_time;
console.log("Initial lastImportTime:", lastImportTime);
} else {
lastImportTime = new Date().toISOString();
console.log("No packets, setting lastImportTime to now:", lastImportTime);
}
})
.catch(err => console.error("Error fetching latest packet:", err));
}
function fetchNewPackets() {
if (!lastImportTime) return;
fetch(`/api/packets?since=${lastImportTime}`)
.then(res => res.json())
.then(data => {
console.log("===== New Fetch =====");
if (!data.packets || data.packets.length === 0) {
console.log("No new packets");
return;
}
let latestSeen = lastImportTime;
data.packets.forEach(packet => {
console.log(`Packet ID: ${packet.id}, From Node: ${packet.from_node_id}, Port: ${packet.portnum}, Time: ${packet.import_time}`);
if (packet.import_time && (!latestSeen || packet.import_time > latestSeen)) latestSeen = packet.import_time;
let marker = markerById[packet.from_node_id];
if (marker) {
let nodeData = nodeMap.get(packet.from_node_id);
if (nodeData) blinkNode(marker, nodeData.long_name, packet.portnum);
}
});
if (latestSeen) lastImportTime = latestSeen;
console.log("Updated lastImportTime:", lastImportTime);
console.log("===== End Fetch =====");
})
.catch(err => console.error("Fetch error:", err));
}
// ---- Polling Control ----
let packetInterval = null;
const mapInterval = {{ site_config["site"]["map_interval"] | default(3) }};
function startPacketFetcher() {
if (mapInterval <= 0) return;
if (!packetInterval) {
fetchLatestPacket();
packetInterval = setInterval(fetchNewPackets, mapInterval * 1000);
console.log("Packet fetcher started, interval:", mapInterval, "seconds");
}
}
function stopPacketFetcher() {
if (packetInterval) {
clearInterval(packetInterval);
packetInterval = null;
console.log("Packet fetcher stopped");
}
}
document.addEventListener("visibilitychange", function() {
if (document.hidden) stopPacketFetcher();
else startPacketFetcher();
});
// ---- Initialize ----
if (mapInterval > 0) startPacketFetcher();
function resetFiltersToDefaults(){
document.getElementById("filter-routers-only").checked = false;
channelSet.forEach(ch=>document.getElementById(`filter-channel-${ch}`).checked = true);
saveFiltersToLocalStorage();
updateNodeVisibility();
}
</script>
{% endblock %}

View File

@@ -1,54 +1,184 @@
{% extends "base.html" %}
{% block css %}
.timestamp {
min-width:10em;
}
.chat-packet:nth-of-type(odd){
background-color: #3a3a3a; /* Lighter than #2a2a2a */
}
.timestamp { min-width: 10em; color: #ccc; }
.chat-packet:nth-of-type(odd) { background-color: #3a3a3a; }
.chat-packet {
border-bottom: 1px solid #555;
padding: 8px;
border-radius: 8px; /* Adjust the value to make the corners more or less rounded */
padding: 3px 6px;
border-radius: 6px;
margin: 0;
}
.chat-packet:nth-of-type(even){
background-color: #333333; /* Slightly lighter than the previous #181818 */
.chat-packet > [class^="col-"] {
padding-left: 10px !important;
padding-right: 10px !important;
padding-top: 1px !important;
padding-bottom: 1px !important;
}
.chat-packet:nth-of-type(even) { background-color: #333333; }
.channel { font-style: italic; color: #bbb; }
.channel a { font-style: normal; color: #999; }
@keyframes flash { 0% { background-color: #ffe066; } 100% { background-color: inherit; } }
.chat-packet.flash { animation: flash 3.5s ease-out; }
.replying-to { font-size: 0.8em; color: #aaa; margin-top: 2px; padding-left: 10px; }
.replying-to .reply-preview { color: #aaa; }
#weekly-message { margin: 15px 0; font-weight: bold; color: #ffeb3b; }
#total-count { margin-bottom: 10px; font-style: italic; color: #ccc; }
{% endblock %}
{% block body %}
<div class="container">
{{ site_config["site"]["weekly_net_message"] }} <br><br>
<div id="weekly-message">Loading weekly message...</div>
<div id="total-count">Total messages: 0</div>
<h5>Number of Check-ins: {{ packets|length }}</h5>
<div id="chat-container">
<div class="container" id="chat-log"></div>
</div>
</div>
<div class="container">
{% for packet in packets %}
<div
class="row chat-packet"
data-packet-id="{{ packet.id }}"
role="article"
aria-label="Chat message from {{ packet.from_node.long_name or (packet.from_node_id | node_id_to_hex) }}"
>
<span class="col-2 timestamp">
{{ packet.import_time.strftime('%-I:%M:%S %p - %m-%d-%Y') }}
</span>
<span class="col-1 timestamp">
<a href="/packet/{{ packet.id }}" title="View packet details">✉️</a> {{ packet.from_node.channel }}
</span>
<span class="col-2 username">
<a href="/packet_list/{{ packet.from_node_id }}" title="View all packets from this node">
{{ packet.from_node.long_name or (packet.from_node_id | node_id_to_hex) }}
</a>
</span>
<span class="col-5 message">
{{ packet.payload }}
</span>
</div>
{% else %}
No packets found.
{% endfor %}
</div>
<script>
document.addEventListener("DOMContentLoaded", async () => {
const chatContainer = document.querySelector("#chat-log");
const totalCountEl = document.querySelector("#total-count");
const weeklyMessageEl = document.querySelector("#weekly-message");
if (!chatContainer || !totalCountEl || !weeklyMessageEl) {
console.error("Required elements not found");
return;
}
const renderedPacketIds = new Set();
const packetMap = new Map();
let chatTranslations = {};
let netTag = "";
function updateTotalCount() {
totalCountEl.textContent = `Total messages: ${renderedPacketIds.size}`;
}
function escapeHtml(text) {
const div = document.createElement("div");
div.textContent = text ?? "";
return div.innerHTML;
}
function applyTranslations(translations, root = document) {
root.querySelectorAll("[data-translate-lang]").forEach(el => {
const key = el.dataset.translateLang;
if (translations[key]) el.textContent = translations[key];
});
root.querySelectorAll("[data-translate-lang-title]").forEach(el => {
const key = el.dataset.translateLangTitle;
if (translations[key]) el.title = translations[key];
});
}
function renderPacket(packet) {
if (renderedPacketIds.has(packet.id)) return;
renderedPacketIds.add(packet.id);
packetMap.set(packet.id, packet);
const date = new Date(packet.import_time_us / 1000);
const formattedTime = date.toLocaleTimeString([], { hour: "numeric", minute: "2-digit", second: "2-digit", hour12: true });
const formattedDate = `${(date.getMonth() + 1).toString().padStart(2, "0")}/${date.getDate().toString().padStart(2, "0")}/${date.getFullYear()}`;
const formattedTimestamp = `${formattedTime} - ${formattedDate}`;
let replyHtml = "";
if (packet.reply_id) {
const parent = packetMap.get(packet.reply_id);
if (parent) {
replyHtml = `<div class="replying-to">
<div class="reply-preview">
<i data-translate-lang="replying_to"></i>
<strong>${escapeHtml((parent.long_name || "").trim() || `Node ${parent.from_node_id}`)}</strong>:
${escapeHtml(parent.payload || "")}
</div>
</div>`;
} else {
replyHtml = `<div class="replying-to">
<i data-translate-lang="replying_to"></i>
<a href="/packet/${packet.reply_id}">${packet.reply_id}</a>
</div>`;
}
}
const div = document.createElement("div");
div.className = "row chat-packet";
div.dataset.packetId = packet.id;
div.innerHTML = `
<span class="col-2 timestamp" title="${packet.import_time_us}">${formattedTimestamp}</span>
<span class="col-2 channel">
<a href="/packet/${packet.id}" data-translate-lang-title="view_packet_details">✉️</a>
${escapeHtml(packet.channel || "")}
</span>
<span class="col-3 nodename">
<a href="/packet_list/${packet.from_node_id}">
${escapeHtml((packet.long_name || "").trim() || `Node ${packet.from_node_id}`)}
</a>
</span>
<span class="col-5 message">${escapeHtml(packet.payload)}${replyHtml}</span>
`;
chatContainer.prepend(div);
applyTranslations(chatTranslations, div);
updateTotalCount();
}
function renderPacketsEnsureDescending(packets) {
if (!Array.isArray(packets) || packets.length === 0) return;
const sortedDesc = packets.slice().sort((a, b) => b.import_time_us - a.import_time_us);
for (let i = sortedDesc.length - 1; i >= 0; i--) renderPacket(sortedDesc[i]);
}
async function fetchInitialPackets(tag) {
if (!tag) {
console.warn("No net_tag defined, skipping packet fetch.");
return;
}
try {
console.log("Fetching packets for netTag:", tag);
const sixDaysAgoMs = Date.now() - (6 * 24 * 60 * 60 * 1000);
const sinceUs = Math.floor(sixDaysAgoMs * 1000);
const resp = await fetch(`/api/packets?portnum=1&contains=${encodeURIComponent(tag)}&since=${sinceUs}`);
const data = await resp.json();
console.log("Packets received:", data?.packets?.length);
if (data?.packets?.length) renderPacketsEnsureDescending(data.packets);
} catch (err) {
console.error("Initial fetch error:", err);
}
}
async function loadTranslations(cfg) {
try {
const langCode = cfg?.site?.language || "en";
const res = await fetch(`/api/lang?lang=${langCode}&section=chat`);
chatTranslations = await res.json();
applyTranslations(chatTranslations, document);
} catch (err) {
console.error("Chat translation load failed:", err);
}
}
// --- MAIN LOGIC ---
try {
const cfg = await window._siteConfigPromise; // ✅ Already fetched by base.html
const site = cfg?.site || {};
// Populate from config
netTag = site.net_tag || "";
weeklyMessageEl.textContent = site.weekly_net_message || "Weekly message not set.";
await loadTranslations(cfg);
await fetchInitialPackets(netTag);
} catch (err) {
console.error("Initialization failed:", err);
weeklyMessageEl.textContent = "Failed to load site config.";
}
});
</script>
{% endblock %}

File diff suppressed because it is too large Load Diff

View File

@@ -1,58 +0,0 @@
{% extends "base.html" %}
{% block css %}
#node_info {
height:100%;
}
#map{
height:100%;
min-height: 400px;
}
#packet_details{
height: 95vh;
overflow: scroll;
top: 3em;
}
div.tab-pane > dl {
display: inline-block;
}
{% endblock %}
{% block body %}
{% include "search_form.html" %}
<div class="row">
<div class="col mb-3">
<div class="card" id="node_info">
{% if node %}
<div class="card-header">
{{node.long_name}}
</div>
<div class="card-body">
<dl >
<dt>ShortName</dt>
<dd>{{node.short_name}}</dd>
<dt>HW Model</dt>
<dd>{{node.hw_model}}</dd>
<dt>Role</dt>
<dd>{{node.role}}</dd>
</dl>
</div>
{% else %}
<div class="card-body">
A NodeInfo has not been seen.
</div>
{% endif %}
</div>
</div>
<div class="row">
<div class="col">
{% include 'packet_list.html' %}
</div>
</div>
<div class="col mb-3">
<div id="map"></div>
</div>
</div>
{% endblock %}

View File

@@ -1,263 +0,0 @@
{% macro graph(name) %}
<div id="{{name}}Chart" style="width: 100%; height: 100%;"></div>
{% endmacro %}
<!-- Download and Expand buttons -->
<div class="d-flex justify-content-end mb-2">
<button class="btn btn-sm btn-outline-light me-2" id="downloadCsvBtn">Download CSV</button>
<button class="btn btn-sm btn-outline-light" data-bs-toggle="modal" data-bs-target="#fullChartModal">Expand</button>
</div>
<!-- Tab Navigation -->
<ul class="nav nav-tabs" role="tablist">
{% for name in [
"power", "utilization", "temperature", "humidity", "pressure",
"iaq", "wind_speed", "wind_direction", "power_metrics", "neighbors"
] %}
<li class="nav-item" role="presentation">
<button class="nav-link {% if loop.first %}active{% endif %}" data-bs-toggle="tab" data-bs-target="#{{name}}Tab" type="button" role="tab">{{ name | capitalize }}</button>
</li>
{% endfor %}
</ul>
<!-- Tab Content -->
<div class="tab-content mt-3" style="height: 40vh;">
{% for name in [
"power", "utilization", "temperature", "humidity", "pressure",
"iaq", "wind_speed", "wind_direction", "power_metrics", "neighbors"
] %}
<div class="tab-pane fade {% if loop.first %}show active{% endif %}" id="{{name}}Tab" role="tabpanel" style="height: 100%;">
{{ graph(name) | safe }}
</div>
{% endfor %}
</div>
<!-- Fullscreen Modal -->
<div class="modal fade" id="fullChartModal" tabindex="-1" aria-labelledby="fullChartModalLabel" aria-hidden="true">
<div class="modal-dialog modal-fullscreen">
<div class="modal-content bg-dark text-white">
<div class="modal-header">
<h5 class="modal-title" id="fullChartModalLabel">Full Graph</h5>
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body" style="height: 100vh;">
<div id="fullChartContainer" style="width: 100%; height: 100%;"></div>
</div>
</div>
</div>
</div>
<!-- ECharts Library -->
<script src="https://cdn.jsdelivr.net/npm/echarts@5/dist/echarts.min.js"></script>
<script>
document.addEventListener("DOMContentLoaded", function () {
let currentChart = null;
let currentChartName = null;
let currentChartData = null;
let fullChart = null;
async function loadChart(name, targetDiv) {
currentChartName = name;
const chartDiv = document.getElementById(targetDiv);
if (!chartDiv) return;
try {
const resp = await fetch(`/graph/${name}_json/{{ node_id }}`);
if (!resp.ok) throw new Error(`Failed to load data for ${name}`);
const data = await resp.json();
// Reverse for chronological order
data.timestamps.reverse();
data.series.forEach(s => s.data.reverse());
const formattedDates = data.timestamps.map(t => {
const d = new Date(t);
return `${(d.getMonth() + 1).toString().padStart(2, '0')}-${d.getDate().toString().padStart(2, '0')}-${d.getFullYear().toString().slice(-2)}`;
});
currentChartData = {
...data,
timestamps: formattedDates
};
const chart = echarts.init(chartDiv);
const isDualAxis = name === 'power';
chart.setOption({
tooltip: {
trigger: 'axis',
formatter: function (params) {
return params.map(p => {
const label = p.seriesName.toLowerCase();
const unit = label.includes('volt') ? 'V' : label.includes('battery') ? '%' : '';
return `${p.marker} ${p.seriesName}: ${p.data}${unit}`;
}).join('<br>');
}
},
xAxis: {
type: 'category',
data: formattedDates,
axisLabel: { color: '#fff', rotate: 45 },
},
yAxis: isDualAxis ? [
{
type: 'value',
name: 'Battery (%)',
min: 0,
max: 120,
position: 'left',
axisLabel: { color: '#fff' },
nameTextStyle: { color: '#fff' }
},
{
type: 'value',
name: 'Voltage (V)',
min: 0,
max: 6,
position: 'right',
axisLabel: { color: '#fff' },
nameTextStyle: { color: '#fff' }
}
] : {
type: 'value',
axisLabel: { color: '#fff' },
},
series: data.series.map(s => ({
name: s.name,
type: 'line',
data: s.data,
smooth: true,
connectNulls: true,
showSymbol: false,
yAxisIndex: isDualAxis && s.name.toLowerCase().includes('volt') ? 1 : 0,
})),
legend: { textStyle: { color: '#fff' } }
});
return chart;
} catch (err) {
console.error(err);
currentChartData = null;
currentChartName = null;
chartDiv.innerHTML = `<div class="text-white text-center mt-5">Error loading ${name} data.</div>`;
return null;
}
}
// Load first chart
const firstTabBtn = document.querySelector('.nav-tabs button.nav-link.active');
if (firstTabBtn) {
const name = firstTabBtn.textContent.toLowerCase();
const chartId = `${name}Chart`;
loadChart(name, chartId).then(chart => currentChart = chart);
}
// On tab switch
document.querySelectorAll('.nav-tabs button.nav-link').forEach(button => {
button.addEventListener('shown.bs.tab', event => {
const name = event.target.textContent.toLowerCase();
const chartId = `${name}Chart`;
loadChart(name, chartId).then(chart => currentChart = chart);
});
});
// CSV Download
document.getElementById('downloadCsvBtn').addEventListener('click', () => {
if (!currentChartData || !currentChartName) {
alert("Chart data not loaded yet.");
return;
}
const { timestamps, series } = currentChartData;
let csv = 'Date,' + series.map(s => s.name).join(',') + '\n';
for (let i = 0; i < timestamps.length; i++) {
const row = [timestamps[i]];
for (const s of series) {
row.push(s.data[i] != null ? s.data[i] : '');
}
csv += row.join(',') + '\n';
}
const blob = new Blob([csv], { type: 'text/csv' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = `${currentChartName}_{{ node_id }}.csv`;
a.click();
URL.revokeObjectURL(url);
});
// Fullscreen modal chart
document.getElementById('fullChartModal').addEventListener('shown.bs.modal', () => {
if (!currentChartData || !currentChartName) return;
if (!fullChart) {
fullChart = echarts.init(document.getElementById('fullChartContainer'));
}
const isDualAxis = currentChartName === 'power';
fullChart.setOption({
title: { text: currentChartName.charAt(0).toUpperCase() + currentChartName.slice(1), textStyle: { color: '#fff' } },
tooltip: {
trigger: 'axis',
formatter: function (params) {
return params.map(p => {
const label = p.seriesName.toLowerCase();
const unit = label.includes('volt') ? 'V' : label.includes('battery') ? '%' : '';
return `${p.marker} ${p.seriesName}: ${p.data}${unit}`;
}).join('<br>');
}
},
xAxis: {
type: 'category',
data: currentChartData.timestamps,
axisLabel: { color: '#fff', rotate: 45 },
},
yAxis: isDualAxis ? [
{
type: 'value',
name: 'Battery (%)',
min: 0,
max: 120,
position: 'left',
axisLabel: { color: '#fff' },
nameTextStyle: { color: '#fff' }
},
{
type: 'value',
name: 'Voltage (V)',
min: 0,
max: 6,
position: 'right',
axisLabel: { color: '#fff' },
nameTextStyle: { color: '#fff' }
}
] : {
type: 'value',
axisLabel: { color: '#fff' },
},
series: currentChartData.series.map(s => ({
name: s.name,
type: 'line',
data: s.data,
smooth: true,
connectNulls: true,
showSymbol: false,
yAxisIndex: isDualAxis && s.name.toLowerCase().includes('volt') ? 1 : 0,
})),
legend: { textStyle: { color: '#fff' } }
});
fullChart.resize();
});
window.addEventListener('resize', () => {
if (fullChart) fullChart.resize();
if (currentChart) currentChart.resize();
});
});
</script>

View File

@@ -1,109 +0,0 @@
{% extends "base.html" %}
{% block css %}
.table-title {
font-size: 2rem;
text-align: center;
margin-bottom: 20px;
}
.traffic-table {
width: 50%;
border-collapse: collapse;
margin: 0 auto;
font-family: Arial, sans-serif;
}
.traffic-table th,
.traffic-table td {
padding: 10px 15px;
text-align: left;
border: 1px solid #474b4e;
}
.traffic-table th {
background-color: #272b2f;
color: white;
}
.traffic:nth-of-type(odd) {
background-color: #272b2f; /* Lighter than #2a2a2a */
}
.traffic {
border: 1px solid #474b4e;
padding: 8px;
margin-bottom: 4px;
border-radius: 8px;
}
.traffic:nth-of-type(even) {
background-color: #212529; /* Slightly lighter than the previous #181818 */
}
.footer {
text-align: center;
margin-top: 20px;
}
{% endblock %}
{% block body %}
<section>
<h2 class="table-title">
{% if traffic %}
{{ traffic[0].long_name }} (last 24 hours)
{% else %}
No Traffic Data Available
{% endif %}
</h2>
<table class="traffic-table">
<thead>
<tr>
<th>Port Number</th>
<th>Packet Count</th>
</tr>
</thead>
<tbody>
{% for port in traffic %}
<tr class="traffic">
<td>
{% if port.portnum == 1 %}
TEXT_MESSAGE_APP
{% elif port.portnum == 3 %}
POSITION_APP
{% elif port.portnum == 4 %}
NODEINFO_APP
{% elif port.portnum == 5 %}
ROUTING_APP
{% elif port.portnum == 8 %}
WAYPOINT_APP
{% elif port.portnum == 67 %}
TELEMETRY_APP
{% elif port.portnum == 70 %}
TRACEROUTE_APP
{% elif port.portnum == 71 %}
NEIGHBORINFO_APP
{% elif port.portnum == 73 %}
MAP_REPORT_APP
{% elif port.portnum == 0 %}
UNKNOWN_APP
{% else %}
{{ port.portnum }}
{% endif %}
</td>
<td>{{ port.packet_count }}</td>
</tr>
{% else %}
<tr>
<td colspan="2">No traffic data available for this node.</td>
</tr>
{% endfor %}
</tbody>
</table>
</section>
<footer class="footer">
<a href="/top">Back to Top Nodes</a>
</footer>
{% endblock %}

View File

@@ -13,6 +13,8 @@
border-radius: 10px;
box-shadow: 0 4px 8px rgba(0,0,0,0.1);
}
/* Search UI */
.search-container {
position: absolute;
bottom: 100px;
@@ -37,6 +39,8 @@
.search-container button:hover {
background-color: #0056b3;
}
/* Node info box */
#node-info {
position: absolute;
bottom: 10px;
@@ -52,6 +56,8 @@
max-height: 250px;
overflow-y: auto;
}
/* Legend */
#legend {
position: absolute;
bottom: 10px;
@@ -67,9 +73,6 @@
}
.legend-category {
margin-right: 10px;
code {
color: inherit;
}
}
.legend-box {
display: inline-block;
@@ -77,22 +80,23 @@
height: 12px;
margin-right: 5px;
border-radius: 3px;
&.circle {
border-radius: 6px;
}
}
.circle { border-radius: 6px; }
{% endblock %}
{% block body %}
<div id="mynetwork"></div>
<!-- SEARCH + FILTER -->
<div class="search-container">
<label for="channel-select" style="color:#333;">Channel:</label>
<label style="color:#333;">Channel:</label>
<select id="channel-select" onchange="filterByChannel()"></select>
<input type="text" id="node-search" placeholder="Search node...">
<button onclick="searchNode()">Search</button>
</div>
<!-- INFO BOX -->
<div id="node-info">
<b>Long Name:</b> <span id="node-long-name"></span><br>
<b>Short Name:</b> <span id="node-short-name"></span><br>
@@ -100,190 +104,280 @@
<b>Hardware Model:</b> <span id="node-hw-model"></span>
</div>
<!-- LEGEND -->
<div id="legend">
<div class="legend-category">
<div><span class="legend-box" style="background-color: #ff5733"></span> Traceroute</div>
<div><span class="legend-box" style="background-color: #049acd"></span> Neighbor</div>
<div><span class="legend-box" style="background-color: #ff5733"></span>Traceroute</div>
<div><span class="legend-box" style="background-color: #049acd"></span>Neighbor</div>
</div>
<div class="legend-category">
<div><span class="legend-box circle" style="background-color: #ff5733"></span> <code>ROUTER</code></div>
<div><span class="legend-box circle" style="background-color: #b65224"></span> <code>ROUTER_LATE</code></div>
<div><span class="legend-box circle" style="background-color: #ff5733"></span><code>ROUTER</code></div>
<div><span class="legend-box circle" style="background-color: #b65224"></span><code>ROUTER_LATE</code></div>
</div>
<div class="legend-category">
<div><span class="legend-box circle" style="background-color: #007bff"></span> <code>CLIENT</code></div>
<div><span class="legend-box circle" style="background-color: #00c3ff"></span> <code>CLIENT_MUTE</code></div>
<div><span class="legend-box circle" style="background-color: #007bff"></span><code>CLIENT</code></div>
<div><span class="legend-box circle" style="background-color: #00c3ff"></span><code>CLIENT_MUTE</code></div>
</div>
<div class="legend-category">
<div><span class="legend-box circle" style="background-color: #ffbf00"></span> Other</div>
<div><span class="legend-box circle" style="background-color: #6c757d"></span> Unknown</div>
<div><span class="legend-box circle" style="background-color: #049acd"></span><code>CLIENT_BASE</code></div>
<div><span class="legend-box circle" style="background-color: #ffbf00"></span>Other</div>
</div>
<div class="legend-category">
<div><span class="legend-box circle" style="background-color: #6c757d"></span>Unknown</div>
</div>
</div>
<script>
const chart = echarts.init(document.getElementById('mynetwork'));
// Initialize ECharts
const chart = echarts.init(document.getElementById("mynetwork"));
// -----------------------------------
// COLOR + ROLE HELPERS
// -----------------------------------
const colors = {
edge: {
traceroute: '#ff5733',
neighbor: '#049acd',
},
edge: { traceroute:"#ff5733", neighbor:"#049acd" },
role: {
ROUTER: '#ff5733',
ROUTER_LATE: '#b65224',
CLIENT: '#007bff',
CLIENT_MUTE: '#00c3ff',
other: '#ffbf00',
unknown: '#6c757d',
ROUTER:"#ff5733",
ROUTER_LATE:"#b65224",
CLIENT:"#007bff",
CLIENT_MUTE:"#00c3ff",
CLIENT_BASE:"#049acd",
other:"#ffbf00",
unknown:"#6c757d"
},
selection: '#ff8c00',
selection:"#ff8c00"
};
function getRoleColor(role) {
if (!role) return colors.role.unknown;
return colors.role[role] || colors.role.other;
}
function getSymbolSize (role) {
switch (role) {
case 'ROUTER': return 30;
case 'ROUTER_LATE': return 30;
case 'CLIENT': return 15;
case 'CLIENT_MUTE': return 7;
default: return 15; // Unknown or other roles
function getRoleColor(role) { return colors.role[role] || colors.role.other; }
function getSymbolSize(role) {
switch(role){
case "ROUTER":
case "ROUTER_LATE": return 30;
case "CLIENT_BASE": return 18;
case "CLIENT": return 15;
case "CLIENT_MUTE": return 7;
default: return 15;
}
}
function getLabel (role, short_name, long_name) {
if (role === 'ROUTER') return long_name;
if (role === 'ROUTER_LATE') return long_name;
if (role === 'CLIENT') return short_name;
if (role === 'CLIENT_MUTE') return short_name;
return short_name || '';
function getLabel(role, shortName, longName) {
if (role === "ROUTER" || role === "ROUTER_LATE") return longName;
return shortName || "";
}
// --- Nodes ---
const nodes = [
{% for node in nodes %}
{
name: "{{ node.node_id }}", // node_id as string
value: getLabel({{node.role | tojson}}, {{node.short_name | tojson }}, {{node.long_name | tojson}}), // display label
symbol: 'circle',
symbolSize: getSymbolSize({{node.role | tojson}}),
itemStyle: { color: getRoleColor({{node.role | tojson}}), opacity:1 },
label: { show:true, position:'right', color:'#333', fontSize:12, formatter: (p)=>p.data.value },
long_name: {{ node.long_name | tojson }},
short_name: {{ node.short_name | tojson }},
role: {{ node.role | tojson }},
hw_model: {{ node.hw_model | tojson }},
channel: {{ node.channel | tojson }}
},
{% endfor %}
];
// --- Edges ---
const edges = [
{% for edge in edges %}
{
source: "{{ edge.from }}", // edge source as string
target: "{{ edge.to }}", // edge target as string
originalColor: colors.edge[{{edge.type | tojson}}],
lineStyle: {
color: colors.edge[{{edge.type | tojson}}],
width: {{edge.weight | tojson}},
},
},
{% endfor %}
];
// -----------------------------------
// STATE
// -----------------------------------
let nodes = [];
let edges = [];
let filteredNodes = [];
let filteredEdges = [];
let selectedChannel = 'LongFast';
let lastSelectedNode = null;
let selectedChannel = null;
// -----------------------------------
// LOAD NODES + EDGES FROM API
// -----------------------------------
async function loadData() {
// 1. Load nodes
const n = await fetch("/api/nodes").then(r => r.json());
nodes = n.nodes.map(x => ({
name: String(x.node_id),
node_id: x.node_id,
long_name: x.long_name,
short_name: x.short_name,
hw_model: x.hw_model,
role: x.role,
channel: x.channel,
labelValue: getLabel(x.role, x.short_name, x.long_name),
symbolSize: getSymbolSize(x.role),
itemStyle: {
color: getRoleColor(x.role),
opacity: 1
},
label: {
show: true,
position: "right",
color: "#333",
fontSize: 12,
formatter: p => p.data.labelValue
}
}));
const allNodeIDs = new Set(nodes.map(n => n.name));
// 2. Load edges
const e = await fetch("/api/edges").then(r => r.json());
// Only keep edges that reference valid nodes
edges = e.edges
.filter(ed =>
allNodeIDs.has(String(ed.from)) &&
allNodeIDs.has(String(ed.to))
)
.map(ed => ({
source: String(ed.from),
target: String(ed.to),
edgeType: ed.type,
originalColor: colors.edge[ed.type] || "#ccc",
lineStyle: {
width: 2,
color: colors.edge[ed.type] || "#ccc",
opacity: 1
}
}));
// 3. Determine which nodes are actually used in edges
const usedNodeIDs = new Set();
edges.forEach(e => {
usedNodeIDs.add(e.source);
usedNodeIDs.add(e.target);
});
// 4. Remove unused (no-edge) nodes
nodes = nodes.filter(n => usedNodeIDs.has(n.name));
// 5. Double safety: remove any edges referencing removed nodes
edges = edges.filter(e =>
usedNodeIDs.has(e.source) && usedNodeIDs.has(e.target)
);
// 6. Now ready to build dropdown & render
populateChannelDropdown();
}
// -----------------------------------
// CHANNEL FILTER
// -----------------------------------
function populateChannelDropdown() {
const sel = document.getElementById('channel-select');
const unique = [...new Set(nodes.map(n=>n.channel).filter(Boolean))].sort();
unique.forEach(ch=>{
const opt = document.createElement('option');
opt.value=ch; opt.text=ch;
if(ch==='LongFast') opt.selected=true;
const sel = document.getElementById("channel-select");
const chans = [...new Set(nodes.map(n => n.channel))].sort();
chans.forEach(ch => {
const opt = document.createElement("option");
opt.value = ch;
opt.text = ch;
sel.appendChild(opt);
});
selectedChannel = sel.value;
selectedChannel = chans[0];
sel.value = selectedChannel;
filterByChannel();
}
function filterByChannel() {
selectedChannel = document.getElementById('channel-select').value;
filteredNodes = nodes.filter(n=>n.channel===selectedChannel);
const nodeSet = new Set(filteredNodes.map(n=>n.name));
filteredEdges = edges.filter(e=>nodeSet.has(e.source) && nodeSet.has(e.target));
lastSelectedNode=null;
selectedChannel = document.getElementById("channel-select").value;
filteredNodes = nodes.filter(n => n.channel === selectedChannel);
const allowed = new Set(filteredNodes.map(n => n.name));
filteredEdges = edges.filter(e => allowed.has(e.source) && allowed.has(e.target));
lastSelectedNode = null;
updateChart();
}
// -----------------------------------
// FORCE GRAPH UPDATE
// -----------------------------------
function updateChart() {
const updatedNodes = filteredNodes.map(node=>{
let opacity=1, color=getRoleColor(node.role), borderColor='transparent', borderWidth=6;
if(lastSelectedNode){
const connected = filteredEdges.some(e=>
(e.source===node.name && e.target===lastSelectedNode) ||
(e.target===node.name && e.source===lastSelectedNode)
const updatedNodes = filteredNodes.map(n => {
let opacity = 1;
let borderColor = "transparent";
if (lastSelectedNode) {
const connected = filteredEdges.some(
e => (e.source === n.name && e.target === lastSelectedNode) ||
(e.target === n.name && e.source === lastSelectedNode)
);
if(node.name === lastSelectedNode) {
opacity=1;
borderColor=colors.selection;
if (n.name === lastSelectedNode) {
borderColor = colors.selection;
} else if (!connected) {
opacity = 0.3;
}
else if(connected) opacity=1;
else opacity=0.4;
}
return {...node, itemStyle:{...node.itemStyle, color,opacity, borderColor, borderWidth}};
return {
...n,
itemStyle: { ...n.itemStyle, opacity, borderColor, borderWidth: 6 }
};
});
const updatedEdges = filteredEdges.map(edge=>{
let opacity=0.1, width=edge.lineStyle.width;
if(lastSelectedNode){
const connected = edge.source===lastSelectedNode || edge.target===lastSelectedNode;
opacity=connected?1:0.05; width=edge.lineStyle.width;
}
return {...edge, lineStyle:{color:edge.originalColor||'#d3d3d3', width, opacity}};
const updatedEdges = filteredEdges.map(e => {
const connected =
lastSelectedNode &&
(e.source === lastSelectedNode || e.target === lastSelectedNode);
return {
...e,
lineStyle: { ...e.lineStyle, opacity: connected ? 1 : 0.1 }
};
});
chart.setOption({series:[{type:'graph', layout:'force', data:updatedNodes, links:updatedEdges, roam:true, force:{repulsion:200, edgeLength:[80,120]}}]});
chart.setOption({
animation: false,
series: [{
type: "graph",
layout: "force",
roam: true,
data: updatedNodes,
links: updatedEdges,
force: { repulsion: 200, edgeLength: [80, 120] }
}]
});
}
chart.on('click', function(params){
if(params.dataType==='node') updateSelectedNode(params.data.name);
else{
lastSelectedNode=null; updateChart();
document.getElementById('node-long-name').innerText='';
document.getElementById('node-short-name').innerText='';
document.getElementById('node-role').innerText='';
document.getElementById('node-hw-model').innerText='';
// -----------------------------------
// CLICK EVENTS
// -----------------------------------
chart.on("click", function(params){
if (params.dataType === "node") {
updateSelectedNode(params.data.name);
} else {
lastSelectedNode = null;
updateChart();
document.getElementById("node-long-name").innerText = "";
document.getElementById("node-short-name").innerText = "";
document.getElementById("node-role").innerText = "";
document.getElementById("node-hw-model").innerText = "";
}
});
function updateSelectedNode(selNode){
lastSelectedNode=selNode; updateChart();
const n = filteredNodes.find(x=>x.name===selNode);
if(n){
document.getElementById('node-long-name').innerText=n.long_name;
document.getElementById('node-short-name').innerText=n.short_name;
document.getElementById('node-role').innerText=n.role;
document.getElementById('node-hw-model').innerText=n.hw_model;
}
function updateSelectedNode(id) {
lastSelectedNode = id;
updateChart();
const n = filteredNodes.find(n => n.name === id);
if (!n) return;
document.getElementById("node-long-name").innerText = n.long_name;
document.getElementById("node-short-name").innerText = n.short_name;
document.getElementById("node-role").innerText = n.role;
document.getElementById("node-hw-model").innerText = n.hw_model;
}
function searchNode(){
const q = document.getElementById('node-search').value.toLowerCase().trim();
if(!q) return;
const found = filteredNodes.find(n=>n.name.toLowerCase().includes(q) || n.long_name.toLowerCase().includes(q) || n.short_name.toLowerCase().includes(q));
if(found) updateSelectedNode(found.name);
// -----------------------------------
// SEARCH
// -----------------------------------
function searchNode() {
const q = document.getElementById("node-search").value.toLowerCase().trim();
if (!q) return;
const found = filteredNodes.find(n =>
n.name.toLowerCase().includes(q) ||
(n.long_name || "").toLowerCase().includes(q) ||
(n.short_name || "").toLowerCase().includes(q)
);
if (found) updateSelectedNode(found.name);
else alert("Node not found in current channel!");
}
populateChannelDropdown();
window.addEventListener('resize', ()=>chart.resize());
// -----------------------------------
loadData();
window.addEventListener("resize", () => chart.resize());
</script>
{% endblock %}

View File

@@ -84,28 +84,48 @@ select, .export-btn, .search-box, .clear-btn {
font-weight: bold;
color: white;
}
.favorite-star {
cursor: pointer;
font-size: 1.2em;
user-select: none;
transition: color 0.2s;
}
.favorite-star:hover {
transform: scale(1.2);
}
.favorite-star.active {
color: #ffd700;
}
.favorites-btn {
background-color: #ffd700;
color: #000;
border: none;
}
.favorites-btn:hover {
background-color: #ffed4e;
}
.favorites-btn.active {
background-color: #ff6b6b;
color: white;
}
{% endblock %}
{% block body %}
<div class="filter-container">
<input type="text" id="search-box" class="search-box" placeholder="Search by name or ID..." />
<select id="role-filter">
<option value="">All Roles</option>
</select>
<select id="channel-filter">
<option value="">All Channels</option>
</select>
<select id="hw-filter">
<option value="">All HW Models</option>
</select>
<select id="firmware-filter">
<option value="">All Firmware</option>
</select>
<select id="role-filter"><option value="">All Roles</option></select>
<select id="channel-filter"><option value="">All Channels</option></select>
<select id="hw-filter"><option value="">All HW Models</option></select>
<select id="firmware-filter"><option value="">All Firmware</option></select>
<button class="favorites-btn" id="favorites-btn">⭐ Show Favorites</button>
<button class="export-btn" id="export-btn">Export CSV</button>
<button class="clear-btn" id="clear-btn">Clear Filters</button>
</div>
@@ -118,7 +138,7 @@ select, .export-btn, .search-box, .clear-btn {
<table>
<thead>
<tr>
<th>Short<span class="sort-icon"></span></th>
<th>Short <span class="sort-icon"></span></th>
<th>Long Name <span class="sort-icon"></span></th>
<th>HW Model <span class="sort-icon"></span></th>
<th>Firmware <span class="sort-icon"></span></th>
@@ -126,24 +146,75 @@ select, .export-btn, .search-box, .clear-btn {
<th>Last Latitude <span class="sort-icon"></span></th>
<th>Last Longitude <span class="sort-icon"></span></th>
<th>Channel <span class="sort-icon"></span></th>
<th>Last Update <span class="sort-icon"></span></th>
<th>Last Seen <span class="sort-icon"></span></th>
<th> </th>
</tr>
</thead>
<tbody id="node-table-body">
<tr><td colspan="9" style="text-align:center; color:white;">Loading nodes...</td></tr>
<tr><td colspan="10" style="text-align:center; color:white;">Loading nodes...</td></tr>
</tbody>
</table>
</div>
<script>
// =====================================================
// GLOBALS
// =====================================================
let allNodes = [];
let sortColumn = "short_name"; // default sorted column
let sortAsc = true; // default ascending
let sortColumn = "short_name";
let sortAsc = true;
let showOnlyFavorites = false;
// Declare headers and keyMap BEFORE any function that uses them
const headers = document.querySelectorAll("thead th");
const keyMap = ["short_name","long_name","hw_model","firmware","role","last_lat","last_long","channel","last_update"];
const keyMap = [
"short_name","long_name","hw_model","firmware","role",
"last_lat","last_long","channel","last_seen_us"
];
// =====================================================
// FAVORITES SYSTEM (localStorage)
// =====================================================
function getFavorites() {
const favorites = localStorage.getItem('nodelist_favorites');
return favorites ? JSON.parse(favorites) : [];
}
function saveFavorites(favs) {
localStorage.setItem('nodelist_favorites', JSON.stringify(favs));
}
function toggleFavorite(nodeId) {
let favs = getFavorites();
const idx = favs.indexOf(nodeId);
if (idx >= 0) favs.splice(idx, 1);
else favs.push(nodeId);
saveFavorites(favs);
}
function isFavorite(nodeId) {
return getFavorites().includes(nodeId);
}
// =====================================================
// "TIME AGO" FORMATTER
// =====================================================
function timeAgo(usTimestamp) {
if (!usTimestamp) return "N/A";
const ms = usTimestamp / 1000;
const diff = Date.now() - ms;
if (diff < 60000) return "just now";
const mins = Math.floor(diff / 60000);
if (mins < 60) return `${mins} min ago`;
const hrs = Math.floor(mins / 60);
if (hrs < 24) return `${hrs} hr ago`;
const days = Math.floor(hrs / 24);
return `${days} day${days > 1 ? "s" : ""} ago`;
}
// =====================================================
// DOM LOADED: FETCH NODES
// =====================================================
document.addEventListener("DOMContentLoaded", async function() {
const tbody = document.getElementById("node-table-body");
const roleFilter = document.getElementById("role-filter");
@@ -154,17 +225,20 @@ document.addEventListener("DOMContentLoaded", async function() {
const countSpan = document.getElementById("node-count");
const exportBtn = document.getElementById("export-btn");
const clearBtn = document.getElementById("clear-btn");
const favoritesBtn = document.getElementById("favorites-btn");
try {
const response = await fetch("/api/nodes?days_active=3");
if (!response.ok) throw new Error("Failed to fetch nodes");
const data = await response.json();
const res = await fetch("/api/nodes?days_active=3");
if (!res.ok) throw new Error("Failed to fetch nodes");
const data = await res.json();
allNodes = data.nodes;
populateFilters(allNodes);
renderTable(allNodes);
updateSortIcons();
} catch (err) {
tbody.innerHTML = `<tr><td colspan="9" style="text-align:center; color:red;">Error loading nodes: ${err.message}</td></tr>`;
tbody.innerHTML = `<tr><td colspan="10" style="text-align:center; color:red;">Error loading nodes: ${err.message}</td></tr>`;
}
roleFilter.addEventListener("change", applyFilters);
@@ -174,94 +248,141 @@ document.addEventListener("DOMContentLoaded", async function() {
searchBox.addEventListener("input", applyFilters);
exportBtn.addEventListener("click", exportToCSV);
clearBtn.addEventListener("click", clearFilters);
favoritesBtn.addEventListener("click", toggleFavoritesFilter);
// STAR CLICK HANDLER
tbody.addEventListener("click", e => {
if (e.target.classList.contains('favorite-star')) {
const nodeId = parseInt(e.target.dataset.nodeId);
const isFav = isFavorite(nodeId);
if (isFav) {
e.target.classList.remove("active");
e.target.textContent = "☆";
} else {
e.target.classList.add("active");
e.target.textContent = "★";
}
toggleFavorite(nodeId);
}
});
// SORTING
headers.forEach((th, index) => {
th.addEventListener("click", () => {
const key = keyMap[index];
let key = keyMap[index];
sortAsc = (sortColumn === key) ? !sortAsc : true;
sortColumn = key;
applyFilters(); // apply filters and sort
applyFilters();
});
});
// =====================================================
// FILTER POPULATION
// =====================================================
function populateFilters(nodes) {
const roles = new Set();
const channels = new Set();
const hws = new Set();
const firmwares = new Set();
const roles = new Set(), channels = new Set(), hws = new Set(), fws = new Set();
nodes.forEach(n => {
if (n.role) roles.add(n.role);
if (n.channel) channels.add(n.channel);
if (n.hw_model) hws.add(n.hw_model);
if (n.firmware) firmwares.add(n.firmware);
if (n.firmware) fws.add(n.firmware);
});
fillSelect(roleFilter, roles);
fillSelect(channelFilter, channels);
fillSelect(hwFilter, hws);
fillSelect(firmwareFilter, firmwares);
fillSelect(firmwareFilter, fws);
}
function fillSelect(select, values) {
[...values].sort().forEach(v => {
const option = document.createElement("option");
option.value = v;
option.textContent = v;
select.appendChild(option);
const opt = document.createElement("option");
opt.value = v;
opt.textContent = v;
select.appendChild(opt);
});
}
// =====================================================
// FAVORITES FILTER
// =====================================================
function toggleFavoritesFilter() {
showOnlyFavorites = !showOnlyFavorites;
favoritesBtn.textContent = showOnlyFavorites ? "⭐ Show All" : "⭐ Show Favorites";
favoritesBtn.classList.toggle("active", showOnlyFavorites);
applyFilters();
}
// =====================================================
// APPLY FILTERS + SORT
// =====================================================
function applyFilters() {
const searchTerm = searchBox.value.trim().toLowerCase();
let filtered = allNodes.filter(node => {
const roleMatch = !roleFilter.value || node.role === roleFilter.value;
const channelMatch = !channelFilter.value || node.channel === channelFilter.value;
const hwMatch = !hwFilter.value || node.hw_model === hwFilter.value;
const firmwareMatch = !firmwareFilter.value || node.firmware === firmwareFilter.value;
let filtered = allNodes.filter(n => {
const roleMatch = !roleFilter.value || n.role === roleFilter.value;
const channelMatch = !channelFilter.value || n.channel === channelFilter.value;
const hwMatch = !hwFilter.value || n.hw_model === hwFilter.value;
const fwMatch = !firmwareFilter.value || n.firmware === firmwareFilter.value;
const searchMatch =
!searchTerm ||
(n.long_name && n.long_name.toLowerCase().includes(searchTerm)) ||
(n.short_name && n.short_name.toLowerCase().includes(searchTerm)) ||
n.node_id.toString().includes(searchTerm);
const searchMatch = !searchTerm ||
(node.long_name && node.long_name.toLowerCase().includes(searchTerm)) ||
(node.short_name && node.short_name.toLowerCase().includes(searchTerm)) ||
(node.node_id && node.node_id.toString().includes(searchTerm)) ||
(node.id && node.id.toString().includes(searchTerm));
const favMatch = !showOnlyFavorites || isFavorite(n.node_id);
return roleMatch && channelMatch && hwMatch && firmwareMatch && searchMatch;
return roleMatch && channelMatch && hwMatch && fwMatch && searchMatch && favMatch;
});
if (sortColumn) {
filtered = sortNodes(filtered, sortColumn, sortAsc);
}
filtered = sortNodes(filtered, sortColumn, sortAsc);
renderTable(filtered);
updateSortIcons();
}
// =====================================================
// RENDER TABLE
// =====================================================
function renderTable(nodes) {
tbody.innerHTML = "";
if (!nodes.length) {
tbody.innerHTML = '<tr><td colspan="9" style="text-align:center; color:white;">No nodes found</td></tr>';
} else {
nodes.forEach(node => {
const row = document.createElement("tr");
row.innerHTML = `
<td>${node.short_name || "N/A"}</td>
<td><a href="/packet_list/${node.node_id}">${node.long_name || "N/A"}</a></td>
<td>${node.hw_model || "N/A"}</td>
<td>${node.firmware || "N/A"}</td>
<td>${node.role || "N/A"}</td>
<td>${node.last_lat ? (node.last_lat / 1e7).toFixed(7) : "N/A"}</td>
<td>${node.last_long ? (node.last_long / 1e7).toFixed(7) : "N/A"}</td>
<td>${node.channel || "N/A"}</td>
<td>${node.last_update ? new Date(node.last_update).toLocaleString() : "N/A"}</td>
`;
tbody.appendChild(row);
});
tbody.innerHTML = `<tr><td colspan="10" style="text-align:center; color:white;">No nodes found</td></tr>`;
countSpan.textContent = 0;
return;
}
nodes.forEach(node => {
const isFav = isFavorite(node.node_id);
const star = isFav ? "★" : "☆";
const row = document.createElement("tr");
row.innerHTML = `
<td>${node.short_name || "N/A"}</td>
<td><a href="/node/${node.node_id}">${node.long_name || "N/A"}</a></td>
<td>${node.hw_model || "N/A"}</td>
<td>${node.firmware || "N/A"}</td>
<td>${node.role || "N/A"}</td>
<td>${node.last_lat ? (node.last_lat / 1e7).toFixed(7) : "N/A"}</td>
<td>${node.last_long ? (node.last_long / 1e7).toFixed(7) : "N/A"}</td>
<td>${node.channel || "N/A"}</td>
<td>${timeAgo(node.last_seen_us)}</td>
<td style="text-align:center;">
<span class="favorite-star ${isFav ? "active" : ""}" data-node-id="${node.node_id}">${star}</span>
</td>
`;
tbody.appendChild(row);
});
countSpan.textContent = nodes.length;
}
// =====================================================
// CLEAR FILTERS
// =====================================================
function clearFilters() {
roleFilter.value = "";
channelFilter.value = "";
@@ -270,56 +391,63 @@ document.addEventListener("DOMContentLoaded", async function() {
searchBox.value = "";
sortColumn = "short_name";
sortAsc = true;
showOnlyFavorites = false;
favoritesBtn.textContent = "⭐ Show Favorites";
favoritesBtn.classList.remove("active");
renderTable(allNodes);
updateSortIcons();
}
// =====================================================
// EXPORT CSV
// =====================================================
function exportToCSV() {
const rows = [];
const headersText = Array.from(headers).map(th => `"${th.innerText.replace(/▲|▼/g,'')}"`);
rows.push(headersText.join(","));
const headerList = Array.from(headers).map(h => `"${h.innerText.replace(/▲|▼/g,'')}"`);
rows.push(headerList.join(","));
const visibleRows = tbody.querySelectorAll("tr");
visibleRows.forEach(tr => {
if (tr.children.length === 9) {
const row = Array.from(tr.children).map(td => `"${td.innerText.replace(/"/g, '""')}"`);
rows.push(row.join(","));
}
const trs = tbody.querySelectorAll("tr");
trs.forEach(tr => {
const cells = Array.from(tr.children).map(td => `"${td.innerText.replace(/"/g,'""')}"`);
rows.push(cells.join(","));
});
const csvContent = "data:text/csv;charset=utf-8,\uFEFF" + rows.join("\n");
const link = document.createElement("a");
link.href = encodeURI(csvContent);
const dateStr = new Date().toISOString().slice(0,10);
link.download = `nodes_list_${dateStr}.csv`;
link.click();
const csv = "data:text/csv;charset=utf-8,\uFEFF" + rows.join("\n");
const a = document.createElement("a");
a.href = encodeURI(csv);
a.download = "nodelist.csv";
a.click();
}
// =====================================================
// SORT NODES
// =====================================================
function sortNodes(nodes, key, asc) {
return [...nodes].sort((a, b) => {
let valA = a[key] || "";
let valB = b[key] || "";
let A = a[key];
let B = b[key];
if (key === "last_lat" || key === "last_long") {
valA = Number(valA) || 0;
valB = Number(valB) || 0;
}
if (key === "last_update") {
valA = valA ? new Date(valA).getTime() : 0;
valB = valB ? new Date(valB).getTime() : 0;
// special handling for timestamp
if (key === "last_seen_us") {
A = A || 0;
B = B || 0;
}
if (valA < valB) return asc ? -1 : 1;
if (valA > valB) return asc ? 1 : -1;
if (A < B) return asc ? -1 : 1;
if (A > B) return asc ? 1 : -1;
return 0;
});
}
// =====================================================
// SORT ICONS
// =====================================================
function updateSortIcons() {
headers.forEach((th, index) => {
headers.forEach((th, i) => {
const span = th.querySelector(".sort-icon");
if (!span) return;
span.textContent = (keyMap[index] === sortColumn) ? (sortAsc ? "▲" : "▼") : "";
span.textContent = keyMap[i] === sortColumn ? (sortAsc ? "▲" : "▼") : "";
});
}
});

View File

@@ -1,69 +1,478 @@
<div class="card mt-2">
<div class="card-header">
{% set from_me = packet.from_node_id == node_id %}
{% set to_me = packet.to_node_id == node_id %}
<span {% if from_me %} class="fw-bold" {% endif %}>
{{packet.from_node.long_name}}(
{%- if not from_me -%}
<a href="/node_search?q={{packet.from_node_id|node_id_to_hex}}">
{%- endif -%}
{{packet.from_node_id|node_id_to_hex}}
{%- if not from_me -%}
</a>
{%- endif -%}
)
</span>
<span {% if to_me %} class="fw-bold" {% endif %}>
{{packet.to_node.long_name}}(
{%- if not to_me -%}
<a hx-target="#node" href="/node_search?q={{packet.to_node_id|node_id_to_hex}}">
{%- endif -%}
{{packet.to_node_id|node_id_to_hex}}
{%- if not to_me -%}
</a>
{%- endif -%}
)
</span>
</div>
<div class="card-body">
<div class="card-title">
{{packet.id}}
<a href="/packet/{{packet.id}}">🔎</a>
</div>
<div class="card-text text-start">
<dl>
<dt>Import Time</dt>
<dd>{{packet.import_time.strftime('%-I:%M:%S %p - %m-%d-%Y')}}</dd>
<dt>packet</dt>
<dd><pre>{{packet.data}}</pre></dd>
<dt>payload</dt>
<dd>
{% if packet.pretty_payload %}
<div>{{packet.pretty_payload}}</div>
{% endif %}
{% if packet.raw_mesh_packet and packet.raw_mesh_packet.decoded and packet.raw_mesh_packet.decoded.reply_id %}
<i>(Replying to: <a href="/packet/{{ packet.raw_mesh_packet.decoded.reply_id }}">{{ packet.raw_mesh_packet.decoded.reply_id }}</a>)</i>
{% endif %}
{% if packet.raw_mesh_packet.decoded and packet.raw_mesh_packet.decoded.portnum == 70 %}
<ul>
{% for node_id in packet.raw_payload.route %}
<li><a
href="/packet_list/{{node_id}}"
>
{{node_id | node_id_to_hex}}
</a>
</li>
{% endfor %}
</ul>
{% if packet.raw_mesh_packet.decoded.want_response %}
<a href="/graph/traceroute/{{packet.id}}">graph</a>
{% else %}
<a href="/graph/traceroute/{{packet.raw_mesh_packet.decoded.request_id}}">graph</a>
{% endif %}
{% endif %}
<pre>{{packet.payload}}</pre>
</dd>
</dl>
{% extends "base.html" %}
{% block title %}Packet Details{%endblock%}
{% block css %}
{{ super() }}
<style>
/* --- Packet page container --- */
.packet-container {
max-width: 900px;
margin: 0 auto;
padding: 20px 15px;
font-family: "JetBrains Mono", monospace;
}
/* --- Packet Details Card --- */
.packet-card .card-body { padding: 26px 30px; }
.packet-card {
background-color: #1e1f22;
border: 1px solid #3a3a3a;
border-radius: 12px;
color: #ddd;
margin-top: 35px;
box-shadow: 0 0 20px rgba(0,0,0,0.35);
overflow: hidden;
}
.packet-card .card-header {
background: linear-gradient(90deg, #2c2f35, #25262a);
border-bottom: 1px solid #3f3f3f;
font-weight: 600;
font-size: 1.1em;
padding: 14px 18px;
color: #e2e6ea;
display: flex;
justify-content: space-between;
align-items: center;
}
/* --- Map --- */
#map {
width: 100%;
height: 640px;
border-radius: 10px;
margin-top: 20px;
border: 1px solid #333;
display: none;
}
/* --- SOURCE MARKER (slightly bigger) --- */
.source-marker {
width: 24px;
height: 24px;
background: rgba(255,0,0,0.55);
border: 3px solid #ff0000;
border-radius: 50%;
box-shadow: 0 0 6px rgba(255,0,0,0.7);
}
/* --- Seen Table --- */
.seen-table {
border-collapse: separate;
border-spacing: 0 6px;
font-size: 0.92em;
}
.seen-table thead th {
background-color: #2a2b2f;
color: #e2e2e2;
padding: 10px 12px;
border: none !important;
text-transform: uppercase;
font-size: 0.75em;
}
.seen-table tbody td {
background: #323338;
color: #f0f0f0;
border-top: 1px solid #4a4c4f !important;
border-bottom: 1px solid #4a4c4f !important;
padding: 10px 12px !important;
}
.seen-table tbody tr:hover td { background-color: #3a3c41 !important; }
.seen-table tbody tr td:first-child {
border-left: 1px solid #4a4c4f;
border-top-left-radius: 8px;
border-bottom-left-radius: 8px;
}
.seen-table tbody tr td:last-child {
border-right: 1px solid #4a4c4f;
border-top-right-radius: 8px;
border-bottom-right-radius: 8px;
}
</style>
{% endblock %}
{% block body %}
<div class="container mt-4 mb-5 packet-container">
<div id="loading">Loading packet information...</div>
<div id="packet-card" class="packet-card d-none"></div>
<div id="map"></div>
<div id="seen-container" class="mt-4 d-none">
<h5 style="color:#ccc; margin:15px 0 10px 0;">
📡 Seen By <span id="seen-count" style="color:#4da6ff;"></span>
</h5>
<div class="table-responsive">
<table class="table table-dark table-sm seen-table">
<thead>
<tr>
<th>Gateway</th>
<th>RSSI</th>
<th>SNR</th>
<th>Hop</th>
<th>Channel</th>
<th>Time</th>
</tr>
</thead>
<tbody id="seen-table-body"></tbody>
</table>
</div>
</div>
</div>
<script>
document.addEventListener("DOMContentLoaded", async () => {
const packetCard = document.getElementById("packet-card");
const loading = document.getElementById("loading");
const mapDiv = document.getElementById("map");
const seenContainer = document.getElementById("seen-container");
const seenTableBody = document.getElementById("seen-table-body");
const seenCountSpan = document.getElementById("seen-count");
/* ---------------------------------------------
Identify packet ID
----------------------------------------------*/
const match = window.location.pathname.match(/\/packet\/(\d+)/);
if (!match) {
loading.textContent = "Invalid packet URL";
return;
}
const packetId = match[1];
/* PORT NAME MAP */
const PORT_NAMES = {
0:"UNKNOWN APP",
1:"Text",
3:"Position",
4:"Node Info",
5:"Routing",
6:"Admin",
67:"Telemetry",
70:"Traceroute",
71:"Neighbor"
};
/* ---------------------------------------------
Fetch packet
----------------------------------------------*/
const packetRes = await fetch(`/api/packets?packet_id=${packetId}`);
const packetData = await packetRes.json();
if (!packetData.packets.length) {
loading.textContent = "Packet not found.";
return;
}
const p = packetData.packets[0];
/* ---------------------------------------------
Fetch all nodes
----------------------------------------------*/
const nodesRes = await fetch("/api/nodes");
const nodesData = await nodesRes.json();
const nodeLookup = {};
(nodesData.nodes || []).forEach(n => nodeLookup[n.node_id] = n);
const fromNodeObj = nodeLookup[p.from_node_id];
const toNodeObj = nodeLookup[p.to_node_id];
const fromNodeLabel = fromNodeObj?.long_name || p.from_node_id;
const toNodeLabel =
p.to_node_id == 4294967295 ? "All" : (toNodeObj?.long_name || p.to_node_id);
/* ---------------------------------------------
Parse payload for lat/lon if this *packet* is a position packet
----------------------------------------------*/
let lat = null, lon = null;
const parsed = {};
if (p.payload?.includes(":")) {
p.payload.split("\n").forEach(line => {
const [k, v] = line.split(":").map(x=>x.trim());
if (k && v !== undefined) {
parsed[k] = v;
if (k === "latitude_i") lat = Number(v) / 1e7;
if (k === "longitude_i") lon = Number(v) / 1e7;
}
});
}
/* ---------------------------------------------
Render packet header & details
----------------------------------------------*/
const time = p.import_time_us
? new Date(p.import_time_us / 1000).toLocaleString()
: "—";
const telemetryExtras = [];
if (parsed.PDOP) telemetryExtras.push(`PDOP: ${parsed.PDOP}`);
if (parsed.sats_in_view) telemetryExtras.push(`Sats: ${parsed.sats_in_view}`);
if (parsed.ground_speed) telemetryExtras.push(`Speed: ${parsed.ground_speed}`);
if (parsed.altitude) telemetryExtras.push(`Altitude: ${parsed.altitude}`);
packetCard.innerHTML = `
<div class="card-header">
<span>Packet ID: <i>${p.id}</i></span>
<small>${time}</small>
</div>
<div class="card-body">
<dl>
<dt>From Node:</dt>
<dd><a href="/node/${p.from_node_id}">${fromNodeLabel}</a></dd>
<dt>To Node:</dt>
<dd>${
p.to_node_id === 4294967295
? `<i>All</i>`
: p.to_node_id === 1
? `<i>Direct to MQTT</i>`
: `<a href="/node/${p.to_node_id}">${toNodeLabel}</a>`
}</dd>
<dt>Channel:</dt><dd>${p.channel ?? "—"}</dd>
<dt>Port:</dt>
<dd><i>${PORT_NAMES[p.portnum] || "UNKNOWN APP"}</i> (${p.portnum})</dd>
<dt>Raw Payload:</dt>
<dd><pre>${escapeHtml(p.payload ?? "—")}</pre></dd>
${
telemetryExtras.length
? `<dt>Decoded Telemetry</dt>
<dd><pre>${telemetryExtras.join("\n")}</pre></dd>`
: ""
}
${
lat && lon
? `<dt>Location:</dt><dd>${lat.toFixed(6)}, ${lon.toFixed(6)}</dd>`
: ""
}
</dl>
</div>
`;
loading.classList.add("d-none");
packetCard.classList.remove("d-none");
/* ---------------------------------------------
Map initialization
----------------------------------------------*/
const map = L.map("map");
mapDiv.style.display = "block";
L.tileLayer("https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png", {
maxZoom: 19
}).addTo(map);
const allBounds = [];
/* ---------------------------------------------
ALWAYS SHOW SOURCE POSITION
Priority:
1) position from packet payload
2) fallback: last_lat/last_long from /api/nodes
----------------------------------------------*/
let srcLat = lat;
let srcLon = lon;
if ((!srcLat || !srcLon) && fromNodeObj?.last_lat && fromNodeObj?.last_long) {
srcLat = fromNodeObj.last_lat / 1e7;
srcLon = fromNodeObj.last_long / 1e7;
}
if (srcLat && srcLon) {
allBounds.push([srcLat, srcLon]);
const sourceIcon = L.divIcon({
html: `<div class="source-marker"></div>`,
className: "",
iconSize: [26, 26],
iconAnchor: [13, 13]
});
const sourceMarker = L.marker([srcLat, srcLon], {
icon: sourceIcon,
zIndexOffset: 9999
}).addTo(map);
sourceMarker.bindPopup(`
<div style="font-size:0.9em">
<b>Packet Source</b><br>
Lat: ${srcLat.toFixed(6)}<br>
Lon: ${srcLon.toFixed(6)}<br>
From Node: ${fromNodeLabel}<br>
Channel: ${p.channel ?? "—"}<br>
Port: ${PORT_NAMES[p.portnum] || "UNKNOWN"} (${p.portnum})
</div>
`);
} else {
map.setView([0,0], 2);
}
/* ---------------------------------------------
Color for hop indicator markers (warm → cold)
----------------------------------------------*/
function hopColor(hopValue){
const colors = [
"#ff3b30",
"#ff6b22",
"#ff9f0c",
"#ffd60a",
"#87d957",
"#57d9c4",
"#3db2ff",
"#1e63ff"
];
let h = Number(hopValue);
if (isNaN(h)) return "#aaa";
if (h < 0) h = 0;
if (h > 7) h = 7;
return colors[h];
}
/* Distance helper */
function haversine(lat1,lon1,lat2,lon2){
const R=6371;
const dLat=(lat2-lat1)*Math.PI/180;
const dLon=(lon2-lon1)*Math.PI/180;
const a=Math.sin(dLat/2)**2+
Math.cos(lat1*Math.PI/180)*
Math.cos(lat2*Math.PI/180)*
Math.sin(dLon/2)**2;
return R*(2*Math.atan2(Math.sqrt(a),Math.sqrt(1-a)));
}
/* ---------------------------------------------
Fetch packets_seen
----------------------------------------------*/
const seenRes = await fetch(`/api/packets_seen/${packetId}`);
const seenData = await seenRes.json();
const seenList = seenData.seen ?? [];
/* sort by hop_start descending (warm → cold) */
const seenSorted = seenList.slice().sort((a,b)=>{
const A=a.hop_start??-999;
const B=b.hop_start??-999;
return B-A;
});
if (seenSorted.length){
seenContainer.classList.remove("d-none");
seenCountSpan.textContent=`(${seenSorted.length} gateways)`;
}
/* ---------------------------------------------
Gateway markers and seen table
----------------------------------------------*/
seenTableBody.innerHTML = seenSorted.map(s=>{
const node=nodeLookup[s.node_id];
const label=node?(node.long_name||node.node_id):s.node_id;
const timeStr = s.import_time_us
? new Date(s.import_time_us/1000).toLocaleTimeString()
: "—";
if(node?.last_lat && node.last_long){
const rlat=node.last_lat/1e7;
const rlon=node.last_long/1e7;
allBounds.push([rlat,rlon]);
const start = Number(s.hop_start ?? 0);
const limit = Number(s.hop_limit ?? 0);
const hopValue = start - limit;
const color = hopColor(hopValue);
const iconHtml = `
<div style="
background:${color};
width:24px;
height:24px;
border-radius:50%;
display:flex;
align-items:center;
justify-content:center;
color:white;
font-size:11px;
font-weight:700;
border:2px solid rgba(0,0,0,0.35);
box-shadow:0 0 5px rgba(0,0,0,0.45);
">${hopValue}</div>`;
const marker=L.marker([rlat,rlon],{
icon:L.divIcon({
html:iconHtml,
className:"",
iconSize:[24,24],
iconAnchor:[12,12]
})
}).addTo(map);
let distKm=null,distMi=null;
if(srcLat&&srcLon){
distKm=haversine(srcLat,srcLon,rlat,rlon);
distMi=distKm*0.621371;
}
marker.bindPopup(`
<div style="font-size:0.9em">
<b>${node?.long_name || s.node_id}</b><br>
Node ID: <a href="/node/${s.node_id}">${s.node_id}</a><br>
HW: ${node?.hw_model ?? "—"}<br>
Channel: ${s.channel ?? "—"}<br><br>
<b>Signal</b><br>
RSSI: ${s.rx_rssi ?? "—"}<br>
SNR: ${s.rx_snr ?? "—"}<br><br>
<b>Hops</b>: ${hopValue}<br>
<b>Distance</b><br>
${
distKm
? `${distKm.toFixed(2)} km (${distMi.toFixed(2)} mi)`
: "—"
}
</div>
`);
}
return `
<tr>
<td><a href="/node/${s.node_id}">${label}</a></td>
<td>${s.rx_rssi ?? "—"}</td>
<td>${s.rx_snr ?? "—"}</td>
<td>${s.hop_start ?? "—"}${s.hop_limit ?? "—"}</td>
<td>${s.channel ?? "—"}</td>
<td>${timeStr}</td>
</tr>`;
}).join("");
/* ---------------------------------------------
Fit map to all markers
----------------------------------------------*/
if(allBounds.length>0){
map.fitBounds(allBounds,{padding:[40,40]});
}
/* ---------------------------------------------
Escape HTML
----------------------------------------------*/
function escapeHtml(unsafe) {
return (unsafe??"").replace(/[&<"'>]/g,m=>({
"&":"&amp;",
"<":"&lt;",
">":"&gt;",
"\"":"&quot;",
"'":"&#039;"
})[m]);
}
});
</script>
{% endblock %}

View File

@@ -1,132 +0,0 @@
<div id="details_map"></div>
{% for seen in packets_seen %}
<div class="card mt-2">
<div class="card-header">
{{seen.node.long_name}}(
<a hx-target="#node" href="/node_search?q={{seen.node_id|node_id_to_hex}}">
{{seen.node_id|node_id_to_hex}}
</a>
)
</div>
<div class="card-body">
<div class="card-text text-start">
<dl>
<dt>Import Time</dt>
<dd>{{seen.import_time.strftime('%-I:%M:%S %p - %m-%d-%Y')}}</dd>
<dt>rx_time</dt>
<dd>{{seen.rx_time|format_timestamp}}</dd>
<dt>hop_limit</dt>
<dd>{{seen.hop_limit}}</dd>
<dt>hop_start</dt>
<dd>{{seen.hop_start}}</dd>
<dt>channel</dt>
<dd>{{seen.channel}}</dd>
<dt>rx_snr</dt>
<dd>{{seen.rx_snr}}</dd>
<dt>rx_rssi</dt>
<dd>{{seen.rx_rssi}}</dd>
<dt>topic</dt>
<dd>{{seen.topic}}</dd>
</dl>
</div>
</div>
</div>
{% endfor %}
{% if map_center %}
<script>
var details_map = L.map('details_map').setView({{ map_center | tojson }}, 8);
var markers = L.featureGroup();
markers.addTo(details_map);
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', {
maxZoom: 15,
attribution: '&copy; <a href="http://www.openstreetmap.org/copyright">OpenStreetMap</a>'
}).addTo(details_map);
function getDistanceInMiles(latlng1, latlng2) {
var meters = latlng1.distanceTo(latlng2);
return meters * 0.000621371;
}
{% if from_node_cord %}
var fromNodeLatLng = L.latLng({{ from_node_cord | tojson }});
var fromNode = L.circleMarker(fromNodeLatLng, {
radius: 10,
color: 'red',
weight: 1,
fillColor: 'red',
fillOpacity: .4
}).addTo(markers);
fromNode.bindPopup(`
Sent by: <b>{{node.long_name}}</b><br/>
<b>Short:</b> {{node.short_name}}<br/>
<b>Channel:</b> {{node.channel}}<br/>
<b>Hardware:</b> {{node.hw_model}}<br/>
<b>Role:</b> {{node.role}}<br/>
<b>Firmware:</b> {{node.firmware}}<br/>
<b>Coordinates:</b> [{{node.last_lat}}, {{node.last_long}}]
`, { permanent: false, direction: 'top', opacity: 0.9 });
{% endif %}
{% for u in uplinked_nodes %}
var uplinkNodeLatLng = L.latLng([{{ u.lat }}, {{ u.long }}]);
{% if from_node_cord %}
var distanceMiles = getDistanceInMiles(fromNodeLatLng, uplinkNodeLatLng).toFixed(1);
{% endif %}
var node = L.marker(uplinkNodeLatLng, {
icon: L.divIcon({
className: 'text-icon',
html: `<div style="font-size: 12px; color: white; font-weight: bold; display: flex; justify-content: center; align-items: center; height: 16px; width: 16px; border-radius: 50%; background-color: blue; border: 1px solid blue;">{{u.hops}}</div>`,
iconSize: [16, 16],
iconAnchor: [8, 8]
})
}).addTo(markers);
node.setZIndexOffset({{u.hops}}*-1);
node.bindPopup(`
Heard by: <b>{{u.long_name}}</b><br>
<b>{{ u.short_name }}</b><br/>
<b>Hops:</b> {{ u.hops }}<br/>
<b>SNR:</b> {{ u.snr }}<br/>
<b>RSSI:</b> {{ u.rssi }}<br/>
{% if from_node_cord %}
<b>Distance:</b> ${distanceMiles} miles <br/>
{% endif %}
<b>Coordinates:</b> [{{u.lat}}, {{u.long}}]
`, { permanent: false, direction: 'top', opacity: 0.9 });
{% endfor %}
if (markers.getLayers().length > 0) {
details_map.fitBounds(markers.getBounds().pad(0.1), { animate: true });
}
var legend = L.control({ position: 'bottomleft' });
legend.onAdd = function(map) {
var div = L.DomUtil.create('div', 'info legend');
div.style.background = 'white';
div.style.padding = '8px';
div.style.border = '1px solid black';
div.style.borderRadius = '5px';
div.style.boxShadow = '0 0 5px rgba(0,0,0,0.3)';
div.style.color = 'black';
div.style.textAlign = 'left';
div.innerHTML = `
<b>Legend</b><br>
<svg width="20" height="20">
<circle cx="8" cy="8" r="6" fill="blue" stroke="blue" stroke-width="1" fill-opacity="0.9"/>
</svg> Receiving Node (Number is hop count)<br>
<svg width="20" height="20">
<circle cx="10" cy="10" r="8" fill="red" stroke="red" stroke-width="1" fill-opacity="0.4"/>
</svg> Sending Node<br>
`;
return div;
};
legend.addTo(details_map);
</script>
{% endif %}

View File

@@ -1,24 +0,0 @@
{% extends "base.html" %}
{% block css %}
/* Set the maximum width of the page to 900px */
.container {
max-width: 900px;
margin: 0 auto; /* Center the content horizontally */
}
{% endblock %}
{% block body %}
<div class="container">
<div class="row">
<div>
{% include 'packet.html' %}
</div>
<div
id="packet_details"
hx-get="/packet_details/{{packet.id}}"
hx-trigger="load"
>
</div>
</div>
</div>
{% endblock %}

View File

@@ -1,7 +0,0 @@
<div class="col" id="packet_list">
{% for packet in packets %}
{% include 'packet.html' %}
{% else %}
No packets found.
{% endfor %}
</div>

View File

@@ -1,21 +0,0 @@
{% extends "base.html" %}
{% block body %}
{% include "search_form.html" %}
<ul>
{% for node in nodes %}
<li>
<a href="/packet_list/{{node.node_id}}?{{query_string}}">
{{node.node_id | node_id_to_hex}}
{% if node.long_name %}
{{node.short_name}} &mdash; {{node.long_name}}
{% endif %}
</a>
</li>
{% endfor %}
</ul>
{% endblock %}

View File

@@ -1,44 +0,0 @@
<form
class="container p-2 sticky-top mx-auto"
id="search_form"
action="/node_search"
>
<div class="row">
<input
class="col m-2"
id="q"
type="text"
name="q"
placeholder="Node id"
autocomplete="off"
list="node_options"
value="{{raw_node_id}}"
hx-trigger="input delay:100ms"
hx-get="/node_match"
hx-target="#node_options"
/>
{% include "datalist.html" %}
{% set options = {
1: "Text Message",
3: "Position",
4: "Node Info",
67: "Telemetry",
70: "Traceroute",
71: "Neighbor Info",
}
%}
<select name="portnum" class="col-2 m-2">
<option
value = ""
{% if portnum not in options %}selected{% endif %}
>All</option>
{% for value, name in options.items() %}
<option
value="{{value}}"
{% if value == portnum %}selected{% endif %}
>{{ name }}</option>
{% endfor %}
</select>
<input type="submit" value="Go to Node" class="col-2 m-2" />
</div>
</form>

View File

@@ -62,6 +62,29 @@
.expand-btn:hover { background-color: #666; }
.export-btn:hover { background-color: #777; }
/* Summary cards at top */
.summary-card {
background-color: #1f2124;
border: 1px solid #474b4e;
padding: 10px 15px;
margin-bottom: 15px;
border-radius: 8px;
}
.summary-count {
font-size: 18px;
color: #66bb6a;
font-weight: bold;
}
#channelSelect {
margin-bottom: 8px;
padding: 4px 6px;
background:#444;
color:#fff;
border:none;
border-radius:4px;
}
{% endblock %}
{% block head %}
@@ -70,106 +93,105 @@
{% block body %}
<div class="main-container">
<h2 class="main-header">Mesh Statistics - Hourly Packet Counts (Last 24 Hours)</h2>
<h2 class="main-header" data-translate-lang="mesh_stats_summary">
Mesh Statistics - Summary (all available in Database)
</h2>
{# Daily Charts #}
<!-- Summary cards now fully driven by API + JS -->
<div class="summary-container" style="display:flex; justify-content:space-between; gap:10px; margin-bottom:20px;">
<div class="summary-card" style="flex:1;">
<p data-translate-lang="total_nodes">Total Nodes</p>
<div class="summary-count" id="summary_nodes">0</div>
</div>
<div class="summary-card" style="flex:1;">
<p data-translate-lang="total_packets">Total Packets</p>
<div class="summary-count" id="summary_packets">0</div>
</div>
<div class="summary-card" style="flex:1;">
<p data-translate-lang="total_packets_seen">Total Packets Seen</p>
<div class="summary-count" id="summary_seen">0</div>
</div>
</div>
<!-- Daily Charts -->
<div class="card-section">
<p class="section-header">Packets per Day - All Ports (Last 14 Days)</p>
<p class="section-header" data-translate-lang="packets_per_day_all">
Packets per Day - All Ports (Last 14 Days)
</p>
<div id="total_daily_all" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_daily_all">Expand Chart</button>
<button class="export-btn" data-chart="chart_daily_all">Export CSV</button>
<button class="expand-btn" data-chart="chart_daily_all" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_daily_all" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_daily_all" class="chart"></div>
</div>
<!-- Packet Types Pie Chart with Channel Selector -->
<div class="card-section">
<p class="section-header">Packets per Day - Text Messages (Port 1, Last 14 Days)</p>
<p class="section-header" data-translate-lang="packet_types_last_24h">
Packet Types - Last 24 Hours
</p>
<select id="channelSelect">
<option value="" data-translate-lang="all_channels">All Channels</option>
</select>
<button class="expand-btn" data-chart="chart_packet_types" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_packet_types" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_packet_types" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header" data-translate-lang="packets_per_day_text">
Packets per Day - Text Messages (Port 1, Last 14 Days)
</p>
<div id="total_daily_portnum_1" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_daily_portnum_1">Expand Chart</button>
<button class="export-btn" data-chart="chart_daily_portnum_1">Export CSV</button>
<button class="expand-btn" data-chart="chart_daily_portnum_1" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_daily_portnum_1" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_daily_portnum_1" class="chart"></div>
</div>
{# Hourly Charts #}
<!-- Hourly Charts -->
<div class="card-section">
<p class="section-header">Packets per Hour - All Ports</p>
<p class="section-header" data-translate-lang="packets_per_hour_all">
Packets per Hour - All Ports
</p>
<div id="total_hourly_all" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_hourly_all">Expand Chart</button>
<button class="export-btn" data-chart="chart_hourly_all">Export CSV</button>
<button class="expand-btn" data-chart="chart_hourly_all" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_hourly_all" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_hourly_all" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Packets per Hour - Text Messages (Port 1)</p>
<p class="section-header" data-translate-lang="packets_per_hour_text">
Packets per Hour - Text Messages (Port 1)
</p>
<div id="total_portnum_1" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_1">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_1">Export CSV</button>
<button class="expand-btn" data-chart="chart_portnum_1" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_1" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_portnum_1" class="chart"></div>
</div>
<!-- Node breakdown charts -->
<div class="card-section">
<p class="section-header">Packets per Hour - Position (Port 3)</p>
<div id="total_portnum_3" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_3">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_3">Export CSV</button>
<div id="chart_portnum_3" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Packets per Hour - Node Info (Port 4)</p>
<div id="total_portnum_4" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_4">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_4">Export CSV</button>
<div id="chart_portnum_4" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Packets per Hour - Telemetry (Port 67)</p>
<div id="total_portnum_67" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_67">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_67">Export CSV</button>
<div id="chart_portnum_67" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Packets per Hour - Traceroute (Port 70)</p>
<div id="total_portnum_70" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_70">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_70">Export CSV</button>
<div id="chart_portnum_70" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Packets per Hour - Neighbor Info (Port 71)</p>
<div id="total_portnum_71" class="total-count">Total: 0</div>
<button class="expand-btn" data-chart="chart_portnum_71">Expand Chart</button>
<button class="export-btn" data-chart="chart_portnum_71">Export CSV</button>
<div id="chart_portnum_71" class="chart"></div>
</div>
{# Node breakdown charts #}
<div class="card-section">
<p class="section-header">Hardware Breakdown</p>
<button class="expand-btn" data-chart="chart_hw_model">Expand Chart</button>
<button class="export-btn" data-chart="chart_hw_model">Export CSV</button>
<p class="section-header" data-translate-lang="hardware_breakdown">Hardware Breakdown</p>
<button class="expand-btn" data-chart="chart_hw_model" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_hw_model" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_hw_model" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Role Breakdown</p>
<button class="expand-btn" data-chart="chart_role">Expand Chart</button>
<button class="export-btn" data-chart="chart_role">Export CSV</button>
<p class="section-header" data-translate-lang="role_breakdown">Role Breakdown</p>
<button class="expand-btn" data-chart="chart_role" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_role" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_role" class="chart"></div>
</div>
<div class="card-section">
<p class="section-header">Channel Breakdown</p>
<button class="expand-btn" data-chart="chart_channel">Expand Chart</button>
<button class="export-btn" data-chart="chart_channel">Export CSV</button>
<p class="section-header" data-translate-lang="channel_breakdown">Channel Breakdown</p>
<button class="expand-btn" data-chart="chart_channel" data-translate-lang="expand_chart">Expand Chart</button>
<button class="export-btn" data-chart="chart_channel" data-translate-lang="export_csv">Export CSV</button>
<div id="chart_channel" class="chart"></div>
</div>
</div>
{# Modal for expanded charts #}
<!-- Modal for expanded charts -->
<div id="chartModal" style="display:none; position:fixed; top:0; left:0; width:100%; height:100%;
background:rgba(0,0,0,0.7); z-index:1000; justify-content:center; align-items:center;">
<div style="position:relative; width:80%; max-width:1000px; height:80%;
@@ -183,11 +205,21 @@
</div>
<script>
const PORTNUM_LABELS = {
1: "Text Messages",
3: "Position",
4: "Node Info",
67: "Telemetry",
70: "Traceroute",
71: "Neighbor Info"
};
// --- Fetch & Processing ---
async function fetchStats(period_type,length,portnum=null){
async function fetchStats(period_type,length,portnum=null,channel=null){
try{
let url=`/api/stats?period_type=${period_type}&length=${length}`;
if(portnum!==null) url+=`&portnum=${portnum}`;
if(channel) url+=`&channel=${channel}`;
const res=await fetch(url);
if(!res.ok) return [];
const json=await res.json();
@@ -200,7 +232,19 @@ async function fetchNodes(){
const res=await fetch("/api/nodes");
const json=await res.json();
return json.nodes||[];
}catch{return [];}
}catch{
return [];
}
}
async function fetchChannels(){
try{
const res = await fetch("/api/channels");
const json = await res.json();
return json.channels || [];
}catch{
return [];
}
}
function processCountField(nodes,field){
@@ -230,21 +274,35 @@ function prepareTopN(data,n=20){
}
// --- Chart Rendering ---
function renderChart(domId,data,type,color,isHourly){
function renderChart(domId,data,type,color){
const el=document.getElementById(domId);
if(!el) return;
const chart=echarts.init(el);
const periods=data.map(d=>(d.period??d.period===0)?d.period.toString():'');
const counts=data.map(d=>d.count??d.packet_count??0);
const option={
chart.setOption({
backgroundColor:'#272b2f',
tooltip:{trigger:'axis'},
grid:{left:'6%', right:'6%', bottom:'18%'},
xAxis:{type:'category', data:periods, axisLine:{lineStyle:{color:'#aaa'}}, axisLabel:{rotate:45,color:'#ccc'}},
yAxis:{type:'value', axisLine:{lineStyle:{color:'#aaa'}}, axisLabel:{color:'#ccc'}},
series:[{data:counts,type:type,smooth:type==='line',itemStyle:{color:color}, areaStyle:type==='line'?{}:undefined}]
};
chart.setOption(option);
xAxis:{
type:'category',
data:periods,
axisLine:{lineStyle:{color:'#aaa'}},
axisLabel:{rotate:45,color:'#ccc'}
},
yAxis:{
type:'value',
axisLine:{lineStyle:{color:'#aaa'}},
axisLabel:{color:'#ccc'}
},
series:[{
data:counts,
type:type,
smooth:type==='line',
itemStyle:{color:color},
areaStyle:type==='line'?{}:undefined
}]
});
return chart;
}
@@ -253,63 +311,163 @@ function renderPieChart(elId,data,name){
if(!el) return;
const chart=echarts.init(el);
const top20=prepareTopN(data,20);
const option={
chart.setOption({
backgroundColor:"#272b2f",
tooltip:{trigger:"item", formatter: params=>`${params.name}: ${Math.round(params.percent)}% (${params.value})`},
tooltip:{
trigger:"item",
formatter: params=>`${params.name}: ${Math.round(params.percent)}% (${params.value})`
},
series:[{
name:name, type:"pie", radius:["30%","70%"], center:["50%","50%"],
name:name,
type:"pie",
radius:["30%","70%"],
center:["50%","50%"],
avoidLabelOverlap:true,
itemStyle:{borderRadius:6,borderColor:"#272b2f",borderWidth:2},
label:{show:true,formatter:"{b}\n{d}%", color:"#ccc", fontSize:10},
labelLine:{show:true,length:10,length2:6},
itemStyle:{
borderRadius:6,
borderColor:"#272b2f",
borderWidth:2
},
label:{
show:true,
formatter:"{b}\n{d}%",
color:"#ccc",
fontSize:10
},
labelLine:{
show:true,
length:10,
length2:6
},
data:top20
}]
};
chart.setOption(option);
});
return chart;
}
// --- Packet Type Pie Chart ---
async function fetchPacketTypeBreakdown(channel=null) {
const portnums = [1,3,4,67,70,71];
const requests = portnums.map(async pn => {
const data = await fetchStats('hour',24,pn,channel);
const total = (data || []).reduce((sum,d)=>sum+(d.count??d.packet_count??0),0);
return {portnum: pn, count: total};
});
const allData = await fetchStats('hour',24,null,channel);
const totalAll = allData.reduce((sum,d)=>sum+(d.count??d.packet_count??0),0);
const results = await Promise.all(requests);
const trackedTotal = results.reduce((sum,d)=>sum+d.count,0);
const other = Math.max(totalAll - trackedTotal,0);
if(other>0) results.push({portnum:"other", count:other});
return results;
}
// --- Init ---
let chartHourlyAll, chartPortnum1, chartPortnum3, chartPortnum4, chartPortnum67, chartPortnum70, chartPortnum71;
let chartDailyAll, chartDailyPortnum1;
let chartHwModel, chartRole, chartChannel;
let chartPacketTypes;
async function init(){
// Channel selector
const channels = await fetchChannels();
const select = document.getElementById("channelSelect");
channels.forEach(ch=>{
const opt = document.createElement("option");
opt.value = ch;
opt.textContent = ch;
select.appendChild(opt);
});
// Daily all ports
const dailyAllData=await fetchStats('day',14);
updateTotalCount('total_daily_all',dailyAllData);
chartDailyAll=renderChart('chart_daily_all',dailyAllData,'line','#66bb6a',false);
chartDailyAll=renderChart('chart_daily_all',dailyAllData,'line','#66bb6a');
// Daily port 1
const dailyPort1Data=await fetchStats('day',14,1);
updateTotalCount('total_daily_portnum_1',dailyPort1Data);
chartDailyPortnum1=renderChart('chart_daily_portnum_1',dailyPort1Data,'bar','#ff5722',false);
chartDailyPortnum1=renderChart('chart_daily_portnum_1',dailyPort1Data,'bar','#ff5722');
// Hourly all ports
const hourlyAllData=await fetchStats('hour',24);
updateTotalCount('total_hourly_all',hourlyAllData);
chartHourlyAll=renderChart('chart_hourly_all',hourlyAllData,'bar','#03dac6',true);
chartHourlyAll=renderChart('chart_hourly_all',hourlyAllData,'bar','#03dac6');
// Hourly per port
const portnums=[1,3,4,67,70,71];
const colors=['#ff5722','#2196f3','#9c27b0','#ffeb3b','#795548','#4caf50'];
const domIds=['chart_portnum_1','chart_portnum_3','chart_portnum_4','chart_portnum_67','chart_portnum_70','chart_portnum_71'];
const totalIds=['total_portnum_1','total_portnum_3','total_portnum_4','total_portnum_67','total_portnum_70','total_portnum_71'];
const allData=await Promise.all(portnums.map(pn=>fetchStats('hour',24,pn)));
for(let i=0;i<portnums.length;i++){
updateTotalCount(totalIds[i],allData[i]);
window['chartPortnum'+portnums[i]]=renderChart(domIds[i],allData[i],'bar',colors[i],true);
window['chartPortnum'+portnums[i]]=renderChart(domIds[i],allData[i],'bar',colors[i]);
}
// Nodes for breakdown + summary node count
const nodes=await fetchNodes();
chartHwModel=renderPieChart("chart_hw_model",processCountField(nodes,"hw_model"),"Hardware");
chartRole=renderPieChart("chart_role",processCountField(nodes,"role"),"Role");
chartChannel=renderPieChart("chart_channel",processCountField(nodes,"channel"),"Channel");
const summaryNodesEl = document.getElementById("summary_nodes");
if (summaryNodesEl) {
summaryNodesEl.textContent = nodes.length.toLocaleString();
}
// Packet types pie
const packetTypesData = await fetchPacketTypeBreakdown();
const formatted = packetTypesData
.filter(d=>d.count>0)
.map(d=>({
name: d.portnum==="other"
? "Other"
: (PORTNUM_LABELS[d.portnum]||`Port ${d.portnum}`),
value: d.count
}));
chartPacketTypes = renderPieChart("chart_packet_types",formatted,"Packet Types (Last 24h)");
// Total packet + total seen from /api/stats/count
try {
const countsRes = await fetch("/api/stats/count");
if (countsRes.ok) {
const countsJson = await countsRes.json();
const elPackets = document.getElementById("summary_packets");
const elSeen = document.getElementById("summary_seen");
if (elPackets) {
elPackets.textContent = (countsJson.total_packets || 0).toLocaleString();
}
if (elSeen) {
elSeen.textContent = (countsJson.total_seen || 0).toLocaleString();
}
}
} catch (err) {
console.error("Failed to load /api/stats/count:", err);
}
}
// --- Resize ---
window.addEventListener('resize',()=>{
[chartHourlyAll,chartPortnum1,chartPortnum3,chartPortnum4,chartPortnum67,chartPortnum70,chartPortnum71,
chartDailyAll,chartDailyPortnum1,chartHwModel,chartRole,chartChannel].forEach(c=>c?.resize());
[
chartHourlyAll,
chartPortnum1,
chartPortnum3,
chartPortnum4,
chartPortnum67,
chartPortnum70,
chartPortnum71,
chartDailyAll,
chartDailyPortnum1,
chartHwModel,
chartRole,
chartChannel,
chartPacketTypes
].forEach(c=>c?.resize());
});
// --- Modal ---
const modal=document.getElementById("chartModal");
const modalChartEl=document.getElementById("modalChart");
let modalChart=null;
@@ -317,25 +475,22 @@ let modalChart=null;
document.querySelectorAll(".expand-btn").forEach(btn=>{
btn.addEventListener("click",()=>{
const chartId=btn.getAttribute("data-chart");
const chartInstance=echarts.getInstanceByDom(document.getElementById(chartId));
if(!chartInstance) return;
const chartData=chartInstance.getOption();
const sourceChart=echarts.getInstanceByDom(document.getElementById(chartId));
if(!sourceChart)return;
modal.style.display="flex";
if(modalChart) modalChart.dispose();
modalChart=echarts.init(modalChartEl);
modalChart.setOption(chartData);
modalChart.resize();
modalChart.setOption(sourceChart.getOption());
});
});
document.getElementById("closeModal").addEventListener("click",()=>{
modal.style.display="none";
if(modalChart) modalChart.dispose();
modalChart?.dispose();
modalChart=null;
});
// --- CSV Export ---
function downloadCSV(filename,rows){
const csvContent=rows.map(e=>e.join(",")).join("\n");
const csvContent=rows.map(r=>r.map(v=>`"${v}"`).join(",")).join("\n");
const blob=new Blob([csvContent],{type:"text/csv;charset=utf-8;"});
const link=document.createElement("a");
link.href=URL.createObjectURL(blob);
@@ -349,13 +504,13 @@ document.querySelectorAll(".export-btn").forEach(btn=>{
btn.addEventListener("click",()=>{
const chartId=btn.getAttribute("data-chart");
const chart=echarts.getInstanceByDom(document.getElementById(chartId));
if(!chart) return;
if(!chart)return;
const option=chart.getOption();
let rows=[];
if(option.series[0].type==="bar"||option.series[0].type==="line"){
rows.push(["Period","Count"]);
const xData=option.xAxis[0].data;
const yData=option.series[0].data;
rows.push(["Period","Count"]);
for(let i=0;i<xData.length;i++) rows.push([xData[i],yData[i]]);
}
if(option.series[0].type==="pie"){
@@ -370,6 +525,54 @@ document.querySelectorAll(".export-btn").forEach(btn=>{
});
});
document.getElementById("channelSelect").addEventListener("change", async (e)=>{
const channel = e.target.value;
const packetTypesData = await fetchPacketTypeBreakdown(channel);
const formatted = packetTypesData
.filter(d=>d.count>0)
.map(d=>({
name: d.portnum==="other"
? "Other"
: (PORTNUM_LABELS[d.portnum]||`Port ${d.portnum}`),
value: d.count
}));
chartPacketTypes?.dispose();
chartPacketTypes = renderPieChart("chart_packet_types",formatted,"Packet Types (Last 24h)");
});
// Kick everything off
init();
// --- Load config and translations ---
async function loadConfigAndTranslations() {
let langCode = "en";
try {
const resConfig = await fetch("/api/config");
const cfg = await resConfig.json();
window.site_config = cfg;
langCode = cfg?.site?.language || "en";
} catch(err) {
console.error("Failed to load /api/config:", err);
window.site_config = { site: { language: "en" } };
}
try {
const resLang = await fetch(`/api/lang?lang=${langCode}&section=stats`);
window.statsTranslations = await resLang.json();
} catch(err) {
console.error("Stats translation load failed:", err);
window.statsTranslations = {};
}
// Apply translations
const t = window.statsTranslations || {};
document.querySelectorAll("[data-translate-lang]").forEach(el=>{
const key = el.getAttribute("data-translate-lang");
if(t[key]) el.textContent = t[key];
});
}
// Call after init
loadConfigAndTranslations();
</script>
{% endblock %}

View File

@@ -1,80 +0,0 @@
{% extends "base.html" %}
{% block css %}
#packet_details {
height: 95vh;
overflow: auto;
}
.main-container, .container {
max-width: 600px;
margin: 0 auto;
text-align: center;
}
.card-section {
background-color: #272b2f;
border: 1px solid #474b4e;
padding: 15px 20px;
margin-bottom: 10px;
border-radius: 10px;
transition: background-color 0.2s ease;
}
.card-section:hover {
background-color: #2f3338;
}
.section-header {
font-size: 16px;
margin: 0;
font-weight: 500;
}
.section-value {
font-weight: 700;
color: #03dac6;
}
.percentage {
font-size: 12px;
color: #ffeb3b;
font-weight: 400;
}
.main-header {
font-size: 22px;
margin-bottom: 20px;
font-weight: 600;
}
{% endblock %}
{% block body %}
<div class="main-container">
<h2 class="main-header">Mesh Statistics</h2>
<!-- Section for Total Nodes -->
<div class="card-section">
<p class="section-header">
Total Active Nodes (24 hours): <br>
<span class="section-value">{{ "{:,}".format(total_nodes) }}</span>
</p>
</div>
<!-- Section for Total Packets -->
<div class="card-section">
<p class="section-header">
Total Packets (14 days):
<span class="section-value">{{ "{:,}".format(total_packets) }}</span>
</p>
</div>
<!-- Section for Total MQTT Reports -->
<div class="card-section">
<p class="section-header">
Total MQTT Reports (14 days):
<span class="section-value">{{ "{:,}".format(total_packets_seen) }}</span>
</p>
</div>
</div>
{% endblock %}

View File

@@ -2,252 +2,283 @@
{% block css %}
<style>
/* General table styling */
table {
width: 100%;
border-collapse: collapse;
margin-top: 20px;
}
body { background-color: #121212; color: #ddd; }
h1 { text-align: center; margin-top: 20px; color: #fff; }
table th, table td {
padding: 12px;
text-align: left;
border: 1px solid #ddd;
cursor: pointer;
}
table th {
background-color: #333;
color: #fff;
font-weight: bold;
}
table tbody tr:nth-child(odd) {
background-color: #272b2f;
}
table tbody tr:nth-child(even) {
background-color: #212529;
}
table tbody tr:hover {
background-color: #555;
}
table td {
color: #ddd;
}
table th, table td {
border-radius: 8px;
}
@media (max-width: 768px) {
table th, table td {
padding: 8px;
.top-container {
max-width: 1100px;
margin: 25px auto;
padding: 0 15px;
}
}
h1 {
text-align: center;
color: #ddd;
margin-top: 20px;
margin-bottom: 20px;
}
.filter-bar {
display: flex;
flex-wrap: wrap;
align-items: center;
gap: 10px;
margin-bottom: 15px;
}
#bellCurveChart {
height: 400px;
width: 100%;
max-width: 100%;
margin-top: 30px;
}
.filter-bar select {
background-color: #1f2327;
border: 1px solid #444;
color: #ddd;
}
#stats {
text-align: center;
margin-top: 20px;
color: #fff;
}
table {
width: 100%;
border-collapse: collapse;
margin-top: 10px;
}
select {
margin: 10px auto;
display: block;
padding: 6px 10px;
border-radius: 6px;
background-color: #333;
color: #fff;
border: 1px solid #555;
}
table th, table td {
padding: 12px;
border: 1px solid #444;
text-align: left;
color: #ddd;
}
.filter-bar {
display: flex;
flex-direction: row;
align-items: center;
gap: 20px;
}
table th { background-color: #333; }
table tbody tr:nth-child(odd) { background-color: #272b2f; }
table tbody tr:nth-child(even) { background-color: #212529; }
table tbody tr:hover { background-color: #555; cursor: pointer; }
.node-link {
color: #9fd4ff;
text-decoration: none;
}
.node-link:hover { text-decoration: underline; }
.good-x { color: #81ff81; font-weight: bold; }
.ok-x { color: #e8e86d; font-weight: bold; }
.bad-x { color: #ff6464; font-weight: bold; }
</style>
{% endblock %}
{% block body %}
<h1>Top Traffic Nodes (last 24 hours)</h1>
<!-- Channel Filter Dropdown -->
<select id="channelFilter"></select>
<h1>Top Nodes Traffic</h1>
<div id="stats">
<p>This chart shows a bell curve (normal distribution) based on the total <strong>"Times Seen"</strong> values for all nodes. It helps visualize how frequently nodes are heard, relative to the average.</p>
<p>This "Times Seen" value is the closest that we can get to Mesh utilization by node.</p>
<p><strong>Mean:</strong> <span id="mean"></span> - <strong>Standard Deviation:</strong> <span id="stdDev"></span></p>
<div class="top-container">
<div class="filter-bar">
<div>
<label for="channelFilter">Channel:</label>
<select id="channelFilter" class="form-select form-select-sm" style="width:auto;"></select>
</div>
<div>
<label for="nodeSearch">Search:</label>
<input id="nodeSearch" type="text" class="form-control form-control-sm"
placeholder="Search nodes..."
style="width:180px; display:inline-block;">
</div>
</div>
<!-- Chart -->
<div id="bellCurveChart"></div>
<!-- Table -->
{% if nodes %}
<div class="container">
<table id="trafficTable">
<thead>
<tr>
<th onclick="sortTable(0)">Long Name</th>
<th onclick="sortTable(1)">Short Name</th>
<th onclick="sortTable(2)">Channel</th>
<th onclick="sortTable(3)">Packets Sent</th>
<th onclick="sortTable(4)">Times Seen</th>
<th onclick="sortTable(5)">Seen % of Mean</th>
</tr>
</thead>
<tbody></tbody>
</table>
</div>
{% else %}
<p style="text-align: center;">No top traffic nodes available.</p>
{% endif %}
<script src="https://cdn.jsdelivr.net/npm/echarts@5.3.2/dist/echarts.min.js"></script>
<div class="table-responsive">
<table id="nodesTable">
<thead>
<tr>
<th>Long Name</th>
<th>Short Name</th>
<th>Channel</th>
<th>Sent (24h)</th>
<th>Seen (24h)</th>
<th>Avg Gateways</th>
</tr>
</thead>
<tbody></tbody>
</table>
</div>
</div>
<script>
const nodes = {{ nodes | tojson }};
let filteredNodes = [];
let allNodes = [];
// Chart & Stats
const chart = echarts.init(document.getElementById('bellCurveChart'));
const meanEl = document.getElementById('mean');
const stdEl = document.getElementById('stdDev');
async function loadChannels() {
try {
const res = await fetch("/api/channels");
const data = await res.json();
const channels = data.channels || [];
// Populate Channel Dropdown (without "All"), default to "LongFast"
const channelSet = new Set();
nodes.forEach(n => channelSet.add(n.channel));
const dropdown = document.getElementById('channelFilter');
const sortedChannels = [...channelSet].sort();
const select = document.getElementById("channelFilter");
sortedChannels.forEach(channel => {
const option = document.createElement('option');
option.value = channel;
option.textContent = channel;
if (channel === "LongFast") {
option.selected = true;
// Default LongFast first
if (channels.includes("LongFast")) {
const opt = document.createElement("option");
opt.value = "LongFast";
opt.textContent = "LongFast";
select.appendChild(opt);
}
for (const ch of channels) {
if (ch === "LongFast") continue;
const opt = document.createElement("option");
opt.value = ch;
opt.textContent = ch;
select.appendChild(opt);
}
select.addEventListener("change", renderTable);
} catch (err) {
console.error("Error loading channels:", err);
}
}
dropdown.appendChild(option);
});
// Default to LongFast filter on load
filteredNodes = nodes.filter(n => n.channel === "LongFast");
async function loadNodes() {
try {
const res = await fetch("/api/nodes");
const data = await res.json();
allNodes = data.nodes || [];
} catch (err) {
console.error("Error loading nodes:", err);
}
}
// Filter change handler
dropdown.addEventListener('change', () => {
const val = dropdown.value;
filteredNodes = nodes.filter(n => n.channel === val);
updateTable();
updateStatsAndChart();
});
async function fetchNodeStats(nodeId) {
try {
const url = `/api/stats/count?from_node=${nodeId}&period_type=day&length=1`;
const res = await fetch(url);
const data = await res.json();
// Normal distribution function
function normalDistribution(x, mean, stdDev) {
return (1 / (stdDev * Math.sqrt(2 * Math.PI))) * Math.exp(-0.5 * Math.pow((x - mean) / stdDev, 2));
}
const sent = data.total_packets || 0;
const seen = data.total_seen || 0;
const avg = seen / Math.max(sent, 1);
// Update table based on filteredNodes
function updateTable() {
const tbody = document.querySelector('#trafficTable tbody');
return {
sent,
seen,
avg: avg
};
} catch (err) {
console.error("Stat error", err);
return { sent: 0, seen: 0, avg: 0 };
}
}
function avgClass(v) {
if (v >= 10) return "good-x"; // Very strong node
if (v >= 2) return "ok-x"; // Normal node
return "bad-x"; // Weak node
}
async function renderTable() {
const tbody = document.querySelector("#nodesTable tbody");
tbody.innerHTML = "";
const mean = filteredNodes.reduce((sum, n) => sum + n.total_times_seen, 0) / (filteredNodes.length || 1);
const channel = document.getElementById("channelFilter").value;
const searchText = document.getElementById("nodeSearch").value.trim().toLowerCase();
for (const node of filteredNodes) {
const percent = mean > 0 ? ((node.total_times_seen / mean) * 100).toFixed(1) + "%" : "0%";
const row = `<tr>
<td><a href="/packet_list/${node.node_id}">${node.long_name}</a></td>
<td>${node.short_name}</td>
<td>${node.channel}</td>
<td><a href="/top?node_id=${node.node_id}">${node.total_packets_sent}</a></td>
<td>${node.total_times_seen}</td>
<td>${percent}</td>
</tr>`;
tbody.insertAdjacentHTML('beforeend', row);
}
}
// Filter nodes by channel FIRST
let filtered = allNodes.filter(n => n.channel === channel);
// Update chart & stats
function updateStatsAndChart() {
const timesSeen = filteredNodes.map(n => n.total_times_seen);
const mean = timesSeen.reduce((sum, v) => sum + v, 0) / (timesSeen.length || 1);
const stdDev = Math.sqrt(timesSeen.reduce((sum, v) => sum + Math.pow(v - mean, 2), 0) / (timesSeen.length || 1));
meanEl.textContent = mean.toFixed(2);
stdEl.textContent = stdDev.toFixed(2);
const min = Math.min(...timesSeen);
const max = Math.max(...timesSeen);
const step = (max - min) / 100;
const xData = [], yData = [];
for (let x = min; x <= max; x += step) {
xData.push(x);
yData.push(normalDistribution(x, mean, stdDev));
// Then apply search
if (searchText !== "") {
filtered = filtered.filter(n =>
(n.long_name && n.long_name.toLowerCase().includes(searchText)) ||
(n.short_name && n.short_name.toLowerCase().includes(searchText)) ||
String(n.node_id).includes(searchText)
);
}
const option = {
animation: false,
tooltip: { trigger: 'axis' },
xAxis: {
name: 'Total Times Seen',
type: 'value',
min, max
},
yAxis: {
name: 'Probability Density',
type: 'value',
},
series: [{
data: xData.map((x, i) => [x, yData[i]]),
type: 'line',
smooth: true,
color: 'blue',
lineStyle: { width: 3 }
}]
};
chart.setOption(option);
chart.resize();
}
// --- Create placeholder rows ---
const rowRefs = filtered.map(n => {
const tr = document.createElement("tr");
tr.addEventListener("click", () => {
window.location.href = `/node/${n.node_id}`;
});
// Sorting
function sortTable(n) {
const table = document.getElementById("trafficTable");
const rows = Array.from(table.rows).slice(1);
const header = table.rows[0].cells[n];
const isNumeric = !isNaN(rows[0].cells[n].innerText.replace('%', ''));
let sortedRows = rows.sort((a, b) => {
const valA = isNumeric ? parseFloat(a.cells[n].innerText.replace('%', '')) : a.cells[n].innerText.toLowerCase();
const valB = isNumeric ? parseFloat(b.cells[n].innerText.replace('%', '')) : b.cells[n].innerText.toLowerCase();
return valA > valB ? 1 : -1;
const tdLong = document.createElement("td");
const a = document.createElement("a");
a.href = `/node/${n.node_id}`;
a.textContent = n.long_name || n.node_id;
a.className = "node-link";
a.addEventListener("click", e => e.stopPropagation());
tdLong.appendChild(a);
const tdShort = document.createElement("td");
tdShort.textContent = n.short_name || "";
const tdChannel = document.createElement("td");
tdChannel.textContent = n.channel || "";
const tdSent = document.createElement("td");
tdSent.textContent = "Loading...";
const tdSeen = document.createElement("td");
tdSeen.textContent = "Loading...";
const tdAvg = document.createElement("td");
tdAvg.textContent = "Loading...";
tr.appendChild(tdLong);
tr.appendChild(tdShort);
tr.appendChild(tdChannel);
tr.appendChild(tdSent);
tr.appendChild(tdSeen);
tr.appendChild(tdAvg);
tbody.appendChild(tr);
return { node: n, tr, tdSent, tdSeen, tdAvg };
});
if (header.getAttribute('data-sort-direction') === 'asc') {
sortedRows.reverse();
header.setAttribute('data-sort-direction', 'desc');
} else {
header.setAttribute('data-sort-direction', 'asc');
}
// --- Stats fetch ---
const statsList = await Promise.all(
rowRefs.map(ref => fetchNodeStats(ref.node.node_id))
);
const tbody = table.tBodies[0];
sortedRows.forEach(row => tbody.appendChild(row));
// --- Update + cleanup empty nodes ---
let combined = rowRefs.map((ref, i) => {
const stats = statsList[i];
ref.tdSent.textContent = stats.sent;
ref.tdSeen.textContent = stats.seen;
ref.tdAvg.innerHTML = `<span class="${avgClass(stats.avg)}">${stats.avg.toFixed(1)}</span>`;
return {
tr: ref.tr,
sent: stats.sent,
seen: stats.seen
};
});
// Remove nodes with no traffic
combined = combined.filter(r => !(r.sent === 0 && r.seen === 0));
// Sort by traffic (seen)
combined.sort((a, b) => b.seen - a.seen);
// Rebuild table
tbody.innerHTML = "";
for (const r of combined) {
tbody.appendChild(r.tr);
}
}
// Initialize
updateTable();
updateStatsAndChart();
window.addEventListener('resize', () => chart.resize());
(async () => {
await loadNodes();
await loadChannels();
document.getElementById("channelFilter").value = "LongFast";
document.getElementById("nodeSearch").addEventListener("input", renderTable);
renderTable();
})();
</script>
{% endblock %}

View File

@@ -1,94 +0,0 @@
{% block head %}
<script src="https://cdn.jsdelivr.net/npm/echarts/dist/echarts.min.js"></script>
{% endblock %}
{% block body %}
<div id="mynetwork" style="width: 100%; height: 800px;"></div>
<script type="text/javascript">
const chart = echarts.init(document.getElementById('mynetwork'));
const rawNodes = {{ chart_data['nodes'] | tojson }};
const rawEdges = {{ chart_data['edges'] | tojson }};
// Build DAG layout
const layers = {};
const nodeDepth = {};
// Organize nodes into layers by hop count
for (const edge of rawEdges) {
const { source, target } = edge;
if (!(source in nodeDepth)) nodeDepth[source] = 0;
const nextDepth = nodeDepth[source] + 1;
nodeDepth[target] = Math.max(nodeDepth[target] || 0, nextDepth);
}
for (const node of rawNodes) {
const depth = nodeDepth[node.name] || 0;
if (!(depth in layers)) layers[depth] = [];
layers[depth].push(node);
}
// Position nodes manually
const chartNodes = [];
const layerKeys = Object.keys(layers).sort((a, b) => +a - +b);
const verticalSpacing = 100;
const horizontalSpacing = 180;
layerKeys.forEach((depth, layerIndex) => {
const layer = layers[depth];
const y = layerIndex * verticalSpacing;
const xStart = -(layer.length - 1) * horizontalSpacing / 2;
layer.forEach((node, i) => {
chartNodes.push({
...node,
x: xStart + i * horizontalSpacing,
y: y,
itemStyle: {
color: '#dddddd',
borderColor: '#222',
borderWidth: 2,
},
label: {
show: true,
position: 'inside',
color: '#000',
fontSize: 12,
formatter: node.short_name || node.name,
},
});
});
});
const chartEdges = rawEdges.map(edge => ({
source: edge.source,
target: edge.target,
lineStyle: {
color: edge.originalColor || '#ccc',
width: 2,
type: 'solid',
},
}));
const option = {
backgroundColor: '#fff',
tooltip: {},
animation: false,
series: [{
type: 'graph',
layout: 'none',
coordinateSystem: null,
data: chartNodes,
links: chartEdges,
roam: true,
edgeSymbol: ['none', 'arrow'],
edgeSymbolSize: [0, 10],
lineStyle: {
curveness: 0,
},
}],
};
chart.setOption(option);
</script>
{% endblock %}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1 @@
"""Web submodule for MeshView API endpoints."""

692
meshview/web_api/api.py Normal file
View File

@@ -0,0 +1,692 @@
"""API endpoints for MeshView."""
import datetime
import json
import logging
import os
from aiohttp import web
from sqlalchemy import text
from meshtastic.protobuf.portnums_pb2 import PortNum
from meshview import database, decode_payload, store
from meshview.__version__ import __version__, _git_revision_short, get_version_info
from meshview.config import CONFIG
logger = logging.getLogger(__name__)
# Will be set by web.py during initialization
Packet = None
SEQ_REGEX = None
LANG_DIR = None
# Create dedicated route table for API endpoints
routes = web.RouteTableDef()
def init_api_module(packet_class, seq_regex, lang_dir):
"""Initialize API module with dependencies from main web module."""
global Packet, SEQ_REGEX, LANG_DIR
Packet = packet_class
SEQ_REGEX = seq_regex
LANG_DIR = lang_dir
@routes.get("/api/channels")
async def api_channels(request: web.Request):
period_type = request.query.get("period_type", "hour")
length = int(request.query.get("length", 24))
try:
channels = await store.get_channels_in_period(period_type, length)
return web.json_response({"channels": channels})
except Exception as e:
return web.json_response({"channels": [], "error": str(e)})
@routes.get("/api/nodes")
async def api_nodes(request):
try:
# Optional query parameters
role = request.query.get("role")
channel = request.query.get("channel")
hw_model = request.query.get("hw_model")
days_active = request.query.get("days_active")
if days_active:
try:
days_active = int(days_active)
except ValueError:
days_active = None
# Fetch nodes from database
nodes = await store.get_nodes(
role=role, channel=channel, hw_model=hw_model, days_active=days_active
)
# Prepare the JSON response
nodes_data = []
for n in nodes:
nodes_data.append(
{
"id": getattr(n, "id", None),
"node_id": n.node_id,
"long_name": n.long_name,
"short_name": n.short_name,
"hw_model": n.hw_model,
"firmware": n.firmware,
"role": n.role,
"last_lat": getattr(n, "last_lat", None),
"last_long": getattr(n, "last_long", None),
"channel": n.channel,
# "last_update": n.last_update.isoformat(),
"last_seen_us": n.last_seen_us,
}
)
return web.json_response({"nodes": nodes_data})
except Exception as e:
logger.error(f"Error in /api/nodes: {e}")
return web.json_response({"error": "Failed to fetch nodes"}, status=500)
@routes.get("/api/packets")
async def api_packets(request):
try:
# --- Parse query parameters ---
packet_id_str = request.query.get("packet_id")
limit_str = request.query.get("limit", "50")
since_str = request.query.get("since")
portnum_str = request.query.get("portnum")
contains = request.query.get("contains")
# NEW — explicit filters
from_node_id_str = request.query.get("from_node_id")
to_node_id_str = request.query.get("to_node_id")
node_id_str = request.query.get("node_id") # legacy: match either from/to
# --- If a packet_id is provided, return only that packet ---
if packet_id_str:
try:
packet_id = int(packet_id_str)
except ValueError:
return web.json_response({"error": "Invalid packet_id format"}, status=400)
packet = await store.get_packet(packet_id)
if not packet:
return web.json_response({"packets": []})
p = Packet.from_model(packet)
data = {
"id": p.id,
"from_node_id": p.from_node_id,
"to_node_id": p.to_node_id,
"portnum": int(p.portnum) if p.portnum is not None else None,
"payload": (p.payload or "").strip(),
"import_time_us": p.import_time_us,
"import_time": p.import_time.isoformat() if p.import_time else None,
"channel": getattr(p.from_node, "channel", ""),
"long_name": getattr(p.from_node, "long_name", ""),
}
return web.json_response({"packets": [data]})
# --- Parse limit ---
try:
limit = min(max(int(limit_str), 1), 100)
except ValueError:
limit = 50
# --- Parse since timestamp ---
since = None
if since_str:
try:
since = int(since_str)
except ValueError:
logger.warning(f"Invalid 'since' value (expected microseconds): {since_str}")
# --- Parse portnum ---
portnum = None
if portnum_str:
try:
portnum = int(portnum_str)
except ValueError:
logger.warning(f"Invalid portnum: {portnum_str}")
# --- Parse node filters ---
from_node_id = None
to_node_id = None
node_id = None # legacy: match either from/to
if from_node_id_str:
try:
from_node_id = int(from_node_id_str, 0)
except ValueError:
logger.warning(f"Invalid from_node_id: {from_node_id_str}")
if to_node_id_str:
try:
to_node_id = int(to_node_id_str, 0)
except ValueError:
logger.warning(f"Invalid to_node_id: {to_node_id_str}")
if node_id_str:
try:
node_id = int(node_id_str, 0)
except ValueError:
logger.warning(f"Invalid node_id: {node_id_str}")
# --- Fetch packets using explicit filters ---
packets = await store.get_packets(
from_node_id=from_node_id,
to_node_id=to_node_id,
node_id=node_id,
portnum=portnum,
after=since,
contains=contains,
limit=limit,
)
ui_packets = [Packet.from_model(p) for p in packets]
# --- Text message filtering ---
if portnum == PortNum.TEXT_MESSAGE_APP:
ui_packets = [p for p in ui_packets if p.payload and not SEQ_REGEX.fullmatch(p.payload)]
if contains:
ui_packets = [p for p in ui_packets if contains.lower() in p.payload.lower()]
# --- Sort descending by import_time_us ---
ui_packets.sort(
key=lambda p: (p.import_time_us is not None, p.import_time_us or 0), reverse=True
)
ui_packets = ui_packets[:limit]
# --- Build JSON output ---
packets_data = []
for p in ui_packets:
packet_dict = {
"id": p.id,
"import_time_us": p.import_time_us,
"import_time": p.import_time.isoformat() if p.import_time else None,
"channel": getattr(p.from_node, "channel", ""),
"from_node_id": p.from_node_id,
"to_node_id": p.to_node_id,
"portnum": int(p.portnum),
"long_name": getattr(p.from_node, "long_name", ""),
"payload": (p.payload or "").strip(),
}
reply_id = getattr(
getattr(getattr(p, "raw_mesh_packet", None), "decoded", None),
"reply_id",
None,
)
if reply_id:
packet_dict["reply_id"] = reply_id
packets_data.append(packet_dict)
# --- Latest import_time for incremental fetch ---
latest_import_time = None
if packets_data:
for p in packets_data:
if p.get("import_time_us") and p["import_time_us"] > 0:
latest_import_time = max(latest_import_time or 0, p["import_time_us"])
elif p.get("import_time") and latest_import_time is None:
try:
dt = datetime.datetime.fromisoformat(
p["import_time"].replace("Z", "+00:00")
)
latest_import_time = int(dt.timestamp() * 1_000_000)
except Exception:
pass
response = {"packets": packets_data}
if latest_import_time is not None:
response["latest_import_time"] = latest_import_time
return web.json_response(response)
except Exception as e:
logger.error(f"Error in /api/packets: {e}")
return web.json_response({"error": "Failed to fetch packets"}, status=500)
@routes.get("/api/stats")
async def api_stats(request):
"""
Enhanced stats endpoint:
- Supports global stats (existing behavior)
- Supports per-node stats using ?node=<node_id>
returning both sent AND seen counts in the specified period
"""
allowed_periods = {"hour", "day"}
period_type = request.query.get("period_type", "hour").lower()
if period_type not in allowed_periods:
return web.json_response(
{"error": f"Invalid period_type. Must be one of {allowed_periods}"},
status=400,
)
try:
length = int(request.query.get("length", 24))
except ValueError:
return web.json_response({"error": "length must be an integer"}, status=400)
# NEW: optional combined node stats
node_str = request.query.get("node")
if node_str:
try:
node_id = int(node_str)
except ValueError:
return web.json_response({"error": "node must be an integer"}, status=400)
# Fetch sent packets
sent = await store.get_packet_stats(
period_type=period_type,
length=length,
from_node=node_id,
)
# Fetch seen packets
seen = await store.get_packet_stats(
period_type=period_type,
length=length,
to_node=node_id,
)
return web.json_response(
{
"node_id": node_id,
"period_type": period_type,
"length": length,
"sent": sent.get("total", 0),
"seen": seen.get("total", 0),
}
)
# ---- Existing full stats mode (unchanged) ----
channel = request.query.get("channel")
def parse_int_param(name):
value = request.query.get(name)
if value is not None:
try:
return int(value)
except ValueError:
raise web.HTTPBadRequest(
text=json.dumps({"error": f"{name} must be an integer"}),
content_type="application/json",
) from None
return None
portnum = parse_int_param("portnum")
to_node = parse_int_param("to_node")
from_node = parse_int_param("from_node")
stats = await store.get_packet_stats(
period_type=period_type,
length=length,
channel=channel,
portnum=portnum,
to_node=to_node,
from_node=from_node,
)
return web.json_response(stats)
@routes.get("/api/stats/count")
async def api_stats_count(request):
"""
Returns packet and packet_seen totals.
Behavior:
• If no filters → total packets ever + total seen ever
• If filters → apply window/channel/from/to + packet_id
"""
# -------- Parse request parameters --------
packet_id_str = request.query.get("packet_id")
packet_id = None
if packet_id_str:
try:
packet_id = int(packet_id_str)
except ValueError:
return web.json_response({"error": "packet_id must be integer"}, status=400)
period_type = request.query.get("period_type")
length_str = request.query.get("length")
length = None
if length_str:
try:
length = int(length_str)
except ValueError:
return web.json_response({"error": "length must be integer"}, status=400)
channel = request.query.get("channel")
def parse_int(name):
value = request.query.get(name)
if value is None:
return None
try:
return int(value)
except ValueError:
raise web.HTTPBadRequest(
text=json.dumps({"error": f"{name} must be integer"}),
content_type="application/json",
) from None
from_node = parse_int("from_node")
to_node = parse_int("to_node")
# -------- Case 1: NO FILTERS → return global totals --------
no_filters = (
period_type is None
and length is None
and channel is None
and from_node is None
and to_node is None
and packet_id is None
)
if no_filters:
total_packets = await store.get_total_packet_count()
total_seen = await store.get_total_packet_seen_count()
return web.json_response({"total_packets": total_packets, "total_seen": total_seen})
# -------- Case 2: Apply filters → compute totals --------
total_packets = await store.get_total_packet_count(
period_type=period_type,
length=length,
channel=channel,
from_node=from_node,
to_node=to_node,
)
total_seen = await store.get_total_packet_seen_count(
packet_id=packet_id,
period_type=period_type,
length=length,
channel=channel,
from_node=from_node,
to_node=to_node,
)
return web.json_response({"total_packets": total_packets, "total_seen": total_seen})
@routes.get("/api/edges")
async def api_edges(request):
since = datetime.datetime.now() - datetime.timedelta(hours=48)
filter_type = request.query.get("type")
edges = {}
# Only build traceroute edges if requested
if filter_type in (None, "traceroute"):
async for tr in store.get_traceroutes(since):
try:
route = decode_payload.decode_payload(PortNum.TRACEROUTE_APP, tr.route)
except Exception as e:
logger.error(f"Error decoding Traceroute {tr.id}: {e}")
continue
path = [tr.packet.from_node_id] + list(route.route)
path.append(tr.packet.to_node_id if tr.done else tr.gateway_node_id)
for a, b in zip(path, path[1:], strict=False):
edges[(a, b)] = "traceroute"
# Only build neighbor edges if requested
if filter_type in (None, "neighbor"):
packets = await store.get_packets(portnum=PortNum.NEIGHBORINFO_APP, after=since)
for packet in packets:
try:
_, neighbor_info = decode_payload.decode(packet)
for node in neighbor_info.neighbors:
edges.setdefault((node.node_id, packet.from_node_id), "neighbor")
except Exception as e:
logger.error(
f"Error decoding NeighborInfo packet {getattr(packet, 'id', '?')}: {e}"
)
# Convert edges dict to list format for JSON response
edges_list = [
{"from": frm, "to": to, "type": edge_type} for (frm, to), edge_type in edges.items()
]
return web.json_response({"edges": edges_list})
@routes.get("/api/config")
async def api_config(request):
try:
# ------------------ Helpers ------------------
def get(section, key, default=None):
"""Safe getter for both dict and ConfigParser."""
if isinstance(section, dict):
return section.get(key, default)
return section.get(key, fallback=default)
def get_bool(section, key, default=False):
val = get(section, key, default)
if isinstance(val, bool):
return "true" if val else "false"
if isinstance(val, str):
return "true" if val.lower() in ("1", "true", "yes", "on") else "false"
return "true" if bool(val) else "false"
def get_float(section, key, default=0.0):
try:
return float(get(section, key, default))
except Exception:
return float(default)
def get_int(section, key, default=0):
try:
return int(get(section, key, default))
except Exception:
return default
def get_str(section, key, default=""):
val = get(section, key, default)
return str(val) if val is not None else str(default)
# ------------------ SITE ------------------
site = CONFIG.get("site", {})
safe_site = {
"domain": get_str(site, "domain", ""),
"language": get_str(site, "language", "en"),
"title": get_str(site, "title", ""),
"message": get_str(site, "message", ""),
"starting": get_str(site, "starting", "/chat"),
"nodes": get_bool(site, "nodes", True),
"chat": get_bool(site, "chat", True),
"everything": get_bool(site, "everything", True),
"graphs": get_bool(site, "graphs", True),
"stats": get_bool(site, "stats", True),
"net": get_bool(site, "net", True),
"map": get_bool(site, "map", True),
"top": get_bool(site, "top", True),
"map_top_left_lat": get_float(site, "map_top_left_lat", 39.0),
"map_top_left_lon": get_float(site, "map_top_left_lon", -123.0),
"map_bottom_right_lat": get_float(site, "map_bottom_right_lat", 36.0),
"map_bottom_right_lon": get_float(site, "map_bottom_right_lon", -121.0),
"map_interval": get_int(site, "map_interval", 3),
"firehose_interval": get_int(site, "firehose_interval", 3),
"weekly_net_message": get_str(
site, "weekly_net_message", "Weekly Mesh check-in message."
),
"net_tag": get_str(site, "net_tag", "#BayMeshNet"),
"version": str(__version__),
}
# ------------------ MQTT ------------------
mqtt = CONFIG.get("mqtt", {})
topics_raw = get(mqtt, "topics", [])
if isinstance(topics_raw, str):
try:
topics = json.loads(topics_raw)
except Exception:
topics = [topics_raw]
elif isinstance(topics_raw, list):
topics = topics_raw
else:
topics = []
safe_mqtt = {
"server": get_str(mqtt, "server", ""),
"topics": topics,
}
# ------------------ CLEANUP ------------------
cleanup = CONFIG.get("cleanup", {})
safe_cleanup = {
"enabled": get_bool(cleanup, "enabled", False),
"days_to_keep": get_str(cleanup, "days_to_keep", "14"),
"hour": get_str(cleanup, "hour", "2"),
"minute": get_str(cleanup, "minute", "0"),
"vacuum": get_bool(cleanup, "vacuum", False),
}
safe_config = {
"site": safe_site,
"mqtt": safe_mqtt,
"cleanup": safe_cleanup,
}
return web.json_response(safe_config)
except Exception as e:
return web.json_response({"error": str(e)}, status=500)
@routes.get("/api/lang")
async def api_lang(request):
# Language from ?lang=xx, fallback to config, then to "en"
lang_code = request.query.get("lang") or CONFIG.get("site", {}).get("language", "en")
section = request.query.get("section")
lang_file = os.path.join(LANG_DIR, f"{lang_code}.json")
if not os.path.exists(lang_file):
lang_file = os.path.join(LANG_DIR, "en.json")
# Load JSON translations
with open(lang_file, encoding="utf-8") as f:
translations = json.load(f)
if section:
section = section.lower()
if section in translations:
return web.json_response(translations[section])
else:
return web.json_response(
{"error": f"Section '{section}' not found in {lang_code}"}, status=404
)
# if no section requested → return full translation file
return web.json_response(translations)
@routes.get("/health")
async def health_check(request):
"""Health check endpoint for monitoring and load balancers."""
health_status = {
"status": "healthy",
"timestamp": datetime.datetime.now(datetime.UTC).isoformat(),
"version": __version__,
"git_revision": _git_revision_short,
}
# Check database connectivity
try:
async with database.async_session() as session:
await session.execute(text("SELECT 1"))
health_status["database"] = "connected"
except Exception as e:
logger.error(f"Database health check failed: {e}")
health_status["database"] = "disconnected"
health_status["status"] = "unhealthy"
return web.json_response(health_status, status=503)
# Get database file size
try:
db_url = CONFIG.get("database", {}).get("connection_string", "")
# Extract file path from SQLite connection string (e.g., "sqlite+aiosqlite:///packets.db")
if "sqlite" in db_url.lower():
db_path = db_url.split("///")[-1].split("?")[0]
if os.path.exists(db_path):
db_size_bytes = os.path.getsize(db_path)
# Convert to human-readable format
if db_size_bytes < 1024:
health_status["database_size"] = f"{db_size_bytes} B"
elif db_size_bytes < 1024 * 1024:
health_status["database_size"] = f"{db_size_bytes / 1024:.2f} KB"
elif db_size_bytes < 1024 * 1024 * 1024:
health_status["database_size"] = f"{db_size_bytes / (1024 * 1024):.2f} MB"
else:
health_status["database_size"] = (
f"{db_size_bytes / (1024 * 1024 * 1024):.2f} GB"
)
health_status["database_size_bytes"] = db_size_bytes
except Exception as e:
logger.warning(f"Failed to get database size: {e}")
# Don't fail health check if we can't get size
return web.json_response(health_status)
@routes.get("/version")
async def version_endpoint(request):
"""Return version information including semver and git revision."""
try:
version_info = get_version_info()
return web.json_response(version_info)
except Exception as e:
logger.error(f"Error in /version: {e}")
return web.json_response({"error": "Failed to fetch version info"}, status=500)
@routes.get("/api/packets_seen/{packet_id}")
async def api_packets_seen(request):
try:
# --- Validate packet_id ---
try:
packet_id = int(request.match_info["packet_id"])
except (KeyError, ValueError):
return web.json_response(
{"error": "Invalid or missing packet_id"},
status=400,
)
# --- Fetch list using your helper ---
rows = await store.get_packets_seen(packet_id)
items = []
for row in rows: # <-- FIX: normal for-loop
items.append(
{
"packet_id": row.packet_id,
"node_id": row.node_id,
"rx_time": row.rx_time,
"hop_limit": row.hop_limit,
"hop_start": row.hop_start,
"channel": row.channel,
"rx_snr": row.rx_snr,
"rx_rssi": row.rx_rssi,
"topic": row.topic,
"import_time": (row.import_time.isoformat() if row.import_time else None),
"import_time_us": row.import_time_us,
}
)
return web.json_response({"seen": items})
except Exception:
logger.exception("Error in /api/packets_seen")
return web.json_response(
{"error": "Internal server error"},
status=500,
)

128
mvrun.py
View File

@@ -1,41 +1,139 @@
import argparse
import threading
import logging
import os
import signal
import subprocess
import sys
import threading
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s %(filename)s:%(lineno)d [pid:%(process)d] %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
logger = logging.getLogger(__name__)
# Global list to track running processes
running_processes = []
pid_files = []
def cleanup_pid_file(pid_file):
"""Remove a PID file if it exists"""
if os.path.exists(pid_file):
try:
os.remove(pid_file)
logger.info(f"Removed PID file {pid_file}")
except Exception as e:
logger.error(f"Error removing PID file {pid_file}: {e}")
def signal_handler(sig, frame):
"""Handle Ctrl-C gracefully"""
logger.info("Received interrupt signal (Ctrl-C), shutting down gracefully...")
# Terminate all running processes
for process in running_processes:
if process and process.poll() is None: # Process is still running
try:
logger.info(f"Terminating process PID {process.pid}")
process.terminate()
# Give it a moment to terminate gracefully
try:
process.wait(timeout=5)
logger.info(f"Process PID {process.pid} terminated successfully")
except subprocess.TimeoutExpired:
logger.warning(f"Process PID {process.pid} did not terminate, forcing kill")
process.kill()
process.wait()
except Exception as e:
logger.error(f"Error terminating process PID {process.pid}: {e}")
# Clean up PID files
for pid_file in pid_files:
cleanup_pid_file(pid_file)
logger.info("Shutdown complete")
sys.exit(0)
# Run python in subprocess
def run_script(script_name, *args):
def run_script(python_executable, script_name, pid_file, *args):
process = None
try:
# Path to the Python interpreter inside the virtual environment
python_executable = './env/bin/python'
# Combine the script name and arguments
command = [python_executable, script_name] + list(args)
command = [python_executable, '-u', script_name] + list(args)
# Run the subprocess (output goes directly to console for real-time viewing)
process = subprocess.Popen(command)
# Track the process globally
running_processes.append(process)
# Write PID to file
with open(pid_file, 'w') as f:
f.write(str(process.pid))
logger.info(f"Started {script_name} with PID {process.pid}, written to {pid_file}")
# Wait for the process to complete
process.wait()
# Run the subprocess and report errors
subprocess.run(command, check=True)
except Exception as e:
print(f"Error running {script_name}: {e}")
logger.error(f"Error running {script_name}: {e}")
finally:
# Clean up PID file when process exits
cleanup_pid_file(pid_file)
# Parse runtime argument (--config) and start subprocess threads
def main():
parser = argparse.ArgumentParser(description="Helper script to run the datbase and web frontend in separate threads.")
# Register signal handler for Ctrl-C
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
parser = argparse.ArgumentParser(
description="Helper script to run the database and web frontend in separate threads."
)
# Add --config runtime argument
parser.add_argument('--config', help="Path to the configuration file.", default='config.ini')
parser.add_argument('--pid_dir', help="PID files path.", default='.')
parser.add_argument('--py_exec', help="Path to the Python executable.", default=sys.executable)
args = parser.parse_args()
# PID file paths
db_pid_file = os.path.join(args.pid_dir, 'meshview-db.pid')
web_pid_file = os.path.join(args.pid_dir, 'meshview-web.pid')
# Track PID files globally for cleanup
pid_files.append(db_pid_file)
pid_files.append(web_pid_file)
# Database Thread
dbthrd = threading.Thread(target=run_script, args=('startdb.py', '--config', args.config))
dbthrd = threading.Thread(
target=run_script, args=(args.py_exec, 'startdb.py', db_pid_file, '--config', args.config)
)
# Web server thread
webthrd = threading.Thread(target=run_script, args=('main.py', '--config', args.config))
webthrd = threading.Thread(
target=run_script, args=(args.py_exec, 'main.py', web_pid_file, '--config', args.config)
)
# Start Meshview subprocess threads
logger.info(f"Starting Meshview with config: {args.config}")
logger.info("Starting database thread...")
dbthrd.start()
logger.info("Starting web server thread...")
webthrd.start()
dbthrd.join()
webthrd.join()
try:
dbthrd.join()
webthrd.join()
except KeyboardInterrupt:
# This shouldn't be reached due to signal handler, but just in case
signal_handler(signal.SIGINT, None)
if __name__ == '__main__':
main()
main()

59
pyproject.toml Normal file
View File

@@ -0,0 +1,59 @@
[project]
name = "meshview"
version = "3.0.0"
description = "Real-time monitoring and diagnostic tool for the Meshtastic mesh network"
readme = "README.md"
requires-python = ">=3.10"
dependencies = [
# Core async + networking
"aiohttp>=3.11.12,<4.0.0",
"aiohttp-sse",
"aiodns>=3.2.0,<4.0.0",
"aiomqtt>=2.3.0,<3.0.0",
"asyncpg>=0.30.0,<0.31.0",
"aiosqlite>=0.21.0,<0.22.0",
# Database + ORM
"sqlalchemy[asyncio]>=2.0.38,<3.0.0",
"alembic>=1.14.0,<2.0.0",
# Serialization / security
"protobuf>=5.29.3,<6.0.0",
"cryptography>=44.0.1,<45.0.0",
# Templates
"Jinja2>=3.1.5,<4.0.0",
"MarkupSafe>=3.0.2,<4.0.0",
# Graphs / diagrams
"pydot>=3.0.4,<4.0.0",
]
[project.optional-dependencies]
dev = [
# Data science stack
"numpy>=2.2.3,<3.0.0",
"pandas>=2.2.3,<3.0.0",
"matplotlib>=3.10.0,<4.0.0",
"seaborn>=0.13.2,<1.0.0",
"plotly>=6.0.0,<7.0.0",
# Image support
"pillow>=11.1.0,<12.0.0",
# Debugging / profiling
"psutil>=7.0.0,<8.0.0",
"objgraph>=3.6.2,<4.0.0",
# Testing
"pytest>=8.3.4,<9.0.0",
"pytest-aiohttp>=1.0.5,<2.0.0",
"pytest-asyncio>=0.24.0,<1.0.0",
]
[tool.ruff]
# Linting
target-version = "py313"
line-length = 100
extend-exclude = ["build", "dist", ".venv"]
[tool.ruff.lint]
select = ["E", "F", "I", "UP", "B"] # pick your rulesets
ignore = ["E501"] # example; let formatter handle line length
[tool.ruff.format]
quote-style = "preserve"
indent-style = "space"

View File

@@ -12,6 +12,7 @@ aiosqlite~=0.21.0
# Database + ORM
sqlalchemy[asyncio]~=2.0.38
alembic~=1.14.0
# Serialization / security
protobuf~=5.29.3
@@ -42,3 +43,8 @@ pillow~=11.1.0
# Debugging / profiling
psutil~=7.0.0
objgraph~=3.6.2
# Testing
pytest~=8.3.4
pytest-aiohttp~=1.0.5
pytest-asyncio~=0.24.0

View File

@@ -22,6 +22,9 @@ acme_challenge =
# The domain name of your site.
domain =
# Select language
language = en
# Site title to show in the browser title bar and headers.
title = Bay Area Mesh
@@ -56,11 +59,6 @@ firehose_interal=3
weekly_net_message = Weekly Mesh check-in. We will keep it open on every Wednesday from 5:00pm for checkins. The message format should be (LONG NAME) - (CITY YOU ARE IN) #BayMeshNet.
net_tag = #BayMeshNet
# Updates intervals in seconds, zero or negative number means no updates
# defaults will be 3 seconds
map_interval=3
firehose_interal=3
# -------------------------
# MQTT Broker Configuration
# -------------------------
@@ -85,3 +83,40 @@ password = large4cats
[database]
# SQLAlchemy connection string. This one uses SQLite with asyncio support.
connection_string = sqlite+aiosqlite:///packets.db
# -------------------------
# Database Cleanup Configuration
# -------------------------
[cleanup]
# Enable or disable daily cleanup
enabled = False
# Number of days to keep records in the database
days_to_keep = 14
# Time to run daily cleanup (24-hour format)
hour = 2
minute = 00
# Run VACUUM after cleanup
vacuum = False
# Enable database backups (independent of cleanup)
backup_enabled = False
# Directory to store database backups (relative or absolute path)
backup_dir = ./backups
# Time to run daily backup (24-hour format)
# If not specified, uses cleanup hour/minute
backup_hour = 2
backup_minute = 00
# -------------------------
# Logging Configuration
# -------------------------
[logging]
# Enable or disable HTTP access logs from the web server
# When disabled, request logs like "GET /api/chat" will not appear
# Application logs (errors, startup messages, etc.) are unaffected
# Set to True to enable, False to disable (default: False)
access_log = False
# Database cleanup logfile
db_cleanup_logfile = dbcleanup.log

84
setup-dev.sh Executable file
View File

@@ -0,0 +1,84 @@
#!/bin/bash
#
# setup-dev.sh
#
# Development environment setup script for MeshView
# This script sets up the Python virtual environment and installs development tools
set -e
echo "Setting up MeshView development environment..."
echo ""
# Check if uv is installed
if ! command -v uv &> /dev/null; then
echo "Error: 'uv' is not installed."
echo "Install it with: curl -LsSf https://astral.sh/uv/install.sh | sh"
exit 1
fi
# Create virtual environment if it doesn't exist
if [ ! -d "env" ]; then
echo "Creating Python virtual environment with uv..."
uv venv env
echo "✓ Virtual environment created"
else
echo "✓ Virtual environment already exists"
fi
# Install requirements
echo ""
echo "Installing requirements..."
uv pip install -r requirements.txt
echo "✓ Requirements installed"
# Install development tools
echo ""
echo "Installing development tools..."
uv pip install pre-commit pytest pytest-asyncio pytest-aiohttp
echo "✓ Development tools installed"
# Install pre-commit hooks
echo ""
echo "Installing pre-commit hooks..."
./env/bin/pre-commit install
echo "✓ Pre-commit hooks installed"
# Install graphviz check
echo ""
if command -v dot &> /dev/null; then
echo "✓ graphviz is installed"
else
echo "⚠ Warning: graphviz is not installed"
echo " Install it with:"
echo " macOS: brew install graphviz"
echo " Debian: sudo apt-get install graphviz"
fi
# Create config.ini if it doesn't exist
echo ""
if [ ! -f "config.ini" ]; then
echo "Creating config.ini from sample..."
cp sample.config.ini config.ini
echo "✓ config.ini created"
echo " Edit config.ini to configure your MQTT and site settings"
else
echo "✓ config.ini already exists"
fi
echo ""
echo "=========================================="
echo "Development environment setup complete!"
echo "=========================================="
echo ""
echo "Next steps:"
echo " 1. Edit config.ini with your MQTT settings"
echo " 2. Run: ./env/bin/python mvrun.py"
echo " 3. Open: http://localhost:8081"
echo ""
echo "Pre-commit hooks are now active:"
echo " - Ruff will auto-format and fix issues before each commit"
echo " - If files are changed, you'll need to git add and commit again"
echo ""
echo "Run tests with: ./env/bin/pytest tests/"
echo ""

View File

@@ -1,51 +1,330 @@
import asyncio
import argparse
import configparser
from meshview import mqtt_reader
from meshview import mqtt_database
from meshview import mqtt_store
import datetime
import gzip
import json
import logging
import shutil
from pathlib import Path
from sqlalchemy import delete
from meshview import migrations, models, mqtt_database, mqtt_reader, mqtt_store
from meshview.config import CONFIG
# -------------------------
# Basic logging configuration
# -------------------------
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s %(filename)s:%(lineno)d [pid:%(process)d] %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
# -------------------------
# Logging for cleanup
# -------------------------
cleanup_logger = logging.getLogger("dbcleanup")
cleanup_logger.setLevel(logging.INFO)
cleanup_logfile = CONFIG.get("logging", {}).get("db_cleanup_logfile", "dbcleanup.log")
file_handler = logging.FileHandler(cleanup_logfile)
file_handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s [%(levelname)s] %(message)s')
file_handler.setFormatter(formatter)
cleanup_logger.addHandler(file_handler)
async def load_database_from_mqtt(mqtt_server: str , mqtt_port: int, topic: list, mqtt_user: str | None = None, mqtt_passwd: str | None = None):
async for topic, env in mqtt_reader.get_topic_envelopes(mqtt_server, mqtt_port, topic, mqtt_user, mqtt_passwd):
await mqtt_store.process_envelope(topic, env)
# -------------------------
# Helper functions
# -------------------------
def get_bool(config, section, key, default=False):
return str(config.get(section, {}).get(key, default)).lower() in ("1", "true", "yes", "on")
async def main(config):
mqtt_database.init_database(config["database"]["connection_string"])
def get_int(config, section, key, default=0):
try:
return int(config.get(section, {}).get(key, default))
except ValueError:
return default
# -------------------------
# Shared DB lock
# -------------------------
db_lock = asyncio.Lock()
# -------------------------
# Database backup function
# -------------------------
async def backup_database(database_url: str, backup_dir: str = ".") -> None:
"""
Create a compressed backup of the database file.
Args:
database_url: SQLAlchemy connection string
backup_dir: Directory to store backups (default: current directory)
"""
try:
# Extract database file path from connection string
# Format: sqlite+aiosqlite:///path/to/db.db
if not database_url.startswith("sqlite"):
cleanup_logger.warning("Backup only supported for SQLite databases")
return
db_path = database_url.split("///", 1)[1] if "///" in database_url else None
if not db_path:
cleanup_logger.error("Could not extract database path from connection string")
return
db_file = Path(db_path)
if not db_file.exists():
cleanup_logger.error(f"Database file not found: {db_file}")
return
# Create backup directory if it doesn't exist
backup_path = Path(backup_dir)
backup_path.mkdir(parents=True, exist_ok=True)
# Generate backup filename with timestamp
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
backup_filename = f"{db_file.stem}_backup_{timestamp}.db.gz"
backup_file = backup_path / backup_filename
cleanup_logger.info(f"Creating backup: {backup_file}")
# Copy and compress the database file
with open(db_file, 'rb') as f_in:
with gzip.open(backup_file, 'wb', compresslevel=9) as f_out:
shutil.copyfileobj(f_in, f_out)
# Get file sizes for logging
original_size = db_file.stat().st_size / (1024 * 1024) # MB
compressed_size = backup_file.stat().st_size / (1024 * 1024) # MB
compression_ratio = (1 - compressed_size / original_size) * 100 if original_size > 0 else 0
cleanup_logger.info(
f"Backup created successfully: {backup_file.name} "
f"({original_size:.2f} MB -> {compressed_size:.2f} MB, "
f"{compression_ratio:.1f}% compression)"
)
except Exception as e:
cleanup_logger.error(f"Error creating database backup: {e}")
# -------------------------
# Database backup scheduler
# -------------------------
async def daily_backup_at(hour: int = 2, minute: int = 0, backup_dir: str = "."):
while True:
now = datetime.datetime.now()
next_run = now.replace(hour=hour, minute=minute, second=0, microsecond=0)
if next_run <= now:
next_run += datetime.timedelta(days=1)
delay = (next_run - now).total_seconds()
cleanup_logger.info(f"Next backup scheduled at {next_run}")
await asyncio.sleep(delay)
database_url = CONFIG["database"]["connection_string"]
await backup_database(database_url, backup_dir)
# -------------------------
# Database cleanup using ORM
# -------------------------
async def daily_cleanup_at(
hour: int = 2,
minute: int = 0,
days_to_keep: int = 14,
vacuum_db: bool = True,
wait_for_backup: bool = False,
):
while True:
now = datetime.datetime.now()
next_run = now.replace(hour=hour, minute=minute, second=0, microsecond=0)
if next_run <= now:
next_run += datetime.timedelta(days=1)
delay = (next_run - now).total_seconds()
cleanup_logger.info(f"Next cleanup scheduled at {next_run}")
await asyncio.sleep(delay)
# If backup is enabled, wait a bit to let backup complete first
if wait_for_backup:
cleanup_logger.info("Waiting 60 seconds for backup to complete...")
await asyncio.sleep(60)
# Local-time cutoff as string for SQLite DATETIME comparison
cutoff = (datetime.datetime.now() - datetime.timedelta(days=days_to_keep)).strftime(
"%Y-%m-%d %H:%M:%S"
)
cleanup_logger.info(f"Running cleanup for records older than {cutoff}...")
try:
async with db_lock: # Pause ingestion
cleanup_logger.info("Ingestion paused for cleanup.")
async with mqtt_database.async_session() as session:
# -------------------------
# Packet
# -------------------------
result = await session.execute(
delete(models.Packet).where(models.Packet.import_time < cutoff)
)
cleanup_logger.info(f"Deleted {result.rowcount} rows from Packet")
# -------------------------
# PacketSeen
# -------------------------
result = await session.execute(
delete(models.PacketSeen).where(models.PacketSeen.import_time < cutoff)
)
cleanup_logger.info(f"Deleted {result.rowcount} rows from PacketSeen")
# -------------------------
# Traceroute
# -------------------------
result = await session.execute(
delete(models.Traceroute).where(models.Traceroute.import_time < cutoff)
)
cleanup_logger.info(f"Deleted {result.rowcount} rows from Traceroute")
# -------------------------
# Node
# -------------------------
result = await session.execute(
delete(models.Node).where(models.Node.last_update < cutoff)
)
cleanup_logger.info(f"Deleted {result.rowcount} rows from Node")
await session.commit()
if vacuum_db:
cleanup_logger.info("Running VACUUM...")
async with mqtt_database.engine.begin() as conn:
await conn.exec_driver_sql("VACUUM;")
cleanup_logger.info("VACUUM completed.")
cleanup_logger.info("Cleanup completed successfully.")
cleanup_logger.info("Ingestion resumed after cleanup.")
except Exception as e:
cleanup_logger.error(f"Error during cleanup: {e}")
# -------------------------
# MQTT loading
# -------------------------
async def load_database_from_mqtt(
mqtt_server: str,
mqtt_port: int,
topics: list,
mqtt_user: str | None = None,
mqtt_passwd: str | None = None,
):
async for topic, env in mqtt_reader.get_topic_envelopes(
mqtt_server, mqtt_port, topics, mqtt_user, mqtt_passwd
):
async with db_lock: # Block if cleanup is running
await mqtt_store.process_envelope(topic, env)
# -------------------------
# Main function
# -------------------------
async def main():
logger = logging.getLogger(__name__)
# Initialize database
database_url = CONFIG["database"]["connection_string"]
mqtt_database.init_database(database_url)
# Create migration status table
await migrations.create_migration_status_table(mqtt_database.engine)
# Set migration in progress flag
await migrations.set_migration_in_progress(mqtt_database.engine, True)
logger.info("Migration status set to 'in progress'")
try:
# Check if migrations are needed before running them
logger.info("Checking for pending database migrations...")
if await migrations.is_database_up_to_date(mqtt_database.engine, database_url):
logger.info("Database schema is already up to date, skipping migrations")
else:
logger.info("Database schema needs updating, running migrations...")
migrations.run_migrations(database_url)
logger.info("Database migrations completed")
# Create tables if needed (for backwards compatibility)
logger.info("Creating database tables...")
await mqtt_database.create_tables()
logger.info("Database tables created")
finally:
# Clear migration in progress flag
logger.info("Clearing migration status...")
await migrations.set_migration_in_progress(mqtt_database.engine, False)
logger.info("Migration status cleared - database ready")
mqtt_user = CONFIG["mqtt"].get("username") or None
mqtt_passwd = CONFIG["mqtt"].get("password") or None
mqtt_topics = json.loads(CONFIG["mqtt"]["topics"])
cleanup_enabled = get_bool(CONFIG, "cleanup", "enabled", False)
cleanup_days = get_int(CONFIG, "cleanup", "days_to_keep", 14)
vacuum_db = get_bool(CONFIG, "cleanup", "vacuum", False)
cleanup_hour = get_int(CONFIG, "cleanup", "hour", 2)
cleanup_minute = get_int(CONFIG, "cleanup", "minute", 0)
backup_enabled = get_bool(CONFIG, "cleanup", "backup_enabled", False)
backup_dir = CONFIG.get("cleanup", {}).get("backup_dir", "./backups")
backup_hour = get_int(CONFIG, "cleanup", "backup_hour", cleanup_hour)
backup_minute = get_int(CONFIG, "cleanup", "backup_minute", cleanup_minute)
logger.info(f"Starting MQTT ingestion from {CONFIG['mqtt']['server']}:{CONFIG['mqtt']['port']}")
if cleanup_enabled:
logger.info(
f"Daily cleanup enabled: keeping {cleanup_days} days of data at {cleanup_hour:02d}:{cleanup_minute:02d}"
)
if backup_enabled:
logger.info(
f"Daily backups enabled: storing in {backup_dir} at {backup_hour:02d}:{backup_minute:02d}"
)
await mqtt_database.create_tables()
mqtt_user = None
mqtt_passwd = None
if config["mqtt"]["username"] != "":
mqtt_user: str = config["mqtt"]["username"]
if config["mqtt"]["password"] != "":
mqtt_passwd: str = config["mqtt"]["password"]
mqtt_topics = json.loads(config["mqtt"]["topics"])
async with asyncio.TaskGroup() as tg:
tg.create_task(
load_database_from_mqtt(config["mqtt"]["server"], int(config["mqtt"]["port"]), mqtt_topics, mqtt_user, mqtt_passwd)
load_database_from_mqtt(
CONFIG["mqtt"]["server"],
int(CONFIG["mqtt"]["port"]),
mqtt_topics,
mqtt_user,
mqtt_passwd,
)
)
def load_config(file_path):
"""Load configuration from an INI-style text file."""
config_parser = configparser.ConfigParser()
config_parser.read(file_path)
# Start backup task if enabled
if backup_enabled:
tg.create_task(daily_backup_at(backup_hour, backup_minute, backup_dir))
# Convert to a dictionary for easier access
config = {section: dict(config_parser.items(section)) for section in config_parser.sections()}
return config
# Start cleanup task if enabled (waits for backup if both run at same time)
if cleanup_enabled:
wait_for_backup = (
backup_enabled
and (backup_hour == cleanup_hour)
and (backup_minute == cleanup_minute)
)
tg.create_task(
daily_cleanup_at(
cleanup_hour, cleanup_minute, cleanup_days, vacuum_db, wait_for_backup
)
)
if not cleanup_enabled and not backup_enabled:
cleanup_logger.info("Daily cleanup and backups are both disabled by configuration.")
# -------------------------
# Entry point
# -------------------------
if __name__ == '__main__':
parser = argparse.ArgumentParser("meshview")
parser.add_argument("--config", help="Path to the configuration file.", default='config.ini')
args = parser.parse_args()
config = load_config(args.config)
asyncio.run(main(config))
asyncio.run(main())