Compare commits

...

106 Commits

Author SHA1 Message Date
SpudGunMan
09515b9bc0 Update tictactoe.py 2025-12-27 19:40:05 -08:00
SpudGunMan
9b8c9d80c8 Update hamtest.py 2025-12-27 19:40:02 -08:00
SpudGunMan
8ee838f5c6 Update hangman.py 2025-12-27 19:39:34 -08:00
SpudGunMan
757d6d30b8 fix 2025-12-27 19:38:06 -08:00
SpudGunMan
1ee785d388 fix 2025-12-27 19:32:58 -08:00
SpudGunMan
c3284f0a0f fix Turn Counter 2025-12-27 16:21:17 -08:00
SpudGunMan
bdcc479360 enhance
fix turn count at 0
2025-12-27 15:39:27 -08:00
SpudGunMan
b1444b24e4 Update mmind.py 2025-12-27 15:26:21 -08:00
SpudGunMan
aef67da492 alertDuration
customize the check for API alerts
2025-12-27 15:06:56 -08:00
SpudGunMan
b8b8145447 enhance
@SnyderMesh Thanks for Idea
2025-12-26 15:33:11 -08:00
dependabot[bot]
42a4842a5b Bump docker/login-action
Bumps [docker/login-action](https://github.com/docker/login-action) from 28fdb31ff34708d19615a74d67103ddc2ea9725c to 6862ffc5ab2cdb4405cf318a62a6f4c066e2298b.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](28fdb31ff3...6862ffc5ab)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-version: 6862ffc5ab2cdb4405cf318a62a6f4c066e2298b
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-22 11:35:11 -08:00
SpudGunMan
201591d469 fix dopewar replay bug
reference https://github.com/SpudGunMan/meshing-around/issues/274 thanks MJTheis

Co-Authored-By: MJTheis <232630404+mjtheis@users.noreply.github.com>
2025-12-21 18:27:26 -08:00
dependabot[bot]
4ecdc7b108 Bump docker/metadata-action (#273) 2025-12-09 09:16:24 -08:00
Kelly
3f78bf7a67 Merge pull request #272 from SpudGunMan/dependabot/github_actions/actions/checkout-6 2025-12-09 09:15:48 -08:00
dependabot[bot]
8af21b760c Bump actions/checkout from 5 to 6
Bumps [actions/checkout](https://github.com/actions/checkout) from 5 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-24 11:21:13 +00:00
SpudGunMan
ea3ed46e86 Update locationdata.py 2025-11-18 13:08:52 -08:00
SpudGunMan
d78d6acd1e Update locationdata.py 2025-11-17 14:33:55 -08:00
SpudGunMan
e9b483f4e8 Update locationdata.py 2025-11-16 19:30:30 -08:00
SpudGunMan
94660e7993 Update space.py 2025-11-15 06:27:17 -08:00
SpudGunMan
12aeaef250 fix hop doh hops! 2025-11-12 23:01:19 -08:00
Kelly
2a6f76ab5b Merge pull request #268 from SpudGunMan/lab
Lab
2025-11-12 21:58:31 -08:00
SpudGunMan
05df1e1a3c logsoff 2025-11-12 21:58:00 -08:00
SpudGunMan
38131b4180 cleanup 2025-11-12 21:53:27 -08:00
SpudGunMan
397c39b13d relay node 2025-11-12 21:33:36 -08:00
SpudGunMan
af7dfe8a51 Update README.md 2025-11-12 19:53:20 -08:00
SpudGunMan
d5d163aab9 logs on 2025-11-12 19:48:12 -08:00
Kelly
58cc3e4314 Merge pull request #267 from SpudGunMan/lab
Lab
2025-11-12 19:47:11 -08:00
SpudGunMan
3274dfdbc0 logs off 2025-11-12 19:45:27 -08:00
SpudGunMan
84a1a163d3 Update system.py 2025-11-12 19:34:46 -08:00
SpudGunMan
289eb70738 cleanup 2025-11-12 19:17:49 -08:00
SpudGunMan
a6d51e41bf clean 2025-11-12 19:08:41 -08:00
SpudGunMan
a63020bbb7 space
space
2025-11-12 19:02:46 -08:00
SpudGunMan
2416e73fbf hop refactor for new proto 2025-11-12 17:27:09 -08:00
SpudGunMan
f87f34f8bf Update icad_tone.py 2025-11-12 17:16:39 -08:00
SpudGunMan
eaed034d20 Revert "Update icad_tone.py"
This reverts commit 14b876b989.
2025-11-12 17:13:03 -08:00
SpudGunMan
ec9ac1b1fe Revert "Update icad_tone.py"
This reverts commit c79f3cdfbc.
2025-11-12 17:12:59 -08:00
SpudGunMan
e84ce13878 Revert "Update icad_tone.py"
This reverts commit c31947194e.
2025-11-12 17:12:49 -08:00
SpudGunMan
a5fc8aca82 fix hops 2025-11-12 17:11:14 -08:00
SpudGunMan
c31947194e Update icad_tone.py 2025-11-12 16:21:20 -08:00
SpudGunMan
c79f3cdfbc Update icad_tone.py 2025-11-12 16:18:26 -08:00
SpudGunMan
14b876b989 Update icad_tone.py 2025-11-12 16:09:31 -08:00
SpudGunMan
2cc5b23753 Update icad_tone.py 2025-11-12 15:48:19 -08:00
SpudGunMan
a5b0fda3ac Update icad_tone.py 2025-11-12 13:08:30 -08:00
SpudGunMan
9c5c332e01 Update icad_tone.py 2025-11-12 12:27:45 -08:00
SpudGunMan
ac5e96e463 Update icad_tone.py 2025-11-12 12:26:33 -08:00
SpudGunMan
0ce7deb740 Update icad_tone.py 2025-11-12 12:23:04 -08:00
SpudGunMan
a60333318b Update icad_tone.py 2025-11-12 12:19:38 -08:00
SpudGunMan
665acaa904 Update icad_tone.py 2025-11-12 12:10:18 -08:00
SpudGunMan
0aa8bccd04 Update README.md 2025-11-12 12:07:33 -08:00
SpudGunMan
2e5e8a7589 Update README.md 2025-11-12 11:52:48 -08:00
SpudGunMan
e3e6393bad icad_tone_alerts
icad_tone_alerts
2025-11-12 11:50:40 -08:00
SpudGunMan
be38588292 Update system.py 2025-11-12 10:17:32 -08:00
SpudGunMan
14fb3f9cb6 lab logging 2025-11-11 23:58:12 -08:00
Kelly
c40cd86592 Merge pull request #265 from SpudGunMan/lab
Lab
2025-11-11 23:56:01 -08:00
SpudGunMan
69df48957e clear Logs 2025-11-11 23:54:17 -08:00
SpudGunMan
e29573ebc0 Update system.py 2025-11-11 22:24:52 -08:00
SpudGunMan
13b9b75f86 Update system.py 2025-11-11 22:24:44 -08:00
SpudGunMan
0bfe908391 Update system.py 2025-11-11 22:21:14 -08:00
SpudGunMan
5baee422c2 Update system.py 2025-11-11 22:18:22 -08:00
SpudGunMan
38ff05fd40 logs 2025-11-11 21:22:28 -08:00
SpudGunMan
e1def5422a cleanup 2025-11-11 21:13:41 -08:00
SpudGunMan
93031010cb enhance output data for solar report 2025-11-11 21:05:58 -08:00
SpudGunMan
21e614ab8e enhance solar with NOAA radio weather 2025-11-11 19:39:04 -08:00
SpudGunMan
a5322867e3 Update pong_bot.py 2025-11-11 16:58:19 -08:00
SpudGunMan
2863a64ec8 Update mesh_bot.py 2025-11-11 16:57:30 -08:00
SpudGunMan
678fde7b2c logs 2025-11-11 16:45:29 -08:00
Kelly
ec0f9f966c Merge pull request #264 from SpudGunMan/lab
Lab
2025-11-11 16:44:28 -08:00
SpudGunMan
fd114301f6 logs 2025-11-11 16:43:52 -08:00
SpudGunMan
1778cb6feb Update system.py 2025-11-11 16:38:13 -08:00
SpudGunMan
fc7ca37184 Update system.py 2025-11-11 16:35:16 -08:00
SpudGunMan
fe2110ca2b Update system.py 2025-11-11 16:30:56 -08:00
SpudGunMan
179113e83a Update system.py 2025-11-11 14:46:51 -08:00
SpudGunMan
79348be644 debug 2025-11-11 14:12:55 -08:00
SpudGunMan
35c6232b0c enhance 2025-11-11 13:40:48 -08:00
SpudGunMan
2aa7ffb0e8 packet debugging 2025-11-11 13:35:16 -08:00
SpudGunMan
a7060bc516 cleanup 2025-11-11 13:23:55 -08:00
SpudGunMan
998d979d71 batter channel assign 2025-11-11 13:13:06 -08:00
Kelly
cdfb451d67 Merge pull request #262 from SpudGunMan/dependabot/github_actions/docker/metadata-action-8d8c7c12f7b958582a5cb82ba16d5903cb27976a
Bump docker/metadata-action from 032a4b3bda1b716928481836ac5bfe36e1feaad6 to 8d8c7c12f7b958582a5cb82ba16d5903cb27976a
2025-11-11 13:05:16 -08:00
Kelly
994405955a Merge pull request #263 from SpudGunMan/lab
Lab
2025-11-11 13:03:39 -08:00
SpudGunMan
17d92dc78d Update system.py 2025-11-11 13:02:03 -08:00
SpudGunMan
d0e33f943f logs 2025-11-11 12:53:24 -08:00
dependabot[bot]
f55c7311fa Bump docker/metadata-action
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 032a4b3bda1b716928481836ac5bfe36e1feaad6 to 8d8c7c12f7b958582a5cb82ba16d5903cb27976a.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](032a4b3bda...8d8c7c12f7)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-version: 8d8c7c12f7b958582a5cb82ba16d5903cb27976a
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-10 12:24:13 +00:00
SpudGunMan
5f4f832af6 Update install.sh 2025-11-09 17:00:26 -08:00
SpudGunMan
c3e8f4a93e fix install embedded 2025-11-09 16:59:52 -08:00
SpudGunMan
e72b3c191e Update system.py 2025-11-09 16:43:22 -08:00
SpudGunMan
3774b8407b refactor with new API 2025-11-09 16:29:36 -08:00
SpudGunMan
0e074a6885 Update custom_scheduler.template 2025-11-09 15:54:00 -08:00
SpudGunMan
c8800d837f Update config.template 2025-11-09 15:51:52 -08:00
SpudGunMan
289ada1fc0 refactor 2025-11-09 15:45:09 -08:00
SpudGunMan
e64e60358d enhance 2025-11-09 15:33:32 -08:00
SpudGunMan
658fb33b69 enhance log 2025-11-09 15:29:52 -08:00
SpudGunMan
1568d026f2 flood packet work
debug on
2025-11-09 15:23:26 -08:00
SpudGunMan
232bf98efd bug highfly
Highfly: error: unsupported operand type(s) for -: 'str' and 'int'
2025-11-09 15:17:02 -08:00
SpudGunMan
3cce938334 ChannelAwarness 2025-11-09 15:00:21 -08:00
SpudGunMan
e6a17d9258 patch DM for welcome and LLM 2025-11-09 13:19:36 -08:00
SpudGunMan
d403e4c8c0 Update game_serve.py 2025-11-09 13:18:36 -08:00
SpudGunMan
ae19c5b83f Update battleship_vid.py 2025-11-09 13:18:17 -08:00
SpudGunMan
532efda9e8 Update config.template 2025-11-08 19:48:23 -08:00
SpudGunMan
d20eab03e9 Update README.md 2025-11-08 18:52:08 -08:00
SpudGunMan
df43b61a0c Update install.sh 2025-11-08 12:13:15 -08:00
SpudGunMan
862347cbec refactor leader fly and speed 2025-11-08 11:37:01 -08:00
SpudGunMan
9c412b8328 leaderboard airspeed 2025-11-08 11:24:00 -08:00
SpudGunMan
c3fcacd64b Update locationdata.py 2025-11-08 11:08:16 -08:00
SpudGunMan
68b5de2950 one is a lonely number 2025-11-08 11:04:16 -08:00
SpudGunMan
f578ba6084 Update yolo_vision.py 2025-11-07 12:44:56 -08:00
SpudGunMan
961bb3abba Tesseract 2025-11-07 12:34:17 -08:00
26 changed files with 867 additions and 367 deletions

View File

@@ -25,10 +25,10 @@ jobs:
#
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v6
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@28fdb31ff34708d19615a74d67103ddc2ea9725c
uses: docker/login-action@6862ffc5ab2cdb4405cf318a62a6f4c066e2298b
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
@@ -36,7 +36,7 @@ jobs:
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@032a4b3bda1b716928481836ac5bfe36e1feaad6
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.

View File

@@ -52,6 +52,7 @@ Mesh Bot is a feature-rich Python bot designed to enhance your [Meshtastic](http
- **Customizable Triggers**: Use proximity events for creative applications like "king of the hill" or 🧭 geocache games by adjusting the alert cycle.
- **High Flying Alerts**: Receive notifications when nodes with high altitude are detected on the mesh.
- **Voice/Command Triggers**: Activate bot functions using keywords or voice commands (see [Voice Commands](#voice-commands-vox) for "Hey Chirpy!" support).
- **YOLOv5 alerts**: Use camera modules to detect objects or OCR
### EAS Alerts
- **FEMA iPAWS/EAS Alerts**: Receive Emergency Alerts from FEMA via API on internet-connected nodes.
@@ -72,6 +73,7 @@ Mesh Bot is a feature-rich Python bot designed to enhance your [Meshtastic](http
- **WSJT-X Integration**: Monitor WSJT-X (FT8, FT4, WSPR, etc.) decode messages and forward them to the mesh network with optional callsign filtering.
- **JS8Call Integration**: Monitor JS8Call messages and forward them to the mesh network with optional callsign filtering.
- **Meshages TTS**: The bot can speak mesh messages aloud using [KittenTTS](https://github.com/KittenML/KittenTTS). Enable this feature to have important alerts and messages read out loud on your device—ideal for hands-free operation or accessibility. See [radio.md](modules/radio.md) for setup instructions.
- **Offline Tone out Decoder**: Decode fire Tone out and DTMF and action with alerts to mesh
### Asset Tracking, Check-In/Check-Out, and Inventory Management
Advanced check-in/check-out and asset tracking for people and equipment—ideal for accountability, safety monitoring, and logistics (e.g., Radio-Net, FEMA, trailhead groups). Admin approval workflows, GPS location capture, and overdue alerts. The integrated inventory and point-of-sale (POS) system enables item management, sales tracking, cart-based transactions, and daily reporting, for swaps, emergency supply management, and field operations, maker-places.
@@ -100,7 +102,7 @@ Advanced check-in/check-out and asset tracking for people and equipment—ideal
- **Automatic Message Chunking**: Messages over 160 characters are automatically split to ensure reliable delivery across multiple hops.
## Getting Started
This project is developed on Linux (specifically a Raspberry Pi) but should work on any platform where the [Meshtastic protobuf API](https://meshtastic.org/docs/software/python/cli/) modules are supported, and with any compatible [Meshtastic](https://meshtastic.org/docs/getting-started/) hardware. For pico or low-powered devices, see projects for embedding, armbian or [buildroot](https://github.com/buildroot-meshtastic/buildroot-meshtastic), also see [femtofox](https://github.com/noon92/femtofox) for running on luckfox hardware. If you need a local console consider the [firefly](https://github.com/pdxlocations/firefly) project.
This project is developed on Linux (specifically a Raspberry Pi) but should work on any platform where the [Meshtastic protobuf API](https://meshtastic.org/docs/software/python/cli/) modules are supported, and with any compatible [Meshtastic](https://meshtastic.org/docs/getting-started/) hardware, however it is **recomended to use the latest firmware code**. For pico or low-powered devices, see projects for embedding, armbian or [buildroot](https://github.com/buildroot-meshtastic/buildroot-meshtastic), also see [femtofox](https://github.com/noon92/femtofox) for running on luckfox hardware. If you need a local console consider the [firefly](https://github.com/pdxlocations/firefly) project.
🥔 Please use responsibly and follow local rulings for such equipment. This project captures packets, logs them, and handles over the air communications which can include PII such as GPS locations.
@@ -172,6 +174,7 @@ For testing and feature ideas on Discord and GitHub, if its stable its thanks to
- **mrpatrick1991**: For OG Docker configurations. 💻
- **A-c0rN**: Assistance with iPAWS and 🚨
- **Mike O'Connell/skrrt**: For [eas_alert_parser](etc/eas_alert_parser.py) enhanced by **sheer.cold**
- **dadud**: For idea on [etc/icad_tone.py](etc/icad_tone.py)
- **WH6GXZ nurse dude**: Volcano Alerts 🌋
- **mikecarper**: hamtest, leading to quiz etc.. 📋
- **c.merphy360**: high altitude alerts. 🚀

View File

@@ -79,31 +79,29 @@ kiwixURL = http://127.0.0.1:8080
# Kiwix library name (e.g., wikipedia_en_100_nopic_2025-09)
kiwixLibraryName = wikipedia_en_100_nopic_2025-09
# Enable ollama LLM see more at https://ollama.com
# Enable LLM local Ollama integration, set true for any LLM support
ollama = False
# Ollama model to use (defaults to gemma3:270m) gemma2 is good for older SYSTEM prompt
# ollamaModel = gemma3:latest
# ollamaModel = gemma2:2b
# server instance to use (defaults to local machine install)
# Ollama server instance to use (defaults to local machine install)
ollamaHostName = http://localhost:11434
# Produce LLM replies to messages that aren't commands?
# If False, the LLM only replies to the "ask:" and "askai" commands.
llmReplyToNonCommands = True
# if True, the input is sent raw to the LLM, if False uses SYSTEM prompt
rawLLMQuery = True
# Enable Wikipedia/Kiwix integration with LLM for RAG (Retrieval Augmented Generation)
# When enabled, LLM will automatically search Wikipedia/Kiwix and include context in responses
llmUseWikiContext = False
# Use OpenWebUI instead of direct Ollama API (enables advanced RAG features)
# Use OpenWebUI instead of direct Ollama API / still leave ollama = True
useOpenWebUI = False
# OpenWebUI server URL (e.g., http://localhost:3000)
openWebUIURL = http://localhost:3000
# OpenWebUI API key/token (required when useOpenWebUI is True)
openWebUIAPIKey =
# Ollama model to use (defaults to gemma3:270m) gemma2 is good for older SYSTEM prompt
# ollamaModel is used for both Ollama and OpenWebUI when useOpenWebUI its just the model name
# ollamaModel = gemma3:latest
# ollamaModel = gemma2:2b
# if True, the query is sent raw to the LLM, if False uses internal SYSTEM prompt
rawLLMQuery = True
# If False, the LLM only replies to the "ask:" and "askai" commands. otherwise DM's automatically go to LLM
llmReplyToNonCommands = True
# Enable Wikipedia/Kiwix integration with LLM for RAG (Retrieval Augmented Generation)
# When enabled, LLM will automatically search Wikipedia/Kiwix and include context in responses
llmUseWikiContext = False
# StoreForward Enabled and Limits
StoreForward = True
StoreLimit = 3
@@ -325,6 +323,7 @@ value =
# interval to use when time is not set (e.g. every 2 days)
interval =
# time of day in 24:00 hour format when value is 'day' and interval is not set
# Process run :00,:20,:40 try and vary the 20 minute offsets to avoid collision
time =
[radioMon]

View File

@@ -97,3 +97,36 @@ Run this script to monitor the camera feed and generate alerts for detected and
---
## icad_tone.py
**Purpose:**
`icad_tone.py` is a utility script for detecting fire and EMS radio tones using the [icad_tone_detection](https://github.com/thegreatcodeholio/icad_tone_detection) library. It analyzes audio from a live stream, soundcard, or WAV file, identifies various tone types (such as two-tone, long tone, hi/low, pulsed, MDC, and DTMF), and writes detected alerts to `alert.txt` for integration with Mesh Bot or Meshtastic.
**Usage:**
Run the script from the command line, specifying a WAV file for offline analysis or configuring it to listen to a stream or soundcard for real-time monitoring.
```sh
python etc/icad_tone.py --wav path/to/file.wav
```
Or, for live monitoring (after setting `HTTP_STREAM_URL` in the script):
```sh
python etc/icad_tone.py
```
**What it does:**
- Loads audio from a stream, soundcard, or WAV file.
- Uses `icad_tone_detection` to analyze audio for tone patterns.
- Prints raw detection results and summaries to the console.
- Writes a summary of detected tones to `alert.txt` (overwriting each time).
- Handles errors and missing dependencies gracefully.
**Configuration:**
- `ALERT_FILE_PATH`: Path to the alert output file (default: `alert.txt`).
- `AUDIO_SOURCE`: Set to `"http"` for streaming or `"soundcard"` for local audio input.
- `HTTP_STREAM_URL`: URL of the audio stream (required if using HTTP source).
- `SAMPLE_RATE`, `INPUT_CHANNELS`, `CHUNK_DURATION`: Audio processing parameters.
**Note:**
- Requires installation of dependencies (`icad_tone_detection`)
- Set `HTTP_STREAM_URL` to a valid stream if using HTTP mode.
- Intended for experimental or hobbyist use; may require customization for your workflow.

View File

@@ -15,6 +15,7 @@ def setup_custom_schedules(send_message, tell_joke, welcome_message, handle_wxc,
5. Make sure to uncomment (delete the single #) the example schedules down at the end of the file to enable them
Python is sensitive to indentation so be careful when editing this file.
https://thonny.org is included on pi's image and is a simple IDE to use for editing python files.
6. System Tasks run every 20min try and avoid overlapping schedules to reduce API rapid fire issues. use like 8:05
Available functions you can import and use, be sure they are enabled modules in config.ini:
- tell_joke() - Returns a random joke

222
etc/icad_tone.py Normal file
View File

@@ -0,0 +1,222 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# icad_tone.py - uses icad_tone_detection, for fire and EMS tone detection
# https://github.com/thegreatcodeholio/icad_tone_detection
# output to alert.txt for meshing-around bot
# 2025 K7MHI Kelly Keeton
# ---------------------------
# User Configuration Section
# ---------------------------
ALERT_FILE_PATH = "alert.txt" # Path to alert log file, or None to disable logging
AUDIO_SOURCE = "soundcard" # "soundcard" for mic/line-in, "http" for stream
HTTP_STREAM_URL = "" # Set to your stream URL if using "http"
SAMPLE_RATE = 16000 # Audio sample rate (Hz)
INPUT_CHANNELS = 1 # Number of input channels (1=mono)
MIN_SAMPLES = 4096 # Minimum samples per detection window (increase for better accuracy)
STREAM_BUFFER = 32000 # Number of bytes to buffer before detection (for MP3 streams)
INPUT_DEVICE = 0 # Set to device index or name, or None for default
# ---------------------------
import sys
import time
from icad_tone_detection import tone_detect
from pydub import AudioSegment
import requests
import sounddevice as sd
import numpy as np
import argparse
import io
import warnings
warnings.filterwarnings("ignore", message="nperseg = .* is greater than input length")
def write_alert(message):
if ALERT_FILE_PATH:
try:
with open(ALERT_FILE_PATH, "w") as f: # overwrite each time
f.write(message + "\n")
except Exception as e:
print(f"Error writing to alert file: {e}", file=sys.stderr)
def detect_and_alert(audio_data, sample_rate):
try:
result = tone_detect(audio_data, sample_rate)
except Exception as e:
print(f"Detection error: {e}", file=sys.stderr)
return
# Only print if something is detected
if result and any(getattr(result, t, []) for t in [
"two_tone_result", "long_result", "hi_low_result", "pulsed_result", "mdc_result", "dtmf_result"
]):
print("Raw detection result:", result)
# Prepare alert summary for all relevant tone types
summary = []
if hasattr(result, "dtmf_result") and result.dtmf_result:
for dtmf in result.dtmf_result:
summary.append(f"DTMF Digit: {dtmf.get('digit', '?')} | Duration: {dtmf.get('length', '?')}s")
if hasattr(result, "hi_low_result") and result.hi_low_result:
for hl in result.hi_low_result:
summary.append(
f"Hi/Low Alternations: {hl.get('alternations', '?')} | Duration: {hl.get('length', '?')}s"
)
if hasattr(result, "mdc_result") and result.mdc_result:
for mdc in result.mdc_result:
summary.append(
f"MDC UnitID: {mdc.get('unitID', '?')} | Op: {mdc.get('op', '?')} | Duration: {mdc.get('length', '?')}s"
)
if hasattr(result, "pulsed_result") and result.pulsed_result:
for pl in result.pulsed_result:
summary.append(
f"Pulsed Tone: {pl.get('detected', '?')}Hz | Cycles: {pl.get('cycles', '?')} | Duration: {pl.get('length', '?')}s"
)
if hasattr(result, "two_tone_result") and result.two_tone_result:
for tt in result.two_tone_result:
summary.append(
f"Two-Tone: {tt.get('detected', ['?','?'])[0]}Hz/{tt.get('detected', ['?','?'])[1]}Hz | Tone A: {tt.get('tone_a_length', '?')}s | Tone B: {tt.get('tone_b_length', '?')}s"
)
if hasattr(result, "long_result") and result.long_result:
for lt in result.long_result:
summary.append(
f"Long Tone: {lt.get('detected', '?')}Hz | Duration: {lt.get('length', '?')}s"
)
if summary:
write_alert("\n".join(summary))
def get_supported_sample_rate(device, channels=1):
# Try common sample rates
for rate in [44100, 48000, 16000, 8000]:
try:
sd.check_input_settings(device=device, channels=channels, samplerate=rate)
return rate
except Exception:
continue
return None
def main():
print("="*80)
print(" iCAD Tone Decoder for Meshing-Around Booting Up!")
if AUDIO_SOURCE == "soundcard":
try:
if INPUT_DEVICE is not None:
sd.default.device = INPUT_DEVICE
device_info = sd.query_devices(INPUT_DEVICE, kind='input')
else:
device_info = sd.query_devices(sd.default.device, kind='input')
device_name = device_info['name']
# Detect supported sample rate
detected_rate = get_supported_sample_rate(sd.default.device, INPUT_CHANNELS)
if detected_rate:
SAMPLE_RATE = detected_rate
else:
print("No supported sample rate found, using default.", file=sys.stderr)
except Exception:
device_name = "Unknown"
print(f" Mode: Soundcard | Device: {device_name} | Sample Rate: {SAMPLE_RATE} Hz | Channels: {INPUT_CHANNELS}")
elif AUDIO_SOURCE == "http":
print(f" Mode: HTTP Stream | URL: {HTTP_STREAM_URL} | Buffer: {STREAM_BUFFER} bytes")
else:
print(f" Mode: {AUDIO_SOURCE}")
print("="*80)
time.sleep(1)
parser = argparse.ArgumentParser(description="ICAD Tone Detection")
parser.add_argument("--wav", type=str, help="Path to WAV file for detection")
args = parser.parse_args()
if args.wav:
print(f"Processing WAV file: {args.wav}")
try:
audio = AudioSegment.from_file(args.wav)
if audio.channels > 1:
audio = audio.set_channels(1)
print(f"AudioSegment: channels={audio.channels}, frame_rate={audio.frame_rate}, duration={len(audio)}ms")
detect_and_alert(audio, audio.frame_rate)
except Exception as e:
print(f"Error processing WAV file: {e}", file=sys.stderr)
return
print("Starting ICAD Tone Detection...")
if AUDIO_SOURCE == "http":
if not HTTP_STREAM_URL or HTTP_STREAM_URL.startswith("http://your-stream-url-here"):
print("ERROR: Please set a valid HTTP_STREAM_URL or provide a WAV file using --wav option.", file=sys.stderr)
sys.exit(2)
print(f"Listening to HTTP stream: {HTTP_STREAM_URL}")
try:
response = requests.get(HTTP_STREAM_URL, stream=True, timeout=10)
buffer = io.BytesIO()
try:
for chunk in response.iter_content(chunk_size=4096):
buffer.write(chunk)
# Use STREAM_BUFFER for detection window
if buffer.tell() > STREAM_BUFFER:
buffer.seek(0)
audio = AudioSegment.from_file(buffer, format="mp3")
if audio.channels > 1:
audio = audio.set_channels(1)
# --- Simple audio level detection ---
samples = np.array(audio.get_array_of_samples())
if samples.dtype != np.float32:
samples = samples.astype(np.float32) / 32767.0 # Normalize to -1..1
rms = np.sqrt(np.mean(samples**2))
if rms > 0.01:
print(f"Audio detected! RMS: {rms:.3f} ", end='\r')
if rms > 0.5:
print(f"WARNING: Audio too loud! RMS: {rms:.3f} ", end='\r')
# --- End audio level detection ---
detect_and_alert(audio, audio.frame_rate)
buffer = io.BytesIO()
except KeyboardInterrupt:
print("\nStopped by user.")
sys.exit(0)
except requests.exceptions.RequestException as e:
print(f"Connection error: {e}", file=sys.stderr)
sys.exit(3)
except Exception as e:
print(f"Error processing HTTP stream: {e}", file=sys.stderr)
sys.exit(4)
elif AUDIO_SOURCE == "soundcard":
print("Listening to audio device:")
buffer = np.array([], dtype=np.float32)
min_samples = MIN_SAMPLES # Use configured minimum samples
def callback(indata, frames, time_info, status):
nonlocal buffer
try:
samples = indata[:, 0]
buffer = np.concatenate((buffer, samples))
# --- Simple audio level detection ---
rms = np.sqrt(np.mean(samples**2))
if rms > 0.01:
print(f"Audio detected! RMS: {rms:.3f} ", end='\r')
if rms > 0.5:
print(f"WARNING: Audio too loud! RMS: {rms:.3f} ", end='\r')
# --- End audio level detection ---
# Only process when buffer is large enough
while buffer.size >= min_samples:
int_samples = np.int16(buffer[:min_samples] * 32767)
audio = AudioSegment(
data=int_samples.tobytes(),
sample_width=2,
frame_rate=SAMPLE_RATE,
channels=1
)
detect_and_alert(audio, SAMPLE_RATE)
buffer = buffer[min_samples:] # keep remainder for next window
except Exception as e:
print(f"Callback error: {e}", file=sys.stderr)
try:
with sd.InputStream(samplerate=SAMPLE_RATE, channels=INPUT_CHANNELS, dtype='float32', callback=callback):
print("Press Ctrl+C to stop.")
import signal
signal.pause() # Wait for Ctrl+C, keeps CPU usage minimal
except KeyboardInterrupt:
print("Stopped by user.")
except Exception as e:
print(f"Error accessing soundcard: {e}", file=sys.stderr)
sys.exit(5)
else:
print("Unknown AUDIO_SOURCE. Set to 'http' or 'soundcard'.", file=sys.stderr)
sys.exit(6)
if __name__ == "__main__":
main()

View File

@@ -1,9 +1,9 @@
#!/usr/bin/env python3
# YOLOv5 Object Detection with Movement Tracking using Raspberry Pi AI Camera or USB Webcam
# YOLOv5 Requirements: yolo5 https://docs.ultralytics.com/yolov5/quickstart_tutorial/
# PiCamera2 Requirements: picamera2 https://github.com/raspberrypi/picamera2
# PiCamera2 may need `sudo apt install imx500-all` on Raspberry Pi OS
# PiCamera2 Requirements: picamera2 https://github.com/raspberrypi/picamera2 `sudo apt install imx500-all`
# NVIDIA GPU PyTorch: https://developer.nvidia.com/cuda-downloads
# OCR with Tesseract: https://tesseract-ocr.github.io/tessdoc/Installation.html. `sudo apt-get install tesseract-ocr`
# Adjust settings below as needed, indended for meshing-around alert.txt output to meshtastic
# 2025 K7MHI Kelly Keeton
@@ -16,6 +16,10 @@ MOVEMENT_THRESHOLD = 50 # Pixels to consider as movement (adjust as needed)
IGNORE_STATIONARY = True # Whether to ignore stationary objects in output
ALERT_FUSE_COUNT = 5 # Number of consecutive detections before alerting
ALERT_FILE_PATH = "alert.txt" # e.g., "/opt/meshing-around/alert.txt" or None for no file output
OCR_PROCESSING_ENABLED = True # Whether to perform OCR on detected objects
SAVE_EVIDENCE_IMAGES = True # Whether to save evidence images when OCR text is found in bbox
EVIDENCE_IMAGE_DIR = "." # Change to desired directory, e.g., "/opt/meshing-around/data/images"
EVIDENCE_IMAGE_PATTERN = "evidence_{timestamp}.png"
try:
import torch # YOLOv5 https://docs.ultralytics.com/yolov5/quickstart_tutorial/
@@ -24,7 +28,10 @@ try:
import time
import warnings
import sys
import os
import datetime
if OCR_PROCESSING_ENABLED:
import pytesseract # pip install pytesseract
if PI_CAM:
from picamera2 import Picamera2 # pip install picamera2
@@ -60,10 +67,10 @@ else:
cap.set(cv2.CAP_PROP_FRAME_WIDTH, cam_res[0])
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, cam_res[1])
print("="*40)
print(f" Sentinal Vision 3000 Booting Up!")
print(f" Model: {YOLO_MODEL} | Camera: {CAMERA_TYPE} | Resolution: {RESOLUTION}")
print("="*40)
print("="*80)
print(f" Sentinal Vision 3000 Booting Up!")
print(f" Model: {YOLO_MODEL} | Camera: {CAMERA_TYPE} | Resolution: {RESOLUTION} | OCR: {'Enabled' if OCR_PROCESSING_ENABLED else 'Disabled'}")
print("="*80)
time.sleep(1)
def alert_output(msg, alert_file_path=ALERT_FILE_PATH):
@@ -74,6 +81,22 @@ def alert_output(msg, alert_file_path=ALERT_FILE_PATH):
with open(alert_file_path, "w") as f: # Use "a" to append instead of overwrite
f.write(msg_no_time + "\n")
def extract_text_from_bbox(img, bbox):
try:
cropped = img.crop((bbox[0], bbox[1], bbox[2], bbox[3]))
text = pytesseract.image_to_string(cropped, config="--psm 7")
text_stripped = text.strip()
if text_stripped and SAVE_EVIDENCE_IMAGES:
timestamp = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
image_path = os.path.join(EVIDENCE_IMAGE_DIR, EVIDENCE_IMAGE_PATTERN.format(timestamp=timestamp))
cropped.save(image_path)
print(f"Saved evidence image: {image_path}")
return f"{text_stripped}"
except Exception as e:
print(f"Error during OCR: {e}")
print("More at https://tesseract-ocr.github.io/tessdoc/Installation.html")
return False
try:
i = 0 # Frame counter if zero will be infinite
system_normal_printed = False # system nominal flag, if true disables printing
@@ -134,23 +157,40 @@ try:
if fuse_counters[obj_id] < ALERT_FUSE_COUNT:
continue # Don't alert yet
# OCR on detected region
bbox = [row['xmin'], row['ymin'], row['xmax'], row['ymax']]
if OCR_PROCESSING_ENABLED:
ocr_text = extract_text_from_bbox(img, bbox)
if prev_x is not None:
delta = x_center - prev_x
if abs(delta) < MOVEMENT_THRESHOLD:
direction = "stationary"
if IGNORE_STATIONARY:
if obj_id not in __builtins__.stationary_reported:
alert_output(f"[{timestamp}] {count} {row['name']} {direction}")
msg = f"[{timestamp}] {count} {row['name']} {direction}"
if OCR_PROCESSING_ENABLED and ocr_text:
msg += f" | OCR: {ocr_text}"
alert_output(msg)
__builtins__.stationary_reported.add(obj_id)
else:
alert_output(f"[{timestamp}] {count} {row['name']} {direction}")
msg = f"[{timestamp}] {count} {row['name']} {direction}"
if OCR_PROCESSING_ENABLED and ocr_text:
msg += f" | OCR: {ocr_text}"
alert_output(msg)
else:
direction = "moving right" if delta > 0 else "moving left"
alert_output(f"[{timestamp}] {count} {row['name']} {direction}")
msg = f"[{timestamp}] {count} {row['name']} {direction}"
if OCR_PROCESSING_ENABLED and ocr_text:
msg += f" | OCR: {ocr_text}"
alert_output(msg)
__builtins__.stationary_reported.discard(obj_id)
else:
direction = "detected"
alert_output(f"[{timestamp}] {count} {row['name']} {direction}")
msg = f"[{timestamp}] {count} {row['name']} {direction}"
if OCR_PROCESSING_ENABLED and ocr_text:
msg += f" | OCR: {ocr_text}"
alert_output(msg)
# Reset fuse counters for objects not detected in this frame
for obj_id in list(__builtins__.fuse_counters.keys()):

View File

@@ -56,6 +56,14 @@ if [[ $NOPE -eq 1 ]]; then
sudo rm -f /etc/systemd/system/ollama.service
sudo rm -rf /usr/local/bin/ollama
sudo rm -rf ~/.ollama
# remove ollama service account if exists
if id ollama &>/dev/null; then
sudo userdel ollama || true
fi
# remove ollama group if exists
if getent group ollama &>/dev/null; then
sudo groupdel ollama || true
fi
echo "Ollama removed."
else
echo "Ollama not removed."
@@ -110,6 +118,7 @@ if [[ "$program_path" != "/opt/meshing-around" ]]; then
if [[ $(echo "$move" | grep -i "^y") ]]; then
sudo mv "$program_path" /opt/meshing-around
cd /opt/meshing-around
sudo git config --global --add safe.directory /opt/meshing-around
printf "\nProject moved to /opt/meshing-around.\n"
printf "Please re-run the installer from the new location.\n"
exit 0
@@ -449,8 +458,8 @@ if [[ $(echo "${embedded}" | grep -i "^n") ]]; then
printf "sudo systemctl disable %s.service\n" "$service" >> install_notes.txt
printf "sudo systemctl disable %s.service\n" "$service" >> install_notes.txt
printf "\n older chron statment to run the report generator hourly:\n" >> install_notes.txt
printf "0 * * * * /usr/bin/python3 $program_path/etc/report_generator5.py" >> install_notes.txt
printf " to edit crontab run 'crontab -e'\n" >> install_notes.txt
#printf "0 * * * * /usr/bin/python3 $program_path/etc/report_generator5.py" >> install_notes.txt
#printf " to edit crontab run 'crontab -e'\n" >> install_notes.txt
printf "\nmesh_bot_reporting.timer installed to run daily at 4:20 am\n" >> install_notes.txt
printf "Check timer status: systemctl status mesh_bot_reporting.timer\n" >> install_notes.txt
printf "List all timers: systemctl list-timers\n" >> install_notes.txt
@@ -475,21 +484,6 @@ else
# add service dependency for meshtasticd into service file
#replace="s|After=network.target|After=network.target meshtasticd.service|g"
# Set up the meshing around service
sudo cp /opt/meshing-around/etc/$service.service /etc/systemd/system/$service.service
sudo systemctl daemon-reload
sudo systemctl enable $service.service
sudo systemctl start $service.service
sudo systemctl daemon-reload
# # check if the cron job already exists
# if ! crontab -l | grep -q "$chronjob"; then
# # add the cron job to run the report_generator5.py script
# (crontab -l 2>/dev/null; echo "$chronjob") | crontab -
# printf "\nAdded cron job to run report_generator5.py\n"
# else
# printf "\nCron job already exists, skipping\n"
# fi
# document the service install
printf "Reference following commands:\n\n" > install_notes.txt
printf "sudo systemctl status %s.service\n" "$service" >> install_notes.txt
@@ -500,8 +494,8 @@ else
printf "sudo systemctl stop %s.service\n" "$service" >> install_notes.txt
printf "sudo systemctl disable %s.service\n" "$service" >> install_notes.txt
printf "older crontab to run the report generator hourly:" >> install_notes.txt
printf "0 * * * * /usr/bin/python3 $program_path/etc/report_generator5.py" >> install_notes.txt
printf " to edit crontab run 'crontab -e'" >> install_notes.txt
#printf "0 * * * * /usr/bin/python3 $program_path/etc/report_generator5.py" >> install_notes.txt
#printf " to edit crontab run 'crontab -e'" >> install_notes.txt
printf "\nmesh_bot_reporting.timer installed to run daily at 4:20 am\n" >> install_notes.txt
printf "Check timer status: systemctl status mesh_bot_reporting.timer\n" >> install_notes.txt
printf "List all timers: systemctl list-timers\n" >> install_notes.txt

View File

@@ -109,7 +109,7 @@ def auto_response(message, snr, rssi, hop, pkiStatus, message_from_id, channel_n
"setsms": lambda: handle_sms( message_from_id, message),
"sitrep": lambda: handle_lheard(message, message_from_id, deviceID, isDM),
"sms:": lambda: handle_sms(message_from_id, message),
"solar": lambda: drap_xray_conditions() + "\n" + solar_conditions(),
"solar": lambda: drap_xray_conditions() + "\n" + solar_conditions() + "\n" + get_noaa_scales_summary(),
"sun": lambda: handle_sun(message_from_id, deviceID, channel_number),
"survey": lambda: surveyHandler(message, message_from_id, deviceID),
"s:": lambda: surveyHandler(message, message_from_id, deviceID),
@@ -287,10 +287,12 @@ def handle_ping(message_from_id, deviceID, message, hop, snr, rssi, isDM, chann
#flood
msg += " [F]"
if (float(snr) != 0 or float(rssi) != 0) and "Hops" not in hop:
if (float(snr) != 0 or float(rssi) != 0) and "Hop" not in hop:
msg += f"\nSNR:{snr} RSSI:{rssi}"
elif "Hops" in hop:
msg += f"\n{hop}🐇 "
elif "Hop" in hop:
# janky, remove the words Gateway or MQTT if present
hop = hop.replace("Gateway", "").replace("Direct", "").replace("MQTT", "").strip()
msg += f"\n{hop} "
if "@" in message:
msg = msg + " @" + message.split("@")[1]
@@ -660,7 +662,7 @@ def handle_llm(message_from_id, channel_number, deviceID, message, publicChannel
if not any(node['nodeID'] == message_from_id and node['welcome'] == True for node in seenNodes):
if (channel_number == publicChannel and my_settings.antiSpam) or my_settings.useDMForResponse:
# send via DM
send_message(my_settings.welcome_message, channel_number, message_from_id, deviceID)
send_message(my_settings.welcome_message, 0, message_from_id, deviceID)
else:
# send via channel
send_message(my_settings.welcome_message, channel_number, 0, deviceID)
@@ -693,7 +695,7 @@ def handle_llm(message_from_id, channel_number, deviceID, message, publicChannel
if msg != '':
if (channel_number == publicChannel and my_settings.antiSpam) or my_settings.useDMForResponse:
# send via DM
send_message(msg, channel_number, message_from_id, deviceID)
send_message(msg, 0, message_from_id, deviceID)
else:
# send via channel
send_message(msg, channel_number, 0, deviceID)
@@ -736,11 +738,6 @@ def handleDopeWars(message, nodeID, rxNode):
if p.get('userID') == nodeID:
p['last_played'] = time.time()
msg = playDopeWars(nodeID, message)
# if message starts wth 'e'xit remove player from tracker
if message.lower().startswith('e'):
dwPlayerTracker[:] = [p for p in dwPlayerTracker if p.get('userID') != nodeID]
msg = 'You have exited Dope Wars.'
return msg
def handle_gTnW(chess = False):
@@ -1909,24 +1906,38 @@ def onReceive(packet, interface):
# check if the packet has a channel flag use it ## FIXME needs to be channel hash lookup
if packet.get('channel'):
channel_number = packet.get('channel')
# get channel name from channel number from connected devices
for device in channel_list:
if device["interface_id"] == rxNode:
device_channels = device['channels']
for chan_name, info in device_channels.items():
if info['number'] == channel_number:
channel_name = chan_name
channel_name = "unknown"
try:
res = resolve_channel_name(channel_number, rxNode, interface)
if res:
try:
channel_name, _ = res
except Exception:
channel_name = "unknown"
else:
# Search all interfaces for this channel
cache = build_channel_cache()
found_on_other = None
for device in cache:
for chan_name, info in device.get("channels", {}).items():
if str(info.get('number')) == str(channel_number) or str(info.get('hash')) == str(channel_number):
found_on_other = device.get("interface_id")
found_chan_name = chan_name
break
if found_on_other:
break
# get channel hashes for the interface
device = next((d for d in channel_list if d["interface_id"] == rxNode), None)
if device:
# Find the channel name whose hash matches channel_number
for chan_name, info in device['channels'].items():
if info['hash'] == channel_number:
print(f"Matched channel hash {info['hash']} to channel name {chan_name}")
channel_name = chan_name
break
if found_on_other and found_on_other != rxNode:
logger.debug(
f"System: Received Packet on Channel:{channel_number} ({found_chan_name}) on Interface:{rxNode}, but this channel is configured on Interface:{found_on_other}"
)
except Exception as e:
logger.debug(f"System: channel resolution error: {e}")
#debug channel info
# if "unknown" in str(channel_name):
# logger.debug(f"System: Received Packet on Channel:{channel_number} on Interface:{rxNode}")
# else:
# logger.debug(f"System: Received Packet on Channel:{channel_number} Name:{channel_name} on Interface:{rxNode}")
# check if the packet has a simulator flag
simulator_flag = packet.get('decoded', {}).get('simulator', False)
@@ -2015,25 +2026,32 @@ def onReceive(packet, interface):
else:
hop_count = hop_away
if hop == "" and hop_count > 0:
if hop_count > 0:
# set hop string from calculated hop count
hop = f"{hop_count} Hop" if hop_count == 1 else f"{hop_count} Hops"
if hop_start == hop_limit and "lora" in str(transport_mechanism).lower() and (snr != 0 or rssi != 0):
if hop_start == hop_limit and "lora" in str(transport_mechanism).lower() and (snr != 0 or rssi != 0) and hop_count == 0:
# 2.7+ firmware direct hop over LoRa
hop = "Direct"
if ((hop_start == 0 and hop_limit >= 0) or via_mqtt or ("mqtt" in str(transport_mechanism).lower())):
if via_mqtt or "mqtt" in str(transport_mechanism).lower():
hop = "MQTT"
elif hop == "" and hop_count == 0 and (snr != 0 or rssi != 0):
# this came from a UDP but we had signal info so gateway is used
hop = "Gateway"
elif "unknown" in str(transport_mechanism).lower() and (snr == 0 and rssi == 0):
# we for sure detected this sourced from a UDP like host
via_mqtt = True
elif "udp" in str(transport_mechanism).lower():
hop = "Gateway"
if hop in ("MQTT", "Gateway") and hop_count > 0:
hop = f"{hop_count} Hops"
hop = f" {hop_count} Hops"
# Add relay node info if present
if packet.get('relayNode') is not None:
relay_val = packet['relayNode']
last_byte = relay_val & 0xFF
if last_byte == 0x00:
hex_val = 'OldFW'
else:
hex_val = f"{last_byte:02X}"
hop += f" Relay:{hex_val}"
if enableHopLogs:
logger.debug(f"System: Packet HopDebugger: hop_away:{hop_away} hop_limit:{hop_limit} hop_start:{hop_start} calculated_hop_count:{hop_count} final_hop_value:{hop} via_mqtt:{via_mqtt} transport_mechanism:{transport_mechanism} Hostname:{rxNodeHostName}")

View File

@@ -12,6 +12,9 @@ CELL_SIZE = 40
BOARD_MARGIN = 50
STATUS_WIDTH = 320
latest_battleship_board = None
latest_battleship_meta = None
def draw_board(screen, board, top_left, cell_size, show_ships=False):
font = pygame.font.Font(None, 28)
x0, y0 = top_left
@@ -92,6 +95,11 @@ def battleship_visual_main(game):
pygame.quit()
sys.exit()
def parse_battleship_message(msg):
# Expected payload:
# MBSP|label|timestamp|nodeID|deviceID|sessionID|status|shotsFired|boardType|shipsStatus|boardString
print("Parsing Battleship message:", msg)
# if __name__ == "__main__":
# # Example: create a new game and show the boards
# game = Battleship(vs_ai=True)

View File

@@ -5,6 +5,7 @@ import random
import time
import pickle
from modules.log import logger
from modules.settings import dwPlayerTracker
# Global variables
total_days = 7 # number of days or rotations the player has to play
@@ -391,6 +392,13 @@ def endGameDw(nodeID):
return msg
if cash < starting_cash:
msg = "You lost money, better go get a real job.💸"
# remove player from all trackers and databases
dwPlayerTracker[:] = [p for p in dwPlayerTracker if p.get('userID') != nodeID]
dwCashDb[:] = [p for p in dwCashDb if p.get('userID') != nodeID]
dwInventoryDb[:] = [p for p in dwInventoryDb if p.get('userID') != nodeID]
dwLocationDb[:] = [p for p in dwLocationDb if p.get('userID') != nodeID]
dwGameDayDb[:] = [p for p in dwGameDayDb if p.get('userID') != nodeID]
return msg

View File

@@ -5,6 +5,7 @@ import random
import time
import pickle
from modules.log import logger
from modules.settings import golfTracker
# Clubs setup
driver_distances = list(range(230, 280, 5))

View File

@@ -10,6 +10,7 @@ import json
import random
import os
from modules.log import logger
from modules.settings import hamtestTracker
class HamTest:
def __init__(self):

View File

@@ -3,6 +3,7 @@ from modules.log import logger, getPrettyTime
import os
import json
import random
from modules.settings import hangmanTracker
class Hangman:
WORDS = [

View File

@@ -211,7 +211,7 @@ def compareCodeMMind(secret_code, user_guess, nodeID):
def playGameMMind(diff, secret_code, turn_count, nodeID, message):
msg = ''
won = False
if turn_count <= 10:
if turn_count < 11:
user_guess = getGuessMMind(diff, message, nodeID)
if user_guess == "XXXX":
msg += f"Invalid guess. Please enter 4 valid colors letters.\n🔴🟢🔵🔴 is RGBR"
@@ -240,7 +240,7 @@ def playGameMMind(diff, secret_code, turn_count, nodeID, message):
# reset turn count in tracker
for i in range(len(mindTracker)):
if mindTracker[i]['nodeID'] == nodeID:
mindTracker[i]['turns'] = 0
mindTracker[i]['turns'] = 1
mindTracker[i]['secret_code'] = ''
mindTracker[i]['cmd'] = 'new'
@@ -277,6 +277,7 @@ def start_mMind(nodeID, message):
if mindTracker[i]['nodeID'] == nodeID:
mindTracker[i]['cmd'] = 'makeCode'
mindTracker[i]['diff'] = diff
mindTracker[i]['turns'] = 1
# Return color message to player
msg += chooseDifficultyMMind(message.lower()[0])
return msg

View File

@@ -4,6 +4,7 @@
import random
import time
import modules.settings as my_settings
from modules.settings import tictactoeTracker
useSynchCompression = True
if useSynchCompression:
@@ -16,9 +17,14 @@ class TicTacToe:
if getattr(my_settings, "disable_emojis_in_games", False):
self.X = "X"
self.O = "O"
self.digit_emojis = None
else:
self.X = ""
self.O = "⭕️"
# Unicode emoji digits 1⃣-9
self.digit_emojis = [
"1", "2", "3", "4", "5", "6", "7", "8", "9"
]
self.display_module = display_module
self.game = {}
self.win_lines_3d = self.generate_3d_win_lines()
@@ -73,7 +79,13 @@ class TicTacToe:
row = []
for j in range(3):
cell = b[i*3+j]
row.append(cell if cell != " " else str(i*3+j+1))
if cell != " ":
row.append(cell)
else:
if self.digit_emojis:
row.append(self.digit_emojis[i*3+j])
else:
row.append(str(i*3+j+1))
s += " | ".join(row) + "\n"
return s
return ""
@@ -147,10 +159,24 @@ class TicTacToe:
msg = self.new_game(nodeID, new_mode, g["channel"], g["deviceID"])
return msg
try:
pos = int(input_msg)
except Exception:
return f"Enter a number between 1 and {max_pos}."
# Accept emoji digits as input
pos = None
# Try to match emoji digits if enabled
if self.digit_emojis:
try:
# Remove variation selectors for matching
normalized_input = input_msg.replace("\ufe0f", "")
for idx, emoji in enumerate(self.digit_emojis[:max_pos]):
if normalized_input == emoji.replace("\ufe0f", ""):
pos = idx + 1
break
except Exception:
pass
if pos is None:
try:
pos = int(input_msg)
except Exception:
return f"Enter a number or emoji between 1 and {max_pos}."
if not self.make_move(nodeID, pos):
return f"Invalid move! Pick 1-{max_pos}:"

View File

@@ -4,6 +4,7 @@ import random
import time
import pickle
from modules.log import logger, getPrettyTime
from modules.settings import vpTracker
vpStartingCash = 20
# Define the Card class
@@ -260,6 +261,7 @@ class PlayerVP:
def getLastCmdVp(nodeID):
global vpTracker
last_cmd = ""
for i in range(len(vpTracker)):
if vpTracker[i]['nodeID'] == nodeID:
@@ -267,6 +269,7 @@ def getLastCmdVp(nodeID):
return last_cmd
def setLastCmdVp(nodeID, cmd):
global vpTracker
for i in range(len(vpTracker)):
if vpTracker[i]['nodeID'] == nodeID:
vpTracker[i]['cmd'] = cmd

View File

@@ -261,7 +261,7 @@ def get_NOAAweather(lat=0, lon=0, unit=0, report_days=None):
logger.warning("Location:Error fetching weather data from NOAA for location")
return my_settings.ERROR_FETCHING_DATA
except Exception:
logger.warning(f"Location:Error fetching weather data error: {Exception}")
logger.warning(f"Location:Error fetching weather data malformed: {Exception}")
return my_settings.ERROR_FETCHING_DATA
# get the forecast URL from the JSON response
weather_json = weather_data.json()
@@ -272,7 +272,7 @@ def get_NOAAweather(lat=0, lon=0, unit=0, report_days=None):
logger.warning("Location:Error fetching weather forecast from NOAA")
return my_settings.ERROR_FETCHING_DATA
except Exception:
logger.warning(f"Location:Error fetching weather data error: {Exception}")
logger.warning(f"Location:Error fetching weather data missing: {Exception}")
return my_settings.ERROR_FETCHING_DATA
# from periods, get the detailedForecast from number of days in NOAAforecastDuration
@@ -409,10 +409,10 @@ def getWeatherAlertsNOAA(lat=0, lon=0, useDefaultLatLon=False):
try:
alert_data = requests.get(alert_url, timeout=my_settings.urlTimeoutSeconds)
if not alert_data.ok:
logger.warning("Location:Error fetching weather alerts from NOAA")
logger.warning("Location:Error fetching weather alerts from NOAA bad data")
return my_settings.ERROR_FETCHING_DATA
except Exception:
logger.warning(f"Location:Error fetching weather data error: {Exception}")
except Exception as e:
logger.warning(f"Location:Error fetching weather alert request error: {type(e).__name__}: {e}")
return my_settings.ERROR_FETCHING_DATA
alerts = ""
@@ -509,10 +509,10 @@ def getActiveWeatherAlertsDetailNOAA(lat=0, lon=0):
try:
alert_data = requests.get(alert_url, timeout=my_settings.urlTimeoutSeconds)
if not alert_data.ok:
logger.warning("Location:Error fetching weather alerts from NOAA")
logger.warning("Location:Error fetching weather alerts from NOAA bad data")
return my_settings.ERROR_FETCHING_DATA
except Exception:
logger.warning(f"Location:Error fetching weather data error: {Exception}")
except Exception as e:
logger.warning(f"Location:Error fetching active weather alert request error: {type(e).__name__}: {e}")
return my_settings.ERROR_FETCHING_DATA
alerts = ""
@@ -1031,79 +1031,132 @@ def distance(lat=0,lon=0,nodeID=0, reset=False):
return msg
def get_openskynetwork(lat=0, lon=0, altitude=0, node_altitude=0, altitude_window=1000):
def get_openskynetwork(lat=0, lon=0, altitude=0, node_altitude=0, altitude_window=900):
"""
Returns the aircraft dict from OpenSky Network closest in altitude (within altitude_window meters)
to the given node_altitude. If no aircraft found, returns my_settings.NO_ALERTS.
"""
if lat == 0 and lon == 0:
return False
def _to_float(v):
try:
# handle numeric and numeric-strings, treat empty/'N/A' as None
if v is None:
return None
if isinstance(v, (int, float)):
return float(v)
s = str(v).strip()
if s == "" or s.upper() == "N/A":
return None
return float(s)
except Exception:
return None
box_size = 0.45 # approx 50km
lamin = lat - box_size
lamax = lat + box_size
lomin = lon - box_size
lomax = lon + box_size
opensky_url = (
f"https://opensky-network.org/api/states/all?lamin={lamin}&lomin={lomin}"
f"&lamax={lamax}&lomax={lomax}"
)
try:
aircraft_data = requests.get(opensky_url, timeout=my_settings.urlTimeoutSeconds)
if not aircraft_data.ok:
# basic input validation/coercion
try:
lat = float(lat)
lon = float(lon)
except Exception:
return False
try:
node_altitude = _to_float(node_altitude) or 0.0
except Exception:
node_altitude = 0.0
if lat == 0 and lon == 0:
return False
box_size = 0.45 # approx 50km
lamin = lat - box_size
lamax = lat + box_size
lomin = lon - box_size
lomax = lon + box_size
opensky_url = (
f"https://opensky-network.org/api/states/all?lamin={lamin}&lomin={lomin}"
f"&lamax={lamax}&lomax={lomax}"
)
try:
aircraft_data = requests.get(opensky_url, timeout=my_settings.urlTimeoutSeconds)
if not aircraft_data.ok:
logger.warning("Location:Error fetching aircraft data from OpenSky Network")
return False
except (requests.exceptions.RequestException):
logger.warning("Location:Error fetching aircraft data from OpenSky Network")
return False
except (requests.exceptions.RequestException):
logger.warning("Location:Error fetching aircraft data from OpenSky Network")
return False
aircraft_json = aircraft_data.json()
if 'states' not in aircraft_json or not aircraft_json['states']:
return False
aircraft_json = aircraft_data.json()
if 'states' not in aircraft_json or not aircraft_json['states']:
return False
aircraft_list = aircraft_json['states']
logger.debug(f"Location: OpenSky Network: Found {len(aircraft_list)} possible aircraft in area")
closest = None
min_diff = float('inf')
for aircraft in aircraft_list:
try:
callsign = aircraft[1].strip() if aircraft[1] else "N/A"
origin_country = aircraft[2]
velocity = aircraft[9]
true_track = aircraft[10]
vertical_rate = aircraft[11]
sensors = aircraft[12]
baro_altitude = aircraft[7]
geo_altitude = aircraft[13]
squawk = aircraft[14] if len(aircraft) > 14 else "N/A"
except Exception as e:
logger.debug("Location:Error extracting aircraft data from OpenSky Network")
continue
aircraft_list = aircraft_json['states']
logger.debug(f"Location: OpenSky Network: Found {len(aircraft_list)} possible aircraft in area")
closest = None
min_diff = float('inf')
# Prefer geo_altitude, fallback to baro_altitude
plane_alt = geo_altitude if geo_altitude is not None else baro_altitude
if plane_alt is None or node_altitude == 0:
continue
diff = abs(plane_alt - node_altitude)
if diff <= altitude_window and diff < min_diff:
min_diff = diff
closest = {
"callsign": callsign,
"origin_country": origin_country,
"velocity": velocity,
"true_track": true_track,
"vertical_rate": vertical_rate,
"sensors": sensors,
"altitude": baro_altitude,
"geo_altitude": geo_altitude,
"squawk": squawk,
if len(aircraft_list) == 1:
# Only one aircraft found; return normalized values (altitudes coerced to numbers or None)
aircraft = aircraft_list[0]
baro_alt = _to_float(aircraft[7]) # barometric altitude
geo_alt = _to_float(aircraft[13]) # geometric altitude
return {
"callsign": aircraft[1].strip() if aircraft[1] else "N/A",
"origin_country": aircraft[2] if aircraft[2] is not None else "N/A",
"velocity": aircraft[9] if aircraft[9] is not None else "N/A",
"true_track": aircraft[10] if aircraft[10] is not None else "N/A",
"vertical_rate": aircraft[11] if aircraft[11] is not None else "N/A",
"sensors": aircraft[12] if aircraft[12] is not None else "N/A",
"altitude": baro_alt,
"geo_altitude": geo_alt,
"squawk": aircraft[14] if len(aircraft) > 14 and aircraft[14] is not None else "N/A",
}
if closest:
return closest
else:
for aircraft in aircraft_list:
try:
callsign = aircraft[1].strip() if aircraft[1] else "N/A"
origin_country = aircraft[2] if aircraft[2] is not None else "N/A"
velocity = aircraft[9] if aircraft[9] is not None else "N/A"
true_track = aircraft[10] if aircraft[10] is not None else "N/A"
vertical_rate = aircraft[11] if aircraft[11] is not None else "N/A"
sensors = aircraft[12] if aircraft[12] is not None else "N/A"
baro_altitude = _to_float(aircraft[7])
geo_altitude = _to_float(aircraft[13])
squawk = aircraft[14] if len(aircraft) > 14 and aircraft[14] is not None else "N/A"
except Exception:
logger.debug("Location:Error extracting aircraft data from OpenSky Network")
continue
# Prefer geo_altitude, fallback to baro_altitude
plane_alt = geo_altitude if geo_altitude is not None else baro_altitude
# skip if we can't get a numeric plane altitude or node_altitude is zero/unset
if plane_alt is None or (node_altitude == 0 or node_altitude is None):
continue
# safe numeric diff
try:
diff = abs(float(plane_alt) - float(node_altitude))
except Exception:
continue
if diff <= altitude_window and diff < min_diff:
min_diff = diff
closest = {
"callsign": callsign,
"origin_country": origin_country,
"velocity": velocity,
"true_track": true_track,
"vertical_rate": vertical_rate,
"sensors": sensors,
"altitude": baro_altitude,
"geo_altitude": geo_altitude,
"squawk": squawk,
}
if closest:
return closest
else:
return False
except Exception as e:
logger.debug(f"SYSTEM: Location HighFly: Error processing OpenSky Network data: {e}")
return False
def log_locationData_toMap(userID, location, message):

View File

@@ -323,6 +323,9 @@ try:
coastalForecastDays = config['location'].getint('coastalForecastDays', 3) # default 3 days
# location alerts
alert_duration = config['location'].getint('alertDuration', 20) # default 20 minutes
if alert_duration < 10: # the API calls need throttle time
alert_duration = 10
eAlertBroadcastEnabled = config['location'].getboolean('eAlertBroadcastEnabled', False) # old deprecated name
ipawsAlertEnabled = config['location'].getboolean('ipawsAlertEnabled', False) # default False new ^
# Keep both in sync for backward compatibility

View File

@@ -37,19 +37,27 @@ def hf_band_conditions():
def solar_conditions():
# radio related solar conditions from hamsql.com
solar_cond = ""
solar_cond = requests.get("https://www.hamqsl.com/solarxml.php", timeout=urlTimeoutSeconds)
if(solar_cond.ok):
solar_xml = xml.dom.minidom.parseString(solar_cond.text)
for i in solar_xml.getElementsByTagName("solardata"):
solar_a_index = i.getElementsByTagName("aindex")[0].childNodes[0].data
solar_k_index = i.getElementsByTagName("kindex")[0].childNodes[0].data
solar_xray = i.getElementsByTagName("xray")[0].childNodes[0].data
solar_flux = i.getElementsByTagName("solarflux")[0].childNodes[0].data
sunspots = i.getElementsByTagName("sunspots")[0].childNodes[0].data
signalnoise = i.getElementsByTagName("signalnoise")[0].childNodes[0].data
solar_cond = "A-Index: " + solar_a_index + "\nK-Index: " + solar_k_index + "\nSunspots: " + sunspots + "\nX-Ray Flux: " + solar_xray + "\nSolar Flux: " + solar_flux + "\nSignal Noise: " + signalnoise
else:
logger.error("Solar: Error fetching solar conditions")
try:
solar_cond = requests.get("https://www.hamqsl.com/solarxml.php", timeout=urlTimeoutSeconds)
if solar_cond.ok:
try:
solar_xml = xml.dom.minidom.parseString(solar_cond.text)
except Exception as e:
logger.error(f"Solar: XML parse error: {e}")
return ERROR_FETCHING_DATA
for i in solar_xml.getElementsByTagName("solardata"):
solar_a_index = i.getElementsByTagName("aindex")[0].childNodes[0].data
solar_k_index = i.getElementsByTagName("kindex")[0].childNodes[0].data
solar_xray = i.getElementsByTagName("xray")[0].childNodes[0].data
solar_flux = i.getElementsByTagName("solarflux")[0].childNodes[0].data
sunspots = i.getElementsByTagName("sunspots")[0].childNodes[0].data
signalnoise = i.getElementsByTagName("signalnoise")[0].childNodes[0].data
solar_cond = "A-Index: " + solar_a_index + "\nK-Index: " + solar_k_index + "\nSunspots: " + sunspots + "\nX-Ray Flux: " + solar_xray + "\nSolar Flux: " + solar_flux + "\nSignal Noise: " + signalnoise
else:
logger.error("Solar: Error fetching solar conditions")
solar_cond = ERROR_FETCHING_DATA
except Exception as e:
logger.error(f"Solar: Exception fetching or parsing: {e}")
solar_cond = ERROR_FETCHING_DATA
return solar_cond
@@ -68,6 +76,77 @@ def drap_xray_conditions():
xray_flux = ERROR_FETCHING_DATA
return xray_flux
def get_noaa_scales_summary():
"""
Show latest observed, 24-hour max, and predicted geomagnetic, storm, and blackout data.
"""
try:
response = requests.get("https://services.swpc.noaa.gov/products/noaa-scales.json", timeout=urlTimeoutSeconds)
if response.ok:
data = response.json()
today = datetime.utcnow().date()
latest_entry = None
latest_dt = None
max_g_today = None
max_g_scale = -1
predicted_g = None
predicted_g_scale = -1
# Find latest observed and 24-hour max for today
for entry in data.values():
date_str = entry.get("DateStamp")
time_str = entry.get("TimeStamp")
if date_str and time_str:
try:
dt = datetime.strptime(f"{date_str} {time_str}", "%Y-%m-%d %H:%M:%S")
g = entry.get("G", {})
g_scale = int(g.get("Scale", -1)) if g.get("Scale") else -1
# Latest observed for today
if dt.date() == today:
if latest_dt is None or dt > latest_dt:
latest_dt = dt
latest_entry = entry
# 24-hour max for today
if g_scale > max_g_scale:
max_g_scale = g_scale
max_g_today = entry
# Predicted (future)
elif dt.date() > today:
if g_scale > predicted_g_scale:
predicted_g_scale = g_scale
predicted_g = entry
except Exception:
continue
def format_entry(label, entry):
if not entry:
return f"{label}: No data"
g = entry.get("G", {})
s = entry.get("S", {})
r = entry.get("R", {})
parts = [f"{label} {g.get('Text', 'N/A')} (G:{g.get('Scale', 'N/A')})"]
# Only show storm if it's happening
if s.get("Text") and s.get("Text") != "none":
parts.append(f"Currently:{s.get('Text')} (S:{s.get('Scale', 'N/A')})")
# Only show blackout if it's not "none" or scale is not 0
if r.get("Text") and r.get("Text") != "none" and r.get("Scale") not in [None, "0", 0]:
parts.append(f"RF Blackout:{r.get('Text')} (R:{r.get('Scale', 'N/A')})")
return "\n".join(parts)
output = []
#output.append(format_entry("Latest Observed", latest_entry))
output.append(format_entry("24hrMax:", max_g_today))
output.append(format_entry("Predicted:", predicted_g))
return "\n".join(output)
else:
return NO_ALERTS
except Exception as e:
logger.warning(f"Error fetching services.swpc.noaa.gov: {e}")
return ERROR_FETCHING_DATA
def get_sun(lat=0, lon=0):
# get sunrise and sunset times using callers location or default
obs = ephem.Observer()

View File

@@ -331,24 +331,6 @@ if ble_count > 1:
logger.critical(f"System: Multiple BLE interfaces detected. Only one BLE interface is allowed. Exiting")
exit()
def xor_hash(data: bytes) -> int:
"""Compute an XOR hash from bytes."""
result = 0
for char in data:
result ^= char
return result
def generate_hash(name: str, key: str) -> int:
"""generate the channel number by hashing the channel name and psk"""
if key == "AQ==":
key = "1PG7OiApB1nwvP+rz05pAQ=="
replaced_key = key.replace("-", "+").replace("_", "/")
key_bytes = base64.b64decode(replaced_key.encode("utf-8"))
h_name = xor_hash(bytes(name, "utf-8"))
h_key = xor_hash(key_bytes)
result: int = h_name ^ h_key
return result
# Initialize interfaces
logger.debug(f"System: Initializing Interfaces")
interface1 = interface2 = interface3 = interface4 = interface5 = interface6 = interface7 = interface8 = interface9 = None
@@ -403,44 +385,90 @@ for i in range(1, 10):
globals()[f'myNodeNum{i}'] = 777
# Fetch channel list from each device
channel_list = []
for i in range(1, 10):
if globals().get(f'interface{i}') and globals().get(f'interface{i}_enabled'):
_channel_cache = None
def build_channel_cache(force_refresh: bool = False):
"""
Build and cache channel_list from interfaces once (or when forced).
"""
global _channel_cache
if _channel_cache is not None and not force_refresh:
return _channel_cache
cache = []
for i in range(1, 10):
if not globals().get(f'interface{i}') or not globals().get(f'interface{i}_enabled'):
continue
try:
node = globals()[f'interface{i}'].getNode('^local')
channels = node.channels
channel_dict = {}
for channel in channels:
if hasattr(channel, 'role') and channel.role:
channel_name = getattr(channel.settings, 'name', '').strip()
channel_number = getattr(channel, 'index', 0)
# Only add channels with a non-empty name
if channel_name:
channel_dict[channel_name] = channel_number
channel_list.append({
"interface_id": i,
"channels": channel_dict
})
logger.debug(f"System: Fetched Channel List from Device{i}")
except Exception as e:
logger.error(f"System: Error fetching channel list from Device{i}: {e}")
# Try to use the node-provided channel/hash table if available
try:
ch_hash_table_raw = node.get_channels_with_hash()
#print(f"System: Device{i} Channel Hash Table: {ch_hash_table_raw}")
except Exception:
logger.warning(f"System: API version error update API `pip3 install --upgrade meshtastic[cli]`")
ch_hash_table_raw = []
# add channel hash to channel_list
for device in channel_list:
interface_id = device["interface_id"]
interface = globals().get(f'interface{interface_id}')
for channel_name, channel_number in device["channels"].items():
psk_base64 = "AQ==" # default PSK
channel_hash = generate_hash(channel_name, psk_base64)
# add hash to the channel entry in channel_list under key 'hash'
for entry in channel_list:
if entry["interface_id"] == interface_id:
entry["channels"][channel_name] = {
"number": channel_number,
"hash": channel_hash
}
channel_dict = {}
# Use the hash table as the source of truth for channels
if isinstance(ch_hash_table_raw, list):
for entry in ch_hash_table_raw:
channel_name = entry.get("name", "").strip()
channel_number = entry.get("index")
ch_hash = entry.get("hash")
role = entry.get("role", "")
# Always add PRIMARY/SECONDARY channels, even if name is empty
if role in ("PRIMARY", "SECONDARY"):
channel_dict[channel_name if channel_name else f"Channel{channel_number}"] = {
"number": channel_number,
"hash": ch_hash
}
elif isinstance(ch_hash_table_raw, dict):
for channel_name, ch_hash in ch_hash_table_raw.items():
channel_dict[channel_name] = {"number": None, "hash": ch_hash}
# Always add the interface, even if no named channels
cache.append({"interface_id": i, "channels": channel_dict})
logger.debug(f"System: Fetched Channel List from Device{i} (cached)")
except Exception as e:
logger.debug(f"System: Error fetching channel list from Device{i}: {e}")
_channel_cache = cache
return _channel_cache
def refresh_channel_cache():
"""Force rebuild of channel cache (call only when channel config changes)."""
return build_channel_cache(force_refresh=True)
channel_list = build_channel_cache()
#print(f"System: Channel Cache Built: {channel_list}")
#### FUN-ctions ####
def resolve_channel_name(channel_number, rxNode=1, interface_obj=None):
"""
Resolve a channel number/hash to its name using cached channel list.
"""
try:
# ensure cache exists (cheap)
cached = build_channel_cache()
# quick search in cache first (no node calls)
for device in cached:
if device.get("interface_id") == rxNode:
device_channels = device.get("channels", {}) or {}
# info is dict: {name: {'number': X, 'hash': Y}}
for chan_name, info in device_channels.items():
try:
if isinstance(info, dict):
if str(info.get('number')) == str(channel_number) or str(info.get('hash')) == str(channel_number):
return (chan_name, info.get('number') or info.get('hash'))
else:
if str(info) == str(channel_number):
return (chan_name, info)
except Exception:
continue
break # stop searching other devices
except Exception as e:
logger.debug(f"System: Error resolving channel name from cache: {e}")
def cleanup_memory():
"""Clean up memory by limiting list sizes and removing stale entries"""
@@ -976,7 +1004,7 @@ def stringSafeCheck(s, fromID=0):
if len(s) > 1000:
return False
# Check for single-character injections
single_injection_chars = [';', '|', '}', '>', ')']
single_injection_chars = [';', '|', '}', '>']
if any(c in s for c in single_injection_chars):
return False # injection character found
# Check for multi-character patterns
@@ -1276,8 +1304,8 @@ def handleAlertBroadcast(deviceID=1):
if should_send_alert("overdue", overdueAlerts, min_interval=300): # 5 minutes interval for overdue alerts
send_message(overdueAlerts, emergency_responder_alert_channel, 0, emergency_responder_alert_interface)
# Only allow API call every 20 minutes
if not (clock.minute % 20 == 0 and clock.second <= 17):
# Only allow API call every alert_duration minutes at xx:00, xx:20, xx:40
if not (clock.minute % alert_duration == 0 and clock.second <= 17):
return False
# Collect alerts
@@ -1320,7 +1348,7 @@ def handleAlertBroadcast(deviceID=1):
def onDisconnect(interface):
# Handle disconnection of the interface
logger.warning(f"System: Abrupt Disconnection of Interface detected")
logger.warning(f"System: Abrupt Disconnection of Interface detected, attempting reconnect...")
interface.close()
# Telemetry Functions
@@ -1466,6 +1494,7 @@ def initializeMeshLeaderboard():
'lowestBattery': {'nodeID': None, 'value': 101, 'timestamp': 0}, # 🪫
'longestUptime': {'nodeID': None, 'value': 0, 'timestamp': 0}, # 🕰️
'fastestSpeed': {'nodeID': None, 'value': 0, 'timestamp': 0}, # 🚓
'fastestAirSpeed': {'nodeID': None, 'value': 0, 'timestamp': 0}, # ✈️
'highestAltitude': {'nodeID': None, 'value': 0, 'timestamp': 0}, # 🚀
'tallestNode': {'nodeID': None, 'value': 0, 'timestamp': 0}, # 🪜
'coldestTemp': {'nodeID': None, 'value': 999, 'timestamp': 0}, # 🥶
@@ -1612,30 +1641,39 @@ def consumeMetadata(packet, rxNode=0, channel=-1):
positionMetadata[nodeID] = {}
for key in position_stats_keys:
positionMetadata[nodeID][key] = position_data.get(key, 0)
# Track fastest speed 🚓
if position_data.get('groundSpeed') is not None:
speed = position_data['groundSpeed']
if speed > meshLeaderboard['fastestSpeed']['value']:
meshLeaderboard['fastestSpeed'] = {'nodeID': nodeID, 'value': speed, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: 🚓 New speed record: {speed} km/h from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# Track highest altitude 🚀 (also log if over highfly_altitude threshold)
if position_data.get('altitude') is not None:
altitude = position_data['altitude']
if altitude > highfly_altitude:
if altitude > meshLeaderboard['highestAltitude']['value']:
meshLeaderboard['highestAltitude'] = {'nodeID': nodeID, 'value': altitude, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: 🚀 New altitude record: {altitude}m from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# Track tallest node 🪜 (under the highfly_altitude limit by 100m)
# Track altitude and speed records
if position_data.get('altitude') is not None:
altitude = position_data['altitude']
highflying = altitude > highfly_altitude
# Tallest node (below highfly_altitude - 100m)
if altitude < (highfly_altitude - 100):
if altitude > meshLeaderboard['tallestNode']['value']:
meshLeaderboard['tallestNode'] = {'nodeID': nodeID, 'value': altitude, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: 🪜 New tallest node record: {altitude}m from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# Highest altitude (above highfly_altitude)
if highflying:
if altitude > meshLeaderboard['highestAltitude']['value']:
meshLeaderboard['highestAltitude'] = {'nodeID': nodeID, 'value': altitude, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: 🚀 New altitude record: {altitude}m from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# Track speed records
if position_data.get('groundSpeed') is not None:
speed = position_data['groundSpeed']
# Fastest ground speed (not highflying)
if not highflying and speed > meshLeaderboard['fastestSpeed']['value']:
meshLeaderboard['fastestSpeed'] = {'nodeID': nodeID, 'value': speed, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: 🚓 New speed record: {speed} km/h from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# Fastest air speed (highflying)
elif highflying and speed > meshLeaderboard['fastestAirSpeed']['value']:
meshLeaderboard['fastestAirSpeed'] = {'nodeID': nodeID, 'value': speed, 'timestamp': time.time()}
if logMetaStats:
logger.info(f"System: ✈️ New air speed record: {speed} km/h from NodeID:{nodeID} ShortName:{get_name_from_number(nodeID, 'short', rxNode)}")
# if altitude is over highfly_altitude send a log and message for high-flying nodes and not in highfly_ignoreList
if position_data.get('altitude', 0) > highfly_altitude and highfly_enabled and str(nodeID) not in highfly_ignoreList and not isNodeBanned(nodeID):
logger.info(f"System: High Altitude {position_data['altitude']}m on Device: {rxNode} Channel: {channel} NodeID:{nodeID} Lat:{position_data.get('latitude', 0)} Lon:{position_data.get('longitude', 0)}")
@@ -1935,6 +1973,16 @@ def get_mesh_leaderboard(msg, fromID, deviceID):
result += f"🚓 Speed: {value_kmh} km/h {get_name_from_number(nodeID, 'short', 1)}\n"
else:
result += f"🚓 Speed: {value_mph} mph {get_name_from_number(nodeID, 'short', 1)}\n"
# Tallest node
if meshLeaderboard['tallestNode']['nodeID']:
nodeID = meshLeaderboard['tallestNode']['nodeID']
value_m = meshLeaderboard['tallestNode']['value']
value_ft = round(value_m * 3.28084, 0)
if use_metric:
result += f"🪜 Tallest: {int(round(value_m, 0))}m {get_name_from_number(nodeID, 'short', 1)}\n"
else:
result += f"🪜 Tallest: {int(value_ft)}ft {get_name_from_number(nodeID, 'short', 1)}\n"
# Highest altitude
if meshLeaderboard['highestAltitude']['nodeID']:
@@ -1946,15 +1994,15 @@ def get_mesh_leaderboard(msg, fromID, deviceID):
else:
result += f"🚀 Altitude: {int(value_ft)}ft {get_name_from_number(nodeID, 'short', 1)}\n"
# Tallest node
if meshLeaderboard['tallestNode']['nodeID']:
nodeID = meshLeaderboard['tallestNode']['nodeID']
value_m = meshLeaderboard['tallestNode']['value']
value_ft = round(value_m * 3.28084, 0)
# Fastest airspeed
if meshLeaderboard['fastestAirSpeed']['nodeID']:
nodeID = meshLeaderboard['fastestAirSpeed']['nodeID']
value_kmh = round(meshLeaderboard['fastestAirSpeed']['value'], 1)
value_mph = round(value_kmh / 1.60934, 1)
if use_metric:
result += f"🪜 Tallest: {int(round(value_m, 0))}m {get_name_from_number(nodeID, 'short', 1)}\n"
result += f"✈️ Airspeed: {value_kmh} km/h {get_name_from_number(nodeID, 'short', 1)}\n"
else:
result += f"🪜 Tallest: {int(value_ft)}ft {get_name_from_number(nodeID, 'short', 1)}\n"
result += f"✈️ Airspeed: {value_mph} mph {get_name_from_number(nodeID, 'short', 1)}\n"
# Coldest temperature
if meshLeaderboard['coldestTemp']['nodeID']:
@@ -2040,7 +2088,7 @@ def get_mesh_leaderboard(msg, fromID, deviceID):
result = result.strip()
if result == "📊Leaderboard📊\n":
result += "No records yet! Keep meshing! 📡"
result += "No records yet! Keep meshing! 📡 \n firmware 2.7+ `Broadcast Device Metrics` in Telemetry Config, needs enabled for full use. Ideally not on AQ=="
return result

View File

@@ -174,7 +174,9 @@ class TestBot(unittest.TestCase):
self.assertIsInstance(haha, str)
def test_tictactoe_initial_and_move(self):
from games.tictactoe import tictactoe
from games.tictactoe import TicTacToe
# Create an instance (no display module required for tests)
tictactoe = TicTacToe(display_module=None)
user_id = "testuser"
# Start a new game (no move yet)
initial = tictactoe.play(user_id, "")

View File

@@ -1,78 +0,0 @@
# modules/test_checklist.py
import os
import sys
# Add the parent directory to sys.path to allow module imports
parent_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
sys.path.insert(0, parent_path)
import unittest
from unittest.mock import patch
from checklist import process_checklist_command, initialize_checklist_database
import time
class TestProcessChecklistCommand(unittest.TestCase):
def setUp(self):
# Always start with a fresh DB
initialize_checklist_database()
# Patch settings for consistent test behavior
patcher1 = patch('modules.checklist.reverse_in_out', False)
patcher2 = patch('modules.checklist.bbs_ban_list', [])
patcher3 = patch('modules.checklist.bbs_admin_list', ['999'])
self.mock_reverse = patcher1.start()
self.mock_ban = patcher2.start()
self.mock_admin = patcher3.start()
self.addCleanup(patcher1.stop)
self.addCleanup(patcher2.stop)
self.addCleanup(patcher3.stop)
def test_checkin_command(self):
result = process_checklist_command(1, "checkin test note", name="TESTUSER", location=["loc"])
self.assertIn("Checked✅In: TESTUSER", result)
def test_checkout_command(self):
# First checkin
process_checklist_command(1, "checkin test note", name="TESTUSER", location=["loc"])
# Then checkout
result = process_checklist_command(1, "checkout", name="TESTUSER", location=["loc"])
self.assertIn("Checked⌛Out: TESTUSER", result)
def test_checkin_with_interval(self):
result = process_checklist_command(1, "checkin 15 hiking", name="TESTUSER", location=["loc"])
self.assertIn("monitoring every 15min", result)
def test_checkout_all(self):
# Multiple checkins
process_checklist_command(1, "checkin note1", name="TESTUSER", location=["loc"])
process_checklist_command(1, "checkin note2", name="TESTUSER", location=["loc"])
result = process_checklist_command(1, "checkout all", name="TESTUSER", location=["loc"])
self.assertIn("Checked out", result)
self.assertIn("check-ins for TESTUSER", result)
def test_checklistapprove_nonadmin(self):
process_checklist_command(1, "checkin foo", name="FOO", location=["loc"])
result = process_checklist_command(2, "checklistapprove 1", name="NOTADMIN", location=["loc"])
self.assertNotIn("approved", result)
def test_checklistdeny_nonadmin(self):
process_checklist_command(1, "checkin foo", name="FOO", location=["loc"])
result = process_checklist_command(2, "checklistdeny 1", name="NOTADMIN", location=["loc"])
self.assertNotIn("denied", result)
def test_help_command(self):
result = process_checklist_command(1, "checklist ?", name="TESTUSER", location=["loc"])
self.assertIn("Command: checklist", result)
def test_checklist_listing(self):
process_checklist_command(1, "checkin foo", name="FOO", location=["loc"])
result = process_checklist_command(1, "checklist", name="FOO", location=["loc"])
self.assertIsInstance(result, str)
self.assertIn("checked-In", result)
def test_invalid_command(self):
result = process_checklist_command(1, "foobar", name="FOO", location=["loc"])
self.assertEqual(result, "Invalid command.")
if __name__ == "__main__":
unittest.main()

View File

@@ -104,12 +104,12 @@ def handle_ping(message_from_id, deviceID, message, hop, snr, rssi, isDM, chann
#flood
msg += " [F]"
if (float(snr) != 0 or float(rssi) != 0) and "Hops" not in hop:
if (float(snr) != 0 or float(rssi) != 0) and "Hop" not in hop:
msg += f"\nSNR:{snr} RSSI:{rssi}"
elif "Hops" in hop:
msg += f"\n{hop}🐇 "
else:
msg += "\nflood route"
elif "Hop" in hop:
# janky, remove the words Gateway or MQTT if present
hop = hop.replace("Gateway", "").replace("Direct", "").replace("MQTT", "").strip()
msg += f"\n{hop} "
if "@" in message:
msg = msg + " @" + message.split("@")[1]
@@ -279,24 +279,38 @@ def onReceive(packet, interface):
# check if the packet has a channel flag use it ## FIXME needs to be channel hash lookup
if packet.get('channel'):
channel_number = packet.get('channel')
# get channel name from channel number from connected devices
for device in channel_list:
if device["interface_id"] == rxNode:
device_channels = device['channels']
for chan_name, info in device_channels.items():
if info['number'] == channel_number:
channel_name = chan_name
channel_name = "unknown"
try:
res = resolve_channel_name(channel_number, rxNode, interface)
if res:
try:
channel_name, _ = res
except Exception:
channel_name = "unknown"
else:
# Search all interfaces for this channel
cache = build_channel_cache()
found_on_other = None
for device in cache:
for chan_name, info in device.get("channels", {}).items():
if str(info.get('number')) == str(channel_number) or str(info.get('hash')) == str(channel_number):
found_on_other = device.get("interface_id")
found_chan_name = chan_name
break
if found_on_other:
break
if found_on_other and found_on_other != rxNode:
logger.debug(
f"System: Received Packet on Channel:{channel_number} ({found_chan_name}) on Interface:{rxNode}, but this channel is configured on Interface:{found_on_other}"
)
except Exception as e:
logger.debug(f"System: channel resolution error: {e}")
# get channel hashes for the interface
device = next((d for d in channel_list if d["interface_id"] == rxNode), None)
if device:
# Find the channel name whose hash matches channel_number
for chan_name, info in device['channels'].items():
if info['hash'] == channel_number:
print(f"Matched channel hash {info['hash']} to channel name {chan_name}")
channel_name = chan_name
break
#debug channel info
# if "unknown" in str(channel_name):
# logger.debug(f"System: Received Packet on Channel:{channel_number} on Interface:{rxNode}")
# else:
# logger.debug(f"System: Received Packet on Channel:{channel_number} Name:{channel_name} on Interface:{rxNode}")
# check if the packet has a simulator flag
simulator_flag = packet.get('decoded', {}).get('simulator', False)
@@ -370,25 +384,32 @@ def onReceive(packet, interface):
else:
hop_count = hop_away
if hop == "" and hop_count > 0:
if hop_count > 0:
# set hop string from calculated hop count
hop = f"{hop_count} Hop" if hop_count == 1 else f"{hop_count} Hops"
if hop_start == hop_limit and "lora" in str(transport_mechanism).lower() and (snr != 0 or rssi != 0):
if hop_start == hop_limit and "lora" in str(transport_mechanism).lower() and (snr != 0 or rssi != 0) and hop_count == 0:
# 2.7+ firmware direct hop over LoRa
hop = "Direct"
if ((hop_start == 0 and hop_limit >= 0) or via_mqtt or ("mqtt" in str(transport_mechanism).lower())):
if via_mqtt or "mqtt" in str(transport_mechanism).lower():
hop = "MQTT"
elif hop == "" and hop_count == 0 and (snr != 0 or rssi != 0):
# this came from a UDP but we had signal info so gateway is used
hop = "Gateway"
elif "unknown" in str(transport_mechanism).lower() and (snr == 0 and rssi == 0):
# we for sure detected this sourced from a UDP like host
via_mqtt = True
elif "udp" in str(transport_mechanism).lower():
hop = "Gateway"
if hop in ("MQTT", "Gateway") and hop_count > 0:
hop = f"{hop_count} Hops"
hop = f" {hop_count} Hops"
# Add relay node info if present
if packet.get('relayNode') is not None:
relay_val = packet['relayNode']
last_byte = relay_val & 0xFF
if last_byte == 0x00:
hex_val = 'FF'
else:
hex_val = f"{last_byte:02X}"
hop += f" (Relay:{hex_val})"
if my_settings.enableHopLogs:
logger.debug(f"System: Packet HopDebugger: hop_away:{hop_away} hop_limit:{hop_limit} hop_start:{hop_start} calculated_hop_count:{hop_count} final_hop_value:{hop} via_mqtt:{via_mqtt} transport_mechanism:{transport_mechanism} Hostname:{rxNodeHostName}")

View File

@@ -13,4 +13,9 @@ This is not a full turnkey setup for Docker yet?
`docker compose run ollama`
`docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://127.0.0.1:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main`
`docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://127.0.0.1:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main`
### Other Stuff
A cool tool to use with RAG creation with open-webui
- https://github.com/microsoft/markitdown

View File

@@ -30,6 +30,7 @@ except ImportError:
try:
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
from modules.games.tictactoe_vid import handle_tictactoe_payload, ttt_main
from modules.games.battleship_vid import parse_battleship_message
except Exception as e:
print(f"Error importing modules: {e}\nRun this program from the main project directory, e.g. 'python3 script/game_serve.py'")
exit(1)
@@ -130,6 +131,13 @@ def on_private_app(packet: mesh_pb2.MeshPacket, addr=None):
add_seen_message(msg_tuple)
handle_tictactoe_payload(packet_payload, from_id=packet_from_id)
print(f"[Channel: {rx_channel}] [Port: {port_name}] Tic-Tac-Toe Message payload:", packet_payload)
elif packet_payload.startswith("MBSP:"):
packet_payload = packet_payload[5:] # remove 'MBSP:'
msg_tuple = (getattr(packet, 'from', None), packet.to, packet_payload)
if msg_tuple not in seen_messages:
add_seen_message(msg_tuple)
#parse_battleship_message(packet_payload, from_id=packet_from_id)
print(f"[Channel: {rx_channel}] [Port: {port_name}] Battleship Message payload:", packet_payload)
else:
msg_tuple = (getattr(packet, 'from', None), packet.to, packet_payload)
if msg_tuple not in seen_messages:
@@ -169,7 +177,7 @@ def main():
print(r"""
___
/ \
| HOT | Mesh Bot Display Server v0.9.5
| HOT | Mesh Bot Display Server v0.9.5b
| TOT | (aka tot-bot)
\___/