1
0
forked from iarv/mc-webui

32 Commits
main ... v1

Author SHA1 Message Date
MarekWo
fd4818cfad fix(ui): Remove CSS rule that stacked channel buttons vertically
Remove nested @media (max-width: 400px) rule that forced btn-group
to flex-direction: column, causing buttons to stack on mobile.
Also remove now-unused .list-group-item small styles (channel keys
no longer shown).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 07:35:56 +01:00
MarekWo
71f292d843 feat(ui): Mark-all-read confirmation dialog and compact channel list
Add confirm() dialog before marking all messages as read, showing
list of unread channels with counts. Remove channel key/ID from
Manage Channels modal to save vertical space on mobile.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 07:27:35 +01:00
MarekWo
7a4f4d3161 feat(notifications): Channel mute toggle and mark-all-as-read bell button
Add ability to mute notifications for individual channels via Manage
Channels modal (bell/bell-slash toggle button). Muted channels are
excluded from unread badge counts, browser notifications, and app icon
badge. Bell icon click now marks all channels as read in bulk.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 22:00:40 +01:00
MarekWo
ad478a8d47 feat(ui): Add @me filter button, DM filter push-down, and DM FAB toggle
- Add person icon button in filter bar that inserts the current device
  name into the search field, for filtering own messages
- DM filter bar already benefits from the CSS sibling push-down rule
  added in previous commit (same class names used)
- Add collapsible FAB toggle to DM view, same pattern as channel chat

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 09:05:47 +01:00
MarekWo
6310c41934 feat(ui): FAB toggle, filter bar layout fix, and filter @mentions
- Add collapsible FAB container with chevron toggle button to
  temporarily hide floating action buttons that overlap messages
- Make filter bar push messages down instead of overlaying the first
  matched message (CSS sibling selector adds padding-top)
- Add @mentions autocomplete to filter search bar - typing @ shows
  contact list dropdown, selecting inserts plain name (not @[] format)
  so all messages from/mentioning that user are found

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 08:47:00 +01:00
MarekWo
000c4f6884 fix: Make container port match FLASK_PORT for custom port configurations
Previously, the internal container port was hardcoded to 5000, so setting
FLASK_PORT to a different value would break the port mapping and healthcheck.

Credit: Tymo3

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 08:01:12 +01:00
MarekWo
2f82c589c7 feat(watchdog): Hardware USB bus reset for stuck LoRa devices
Implement a smart auto-detection and low-level fcntl ioctl reset mechanism for LoRa USB devices. This 'last resort' recovery is triggered if the meshcore-bridge container fails to recover after 3 restarts within an 8-minute window. Includes updates to the installer, systemd service, and newly added README.

Co-Authored-By: Gemini CLI <noreply@google.com>
2026-02-22 20:15:27 +00:00
MarekWo
f1e5f39a4e fix: Reload echoes/acks after device name detection
When MC_DEVICE_NAME=auto, _load_echoes() runs with "auto.echoes.jsonl"
which doesn't exist. After actual device name is detected and paths
updated, the data was never reloaded from the correct file, leaving
incoming_paths and echo_counts empty. This caused missing path info
for all messages older than the current bridge session.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 17:57:47 +01:00
MarekWo
bcdc014965 fix: Extend sent echo_counts retention from 1h to 7 days
Same 1-hour cleanup issue as incoming_paths: sent messages lost
their analyzer links after ~1 hour because echo_counts was pruned
on every new send. Now matches .echoes.jsonl 7-day retention.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 17:24:57 +01:00
MarekWo
9ad3435609 fix: Always use attempt=0 payload for analyzer URL computation
The attempt loop (0-3) for matching incoming echo paths left
computed_payload at attempt=3 when no match was found, producing
wrong analyzer hashes. Combined with 1-hour incoming_paths cleanup
in bridge (vs 7-day .echoes.jsonl retention), this caused older
messages to lose both path info and correct analyzer links.

Two fixes:
- Compute base_payload at attempt=0 upfront for analyzer URL
- Extend incoming_paths memory cleanup from 1h to 7 days

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 17:15:12 +01:00
MarekWo
6d50391ea8 fix: Decode GPS coordinates as int32/1e6, not float
MeshCore encodes lat/lon as little-endian signed int32 divided by 1e6,
not as IEEE 754 floats. This caused all map pins to show at (0,0).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 13:45:43 +01:00
MarekWo
587bc8cb9f fix: Validate GPS coordinates from advert payloads
Discard NaN, Infinity, and out-of-range lat/lon values from
struct.unpack to prevent JSON parse errors in browser.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 13:10:34 +01:00
MarekWo
247b11e1e9 feat: Enrich contacts cache with GPS coordinates and node type
- Extract lat/lon from advert payloads (struct unpack from binary)
- Store type_label and lat/lon in cache from device seed and adverts
- Show Map button for cache contacts with GPS coordinates
- Show colored type badge (CLI/REP/ROOM/SENS) for typed cache contacts
- Type filter now works for cache contacts with known type
- Change counter label from "known" to "cached"

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 12:40:42 +01:00
MarekWo
a5e767e5bf fix: Replace sort buttons with dropdown for mobile-friendly contact filters
Replaces two sort toggle buttons with a single <select> dropdown (e-commerce style)
so all 3 filter/sort controls fit on mobile screens.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 07:19:56 +01:00
MarekWo
de0108d6aa feat: Add persistent contacts cache for @mention autocomplete
Contacts cache accumulates all known node names from device contacts
and adverts into a JSONL file, so @mentions work even after contacts
are removed from the device. Background thread scans adverts every
45s and parses advert payloads to extract public keys and node names.

Existing Contacts page now shows merged view with "Cache" badge for
contacts not on device, plus source filter (All/On device/Cache only).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 17:13:36 +01:00
MarekWo
0a73556c78 fix: Use bi-clipboard-data icon for analyzer (bi-flask unavailable)
bi-flask was added in Bootstrap Icons 1.12+, but project uses 1.11.2.
Replace with bi-clipboard-data which is available and conveys
"data analysis".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 08:24:43 +01:00
MarekWo
5a7a9476f8 feat: Always show analyzer link for incoming msgs + flask icon
Generate analyzer_url from computed pkt_payload for all incoming
channel messages, not just those with echo path matches. This means
the analyzer button appears even when no route paths were captured.

Also change analyzer button icon from bi-search (magnifying glass)
to bi-flask (lab flask) to better convey "analysis/inspection".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 08:20:10 +01:00
MarekWo
68b2166445 fix: Use unstripped raw_text for pkt_payload computation
The parser's .strip() was removing trailing whitespace from message
text, but the encrypted radio payload includes those trailing spaces.
This caused pkt_payload mismatches for messages ending with spaces
(e.g., "Dzień dobry "). Use original unstripped text for raw_text.

Also add debug logging for unmatched messages to help diagnose
remaining edge cases.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 08:09:28 +01:00
MarekWo
28148d32d8 feat: Deterministic echo-to-message matching via pkt_payload computation
Replace unreliable timestamp-based heuristic (±10s window) with exact
cryptographic matching for incoming channel message routes. Compute
pkt_payload by reconstructing the AES-128-ECB encrypted packet from
message data (sender_timestamp, txt_type, text) + channel secret, then
match against echo data by exact key lookup.

Also accumulate ALL route paths per message (previously only last path
was kept due to dict overwrite), and display them in a multi-path popup
showing SNR and hops for each route.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 07:29:49 +01:00
MarekWo
2ed3dc3758 feat: Add unknown delivery status indicator + update docs
Add clickable "?" icon on DMs without ACK, showing a popup
explaining that delivery is unknown (mobile-friendly).
Update README, user guide with new features (Analyzer links,
DM delivery tracking).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 08:14:58 +01:00
MarekWo
235c74338d fix: Skip redundant DM refreshes once delivery ACK is confirmed
Stop scheduling further post-send reloads as soon as the last
own message shows a delivery checkmark.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 07:51:36 +01:00
MarekWo
cdd28e66fc fix: Auto-refresh DM view after send to show delivery status
Add two extra delayed reloads (6s, 15s) after sending a DM,
matching the channel chat pattern, so ACK checkmarks appear
without needing to send another message.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 09:39:14 +01:00
MarekWo
7a960f2556 feat: Add DM delivery tracking via ACK packet detection
Bridge captures ACK packets from meshcli stdout (json_log_rx),
persists to .acks.jsonl, and exposes /ack_status endpoint.
Delivery status is merged server-side into DM messages and
displayed as a green checkmark with SNR/route tooltip.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 09:30:33 +01:00
MarekWo
cf537628cf feat: Add MeshCore Analyzer link button to channel messages
Compute packet_hash from pkt_payload (SHA-256 of type byte + payload)
and generate analyzer.letsmesh.net links. Button appears on both sent
and received messages when echo data is available.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 08:26:43 +01:00
MarekWo
4bb33a7346 upg: Meshcore-cli upgrade to 1.4.2 2026-02-15 16:06:49 +01:00
MarekWo
eb303c35ad fix: Filter meshcli prompt lines to eliminate false WARN results
Prompt lines (DeviceName|* ...) and summary lines (> N contacts)
are normal meshcli output, not format changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 10:22:37 +01:00
MarekWo
bb0937e52a fix: Show unparsed line content in WARN messages for easier diagnosis
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 10:20:40 +01:00
MarekWo
527204ea87 fix: Support piped execution for compat checker (env vars instead of argparse)
The script runs from host piped into the container, so argparse
doesn't work with stdin. Use env vars (BRIDGE_URL, FULL) as primary
config with fallback CLI arg parsing.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 10:16:26 +01:00
MarekWo
47877fb9e1 feat: Add meshcore-cli compatibility checker script
Diagnostic tool that tests all meshcli commands and response formats
used by mc-webui against a running bridge instance, detecting breaking
changes early when updating meshcore-cli versions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-15 10:05:40 +01:00
MarekWo
35c47de624 fix: Update advert/echo log paths when device name is detected
When MC_DEVICE_NAME=auto, the bridge initially creates log files as
auto.adverts.jsonl and auto.echoes.jsonl. After detecting the real
device name, it now renames them and updates paths. Also adds
echoes_log to the /health endpoint.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 11:35:31 +01:00
MarekWo
f35b4ebe95 fix: Retry device name detection when bridge is not ready at startup
The background thread now retries with exponential backoff (5s→60s)
instead of giving up after 3 attempts. Also accepts detected device
name from bridge even when bridge health status is unhealthy.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 11:14:41 +01:00
MarekWo
1d8449138d docs: Add troubleshooting section for unresponsive device (firmware corruption)
Documented the bridge crash-loop scenario where the MeshCore device
serial port connects but firmware doesn't respond to commands,
including symptoms, what doesn't help, and the fix (re-flash firmware).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 10:33:32 +01:00
29 changed files with 2311 additions and 277 deletions

1
.gitignore vendored
View File

@@ -102,3 +102,4 @@ docs/UI-Contact-Management-MVP-v2.md
docs/TEST-PLAN-Contact-Management-v2.md
docs/github-discussion-*.md
docs/github-response-spaces-in-device-name.md
docs/check-compat-howto.md

View File

@@ -22,7 +22,9 @@ A lightweight web interface for meshcore-cli, providing browser-based access to
- **Message archives** - Automatic daily archiving with browse-by-date selector
- **Interactive Console** - Direct meshcli command execution via WebSocket
- **@Mentions autocomplete** - Type @ to see contact suggestions with fuzzy search
- **Echo tracking** - "Heard X repeats" with repeater IDs for sent messages, route path for incoming messages (persisted across restarts)
- **Echo tracking** - "Heard X repeats" with repeater IDs for sent messages, all route paths for incoming messages with deterministic payload matching (persisted across restarts)
- **MeshCore Analyzer** - View packet details on analyzer.letsmesh.net directly from channel messages
- **DM delivery tracking** - ACK-based delivery confirmation with SNR and route info
- **PWA support** - Browser notifications and installable app (experimental)
- **Full offline support** - Works without internet (local Bootstrap, icons, emoji picker)
@@ -308,6 +310,8 @@ sudo ~/mc-webui/scripts/updater/install.sh --uninstall
- [x] Interactive Console - Direct meshcli access via WebSocket with command history
- [x] Contact Map - View contacts with GPS coordinates on OpenStreetMap (Leaflet)
- [x] Echo Tracking - "Heard X repeats" badge for sent channel messages
- [x] MeshCore Analyzer - Packet analysis links on channel messages (analyzer.letsmesh.net)
- [x] DM Delivery Tracking - ACK-based delivery checkmarks with SNR/route details
### Next Steps

273
app/contacts_cache.py Normal file
View File

@@ -0,0 +1,273 @@
"""
Contacts Cache - Persistent storage of all known node names + public keys.
Stores every node name ever seen (from device contacts and adverts),
so @mention autocomplete works even for removed contacts.
File format: JSONL ({device_name}.contacts_cache.jsonl)
Each line: {"public_key": "...", "name": "...", "first_seen": ts, "last_seen": ts,
"source": "advert"|"device", "lat": float, "lon": float, "type_label": "CLI"|"REP"|...}
"""
import json
import logging
import math
import struct
import time
from pathlib import Path
from threading import Lock
from app.config import config, runtime_config
logger = logging.getLogger(__name__)
_cache_lock = Lock()
_cache: dict = {} # {public_key: {name, first_seen, last_seen, source}}
_cache_loaded = False
_adverts_offset = 0 # File offset for incremental advert scanning
def _get_cache_path() -> Path:
device_name = runtime_config.get_device_name()
return Path(config.MC_CONFIG_DIR) / f"{device_name}.contacts_cache.jsonl"
def _get_adverts_path() -> Path:
device_name = runtime_config.get_device_name()
return Path(config.MC_CONFIG_DIR) / f"{device_name}.adverts.jsonl"
def load_cache() -> dict:
"""Load cache from disk into memory. Returns copy of cache dict."""
global _cache, _cache_loaded
with _cache_lock:
if _cache_loaded:
return _cache.copy()
cache_path = _get_cache_path()
_cache = {}
if not cache_path.exists():
_cache_loaded = True
logger.info("Contacts cache file does not exist yet")
return _cache.copy()
try:
with open(cache_path, 'r', encoding='utf-8') as f:
for line in f:
line = line.strip()
if not line:
continue
try:
entry = json.loads(line)
pk = entry.get('public_key', '').lower()
if pk:
_cache[pk] = entry
except json.JSONDecodeError:
continue
_cache_loaded = True
logger.info(f"Loaded contacts cache: {len(_cache)} entries")
except Exception as e:
logger.error(f"Failed to load contacts cache: {e}")
_cache_loaded = True
return _cache.copy()
def save_cache() -> bool:
"""Write full cache to disk (atomic write)."""
with _cache_lock:
cache_path = _get_cache_path()
try:
cache_path.parent.mkdir(parents=True, exist_ok=True)
temp_file = cache_path.with_suffix('.tmp')
with open(temp_file, 'w', encoding='utf-8') as f:
for entry in _cache.values():
f.write(json.dumps(entry, ensure_ascii=False) + '\n')
temp_file.replace(cache_path)
logger.debug(f"Saved contacts cache: {len(_cache)} entries")
return True
except Exception as e:
logger.error(f"Failed to save contacts cache: {e}")
return False
def upsert_contact(public_key: str, name: str, source: str = "advert",
lat: float = 0.0, lon: float = 0.0, type_label: str = "") -> bool:
"""Add or update a contact in the cache. Returns True if cache was modified."""
pk = public_key.lower()
now = int(time.time())
with _cache_lock:
existing = _cache.get(pk)
if existing:
changed = False
if name and name != existing.get('name'):
existing['name'] = name
changed = True
# Update lat/lon if new values are non-zero
if lat != 0.0 or lon != 0.0:
if lat != existing.get('lat') or lon != existing.get('lon'):
existing['lat'] = lat
existing['lon'] = lon
changed = True
# Update type_label if provided and not already set
if type_label and type_label != existing.get('type_label'):
existing['type_label'] = type_label
changed = True
existing['last_seen'] = now
return changed
else:
if not name:
return False
entry = {
'public_key': pk,
'name': name,
'first_seen': now,
'last_seen': now,
'source': source,
}
if lat != 0.0 or lon != 0.0:
entry['lat'] = lat
entry['lon'] = lon
if type_label:
entry['type_label'] = type_label
_cache[pk] = entry
return True
def get_all_contacts() -> list:
"""Get all cached contacts as a list of dicts (shallow copies)."""
with _cache_lock:
return [entry.copy() for entry in _cache.values()]
def get_all_names() -> list:
"""Get all unique non-empty contact names sorted alphabetically."""
with _cache_lock:
return sorted(set(
entry['name'] for entry in _cache.values()
if entry.get('name')
))
def parse_advert_payload(pkt_payload_hex: str):
"""
Parse advert pkt_payload to extract public_key, node_name, and GPS coordinates.
Layout of pkt_payload (byte offsets):
[0:32] Public Key (32 bytes = 64 hex chars)
[32:36] Timestamp (4 bytes)
[36:100] Signature (64 bytes)
[100] App Flags (1 byte) - bit 4: Location, bit 7: Name
[101+] If Location (bit 4): Lat (4 bytes, LE int32/1e6) + Lon (4 bytes, LE int32/1e6)
If Name (bit 7): Node name (UTF-8, variable length)
Returns:
(public_key_hex, node_name, lat, lon) or (None, None, 0, 0) on failure
"""
try:
raw = bytes.fromhex(pkt_payload_hex)
if len(raw) < 101:
return None, None, 0.0, 0.0
public_key = pkt_payload_hex[:64].lower()
app_flags = raw[100]
has_location = bool(app_flags & 0x10) # bit 4
has_name = bool(app_flags & 0x80) # bit 7
lat, lon = 0.0, 0.0
name_offset = 101
if has_location:
if len(raw) >= 109:
lat_i, lon_i = struct.unpack('<ii', raw[101:109])
lat, lon = lat_i / 1e6, lon_i / 1e6
# Validate: discard NaN, Infinity, and out-of-range values
if (math.isnan(lat) or math.isnan(lon) or
math.isinf(lat) or math.isinf(lon) or
not (-90 <= lat <= 90) or not (-180 <= lon <= 180)):
lat, lon = 0.0, 0.0
name_offset += 8
if not has_name:
return public_key, None, lat, lon
if name_offset >= len(raw):
return public_key, None, lat, lon
name_bytes = raw[name_offset:]
node_name = name_bytes.decode('utf-8', errors='replace').rstrip('\x00')
return public_key, node_name if node_name else None, lat, lon
except Exception:
return None, None, 0.0, 0.0
def scan_new_adverts() -> int:
"""
Scan .adverts.jsonl for new entries since last scan.
Returns number of new/updated contacts.
"""
global _adverts_offset
adverts_path = _get_adverts_path()
if not adverts_path.exists():
return 0
updated = 0
try:
with open(adverts_path, 'r', encoding='utf-8') as f:
f.seek(_adverts_offset)
for line in f:
line = line.strip()
if not line:
continue
try:
advert = json.loads(line)
pkt_payload = advert.get('pkt_payload', '')
if not pkt_payload:
continue
pk, name, lat, lon = parse_advert_payload(pkt_payload)
if pk and name:
if upsert_contact(pk, name, source="advert", lat=lat, lon=lon):
updated += 1
except json.JSONDecodeError:
continue
_adverts_offset = f.tell()
except Exception as e:
logger.error(f"Failed to scan adverts: {e}")
if updated > 0:
save_cache()
logger.info(f"Contacts cache updated: {updated} new/changed entries")
return updated
_TYPE_LABELS = {1: 'CLI', 2: 'REP', 3: 'ROOM', 4: 'SENS'}
def initialize_from_device(contacts_detailed: dict):
"""
Seed cache from /api/contacts/detailed response dict.
Called once at startup if cache file doesn't exist.
Args:
contacts_detailed: dict of {public_key: {adv_name, type, adv_lat, adv_lon, ...}} from meshcli
"""
added = 0
for pk, details in contacts_detailed.items():
name = details.get('adv_name', '')
lat = details.get('adv_lat', 0.0) or 0.0
lon = details.get('adv_lon', 0.0) or 0.0
type_label = _TYPE_LABELS.get(details.get('type'), '')
if upsert_contact(pk, name, source="device", lat=lat, lon=lon, type_label=type_label):
added += 1
if added > 0:
save_cache()
logger.info(f"Initialized contacts cache from device: {added} contacts")

View File

@@ -6,6 +6,7 @@ import logging
import re
import shlex
import threading
import time
import requests
from flask import Flask, request as flask_request
from flask_socketio import SocketIO, emit
@@ -15,6 +16,7 @@ from app.routes.api import api_bp
from app.version import VERSION_STRING, GIT_BRANCH
from app.archiver.manager import schedule_daily_archiving
from app.meshcore.cli import fetch_device_name_from_bridge
from app.contacts_cache import load_cache, scan_new_adverts, initialize_from_device
# Commands that require longer timeout (in seconds)
SLOW_COMMANDS = {
@@ -88,13 +90,53 @@ def create_app():
else:
logger.info("Archive scheduler disabled")
# Fetch device name from bridge in background thread
# Fetch device name from bridge in background thread (with retry)
def init_device_name():
device_name, source = fetch_device_name_from_bridge()
runtime_config.set_device_name(device_name, source)
# If we got a fallback name, keep retrying in background
retry_delay = 5
max_delay = 60
while source == "fallback":
time.sleep(retry_delay)
device_name, source = fetch_device_name_from_bridge()
if source != "fallback":
runtime_config.set_device_name(device_name, source)
logger.info(f"Device name resolved after retry: {device_name}")
break
retry_delay = min(retry_delay * 2, max_delay)
threading.Thread(target=init_device_name, daemon=True).start()
# Background thread: contacts cache initialization and periodic advert scanning
def init_contacts_cache():
# Wait for device name to resolve
time.sleep(10)
cache = load_cache()
# Seed from device contacts if cache is empty
if not cache:
try:
from app.routes.api import get_contacts_detailed_cached
success, contacts, error = get_contacts_detailed_cached()
if success and contacts:
initialize_from_device(contacts)
logger.info("Contacts cache seeded from device")
except Exception as e:
logger.error(f"Failed to seed contacts cache: {e}")
# Periodic advert scan loop
while True:
time.sleep(45)
try:
scan_new_adverts()
except Exception as e:
logger.error(f"Contacts cache scan error: {e}")
threading.Thread(target=init_contacts_cache, daemon=True).start()
logger.info(f"mc-webui started - device: {config.MC_DEVICE_NAME}")
logger.info(f"Messages file: {config.msgs_file_path}")
logger.info(f"Serial port: {config.MC_SERIAL_PORT}")

View File

@@ -397,6 +397,36 @@ def send_dm(recipient: str, text: str) -> Tuple[bool, str]:
return success, stdout or stderr
def check_dm_delivery(ack_codes: list) -> Tuple[bool, Dict, str]:
"""
Check delivery status for sent DMs by their expected_ack codes.
Args:
ack_codes: List of expected_ack hex strings from SENT_MSG log entries
Returns:
Tuple of (success, ack_status_dict, error_message)
ack_status_dict maps ack_code -> ack_info dict or None
"""
try:
response = requests.get(
f"{config.MC_BRIDGE_URL.replace('/cli', '/ack_status')}",
params={'ack_codes': ','.join(ack_codes)},
timeout=DEFAULT_TIMEOUT
)
if response.status_code != 200:
return False, {}, f"Bridge error: {response.status_code}"
data = response.json()
return data.get('success', False), data.get('acks', {}), ''
except requests.exceptions.ConnectionError:
return False, {}, 'Cannot connect to bridge'
except Exception as e:
return False, {}, str(e)
# =============================================================================
# Contact Management (Existing & Pending Contacts)
# =============================================================================
@@ -996,7 +1026,7 @@ def fetch_device_name_from_bridge(max_retries: int = 3, retry_delay: float = 2.0
response = requests.get(bridge_health_url, timeout=5)
if response.status_code == 200:
data = response.json()
if data.get('status') == 'healthy':
if data.get('status') == 'healthy' or data.get('device_name_source') == 'detected':
device_name = data.get('device_name')
source = data.get('device_name_source', 'unknown')
if device_name:

View File

@@ -37,7 +37,8 @@ def parse_message(line: Dict, allowed_channels: Optional[List[int]] = None) -> O
return None
timestamp = line.get('timestamp', 0)
text = line.get('text', '').strip()
raw_text = line.get('text', '')
text = raw_text.strip()
if not text:
return None
@@ -69,7 +70,10 @@ def parse_message(line: Dict, allowed_channels: Optional[List[int]] = None) -> O
'is_own': is_own,
'snr': line.get('SNR'),
'path_len': line.get('path_len'),
'channel_idx': channel_idx
'channel_idx': channel_idx,
'sender_timestamp': line.get('sender_timestamp'),
'txt_type': line.get('txt_type', 0),
'raw_text': raw_text
}
@@ -440,7 +444,8 @@ def _parse_sent_msg(line: Dict) -> Optional[Dict]:
'is_own': True,
'txt_type': txt_type,
'conversation_id': conversation_id,
'dedup_key': dedup_key
'dedup_key': dedup_key,
'expected_ack': line.get('expected_ack'),
}

View File

@@ -24,8 +24,9 @@ READ_STATUS_FILE = Path(config.MC_CONFIG_DIR) / '.read_status.json'
def _get_default_status():
"""Get default read status structure"""
return {
'channels': {}, # {"0": timestamp, "1": timestamp, ...}
'dm': {} # {"name_User1": timestamp, "pk_abc123": timestamp, ...}
'channels': {}, # {"0": timestamp, "1": timestamp, ...}
'dm': {}, # {"name_User1": timestamp, "pk_abc123": timestamp, ...}
'muted_channels': [] # [2, 5, 7] - channel indices with muted notifications
}
@@ -50,11 +51,13 @@ def load_read_status():
logger.warning("Invalid read status structure, resetting")
return _get_default_status()
# Ensure both keys exist
# Ensure all keys exist
if 'channels' not in status:
status['channels'] = {}
if 'dm' not in status:
status['dm'] = {}
if 'muted_channels' not in status:
status['muted_channels'] = []
logger.debug(f"Loaded read status: {len(status['channels'])} channels, {len(status['dm'])} DM conversations")
return status
@@ -196,3 +199,78 @@ def get_dm_last_seen(conversation_id):
except Exception as e:
logger.error(f"Error getting last seen for DM {conversation_id}: {e}")
return 0
def get_muted_channels():
"""
Get list of muted channel indices.
Returns:
list: List of muted channel indices (integers)
"""
try:
status = load_read_status()
return status.get('muted_channels', [])
except Exception as e:
logger.error(f"Error getting muted channels: {e}")
return []
def set_channel_muted(channel_idx, muted):
"""
Set mute state for a channel.
Args:
channel_idx (int): Channel index
muted (bool): True to mute, False to unmute
Returns:
bool: True if successful
"""
try:
status = load_read_status()
muted_list = status.get('muted_channels', [])
channel_idx = int(channel_idx)
if muted and channel_idx not in muted_list:
muted_list.append(channel_idx)
elif not muted and channel_idx in muted_list:
muted_list.remove(channel_idx)
status['muted_channels'] = muted_list
success = save_read_status(status)
if success:
logger.info(f"Channel {channel_idx} {'muted' if muted else 'unmuted'}")
return success
except Exception as e:
logger.error(f"Error setting mute for channel {channel_idx}: {e}")
return False
def mark_all_channels_read(channel_timestamps):
"""
Mark all channels as read in bulk.
Args:
channel_timestamps (dict): {"0": timestamp, "1": timestamp, ...}
Returns:
bool: True if successful
"""
try:
status = load_read_status()
for channel_key, timestamp in channel_timestamps.items():
status['channels'][str(channel_key)] = int(timestamp)
success = save_read_status(status)
if success:
logger.info(f"Marked {len(channel_timestamps)} channels as read")
return success
except Exception as e:
logger.error(f"Error marking all channels as read: {e}")
return False

View File

@@ -2,12 +2,16 @@
REST API endpoints for mc-webui
"""
import hashlib
import hmac as hmac_mod
import logging
import json
import re
import base64
import struct
import time
import requests
from Crypto.Cipher import AES
from datetime import datetime
from io import BytesIO
from pathlib import Path
@@ -15,6 +19,7 @@ from flask import Blueprint, jsonify, request, send_file
from app.meshcore import cli, parser
from app.config import config, runtime_config
from app.archiver import manager as archive_manager
from app.contacts_cache import get_all_names, get_all_contacts
logger = logging.getLogger(__name__)
@@ -33,6 +38,43 @@ _contacts_detailed_cache_timestamp = 0
CONTACTS_DETAILED_CACHE_TTL = 60 # seconds
ANALYZER_BASE_URL = 'https://analyzer.letsmesh.net/packets?packet_hash='
GRP_TXT_TYPE_BYTE = 0x05
def compute_analyzer_url(pkt_payload):
"""Compute MeshCore Analyzer URL from a hex-encoded pkt_payload."""
try:
raw = bytes([GRP_TXT_TYPE_BYTE]) + bytes.fromhex(pkt_payload)
packet_hash = hashlib.sha256(raw).hexdigest()[:16].upper()
return f"{ANALYZER_BASE_URL}{packet_hash}"
except (ValueError, TypeError):
return None
def compute_pkt_payload(channel_secret_hex, sender_timestamp, txt_type, text, attempt=0):
"""Compute pkt_payload from message data + channel secret.
Reconstructs the encrypted GRP_TXT payload:
channel_hash(1) + HMAC-MAC(2) + AES-128-ECB(plaintext)
where plaintext = timestamp(4 LE) + flags(1) + text(UTF-8) + null + zero-pad.
"""
secret = bytes.fromhex(channel_secret_hex)
flags = ((txt_type & 0x3F) << 2) | (attempt & 0x03)
plaintext = struct.pack('<I', sender_timestamp) + bytes([flags]) + text.encode('utf-8') + b'\x00'
# Pad to AES block boundary (16 bytes)
pad_len = (16 - len(plaintext) % 16) % 16
plaintext += b'\x00' * pad_len
# AES-128-ECB encrypt
cipher = AES.new(secret[:16], AES.MODE_ECB)
ciphertext = cipher.encrypt(plaintext)
# HMAC-SHA256 truncated to 2 bytes
mac = hmac_mod.new(secret, ciphertext, hashlib.sha256).digest()[:2]
# Channel hash: first byte of SHA256(secret)
chan_hash = hashlib.sha256(secret).digest()[0:1]
return (chan_hash + mac + ciphertext).hex()
def get_channels_cached(force_refresh=False):
"""
Get channels with caching to reduce USB/meshcli calls.
@@ -321,27 +363,52 @@ def get_messages():
abs(msg['timestamp'] - ec['timestamp']) < 5):
msg['echo_count'] = ec['count']
msg['echo_paths'] = ec.get('paths', [])
pkt = ec.get('pkt_payload')
if pkt:
msg['analyzer_url'] = compute_analyzer_url(pkt)
break
# Merge incoming paths into received messages
# Match by timestamp proximity + path_len confirmation
# Deterministic matching via computed pkt_payload
incoming_by_payload = {ip['pkt_payload']: ip for ip in incoming_paths}
# Get channel secrets for payload computation
_, channels = get_channels_cached()
channel_secrets = {ch['index']: ch['key'] for ch in (channels or [])}
for msg in messages:
if not msg.get('is_own'):
best_match = None
best_delta = 10 # max 10 second window
for ip in incoming_paths:
delta = abs(msg['timestamp'] - ip['timestamp'])
if delta < best_delta:
# Prefer matches where path_len also matches
if msg.get('path_len') == ip.get('path_len'):
best_match = ip
best_delta = delta
elif best_match is None:
# Fallback: timestamp-only match
best_match = ip
best_delta = delta
if best_match:
msg['path'] = best_match['path']
if not msg.get('is_own') and msg.get('sender_timestamp') and msg.get('channel_idx') in channel_secrets:
secret = channel_secrets[msg['channel_idx']]
# Always compute attempt=0 payload for analyzer URL
base_payload = compute_pkt_payload(
secret, msg['sender_timestamp'],
msg.get('txt_type', 0), msg.get('raw_text', ''), attempt=0
)
msg['analyzer_url'] = compute_analyzer_url(base_payload)
# Try all 4 attempt values for path matching
matched = False
for attempt in range(4):
try:
computed_payload = compute_pkt_payload(
secret, msg['sender_timestamp'],
msg.get('txt_type', 0), msg.get('raw_text', ''), attempt
)
except Exception:
break
if computed_payload in incoming_by_payload:
entry = incoming_by_payload[computed_payload]
msg['paths'] = entry.get('paths', [])
matched = True
break
if not matched and incoming_by_payload:
raw = msg.get('raw_text', '')
logger.debug(
f"Echo mismatch: ts={msg.get('sender_timestamp')} "
f"ch={msg.get('channel_idx')} "
f"text_bytes={len(raw.encode('utf-8'))} "
f"base_payload={base_payload[:16]}... "
f"text_preview={raw[:40]!r}"
)
except Exception as e:
logger.debug(f"Echo data fetch failed (non-critical): {e}")
@@ -506,6 +573,46 @@ def get_contacts():
}), 500
@api_bp.route('/contacts/cached', methods=['GET'])
def get_cached_contacts():
"""
Get all known contacts from persistent cache (superset of device contacts).
Includes contacts seen via adverts even after removal from device.
Query params:
?format=names - Return just name strings for @mentions (default)
?format=full - Return full cache entries with public_key, timestamps, etc.
"""
try:
fmt = request.args.get('format', 'names')
if fmt == 'full':
contacts = get_all_contacts()
# Add public_key_prefix for display
for c in contacts:
c['public_key_prefix'] = c.get('public_key', '')[:12]
return jsonify({
'success': True,
'contacts': contacts,
'count': len(contacts)
}), 200
else:
names = get_all_names()
return jsonify({
'success': True,
'contacts': names,
'count': len(names)
}), 200
except Exception as e:
logger.error(f"Error getting cached contacts: {e}")
return jsonify({
'success': False,
'error': str(e),
'contacts': []
}), 500
def _filter_contacts_by_criteria(contacts: list, criteria: dict) -> list:
"""
Filter contacts based on cleanup criteria.
@@ -1434,6 +1541,10 @@ def get_messages_updates():
if ts > last_seen_ts:
channel_stats[ch_idx]['unread_count'] += 1
# Get muted channels to exclude from total
from app import read_status as rs
muted_channels = set(rs.get_muted_channels())
# Build response
updates = []
total_unread = 0
@@ -1445,7 +1556,10 @@ def get_messages_updates():
last_seen_ts = last_seen.get(channel_idx, 0)
has_updates = stats['latest_timestamp'] > last_seen_ts
unread_count = stats['unread_count'] if has_updates else 0
total_unread += unread_count
# Only count unmuted channels toward total
if channel_idx not in muted_channels:
total_unread += unread_count
updates.append({
'index': channel_idx,
@@ -1458,7 +1572,8 @@ def get_messages_updates():
return jsonify({
'success': True,
'channels': updates,
'total_unread': total_unread
'total_unread': total_unread,
'muted_channels': list(muted_channels)
}), 200
except Exception as e:
@@ -1571,6 +1686,23 @@ def get_dm_messages():
elif msg['direction'] == 'outgoing' and msg.get('recipient'):
display_name = msg['recipient']
# Merge delivery status from ACK tracking
ack_codes = [msg['expected_ack'] for msg in messages
if msg.get('direction') == 'outgoing' and msg.get('expected_ack')]
if ack_codes:
try:
success_ack, acks, _ = cli.check_dm_delivery(ack_codes)
if success_ack:
for msg in messages:
ack_code = msg.get('expected_ack')
if ack_code and acks.get(ack_code):
ack_info = acks[ack_code]
msg['status'] = 'delivered'
msg['delivery_snr'] = ack_info.get('snr')
msg['delivery_route'] = ack_info.get('route')
except Exception as e:
logger.debug(f"ACK status fetch failed (non-critical): {e}")
return jsonify({
'success': True,
'conversation_id': conversation_id,
@@ -2445,7 +2577,8 @@ def get_read_status_api():
return jsonify({
'success': True,
'channels': status['channels'],
'dm': status['dm']
'dm': status['dm'],
'muted_channels': status.get('muted_channels', [])
}), 200
except Exception as e:
@@ -2454,7 +2587,8 @@ def get_read_status_api():
'success': False,
'error': str(e),
'channels': {},
'dm': {}
'dm': {},
'muted_channels': []
}), 500
@@ -2821,6 +2955,65 @@ def mark_read_api():
}), 500
@api_bp.route('/read_status/mark_all_read', methods=['POST'])
def mark_all_read_api():
"""Mark all channels as read in bulk."""
try:
from app import read_status
data = request.get_json()
if not data or 'channels' not in data:
return jsonify({'success': False, 'error': 'Missing channels timestamps'}), 400
success = read_status.mark_all_channels_read(data['channels'])
if success:
return jsonify({'success': True, 'message': 'All channels marked as read'}), 200
else:
return jsonify({'success': False, 'error': 'Failed to save'}), 500
except Exception as e:
logger.error(f"Error marking all as read: {e}")
return jsonify({'success': False, 'error': str(e)}), 500
@api_bp.route('/channels/muted', methods=['GET'])
def get_muted_channels_api():
"""Get list of muted channel indices."""
try:
from app import read_status
muted = read_status.get_muted_channels()
return jsonify({'success': True, 'muted_channels': muted}), 200
except Exception as e:
logger.error(f"Error getting muted channels: {e}")
return jsonify({'success': False, 'error': str(e)}), 500
@api_bp.route('/channels/<int:index>/mute', methods=['POST'])
def set_channel_muted_api(index):
"""Set mute state for a channel."""
try:
from app import read_status
data = request.get_json()
if data is None or 'muted' not in data:
return jsonify({'success': False, 'error': 'Missing muted field'}), 400
success = read_status.set_channel_muted(index, data['muted'])
if success:
return jsonify({
'success': True,
'message': f'Channel {index} {"muted" if data["muted"] else "unmuted"}'
}), 200
else:
return jsonify({'success': False, 'error': 'Failed to save'}), 500
except Exception as e:
logger.error(f"Error setting channel mute: {e}")
return jsonify({'success': False, 'error': str(e)}), 500
# ============================================================
# Console History API
# ============================================================

View File

@@ -342,28 +342,10 @@ main {
padding: 0.75rem;
}
/* Modal: Smaller channel keys on mobile */
.list-group-item small {
font-size: 0.65rem;
word-break: break-all;
}
/* Modal: Compact channel list */
.modal .list-group-item {
padding: 0.5rem;
}
/* Modal: Stack buttons vertically on very small screens */
@media (max-width: 400px) {
.btn-group {
flex-direction: column;
}
.btn-group .btn {
border-radius: 0.375rem !important;
margin-bottom: 0.25rem;
}
}
}
/* Loading State */
@@ -536,6 +518,30 @@ main {
color: #dc3545;
}
.dm-status.unknown {
color: #adb5bd;
}
.dm-status-unknown {
cursor: pointer;
position: relative;
}
.dm-delivery-popup {
position: absolute;
bottom: 100%;
right: 0;
background-color: #333;
color: #fff;
padding: 0.35rem 0.6rem;
border-radius: 0.375rem;
font-size: 0.7rem;
white-space: nowrap;
z-index: 1000;
pointer-events: none;
margin-bottom: 0.25rem;
}
/* DM Action Buttons */
.dm-actions {
display: flex;
@@ -846,6 +852,48 @@ main {
transition: transform 0.2s ease;
}
/* FAB toggle button (smaller, semi-transparent) */
.fab-toggle {
width: 32px;
height: 32px;
background: rgba(108, 117, 125, 0.6);
color: white;
font-size: 0.85rem;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
}
.fab-toggle:hover {
background: rgba(108, 117, 125, 0.9);
}
/* Collapsed state - hide all FABs except toggle */
.fab-container.collapsed .fab:not(.fab-toggle) {
opacity: 0;
pointer-events: none;
transform: scale(0);
height: 0;
width: 0;
margin: 0;
overflow: hidden;
}
.fab-container.collapsed {
gap: 0;
}
/* Smooth transitions for collapse */
.fab-container .fab:not(.fab-toggle) {
transition: transform 0.2s ease, box-shadow 0.2s ease, opacity 0.2s ease, height 0.2s ease, width 0.2s ease;
}
.fab-toggle i {
transition: transform 0.2s ease;
}
.fab-container.collapsed .fab-toggle i {
transform: rotate(180deg);
}
/* Mobile optimization */
@media (max-width: 768px) {
.fab-container {
@@ -859,6 +907,12 @@ main {
height: 48px;
font-size: 1.25rem;
}
.fab-toggle {
width: 28px;
height: 28px;
font-size: 0.75rem;
}
}
/* =============================================================================
@@ -1128,22 +1182,39 @@ main {
position: relative;
}
/* Path popup (mobile-friendly tooltip) */
/* Path popup (mobile-friendly, multi-path) */
.path-popup {
position: absolute;
bottom: 100%;
left: 50%;
transform: translateX(-50%);
left: 0;
background-color: #333;
color: #fff;
padding: 0.35rem 0.6rem;
padding: 0.4rem 0.6rem;
border-radius: 0.375rem;
font-size: 0.7rem;
white-space: nowrap;
white-space: normal;
z-index: 1000;
pointer-events: none;
pointer-events: auto;
margin-bottom: 4px;
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.25);
min-width: 180px;
max-width: 320px;
}
.path-popup .path-entry {
padding: 0.15rem 0;
border-bottom: 1px solid rgba(255, 255, 255, 0.15);
word-break: break-all;
}
.path-popup .path-entry:last-child {
border-bottom: none;
}
.path-popup .path-detail {
display: block;
opacity: 0.7;
font-size: 0.6rem;
}
/* =============================================================================
@@ -1179,6 +1250,15 @@ main {
visibility: visible;
}
/* Push messages container down when filter bar is visible */
.messages-container {
transition: padding-top 0.3s ease;
}
.filter-bar.visible ~ .messages-container {
padding-top: calc(1rem + 52px) !important; /* 52px ≈ filter bar height (0.75rem*2 padding + 36px input + border) */
}
/* Filter bar inner layout */
.filter-bar-inner {
display: flex;
@@ -1187,7 +1267,6 @@ main {
}
.filter-bar-input {
flex: 1;
border-radius: 0.375rem;
border: 1px solid #ced4da;
padding: 0.5rem 0.75rem;
@@ -1214,6 +1293,15 @@ main {
transition: background-color 0.15s ease;
}
.filter-bar-btn-me {
background-color: #e7f1ff;
color: #0d6efd;
}
.filter-bar-btn-me:hover {
background-color: #cfe2ff;
}
.filter-bar-btn-clear {
background-color: #f8f9fa;
color: #6c757d;
@@ -1266,6 +1354,24 @@ main {
display: block;
}
/* Filter input wrapper for mentions popup positioning */
.filter-input-wrapper {
flex: 1;
position: relative;
}
.filter-input-wrapper .filter-bar-input {
width: 100%;
}
/* Filter mentions popup - appears below input (not above like message input) */
.filter-mentions-popup {
bottom: auto !important;
top: 100% !important;
margin-top: 0.25rem;
margin-bottom: 0;
}
/* Mobile responsive filter bar */
@media (max-width: 576px) {
.filter-bar {

View File

@@ -11,6 +11,7 @@ let currentChannelIdx = 0; // Current active channel (0 = Public)
let availableChannels = []; // List of channels from API
let lastSeenTimestamps = {}; // Track last seen message timestamp per channel
let unreadCounts = {}; // Track unread message counts per channel
let mutedChannels = new Set(); // Channel indices with muted notifications
// DM state (for badge updates on main page)
let dmLastSeenTimestamps = {}; // Track last seen DM timestamp per conversation
@@ -324,6 +325,9 @@ document.addEventListener('DOMContentLoaded', async function() {
// Initialize filter functionality
initializeFilter();
// Initialize FAB toggle
initializeFabToggle();
// Setup auto-refresh immediately after messages are displayed
// Don't wait for geo cache - it's not needed for auto-refresh
setupAutoRefresh();
@@ -715,13 +719,16 @@ function createMessageElement(msg) {
if (msg.path_len !== undefined && msg.path_len !== null) {
metaInfo += ` | Hops: ${msg.path_len}`;
}
if (msg.path) {
const segments = msg.path.match(/.{1,2}/g) || [];
const fullPath = segments.join(' \u2192 ');
if (msg.paths && msg.paths.length > 0) {
// Show first path inline (shortest/first arrival)
const firstPath = msg.paths[0];
const segments = firstPath.path ? firstPath.path.match(/.{1,2}/g) || [] : [];
const shortPath = segments.length > 4
? `${segments[0]}\u2192...\u2192${segments[segments.length - 1]}`
: segments.join('\u2192');
metaInfo += ` | <span class="path-info" onclick="showPathPopup(this, '${fullPath}')">Route: ${shortPath}</span>`;
const pathsData = encodeURIComponent(JSON.stringify(msg.paths));
const routeLabel = msg.paths.length > 1 ? `Route (${msg.paths.length})` : 'Route';
metaInfo += ` | <span class="path-info" onclick="showPathsPopup(this, '${pathsData}')">${routeLabel}: ${shortPath}</span>`;
}
if (msg.is_own) {
@@ -746,6 +753,11 @@ function createMessageElement(msg) {
<div class="message-content">${processMessageContent(msg.content)}</div>
<div class="message-actions justify-content-end">
${echoDisplay}
${msg.analyzer_url ? `
<button class="btn btn-outline-secondary btn-msg-action" onclick="window.open('${msg.analyzer_url}', 'meshcore-analyzer')" title="View in Analyzer">
<i class="bi bi-clipboard-data"></i>
</button>
` : ''}
<button class="btn btn-outline-secondary btn-msg-action" onclick='resendMessage(${JSON.stringify(msg.content)})' title="Resend">
<i class="bi bi-arrow-repeat"></i>
</button>
@@ -785,6 +797,11 @@ function createMessageElement(msg) {
<i class="bi bi-geo-alt"></i>
</button>
` : ''}
${msg.analyzer_url ? `
<button class="btn btn-outline-secondary btn-msg-action" onclick="window.open('${msg.analyzer_url}', 'meshcore-analyzer')" title="View in Analyzer">
<i class="bi bi-clipboard-data"></i>
</button>
` : ''}
</div>
</div>
</div>
@@ -898,22 +915,34 @@ function resendMessage(content) {
}
/**
* Show path popup on tap (mobile-friendly alternative to tooltip)
* Show paths popup on tap (mobile-friendly, shows all routes)
*/
function showPathPopup(element, fullPath) {
function showPathsPopup(element, encodedPaths) {
// Remove any existing popup
const existing = document.querySelector('.path-popup');
if (existing) existing.remove();
const paths = JSON.parse(decodeURIComponent(encodedPaths));
const popup = document.createElement('div');
popup.className = 'path-popup';
popup.textContent = `Path: ${fullPath}`;
let html = '';
paths.forEach((p, i) => {
const segments = p.path ? p.path.match(/.{1,2}/g) || [] : [];
const fullRoute = segments.join(' \u2192 ');
const snr = p.snr !== null && p.snr !== undefined ? `${p.snr.toFixed(1)} dB` : '?';
const hops = p.path_len !== null && p.path_len !== undefined ? p.path_len : segments.length;
html += `<div class="path-entry">${fullRoute}<span class="path-detail">SNR: ${snr} | Hops: ${hops}</span></div>`;
});
popup.innerHTML = html;
element.style.position = 'relative';
element.appendChild(popup);
// Auto-dismiss after 4 seconds or on outside tap
// Auto-dismiss after 8 seconds or on outside tap
const dismiss = () => popup.remove();
setTimeout(dismiss, 4000);
setTimeout(dismiss, 8000);
document.addEventListener('click', function handler(e) {
if (!element.contains(e.target)) {
dismiss();
@@ -1326,8 +1355,13 @@ let previousPendingCount = 0;
* Check if we should send notification based on count changes
*/
function checkAndNotify() {
// Calculate current totals
const currentTotalUnread = Object.values(unreadCounts).reduce((sum, count) => sum + count, 0);
// Calculate current totals (exclude muted channels)
let currentTotalUnread = 0;
for (const [idx, count] of Object.entries(unreadCounts)) {
if (!mutedChannels.has(parseInt(idx))) {
currentTotalUnread += count;
}
}
// Get DM unread count from badge
const dmBadge = document.querySelector('.fab-badge-dm');
@@ -1367,8 +1401,13 @@ function updateAppBadge() {
return;
}
// Calculate total unread
const channelUnread = Object.values(unreadCounts).reduce((sum, count) => sum + count, 0);
// Calculate total unread (exclude muted channels)
let channelUnread = 0;
for (const [idx, count] of Object.entries(unreadCounts)) {
if (!mutedChannels.has(parseInt(idx))) {
channelUnread += count;
}
}
const dmBadge = document.querySelector('.fab-badge-dm');
const dmUnread = dmBadge ? parseInt(dmBadge.textContent) || 0 : 0;
@@ -1837,7 +1876,11 @@ async function loadLastSeenTimestampsFromServer() {
for (const [key, value] of Object.entries(data.channels)) {
lastSeenTimestamps[parseInt(key)] = value;
}
console.log('Loaded channel read status from server:', lastSeenTimestamps);
// Load muted channels
if (data.muted_channels) {
mutedChannels = new Set(data.muted_channels);
}
console.log('Loaded channel read status from server:', lastSeenTimestamps, 'muted:', [...mutedChannels]);
} else {
console.warn('Failed to load read status from server, using empty state');
lastSeenTimestamps = {};
@@ -1885,6 +1928,51 @@ async function markChannelAsRead(channelIdx, timestamp) {
updateUnreadBadges();
}
/**
* Mark all channels as read (bell icon click)
*/
async function markAllChannelsRead() {
// Build list of channels with unread messages
const unreadChannels = [];
for (const [idx, count] of Object.entries(unreadCounts)) {
if (count > 0) {
const channel = availableChannels.find(ch => ch.index === parseInt(idx));
const name = channel ? channel.name : `Channel ${idx}`;
unreadChannels.push({ idx, count, name });
}
}
if (unreadChannels.length === 0) return;
// Show confirmation dialog with list of unread channels
const channelList = unreadChannels.map(ch => ` - ${ch.name} (${ch.count})`).join('\n');
if (!confirm(`Mark all messages as read?\n\nUnread channels:\n${channelList}`)) return;
// Collect latest timestamps
const now = Math.floor(Date.now() / 1000);
const timestamps = {};
for (const { idx } of unreadChannels) {
timestamps[idx] = now;
lastSeenTimestamps[parseInt(idx)] = now;
unreadCounts[idx] = 0;
}
// Update UI immediately
updateUnreadBadges();
// Save to server
try {
await fetch('/api/read_status/mark_all_read', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ channels: timestamps })
});
} catch (error) {
console.error('Error marking all as read:', error);
}
}
/**
* Check for new messages across all channels
*/
@@ -1921,6 +2009,11 @@ async function checkForUpdates() {
unreadCounts[channel.index] = channel.unread_count;
});
// Sync muted channels from server
if (data.muted_channels) {
mutedChannels = new Set(data.muted_channels);
}
// Update UI badges
updateUnreadBadges();
@@ -1957,8 +2050,8 @@ function updateUnreadBadges() {
// Get base channel name (remove existing badge if any)
let channelName = option.textContent.replace(/\s*\(\d+\)$/, '');
// Add badge if there are unread messages and it's not the current channel
if (unreadCount > 0 && channelIdx !== currentChannelIdx) {
// Add badge if there are unread messages, not current channel, and not muted
if (unreadCount > 0 && channelIdx !== currentChannelIdx && !mutedChannels.has(channelIdx)) {
option.textContent = `${channelName} (${unreadCount})`;
} else {
option.textContent = channelName;
@@ -1966,8 +2059,13 @@ function updateUnreadBadges() {
});
}
// Update notification bell
const totalUnread = Object.values(unreadCounts).reduce((sum, count) => sum + count, 0);
// Update notification bell (exclude muted channels)
let totalUnread = 0;
for (const [idx, count] of Object.entries(unreadCounts)) {
if (!mutedChannels.has(parseInt(idx))) {
totalUnread += count;
}
}
updateNotificationBell(totalUnread);
// Update app icon badge
@@ -2237,13 +2335,17 @@ function displayChannelsList(channels) {
const isPublic = channel.index === 0;
const isMuted = mutedChannels.has(channel.index);
item.innerHTML = `
<div>
<strong>${escapeHtml(channel.name)}</strong>
<br>
<small class="text-muted font-monospace">${channel.key}</small>
</div>
<div class="btn-group btn-group-sm">
<button class="btn ${isMuted ? 'btn-secondary' : 'btn-outline-secondary'}"
onclick="toggleChannelMute(${channel.index})"
title="${isMuted ? 'Unmute notifications' : 'Mute notifications'}">
<i class="bi ${isMuted ? 'bi-bell-slash' : 'bi-bell'}"></i>
</button>
<button class="btn btn-outline-primary" onclick="shareChannel(${channel.index})" title="Share">
<i class="bi bi-share"></i>
</button>
@@ -2259,6 +2361,37 @@ function displayChannelsList(channels) {
});
}
/**
* Toggle mute state for a channel
*/
async function toggleChannelMute(index) {
const newMuted = !mutedChannels.has(index);
try {
const response = await fetch(`/api/channels/${index}/mute`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ muted: newMuted })
});
const data = await response.json();
if (data.success) {
if (newMuted) {
mutedChannels.add(index);
} else {
mutedChannels.delete(index);
}
// Refresh modal list and badges
loadChannelsList();
updateUnreadBadges();
} else {
showNotification('Failed to update mute state', 'danger');
}
} catch (error) {
showNotification('Failed to update mute state', 'danger');
}
}
/**
* Delete channel
*/
@@ -2770,19 +2903,35 @@ async function loadContactsForMentions() {
}
try {
const response = await fetch('/api/contacts');
const response = await fetch('/api/contacts/cached');
const data = await response.json();
if (data.success && data.contacts) {
mentionsCache = data.contacts;
mentionsCacheTimestamp = now;
console.log(`[mentions] Cached ${mentionsCache.length} contacts`);
console.log(`[mentions] Cached ${mentionsCache.length} contacts from cache`);
}
} catch (error) {
console.error('[mentions] Error loading contacts:', error);
}
}
// =============================================================================
// FAB Toggle (Collapse/Expand)
// =============================================================================
function initializeFabToggle() {
const toggle = document.getElementById('fabToggle');
const container = document.getElementById('fabContainer');
if (!toggle || !container) return;
toggle.addEventListener('click', () => {
container.classList.toggle('collapsed');
const isCollapsed = container.classList.contains('collapsed');
toggle.title = isCollapsed ? 'Show buttons' : 'Hide buttons';
});
}
// =============================================================================
// Chat Filter Functionality
// =============================================================================
@@ -2809,9 +2958,27 @@ function initializeFilter() {
openFilterBar();
});
// Filter as user types (debounced)
// "Filter my messages" button - inserts current device name
const filterMeBtn = document.getElementById('filterMeBtn');
if (filterMeBtn) {
filterMeBtn.addEventListener('click', () => {
const deviceName = window.MC_CONFIG?.deviceName || '';
if (deviceName) {
filterInput.value = deviceName;
applyFilter(deviceName);
filterInput.focus();
}
});
}
// Filter as user types (debounced) - also check for @mentions
let filterTimeout = null;
filterInput.addEventListener('input', () => {
// Check for @mention trigger
if (handleFilterMentionInput(filterInput)) {
return; // Don't apply filter while picking a mention
}
clearTimeout(filterTimeout);
filterTimeout = setTimeout(() => {
applyFilter(filterInput.value);
@@ -2822,6 +2989,7 @@ function initializeFilter() {
filterClearBtn.addEventListener('click', () => {
filterInput.value = '';
applyFilter('');
hideFilterMentionsPopup();
filterInput.focus();
});
@@ -2830,11 +2998,30 @@ function initializeFilter() {
closeFilterBar();
});
// Keyboard shortcuts
// Keyboard shortcuts (with mentions navigation support)
filterInput.addEventListener('keydown', (e) => {
if (e.key === 'Escape') {
closeFilterBar();
// If filter mentions popup is active, handle navigation
if (filterMentionActive) {
if (handleFilterMentionKeydown(e)) return;
}
if (e.key === 'Escape') {
if (filterMentionActive) {
hideFilterMentionsPopup();
e.preventDefault();
} else {
closeFilterBar();
}
}
});
// Close filter mentions on blur
filterInput.addEventListener('blur', () => {
setTimeout(hideFilterMentionsPopup, 200);
});
// Preload contacts when filter bar is focused
filterInput.addEventListener('focus', () => {
loadContactsForMentions();
});
// Global keyboard shortcut: Ctrl+F to open filter
@@ -2871,6 +3058,7 @@ function closeFilterBar() {
filterBar.classList.remove('visible');
filterActive = false;
hideFilterMentionsPopup();
// Reset filter
filterInput.value = '';
@@ -3001,6 +3189,164 @@ function getMessageId(messageEl) {
return 'msg_' + children.indexOf(messageEl);
}
// =============================================================================
// Filter Mentions Autocomplete
// =============================================================================
let filterMentionActive = false;
let filterMentionStartPos = -1;
let filterMentionSelectedIndex = 0;
/**
* Handle input in filter bar to detect @mention trigger
* @returns {boolean} true if in mention mode (caller should skip filter apply)
*/
function handleFilterMentionInput(input) {
const cursorPos = input.selectionStart;
const text = input.value;
const textBeforeCursor = text.substring(0, cursorPos);
const lastAtPos = textBeforeCursor.lastIndexOf('@');
if (lastAtPos >= 0) {
const textAfterAt = textBeforeCursor.substring(lastAtPos + 1);
// No whitespace after @ means we're typing a mention
if (!/[\s\n]/.test(textAfterAt)) {
filterMentionStartPos = lastAtPos;
filterMentionActive = true;
showFilterMentionsPopup(textAfterAt);
return true;
}
}
if (filterMentionActive) {
hideFilterMentionsPopup();
}
return false;
}
/**
* Handle keyboard navigation in filter mentions popup
* @returns {boolean} true if the key was handled
*/
function handleFilterMentionKeydown(e) {
const popup = document.getElementById('filterMentionsPopup');
const items = popup.querySelectorAll('.mention-item');
if (items.length === 0) return false;
switch (e.key) {
case 'ArrowDown':
e.preventDefault();
filterMentionSelectedIndex = Math.min(filterMentionSelectedIndex + 1, items.length - 1);
updateFilterMentionHighlight(items);
return true;
case 'ArrowUp':
e.preventDefault();
filterMentionSelectedIndex = Math.max(filterMentionSelectedIndex - 1, 0);
updateFilterMentionHighlight(items);
return true;
case 'Enter':
case 'Tab':
if (items.length > 0 && filterMentionSelectedIndex < items.length) {
e.preventDefault();
const selected = items[filterMentionSelectedIndex];
if (selected && selected.dataset.contact) {
selectFilterMentionContact(selected.dataset.contact);
}
return true;
}
break;
}
return false;
}
/**
* Show filter mentions popup with filtered contacts
*/
function showFilterMentionsPopup(query) {
const popup = document.getElementById('filterMentionsPopup');
const list = document.getElementById('filterMentionsList');
// Ensure contacts are loaded
loadContactsForMentions();
const filtered = filterContacts(query);
if (filtered.length === 0) {
list.innerHTML = '<div class="mentions-empty">No contacts found</div>';
popup.classList.remove('hidden');
return;
}
if (filterMentionSelectedIndex >= filtered.length) {
filterMentionSelectedIndex = 0;
}
list.innerHTML = filtered.map((contact, index) => {
const highlighted = index === filterMentionSelectedIndex ? 'highlighted' : '';
const escapedName = escapeHtml(contact);
return `<div class="mention-item ${highlighted}" data-contact="${escapedName}" data-index="${index}">
<span class="mention-item-name">${escapedName}</span>
</div>`;
}).join('');
list.querySelectorAll('.mention-item').forEach(item => {
item.addEventListener('click', function() {
selectFilterMentionContact(this.dataset.contact);
});
});
popup.classList.remove('hidden');
}
/**
* Hide filter mentions popup
*/
function hideFilterMentionsPopup() {
const popup = document.getElementById('filterMentionsPopup');
if (popup) popup.classList.add('hidden');
filterMentionActive = false;
filterMentionStartPos = -1;
filterMentionSelectedIndex = 0;
}
/**
* Update highlight in filter mentions popup
*/
function updateFilterMentionHighlight(items) {
items.forEach((item, index) => {
if (index === filterMentionSelectedIndex) {
item.classList.add('highlighted');
item.scrollIntoView({ block: 'nearest' });
} else {
item.classList.remove('highlighted');
}
});
}
/**
* Select a contact from filter mentions and insert plain name
*/
function selectFilterMentionContact(contactName) {
const input = document.getElementById('filterInput');
const text = input.value;
// Replace from @ position to cursor with plain contact name
const beforeMention = text.substring(0, filterMentionStartPos);
const afterCursor = text.substring(input.selectionStart);
input.value = beforeMention + contactName + afterCursor;
// Set cursor position after the name
const newCursorPos = filterMentionStartPos + contactName.length;
input.setSelectionRange(newCursorPos, newCursorPos);
hideFilterMentionsPopup();
input.focus();
// Trigger filter with the new value
applyFilter(input.value);
}
/**
* Clear filter state when messages are reloaded
* Called from displayMessages()

View File

@@ -839,6 +839,14 @@ function attachExistingEventListeners() {
});
}
// Source filter (device / cache only)
const sourceFilter = document.getElementById('sourceFilter');
if (sourceFilter) {
sourceFilter.addEventListener('change', () => {
applySortAndFilters();
});
}
// Type filter
const typeFilter = document.getElementById('typeFilter');
if (typeFilter) {
@@ -847,18 +855,15 @@ function attachExistingEventListeners() {
});
}
// Sort buttons
const sortByName = document.getElementById('sortByName');
if (sortByName) {
sortByName.addEventListener('click', () => {
handleSortChange('name');
});
}
const sortByLastAdvert = document.getElementById('sortByLastAdvert');
if (sortByLastAdvert) {
sortByLastAdvert.addEventListener('click', () => {
handleSortChange('last_advert');
// Sort dropdown
const sortSelect = document.getElementById('sortSelect');
if (sortSelect) {
sortSelect.addEventListener('change', () => {
const lastUnderscore = sortSelect.value.lastIndexOf('_');
sortBy = sortSelect.value.substring(0, lastUnderscore);
sortOrder = sortSelect.value.substring(lastUnderscore + 1);
updateURLWithSortParams();
applySortAndFilters();
});
}
@@ -1613,7 +1618,6 @@ async function loadExistingContacts() {
const emptyEl = document.getElementById('existingEmpty');
const listEl = document.getElementById('existingList');
const errorEl = document.getElementById('existingError');
const counterEl = document.getElementById('contactsCounter');
// Show loading state
if (loadingEl) loadingEl.style.display = 'block';
@@ -1622,30 +1626,55 @@ async function loadExistingContacts() {
if (errorEl) errorEl.style.display = 'none';
try {
const response = await fetch('/api/contacts/detailed');
const data = await response.json();
// Fetch device contacts and cached contacts in parallel
const [deviceResponse, cacheResponse] = await Promise.all([
fetch('/api/contacts/detailed'),
fetch('/api/contacts/cached?format=full')
]);
const deviceData = await deviceResponse.json();
const cacheData = await cacheResponse.json();
if (loadingEl) loadingEl.style.display = 'none';
if (data.success) {
existingContacts = data.contacts || [];
if (deviceData.success) {
const deviceContacts = deviceData.contacts || [];
const cachedContacts = (cacheData.success && cacheData.contacts) ? cacheData.contacts : [];
// Mark device contacts
const deviceKeySet = new Set(deviceContacts.map(c => c.public_key));
deviceContacts.forEach(c => { c.on_device = true; });
// Add cache-only contacts (not on device)
const cacheOnlyContacts = cachedContacts
.filter(c => !deviceKeySet.has(c.public_key))
.map(c => ({
name: c.name || 'Unknown',
public_key: c.public_key,
public_key_prefix: c.public_key_prefix || c.public_key.substring(0, 12),
type_label: c.type_label || '',
adv_lat: c.lat || 0,
adv_lon: c.lon || 0,
last_seen: c.last_seen || 0,
on_device: false,
source: c.source || 'cache'
}));
existingContacts = [...deviceContacts, ...cacheOnlyContacts];
filteredContacts = [...existingContacts];
// Update counter badge (in navbar)
updateCounter(data.count, data.limit);
// Update counter badge
updateCounter(deviceData.count, deviceData.limit, cachedContacts.length);
if (existingContacts.length === 0) {
// Show empty state
if (emptyEl) emptyEl.style.display = 'block';
} else {
// Apply filters and sort
applySortAndFilters();
}
} else {
console.error('Failed to load existing contacts:', data.error);
console.error('Failed to load existing contacts:', deviceData.error);
if (errorEl) {
const errorMsg = document.getElementById('existingErrorMessage');
if (errorMsg) errorMsg.textContent = data.error || 'Failed to load contacts';
if (errorMsg) errorMsg.textContent = deviceData.error || 'Failed to load contacts';
errorEl.style.display = 'block';
}
}
@@ -1660,11 +1689,15 @@ async function loadExistingContacts() {
}
}
function updateCounter(count, limit) {
function updateCounter(count, limit, totalKnown) {
const counterEl = document.getElementById('contactsCounter');
if (!counterEl) return;
counterEl.textContent = `${count} / ${limit}`;
let text = `${count} / ${limit}`;
if (totalKnown && totalKnown > count) {
text += ` (${totalKnown} cached)`;
}
counterEl.textContent = text;
counterEl.style.display = 'inline-block';
// Remove all counter classes
@@ -1691,31 +1724,11 @@ function parseSortParamsFromURL() {
console.log('Parsed sort params:', { sortBy, sortOrder });
// Update UI to reflect current sort
updateSortUI();
}
function handleSortChange(newSortBy) {
if (sortBy === newSortBy) {
// Toggle order
sortOrder = sortOrder === 'asc' ? 'desc' : 'asc';
} else {
// Change sort field
sortBy = newSortBy;
// Set default order for new field
sortOrder = newSortBy === 'name' ? 'asc' : 'desc';
// Update sort dropdown to reflect current sort
const sortSelect = document.getElementById('sortSelect');
if (sortSelect) {
sortSelect.value = `${sortBy}_${sortOrder}`;
}
console.log('Sort changed to:', { sortBy, sortOrder });
// Update URL parameters
updateURLWithSortParams();
// Update UI
updateSortUI();
// Re-apply filters and sort
applySortAndFilters();
}
function updateURLWithSortParams() {
@@ -1725,48 +1738,32 @@ function updateURLWithSortParams() {
window.history.replaceState({}, '', url);
}
function updateSortUI() {
// Update sort button active states and icons
const sortButtons = document.querySelectorAll('.sort-btn');
sortButtons.forEach(btn => {
const btnSort = btn.dataset.sort;
const icon = btn.querySelector('i');
if (btnSort === sortBy) {
// Active button
btn.classList.add('active');
if (icon) {
icon.className = sortOrder === 'asc' ? 'bi bi-sort-up' : 'bi bi-sort-down';
}
} else {
// Inactive button
btn.classList.remove('active');
if (icon) {
icon.className = 'bi bi-sort-down'; // Default icon
}
}
});
}
function applySortAndFilters() {
const searchInput = document.getElementById('searchInput');
const typeFilter = document.getElementById('typeFilter');
const sourceFilter = document.getElementById('sourceFilter');
const searchTerm = searchInput ? searchInput.value.toLowerCase() : '';
const selectedType = typeFilter ? typeFilter.value : 'ALL';
const selectedSource = sourceFilter ? sourceFilter.value : 'ALL';
// First, filter contacts
filteredContacts = existingContacts.filter(contact => {
// Type filter
if (selectedType !== 'ALL' && contact.type_label !== selectedType) {
return false;
// Source filter
if (selectedSource === 'DEVICE' && !contact.on_device) return false;
if (selectedSource === 'CACHE' && contact.on_device) return false;
// Type filter (cache-only contacts have no type_label)
if (selectedType !== 'ALL') {
if (!contact.type_label || contact.type_label !== selectedType) {
return false;
}
}
// Search filter (name or public_key_prefix)
if (searchTerm) {
const nameMatch = contact.name.toLowerCase().includes(searchTerm);
const keyMatch = contact.public_key_prefix.toLowerCase().includes(searchTerm);
const keyMatch = (contact.public_key_prefix || '').toLowerCase().includes(searchTerm);
return nameMatch || keyMatch;
}
@@ -1922,26 +1919,23 @@ function createExistingContactCard(contact, index) {
nameDiv.appendChild(lockIndicator);
}
// Type badge - use type_label if available, fall back to "Cache" for unknown type
const typeBadge = document.createElement('span');
typeBadge.className = 'badge type-badge';
typeBadge.textContent = contact.type_label;
// Color-code by type
switch (contact.type_label) {
case 'CLI':
typeBadge.classList.add('bg-primary');
break;
case 'REP':
typeBadge.classList.add('bg-success');
break;
case 'ROOM':
typeBadge.classList.add('bg-info');
break;
case 'SENS':
typeBadge.classList.add('bg-warning');
break;
default:
typeBadge.classList.add('bg-secondary');
if (contact.type_label) {
typeBadge.textContent = contact.type_label;
switch (contact.type_label) {
case 'CLI': typeBadge.classList.add('bg-primary'); break;
case 'REP': typeBadge.classList.add('bg-success'); break;
case 'ROOM': typeBadge.classList.add('bg-info'); break;
case 'SENS': typeBadge.classList.add('bg-warning'); break;
default: typeBadge.classList.add('bg-secondary');
}
} else {
typeBadge.textContent = 'Cache';
typeBadge.classList.add('bg-secondary');
typeBadge.title = 'Not on device - type unknown';
}
infoRow.appendChild(nameDiv);
@@ -1998,7 +1992,7 @@ function createExistingContactCard(contact, index) {
const actionsDiv = document.createElement('div');
actionsDiv.className = 'd-flex gap-2 mt-2';
// Map button (only if GPS coordinates available)
// Map button - for ANY contact with GPS coordinates
if (contact.adv_lat && contact.adv_lon && (contact.adv_lat !== 0 || contact.adv_lon !== 0)) {
const mapBtn = document.createElement('button');
mapBtn.className = 'btn btn-sm btn-outline-primary';
@@ -2007,27 +2001,27 @@ function createExistingContactCard(contact, index) {
actionsDiv.appendChild(mapBtn);
}
// Protect button
const protectBtn = document.createElement('button');
protectBtn.className = isProtected ? 'btn btn-sm btn-warning' : 'btn btn-sm btn-outline-warning';
protectBtn.innerHTML = isProtected
? '<i class="bi bi-lock-fill"></i> Protected'
: '<i class="bi bi-shield"></i> Protect';
protectBtn.onclick = () => toggleContactProtection(contact.public_key, protectBtn);
actionsDiv.appendChild(protectBtn);
// Protect & Delete buttons (only for device contacts)
if (contact.on_device !== false) {
const protectBtn = document.createElement('button');
protectBtn.className = isProtected ? 'btn btn-sm btn-warning' : 'btn btn-sm btn-outline-warning';
protectBtn.innerHTML = isProtected
? '<i class="bi bi-lock-fill"></i> Protected'
: '<i class="bi bi-shield"></i> Protect';
protectBtn.onclick = () => toggleContactProtection(contact.public_key, protectBtn);
actionsDiv.appendChild(protectBtn);
// Delete button (disabled if protected)
const deleteBtn = document.createElement('button');
deleteBtn.className = 'btn btn-sm btn-outline-danger';
deleteBtn.innerHTML = '<i class="bi bi-trash"></i> Delete';
deleteBtn.onclick = () => showDeleteModal(contact);
deleteBtn.disabled = isProtected;
if (isProtected) {
deleteBtn.title = 'Cannot delete protected contact';
const deleteBtn = document.createElement('button');
deleteBtn.className = 'btn btn-sm btn-outline-danger';
deleteBtn.innerHTML = '<i class="bi bi-trash"></i> Delete';
deleteBtn.onclick = () => showDeleteModal(contact);
deleteBtn.disabled = isProtected;
if (isProtected) {
deleteBtn.title = 'Cannot delete protected contact';
}
actionsDiv.appendChild(deleteBtn);
}
actionsDiv.appendChild(deleteBtn);
// Assemble card
card.appendChild(infoRow);
card.appendChild(keyDiv);

View File

@@ -55,6 +55,9 @@ document.addEventListener('DOMContentLoaded', async function() {
// Initialize filter functionality
initializeDmFilter();
// Initialize FAB toggle
initializeDmFabToggle();
// Setup auto-refresh
setupAutoRefresh();
});
@@ -424,13 +427,20 @@ function displayMessages(messages) {
// Status icon for own messages
let statusIcon = '';
if (msg.is_own && msg.status) {
const icons = {
'pending': '<i class="bi bi-clock dm-status pending" title="Sending..."></i>',
'delivered': '<i class="bi bi-check2 dm-status delivered" title="Delivered"></i>',
'timeout': '<i class="bi bi-x-circle dm-status timeout" title="Not delivered"></i>'
};
statusIcon = icons[msg.status] || '';
if (msg.is_own) {
if (msg.status === 'delivered') {
let title = 'Delivered';
if (msg.delivery_snr !== null && msg.delivery_snr !== undefined) {
title += `, SNR: ${msg.delivery_snr.toFixed(1)} dB`;
}
if (msg.delivery_route) title += ` (${msg.delivery_route})`;
statusIcon = `<i class="bi bi-check2 dm-status delivered" title="${title}"></i>`;
} else if (msg.status === 'pending') {
statusIcon = '<i class="bi bi-clock dm-status pending" title="Sending..."></i>';
} else {
// No ACK received — show clickable "?" with explanation
statusIcon = `<span class="dm-status-unknown" onclick="showDeliveryInfo(this)"><i class="bi bi-question-circle dm-status unknown"></i></span>`;
}
}
// Metadata for incoming messages
@@ -507,8 +517,22 @@ async function sendMessage() {
updateCharCounter();
showNotification('Message sent', 'success');
// Reload messages after short delay
setTimeout(() => loadMessages(), 1000);
// Reload messages to show sent message + ACK delivery status
// Stop early once the last own message gets a delivery checkmark
const ackRefreshDelays = [1000, 6000, 15000];
let ackRefreshIdx = 0;
const scheduleAckRefresh = () => {
if (ackRefreshIdx >= ackRefreshDelays.length) return;
const delay = ackRefreshDelays[ackRefreshIdx++];
setTimeout(async () => {
await loadMessages();
const ownMsgs = document.querySelectorAll('#dmMessagesList .dm-message.own');
const lastOwn = ownMsgs.length > 0 ? ownMsgs[ownMsgs.length - 1] : null;
const delivered = lastOwn && lastOwn.querySelector('.dm-status.delivered');
if (!delivered) scheduleAckRefresh();
}, delay);
};
scheduleAckRefresh();
} else {
showNotification('Failed to send: ' + data.error, 'danger');
}
@@ -608,6 +632,29 @@ function resendMessage(content) {
input.focus();
}
/**
* Show delivery info popup (mobile-friendly, same pattern as showPathPopup)
*/
function showDeliveryInfo(element) {
const existing = document.querySelector('.dm-delivery-popup');
if (existing) existing.remove();
const popup = document.createElement('div');
popup.className = 'dm-delivery-popup';
popup.textContent = 'Delivery unknown \u2014 no ACK received. Message may still have been delivered.';
element.style.position = 'relative';
element.appendChild(popup);
const dismiss = () => popup.remove();
setTimeout(dismiss, 5000);
document.addEventListener('click', function handler(e) {
if (!element.contains(e.target)) {
dismiss();
document.removeEventListener('click', handler);
}
});
}
/**
* Setup emoji picker
*/
@@ -888,6 +935,21 @@ let dmFilterActive = false;
let currentDmFilterQuery = '';
let originalDmMessageContents = new Map();
/**
* Initialize DM FAB toggle (collapse/expand)
*/
function initializeDmFabToggle() {
const toggle = document.getElementById('dmFabToggle');
const container = document.getElementById('dmFabContainer');
if (!toggle || !container) return;
toggle.addEventListener('click', () => {
container.classList.toggle('collapsed');
const isCollapsed = container.classList.contains('collapsed');
toggle.title = isCollapsed ? 'Show buttons' : 'Hide buttons';
});
}
/**
* Initialize DM filter functionality
*/

View File

@@ -38,7 +38,7 @@
{% endif %}
</span>
<div class="d-flex align-items-center gap-2">
<div id="notificationBell" class="btn btn-outline-light btn-sm position-relative" style="cursor: default;" title="Unread messages">
<div id="notificationBell" class="btn btn-outline-light btn-sm position-relative" style="cursor: pointer;" onclick="markAllChannelsRead()" title="Mark all as read">
<i class="bi bi-bell"></i>
</div>
<select id="channelSelector" class="form-select form-select-sm" style="width: auto; min-width: 100px;" title="Select channel">

View File

@@ -29,6 +29,13 @@
<!-- Filter and Sort Toolbar -->
<div class="filter-sort-toolbar">
<!-- Source Filter -->
<select class="form-select" id="sourceFilter">
<option value="ALL">All sources</option>
<option value="DEVICE">On device</option>
<option value="CACHE">Cache only</option>
</select>
<!-- Type Filter -->
<select class="form-select" id="typeFilter">
<option value="ALL">All Types</option>
@@ -38,17 +45,13 @@
<option value="SENS">SENS</option>
</select>
<!-- Sort Buttons -->
<div class="sort-buttons">
<button class="sort-btn" data-sort="name" id="sortByName" title="Sort by contact name">
<span>Name</span>
<i class="bi bi-sort-down"></i>
</button>
<button class="sort-btn active" data-sort="last_advert" id="sortByLastAdvert" title="Sort by last advertisement time">
<span>Last advert</span>
<i class="bi bi-sort-down"></i>
</button>
</div>
<!-- Sort Dropdown -->
<select class="form-select" id="sortSelect">
<option value="last_advert_desc">Last advert ↓</option>
<option value="last_advert_asc">Last advert ↑</option>
<option value="name_asc">Name A→Z</option>
<option value="name_desc">Name Z→A</option>
</select>
</div>
<!-- Loading State -->

View File

@@ -340,38 +340,6 @@
font-size: 0.85rem;
}
.sort-buttons {
display: flex;
gap: 0.375rem;
}
.sort-btn {
display: flex;
align-items: center;
gap: 0.2rem;
padding: 0.35rem 0.5rem;
background: #f8f9fa;
border: 1px solid #dee2e6;
border-radius: 0.375rem;
cursor: pointer;
font-size: 0.85rem;
transition: all 0.2s;
white-space: nowrap;
}
.sort-btn:hover {
background: #e9ecef;
}
.sort-btn.active {
background-color: #0d6efd;
color: white;
border-color: #0d6efd;
}
.sort-btn i {
font-size: 0.85rem;
}
/* NEW: Back buttons */
.back-buttons {

View File

@@ -149,8 +149,11 @@
</div>
</div>
<!-- Floating Action Button for Filter -->
<div class="fab-container">
<!-- Floating Action Buttons -->
<div class="fab-container" id="dmFabContainer">
<button class="fab fab-toggle" id="dmFabToggle" title="Hide buttons">
<i class="bi bi-chevron-right"></i>
</button>
<button class="fab fab-filter" id="dmFilterFab" title="Filter Messages">
<i class="bi bi-funnel-fill"></i>
</button>

View File

@@ -76,7 +76,16 @@
<!-- Filter bar overlay -->
<div id="filterBar" class="filter-bar">
<div class="filter-bar-inner">
<input type="text" id="filterInput" class="filter-bar-input" placeholder="Filter messages..." autocomplete="off">
<div class="filter-input-wrapper">
<input type="text" id="filterInput" class="filter-bar-input" placeholder="Filter messages..." autocomplete="off">
<!-- Filter mentions autocomplete popup -->
<div id="filterMentionsPopup" class="mentions-popup filter-mentions-popup hidden">
<div class="mentions-list" id="filterMentionsList"></div>
</div>
</div>
<button type="button" id="filterMeBtn" class="filter-bar-btn filter-bar-btn-me" title="Filter my messages">
<i class="bi bi-person-fill"></i>
</button>
<span id="filterMatchCount" class="filter-match-count"></span>
<button type="button" id="filterClearBtn" class="filter-bar-btn filter-bar-btn-clear" title="Clear">
<i class="bi bi-x"></i>
@@ -153,7 +162,10 @@
</div>
<!-- Floating Action Buttons -->
<div class="fab-container">
<div class="fab-container" id="fabContainer">
<button class="fab fab-toggle" id="fabToggle" title="Hide buttons">
<i class="bi bi-chevron-right"></i>
</button>
<button class="fab fab-filter" id="filterFab" title="Filter Messages">
<i class="bi bi-funnel-fill"></i>
</button>

View File

@@ -37,7 +37,7 @@ services:
container_name: mc-webui
restart: unless-stopped
ports:
- "${FLASK_PORT:-5000}:5000"
- "${FLASK_PORT:-5000}:${FLASK_PORT:-5000}"
volumes:
- "${MC_CONFIG_DIR}:/root/.config/meshcore:rw"
- "${MC_ARCHIVE_DIR:-./archive}:/root/.archive/meshcore:rw"
@@ -60,7 +60,7 @@ services:
networks:
- meshcore-net
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:5000/api/status')"]
test: ["CMD", "python", "-c", "import urllib.request, os; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"FLASK_PORT\", \"5000\")}/api/status')"]
interval: 30s
timeout: 10s
retries: 3

View File

@@ -5,6 +5,7 @@ Common issues and solutions for mc-webui.
## Table of Contents
- [Common Issues](#common-issues)
- [Device Not Responding](#device-not-responding-bridge-crash-loop)
- [Docker Commands](#docker-commands)
- [Testing Bridge API](#testing-bridge-api)
- [Backup and Restore](#backup-and-restore)
@@ -114,6 +115,39 @@ The 2-container architecture resolves common USB timeout/deadlock problems:
---
### Device not responding (bridge crash-loop)
**Symptoms:**
- `meshcore-bridge` container shows `unhealthy` status
- Bridge logs show repeated `no_event_received` errors and restarts:
```
ERROR:meshcore:Error while querying device: Event(type=<EventType.ERROR: 'command_error'>, payload={'reason': 'no_event_received'})
meshcli process died (exit code: 0)
Attempting to restart meshcli session...
```
- Device name not detected, falls back to `auto.msgs` (file not found)
- All commands (`infos`, `contacts`, etc.) time out
**What this means:**
The serial connection to the USB adapter (e.g. CP2102) is working, but the MeshCore device firmware is not responding to protocol commands. The device boots (serial port connects), but the application code is not running properly.
**What does NOT help:**
- Restarting Docker containers
- Restarting the host machine
- USB reset or USB power cycle (only resets the USB-to-UART adapter, not the MeshCore radio module)
**Fix: Re-flash the firmware**
The MeshCore device firmware is likely corrupted. Re-flash the latest firmware using the MeshCore Flasher:
1. Download the latest firmware from [MeshCore releases](https://github.com/ripplebiz/MeshCore/releases)
2. Flash using [MeshCore Flasher](https://flasher.meshcore.co) or esptool
3. Restart mc-webui: `docker compose up -d`
This can happen after a power failure during OTA update, flash memory corruption, or other hardware anomalies.
---
### Bridge connection errors
```bash

View File

@@ -166,8 +166,9 @@ Access the Direct Messages feature:
### Message Status Indicators
-**Delivered** (green checkmark) - Recipient confirmed receipt (ACK). Tap/hover for SNR and route details
- ? **Unknown** (gray question mark) - No ACK received. Message may still have been delivered — ACK packets are often lost over multi-hop routes. Tap the icon for details
-**Pending** (clock icon, yellow) - Message sent, awaiting delivery confirmation
- Note: Due to meshcore-cli limitations, we cannot track actual delivery status
### DM Notifications

View File

@@ -7,6 +7,7 @@ The Container Watchdog is a systemd service that monitors Docker containers and
- **Health monitoring** - Checks container status every 30 seconds
- **Automatic restart** - Restarts containers that become unhealthy
- **Auto-start stopped containers** - Starts containers that have stopped (configurable)
- **Hardware USB reset** - Performs a low-level USB bus reset if the LoRa device freezes (detected after 3 failed container restarts within 8 minutes)
- **Diagnostic logging** - Captures container logs before restart for troubleshooting
- **HTTP status endpoint** - Query container status via HTTP API
- **Restart history** - Tracks all automatic restarts with timestamps
@@ -82,6 +83,7 @@ If you need to customize the behavior, the service supports these environment va
| `LOG_FILE` | `/var/log/mc-webui-watchdog.log` | Path to log file |
| `HTTP_PORT` | `5051` | HTTP status port (0 to disable) |
| `AUTO_START` | `true` | Start stopped containers (set to `false` to disable) |
| `USB_DEVICE_PATH` | *(auto-detected)* | Path to the LoRa device (e.g., `/dev/bus/usb/001/002`) for hardware USB bus reset |
To modify defaults, create an override file:
```bash

View File

@@ -13,7 +13,7 @@ RUN apt-get update && apt-get install -y \
&& rm -rf /var/lib/apt/lists/*
# Install meshcore-cli (from PyPI)
RUN pip install --no-cache-dir meshcore-cli==1.3.21
RUN pip install --no-cache-dir meshcore-cli==1.4.2
# Copy bridge application
COPY requirements.txt .

View File

@@ -153,12 +153,46 @@ class MeshCLISession:
self.echo_lock = threading.Lock()
self.echo_log_path = self.config_dir / f"{device_name}.echoes.jsonl"
# Load persisted echo data from disk
# ACK tracking for DM delivery status
self.acks = {} # ack_code -> {snr, rssi, route, path, ts}
self.acks_file = self.config_dir / f"{device_name}.acks.jsonl"
# Load persisted data from disk
self._load_echoes()
self._load_acks()
# Start session
self._start_session()
def _update_log_paths(self, new_name):
"""Update advert/echo/ack log paths after device name detection, renaming existing files."""
new_advert = self.config_dir / f"{new_name}.adverts.jsonl"
new_echo = self.config_dir / f"{new_name}.echoes.jsonl"
new_acks = self.config_dir / f"{new_name}.acks.jsonl"
# Rename existing files if they use the old (configured) name
for old_path, new_path in [
(self.advert_log_path, new_advert),
(self.echo_log_path, new_echo),
(self.acks_file, new_acks),
]:
if old_path != new_path and old_path.exists() and not new_path.exists():
try:
old_path.rename(new_path)
logger.info(f"Renamed {old_path.name} -> {new_path.name}")
except OSError as e:
logger.warning(f"Failed to rename {old_path.name}: {e}")
self.advert_log_path = new_advert
self.echo_log_path = new_echo
self.acks_file = new_acks
logger.info(f"Log paths updated for device: {new_name}")
# Reload echo and ACK data from the correct files
# (initial _load_echoes/_load_acks may have failed with the "auto" name)
self._load_echoes()
self._load_acks()
def _start_session(self):
"""Start meshcli process and worker threads"""
logger.info(f"Starting meshcli session on {self.serial_port}")
@@ -276,6 +310,7 @@ class MeshCLISession:
if 'name' in data:
self.detected_name = data['name']
logger.info(f"Detected device name from .infos: {self.detected_name}")
self._update_log_paths(self.detected_name)
self.name_detection_done.set()
return
except json.JSONDecodeError:
@@ -311,6 +346,7 @@ class MeshCLISession:
if name_part:
self.detected_name = name_part
logger.info(f"Detected device name from prompt: {self.detected_name}")
self._update_log_paths(self.detected_name)
self.name_detection_done.set()
# Try to parse as JSON advert
@@ -324,6 +360,12 @@ class MeshCLISession:
self._process_echo(echo_data)
continue
# Try to parse as ACK packet (for DM delivery tracking)
ack_data = self._parse_ack_packet(line)
if ack_data:
self._process_ack(ack_data)
continue
# Otherwise, append to current CLI response
self._append_to_current_response(line)
@@ -543,24 +585,29 @@ class MeshCLISession:
logger.info(f"Echo: correlated pkt_payload with sent message, first path: {path}")
return
# Not a sent echo -> store as incoming message path
self.incoming_paths[pkt_payload] = {
# Not a sent echo -> accumulate as incoming message path
if pkt_payload not in self.incoming_paths:
self.incoming_paths[pkt_payload] = {
'paths': [],
'first_ts': current_time,
}
self.incoming_paths[pkt_payload]['paths'].append({
'path': path,
'snr': echo_data.get('snr'),
'path_len': echo_data.get('path_len'),
'timestamp': current_time,
}
'ts': current_time,
})
self._save_echo({
'type': 'rx_echo', 'pkt_payload': pkt_payload,
'path': path, 'snr': echo_data.get('snr'),
'path_len': echo_data.get('path_len')
})
logger.debug(f"Echo: stored incoming path {path} (path_len={echo_data.get('path_len')})")
logger.debug(f"Echo: stored incoming path {path} (path_len={echo_data.get('path_len')}, total paths: {len(self.incoming_paths[pkt_payload]['paths'])})")
# Cleanup old incoming paths (> 1 hour)
cutoff = current_time - 3600
# Cleanup old incoming paths (> 7 days, matching .echoes.jsonl retention)
cutoff = current_time - (7 * 24 * 3600)
self.incoming_paths = {k: v for k, v in self.incoming_paths.items()
if v['timestamp'] > cutoff}
if v['first_ts'] > cutoff}
def register_pending_echo(self, channel_idx, timestamp):
"""Register a sent message for echo tracking."""
@@ -570,8 +617,8 @@ class MeshCLISession:
'channel_idx': channel_idx,
'pkt_payload': None
}
# Cleanup old echo counts (> 1 hour)
cutoff = time.time() - 3600
# Cleanup old echo counts (> 7 days, matching .echoes.jsonl retention)
cutoff = time.time() - (7 * 24 * 3600)
self.echo_counts = {k: v for k, v in self.echo_counts.items()
if v['timestamp'] > cutoff}
logger.debug(f"Registered pending echo for channel {channel_idx}")
@@ -642,13 +689,18 @@ class MeshCLISession:
loaded_sent += 1
elif echo_type == 'rx_echo':
self.incoming_paths[pkt_payload] = {
if pkt_payload not in self.incoming_paths:
self.incoming_paths[pkt_payload] = {
'paths': [],
'first_ts': ts,
}
loaded_incoming += 1
self.incoming_paths[pkt_payload]['paths'].append({
'path': record.get('path', ''),
'snr': record.get('snr'),
'path_len': record.get('path_len'),
'timestamp': ts,
}
loaded_incoming += 1
'ts': ts,
})
# Rewrite file with only recent records (compact)
with open(self.echo_log_path, 'w', encoding='utf-8') as f:
@@ -660,6 +712,99 @@ class MeshCLISession:
except Exception as e:
logger.error(f"Failed to load echoes: {e}")
# =========================================================================
# ACK tracking for DM delivery status
# =========================================================================
def _parse_ack_packet(self, line):
"""Parse ACK JSON packet from stdout, return data dict or None."""
try:
data = json.loads(line)
if isinstance(data, dict) and data.get("payload_typename") == "ACK":
return {
'ack_code': data.get('pkt_payload'),
'snr': data.get('snr'),
'rssi': data.get('rssi'),
'route': data.get('route_typename'),
'path': data.get('path', ''),
'path_len': data.get('path_len', 0),
}
except (json.JSONDecodeError, ValueError):
pass
return None
def _process_ack(self, ack_data):
"""Process an ACK packet: store delivery confirmation."""
ack_code = ack_data.get('ack_code')
if not ack_code:
return
# Only store the first ACK per code (ignore duplicates from multi_acks)
if ack_code in self.acks:
logger.debug(f"ACK duplicate ignored: code={ack_code}")
return
record = {
'ack_code': ack_code,
'snr': ack_data.get('snr'),
'rssi': ack_data.get('rssi'),
'route': ack_data.get('route'),
'path': ack_data.get('path', ''),
'ts': time.time(),
}
self.acks[ack_code] = record
self._save_ack(record)
logger.info(f"ACK received: code={ack_code}, snr={ack_data.get('snr')}, route={ack_data.get('route')}")
def _save_ack(self, record):
"""Append ACK record to .acks.jsonl file."""
try:
with open(self.acks_file, 'a', encoding='utf-8') as f:
f.write(json.dumps(record, ensure_ascii=False) + '\n')
except Exception as e:
logger.error(f"Failed to save ACK: {e}")
def _load_acks(self):
"""Load ACK data from .acks.jsonl on startup with 7-day cleanup."""
if not self.acks_file.exists():
return
cutoff = time.time() - (7 * 24 * 3600) # 7 days
kept_lines = []
loaded = 0
try:
with open(self.acks_file, 'r', encoding='utf-8') as f:
for line in f:
line = line.strip()
if not line:
continue
try:
record = json.loads(line)
except json.JSONDecodeError:
continue
ts = record.get('ts', 0)
if ts < cutoff:
continue # Skip old records
kept_lines.append(line)
ack_code = record.get('ack_code')
if ack_code:
self.acks[ack_code] = record
loaded += 1
# Rewrite file with only recent records (compact)
with open(self.acks_file, 'w', encoding='utf-8') as f:
for line in kept_lines:
f.write(line + '\n')
logger.info(f"Loaded ACKs from disk: {loaded} records (kept {len(kept_lines)})")
except Exception as e:
logger.error(f"Failed to load ACKs: {e}")
def _log_advert(self, json_line):
"""Log advert JSON to .jsonl file with timestamp"""
try:
@@ -878,6 +1023,7 @@ def health():
'serial_port': MC_SERIAL_PORT,
'serial_port_source': SERIAL_PORT_SOURCE,
'advert_log': str(meshcli_session.advert_log_path) if meshcli_session else None,
'echoes_log': str(meshcli_session.echo_log_path) if meshcli_session else None,
'device_name': device_name,
'device_name_source': name_source
}), 200
@@ -1253,11 +1399,13 @@ def get_echo_counts():
{
"success": true,
"echo_counts": [
{"timestamp": 1706500000.123, "channel_idx": 0, "count": 3, "paths": ["5e", "d1", "a3"]},
{"timestamp": 1706500000.123, "channel_idx": 0, "count": 3, "paths": ["5e", "d1", "a3"], "pkt_payload": "abcd..."},
...
],
"incoming_paths": [
{"timestamp": 1706500000.456, "path": "8a40a605", "path_len": 4, "snr": 11.0},
{"pkt_payload": "efgh...", "timestamp": 1706500000.456, "paths": [
{"path": "8a40a605", "path_len": 4, "snr": 11.0, "ts": 1706500000.456}, ...
]},
...
]
}
@@ -1272,16 +1420,16 @@ def get_echo_counts():
'timestamp': data['timestamp'],
'channel_idx': data['channel_idx'],
'count': len(data['paths']),
'paths': list(data['paths'])
'paths': list(data['paths']),
'pkt_payload': pkt_payload,
})
incoming = []
for pkt_payload, data in meshcli_session.incoming_paths.items():
incoming.append({
'timestamp': data['timestamp'],
'path': data['path'],
'path_len': data.get('path_len'),
'snr': data.get('snr'),
'pkt_payload': pkt_payload,
'timestamp': data['first_ts'],
'paths': data['paths'],
})
return jsonify({
@@ -1291,6 +1439,40 @@ def get_echo_counts():
}), 200
# =============================================================================
# ACK tracking endpoint for DM delivery status
# =============================================================================
@app.route('/ack_status', methods=['GET'])
def get_ack_status():
"""
Get ACK status for sent DMs by their expected_ack codes.
Query params:
ack_codes: comma-separated list of expected_ack hex codes
Response JSON:
{
"success": true,
"acks": {
"544a4d8f": {"snr": 13.0, "rssi": -32, "route": "DIRECT", "ts": 1706500000.123},
"ff3b55ce": null
}
}
"""
if not meshcli_session:
return jsonify({'success': False, 'error': 'Not initialized'}), 503
requested = request.args.get('ack_codes', '')
codes = [c.strip() for c in requested.split(',') if c.strip()]
result = {}
for code in codes:
result[code] = meshcli_session.acks.get(code)
return jsonify({'success': True, 'acks': result}), 200
# =============================================================================
# WebSocket handlers for console
# =============================================================================

View File

@@ -23,6 +23,9 @@ Pillow==10.1.0
# HTTP Client for MeshCore Bridge communication
requests==2.31.0
# Cryptography for pkt_payload computation (AES-128-ECB)
pycryptodome==3.21.0
# WebSocket support for console (threading mode - no gevent needed)
flask-socketio==5.3.6
python-socketio==5.10.0

558
scripts/check_compat.py Normal file
View File

@@ -0,0 +1,558 @@
#!/usr/bin/env python3
"""
meshcore-cli compatibility checker for mc-webui
Tests all meshcli commands and response formats used by mc-webui
against the currently running meshcore-bridge instance.
Usage (from host, piped into mc-webui container):
cd ~/mc-webui
cat scripts/check_compat.py | docker compose exec -T mc-webui python -
# Full mode (includes advert test):
cat scripts/check_compat.py | docker compose exec -T mc-webui env FULL=1 python -
"""
import json
import os
import re
import sys
import time
import requests
DEFAULT_BRIDGE_URL = "http://meshcore-bridge:5001"
# Expected fields in .contacts JSON response (per contact entry)
EXPECTED_CONTACT_FIELDS = {
"public_key", "type", "adv_name", "flags",
"out_path_len", "out_path", "last_advert",
"adv_lat", "adv_lon", "lastmod"
}
# Valid contact types in text format
VALID_CONTACT_TYPES = {"CLI", "REP", "ROOM", "SENS"}
# Expected fields in /health response
EXPECTED_HEALTH_FIELDS = {
"status", "serial_port", "device_name", "device_name_source"
}
# Channel line format: "0: Public [8b3387e9c5cdea6ac9e5edbaa115cd72]"
CHANNEL_REGEX = re.compile(r'^(\d+):\s+(.+?)\s+\[([a-f0-9]{32})\]$')
# Contacts text format: columns separated by 2+ spaces
CONTACTS_SPLIT_REGEX = re.compile(r'\s{2,}')
class CompatChecker:
"""Checks meshcore-cli compatibility with mc-webui"""
PASS = "PASS"
WARN = "WARN"
FAIL = "FAIL"
SKIP = "SKIP"
ERROR = "ERROR"
def __init__(self, bridge_url, full_mode=False):
self.bridge_url = bridge_url.rstrip('/')
self.full_mode = full_mode
self.results = []
def run_command(self, args, timeout=10):
"""Send command to bridge /cli endpoint. Returns parsed JSON response."""
resp = requests.post(
f"{self.bridge_url}/cli",
json={"args": args, "timeout": timeout},
headers={"Connection": "close"},
timeout=timeout + 5
)
resp.raise_for_status()
return resp.json()
def add(self, status, category, detail):
"""Record a test result."""
self.results.append((status, category, detail))
# ── Test methods ──────────────────────────────────────────────
def test_health(self):
"""Test GET /health endpoint"""
cat = "Bridge Health"
try:
resp = requests.get(f"{self.bridge_url}/health", timeout=5)
resp.raise_for_status()
data = resp.json()
missing = EXPECTED_HEALTH_FIELDS - set(data.keys())
if missing:
self.add(self.FAIL, cat, f"missing fields: {', '.join(sorted(missing))}")
return
if data["status"] != "healthy":
self.add(self.FAIL, cat, f"status={data['status']} (expected 'healthy')")
return
extra = set(data.keys()) - EXPECTED_HEALTH_FIELDS - {
"serial_port_source", "advert_log", "echoes_log"
}
detail = f"status=healthy, device={data['device_name']}"
if extra:
self.add(self.WARN, cat, f"{detail} (new fields: {', '.join(sorted(extra))})")
else:
self.add(self.PASS, cat, detail)
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_device_info(self):
"""Test infos and .infos commands"""
for cmd in ["infos", ".infos"]:
cat = f"Device Info ({cmd})"
try:
data = self.run_command([cmd], timeout=5)
if not data.get("success"):
self.add(self.FAIL, cat, f"command failed: {data.get('stderr', '')}")
continue
stdout = data.get("stdout", "").strip()
if not stdout:
self.add(self.FAIL, cat, "empty response")
continue
# Try to parse JSON from output
json_obj = self._extract_json_object(stdout)
if json_obj is None:
self.add(self.FAIL, cat, "no JSON object found in response")
continue
if "name" not in json_obj:
self.add(self.FAIL, cat, f"'name' field missing from JSON (keys: {', '.join(json_obj.keys())})")
else:
self.add(self.PASS, cat, f"JSON valid, name='{json_obj['name']}'")
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_contacts_text(self):
"""Test contacts command (text format)"""
cat = "Contacts (text)"
try:
data = self.run_command(["contacts"])
if not data.get("success"):
self.add(self.FAIL, cat, f"command failed: {data.get('stderr', '')}")
return
stdout = data.get("stdout", "").strip()
if not stdout:
self.add(self.WARN, cat, "empty response (no contacts on device)")
return
# Parse using same logic as cli.py parse_contacts()
type_counts = {"CLI": 0, "REP": 0, "ROOM": 0, "SENS": 0}
parsed = 0
unparsed_lines = []
for line in stdout.split('\n'):
line_stripped = line.strip()
if not line_stripped or line_stripped.startswith('---') or \
line.lower().startswith('contact') or line.startswith('INFO:') or \
self._is_prompt_line(line_stripped):
continue
parts = CONTACTS_SPLIT_REGEX.split(line)
if len(parts) >= 2:
contact_type = parts[1].strip()
if contact_type in VALID_CONTACT_TYPES:
type_counts[contact_type] += 1
parsed += 1
continue
unparsed_lines.append(line_stripped[:60])
if parsed == 0:
self.add(self.FAIL, cat, "no contacts parsed - format may have changed")
if unparsed_lines:
self.add(self.FAIL, cat, f"unparsed lines: {unparsed_lines[:3]}")
return
types_str = ", ".join(f"{k}={v}" for k, v in type_counts.items() if v > 0)
detail = f"{parsed} contacts parsed, types: {types_str}"
if unparsed_lines:
self.add(self.WARN, cat, f"{detail} ({len(unparsed_lines)} unparsed lines: {unparsed_lines[:3]})")
else:
self.add(self.PASS, cat, detail)
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_contacts_json(self):
"""Test .contacts command (JSON format)"""
cat = "Contacts (JSON)"
try:
data = self.run_command([".contacts"])
if not data.get("success"):
self.add(self.FAIL, cat, f"command failed: {data.get('stderr', '')}")
return
stdout = data.get("stdout", "").strip()
if not stdout:
self.add(self.WARN, cat, "empty response (no contacts on device)")
return
# Parse JSON using brace-matching (same as cli.py)
json_obj = self._extract_json_object(stdout)
if json_obj is None:
self.add(self.FAIL, cat, "no JSON object found in response")
return
if not isinstance(json_obj, dict):
self.add(self.FAIL, cat, f"expected dict, got {type(json_obj).__name__}")
return
if len(json_obj) == 0:
self.add(self.WARN, cat, "JSON valid but empty (no contacts)")
return
# Check fields in first contact entry
first_key = next(iter(json_obj))
first_contact = json_obj[first_key]
if not isinstance(first_contact, dict):
self.add(self.FAIL, cat, f"contact entry is {type(first_contact).__name__}, expected dict")
return
actual_fields = set(first_contact.keys())
missing = EXPECTED_CONTACT_FIELDS - actual_fields
extra = actual_fields - EXPECTED_CONTACT_FIELDS
detail = f"{len(json_obj)} contacts, all expected fields present"
if missing:
self.add(self.FAIL, cat, f"missing fields: {', '.join(sorted(missing))}")
elif extra:
self.add(self.WARN, cat, f"{len(json_obj)} contacts OK (new fields: {', '.join(sorted(extra))})")
else:
self.add(self.PASS, cat, detail)
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_contact_info(self):
"""Test apply_to t=1 contact_info command"""
cat = "Contact Info (apply_to)"
try:
data = self.run_command(["apply_to", "t=1", "contact_info"])
if not data.get("success"):
self.add(self.FAIL, cat, f"command failed: {data.get('stderr', '')}")
return
stdout = data.get("stdout", "").strip()
if not stdout:
self.add(self.WARN, cat, "empty response (no CLI contacts)")
return
# contact_info returns multiple JSON objects (one per contact)
json_count = 0
for line in stdout.split('\n'):
line = line.strip()
if line.startswith('{'):
try:
json.loads(line)
json_count += 1
except json.JSONDecodeError:
pass
if json_count > 0:
self.add(self.PASS, cat, f"{json_count} contact info entries parsed")
else:
# Try brace-matching for multi-line JSON
json_obj = self._extract_json_object(stdout)
if json_obj is not None:
self.add(self.PASS, cat, "contact info JSON parsed (multi-line)")
else:
self.add(self.WARN, cat, "command succeeded but no JSON found in output")
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_channels(self):
"""Test get_channels command"""
cat = "Channels"
try:
data = self.run_command(["get_channels"])
if not data.get("success"):
self.add(self.FAIL, cat, f"command failed: {data.get('stderr', '')}")
return
stdout = data.get("stdout", "").strip()
if not stdout:
self.add(self.FAIL, cat, "empty response (device should have at least Public channel)")
return
channels = []
unparsed = []
for line in stdout.split('\n'):
line = line.strip()
if not line or self._is_prompt_line(line):
continue
match = CHANNEL_REGEX.match(line)
if match:
channels.append({
'index': int(match.group(1)),
'name': match.group(2),
'key': match.group(3)
})
else:
unparsed.append(line[:60])
if not channels:
self.add(self.FAIL, cat, "no channels parsed - format may have changed")
if unparsed:
self.add(self.FAIL, cat, f"unparsed lines: {unparsed[:3]}")
return
names = ", ".join(f"{c['name']}(#{c['index']})" for c in channels)
detail = f"{len(channels)} channels: {names}"
if unparsed:
self.add(self.WARN, cat, f"{detail} ({len(unparsed)} unparsed lines: {unparsed[:3]})")
else:
self.add(self.PASS, cat, detail)
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_recv(self):
"""Test recv command (short timeout)"""
cat = "Recv"
try:
# Use short timeout - we just want to verify the command is accepted
data = self.run_command(["recv"], timeout=5)
if not data.get("success"):
stderr = data.get("stderr", "")
# Timeout is acceptable for recv (no new messages)
if "timeout" in stderr.lower():
self.add(self.PASS, cat, "command accepted (timed out - no new messages)")
else:
self.add(self.FAIL, cat, f"command failed: {stderr}")
return
stdout = data.get("stdout", "").strip()
if stdout:
self.add(self.PASS, cat, f"command accepted ({len(stdout.split(chr(10)))} lines)")
else:
self.add(self.PASS, cat, "command accepted (no new messages)")
except requests.exceptions.Timeout:
# Timeout is acceptable for recv
self.add(self.PASS, cat, "command accepted (HTTP timeout - normal for recv)")
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_settings(self):
"""Test set commands used during bridge initialization"""
settings = [
(["set", "json_log_rx", "on"], "Settings (json_log_rx)"),
(["set", "print_adverts", "on"], "Settings (print_adverts)"),
(["msgs_subscribe"], "Settings (msgs_subscribe)"),
]
for args, cat in settings:
try:
data = self.run_command(args, timeout=5)
if data.get("success"):
self.add(self.PASS, cat, "accepted")
else:
stderr = data.get("stderr", "")
stdout = data.get("stdout", "")
# Some settings return output but bridge marks as timeout
if "timeout" in stderr.lower() and not stdout:
self.add(self.WARN, cat, "possible timeout (no output)")
else:
self.add(self.FAIL, cat, f"failed: {stderr or stdout}")
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_pending_contacts(self):
"""Test GET /pending_contacts bridge endpoint"""
cat = "Pending Contacts"
try:
resp = requests.get(f"{self.bridge_url}/pending_contacts", timeout=10)
resp.raise_for_status()
data = resp.json()
if "success" not in data:
self.add(self.FAIL, cat, "response missing 'success' field")
return
if data.get("success"):
contacts = data.get("contacts", data.get("pending", []))
self.add(self.PASS, cat, f"endpoint OK ({len(contacts)} pending)")
else:
self.add(self.WARN, cat, f"endpoint returned success=false: {data.get('error', '')}")
except Exception as e:
self.add(self.ERROR, cat, str(e))
def test_advert(self):
"""Test advert command (has network side-effect)"""
cat = "Advert"
if not self.full_mode:
self.add(self.SKIP, cat, "skipped (use --full to enable)")
return
try:
data = self.run_command(["advert"], timeout=10)
if data.get("success"):
self.add(self.PASS, cat, "advertisement sent")
else:
self.add(self.FAIL, cat, f"failed: {data.get('stderr', '')}")
except Exception as e:
self.add(self.ERROR, cat, str(e))
# ── Helpers ───────────────────────────────────────────────────
@staticmethod
def _is_prompt_line(line):
"""Check if line is a meshcli prompt or summary (not actual data)."""
# Prompt lines: "DeviceName|* command" or "DeviceName|*"
if '|*' in line:
return True
# Summary lines: "> 310 contacts in device"
if line.startswith('>'):
return True
return False
def _extract_json_object(self, text):
"""Extract first complete JSON object from text using brace-matching."""
depth = 0
start_idx = None
for i, char in enumerate(text):
if char == '{':
if depth == 0:
start_idx = i
depth += 1
elif char == '}':
depth -= 1
if depth == 0 and start_idx is not None:
try:
return json.loads(text[start_idx:i + 1])
except json.JSONDecodeError:
start_idx = None
continue
return None
def _get_meshcli_version(self):
"""Try to get meshcore-cli version from bridge container."""
try:
data = self.run_command(["version"], timeout=5)
if data.get("success") and data.get("stdout"):
return data["stdout"].strip()
except Exception:
pass
return "unknown"
# ── Main runner ───────────────────────────────────────────────
def run_all(self):
"""Run all tests and print report. Returns exit code."""
print()
print("meshcore-cli Compatibility Report")
print("=" * 50)
print(f"Bridge URL: {self.bridge_url}")
print(f"Mode: {'full' if self.full_mode else 'safe (read-only)'}")
print(f"Timestamp: {time.strftime('%Y-%m-%d %H:%M:%S')}")
print()
# Check bridge is reachable first
try:
requests.get(f"{self.bridge_url}/health", timeout=3)
except Exception as e:
print(f"[ERROR] Cannot reach bridge at {self.bridge_url}: {e}")
print()
print("Make sure meshcore-bridge is running:")
print(" docker compose ps")
print(" docker compose logs meshcore-bridge")
return 1
# Run all tests
tests = [
self.test_health,
self.test_device_info,
self.test_contacts_text,
self.test_contacts_json,
self.test_contact_info,
self.test_channels,
self.test_recv,
self.test_settings,
self.test_pending_contacts,
self.test_advert,
]
for test in tests:
test()
# Print results
for status, category, detail in self.results:
print(f"[{status:5s}] {category} - {detail}")
# Summary
counts = {s: 0 for s in [self.PASS, self.WARN, self.FAIL, self.SKIP, self.ERROR]}
for status, _, _ in self.results:
counts[status] += 1
total_tests = counts[self.PASS] + counts[self.FAIL] + counts[self.ERROR]
print()
print(f"Result: {counts[self.PASS]}/{total_tests} PASS", end="")
if counts[self.WARN]:
print(f", {counts[self.WARN]} WARN", end="")
if counts[self.FAIL]:
print(f", {counts[self.FAIL]} FAIL", end="")
if counts[self.ERROR]:
print(f", {counts[self.ERROR]} ERROR", end="")
if counts[self.SKIP]:
print(f", {counts[self.SKIP]} SKIP", end="")
print()
has_failures = counts[self.FAIL] > 0 or counts[self.ERROR] > 0
if has_failures:
print()
print("COMPATIBILITY ISSUES DETECTED - review FAIL/ERROR results above")
return 1 if has_failures else 0
def main():
bridge_url = os.environ.get("BRIDGE_URL", DEFAULT_BRIDGE_URL)
full_mode = os.environ.get("FULL", "").lower() in ("1", "true", "yes")
# Support --bridge-url and --full from command line too
args = sys.argv[1:]
i = 0
while i < len(args):
if args[i] == "--bridge-url" and i + 1 < len(args):
bridge_url = args[i + 1]
i += 2
elif args[i] == "--full":
full_mode = True
i += 1
elif args[i] in ("-h", "--help"):
print("Usage: check_compat.py [--bridge-url URL] [--full]")
print(f" --bridge-url Bridge URL (default: {DEFAULT_BRIDGE_URL})")
print(f" Or set BRIDGE_URL env var")
print(f" --full Include tests with network side-effects")
print(f" Or set FULL=1 env var")
print()
print("Run from host:")
print(" cat scripts/check_compat.py | docker compose exec -T mc-webui python -")
print(" cat scripts/check_compat.py | docker compose exec -T mc-webui env FULL=1 python -")
sys.exit(0)
else:
i += 1
checker = CompatChecker(bridge_url, full_mode)
sys.exit(checker.run_all())
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,21 @@
# mc-webui Container Watchdog
The `watchdog` service is a utility designed to run on the host machine running the Docker containers for the `mc-webui` project. Its primary purpose is to continuously monitor the health of the application's containers, specifically the `meshcore-bridge` container, which handles the physical connection to the LoRa device (like Heltec V3 or V4).
## Key Capabilities
- **Automated Restarts:** If a container becomes `unhealthy` or crashes, the watchdog automatically restarts it to restore service without human intervention.
- **Hardware USB Bus Reset:** If the `meshcore-bridge` container fails to recover after three successive restarts (e.g., due to a hardware freeze on the LoRa device itself), the watchdog will intelligently simulate a physical disconnection and reconnection of the device via a low-level USB bus reset, completely resolving hardware lockups.
## Installation / Update
You can easily install or update the watchdog by running the provided installer script with root privileges:
```bash
cd ~/mc-webui/scripts/watchdog
sudo ./install.sh
```
## Detailed Documentation
For full details on configuration, logs, troubleshooting, and more advanced features, please refer to the main [Container Watchdog Documentation](../../docs/watchdog.md) located in the `docs` folder.

View File

@@ -98,6 +98,7 @@ Environment=CHECK_INTERVAL=30
Environment=LOG_FILE=${LOG_FILE}
Environment=HTTP_PORT=5051
Environment=AUTO_START=true
Environment=USB_DEVICE_PATH=${USB_DEVICE_PATH}
ExecStart=/usr/bin/python3 -u ${SCRIPT_DIR}/watchdog.py
Restart=always
RestartSec=10
@@ -144,6 +145,7 @@ echo "Features:"
echo " - Checks container health every 30 seconds"
echo " - Automatically restarts unhealthy containers"
echo " - Saves diagnostic logs before restart"
echo " - Performs hardware USB bus reset if LoRa device is stuck"
echo ""
echo "Useful commands:"
echo " systemctl status $SERVICE_NAME # Check service status"

View File

@@ -11,6 +11,7 @@ Environment=MCWEBUI_DIR=/home/marek/mc-webui
Environment=CHECK_INTERVAL=30
Environment=LOG_FILE=/var/log/mc-webui-watchdog.log
Environment=HTTP_PORT=5051
Environment=USB_DEVICE_PATH=
ExecStart=/usr/bin/python3 -u /home/marek/mc-webui/scripts/watchdog/watchdog.py
Restart=always
RestartSec=10

View File

@@ -26,6 +26,7 @@ import json
import subprocess
import threading
import time
import fcntl
from datetime import datetime
from http.server import HTTPServer, BaseHTTPRequestHandler
from pathlib import Path
@@ -59,6 +60,107 @@ def log(message: str, level: str = 'INFO'):
print(f"[{timestamp}] [ERROR] Failed to write to log file: {e}")
# USB Device Reset Constant
USBDEVFS_RESET = 21780 # 0x5514
def auto_detect_usb_device() -> str:
"""Attempt to auto-detect the physical USB device path (e.g., /dev/bus/usb/001/002) for LoRa."""
env_file = os.path.join(MCWEBUI_DIR, '.env')
serial_port = 'auto'
if os.path.exists(env_file):
try:
with open(env_file, 'r') as f:
for line in f:
if line.startswith('MC_SERIAL_PORT='):
serial_port = line.split('=', 1)[1].strip().strip('"\'')
break
except Exception as e:
log(f"Failed to read .env file for serial port: {e}", "WARN")
if serial_port.lower() == 'auto':
by_id_path = Path('/dev/serial/by-id')
if by_id_path.exists():
devices = list(by_id_path.iterdir())
if len(devices) == 1:
serial_port = str(devices[0])
elif len(devices) > 1:
log("Multiple serial devices found, cannot auto-detect USB device for reset", "WARN")
return None
else:
log("No serial devices found in /dev/serial/by-id", "WARN")
return None
else:
log("/dev/serial/by-id does not exist", "WARN")
return None
if not serial_port or not os.path.exists(serial_port):
log(f"Serial port {serial_port} not found", "WARN")
return None
try:
# Resolve symlink to get actual tty device (e.g., /dev/ttyACM0)
real_tty = os.path.realpath(serial_port)
tty_name = os.path.basename(real_tty)
# Find USB bus and dev number via sysfs
sysfs_path = f"/sys/class/tty/{tty_name}/device"
if not os.path.exists(sysfs_path):
log(f"Sysfs path {sysfs_path} not found", "WARN")
return None
usb_dev_dir = os.path.dirname(os.path.realpath(sysfs_path))
busnum_file = os.path.join(usb_dev_dir, "busnum")
devnum_file = os.path.join(usb_dev_dir, "devnum")
if os.path.exists(busnum_file) and os.path.exists(devnum_file):
with open(busnum_file) as f:
busnum = int(f.read().strip())
with open(devnum_file) as f:
devnum = int(f.read().strip())
return f"/dev/bus/usb/{busnum:03d}/{devnum:03d}"
log("Could not find busnum/devnum files in sysfs", "WARN")
return None
except Exception as e:
log(f"Error during USB device auto-detection: {e}", "ERROR")
return None
def reset_usb_device():
"""Perform a hardware USB bus reset on the LoRa device."""
device_path = os.environ.get('USB_DEVICE_PATH')
if not device_path:
device_path = auto_detect_usb_device()
if not device_path:
log("Cannot perform USB reset: device path could not be determined", "WARN")
return False
log(f"Performing hardware USB bus reset on {device_path}", "WARN")
try:
with open(device_path, 'w') as fd:
fcntl.ioctl(fd, USBDEVFS_RESET, 0)
log("USB bus reset successful", "INFO")
return True
except Exception as e:
log(f"USB reset failed: {e}", "ERROR")
return False
def count_recent_restarts(container_name: str, minutes: int = 8) -> int:
"""Count how many times a container was restarted in the last N minutes due to unhealthiness."""
cutoff_time = time.time() - (minutes * 60)
count = 0
for entry in restart_history:
if entry.get('container') == container_name and 'restart_success' in entry:
try:
dt = datetime.fromisoformat(entry['timestamp'])
if dt.timestamp() >= cutoff_time:
count += 1
except ValueError:
pass
return count
def run_docker_command(args: list, timeout: int = 30) -> tuple:
"""Run docker command and return (success, stdout, stderr)."""
try:
@@ -216,6 +318,14 @@ def handle_unhealthy_container(container_name: str, status: dict):
except Exception as e:
log(f"Failed to save diagnostic info: {e}", 'ERROR')
# Check if we should do a USB reset for meshcore-bridge
if container_name == 'meshcore-bridge':
recent_restarts = count_recent_restarts(container_name, minutes=8)
if recent_restarts >= 3:
log(f"{container_name} has been restarted {recent_restarts} times in the last 8 minutes. Attempting hardware USB reset.", "WARN")
if reset_usb_device():
time.sleep(2) # Give OS time to re-enumerate the device before Docker brings it back
# Restart the container
restart_success = restart_container(container_name)