Codex Refactor -- Make things more manageable and LLM friendly

This commit is contained in:
Jack Kingsman
2026-03-10 12:26:30 -07:00
committed by GitHub
83 changed files with 7172 additions and 4313 deletions
+29 -15
View File
@@ -28,28 +28,30 @@ Ancillary AGENTS.md files which should generally not be reviewed unless specific
```
┌─────────────────────────────────────────────────────────────────┐
│ Frontend (React)
│ Frontend (React) │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
│ │ StatusBar│ │ Sidebar │ │MessageList│ │ MessageInput │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ CrackerPanel (global collapsible, WebGPU cracking) │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
│ useWebSocket ←──── Real-time updates
│ │
│ api.ts ←──── REST API calls
└───────────────────────────┼─────────────────────────────────────
│ │ │
│ useWebSocket ←──── Real-time updates │
│ │ │
│ api.ts ←──── REST API calls │
└───────────────────────────┼─────────────────────────────────────┘
│ HTTP + WebSocket (/api/*)
┌───────────────────────────┼──────────────────────────────────────┐
│ Backend (FastAPI) │
│ ┌──────────┐ ┌──────────────┐ ┌────────────┐ ┌───────────
│ │ Routers │→ │ Repositories │→ │ SQLite DB │ │ WebSocket │
│ └──────────┘ └──────────────┘ └────────────┘ │ Manager │
│ ↓ ───────────
│ ┌──────────────────────────────────────────────────────────┐
│ │ RadioManager + Event Handlers │ │
└──────────────────────────────────────────────────────────┘
│ ┌──────────┐ ┌──────────┐ ┌──────────────┐ ┌────────────┐
│ │ Routers │→ │ Services │→ │ Repositories │→ │ SQLite DB │ │
│ └──────────┘ └──────────┘ └──────────────┘ └────────────┘ │
│ ↓ ───────────
│ ┌──────────────────────────┐ └──────────────→ │ WebSocket │
│ │ Radio runtime seam + Manager │
│ RadioManager lifecycle │ └───────────┘
│ │ / event adapters │ │
│ └──────────────────────────┘ │
└───────────────────────────┼──────────────────────────────────────┘
│ Serial / TCP / BLE
┌──────┴──────┐
@@ -91,6 +93,15 @@ Ancillary AGENTS.md files which should generally not be reviewed unless specific
5. **Offline-capable**: Radio operates independently; server syncs when connected
6. **Auto-reconnect**: Background monitor detects disconnection and attempts reconnection
## Code Ethos
- Prefer fewer, stronger modules over many tiny wrapper files.
- Split code only when the new module owns a real invariant, workflow, or contract.
- Avoid "enterprise" indirection layers whose main job is forwarding, renaming, or prop bundling.
- For this repo, "locally dense but semantically obvious" is better than context scattered across many files.
- Use typed contracts at important boundaries such as API payloads, WebSocket events, and repository writes.
- Refactors should be behavior-preserving slices with tests around the moved seam, not aesthetic reshuffles.
## Intentional Security Design Decisions
The following are **deliberate design choices**, not bugs. They are documented in the README with appropriate warnings. Do not "fix" these or flag them as vulnerabilities.
@@ -142,7 +153,7 @@ MeshCore firmware can encode path hops as 1-byte, 2-byte, or 3-byte identifiers.
1. User types message → clicks send
2. `api.sendChannelMessage()` → POST to backend
3. Backend calls `radio_manager.meshcore.commands.send_chan_msg()`
3. Backend route delegates to service-layer send orchestration, which acquires the radio lock and calls MeshCore commands
4. Message stored in database with `outgoing=true`
5. For direct messages: ACK tracked; for channel: repeat detection
@@ -162,6 +173,7 @@ This message-layer echo/path handling is independent of raw-packet storage dedup
│ ├── AGENTS.md # Backend documentation
│ ├── main.py # App entry, lifespan
│ ├── routers/ # API endpoints
│ ├── services/ # Shared backend orchestration/domain services, including radio_runtime access seam
│ ├── packet_processor.py # Raw packet pipeline, dedup, path handling
│ ├── repository/ # Database CRUD (contacts, channels, messages, raw_packets, settings, fanout)
│ ├── event_handlers.py # Radio events
@@ -171,7 +183,7 @@ This message-layer echo/path handling is independent of raw-packet storage dedup
├── frontend/ # React frontend
│ ├── AGENTS.md # Frontend documentation
│ ├── src/
│ │ ├── App.tsx # Main component
│ │ ├── App.tsx # Frontend composition entry (hooks → AppShell)
│ │ ├── api.ts # REST client
│ │ ├── useWebSocket.ts # WebSocket hook
│ │ └── components/
@@ -250,6 +262,8 @@ Key test files:
- `tests/test_messages_search.py` - Message search, around endpoint, forward pagination
- `tests/test_rx_log_data.py` - on_rx_log_data event handler integration
- `tests/test_ack_tracking_wiring.py` - DM ACK tracking extraction and wiring
- `tests/test_radio_lifecycle_service.py` - Radio reconnect/setup orchestration helpers
- `tests/test_radio_commands_service.py` - Radio config/private-key service workflows
- `tests/test_health_mqtt_status.py` - Health endpoint MQTT status field
- `tests/test_community_mqtt.py` - Community MQTT publisher (JWT, packet format, hash, broadcast)
- `tests/test_radio_sync.py` - Radio sync, periodic tasks, and contact offload back to the radio
+31 -9
View File
@@ -11,6 +11,14 @@ Keep it aligned with `app/` source files and router behavior.
- MeshCore Python library (`meshcore` from PyPI)
- PyCryptodome
## Code Ethos
- Prefer strong domain modules over layers of pass-through helpers.
- Split code when the new module owns real policy, not just a nicer name.
- Avoid wrapper services around globals unless they materially improve testability or reduce coupling.
- Keep workflows locally understandable; do not scatter one reasoning unit across several files without a clear contract.
- Typed write/read contracts are preferred over loose dict-shaped repository inputs.
## Backend Map
```text
@@ -19,13 +27,22 @@ app/
├── config.py # Env-driven runtime settings
├── database.py # SQLite connection + base schema + migration runner
├── migrations.py # Schema migrations (SQLite user_version)
├── models.py # Pydantic request/response models
├── models.py # Pydantic request/response models and typed write contracts (for example ContactUpsert)
├── repository/ # Data access layer (contacts, channels, messages, raw_packets, settings, fanout)
├── radio.py # RadioManager + auto-reconnect monitor
├── services/ # Shared orchestration/domain services
│ ├── messages.py # Shared message creation, dedup, ACK application
│ ├── message_send.py # Direct send, channel send, resend workflows
│ ├── dm_ack_tracker.py # Pending DM ACK state
│ ├── contact_reconciliation.py # Prefix-claim, sender-key backfill, name-history wiring
│ ├── radio_lifecycle.py # Post-connect setup and reconnect/setup helpers
│ ├── radio_commands.py # Radio config/private-key command workflows
│ └── radio_runtime.py # Router/dependency seam over the global RadioManager
├── radio.py # RadioManager transport/session state + lock management
├── radio_sync.py # Polling, sync, periodic advertisement loop
├── decoder.py # Packet parsing/decryption
├── packet_processor.py # Raw packet pipeline, dedup, path handling
├── event_handlers.py # MeshCore event subscriptions and ACK tracking
├── events.py # Typed WS event payload serialization
├── websocket.py # WS manager + broadcast helpers
├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise (see fanout/AGENTS_fanout.md)
├── dependencies.py # Shared FastAPI dependency providers
@@ -53,13 +70,13 @@ app/
1. Radio emits events.
2. `on_rx_log_data` stores raw packet and tries decrypt/pipeline handling.
3. Decrypted messages are inserted into `messages` and broadcast over WS.
3. Shared message-domain services create/update `messages` and shape WS payloads.
4. `CONTACT_MSG_RECV` is a fallback DM path when packet pipeline cannot decrypt.
### Outgoing messages
1. Send endpoints in `routers/messages.py` call MeshCore commands.
2. Message is persisted as outgoing.
1. Send endpoints in `routers/messages.py` validate requests and delegate to `services/message_send.py`.
2. Service-layer send workflows call MeshCore commands, persist outgoing messages, and wire ACK tracking.
3. Endpoint broadcasts WS `message` event so all live clients update.
4. ACK/repeat updates arrive later as `message_acked` events.
5. Channel resend (`POST /messages/channel/{id}/resend`) strips the sender name prefix by exact match against the current radio name. This assumes the radio name hasn't changed between the original send and the resend. Name changes require an explicit radio config update and are rare, but the `new_timestamp=true` resend path has no time window, so a mismatch is possible if the name was changed between the original send and a later resend.
@@ -67,9 +84,10 @@ app/
### Connection lifecycle
- `RadioManager.start_connection_monitor()` checks health every 5s.
- Monitor reconnect path runs `post_connect_setup()` before broadcasting healthy state.
- Manual reconnect/reboot endpoints call `reconnect()` then `post_connect_setup()`.
- Setup includes handler registration, key export, time sync, contact/channel sync, polling/advert tasks.
- `RadioManager.post_connect_setup()` delegates to `services/radio_lifecycle.py`.
- Routers, startup/lifespan code, fanout helpers, and `radio_sync.py` should reach radio state through `services/radio_runtime.py`, not by importing `app.radio.radio_manager` directly.
- Shared reconnect/setup helpers in `services/radio_lifecycle.py` are used by startup, the monitor, and manual reconnect/reboot flows before broadcasting healthy state.
- Setup still includes handler registration, key export, time sync, contact/channel sync, polling/advert tasks.
## Important Behaviors
@@ -215,7 +233,7 @@ app/
- `error` — toast notification (reconnect failure, missing private key, etc.)
- `success` — toast notification (historical decrypt complete, etc.)
Initial WS connect sends `health` only. Contacts/channels are loaded by REST.
Backend WS sends go through typed serialization in `events.py`. Initial WS connect sends `health` only. Contacts/channels are loaded by REST.
Client sends `"ping"` text; server replies `{"type":"pong"}`.
## Data Model Notes
@@ -230,6 +248,8 @@ Main tables:
- `contact_name_history` (tracks name changes over time)
- `app_settings`
Repository writes should prefer typed models such as `ContactUpsert` over ad hoc dict payloads when adding or updating schema-coupled data.
`app_settings` fields in active model:
- `max_radio_contacts`
- `favorites`
@@ -289,6 +309,8 @@ tests/
├── test_packet_pipeline.py # End-to-end packet processing
├── test_packets_router.py # Packets router endpoints (decrypt, maintenance)
├── test_radio.py # RadioManager, serial detection
├── test_radio_commands_service.py # Radio config/private-key service workflows
├── test_radio_lifecycle_service.py # Reconnect/setup orchestration helpers
├── test_real_crypto.py # Real cryptographic operations
├── test_radio_operation.py # radio_operation() context manager
├── test_radio_router.py # Radio router endpoints
+3 -12
View File
@@ -1,17 +1,8 @@
"""Shared dependencies for FastAPI routers."""
from fastapi import HTTPException
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
def require_connected():
"""Dependency that ensures radio is connected and returns meshcore instance.
Raises HTTPException 503 if radio is not connected.
"""
if getattr(radio_manager, "is_setup_in_progress", False) is True:
raise HTTPException(status_code=503, detail="Radio is initializing")
if not radio_manager.is_connected or radio_manager.meshcore is None:
raise HTTPException(status_code=503, detail="Radio not connected")
return radio_manager.meshcore
"""Dependency that ensures radio is connected and returns meshcore instance."""
return radio_manager.require_connected()
+38 -78
View File
@@ -4,14 +4,18 @@ from typing import TYPE_CHECKING
from meshcore import EventType
from app.models import CONTACT_TYPE_REPEATER, Contact, Message, MessagePath
from app.models import CONTACT_TYPE_REPEATER, Contact, ContactUpsert
from app.packet_processor import process_raw_packet
from app.repository import (
AmbiguousPublicKeyPrefixError,
ContactNameHistoryRepository,
ContactRepository,
MessageRepository,
)
from app.services import dm_ack_tracker
from app.services.contact_reconciliation import (
claim_prefix_messages_for_contact,
record_contact_name_and_reconcile,
)
from app.services.messages import create_fallback_direct_message, increment_ack_and_broadcast
from app.websocket import broadcast_event
if TYPE_CHECKING:
@@ -22,33 +26,17 @@ logger = logging.getLogger(__name__)
# Track active subscriptions so we can unsubscribe before re-registering
# This prevents handler duplication after reconnects
_active_subscriptions: list["Subscription"] = []
# Track pending ACKs: expected_ack_code -> (message_id, timestamp, timeout_ms)
_pending_acks: dict[str, tuple[int, float, int]] = {}
_pending_acks = dm_ack_tracker._pending_acks
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> None:
"""Track a pending ACK for a direct message."""
_pending_acks[expected_ack] = (message_id, time.time(), timeout_ms)
logger.debug(
"Tracking pending ACK %s for message %d (timeout %dms)",
expected_ack,
message_id,
timeout_ms,
)
"""Compatibility wrapper for pending DM ACK tracking."""
dm_ack_tracker.track_pending_ack(expected_ack, message_id, timeout_ms)
def cleanup_expired_acks() -> None:
"""Remove expired pending ACKs."""
now = time.time()
expired = []
for code, (_msg_id, created_at, timeout_ms) in _pending_acks.items():
if now - created_at > (timeout_ms / 1000) * 2: # 2x timeout as buffer
expired.append(code)
for code in expired:
del _pending_acks[code]
logger.debug("Expired pending ACK %s", code)
"""Compatibility wrapper for expiring stale DM ACK entries."""
dm_ack_tracker.cleanup_expired_acks()
async def on_contact_message(event: "Event") -> None:
@@ -90,7 +78,7 @@ async def on_contact_message(event: "Event") -> None:
sender_pubkey = contact.public_key.lower()
# Promote any prefix-stored messages to this full key
await MessageRepository.claim_prefix_messages(sender_pubkey)
await claim_prefix_messages_for_contact(public_key=sender_pubkey, log=logger)
# Skip messages from repeaters - they only send CLI responses, not chat messages.
# CLI responses are handled by the command endpoint and txt_type filter above.
@@ -108,21 +96,21 @@ async def on_contact_message(event: "Event") -> None:
sender_name = contact.name if contact else None
path = payload.get("path")
path_len = payload.get("path_len")
msg_id = await MessageRepository.create(
msg_type="PRIV",
text=payload.get("text", ""),
message = await create_fallback_direct_message(
conversation_key=sender_pubkey,
text=payload.get("text", ""),
sender_timestamp=sender_timestamp,
received_at=received_at,
path=path,
path_len=path_len,
txt_type=txt_type,
signature=payload.get("signature"),
sender_key=sender_pubkey,
sender_name=sender_name,
sender_key=sender_pubkey,
broadcast_fn=broadcast_event,
)
if msg_id is None:
if message is None:
# Already handled by packet processor (or exact duplicate) - nothing more to do
logger.debug("DM from %s already processed by packet processor", sender_pubkey[:12])
return
@@ -131,31 +119,6 @@ async def on_contact_message(event: "Event") -> None:
# (likely because private key export is not available)
logger.debug("DM from %s handled by event handler (fallback path)", sender_pubkey[:12])
# Build paths array for broadcast
paths = (
[MessagePath(path=path or "", received_at=received_at, path_len=path_len)]
if path is not None
else None
)
# Broadcast the new message
broadcast_event(
"message",
Message(
id=msg_id,
type="PRIV",
conversation_key=sender_pubkey,
text=payload.get("text", ""),
sender_timestamp=sender_timestamp,
received_at=received_at,
paths=paths,
txt_type=txt_type,
signature=payload.get("signature"),
sender_key=sender_pubkey,
sender_name=sender_name,
).model_dump(),
)
# Update contact last_contacted (contact was already fetched above)
if contact:
await ContactRepository.update_last_contacted(sender_pubkey, received_at)
@@ -265,30 +228,29 @@ async def on_new_contact(event: "Event") -> None:
logger.debug("New contact: %s", public_key[:12])
contact_data = {
**Contact.from_radio_dict(public_key.lower(), payload, on_radio=True),
"last_seen": int(time.time()),
}
await ContactRepository.upsert(contact_data)
contact_upsert = ContactUpsert.from_radio_dict(public_key.lower(), payload, on_radio=True)
contact_upsert.last_seen = int(time.time())
await ContactRepository.upsert(contact_upsert)
# Record name history if contact has a name
adv_name = payload.get("adv_name")
if adv_name:
await ContactNameHistoryRepository.record_name(
public_key.lower(), adv_name, int(time.time())
)
backfilled = await MessageRepository.backfill_channel_sender_key(public_key, adv_name)
if backfilled > 0:
logger.info(
"Backfilled sender_key on %d channel message(s) for %s",
backfilled,
adv_name,
)
await record_contact_name_and_reconcile(
public_key=public_key,
contact_name=adv_name,
timestamp=int(time.time()),
log=logger,
)
# Read back from DB so the broadcast includes all fields (last_contacted,
# last_read_at, etc.) matching the REST Contact shape exactly.
db_contact = await ContactRepository.get_by_key(public_key)
broadcast_event("contact", (db_contact.model_dump() if db_contact else contact_data))
broadcast_event(
"contact",
(
db_contact.model_dump()
if db_contact
else Contact(**contact_upsert.model_dump(exclude_none=True)).model_dump()
),
)
async def on_ack(event: "Event") -> None:
@@ -304,15 +266,13 @@ async def on_ack(event: "Event") -> None:
cleanup_expired_acks()
if ack_code in _pending_acks:
message_id, _, _ = _pending_acks.pop(ack_code)
message_id = dm_ack_tracker.pop_pending_ack(ack_code)
if message_id is not None:
logger.info("ACK received for message %d", message_id)
ack_count = await MessageRepository.increment_ack_count(message_id)
# DM ACKs don't carry path data, so paths is intentionally omitted.
# The frontend's mergePendingAck handles the missing field correctly,
# preserving any previously known paths.
broadcast_event("message_acked", {"message_id": message_id, "ack_count": ack_count})
await increment_ack_and_broadcast(message_id=message_id, broadcast_fn=broadcast_event)
else:
logger.debug("ACK code %s does not match any pending messages", ack_code)
+107
View File
@@ -0,0 +1,107 @@
"""Typed WebSocket event contracts and serialization helpers."""
import json
import logging
from typing import Any, Literal
from pydantic import TypeAdapter
from typing_extensions import NotRequired, TypedDict
from app.models import Channel, Contact, Message, MessagePath, RawPacketBroadcast
from app.routers.health import HealthResponse
logger = logging.getLogger(__name__)
WsEventType = Literal[
"health",
"message",
"contact",
"channel",
"contact_deleted",
"channel_deleted",
"raw_packet",
"message_acked",
"error",
"success",
]
class ContactDeletedPayload(TypedDict):
public_key: str
class ChannelDeletedPayload(TypedDict):
key: str
class MessageAckedPayload(TypedDict):
message_id: int
ack_count: int
paths: NotRequired[list[MessagePath]]
class ToastPayload(TypedDict):
message: str
details: NotRequired[str]
WsEventPayload = (
HealthResponse
| Message
| Contact
| Channel
| ContactDeletedPayload
| ChannelDeletedPayload
| RawPacketBroadcast
| MessageAckedPayload
| ToastPayload
)
_PAYLOAD_ADAPTERS: dict[WsEventType, TypeAdapter[Any]] = {
"health": TypeAdapter(HealthResponse),
"message": TypeAdapter(Message),
"contact": TypeAdapter(Contact),
"channel": TypeAdapter(Channel),
"contact_deleted": TypeAdapter(ContactDeletedPayload),
"channel_deleted": TypeAdapter(ChannelDeletedPayload),
"raw_packet": TypeAdapter(RawPacketBroadcast),
"message_acked": TypeAdapter(MessageAckedPayload),
"error": TypeAdapter(ToastPayload),
"success": TypeAdapter(ToastPayload),
}
def validate_ws_event_payload(event_type: str, data: Any) -> WsEventPayload | Any:
"""Validate known WebSocket payloads; pass unknown events through unchanged."""
adapter = _PAYLOAD_ADAPTERS.get(event_type) # type: ignore[arg-type]
if adapter is None:
return data
return adapter.validate_python(data)
def dump_ws_event(event_type: str, data: Any) -> str:
"""Serialize a WebSocket event envelope with validation for known event types."""
adapter = _PAYLOAD_ADAPTERS.get(event_type) # type: ignore[arg-type]
if adapter is None:
return json.dumps({"type": event_type, "data": data})
try:
validated = adapter.validate_python(data)
payload = adapter.dump_python(validated, mode="json")
return json.dumps({"type": event_type, "data": payload})
except Exception:
logger.exception(
"Failed to validate WebSocket payload for event %s; falling back to raw JSON envelope",
event_type,
)
return json.dumps({"type": event_type, "data": data})
def dump_ws_event_payload(event_type: str, data: Any) -> Any:
"""Return the JSON-serializable payload for a WebSocket event."""
adapter = _PAYLOAD_ADAPTERS.get(event_type) # type: ignore[arg-type]
if adapter is None:
return data
validated = adapter.validate_python(data)
return adapter.dump_python(validated, mode="json")
+7 -5
View File
@@ -244,7 +244,7 @@ def _build_radio_info() -> str:
Matches the reference format: ``"freq,bw,sf,cr"`` (comma-separated raw
values). Falls back to ``"0,0,0,0"`` when unavailable.
"""
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
try:
if radio_manager.meshcore and radio_manager.meshcore.self_info:
@@ -329,7 +329,7 @@ class CommunityMqttPublisher(BaseMqttPublisher):
def _build_client_kwargs(self, settings: object) -> dict[str, Any]:
s: CommunityMqttSettings = settings # type: ignore[assignment]
from app.keystore import get_private_key, get_public_key
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
private_key = get_private_key()
public_key = get_public_key()
@@ -401,7 +401,8 @@ class CommunityMqttPublisher(BaseMqttPublisher):
if self._cached_device_info is not None:
return self._cached_device_info
from app.radio import RadioDisconnectedError, RadioOperationBusyError, radio_manager
from app.radio import RadioDisconnectedError, RadioOperationBusyError
from app.services.radio_runtime import radio_runtime as radio_manager
fallback = {"model": "unknown", "firmware_version": "unknown"}
try:
@@ -448,7 +449,8 @@ class CommunityMqttPublisher(BaseMqttPublisher):
) < _STATS_MIN_CACHE_SECS and self._cached_stats is not None:
return self._cached_stats
from app.radio import RadioDisconnectedError, RadioOperationBusyError, radio_manager
from app.radio import RadioDisconnectedError, RadioOperationBusyError
from app.services.radio_runtime import radio_runtime as radio_manager
try:
async with radio_manager.radio_operation("community_stats_fetch", blocking=False) as mc:
@@ -489,7 +491,7 @@ class CommunityMqttPublisher(BaseMqttPublisher):
) -> None:
"""Build and publish the enriched retained status message."""
from app.keystore import get_public_key
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
public_key = get_public_key()
if public_key is None:
+1 -1
View File
@@ -25,7 +25,7 @@ _BACKOFF_MIN = 5
def _broadcast_health() -> None:
"""Push updated health (including MQTT status) to all WS clients."""
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_health
broadcast_health(radio_manager.is_connected, radio_manager.connection_info)
+1 -1
View File
@@ -109,7 +109,7 @@ async def _publish_community_packet(
"""Format and publish a raw packet to the community broker."""
try:
from app.keystore import get_public_key
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
public_key = get_public_key()
if public_key is None:
+3 -6
View File
@@ -10,7 +10,7 @@ from fastapi.responses import JSONResponse
from app.config import setup_logging
from app.database import db
from app.frontend_static import register_frontend_missing_fallback, register_frontend_static_routes
from app.radio import RadioDisconnectedError, radio_manager
from app.radio import RadioDisconnectedError
from app.radio_sync import (
stop_message_polling,
stop_periodic_advert,
@@ -30,6 +30,7 @@ from app.routers import (
statistics,
ws,
)
from app.services.radio_runtime import radio_runtime as radio_manager
setup_logging()
logger = logging.getLogger(__name__)
@@ -38,12 +39,8 @@ logger = logging.getLogger(__name__)
async def _startup_radio_connect_and_setup() -> None:
"""Connect/setup the radio in the background so HTTP serving can start immediately."""
try:
connected = await radio_manager.reconnect(broadcast_on_success=False)
connected = await radio_manager.reconnect_and_prepare(broadcast_on_success=True)
if connected:
await radio_manager.post_connect_setup()
from app.websocket import broadcast_health
broadcast_health(True, radio_manager.connection_info)
logger.info("Connected to radio")
else:
logger.warning("Failed to connect to radio on startup")
+68 -26
View File
@@ -5,6 +5,64 @@ from pydantic import BaseModel, Field
from app.path_utils import normalize_contact_route
class ContactUpsert(BaseModel):
"""Typed write contract for contacts persisted to SQLite."""
public_key: str = Field(description="Public key (64-char hex)")
name: str | None = None
type: int = 0
flags: int = 0
last_path: str | None = None
last_path_len: int = -1
out_path_hash_mode: int | None = None
route_override_path: str | None = None
route_override_len: int | None = None
route_override_hash_mode: int | None = None
last_advert: int | None = None
lat: float | None = None
lon: float | None = None
last_seen: int | None = None
on_radio: bool | None = None
last_contacted: int | None = None
first_seen: int | None = None
@classmethod
def from_contact(cls, contact: "Contact", **changes) -> "ContactUpsert":
return cls.model_validate(
{
**contact.model_dump(exclude={"last_read_at"}),
**changes,
}
)
@classmethod
def from_radio_dict(
cls, public_key: str, radio_data: dict, on_radio: bool = False
) -> "ContactUpsert":
"""Convert radio contact data to the contact-row write shape."""
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
radio_data.get("out_path"),
radio_data.get("out_path_len", -1),
radio_data.get(
"out_path_hash_mode",
-1 if radio_data.get("out_path_len", -1) == -1 else 0,
),
)
return cls(
public_key=public_key,
name=radio_data.get("adv_name"),
type=radio_data.get("type", 0),
flags=radio_data.get("flags", 0),
last_path=last_path,
last_path_len=last_path_len,
out_path_hash_mode=out_path_hash_mode,
lat=radio_data.get("adv_lat"),
lon=radio_data.get("adv_lon"),
last_advert=radio_data.get("last_advert"),
on_radio=on_radio,
)
class Contact(BaseModel):
public_key: str = Field(description="Public key (64-char hex)")
name: str | None = None
@@ -61,34 +119,18 @@ class Contact(BaseModel):
"last_advert": self.last_advert if self.last_advert is not None else 0,
}
def to_upsert(self, **changes) -> ContactUpsert:
"""Convert the stored contact to the repository's write contract."""
return ContactUpsert.from_contact(self, **changes)
@staticmethod
def from_radio_dict(public_key: str, radio_data: dict, on_radio: bool = False) -> dict:
"""Convert radio contact data to database format dict.
This is the inverse of to_radio_dict(), used when syncing contacts
from radio to database.
"""
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
radio_data.get("out_path"),
radio_data.get("out_path_len", -1),
radio_data.get(
"out_path_hash_mode",
-1 if radio_data.get("out_path_len", -1) == -1 else 0,
),
)
return {
"public_key": public_key,
"name": radio_data.get("adv_name"),
"type": radio_data.get("type", 0),
"flags": radio_data.get("flags", 0),
"last_path": last_path,
"last_path_len": last_path_len,
"out_path_hash_mode": out_path_hash_mode,
"lat": radio_data.get("adv_lat"),
"lon": radio_data.get("adv_lon"),
"last_advert": radio_data.get("last_advert"),
"on_radio": on_radio,
}
"""Backward-compatible dict wrapper over ContactUpsert.from_radio_dict()."""
return ContactUpsert.from_radio_dict(
public_key,
radio_data,
on_radio=on_radio,
).model_dump()
class CreateContactRequest(BaseModel):
+51 -298
View File
@@ -30,19 +30,24 @@ from app.decoder import (
from app.keystore import get_private_key, get_public_key, has_private_key
from app.models import (
CONTACT_TYPE_REPEATER,
Message,
MessagePath,
Contact,
ContactUpsert,
RawPacketBroadcast,
RawPacketDecryptedInfo,
)
from app.repository import (
ChannelRepository,
ContactAdvertPathRepository,
ContactNameHistoryRepository,
ContactRepository,
MessageRepository,
RawPacketRepository,
)
from app.services.contact_reconciliation import record_contact_name_and_reconcile
from app.services.messages import (
create_dm_message_from_decrypted as _create_dm_message_from_decrypted,
)
from app.services.messages import (
create_message_from_decrypted as _create_message_from_decrypted,
)
from app.websocket import broadcast_error, broadcast_event
logger = logging.getLogger(__name__)
@@ -50,77 +55,6 @@ logger = logging.getLogger(__name__)
_raw_observation_counter = count(1)
async def _handle_duplicate_message(
packet_id: int,
msg_type: str,
conversation_key: str,
text: str,
sender_timestamp: int,
path: str | None,
received: int,
path_len: int | None = None,
) -> None:
"""Handle a duplicate message by updating paths/acks on the existing record.
Called when MessageRepository.create returns None (INSERT OR IGNORE hit a duplicate).
Looks up the existing message, adds the new path, increments ack count for outgoing
messages, and broadcasts the update to clients.
"""
existing_msg = await MessageRepository.get_by_content(
msg_type=msg_type,
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
)
if not existing_msg:
label = "message" if msg_type == "CHAN" else "DM"
logger.warning(
"Duplicate %s for %s but couldn't find existing",
label,
conversation_key[:12],
)
return
logger.debug(
"Duplicate %s for %s (msg_id=%d, outgoing=%s) - adding path",
msg_type,
conversation_key[:12],
existing_msg.id,
existing_msg.outgoing,
)
# Add path if provided
if path is not None:
paths = await MessageRepository.add_path(existing_msg.id, path, received, path_len)
else:
# Get current paths for broadcast
paths = existing_msg.paths or []
# Increment ack count for outgoing messages (echo confirmation)
if existing_msg.outgoing:
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
else:
ack_count = existing_msg.acked
# Only broadcast when something actually changed:
# - outgoing: ack count was incremented
# - path provided: a new path entry was appended
# The path=None case happens for direct-delivery DMs (0-hop, no routing bytes).
# A non-outgoing duplicate with no new path changes nothing in the DB, so skip.
if existing_msg.outgoing or path is not None:
broadcast_event(
"message_acked",
{
"message_id": existing_msg.id,
"ack_count": ack_count,
"paths": [p.model_dump() for p in paths] if paths else [],
},
)
# Mark this packet as decrypted
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
async def create_message_from_decrypted(
packet_id: int,
channel_key: str,
@@ -133,95 +67,21 @@ async def create_message_from_decrypted(
channel_name: str | None = None,
realtime: bool = True,
) -> int | None:
"""Create a message record from decrypted channel packet content.
This is the shared logic for storing decrypted channel messages,
used by both real-time packet processing and historical decryption.
Args:
packet_id: ID of the raw packet being processed
channel_key: Hex string channel key
channel_name: Channel name (e.g. "#general"), for bot context
sender: Sender name (will be prefixed to message) or None
message_text: The decrypted message content
timestamp: Sender timestamp from the packet
received_at: When the packet was received (defaults to now)
path: Hex-encoded routing path
realtime: If False, skip fanout dispatch (used for historical decryption)
Returns the message ID if created, None if duplicate.
"""
received = received_at or int(time.time())
# Format the message text with sender prefix if present
text = f"{sender}: {message_text}" if sender else message_text
# Normalize channel key to uppercase for consistency
channel_key_normalized = channel_key.upper()
# Resolve sender_key: look up contact by exact name match
resolved_sender_key: str | None = None
if sender:
candidates = await ContactRepository.get_by_name(sender)
if len(candidates) == 1:
resolved_sender_key = candidates[0].public_key
# Try to create message - INSERT OR IGNORE handles duplicates atomically
msg_id = await MessageRepository.create(
msg_type="CHAN",
text=text,
conversation_key=channel_key_normalized,
sender_timestamp=timestamp,
received_at=received,
"""Store a decrypted channel message via the shared message service."""
return await _create_message_from_decrypted(
packet_id=packet_id,
channel_key=channel_key,
sender=sender,
message_text=message_text,
timestamp=timestamp,
received_at=received_at,
path=path,
path_len=path_len,
sender_name=sender,
sender_key=resolved_sender_key,
)
if msg_id is None:
# Duplicate message detected - this happens when:
# 1. Our own outgoing message echoes back (flood routing)
# 2. Same message arrives via multiple paths before first is committed
# In either case, add the path to the existing message.
await _handle_duplicate_message(
packet_id, "CHAN", channel_key_normalized, text, timestamp, path, received, path_len
)
return None
logger.info("Stored channel message %d for channel %s", msg_id, channel_key_normalized[:8])
# Mark the raw packet as decrypted
await RawPacketRepository.mark_decrypted(packet_id, msg_id)
# Build paths array for broadcast
# Use "is not None" to include empty string (direct/0-hop messages)
paths = (
[MessagePath(path=path or "", received_at=received, path_len=path_len)]
if path is not None
else None
)
# Broadcast new message to connected clients (and fanout modules when realtime)
broadcast_event(
"message",
Message(
id=msg_id,
type="CHAN",
conversation_key=channel_key_normalized,
text=text,
sender_timestamp=timestamp,
received_at=received,
paths=paths,
sender_name=sender,
sender_key=resolved_sender_key,
channel_name=channel_name,
).model_dump(),
channel_name=channel_name,
realtime=realtime,
broadcast_fn=broadcast_event,
)
return msg_id
async def create_dm_message_from_decrypted(
packet_id: int,
@@ -234,111 +94,20 @@ async def create_dm_message_from_decrypted(
outgoing: bool = False,
realtime: bool = True,
) -> int | None:
"""Create a message record from decrypted direct message packet content.
This is the shared logic for storing decrypted direct messages,
used by both real-time packet processing and historical decryption.
Args:
packet_id: ID of the raw packet being processed
decrypted: DecryptedDirectMessage from decoder
their_public_key: The contact's full 64-char public key (conversation_key)
our_public_key: Our public key (to determine direction), or None
received_at: When the packet was received (defaults to now)
path: Hex-encoded routing path
outgoing: Whether this is an outgoing message (we sent it)
realtime: If False, skip fanout dispatch (used for historical decryption)
Returns the message ID if created, None if duplicate.
"""
# Check if sender is a repeater - repeaters only send CLI responses, not chat messages.
# CLI responses are handled by the command endpoint, not stored in chat history.
contact = await ContactRepository.get_by_key(their_public_key)
if contact and contact.type == CONTACT_TYPE_REPEATER:
logger.debug(
"Skipping message from repeater %s (CLI responses not stored): %s",
their_public_key[:12],
(decrypted.message or "")[:50],
)
return None
received = received_at or int(time.time())
# conversation_key is always the other party's public key
conversation_key = their_public_key.lower()
# Resolve sender name for incoming messages (used for name-based blocking)
sender_name = contact.name if contact and not outgoing else None
# Try to create message - INSERT OR IGNORE handles duplicates atomically
msg_id = await MessageRepository.create(
msg_type="PRIV",
text=decrypted.message,
conversation_key=conversation_key,
sender_timestamp=decrypted.timestamp,
received_at=received,
"""Store a decrypted direct message via the shared message service."""
return await _create_dm_message_from_decrypted(
packet_id=packet_id,
decrypted=decrypted,
their_public_key=their_public_key,
our_public_key=our_public_key,
received_at=received_at,
path=path,
path_len=path_len,
outgoing=outgoing,
sender_key=conversation_key if not outgoing else None,
sender_name=sender_name,
)
if msg_id is None:
# Duplicate message detected
await _handle_duplicate_message(
packet_id,
"PRIV",
conversation_key,
decrypted.message,
decrypted.timestamp,
path,
received,
path_len,
)
return None
logger.info(
"Stored direct message %d for contact %s (outgoing=%s)",
msg_id,
conversation_key[:12],
outgoing,
)
# Mark the raw packet as decrypted
await RawPacketRepository.mark_decrypted(packet_id, msg_id)
# Build paths array for broadcast
paths = (
[MessagePath(path=path or "", received_at=received, path_len=path_len)]
if path is not None
else None
)
# Broadcast new message to connected clients (and fanout modules when realtime)
sender_name = contact.name if contact and not outgoing else None
broadcast_event(
"message",
Message(
id=msg_id,
type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
sender_timestamp=decrypted.timestamp,
received_at=received,
paths=paths,
outgoing=outgoing,
sender_name=sender_name,
sender_key=conversation_key if not outgoing else None,
).model_dump(),
realtime=realtime,
broadcast_fn=broadcast_event,
)
# Update contact's last_contacted timestamp (for sorting)
await ContactRepository.update_last_contacted(conversation_key, received)
return msg_id
async def run_historical_dm_decryption(
private_key_bytes: bytes,
@@ -722,46 +491,27 @@ async def _process_advertisement(
hop_count=new_path_len,
)
# Record name history
if advert.name:
await ContactNameHistoryRepository.record_name(
public_key=advert.public_key.lower(),
name=advert.name,
timestamp=timestamp,
)
contact_upsert = ContactUpsert(
public_key=advert.public_key.lower(),
name=advert.name,
type=contact_type,
lat=advert.lat,
lon=advert.lon,
last_advert=advert.timestamp if advert.timestamp > 0 else timestamp,
last_seen=timestamp,
last_path=path_hex,
last_path_len=path_len,
out_path_hash_mode=out_path_hash_mode,
first_seen=timestamp, # COALESCE in upsert preserves existing value
)
contact_data = {
"public_key": advert.public_key.lower(),
"name": advert.name,
"type": contact_type,
"lat": advert.lat,
"lon": advert.lon,
"last_advert": advert.timestamp if advert.timestamp > 0 else timestamp,
"last_seen": timestamp,
"last_path": path_hex,
"last_path_len": path_len,
"out_path_hash_mode": out_path_hash_mode,
"first_seen": timestamp, # COALESCE in upsert preserves existing value
}
await ContactRepository.upsert(contact_data)
claimed = await MessageRepository.claim_prefix_messages(advert.public_key.lower())
if claimed > 0:
logger.info(
"Claimed %d prefix DM message(s) for contact %s",
claimed,
advert.public_key[:12],
)
if advert.name:
backfilled = await MessageRepository.backfill_channel_sender_key(
advert.public_key, advert.name
)
if backfilled > 0:
logger.info(
"Backfilled sender_key on %d channel message(s) for %s",
backfilled,
advert.name,
)
await ContactRepository.upsert(contact_upsert)
await record_contact_name_and_reconcile(
public_key=advert.public_key,
contact_name=advert.name,
timestamp=timestamp,
log=logger,
)
# Read back from DB so the broadcast includes all fields (last_contacted,
# last_read_at, flags, on_radio, etc.) matching the REST Contact shape exactly.
@@ -769,7 +519,10 @@ async def _process_advertisement(
if db_contact:
broadcast_event("contact", db_contact.model_dump())
else:
broadcast_event("contact", contact_data)
broadcast_event(
"contact",
Contact(**contact_upsert.model_dump(exclude_none=True)).model_dump(),
)
# For new contacts, optionally attempt to decrypt any historical DMs we may have stored
# This is controlled by the auto_decrypt_dm_on_advert setting
+6 -207
View File
@@ -217,146 +217,10 @@ class RadioManager:
self._release_operation_lock(name)
async def post_connect_setup(self) -> None:
"""Full post-connection setup: handlers, key export, sync, advertisements, polling.
"""Run shared post-connection orchestration after transport setup succeeds."""
from app.services.radio_lifecycle import run_post_connect_setup
Called after every successful connection or reconnection.
Idempotent — safe to call repeatedly (periodic tasks have start guards).
"""
from app.event_handlers import register_event_handlers
from app.keystore import export_and_store_private_key
from app.radio_sync import (
drain_pending_messages,
send_advertisement,
start_message_polling,
start_periodic_advert,
start_periodic_sync,
sync_and_offload_all,
sync_radio_time,
)
if not self._meshcore:
return
if self._setup_lock is None:
self._setup_lock = asyncio.Lock()
async with self._setup_lock:
if not self._meshcore:
return
self._setup_in_progress = True
self._setup_complete = False
mc = self._meshcore
try:
# Register event handlers (no radio I/O, just callback setup)
register_event_handlers(mc)
# Hold the operation lock for all radio I/O during setup.
# This prevents user-initiated operations (send message, etc.)
# from interleaving commands on the serial link.
await self._acquire_operation_lock("post_connect_setup", blocking=True)
try:
await export_and_store_private_key(mc)
# Sync radio clock with system time
await sync_radio_time(mc)
# Apply flood scope from settings (best-effort; older firmware
# may not support set_flood_scope)
from app.region_scope import normalize_region_scope
from app.repository import AppSettingsRepository
app_settings = await AppSettingsRepository.get()
scope = normalize_region_scope(app_settings.flood_scope)
try:
await mc.commands.set_flood_scope(scope if scope else "")
logger.info("Applied flood_scope=%r", scope or "(disabled)")
except Exception as exc:
logger.warning(
"set_flood_scope failed (firmware may not support it): %s", exc
)
# Query path hash mode support (best-effort; older firmware won't report it).
# If the library's parsed payload is missing path_hash_mode (e.g. stale
# .pyc on WSL2 Windows mounts), fall back to raw-frame extraction.
reader = mc._reader
_original_handle_rx = reader.handle_rx
_captured_frame: list[bytes] = []
async def _capture_handle_rx(data: bytearray) -> None:
from meshcore.packets import PacketType
if len(data) > 0 and data[0] == PacketType.DEVICE_INFO.value:
_captured_frame.append(bytes(data))
return await _original_handle_rx(data)
reader.handle_rx = _capture_handle_rx
self.path_hash_mode = 0
self.path_hash_mode_supported = False
try:
device_query = await mc.commands.send_device_query()
if device_query and "path_hash_mode" in device_query.payload:
self.path_hash_mode = device_query.payload["path_hash_mode"]
self.path_hash_mode_supported = True
elif _captured_frame:
# Raw-frame fallback: byte 1 = fw_ver, byte 81 = path_hash_mode
raw = _captured_frame[-1]
fw_ver = raw[1] if len(raw) > 1 else 0
if fw_ver >= 10 and len(raw) >= 82:
self.path_hash_mode = raw[81]
self.path_hash_mode_supported = True
logger.warning(
"path_hash_mode=%d extracted from raw frame "
"(stale .pyc? try: rm %s)",
self.path_hash_mode,
getattr(
__import__("meshcore.reader", fromlist=["reader"]),
"__cached__",
"meshcore __pycache__/reader.*.pyc",
),
)
if self.path_hash_mode_supported:
logger.info("Path hash mode: %d (supported)", self.path_hash_mode)
else:
logger.debug("Firmware does not report path_hash_mode")
except Exception as exc:
logger.debug("Failed to query path_hash_mode: %s", exc)
finally:
reader.handle_rx = _original_handle_rx
# Sync contacts/channels from radio to DB and clear radio
logger.info("Syncing and offloading radio data...")
result = await sync_and_offload_all(mc)
logger.info("Sync complete: %s", result)
# Send advertisement to announce our presence (if enabled and not throttled)
if await send_advertisement(mc):
logger.info("Advertisement sent")
else:
logger.debug("Advertisement skipped (disabled or throttled)")
# Drain any messages that were queued before we connected.
# This must happen BEFORE starting auto-fetch, otherwise both
# compete on get_msg() with interleaved radio I/O.
drained = await drain_pending_messages(mc)
if drained > 0:
logger.info("Drained %d pending message(s)", drained)
await mc.start_auto_message_fetching()
logger.info("Auto message fetching started")
finally:
self._release_operation_lock("post_connect_setup")
# Start background tasks AFTER releasing the operation lock.
# These tasks acquire their own locks when they need radio access.
start_periodic_sync()
start_periodic_advert()
start_message_polling()
self._setup_complete = True
finally:
self._setup_in_progress = False
logger.info("Post-connect setup complete")
await run_post_connect_setup(self)
@property
def meshcore(self) -> MeshCore | None:
@@ -516,77 +380,12 @@ class RadioManager:
async def start_connection_monitor(self) -> None:
"""Start background task to monitor connection and auto-reconnect."""
from app.services.radio_lifecycle import connection_monitor_loop
if self._reconnect_task is not None:
return
async def monitor_loop():
from app.websocket import broadcast_health
CHECK_INTERVAL_SECONDS = 5
UNRESPONSIVE_THRESHOLD = 3
consecutive_setup_failures = 0
while True:
try:
await asyncio.sleep(CHECK_INTERVAL_SECONDS)
current_connected = self.is_connected
# Detect status change
if self._last_connected and not current_connected:
# Connection lost
logger.warning("Radio connection lost, broadcasting status change")
broadcast_health(False, self._connection_info)
self._last_connected = False
consecutive_setup_failures = 0
if not current_connected:
# Attempt reconnection on every loop while disconnected
if not self.is_reconnecting and await self.reconnect(
broadcast_on_success=False
):
await self.post_connect_setup()
broadcast_health(True, self._connection_info)
self._last_connected = True
consecutive_setup_failures = 0
elif not self._last_connected and current_connected:
# Connection restored (might have reconnected automatically).
# Always run setup before reporting healthy.
logger.info("Radio connection restored")
await self.post_connect_setup()
broadcast_health(True, self._connection_info)
self._last_connected = True
consecutive_setup_failures = 0
elif current_connected and not self._setup_complete:
# Transport connected but setup incomplete — retry
logger.info("Retrying post-connect setup...")
await self.post_connect_setup()
broadcast_health(True, self._connection_info)
consecutive_setup_failures = 0
except asyncio.CancelledError:
# Task is being cancelled, exit cleanly
break
except Exception as e:
consecutive_setup_failures += 1
if consecutive_setup_failures == UNRESPONSIVE_THRESHOLD:
logger.error(
"Post-connect setup has failed %d times in a row. "
"The radio port appears open but the radio is not "
"responding to commands. Common causes: another "
"process has the serial port open (check for other "
"RemoteTerm instances, serial monitors, etc.), the "
"firmware is in repeater mode (not client), or the "
"radio needs a power cycle. Will keep retrying.",
consecutive_setup_failures,
)
elif consecutive_setup_failures < UNRESPONSIVE_THRESHOLD:
logger.exception("Error in connection monitor, continuing: %s", e)
# After the threshold, silently retry (avoid log spam)
self._reconnect_task = asyncio.create_task(monitor_loop())
self._reconnect_task = asyncio.create_task(connection_monitor_loop(self))
logger.info("Radio connection monitor started")
async def stop_connection_monitor(self) -> None:
+12 -22
View File
@@ -17,15 +17,16 @@ from contextlib import asynccontextmanager
from meshcore import EventType, MeshCore
from app.event_handlers import cleanup_expired_acks
from app.models import Contact
from app.radio import RadioOperationBusyError, radio_manager
from app.models import Contact, ContactUpsert
from app.radio import RadioOperationBusyError
from app.repository import (
AmbiguousPublicKeyPrefixError,
AppSettingsRepository,
ChannelRepository,
ContactRepository,
MessageRepository,
)
from app.services.contact_reconciliation import reconcile_contact_messages
from app.services.radio_runtime import radio_runtime as radio_manager
logger = logging.getLogger(__name__)
@@ -154,26 +155,13 @@ async def sync_and_offload_contacts(mc: MeshCore) -> dict:
for public_key, contact_data in contacts.items():
# Save to database
await ContactRepository.upsert(
Contact.from_radio_dict(public_key, contact_data, on_radio=False)
ContactUpsert.from_radio_dict(public_key, contact_data, on_radio=False)
)
await reconcile_contact_messages(
public_key=public_key,
contact_name=contact_data.get("adv_name"),
log=logger,
)
claimed = await MessageRepository.claim_prefix_messages(public_key.lower())
if claimed > 0:
logger.info(
"Claimed %d prefix DM message(s) for contact %s",
claimed,
public_key[:12],
)
adv_name = contact_data.get("adv_name")
if adv_name:
backfilled = await MessageRepository.backfill_channel_sender_key(
public_key, adv_name
)
if backfilled > 0:
logger.info(
"Backfilled sender_key on %d channel message(s) for %s",
backfilled,
adv_name,
)
synced += 1
# Remove from radio
@@ -757,6 +745,7 @@ async def sync_recent_contacts_to_radio(force: bool = False, mc: MeshCore | None
# If caller provided a MeshCore instance, use it directly (caller holds the lock)
if mc is not None:
_last_contact_sync = now
assert mc is not None
return await _sync_contacts_to_radio_inner(mc)
if not radio_manager.is_connected or radio_manager.meshcore is None:
@@ -769,6 +758,7 @@ async def sync_recent_contacts_to_radio(force: bool = False, mc: MeshCore | None
blocking=False,
) as mc:
_last_contact_sync = now
assert mc is not None
return await _sync_contacts_to_radio_inner(mc)
except RadioOperationBusyError:
logger.debug("Skipping contact sync to radio: radio busy")
+31 -18
View File
@@ -1,4 +1,5 @@
import time
from collections.abc import Mapping
from typing import Any
from app.database import db
@@ -7,6 +8,7 @@ from app.models import (
ContactAdvertPath,
ContactAdvertPathSummary,
ContactNameHistory,
ContactUpsert,
)
from app.path_utils import first_hop_hex, normalize_contact_route, normalize_route_override
@@ -22,17 +24,28 @@ class AmbiguousPublicKeyPrefixError(ValueError):
class ContactRepository:
@staticmethod
async def upsert(contact: dict[str, Any]) -> None:
def _coerce_contact_upsert(
contact: ContactUpsert | Contact | Mapping[str, Any],
) -> ContactUpsert:
if isinstance(contact, ContactUpsert):
return contact
if isinstance(contact, Contact):
return contact.to_upsert()
return ContactUpsert.model_validate(contact)
@staticmethod
async def upsert(contact: ContactUpsert | Contact | Mapping[str, Any]) -> None:
contact_row = ContactRepository._coerce_contact_upsert(contact)
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
contact.get("last_path"),
contact.get("last_path_len", -1),
contact.get("out_path_hash_mode"),
contact_row.last_path,
contact_row.last_path_len,
contact_row.out_path_hash_mode,
)
route_override_path, route_override_len, route_override_hash_mode = (
normalize_route_override(
contact.get("route_override_path"),
contact.get("route_override_len"),
contact.get("route_override_hash_mode"),
contact_row.route_override_path,
contact_row.route_override_len,
contact_row.route_override_hash_mode,
)
)
@@ -70,23 +83,23 @@ class ContactRepository:
first_seen = COALESCE(contacts.first_seen, excluded.first_seen)
""",
(
contact.get("public_key", "").lower(),
contact.get("name"),
contact.get("type", 0),
contact.get("flags", 0),
contact_row.public_key.lower(),
contact_row.name,
contact_row.type,
contact_row.flags,
last_path,
last_path_len,
out_path_hash_mode,
route_override_path,
route_override_len,
route_override_hash_mode,
contact.get("last_advert"),
contact.get("lat"),
contact.get("lon"),
contact.get("last_seen", int(time.time())),
contact.get("on_radio"),
contact.get("last_contacted"),
contact.get("first_seen"),
contact_row.last_advert,
contact_row.lat,
contact_row.lon,
contact_row.last_seen if contact_row.last_seen is not None else int(time.time()),
contact_row.on_radio,
contact_row.last_contacted,
contact_row.first_seen,
),
)
await db.conn.commit()
+1 -1
View File
@@ -7,10 +7,10 @@ from pydantic import BaseModel, Field
from app.dependencies import require_connected
from app.models import Channel, ChannelDetail, ChannelMessageCounts, ChannelTopSender
from app.radio import radio_manager
from app.radio_sync import upsert_channel_from_radio_slot
from app.region_scope import normalize_region_scope
from app.repository import ChannelRepository, MessageRepository
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_event
logger = logging.getLogger(__name__)
+23 -60
View File
@@ -12,13 +12,13 @@ from app.models import (
ContactAdvertPathSummary,
ContactDetail,
ContactRoutingOverrideRequest,
ContactUpsert,
CreateContactRequest,
NearestRepeater,
TraceResponse,
)
from app.packet_processor import start_historical_dm_decryption
from app.path_utils import parse_explicit_hop_route
from app.radio import radio_manager
from app.repository import (
AmbiguousPublicKeyPrefixError,
ContactAdvertPathRepository,
@@ -26,6 +26,8 @@ from app.repository import (
ContactRepository,
MessageRepository,
)
from app.services.contact_reconciliation import reconcile_contact_messages
from app.services.radio_runtime import radio_runtime as radio_manager
logger = logging.getLogger(__name__)
@@ -132,23 +134,7 @@ async def create_contact(
if existing:
# Update name if provided
if request.name:
await ContactRepository.upsert(
{
"public_key": existing.public_key,
"name": request.name,
"type": existing.type,
"flags": existing.flags,
"last_path": existing.last_path,
"last_path_len": existing.last_path_len,
"out_path_hash_mode": existing.out_path_hash_mode,
"last_advert": existing.last_advert,
"lat": existing.lat,
"lon": existing.lon,
"last_seen": existing.last_seen,
"on_radio": existing.on_radio,
"last_contacted": existing.last_contacted,
}
)
await ContactRepository.upsert(existing.to_upsert(name=request.name))
refreshed = await ContactRepository.get_by_key(request.public_key)
if refreshed is not None:
existing = refreshed
@@ -163,42 +149,26 @@ async def create_contact(
# Create new contact
lower_key = request.public_key.lower()
contact_data = {
"public_key": lower_key,
"name": request.name,
"type": 0, # Unknown
"flags": 0,
"last_path": None,
"last_path_len": -1,
"out_path_hash_mode": -1,
"last_advert": None,
"lat": None,
"lon": None,
"last_seen": None,
"on_radio": False,
"last_contacted": None,
}
await ContactRepository.upsert(contact_data)
contact_upsert = ContactUpsert(
public_key=lower_key,
name=request.name,
out_path_hash_mode=-1,
on_radio=False,
)
await ContactRepository.upsert(contact_upsert)
logger.info("Created contact %s", lower_key[:12])
# Promote any prefix-stored messages to this full key
claimed = await MessageRepository.claim_prefix_messages(lower_key)
if claimed > 0:
logger.info("Claimed %d prefix messages for contact %s", claimed, lower_key[:12])
# Backfill sender_key on channel messages that match this contact's name
if request.name:
backfilled = await MessageRepository.backfill_channel_sender_key(lower_key, request.name)
if backfilled > 0:
logger.info(
"Backfilled sender_key on %d channel message(s) for %s", backfilled, request.name
)
await reconcile_contact_messages(
public_key=lower_key,
contact_name=request.name,
log=logger,
)
# Trigger historical decryption if requested
if request.try_historical:
await start_historical_dm_decryption(background_tasks, lower_key, request.name)
return Contact(**contact_data)
return Contact(**contact_upsert.model_dump())
@router.get("/{public_key}/detail", response_model=ContactDetail)
@@ -315,21 +285,14 @@ async def sync_contacts_from_radio() -> dict:
for public_key, contact_data in contacts.items():
lower_key = public_key.lower()
await ContactRepository.upsert(
Contact.from_radio_dict(lower_key, contact_data, on_radio=True)
ContactUpsert.from_radio_dict(lower_key, contact_data, on_radio=True)
)
synced_keys.append(lower_key)
claimed = await MessageRepository.claim_prefix_messages(lower_key)
if claimed > 0:
logger.info("Claimed %d prefix DM message(s) for contact %s", claimed, public_key[:12])
adv_name = contact_data.get("adv_name")
if adv_name:
backfilled = await MessageRepository.backfill_channel_sender_key(lower_key, adv_name)
if backfilled > 0:
logger.info(
"Backfilled sender_key on %d channel message(s) for %s",
backfilled,
adv_name,
)
await reconcile_contact_messages(
public_key=lower_key,
contact_name=contact_data.get("adv_name"),
log=logger,
)
count += 1
# Clear on_radio for contacts not found on the radio
+3 -1
View File
@@ -5,8 +5,8 @@ from fastapi import APIRouter
from pydantic import BaseModel
from app.config import settings
from app.radio import radio_manager
from app.repository import RawPacketRepository
from app.services.radio_runtime import radio_runtime as radio_manager
router = APIRouter(tags=["health"])
@@ -53,6 +53,8 @@ async def build_health_data(radio_connected: bool, connection_info: str | None)
setup_complete = getattr(radio_manager, "is_setup_complete", radio_connected)
if not isinstance(setup_complete, bool):
setup_complete = radio_connected
if not radio_connected:
setup_complete = False
radio_initializing = bool(radio_connected and (setup_in_progress or not setup_complete))
+36 -351
View File
@@ -1,9 +1,7 @@
import logging
import time
from typing import Any
from fastapi import APIRouter, HTTPException, Query
from meshcore import EventType
from app.dependencies import require_connected
from app.event_handlers import track_pending_ack
@@ -13,114 +11,19 @@ from app.models import (
SendChannelMessageRequest,
SendDirectMessageRequest,
)
from app.radio import radio_manager
from app.region_scope import normalize_region_scope
from app.repository import AmbiguousPublicKeyPrefixError, AppSettingsRepository, MessageRepository
from app.services.message_send import (
resend_channel_message_record,
send_channel_message_to_channel,
send_direct_message_to_contact,
)
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import broadcast_error, broadcast_event
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/messages", tags=["messages"])
async def _send_channel_message_with_effective_scope(
*,
mc,
channel,
key_bytes: bytes,
text: str,
timestamp_bytes: bytes,
action_label: str,
) -> Any:
"""Send a channel message, temporarily overriding flood scope when configured."""
override_scope = normalize_region_scope(channel.flood_scope_override)
baseline_scope = ""
if override_scope:
settings = await AppSettingsRepository.get()
baseline_scope = normalize_region_scope(settings.flood_scope)
if override_scope and override_scope != baseline_scope:
logger.info(
"Temporarily applying channel flood_scope override for %s: %r",
channel.name,
override_scope,
)
override_result = await mc.commands.set_flood_scope(override_scope)
if override_result is not None and override_result.type == EventType.ERROR:
logger.warning(
"Failed to apply channel flood_scope override for %s: %s",
channel.name,
override_result.payload,
)
raise HTTPException(
status_code=500,
detail=(
f"Failed to apply regional override {override_scope!r} before {action_label}: "
f"{override_result.payload}"
),
)
try:
set_result = await mc.commands.set_channel(
channel_idx=TEMP_RADIO_SLOT,
channel_name=channel.name,
channel_secret=key_bytes,
)
if set_result.type == EventType.ERROR:
logger.warning(
"Failed to set channel on radio slot %d before %s: %s",
TEMP_RADIO_SLOT,
action_label,
set_result.payload,
)
raise HTTPException(
status_code=500,
detail=f"Failed to configure channel on radio before {action_label}",
)
return await mc.commands.send_chan_msg(
chan=TEMP_RADIO_SLOT,
msg=text,
timestamp=timestamp_bytes,
)
finally:
if override_scope and override_scope != baseline_scope:
try:
restore_result = await mc.commands.set_flood_scope(
baseline_scope if baseline_scope else ""
)
if restore_result is not None and restore_result.type == EventType.ERROR:
logger.error(
"Failed to restore baseline flood_scope after sending to %s: %s",
channel.name,
restore_result.payload,
)
broadcast_error(
"Regional override restore failed",
(
f"Sent to {channel.name}, but restoring flood scope failed. "
"The radio may still be region-scoped. Consider rebooting the radio."
),
)
else:
logger.debug(
"Restored baseline flood_scope after channel send: %r",
baseline_scope or "(disabled)",
)
except Exception:
logger.exception(
"Failed to restore baseline flood_scope after sending to %s",
channel.name,
)
broadcast_error(
"Regional override restore failed",
(
f"Sent to {channel.name}, but restoring flood scope failed. "
"The radio may still be region-scoped. Consider rebooting the radio."
),
)
@router.get("/around/{message_id}", response_model=MessagesAroundResponse)
async def get_messages_around(
message_id: int,
@@ -206,80 +109,16 @@ async def send_direct_message(request: SendDirectMessageRequest) -> Message:
status_code=404, detail=f"Contact not found in database: {request.destination}"
)
# Always add/update the contact on radio before sending.
# The library cache (get_contact_by_key_prefix) can be stale after radio reboot,
# so we can't rely on it to know if the firmware has the contact.
# add_contact is idempotent - updates if exists, adds if not.
contact_data = db_contact.to_radio_dict()
async with radio_manager.radio_operation("send_direct_message") as mc:
logger.debug("Ensuring contact %s is on radio before sending", db_contact.public_key[:12])
add_result = await mc.commands.add_contact(contact_data)
if add_result.type == EventType.ERROR:
logger.warning("Failed to add contact to radio: %s", add_result.payload)
# Continue anyway - might still work if contact exists
# Get the contact from the library cache (may have updated info like path)
contact = mc.get_contact_by_key_prefix(db_contact.public_key[:12])
if not contact:
contact = contact_data
logger.info("Sending direct message to %s", db_contact.public_key[:12])
# Capture timestamp BEFORE sending so we can pass the same value to both the radio
# and the database. This ensures consistency for deduplication.
now = int(time.time())
result = await mc.commands.send_msg(
dst=contact,
msg=request.text,
timestamp=now,
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
# Store outgoing message
message_id = await MessageRepository.create(
msg_type="PRIV",
return await send_direct_message_to_contact(
contact=db_contact,
text=request.text,
conversation_key=db_contact.public_key.lower(),
sender_timestamp=now,
received_at=now,
outgoing=True,
radio_manager=radio_manager,
broadcast_fn=broadcast_event,
track_pending_ack_fn=track_pending_ack,
now_fn=time.time,
message_repository=MessageRepository,
contact_repository=ContactRepository,
)
if message_id is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
# Update last_contacted for the contact
await ContactRepository.update_last_contacted(db_contact.public_key.lower(), now)
# Track the expected ACK for this message
expected_ack = result.payload.get("expected_ack")
suggested_timeout: int = result.payload.get("suggested_timeout", 10000) # default 10s
if expected_ack:
ack_code = expected_ack.hex() if isinstance(expected_ack, bytes) else expected_ack
track_pending_ack(ack_code, message_id, suggested_timeout)
logger.debug("Tracking ACK %s for message %d", ack_code, message_id)
message = Message(
id=message_id,
type="PRIV",
conversation_key=db_contact.public_key.lower(),
text=request.text,
sender_timestamp=now,
received_at=now,
outgoing=True,
acked=0,
)
# Broadcast so all connected clients (not just sender) see the outgoing message immediately.
# Fanout modules (including bots) are triggered via broadcast_event's realtime dispatch.
broadcast_event("message", message.model_dump())
return message
# Temporary radio slot used for sending channel messages
@@ -317,98 +156,19 @@ async def send_channel_message(request: SendChannelMessageRequest) -> Message:
TEMP_RADIO_SLOT,
expected_hash,
)
channel_key_upper = request.channel_key.upper()
message_id: int | None = None
now: int | None = None
radio_name: str = ""
text_with_sender: str = request.text
our_public_key: str | None = None
async with radio_manager.radio_operation("send_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
our_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_with_sender = f"{radio_name}: {request.text}" if radio_name else request.text
logger.info("Sending channel message to %s: %s", db_channel.name, request.text[:50])
# Capture timestamp BEFORE sending so we can pass the same value to both the radio
# and the database. This ensures the echo's timestamp matches our stored message
# for proper deduplication.
now = int(time.time())
timestamp_bytes = now.to_bytes(4, "little")
result = await _send_channel_message_with_effective_scope(
mc=mc,
channel=db_channel,
key_bytes=key_bytes,
text=request.text,
timestamp_bytes=timestamp_bytes,
action_label="sending message",
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
# Store outgoing immediately after send to avoid a race where
# our own echo lands before persistence.
message_id = await MessageRepository.create(
msg_type="CHAN",
text=text_with_sender,
conversation_key=channel_key_upper,
sender_timestamp=now,
received_at=now,
outgoing=True,
sender_name=radio_name or None,
sender_key=our_public_key,
)
if message_id is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
# Broadcast immediately so all connected clients see the message promptly.
# This ensures the message exists in frontend state when echo-driven
# `message_acked` events arrive.
broadcast_event(
"message",
Message(
id=message_id,
type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
outgoing=True,
acked=0,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=db_channel.name,
).model_dump(),
)
if message_id is None or now is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message")
acked_count, paths = await MessageRepository.get_ack_and_paths(message_id)
message = Message(
id=message_id,
type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
outgoing=True,
acked=acked_count,
paths=paths,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=db_channel.name,
return await send_channel_message_to_channel(
channel=db_channel,
channel_key_upper=request.channel_key.upper(),
key_bytes=key_bytes,
text=request.text,
radio_manager=radio_manager,
broadcast_fn=broadcast_event,
error_broadcast_fn=broadcast_error,
now_fn=time.time,
temp_radio_slot=TEMP_RADIO_SLOT,
message_repository=MessageRepository,
)
return message
RESEND_WINDOW_SECONDS = 30
@@ -453,89 +213,14 @@ async def resend_channel_message(
if not db_channel:
raise HTTPException(status_code=404, detail=f"Channel {msg.conversation_key} not found")
# Choose timestamp: original for byte-perfect, fresh for new-timestamp
if new_timestamp:
now = int(time.time())
timestamp_bytes = now.to_bytes(4, "little")
else:
timestamp_bytes = msg.sender_timestamp.to_bytes(4, "little")
try:
key_bytes = bytes.fromhex(msg.conversation_key)
except ValueError:
raise HTTPException(
status_code=400, detail=f"Invalid channel key format: {msg.conversation_key}"
) from None
resend_public_key: str | None = None
async with radio_manager.radio_operation("resend_channel_message") as mc:
# Strip sender prefix: DB stores "RadioName: message" but radio needs "message"
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
resend_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_to_send = msg.text
if radio_name and text_to_send.startswith(f"{radio_name}: "):
text_to_send = text_to_send[len(f"{radio_name}: ") :]
result = await _send_channel_message_with_effective_scope(
mc=mc,
channel=db_channel,
key_bytes=key_bytes,
text=text_to_send,
timestamp_bytes=timestamp_bytes,
action_label="resending message",
)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500, detail=f"Failed to resend message: {result.payload}"
)
# For new-timestamp resend, create a new message row and broadcast it
if new_timestamp:
new_msg_id = await MessageRepository.create(
msg_type="CHAN",
text=msg.text,
conversation_key=msg.conversation_key,
sender_timestamp=now,
received_at=now,
outgoing=True,
sender_name=radio_name or None,
sender_key=resend_public_key,
)
if new_msg_id is None:
# Timestamp-second collision (same text+channel within the same second).
# The radio already transmitted, so log and return the original ID rather
# than surfacing a 500 for a message that was successfully sent over the air.
logger.warning(
"Duplicate timestamp collision resending message %d — radio sent but DB row not created",
message_id,
)
return {"status": "ok", "message_id": message_id}
broadcast_event(
"message",
Message(
id=new_msg_id,
type="CHAN",
conversation_key=msg.conversation_key,
text=msg.text,
sender_timestamp=now,
received_at=now,
outgoing=True,
acked=0,
sender_name=radio_name or None,
sender_key=resend_public_key,
channel_name=db_channel.name,
).model_dump(),
)
logger.info(
"Resent channel message %d as new message %d to %s",
message_id,
new_msg_id,
db_channel.name,
)
return {"status": "ok", "message_id": new_msg_id}
logger.info("Resent channel message %d to %s", message_id, db_channel.name)
return {"status": "ok", "message_id": message_id}
return await resend_channel_message_record(
message=msg,
channel=db_channel,
new_timestamp=new_timestamp,
radio_manager=radio_manager,
broadcast_fn=broadcast_event,
error_broadcast_fn=broadcast_error,
now_fn=time.time,
temp_radio_slot=TEMP_RADIO_SLOT,
message_repository=MessageRepository,
)
+44 -82
View File
@@ -1,18 +1,34 @@
import logging
from fastapi import APIRouter, HTTPException
from meshcore import EventType
from pydantic import BaseModel, Field
from app.dependencies import require_connected
from app.radio import radio_manager
from app.radio_sync import send_advertisement as do_send_advertisement
from app.radio_sync import sync_radio_time
from app.services.radio_commands import (
KeystoreRefreshError,
PathHashModeUnsupportedError,
RadioCommandRejectedError,
apply_radio_config_update,
import_private_key_and_refresh_keystore,
)
from app.services.radio_runtime import radio_runtime as radio_manager
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/radio", tags=["radio"])
async def _prepare_connected(*, broadcast_on_success: bool) -> None:
await radio_manager.prepare_connected(broadcast_on_success=broadcast_on_success)
async def _reconnect_and_prepare(*, broadcast_on_success: bool) -> bool:
return await radio_manager.reconnect_and_prepare(
broadcast_on_success=broadcast_on_success,
)
class RadioSettings(BaseModel):
freq: float = Field(description="Frequency in MHz")
bw: float = Field(description="Bandwidth in kHz")
@@ -87,57 +103,18 @@ async def update_radio_config(update: RadioConfigUpdate) -> RadioConfigResponse:
require_connected()
async with radio_manager.radio_operation("update_radio_config") as mc:
if update.name is not None:
logger.info("Setting radio name to %s", update.name)
await mc.commands.set_name(update.name)
if update.lat is not None or update.lon is not None:
current_info = mc.self_info
lat = update.lat if update.lat is not None else current_info.get("adv_lat", 0.0)
lon = update.lon if update.lon is not None else current_info.get("adv_lon", 0.0)
logger.info("Setting radio coordinates to %f, %f", lat, lon)
await mc.commands.set_coords(lat=lat, lon=lon)
if update.tx_power is not None:
logger.info("Setting TX power to %d dBm", update.tx_power)
await mc.commands.set_tx_power(val=update.tx_power)
if update.radio is not None:
logger.info(
"Setting radio params: freq=%f MHz, bw=%f kHz, sf=%d, cr=%d",
update.radio.freq,
update.radio.bw,
update.radio.sf,
update.radio.cr,
try:
await apply_radio_config_update(
mc,
update,
path_hash_mode_supported=radio_manager.path_hash_mode_supported,
set_path_hash_mode=lambda mode: setattr(radio_manager, "path_hash_mode", mode),
sync_radio_time_fn=sync_radio_time,
)
await mc.commands.set_radio(
freq=update.radio.freq,
bw=update.radio.bw,
sf=update.radio.sf,
cr=update.radio.cr,
)
if update.path_hash_mode is not None:
if not radio_manager.path_hash_mode_supported:
raise HTTPException(
status_code=400, detail="Firmware does not support path hash mode setting"
)
logger.info("Setting path hash mode to %d", update.path_hash_mode)
result = await mc.commands.set_path_hash_mode(update.path_hash_mode)
if result is not None and result.type == EventType.ERROR:
raise HTTPException(
status_code=500,
detail=f"Failed to set path hash mode: {result.payload}",
)
radio_manager.path_hash_mode = update.path_hash_mode
# Sync time with system clock
await sync_radio_time(mc)
# Re-fetch self_info so the response reflects the changes we just made.
# Commands like set_name() write to flash but don't update the cached
# self_info — send_appstart() triggers a fresh SELF_INFO from the radio.
await mc.commands.send_appstart()
except PathHashModeUnsupportedError as exc:
raise HTTPException(status_code=400, detail=str(exc)) from exc
except RadioCommandRejectedError as exc:
raise HTTPException(status_code=500, detail=str(exc)) from exc
return await get_radio_config()
@@ -154,30 +131,16 @@ async def set_private_key(update: PrivateKeyUpdate) -> dict:
logger.info("Importing private key")
async with radio_manager.radio_operation("import_private_key") as mc:
result = await mc.commands.import_private_key(key_bytes)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500, detail=f"Failed to import private key: {result.payload}"
)
# Re-export from radio so the server-side keystore uses the new key
# for DM decryption immediately, rather than waiting for reconnect.
from app.keystore import export_and_store_private_key
keystore_refreshed = await export_and_store_private_key(mc)
if not keystore_refreshed:
logger.warning("Keystore refresh failed after import, retrying once")
keystore_refreshed = await export_and_store_private_key(mc)
if not keystore_refreshed:
raise HTTPException(
status_code=500,
detail=(
"Private key imported on radio, but server-side keystore "
"refresh failed. Reconnect to apply the new key for DM decryption."
),
)
try:
await import_private_key_and_refresh_keystore(
mc,
key_bytes,
export_and_store_private_key_fn=export_and_store_private_key,
)
except (RadioCommandRejectedError, KeystoreRefreshError) as exc:
raise HTTPException(status_code=500, detail=str(exc)) from exc
return {"status": "ok"}
@@ -214,14 +177,8 @@ async def _attempt_reconnect() -> dict:
"connected": False,
}
success = await radio_manager.reconnect()
if not success:
raise HTTPException(
status_code=503, detail="Failed to reconnect. Check radio connection and power."
)
try:
await radio_manager.post_connect_setup()
success = await _reconnect_and_prepare(broadcast_on_success=True)
except Exception as e:
logger.exception("Post-connect setup failed after reconnect")
raise HTTPException(
@@ -229,6 +186,11 @@ async def _attempt_reconnect() -> dict:
detail=f"Radio connected but setup failed: {e}",
) from e
if not success:
raise HTTPException(
status_code=503, detail="Failed to reconnect. Check radio connection and power."
)
return {"status": "ok", "message": "Reconnected successfully", "connected": True}
@@ -266,7 +228,7 @@ async def reconnect_radio() -> dict:
logger.info("Radio connected but setup incomplete, retrying setup")
try:
await radio_manager.post_connect_setup()
await _prepare_connected(broadcast_on_success=True)
return {"status": "ok", "message": "Setup completed", "connected": True}
except Exception as e:
logger.exception("Post-connect setup failed")
+1 -1
View File
@@ -6,13 +6,13 @@ import time
from fastapi import APIRouter
from app.models import UnreadCounts
from app.radio import radio_manager
from app.repository import (
AppSettingsRepository,
ChannelRepository,
ContactRepository,
MessageRepository,
)
from app.services.radio_runtime import radio_runtime as radio_manager
logger = logging.getLogger(__name__)
router = APIRouter(prefix="/read-state", tags=["read-state"])
+1 -1
View File
@@ -25,9 +25,9 @@ from app.models import (
RepeaterRadioSettingsResponse,
RepeaterStatusResponse,
)
from app.radio import radio_manager
from app.repository import ContactRepository
from app.routers.contacts import _ensure_on_radio, _resolve_contact_or_404
from app.services.radio_runtime import radio_runtime as radio_manager
if TYPE_CHECKING:
from meshcore.events import Event
+1 -1
View File
@@ -132,7 +132,7 @@ async def update_settings(update: AppSettingsUpdate) -> AppSettings:
# Apply flood scope to radio immediately if changed
if flood_scope_changed:
from app.radio import radio_manager
from app.services.radio_runtime import radio_runtime as radio_manager
if radio_manager.is_connected:
try:
+1 -1
View File
@@ -4,8 +4,8 @@ import logging
from fastapi import APIRouter, WebSocket, WebSocketDisconnect
from app.radio import radio_manager
from app.routers.health import build_health_data
from app.services.radio_runtime import radio_runtime as radio_manager
from app.websocket import ws_manager
logger = logging.getLogger(__name__)
+1
View File
@@ -0,0 +1 @@
"""Backend service-layer helpers."""
+115
View File
@@ -0,0 +1,115 @@
"""Shared contact/message reconciliation helpers."""
import logging
from app.repository import ContactNameHistoryRepository, MessageRepository
logger = logging.getLogger(__name__)
async def claim_prefix_messages_for_contact(
*,
public_key: str,
message_repository=MessageRepository,
log: logging.Logger | None = None,
) -> int:
"""Promote prefix-key DMs to a resolved full public key."""
normalized_key = public_key.lower()
claimed = await message_repository.claim_prefix_messages(normalized_key)
if claimed > 0:
(log or logger).info(
"Claimed %d prefix DM message(s) for contact %s",
claimed,
normalized_key[:12],
)
return claimed
async def backfill_channel_sender_for_contact(
*,
public_key: str,
contact_name: str | None,
message_repository=MessageRepository,
log: logging.Logger | None = None,
) -> int:
"""Backfill channel sender attribution once a contact name is known."""
if not contact_name:
return 0
normalized_key = public_key.lower()
backfilled = await message_repository.backfill_channel_sender_key(
normalized_key,
contact_name,
)
if backfilled > 0:
(log or logger).info(
"Backfilled sender_key on %d channel message(s) for %s",
backfilled,
contact_name,
)
return backfilled
async def reconcile_contact_messages(
*,
public_key: str,
contact_name: str | None,
message_repository=MessageRepository,
log: logging.Logger | None = None,
) -> tuple[int, int]:
"""Apply message reconciliation once a contact's identity is resolved."""
claimed = await claim_prefix_messages_for_contact(
public_key=public_key,
message_repository=message_repository,
log=log,
)
backfilled = await backfill_channel_sender_for_contact(
public_key=public_key,
contact_name=contact_name,
message_repository=message_repository,
log=log,
)
return claimed, backfilled
async def record_contact_name(
*,
public_key: str,
contact_name: str | None,
timestamp: int,
contact_name_history_repository=ContactNameHistoryRepository,
) -> bool:
"""Record contact name history when a non-empty name is available."""
if not contact_name:
return False
await contact_name_history_repository.record_name(
public_key.lower(),
contact_name,
timestamp,
)
return True
async def record_contact_name_and_reconcile(
*,
public_key: str,
contact_name: str | None,
timestamp: int,
message_repository=MessageRepository,
contact_name_history_repository=ContactNameHistoryRepository,
log: logging.Logger | None = None,
) -> tuple[int, int]:
"""Record name history, then reconcile message identity for the contact."""
await record_contact_name(
public_key=public_key,
contact_name=contact_name,
timestamp=timestamp,
contact_name_history_repository=contact_name_history_repository,
)
return await reconcile_contact_messages(
public_key=public_key,
contact_name=contact_name,
message_repository=message_repository,
log=log,
)
+43
View File
@@ -0,0 +1,43 @@
"""Shared pending ACK tracking for outgoing direct messages."""
import logging
import time
logger = logging.getLogger(__name__)
PendingAck = tuple[int, float, int]
_pending_acks: dict[str, PendingAck] = {}
def track_pending_ack(expected_ack: str, message_id: int, timeout_ms: int) -> None:
"""Track an expected ACK code for an outgoing direct message."""
_pending_acks[expected_ack] = (message_id, time.time(), timeout_ms)
logger.debug(
"Tracking pending ACK %s for message %d (timeout %dms)",
expected_ack,
message_id,
timeout_ms,
)
def cleanup_expired_acks() -> None:
"""Remove stale pending ACK entries."""
now = time.time()
expired_codes = [
code
for code, (_message_id, created_at, timeout_ms) in _pending_acks.items()
if now - created_at > (timeout_ms / 1000) * 2
]
for code in expired_codes:
del _pending_acks[code]
logger.debug("Expired pending ACK %s", code)
def pop_pending_ack(ack_code: str) -> int | None:
"""Claim the tracked message ID for an ACK code if present."""
pending = _pending_acks.pop(ack_code, None)
if pending is None:
return None
message_id, _, _ = pending
return message_id
+352
View File
@@ -0,0 +1,352 @@
"""Shared send/resend orchestration for outgoing messages."""
import logging
from collections.abc import Callable
from typing import Any
from fastapi import HTTPException
from meshcore import EventType
from app.region_scope import normalize_region_scope
from app.repository import AppSettingsRepository, ContactRepository, MessageRepository
from app.services.messages import (
build_message_model,
create_outgoing_channel_message,
create_outgoing_direct_message,
)
logger = logging.getLogger(__name__)
BroadcastFn = Callable[..., Any]
TrackAckFn = Callable[[str, int, int], None]
NowFn = Callable[[], float]
async def send_channel_message_with_effective_scope(
*,
mc,
channel,
key_bytes: bytes,
text: str,
timestamp_bytes: bytes,
action_label: str,
temp_radio_slot: int,
error_broadcast_fn: BroadcastFn,
app_settings_repository=AppSettingsRepository,
) -> Any:
"""Send a channel message, temporarily overriding flood scope when configured."""
override_scope = normalize_region_scope(channel.flood_scope_override)
baseline_scope = ""
if override_scope:
settings = await app_settings_repository.get()
baseline_scope = normalize_region_scope(settings.flood_scope)
if override_scope and override_scope != baseline_scope:
logger.info(
"Temporarily applying channel flood_scope override for %s: %r",
channel.name,
override_scope,
)
override_result = await mc.commands.set_flood_scope(override_scope)
if override_result is not None and override_result.type == EventType.ERROR:
logger.warning(
"Failed to apply channel flood_scope override for %s: %s",
channel.name,
override_result.payload,
)
raise HTTPException(
status_code=500,
detail=(
f"Failed to apply regional override {override_scope!r} before {action_label}: "
f"{override_result.payload}"
),
)
try:
set_result = await mc.commands.set_channel(
channel_idx=temp_radio_slot,
channel_name=channel.name,
channel_secret=key_bytes,
)
if set_result.type == EventType.ERROR:
logger.warning(
"Failed to set channel on radio slot %d before %s: %s",
temp_radio_slot,
action_label,
set_result.payload,
)
raise HTTPException(
status_code=500,
detail=f"Failed to configure channel on radio before {action_label}",
)
return await mc.commands.send_chan_msg(
chan=temp_radio_slot,
msg=text,
timestamp=timestamp_bytes,
)
finally:
if override_scope and override_scope != baseline_scope:
try:
restore_result = await mc.commands.set_flood_scope(
baseline_scope if baseline_scope else ""
)
if restore_result is not None and restore_result.type == EventType.ERROR:
logger.error(
"Failed to restore baseline flood_scope after sending to %s: %s",
channel.name,
restore_result.payload,
)
error_broadcast_fn(
"Regional override restore failed",
(
f"Sent to {channel.name}, but restoring flood scope failed. "
"The radio may still be region-scoped. Consider rebooting the radio."
),
)
else:
logger.debug(
"Restored baseline flood_scope after channel send: %r",
baseline_scope or "(disabled)",
)
except Exception:
logger.exception(
"Failed to restore baseline flood_scope after sending to %s",
channel.name,
)
error_broadcast_fn(
"Regional override restore failed",
(
f"Sent to {channel.name}, but restoring flood scope failed. "
"The radio may still be region-scoped. Consider rebooting the radio."
),
)
async def send_direct_message_to_contact(
*,
contact,
text: str,
radio_manager,
broadcast_fn: BroadcastFn,
track_pending_ack_fn: TrackAckFn,
now_fn: NowFn,
message_repository=MessageRepository,
contact_repository=ContactRepository,
) -> Any:
"""Send a direct message and persist/broadcast the outgoing row."""
contact_data = contact.to_radio_dict()
async with radio_manager.radio_operation("send_direct_message") as mc:
logger.debug("Ensuring contact %s is on radio before sending", contact.public_key[:12])
add_result = await mc.commands.add_contact(contact_data)
if add_result.type == EventType.ERROR:
logger.warning("Failed to add contact to radio: %s", add_result.payload)
cached_contact = mc.get_contact_by_key_prefix(contact.public_key[:12])
if not cached_contact:
cached_contact = contact_data
logger.info("Sending direct message to %s", contact.public_key[:12])
now = int(now_fn())
result = await mc.commands.send_msg(
dst=cached_contact,
msg=text,
timestamp=now,
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
message = await create_outgoing_direct_message(
conversation_key=contact.public_key.lower(),
text=text,
sender_timestamp=now,
received_at=now,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
await contact_repository.update_last_contacted(contact.public_key.lower(), now)
expected_ack = result.payload.get("expected_ack")
suggested_timeout: int = result.payload.get("suggested_timeout", 10000)
if expected_ack:
ack_code = expected_ack.hex() if isinstance(expected_ack, bytes) else expected_ack
track_pending_ack_fn(ack_code, message.id, suggested_timeout)
logger.debug("Tracking ACK %s for message %d", ack_code, message.id)
return message
async def send_channel_message_to_channel(
*,
channel,
channel_key_upper: str,
key_bytes: bytes,
text: str,
radio_manager,
broadcast_fn: BroadcastFn,
error_broadcast_fn: BroadcastFn,
now_fn: NowFn,
temp_radio_slot: int,
message_repository=MessageRepository,
) -> Any:
"""Send a channel message and persist/broadcast the outgoing row."""
message_id: int | None = None
now: int | None = None
radio_name = ""
our_public_key: str | None = None
text_with_sender = text
async with radio_manager.radio_operation("send_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
our_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_with_sender = f"{radio_name}: {text}" if radio_name else text
logger.info("Sending channel message to %s: %s", channel.name, text[:50])
now = int(now_fn())
timestamp_bytes = now.to_bytes(4, "little")
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
key_bytes=key_bytes,
text=text,
timestamp_bytes=timestamp_bytes,
action_label="sending message",
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
)
if result.type == EventType.ERROR:
raise HTTPException(status_code=500, detail=f"Failed to send message: {result.payload}")
outgoing_message = await create_outgoing_channel_message(
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if outgoing_message is None:
raise HTTPException(
status_code=500,
detail="Failed to store outgoing message - unexpected duplicate",
)
message_id = outgoing_message.id
if message_id is None or now is None:
raise HTTPException(status_code=500, detail="Failed to store outgoing message")
acked_count, paths = await message_repository.get_ack_and_paths(message_id)
return build_message_model(
message_id=message_id,
msg_type="CHAN",
conversation_key=channel_key_upper,
text=text_with_sender,
sender_timestamp=now,
received_at=now,
paths=paths,
outgoing=True,
acked=acked_count,
sender_name=radio_name or None,
sender_key=our_public_key,
channel_name=channel.name,
)
async def resend_channel_message_record(
*,
message,
channel,
new_timestamp: bool,
radio_manager,
broadcast_fn: BroadcastFn,
error_broadcast_fn: BroadcastFn,
now_fn: NowFn,
temp_radio_slot: int,
message_repository=MessageRepository,
) -> dict[str, Any]:
"""Resend a stored outgoing channel message."""
try:
key_bytes = bytes.fromhex(message.conversation_key)
except ValueError:
raise HTTPException(
status_code=400,
detail=f"Invalid channel key format: {message.conversation_key}",
) from None
now: int | None = None
if new_timestamp:
now = int(now_fn())
timestamp_bytes = now.to_bytes(4, "little")
else:
timestamp_bytes = message.sender_timestamp.to_bytes(4, "little")
resend_public_key: str | None = None
radio_name = ""
async with radio_manager.radio_operation("resend_channel_message") as mc:
radio_name = mc.self_info.get("name", "") if mc.self_info else ""
resend_public_key = (mc.self_info.get("public_key") or None) if mc.self_info else None
text_to_send = message.text
if radio_name and text_to_send.startswith(f"{radio_name}: "):
text_to_send = text_to_send[len(f"{radio_name}: ") :]
result = await send_channel_message_with_effective_scope(
mc=mc,
channel=channel,
key_bytes=key_bytes,
text=text_to_send,
timestamp_bytes=timestamp_bytes,
action_label="resending message",
temp_radio_slot=temp_radio_slot,
error_broadcast_fn=error_broadcast_fn,
)
if result.type == EventType.ERROR:
raise HTTPException(
status_code=500,
detail=f"Failed to resend message: {result.payload}",
)
if new_timestamp:
if now is None:
raise HTTPException(status_code=500, detail="Failed to assign resend timestamp")
new_message = await create_outgoing_channel_message(
conversation_key=message.conversation_key,
text=message.text,
sender_timestamp=now,
received_at=now,
sender_name=radio_name or None,
sender_key=resend_public_key,
channel_name=channel.name,
broadcast_fn=broadcast_fn,
message_repository=message_repository,
)
if new_message is None:
logger.warning(
"Duplicate timestamp collision resending message %d — radio sent but DB row not created",
message.id,
)
return {"status": "ok", "message_id": message.id}
logger.info(
"Resent channel message %d as new message %d to %s",
message.id,
new_message.id,
channel.name,
)
return {"status": "ok", "message_id": new_message.id}
logger.info("Resent channel message %d to %s", message.id, channel.name)
return {"status": "ok", "message_id": message.id}
+447
View File
@@ -0,0 +1,447 @@
import logging
import time
from collections.abc import Callable
from typing import TYPE_CHECKING, Any
from app.models import CONTACT_TYPE_REPEATER, Message, MessagePath
from app.repository import ContactRepository, MessageRepository, RawPacketRepository
if TYPE_CHECKING:
from app.decoder import DecryptedDirectMessage
logger = logging.getLogger(__name__)
BroadcastFn = Callable[..., Any]
def build_message_paths(
path: str | None,
received_at: int,
path_len: int | None = None,
) -> list[MessagePath] | None:
"""Build the single-path list used by message payloads."""
return (
[MessagePath(path=path or "", received_at=received_at, path_len=path_len)]
if path is not None
else None
)
def build_message_model(
*,
message_id: int,
msg_type: str,
conversation_key: str,
text: str,
sender_timestamp: int | None,
received_at: int,
paths: list[MessagePath] | None = None,
txt_type: int = 0,
signature: str | None = None,
sender_key: str | None = None,
outgoing: bool = False,
acked: int = 0,
sender_name: str | None = None,
channel_name: str | None = None,
) -> Message:
"""Build a Message model with the canonical backend payload shape."""
return Message(
id=message_id,
type=msg_type,
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
received_at=received_at,
paths=paths,
txt_type=txt_type,
signature=signature,
sender_key=sender_key,
outgoing=outgoing,
acked=acked,
sender_name=sender_name,
channel_name=channel_name,
)
def broadcast_message(
*,
message: Message,
broadcast_fn: BroadcastFn,
realtime: bool | None = None,
) -> None:
"""Broadcast a message payload, preserving the caller's broadcast signature."""
payload = message.model_dump()
if realtime is None:
broadcast_fn("message", payload)
else:
broadcast_fn("message", payload, realtime=realtime)
def broadcast_message_acked(
*,
message_id: int,
ack_count: int,
paths: list[MessagePath] | None,
broadcast_fn: BroadcastFn,
) -> None:
"""Broadcast a message_acked payload."""
broadcast_fn(
"message_acked",
{
"message_id": message_id,
"ack_count": ack_count,
"paths": [path.model_dump() for path in paths] if paths else [],
},
)
async def increment_ack_and_broadcast(
*,
message_id: int,
broadcast_fn: BroadcastFn,
) -> int:
"""Increment a message's ACK count and broadcast the update."""
ack_count = await MessageRepository.increment_ack_count(message_id)
broadcast_fn("message_acked", {"message_id": message_id, "ack_count": ack_count})
return ack_count
async def handle_duplicate_message(
*,
packet_id: int,
msg_type: str,
conversation_key: str,
text: str,
sender_timestamp: int,
path: str | None,
received_at: int,
path_len: int | None = None,
broadcast_fn: BroadcastFn,
) -> None:
"""Handle a duplicate message by updating paths/acks on the existing record."""
existing_msg = await MessageRepository.get_by_content(
msg_type=msg_type,
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
)
if not existing_msg:
label = "message" if msg_type == "CHAN" else "DM"
logger.warning(
"Duplicate %s for %s but couldn't find existing",
label,
conversation_key[:12],
)
return
logger.debug(
"Duplicate %s for %s (msg_id=%d, outgoing=%s) - adding path",
msg_type,
conversation_key[:12],
existing_msg.id,
existing_msg.outgoing,
)
if path is not None:
paths = await MessageRepository.add_path(existing_msg.id, path, received_at, path_len)
else:
paths = existing_msg.paths or []
if existing_msg.outgoing:
ack_count = await MessageRepository.increment_ack_count(existing_msg.id)
else:
ack_count = existing_msg.acked
if existing_msg.outgoing or path is not None:
broadcast_message_acked(
message_id=existing_msg.id,
ack_count=ack_count,
paths=paths,
broadcast_fn=broadcast_fn,
)
await RawPacketRepository.mark_decrypted(packet_id, existing_msg.id)
async def create_message_from_decrypted(
*,
packet_id: int,
channel_key: str,
sender: str | None,
message_text: str,
timestamp: int,
received_at: int | None = None,
path: str | None = None,
path_len: int | None = None,
channel_name: str | None = None,
realtime: bool = True,
broadcast_fn: BroadcastFn,
) -> int | None:
"""Store and broadcast a decrypted channel message."""
received = received_at or int(time.time())
text = f"{sender}: {message_text}" if sender else message_text
channel_key_normalized = channel_key.upper()
resolved_sender_key: str | None = None
if sender:
candidates = await ContactRepository.get_by_name(sender)
if len(candidates) == 1:
resolved_sender_key = candidates[0].public_key
msg_id = await MessageRepository.create(
msg_type="CHAN",
text=text,
conversation_key=channel_key_normalized,
sender_timestamp=timestamp,
received_at=received,
path=path,
path_len=path_len,
sender_name=sender,
sender_key=resolved_sender_key,
)
if msg_id is None:
await handle_duplicate_message(
packet_id=packet_id,
msg_type="CHAN",
conversation_key=channel_key_normalized,
text=text,
sender_timestamp=timestamp,
path=path,
received_at=received,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
logger.info("Stored channel message %d for channel %s", msg_id, channel_key_normalized[:8])
await RawPacketRepository.mark_decrypted(packet_id, msg_id)
broadcast_message(
message=build_message_model(
message_id=msg_id,
msg_type="CHAN",
conversation_key=channel_key_normalized,
text=text,
sender_timestamp=timestamp,
received_at=received,
paths=build_message_paths(path, received, path_len),
sender_name=sender,
sender_key=resolved_sender_key,
channel_name=channel_name,
),
broadcast_fn=broadcast_fn,
realtime=realtime,
)
return msg_id
async def create_dm_message_from_decrypted(
*,
packet_id: int,
decrypted: "DecryptedDirectMessage",
their_public_key: str,
our_public_key: str | None,
received_at: int | None = None,
path: str | None = None,
path_len: int | None = None,
outgoing: bool = False,
realtime: bool = True,
broadcast_fn: BroadcastFn,
) -> int | None:
"""Store and broadcast a decrypted direct message."""
contact = await ContactRepository.get_by_key(their_public_key)
if contact and contact.type == CONTACT_TYPE_REPEATER:
logger.debug(
"Skipping message from repeater %s (CLI responses not stored): %s",
their_public_key[:12],
(decrypted.message or "")[:50],
)
return None
received = received_at or int(time.time())
conversation_key = their_public_key.lower()
sender_name = contact.name if contact and not outgoing else None
msg_id = await MessageRepository.create(
msg_type="PRIV",
text=decrypted.message,
conversation_key=conversation_key,
sender_timestamp=decrypted.timestamp,
received_at=received,
path=path,
path_len=path_len,
outgoing=outgoing,
sender_key=conversation_key if not outgoing else None,
sender_name=sender_name,
)
if msg_id is None:
await handle_duplicate_message(
packet_id=packet_id,
msg_type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
sender_timestamp=decrypted.timestamp,
path=path,
received_at=received,
path_len=path_len,
broadcast_fn=broadcast_fn,
)
return None
logger.info(
"Stored direct message %d for contact %s (outgoing=%s)",
msg_id,
conversation_key[:12],
outgoing,
)
await RawPacketRepository.mark_decrypted(packet_id, msg_id)
broadcast_message(
message=build_message_model(
message_id=msg_id,
msg_type="PRIV",
conversation_key=conversation_key,
text=decrypted.message,
sender_timestamp=decrypted.timestamp,
received_at=received,
paths=build_message_paths(path, received, path_len),
outgoing=outgoing,
sender_name=sender_name,
sender_key=conversation_key if not outgoing else None,
),
broadcast_fn=broadcast_fn,
realtime=realtime,
)
await ContactRepository.update_last_contacted(conversation_key, received)
return msg_id
async def create_fallback_direct_message(
*,
conversation_key: str,
text: str,
sender_timestamp: int,
received_at: int,
path: str | None,
path_len: int | None,
txt_type: int,
signature: str | None,
sender_name: str | None,
sender_key: str | None,
broadcast_fn: BroadcastFn,
message_repository=MessageRepository,
) -> Message | None:
"""Store and broadcast a CONTACT_MSG_RECV fallback direct message."""
msg_id = await message_repository.create(
msg_type="PRIV",
text=text,
conversation_key=conversation_key,
sender_timestamp=sender_timestamp,
received_at=received_at,
path=path,
path_len=path_len,
txt_type=txt_type,
signature=signature,
sender_key=sender_key,
sender_name=sender_name,
)
if msg_id is None:
return None
message = build_message_model(
message_id=msg_id,
msg_type="PRIV",
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
received_at=received_at,
paths=build_message_paths(path, received_at, path_len),
txt_type=txt_type,
signature=signature,
sender_key=sender_key,
sender_name=sender_name,
)
broadcast_message(message=message, broadcast_fn=broadcast_fn)
return message
async def create_outgoing_direct_message(
*,
conversation_key: str,
text: str,
sender_timestamp: int,
received_at: int,
broadcast_fn: BroadcastFn,
message_repository=MessageRepository,
) -> Message | None:
"""Store and broadcast an outgoing direct message."""
msg_id = await message_repository.create(
msg_type="PRIV",
text=text,
conversation_key=conversation_key,
sender_timestamp=sender_timestamp,
received_at=received_at,
outgoing=True,
)
if msg_id is None:
return None
message = build_message_model(
message_id=msg_id,
msg_type="PRIV",
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
received_at=received_at,
outgoing=True,
acked=0,
)
broadcast_message(message=message, broadcast_fn=broadcast_fn)
return message
async def create_outgoing_channel_message(
*,
conversation_key: str,
text: str,
sender_timestamp: int,
received_at: int,
sender_name: str | None,
sender_key: str | None,
channel_name: str | None,
broadcast_fn: BroadcastFn,
message_repository=MessageRepository,
) -> Message | None:
"""Store and broadcast an outgoing channel message."""
msg_id = await message_repository.create(
msg_type="CHAN",
text=text,
conversation_key=conversation_key,
sender_timestamp=sender_timestamp,
received_at=received_at,
outgoing=True,
sender_name=sender_name,
sender_key=sender_key,
)
if msg_id is None:
return None
message = build_message_model(
message_id=msg_id,
msg_type="CHAN",
conversation_key=conversation_key,
text=text,
sender_timestamp=sender_timestamp,
received_at=received_at,
outgoing=True,
acked=0,
sender_name=sender_name,
sender_key=sender_key,
channel_name=channel_name,
)
broadcast_message(message=message, broadcast_fn=broadcast_fn)
return message
+102
View File
@@ -0,0 +1,102 @@
import logging
from collections.abc import Awaitable, Callable
from typing import Any
from meshcore import EventType
logger = logging.getLogger(__name__)
class RadioCommandServiceError(RuntimeError):
"""Base error for reusable radio command workflows."""
class PathHashModeUnsupportedError(RadioCommandServiceError):
"""Raised when firmware does not support path hash mode updates."""
class RadioCommandRejectedError(RadioCommandServiceError):
"""Raised when the radio reports an error for a command."""
class KeystoreRefreshError(RadioCommandServiceError):
"""Raised when server-side keystore refresh fails after import."""
async def apply_radio_config_update(
mc,
update,
*,
path_hash_mode_supported: bool,
set_path_hash_mode: Callable[[int], None],
sync_radio_time_fn: Callable[[Any], Awaitable[Any]],
) -> None:
"""Apply a validated radio-config update to the connected radio."""
if update.name is not None:
logger.info("Setting radio name to %s", update.name)
await mc.commands.set_name(update.name)
if update.lat is not None or update.lon is not None:
current_info = mc.self_info
lat = update.lat if update.lat is not None else current_info.get("adv_lat", 0.0)
lon = update.lon if update.lon is not None else current_info.get("adv_lon", 0.0)
logger.info("Setting radio coordinates to %f, %f", lat, lon)
await mc.commands.set_coords(lat=lat, lon=lon)
if update.tx_power is not None:
logger.info("Setting TX power to %d dBm", update.tx_power)
await mc.commands.set_tx_power(val=update.tx_power)
if update.radio is not None:
logger.info(
"Setting radio params: freq=%f MHz, bw=%f kHz, sf=%d, cr=%d",
update.radio.freq,
update.radio.bw,
update.radio.sf,
update.radio.cr,
)
await mc.commands.set_radio(
freq=update.radio.freq,
bw=update.radio.bw,
sf=update.radio.sf,
cr=update.radio.cr,
)
if update.path_hash_mode is not None:
if not path_hash_mode_supported:
raise PathHashModeUnsupportedError("Firmware does not support path hash mode setting")
logger.info("Setting path hash mode to %d", update.path_hash_mode)
result = await mc.commands.set_path_hash_mode(update.path_hash_mode)
if result is not None and result.type == EventType.ERROR:
raise RadioCommandRejectedError(f"Failed to set path hash mode: {result.payload}")
set_path_hash_mode(update.path_hash_mode)
await sync_radio_time_fn(mc)
# Commands like set_name() write to flash but don't update cached self_info.
# send_appstart() forces a fresh SELF_INFO so the response reflects changes.
await mc.commands.send_appstart()
async def import_private_key_and_refresh_keystore(
mc,
key_bytes: bytes,
*,
export_and_store_private_key_fn: Callable[[Any], Awaitable[bool]],
) -> None:
"""Import a private key and refresh the in-memory keystore immediately."""
result = await mc.commands.import_private_key(key_bytes)
if result.type == EventType.ERROR:
raise RadioCommandRejectedError(f"Failed to import private key: {result.payload}")
keystore_refreshed = await export_and_store_private_key_fn(mc)
if not keystore_refreshed:
logger.warning("Keystore refresh failed after import, retrying once")
keystore_refreshed = await export_and_store_private_key_fn(mc)
if not keystore_refreshed:
raise KeystoreRefreshError(
"Private key imported on radio, but server-side keystore refresh failed. "
"Reconnect to apply the new key for DM decryption."
)
+221
View File
@@ -0,0 +1,221 @@
import asyncio
import logging
logger = logging.getLogger(__name__)
async def run_post_connect_setup(radio_manager) -> None:
"""Run shared radio initialization after a transport connection succeeds."""
from app.event_handlers import register_event_handlers
from app.keystore import export_and_store_private_key
from app.radio_sync import (
drain_pending_messages,
send_advertisement,
start_message_polling,
start_periodic_advert,
start_periodic_sync,
sync_and_offload_all,
sync_radio_time,
)
if not radio_manager.meshcore:
return
if radio_manager._setup_lock is None:
radio_manager._setup_lock = asyncio.Lock()
async with radio_manager._setup_lock:
if not radio_manager.meshcore:
return
radio_manager._setup_in_progress = True
radio_manager._setup_complete = False
mc = radio_manager.meshcore
try:
# Register event handlers (no radio I/O, just callback setup)
register_event_handlers(mc)
# Hold the operation lock for all radio I/O during setup.
# This prevents user-initiated operations (send message, etc.)
# from interleaving commands on the serial link.
await radio_manager._acquire_operation_lock("post_connect_setup", blocking=True)
try:
await export_and_store_private_key(mc)
# Sync radio clock with system time
await sync_radio_time(mc)
# Apply flood scope from settings (best-effort; older firmware
# may not support set_flood_scope)
from app.region_scope import normalize_region_scope
from app.repository import AppSettingsRepository
app_settings = await AppSettingsRepository.get()
scope = normalize_region_scope(app_settings.flood_scope)
try:
await mc.commands.set_flood_scope(scope if scope else "")
logger.info("Applied flood_scope=%r", scope or "(disabled)")
except Exception as exc:
logger.warning("set_flood_scope failed (firmware may not support it): %s", exc)
# Query path hash mode support (best-effort; older firmware won't report it).
# If the library's parsed payload is missing path_hash_mode (e.g. stale
# .pyc on WSL2 Windows mounts), fall back to raw-frame extraction.
reader = mc._reader
_original_handle_rx = reader.handle_rx
_captured_frame: list[bytes] = []
async def _capture_handle_rx(data: bytearray) -> None:
from meshcore.packets import PacketType
if len(data) > 0 and data[0] == PacketType.DEVICE_INFO.value:
_captured_frame.append(bytes(data))
return await _original_handle_rx(data)
reader.handle_rx = _capture_handle_rx
radio_manager.path_hash_mode = 0
radio_manager.path_hash_mode_supported = False
try:
device_query = await mc.commands.send_device_query()
if device_query and "path_hash_mode" in device_query.payload:
radio_manager.path_hash_mode = device_query.payload["path_hash_mode"]
radio_manager.path_hash_mode_supported = True
elif _captured_frame:
# Raw-frame fallback: byte 1 = fw_ver, byte 81 = path_hash_mode
raw = _captured_frame[-1]
fw_ver = raw[1] if len(raw) > 1 else 0
if fw_ver >= 10 and len(raw) >= 82:
radio_manager.path_hash_mode = raw[81]
radio_manager.path_hash_mode_supported = True
logger.warning(
"path_hash_mode=%d extracted from raw frame "
"(stale .pyc? try: rm %s)",
radio_manager.path_hash_mode,
getattr(
__import__("meshcore.reader", fromlist=["reader"]),
"__cached__",
"meshcore __pycache__/reader.*.pyc",
),
)
if radio_manager.path_hash_mode_supported:
logger.info("Path hash mode: %d (supported)", radio_manager.path_hash_mode)
else:
logger.debug("Firmware does not report path_hash_mode")
except Exception as exc:
logger.debug("Failed to query path_hash_mode: %s", exc)
finally:
reader.handle_rx = _original_handle_rx
# Sync contacts/channels from radio to DB and clear radio
logger.info("Syncing and offloading radio data...")
result = await sync_and_offload_all(mc)
logger.info("Sync complete: %s", result)
# Send advertisement to announce our presence (if enabled and not throttled)
if await send_advertisement(mc):
logger.info("Advertisement sent")
else:
logger.debug("Advertisement skipped (disabled or throttled)")
# Drain any messages that were queued before we connected.
# This must happen BEFORE starting auto-fetch, otherwise both
# compete on get_msg() with interleaved radio I/O.
drained = await drain_pending_messages(mc)
if drained > 0:
logger.info("Drained %d pending message(s)", drained)
await mc.start_auto_message_fetching()
logger.info("Auto message fetching started")
finally:
radio_manager._release_operation_lock("post_connect_setup")
# Start background tasks AFTER releasing the operation lock.
# These tasks acquire their own locks when they need radio access.
start_periodic_sync()
start_periodic_advert()
start_message_polling()
radio_manager._setup_complete = True
finally:
radio_manager._setup_in_progress = False
logger.info("Post-connect setup complete")
async def prepare_connected_radio(radio_manager, *, broadcast_on_success: bool = True) -> None:
"""Finish setup for an already-connected radio and optionally broadcast health."""
from app.websocket import broadcast_health
await radio_manager.post_connect_setup()
radio_manager._last_connected = True
if broadcast_on_success:
broadcast_health(True, radio_manager.connection_info)
async def reconnect_and_prepare_radio(
radio_manager,
*,
broadcast_on_success: bool = True,
) -> bool:
"""Reconnect the transport, then run post-connect setup before reporting healthy."""
connected = await radio_manager.reconnect(broadcast_on_success=False)
if not connected:
return False
await prepare_connected_radio(radio_manager, broadcast_on_success=broadcast_on_success)
return True
async def connection_monitor_loop(radio_manager) -> None:
"""Monitor radio health and keep transport/setup state converged."""
from app.websocket import broadcast_health
check_interval_seconds = 5
unresponsive_threshold = 3
consecutive_setup_failures = 0
while True:
try:
await asyncio.sleep(check_interval_seconds)
current_connected = radio_manager.is_connected
if radio_manager._last_connected and not current_connected:
logger.warning("Radio connection lost, broadcasting status change")
broadcast_health(False, radio_manager.connection_info)
radio_manager._last_connected = False
consecutive_setup_failures = 0
if not current_connected:
if not radio_manager.is_reconnecting and await reconnect_and_prepare_radio(
radio_manager,
broadcast_on_success=True,
):
consecutive_setup_failures = 0
elif not radio_manager._last_connected and current_connected:
logger.info("Radio connection restored")
await prepare_connected_radio(radio_manager, broadcast_on_success=True)
consecutive_setup_failures = 0
elif current_connected and not radio_manager.is_setup_complete:
logger.info("Retrying post-connect setup...")
await prepare_connected_radio(radio_manager, broadcast_on_success=True)
consecutive_setup_failures = 0
except asyncio.CancelledError:
break
except Exception as e:
consecutive_setup_failures += 1
if consecutive_setup_failures == unresponsive_threshold:
logger.error(
"Post-connect setup has failed %d times in a row. "
"The radio port appears open but the radio is not "
"responding to commands. Common causes: another "
"process has the serial port open (check for other "
"RemoteTerm instances, serial monitors, etc.), the "
"firmware is in repeater mode (not client), or the "
"radio needs a power cycle. Will keep retrying.",
consecutive_setup_failures,
)
elif consecutive_setup_failures < unresponsive_threshold:
logger.exception("Error in connection monitor, continuing: %s", e)
+91
View File
@@ -0,0 +1,91 @@
"""Shared access seam over the global RadioManager instance.
This module deliberately keeps behavior thin and forwarding-only. The goal is
to reduce direct `app.radio.radio_manager` imports across routers and helpers
without changing radio lifecycle, lock, or connection semantics.
"""
from collections.abc import Callable
from contextlib import asynccontextmanager
from typing import Any
from fastapi import HTTPException
import app.radio as radio_module
class RadioRuntime:
"""Thin forwarding wrapper around the process-global RadioManager."""
def __init__(self, manager_or_getter=None):
if manager_or_getter is None:
self._manager_getter: Callable[[], Any] = lambda: radio_module.radio_manager
elif callable(manager_or_getter):
self._manager_getter = manager_or_getter
else:
self._manager_getter = lambda: manager_or_getter
@property
def manager(self) -> Any:
return self._manager_getter()
def __getattr__(self, name: str) -> Any:
"""Forward unknown attributes to the current global manager."""
return getattr(self.manager, name)
@staticmethod
def _is_local_runtime_attr(name: str) -> bool:
return name.startswith("_") or hasattr(RadioRuntime, name)
def __setattr__(self, name: str, value: Any) -> None:
if self._is_local_runtime_attr(name):
object.__setattr__(self, name, value)
return
setattr(self.manager, name, value)
def __delattr__(self, name: str) -> None:
if self._is_local_runtime_attr(name):
object.__delattr__(self, name)
return
delattr(self.manager, name)
def require_connected(self):
"""Return MeshCore when available, mirroring existing HTTP semantics."""
if self.is_setup_in_progress:
raise HTTPException(status_code=503, detail="Radio is initializing")
if not self.is_connected:
raise HTTPException(status_code=503, detail="Radio not connected")
mc = self.meshcore
if mc is None:
raise HTTPException(status_code=503, detail="Radio not connected")
return mc
@asynccontextmanager
async def radio_operation(self, name: str, **kwargs):
async with self.manager.radio_operation(name, **kwargs) as mc:
yield mc
async def start_connection_monitor(self) -> None:
await self.manager.start_connection_monitor()
async def stop_connection_monitor(self) -> None:
await self.manager.stop_connection_monitor()
async def disconnect(self) -> None:
await self.manager.disconnect()
async def prepare_connected(self, *, broadcast_on_success: bool = True) -> None:
from app.services.radio_lifecycle import prepare_connected_radio
await prepare_connected_radio(self.manager, broadcast_on_success=broadcast_on_success)
async def reconnect_and_prepare(self, *, broadcast_on_success: bool = True) -> bool:
from app.services.radio_lifecycle import reconnect_and_prepare_radio
return await reconnect_and_prepare_radio(
self.manager,
broadcast_on_success=broadcast_on_success,
)
radio_runtime = RadioRuntime()
+4 -3
View File
@@ -1,12 +1,13 @@
"""WebSocket manager for real-time updates."""
import asyncio
import json
import logging
from typing import Any
from fastapi import WebSocket
from app.events import dump_ws_event
logger = logging.getLogger(__name__)
# Timeout for individual WebSocket send operations (seconds)
@@ -45,7 +46,7 @@ class WebSocketManager:
if not self.active_connections:
return
message = json.dumps({"type": event_type, "data": data})
message = dump_ws_event(event_type, data)
# Copy connection list under lock to avoid holding lock during I/O
async with self._lock:
@@ -81,7 +82,7 @@ class WebSocketManager:
async def send_personal(self, websocket: WebSocket, event_type: str, data: Any) -> None:
"""Send an event to a specific client."""
message = json.dumps({"type": event_type, "data": data})
message = dump_ws_event(event_type, data)
try:
await websocket.send_text(message)
except Exception as e:
+65 -10
View File
@@ -16,15 +16,25 @@ Keep it aligned with `frontend/src` source code.
- `meshcore-hashtag-cracker` + `nosleep.js` (channel cracker)
- Multibyte-aware decoder build published as `meshcore-decoder-multibyte-patch`
## Code Ethos
- Prefer fewer, stronger modules over many thin wrappers.
- Split code only when the new hook/component owns a real invariant or workflow.
- Keep one reasoning unit readable in one place, even if that file is moderately large.
- Avoid dedicated files whose main job is pass-through, prop bundling, or renaming.
- For this repo, "locally dense but semantically obvious" is better than indirection-heavy "clean architecture".
- When refactoring, preserve behavior first and add tests around the seam being moved.
## Frontend Map
```text
frontend/src/
├── main.tsx # React entry point (StrictMode, root render)
├── App.tsx # App shell and orchestration
├── App.tsx # Data/orchestration entry that wires hooks into AppShell
├── api.ts # Typed REST client
├── types.ts # Shared TS contracts
├── useWebSocket.ts # WS lifecycle + event dispatch
├── wsEvents.ts # Typed WS event parsing / discriminated union
├── messageCache.ts # Conversation-scoped cache
├── prefetch.ts # Consumes prefetched API promises started in index.html
├── index.css # Global styles/utilities
@@ -34,13 +44,27 @@ frontend/src/
│ └── utils.ts # cn() — clsx + tailwind-merge helper
├── hooks/
│ ├── index.ts # Central re-export of all hooks
│ ├── useConversationMessages.ts # Fetch, pagination, dedup, ACK buffering
│ ├── useConversationActions.ts # Send/resend/trace/block conversation actions
│ ├── useConversationNavigation.ts # Search target, selection reset, and info-pane navigation state
│ ├── useConversationMessages.ts # Conversation timeline loading, cache restore, jump-target loading, pagination, dedup, pending ACK buffering
│ ├── useUnreadCounts.ts # Unread counters, mentions, recent-sort timestamps
│ ├── useRealtimeAppState.ts # WebSocket event application and reconnect recovery
│ ├── useAppShell.ts # App-shell view state (settings/sidebar/modals/cracker)
│ ├── useRepeaterDashboard.ts # Repeater dashboard state (login, panes, console, retries)
│ ├── useRadioControl.ts # Radio health/config state, reconnection
│ ├── useAppSettings.ts # Settings, favorites, preferences migration
│ ├── useConversationRouter.ts # URL hash → active conversation routing
│ └── useContactsAndChannels.ts # Contact/channel loading, creation, deletion
├── components/
│ ├── AppShell.tsx # App-shell layout: status, sidebar, search/settings panes, cracker, modals
│ ├── ConversationPane.tsx # Active conversation surface selection (map/raw/repeater/chat/empty)
│ ├── visualizer/
│ │ ├── useVisualizerData3D.ts # Packet→graph data pipeline, repeat aggregation, simulation state
│ │ ├── useVisualizer3DScene.ts # Three.js scene lifecycle, buffers, hover/pin interaction
│ │ ├── VisualizerControls.tsx # Visualizer legends and control panel overlay
│ │ ├── VisualizerTooltip.tsx # Hover/pin node detail overlay
│ │ └── shared.ts # Graph node/link types and shared rendering helpers
│ └── ...
├── utils/
│ ├── urlHash.ts # Hash parsing and encoding
│ ├── conversationState.ts # State keys, in-memory + localStorage helpers
@@ -136,10 +160,15 @@ frontend/src/
├── searchView.test.tsx
├── useConversationMessages.test.ts
├── useConversationMessages.race.test.ts
├── useConversationNavigation.test.ts
├── useAppShell.test.ts
├── useRepeaterDashboard.test.ts
├── useContactsAndChannels.test.ts
├── useRealtimeAppState.test.ts
├── useUnreadCounts.test.ts
├── useWebSocket.dispatch.test.ts
── useWebSocket.lifecycle.test.ts
── useWebSocket.lifecycle.test.ts
└── wsEvents.test.ts
```
@@ -147,19 +176,42 @@ frontend/src/
### State ownership
`App.tsx` orchestrates high-level state and delegates to hooks:
`App.tsx` is now a thin composition entrypoint over the hook layer. `AppShell.tsx` owns shell layout/composition:
- local label banner
- status bar
- desktop/mobile sidebar container
- search/settings surface switching
- global cracker mount/focus behavior
- new-message modal and info panes
High-level state is delegated to hooks:
- `useAppShell`: app-shell view state (settings section, sidebar, cracker, new-message modal)
- `useRadioControl`: radio health/config state, reconnect/reboot polling
- `useAppSettings`: settings CRUD, favorites, preferences migration
- `useContactsAndChannels`: contact/channel lists, creation, deletion
- `useConversationRouter`: URL hash → active conversation routing
- `useConversationMessages`: fetch, pagination, dedup/update helpers
- `useConversationNavigation`: search target, conversation selection reset, and info-pane state
- `useConversationActions`: send/resend/trace/block handlers and channel override updates
- `useConversationMessages`: conversation switch loading, cache restore, jump-target loading, pagination, dedup/update helpers, and pending ACK buffering
- `useUnreadCounts`: unread counters, mention tracking, recent-sort timestamps
- `useRealtimeAppState`: typed WS event application, reconnect recovery, cache/unread coordination
- `useRepeaterDashboard`: repeater dashboard state (login, pane data/retries, console, actions)
`App.tsx` intentionally still does the final `AppShell` prop assembly. That composition layer is considered acceptable here because it keeps the shell contract visible in one place and avoids a prop-bundling hook with little original logic.
`ConversationPane.tsx` owns the main active-conversation surface branching:
- empty state
- map view
- visualizer
- raw packet feed
- repeater dashboard
- normal chat chrome (`ChatHeader` + `MessageList` + `MessageInput`)
### Initial load + realtime
- Initial data: REST fetches (`api.ts`) for config/settings/channels/contacts/unreads.
- WebSocket: realtime deltas/events.
- On reconnect, the app refetches channels and contacts, refreshes unread counts, and reconciles the active conversation to recover disconnect-window drift.
- On WS connect, backend sends `health` only; contacts/channels still come from REST.
### New Message modal
@@ -176,6 +228,7 @@ frontend/src/
### Visualizer behavior
- `VisualizerView.tsx` hosts `PacketVisualizer3D.tsx` (desktop split-pane and mobile tabs).
- `PacketVisualizer3D.tsx` is now a thin composition shell over visualizer-specific hooks/components in `components/visualizer/`.
- `PacketVisualizer3D` uses persistent Three.js geometries for links/highlights/particles and updates typed-array buffers in-place per frame.
- Packet repeat aggregation keys prefer decoder `messageHash` (path-insensitive), with hash fallback for malformed packets.
- Raw-packet decoding in `RawPacketList.tsx` and `visualizerUtils.ts` relies on the multibyte-aware decoder fork; keep frontend packet parsing aligned with backend `path_utils.py`.
@@ -193,6 +246,7 @@ frontend/src/
- Auto reconnect (3s) with cleanup guard on unmount.
- Heartbeat ping every 30s.
- Incoming JSON is parsed through `wsEvents.ts`, which returns a typed discriminated union for known events and a centralized `unknown` fallback.
- Event handlers: `health`, `message`, `contact`, `raw_packet`, `message_acked`, `contact_deleted`, `channel_deleted`, `error`, `success`, `pong` (ignored).
- For `raw_packet` events, use `observation_id` as event identity; `id` is a storage reference and may repeat.
@@ -276,7 +330,7 @@ Clicking a contact's avatar in `ChatHeader` or `MessageList` opens a `ContactInf
- Nearest repeaters (resolved from first-hop path prefixes)
- Recent advert paths
State: `infoPaneContactKey` in App.tsx controls open/close. Live contact data from WebSocket updates is preferred over the initial detail snapshot.
State: `useConversationNavigation` controls open/close via `infoPaneContactKey`. Live contact data from WebSocket updates is preferred over the initial detail snapshot.
## Channel Info Pane
@@ -288,11 +342,11 @@ Clicking a channel name in `ChatHeader` opens a `ChannelInfoPane` sheet (right d
- First message date
- Top senders in last 24h (name + count)
State: `infoPaneChannelKey` in App.tsx controls open/close. Live channel data from the `channels` array is preferred over the initial detail snapshot.
State: `useConversationNavigation` controls open/close via `infoPaneChannelKey`. Live channel data from the `channels` array is preferred over the initial detail snapshot.
## Repeater Dashboard
For repeater contacts (`type=2`), App.tsx renders `RepeaterDashboard` instead of the normal chat UI (ChatHeader + MessageList + MessageInput).
For repeater contacts (`type=2`), `ConversationPane.tsx` renders `RepeaterDashboard` instead of the normal chat UI (ChatHeader + MessageList + MessageInput).
**Login**: `RepeaterLogin` component — password or guest login via `POST /api/contacts/{key}/repeater/login`.
@@ -308,9 +362,10 @@ All state is managed by `useRepeaterDashboard` hook. State resets on conversatio
The `SearchView` component (`components/SearchView.tsx`) provides full-text search across all DMs and channel messages. Key behaviors:
- **State**: `targetMessageId` in `App.tsx` drives the jump-to-message flow. When a search result is clicked, `handleNavigateToMessage` sets `targetMessageId` and switches to the target conversation.
- **State**: `targetMessageId` is shared between `useConversationNavigation` and `useConversationMessages`. When a search result is clicked, `handleNavigateToMessage` sets the target ID and switches to the target conversation.
- **Same-conversation clear**: when `targetMessageId` is cleared after the target is reached, the hook preserves the around-loaded mid-history view instead of replacing it with the latest page.
- **Persistence**: `SearchView` stays mounted after first open using the same `hidden` class pattern as `CrackerPanel`, preserving search state when navigating to results.
- **Jump-to-message**: `useConversationMessages` accepts optional `targetMessageId`. When set, it calls `api.getMessagesAround()` instead of normal fetch, loading context around the target message. `MessageList` scrolls to the target via `data-message-id` attribute and applies a `message-highlight` CSS animation.
- **Jump-to-message**: `useConversationMessages` handles optional `targetMessageId` by calling `api.getMessagesAround()` instead of the normal latest-page fetch, loading context around the target message. `MessageList` scrolls to the target via `data-message-id` attribute and applies a `message-highlight` CSS animation.
- **Bidirectional pagination**: After jumping mid-history, `hasNewerMessages` enables forward pagination via `fetchNewerMessages`. The scroll-to-bottom button calls `jumpToBottom` (re-fetches latest page) instead of just scrolling.
- **WS message suppression**: When `hasNewerMessages` is true, incoming WS messages for the active conversation are not added to the message list (the user is viewing historical context, not the latest page).
+226 -840
View File
File diff suppressed because it is too large Load Diff
+292
View File
@@ -0,0 +1,292 @@
import { lazy, Suspense, useRef, type ComponentProps } from 'react';
import { StatusBar } from './StatusBar';
import { Sidebar } from './Sidebar';
import { ConversationPane } from './ConversationPane';
import { NewMessageModal } from './NewMessageModal';
import { ContactInfoPane } from './ContactInfoPane';
import { ChannelInfoPane } from './ChannelInfoPane';
import { Toaster } from './ui/sonner';
import { Sheet, SheetContent, SheetDescription, SheetHeader, SheetTitle } from './ui/sheet';
import {
SETTINGS_SECTION_LABELS,
SETTINGS_SECTION_ORDER,
type SettingsSection,
} from './settings/settingsConstants';
import { getContrastTextColor, type LocalLabel } from '../utils/localLabel';
import type { CrackerPanelProps } from './CrackerPanel';
import type { SearchViewProps } from './SearchView';
import type { SettingsModalProps } from './SettingsModal';
import { cn } from '@/lib/utils';
const SettingsModal = lazy(() =>
import('./SettingsModal').then((m) => ({ default: m.SettingsModal }))
);
const CrackerPanel = lazy(() =>
import('./CrackerPanel').then((m) => ({ default: m.CrackerPanel }))
);
const SearchView = lazy(() => import('./SearchView').then((m) => ({ default: m.SearchView })));
type SidebarProps = ComponentProps<typeof Sidebar>;
type ConversationPaneProps = ComponentProps<typeof ConversationPane>;
type NewMessageModalProps = Omit<ComponentProps<typeof NewMessageModal>, 'open' | 'onClose'>;
type ContactInfoPaneProps = ComponentProps<typeof ContactInfoPane>;
type ChannelInfoPaneProps = ComponentProps<typeof ChannelInfoPane>;
interface AppShellProps {
localLabel: LocalLabel;
showNewMessage: boolean;
showSettings: boolean;
settingsSection: SettingsSection;
sidebarOpen: boolean;
showCracker: boolean;
onSettingsSectionChange: (section: SettingsSection) => void;
onSidebarOpenChange: (open: boolean) => void;
onCrackerRunningChange: (running: boolean) => void;
onToggleSettingsView: () => void;
onCloseSettingsView: () => void;
onCloseNewMessage: () => void;
onLocalLabelChange: (label: LocalLabel) => void;
statusProps: Pick<ComponentProps<typeof StatusBar>, 'health' | 'config'>;
sidebarProps: SidebarProps;
conversationPaneProps: ConversationPaneProps;
searchProps: SearchViewProps;
settingsProps: Omit<
SettingsModalProps,
'open' | 'pageMode' | 'externalSidebarNav' | 'desktopSection' | 'onClose' | 'onLocalLabelChange'
>;
crackerProps: Omit<CrackerPanelProps, 'visible' | 'onRunningChange'>;
newMessageModalProps: NewMessageModalProps;
contactInfoPaneProps: ContactInfoPaneProps;
channelInfoPaneProps: ChannelInfoPaneProps;
}
export function AppShell({
localLabel,
showNewMessage,
showSettings,
settingsSection,
sidebarOpen,
showCracker,
onSettingsSectionChange,
onSidebarOpenChange,
onCrackerRunningChange,
onToggleSettingsView,
onCloseSettingsView,
onCloseNewMessage,
onLocalLabelChange,
statusProps,
sidebarProps,
conversationPaneProps,
searchProps,
settingsProps,
crackerProps,
newMessageModalProps,
contactInfoPaneProps,
channelInfoPaneProps,
}: AppShellProps) {
const searchMounted = useRef(false);
if (conversationPaneProps.activeConversation?.type === 'search') {
searchMounted.current = true;
}
const crackerMounted = useRef(false);
if (showCracker) {
crackerMounted.current = true;
}
const settingsSidebarContent = (
<nav
className="sidebar w-60 h-full min-h-0 overflow-hidden bg-card border-r border-border flex flex-col"
aria-label="Settings"
>
<div className="flex justify-between items-center px-3 py-2.5 border-b border-border">
<h2 className="text-[10px] uppercase tracking-wider text-muted-foreground font-medium">
Settings
</h2>
<button
type="button"
onClick={onCloseSettingsView}
className="flex items-center gap-1 px-2 py-1 rounded text-xs bg-status-connected/15 border border-status-connected/30 text-status-connected hover:bg-status-connected/25 transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring"
title="Back to conversations"
aria-label="Back to conversations"
>
&larr; Back to Chat
</button>
</div>
<div className="flex-1 min-h-0 overflow-y-auto py-1 [contain:layout_paint]">
{SETTINGS_SECTION_ORDER.map((section) => (
<button
key={section}
type="button"
className={cn(
'w-full px-3 py-2 text-left text-[13px] border-l-2 border-transparent hover:bg-accent transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-inset',
settingsSection === section && 'bg-accent border-l-primary'
)}
aria-current={settingsSection === section ? 'true' : undefined}
onClick={() => onSettingsSectionChange(section)}
>
{SETTINGS_SECTION_LABELS[section]}
</button>
))}
</div>
</nav>
);
const activeSidebarContent = showSettings ? (
settingsSidebarContent
) : (
<Sidebar {...sidebarProps} />
);
return (
<div className="flex flex-col h-full">
<a
href="#main-content"
className="sr-only focus:not-sr-only focus:absolute focus:z-50 focus:p-2 focus:bg-primary focus:text-primary-foreground"
>
Skip to content
</a>
{localLabel.text && (
<div
style={{
backgroundColor: localLabel.color,
color: getContrastTextColor(localLabel.color),
}}
className="px-4 py-1 text-center text-sm font-medium"
>
{localLabel.text}
</div>
)}
<StatusBar
health={statusProps.health}
config={statusProps.config}
settingsMode={showSettings}
onSettingsClick={onToggleSettingsView}
onMenuClick={showSettings ? undefined : () => onSidebarOpenChange(true)}
/>
<div className="flex flex-1 overflow-hidden">
<div className="hidden md:block min-h-0 overflow-hidden">{activeSidebarContent}</div>
<Sheet open={sidebarOpen} onOpenChange={onSidebarOpenChange}>
<SheetContent side="left" className="w-[280px] p-0 flex flex-col" hideCloseButton>
<SheetHeader className="sr-only">
<SheetTitle>Navigation</SheetTitle>
<SheetDescription>Sidebar navigation</SheetDescription>
</SheetHeader>
<div className="flex-1 overflow-hidden">{activeSidebarContent}</div>
</SheetContent>
</Sheet>
<main id="main-content" className="flex-1 flex flex-col bg-background min-w-0">
<div
className={cn(
'flex-1 flex flex-col min-h-0',
(showSettings || conversationPaneProps.activeConversation?.type === 'search') &&
'hidden'
)}
>
<ConversationPane {...conversationPaneProps} />
</div>
{searchMounted.current && (
<div
className={cn(
'flex-1 flex flex-col min-h-0',
(conversationPaneProps.activeConversation?.type !== 'search' || showSettings) &&
'hidden'
)}
>
<Suspense
fallback={
<div className="flex-1 flex items-center justify-center text-muted-foreground">
Loading search...
</div>
}
>
<SearchView {...searchProps} />
</Suspense>
</div>
)}
{showSettings && (
<div className="flex-1 flex flex-col min-h-0">
<h2 className="flex justify-between items-center px-4 py-2.5 border-b border-border font-semibold text-base">
<span>Radio & Settings</span>
<span className="text-sm text-muted-foreground hidden md:inline">
{SETTINGS_SECTION_LABELS[settingsSection]}
</span>
</h2>
<div className="flex-1 min-h-0 overflow-hidden">
<Suspense
fallback={
<div className="flex-1 flex items-center justify-center p-8 text-muted-foreground">
Loading settings...
</div>
}
>
<SettingsModal
{...settingsProps}
open={showSettings}
pageMode
externalSidebarNav
desktopSection={settingsSection}
onClose={onCloseSettingsView}
onLocalLabelChange={onLocalLabelChange}
/>
</Suspense>
</div>
</div>
)}
</main>
</div>
<div
ref={(el) => {
if (showCracker && el) {
const focusable = el.querySelector<HTMLElement>('input, button:not([disabled])');
if (focusable) {
setTimeout(() => focusable.focus(), 210);
}
}
}}
className={cn(
'border-t border-border bg-background transition-all duration-200 overflow-hidden',
showCracker ? 'h-[275px]' : 'h-0'
)}
>
{crackerMounted.current && (
<Suspense
fallback={
<div className="flex items-center justify-center h-full text-muted-foreground">
Loading cracker...
</div>
}
>
<CrackerPanel
{...crackerProps}
visible={showCracker}
onRunningChange={onCrackerRunningChange}
/>
</Suspense>
)}
</div>
<NewMessageModal
{...newMessageModalProps}
open={showNewMessage}
onClose={onCloseNewMessage}
onSelectConversation={(conv) => {
newMessageModalProps.onSelectConversation(conv);
onCloseNewMessage();
}}
/>
<ContactInfoPane {...contactInfoPaneProps} />
<ChannelInfoPane {...channelInfoPaneProps} />
<Toaster position="top-right" />
</div>
);
}
+46 -1
View File
@@ -6,6 +6,35 @@ interface ContactAvatarProps {
size?: number;
contactType?: number;
clickable?: boolean;
variant?: 'default' | 'corrupt';
}
function CorruptAvatarGraphic({ size }: { size: number }) {
return (
<svg
data-testid="corrupt-avatar"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 24 56"
width={Math.round(size * 0.6)}
height={Math.round(size * 0.94)}
shapeRendering="crispEdges"
aria-hidden="true"
>
<path fill="#F8E8F8" d="M8 0v24H0v32h24V0" />
<path
fill="#807098"
d="M12 0h2v1h1v1h1v1h-1v1h-2V3h-1V0zm3 0h1v1h-1V0zm2 0h2v1h-1v1h-1V0zm4 0h2v1h-2V0zm-2 1h2v1h-2V1zm4 0h1v3h-2v1h-1V3h1V2h1V1zm-6 2h1v2h-1V3zm2 0h1v2h-1V3zM9 4h4v1H9V4zm5 1h3v4h-1V6h-2V5zm4 0h1v1h-1V5zm2 0h1v1h1V5h2v1h-1v2h-1v1h-2v1h-1V9h-1V7h1V6h1V5zM8 7h1v1H8V7zm0 3h2v1h1v1H9v-1H8v-1zm13 0h1v1h1v1h-2v-2zm2 0h1v1h-1v-1zm-11 1h1v1h-1v-1zm2 0h1v1h-1v-1zm3 0h1v1h-1v-1zm2 0h1v1h-1v-1zM8 12h1v1H8v-1zm3 0h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h1v1h2v1h1v1h-2v-1h-3v-1h1v-1zm-5 1h1v1h-1v-1zm2 0h1v1h-1v-1zm7 0h2v1h-2v-1zm3 0h2v1h-2v-1zM8 15h1v1H8v-1zm11 0h2v1h1v1h-2v-1h-1v-1zm4 0h1v1h-1v-1zm-13 1h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h4v1h-4v-1zm-2 2h1v1h-1v-1zm3 0h1v1h1v1h-3v-1h1v-1zm4 0h4v1h-2v1h-3v-1h1v-1zM9 19h1v1H9v-1zm2 0h1v1h-1v-1zm10 1h3v2h-1v-1h-2v-1zm-11 1h2v2H9v-1h1v-1zm4 0h2v3h1v-1h1v2h1v1h-2v-1h-2v-1h-1v-1h-1v-1h1v-1zm6 0h1v1h-1v-1zm-8 2h1v1h-1v-1zm7 0h2v1h-1v1h-1v-2zm4 0h1v1h-1v-1zM0 24h8v3H7v2h3v1H8v1H7v-1H1v-1h5v-4H0v-1zm9 0h3v1H9v-1zm4 0h1v1h-1v-1zm8 0h1v1h-1v-1zm-1 1h1v1h-1v-1zm2 0h2v3h-1v-2h-1v-1zm-6 1h1v2h-1v-2zm3 0h1v2h-1v-2zm2 0h1v2h-1v-2zm-11 2h2v1h1v1h-2v-1h-1v-1zm7 0h2v1h-2v-1zm3 0h1v1h-1v-1zm2 0h1v1h-1v-1zm-6 1h1v2h-1v-2zm3 0h1v1h1v-1h1v2h1v1h-1v2h-1v-1h-1v1h-1v-1h-2v-1h2v-1h-1v-1h1v-1zm4 0h1v1h-1v-1zm-12 2h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h1v1h-1v-1zM5 32h1v1H5v-1zm3 1h2v1H8v-1zm3 0h1v1h-1v-1zm3 0h1v1h-1v-1zm9 0h1v1h-1v-1zM6 34h2v1H7v1H6v1h1v1H1v-1H0v-1h1v-1h1v1h1v-1h1v1h1v-1h1v-1zm11 0h2v1h-1v1h1v-1h1v1h1v1h-5v1h-1v-2h2v-2zm5 0h1v1h-1v-1zM8 35h4v1h1v2h-3v-1H7v-1h1v-1zm5 0h2v1h-2v-1zm9 1h1v1h-1v-1zm-1 1h1v1h-1v-1zM2 39h1v2H2v-2zm2 0h1v2H4v-2zm2 0h1v2H6v-2zm2 0h2v2H9v-1H8v-1zm3 0h1v1h-1v-1zm4 0h1v1h-1v-1zm7 0h1v1h-1v-1zm-5 1h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h1v1h-1v-1zM1 41h1v1H1v-1zm2 0h1v1H3v-1zm2 0h1v1H5v-1zm2 0h1v1H7v-1zm3 0h2v2h-1v-1h-1v-1zm4 0h2v1h7v1h-1v1h-1v-1h-1v3h-4v-1h1v-2h-2v-1h-1v-1zm9 0h1v1h-1v-1zm-10 2h2v1h1v1h-4v1h1v1h1v-1h1v1h1v1h-4v-1h-1v-3h2v-1zm10 0h1v1h-1v-1zM0 45h10v3H8v-2H0v-1zm22 0h1v1h-1v-1zm-1 1h1v1h-1v-1zm2 0h1v2h-1v-2zM4 48h4v2H7v-1H6v1H5v-1H4v-1zm6 0h2v3h-1v-1h-1v-2zm6 0h7v1h-7v-1zM0 49h4v1h1v1H0v-2zm13 0h1v1h-1v-1zm2 0h1v1h2v1h-3v-2zm-9 1h1v2H6v-2zm2 0h2v1h1v1h1v1H7v-1h1v-2zm11 0h5v1h-1v1h-1v-1h-1v1h-3v-1h1v-1zm-7 1h3v1h-3v-1zM0 52h6v1h1v1H5v-1H4v1h1v1h1v1H4v-1H0v-3zm12 1h1v1h-1v-1zm2 0h1v3h-2v-2h1v-1zm2 0h2v3h-1v-1h-1v-2zm6 0h2v1h-1v1h-2v-1h1v-1zM7 54h1v1H7v-1zm12 0h1v1h-1v-1zM8 55h4v1H8v-1zm15 0h1v1h-1v-1z"
/>
<path
fill="#181010"
d="M8 0h1v1H8V0zm0 6h1v1H8V6zm2 0h1v1h-1V6zm2 0h1v1h-1V6zm2 0h1v1h-1V6zM8 9h1v1H8V9zm3 0h1v1h-1V9zm2 0h1v1h-1V9zm2 0h1v1h-1V9zm6 0h1v1h-1V9zm2 0h1v1h-1V9zM8 11h1v1H8v-1zm9 1h5v1h-5v-1zm6 0h1v1h-1v-1zm-13 2h4v1h-4v-1zm9 0h4v1h-4v-1zm-4 1h1v1h-1v-1zm2 0h2v1h-2v-1zm-5 2h1v1h-1v-1zm4 0h2v1h-2v-1zm3 0h2v1h-2v-1zm3 0h1v1h-1v-1zM8 18h1v2H8v-2zm1 2h1v1H9v-1zm2 0h2v1h-2v-1zm4 0h1v1h-1v-1zm1 2h4v1h-4v-1zm6 0h2v1h-2v-1zM9 25h1v1H9v-1zm2 0h5v1h-1v1h-1v-1h-1v1h1v1h-2v-1h-1v-2zM0 26h1v1H0v-1zm8 0h1v1h1v1H8v-2zm7 1h1v1h-1v-1zm-8 1h1v1H7v-1zm-7 2h1v1H0v-1zm9 0h2v1H9v-1zm5 0h2v1h-2v-1zM1 31h5v1H1v-1zm6 0h1v1H7v-1zm-7 1h1v2H0v-2zm9 0h7v1H9v-1zm14 0h1v1h-1v-1zM2 33h1v1H2v-1zm2 0h1v1H4v-1zm3 0h1v1H7v-1zm1 1h2v1H8v-1zm4 0h1v1h-1v-1zM0 35h1v1H0v-1zm23 1h1v1h-1v-1zm-1 1h1v2h-1v-2zM2 38h1v1H2v-1zm2 0h1v1H4v-1zm2 0h1v1H6v-1zm2 0h4v1H8v-1zm6 0h1v1h-1v-1zm3 1h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h1v1h-1v-1zm2 0h1v1h-1v-1zm-10 1h1v1h-1v-1zm9 0h1v1h-1v-1zM8 42h1v1H8v-1zm10 4h3v2h-2v-1h-1v-1zm-1 1h1v1h-1v-1zm-1 2h1v1h-1v-1zm7 0h1v1h-1v-1zm-5 3h3v1h-3v-1zm4 0h1v1h-1v-1z"
/>
<path
fill="#F0B088"
d="M8 5h1v1H8V5zm1 1h1v1h1V6h1v1h1V6h1v1h1V6h1v2H9V6zM8 8h1v1H8V8zm1 1h2v1H9V9zm3 0h1v1h-1V9zm2 0h1v1h-1V9zm6 0h1v1h-1V9zm2 0h1v1h-1V9zm-6 3h1v1h-1v-1zm6 0h1v1h-1v-1zM9 14h1v1h4v-1h2v1h-1v1H9v-2zm14 0h1v1h-1v-1zm-7 1h1v1h-1v-1zm-8 1h1v1h3v1H8v-2zm5 1h3v1h-3v-1zm5 0h1v1h-1v-1zm3 0h1v1h-1v-1zm2 0h1v1h-1v-1zM8 20h1v1H8v-1zm2 0h1v1h-1v-1zm3 0h2v1h-2v-1zm-5 2h1v1H8v-1zm12 0h2v1h-2v-1zm-10 3h1v2h1v1h-2v-1H9v-1h1v-1zm3 1h1v1h-1v-1zm2 0h1v1h-1v-1zm-1 1h1v1h-1v-1zM0 29h1v1H0v-1zm11 1h3v1h-3v-1zm-5 1h1v1H6v-1zm-5 2h1v1H1v-1zm2 0h1v1H3v-1zm2 0h2v1H5v-1zm5 0h1v1h1v1h-2v-2zm3 1h3v1h-3v-1zM0 37h1v1H0v-1zm8 0h2v1H8v-1zm15 0h1v1h-1v-1zM1 38h1v1H1v-1zm2 0h1v1H3v-1zm2 0h1v1H5v-1zm2 0h1v1H7v-1zm5 0h2v1h-2v-1zm3 0h7v1h-1v1h-1v-1h-1v1h-1v-1h-1v1h-1v-1h-1v-1zM0 39h1v2H0v-2zm10 0h1v1h-1v-1zm-2 1h1v1H8v-1zm3 0h2v1h-2v-1zm3 0h2v1h-2v-1zm9 0h1v1h-1v-1zm-7 1h7v1h-7v-1zm7 1h1v1h-1v-1zm-3 3h1v1h-1v-1zm-4 1h2v1h-1v1h-1v-2zm2 1h1v1h-1v-1zm-1 2h6v1h-6v-1zm-1 3h2v1h-2v-1zm5 0h1v1h-1v-1zm2 0h1v1h-1v-1z"
/>
</svg>
);
}
export function ContactAvatar({
@@ -14,14 +43,30 @@ export function ContactAvatar({
size = 28,
contactType,
clickable,
variant = 'default',
}: ContactAvatarProps) {
if (variant === 'corrupt') {
return (
<div
className={`flex items-center justify-center rounded-md flex-shrink-0 select-none bg-black/10${clickable ? ' cursor-pointer' : ''}`}
style={{
width: size,
height: size,
}}
aria-hidden="true"
>
<CorruptAvatarGraphic size={size} />
</div>
);
}
const avatar = getContactAvatar(name, publicKey, contactType);
return (
<div
className={`flex items-center justify-center rounded-full font-semibold flex-shrink-0 select-none${clickable ? ' cursor-pointer' : ''}`}
style={{
backgroundColor: avatar.background,
background: avatar.background,
color: avatar.textColor,
width: size,
height: size,
@@ -0,0 +1,219 @@
import { lazy, Suspense, useMemo, type Ref } from 'react';
import { ChatHeader } from './ChatHeader';
import { MessageInput, type MessageInputHandle } from './MessageInput';
import { MessageList } from './MessageList';
import { RawPacketList } from './RawPacketList';
import type {
Channel,
Contact,
Conversation,
Favorite,
HealthStatus,
Message,
RawPacket,
RadioConfig,
} from '../types';
import { CONTACT_TYPE_REPEATER } from '../types';
const RepeaterDashboard = lazy(() =>
import('./RepeaterDashboard').then((m) => ({ default: m.RepeaterDashboard }))
);
const MapView = lazy(() => import('./MapView').then((m) => ({ default: m.MapView })));
const VisualizerView = lazy(() =>
import('./VisualizerView').then((m) => ({ default: m.VisualizerView }))
);
interface ConversationPaneProps {
activeConversation: Conversation | null;
contacts: Contact[];
channels: Channel[];
rawPackets: RawPacket[];
config: RadioConfig | null;
health: HealthStatus | null;
favorites: Favorite[];
messages: Message[];
messagesLoading: boolean;
loadingOlder: boolean;
hasOlderMessages: boolean;
targetMessageId: number | null;
hasNewerMessages: boolean;
loadingNewer: boolean;
messageInputRef: Ref<MessageInputHandle>;
onTrace: () => Promise<void>;
onToggleFavorite: (type: 'channel' | 'contact', id: string) => Promise<void>;
onDeleteContact: (publicKey: string) => Promise<void>;
onDeleteChannel: (key: string) => Promise<void>;
onSetChannelFloodScopeOverride: (channelKey: string, floodScopeOverride: string) => Promise<void>;
onOpenContactInfo: (publicKey: string, fromChannel?: boolean) => void;
onOpenChannelInfo: (channelKey: string) => void;
onSenderClick: (sender: string) => void;
onLoadOlder: () => Promise<void>;
onResendChannelMessage: (messageId: number, newTimestamp?: boolean) => Promise<void>;
onTargetReached: () => void;
onLoadNewer: () => Promise<void>;
onJumpToBottom: () => void;
onSendMessage: (text: string) => Promise<void>;
}
function LoadingPane({ label }: { label: string }) {
return (
<div className="flex-1 flex items-center justify-center text-muted-foreground">{label}</div>
);
}
export function ConversationPane({
activeConversation,
contacts,
channels,
rawPackets,
config,
health,
favorites,
messages,
messagesLoading,
loadingOlder,
hasOlderMessages,
targetMessageId,
hasNewerMessages,
loadingNewer,
messageInputRef,
onTrace,
onToggleFavorite,
onDeleteContact,
onDeleteChannel,
onSetChannelFloodScopeOverride,
onOpenContactInfo,
onOpenChannelInfo,
onSenderClick,
onLoadOlder,
onResendChannelMessage,
onTargetReached,
onLoadNewer,
onJumpToBottom,
onSendMessage,
}: ConversationPaneProps) {
const activeContactIsRepeater = useMemo(() => {
if (!activeConversation || activeConversation.type !== 'contact') return false;
const contact = contacts.find((candidate) => candidate.public_key === activeConversation.id);
return contact?.type === CONTACT_TYPE_REPEATER;
}, [activeConversation, contacts]);
if (!activeConversation) {
return (
<div className="flex-1 flex items-center justify-center text-muted-foreground">
Select a conversation or start a new one
</div>
);
}
if (activeConversation.type === 'map') {
return (
<>
<h2 className="flex justify-between items-center px-4 py-2.5 border-b border-border font-semibold text-base">
Node Map
</h2>
<div className="flex-1 overflow-hidden">
<Suspense fallback={<LoadingPane label="Loading map..." />}>
<MapView contacts={contacts} focusedKey={activeConversation.mapFocusKey} />
</Suspense>
</div>
</>
);
}
if (activeConversation.type === 'visualizer') {
return (
<Suspense fallback={<LoadingPane label="Loading visualizer..." />}>
<VisualizerView packets={rawPackets} contacts={contacts} config={config} />
</Suspense>
);
}
if (activeConversation.type === 'raw') {
return (
<>
<h2 className="flex justify-between items-center px-4 py-2.5 border-b border-border font-semibold text-base">
Raw Packet Feed
</h2>
<div className="flex-1 overflow-hidden">
<RawPacketList packets={rawPackets} />
</div>
</>
);
}
if (activeConversation.type === 'search') {
return null;
}
if (activeContactIsRepeater) {
return (
<Suspense fallback={<LoadingPane label="Loading dashboard..." />}>
<RepeaterDashboard
key={activeConversation.id}
conversation={activeConversation}
contacts={contacts}
favorites={favorites}
radioLat={config?.lat ?? null}
radioLon={config?.lon ?? null}
radioName={config?.name ?? null}
onTrace={onTrace}
onToggleFavorite={onToggleFavorite}
onDeleteContact={onDeleteContact}
/>
</Suspense>
);
}
return (
<>
<ChatHeader
conversation={activeConversation}
contacts={contacts}
channels={channels}
config={config}
favorites={favorites}
onTrace={onTrace}
onToggleFavorite={onToggleFavorite}
onSetChannelFloodScopeOverride={onSetChannelFloodScopeOverride}
onDeleteChannel={onDeleteChannel}
onDeleteContact={onDeleteContact}
onOpenContactInfo={onOpenContactInfo}
onOpenChannelInfo={onOpenChannelInfo}
/>
<MessageList
key={activeConversation.id}
messages={messages}
contacts={contacts}
loading={messagesLoading}
loadingOlder={loadingOlder}
hasOlderMessages={hasOlderMessages}
onSenderClick={activeConversation.type === 'channel' ? onSenderClick : undefined}
onLoadOlder={onLoadOlder}
onResendChannelMessage={
activeConversation.type === 'channel' ? onResendChannelMessage : undefined
}
radioName={config?.name}
config={config}
onOpenContactInfo={onOpenContactInfo}
targetMessageId={targetMessageId}
onTargetReached={onTargetReached}
hasNewerMessages={hasNewerMessages}
loadingNewer={loadingNewer}
onLoadNewer={onLoadNewer}
onJumpToBottom={onJumpToBottom}
/>
<MessageInput
ref={messageInputRef}
onSend={onSendMessage}
disabled={!health?.radio_connected}
conversationType={activeConversation.type}
senderName={config?.name}
placeholder={
!health?.radio_connected ? 'Radio not connected' : `Message ${activeConversation.name}...`
}
/>
</>
);
}
+1 -1
View File
@@ -22,7 +22,7 @@ interface QueueItem {
status: 'pending' | 'cracking' | 'cracked' | 'failed';
}
interface CrackerPanelProps {
export interface CrackerPanelProps {
packets: RawPacket[];
channels: Channel[];
onChannelCreate: (name: string, key: string) => Promise<void>;
+100 -11
View File
@@ -148,6 +148,23 @@ function HopCountBadge({ paths, onClick, variant }: HopCountBadgeProps) {
}
const RESEND_WINDOW_SECONDS = 30;
const CORRUPT_SENDER_LABEL = '<No name -- corrupt packet?>';
function hasUnexpectedControlChars(text: string): boolean {
for (const char of text) {
const code = char.charCodeAt(0);
if (
(code >= 0 && code <= 8) ||
code === 11 ||
code === 12 ||
(code >= 14 && code <= 31) ||
code === 127
) {
return true;
}
}
return false;
}
export function MessageList({
messages,
@@ -400,6 +417,17 @@ export function MessageList({
return contacts.find((c) => c.name === name) || null;
};
const isCorruptUnnamedChannelMessage = (msg: Message, parsedSender: string | null): boolean => {
return (
msg.type === 'CHAN' &&
!msg.outgoing &&
!msg.sender_name &&
!msg.sender_key &&
!parsedSender &&
hasUnexpectedControlChars(msg.text)
);
};
// Build sender info for path modal
const getSenderInfo = (
msg: Message,
@@ -415,6 +443,32 @@ export function MessageList({
pathHashMode: contact.out_path_hash_mode,
};
}
if (msg.type === 'CHAN') {
const senderName = msg.sender_name || parsedSender;
const senderContact =
(msg.sender_key
? contacts.find((candidate) => candidate.public_key === msg.sender_key)
: null) || (senderName ? getContactByName(senderName) : null);
if (senderContact) {
return {
name: senderContact.name || senderName || senderContact.public_key.slice(0, 12),
publicKeyOrPrefix: senderContact.public_key,
lat: senderContact.lat,
lon: senderContact.lon,
pathHashMode: senderContact.out_path_hash_mode,
};
}
if (senderName || msg.sender_key) {
return {
name: senderName || msg.sender_key || 'Unknown',
publicKeyOrPrefix: msg.sender_key || msg.conversation_key || '',
lat: null,
lon: null,
pathHashMode: null,
};
}
}
// For channel messages, try to find contact by parsed sender name
if (parsedSender) {
const senderContact = getContactByName(parsedSender);
@@ -455,10 +509,17 @@ export function MessageList({
}
// Helper to get a unique sender key for grouping messages
const getSenderKey = (msg: Message, sender: string | null): string => {
const getSenderKey = (
msg: Message,
senderName: string | null,
isCorruptChannelMessage: boolean
): string => {
if (msg.outgoing) return '__outgoing__';
if (msg.type === 'PRIV' && msg.conversation_key) return msg.conversation_key;
return sender || '__unknown__';
if (msg.sender_key) return `key:${msg.sender_key}`;
if (senderName) return `name:${senderName}`;
if (isCorruptChannelMessage) return `corrupt:${msg.id}`;
return '__unknown__';
};
return (
@@ -487,17 +548,36 @@ export function MessageList({
const { sender, content } = isRepeater
? { sender: null, content: msg.text }
: parseSenderFromText(msg.text);
const channelSenderName = msg.type === 'CHAN' ? msg.sender_name || sender : null;
const channelSenderContact =
msg.type === 'CHAN' && channelSenderName ? getContactByName(channelSenderName) : null;
const isCorruptChannelMessage = isCorruptUnnamedChannelMessage(msg, sender);
const displaySender = msg.outgoing
? 'You'
: contact?.name || sender || msg.conversation_key?.slice(0, 8) || 'Unknown';
: contact?.name ||
channelSenderName ||
(isCorruptChannelMessage
? CORRUPT_SENDER_LABEL
: msg.conversation_key?.slice(0, 8) || 'Unknown');
const canClickSender = !msg.outgoing && onSenderClick && displaySender !== 'Unknown';
const canClickSender =
!msg.outgoing &&
onSenderClick &&
displaySender !== 'Unknown' &&
displaySender !== CORRUPT_SENDER_LABEL;
// Determine if we should show avatar (first message in a chunk from same sender)
const currentSenderKey = getSenderKey(msg, sender);
const currentSenderKey = getSenderKey(msg, channelSenderName, isCorruptChannelMessage);
const prevMsg = sortedMessages[index - 1];
const prevParsedSender = prevMsg ? parseSenderFromText(prevMsg.text).sender : null;
const prevSenderKey = prevMsg
? getSenderKey(prevMsg, parseSenderFromText(prevMsg.text).sender)
? getSenderKey(
prevMsg,
prevMsg.type === 'CHAN'
? prevMsg.sender_name || prevParsedSender
: prevParsedSender,
isCorruptUnnamedChannelMessage(prevMsg, prevParsedSender)
)
: null;
const isFirstInGroup = currentSenderKey !== prevSenderKey;
const showAvatar = !msg.outgoing && isFirstInGroup;
@@ -506,16 +586,24 @@ export function MessageList({
// Get avatar info for incoming messages
let avatarName: string | null = null;
let avatarKey: string = '';
let avatarVariant: 'default' | 'corrupt' = 'default';
if (!msg.outgoing) {
if (msg.type === 'PRIV' && msg.conversation_key) {
// DM: use conversation_key (sender's public key)
avatarName = contact?.name || null;
avatarKey = msg.conversation_key;
} else if (sender) {
// Channel message: try to find contact by name, or use sender name as pseudo-key
const senderContact = getContactByName(sender);
avatarName = sender;
avatarKey = senderContact?.public_key || `name:${sender}`;
} else if (isCorruptChannelMessage) {
avatarName = CORRUPT_SENDER_LABEL;
avatarKey = `corrupt:${msg.id}`;
avatarVariant = 'corrupt';
} else {
// Channel message: use stored sender identity first, then parsed/fallback display name
avatarName =
channelSenderName || (displaySender !== 'Unknown' ? displaySender : null);
avatarKey =
msg.sender_key ||
channelSenderContact?.public_key ||
(avatarName ? `name:${avatarName}` : `message:${msg.id}`);
}
}
@@ -547,6 +635,7 @@ export function MessageList({
publicKey={avatarKey}
size={32}
clickable={!!onOpenContactInfo}
variant={avatarVariant}
/>
</span>
)}
File diff suppressed because it is too large Load Diff
@@ -70,7 +70,7 @@ export function RepeaterDashboard({
return (
<div className="flex-1 flex flex-col min-h-0">
{/* Header */}
<header className="flex justify-between items-center px-4 py-2.5 border-b border-border gap-2">
<header className="flex justify-between items-start sm:items-center px-4 py-2.5 border-b border-border gap-2">
<span className="flex flex-wrap items-baseline gap-x-2 min-w-0 flex-1">
<span className="flex-shrink-0 font-semibold text-base">{conversation.name}</span>
<span
@@ -95,7 +95,7 @@ export function RepeaterDashboard({
size="sm"
onClick={loadAll}
disabled={anyLoading}
className="text-xs border-success text-success hover:bg-success/10 hover:text-success"
className="h-7 px-2 text-[11px] leading-none border-success text-success hover:bg-success/10 hover:text-success sm:h-8 sm:px-3 sm:text-xs"
>
{anyLoading ? 'Loading...' : 'Load All'}
</Button>
+1 -1
View File
@@ -26,7 +26,7 @@ export interface SearchNavigateTarget {
conversation_name: string;
}
interface SearchViewProps {
export interface SearchViewProps {
contacts: Contact[];
channels: Channel[];
onNavigateToMessage: (target: SearchNavigateTarget) => void;
+1 -1
View File
@@ -37,7 +37,7 @@ interface SettingsModalBaseProps {
onToggleBlockedName?: (name: string) => void;
}
type SettingsModalProps = SettingsModalBaseProps &
export type SettingsModalProps = SettingsModalBaseProps &
(
| { externalSidebarNav: true; desktopSection: SettingsSection }
| { externalSidebarNav?: false; desktopSection?: never }
@@ -12,7 +12,7 @@ The visualizer displays:
## Architecture
### Data Layer (`useVisualizerData3D` hook)
### Data Layer (`components/visualizer/useVisualizerData3D.ts`)
The custom hook manages all graph state and simulation logic:
@@ -39,6 +39,8 @@ Packets → Parse → Aggregate by key → Observation window → Publish → An
### Rendering Layer (Three.js)
Scene creation, render-loop updates, raycasting hover, and click-to-pin interaction live in `components/visualizer/useVisualizer3DScene.ts`.
- `THREE.WebGLRenderer` + `CSS2DRenderer` (text labels overlaid on 3D scene)
- `OrbitControls` for camera interaction (orbit, pan, zoom)
- `THREE.Mesh` with `SphereGeometry` per node + `CSS2DObject` labels
@@ -46,15 +48,20 @@ Packets → Parse → Aggregate by key → Observation window → Publish → An
- `THREE.Points` with vertex colors for particles (persistent geometry + circular sprite texture)
- `THREE.Raycaster` for hover/click detection on node spheres
### Shared Utilities (`utils/visualizerUtils.ts`)
### Shared Utilities
Types, constants, and pure functions shared across the codebase:
- `components/visualizer/shared.ts`
- Graph-specific types: `GraphNode`, `GraphLink`, `NodeMeshData`
- Shared rendering helpers: node colors, relative-time formatting, typed-array growth helpers
- `utils/visualizerUtils.ts`
- Packet parsing, identity helpers, ambiguous repeater heuristics, constants shared across visualizer code
- Types: `NodeType`, `PacketLabel`, `Particle`, `ObservedPath`, `PendingPacket`, `ParsedPacket`, `TrafficObservation`, `RepeaterTrafficData`, `RepeaterSplitAnalysis`
- Constants: `COLORS`, `PARTICLE_COLOR_MAP`, `PARTICLE_SPEED`, `DEFAULT_OBSERVATION_WINDOW_SEC`, traffic thresholds, `PACKET_LEGEND_ITEMS`
- Functions: `hashString` (from `utils/contactAvatar.ts`), `parsePacket`, `getPacketLabel`, `generatePacketKey`, `getLinkId`, `getNodeType`, `dedupeConsecutive`, `analyzeRepeaterTraffic`, `recordTrafficObservation`
### UI Overlays
`GraphNode` and `GraphLink` are defined locally in the component — they extend `SimulationNodeDatum3D` and `SimulationLinkDatum` from `d3-force-3d`.
- `components/visualizer/VisualizerControls.tsx`
- Legends, settings toggles, repulsion/speed controls, reset/stretch actions
- `components/visualizer/VisualizerTooltip.tsx`
- Hovered/pinned node metadata and neighbor list
### Type Declarations (`types/d3-force-3d.d.ts`)
@@ -0,0 +1,317 @@
import { Checkbox } from '../ui/checkbox';
import { PACKET_LEGEND_ITEMS } from '../../utils/visualizerUtils';
import { NODE_LEGEND_ITEMS } from './shared';
interface VisualizerControlsProps {
showControls: boolean;
setShowControls: (value: boolean) => void;
fullScreen?: boolean;
onFullScreenChange?: (fullScreen: boolean) => void;
showAmbiguousPaths: boolean;
setShowAmbiguousPaths: (value: boolean) => void;
showAmbiguousNodes: boolean;
setShowAmbiguousNodes: (value: boolean) => void;
useAdvertPathHints: boolean;
setUseAdvertPathHints: (value: boolean) => void;
splitAmbiguousByTraffic: boolean;
setSplitAmbiguousByTraffic: (value: boolean) => void;
observationWindowSec: number;
setObservationWindowSec: (value: number) => void;
pruneStaleNodes: boolean;
setPruneStaleNodes: (value: boolean) => void;
pruneStaleMinutes: number;
setPruneStaleMinutes: (value: number) => void;
letEmDrift: boolean;
setLetEmDrift: (value: boolean) => void;
autoOrbit: boolean;
setAutoOrbit: (value: boolean) => void;
chargeStrength: number;
setChargeStrength: (value: number) => void;
particleSpeedMultiplier: number;
setParticleSpeedMultiplier: (value: number) => void;
nodeCount: number;
linkCount: number;
onExpandContract: () => void;
onClearAndReset: () => void;
}
export function VisualizerControls({
showControls,
setShowControls,
fullScreen,
onFullScreenChange,
showAmbiguousPaths,
setShowAmbiguousPaths,
showAmbiguousNodes,
setShowAmbiguousNodes,
useAdvertPathHints,
setUseAdvertPathHints,
splitAmbiguousByTraffic,
setSplitAmbiguousByTraffic,
observationWindowSec,
setObservationWindowSec,
pruneStaleNodes,
setPruneStaleNodes,
pruneStaleMinutes,
setPruneStaleMinutes,
letEmDrift,
setLetEmDrift,
autoOrbit,
setAutoOrbit,
chargeStrength,
setChargeStrength,
particleSpeedMultiplier,
setParticleSpeedMultiplier,
nodeCount,
linkCount,
onExpandContract,
onClearAndReset,
}: VisualizerControlsProps) {
return (
<>
{showControls && (
<div className="absolute bottom-4 left-4 bg-background/80 backdrop-blur-sm rounded-lg p-3 text-xs border border-border z-10">
<div className="flex gap-6">
<div className="flex flex-col gap-1.5">
<div className="text-muted-foreground font-medium mb-1">Packets</div>
{PACKET_LEGEND_ITEMS.map((item) => (
<div key={item.label} className="flex items-center gap-2">
<div
className="w-5 h-5 rounded-full flex items-center justify-center text-[8px] font-bold text-white"
style={{ backgroundColor: item.color }}
>
{item.label}
</div>
<span>{item.description}</span>
</div>
))}
</div>
<div className="flex flex-col gap-1.5">
<div className="text-muted-foreground font-medium mb-1">Nodes</div>
{NODE_LEGEND_ITEMS.map((item) => (
<div key={item.label} className="flex items-center gap-2">
<div
className="rounded-full"
style={{
width: item.size,
height: item.size,
backgroundColor: item.color,
}}
/>
<span>{item.label}</span>
</div>
))}
</div>
</div>
</div>
)}
<div
className={`absolute top-4 left-4 bg-background/80 backdrop-blur-sm rounded-lg p-3 text-xs border border-border z-10 transition-opacity ${!showControls ? 'opacity-40 hover:opacity-100' : ''}`}
>
<div className="flex flex-col gap-2">
<div className="flex flex-col gap-2">
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={showControls}
onCheckedChange={(c) => setShowControls(c === true)}
/>
<span title="Toggle legends and controls visibility">Show controls</span>
</label>
{onFullScreenChange && (
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={!fullScreen}
onCheckedChange={(c) => onFullScreenChange(c !== true)}
/>
<span title="Show or hide the packet feed sidebar">Show packet feed sidebar</span>
</label>
)}
</div>
{showControls && (
<>
<div className="border-t border-border pt-2 mt-1 flex flex-col gap-2">
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={showAmbiguousPaths}
onCheckedChange={(c) => setShowAmbiguousPaths(c === true)}
/>
<span title="Show placeholder nodes for repeaters when the 1-byte prefix matches multiple contacts">
Show ambiguous repeaters
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={showAmbiguousNodes}
onCheckedChange={(c) => setShowAmbiguousNodes(c === true)}
/>
<span title="Show placeholder nodes for senders/recipients when only a 1-byte prefix is known">
Show ambiguous sender/recipient
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={useAdvertPathHints}
onCheckedChange={(c) => setUseAdvertPathHints(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Use stored repeater advert paths to assign likely identity labels for ambiguous repeater nodes"
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Use repeater advert-path identity hints
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={splitAmbiguousByTraffic}
onCheckedChange={(c) => setSplitAmbiguousByTraffic(c === true)}
disabled={!showAmbiguousPaths}
/>
<span
title="Split ambiguous repeaters into separate nodes based on traffic patterns (prev→next). Helps identify colliding prefixes representing different physical nodes, but requires enough traffic to disambiguate."
className={!showAmbiguousPaths ? 'text-muted-foreground' : ''}
>
Heuristically group repeaters by traffic pattern
</span>
</label>
<div className="flex items-center gap-2">
<label
htmlFor="observation-window-3d"
className="text-muted-foreground"
title="How long to wait for duplicate packets via different paths before animating"
>
Ack/echo listen window:
</label>
<input
id="observation-window-3d"
type="number"
min="1"
max="60"
value={observationWindowSec}
onChange={(e) =>
setObservationWindowSec(
Math.max(1, Math.min(60, parseInt(e.target.value, 10) || 1))
)
}
className="w-12 px-1 py-0.5 bg-background border border-border rounded text-xs text-center"
/>
<span className="text-muted-foreground">sec</span>
</div>
<div className="border-t border-border pt-2 mt-1 flex flex-col gap-2">
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={pruneStaleNodes}
onCheckedChange={(c) => setPruneStaleNodes(c === true)}
/>
<span title="Automatically remove nodes with no traffic within the configured window to keep the mesh manageable">
Only show recently heard/in-a-path nodes
</span>
</label>
{pruneStaleNodes && (
<div className="flex items-center gap-2 pl-6">
<label
htmlFor="prune-window"
className="text-muted-foreground whitespace-nowrap"
>
Window:
</label>
<input
id="prune-window"
type="number"
min={1}
max={60}
value={pruneStaleMinutes}
onChange={(e) => {
const v = parseInt(e.target.value, 10);
if (!isNaN(v) && v >= 1 && v <= 60) setPruneStaleMinutes(v);
}}
className="w-14 rounded border border-border bg-background px-2 py-0.5 text-sm"
/>
<span className="text-muted-foreground" aria-hidden="true">
min
</span>
</div>
)}
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={letEmDrift}
onCheckedChange={(c) => setLetEmDrift(c === true)}
/>
<span title="When enabled, the graph continuously reorganizes itself into a better layout">
Let &apos;em drift
</span>
</label>
<label className="flex items-center gap-2 cursor-pointer">
<Checkbox
checked={autoOrbit}
onCheckedChange={(c) => setAutoOrbit(c === true)}
/>
<span title="Automatically orbit the camera around the scene">
Orbit the mesh
</span>
</label>
<div className="flex flex-col gap-1 mt-1">
<label
htmlFor="viz-repulsion"
className="text-muted-foreground"
title="How strongly nodes repel each other. Higher values spread nodes out more."
>
Repulsion: {Math.abs(chargeStrength)}
</label>
<input
id="viz-repulsion"
type="range"
min="50"
max="2500"
value={Math.abs(chargeStrength)}
onChange={(e) => setChargeStrength(-parseInt(e.target.value, 10))}
className="w-full h-2 bg-border rounded-lg appearance-none cursor-pointer accent-primary"
/>
</div>
<div className="flex flex-col gap-1 mt-1">
<label
htmlFor="viz-packet-speed"
className="text-muted-foreground"
title="How fast particles travel along links. Higher values make packets move faster."
>
Packet speed: {particleSpeedMultiplier}x
</label>
<input
id="viz-packet-speed"
type="range"
min="1"
max="5"
step="0.5"
value={particleSpeedMultiplier}
onChange={(e) => setParticleSpeedMultiplier(parseFloat(e.target.value))}
className="w-full h-2 bg-border rounded-lg appearance-none cursor-pointer accent-primary"
/>
</div>
</div>
<button
onClick={onExpandContract}
className="mt-1 px-3 py-1.5 bg-primary/20 hover:bg-primary/30 text-primary rounded text-xs transition-colors"
title="Expand nodes apart then contract back - can help untangle the graph"
>
Oooh Big Stretch!
</button>
<button
onClick={onClearAndReset}
className="mt-1 px-3 py-1.5 bg-yellow-500/20 hover:bg-yellow-500/30 text-yellow-500 rounded text-xs transition-colors"
title="Clear all nodes and links from the visualization - packets are preserved"
>
Clear &amp; Reset
</button>
</div>
<div className="border-t border-border pt-2 mt-1">
<div>Nodes: {nodeCount}</div>
<div>Links: {linkCount}</div>
</div>
</>
)}
</div>
</div>
</>
);
}
@@ -0,0 +1,73 @@
import type { GraphNode } from './shared';
import { formatRelativeTime } from './shared';
interface VisualizerTooltipProps {
activeNodeId: string | null;
nodes: Map<string, GraphNode>;
neighborIds: string[];
}
export function VisualizerTooltip({ activeNodeId, nodes, neighborIds }: VisualizerTooltipProps) {
if (!activeNodeId) return null;
const node = nodes.get(activeNodeId);
if (!node) return null;
const neighbors = neighborIds
.map((nid) => {
const neighbor = nodes.get(nid);
if (!neighbor) return null;
const displayName =
neighbor.name || (neighbor.type === 'self' ? 'Me' : neighbor.id.slice(0, 8));
return { id: nid, name: displayName, ambiguousNames: neighbor.ambiguousNames };
})
.filter((neighbor): neighbor is NonNullable<typeof neighbor> => neighbor !== null);
return (
<div className="absolute top-4 right-4 bg-background/90 backdrop-blur-sm rounded-lg p-3 text-xs border border-border z-10 max-w-72 max-h-[calc(100%-2rem)] overflow-y-auto">
<div className="flex flex-col gap-1">
<div className="font-medium">
{node.name || (node.type === 'self' ? 'Me' : node.id.slice(0, 8))}
</div>
<div className="text-muted-foreground">ID: {node.id}</div>
<div className="text-muted-foreground">
Type: {node.type}
{node.isAmbiguous ? ' (ambiguous)' : ''}
</div>
{node.probableIdentity && (
<div className="text-muted-foreground">Probably: {node.probableIdentity}</div>
)}
{node.ambiguousNames && node.ambiguousNames.length > 0 && (
<div className="text-muted-foreground">
{node.probableIdentity ? 'Other possible: ' : 'Possible: '}
{node.ambiguousNames.join(', ')}
</div>
)}
{node.type !== 'self' && (
<div className="text-muted-foreground border-t border-border pt-1 mt-1">
<div>Last active: {formatRelativeTime(node.lastActivity)}</div>
{node.lastActivityReason && <div>Reason: {node.lastActivityReason}</div>}
</div>
)}
{neighbors.length > 0 && (
<div className="text-muted-foreground border-t border-border pt-1 mt-1">
<div className="mb-0.5">Traffic exchanged with:</div>
<ul className="pl-3 flex flex-col gap-0.5">
{neighbors.map((neighbor) => (
<li key={neighbor.id}>
{neighbor.name}
{neighbor.ambiguousNames && neighbor.ambiguousNames.length > 0 && (
<span className="text-muted-foreground/60">
{' '}
({neighbor.ambiguousNames.join(', ')})
</span>
)}
</li>
))}
</ul>
</div>
)}
</div>
</div>
);
}
@@ -0,0 +1,83 @@
import * as THREE from 'three';
import type { SimulationLinkDatum } from 'd3-force';
import type { SimulationNodeDatum3D } from 'd3-force-3d';
import type { CSS2DObject } from 'three/examples/jsm/renderers/CSS2DRenderer.js';
import type { NodeType } from '../../utils/visualizerUtils';
export interface GraphNode extends SimulationNodeDatum3D {
id: string;
name: string | null;
type: NodeType;
isAmbiguous: boolean;
lastActivity: number;
lastActivityReason?: string;
lastSeen?: number | null;
probableIdentity?: string | null;
ambiguousNames?: string[];
}
export interface GraphLink extends SimulationLinkDatum<GraphNode> {
source: string | GraphNode;
target: string | GraphNode;
lastActivity: number;
}
export interface NodeMeshData {
mesh: THREE.Mesh;
label: CSS2DObject;
labelDiv: HTMLDivElement;
}
export const NODE_COLORS = {
self: 0x22c55e,
repeater: 0x3b82f6,
client: 0xffffff,
ambiguous: 0x9ca3af,
} as const;
export const NODE_LEGEND_ITEMS = [
{ color: '#22c55e', label: 'You', size: 14 },
{ color: '#3b82f6', label: 'Repeater', size: 10 },
{ color: '#ffffff', label: 'Node', size: 10 },
{ color: '#9ca3af', label: 'Ambiguous', size: 10 },
] as const;
export function getBaseNodeColor(node: Pick<GraphNode, 'type' | 'isAmbiguous'>): number {
if (node.type === 'self') return NODE_COLORS.self;
if (node.type === 'repeater') return NODE_COLORS.repeater;
return node.isAmbiguous ? NODE_COLORS.ambiguous : NODE_COLORS.client;
}
export function growFloat32Buffer(current: Float32Array, requiredLength: number): Float32Array {
let nextLength = Math.max(12, current.length);
while (nextLength < requiredLength) {
nextLength *= 2;
}
return new Float32Array(nextLength);
}
export function arraysEqual(a: string[], b: string[]): boolean {
if (a.length !== b.length) return false;
for (let i = 0; i < a.length; i++) {
if (a[i] !== b[i]) return false;
}
return true;
}
export function formatRelativeTime(timestamp: number): string {
const seconds = Math.floor((Date.now() - timestamp) / 1000);
if (seconds < 5) return 'just now';
if (seconds < 60) return `${seconds}s ago`;
const minutes = Math.floor(seconds / 60);
const secs = seconds % 60;
return secs > 0 ? `${minutes}m ${secs}s ago` : `${minutes}m ago`;
}
export function normalizePacketTimestampMs(timestamp: number | null | undefined): number {
if (!Number.isFinite(timestamp) || !timestamp || timestamp <= 0) {
return Date.now();
}
const ts = Number(timestamp);
return ts > 1_000_000_000_000 ? ts : ts * 1000;
}
@@ -0,0 +1,578 @@
import { useEffect, useRef, useState, type RefObject } from 'react';
import * as THREE from 'three';
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
import { CSS2DObject, CSS2DRenderer } from 'three/examples/jsm/renderers/CSS2DRenderer.js';
import { COLORS, getLinkId } from '../../utils/visualizerUtils';
import type { VisualizerData3D } from './useVisualizerData3D';
import { arraysEqual, getBaseNodeColor, growFloat32Buffer, type NodeMeshData } from './shared';
interface UseVisualizer3DSceneArgs {
containerRef: RefObject<HTMLDivElement | null>;
data: VisualizerData3D;
autoOrbit: boolean;
}
interface UseVisualizer3DSceneResult {
hoveredNodeId: string | null;
hoveredNeighborIds: string[];
pinnedNodeId: string | null;
}
export function useVisualizer3DScene({
containerRef,
data,
autoOrbit,
}: UseVisualizer3DSceneArgs): UseVisualizer3DSceneResult {
const rendererRef = useRef<THREE.WebGLRenderer | null>(null);
const cssRendererRef = useRef<CSS2DRenderer | null>(null);
const sceneRef = useRef<THREE.Scene | null>(null);
const cameraRef = useRef<THREE.PerspectiveCamera | null>(null);
const controlsRef = useRef<OrbitControls | null>(null);
const nodeMeshesRef = useRef<Map<string, NodeMeshData>>(new Map());
const raycastTargetsRef = useRef<THREE.Mesh[]>([]);
const linkLineRef = useRef<THREE.LineSegments | null>(null);
const highlightLineRef = useRef<THREE.LineSegments | null>(null);
const particlePointsRef = useRef<THREE.Points | null>(null);
const particleTextureRef = useRef<THREE.Texture | null>(null);
const linkPositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const highlightPositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const particlePositionBufferRef = useRef<Float32Array>(new Float32Array(0));
const particleColorBufferRef = useRef<Float32Array>(new Float32Array(0));
const raycasterRef = useRef(new THREE.Raycaster());
const mouseRef = useRef(new THREE.Vector2());
const dataRef = useRef(data);
const [hoveredNodeId, setHoveredNodeId] = useState<string | null>(null);
const hoveredNodeIdRef = useRef<string | null>(null);
const [hoveredNeighborIds, setHoveredNeighborIds] = useState<string[]>([]);
const hoveredNeighborIdsRef = useRef<string[]>([]);
const pinnedNodeIdRef = useRef<string | null>(null);
const [pinnedNodeId, setPinnedNodeId] = useState<string | null>(null);
useEffect(() => {
dataRef.current = data;
}, [data]);
useEffect(() => {
const container = containerRef.current;
if (!container) return;
const scene = new THREE.Scene();
scene.background = new THREE.Color(COLORS.background);
sceneRef.current = scene;
const camera = new THREE.PerspectiveCamera(60, 1, 1, 5000);
camera.position.set(0, 0, 400);
cameraRef.current = camera;
const renderer = new THREE.WebGLRenderer({ antialias: true });
renderer.setPixelRatio(window.devicePixelRatio);
container.appendChild(renderer.domElement);
rendererRef.current = renderer;
const texSize = 64;
const texCanvas = document.createElement('canvas');
texCanvas.width = texSize;
texCanvas.height = texSize;
const texCtx = texCanvas.getContext('2d');
if (!texCtx) {
renderer.dispose();
if (renderer.domElement.parentNode) {
renderer.domElement.parentNode.removeChild(renderer.domElement);
}
return;
}
const gradient = texCtx.createRadialGradient(
texSize / 2,
texSize / 2,
0,
texSize / 2,
texSize / 2,
texSize / 2
);
gradient.addColorStop(0, 'rgba(255,255,255,1)');
gradient.addColorStop(0.5, 'rgba(255,255,255,0.8)');
gradient.addColorStop(1, 'rgba(255,255,255,0)');
texCtx.fillStyle = gradient;
texCtx.fillRect(0, 0, texSize, texSize);
const particleTexture = new THREE.CanvasTexture(texCanvas);
particleTextureRef.current = particleTexture;
const cssRenderer = new CSS2DRenderer();
cssRenderer.domElement.style.position = 'absolute';
cssRenderer.domElement.style.top = '0';
cssRenderer.domElement.style.left = '0';
cssRenderer.domElement.style.pointerEvents = 'none';
cssRenderer.domElement.style.zIndex = '1';
container.appendChild(cssRenderer.domElement);
cssRendererRef.current = cssRenderer;
const controls = new OrbitControls(camera, renderer.domElement);
controls.enableDamping = true;
controls.dampingFactor = 0.1;
controls.minDistance = 50;
controls.maxDistance = 2000;
controlsRef.current = controls;
const linkGeometry = new THREE.BufferGeometry();
const linkMaterial = new THREE.LineBasicMaterial({
color: COLORS.link,
transparent: true,
opacity: 0.6,
});
const linkSegments = new THREE.LineSegments(linkGeometry, linkMaterial);
linkSegments.visible = false;
scene.add(linkSegments);
linkLineRef.current = linkSegments;
const highlightGeometry = new THREE.BufferGeometry();
const highlightMaterial = new THREE.LineBasicMaterial({
color: 0xffd700,
transparent: true,
opacity: 1,
linewidth: 2,
});
const highlightSegments = new THREE.LineSegments(highlightGeometry, highlightMaterial);
highlightSegments.visible = false;
scene.add(highlightSegments);
highlightLineRef.current = highlightSegments;
const particleGeometry = new THREE.BufferGeometry();
const particleMaterial = new THREE.PointsMaterial({
size: 20,
map: particleTexture,
vertexColors: true,
sizeAttenuation: true,
transparent: true,
opacity: 0.9,
depthWrite: false,
});
const particlePoints = new THREE.Points(particleGeometry, particleMaterial);
particlePoints.visible = false;
scene.add(particlePoints);
particlePointsRef.current = particlePoints;
const rect = container.getBoundingClientRect();
renderer.setSize(rect.width, rect.height);
cssRenderer.setSize(rect.width, rect.height);
camera.aspect = rect.width / rect.height;
camera.updateProjectionMatrix();
const observer = new ResizeObserver((entries) => {
for (const entry of entries) {
const { width, height } = entry.contentRect;
if (width === 0 || height === 0) continue;
renderer.setSize(width, height);
cssRenderer.setSize(width, height);
camera.aspect = width / height;
camera.updateProjectionMatrix();
}
});
observer.observe(container);
const nodeMeshes = nodeMeshesRef.current;
return () => {
observer.disconnect();
controls.dispose();
renderer.dispose();
if (renderer.domElement.parentNode) {
renderer.domElement.parentNode.removeChild(renderer.domElement);
}
if (cssRenderer.domElement.parentNode) {
cssRenderer.domElement.parentNode.removeChild(cssRenderer.domElement);
}
for (const nd of nodeMeshes.values()) {
nd.mesh.remove(nd.label);
nd.labelDiv.remove();
scene.remove(nd.mesh);
nd.mesh.geometry.dispose();
(nd.mesh.material as THREE.Material).dispose();
}
nodeMeshes.clear();
raycastTargetsRef.current = [];
if (linkLineRef.current) {
scene.remove(linkLineRef.current);
linkLineRef.current.geometry.dispose();
(linkLineRef.current.material as THREE.Material).dispose();
linkLineRef.current = null;
}
if (highlightLineRef.current) {
scene.remove(highlightLineRef.current);
highlightLineRef.current.geometry.dispose();
(highlightLineRef.current.material as THREE.Material).dispose();
highlightLineRef.current = null;
}
if (particlePointsRef.current) {
scene.remove(particlePointsRef.current);
particlePointsRef.current.geometry.dispose();
(particlePointsRef.current.material as THREE.Material).dispose();
particlePointsRef.current = null;
}
particleTexture.dispose();
particleTextureRef.current = null;
linkPositionBufferRef.current = new Float32Array(0);
highlightPositionBufferRef.current = new Float32Array(0);
particlePositionBufferRef.current = new Float32Array(0);
particleColorBufferRef.current = new Float32Array(0);
sceneRef.current = null;
cameraRef.current = null;
rendererRef.current = null;
cssRendererRef.current = null;
controlsRef.current = null;
};
}, [containerRef]);
useEffect(() => {
const controls = controlsRef.current;
if (!controls) return;
controls.autoRotate = autoOrbit;
controls.autoRotateSpeed = -0.5;
}, [autoOrbit]);
useEffect(() => {
const renderer = rendererRef.current;
const camera = cameraRef.current;
if (!renderer || !camera) return;
const onMouseMove = (event: MouseEvent) => {
const rect = renderer.domElement.getBoundingClientRect();
mouseRef.current.x = ((event.clientX - rect.left) / rect.width) * 2 - 1;
mouseRef.current.y = -((event.clientY - rect.top) / rect.height) * 2 + 1;
};
let mouseDownPos = { x: 0, y: 0 };
const onMouseDown = (event: MouseEvent) => {
mouseDownPos = { x: event.clientX, y: event.clientY };
};
const onMouseUp = (event: MouseEvent) => {
const dx = event.clientX - mouseDownPos.x;
const dy = event.clientY - mouseDownPos.y;
if (dx * dx + dy * dy > 25) return;
const rect = renderer.domElement.getBoundingClientRect();
const clickMouse = new THREE.Vector2(
((event.clientX - rect.left) / rect.width) * 2 - 1,
-((event.clientY - rect.top) / rect.height) * 2 + 1
);
const raycaster = raycasterRef.current;
raycaster.setFromCamera(clickMouse, camera);
const intersects = raycaster.intersectObjects(raycastTargetsRef.current, false);
const clickedObject = intersects[0]?.object as THREE.Mesh | undefined;
const clickedId = (clickedObject?.userData?.nodeId as string | undefined) ?? null;
if (clickedId === pinnedNodeIdRef.current) {
pinnedNodeIdRef.current = null;
setPinnedNodeId(null);
} else if (clickedId) {
pinnedNodeIdRef.current = clickedId;
setPinnedNodeId(clickedId);
} else {
pinnedNodeIdRef.current = null;
setPinnedNodeId(null);
}
};
renderer.domElement.addEventListener('mousemove', onMouseMove);
renderer.domElement.addEventListener('mousedown', onMouseDown);
renderer.domElement.addEventListener('mouseup', onMouseUp);
return () => {
renderer.domElement.removeEventListener('mousemove', onMouseMove);
renderer.domElement.removeEventListener('mousedown', onMouseDown);
renderer.domElement.removeEventListener('mouseup', onMouseUp);
};
}, []);
useEffect(() => {
const scene = sceneRef.current;
const camera = cameraRef.current;
const renderer = rendererRef.current;
const cssRenderer = cssRendererRef.current;
const controls = controlsRef.current;
if (!scene || !camera || !renderer || !cssRenderer || !controls) return;
let running = true;
const animate = () => {
if (!running) return;
requestAnimationFrame(animate);
controls.update();
const { nodes, links, particles } = dataRef.current;
const currentNodeIds = new Set<string>();
for (const node of nodes.values()) {
currentNodeIds.add(node.id);
let nd = nodeMeshesRef.current.get(node.id);
if (!nd) {
const isSelf = node.type === 'self';
const radius = isSelf ? 12 : 6;
const geometry = new THREE.SphereGeometry(radius, 16, 12);
const material = new THREE.MeshBasicMaterial({ color: getBaseNodeColor(node) });
const mesh = new THREE.Mesh(geometry, material);
mesh.userData.nodeId = node.id;
scene.add(mesh);
const labelDiv = document.createElement('div');
labelDiv.style.color = node.isAmbiguous ? COLORS.ambiguous : '#e5e7eb';
labelDiv.style.fontSize = '11px';
labelDiv.style.fontFamily = 'sans-serif';
labelDiv.style.textAlign = 'center';
labelDiv.style.whiteSpace = 'nowrap';
labelDiv.style.textShadow = '0 0 4px #000, 0 0 2px #000';
const label = new CSS2DObject(labelDiv);
label.position.set(0, -(radius + 6), 0);
mesh.add(label);
nd = { mesh, label, labelDiv };
nodeMeshesRef.current.set(node.id, nd);
raycastTargetsRef.current.push(mesh);
}
nd.mesh.position.set(node.x ?? 0, node.y ?? 0, node.z ?? 0);
const labelColor = node.isAmbiguous ? COLORS.ambiguous : '#e5e7eb';
if (nd.labelDiv.style.color !== labelColor) {
nd.labelDiv.style.color = labelColor;
}
const labelText = node.name || (node.type === 'self' ? 'Me' : node.id.slice(0, 8));
if (nd.labelDiv.textContent !== labelText) {
nd.labelDiv.textContent = labelText;
}
}
for (const [id, nd] of nodeMeshesRef.current) {
if (!currentNodeIds.has(id)) {
nd.mesh.remove(nd.label);
nd.labelDiv.remove();
scene.remove(nd.mesh);
nd.mesh.geometry.dispose();
(nd.mesh.material as THREE.Material).dispose();
const meshIdx = raycastTargetsRef.current.indexOf(nd.mesh);
if (meshIdx >= 0) raycastTargetsRef.current.splice(meshIdx, 1);
nodeMeshesRef.current.delete(id);
}
}
raycasterRef.current.setFromCamera(mouseRef.current, camera);
const intersects = raycasterRef.current.intersectObjects(raycastTargetsRef.current, false);
const hitObject = intersects[0]?.object as THREE.Mesh | undefined;
const hitId = (hitObject?.userData?.nodeId as string | undefined) ?? null;
if (hitId !== hoveredNodeIdRef.current) {
hoveredNodeIdRef.current = hitId;
setHoveredNodeId(hitId);
}
const activeId = pinnedNodeIdRef.current ?? hoveredNodeIdRef.current;
const visibleLinks = [];
for (const link of links.values()) {
const { sourceId, targetId } = getLinkId(link);
if (currentNodeIds.has(sourceId) && currentNodeIds.has(targetId)) {
visibleLinks.push(link);
}
}
const connectedIds = activeId ? new Set<string>([activeId]) : null;
const linkLine = linkLineRef.current;
if (linkLine) {
const geometry = linkLine.geometry as THREE.BufferGeometry;
const requiredLength = visibleLinks.length * 6;
if (linkPositionBufferRef.current.length < requiredLength) {
linkPositionBufferRef.current = growFloat32Buffer(
linkPositionBufferRef.current,
requiredLength
);
geometry.setAttribute(
'position',
new THREE.BufferAttribute(linkPositionBufferRef.current, 3).setUsage(
THREE.DynamicDrawUsage
)
);
}
const highlightLine = highlightLineRef.current;
if (highlightLine && highlightPositionBufferRef.current.length < requiredLength) {
highlightPositionBufferRef.current = growFloat32Buffer(
highlightPositionBufferRef.current,
requiredLength
);
(highlightLine.geometry as THREE.BufferGeometry).setAttribute(
'position',
new THREE.BufferAttribute(highlightPositionBufferRef.current, 3).setUsage(
THREE.DynamicDrawUsage
)
);
}
const positions = linkPositionBufferRef.current;
const hlPositions = highlightPositionBufferRef.current;
let idx = 0;
let hlIdx = 0;
for (const link of visibleLinks) {
const { sourceId, targetId } = getLinkId(link);
const sNode = nodes.get(sourceId);
const tNode = nodes.get(targetId);
if (!sNode || !tNode) continue;
const sx = sNode.x ?? 0;
const sy = sNode.y ?? 0;
const sz = sNode.z ?? 0;
const tx = tNode.x ?? 0;
const ty = tNode.y ?? 0;
const tz = tNode.z ?? 0;
positions[idx++] = sx;
positions[idx++] = sy;
positions[idx++] = sz;
positions[idx++] = tx;
positions[idx++] = ty;
positions[idx++] = tz;
if (activeId && (sourceId === activeId || targetId === activeId)) {
connectedIds?.add(sourceId === activeId ? targetId : sourceId);
hlPositions[hlIdx++] = sx;
hlPositions[hlIdx++] = sy;
hlPositions[hlIdx++] = sz;
hlPositions[hlIdx++] = tx;
hlPositions[hlIdx++] = ty;
hlPositions[hlIdx++] = tz;
}
}
const positionAttr = geometry.getAttribute('position') as THREE.BufferAttribute | undefined;
if (positionAttr) {
positionAttr.needsUpdate = true;
}
geometry.setDrawRange(0, idx / 3);
linkLine.visible = idx > 0;
if (highlightLine) {
const hlGeometry = highlightLine.geometry as THREE.BufferGeometry;
const hlAttr = hlGeometry.getAttribute('position') as THREE.BufferAttribute | undefined;
if (hlAttr) {
hlAttr.needsUpdate = true;
}
hlGeometry.setDrawRange(0, hlIdx / 3);
highlightLine.visible = hlIdx > 0;
}
}
let writeIdx = 0;
for (let readIdx = 0; readIdx < particles.length; readIdx++) {
const particle = particles[readIdx];
particle.progress += particle.speed;
if (particle.progress <= 1) {
particles[writeIdx++] = particle;
}
}
particles.length = writeIdx;
const particlePoints = particlePointsRef.current;
if (particlePoints) {
const geometry = particlePoints.geometry as THREE.BufferGeometry;
const requiredLength = particles.length * 3;
if (particlePositionBufferRef.current.length < requiredLength) {
particlePositionBufferRef.current = growFloat32Buffer(
particlePositionBufferRef.current,
requiredLength
);
geometry.setAttribute(
'position',
new THREE.BufferAttribute(particlePositionBufferRef.current, 3).setUsage(
THREE.DynamicDrawUsage
)
);
}
if (particleColorBufferRef.current.length < requiredLength) {
particleColorBufferRef.current = growFloat32Buffer(
particleColorBufferRef.current,
requiredLength
);
geometry.setAttribute(
'color',
new THREE.BufferAttribute(particleColorBufferRef.current, 3).setUsage(
THREE.DynamicDrawUsage
)
);
}
const pPositions = particlePositionBufferRef.current;
const pColors = particleColorBufferRef.current;
const color = new THREE.Color();
let visibleCount = 0;
for (const p of particles) {
if (p.progress < 0) continue;
if (!currentNodeIds.has(p.fromNodeId) || !currentNodeIds.has(p.toNodeId)) continue;
const fromNode = nodes.get(p.fromNodeId);
const toNode = nodes.get(p.toNodeId);
if (!fromNode || !toNode) continue;
const t = p.progress;
const x = (fromNode.x ?? 0) + ((toNode.x ?? 0) - (fromNode.x ?? 0)) * t;
const y = (fromNode.y ?? 0) + ((toNode.y ?? 0) - (fromNode.y ?? 0)) * t;
const z = (fromNode.z ?? 0) + ((toNode.z ?? 0) - (fromNode.z ?? 0)) * t;
pPositions[visibleCount * 3] = x;
pPositions[visibleCount * 3 + 1] = y;
pPositions[visibleCount * 3 + 2] = z;
color.set(p.color);
pColors[visibleCount * 3] = color.r;
pColors[visibleCount * 3 + 1] = color.g;
pColors[visibleCount * 3 + 2] = color.b;
visibleCount++;
}
const posAttr = geometry.getAttribute('position') as THREE.BufferAttribute | undefined;
const colorAttr = geometry.getAttribute('color') as THREE.BufferAttribute | undefined;
if (posAttr) posAttr.needsUpdate = true;
if (colorAttr) colorAttr.needsUpdate = true;
geometry.setDrawRange(0, visibleCount);
particlePoints.visible = visibleCount > 0;
}
const nextNeighbors = connectedIds
? Array.from(connectedIds)
.filter((id) => id !== activeId)
.sort()
: [];
if (!arraysEqual(hoveredNeighborIdsRef.current, nextNeighbors)) {
hoveredNeighborIdsRef.current = nextNeighbors;
setHoveredNeighborIds(nextNeighbors);
}
for (const [id, nd] of nodeMeshesRef.current) {
const node = nodes.get(id);
if (!node) continue;
const mat = nd.mesh.material as THREE.MeshBasicMaterial;
if (id === activeId) {
mat.color.set(0xffd700);
} else if (connectedIds?.has(id)) {
mat.color.set(0xfff0b3);
} else {
mat.color.set(getBaseNodeColor(node));
}
}
renderer.render(scene, camera);
cssRenderer.render(scene, camera);
};
animate();
return () => {
running = false;
};
}, []);
return { hoveredNodeId, hoveredNeighborIds, pinnedNodeId };
}
@@ -0,0 +1,922 @@
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
import {
forceCenter,
forceLink,
forceManyBody,
forceSimulation,
forceX,
forceY,
forceZ,
type ForceLink3D,
type Simulation3D,
} from 'd3-force-3d';
import { PayloadType } from '@michaelhart/meshcore-decoder';
import {
CONTACT_TYPE_REPEATER,
type Contact,
type ContactAdvertPathSummary,
type RadioConfig,
type RawPacket,
} from '../../types';
import { getRawPacketObservationKey } from '../../utils/rawPacketIdentity';
import {
type Particle,
type PendingPacket,
type RepeaterTrafficData,
PARTICLE_COLOR_MAP,
PARTICLE_SPEED,
analyzeRepeaterTraffic,
buildAmbiguousRepeaterLabel,
buildAmbiguousRepeaterNodeId,
dedupeConsecutive,
generatePacketKey,
getNodeType,
getPacketLabel,
parsePacket,
recordTrafficObservation,
} from '../../utils/visualizerUtils';
import { type GraphLink, type GraphNode, normalizePacketTimestampMs } from './shared';
export interface UseVisualizerData3DOptions {
packets: RawPacket[];
contacts: Contact[];
config: RadioConfig | null;
repeaterAdvertPaths: ContactAdvertPathSummary[];
showAmbiguousPaths: boolean;
showAmbiguousNodes: boolean;
useAdvertPathHints: boolean;
splitAmbiguousByTraffic: boolean;
chargeStrength: number;
letEmDrift: boolean;
particleSpeedMultiplier: number;
observationWindowSec: number;
pruneStaleNodes: boolean;
pruneStaleMinutes: number;
}
export interface VisualizerData3D {
nodes: Map<string, GraphNode>;
links: Map<string, GraphLink>;
particles: Particle[];
stats: { processed: number; animated: number; nodes: number; links: number };
expandContract: () => void;
clearAndReset: () => void;
}
export function useVisualizerData3D({
packets,
contacts,
config,
repeaterAdvertPaths,
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
splitAmbiguousByTraffic,
chargeStrength,
letEmDrift,
particleSpeedMultiplier,
observationWindowSec,
pruneStaleNodes,
pruneStaleMinutes,
}: UseVisualizerData3DOptions): VisualizerData3D {
const nodesRef = useRef<Map<string, GraphNode>>(new Map());
const linksRef = useRef<Map<string, GraphLink>>(new Map());
const particlesRef = useRef<Particle[]>([]);
const simulationRef = useRef<Simulation3D<GraphNode, GraphLink> | null>(null);
const processedRef = useRef<Set<string>>(new Set());
const pendingRef = useRef<Map<string, PendingPacket>>(new Map());
const timersRef = useRef<Map<string, ReturnType<typeof setTimeout>>>(new Map());
const trafficPatternsRef = useRef<Map<string, RepeaterTrafficData>>(new Map());
const speedMultiplierRef = useRef(particleSpeedMultiplier);
const observationWindowRef = useRef(observationWindowSec * 1000);
const stretchRafRef = useRef<number | null>(null);
const [stats, setStats] = useState({ processed: 0, animated: 0, nodes: 0, links: 0 });
const contactIndex = useMemo(() => {
const byPrefix12 = new Map<string, Contact>();
const byName = new Map<string, Contact>();
const byPrefix = new Map<string, Contact[]>();
for (const contact of contacts) {
const prefix12 = contact.public_key.slice(0, 12).toLowerCase();
byPrefix12.set(prefix12, contact);
if (contact.name && !byName.has(contact.name)) {
byName.set(contact.name, contact);
}
for (let len = 1; len <= 12; len++) {
const prefix = prefix12.slice(0, len);
const matches = byPrefix.get(prefix);
if (matches) {
matches.push(contact);
} else {
byPrefix.set(prefix, [contact]);
}
}
}
return { byPrefix12, byName, byPrefix };
}, [contacts]);
const advertPathIndex = useMemo(() => {
const byRepeater = new Map<string, ContactAdvertPathSummary['paths']>();
for (const summary of repeaterAdvertPaths) {
const key = summary.public_key.slice(0, 12).toLowerCase();
byRepeater.set(key, summary.paths);
}
return { byRepeater };
}, [repeaterAdvertPaths]);
useEffect(() => {
speedMultiplierRef.current = particleSpeedMultiplier;
}, [particleSpeedMultiplier]);
useEffect(() => {
observationWindowRef.current = observationWindowSec * 1000;
}, [observationWindowSec]);
useEffect(() => {
const sim = forceSimulation<GraphNode, GraphLink>([])
.numDimensions(3)
.force(
'link',
forceLink<GraphNode, GraphLink>([])
.id((d) => d.id)
.distance(120)
.strength(0.3)
)
.force(
'charge',
forceManyBody<GraphNode>()
.strength((d) => (d.id === 'self' ? -1200 : -200))
.distanceMax(800)
)
.force('center', forceCenter(0, 0, 0))
.force(
'selfX',
forceX<GraphNode>(0).strength((d) => (d.id === 'self' ? 0.1 : 0))
)
.force(
'selfY',
forceY<GraphNode>(0).strength((d) => (d.id === 'self' ? 0.1 : 0))
)
.force(
'selfZ',
forceZ<GraphNode>(0).strength((d) => (d.id === 'self' ? 0.1 : 0))
)
.alphaDecay(0.02)
.velocityDecay(0.5)
.alphaTarget(0.03);
simulationRef.current = sim;
return () => {
sim.stop();
};
}, []);
useEffect(() => {
const sim = simulationRef.current;
if (!sim) return;
sim.force(
'charge',
forceManyBody<GraphNode>()
.strength((d) => (d.id === 'self' ? chargeStrength * 6 : chargeStrength))
.distanceMax(800)
);
sim.alpha(0.3).restart();
}, [chargeStrength]);
useEffect(() => {
const sim = simulationRef.current;
if (!sim) return;
sim.alphaTarget(letEmDrift ? 0.05 : 0);
}, [letEmDrift]);
const syncSimulation = useCallback(() => {
const sim = simulationRef.current;
if (!sim) return;
const nodes = Array.from(nodesRef.current.values());
const links = Array.from(linksRef.current.values());
sim.nodes(nodes);
const linkForce = sim.force('link') as ForceLink3D<GraphNode, GraphLink> | undefined;
linkForce?.links(links);
sim.alpha(0.15).restart();
setStats((prev) =>
prev.nodes === nodes.length && prev.links === links.length
? prev
: { ...prev, nodes: nodes.length, links: links.length }
);
}, []);
useEffect(() => {
if (!nodesRef.current.has('self')) {
nodesRef.current.set('self', {
id: 'self',
name: config?.name || 'Me',
type: 'self',
isAmbiguous: false,
lastActivity: Date.now(),
x: 0,
y: 0,
z: 0,
});
syncSimulation();
}
}, [config, syncSimulation]);
useEffect(() => {
processedRef.current.clear();
const selfNode = nodesRef.current.get('self');
nodesRef.current.clear();
if (selfNode) nodesRef.current.set('self', selfNode);
linksRef.current.clear();
particlesRef.current = [];
pendingRef.current.clear();
timersRef.current.forEach((t) => clearTimeout(t));
timersRef.current.clear();
trafficPatternsRef.current.clear();
setStats({ processed: 0, animated: 0, nodes: selfNode ? 1 : 0, links: 0 });
syncSimulation();
}, [
showAmbiguousPaths,
showAmbiguousNodes,
useAdvertPathHints,
splitAmbiguousByTraffic,
syncSimulation,
]);
const addNode = useCallback(
(
id: string,
name: string | null,
type: GraphNode['type'],
isAmbiguous: boolean,
probableIdentity?: string | null,
ambiguousNames?: string[],
lastSeen?: number | null,
activityAtMs?: number
) => {
const activityAt = activityAtMs ?? Date.now();
const existing = nodesRef.current.get(id);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAt);
if (name) existing.name = name;
if (probableIdentity !== undefined) existing.probableIdentity = probableIdentity;
if (ambiguousNames) existing.ambiguousNames = ambiguousNames;
if (lastSeen !== undefined) existing.lastSeen = lastSeen;
} else {
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = 80 + Math.random() * 100;
nodesRef.current.set(id, {
id,
name,
type,
isAmbiguous,
lastActivity: activityAt,
probableIdentity,
lastSeen,
ambiguousNames,
x: r * Math.sin(phi) * Math.cos(theta),
y: r * Math.sin(phi) * Math.sin(theta),
z: r * Math.cos(phi),
});
}
},
[]
);
const addLink = useCallback((sourceId: string, targetId: string, activityAtMs?: number) => {
const activityAt = activityAtMs ?? Date.now();
const key = [sourceId, targetId].sort().join('->');
const existing = linksRef.current.get(key);
if (existing) {
existing.lastActivity = Math.max(existing.lastActivity, activityAt);
} else {
linksRef.current.set(key, { source: sourceId, target: targetId, lastActivity: activityAt });
}
}, []);
const publishPacket = useCallback((packetKey: string) => {
const pending = pendingRef.current.get(packetKey);
if (!pending) return;
pendingRef.current.delete(packetKey);
timersRef.current.delete(packetKey);
if (document.hidden) return;
for (const path of pending.paths) {
const dedupedPath = dedupeConsecutive(path.nodes);
if (dedupedPath.length < 2) continue;
for (let i = 0; i < dedupedPath.length - 1; i++) {
particlesRef.current.push({
linkKey: [dedupedPath[i], dedupedPath[i + 1]].sort().join('->'),
progress: -i,
speed: PARTICLE_SPEED * speedMultiplierRef.current,
color: PARTICLE_COLOR_MAP[pending.label],
label: pending.label,
fromNodeId: dedupedPath[i],
toNodeId: dedupedPath[i + 1],
});
}
}
}, []);
const pickLikelyRepeaterByAdvertPath = useCallback(
(candidates: Contact[], nextPrefix: string | null) => {
const nextHop = nextPrefix?.toLowerCase() ?? null;
const scored = candidates
.map((candidate) => {
const prefix12 = candidate.public_key.slice(0, 12).toLowerCase();
const paths = advertPathIndex.byRepeater.get(prefix12) ?? [];
let matchScore = 0;
let totalScore = 0;
for (const path of paths) {
totalScore += path.heard_count;
const pathNextHop = path.next_hop?.toLowerCase() ?? null;
if (pathNextHop === nextHop) {
matchScore += path.heard_count;
}
}
return { candidate, matchScore, totalScore };
})
.filter((entry) => entry.totalScore > 0)
.sort(
(a, b) =>
b.matchScore - a.matchScore ||
b.totalScore - a.totalScore ||
a.candidate.public_key.localeCompare(b.candidate.public_key)
);
if (scored.length === 0) return null;
const top = scored[0];
const second = scored[1] ?? null;
if (top.matchScore < 2) return null;
if (second && top.matchScore < second.matchScore * 2) return null;
return top.candidate;
},
[advertPathIndex]
);
const resolveNode = useCallback(
(
source: { type: 'prefix' | 'pubkey' | 'name'; value: string },
isRepeater: boolean,
showAmbiguous: boolean,
myPrefix: string | null,
activityAtMs: number,
trafficContext?: { packetSource: string | null; nextPrefix: string | null }
): string | null => {
if (source.type === 'pubkey') {
if (source.value.length < 12) return null;
const nodeId = source.value.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
const contact = contactIndex.byPrefix12.get(nodeId);
addNode(
nodeId,
contact?.name || null,
getNodeType(contact),
false,
undefined,
undefined,
contact?.last_seen,
activityAtMs
);
return nodeId;
}
if (source.type === 'name') {
const contact = contactIndex.byName.get(source.value) ?? null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
addNode(
nodeId,
contact.name,
getNodeType(contact),
false,
undefined,
undefined,
contact.last_seen,
activityAtMs
);
return nodeId;
}
const nodeId = `name:${source.value}`;
addNode(
nodeId,
source.value,
'client',
false,
undefined,
undefined,
undefined,
activityAtMs
);
return nodeId;
}
const lookupValue = source.value.toLowerCase();
const matches = contactIndex.byPrefix.get(lookupValue) ?? [];
const contact = matches.length === 1 ? matches[0] : null;
if (contact) {
const nodeId = contact.public_key.slice(0, 12).toLowerCase();
if (myPrefix && nodeId === myPrefix) return 'self';
addNode(
nodeId,
contact.name,
getNodeType(contact),
false,
undefined,
undefined,
contact.last_seen,
activityAtMs
);
return nodeId;
}
if (showAmbiguous) {
const filtered = isRepeater
? matches.filter((c) => c.type === CONTACT_TYPE_REPEATER)
: matches.filter((c) => c.type !== CONTACT_TYPE_REPEATER);
if (filtered.length === 1) {
const c = filtered[0];
const nodeId = c.public_key.slice(0, 12).toLowerCase();
addNode(
nodeId,
c.name,
getNodeType(c),
false,
undefined,
undefined,
c.last_seen,
activityAtMs
);
return nodeId;
}
if (filtered.length > 1 || (filtered.length === 0 && isRepeater)) {
const names = filtered.map((c) => c.name || c.public_key.slice(0, 8));
const lastSeen = filtered.reduce(
(max, c) => (c.last_seen && (!max || c.last_seen > max) ? c.last_seen : max),
null as number | null
);
let nodeId = buildAmbiguousRepeaterNodeId(lookupValue);
let displayName = buildAmbiguousRepeaterLabel(lookupValue);
let probableIdentity: string | null = null;
let ambiguousNames = names.length > 0 ? names : undefined;
if (useAdvertPathHints && isRepeater && trafficContext) {
const normalizedNext = trafficContext.nextPrefix?.toLowerCase() ?? null;
const likely = pickLikelyRepeaterByAdvertPath(filtered, normalizedNext);
if (likely) {
const likelyName = likely.name || likely.public_key.slice(0, 12).toUpperCase();
probableIdentity = likelyName;
displayName = likelyName;
ambiguousNames = filtered
.filter((c) => c.public_key !== likely.public_key)
.map((c) => c.name || c.public_key.slice(0, 8));
}
}
if (splitAmbiguousByTraffic && isRepeater && trafficContext) {
const normalizedNext = trafficContext.nextPrefix?.toLowerCase() ?? null;
if (trafficContext.packetSource) {
recordTrafficObservation(
trafficPatternsRef.current,
lookupValue,
trafficContext.packetSource,
normalizedNext
);
}
const trafficData = trafficPatternsRef.current.get(lookupValue);
if (trafficData) {
const analysis = analyzeRepeaterTraffic(trafficData);
if (analysis.shouldSplit && normalizedNext) {
nodeId = buildAmbiguousRepeaterNodeId(lookupValue, normalizedNext);
if (!probableIdentity) {
displayName = buildAmbiguousRepeaterLabel(lookupValue, normalizedNext);
}
}
}
}
addNode(
nodeId,
displayName,
isRepeater ? 'repeater' : 'client',
true,
probableIdentity,
ambiguousNames,
lastSeen,
activityAtMs
);
return nodeId;
}
}
return null;
},
[
contactIndex,
addNode,
useAdvertPathHints,
pickLikelyRepeaterByAdvertPath,
splitAmbiguousByTraffic,
]
);
const buildPath = useCallback(
(
parsed: ReturnType<typeof parsePacket>,
packet: RawPacket,
myPrefix: string | null,
activityAtMs: number
): string[] => {
if (!parsed) return [];
const path: string[] = [];
let packetSource: string | null = null;
if (parsed.payloadType === PayloadType.Advert && parsed.advertPubkey) {
const nodeId = resolveNode(
{ type: 'pubkey', value: parsed.advertPubkey },
false,
false,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.AnonRequest && parsed.anonRequestPubkey) {
const nodeId = resolveNode(
{ type: 'pubkey', value: parsed.anonRequestPubkey },
false,
false,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
} else if (parsed.payloadType === PayloadType.TextMessage && parsed.srcHash) {
if (myPrefix && parsed.srcHash.toLowerCase() === myPrefix) {
path.push('self');
packetSource = 'self';
} else {
const nodeId = resolveNode(
{ type: 'prefix', value: parsed.srcHash },
false,
showAmbiguousNodes,
myPrefix,
activityAtMs
);
if (nodeId) {
path.push(nodeId);
packetSource = nodeId;
}
}
} else if (parsed.payloadType === PayloadType.GroupText) {
const senderName = parsed.groupTextSender || packet.decrypted_info?.sender;
if (senderName) {
const resolved = resolveNode(
{ type: 'name', value: senderName },
false,
false,
myPrefix,
activityAtMs
);
if (resolved) {
path.push(resolved);
packetSource = resolved;
}
}
}
for (let i = 0; i < parsed.pathBytes.length; i++) {
const hexPrefix = parsed.pathBytes[i];
const nextPrefix = parsed.pathBytes[i + 1] || null;
const nodeId = resolveNode(
{ type: 'prefix', value: hexPrefix },
true,
showAmbiguousPaths,
myPrefix,
activityAtMs,
{ packetSource, nextPrefix }
);
if (nodeId) path.push(nodeId);
}
if (parsed.payloadType === PayloadType.TextMessage && parsed.dstHash) {
if (myPrefix && parsed.dstHash.toLowerCase() === myPrefix) {
path.push('self');
} else {
const nodeId = resolveNode(
{ type: 'prefix', value: parsed.dstHash },
false,
showAmbiguousNodes,
myPrefix,
activityAtMs
);
if (nodeId) path.push(nodeId);
else path.push('self');
}
} else if (path.length > 0) {
path.push('self');
}
if (path.length > 0 && path[path.length - 1] !== 'self') {
path.push('self');
}
return dedupeConsecutive(path);
},
[resolveNode, showAmbiguousPaths, showAmbiguousNodes]
);
useEffect(() => {
let newProcessed = 0;
let newAnimated = 0;
let needsUpdate = false;
const myPrefix = config?.public_key?.slice(0, 12).toLowerCase() || null;
for (const packet of packets) {
const observationKey = getRawPacketObservationKey(packet);
if (processedRef.current.has(observationKey)) continue;
processedRef.current.add(observationKey);
newProcessed++;
if (processedRef.current.size > 1000) {
processedRef.current = new Set(Array.from(processedRef.current).slice(-500));
}
const parsed = parsePacket(packet.data);
if (!parsed) continue;
const packetActivityAt = normalizePacketTimestampMs(packet.timestamp);
const path = buildPath(parsed, packet, myPrefix, packetActivityAt);
if (path.length < 2) continue;
const label = getPacketLabel(parsed.payloadType);
for (let i = 0; i < path.length; i++) {
const n = nodesRef.current.get(path[i]);
if (n && n.id !== 'self') {
n.lastActivityReason = i === 0 ? `${label} source` : `Relayed ${label}`;
}
}
for (let i = 0; i < path.length - 1; i++) {
if (path[i] !== path[i + 1]) {
addLink(path[i], path[i + 1], packetActivityAt);
needsUpdate = true;
}
}
const packetKey = generatePacketKey(parsed, packet);
const now = Date.now();
const existing = pendingRef.current.get(packetKey);
if (existing && now < existing.expiresAt) {
existing.paths.push({ nodes: path, snr: packet.snr ?? null, timestamp: now });
} else {
const existingTimer = timersRef.current.get(packetKey);
if (existingTimer) {
clearTimeout(existingTimer);
}
const windowMs = observationWindowRef.current;
pendingRef.current.set(packetKey, {
key: packetKey,
label: getPacketLabel(parsed.payloadType),
paths: [{ nodes: path, snr: packet.snr ?? null, timestamp: now }],
firstSeen: now,
expiresAt: now + windowMs,
});
timersRef.current.set(
packetKey,
setTimeout(() => publishPacket(packetKey), windowMs)
);
}
if (pendingRef.current.size > 100) {
const entries = Array.from(pendingRef.current.entries())
.sort((a, b) => a[1].firstSeen - b[1].firstSeen)
.slice(0, 50);
for (const [key] of entries) {
const timer = timersRef.current.get(key);
if (timer) {
clearTimeout(timer);
}
timersRef.current.delete(key);
pendingRef.current.delete(key);
}
}
newAnimated++;
}
if (needsUpdate) syncSimulation();
if (newProcessed > 0) {
setStats((prev) => ({
...prev,
processed: prev.processed + newProcessed,
animated: prev.animated + newAnimated,
}));
}
}, [packets, config, buildPath, addLink, syncSimulation, publishPacket]);
const expandContract = useCallback(() => {
const sim = simulationRef.current;
if (!sim) return;
if (stretchRafRef.current !== null) {
cancelAnimationFrame(stretchRafRef.current);
stretchRafRef.current = null;
}
const startChargeStrength = chargeStrength;
const peakChargeStrength = -5000;
const startLinkStrength = 0.3;
const minLinkStrength = 0.02;
const expandDuration = 1000;
const holdDuration = 2000;
const contractDuration = 1000;
const startTime = performance.now();
const animate = (now: number) => {
const elapsed = now - startTime;
let currentChargeStrength: number;
let currentLinkStrength: number;
if (elapsed < expandDuration) {
const t = elapsed / expandDuration;
currentChargeStrength =
startChargeStrength + (peakChargeStrength - startChargeStrength) * t;
currentLinkStrength = startLinkStrength + (minLinkStrength - startLinkStrength) * t;
} else if (elapsed < expandDuration + holdDuration) {
currentChargeStrength = peakChargeStrength;
currentLinkStrength = minLinkStrength;
} else if (elapsed < expandDuration + holdDuration + contractDuration) {
const t = (elapsed - expandDuration - holdDuration) / contractDuration;
currentChargeStrength = peakChargeStrength + (startChargeStrength - peakChargeStrength) * t;
currentLinkStrength = minLinkStrength + (startLinkStrength - minLinkStrength) * t;
} else {
sim.force(
'charge',
forceManyBody<GraphNode>()
.strength((d) => (d.id === 'self' ? startChargeStrength * 6 : startChargeStrength))
.distanceMax(800)
);
sim.force(
'link',
forceLink<GraphNode, GraphLink>(Array.from(linksRef.current.values()))
.id((d) => d.id)
.distance(120)
.strength(startLinkStrength)
);
sim.alpha(0.3).restart();
stretchRafRef.current = null;
return;
}
sim.force(
'charge',
forceManyBody<GraphNode>()
.strength((d) => (d.id === 'self' ? currentChargeStrength * 6 : currentChargeStrength))
.distanceMax(800)
);
sim.force(
'link',
forceLink<GraphNode, GraphLink>(Array.from(linksRef.current.values()))
.id((d) => d.id)
.distance(120)
.strength(currentLinkStrength)
);
sim.alpha(0.5).restart();
stretchRafRef.current = requestAnimationFrame(animate);
};
stretchRafRef.current = requestAnimationFrame(animate);
}, [chargeStrength]);
const clearAndReset = useCallback(() => {
if (stretchRafRef.current !== null) {
cancelAnimationFrame(stretchRafRef.current);
stretchRafRef.current = null;
}
for (const timer of timersRef.current.values()) {
clearTimeout(timer);
}
timersRef.current.clear();
pendingRef.current.clear();
processedRef.current.clear();
trafficPatternsRef.current.clear();
particlesRef.current.length = 0;
linksRef.current.clear();
const selfNode = nodesRef.current.get('self');
nodesRef.current.clear();
if (selfNode) {
selfNode.x = 0;
selfNode.y = 0;
selfNode.z = 0;
selfNode.vx = 0;
selfNode.vy = 0;
selfNode.vz = 0;
selfNode.lastActivity = Date.now();
nodesRef.current.set('self', selfNode);
}
const sim = simulationRef.current;
if (sim) {
sim.nodes(Array.from(nodesRef.current.values()));
const linkForce = sim.force('link') as ForceLink3D<GraphNode, GraphLink> | undefined;
linkForce?.links([]);
sim.alpha(0.3).restart();
}
setStats({ processed: 0, animated: 0, nodes: 1, links: 0 });
}, []);
useEffect(() => {
const stretchRaf = stretchRafRef;
const timers = timersRef.current;
const pending = pendingRef.current;
return () => {
if (stretchRaf.current !== null) {
cancelAnimationFrame(stretchRaf.current);
}
for (const timer of timers.values()) {
clearTimeout(timer);
}
timers.clear();
pending.clear();
};
}, []);
useEffect(() => {
if (!pruneStaleNodes) return;
const staleMs = pruneStaleMinutes * 60 * 1000;
const pruneIntervalMs = 1000;
const interval = setInterval(() => {
const cutoff = Date.now() - staleMs;
let pruned = false;
for (const [id, node] of nodesRef.current) {
if (id === 'self') continue;
if (node.lastActivity < cutoff) {
nodesRef.current.delete(id);
pruned = true;
}
}
if (pruned) {
for (const [key, link] of linksRef.current) {
const sourceId = typeof link.source === 'string' ? link.source : link.source.id;
const targetId = typeof link.target === 'string' ? link.target : link.target.id;
if (!nodesRef.current.has(sourceId) || !nodesRef.current.has(targetId)) {
linksRef.current.delete(key);
}
}
syncSimulation();
}
}, pruneIntervalMs);
return () => clearInterval(interval);
}, [pruneStaleNodes, pruneStaleMinutes, syncSimulation]);
return useMemo(
() => ({
nodes: nodesRef.current,
links: linksRef.current,
particles: particlesRef.current,
stats,
expandContract,
clearAndReset,
}),
[stats, expandContract, clearAndReset]
);
}
+4
View File
@@ -2,6 +2,10 @@ export { useUnreadCounts } from './useUnreadCounts';
export { useConversationMessages, getMessageContentKey } from './useConversationMessages';
export { useRadioControl } from './useRadioControl';
export { useRepeaterDashboard } from './useRepeaterDashboard';
export { useAppShell } from './useAppShell';
export { useAppSettings } from './useAppSettings';
export { useConversationRouter } from './useConversationRouter';
export { useContactsAndChannels } from './useContactsAndChannels';
export { useRealtimeAppState } from './useRealtimeAppState';
export { useConversationActions } from './useConversationActions';
export { useConversationNavigation } from './useConversationNavigation';
+77
View File
@@ -0,0 +1,77 @@
import { startTransition, useCallback, useState } from 'react';
import { getLocalLabel, type LocalLabel } from '../utils/localLabel';
import type { SettingsSection } from '../components/settings/settingsConstants';
interface UseAppShellResult {
showNewMessage: boolean;
showSettings: boolean;
settingsSection: SettingsSection;
sidebarOpen: boolean;
showCracker: boolean;
crackerRunning: boolean;
localLabel: LocalLabel;
setSettingsSection: (section: SettingsSection) => void;
setSidebarOpen: (open: boolean) => void;
setCrackerRunning: (running: boolean) => void;
setLocalLabel: (label: LocalLabel) => void;
handleCloseSettingsView: () => void;
handleToggleSettingsView: () => void;
handleOpenNewMessage: () => void;
handleCloseNewMessage: () => void;
handleToggleCracker: () => void;
}
export function useAppShell(): UseAppShellResult {
const [showNewMessage, setShowNewMessage] = useState(false);
const [showSettings, setShowSettings] = useState(false);
const [settingsSection, setSettingsSection] = useState<SettingsSection>('radio');
const [sidebarOpen, setSidebarOpen] = useState(false);
const [showCracker, setShowCracker] = useState(false);
const [crackerRunning, setCrackerRunning] = useState(false);
const [localLabel, setLocalLabel] = useState(getLocalLabel);
const handleCloseSettingsView = useCallback(() => {
startTransition(() => setShowSettings(false));
setSidebarOpen(false);
}, []);
const handleToggleSettingsView = useCallback(() => {
startTransition(() => {
setShowSettings((prev) => !prev);
});
setSidebarOpen(false);
}, []);
const handleOpenNewMessage = useCallback(() => {
setShowNewMessage(true);
setSidebarOpen(false);
}, []);
const handleCloseNewMessage = useCallback(() => {
setShowNewMessage(false);
}, []);
const handleToggleCracker = useCallback(() => {
setShowCracker((prev) => !prev);
}, []);
return {
showNewMessage,
showSettings,
settingsSection,
sidebarOpen,
showCracker,
crackerRunning,
localLabel,
setSettingsSection,
setSidebarOpen,
setCrackerRunning,
setLocalLabel,
handleCloseSettingsView,
handleToggleSettingsView,
handleOpenNewMessage,
handleCloseNewMessage,
handleToggleCracker,
};
}
@@ -0,0 +1,156 @@
import { useCallback, type MutableRefObject, type RefObject } from 'react';
import { api } from '../api';
import * as messageCache from '../messageCache';
import { toast } from '../components/ui/sonner';
import type { MessageInputHandle } from '../components/MessageInput';
import type { Channel, Conversation, Message } from '../types';
interface UseConversationActionsArgs {
activeConversation: Conversation | null;
activeConversationRef: MutableRefObject<Conversation | null>;
setChannels: React.Dispatch<React.SetStateAction<Channel[]>>;
addMessageIfNew: (msg: Message) => boolean;
jumpToBottom: () => void;
handleToggleBlockedKey: (key: string) => Promise<void>;
handleToggleBlockedName: (name: string) => Promise<void>;
messageInputRef: RefObject<MessageInputHandle | null>;
}
interface UseConversationActionsResult {
handleSendMessage: (text: string) => Promise<void>;
handleResendChannelMessage: (messageId: number, newTimestamp?: boolean) => Promise<void>;
handleSetChannelFloodScopeOverride: (
channelKey: string,
floodScopeOverride: string
) => Promise<void>;
handleSenderClick: (sender: string) => void;
handleTrace: () => Promise<void>;
handleBlockKey: (key: string) => Promise<void>;
handleBlockName: (name: string) => Promise<void>;
}
export function useConversationActions({
activeConversation,
activeConversationRef,
setChannels,
addMessageIfNew,
jumpToBottom,
handleToggleBlockedKey,
handleToggleBlockedName,
messageInputRef,
}: UseConversationActionsArgs): UseConversationActionsResult {
const mergeChannelIntoList = useCallback(
(updated: Channel) => {
setChannels((prev) => {
const existingIndex = prev.findIndex((channel) => channel.key === updated.key);
if (existingIndex === -1) {
return [...prev, updated].sort((a, b) => a.name.localeCompare(b.name));
}
const next = [...prev];
next[existingIndex] = updated;
return next;
});
},
[setChannels]
);
const handleSendMessage = useCallback(
async (text: string) => {
if (!activeConversation) return;
const conversationId = activeConversation.id;
const sent =
activeConversation.type === 'channel'
? await api.sendChannelMessage(activeConversation.id, text)
: await api.sendDirectMessage(activeConversation.id, text);
if (activeConversationRef.current?.id === conversationId) {
addMessageIfNew(sent);
}
},
[activeConversation, activeConversationRef, addMessageIfNew]
);
const handleResendChannelMessage = useCallback(
async (messageId: number, newTimestamp?: boolean) => {
try {
await api.resendChannelMessage(messageId, newTimestamp);
toast.success(newTimestamp ? 'Message resent with new timestamp' : 'Message resent');
} catch (err) {
toast.error('Failed to resend', {
description: err instanceof Error ? err.message : 'Unknown error',
});
}
},
[]
);
const handleSetChannelFloodScopeOverride = useCallback(
async (channelKey: string, floodScopeOverride: string) => {
try {
const updated = await api.setChannelFloodScopeOverride(channelKey, floodScopeOverride);
mergeChannelIntoList(updated);
toast.success(
updated.flood_scope_override ? 'Regional override saved' : 'Regional override cleared'
);
} catch (err) {
toast.error('Failed to update regional override', {
description: err instanceof Error ? err.message : 'Unknown error',
});
}
},
[mergeChannelIntoList]
);
const handleSenderClick = useCallback(
(sender: string) => {
messageInputRef.current?.appendText(`@[${sender}] `);
},
[messageInputRef]
);
const handleTrace = useCallback(async () => {
if (!activeConversation || activeConversation.type !== 'contact') return;
toast('Trace started...');
try {
const result = await api.requestTrace(activeConversation.id);
const parts: string[] = [];
if (result.remote_snr !== null) parts.push(`Remote SNR: ${result.remote_snr.toFixed(1)} dB`);
if (result.local_snr !== null) parts.push(`Local SNR: ${result.local_snr.toFixed(1)} dB`);
const detail = parts.join(', ');
toast.success(detail ? `Trace complete! ${detail}` : 'Trace complete!');
} catch (err) {
toast.error('Trace failed', {
description: err instanceof Error ? err.message : 'Unknown error',
});
}
}, [activeConversation]);
const handleBlockKey = useCallback(
async (key: string) => {
await handleToggleBlockedKey(key);
messageCache.clear();
jumpToBottom();
},
[handleToggleBlockedKey, jumpToBottom]
);
const handleBlockName = useCallback(
async (name: string) => {
await handleToggleBlockedName(name);
messageCache.clear();
jumpToBottom();
},
[handleToggleBlockedName, jumpToBottom]
);
return {
handleSendMessage,
handleResendChannelMessage,
handleSetChannelFloodScopeOverride,
handleSenderClick,
handleTrace,
handleBlockKey,
handleBlockName,
};
}
+106 -184
View File
@@ -1,11 +1,19 @@
import { useState, useCallback, useEffect, useRef } from 'react';
import {
useCallback,
useEffect,
useRef,
useState,
type Dispatch,
type MutableRefObject,
type SetStateAction,
} from 'react';
import { toast } from '../components/ui/sonner';
import { api, isAbortError } from '../api';
import * as messageCache from '../messageCache';
import type { Conversation, Message, MessagePath } from '../types';
const MESSAGE_PAGE_SIZE = 200;
const MAX_PENDING_ACKS = 500;
const MESSAGE_PAGE_SIZE = 200;
interface PendingAckUpdate {
ackCount: number;
@@ -64,8 +72,8 @@ interface UseConversationMessagesResult {
hasOlderMessages: boolean;
hasNewerMessages: boolean;
loadingNewer: boolean;
hasNewerMessagesRef: React.MutableRefObject<boolean>;
setMessages: React.Dispatch<React.SetStateAction<Message[]>>;
hasNewerMessagesRef: MutableRefObject<boolean>;
setMessages: Dispatch<SetStateAction<Message[]>>;
fetchOlderMessages: () => Promise<void>;
fetchNewerMessages: () => Promise<void>;
jumpToBottom: () => void;
@@ -74,17 +82,14 @@ interface UseConversationMessagesResult {
triggerReconcile: () => void;
}
function isMessageConversation(conversation: Conversation | null): conversation is Conversation {
return !!conversation && !['raw', 'map', 'visualizer', 'search'].includes(conversation.type);
}
export function useConversationMessages(
activeConversation: Conversation | null,
targetMessageId?: number | null
): UseConversationMessagesResult {
const [messages, setMessages] = useState<Message[]>([]);
const [messagesLoading, setMessagesLoading] = useState(false);
const [loadingOlder, setLoadingOlder] = useState(false);
const [hasOlderMessages, setHasOlderMessages] = useState(false);
const [hasNewerMessages, setHasNewerMessages] = useState(false);
const [loadingNewer, setLoadingNewer] = useState(false);
// Track seen message content for deduplication
const seenMessageContent = useRef<Set<string>>(new Set());
@@ -92,31 +97,6 @@ export function useConversationMessages(
// Buffer latest ACK state by message_id and apply when the message arrives.
const pendingAcksRef = useRef<Map<number, PendingAckUpdate>>(new Map());
// AbortController for cancelling in-flight requests on conversation change
const abortControllerRef = useRef<AbortController | null>(null);
// Ref to track the conversation ID being fetched to prevent stale responses
const fetchingConversationIdRef = useRef<string | null>(null);
// --- Cache integration refs ---
// Keep refs in sync with state so we can read current values in the switch effect
const messagesRef = useRef<Message[]>([]);
const hasOlderMessagesRef = useRef(false);
const hasNewerMessagesRef = useRef(false);
const prevConversationIdRef = useRef<string | null>(null);
useEffect(() => {
messagesRef.current = messages;
}, [messages]);
useEffect(() => {
hasOlderMessagesRef.current = hasOlderMessages;
}, [hasOlderMessages]);
useEffect(() => {
hasNewerMessagesRef.current = hasNewerMessages;
}, [hasNewerMessages]);
const setPendingAck = useCallback(
(messageId: number, ackCount: number, paths?: MessagePath[]) => {
const existing = pendingAcksRef.current.get(messageId);
@@ -148,33 +128,57 @@ export function useConversationMessages(
...(pending.paths !== undefined && { paths: pending.paths }),
};
}, []);
const [messages, setMessages] = useState<Message[]>([]);
const [messagesLoading, setMessagesLoading] = useState(false);
const [loadingOlder, setLoadingOlder] = useState(false);
const [hasOlderMessages, setHasOlderMessages] = useState(false);
const [hasNewerMessages, setHasNewerMessages] = useState(false);
const [loadingNewer, setLoadingNewer] = useState(false);
// Fetch messages for active conversation
// Note: This is called manually and from the useEffect. The useEffect handles
// cancellation via AbortController; manual calls (e.g., after sending a message)
// don't need cancellation.
const fetchMessages = useCallback(
const abortControllerRef = useRef<AbortController | null>(null);
const fetchingConversationIdRef = useRef<string | null>(null);
const messagesRef = useRef<Message[]>([]);
const hasOlderMessagesRef = useRef(false);
const hasNewerMessagesRef = useRef(false);
const prevConversationIdRef = useRef<string | null>(null);
useEffect(() => {
messagesRef.current = messages;
}, [messages]);
useEffect(() => {
hasOlderMessagesRef.current = hasOlderMessages;
}, [hasOlderMessages]);
useEffect(() => {
hasNewerMessagesRef.current = hasNewerMessages;
}, [hasNewerMessages]);
const syncSeenContent = useCallback(
(nextMessages: Message[]) => {
seenMessageContent.current.clear();
for (const msg of nextMessages) {
seenMessageContent.current.add(getMessageContentKey(msg));
}
},
[seenMessageContent]
);
const fetchLatestMessages = useCallback(
async (showLoading = false, signal?: AbortSignal) => {
if (
!activeConversation ||
activeConversation.type === 'raw' ||
activeConversation.type === 'map' ||
activeConversation.type === 'visualizer' ||
activeConversation.type === 'search'
) {
if (!isMessageConversation(activeConversation)) {
setMessages([]);
setHasOlderMessages(false);
return;
}
// Track which conversation we're fetching for
const conversationId = activeConversation.id;
if (showLoading) {
setMessagesLoading(true);
// Clear messages first so MessageList resets scroll state for new conversation
setMessages([]);
}
try {
const data = await api.getMessages(
{
@@ -185,24 +189,15 @@ export function useConversationMessages(
signal
);
// Check if this response is still for the current conversation
// This handles the race where the conversation changed while awaiting
if (fetchingConversationIdRef.current !== conversationId) {
// Stale response - conversation changed while we were fetching
return;
}
const messagesWithPendingAck = data.map((msg) => applyPendingAck(msg));
setMessages(messagesWithPendingAck);
// Track seen content for new messages
seenMessageContent.current.clear();
for (const msg of messagesWithPendingAck) {
seenMessageContent.current.add(getMessageContentKey(msg));
}
// If we got a full page, there might be more
syncSeenContent(messagesWithPendingAck);
setHasOlderMessages(messagesWithPendingAck.length >= MESSAGE_PAGE_SIZE);
} catch (err) {
// Don't show error toast for aborted requests (user switched conversations)
if (isAbortError(err)) {
return;
}
@@ -216,22 +211,46 @@ export function useConversationMessages(
}
}
},
[activeConversation, applyPendingAck]
[activeConversation, applyPendingAck, syncSeenContent]
);
const reconcileFromBackend = useCallback(
(conversation: Conversation, signal: AbortSignal) => {
const conversationId = conversation.id;
api
.getMessages(
{
type: conversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
},
signal
)
.then((data) => {
if (fetchingConversationIdRef.current !== conversationId) return;
const dataWithPendingAck = data.map((msg) => applyPendingAck(msg));
const merged = messageCache.reconcile(messagesRef.current, dataWithPendingAck);
if (!merged) return;
setMessages(merged);
syncSeenContent(merged);
if (dataWithPendingAck.length >= MESSAGE_PAGE_SIZE) {
setHasOlderMessages(true);
}
})
.catch((err) => {
if (isAbortError(err)) return;
console.debug('Background reconciliation failed:', err);
});
},
[applyPendingAck, syncSeenContent]
);
// Fetch older messages (cursor-based pagination)
const fetchOlderMessages = useCallback(async () => {
if (
!activeConversation ||
activeConversation.type === 'raw' ||
loadingOlder ||
!hasOlderMessages
)
return;
if (!isMessageConversation(activeConversation) || loadingOlder || !hasOlderMessages) return;
const conversationId = activeConversation.id;
// Get the true oldest message as cursor for the next page
const oldestMessage = messages.reduce(
(oldest, msg) => {
if (!oldest) return msg;
@@ -253,20 +272,16 @@ export function useConversationMessages(
before_id: oldestMessage.id,
});
// Guard against stale response if the user switched conversations mid-request
if (fetchingConversationIdRef.current !== conversationId) return;
const dataWithPendingAck = data.map((msg) => applyPendingAck(msg));
if (dataWithPendingAck.length > 0) {
// Prepend older messages (they come sorted DESC, so older are at the end)
setMessages((prev) => [...prev, ...dataWithPendingAck]);
// Track seen content
for (const msg of dataWithPendingAck) {
seenMessageContent.current.add(getMessageContentKey(msg));
}
}
// If we got less than a full page, no more messages
setHasOlderMessages(dataWithPendingAck.length >= MESSAGE_PAGE_SIZE);
} catch (err) {
console.error('Failed to fetch older messages:', err);
@@ -276,21 +291,12 @@ export function useConversationMessages(
} finally {
setLoadingOlder(false);
}
}, [activeConversation, loadingOlder, hasOlderMessages, messages, applyPendingAck]);
}, [activeConversation, applyPendingAck, hasOlderMessages, loadingOlder, messages]);
// Fetch newer messages (forward cursor pagination)
const fetchNewerMessages = useCallback(async () => {
if (
!activeConversation ||
activeConversation.type === 'raw' ||
loadingNewer ||
!hasNewerMessages
)
return;
if (!isMessageConversation(activeConversation) || loadingNewer || !hasNewerMessages) return;
const conversationId = activeConversation.id;
// Get the newest message as forward cursor
const newestMessage = messages.reduce(
(newest, msg) => {
if (!newest) return msg;
@@ -315,11 +321,10 @@ export function useConversationMessages(
if (fetchingConversationIdRef.current !== conversationId) return;
const dataWithPendingAck = data.map((msg) => applyPendingAck(msg));
// Deduplicate against already-seen messages (WS race)
const newMessages = dataWithPendingAck.filter(
(msg) => !seenMessageContent.current.has(getMessageContentKey(msg))
);
if (newMessages.length > 0) {
setMessages((prev) => [...prev, ...newMessages]);
for (const msg of newMessages) {
@@ -335,105 +340,43 @@ export function useConversationMessages(
} finally {
setLoadingNewer(false);
}
}, [activeConversation, loadingNewer, hasNewerMessages, messages, applyPendingAck]);
}, [activeConversation, applyPendingAck, hasNewerMessages, loadingNewer, messages]);
// Jump to bottom: re-fetch latest page, clear hasNewerMessages
const jumpToBottom = useCallback(() => {
if (!activeConversation) return;
setHasNewerMessages(false);
// Invalidate cache so fetchMessages does a fresh load
messageCache.remove(activeConversation.id);
fetchMessages(true);
}, [activeConversation, fetchMessages]);
void fetchLatestMessages(true);
}, [activeConversation, fetchLatestMessages]);
// Trigger a background reconciliation for the current conversation.
// Used after WebSocket reconnects to silently recover any missed messages.
const triggerReconcile = useCallback(() => {
const conv = activeConversation;
if (
!conv ||
conv.type === 'raw' ||
conv.type === 'map' ||
conv.type === 'visualizer' ||
conv.type === 'search'
)
return;
if (!isMessageConversation(activeConversation)) return;
const controller = new AbortController();
reconcileFromBackend(conv, controller.signal);
}, [activeConversation]); // eslint-disable-line react-hooks/exhaustive-deps
reconcileFromBackend(activeConversation, controller.signal);
}, [activeConversation, reconcileFromBackend]);
// Background reconciliation: silently fetch from backend after a cache restore
// and only update state if something differs (missed WS message, stale ack, etc.).
// No-ops on the happy path — zero rerenders when cache is already consistent.
function reconcileFromBackend(conversation: Conversation, signal: AbortSignal) {
const conversationId = conversation.id;
api
.getMessages(
{
type: conversation.type === 'channel' ? 'CHAN' : 'PRIV',
conversation_key: conversationId,
limit: MESSAGE_PAGE_SIZE,
},
signal
)
.then((data) => {
// Stale check — conversation may have changed while awaiting
if (fetchingConversationIdRef.current !== conversationId) return;
const dataWithPendingAck = data.map((msg) => applyPendingAck(msg));
const merged = messageCache.reconcile(messagesRef.current, dataWithPendingAck);
if (!merged) return; // Cache was consistent — no rerender
setMessages(merged);
seenMessageContent.current.clear();
for (const msg of merged) {
seenMessageContent.current.add(getMessageContentKey(msg));
}
if (dataWithPendingAck.length >= MESSAGE_PAGE_SIZE) {
setHasOlderMessages(true);
}
})
.catch((err) => {
if (isAbortError(err)) return;
// Silent failure — we already have cached data
console.debug('Background reconciliation failed:', err);
});
}
// Fetch messages when conversation changes, with proper cancellation and caching
useEffect(() => {
// Abort any previous in-flight request
if (abortControllerRef.current) {
abortControllerRef.current.abort();
}
const prevId = prevConversationIdRef.current;
// Track which conversation we're now on
const newId = activeConversation?.id ?? null;
const conversationChanged = prevId !== newId;
fetchingConversationIdRef.current = newId;
prevConversationIdRef.current = newId;
// When targetMessageId goes from a value to null (onTargetReached cleared it)
// but the conversation hasn't changed, the around-loaded messages are already
// displayed — do nothing. Without this guard the effect would re-enter the
// normal fetch path and replace the mid-history view with the latest page.
// Preserve around-loaded context on the same conversation when search clears targetMessageId.
if (!conversationChanged && !targetMessageId) {
return;
}
// Reset loadingOlder/loadingNewer — the previous conversation's in-flight
// fetch is irrelevant now (its stale-check will discard the response).
setLoadingOlder(false);
setLoadingNewer(false);
if (conversationChanged) {
setHasNewerMessages(false);
}
// Save outgoing conversation to cache only when actually leaving it, and
// only if we were on the latest page (mid-history views would restore stale
// partial data on switch-back).
if (
conversationChanged &&
prevId &&
@@ -447,29 +390,20 @@ export function useConversationMessages(
});
}
// Clear state for non-message views
if (
!activeConversation ||
activeConversation.type === 'raw' ||
activeConversation.type === 'map' ||
activeConversation.type === 'visualizer' ||
activeConversation.type === 'search'
) {
if (!isMessageConversation(activeConversation)) {
setMessages([]);
setHasOlderMessages(false);
return;
}
// Create AbortController for this conversation's fetch (cache reconcile or full fetch)
const controller = new AbortController();
abortControllerRef.current = controller;
// Jump-to-message: skip cache and load messages around the target
if (targetMessageId) {
setMessagesLoading(true);
setMessages([]);
const msgType = activeConversation.type === 'channel' ? 'CHAN' : 'PRIV';
api
void api
.getMessagesAround(
targetMessageId,
msgType as 'PRIV' | 'CHAN',
@@ -480,10 +414,7 @@ export function useConversationMessages(
if (fetchingConversationIdRef.current !== activeConversation.id) return;
const withAcks = response.messages.map((msg) => applyPendingAck(msg));
setMessages(withAcks);
seenMessageContent.current.clear();
for (const msg of withAcks) {
seenMessageContent.current.add(getMessageContentKey(msg));
}
syncSeenContent(withAcks);
setHasOlderMessages(response.has_older);
setHasNewerMessages(response.has_newer);
})
@@ -496,30 +427,21 @@ export function useConversationMessages(
setMessagesLoading(false);
});
} else {
// Check cache for the new conversation
const cached = messageCache.get(activeConversation.id);
if (cached) {
// Restore from cache instantly — no spinner
setMessages(cached.messages);
seenMessageContent.current = new Set(cached.seenContent);
setHasOlderMessages(cached.hasOlderMessages);
setMessagesLoading(false);
// Silently reconcile with backend in case we missed a WS message
reconcileFromBackend(activeConversation, controller.signal);
} else {
// Not cached — full fetch with spinner
fetchMessages(true, controller.signal);
void fetchLatestMessages(true, controller.signal);
}
}
// Cleanup: abort request if conversation changes or component unmounts
return () => {
controller.abort();
};
// NOTE: Intentionally omitting fetchMessages and activeConversation from deps:
// - fetchMessages is recreated when activeConversation changes, which would cause infinite loops
// - activeConversation object identity changes on every render; we only care about id/type
// - We use fetchingConversationIdRef and AbortController to handle stale responses safely
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [activeConversation?.id, activeConversation?.type, targetMessageId]);
@@ -555,7 +477,7 @@ export function useConversationMessages(
return true;
},
[applyPendingAck]
[applyPendingAck, messagesRef, setMessages]
);
// Update a message's ack count and paths
@@ -592,7 +514,7 @@ export function useConversationMessages(
return prev;
});
},
[setPendingAck]
[messagesRef, setMessages, setPendingAck]
);
return {
@@ -0,0 +1,112 @@
import { useCallback, useState, type Dispatch, type SetStateAction } from 'react';
import type { SearchNavigateTarget } from '../components/SearchView';
import type { Channel, Conversation } from '../types';
interface UseConversationNavigationArgs {
channels: Channel[];
handleSelectConversation: (conv: Conversation) => void;
}
interface UseConversationNavigationResult {
targetMessageId: number | null;
setTargetMessageId: Dispatch<SetStateAction<number | null>>;
infoPaneContactKey: string | null;
infoPaneFromChannel: boolean;
infoPaneChannelKey: string | null;
handleOpenContactInfo: (publicKey: string, fromChannel?: boolean) => void;
handleCloseContactInfo: () => void;
handleOpenChannelInfo: (channelKey: string) => void;
handleCloseChannelInfo: () => void;
handleSelectConversationWithTargetReset: (
conv: Conversation,
options?: { preserveTarget?: boolean }
) => void;
handleNavigateToChannel: (channelKey: string) => void;
handleNavigateToMessage: (target: SearchNavigateTarget) => void;
}
export function useConversationNavigation({
channels,
handleSelectConversation,
}: UseConversationNavigationArgs): UseConversationNavigationResult {
const [targetMessageId, setTargetMessageId] = useState<number | null>(null);
const [infoPaneContactKey, setInfoPaneContactKey] = useState<string | null>(null);
const [infoPaneFromChannel, setInfoPaneFromChannel] = useState(false);
const [infoPaneChannelKey, setInfoPaneChannelKey] = useState<string | null>(null);
const handleOpenContactInfo = useCallback((publicKey: string, fromChannel?: boolean) => {
setInfoPaneContactKey(publicKey);
setInfoPaneFromChannel(fromChannel ?? false);
}, []);
const handleCloseContactInfo = useCallback(() => {
setInfoPaneContactKey(null);
}, []);
const handleOpenChannelInfo = useCallback((channelKey: string) => {
setInfoPaneChannelKey(channelKey);
}, []);
const handleCloseChannelInfo = useCallback(() => {
setInfoPaneChannelKey(null);
}, []);
const handleSelectConversationWithTargetReset = useCallback(
(conv: Conversation, options?: { preserveTarget?: boolean }) => {
if (conv.type !== 'search' && !options?.preserveTarget) {
setTargetMessageId(null);
}
handleSelectConversation(conv);
},
[handleSelectConversation]
);
const handleNavigateToChannel = useCallback(
(channelKey: string) => {
const channel = channels.find((item) => item.key === channelKey);
if (!channel) {
return;
}
handleSelectConversationWithTargetReset({
type: 'channel',
id: channel.key,
name: channel.name,
});
setInfoPaneContactKey(null);
},
[channels, handleSelectConversationWithTargetReset]
);
const handleNavigateToMessage = useCallback(
(target: SearchNavigateTarget) => {
const convType = target.type === 'CHAN' ? 'channel' : 'contact';
setTargetMessageId(target.id);
handleSelectConversationWithTargetReset(
{
type: convType,
id: target.conversation_key,
name: target.conversation_name,
},
{ preserveTarget: true }
);
},
[handleSelectConversationWithTargetReset]
);
return {
targetMessageId,
setTargetMessageId,
infoPaneContactKey,
infoPaneFromChannel,
infoPaneChannelKey,
handleOpenContactInfo,
handleCloseContactInfo,
handleOpenChannelInfo,
handleCloseChannelInfo,
handleSelectConversationWithTargetReset,
handleNavigateToChannel,
handleNavigateToMessage,
};
}
+264
View File
@@ -0,0 +1,264 @@
import {
useCallback,
useMemo,
type Dispatch,
type MutableRefObject,
type SetStateAction,
} from 'react';
import { api } from '../api';
import * as messageCache from '../messageCache';
import type { UseWebSocketOptions } from '../useWebSocket';
import { toast } from '../components/ui/sonner';
import { getStateKey } from '../utils/conversationState';
import { mergeContactIntoList } from '../utils/contactMerge';
import { appendRawPacketUnique } from '../utils/rawPacketIdentity';
import { getMessageContentKey } from './useConversationMessages';
import type {
Channel,
Contact,
Conversation,
HealthStatus,
Message,
MessagePath,
RawPacket,
} from '../types';
interface UseRealtimeAppStateArgs {
prevHealthRef: MutableRefObject<HealthStatus | null>;
setHealth: Dispatch<SetStateAction<HealthStatus | null>>;
fetchConfig: () => void | Promise<void>;
setRawPackets: Dispatch<SetStateAction<RawPacket[]>>;
triggerReconcile: () => void;
refreshUnreads: () => Promise<void>;
setChannels: Dispatch<SetStateAction<Channel[]>>;
fetchAllContacts: () => Promise<Contact[]>;
setContacts: Dispatch<SetStateAction<Contact[]>>;
blockedKeysRef: MutableRefObject<string[]>;
blockedNamesRef: MutableRefObject<string[]>;
activeConversationRef: MutableRefObject<Conversation | null>;
hasNewerMessagesRef: MutableRefObject<boolean>;
addMessageIfNew: (msg: Message) => boolean;
trackNewMessage: (msg: Message) => void;
incrementUnread: (stateKey: string, hasMention?: boolean) => void;
checkMention: (text: string) => boolean;
pendingDeleteFallbackRef: MutableRefObject<boolean>;
setActiveConversation: (conv: Conversation | null) => void;
updateMessageAck: (messageId: number, ackCount: number, paths?: MessagePath[]) => void;
maxRawPackets?: number;
}
function isMessageBlocked(msg: Message, blockedKeys: string[], blockedNames: string[]): boolean {
if (msg.outgoing) {
return false;
}
if (blockedKeys.length > 0) {
if (msg.type === 'PRIV' && blockedKeys.includes(msg.conversation_key.toLowerCase())) {
return true;
}
if (
msg.type === 'CHAN' &&
msg.sender_key &&
blockedKeys.includes(msg.sender_key.toLowerCase())
) {
return true;
}
}
return blockedNames.length > 0 && !!msg.sender_name && blockedNames.includes(msg.sender_name);
}
function isActiveConversationMessage(
activeConversation: Conversation | null,
msg: Message
): boolean {
if (!activeConversation) return false;
if (msg.type === 'CHAN' && activeConversation.type === 'channel') {
return msg.conversation_key === activeConversation.id;
}
if (msg.type === 'PRIV' && activeConversation.type === 'contact') {
return msg.conversation_key === activeConversation.id;
}
return false;
}
export function useRealtimeAppState({
prevHealthRef,
setHealth,
fetchConfig,
setRawPackets,
triggerReconcile,
refreshUnreads,
setChannels,
fetchAllContacts,
setContacts,
blockedKeysRef,
blockedNamesRef,
activeConversationRef,
hasNewerMessagesRef,
addMessageIfNew,
trackNewMessage,
incrementUnread,
checkMention,
pendingDeleteFallbackRef,
setActiveConversation,
updateMessageAck,
maxRawPackets = 500,
}: UseRealtimeAppStateArgs): UseWebSocketOptions {
const mergeChannelIntoList = useCallback(
(updated: Channel) => {
setChannels((prev) => {
const existingIndex = prev.findIndex((channel) => channel.key === updated.key);
if (existingIndex === -1) {
return [...prev, updated].sort((a, b) => a.name.localeCompare(b.name));
}
const next = [...prev];
next[existingIndex] = updated;
return next;
});
},
[setChannels]
);
return useMemo(
() => ({
onHealth: (data: HealthStatus) => {
const prev = prevHealthRef.current;
prevHealthRef.current = data;
setHealth(data);
const initializationCompleted =
prev !== null &&
prev.radio_connected &&
prev.radio_initializing &&
data.radio_connected &&
!data.radio_initializing;
if (prev !== null && prev.radio_connected !== data.radio_connected) {
if (data.radio_connected) {
toast.success('Radio connected', {
description: data.connection_info
? `Connected via ${data.connection_info}`
: undefined,
});
fetchConfig();
} else {
toast.error('Radio disconnected', {
description: 'Check radio connection and power',
});
}
}
if (initializationCompleted) {
fetchConfig();
}
},
onError: (error: { message: string; details?: string }) => {
toast.error(error.message, {
description: error.details,
});
},
onSuccess: (success: { message: string; details?: string }) => {
toast.success(success.message, {
description: success.details,
});
},
onReconnect: () => {
setRawPackets([]);
triggerReconcile();
refreshUnreads();
api.getChannels().then(setChannels).catch(console.error);
fetchAllContacts()
.then((data) => setContacts(data))
.catch(console.error);
},
onMessage: (msg: Message) => {
if (isMessageBlocked(msg, blockedKeysRef.current, blockedNamesRef.current)) {
return;
}
const isForActiveConversation = isActiveConversationMessage(
activeConversationRef.current,
msg
);
if (isForActiveConversation && !hasNewerMessagesRef.current) {
addMessageIfNew(msg);
}
trackNewMessage(msg);
const contentKey = getMessageContentKey(msg);
if (!isForActiveConversation) {
const isNew = messageCache.addMessage(msg.conversation_key, msg, contentKey);
if (!msg.outgoing && isNew) {
let stateKey: string | null = null;
if (msg.type === 'CHAN' && msg.conversation_key) {
stateKey = getStateKey('channel', msg.conversation_key);
} else if (msg.type === 'PRIV' && msg.conversation_key) {
stateKey = getStateKey('contact', msg.conversation_key);
}
if (stateKey) {
incrementUnread(stateKey, checkMention(msg.text));
}
}
}
},
onContact: (contact: Contact) => {
setContacts((prev) => mergeContactIntoList(prev, contact));
},
onChannel: (channel: Channel) => {
mergeChannelIntoList(channel);
},
onContactDeleted: (publicKey: string) => {
setContacts((prev) => prev.filter((c) => c.public_key !== publicKey));
messageCache.remove(publicKey);
const active = activeConversationRef.current;
if (active?.type === 'contact' && active.id === publicKey) {
pendingDeleteFallbackRef.current = true;
setActiveConversation(null);
}
},
onChannelDeleted: (key: string) => {
setChannels((prev) => prev.filter((c) => c.key !== key));
messageCache.remove(key);
const active = activeConversationRef.current;
if (active?.type === 'channel' && active.id === key) {
pendingDeleteFallbackRef.current = true;
setActiveConversation(null);
}
},
onRawPacket: (packet: RawPacket) => {
setRawPackets((prev) => appendRawPacketUnique(prev, packet, maxRawPackets));
},
onMessageAcked: (messageId: number, ackCount: number, paths?: MessagePath[]) => {
updateMessageAck(messageId, ackCount, paths);
messageCache.updateAck(messageId, ackCount, paths);
},
}),
[
activeConversationRef,
addMessageIfNew,
blockedKeysRef,
blockedNamesRef,
checkMention,
fetchAllContacts,
fetchConfig,
hasNewerMessagesRef,
incrementUnread,
maxRawPackets,
mergeChannelIntoList,
pendingDeleteFallbackRef,
prevHealthRef,
refreshUnreads,
setActiveConversation,
setChannels,
setContacts,
setHealth,
setRawPackets,
trackNewMessage,
triggerReconcile,
updateMessageAck,
]
);
}
+195
View File
@@ -0,0 +1,195 @@
import React from 'react';
import { render, screen, waitFor } from '@testing-library/react';
import { beforeEach, describe, expect, it, vi } from 'vitest';
import { ConversationPane } from '../components/ConversationPane';
import type {
Channel,
Contact,
Conversation,
Favorite,
HealthStatus,
Message,
RadioConfig,
} from '../types';
vi.mock('../components/ChatHeader', () => ({
ChatHeader: () => <div data-testid="chat-header" />,
}));
vi.mock('../components/MessageList', () => ({
MessageList: () => <div data-testid="message-list" />,
}));
vi.mock('../components/MessageInput', () => ({
MessageInput: React.forwardRef((_props, ref) => {
React.useImperativeHandle(ref, () => ({ appendText: vi.fn() }));
return <div data-testid="message-input" />;
}),
}));
vi.mock('../components/RawPacketList', () => ({
RawPacketList: () => <div data-testid="raw-packet-list" />,
}));
vi.mock('../components/RepeaterDashboard', () => ({
RepeaterDashboard: () => <div data-testid="repeater-dashboard" />,
}));
vi.mock('../components/MapView', () => ({
MapView: () => <div data-testid="map-view" />,
}));
vi.mock('../components/VisualizerView', () => ({
VisualizerView: () => <div data-testid="visualizer-view" />,
}));
const config: RadioConfig = {
public_key: 'aa'.repeat(32),
name: 'Radio',
lat: 1,
lon: 2,
tx_power: 17,
max_tx_power: 22,
radio: { freq: 910.525, bw: 62.5, sf: 7, cr: 5 },
path_hash_mode: 0,
path_hash_mode_supported: true,
};
const health: HealthStatus = {
status: 'ok',
radio_connected: true,
radio_initializing: false,
connection_info: 'serial',
database_size_mb: 1,
oldest_undecrypted_timestamp: null,
fanout_statuses: {},
bots_disabled: false,
};
const channel: Channel = {
key: '8B3387E9C5CDEA6AC9E5EDBAA115CD72',
name: 'Public',
is_hashtag: false,
on_radio: false,
last_read_at: null,
};
const message: Message = {
id: 1,
type: 'CHAN',
conversation_key: channel.key,
text: 'hello',
sender_timestamp: 1700000000,
received_at: 1700000001,
paths: null,
txt_type: 0,
signature: null,
sender_key: null,
outgoing: false,
acked: 0,
sender_name: null,
};
function createProps(overrides: Partial<React.ComponentProps<typeof ConversationPane>> = {}) {
return {
activeConversation: null as Conversation | null,
contacts: [] as Contact[],
channels: [channel],
rawPackets: [],
config,
health,
favorites: [] as Favorite[],
messages: [message],
messagesLoading: false,
loadingOlder: false,
hasOlderMessages: false,
targetMessageId: null,
hasNewerMessages: false,
loadingNewer: false,
messageInputRef: { current: null },
onTrace: vi.fn(async () => {}),
onToggleFavorite: vi.fn(async () => {}),
onDeleteContact: vi.fn(async () => {}),
onDeleteChannel: vi.fn(async () => {}),
onSetChannelFloodScopeOverride: vi.fn(async () => {}),
onOpenContactInfo: vi.fn(),
onOpenChannelInfo: vi.fn(),
onSenderClick: vi.fn(),
onLoadOlder: vi.fn(async () => {}),
onResendChannelMessage: vi.fn(async () => {}),
onTargetReached: vi.fn(),
onLoadNewer: vi.fn(async () => {}),
onJumpToBottom: vi.fn(),
onSendMessage: vi.fn(async () => {}),
...overrides,
};
}
describe('ConversationPane', () => {
beforeEach(() => {
vi.clearAllMocks();
});
it('renders the empty state when no conversation is active', () => {
render(<ConversationPane {...createProps()} />);
expect(screen.getByText('Select a conversation or start a new one')).toBeInTheDocument();
});
it('renders repeater dashboard instead of chat chrome for repeater contacts', async () => {
render(
<ConversationPane
{...createProps({
activeConversation: {
type: 'contact',
id: 'bb'.repeat(32),
name: 'Repeater',
},
contacts: [
{
public_key: 'bb'.repeat(32),
name: 'Repeater',
type: 2,
flags: 0,
last_path: null,
last_path_len: 0,
out_path_hash_mode: 0,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
},
],
})}
/>
);
expect(await screen.findByTestId('repeater-dashboard')).toBeInTheDocument();
expect(screen.queryByTestId('message-list')).not.toBeInTheDocument();
});
it('renders chat chrome for normal channel conversations', async () => {
render(
<ConversationPane
{...createProps({
activeConversation: {
type: 'channel',
id: channel.key,
name: channel.name,
},
})}
/>
);
await waitFor(() => {
expect(screen.getByTestId('chat-header')).toBeInTheDocument();
expect(screen.getByTestId('message-list')).toBeInTheDocument();
expect(screen.getByTestId('message-input')).toBeInTheDocument();
});
});
});
+64
View File
@@ -0,0 +1,64 @@
import { render, screen } from '@testing-library/react';
import { describe, expect, it } from 'vitest';
import { MessageList } from '../components/MessageList';
import type { Message } from '../types';
function createMessage(overrides: Partial<Message> = {}): Message {
return {
id: 1,
type: 'CHAN',
conversation_key: 'C3B889530D4F02DB5662EA13C417F530',
text: 'Alice: hello world',
sender_timestamp: 1700000000,
received_at: 1700000001,
paths: null,
txt_type: 0,
signature: null,
sender_key: null,
outgoing: false,
acked: 0,
sender_name: null,
...overrides,
};
}
describe('MessageList channel sender rendering', () => {
it('renders explicit corrupt placeholder and warning avatar for unnamed corrupt channel packets', () => {
render(
<MessageList
messages={[
createMessage({
text: "Nv\x0ek\x16ɩ'\x7fg:",
sender_name: null,
sender_key: null,
}),
]}
contacts={[]}
loading={false}
/>
);
expect(screen.getByText('<No name -- corrupt packet?>')).toBeInTheDocument();
expect(screen.getByTestId('corrupt-avatar')).toBeInTheDocument();
});
it('prefers stored sender_name for channel messages even when text is not sender-prefixed', () => {
render(
<MessageList
messages={[
createMessage({
text: 'garbled payload with no sender prefix',
sender_name: 'Alice',
sender_key: 'ab'.repeat(32),
}),
]}
contacts={[]}
loading={false}
/>
);
expect(screen.getByText('Alice')).toBeInTheDocument();
expect(screen.getByText('A')).toBeInTheDocument();
});
});
+48
View File
@@ -0,0 +1,48 @@
import { act, renderHook } from '@testing-library/react';
import { describe, expect, it } from 'vitest';
import { useAppShell } from '../hooks/useAppShell';
describe('useAppShell', () => {
it('opens new-message modal and closes the sidebar', () => {
const { result } = renderHook(() => useAppShell());
act(() => {
result.current.setSidebarOpen(true);
result.current.handleOpenNewMessage();
});
expect(result.current.showNewMessage).toBe(true);
expect(result.current.sidebarOpen).toBe(false);
});
it('toggles settings mode and closes the sidebar', () => {
const { result } = renderHook(() => useAppShell());
act(() => {
result.current.setSidebarOpen(true);
result.current.handleToggleSettingsView();
});
expect(result.current.showSettings).toBe(true);
expect(result.current.sidebarOpen).toBe(false);
act(() => {
result.current.handleCloseSettingsView();
});
expect(result.current.showSettings).toBe(false);
});
it('toggles the cracker shell without affecting sidebar state', () => {
const { result } = renderHook(() => useAppShell());
act(() => {
result.current.setSidebarOpen(true);
result.current.handleToggleCracker();
});
expect(result.current.showCracker).toBe(true);
expect(result.current.sidebarOpen).toBe(true);
});
});
@@ -0,0 +1,146 @@
import { act, renderHook } from '@testing-library/react';
import { beforeEach, describe, expect, it, vi } from 'vitest';
import { useConversationActions } from '../hooks/useConversationActions';
import type { Channel, Conversation, Message } from '../types';
const mocks = vi.hoisted(() => ({
api: {
requestTrace: vi.fn(),
resendChannelMessage: vi.fn(),
sendChannelMessage: vi.fn(),
sendDirectMessage: vi.fn(),
setChannelFloodScopeOverride: vi.fn(),
},
messageCache: {
clear: vi.fn(),
},
toast: {
success: vi.fn(),
error: vi.fn(),
},
}));
vi.mock('../api', () => ({
api: mocks.api,
}));
vi.mock('../messageCache', () => mocks.messageCache);
vi.mock('../components/ui/sonner', () => ({
toast: mocks.toast,
}));
const publicChannel: Channel = {
key: '8B3387E9C5CDEA6AC9E5EDBAA115CD72',
name: 'Public',
is_hashtag: false,
on_radio: false,
last_read_at: null,
};
const sentMessage: Message = {
id: 42,
type: 'CHAN',
conversation_key: publicChannel.key,
text: 'hello mesh',
sender_timestamp: 1700000000,
received_at: 1700000001,
paths: null,
txt_type: 0,
signature: null,
sender_key: null,
outgoing: true,
acked: 0,
sender_name: 'Radio',
};
function createArgs(overrides: Partial<Parameters<typeof useConversationActions>[0]> = {}) {
const activeConversation: Conversation = {
type: 'channel',
id: publicChannel.key,
name: publicChannel.name,
};
return {
activeConversation,
activeConversationRef: { current: activeConversation },
setChannels: vi.fn(),
addMessageIfNew: vi.fn(() => true),
jumpToBottom: vi.fn(),
handleToggleBlockedKey: vi.fn(async () => {}),
handleToggleBlockedName: vi.fn(async () => {}),
messageInputRef: { current: { appendText: vi.fn() } },
...overrides,
};
}
describe('useConversationActions', () => {
beforeEach(() => {
vi.clearAllMocks();
});
it('appends a sent message when the user is still in the same conversation', async () => {
mocks.api.sendChannelMessage.mockResolvedValue(sentMessage);
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handleSendMessage(sentMessage.text);
});
expect(mocks.api.sendChannelMessage).toHaveBeenCalledWith(publicChannel.key, sentMessage.text);
expect(args.addMessageIfNew).toHaveBeenCalledWith(sentMessage);
});
it('does not append a sent message after the active conversation changes', async () => {
let resolveSend: ((message: Message) => void) | null = null;
mocks.api.sendChannelMessage.mockImplementation(
() =>
new Promise<Message>((resolve) => {
resolveSend = resolve;
})
);
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
const sendPromise = result.current.handleSendMessage(sentMessage.text);
args.activeConversationRef.current = {
type: 'contact',
id: 'aa'.repeat(32),
name: 'Alice',
};
resolveSend?.(sentMessage);
await sendPromise;
});
expect(args.addMessageIfNew).not.toHaveBeenCalled();
});
it('clears cached messages and jumps to the latest page after blocking a key', async () => {
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
await act(async () => {
await result.current.handleBlockKey('cc'.repeat(32));
});
expect(args.handleToggleBlockedKey).toHaveBeenCalledWith('cc'.repeat(32));
expect(mocks.messageCache.clear).toHaveBeenCalledTimes(1);
expect(args.jumpToBottom).toHaveBeenCalledTimes(1);
});
it('appends sender mentions into the message input', () => {
const args = createArgs();
const { result } = renderHook(() => useConversationActions(args));
act(() => {
result.current.handleSenderClick('Alice');
});
expect(args.messageInputRef.current?.appendText).toHaveBeenCalledWith('@[Alice] ');
});
});
@@ -392,4 +392,65 @@ describe('useConversationMessages forward pagination', () => {
expect(result.current.messages).toHaveLength(1);
expect(result.current.messages[0].text).toBe('latest-msg');
});
it('preserves around-loaded messages when the jump target is cleared in the same conversation', async () => {
const conv: Conversation = { type: 'channel', id: 'ch1', name: 'Channel' };
const aroundMessages = [
createMessage({
id: 4,
conversation_key: 'ch1',
text: 'older-context',
sender_timestamp: 1700000004,
received_at: 1700000004,
}),
createMessage({
id: 5,
conversation_key: 'ch1',
text: 'target-message',
sender_timestamp: 1700000005,
received_at: 1700000005,
}),
createMessage({
id: 6,
conversation_key: 'ch1',
text: 'newer-context',
sender_timestamp: 1700000006,
received_at: 1700000006,
}),
];
mockGetMessagesAround.mockResolvedValueOnce({
messages: aroundMessages,
has_older: true,
has_newer: true,
});
const { result, rerender } = renderHook<
ReturnType<typeof useConversationMessages>,
{ conv: Conversation; target: number | null }
>(({ conv, target }) => useConversationMessages(conv, target), {
initialProps: { conv, target: 5 },
});
await waitFor(() => expect(result.current.messagesLoading).toBe(false));
expect(result.current.messages.map((message) => message.text)).toEqual([
'older-context',
'target-message',
'newer-context',
]);
expect(mockGetMessages).not.toHaveBeenCalled();
rerender({ conv, target: null });
await waitFor(() =>
expect(result.current.messages.map((message) => message.text)).toEqual([
'older-context',
'target-message',
'newer-context',
])
);
expect(mockGetMessages).not.toHaveBeenCalled();
expect(result.current.hasNewerMessages).toBe(true);
});
});
@@ -0,0 +1,82 @@
import { act, renderHook } from '@testing-library/react';
import { describe, expect, it, vi } from 'vitest';
import { useConversationNavigation } from '../hooks/useConversationNavigation';
import type { Channel } from '../types';
const publicChannel: Channel = {
key: '8B3387E9C5CDEA6AC9E5EDBAA115CD72',
name: 'Public',
is_hashtag: false,
on_radio: false,
last_read_at: null,
};
function createArgs(overrides: Partial<Parameters<typeof useConversationNavigation>[0]> = {}) {
return {
channels: [publicChannel],
handleSelectConversation: vi.fn(),
...overrides,
};
}
describe('useConversationNavigation', () => {
it('resets the jump target when switching to a non-search conversation', () => {
const args = createArgs();
const { result } = renderHook(() => useConversationNavigation(args));
act(() => {
result.current.setTargetMessageId(10);
result.current.handleSelectConversationWithTargetReset({
type: 'contact',
id: 'aa'.repeat(32),
name: 'Alice',
});
});
expect(result.current.targetMessageId).toBeNull();
expect(args.handleSelectConversation).toHaveBeenCalledWith({
type: 'contact',
id: 'aa'.repeat(32),
name: 'Alice',
});
});
it('preserves the jump target when navigating from search results', () => {
const args = createArgs();
const { result } = renderHook(() => useConversationNavigation(args));
act(() => {
result.current.handleNavigateToMessage({
id: 321,
type: 'CHAN',
conversation_key: publicChannel.key,
conversation_name: publicChannel.name,
});
});
expect(result.current.targetMessageId).toBe(321);
expect(args.handleSelectConversation).toHaveBeenCalledWith({
type: 'channel',
id: publicChannel.key,
name: publicChannel.name,
});
});
it('closes the contact info pane when navigating to a channel', () => {
const args = createArgs();
const { result } = renderHook(() => useConversationNavigation(args));
act(() => {
result.current.handleOpenContactInfo('bb'.repeat(32), true);
result.current.handleNavigateToChannel(publicChannel.key);
});
expect(result.current.infoPaneContactKey).toBeNull();
expect(args.handleSelectConversation).toHaveBeenCalledWith({
type: 'channel',
id: publicChannel.key,
name: publicChannel.name,
});
});
});
@@ -0,0 +1,216 @@
import { act, renderHook, waitFor } from '@testing-library/react';
import { beforeEach, describe, expect, it, vi } from 'vitest';
import { useRealtimeAppState } from '../hooks/useRealtimeAppState';
import type { Channel, Contact, Conversation, HealthStatus, Message, RawPacket } from '../types';
const mocks = vi.hoisted(() => ({
api: {
getChannels: vi.fn(),
},
toast: {
success: vi.fn(),
error: vi.fn(),
},
messageCache: {
addMessage: vi.fn(),
remove: vi.fn(),
updateAck: vi.fn(),
},
}));
vi.mock('../api', () => ({
api: mocks.api,
}));
vi.mock('../components/ui/sonner', () => ({
toast: mocks.toast,
}));
vi.mock('../messageCache', () => mocks.messageCache);
const publicChannel: Channel = {
key: '8B3387E9C5CDEA6AC9E5EDBAA115CD72',
name: 'Public',
is_hashtag: false,
on_radio: false,
last_read_at: null,
};
const incomingDm: Message = {
id: 7,
type: 'PRIV',
conversation_key: 'aa'.repeat(32),
text: 'hello',
sender_timestamp: 1700000000,
received_at: 1700000001,
paths: null,
txt_type: 0,
signature: null,
sender_key: 'aa'.repeat(32),
outgoing: false,
acked: 0,
sender_name: 'Alice',
};
function createRealtimeArgs(overrides: Partial<Parameters<typeof useRealtimeAppState>[0]> = {}) {
const setHealth = vi.fn();
const setRawPackets = vi.fn();
const setChannels = vi.fn();
const setContacts = vi.fn();
return {
args: {
prevHealthRef: { current: null as HealthStatus | null },
setHealth,
fetchConfig: vi.fn(),
setRawPackets,
triggerReconcile: vi.fn(),
refreshUnreads: vi.fn(async () => {}),
setChannels,
fetchAllContacts: vi.fn(async () => [] as Contact[]),
setContacts,
blockedKeysRef: { current: [] as string[] },
blockedNamesRef: { current: [] as string[] },
activeConversationRef: { current: null as Conversation | null },
hasNewerMessagesRef: { current: false },
addMessageIfNew: vi.fn(),
trackNewMessage: vi.fn(),
incrementUnread: vi.fn(),
checkMention: vi.fn(() => false),
pendingDeleteFallbackRef: { current: false },
setActiveConversation: vi.fn(),
updateMessageAck: vi.fn(),
...overrides,
},
fns: {
setHealth,
setRawPackets,
setChannels,
setContacts,
},
};
}
describe('useRealtimeAppState', () => {
beforeEach(() => {
vi.clearAllMocks();
mocks.api.getChannels.mockResolvedValue([publicChannel]);
});
it('reconnect clears raw packets and refetches channels/contacts/unreads', async () => {
const contacts: Contact[] = [
{
public_key: 'bb'.repeat(32),
name: 'Bob',
type: 1,
flags: 0,
last_path: null,
last_path_len: 0,
out_path_hash_mode: 0,
last_advert: null,
lat: null,
lon: null,
last_seen: null,
on_radio: false,
last_contacted: null,
last_read_at: null,
first_seen: null,
},
];
const { args, fns } = createRealtimeArgs({
fetchAllContacts: vi.fn(async () => contacts),
});
const { result } = renderHook(() => useRealtimeAppState(args));
act(() => {
result.current.onReconnect?.();
});
await waitFor(() => {
expect(args.triggerReconcile).toHaveBeenCalledTimes(1);
expect(args.refreshUnreads).toHaveBeenCalledTimes(1);
expect(mocks.api.getChannels).toHaveBeenCalledTimes(1);
expect(args.fetchAllContacts).toHaveBeenCalledTimes(1);
expect(fns.setRawPackets).toHaveBeenCalledWith([]);
expect(fns.setChannels).toHaveBeenCalledWith([publicChannel]);
expect(fns.setContacts).toHaveBeenCalledWith(contacts);
});
});
it('tracks unread state for a new non-active incoming message', () => {
mocks.messageCache.addMessage.mockReturnValue(true);
const { args } = createRealtimeArgs({
checkMention: vi.fn(() => true),
});
const { result } = renderHook(() => useRealtimeAppState(args));
act(() => {
result.current.onMessage?.(incomingDm);
});
expect(args.addMessageIfNew).not.toHaveBeenCalled();
expect(args.trackNewMessage).toHaveBeenCalledWith(incomingDm);
expect(mocks.messageCache.addMessage).toHaveBeenCalledWith(
incomingDm.conversation_key,
incomingDm,
expect.any(String)
);
expect(args.incrementUnread).toHaveBeenCalledWith(
`contact-${incomingDm.conversation_key}`,
true
);
});
it('deleting the active contact clears it and marks fallback recovery pending', () => {
const pendingDeleteFallbackRef = { current: false };
const activeConversationRef = {
current: {
type: 'contact',
id: incomingDm.conversation_key,
name: 'Alice',
} satisfies Conversation,
};
const { args, fns } = createRealtimeArgs({
activeConversationRef,
pendingDeleteFallbackRef,
});
const { result } = renderHook(() => useRealtimeAppState(args));
act(() => {
result.current.onContactDeleted?.(incomingDm.conversation_key);
});
expect(fns.setContacts).toHaveBeenCalledWith(expect.any(Function));
expect(mocks.messageCache.remove).toHaveBeenCalledWith(incomingDm.conversation_key);
expect(args.setActiveConversation).toHaveBeenCalledWith(null);
expect(pendingDeleteFallbackRef.current).toBe(true);
});
it('appends raw packets using observation identity dedup', () => {
const { args, fns } = createRealtimeArgs();
const packet: RawPacket = {
id: 1,
observation_id: 2,
timestamp: 1700000000,
data: 'aabb',
payload_type: 'GROUP_TEXT',
snr: 7.5,
rssi: -80,
decrypted: false,
decrypted_info: null,
};
const { result } = renderHook(() => useRealtimeAppState(args));
act(() => {
result.current.onRawPacket?.(packet);
});
expect(fns.setRawPackets).toHaveBeenCalledWith(expect.any(Function));
});
});
@@ -0,0 +1,78 @@
import { render, screen } from '@testing-library/react';
import { describe, expect, it, vi } from 'vitest';
import { VisualizerTooltip } from '../components/visualizer/VisualizerTooltip';
import type { GraphNode } from '../components/visualizer/shared';
function createNode(overrides: Partial<GraphNode> & Pick<GraphNode, 'id' | 'type'>): GraphNode {
return {
id: overrides.id,
type: overrides.type,
name: overrides.name ?? null,
isAmbiguous: overrides.isAmbiguous ?? false,
lastActivity: overrides.lastActivity ?? Date.now(),
x: overrides.x ?? 0,
y: overrides.y ?? 0,
z: overrides.z ?? 0,
probableIdentity: overrides.probableIdentity,
ambiguousNames: overrides.ambiguousNames,
lastActivityReason: overrides.lastActivityReason,
};
}
describe('VisualizerTooltip', () => {
it('renders nothing without an active node', () => {
const { container } = render(
<VisualizerTooltip activeNodeId={null} nodes={new Map()} neighborIds={[]} />
);
expect(container).toBeEmptyDOMElement();
});
it('renders ambiguous node details and neighbors', () => {
vi.useFakeTimers();
vi.setSystemTime(new Date('2026-03-10T22:00:00Z'));
const node = createNode({
id: '?32',
type: 'repeater',
name: 'Likely Relay',
isAmbiguous: true,
probableIdentity: 'Likely Relay',
ambiguousNames: ['Relay A', 'Relay B'],
lastActivity: new Date('2026-03-10T21:58:30Z').getTime(),
lastActivityReason: 'Relayed GT',
});
const neighbor = createNode({
id: 'abcd1234ef56',
type: 'client',
name: 'Neighbor Node',
ambiguousNames: ['Alt Neighbor'],
});
render(
<VisualizerTooltip
activeNodeId={node.id}
nodes={
new Map([
[node.id, node],
[neighbor.id, neighbor],
])
}
neighborIds={[neighbor.id]}
/>
);
expect(screen.getByText('Likely Relay')).toBeInTheDocument();
expect(screen.getByText('ID: ?32')).toBeInTheDocument();
expect(screen.getByText('Type: repeater (ambiguous)')).toBeInTheDocument();
expect(screen.getByText('Probably: Likely Relay')).toBeInTheDocument();
expect(screen.getByText('Other possible: Relay A, Relay B')).toBeInTheDocument();
expect(screen.getByText('Last active: 1m 30s ago')).toBeInTheDocument();
expect(screen.getByText('Reason: Relayed GT')).toBeInTheDocument();
expect(screen.getByText('Neighbor Node')).toBeInTheDocument();
expect(screen.getByText('(Alt Neighbor)')).toBeInTheDocument();
vi.useRealTimers();
});
});
+41
View File
@@ -0,0 +1,41 @@
import { describe, expect, it } from 'vitest';
import { parseWsEvent } from '../wsEvents';
describe('wsEvents', () => {
it('parses contact_deleted events', () => {
const event = parseWsEvent(
JSON.stringify({ type: 'contact_deleted', data: { public_key: 'aa' } })
);
expect(event).toEqual({
type: 'contact_deleted',
data: { public_key: 'aa' },
});
});
it('parses channel_deleted events', () => {
const event = parseWsEvent(JSON.stringify({ type: 'channel_deleted', data: { key: 'bb' } }));
expect(event).toEqual({
type: 'channel_deleted',
data: { key: 'bb' },
});
});
it('returns unknown events with rawType preserved', () => {
const event = parseWsEvent(JSON.stringify({ type: 'mystery', data: { ok: true } }));
expect(event).toEqual({
type: 'unknown',
rawType: 'mystery',
data: { ok: true },
});
});
it('rejects invalid envelopes', () => {
expect(() => parseWsEvent(JSON.stringify({ data: {} }))).toThrow(
'Invalid WebSocket event envelope'
);
});
});
+5 -9
View File
@@ -1,10 +1,6 @@
import { useEffect, useRef, useCallback } from 'react';
import type { Channel, HealthStatus, Contact, Message, MessagePath, RawPacket } from './types';
interface WebSocketMessage {
type: string;
data: unknown;
}
import { parseWsEvent } from './wsEvents';
interface ErrorEvent {
message: string;
@@ -16,7 +12,7 @@ interface SuccessEvent {
details?: string;
}
interface UseWebSocketOptions {
export interface UseWebSocketOptions {
onHealth?: (health: HealthStatus) => void;
onMessage?: (message: Message) => void;
onContact?: (contact: Contact) => void;
@@ -92,7 +88,7 @@ export function useWebSocket(options: UseWebSocketOptions) {
ws.onmessage = (event) => {
try {
const msg: WebSocketMessage = JSON.parse(event.data);
const msg = parseWsEvent(event.data);
// Access handlers through ref to always use current versions
const handlers = optionsRef.current;
@@ -136,8 +132,8 @@ export function useWebSocket(options: UseWebSocketOptions) {
case 'pong':
// Heartbeat response, ignore
break;
default:
console.warn('Unknown WebSocket message type:', msg.type);
case 'unknown':
console.warn('Unknown WebSocket message type:', msg.rawType);
}
} catch (e) {
console.error('Failed to parse WebSocket message:', e);
+78
View File
@@ -0,0 +1,78 @@
import type { Channel, Contact, HealthStatus, Message, MessagePath, RawPacket } from './types';
export interface MessageAckedPayload {
message_id: number;
ack_count: number;
paths?: MessagePath[];
}
export interface ContactDeletedPayload {
public_key: string;
}
export interface ChannelDeletedPayload {
key: string;
}
export interface ToastPayload {
message: string;
details?: string;
}
export type KnownWsEvent =
| { type: 'health'; data: HealthStatus }
| { type: 'message'; data: Message }
| { type: 'contact'; data: Contact }
| { type: 'channel'; data: Channel }
| { type: 'contact_deleted'; data: ContactDeletedPayload }
| { type: 'channel_deleted'; data: ChannelDeletedPayload }
| { type: 'raw_packet'; data: RawPacket }
| { type: 'message_acked'; data: MessageAckedPayload }
| { type: 'error'; data: ToastPayload }
| { type: 'success'; data: ToastPayload }
| { type: 'pong'; data?: null };
export interface UnknownWsEvent {
type: 'unknown';
rawType: string;
data: unknown;
}
export type ParsedWsEvent = KnownWsEvent | UnknownWsEvent;
interface RawWsEnvelope {
type?: unknown;
data?: unknown;
}
export function parseWsEvent(raw: string): ParsedWsEvent {
const parsed: RawWsEnvelope = JSON.parse(raw);
if (!parsed || typeof parsed !== 'object' || typeof parsed.type !== 'string') {
throw new Error('Invalid WebSocket event envelope');
}
switch (parsed.type) {
case 'health':
case 'message':
case 'contact':
case 'channel':
case 'contact_deleted':
case 'channel_deleted':
case 'raw_packet':
case 'message_acked':
case 'error':
case 'success':
return {
type: parsed.type,
data: parsed.data,
} as KnownWsEvent;
case 'pong':
return { type: 'pong', data: parsed.data as null | undefined };
default:
return {
type: 'unknown',
rawType: parsed.type,
data: parsed.data,
};
}
}
Regular → Executable
View File
Regular → Executable
View File
Regular → Executable
View File
Regular → Executable
View File
View File
Regular → Executable
View File
+22 -46
View File
@@ -9,6 +9,7 @@ import time
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from fastapi import HTTPException
from app.radio import radio_manager
from app.repository import (
@@ -29,6 +30,15 @@ def _reset_radio_state():
radio_manager._operation_lock = prev_lock
def _patch_require_connected(mc=None, *, detail="Radio not connected"):
if mc is None:
return patch(
"app.dependencies.radio_manager.require_connected",
side_effect=HTTPException(status_code=503, detail=detail),
)
return patch("app.dependencies.radio_manager.require_connected", return_value=mc)
async def _insert_contact(public_key, name="Alice", **overrides):
"""Insert a contact into the test database."""
data = {
@@ -102,10 +112,7 @@ class TestRadioDisconnectedHandler:
# require_connected() passes, but _meshcore is None when radio_operation() checks
radio_manager._meshcore = None
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = MagicMock()
with _patch_require_connected(MagicMock()):
response = await client.post(
"/api/messages/direct", json={"destination": pub_key, "text": "Hi"}
)
@@ -120,10 +127,7 @@ class TestMessagesEndpoint:
@pytest.mark.asyncio
async def test_send_direct_message_requires_connection(self, test_db, client):
"""Sending message when disconnected returns 503."""
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post(
"/api/messages/direct", json={"destination": "abc123", "text": "Hello"}
)
@@ -134,10 +138,7 @@ class TestMessagesEndpoint:
@pytest.mark.asyncio
async def test_send_channel_message_requires_connection(self, test_db, client):
"""Sending channel message when disconnected returns 503."""
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post(
"/api/messages/channel",
json={"channel_key": "0123456789ABCDEF0123456789ABCDEF", "text": "Hello"},
@@ -164,12 +165,9 @@ class TestMessagesEndpoint:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_rm,
_patch_require_connected(mock_mc),
patch("app.routers.messages.broadcast_event") as mock_broadcast,
):
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
response = await client.post(
"/api/messages/direct",
json={"destination": pub_key, "text": "Hello"},
@@ -202,13 +200,10 @@ class TestMessagesEndpoint:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_rm,
_patch_require_connected(mock_mc),
patch("app.decoder.calculate_channel_hash", return_value="abcd"),
patch("app.routers.messages.broadcast_event") as mock_broadcast,
):
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
response = await client.post(
"/api/messages/channel",
json={"channel_key": chan_key, "text": "Hello room"},
@@ -226,10 +221,7 @@ class TestMessagesEndpoint:
mock_mc = MagicMock()
mock_mc.get_contact_by_key_prefix.return_value = None
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(
"/api/messages/direct", json={"destination": "nonexistent", "text": "Hello"}
)
@@ -257,11 +249,9 @@ class TestMessagesEndpoint:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_rm,
_patch_require_connected(mock_mc),
patch("app.routers.messages.MessageRepository") as mock_msg_repo,
):
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
# Simulate duplicate - create returns None
mock_msg_repo.create = AsyncMock(return_value=None)
@@ -294,11 +284,9 @@ class TestMessagesEndpoint:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_rm,
_patch_require_connected(mock_mc),
patch("app.routers.messages.MessageRepository") as mock_msg_repo,
):
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
# Simulate duplicate - create returns None
mock_msg_repo.create = AsyncMock(return_value=None)
@@ -315,10 +303,7 @@ class TestMessagesEndpoint:
@pytest.mark.asyncio
async def test_resend_channel_message_requires_connection(self, test_db, client):
"""Resend endpoint returns 503 when radio is disconnected."""
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post("/api/messages/channel/1/resend")
assert response.status_code == 503
@@ -353,10 +338,7 @@ class TestMessagesEndpoint:
)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/messages/channel/{msg_id}/resend")
assert response.status_code == 200
@@ -394,10 +376,7 @@ class TestMessagesEndpoint:
mock_mc.commands.set_channel = AsyncMock()
mock_mc.commands.send_chan_msg = AsyncMock()
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/messages/channel/{msg_id}/resend")
assert response.status_code == 400
@@ -414,10 +393,7 @@ class TestMessagesEndpoint:
mock_mc.commands.set_channel = AsyncMock()
mock_mc.commands.send_chan_msg = AsyncMock()
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post("/api/messages/channel/999999/resend")
assert response.status_code == 404
+17 -22
View File
@@ -9,6 +9,7 @@ from contextlib import asynccontextmanager
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from fastapi import HTTPException
from meshcore import EventType
from app.radio import radio_manager
@@ -55,6 +56,15 @@ def _make_error_response():
return result
def _patch_require_connected(mc=None, *, detail="Radio not connected"):
if mc is None:
return patch(
"app.dependencies.radio_manager.require_connected",
side_effect=HTTPException(status_code=503, detail=detail),
)
return patch("app.dependencies.radio_manager.require_connected", return_value=mc)
@asynccontextmanager
async def _noop_radio_operation(mc):
"""No-op radio_operation context manager that yields mc."""
@@ -83,11 +93,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
response = await client.post("/api/channels/sync?max_channels=5")
@@ -119,11 +127,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
response = await client.post("/api/channels/sync?max_channels=5")
@@ -146,11 +152,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
response = await client.post("/api/channels/sync?max_channels=3")
@@ -178,11 +182,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
await client.post("/api/channels/sync?max_channels=3")
@@ -193,10 +195,7 @@ class TestSyncChannelsFromRadio:
@pytest.mark.asyncio
async def test_sync_requires_connection(self, test_db, client):
"""Sync returns 503 when radio is not connected."""
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post("/api/channels/sync")
assert response.status_code == 503
@@ -216,11 +215,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
await client.post("/api/channels/sync?max_channels=3")
@@ -246,11 +243,9 @@ class TestSyncChannelsFromRadio:
radio_manager._meshcore = mock_mc
with (
patch("app.dependencies.radio_manager") as mock_dep_rm,
_patch_require_connected(mock_mc),
patch("app.routers.channels.radio_manager") as mock_ch_rm,
):
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
mock_ch_rm.radio_operation = lambda desc: _noop_radio_operation(mock_mc)
response = await client.post("/api/channels/sync?max_channels=3")
@@ -0,0 +1,72 @@
"""Tests for shared contact/message reconciliation helpers."""
import pytest
from app.repository import ContactNameHistoryRepository, ContactRepository, MessageRepository
from app.services.contact_reconciliation import (
claim_prefix_messages_for_contact,
record_contact_name_and_reconcile,
)
@pytest.mark.asyncio
async def test_claim_prefix_messages_for_contact_promotes_prefix_dm(test_db):
public_key = "aa" * 32
await ContactRepository.upsert({"public_key": public_key, "name": "Alice", "type": 1})
await MessageRepository.create(
msg_type="PRIV",
text="hello",
conversation_key=public_key[:12],
sender_timestamp=1000,
received_at=1000,
)
claimed = await claim_prefix_messages_for_contact(public_key=public_key)
assert claimed == 1
messages = await MessageRepository.get_all(conversation_key=public_key)
assert len(messages) == 1
assert messages[0].conversation_key == public_key
@pytest.mark.asyncio
async def test_record_contact_name_and_reconcile_records_history_and_backfills(test_db):
public_key = "bb" * 32
channel_key = "CC" * 16
await ContactRepository.upsert({"public_key": public_key, "name": "Alice", "type": 1})
await MessageRepository.create(
msg_type="PRIV",
text="dm",
conversation_key=public_key[:12],
sender_timestamp=1000,
received_at=1000,
)
await MessageRepository.create(
msg_type="CHAN",
text="Alice: hello",
conversation_key=channel_key,
sender_timestamp=1001,
received_at=1001,
sender_name="Alice",
)
claimed, backfilled = await record_contact_name_and_reconcile(
public_key=public_key,
contact_name="Alice",
timestamp=1234,
)
assert claimed == 1
assert backfilled == 1
history = await ContactNameHistoryRepository.get_history(public_key)
assert len(history) == 1
assert history[0].name == "Alice"
assert history[0].first_seen == 1234
assert history[0].last_seen == 1234
messages = await MessageRepository.get_all(msg_type="CHAN", conversation_key=channel_key)
assert len(messages) == 1
assert messages[0].sender_key == public_key
+19 -36
View File
@@ -10,6 +10,7 @@ from contextlib import asynccontextmanager
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from fastapi import HTTPException
from meshcore import EventType
from app.radio import radio_manager
@@ -31,6 +32,15 @@ def _noop_radio_operation(mc=None):
return _ctx
def _patch_require_connected(mc=None, *, detail="Radio not connected"):
if mc is None:
return patch(
"app.dependencies.radio_manager.require_connected",
side_effect=HTTPException(status_code=503, detail=detail),
)
return patch("app.dependencies.radio_manager.require_connected", return_value=mc)
@pytest.fixture(autouse=True)
def _reset_radio_state():
"""Save/restore radio_manager state so tests don't leak."""
@@ -505,10 +515,7 @@ class TestSyncContacts:
mock_mc.commands.get_contacts = AsyncMock(return_value=mock_result)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post("/api/contacts/sync")
assert response.status_code == 200
@@ -521,10 +528,7 @@ class TestSyncContacts:
@pytest.mark.asyncio
async def test_sync_requires_connection(self, test_db, client):
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post("/api/contacts/sync")
assert response.status_code == 503
@@ -547,10 +551,7 @@ class TestSyncContacts:
mock_mc.commands.get_contacts = AsyncMock(return_value=mock_result)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post("/api/contacts/sync")
assert response.status_code == 200
@@ -771,10 +772,7 @@ class TestAddRemoveRadio:
mock_mc.commands.add_contact = AsyncMock(return_value=mock_result)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/contacts/{KEY_A}/add-to-radio")
assert response.status_code == 200
@@ -800,10 +798,7 @@ class TestAddRemoveRadio:
mock_mc.commands.add_contact = AsyncMock(return_value=mock_result)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/contacts/{KEY_A}/add-to-radio")
assert response.status_code == 200
@@ -821,10 +816,7 @@ class TestAddRemoveRadio:
mock_mc.get_contact_by_key_prefix = MagicMock(return_value=MagicMock()) # On radio
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/contacts/{KEY_A}/add-to-radio")
assert response.status_code == 200
@@ -842,10 +834,7 @@ class TestAddRemoveRadio:
mock_mc.commands.remove_contact = AsyncMock(return_value=mock_result)
radio_manager._meshcore = mock_mc
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/contacts/{KEY_A}/remove-from-radio")
assert response.status_code == 200
@@ -857,10 +846,7 @@ class TestAddRemoveRadio:
@pytest.mark.asyncio
async def test_add_requires_connection(self, test_db, client):
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = False
mock_rm.meshcore = None
with _patch_require_connected():
response = await client.post(f"/api/contacts/{KEY_A}/add-to-radio")
assert response.status_code == 503
@@ -869,10 +855,7 @@ class TestAddRemoveRadio:
async def test_remove_not_found(self, test_db, client):
mock_mc = MagicMock()
with patch("app.dependencies.radio_manager") as mock_dep_rm:
mock_dep_rm.is_connected = True
mock_dep_rm.meshcore = mock_mc
with _patch_require_connected(mock_mc):
response = await client.post(f"/api/contacts/{KEY_A}/remove-from-radio")
assert response.status_code == 404
+154
View File
@@ -0,0 +1,154 @@
from unittest.mock import AsyncMock, MagicMock
import pytest
from meshcore import EventType
from app.routers.radio import RadioConfigUpdate, RadioSettings
from app.services.radio_commands import (
KeystoreRefreshError,
PathHashModeUnsupportedError,
RadioCommandRejectedError,
apply_radio_config_update,
import_private_key_and_refresh_keystore,
)
def _radio_result(event_type=EventType.OK, payload=None):
result = MagicMock()
result.type = event_type
result.payload = payload or {}
return result
def _mock_meshcore_with_info():
mc = MagicMock()
mc.self_info = {
"adv_lat": 10.0,
"adv_lon": 20.0,
}
mc.commands = MagicMock()
mc.commands.set_name = AsyncMock()
mc.commands.set_coords = AsyncMock()
mc.commands.set_tx_power = AsyncMock()
mc.commands.set_radio = AsyncMock()
mc.commands.set_path_hash_mode = AsyncMock(return_value=_radio_result())
mc.commands.send_appstart = AsyncMock()
mc.commands.import_private_key = AsyncMock(return_value=_radio_result())
return mc
class TestApplyRadioConfigUpdate:
@pytest.mark.asyncio
async def test_updates_requested_fields_and_refreshes_info(self):
mc = _mock_meshcore_with_info()
sync_radio_time_fn = AsyncMock()
set_path_hash_mode = MagicMock()
update = RadioConfigUpdate(
name="NodeUpdated",
lat=1.23,
tx_power=17,
radio=RadioSettings(freq=910.525, bw=62.5, sf=7, cr=5),
path_hash_mode=1,
)
await apply_radio_config_update(
mc,
update,
path_hash_mode_supported=True,
set_path_hash_mode=set_path_hash_mode,
sync_radio_time_fn=sync_radio_time_fn,
)
mc.commands.set_name.assert_awaited_once_with("NodeUpdated")
mc.commands.set_coords.assert_awaited_once_with(lat=1.23, lon=20.0)
mc.commands.set_tx_power.assert_awaited_once_with(val=17)
mc.commands.set_radio.assert_awaited_once_with(freq=910.525, bw=62.5, sf=7, cr=5)
mc.commands.set_path_hash_mode.assert_awaited_once_with(1)
set_path_hash_mode.assert_called_once_with(1)
sync_radio_time_fn.assert_awaited_once_with(mc)
mc.commands.send_appstart.assert_awaited_once()
@pytest.mark.asyncio
async def test_rejects_unsupported_path_hash_mode(self):
mc = _mock_meshcore_with_info()
update = RadioConfigUpdate(path_hash_mode=1)
with pytest.raises(PathHashModeUnsupportedError):
await apply_radio_config_update(
mc,
update,
path_hash_mode_supported=False,
set_path_hash_mode=MagicMock(),
sync_radio_time_fn=AsyncMock(),
)
mc.commands.set_path_hash_mode.assert_not_awaited()
mc.commands.send_appstart.assert_not_awaited()
@pytest.mark.asyncio
async def test_raises_when_radio_rejects_path_hash_mode(self):
mc = _mock_meshcore_with_info()
mc.commands.set_path_hash_mode = AsyncMock(
return_value=_radio_result(EventType.ERROR, {"error": "nope"})
)
update = RadioConfigUpdate(path_hash_mode=1)
set_path_hash_mode = MagicMock()
with pytest.raises(RadioCommandRejectedError):
await apply_radio_config_update(
mc,
update,
path_hash_mode_supported=True,
set_path_hash_mode=set_path_hash_mode,
sync_radio_time_fn=AsyncMock(),
)
set_path_hash_mode.assert_not_called()
mc.commands.send_appstart.assert_not_awaited()
class TestImportPrivateKeyAndRefreshKeystore:
@pytest.mark.asyncio
async def test_rejects_radio_error(self):
mc = _mock_meshcore_with_info()
mc.commands.import_private_key = AsyncMock(
return_value=_radio_result(EventType.ERROR, {"error": "failed"})
)
export_fn = AsyncMock(return_value=True)
with pytest.raises(RadioCommandRejectedError):
await import_private_key_and_refresh_keystore(
mc,
b"\xaa" * 64,
export_and_store_private_key_fn=export_fn,
)
export_fn.assert_not_awaited()
@pytest.mark.asyncio
async def test_retries_keystore_refresh_once(self):
mc = _mock_meshcore_with_info()
export_fn = AsyncMock(side_effect=[False, True])
await import_private_key_and_refresh_keystore(
mc,
b"\xaa" * 64,
export_and_store_private_key_fn=export_fn,
)
mc.commands.import_private_key.assert_awaited_once_with(b"\xaa" * 64)
assert export_fn.await_count == 2
@pytest.mark.asyncio
async def test_raises_when_keystore_refresh_fails_twice(self):
mc = _mock_meshcore_with_info()
export_fn = AsyncMock(return_value=False)
with pytest.raises(KeystoreRefreshError):
await import_private_key_and_refresh_keystore(
mc,
b"\xaa" * 64,
export_and_store_private_key_fn=export_fn,
)
assert export_fn.await_count == 2
+84
View File
@@ -0,0 +1,84 @@
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from app.services.radio_lifecycle import prepare_connected_radio, reconnect_and_prepare_radio
class TestPrepareConnectedRadio:
@pytest.mark.asyncio
async def test_runs_setup_then_broadcasts_health(self):
radio_manager = MagicMock()
radio_manager._last_connected = False
radio_manager.connection_info = "TCP: test:4000"
call_order: list[str] = []
async def _setup():
call_order.append("setup")
radio_manager.post_connect_setup = AsyncMock(side_effect=_setup)
with patch("app.websocket.broadcast_health") as mock_broadcast:
await prepare_connected_radio(radio_manager, broadcast_on_success=True)
assert call_order == ["setup"]
assert radio_manager._last_connected is True
mock_broadcast.assert_called_once_with(True, "TCP: test:4000")
@pytest.mark.asyncio
async def test_can_skip_broadcast(self):
radio_manager = MagicMock()
radio_manager._last_connected = False
radio_manager.connection_info = "TCP: test:4000"
radio_manager.post_connect_setup = AsyncMock()
with patch("app.websocket.broadcast_health") as mock_broadcast:
await prepare_connected_radio(radio_manager, broadcast_on_success=False)
assert radio_manager._last_connected is True
mock_broadcast.assert_not_called()
class TestReconnectAndPrepareRadio:
@pytest.mark.asyncio
async def test_reconnects_without_early_health_broadcast(self):
radio_manager = MagicMock()
radio_manager._last_connected = False
radio_manager.connection_info = "Serial: /dev/ttyUSB0"
reconnect_calls: list[bool] = []
call_order: list[str] = []
async def _reconnect(*, broadcast_on_success: bool):
reconnect_calls.append(broadcast_on_success)
call_order.append("reconnect")
return True
async def _setup():
call_order.append("setup")
radio_manager.reconnect = AsyncMock(side_effect=_reconnect)
radio_manager.post_connect_setup = AsyncMock(side_effect=_setup)
with patch("app.websocket.broadcast_health") as mock_broadcast:
result = await reconnect_and_prepare_radio(radio_manager, broadcast_on_success=True)
assert result is True
assert reconnect_calls == [False]
assert call_order == ["reconnect", "setup"]
assert radio_manager._last_connected is True
mock_broadcast.assert_called_once_with(True, "Serial: /dev/ttyUSB0")
@pytest.mark.asyncio
async def test_returns_false_without_running_setup_when_reconnect_fails(self):
radio_manager = MagicMock()
radio_manager.reconnect = AsyncMock(return_value=False)
radio_manager.post_connect_setup = AsyncMock()
with patch("app.websocket.broadcast_health") as mock_broadcast:
result = await reconnect_and_prepare_radio(radio_manager, broadcast_on_success=True)
assert result is False
radio_manager.post_connect_setup.assert_not_awaited()
mock_broadcast.assert_not_called()
+20 -15
View File
@@ -7,6 +7,11 @@ import pytest
from app.radio import RadioDisconnectedError, RadioOperationBusyError, radio_manager
from app.radio_sync import is_polling_paused
from app.services.radio_runtime import RadioRuntime
def _runtime(manager):
return RadioRuntime(lambda: manager)
@pytest.fixture(autouse=True)
@@ -180,11 +185,11 @@ class TestRequireConnected:
from app.dependencies import require_connected
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_connected = True
mock_rm.meshcore = MagicMock()
mock_rm.is_setup_in_progress = True
manager = MagicMock()
manager.is_connected = True
manager.meshcore = MagicMock()
manager.is_setup_in_progress = True
with patch("app.dependencies.radio_manager", _runtime(manager)):
with pytest.raises(HTTPException) as exc_info:
require_connected()
@@ -197,11 +202,11 @@ class TestRequireConnected:
from app.dependencies import require_connected
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_setup_in_progress = False
mock_rm.is_connected = False
mock_rm.meshcore = None
manager = MagicMock()
manager.is_setup_in_progress = False
manager.is_connected = False
manager.meshcore = None
with patch("app.dependencies.radio_manager", _runtime(manager)):
with pytest.raises(HTTPException) as exc_info:
require_connected()
@@ -212,11 +217,11 @@ class TestRequireConnected:
from app.dependencies import require_connected
mock_mc = MagicMock()
with patch("app.dependencies.radio_manager") as mock_rm:
mock_rm.is_setup_in_progress = False
mock_rm.is_connected = True
mock_rm.meshcore = mock_mc
manager = MagicMock()
manager.is_setup_in_progress = False
manager.is_connected = True
manager.meshcore = mock_mc
with patch("app.dependencies.radio_manager", _runtime(manager)):
result = require_connected()
assert result is mock_mc
+13 -6
View File
@@ -22,6 +22,7 @@ from app.routers.radio import (
set_private_key,
update_radio_config,
)
from app.services.radio_runtime import RadioRuntime
def _radio_result(event_type=EventType.OK, payload=None):
@@ -41,6 +42,10 @@ def _noop_radio_operation(mc=None):
return _ctx
def _runtime(manager):
return RadioRuntime(lambda: manager)
@pytest.fixture(autouse=True)
def _reset_radio_state():
"""Save/restore radio_manager state so tests don't leak."""
@@ -301,7 +306,7 @@ class TestAdvertise:
isolated_manager._meshcore = MagicMock()
with (
patch("app.routers.radio.require_connected"),
patch("app.routers.radio.radio_manager", isolated_manager),
patch("app.routers.radio.radio_manager", _runtime(isolated_manager)),
patch(
"app.routers.radio.do_send_advertisement",
new_callable=AsyncMock,
@@ -324,7 +329,7 @@ class TestRebootAndReconnect:
mock_rm.meshcore = mock_mc
mock_rm.radio_operation = _noop_radio_operation(mock_mc)
with patch("app.routers.radio.radio_manager", mock_rm):
with patch("app.routers.radio.radio_manager", _runtime(mock_rm)):
result = await reboot_radio()
assert result["status"] == "ok"
@@ -338,7 +343,7 @@ class TestRebootAndReconnect:
mock_rm.is_reconnecting = True
mock_rm.radio_operation = _noop_radio_operation()
with patch("app.routers.radio.radio_manager", mock_rm):
with patch("app.routers.radio.radio_manager", _runtime(mock_rm)):
result = await reboot_radio()
assert result["status"] == "pending"
@@ -353,8 +358,9 @@ class TestRebootAndReconnect:
mock_rm.reconnect = AsyncMock(return_value=True)
mock_rm.post_connect_setup = AsyncMock()
mock_rm.radio_operation = _noop_radio_operation()
mock_rm.connection_info = "TCP: test:4000"
with patch("app.routers.radio.radio_manager", mock_rm):
with patch("app.routers.radio.radio_manager", _runtime(mock_rm)):
result = await reboot_radio()
assert result["status"] == "ok"
@@ -367,8 +373,9 @@ class TestRebootAndReconnect:
mock_rm = MagicMock()
mock_rm.is_connected = True
mock_rm.radio_operation = _noop_radio_operation()
mock_rm.is_setup_complete = True
with patch("app.routers.radio.radio_manager", mock_rm):
with patch("app.routers.radio.radio_manager", _runtime(mock_rm)):
result = await reconnect_radio()
assert result["status"] == "ok"
@@ -382,7 +389,7 @@ class TestRebootAndReconnect:
mock_rm.reconnect = AsyncMock(return_value=False)
mock_rm.radio_operation = _noop_radio_operation()
with patch("app.routers.radio.radio_manager", mock_rm):
with patch("app.routers.radio.radio_manager", _runtime(mock_rm)):
with pytest.raises(HTTPException) as exc:
await reconnect_radio()
+116
View File
@@ -0,0 +1,116 @@
from contextlib import asynccontextmanager
from unittest.mock import AsyncMock
import pytest
from fastapi import HTTPException
from app.services.radio_runtime import RadioRuntime
class _Manager:
def __init__(
self,
*,
meshcore=None,
is_connected=False,
is_reconnecting=False,
is_setup_in_progress=False,
is_setup_complete=False,
connection_info=None,
path_hash_mode=0,
path_hash_mode_supported=False,
):
self.meshcore = meshcore
self.is_connected = is_connected
self.is_reconnecting = is_reconnecting
self.is_setup_in_progress = is_setup_in_progress
self.is_setup_complete = is_setup_complete
self.connection_info = connection_info
self.path_hash_mode = path_hash_mode
self.path_hash_mode_supported = path_hash_mode_supported
self.calls: list[tuple[str, dict]] = []
@asynccontextmanager
async def radio_operation(self, name: str, **kwargs):
self.calls.append((name, kwargs))
yield self.meshcore
def test_uses_latest_manager_from_getter():
first = _Manager(meshcore="mc1", is_connected=True, connection_info="first")
second = _Manager(meshcore="mc2", is_connected=True, connection_info="second")
current = {"manager": first}
runtime = RadioRuntime(lambda: current["manager"])
assert runtime.connection_info == "first"
assert runtime.require_connected() == "mc1"
current["manager"] = second
assert runtime.connection_info == "second"
assert runtime.require_connected() == "mc2"
def test_require_connected_preserves_http_semantics():
runtime = RadioRuntime(
_Manager(meshcore=None, is_connected=True, is_setup_in_progress=True),
)
with pytest.raises(HTTPException, match="Radio is initializing") as exc:
runtime.require_connected()
assert exc.value.status_code == 503
runtime = RadioRuntime(_Manager(meshcore=None, is_connected=False, is_setup_in_progress=False))
with pytest.raises(HTTPException, match="Radio not connected") as exc:
runtime.require_connected()
assert exc.value.status_code == 503
def test_require_connected_returns_fresh_meshcore_after_connectivity_check():
old_meshcore = object()
new_meshcore = object()
class _SwappingManager:
def __init__(self):
self._meshcore = old_meshcore
self.is_setup_in_progress = False
@property
def is_connected(self):
self._meshcore = new_meshcore
return True
@property
def meshcore(self):
return self._meshcore
runtime = RadioRuntime(_SwappingManager())
assert runtime.require_connected() is new_meshcore
@pytest.mark.asyncio
async def test_radio_operation_delegates_to_current_manager():
manager = _Manager(meshcore="meshcore", is_connected=True)
runtime = RadioRuntime(manager)
async with runtime.radio_operation("sync_contacts", pause_polling=True) as mc:
assert mc == "meshcore"
assert manager.calls == [("sync_contacts", {"pause_polling": True})]
@pytest.mark.asyncio
async def test_lifecycle_passthrough_methods_delegate_to_current_manager():
manager = _Manager(meshcore="meshcore", is_connected=True)
manager.start_connection_monitor = AsyncMock()
manager.stop_connection_monitor = AsyncMock()
manager.disconnect = AsyncMock()
runtime = RadioRuntime(manager)
await runtime.start_connection_monitor()
await runtime.stop_connection_monitor()
await runtime.disconnect()
manager.start_connection_monitor.assert_awaited_once()
manager.stop_connection_monitor.assert_awaited_once()
manager.disconnect.assert_awaited_once()
+32
View File
@@ -4,6 +4,7 @@ from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from app.models import Contact, ContactUpsert
from app.repository import (
ContactAdvertPathRepository,
ContactNameHistoryRepository,
@@ -643,3 +644,34 @@ class TestMessageRepositoryGetById:
result = await MessageRepository.get_by_id(999999)
assert result is None
class TestContactRepositoryUpsertContracts:
@pytest.mark.asyncio
async def test_accepts_contact_upsert_model(self, test_db):
await ContactRepository.upsert(
ContactUpsert(public_key="aa" * 32, name="Alice", type=1, on_radio=False)
)
contact = await ContactRepository.get_by_key("aa" * 32)
assert contact is not None
assert contact.name == "Alice"
assert contact.type == 1
@pytest.mark.asyncio
async def test_accepts_contact_model(self, test_db):
await ContactRepository.upsert(
Contact(
public_key="bb" * 32,
name="Bob",
type=2,
on_radio=True,
out_path_hash_mode=-1,
)
)
contact = await ContactRepository.get_by_key("bb" * 32)
assert contact is not None
assert contact.name == "Bob"
assert contact.type == 2
assert contact.on_radio is True
+25
View File
@@ -1,6 +1,7 @@
"""Tests for WebSocket manager functionality."""
import asyncio
import json
from unittest.mock import AsyncMock, patch
import pytest
@@ -245,3 +246,27 @@ class TestBroadcastEventFanout:
mock_ws.broadcast.assert_called_once()
mock_fm.broadcast_raw.assert_called_once_with({"data": "ff00"})
class TestTypedEventSerialization:
"""Tests for typed websocket event serialization."""
def test_dump_ws_event_preserves_optional_message_acked_shape(self):
from app.events import dump_ws_event
serialized = dump_ws_event("message_acked", {"message_id": 7, "ack_count": 2})
assert json.loads(serialized) == {
"type": "message_acked",
"data": {"message_id": 7, "ack_count": 2},
}
def test_dump_ws_event_falls_back_to_raw_payload_when_validation_fails(self):
from app.events import dump_ws_event
serialized = dump_ws_event("message_acked", {"ack_count": 2})
assert json.loads(serialized) == {
"type": "message_acked",
"data": {"ack_count": 2},
}