mirror of
https://github.com/jkingsman/Remote-Terminal-for-MeshCore.git
synced 2026-03-28 17:43:05 +01:00
Compare commits
55 Commits
fanout_bus
...
2.7.9
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9fbdbaa174 | ||
|
|
e99e522573 | ||
|
|
9d806c608b | ||
|
|
5a9489eff1 | ||
|
|
beb28b1f31 | ||
|
|
7d688fa5f8 | ||
|
|
09b68c37ba | ||
|
|
df7dbad73d | ||
|
|
060fb1ef59 | ||
|
|
b14e99ff24 | ||
|
|
77523c1b15 | ||
|
|
9673b25ab3 | ||
|
|
2732506f3c | ||
|
|
523fe3e28e | ||
|
|
3663db6ed3 | ||
|
|
5832fbd2c9 | ||
|
|
655066ed73 | ||
|
|
5cb5c2ad25 | ||
|
|
69a6922827 | ||
|
|
806252ec7e | ||
|
|
2236132df4 | ||
|
|
564cd65496 | ||
|
|
b3625b4937 | ||
|
|
20af50585b | ||
|
|
d776f3d09b | ||
|
|
075debc51b | ||
|
|
34318e4814 | ||
|
|
48dab293ae | ||
|
|
76d11b01a7 | ||
|
|
69c812cfd4 | ||
|
|
2257c091e8 | ||
|
|
55fb2390de | ||
|
|
0b91fb18bd | ||
|
|
8948f2e504 | ||
|
|
5c413bf949 | ||
|
|
b0ffa28e46 | ||
|
|
f97c846378 | ||
|
|
e0d7c8a083 | ||
|
|
11ce2be5fa | ||
|
|
1fc041538e | ||
|
|
0ac8e97ea2 | ||
|
|
e6743d2098 | ||
|
|
f472ff7cab | ||
|
|
7ac220aee1 | ||
|
|
43e38ecc5b | ||
|
|
99eddfc2ef | ||
|
|
f9eb6ebd98 | ||
|
|
8f59606867 | ||
|
|
d214da41c6 | ||
|
|
da22eb5c48 | ||
|
|
94546f90a4 | ||
|
|
f82cadb4e1 | ||
|
|
f60c656566 | ||
|
|
b5e2a4c269 | ||
|
|
d4f73d318a |
29
AGENTS.md
29
AGENTS.md
@@ -112,10 +112,20 @@ Frontend packet-feed consumers should treat `observation_id` as the dedup/render
|
||||
To improve repeater disambiguation in the network visualizer, the backend stores recent unique advertisement paths per contact in a dedicated table (`contact_advert_paths`).
|
||||
|
||||
- This is independent of raw-packet payload deduplication.
|
||||
- Paths are keyed per contact + path, with `heard_count`, `first_seen`, and `last_seen`.
|
||||
- Paths are keyed per contact + path + hop count, with `heard_count`, `first_seen`, and `last_seen`.
|
||||
- Only the N most recent unique paths are retained per contact (currently 10).
|
||||
- See `frontend/src/components/AGENTS_packet_visualizer.md` § "Advert-Path Identity Hints" for how the visualizer consumes this data.
|
||||
|
||||
## Path Hash Modes
|
||||
|
||||
MeshCore firmware can encode path hops as 1-byte, 2-byte, or 3-byte identifiers.
|
||||
|
||||
- `path_hash_mode` values are `0` = 1-byte, `1` = 2-byte, `2` = 3-byte.
|
||||
- `GET /api/radio/config` exposes both the current `path_hash_mode` and `path_hash_mode_supported`.
|
||||
- `PATCH /api/radio/config` may update `path_hash_mode` only when the connected firmware supports it.
|
||||
- Contacts persist `out_path_hash_mode` separately from `last_path` so contact sync and DM send paths can round-trip correctly even when hop bytes are ambiguous.
|
||||
- `path_len` in API payloads is always hop count, not byte count. The actual path byte length is `hop_count * hash_size`.
|
||||
|
||||
## Data Flow
|
||||
|
||||
### Incoming Messages
|
||||
@@ -239,6 +249,7 @@ Key test files:
|
||||
- `tests/test_ack_tracking_wiring.py` - DM ACK tracking extraction and wiring
|
||||
- `tests/test_health_mqtt_status.py` - Health endpoint MQTT status field
|
||||
- `tests/test_community_mqtt.py` - Community MQTT publisher (JWT, packet format, hash, broadcast)
|
||||
- `tests/test_radio_sync.py` - Radio sync, periodic tasks, and contact offload back to the radio
|
||||
- `tests/test_real_crypto.py` - Real cryptographic operations
|
||||
- `tests/test_disable_bots.py` - MESHCORE_DISABLE_BOTS=true feature
|
||||
|
||||
@@ -251,7 +262,7 @@ npm run test:run
|
||||
|
||||
### Before Completing Changes
|
||||
|
||||
**Always run `./scripts/all_quality.sh` before finishing any changes.** This runs all linting, formatting, type checking, tests, and builds sequentially, catching type mismatches, breaking changes, and compilation errors.
|
||||
**Always run `./scripts/all_quality.sh` before finishing any changes that have modified code or tests.** This runs all linting, formatting, type checking, tests, and builds sequentially, catching type mismatches, breaking changes, and compilation errors. This is not necessary for docs-only changes.
|
||||
|
||||
## API Summary
|
||||
|
||||
@@ -260,8 +271,8 @@ All endpoints are prefixed with `/api` (e.g., `/api/health`).
|
||||
| Method | Endpoint | Description |
|
||||
|--------|----------|-------------|
|
||||
| GET | `/api/health` | Connection status, fanout statuses, bots_disabled flag |
|
||||
| GET | `/api/radio/config` | Radio configuration |
|
||||
| PATCH | `/api/radio/config` | Update name, location, radio params |
|
||||
| GET | `/api/radio/config` | Radio configuration, including `path_hash_mode` and `path_hash_mode_supported` |
|
||||
| PATCH | `/api/radio/config` | Update name, location, radio params, and `path_hash_mode` when supported |
|
||||
| PUT | `/api/radio/private-key` | Import private key to radio |
|
||||
| POST | `/api/radio/advertise` | Send advertisement |
|
||||
| POST | `/api/radio/reboot` | Reboot radio or reconnect if disconnected |
|
||||
@@ -367,6 +378,8 @@ All external integrations are managed through the fanout bus (`app/fanout/`). Ea
|
||||
|
||||
`broadcast_event()` in `websocket.py` dispatches `message` and `raw_packet` events to the fanout manager. See `app/fanout/AGENTS_fanout.md` for full architecture details.
|
||||
|
||||
Community MQTT forwards raw packets only. Its derived `path` field, when present on direct packets, is a comma-separated list of hop identifiers as reported by the packet format. Token width therefore varies with the packet's path hash mode; it is intentionally not a flat per-byte rendering.
|
||||
|
||||
### Server-Side Decryption
|
||||
|
||||
The server can decrypt packets using stored keys, both in real-time and for historical packets.
|
||||
@@ -414,3 +427,11 @@ mc.subscribe(EventType.ACK, handler)
|
||||
Byte-perfect channel retries are user-triggered via `POST /api/messages/channel/{message_id}/resend` and are allowed for 30 seconds after the original send.
|
||||
|
||||
**Transport mutual exclusivity:** Only one of `MESHCORE_SERIAL_PORT`, `MESHCORE_TCP_HOST`, or `MESHCORE_BLE_ADDRESS` may be set. If none are set, serial auto-detection is used.
|
||||
|
||||
## Errata & Known Non-Issues
|
||||
|
||||
### `meshcore_py` advert parsing can crash on malformed/truncated RF log packets
|
||||
|
||||
The vendored MeshCore Python reader's `LOG_DATA` advert path assumes the decoded advert payload always contains at least 101 bytes of advert body and reads the flags byte with `pk_buf.read(1)[0]` without a length guard. If a malformed or truncated RF log frame slips through, `MessageReader.handle_rx()` can fail with `IndexError: index out of range` from `meshcore/reader.py` while parsing payload type `0x04` (advert).
|
||||
|
||||
This does not indicate database corruption or a message-store bug. It is a parser-hardening gap in `meshcore_py`: the reader does not fully mirror firmware-side packet/path validation before attempting advert decode. The practical effect is usually a one-off asyncio task failure for that packet while later packets continue processing normally.
|
||||
|
||||
43
CHANGELOG.md
43
CHANGELOG.md
@@ -1,3 +1,46 @@
|
||||
## [2.7.9] - 2026-03-08
|
||||
|
||||
Bugfix: Don't obscure new integration dropdown on session boundary
|
||||
|
||||
## [2.7.8] - 2026-03-08
|
||||
|
||||
|
||||
|
||||
## [2.7.8] - 2026-03-08
|
||||
|
||||
Bugfix: Improve frontend asset resolution and fixup the build/push script
|
||||
|
||||
## [2.7.1] - 2026-03-08
|
||||
|
||||
Bugfix: Fix historical DM packet length passing
|
||||
Misc: Follow better inclusion patterns for the patched meshcore-decoder and just publish the dang package
|
||||
Misc: Patch a bewildering browser quirk that cause large raw packet lists to extend past the bottom of the page
|
||||
|
||||
## [2.7.0] - 2026-03-08
|
||||
|
||||
Feature: Multibyte path support
|
||||
Feature: Add multibyte statistics to statistics pane
|
||||
Feature: Add path bittage to contact info pane
|
||||
Feature: Put tools in a collapsible
|
||||
|
||||
## [2.6.1] - 2026-03-08
|
||||
|
||||
Misc: Fix busted docker builds; we don't have a 2.6.0 build sorry
|
||||
|
||||
## [2.6.0] - 2026-03-08
|
||||
|
||||
Feature: A11y improvements
|
||||
Feature: New themes
|
||||
Feature: Backfill channel sender identity when available
|
||||
Feature: Modular fanout bus, including Webhooks, more customizable community MQTT, and Apprise
|
||||
Bugfix: Unreads now respect blocklist
|
||||
Bugfix: Unreads can't accumulate on an open thread
|
||||
Bugfix: Channel name in broadcasts
|
||||
Bugfix: Add missing httpx dependency
|
||||
Bugfix: Improvements to radio startup frontend-blocking time and radio status reporting
|
||||
Misc: Improved button signage for app movement
|
||||
Misc: Test, performance, and documentation improvements
|
||||
|
||||
## [2.5.0] - 2026-03-05
|
||||
|
||||
Feature: Far better accessibility across the app (with far to go)
|
||||
|
||||
@@ -5,7 +5,7 @@ ARG COMMIT_HASH=unknown
|
||||
|
||||
WORKDIR /build
|
||||
|
||||
COPY frontend/package.json ./
|
||||
COPY frontend/package.json frontend/.npmrc ./
|
||||
RUN npm install
|
||||
|
||||
COPY frontend/ ./
|
||||
|
||||
@@ -144,7 +144,7 @@ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
|
||||
</details>
|
||||
|
||||
### meshcore (2.2.5) — MIT
|
||||
### meshcore (2.2.29) — MIT
|
||||
|
||||
<details>
|
||||
<summary>Full license text</summary>
|
||||
@@ -779,7 +779,7 @@ SOFTWARE.
|
||||
|
||||
</details>
|
||||
|
||||
### @uiw/react-codemirror (4.25.7) — MIT
|
||||
### @uiw/react-codemirror (4.25.8) — MIT
|
||||
|
||||
*License file not found in package.*
|
||||
|
||||
@@ -1141,7 +1141,7 @@ SOFTWARE.
|
||||
|
||||
</details>
|
||||
|
||||
### meshcore-hashtag-cracker (1.7.0) — MIT
|
||||
### meshcore-hashtag-cracker (1.11.0) — MIT
|
||||
|
||||
<details>
|
||||
<summary>Full license text</summary>
|
||||
|
||||
@@ -9,6 +9,7 @@ Backend server + browser interface for MeshCore mesh radio networks. Connect you
|
||||
* Access your radio remotely over your network or VPN
|
||||
* Search for hashtag room names for channels you don't have keys for yet
|
||||
* Forward packets to MQTT brokers (private: decrypted messages and/or raw packets; community aggregators like LetsMesh.net: raw packets only)
|
||||
* Use the more recent 1.14 firmwares which support multibyte pathing in all traffic and display systems within the app
|
||||
* Visualize the mesh as a map or node set, view repeater stats, and more!
|
||||
|
||||
**Warning:** This app has no auth, and is for trusted environments only. _Do not put this on an untrusted network, or open it to the public._ The bots can execute arbitrary Python code which means anyone on your network can, too. To completely disable the bot system, start the server with `MESHCORE_DISABLE_BOTS=true` — this prevents all bot execution and blocks bot configuration changes via the API. If you need access control, consider using a reverse proxy like Nginx, or extending FastAPI; access control and user management are outside the scope of this app.
|
||||
|
||||
@@ -20,7 +20,7 @@ app/
|
||||
├── database.py # SQLite connection + base schema + migration runner
|
||||
├── migrations.py # Schema migrations (SQLite user_version)
|
||||
├── models.py # Pydantic request/response models
|
||||
├── repository/ # Data access layer (contacts, channels, messages, raw_packets, settings)
|
||||
├── repository/ # Data access layer (contacts, channels, messages, raw_packets, settings, fanout)
|
||||
├── radio.py # RadioManager + auto-reconnect monitor
|
||||
├── radio_sync.py # Polling, sync, periodic advertisement loop
|
||||
├── decoder.py # Packet parsing/decryption
|
||||
@@ -29,6 +29,7 @@ app/
|
||||
├── websocket.py # WS manager + broadcast helpers
|
||||
├── fanout/ # Fanout bus: MQTT, bots, webhooks, Apprise (see fanout/AGENTS_fanout.md)
|
||||
├── dependencies.py # Shared FastAPI dependency providers
|
||||
├── path_utils.py # Path hex rendering and hop-width helpers
|
||||
├── keystore.py # Ephemeral private/public key storage for DM decryption
|
||||
├── frontend_static.py # Mount/serve built frontend (production)
|
||||
└── routers/
|
||||
@@ -72,6 +73,13 @@ app/
|
||||
|
||||
## Important Behaviors
|
||||
|
||||
### Multibyte routing
|
||||
|
||||
- Packet `path_len` values are hop counts, not byte counts.
|
||||
- Hop width comes from the packet or radio `path_hash_mode`: `0` = 1-byte, `1` = 2-byte, `2` = 3-byte.
|
||||
- Contacts persist `out_path_hash_mode` in the database so contact sync and outbound DM routing reuse the exact stored mode instead of inferring from path bytes.
|
||||
- `contact_advert_paths` identity is `(public_key, path_hex, path_len)` because the same hex bytes can represent different routes at different hop widths.
|
||||
|
||||
### Read/unread state
|
||||
|
||||
- Server is source of truth (`contacts.last_read_at`, `channels.last_read_at`).
|
||||
@@ -107,6 +115,7 @@ app/
|
||||
- Configs stored in `fanout_configs` table, managed via `GET/POST/PATCH/DELETE /api/fanout`.
|
||||
- `broadcast_event()` in `websocket.py` dispatches to the fanout manager for `message` and `raw_packet` events.
|
||||
- Each integration is a `FanoutModule` with scope-based filtering.
|
||||
- Community MQTT publishes raw packets only, but its derived `path` field for direct packets is emitted as comma-separated hop identifiers, not flat path bytes.
|
||||
- See `app/fanout/AGENTS_fanout.md` for full architecture details.
|
||||
|
||||
## API Surface (all under `/api`)
|
||||
@@ -115,8 +124,8 @@ app/
|
||||
- `GET /health`
|
||||
|
||||
### Radio
|
||||
- `GET /radio/config`
|
||||
- `PATCH /radio/config`
|
||||
- `GET /radio/config` — includes `path_hash_mode` and `path_hash_mode_supported`
|
||||
- `PATCH /radio/config` — may update `path_hash_mode` (`0..2`) when firmware supports it
|
||||
- `PUT /radio/private-key`
|
||||
- `POST /radio/advertise`
|
||||
- `POST /radio/reboot`
|
||||
@@ -209,11 +218,11 @@ Client sends `"ping"` text; server replies `{"type":"pong"}`.
|
||||
## Data Model Notes
|
||||
|
||||
Main tables:
|
||||
- `contacts` (includes `first_seen` for contact age tracking)
|
||||
- `contacts` (includes `first_seen` for contact age tracking and `out_path_hash_mode` for route round-tripping)
|
||||
- `channels`
|
||||
- `messages` (includes `sender_name`, `sender_key` for per-contact channel message attribution)
|
||||
- `raw_packets`
|
||||
- `contact_advert_paths` (recent unique advertisement paths per contact)
|
||||
- `contact_advert_paths` (recent unique advertisement paths per contact, keyed by contact + path bytes + hop count)
|
||||
- `contact_name_history` (tracks name changes over time)
|
||||
- `app_settings`
|
||||
|
||||
@@ -288,6 +297,10 @@ tests/
|
||||
├── test_send_messages.py # Outgoing messages, bot triggers, concurrent sends
|
||||
├── test_settings_router.py # Settings endpoints, advert validation
|
||||
├── test_statistics.py # Statistics aggregation
|
||||
├── test_channel_sender_backfill.py # Sender key backfill for channel messages
|
||||
├── test_fanout_hitlist.py # Fanout-related hitlist regression tests
|
||||
├── test_main_startup.py # App startup and lifespan
|
||||
├── test_path_utils.py # Path hex rendering helpers
|
||||
├── test_websocket.py # WS manager broadcast/cleanup
|
||||
└── test_websocket_route.py # WS endpoint lifecycle
|
||||
```
|
||||
|
||||
@@ -15,6 +15,7 @@ CREATE TABLE IF NOT EXISTS contacts (
|
||||
flags INTEGER DEFAULT 0,
|
||||
last_path TEXT,
|
||||
last_path_len INTEGER DEFAULT -1,
|
||||
out_path_hash_mode INTEGER DEFAULT 0,
|
||||
last_advert INTEGER,
|
||||
lat REAL,
|
||||
lon REAL,
|
||||
@@ -70,7 +71,7 @@ CREATE TABLE IF NOT EXISTS contact_advert_paths (
|
||||
first_seen INTEGER NOT NULL,
|
||||
last_seen INTEGER NOT NULL,
|
||||
heard_count INTEGER NOT NULL DEFAULT 1,
|
||||
UNIQUE(public_key, path_hex),
|
||||
UNIQUE(public_key, path_hex, path_len),
|
||||
FOREIGN KEY (public_key) REFERENCES contacts(public_key)
|
||||
);
|
||||
|
||||
|
||||
@@ -79,9 +79,10 @@ class PacketInfo:
|
||||
route_type: RouteType
|
||||
payload_type: PayloadType
|
||||
payload_version: int
|
||||
path_length: int
|
||||
path: bytes # The routing path (empty if path_length is 0)
|
||||
path_length: int # Decoded hop count (not the raw wire byte)
|
||||
path: bytes # The routing path bytes (empty if path_length is 0)
|
||||
payload: bytes
|
||||
path_hash_size: int = 1 # Bytes per hop: 1, 2, or 3
|
||||
|
||||
|
||||
def calculate_channel_hash(channel_key: bytes) -> str:
|
||||
@@ -100,86 +101,36 @@ def extract_payload(raw_packet: bytes) -> bytes | None:
|
||||
Packet structure:
|
||||
- Byte 0: header (route_type, payload_type, version)
|
||||
- For TRANSPORT routes: bytes 1-4 are transport codes
|
||||
- Next byte: path_length
|
||||
- Next path_length bytes: path data
|
||||
- Next byte: path byte (packed as [hash_mode:2][hop_count:6])
|
||||
- Next hop_count * hash_size bytes: path data
|
||||
- Remaining: payload
|
||||
|
||||
Returns the payload bytes, or None if packet is malformed.
|
||||
"""
|
||||
if len(raw_packet) < 2:
|
||||
return None
|
||||
from app.path_utils import parse_packet_envelope
|
||||
|
||||
try:
|
||||
header = raw_packet[0]
|
||||
route_type = header & 0x03
|
||||
offset = 1
|
||||
|
||||
# Skip transport codes if present (TRANSPORT_FLOOD=0, TRANSPORT_DIRECT=3)
|
||||
if route_type in (0x00, 0x03):
|
||||
if len(raw_packet) < offset + 4:
|
||||
return None
|
||||
offset += 4
|
||||
|
||||
# Get path length
|
||||
if len(raw_packet) < offset + 1:
|
||||
return None
|
||||
path_length = raw_packet[offset]
|
||||
offset += 1
|
||||
|
||||
# Skip path data
|
||||
if len(raw_packet) < offset + path_length:
|
||||
return None
|
||||
offset += path_length
|
||||
|
||||
# Rest is payload
|
||||
return raw_packet[offset:]
|
||||
except (ValueError, IndexError):
|
||||
return None
|
||||
envelope = parse_packet_envelope(raw_packet)
|
||||
return envelope.payload if envelope is not None else None
|
||||
|
||||
|
||||
def parse_packet(raw_packet: bytes) -> PacketInfo | None:
|
||||
"""Parse a raw packet and extract basic info."""
|
||||
if len(raw_packet) < 2:
|
||||
from app.path_utils import parse_packet_envelope
|
||||
|
||||
envelope = parse_packet_envelope(raw_packet)
|
||||
if envelope is None:
|
||||
return None
|
||||
|
||||
try:
|
||||
header = raw_packet[0]
|
||||
route_type = RouteType(header & 0x03)
|
||||
payload_type = PayloadType((header >> 2) & 0x0F)
|
||||
payload_version = (header >> 6) & 0x03
|
||||
|
||||
offset = 1
|
||||
|
||||
# Skip transport codes if present
|
||||
if route_type in (RouteType.TRANSPORT_FLOOD, RouteType.TRANSPORT_DIRECT):
|
||||
if len(raw_packet) < offset + 4:
|
||||
return None
|
||||
offset += 4
|
||||
|
||||
# Get path length
|
||||
if len(raw_packet) < offset + 1:
|
||||
return None
|
||||
path_length = raw_packet[offset]
|
||||
offset += 1
|
||||
|
||||
# Extract path data
|
||||
if len(raw_packet) < offset + path_length:
|
||||
return None
|
||||
path = raw_packet[offset : offset + path_length]
|
||||
offset += path_length
|
||||
|
||||
# Rest is payload
|
||||
payload = raw_packet[offset:]
|
||||
|
||||
return PacketInfo(
|
||||
route_type=route_type,
|
||||
payload_type=payload_type,
|
||||
payload_version=payload_version,
|
||||
path_length=path_length,
|
||||
path=path,
|
||||
payload=payload,
|
||||
route_type=RouteType(envelope.route_type),
|
||||
payload_type=PayloadType(envelope.payload_type),
|
||||
payload_version=envelope.payload_version,
|
||||
path_length=envelope.hop_count,
|
||||
path_hash_size=envelope.hash_size,
|
||||
path=envelope.path,
|
||||
payload=envelope.payload,
|
||||
)
|
||||
except (ValueError, IndexError):
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
|
||||
@@ -106,13 +106,16 @@ async def on_contact_message(event: "Event") -> None:
|
||||
ts = payload.get("sender_timestamp")
|
||||
sender_timestamp = ts if ts is not None else received_at
|
||||
sender_name = contact.name if contact else None
|
||||
path = payload.get("path")
|
||||
path_len = payload.get("path_len")
|
||||
msg_id = await MessageRepository.create(
|
||||
msg_type="PRIV",
|
||||
text=payload.get("text", ""),
|
||||
conversation_key=sender_pubkey,
|
||||
sender_timestamp=sender_timestamp,
|
||||
received_at=received_at,
|
||||
path=payload.get("path"),
|
||||
path=path,
|
||||
path_len=path_len,
|
||||
txt_type=txt_type,
|
||||
signature=payload.get("signature"),
|
||||
sender_key=sender_pubkey,
|
||||
@@ -129,8 +132,11 @@ async def on_contact_message(event: "Event") -> None:
|
||||
logger.debug("DM from %s handled by event handler (fallback path)", sender_pubkey[:12])
|
||||
|
||||
# Build paths array for broadcast
|
||||
path = payload.get("path")
|
||||
paths = [MessagePath(path=path or "", received_at=received_at)] if path is not None else None
|
||||
paths = (
|
||||
[MessagePath(path=path or "", received_at=received_at, path_len=path_len)]
|
||||
if path is not None
|
||||
else None
|
||||
)
|
||||
|
||||
# Broadcast the new message
|
||||
broadcast_event(
|
||||
@@ -205,6 +211,7 @@ async def on_path_update(event: "Event") -> None:
|
||||
# so if path fields are absent here we treat this as informational only.
|
||||
path = payload.get("path")
|
||||
path_len = payload.get("path_len")
|
||||
path_hash_mode = payload.get("path_hash_mode")
|
||||
if path is None or path_len is None:
|
||||
logger.debug(
|
||||
"PATH_UPDATE for %s has no path payload, skipping DB update", contact.public_key[:12]
|
||||
@@ -219,7 +226,28 @@ async def on_path_update(event: "Event") -> None:
|
||||
)
|
||||
return
|
||||
|
||||
await ContactRepository.update_path(contact.public_key, str(path), normalized_path_len)
|
||||
normalized_path_hash_mode: int | None
|
||||
if path_hash_mode is None:
|
||||
# Legacy firmware/library payloads only support 1-byte hop hashes.
|
||||
normalized_path_hash_mode = -1 if normalized_path_len == -1 else 0
|
||||
else:
|
||||
normalized_path_hash_mode = None
|
||||
try:
|
||||
normalized_path_hash_mode = int(path_hash_mode)
|
||||
except (TypeError, ValueError):
|
||||
logger.warning(
|
||||
"Invalid path_hash_mode in PATH_UPDATE for %s: %r",
|
||||
contact.public_key[:12],
|
||||
path_hash_mode,
|
||||
)
|
||||
normalized_path_hash_mode = None
|
||||
|
||||
await ContactRepository.update_path(
|
||||
contact.public_key,
|
||||
str(path),
|
||||
normalized_path_len,
|
||||
normalized_path_hash_mode,
|
||||
)
|
||||
|
||||
|
||||
async def on_new_contact(event: "Event") -> None:
|
||||
|
||||
@@ -57,6 +57,8 @@ Wraps `MqttPublisher` from `app/fanout/mqtt.py`. Config blob:
|
||||
Wraps `CommunityMqttPublisher` from `app/fanout/community_mqtt.py`. Config blob:
|
||||
- `broker_host`, `broker_port`, `iata`, `email`
|
||||
- Only publishes raw packets (on_message is a no-op)
|
||||
- The published `raw` field is always the original packet hex.
|
||||
- When a direct packet includes a `path` field, it is emitted as comma-separated hop identifiers exactly as the packet reports them. Token width varies with the packet's path hash mode (`1`, `2`, or `3` bytes per hop); there is no legacy flat per-byte companion field.
|
||||
|
||||
### bot (bot.py)
|
||||
Wraps bot code execution via `app/fanout/bot_exec.py`. Config blob:
|
||||
|
||||
@@ -7,6 +7,7 @@ import logging
|
||||
from urllib.parse import parse_qsl, urlencode, urlsplit, urlunsplit
|
||||
|
||||
from app.fanout.base import FanoutModule
|
||||
from app.path_utils import split_path_hex
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -45,9 +46,12 @@ def _format_body(data: dict, *, include_path: bool) -> str:
|
||||
if include_path:
|
||||
paths = data.get("paths")
|
||||
if paths and isinstance(paths, list) and len(paths) > 0:
|
||||
path_str = paths[0].get("path", "") if isinstance(paths[0], dict) else ""
|
||||
first_path = paths[0] if isinstance(paths[0], dict) else {}
|
||||
path_str = first_path.get("path", "")
|
||||
path_len = first_path.get("path_len")
|
||||
else:
|
||||
path_str = None
|
||||
path_len = None
|
||||
|
||||
if msg_type == "PRIV" and path_str is None:
|
||||
via = " **via:** [`direct`]"
|
||||
@@ -56,7 +60,8 @@ def _format_body(data: dict, *, include_path: bool) -> str:
|
||||
if path_str == "":
|
||||
via = " **via:** [`direct`]"
|
||||
else:
|
||||
hops = [path_str[i : i + 2] for i in range(0, len(path_str), 2)]
|
||||
hop_count = path_len if isinstance(path_len, int) else len(path_str) // 2
|
||||
hops = split_path_hex(path_str, hop_count)
|
||||
if hops:
|
||||
hop_list = ", ".join(f"`{h}`" for h in hops)
|
||||
via = f" **via:** [{hop_list}]"
|
||||
|
||||
@@ -24,6 +24,7 @@ import aiomqtt
|
||||
import nacl.bindings
|
||||
|
||||
from app.fanout.mqtt_base import BaseMqttPublisher
|
||||
from app.path_utils import parse_packet_envelope, split_path_hex
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -52,8 +53,15 @@ class CommunityMqttSettings(Protocol):
|
||||
community_mqtt_enabled: bool
|
||||
community_mqtt_broker_host: str
|
||||
community_mqtt_broker_port: int
|
||||
community_mqtt_transport: str
|
||||
community_mqtt_use_tls: bool
|
||||
community_mqtt_tls_verify: bool
|
||||
community_mqtt_auth_mode: str
|
||||
community_mqtt_username: str
|
||||
community_mqtt_password: str
|
||||
community_mqtt_iata: str
|
||||
community_mqtt_email: str
|
||||
community_mqtt_token_audience: str
|
||||
|
||||
|
||||
def _base64url_encode(data: bytes) -> str:
|
||||
@@ -135,35 +143,17 @@ def _calculate_packet_hash(raw_bytes: bytes) -> str:
|
||||
return "0" * 16
|
||||
|
||||
try:
|
||||
header = raw_bytes[0]
|
||||
payload_type = (header >> 2) & 0x0F
|
||||
route_type = header & 0x03
|
||||
|
||||
# Transport codes present for TRANSPORT_FLOOD (0) and TRANSPORT_DIRECT (3)
|
||||
has_transport = route_type in (0x00, 0x03)
|
||||
|
||||
offset = 1 # Past header
|
||||
if has_transport:
|
||||
offset += 4 # Skip 4 bytes of transport codes
|
||||
|
||||
# Read path_len (1 byte on wire). Invalid/truncated packets map to zero hash.
|
||||
if offset >= len(raw_bytes):
|
||||
envelope = parse_packet_envelope(raw_bytes)
|
||||
if envelope is None:
|
||||
return "0" * 16
|
||||
path_len = raw_bytes[offset]
|
||||
offset += 1
|
||||
|
||||
# Skip past path to get to payload. Invalid/truncated packets map to zero hash.
|
||||
if len(raw_bytes) < offset + path_len:
|
||||
return "0" * 16
|
||||
payload_start = offset + path_len
|
||||
payload_data = raw_bytes[payload_start:]
|
||||
|
||||
# Hash: payload_type(1 byte) [+ path_len as uint16_t LE for TRACE] + payload_data
|
||||
# Hash: payload_type(1 byte) [+ path_byte as uint16_t LE for TRACE] + payload_data
|
||||
# IMPORTANT: TRACE hash uses the raw wire byte (not decoded hop count) to match firmware.
|
||||
hash_obj = hashlib.sha256()
|
||||
hash_obj.update(bytes([payload_type]))
|
||||
if payload_type == 9: # PAYLOAD_TYPE_TRACE
|
||||
hash_obj.update(path_len.to_bytes(2, byteorder="little"))
|
||||
hash_obj.update(payload_data)
|
||||
hash_obj.update(bytes([envelope.payload_type]))
|
||||
if envelope.payload_type == 9: # PAYLOAD_TYPE_TRACE
|
||||
hash_obj.update(envelope.path_byte.to_bytes(2, byteorder="little"))
|
||||
hash_obj.update(envelope.payload)
|
||||
|
||||
return hash_obj.hexdigest()[:16].upper()
|
||||
except Exception:
|
||||
@@ -184,38 +174,15 @@ def _decode_packet_fields(raw_bytes: bytes) -> tuple[str, str, str, list[str], i
|
||||
payload_type: int | None = None
|
||||
|
||||
try:
|
||||
if len(raw_bytes) < 2:
|
||||
envelope = parse_packet_envelope(raw_bytes)
|
||||
if envelope is None or envelope.payload_version != 0:
|
||||
return route, packet_type, payload_len, path_values, payload_type
|
||||
|
||||
header = raw_bytes[0]
|
||||
payload_version = (header >> 6) & 0x03
|
||||
if payload_version != 0:
|
||||
return route, packet_type, payload_len, path_values, payload_type
|
||||
|
||||
route_type = header & 0x03
|
||||
has_transport = route_type in (0x00, 0x03)
|
||||
|
||||
offset = 1
|
||||
if has_transport:
|
||||
offset += 4
|
||||
|
||||
if len(raw_bytes) <= offset:
|
||||
return route, packet_type, payload_len, path_values, payload_type
|
||||
|
||||
path_len = raw_bytes[offset]
|
||||
offset += 1
|
||||
|
||||
if len(raw_bytes) < offset + path_len:
|
||||
return route, packet_type, payload_len, path_values, payload_type
|
||||
|
||||
path_bytes = raw_bytes[offset : offset + path_len]
|
||||
offset += path_len
|
||||
|
||||
payload_type = (header >> 2) & 0x0F
|
||||
route = _ROUTE_MAP.get(route_type, "U")
|
||||
payload_type = envelope.payload_type
|
||||
route = _ROUTE_MAP.get(envelope.route_type, "U")
|
||||
packet_type = str(payload_type)
|
||||
payload_len = str(max(0, len(raw_bytes) - offset))
|
||||
path_values = [f"{b:02x}" for b in path_bytes]
|
||||
payload_len = str(len(envelope.payload))
|
||||
path_values = split_path_hex(envelope.path.hex(), envelope.hop_count)
|
||||
|
||||
return route, packet_type, payload_len, path_values, payload_type
|
||||
except Exception:
|
||||
@@ -327,11 +294,18 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
await super().start(settings)
|
||||
|
||||
def _on_not_configured(self) -> None:
|
||||
from app.keystore import has_private_key
|
||||
from app.keystore import get_public_key, has_private_key
|
||||
from app.websocket import broadcast_error
|
||||
|
||||
s: CommunityMqttSettings | None = self._settings
|
||||
if s and not has_private_key() and not self._key_unavailable_warned:
|
||||
auth_mode = getattr(s, "community_mqtt_auth_mode", "token") if s else "token"
|
||||
if (
|
||||
s
|
||||
and auth_mode == "token"
|
||||
and get_public_key() is not None
|
||||
and not has_private_key()
|
||||
and not self._key_unavailable_warned
|
||||
):
|
||||
broadcast_error(
|
||||
"Community MQTT unavailable",
|
||||
"Radio firmware does not support private key export.",
|
||||
@@ -340,10 +314,17 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
|
||||
def _is_configured(self) -> bool:
|
||||
"""Check if community MQTT is enabled and keys are available."""
|
||||
from app.keystore import has_private_key
|
||||
from app.keystore import get_public_key, has_private_key
|
||||
|
||||
s: CommunityMqttSettings | None = self._settings
|
||||
return bool(s and s.community_mqtt_enabled and has_private_key())
|
||||
if not s or not s.community_mqtt_enabled:
|
||||
return False
|
||||
if get_public_key() is None:
|
||||
return False
|
||||
auth_mode = getattr(s, "community_mqtt_auth_mode", "token")
|
||||
if auth_mode == "token":
|
||||
return has_private_key()
|
||||
return True
|
||||
|
||||
def _build_client_kwargs(self, settings: object) -> dict[str, Any]:
|
||||
s: CommunityMqttSettings = settings # type: ignore[assignment]
|
||||
@@ -352,19 +333,23 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
|
||||
private_key = get_private_key()
|
||||
public_key = get_public_key()
|
||||
assert private_key is not None and public_key is not None # guaranteed by _pre_connect
|
||||
assert public_key is not None # guaranteed by _pre_connect
|
||||
|
||||
pubkey_hex = public_key.hex().upper()
|
||||
broker_host = s.community_mqtt_broker_host or _DEFAULT_BROKER
|
||||
broker_port = s.community_mqtt_broker_port or _DEFAULT_PORT
|
||||
jwt_token = _generate_jwt_token(
|
||||
private_key,
|
||||
public_key,
|
||||
audience=broker_host,
|
||||
email=s.community_mqtt_email or "",
|
||||
)
|
||||
transport = s.community_mqtt_transport or "websockets"
|
||||
use_tls = bool(s.community_mqtt_use_tls)
|
||||
tls_verify = bool(s.community_mqtt_tls_verify)
|
||||
auth_mode = s.community_mqtt_auth_mode or "token"
|
||||
secure_connection = use_tls and tls_verify
|
||||
|
||||
tls_context = ssl.create_default_context()
|
||||
tls_context: ssl.SSLContext | None = None
|
||||
if use_tls:
|
||||
tls_context = ssl.create_default_context()
|
||||
if not tls_verify:
|
||||
tls_context.check_hostname = False
|
||||
tls_context.verify_mode = ssl.CERT_NONE
|
||||
|
||||
device_name = ""
|
||||
if radio_manager.meshcore and radio_manager.meshcore.self_info:
|
||||
@@ -380,16 +365,30 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
}
|
||||
)
|
||||
|
||||
return {
|
||||
kwargs: dict[str, Any] = {
|
||||
"hostname": broker_host,
|
||||
"port": broker_port,
|
||||
"transport": "websockets",
|
||||
"transport": transport,
|
||||
"tls_context": tls_context,
|
||||
"websocket_path": "/",
|
||||
"username": f"v1_{pubkey_hex}",
|
||||
"password": jwt_token,
|
||||
"will": aiomqtt.Will(status_topic, offline_payload, retain=True),
|
||||
}
|
||||
if auth_mode == "token":
|
||||
assert private_key is not None
|
||||
token_audience = (s.community_mqtt_token_audience or "").strip() or broker_host
|
||||
jwt_token = _generate_jwt_token(
|
||||
private_key,
|
||||
public_key,
|
||||
audience=token_audience,
|
||||
email=(s.community_mqtt_email or "") if secure_connection else "",
|
||||
)
|
||||
kwargs["username"] = f"v1_{pubkey_hex}"
|
||||
kwargs["password"] = jwt_token
|
||||
elif auth_mode == "password":
|
||||
kwargs["username"] = s.community_mqtt_username or None
|
||||
kwargs["password"] = s.community_mqtt_password or None
|
||||
if transport == "websockets":
|
||||
kwargs["websocket_path"] = "/"
|
||||
return kwargs
|
||||
|
||||
def _on_connected(self, settings: object) -> tuple[str, str]:
|
||||
s: CommunityMqttSettings = settings # type: ignore[assignment]
|
||||
@@ -543,7 +542,9 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
if not self.connected:
|
||||
logger.info("Community MQTT publish failure detected, reconnecting")
|
||||
return True
|
||||
if elapsed >= _TOKEN_RENEWAL_THRESHOLD:
|
||||
s: CommunityMqttSettings | None = self._settings
|
||||
auth_mode = getattr(s, "community_mqtt_auth_mode", "token") if s else "token"
|
||||
if auth_mode == "token" and elapsed >= _TOKEN_RENEWAL_THRESHOLD:
|
||||
logger.info("Community MQTT JWT nearing expiry, reconnecting")
|
||||
return True
|
||||
return False
|
||||
@@ -551,9 +552,11 @@ class CommunityMqttPublisher(BaseMqttPublisher):
|
||||
async def _pre_connect(self, settings: object) -> bool:
|
||||
from app.keystore import get_private_key, get_public_key
|
||||
|
||||
s: CommunityMqttSettings = settings # type: ignore[assignment]
|
||||
auth_mode = s.community_mqtt_auth_mode or "token"
|
||||
private_key = get_private_key()
|
||||
public_key = get_public_key()
|
||||
if private_key is None or public_key is None:
|
||||
if public_key is None or (auth_mode == "token" and private_key is None):
|
||||
# Keys not available yet, wait for settings change or key export
|
||||
self.connected = False
|
||||
self._version_event.clear()
|
||||
|
||||
@@ -4,6 +4,7 @@ from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import re
|
||||
import string
|
||||
from types import SimpleNamespace
|
||||
from typing import Any
|
||||
|
||||
@@ -13,6 +14,37 @@ from app.fanout.community_mqtt import CommunityMqttPublisher, _format_raw_packet
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_IATA_RE = re.compile(r"^[A-Z]{3}$")
|
||||
_DEFAULT_PACKET_TOPIC_TEMPLATE = "meshcore/{IATA}/{PUBLIC_KEY}/packets"
|
||||
_TOPIC_TEMPLATE_FIELD_CANONICAL = {
|
||||
"iata": "IATA",
|
||||
"public_key": "PUBLIC_KEY",
|
||||
}
|
||||
|
||||
|
||||
def _normalize_topic_template(topic_template: str) -> str:
|
||||
"""Normalize packet topic template fields to canonical uppercase placeholders."""
|
||||
template = topic_template.strip() or _DEFAULT_PACKET_TOPIC_TEMPLATE
|
||||
parts: list[str] = []
|
||||
try:
|
||||
parsed = string.Formatter().parse(template)
|
||||
for literal_text, field_name, format_spec, conversion in parsed:
|
||||
parts.append(literal_text)
|
||||
if field_name is None:
|
||||
continue
|
||||
normalized_field = _TOPIC_TEMPLATE_FIELD_CANONICAL.get(field_name.lower())
|
||||
if normalized_field is None:
|
||||
raise ValueError(f"Unsupported topic template field(s): {field_name}")
|
||||
replacement = ["{", normalized_field]
|
||||
if conversion:
|
||||
replacement.extend(["!", conversion])
|
||||
if format_spec:
|
||||
replacement.extend([":", format_spec])
|
||||
replacement.append("}")
|
||||
parts.append("".join(replacement))
|
||||
except ValueError:
|
||||
raise
|
||||
|
||||
return "".join(parts)
|
||||
|
||||
|
||||
def _config_to_settings(config: dict) -> SimpleNamespace:
|
||||
@@ -21,11 +53,24 @@ def _config_to_settings(config: dict) -> SimpleNamespace:
|
||||
community_mqtt_enabled=True,
|
||||
community_mqtt_broker_host=config.get("broker_host", "mqtt-us-v1.letsmesh.net"),
|
||||
community_mqtt_broker_port=config.get("broker_port", 443),
|
||||
community_mqtt_transport=config.get("transport", "websockets"),
|
||||
community_mqtt_use_tls=config.get("use_tls", True),
|
||||
community_mqtt_tls_verify=config.get("tls_verify", True),
|
||||
community_mqtt_auth_mode=config.get("auth_mode", "token"),
|
||||
community_mqtt_username=config.get("username", ""),
|
||||
community_mqtt_password=config.get("password", ""),
|
||||
community_mqtt_iata=config.get("iata", ""),
|
||||
community_mqtt_email=config.get("email", ""),
|
||||
community_mqtt_token_audience=config.get("token_audience", ""),
|
||||
)
|
||||
|
||||
|
||||
def _render_packet_topic(topic_template: str, *, iata: str, public_key: str) -> str:
|
||||
"""Render the configured raw-packet publish topic."""
|
||||
template = _normalize_topic_template(topic_template)
|
||||
return template.format(IATA=iata, PUBLIC_KEY=public_key)
|
||||
|
||||
|
||||
class MqttCommunityModule(FanoutModule):
|
||||
"""Wraps a CommunityMqttPublisher for community packet sharing."""
|
||||
|
||||
@@ -81,7 +126,11 @@ async def _publish_community_packet(
|
||||
if not _IATA_RE.fullmatch(iata):
|
||||
logger.debug("Community MQTT: skipping publish — no valid IATA code configured")
|
||||
return
|
||||
topic = f"meshcore/{iata}/{pubkey_hex}/packets"
|
||||
topic = _render_packet_topic(
|
||||
str(config.get("topic_template", _DEFAULT_PACKET_TOPIC_TEMPLATE)),
|
||||
iata=iata,
|
||||
public_key=pubkey_hex,
|
||||
)
|
||||
|
||||
await publisher.publish(topic, packet)
|
||||
|
||||
|
||||
33
app/main.py
33
app/main.py
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
from pathlib import Path
|
||||
@@ -34,6 +35,22 @@ setup_logging()
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def _startup_radio_connect_and_setup() -> None:
|
||||
"""Connect/setup the radio in the background so HTTP serving can start immediately."""
|
||||
try:
|
||||
connected = await radio_manager.reconnect(broadcast_on_success=False)
|
||||
if connected:
|
||||
await radio_manager.post_connect_setup()
|
||||
from app.websocket import broadcast_health
|
||||
|
||||
broadcast_health(True, radio_manager.connection_info)
|
||||
logger.info("Connected to radio")
|
||||
else:
|
||||
logger.warning("Failed to connect to radio on startup")
|
||||
except Exception as e:
|
||||
logger.warning("Failed to connect to radio on startup: %s", e)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Manage database and radio connection lifecycle."""
|
||||
@@ -47,13 +64,6 @@ async def lifespan(app: FastAPI):
|
||||
|
||||
await ensure_default_channels()
|
||||
|
||||
try:
|
||||
await radio_manager.connect()
|
||||
logger.info("Connected to radio")
|
||||
await radio_manager.post_connect_setup()
|
||||
except Exception as e:
|
||||
logger.warning("Failed to connect to radio on startup: %s", e)
|
||||
|
||||
# Always start connection monitor (even if initial connection failed)
|
||||
await radio_manager.start_connection_monitor()
|
||||
|
||||
@@ -65,9 +75,18 @@ async def lifespan(app: FastAPI):
|
||||
except Exception as e:
|
||||
logger.warning("Failed to start fanout modules: %s", e)
|
||||
|
||||
startup_radio_task = asyncio.create_task(_startup_radio_connect_and_setup())
|
||||
app.state.startup_radio_task = startup_radio_task
|
||||
|
||||
yield
|
||||
|
||||
logger.info("Shutting down")
|
||||
if startup_radio_task and not startup_radio_task.done():
|
||||
startup_radio_task.cancel()
|
||||
try:
|
||||
await startup_radio_task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
await fanout_manager.stop_all()
|
||||
await radio_manager.stop_connection_monitor()
|
||||
await stop_message_polling()
|
||||
|
||||
@@ -303,6 +303,20 @@ async def run_migrations(conn: aiosqlite.Connection) -> int:
|
||||
await set_version(conn, 38)
|
||||
applied += 1
|
||||
|
||||
# Migration 39: Persist contacts.out_path_hash_mode for multibyte path round-tripping
|
||||
if version < 39:
|
||||
logger.info("Applying migration 39: add contacts.out_path_hash_mode")
|
||||
await _migrate_039_add_contact_out_path_hash_mode(conn)
|
||||
await set_version(conn, 39)
|
||||
applied += 1
|
||||
|
||||
# Migration 40: Distinguish advert paths by hop count as well as bytes
|
||||
if version < 40:
|
||||
logger.info("Applying migration 40: rebuild contact_advert_paths uniqueness with path_len")
|
||||
await _migrate_040_rebuild_contact_advert_paths_identity(conn)
|
||||
await set_version(conn, 40)
|
||||
applied += 1
|
||||
|
||||
if applied > 0:
|
||||
logger.info(
|
||||
"Applied %d migration(s), schema now at version %d", applied, await get_version(conn)
|
||||
@@ -438,39 +452,14 @@ async def _migrate_004_add_payload_hash_column(conn: aiosqlite.Connection) -> No
|
||||
|
||||
def _extract_payload_for_hash(raw_packet: bytes) -> bytes | None:
|
||||
"""
|
||||
Extract payload from a raw packet for hashing (migration-local copy of decoder logic).
|
||||
Extract payload from a raw packet for hashing using canonical framing validation.
|
||||
|
||||
Returns the payload bytes, or None if packet is malformed.
|
||||
"""
|
||||
if len(raw_packet) < 2:
|
||||
return None
|
||||
from app.path_utils import parse_packet_envelope
|
||||
|
||||
try:
|
||||
header = raw_packet[0]
|
||||
route_type = header & 0x03
|
||||
offset = 1
|
||||
|
||||
# Skip transport codes if present (TRANSPORT_FLOOD=0, TRANSPORT_DIRECT=3)
|
||||
if route_type in (0x00, 0x03):
|
||||
if len(raw_packet) < offset + 4:
|
||||
return None
|
||||
offset += 4
|
||||
|
||||
# Get path length
|
||||
if len(raw_packet) < offset + 1:
|
||||
return None
|
||||
path_length = raw_packet[offset]
|
||||
offset += 1
|
||||
|
||||
# Skip path bytes
|
||||
if len(raw_packet) < offset + path_length:
|
||||
return None
|
||||
offset += path_length
|
||||
|
||||
# Rest is payload (may be empty, matching decoder.py behavior)
|
||||
return raw_packet[offset:]
|
||||
except (IndexError, ValueError):
|
||||
return None
|
||||
envelope = parse_packet_envelope(raw_packet)
|
||||
return envelope.payload if envelope is not None else None
|
||||
|
||||
|
||||
async def _migrate_005_backfill_payload_hashes(conn: aiosqlite.Connection) -> None:
|
||||
@@ -620,38 +609,14 @@ async def _migrate_006_replace_path_len_with_path(conn: aiosqlite.Connection) ->
|
||||
|
||||
def _extract_path_from_packet(raw_packet: bytes) -> str | None:
|
||||
"""
|
||||
Extract path hex string from a raw packet (migration-local copy of decoder logic).
|
||||
Extract path hex string from a raw packet using canonical framing validation.
|
||||
|
||||
Returns the path as a hex string, or None if packet is malformed.
|
||||
"""
|
||||
if len(raw_packet) < 2:
|
||||
return None
|
||||
from app.path_utils import parse_packet_envelope
|
||||
|
||||
try:
|
||||
header = raw_packet[0]
|
||||
route_type = header & 0x03
|
||||
offset = 1
|
||||
|
||||
# Skip transport codes if present (TRANSPORT_FLOOD=0, TRANSPORT_DIRECT=3)
|
||||
if route_type in (0x00, 0x03):
|
||||
if len(raw_packet) < offset + 4:
|
||||
return None
|
||||
offset += 4
|
||||
|
||||
# Get path length
|
||||
if len(raw_packet) < offset + 1:
|
||||
return None
|
||||
path_length = raw_packet[offset]
|
||||
offset += 1
|
||||
|
||||
# Extract path bytes
|
||||
if len(raw_packet) < offset + path_length:
|
||||
return None
|
||||
path_bytes = raw_packet[offset : offset + path_length]
|
||||
|
||||
return path_bytes.hex()
|
||||
except (IndexError, ValueError):
|
||||
return None
|
||||
envelope = parse_packet_envelope(raw_packet)
|
||||
return envelope.path.hex() if envelope is not None else None
|
||||
|
||||
|
||||
async def _migrate_007_backfill_message_paths(conn: aiosqlite.Connection) -> None:
|
||||
@@ -1678,7 +1643,7 @@ async def _migrate_026_rename_advert_paths_table(conn: aiosqlite.Connection) ->
|
||||
first_seen INTEGER NOT NULL,
|
||||
last_seen INTEGER NOT NULL,
|
||||
heard_count INTEGER NOT NULL DEFAULT 1,
|
||||
UNIQUE(public_key, path_hex),
|
||||
UNIQUE(public_key, path_hex, path_len),
|
||||
FOREIGN KEY (public_key) REFERENCES contacts(public_key)
|
||||
)
|
||||
"""
|
||||
@@ -1702,7 +1667,7 @@ async def _migrate_026_rename_advert_paths_table(conn: aiosqlite.Connection) ->
|
||||
first_seen INTEGER NOT NULL,
|
||||
last_seen INTEGER NOT NULL,
|
||||
heard_count INTEGER NOT NULL DEFAULT 1,
|
||||
UNIQUE(public_key, path_hex),
|
||||
UNIQUE(public_key, path_hex, path_len),
|
||||
FOREIGN KEY (public_key) REFERENCES contacts(public_key)
|
||||
)
|
||||
"""
|
||||
@@ -2280,3 +2245,140 @@ async def _migrate_038_drop_legacy_columns(conn: aiosqlite.Connection) -> None:
|
||||
raise
|
||||
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_039_add_contact_out_path_hash_mode(conn: aiosqlite.Connection) -> None:
|
||||
"""Add contacts.out_path_hash_mode and backfill legacy rows.
|
||||
|
||||
Historical databases predate multibyte routing support. Backfill rules:
|
||||
- contacts with last_path_len = -1 are flood routes -> out_path_hash_mode = -1
|
||||
- all other existing contacts default to 0 (1-byte legacy hop identifiers)
|
||||
"""
|
||||
cursor = await conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name='contacts'"
|
||||
)
|
||||
if await cursor.fetchone() is None:
|
||||
await conn.commit()
|
||||
return
|
||||
|
||||
column_cursor = await conn.execute("PRAGMA table_info(contacts)")
|
||||
columns = {row[1] for row in await column_cursor.fetchall()}
|
||||
|
||||
added_column = False
|
||||
|
||||
try:
|
||||
await conn.execute(
|
||||
"ALTER TABLE contacts ADD COLUMN out_path_hash_mode INTEGER NOT NULL DEFAULT 0"
|
||||
)
|
||||
added_column = True
|
||||
logger.debug("Added out_path_hash_mode to contacts table")
|
||||
except aiosqlite.OperationalError as e:
|
||||
if "duplicate column name" in str(e).lower():
|
||||
logger.debug("contacts.out_path_hash_mode already exists, skipping add")
|
||||
else:
|
||||
raise
|
||||
|
||||
if "last_path_len" not in columns:
|
||||
await conn.commit()
|
||||
return
|
||||
|
||||
if added_column:
|
||||
await conn.execute(
|
||||
"""
|
||||
UPDATE contacts
|
||||
SET out_path_hash_mode = CASE
|
||||
WHEN last_path_len = -1 THEN -1
|
||||
ELSE 0
|
||||
END
|
||||
"""
|
||||
)
|
||||
else:
|
||||
await conn.execute(
|
||||
"""
|
||||
UPDATE contacts
|
||||
SET out_path_hash_mode = CASE
|
||||
WHEN last_path_len = -1 THEN -1
|
||||
ELSE 0
|
||||
END
|
||||
WHERE out_path_hash_mode NOT IN (-1, 0, 1, 2)
|
||||
OR (last_path_len = -1 AND out_path_hash_mode != -1)
|
||||
"""
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
|
||||
async def _migrate_040_rebuild_contact_advert_paths_identity(
|
||||
conn: aiosqlite.Connection,
|
||||
) -> None:
|
||||
"""Rebuild contact_advert_paths so uniqueness includes path_len.
|
||||
|
||||
Multi-byte routing can produce the same path_hex bytes with a different hop count,
|
||||
which changes the hop boundaries and therefore the semantic next-hop identity.
|
||||
"""
|
||||
cursor = await conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name='contact_advert_paths'"
|
||||
)
|
||||
if await cursor.fetchone() is None:
|
||||
await conn.execute(
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS contact_advert_paths (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
public_key TEXT NOT NULL,
|
||||
path_hex TEXT NOT NULL,
|
||||
path_len INTEGER NOT NULL,
|
||||
first_seen INTEGER NOT NULL,
|
||||
last_seen INTEGER NOT NULL,
|
||||
heard_count INTEGER NOT NULL DEFAULT 1,
|
||||
UNIQUE(public_key, path_hex, path_len),
|
||||
FOREIGN KEY (public_key) REFERENCES contacts(public_key)
|
||||
)
|
||||
"""
|
||||
)
|
||||
await conn.execute("DROP INDEX IF EXISTS idx_contact_advert_paths_recent")
|
||||
await conn.execute(
|
||||
"CREATE INDEX IF NOT EXISTS idx_contact_advert_paths_recent "
|
||||
"ON contact_advert_paths(public_key, last_seen DESC)"
|
||||
)
|
||||
await conn.commit()
|
||||
return
|
||||
|
||||
await conn.execute(
|
||||
"""
|
||||
CREATE TABLE contact_advert_paths_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
public_key TEXT NOT NULL,
|
||||
path_hex TEXT NOT NULL,
|
||||
path_len INTEGER NOT NULL,
|
||||
first_seen INTEGER NOT NULL,
|
||||
last_seen INTEGER NOT NULL,
|
||||
heard_count INTEGER NOT NULL DEFAULT 1,
|
||||
UNIQUE(public_key, path_hex, path_len),
|
||||
FOREIGN KEY (public_key) REFERENCES contacts(public_key)
|
||||
)
|
||||
"""
|
||||
)
|
||||
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO contact_advert_paths_new
|
||||
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
||||
SELECT
|
||||
public_key,
|
||||
path_hex,
|
||||
path_len,
|
||||
MIN(first_seen),
|
||||
MAX(last_seen),
|
||||
SUM(heard_count)
|
||||
FROM contact_advert_paths
|
||||
GROUP BY public_key, path_hex, path_len
|
||||
"""
|
||||
)
|
||||
|
||||
await conn.execute("DROP TABLE contact_advert_paths")
|
||||
await conn.execute("ALTER TABLE contact_advert_paths_new RENAME TO contact_advert_paths")
|
||||
await conn.execute("DROP INDEX IF EXISTS idx_contact_advert_paths_recent")
|
||||
await conn.execute(
|
||||
"CREATE INDEX IF NOT EXISTS idx_contact_advert_paths_recent "
|
||||
"ON contact_advert_paths(public_key, last_seen DESC)"
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
@@ -2,6 +2,8 @@ from typing import Literal
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from app.path_utils import normalize_contact_route
|
||||
|
||||
|
||||
class Contact(BaseModel):
|
||||
public_key: str = Field(description="Public key (64-char hex)")
|
||||
@@ -10,6 +12,7 @@ class Contact(BaseModel):
|
||||
flags: int = 0
|
||||
last_path: str | None = None
|
||||
last_path_len: int = -1
|
||||
out_path_hash_mode: int = 0
|
||||
last_advert: int | None = None
|
||||
lat: float | None = None
|
||||
lon: float | None = None
|
||||
@@ -25,13 +28,19 @@ class Contact(BaseModel):
|
||||
The radio API uses different field names (adv_name, out_path, etc.)
|
||||
than our database schema (name, last_path, etc.).
|
||||
"""
|
||||
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
|
||||
self.last_path,
|
||||
self.last_path_len,
|
||||
self.out_path_hash_mode,
|
||||
)
|
||||
return {
|
||||
"public_key": self.public_key,
|
||||
"adv_name": self.name or "",
|
||||
"type": self.type,
|
||||
"flags": self.flags,
|
||||
"out_path": self.last_path or "",
|
||||
"out_path_len": self.last_path_len,
|
||||
"out_path": last_path,
|
||||
"out_path_len": last_path_len,
|
||||
"out_path_hash_mode": out_path_hash_mode,
|
||||
"adv_lat": self.lat if self.lat is not None else 0.0,
|
||||
"adv_lon": self.lon if self.lon is not None else 0.0,
|
||||
"last_advert": self.last_advert if self.last_advert is not None else 0,
|
||||
@@ -44,13 +53,22 @@ class Contact(BaseModel):
|
||||
This is the inverse of to_radio_dict(), used when syncing contacts
|
||||
from radio to database.
|
||||
"""
|
||||
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
|
||||
radio_data.get("out_path"),
|
||||
radio_data.get("out_path_len", -1),
|
||||
radio_data.get(
|
||||
"out_path_hash_mode",
|
||||
-1 if radio_data.get("out_path_len", -1) == -1 else 0,
|
||||
),
|
||||
)
|
||||
return {
|
||||
"public_key": public_key,
|
||||
"name": radio_data.get("adv_name"),
|
||||
"type": radio_data.get("type", 0),
|
||||
"flags": radio_data.get("flags", 0),
|
||||
"last_path": radio_data.get("out_path"),
|
||||
"last_path_len": radio_data.get("out_path_len", -1),
|
||||
"last_path": last_path,
|
||||
"last_path_len": last_path_len,
|
||||
"out_path_hash_mode": out_path_hash_mode,
|
||||
"lat": radio_data.get("adv_lat"),
|
||||
"lon": radio_data.get("adv_lon"),
|
||||
"last_advert": radio_data.get("last_advert"),
|
||||
@@ -79,7 +97,8 @@ class ContactAdvertPath(BaseModel):
|
||||
path: str = Field(description="Hex-encoded routing path (empty string for direct)")
|
||||
path_len: int = Field(description="Number of hops in the path")
|
||||
next_hop: str | None = Field(
|
||||
default=None, description="First hop toward us (2-char hex), or null for direct"
|
||||
default=None,
|
||||
description="First hop toward us as a full hop identifier, or null for direct",
|
||||
)
|
||||
first_seen: int = Field(description="Unix timestamp of first observation")
|
||||
last_seen: int = Field(description="Unix timestamp of most recent observation")
|
||||
@@ -176,8 +195,12 @@ class ChannelDetail(BaseModel):
|
||||
class MessagePath(BaseModel):
|
||||
"""A single path that a message took to reach us."""
|
||||
|
||||
path: str = Field(description="Hex-encoded routing path (2 chars per hop)")
|
||||
path: str = Field(description="Hex-encoded routing path")
|
||||
received_at: int = Field(description="Unix timestamp when this path was received")
|
||||
path_len: int | None = Field(
|
||||
default=None,
|
||||
description="Hop count. None = legacy (infer as len(path)//2, i.e. 1-byte hops)",
|
||||
)
|
||||
|
||||
|
||||
class Message(BaseModel):
|
||||
@@ -490,6 +513,16 @@ class ContactActivityCounts(BaseModel):
|
||||
last_week: int
|
||||
|
||||
|
||||
class PathHashWidthStats(BaseModel):
|
||||
total_packets: int
|
||||
single_byte: int
|
||||
double_byte: int
|
||||
triple_byte: int
|
||||
single_byte_pct: float
|
||||
double_byte_pct: float
|
||||
triple_byte_pct: float
|
||||
|
||||
|
||||
class StatisticsResponse(BaseModel):
|
||||
busiest_channels_24h: list[BusyChannel]
|
||||
contact_count: int
|
||||
@@ -503,3 +536,4 @@ class StatisticsResponse(BaseModel):
|
||||
total_outgoing: int
|
||||
contacts_heard: ContactActivityCounts
|
||||
repeaters_heard: ContactActivityCounts
|
||||
path_hash_width_24h: PathHashWidthStats
|
||||
|
||||
@@ -58,6 +58,7 @@ async def _handle_duplicate_message(
|
||||
sender_timestamp: int,
|
||||
path: str | None,
|
||||
received: int,
|
||||
path_len: int | None = None,
|
||||
) -> None:
|
||||
"""Handle a duplicate message by updating paths/acks on the existing record.
|
||||
|
||||
@@ -90,7 +91,7 @@ async def _handle_duplicate_message(
|
||||
|
||||
# Add path if provided
|
||||
if path is not None:
|
||||
paths = await MessageRepository.add_path(existing_msg.id, path, received)
|
||||
paths = await MessageRepository.add_path(existing_msg.id, path, received, path_len)
|
||||
else:
|
||||
# Get current paths for broadcast
|
||||
paths = existing_msg.paths or []
|
||||
@@ -128,6 +129,7 @@ async def create_message_from_decrypted(
|
||||
timestamp: int,
|
||||
received_at: int | None = None,
|
||||
path: str | None = None,
|
||||
path_len: int | None = None,
|
||||
channel_name: str | None = None,
|
||||
realtime: bool = True,
|
||||
) -> int | None:
|
||||
@@ -172,6 +174,7 @@ async def create_message_from_decrypted(
|
||||
sender_timestamp=timestamp,
|
||||
received_at=received,
|
||||
path=path,
|
||||
path_len=path_len,
|
||||
sender_name=sender,
|
||||
sender_key=resolved_sender_key,
|
||||
)
|
||||
@@ -182,7 +185,7 @@ async def create_message_from_decrypted(
|
||||
# 2. Same message arrives via multiple paths before first is committed
|
||||
# In either case, add the path to the existing message.
|
||||
await _handle_duplicate_message(
|
||||
packet_id, "CHAN", channel_key_normalized, text, timestamp, path, received
|
||||
packet_id, "CHAN", channel_key_normalized, text, timestamp, path, received, path_len
|
||||
)
|
||||
return None
|
||||
|
||||
@@ -193,7 +196,11 @@ async def create_message_from_decrypted(
|
||||
|
||||
# Build paths array for broadcast
|
||||
# Use "is not None" to include empty string (direct/0-hop messages)
|
||||
paths = [MessagePath(path=path or "", received_at=received)] if path is not None else None
|
||||
paths = (
|
||||
[MessagePath(path=path or "", received_at=received, path_len=path_len)]
|
||||
if path is not None
|
||||
else None
|
||||
)
|
||||
|
||||
# Broadcast new message to connected clients (and fanout modules when realtime)
|
||||
broadcast_event(
|
||||
@@ -223,6 +230,7 @@ async def create_dm_message_from_decrypted(
|
||||
our_public_key: str | None,
|
||||
received_at: int | None = None,
|
||||
path: str | None = None,
|
||||
path_len: int | None = None,
|
||||
outgoing: bool = False,
|
||||
realtime: bool = True,
|
||||
) -> int | None:
|
||||
@@ -270,6 +278,7 @@ async def create_dm_message_from_decrypted(
|
||||
sender_timestamp=decrypted.timestamp,
|
||||
received_at=received,
|
||||
path=path,
|
||||
path_len=path_len,
|
||||
outgoing=outgoing,
|
||||
sender_key=conversation_key if not outgoing else None,
|
||||
sender_name=sender_name,
|
||||
@@ -285,6 +294,7 @@ async def create_dm_message_from_decrypted(
|
||||
decrypted.timestamp,
|
||||
path,
|
||||
received,
|
||||
path_len,
|
||||
)
|
||||
return None
|
||||
|
||||
@@ -299,7 +309,11 @@ async def create_dm_message_from_decrypted(
|
||||
await RawPacketRepository.mark_decrypted(packet_id, msg_id)
|
||||
|
||||
# Build paths array for broadcast
|
||||
paths = [MessagePath(path=path or "", received_at=received)] if path is not None else None
|
||||
paths = (
|
||||
[MessagePath(path=path or "", received_at=received, path_len=path_len)]
|
||||
if path is not None
|
||||
else None
|
||||
)
|
||||
|
||||
# Broadcast new message to connected clients (and fanout modules when realtime)
|
||||
sender_name = contact.name if contact and not outgoing else None
|
||||
@@ -383,6 +397,7 @@ async def run_historical_dm_decryption(
|
||||
# Extract path from the raw packet for storage
|
||||
packet_info = parse_packet(packet_data)
|
||||
path_hex = packet_info.path.hex() if packet_info else None
|
||||
path_len = packet_info.path_length if packet_info else None
|
||||
|
||||
msg_id = await create_dm_message_from_decrypted(
|
||||
packet_id=packet_id,
|
||||
@@ -391,6 +406,7 @@ async def run_historical_dm_decryption(
|
||||
our_public_key=our_public_key_bytes.hex(),
|
||||
received_at=packet_timestamp,
|
||||
path=path_hex,
|
||||
path_len=path_len,
|
||||
outgoing=outgoing,
|
||||
realtime=False, # Historical decryption should not trigger fanout
|
||||
)
|
||||
@@ -606,6 +622,7 @@ async def _process_group_text(
|
||||
timestamp=decrypted.timestamp,
|
||||
received_at=timestamp,
|
||||
path=packet_info.path.hex() if packet_info else None,
|
||||
path_len=packet_info.path_length if packet_info else None,
|
||||
)
|
||||
|
||||
return {
|
||||
@@ -674,9 +691,11 @@ async def _process_advertisement(
|
||||
assert existing is not None # Guaranteed by the conditions that set use_existing_path
|
||||
path_len = existing.last_path_len if existing.last_path_len is not None else -1
|
||||
path_hex = existing.last_path or ""
|
||||
out_path_hash_mode = existing.out_path_hash_mode
|
||||
else:
|
||||
path_len = new_path_len
|
||||
path_hex = new_path_hex
|
||||
out_path_hash_mode = packet_info.path_hash_size - 1
|
||||
|
||||
logger.debug(
|
||||
"Parsed advertisement from %s: %s (role=%d, lat=%s, lon=%s, path_len=%d)",
|
||||
@@ -700,6 +719,7 @@ async def _process_advertisement(
|
||||
path_hex=new_path_hex,
|
||||
timestamp=timestamp,
|
||||
max_paths=10,
|
||||
hop_count=new_path_len,
|
||||
)
|
||||
|
||||
# Record name history
|
||||
@@ -720,6 +740,7 @@ async def _process_advertisement(
|
||||
"last_seen": timestamp,
|
||||
"last_path": path_hex,
|
||||
"last_path_len": path_len,
|
||||
"out_path_hash_mode": out_path_hash_mode,
|
||||
"first_seen": timestamp, # COALESCE in upsert preserves existing value
|
||||
}
|
||||
|
||||
@@ -872,6 +893,7 @@ async def _process_direct_message(
|
||||
our_public_key=our_public_key.hex(),
|
||||
received_at=timestamp,
|
||||
path=packet_info.path.hex() if packet_info else None,
|
||||
path_len=packet_info.path_length if packet_info else None,
|
||||
outgoing=is_outgoing,
|
||||
)
|
||||
|
||||
|
||||
204
app/path_utils.py
Normal file
204
app/path_utils.py
Normal file
@@ -0,0 +1,204 @@
|
||||
"""
|
||||
Centralized helpers for MeshCore multi-byte path encoding.
|
||||
|
||||
The path_len wire byte is packed as [hash_mode:2][hop_count:6]:
|
||||
- hash_size = (hash_mode) + 1 → 1, 2, or 3 bytes per hop
|
||||
- hop_count = lower 6 bits → 0–63 hops
|
||||
- wire bytes = hop_count × hash_size
|
||||
|
||||
Mode 3 (hash_size=4) is reserved and rejected.
|
||||
"""
|
||||
|
||||
from dataclasses import dataclass
|
||||
|
||||
MAX_PATH_SIZE = 64
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ParsedPacketEnvelope:
|
||||
"""Canonical packet framing parse matching MeshCore Packet::readFrom()."""
|
||||
|
||||
header: int
|
||||
route_type: int
|
||||
payload_type: int
|
||||
payload_version: int
|
||||
path_byte: int
|
||||
hop_count: int
|
||||
hash_size: int
|
||||
path_byte_len: int
|
||||
path: bytes
|
||||
payload: bytes
|
||||
payload_offset: int
|
||||
|
||||
|
||||
def decode_path_byte(path_byte: int) -> tuple[int, int]:
|
||||
"""Decode a packed path byte into (hop_count, hash_size).
|
||||
|
||||
Returns:
|
||||
(hop_count, hash_size) where hash_size is 1, 2, or 3.
|
||||
|
||||
Raises:
|
||||
ValueError: If hash_mode is 3 (reserved).
|
||||
"""
|
||||
hash_mode = (path_byte >> 6) & 0x03
|
||||
if hash_mode == 3:
|
||||
raise ValueError(f"Reserved path hash mode 3 (path_byte=0x{path_byte:02X})")
|
||||
hop_count = path_byte & 0x3F
|
||||
hash_size = hash_mode + 1
|
||||
return hop_count, hash_size
|
||||
|
||||
|
||||
def path_wire_len(hop_count: int, hash_size: int) -> int:
|
||||
"""Wire byte length of path data."""
|
||||
return hop_count * hash_size
|
||||
|
||||
|
||||
def validate_path_byte(path_byte: int) -> tuple[int, int, int]:
|
||||
"""Validate a packed path byte using firmware-equivalent rules.
|
||||
|
||||
Returns:
|
||||
(hop_count, hash_size, byte_len)
|
||||
|
||||
Raises:
|
||||
ValueError: If the encoding uses reserved mode 3 or exceeds MAX_PATH_SIZE.
|
||||
"""
|
||||
hop_count, hash_size = decode_path_byte(path_byte)
|
||||
byte_len = path_wire_len(hop_count, hash_size)
|
||||
if byte_len > MAX_PATH_SIZE:
|
||||
raise ValueError(
|
||||
f"Invalid path length {byte_len} bytes exceeds MAX_PATH_SIZE={MAX_PATH_SIZE}"
|
||||
)
|
||||
return hop_count, hash_size, byte_len
|
||||
|
||||
|
||||
def parse_packet_envelope(raw_packet: bytes) -> ParsedPacketEnvelope | None:
|
||||
"""Parse packet framing using firmware Packet::readFrom() semantics.
|
||||
|
||||
Validation matches the firmware's path checks:
|
||||
- reserved mode 3 is invalid
|
||||
- hop_count * hash_size must not exceed MAX_PATH_SIZE
|
||||
- at least one payload byte must remain after the path
|
||||
"""
|
||||
if len(raw_packet) < 2:
|
||||
return None
|
||||
|
||||
try:
|
||||
header = raw_packet[0]
|
||||
route_type = header & 0x03
|
||||
payload_type = (header >> 2) & 0x0F
|
||||
payload_version = (header >> 6) & 0x03
|
||||
|
||||
offset = 1
|
||||
if route_type in (0x00, 0x03):
|
||||
if len(raw_packet) < offset + 4:
|
||||
return None
|
||||
offset += 4
|
||||
|
||||
if len(raw_packet) < offset + 1:
|
||||
return None
|
||||
path_byte = raw_packet[offset]
|
||||
offset += 1
|
||||
|
||||
hop_count, hash_size, path_byte_len = validate_path_byte(path_byte)
|
||||
if len(raw_packet) < offset + path_byte_len:
|
||||
return None
|
||||
|
||||
path = raw_packet[offset : offset + path_byte_len]
|
||||
offset += path_byte_len
|
||||
|
||||
if offset >= len(raw_packet):
|
||||
return None
|
||||
|
||||
return ParsedPacketEnvelope(
|
||||
header=header,
|
||||
route_type=route_type,
|
||||
payload_type=payload_type,
|
||||
payload_version=payload_version,
|
||||
path_byte=path_byte,
|
||||
hop_count=hop_count,
|
||||
hash_size=hash_size,
|
||||
path_byte_len=path_byte_len,
|
||||
path=path,
|
||||
payload=raw_packet[offset:],
|
||||
payload_offset=offset,
|
||||
)
|
||||
except (IndexError, ValueError):
|
||||
return None
|
||||
|
||||
|
||||
def split_path_hex(path_hex: str, hop_count: int) -> list[str]:
|
||||
"""Split a hex path string into per-hop chunks using the known hop count.
|
||||
|
||||
If hop_count is 0 or the hex length doesn't divide evenly, falls back
|
||||
to 2-char (1-byte) chunks for backward compatibility.
|
||||
"""
|
||||
if not path_hex or hop_count <= 0:
|
||||
return []
|
||||
chars_per_hop = len(path_hex) // hop_count
|
||||
if chars_per_hop < 2 or chars_per_hop % 2 != 0 or chars_per_hop * hop_count != len(path_hex):
|
||||
# Inconsistent — fall back to legacy 2-char split
|
||||
return [path_hex[i : i + 2] for i in range(0, len(path_hex), 2)]
|
||||
return [path_hex[i : i + chars_per_hop] for i in range(0, len(path_hex), chars_per_hop)]
|
||||
|
||||
|
||||
def first_hop_hex(path_hex: str, hop_count: int) -> str | None:
|
||||
"""Extract the first hop identifier from a path hex string.
|
||||
|
||||
Returns None for empty/direct paths.
|
||||
"""
|
||||
hops = split_path_hex(path_hex, hop_count)
|
||||
return hops[0] if hops else None
|
||||
|
||||
|
||||
def normalize_contact_route(
|
||||
path_hex: str | None,
|
||||
path_len: int | None,
|
||||
out_path_hash_mode: int | None,
|
||||
) -> tuple[str, int, int]:
|
||||
"""Normalize stored contact route fields.
|
||||
|
||||
Handles legacy/bad rows where the packed wire path byte was stored directly
|
||||
in `last_path_len` (sometimes as a signed byte, e.g. `-125` for `0x83`).
|
||||
Returns `(path_hex, hop_count, hash_mode)`.
|
||||
"""
|
||||
normalized_path = path_hex or ""
|
||||
|
||||
try:
|
||||
normalized_len = int(path_len) if path_len is not None else -1
|
||||
except (TypeError, ValueError):
|
||||
normalized_len = -1
|
||||
|
||||
try:
|
||||
normalized_mode = int(out_path_hash_mode) if out_path_hash_mode is not None else None
|
||||
except (TypeError, ValueError):
|
||||
normalized_mode = None
|
||||
|
||||
if normalized_len < -1 or normalized_len > 63:
|
||||
packed = normalized_len & 0xFF
|
||||
if packed == 0xFF:
|
||||
return "", -1, -1
|
||||
decoded_mode = (packed >> 6) & 0x03
|
||||
if decoded_mode != 0x03:
|
||||
normalized_len = packed & 0x3F
|
||||
normalized_mode = decoded_mode
|
||||
|
||||
if normalized_len == -1:
|
||||
return "", -1, -1
|
||||
|
||||
if normalized_mode not in (0, 1, 2):
|
||||
normalized_mode = 0
|
||||
|
||||
if normalized_path:
|
||||
bytes_per_hop = normalized_mode + 1
|
||||
actual_bytes = len(normalized_path) // 2
|
||||
expected_bytes = normalized_len * bytes_per_hop
|
||||
if actual_bytes > expected_bytes >= 0:
|
||||
normalized_path = normalized_path[: expected_bytes * 2]
|
||||
elif (
|
||||
actual_bytes < expected_bytes
|
||||
and bytes_per_hop > 0
|
||||
and actual_bytes % bytes_per_hop == 0
|
||||
):
|
||||
normalized_len = actual_bytes // bytes_per_hop
|
||||
|
||||
return normalized_path, normalized_len, normalized_mode
|
||||
52
app/radio.py
52
app/radio.py
@@ -128,6 +128,8 @@ class RadioManager:
|
||||
self._setup_lock: asyncio.Lock | None = None
|
||||
self._setup_in_progress: bool = False
|
||||
self._setup_complete: bool = False
|
||||
self.path_hash_mode: int = 0
|
||||
self.path_hash_mode_supported: bool = False
|
||||
|
||||
async def _acquire_operation_lock(
|
||||
self,
|
||||
@@ -272,6 +274,54 @@ class RadioManager:
|
||||
"set_flood_scope failed (firmware may not support it): %s", exc
|
||||
)
|
||||
|
||||
# Query path hash mode support (best-effort; older firmware won't report it).
|
||||
# If the library's parsed payload is missing path_hash_mode (e.g. stale
|
||||
# .pyc on WSL2 Windows mounts), fall back to raw-frame extraction.
|
||||
reader = mc._reader
|
||||
_original_handle_rx = reader.handle_rx
|
||||
_captured_frame: list[bytes] = []
|
||||
|
||||
async def _capture_handle_rx(data: bytearray) -> None:
|
||||
from meshcore.packets import PacketType
|
||||
|
||||
if len(data) > 0 and data[0] == PacketType.DEVICE_INFO.value:
|
||||
_captured_frame.append(bytes(data))
|
||||
return await _original_handle_rx(data)
|
||||
|
||||
reader.handle_rx = _capture_handle_rx
|
||||
self.path_hash_mode = 0
|
||||
self.path_hash_mode_supported = False
|
||||
try:
|
||||
device_query = await mc.commands.send_device_query()
|
||||
if device_query and "path_hash_mode" in device_query.payload:
|
||||
self.path_hash_mode = device_query.payload["path_hash_mode"]
|
||||
self.path_hash_mode_supported = True
|
||||
elif _captured_frame:
|
||||
# Raw-frame fallback: byte 1 = fw_ver, byte 81 = path_hash_mode
|
||||
raw = _captured_frame[-1]
|
||||
fw_ver = raw[1] if len(raw) > 1 else 0
|
||||
if fw_ver >= 10 and len(raw) >= 82:
|
||||
self.path_hash_mode = raw[81]
|
||||
self.path_hash_mode_supported = True
|
||||
logger.warning(
|
||||
"path_hash_mode=%d extracted from raw frame "
|
||||
"(stale .pyc? try: rm %s)",
|
||||
self.path_hash_mode,
|
||||
getattr(
|
||||
__import__("meshcore.reader", fromlist=["reader"]),
|
||||
"__cached__",
|
||||
"meshcore __pycache__/reader.*.pyc",
|
||||
),
|
||||
)
|
||||
if self.path_hash_mode_supported:
|
||||
logger.info("Path hash mode: %d (supported)", self.path_hash_mode)
|
||||
else:
|
||||
logger.debug("Firmware does not report path_hash_mode")
|
||||
except Exception as exc:
|
||||
logger.debug("Failed to query path_hash_mode: %s", exc)
|
||||
finally:
|
||||
reader.handle_rx = _original_handle_rx
|
||||
|
||||
# Sync contacts/channels from radio to DB and clear radio
|
||||
logger.info("Syncing and offloading radio data...")
|
||||
result = await sync_and_offload_all(mc)
|
||||
@@ -412,6 +462,8 @@ class RadioManager:
|
||||
await self._meshcore.disconnect()
|
||||
self._meshcore = None
|
||||
self._setup_complete = False
|
||||
self.path_hash_mode = 0
|
||||
self.path_hash_mode_supported = False
|
||||
logger.debug("Radio disconnected")
|
||||
|
||||
async def reconnect(self, *, broadcast_on_success: bool = True) -> bool:
|
||||
|
||||
@@ -30,6 +30,21 @@ from app.repository import (
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _contact_sync_debug_fields(contact: Contact) -> dict[str, object]:
|
||||
"""Return key contact fields for sync failure diagnostics."""
|
||||
return {
|
||||
"type": contact.type,
|
||||
"flags": contact.flags,
|
||||
"last_path": contact.last_path,
|
||||
"last_path_len": contact.last_path_len,
|
||||
"out_path_hash_mode": contact.out_path_hash_mode,
|
||||
"last_advert": contact.last_advert,
|
||||
"lat": contact.lat,
|
||||
"lon": contact.lon,
|
||||
"on_radio": contact.on_radio,
|
||||
}
|
||||
|
||||
|
||||
async def upsert_channel_from_radio_slot(payload: dict, *, on_radio: bool) -> str | None:
|
||||
"""Parse a radio channel-slot payload and upsert to the database.
|
||||
|
||||
@@ -664,7 +679,8 @@ async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
|
||||
continue
|
||||
|
||||
try:
|
||||
result = await mc.commands.add_contact(contact.to_radio_dict())
|
||||
radio_contact_payload = contact.to_radio_dict()
|
||||
result = await mc.commands.add_contact(radio_contact_payload)
|
||||
if result.type == EventType.OK:
|
||||
loaded += 1
|
||||
await ContactRepository.set_on_radio(contact.public_key, True)
|
||||
@@ -687,7 +703,14 @@ async def _sync_contacts_to_radio_inner(mc: MeshCore) -> dict:
|
||||
)
|
||||
except Exception as e:
|
||||
failed += 1
|
||||
logger.warning("Error loading contact %s: %s", contact.public_key[:12], e)
|
||||
logger.warning(
|
||||
"Error loading contact %s with fields=%s radio_payload=%s: %s",
|
||||
contact.public_key[:12],
|
||||
_contact_sync_debug_fields(contact),
|
||||
locals().get("radio_contact_payload"),
|
||||
e,
|
||||
exc_info=True,
|
||||
)
|
||||
|
||||
if loaded > 0 or failed > 0:
|
||||
logger.info(
|
||||
|
||||
@@ -8,6 +8,7 @@ from app.models import (
|
||||
ContactAdvertPathSummary,
|
||||
ContactNameHistory,
|
||||
)
|
||||
from app.path_utils import first_hop_hex, normalize_contact_route
|
||||
|
||||
|
||||
class AmbiguousPublicKeyPrefixError(ValueError):
|
||||
@@ -22,18 +23,26 @@ class AmbiguousPublicKeyPrefixError(ValueError):
|
||||
class ContactRepository:
|
||||
@staticmethod
|
||||
async def upsert(contact: dict[str, Any]) -> None:
|
||||
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
|
||||
contact.get("last_path"),
|
||||
contact.get("last_path_len", -1),
|
||||
contact.get("out_path_hash_mode"),
|
||||
)
|
||||
|
||||
await db.conn.execute(
|
||||
"""
|
||||
INSERT INTO contacts (public_key, name, type, flags, last_path, last_path_len,
|
||||
last_advert, lat, lon, last_seen, on_radio, last_contacted,
|
||||
first_seen)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
out_path_hash_mode,
|
||||
last_advert, lat, lon, last_seen,
|
||||
on_radio, last_contacted, first_seen)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(public_key) DO UPDATE SET
|
||||
name = COALESCE(excluded.name, contacts.name),
|
||||
type = CASE WHEN excluded.type = 0 THEN contacts.type ELSE excluded.type END,
|
||||
flags = excluded.flags,
|
||||
last_path = COALESCE(excluded.last_path, contacts.last_path),
|
||||
last_path_len = excluded.last_path_len,
|
||||
out_path_hash_mode = excluded.out_path_hash_mode,
|
||||
last_advert = COALESCE(excluded.last_advert, contacts.last_advert),
|
||||
lat = COALESCE(excluded.lat, contacts.lat),
|
||||
lon = COALESCE(excluded.lon, contacts.lon),
|
||||
@@ -47,8 +56,9 @@ class ContactRepository:
|
||||
contact.get("name"),
|
||||
contact.get("type", 0),
|
||||
contact.get("flags", 0),
|
||||
contact.get("last_path"),
|
||||
contact.get("last_path_len", -1),
|
||||
last_path,
|
||||
last_path_len,
|
||||
out_path_hash_mode,
|
||||
contact.get("last_advert"),
|
||||
contact.get("lat"),
|
||||
contact.get("lon"),
|
||||
@@ -63,13 +73,19 @@ class ContactRepository:
|
||||
@staticmethod
|
||||
def _row_to_contact(row) -> Contact:
|
||||
"""Convert a database row to a Contact model."""
|
||||
last_path, last_path_len, out_path_hash_mode = normalize_contact_route(
|
||||
row["last_path"],
|
||||
row["last_path_len"],
|
||||
row["out_path_hash_mode"],
|
||||
)
|
||||
return Contact(
|
||||
public_key=row["public_key"],
|
||||
name=row["name"],
|
||||
type=row["type"],
|
||||
flags=row["flags"],
|
||||
last_path=row["last_path"],
|
||||
last_path_len=row["last_path_len"],
|
||||
last_path=last_path,
|
||||
last_path_len=last_path_len,
|
||||
out_path_hash_mode=out_path_hash_mode,
|
||||
last_advert=row["last_advert"],
|
||||
lat=row["lat"],
|
||||
lon=row["lon"],
|
||||
@@ -200,10 +216,28 @@ class ContactRepository:
|
||||
return [ContactRepository._row_to_contact(row) for row in rows]
|
||||
|
||||
@staticmethod
|
||||
async def update_path(public_key: str, path: str, path_len: int) -> None:
|
||||
async def update_path(
|
||||
public_key: str,
|
||||
path: str,
|
||||
path_len: int,
|
||||
out_path_hash_mode: int | None = None,
|
||||
) -> None:
|
||||
normalized_path, normalized_path_len, normalized_hash_mode = normalize_contact_route(
|
||||
path,
|
||||
path_len,
|
||||
out_path_hash_mode,
|
||||
)
|
||||
await db.conn.execute(
|
||||
"UPDATE contacts SET last_path = ?, last_path_len = ?, last_seen = ? WHERE public_key = ?",
|
||||
(path, path_len, int(time.time()), public_key.lower()),
|
||||
"""UPDATE contacts SET last_path = ?, last_path_len = ?,
|
||||
out_path_hash_mode = COALESCE(?, out_path_hash_mode),
|
||||
last_seen = ? WHERE public_key = ?""",
|
||||
(
|
||||
normalized_path,
|
||||
normalized_path_len,
|
||||
normalized_hash_mode,
|
||||
int(time.time()),
|
||||
public_key.lower(),
|
||||
),
|
||||
)
|
||||
await db.conn.commit()
|
||||
|
||||
@@ -287,10 +321,11 @@ class ContactAdvertPathRepository:
|
||||
@staticmethod
|
||||
def _row_to_path(row) -> ContactAdvertPath:
|
||||
path = row["path_hex"] or ""
|
||||
next_hop = path[:2].lower() if len(path) >= 2 else None
|
||||
path_len = row["path_len"]
|
||||
next_hop = first_hop_hex(path, path_len)
|
||||
return ContactAdvertPath(
|
||||
path=path,
|
||||
path_len=row["path_len"],
|
||||
path_len=path_len,
|
||||
next_hop=next_hop,
|
||||
first_seen=row["first_seen"],
|
||||
last_seen=row["last_seen"],
|
||||
@@ -303,6 +338,7 @@ class ContactAdvertPathRepository:
|
||||
path_hex: str,
|
||||
timestamp: int,
|
||||
max_paths: int = 10,
|
||||
hop_count: int | None = None,
|
||||
) -> None:
|
||||
"""
|
||||
Upsert a unique advert path observation for a contact and prune to N most recent.
|
||||
@@ -312,16 +348,15 @@ class ContactAdvertPathRepository:
|
||||
|
||||
normalized_key = public_key.lower()
|
||||
normalized_path = path_hex.lower()
|
||||
path_len = len(normalized_path) // 2
|
||||
path_len = hop_count if hop_count is not None else len(normalized_path) // 2
|
||||
|
||||
await db.conn.execute(
|
||||
"""
|
||||
INSERT INTO contact_advert_paths
|
||||
(public_key, path_hex, path_len, first_seen, last_seen, heard_count)
|
||||
VALUES (?, ?, ?, ?, ?, 1)
|
||||
ON CONFLICT(public_key, path_hex) DO UPDATE SET
|
||||
ON CONFLICT(public_key, path_hex, path_len) DO UPDATE SET
|
||||
last_seen = MAX(contact_advert_paths.last_seen, excluded.last_seen),
|
||||
path_len = excluded.path_len,
|
||||
heard_count = contact_advert_paths.heard_count + 1
|
||||
""",
|
||||
(normalized_key, normalized_path, path_len, timestamp, timestamp),
|
||||
@@ -332,8 +367,8 @@ class ContactAdvertPathRepository:
|
||||
"""
|
||||
DELETE FROM contact_advert_paths
|
||||
WHERE public_key = ?
|
||||
AND path_hex NOT IN (
|
||||
SELECT path_hex
|
||||
AND id NOT IN (
|
||||
SELECT id
|
||||
FROM contact_advert_paths
|
||||
WHERE public_key = ?
|
||||
ORDER BY last_seen DESC, heard_count DESC, path_len ASC, path_hex ASC
|
||||
|
||||
@@ -26,6 +26,7 @@ class MessageRepository:
|
||||
conversation_key: str,
|
||||
sender_timestamp: int | None = None,
|
||||
path: str | None = None,
|
||||
path_len: int | None = None,
|
||||
txt_type: int = 0,
|
||||
signature: str | None = None,
|
||||
outgoing: bool = False,
|
||||
@@ -43,7 +44,10 @@ class MessageRepository:
|
||||
# Convert single path to paths array format
|
||||
paths_json = None
|
||||
if path is not None:
|
||||
paths_json = json.dumps([{"path": path, "received_at": received_at}])
|
||||
entry: dict = {"path": path, "received_at": received_at}
|
||||
if path_len is not None:
|
||||
entry["path_len"] = path_len
|
||||
paths_json = json.dumps([entry])
|
||||
|
||||
cursor = await db.conn.execute(
|
||||
"""
|
||||
@@ -74,7 +78,10 @@ class MessageRepository:
|
||||
|
||||
@staticmethod
|
||||
async def add_path(
|
||||
message_id: int, path: str, received_at: int | None = None
|
||||
message_id: int,
|
||||
path: str,
|
||||
received_at: int | None = None,
|
||||
path_len: int | None = None,
|
||||
) -> list[MessagePath]:
|
||||
"""Add a new path to an existing message.
|
||||
|
||||
@@ -85,7 +92,10 @@ class MessageRepository:
|
||||
|
||||
# Atomic append: use json_insert to avoid read-modify-write race when
|
||||
# multiple duplicate packets arrive concurrently for the same message.
|
||||
new_entry = json.dumps({"path": path, "received_at": ts})
|
||||
entry: dict = {"path": path, "received_at": ts}
|
||||
if path_len is not None:
|
||||
entry["path_len"] = path_len
|
||||
new_entry = json.dumps(entry)
|
||||
await db.conn.execute(
|
||||
"""UPDATE messages SET paths = json_insert(
|
||||
COALESCE(paths, '[]'), '$[#]', json(?)
|
||||
|
||||
@@ -5,6 +5,7 @@ from typing import Any, Literal
|
||||
|
||||
from app.database import db
|
||||
from app.models import AppSettings, Favorite
|
||||
from app.path_utils import parse_packet_envelope
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -269,6 +270,53 @@ class StatisticsRepository:
|
||||
"last_week": row["last_week"] or 0,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
async def _path_hash_width_24h() -> dict[str, int | float]:
|
||||
"""Count parsed raw packets from the last 24h by hop hash width."""
|
||||
now = int(time.time())
|
||||
cursor = await db.conn.execute(
|
||||
"SELECT data FROM raw_packets WHERE timestamp >= ?",
|
||||
(now - SECONDS_24H,),
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
|
||||
single_byte = 0
|
||||
double_byte = 0
|
||||
triple_byte = 0
|
||||
|
||||
for row in rows:
|
||||
envelope = parse_packet_envelope(bytes(row["data"]))
|
||||
if envelope is None:
|
||||
continue
|
||||
if envelope.hash_size == 1:
|
||||
single_byte += 1
|
||||
elif envelope.hash_size == 2:
|
||||
double_byte += 1
|
||||
elif envelope.hash_size == 3:
|
||||
triple_byte += 1
|
||||
|
||||
total_packets = single_byte + double_byte + triple_byte
|
||||
if total_packets == 0:
|
||||
return {
|
||||
"total_packets": 0,
|
||||
"single_byte": 0,
|
||||
"double_byte": 0,
|
||||
"triple_byte": 0,
|
||||
"single_byte_pct": 0.0,
|
||||
"double_byte_pct": 0.0,
|
||||
"triple_byte_pct": 0.0,
|
||||
}
|
||||
|
||||
return {
|
||||
"total_packets": total_packets,
|
||||
"single_byte": single_byte,
|
||||
"double_byte": double_byte,
|
||||
"triple_byte": triple_byte,
|
||||
"single_byte_pct": (single_byte / total_packets) * 100,
|
||||
"double_byte_pct": (double_byte / total_packets) * 100,
|
||||
"triple_byte_pct": (triple_byte / total_packets) * 100,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
async def get_all() -> dict:
|
||||
"""Aggregate all statistics from existing tables."""
|
||||
@@ -348,6 +396,7 @@ class StatisticsRepository:
|
||||
# Activity windows
|
||||
contacts_heard = await StatisticsRepository._activity_counts(contact_type=2, exclude=True)
|
||||
repeaters_heard = await StatisticsRepository._activity_counts(contact_type=2)
|
||||
path_hash_width_24h = await StatisticsRepository._path_hash_width_24h()
|
||||
|
||||
return {
|
||||
"busiest_channels_24h": busiest_channels_24h,
|
||||
@@ -362,4 +411,5 @@ class StatisticsRepository:
|
||||
"total_outgoing": total_outgoing,
|
||||
"contacts_heard": contacts_heard,
|
||||
"repeaters_heard": repeaters_heard,
|
||||
"path_hash_width_24h": path_hash_width_24h,
|
||||
}
|
||||
|
||||
@@ -110,6 +110,7 @@ async def create_contact(
|
||||
"flags": existing.flags,
|
||||
"last_path": existing.last_path,
|
||||
"last_path_len": existing.last_path_len,
|
||||
"out_path_hash_mode": existing.out_path_hash_mode,
|
||||
"last_advert": existing.last_advert,
|
||||
"lat": existing.lat,
|
||||
"lon": existing.lon,
|
||||
@@ -139,6 +140,7 @@ async def create_contact(
|
||||
"flags": 0,
|
||||
"last_path": None,
|
||||
"last_path_len": -1,
|
||||
"out_path_hash_mode": -1,
|
||||
"last_advert": None,
|
||||
"lat": None,
|
||||
"lon": None,
|
||||
@@ -204,8 +206,8 @@ async def get_contact_detail(public_key: str) -> ContactDetail:
|
||||
# Compute nearest repeaters from first-hop prefixes in advert paths
|
||||
first_hop_stats: dict[str, dict] = {} # prefix -> {heard_count, path_len, last_seen}
|
||||
for p in advert_paths:
|
||||
if p.path and len(p.path) >= 2:
|
||||
prefix = p.path[:2].lower()
|
||||
prefix = p.next_hop
|
||||
if prefix:
|
||||
if prefix not in first_hop_stats:
|
||||
first_hop_stats[prefix] = {
|
||||
"heard_count": 0,
|
||||
@@ -462,7 +464,7 @@ async def reset_contact_path(public_key: str) -> dict:
|
||||
"""Reset a contact's routing path to flood."""
|
||||
contact = await _resolve_contact_or_404(public_key)
|
||||
|
||||
await ContactRepository.update_path(contact.public_key, "", -1)
|
||||
await ContactRepository.update_path(contact.public_key, "", -1, -1)
|
||||
logger.info("Reset path to flood for %s", contact.public_key[:12])
|
||||
|
||||
# Push the updated path to radio if connected and contact is on radio
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
import logging
|
||||
import re
|
||||
import string
|
||||
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from pydantic import BaseModel, Field
|
||||
@@ -15,6 +16,48 @@ router = APIRouter(prefix="/fanout", tags=["fanout"])
|
||||
_VALID_TYPES = {"mqtt_private", "mqtt_community", "bot", "webhook", "apprise"}
|
||||
|
||||
_IATA_RE = re.compile(r"^[A-Z]{3}$")
|
||||
_DEFAULT_COMMUNITY_MQTT_TOPIC_TEMPLATE = "meshcore/{IATA}/{PUBLIC_KEY}/packets"
|
||||
_DEFAULT_COMMUNITY_MQTT_BROKER_HOST = "mqtt-us-v1.letsmesh.net"
|
||||
_DEFAULT_COMMUNITY_MQTT_BROKER_PORT = 443
|
||||
_DEFAULT_COMMUNITY_MQTT_TRANSPORT = "websockets"
|
||||
_DEFAULT_COMMUNITY_MQTT_AUTH_MODE = "token"
|
||||
_COMMUNITY_MQTT_TEMPLATE_FIELD_CANONICAL = {
|
||||
"iata": "IATA",
|
||||
"public_key": "PUBLIC_KEY",
|
||||
}
|
||||
_ALLOWED_COMMUNITY_MQTT_TRANSPORTS = {"tcp", "websockets"}
|
||||
_ALLOWED_COMMUNITY_MQTT_AUTH_MODES = {"token", "password", "none"}
|
||||
|
||||
|
||||
def _normalize_community_topic_template(topic_template: str) -> str:
|
||||
"""Normalize Community MQTT topic template placeholders to canonical uppercase form."""
|
||||
template = topic_template.strip() or _DEFAULT_COMMUNITY_MQTT_TOPIC_TEMPLATE
|
||||
parts: list[str] = []
|
||||
try:
|
||||
parsed = string.Formatter().parse(template)
|
||||
for literal_text, field_name, format_spec, conversion in parsed:
|
||||
parts.append(literal_text)
|
||||
if field_name is None:
|
||||
continue
|
||||
normalized_field = _COMMUNITY_MQTT_TEMPLATE_FIELD_CANONICAL.get(field_name.lower())
|
||||
if normalized_field is None:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=(
|
||||
f"topic_template may only use {{IATA}} and {{PUBLIC_KEY}}; got {field_name}"
|
||||
),
|
||||
)
|
||||
replacement = ["{", normalized_field]
|
||||
if conversion:
|
||||
replacement.extend(["!", conversion])
|
||||
if format_spec:
|
||||
replacement.extend([":", format_spec])
|
||||
replacement.append("}")
|
||||
parts.append("".join(replacement))
|
||||
except ValueError as exc:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid topic_template: {exc}") from None
|
||||
|
||||
return "".join(parts)
|
||||
|
||||
|
||||
class FanoutConfigCreate(BaseModel):
|
||||
@@ -43,6 +86,46 @@ def _validate_mqtt_private_config(config: dict) -> None:
|
||||
|
||||
def _validate_mqtt_community_config(config: dict) -> None:
|
||||
"""Validate mqtt_community config blob. Normalizes IATA to uppercase."""
|
||||
broker_host = str(config.get("broker_host", _DEFAULT_COMMUNITY_MQTT_BROKER_HOST)).strip()
|
||||
if not broker_host:
|
||||
broker_host = _DEFAULT_COMMUNITY_MQTT_BROKER_HOST
|
||||
config["broker_host"] = broker_host
|
||||
|
||||
port = config.get("broker_port", _DEFAULT_COMMUNITY_MQTT_BROKER_PORT)
|
||||
if not isinstance(port, int) or port < 1 or port > 65535:
|
||||
raise HTTPException(status_code=400, detail="broker_port must be between 1 and 65535")
|
||||
config["broker_port"] = port
|
||||
|
||||
transport = str(config.get("transport", _DEFAULT_COMMUNITY_MQTT_TRANSPORT)).strip().lower()
|
||||
if transport not in _ALLOWED_COMMUNITY_MQTT_TRANSPORTS:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="transport must be 'websockets' or 'tcp'",
|
||||
)
|
||||
config["transport"] = transport
|
||||
config["use_tls"] = bool(config.get("use_tls", True))
|
||||
config["tls_verify"] = bool(config.get("tls_verify", True))
|
||||
|
||||
auth_mode = str(config.get("auth_mode", _DEFAULT_COMMUNITY_MQTT_AUTH_MODE)).strip().lower()
|
||||
if auth_mode not in _ALLOWED_COMMUNITY_MQTT_AUTH_MODES:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="auth_mode must be 'token', 'password', or 'none'",
|
||||
)
|
||||
config["auth_mode"] = auth_mode
|
||||
username = str(config.get("username", "")).strip()
|
||||
password = str(config.get("password", "")).strip()
|
||||
if auth_mode == "password" and (not username or not password):
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="username and password are required when auth_mode is 'password'",
|
||||
)
|
||||
config["username"] = username
|
||||
config["password"] = password
|
||||
|
||||
token_audience = str(config.get("token_audience", "")).strip()
|
||||
config["token_audience"] = token_audience
|
||||
|
||||
iata = config.get("iata", "").upper().strip()
|
||||
if not iata or not _IATA_RE.fullmatch(iata):
|
||||
raise HTTPException(
|
||||
@@ -51,6 +134,14 @@ def _validate_mqtt_community_config(config: dict) -> None:
|
||||
)
|
||||
config["iata"] = iata
|
||||
|
||||
topic_template = str(
|
||||
config.get("topic_template", _DEFAULT_COMMUNITY_MQTT_TOPIC_TEMPLATE)
|
||||
).strip()
|
||||
if not topic_template:
|
||||
topic_template = _DEFAULT_COMMUNITY_MQTT_TOPIC_TEMPLATE
|
||||
|
||||
config["topic_template"] = _normalize_community_topic_template(topic_template)
|
||||
|
||||
|
||||
def _validate_bot_config(config: dict) -> None:
|
||||
"""Validate bot config blob (syntax-check the code)."""
|
||||
|
||||
@@ -14,6 +14,7 @@ router = APIRouter(tags=["health"])
|
||||
class HealthResponse(BaseModel):
|
||||
status: str
|
||||
radio_connected: bool
|
||||
radio_initializing: bool = False
|
||||
connection_info: str | None
|
||||
database_size_mb: float
|
||||
oldest_undecrypted_timestamp: int | None
|
||||
@@ -45,9 +46,20 @@ async def build_health_data(radio_connected: bool, connection_info: str | None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
setup_in_progress = getattr(radio_manager, "is_setup_in_progress", False)
|
||||
if not isinstance(setup_in_progress, bool):
|
||||
setup_in_progress = False
|
||||
|
||||
setup_complete = getattr(radio_manager, "is_setup_complete", radio_connected)
|
||||
if not isinstance(setup_complete, bool):
|
||||
setup_complete = radio_connected
|
||||
|
||||
radio_initializing = bool(radio_connected and (setup_in_progress or not setup_complete))
|
||||
|
||||
return {
|
||||
"status": "ok" if radio_connected else "degraded",
|
||||
"status": "ok" if radio_connected and not radio_initializing else "degraded",
|
||||
"radio_connected": radio_connected,
|
||||
"radio_initializing": radio_initializing,
|
||||
"connection_info": connection_info,
|
||||
"database_size_mb": db_size_mb,
|
||||
"oldest_undecrypted_timestamp": oldest_ts,
|
||||
|
||||
@@ -71,6 +71,7 @@ async def _run_historical_channel_decryption(
|
||||
timestamp=result.timestamp,
|
||||
received_at=packet_timestamp,
|
||||
path=path_hex,
|
||||
path_len=packet_info.path_length if packet_info else None,
|
||||
realtime=False, # Historical decryption should not trigger fanout
|
||||
)
|
||||
|
||||
|
||||
@@ -28,6 +28,12 @@ class RadioConfigResponse(BaseModel):
|
||||
tx_power: int = Field(description="Transmit power in dBm")
|
||||
max_tx_power: int = Field(description="Maximum transmit power in dBm")
|
||||
radio: RadioSettings
|
||||
path_hash_mode: int = Field(
|
||||
default=0, description="Path hash mode (0=1-byte, 1=2-byte, 2=3-byte)"
|
||||
)
|
||||
path_hash_mode_supported: bool = Field(
|
||||
default=False, description="Whether firmware supports path hash mode setting"
|
||||
)
|
||||
|
||||
|
||||
class RadioConfigUpdate(BaseModel):
|
||||
@@ -36,6 +42,12 @@ class RadioConfigUpdate(BaseModel):
|
||||
lon: float | None = None
|
||||
tx_power: int | None = Field(default=None, description="Transmit power in dBm")
|
||||
radio: RadioSettings | None = None
|
||||
path_hash_mode: int | None = Field(
|
||||
default=None,
|
||||
ge=0,
|
||||
le=2,
|
||||
description="Path hash mode (0=1-byte, 1=2-byte, 2=3-byte)",
|
||||
)
|
||||
|
||||
|
||||
class PrivateKeyUpdate(BaseModel):
|
||||
@@ -64,6 +76,8 @@ async def get_radio_config() -> RadioConfigResponse:
|
||||
sf=info.get("radio_sf", 0),
|
||||
cr=info.get("radio_cr", 0),
|
||||
),
|
||||
path_hash_mode=radio_manager.path_hash_mode,
|
||||
path_hash_mode_supported=radio_manager.path_hash_mode_supported,
|
||||
)
|
||||
|
||||
|
||||
@@ -103,6 +117,20 @@ async def update_radio_config(update: RadioConfigUpdate) -> RadioConfigResponse:
|
||||
cr=update.radio.cr,
|
||||
)
|
||||
|
||||
if update.path_hash_mode is not None:
|
||||
if not radio_manager.path_hash_mode_supported:
|
||||
raise HTTPException(
|
||||
status_code=400, detail="Firmware does not support path hash mode setting"
|
||||
)
|
||||
logger.info("Setting path hash mode to %d", update.path_hash_mode)
|
||||
result = await mc.commands.set_path_hash_mode(update.path_hash_mode)
|
||||
if result is not None and result.type == EventType.ERROR:
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to set path hash mode: {result.payload}",
|
||||
)
|
||||
radio_manager.path_hash_mode = update.path_hash_mode
|
||||
|
||||
# Sync time with system clock
|
||||
await sync_radio_time(mc)
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@ services:
|
||||
# Set your serial device for passthrough here! #
|
||||
################################################
|
||||
devices:
|
||||
- /dev/ttyUSB0:/dev/ttyUSB0
|
||||
- /dev/ttyACM0:/dev/ttyUSB0
|
||||
|
||||
environment:
|
||||
MESHCORE_DATABASE_PATH: data/meshcore.db
|
||||
|
||||
1
frontend/.npmrc
Normal file
1
frontend/.npmrc
Normal file
@@ -0,0 +1 @@
|
||||
install-links=true
|
||||
@@ -12,7 +12,9 @@ Keep it aligned with `frontend/src` source code.
|
||||
- Tailwind utility classes + local CSS (`index.css`, `styles.css`)
|
||||
- Sonner (toasts)
|
||||
- Leaflet / react-leaflet (map)
|
||||
- `@michaelhart/meshcore-decoder` installed via npm alias to `meshcore-decoder-multibyte-patch`
|
||||
- `meshcore-hashtag-cracker` + `nosleep.js` (channel cracker)
|
||||
- Multibyte-aware decoder build published as `meshcore-decoder-multibyte-patch`
|
||||
|
||||
## Frontend Map
|
||||
|
||||
@@ -138,6 +140,7 @@ frontend/src/
|
||||
├── useContactsAndChannels.test.ts
|
||||
├── useWebSocket.dispatch.test.ts
|
||||
└── useWebSocket.lifecycle.test.ts
|
||||
|
||||
```
|
||||
|
||||
## Architecture Notes
|
||||
@@ -175,11 +178,17 @@ frontend/src/
|
||||
- `VisualizerView.tsx` hosts `PacketVisualizer3D.tsx` (desktop split-pane and mobile tabs).
|
||||
- `PacketVisualizer3D` uses persistent Three.js geometries for links/highlights/particles and updates typed-array buffers in-place per frame.
|
||||
- Packet repeat aggregation keys prefer decoder `messageHash` (path-insensitive), with hash fallback for malformed packets.
|
||||
- Raw-packet decoding in `RawPacketList.tsx` and `visualizerUtils.ts` relies on the multibyte-aware decoder fork; keep frontend packet parsing aligned with backend `path_utils.py`.
|
||||
- Raw packet events carry both:
|
||||
- `id`: backend storage row identity (payload-level dedup)
|
||||
- `observation_id`: realtime per-arrival identity (session fidelity)
|
||||
- Packet feed/visualizer render keys and dedup logic should use `observation_id` (fallback to `id` only for older payloads).
|
||||
|
||||
### Radio settings behavior
|
||||
|
||||
- `SettingsRadioSection.tsx` surfaces `path_hash_mode` only when `config.path_hash_mode_supported` is true.
|
||||
- Frontend `path_len` fields are hop counts, not raw byte lengths; multibyte path rendering must use the accompanying metadata before splitting hop identifiers.
|
||||
|
||||
## WebSocket (`useWebSocket.ts`)
|
||||
|
||||
- Auto reconnect (3s) with cleanup guard on unmount.
|
||||
|
||||
21
frontend/lib/meshcore-decoder/LICENSE.md
Normal file
21
frontend/lib/meshcore-decoder/LICENSE.md
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2025 Michael Hart <michaelhart@michaelhart.me> (https://github.com/michaelhart)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
453
frontend/lib/meshcore-decoder/README.md
Normal file
453
frontend/lib/meshcore-decoder/README.md
Normal file
@@ -0,0 +1,453 @@
|
||||
# MeshCore Decoder
|
||||
|
||||
A TypeScript library for decoding MeshCore mesh networking packets with full cryptographic support. Uses WebAssembly (WASM) for Ed25519 key derivation through the [orlp/ed25519 library](https://github.com/orlp/ed25519).
|
||||
|
||||
This powers the [MeshCore Packet Analyzer](https://analyzer.letsme.sh/).
|
||||
|
||||
## Features
|
||||
|
||||
- **Packet Decoding**: Decode MeshCore packets
|
||||
- **Built-in Decryption**: Decrypt GroupText, TextMessage, and other encrypted payloads
|
||||
- **Developer Friendly**: TypeScript-first with full type safety and portability of JavaScript
|
||||
|
||||
## Installation
|
||||
|
||||
### Install to a single project
|
||||
|
||||
```bash
|
||||
npm install @michaelhart/meshcore-decoder
|
||||
```
|
||||
|
||||
### Install CLI (install globally)
|
||||
|
||||
```bash
|
||||
npm install -g @michaelhart/meshcore-decoder
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
```typescript
|
||||
import {
|
||||
MeshCoreDecoder,
|
||||
PayloadType,
|
||||
Utils,
|
||||
DecodedPacket,
|
||||
AdvertPayload
|
||||
} from '@michaelhart/meshcore-decoder';
|
||||
|
||||
// Decode a MeshCore packet
|
||||
const hexData: string = '11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172';
|
||||
|
||||
const packet: DecodedPacket = MeshCoreDecoder.decode(hexData);
|
||||
|
||||
console.log(`Route Type: ${Utils.getRouteTypeName(packet.routeType)}`);
|
||||
console.log(`Payload Type: ${Utils.getPayloadTypeName(packet.payloadType)}`);
|
||||
console.log(`Message Hash: ${packet.messageHash}`);
|
||||
|
||||
if (packet.payloadType === PayloadType.Advert && packet.payload.decoded) {
|
||||
const advert: AdvertPayload = packet.payload.decoded as AdvertPayload;
|
||||
console.log(`Device Name: ${advert.appData.name}`);
|
||||
console.log(`Device Role: ${Utils.getDeviceRoleName(advert.appData.deviceRole)}`);
|
||||
if (advert.appData.location) {
|
||||
console.log(`Location: ${advert.appData.location.latitude}, ${advert.appData.location.longitude}`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Full Packet Structure Example
|
||||
|
||||
Here's what a complete decoded packet looks like:
|
||||
|
||||
```typescript
|
||||
import { MeshCoreDecoder, DecodedPacket } from '@michaelhart/meshcore-decoder';
|
||||
|
||||
const hexData: string = '11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172';
|
||||
|
||||
const packet: DecodedPacket = MeshCoreDecoder.decode(hexData);
|
||||
|
||||
console.log(JSON.stringify(packet, null, 2));
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```json
|
||||
{
|
||||
"messageHash": "F9C060FE",
|
||||
"routeType": 1,
|
||||
"payloadType": 4,
|
||||
"payloadVersion": 0,
|
||||
"pathLength": 0,
|
||||
"path": null,
|
||||
"payload": {
|
||||
"raw": "7E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172",
|
||||
"decoded": {
|
||||
"type": 4,
|
||||
"version": 0,
|
||||
"isValid": true,
|
||||
"publicKey": "7E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C9400",
|
||||
"timestamp": 1758455660,
|
||||
"signature": "2E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E609",
|
||||
"appData": {
|
||||
"flags": 146,
|
||||
"deviceRole": 2,
|
||||
"hasLocation": true,
|
||||
"hasName": true,
|
||||
"location": {
|
||||
"latitude": 47.543968,
|
||||
"longitude": -122.108616
|
||||
},
|
||||
"name": "WW7STR/PugetMesh Cougar"
|
||||
}
|
||||
}
|
||||
},
|
||||
"totalBytes": 134,
|
||||
"isValid": true
|
||||
}
|
||||
```
|
||||
|
||||
## Packet Support
|
||||
|
||||
| Value | Name | Description | Decoding | Decryption | Segment Analysis |
|
||||
|-------|------|-------------|----------|------------|------------------|
|
||||
| `0x00` | Request | Request (destination/source hashes + MAC) | ✅ | 🚧 | ✅ |
|
||||
| `0x01` | Response | Response to REQ or ANON_REQ | ✅ | 🚧 | ✅ |
|
||||
| `0x02` | Plain text message | Plain text message | ✅ | 🚧 | ✅ |
|
||||
| `0x03` | Acknowledgment | Acknowledgment | ✅ | N/A | ✅ |
|
||||
| `0x04` | Node advertisement | Node advertisement | ✅ | N/A | ✅ |
|
||||
| `0x05` | Group text message | Group text message | ✅ | ✅ | ✅ |
|
||||
| `0x06` | Group datagram | Group datagram | 🚧 | 🚧 | 🚧 |
|
||||
| `0x07` | Anonymous request | Anonymous request | ✅ | 🚧 | ✅ |
|
||||
| `0x08` | Returned path | Returned path | ✅ | N/A | ✅ |
|
||||
| `0x09` | Trace | Trace a path, collecting SNI for each hop | ✅ | N/A | ✅ |
|
||||
| `0x0A` | Multi-part packet | Packet is part of a sequence of packets | 🚧 | 🚧 | 🚧 |
|
||||
| `0x0F` | Custom packet | Custom packet (raw bytes, custom encryption) | 🚧 | 🚧 | 🚧 |
|
||||
|
||||
**Legend:**
|
||||
- ✅ Fully implemented
|
||||
- 🚧 Planned/In development
|
||||
- `-` Not applicable
|
||||
|
||||
For some packet types not yet supported here, they may not exist in MeshCore yet or I have yet to observe these packet types on the mesh.
|
||||
|
||||
## Decryption Support
|
||||
|
||||
Simply provide your channel secret keys and the library handles everything else:
|
||||
|
||||
```typescript
|
||||
import {
|
||||
MeshCoreDecoder,
|
||||
PayloadType,
|
||||
CryptoKeyStore,
|
||||
DecodedPacket,
|
||||
GroupTextPayload
|
||||
} from '@michaelhart/meshcore-decoder';
|
||||
|
||||
// Create a key store with channel secret keys
|
||||
const keyStore: CryptoKeyStore = MeshCoreDecoder.createKeyStore({
|
||||
channelSecrets: [
|
||||
'8b3387e9c5cdea6ac9e5edbaa115cd72', // Public channel (channel hash 11)
|
||||
'ff2b7d74e8d20f71505bda9ea8d59a1c', // A different channel's secret
|
||||
]
|
||||
});
|
||||
|
||||
const groupTextHexData: string = '...'; // Your encrypted GroupText packet hex
|
||||
|
||||
// Decode encrypted GroupText message
|
||||
const encryptedPacket: DecodedPacket = MeshCoreDecoder.decode(groupTextHexData, { keyStore });
|
||||
|
||||
if (encryptedPacket.payloadType === PayloadType.GroupText && encryptedPacket.payload.decoded) {
|
||||
const groupText: GroupTextPayload = encryptedPacket.payload.decoded as GroupTextPayload;
|
||||
|
||||
if (groupText.decrypted) {
|
||||
console.log(`Sender: ${groupText.decrypted.sender}`);
|
||||
console.log(`Message: ${groupText.decrypted.message}`);
|
||||
console.log(`Timestamp: ${new Date(groupText.decrypted.timestamp * 1000).toISOString()}`);
|
||||
} else {
|
||||
console.log('Message encrypted (no key available)');
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The library automatically:
|
||||
- Calculates channel hashes from your secret keys using SHA256
|
||||
- Handles hash collisions (multiple keys with same first byte) by trying all matching keys
|
||||
- Verifies message authenticity using HMAC-SHA256
|
||||
- Decrypts using AES-128 ECB
|
||||
|
||||
## Packet Structure Analysis
|
||||
|
||||
For detailed packet analysis and debugging, use `analyzeStructure()` to get byte-level breakdowns:
|
||||
|
||||
```typescript
|
||||
import { MeshCoreDecoder, PacketStructure } from '@michaelhart/meshcore-decoder';
|
||||
|
||||
console.log('=== Packet Breakdown ===');
|
||||
const hexData: string = '11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172';
|
||||
|
||||
console.log('Packet length:', hexData.length);
|
||||
console.log('Expected bytes:', hexData.length / 2);
|
||||
|
||||
const structure: PacketStructure = MeshCoreDecoder.analyzeStructure(hexData);
|
||||
console.log('\nMain segments:');
|
||||
structure.segments.forEach((seg, i) => {
|
||||
console.log(`${i+1}. ${seg.name} (bytes ${seg.startByte}-${seg.endByte}): ${seg.value}`);
|
||||
});
|
||||
|
||||
console.log('\nPayload segments:');
|
||||
structure.payload.segments.forEach((seg, i) => {
|
||||
console.log(`${i+1}. ${seg.name} (bytes ${seg.startByte}-${seg.endByte}): ${seg.value}`);
|
||||
console.log(` Description: ${seg.description}`);
|
||||
});
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```
|
||||
=== Packet Breakdown ===
|
||||
Packet length: 268
|
||||
Expected bytes: 134
|
||||
|
||||
Main segments:
|
||||
1. Header (bytes 0-0): 0x11
|
||||
2. Path Length (bytes 1-1): 0x00
|
||||
3. Payload (bytes 2-133): 7E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172
|
||||
|
||||
Payload segments:
|
||||
1. Public Key (bytes 0-31): 7E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C9400
|
||||
Description: Ed25519 public key
|
||||
2. Timestamp (bytes 32-35): 6CE7CF68
|
||||
Description: 1758455660 (2025-09-21T11:54:20Z)
|
||||
3. Signature (bytes 36-99): 2E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E609
|
||||
Description: Ed25519 signature
|
||||
4. App Flags (bytes 100-100): 92
|
||||
Description: Binary: 10010010 | Bits 0-3 (Role): Room server | Bit 4 (Location): Yes | Bit 5 (Feature1): No | Bit 6 (Feature2): No | Bit 7 (Name): Yes
|
||||
5. Latitude (bytes 101-104): A076D502
|
||||
Description: 47.543968° (47.543968)
|
||||
6. Longitude (bytes 105-108): 38C5B8F8
|
||||
Description: -122.108616° (-122.108616)
|
||||
7. Node Name (bytes 109-131): 5757375354522F50756765744D65736820436F75676172
|
||||
Description: Node name: "WW7STR/PugetMesh Cougar"
|
||||
```
|
||||
|
||||
The `analyzeStructure()` method provides:
|
||||
- **Header breakdown** with bit-level field analysis
|
||||
- **Byte-accurate segments** with start/end positions
|
||||
- **Payload field parsing** for all supported packet types
|
||||
- **Human-readable descriptions** for each field
|
||||
|
||||
## Ed25519 Key Derivation
|
||||
|
||||
The library includes MeshCore-compatible Ed25519 key derivation using the exact orlp/ed25519 algorithm via WebAssembly:
|
||||
|
||||
```typescript
|
||||
import { Utils } from '@michaelhart/meshcore-decoder';
|
||||
|
||||
// Derive public key from MeshCore private key (64-byte format)
|
||||
const privateKey = '18469d6140447f77de13cd8d761e605431f52269fbff43b0925752ed9e6745435dc6a86d2568af8b70d3365db3f88234760c8ecc645ce469829bc45b65f1d5d5';
|
||||
|
||||
const publicKey = await Utils.derivePublicKey(privateKey);
|
||||
console.log('Derived Public Key:', publicKey);
|
||||
// Output: 4852B69364572B52EFA1B6BB3E6D0ABED4F389A1CBFBB60A9BBA2CCE649CAF0E
|
||||
|
||||
// Validate a key pair
|
||||
const isValid = await Utils.validateKeyPair(privateKey, publicKey);
|
||||
console.log('Key pair valid:', isValid); // true
|
||||
```
|
||||
|
||||
### Command Line Interface
|
||||
|
||||
For quick analysis from the terminal, install globally and use the CLI:
|
||||
|
||||
```bash
|
||||
# Install globally
|
||||
npm install -g @michaelhart/meshcore-decoder
|
||||
|
||||
# Analyze a packet
|
||||
meshcore-decoder 11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172
|
||||
|
||||
# With decryption (provide channel secrets)
|
||||
meshcore-decoder 150011C3C1354D619BAE9590E4D177DB7EEAF982F5BDCF78005D75157D9535FA90178F785D --key 8b3387e9c5cdea6ac9e5edbaa115cd72
|
||||
|
||||
# Show detailed structure analysis
|
||||
meshcore-decoder --structure 11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172
|
||||
|
||||
# JSON output
|
||||
meshcore-decoder --json 11007E7662676F7F0850A8A355BAAFBFC1EB7B4174C340442D7D7161C9474A2C94006CE7CF682E58408DD8FCC51906ECA98EBF94A037886BDADE7ECD09FD92B839491DF3809C9454F5286D1D3370AC31A34593D569E9A042A3B41FD331DFFB7E18599CE1E60992A076D50238C5B8F85757375354522F50756765744D65736820436F75676172
|
||||
|
||||
# Derive public key from MeshCore private key
|
||||
meshcore-decoder derive-key 18469d6140447f77de13cd8d761e605431f52269fbff43b0925752ed9e6745435dc6a86d2568af8b70d3365db3f88234760c8ecc645ce469829bc45b65f1d5d5
|
||||
|
||||
# Validate key pair
|
||||
meshcore-decoder derive-key 18469d6140447f77de13cd8d761e605431f52269fbff43b0925752ed9e6745435dc6a86d2568af8b70d3365db3f88234760c8ecc645ce469829bc45b65f1d5d5 --validate 4852b69364572b52efa1b6bb3e6d0abed4f389a1cbfbb60a9bba2cce649caf0e
|
||||
|
||||
# Key derivation with JSON output
|
||||
meshcore-decoder derive-key 18469d6140447f77de13cd8d761e605431f52269fbff43b0925752ed9e6745435dc6a86d2568af8b70d3365db3f88234760c8ecc645ce469829bc45b65f1d5d5 --json
|
||||
```
|
||||
|
||||
|
||||
## Using with Angular
|
||||
|
||||
The library works in Angular (and other browser-based) applications but requires additional configuration for WASM support and browser compatibility.
|
||||
|
||||
### 1. Configure Assets in `angular.json`
|
||||
|
||||
Add the WASM files to your Angular assets configuration:
|
||||
|
||||
```json
|
||||
{
|
||||
"projects": {
|
||||
"your-app": {
|
||||
"architect": {
|
||||
"build": {
|
||||
"options": {
|
||||
"assets": [
|
||||
// ... your existing assets ...
|
||||
{
|
||||
"glob": "orlp-ed25519.*",
|
||||
"input": "./node_modules/@michaelhart/meshcore-decoder/lib",
|
||||
"output": "assets/"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Create a WASM Service
|
||||
|
||||
Create `src/app/services/meshcore-wasm.ts`:
|
||||
|
||||
```typescript
|
||||
import { Injectable } from '@angular/core';
|
||||
import { BehaviorSubject } from 'rxjs';
|
||||
|
||||
@Injectable({ providedIn: 'root' })
|
||||
export class MeshCoreWasmService {
|
||||
private wasm: any = null;
|
||||
public ready = new BehaviorSubject<boolean>(false);
|
||||
|
||||
constructor() {
|
||||
this.loadWasm();
|
||||
}
|
||||
|
||||
private async loadWasm() {
|
||||
try {
|
||||
const jsResponse = await fetch('/assets/orlp-ed25519.js');
|
||||
const jsText = await jsResponse.text();
|
||||
|
||||
const script = document.createElement('script');
|
||||
script.textContent = jsText;
|
||||
document.head.appendChild(script);
|
||||
|
||||
this.wasm = await (window as any).OrlpEd25519({
|
||||
locateFile: (path: string) => path === 'orlp-ed25519.wasm' ? '/assets/orlp-ed25519.wasm' : path
|
||||
});
|
||||
|
||||
this.ready.next(true);
|
||||
} catch (error) {
|
||||
console.error('WASM load failed:', error);
|
||||
this.ready.next(false);
|
||||
}
|
||||
}
|
||||
|
||||
derivePublicKey(privateKeyHex: string): string | null {
|
||||
if (!this.wasm) return null;
|
||||
|
||||
const privateKeyBytes = this.hexToBytes(privateKeyHex);
|
||||
const privateKeyPtr = 1024;
|
||||
const publicKeyPtr = 1088;
|
||||
|
||||
this.wasm.HEAPU8.set(privateKeyBytes, privateKeyPtr);
|
||||
|
||||
const result = this.wasm.ccall('orlp_derive_public_key', 'number', ['number', 'number'], [publicKeyPtr, privateKeyPtr]);
|
||||
|
||||
if (result === 0) {
|
||||
const publicKeyBytes = this.wasm.HEAPU8.subarray(publicKeyPtr, publicKeyPtr + 32);
|
||||
return this.bytesToHex(publicKeyBytes);
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private hexToBytes(hex: string): Uint8Array {
|
||||
const bytes = new Uint8Array(hex.length / 2);
|
||||
for (let i = 0; i < hex.length; i += 2) {
|
||||
bytes[i / 2] = parseInt(hex.substr(i, 2), 16);
|
||||
}
|
||||
return bytes;
|
||||
}
|
||||
|
||||
private bytesToHex(bytes: Uint8Array): string {
|
||||
return Array.from(bytes).map(b => b.toString(16).padStart(2, '0')).join('').toUpperCase();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Basic Usage
|
||||
|
||||
```typescript
|
||||
import { MeshCorePacketDecoder } from '@michaelhart/meshcore-decoder';
|
||||
import { MeshCoreWasmService } from './services/meshcore-wasm';
|
||||
|
||||
// Basic packet decoding (works immediately)
|
||||
const packet = MeshCorePacketDecoder.decode(hexData);
|
||||
|
||||
// Key derivation (wait for WASM)
|
||||
wasmService.ready.subscribe(isReady => {
|
||||
if (isReady) {
|
||||
const publicKey = wasmService.derivePublicKey(privateKeyHex);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### Angular/Browser: Important Notes
|
||||
|
||||
- **WASM Loading**: The library uses WebAssembly for Ed25519 key derivation. This requires proper asset configuration and a service to handle async WASM loading.
|
||||
- **Browser Compatibility**: The library automatically detects the environment and uses Web Crypto API in browsers, Node.js crypto in Node.js.
|
||||
- **Async Operations**: Key derivation is async due to WASM loading. Always wait for the `WasmService.ready` observable.
|
||||
- **Error Handling**: WASM operations may fail in some environments. Always wrap in try-catch blocks.
|
||||
|
||||
|
||||
## Development
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Run tests
|
||||
npm test
|
||||
|
||||
# Run tests in watch mode
|
||||
npm run test:watch
|
||||
|
||||
# Build for production
|
||||
npm run build
|
||||
|
||||
# Development with ts-node
|
||||
npm run dev
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2025 Michael Hart <michaelhart@michaelhart.me> (https://github.com/michaelhart)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
3
frontend/lib/meshcore-decoder/dist/cli.d.ts
vendored
Normal file
3
frontend/lib/meshcore-decoder/dist/cli.d.ts
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
#!/usr/bin/env node
|
||||
export {};
|
||||
//# sourceMappingURL=cli.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/cli.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/cli.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"cli.d.ts","sourceRoot":"","sources":["../src/cli.ts"],"names":[],"mappings":""}
|
||||
409
frontend/lib/meshcore-decoder/dist/cli.js
vendored
Normal file
409
frontend/lib/meshcore-decoder/dist/cli.js
vendored
Normal file
@@ -0,0 +1,409 @@
|
||||
#!/usr/bin/env node
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || (function () {
|
||||
var ownKeys = function(o) {
|
||||
ownKeys = Object.getOwnPropertyNames || function (o) {
|
||||
var ar = [];
|
||||
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
|
||||
return ar;
|
||||
};
|
||||
return ownKeys(o);
|
||||
};
|
||||
return function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
})();
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const packet_decoder_1 = require("./decoder/packet-decoder");
|
||||
const enums_1 = require("./types/enums");
|
||||
const enum_names_1 = require("./utils/enum-names");
|
||||
const index_1 = require("./index");
|
||||
const commander_1 = require("commander");
|
||||
const chalk_1 = __importDefault(require("chalk"));
|
||||
const packageJson = __importStar(require("../package.json"));
|
||||
commander_1.program
|
||||
.name('meshcore-decoder')
|
||||
.description('CLI tool for decoding MeshCore packets')
|
||||
.version(packageJson.version);
|
||||
// Default decode command
|
||||
commander_1.program
|
||||
.command('decode', { isDefault: true })
|
||||
.description('Decode a MeshCore packet')
|
||||
.argument('<hex>', 'Hex string of the packet to decode')
|
||||
.option('-k, --key <keys...>', 'Channel secret keys for decryption (hex)')
|
||||
.option('-j, --json', 'Output as JSON instead of formatted text')
|
||||
.option('-s, --structure', 'Show detailed packet structure analysis')
|
||||
.action(async (hex, options) => {
|
||||
try {
|
||||
// Clean up hex input
|
||||
const cleanHex = hex.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
// Create key store if keys provided
|
||||
let keyStore;
|
||||
if (options.key && options.key.length > 0) {
|
||||
keyStore = packet_decoder_1.MeshCorePacketDecoder.createKeyStore({
|
||||
channelSecrets: options.key
|
||||
});
|
||||
}
|
||||
// Decode packet with signature verification
|
||||
const packet = await packet_decoder_1.MeshCorePacketDecoder.decodeWithVerification(cleanHex, { keyStore });
|
||||
if (options.json) {
|
||||
// JSON output
|
||||
if (options.structure) {
|
||||
const structure = await packet_decoder_1.MeshCorePacketDecoder.analyzeStructureWithVerification(cleanHex, { keyStore });
|
||||
console.log(JSON.stringify({ packet, structure }, null, 2));
|
||||
}
|
||||
else {
|
||||
console.log(JSON.stringify(packet, null, 2));
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Formatted output
|
||||
console.log(chalk_1.default.cyan('=== MeshCore Packet Analysis ===\n'));
|
||||
if (!packet.isValid) {
|
||||
console.log(chalk_1.default.red('❌ Invalid Packet'));
|
||||
if (packet.errors) {
|
||||
packet.errors.forEach(error => console.log(chalk_1.default.red(` ${error}`)));
|
||||
}
|
||||
}
|
||||
else {
|
||||
console.log(chalk_1.default.green('✅ Valid Packet'));
|
||||
}
|
||||
console.log(`${chalk_1.default.bold('Message Hash:')} ${packet.messageHash}`);
|
||||
console.log(`${chalk_1.default.bold('Route Type:')} ${(0, enum_names_1.getRouteTypeName)(packet.routeType)}`);
|
||||
console.log(`${chalk_1.default.bold('Payload Type:')} ${(0, enum_names_1.getPayloadTypeName)(packet.payloadType)}`);
|
||||
console.log(`${chalk_1.default.bold('Total Bytes:')} ${packet.totalBytes}`);
|
||||
if (packet.path && packet.path.length > 0) {
|
||||
console.log(`${chalk_1.default.bold('Path:')} ${packet.path.join(' → ')}`);
|
||||
}
|
||||
// Show payload details (even for invalid packets)
|
||||
if (packet.payload.decoded) {
|
||||
console.log(chalk_1.default.cyan('\n=== Payload Details ==='));
|
||||
showPayloadDetails(packet.payload.decoded);
|
||||
}
|
||||
// Exit with error code if packet is invalid
|
||||
if (!packet.isValid) {
|
||||
process.exit(1);
|
||||
}
|
||||
// Show structure if requested
|
||||
if (options.structure) {
|
||||
const structure = await packet_decoder_1.MeshCorePacketDecoder.analyzeStructureWithVerification(cleanHex, { keyStore });
|
||||
console.log(chalk_1.default.cyan('\n=== Packet Structure ==='));
|
||||
console.log(chalk_1.default.yellow('\nMain Segments:'));
|
||||
structure.segments.forEach((seg, i) => {
|
||||
console.log(`${i + 1}. ${chalk_1.default.bold(seg.name)} (bytes ${seg.startByte}-${seg.endByte}): ${seg.value}`);
|
||||
if (seg.description) {
|
||||
console.log(` ${chalk_1.default.dim(seg.description)}`);
|
||||
}
|
||||
});
|
||||
if (structure.payload.segments.length > 0) {
|
||||
console.log(chalk_1.default.yellow('\nPayload Segments:'));
|
||||
structure.payload.segments.forEach((seg, i) => {
|
||||
console.log(`${i + 1}. ${chalk_1.default.bold(seg.name)} (bytes ${seg.startByte}-${seg.endByte}): ${seg.value}`);
|
||||
console.log(` ${chalk_1.default.dim(seg.description)}`);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
console.error(chalk_1.default.red('Error:'), error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
function showPayloadDetails(payload) {
|
||||
switch (payload.type) {
|
||||
case enums_1.PayloadType.Advert:
|
||||
const advert = payload;
|
||||
console.log(`${chalk_1.default.bold('Device Role:')} ${(0, enum_names_1.getDeviceRoleName)(advert.appData.deviceRole)}`);
|
||||
if (advert.appData.name) {
|
||||
console.log(`${chalk_1.default.bold('Device Name:')} ${advert.appData.name}`);
|
||||
}
|
||||
if (advert.appData.location) {
|
||||
console.log(`${chalk_1.default.bold('Location:')} ${advert.appData.location.latitude}, ${advert.appData.location.longitude}`);
|
||||
}
|
||||
console.log(`${chalk_1.default.bold('Timestamp:')} ${new Date(advert.timestamp * 1000).toISOString()}`);
|
||||
// Show signature verification status
|
||||
if (advert.signatureValid !== undefined) {
|
||||
if (advert.signatureValid) {
|
||||
console.log(`${chalk_1.default.bold('Signature:')} ${chalk_1.default.green('✅ Valid Ed25519 signature')}`);
|
||||
}
|
||||
else {
|
||||
console.log(`${chalk_1.default.bold('Signature:')} ${chalk_1.default.red('❌ Invalid Ed25519 signature')}`);
|
||||
if (advert.signatureError) {
|
||||
console.log(`${chalk_1.default.bold('Error:')} ${chalk_1.default.red(advert.signatureError)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
console.log(`${chalk_1.default.bold('Signature:')} ${chalk_1.default.yellow('⚠️ Not verified (use async verification)')}`);
|
||||
}
|
||||
break;
|
||||
case enums_1.PayloadType.GroupText:
|
||||
const groupText = payload;
|
||||
console.log(`${chalk_1.default.bold('Channel Hash:')} ${groupText.channelHash}`);
|
||||
if (groupText.decrypted) {
|
||||
console.log(chalk_1.default.green('🔓 Decrypted Message:'));
|
||||
if (groupText.decrypted.sender) {
|
||||
console.log(`${chalk_1.default.bold('Sender:')} ${groupText.decrypted.sender}`);
|
||||
}
|
||||
console.log(`${chalk_1.default.bold('Message:')} ${groupText.decrypted.message}`);
|
||||
console.log(`${chalk_1.default.bold('Timestamp:')} ${new Date(groupText.decrypted.timestamp * 1000).toISOString()}`);
|
||||
}
|
||||
else {
|
||||
console.log(chalk_1.default.yellow('🔒 Encrypted (no key available)'));
|
||||
console.log(`${chalk_1.default.bold('Ciphertext:')} ${groupText.ciphertext.substring(0, 32)}...`);
|
||||
}
|
||||
break;
|
||||
case enums_1.PayloadType.Trace:
|
||||
const trace = payload;
|
||||
console.log(`${chalk_1.default.bold('Trace Tag:')} ${trace.traceTag}`);
|
||||
console.log(`${chalk_1.default.bold('Auth Code:')} ${trace.authCode}`);
|
||||
if (trace.snrValues && trace.snrValues.length > 0) {
|
||||
console.log(`${chalk_1.default.bold('SNR Values:')} ${trace.snrValues.map(snr => `${snr.toFixed(1)}dB`).join(', ')}`);
|
||||
}
|
||||
break;
|
||||
default:
|
||||
console.log(`${chalk_1.default.bold('Type:')} ${(0, enum_names_1.getPayloadTypeName)(payload.type)}`);
|
||||
console.log(`${chalk_1.default.bold('Valid:')} ${payload.isValid ? '✅' : '❌'}`);
|
||||
}
|
||||
}
|
||||
// Add key derivation command
|
||||
commander_1.program
|
||||
.command('derive-key')
|
||||
.description('Derive Ed25519 public key from MeshCore private key')
|
||||
.argument('<private-key>', '64-byte private key in hex format')
|
||||
.option('-v, --validate <public-key>', 'Validate against expected public key')
|
||||
.option('-j, --json', 'Output as JSON')
|
||||
.action(async (privateKeyHex, options) => {
|
||||
try {
|
||||
// Clean up hex input
|
||||
const cleanPrivateKey = privateKeyHex.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
if (cleanPrivateKey.length !== 128) {
|
||||
console.error(chalk_1.default.red('❌ Error: Private key must be exactly 64 bytes (128 hex characters)'));
|
||||
process.exit(1);
|
||||
}
|
||||
if (options.json) {
|
||||
// JSON output
|
||||
const result = {
|
||||
privateKey: cleanPrivateKey,
|
||||
derivedPublicKey: await index_1.Utils.derivePublicKey(cleanPrivateKey)
|
||||
};
|
||||
if (options.validate) {
|
||||
const cleanExpectedKey = options.validate.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
result.expectedPublicKey = cleanExpectedKey;
|
||||
result.isValid = await index_1.Utils.validateKeyPair(cleanPrivateKey, cleanExpectedKey);
|
||||
result.match = result.derivedPublicKey.toLowerCase() === cleanExpectedKey.toLowerCase();
|
||||
}
|
||||
console.log(JSON.stringify(result, null, 2));
|
||||
}
|
||||
else {
|
||||
// Formatted output
|
||||
console.log(chalk_1.default.cyan('=== MeshCore Ed25519 Key Derivation ===\n'));
|
||||
console.log(chalk_1.default.bold('Private Key (64 bytes):'));
|
||||
console.log(chalk_1.default.gray(cleanPrivateKey));
|
||||
console.log();
|
||||
console.log(chalk_1.default.bold('Derived Public Key (32 bytes):'));
|
||||
const derivedKey = await index_1.Utils.derivePublicKey(cleanPrivateKey);
|
||||
console.log(chalk_1.default.green(derivedKey));
|
||||
console.log();
|
||||
if (options.validate) {
|
||||
const cleanExpectedKey = options.validate.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
console.log(chalk_1.default.bold('Expected Public Key:'));
|
||||
console.log(chalk_1.default.gray(cleanExpectedKey));
|
||||
console.log();
|
||||
const match = derivedKey.toLowerCase() === cleanExpectedKey.toLowerCase();
|
||||
console.log(chalk_1.default.bold('Validation:'));
|
||||
console.log(match ? chalk_1.default.green('Keys match') : chalk_1.default.red('Keys do not match'));
|
||||
if (!match) {
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
console.log(chalk_1.default.green('Key derivation completed successfully'));
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({ error: errorMessage }, null, 2));
|
||||
}
|
||||
else {
|
||||
console.error(chalk_1.default.red(`Error: ${errorMessage}`));
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
// Add auth-token command
|
||||
commander_1.program
|
||||
.command('auth-token')
|
||||
.description('Generate JWT authentication token signed with Ed25519 private key')
|
||||
.argument('<public-key>', '32-byte public key in hex format')
|
||||
.argument('<private-key>', '64-byte private key in hex format')
|
||||
.option('-e, --exp <seconds>', 'Token expiration in seconds from now (default: 86400 = 24 hours)', '86400')
|
||||
.option('-c, --claims <json>', 'Additional claims as JSON object (e.g., \'{"aud":"mqtt.example.com","sub":"device-123"}\')')
|
||||
.option('-j, --json', 'Output as JSON')
|
||||
.action(async (publicKeyHex, privateKeyHex, options) => {
|
||||
try {
|
||||
const { createAuthToken } = await Promise.resolve().then(() => __importStar(require('./utils/auth-token')));
|
||||
// Clean up hex inputs
|
||||
const cleanPublicKey = publicKeyHex.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
const cleanPrivateKey = privateKeyHex.replace(/\s+/g, '').replace(/^0x/i, '');
|
||||
if (cleanPublicKey.length !== 64) {
|
||||
console.error(chalk_1.default.red('❌ Error: Public key must be exactly 32 bytes (64 hex characters)'));
|
||||
process.exit(1);
|
||||
}
|
||||
if (cleanPrivateKey.length !== 128) {
|
||||
console.error(chalk_1.default.red('❌ Error: Private key must be exactly 64 bytes (128 hex characters)'));
|
||||
process.exit(1);
|
||||
}
|
||||
const expSeconds = parseInt(options.exp);
|
||||
const iat = Math.floor(Date.now() / 1000);
|
||||
const exp = iat + expSeconds;
|
||||
const payload = {
|
||||
publicKey: cleanPublicKey.toUpperCase(),
|
||||
iat,
|
||||
exp
|
||||
};
|
||||
// Parse and merge additional claims if provided
|
||||
if (options.claims) {
|
||||
try {
|
||||
const additionalClaims = JSON.parse(options.claims);
|
||||
Object.assign(payload, additionalClaims);
|
||||
}
|
||||
catch (e) {
|
||||
console.error(chalk_1.default.red('❌ Error: Invalid JSON in --claims option'));
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
const token = await createAuthToken(payload, cleanPrivateKey, cleanPublicKey.toUpperCase());
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({
|
||||
token,
|
||||
payload
|
||||
}, null, 2));
|
||||
}
|
||||
else {
|
||||
console.log(token);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({ error: errorMessage }, null, 2));
|
||||
}
|
||||
else {
|
||||
console.error(chalk_1.default.red(`Error: ${errorMessage}`));
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
// Add verify-token command
|
||||
commander_1.program
|
||||
.command('verify-token')
|
||||
.description('Verify JWT authentication token')
|
||||
.argument('<token>', 'JWT token to verify')
|
||||
.option('-p, --public-key <key>', 'Expected public key in hex format (optional)')
|
||||
.option('-j, --json', 'Output as JSON')
|
||||
.action(async (token, options) => {
|
||||
try {
|
||||
const { verifyAuthToken } = await Promise.resolve().then(() => __importStar(require('./utils/auth-token')));
|
||||
const cleanToken = token.trim();
|
||||
let expectedPublicKey;
|
||||
if (options.publicKey) {
|
||||
const cleanKey = options.publicKey.replace(/\s+/g, '').replace(/^0x/i, '').toUpperCase();
|
||||
if (cleanKey.length !== 64) {
|
||||
console.error(chalk_1.default.red('❌ Error: Public key must be exactly 32 bytes (64 hex characters)'));
|
||||
process.exit(1);
|
||||
}
|
||||
expectedPublicKey = cleanKey;
|
||||
}
|
||||
const payload = await verifyAuthToken(cleanToken, expectedPublicKey);
|
||||
if (payload) {
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
const isExpired = payload.exp && now > payload.exp;
|
||||
const timeToExpiry = payload.exp ? payload.exp - now : null;
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({
|
||||
valid: true,
|
||||
expired: isExpired,
|
||||
payload,
|
||||
timeToExpiry
|
||||
}, null, 2));
|
||||
}
|
||||
else {
|
||||
console.log(chalk_1.default.green('✅ Token is valid'));
|
||||
console.log(chalk_1.default.cyan('\nPayload:'));
|
||||
console.log(` Public Key: ${payload.publicKey}`);
|
||||
console.log(` Issued At: ${new Date(payload.iat * 1000).toISOString()} (${payload.iat})`);
|
||||
if (payload.exp) {
|
||||
console.log(` Expires At: ${new Date(payload.exp * 1000).toISOString()} (${payload.exp})`);
|
||||
if (isExpired) {
|
||||
console.log(chalk_1.default.red(` Status: EXPIRED`));
|
||||
}
|
||||
else {
|
||||
console.log(chalk_1.default.green(` Status: Valid for ${timeToExpiry} more seconds`));
|
||||
}
|
||||
}
|
||||
// Show any additional claims
|
||||
const standardClaims = ['publicKey', 'iat', 'exp'];
|
||||
const customClaims = Object.keys(payload).filter(k => !standardClaims.includes(k));
|
||||
if (customClaims.length > 0) {
|
||||
console.log(chalk_1.default.cyan('\nCustom Claims:'));
|
||||
customClaims.forEach(key => {
|
||||
console.log(` ${key}: ${JSON.stringify(payload[key])}`);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({
|
||||
valid: false,
|
||||
error: 'Token verification failed'
|
||||
}, null, 2));
|
||||
}
|
||||
else {
|
||||
console.error(chalk_1.default.red('❌ Token verification failed'));
|
||||
console.error(chalk_1.default.yellow('Possible reasons:'));
|
||||
console.error(' - Invalid signature');
|
||||
console.error(' - Token format is incorrect');
|
||||
console.error(' - Public key mismatch (if --public-key was provided)');
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify({ valid: false, error: errorMessage }, null, 2));
|
||||
}
|
||||
else {
|
||||
console.error(chalk_1.default.red(`Error: ${errorMessage}`));
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
commander_1.program.parse();
|
||||
//# sourceMappingURL=cli.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/cli.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/cli.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
15
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.d.ts
vendored
Normal file
15
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.d.ts
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
import { DecryptionResult } from '../types/crypto';
|
||||
export declare class ChannelCrypto {
|
||||
/**
|
||||
* Decrypt GroupText message using MeshCore algorithm:
|
||||
* - HMAC-SHA256 verification with 2-byte MAC
|
||||
* - AES-128 ECB decryption
|
||||
*/
|
||||
static decryptGroupTextMessage(ciphertext: string, cipherMac: string, channelKey: string): DecryptionResult;
|
||||
/**
|
||||
* Calculate MeshCore channel hash from secret key
|
||||
* Returns the first byte of SHA256(secret) as hex string
|
||||
*/
|
||||
static calculateChannelHash(secretKeyHex: string): string;
|
||||
}
|
||||
//# sourceMappingURL=channel-crypto.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"channel-crypto.d.ts","sourceRoot":"","sources":["../../src/crypto/channel-crypto.ts"],"names":[],"mappings":"AAIA,OAAO,EAAE,gBAAgB,EAAE,MAAM,iBAAiB,CAAC;AAGnD,qBAAa,aAAa;IACxB;;;;OAIG;IACH,MAAM,CAAC,uBAAuB,CAC5B,UAAU,EAAE,MAAM,EAClB,SAAS,EAAE,MAAM,EACjB,UAAU,EAAE,MAAM,GACjB,gBAAgB;IAuFnB;;;OAGG;IACH,MAAM,CAAC,oBAAoB,CAAC,YAAY,EAAE,MAAM,GAAG,MAAM;CAK1D"}
|
||||
94
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.js
vendored
Normal file
94
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.js
vendored
Normal file
@@ -0,0 +1,94 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ChannelCrypto = void 0;
|
||||
const crypto_js_1 = require("crypto-js");
|
||||
const hex_1 = require("../utils/hex");
|
||||
class ChannelCrypto {
|
||||
/**
|
||||
* Decrypt GroupText message using MeshCore algorithm:
|
||||
* - HMAC-SHA256 verification with 2-byte MAC
|
||||
* - AES-128 ECB decryption
|
||||
*/
|
||||
static decryptGroupTextMessage(ciphertext, cipherMac, channelKey) {
|
||||
try {
|
||||
// convert hex strings to byte arrays
|
||||
const channelKey16 = (0, hex_1.hexToBytes)(channelKey);
|
||||
const macBytes = (0, hex_1.hexToBytes)(cipherMac);
|
||||
// MeshCore uses 32-byte channel secret: 16-byte key + 16 zero bytes
|
||||
const channelSecret = new Uint8Array(32);
|
||||
channelSecret.set(channelKey16, 0);
|
||||
// Step 1: Verify HMAC-SHA256 using full 32-byte channel secret
|
||||
const calculatedMac = (0, crypto_js_1.HmacSHA256)(crypto_js_1.enc.Hex.parse(ciphertext), crypto_js_1.enc.Hex.parse((0, hex_1.bytesToHex)(channelSecret)));
|
||||
const calculatedMacBytes = (0, hex_1.hexToBytes)(calculatedMac.toString(crypto_js_1.enc.Hex));
|
||||
const calculatedMacFirst2 = calculatedMacBytes.slice(0, 2);
|
||||
if (calculatedMacFirst2[0] !== macBytes[0] || calculatedMacFirst2[1] !== macBytes[1]) {
|
||||
return { success: false, error: 'MAC verification failed' };
|
||||
}
|
||||
// Step 2: Decrypt using AES-128 ECB with first 16 bytes of channel secret
|
||||
const keyWords = crypto_js_1.enc.Hex.parse(channelKey);
|
||||
const ciphertextWords = crypto_js_1.enc.Hex.parse(ciphertext);
|
||||
const decrypted = crypto_js_1.AES.decrypt(crypto_js_1.lib.CipherParams.create({ ciphertext: ciphertextWords }), keyWords, { mode: crypto_js_1.mode.ECB, padding: crypto_js_1.pad.NoPadding });
|
||||
const decryptedBytes = (0, hex_1.hexToBytes)(decrypted.toString(crypto_js_1.enc.Hex));
|
||||
if (!decryptedBytes || decryptedBytes.length < 5) {
|
||||
return { success: false, error: 'Decrypted content too short' };
|
||||
}
|
||||
// parse MeshCore format: timestamp(4) + flags(1) + message_text
|
||||
const timestamp = decryptedBytes[0] |
|
||||
(decryptedBytes[1] << 8) |
|
||||
(decryptedBytes[2] << 16) |
|
||||
(decryptedBytes[3] << 24);
|
||||
const flagsAndAttempt = decryptedBytes[4];
|
||||
// extract message text with UTF-8 decoding
|
||||
const messageBytes = decryptedBytes.slice(5);
|
||||
const decoder = new TextDecoder('utf-8');
|
||||
let messageText = decoder.decode(messageBytes);
|
||||
// remove null terminator if present
|
||||
const nullIndex = messageText.indexOf('\0');
|
||||
if (nullIndex >= 0) {
|
||||
messageText = messageText.substring(0, nullIndex);
|
||||
}
|
||||
// parse sender and message (format: "sender: message")
|
||||
const colonIndex = messageText.indexOf(': ');
|
||||
let sender;
|
||||
let content;
|
||||
if (colonIndex > 0 && colonIndex < 50) {
|
||||
const potentialSender = messageText.substring(0, colonIndex);
|
||||
if (!/[:\[\]]/.test(potentialSender)) {
|
||||
sender = potentialSender;
|
||||
content = messageText.substring(colonIndex + 2);
|
||||
}
|
||||
else {
|
||||
content = messageText;
|
||||
}
|
||||
}
|
||||
else {
|
||||
content = messageText;
|
||||
}
|
||||
return {
|
||||
success: true,
|
||||
data: {
|
||||
timestamp,
|
||||
flags: flagsAndAttempt,
|
||||
sender,
|
||||
message: content
|
||||
}
|
||||
};
|
||||
}
|
||||
catch (error) {
|
||||
return { success: false, error: error instanceof Error ? error.message : 'Decryption failed' };
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Calculate MeshCore channel hash from secret key
|
||||
* Returns the first byte of SHA256(secret) as hex string
|
||||
*/
|
||||
static calculateChannelHash(secretKeyHex) {
|
||||
const hash = (0, crypto_js_1.SHA256)(crypto_js_1.enc.Hex.parse(secretKeyHex));
|
||||
const hashBytes = (0, hex_1.hexToBytes)(hash.toString(crypto_js_1.enc.Hex));
|
||||
return hashBytes[0].toString(16).padStart(2, '0');
|
||||
}
|
||||
}
|
||||
exports.ChannelCrypto = ChannelCrypto;
|
||||
//# sourceMappingURL=channel-crypto.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/channel-crypto.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"channel-crypto.js","sourceRoot":"","sources":["../../src/crypto/channel-crypto.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAEd,yCAAyE;AAEzE,sCAAsD;AAEtD,MAAa,aAAa;IACxB;;;;OAIG;IACH,MAAM,CAAC,uBAAuB,CAC5B,UAAkB,EAClB,SAAiB,EACjB,UAAkB;QAElB,IAAI,CAAC;YACH,qCAAqC;YACrC,MAAM,YAAY,GAAG,IAAA,gBAAU,EAAC,UAAU,CAAC,CAAC;YAC5C,MAAM,QAAQ,GAAG,IAAA,gBAAU,EAAC,SAAS,CAAC,CAAC;YAEvC,oEAAoE;YACpE,MAAM,aAAa,GAAG,IAAI,UAAU,CAAC,EAAE,CAAC,CAAC;YACzC,aAAa,CAAC,GAAG,CAAC,YAAY,EAAE,CAAC,CAAC,CAAC;YAEnC,+DAA+D;YAC/D,MAAM,aAAa,GAAG,IAAA,sBAAU,EAAC,eAAG,CAAC,GAAG,CAAC,KAAK,CAAC,UAAU,CAAC,EAAE,eAAG,CAAC,GAAG,CAAC,KAAK,CAAC,IAAA,gBAAU,EAAC,aAAa,CAAC,CAAC,CAAC,CAAC;YACtG,MAAM,kBAAkB,GAAG,IAAA,gBAAU,EAAC,aAAa,CAAC,QAAQ,CAAC,eAAG,CAAC,GAAG,CAAC,CAAC,CAAC;YACvE,MAAM,mBAAmB,GAAG,kBAAkB,CAAC,KAAK,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;YAE3D,IAAI,mBAAmB,CAAC,CAAC,CAAC,KAAK,QAAQ,CAAC,CAAC,CAAC,IAAI,mBAAmB,CAAC,CAAC,CAAC,KAAK,QAAQ,CAAC,CAAC,CAAC,EAAE,CAAC;gBACrF,OAAO,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,yBAAyB,EAAE,CAAC;YAC9D,CAAC;YAED,0EAA0E;YAC1E,MAAM,QAAQ,GAAG,eAAG,CAAC,GAAG,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;YAC3C,MAAM,eAAe,GAAG,eAAG,CAAC,GAAG,CAAC,KAAK,CAAC,UAAU,CAAC,CAAC;YAElD,MAAM,SAAS,GAAG,eAAG,CAAC,OAAO,CAC3B,eAAG,CAAC,YAAY,CAAC,MAAM,CAAC,EAAE,UAAU,EAAE,eAAe,EAAE,CAAC,EACxD,QAAQ,EACR,EAAE,IAAI,EAAE,gBAAI,CAAC,GAAG,EAAE,OAAO,EAAE,eAAG,CAAC,SAAS,EAAE,CAC3C,CAAC;YAEF,MAAM,cAAc,GAAG,IAAA,gBAAU,EAAC,SAAS,CAAC,QAAQ,CAAC,eAAG,CAAC,GAAG,CAAC,CAAC,CAAC;YAE/D,IAAI,CAAC,cAAc,IAAI,cAAc,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACjD,OAAO,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,6BAA6B,EAAE,CAAC;YAClE,CAAC;YAED,gEAAgE;YAChE,MAAM,SAAS,GAAG,cAAc,CAAC,CAAC,CAAC;gBAClB,CAAC,cAAc,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;gBACxB,CAAC,cAAc,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC;gBACzB,CAAC,cAAc,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,CAAC;YAE3C,MAAM,eAAe,GAAG,cAAc,CAAC,CAAC,CAAC,CAAC;YAE1C,2CAA2C;YAC3C,MAAM,YAAY,GAAG,cAAc,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC7C,MAAM,OAAO,GAAG,IAAI,WAAW,CAAC,OAAO,CAAC,CAAC;YACzC,IAAI,WAAW,GAAG,OAAO,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC;YAE/C,oCAAoC;YACpC,MAAM,SAAS,GAAG,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;YAC5C,IAAI,SAAS,IAAI,CAAC,EAAE,CAAC;gBACnB,WAAW,GAAG,WAAW,CAAC,SAAS,CAAC,CAAC,EAAE,SAAS,CAAC,CAAC;YACpD,CAAC;YAED,uDAAuD;YACvD,MAAM,UAAU,GAAG,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;YAC7C,IAAI,MAA0B,CAAC;YAC/B,IAAI,OAAe,CAAC;YAEpB,IAAI,UAAU,GAAG,CAAC,IAAI,UAAU,GAAG,EAAE,EAAE,CAAC;gBACtC,MAAM,eAAe,GAAG,WAAW,CAAC,SAAS,CAAC,CAAC,EAAE,UAAU,CAAC,CAAC;gBAC7D,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,eAAe,CAAC,EAAE,CAAC;oBACrC,MAAM,GAAG,eAAe,CAAC;oBACzB,OAAO,GAAG,WAAW,CAAC,SAAS,CAAC,UAAU,GAAG,CAAC,CAAC,CAAC;gBAClD,CAAC;qBAAM,CAAC;oBACN,OAAO,GAAG,WAAW,CAAC;gBACxB,CAAC;YACH,CAAC;iBAAM,CAAC;gBACN,OAAO,GAAG,WAAW,CAAC;YACxB,CAAC;YAED,OAAO;gBACL,OAAO,EAAE,IAAI;gBACb,IAAI,EAAE;oBACJ,SAAS;oBACT,KAAK,EAAE,eAAe;oBACtB,MAAM;oBACN,OAAO,EAAE,OAAO;iBACjB;aACF,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,mBAAmB,EAAE,CAAC;QACjG,CAAC;IACH,CAAC;IAID;;;OAGG;IACH,MAAM,CAAC,oBAAoB,CAAC,YAAoB;QAC9C,MAAM,IAAI,GAAG,IAAA,kBAAM,EAAC,eAAG,CAAC,GAAG,CAAC,KAAK,CAAC,YAAY,CAAC,CAAC,CAAC;QACjD,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,IAAI,CAAC,QAAQ,CAAC,eAAG,CAAC,GAAG,CAAC,CAAC,CAAC;QACrD,OAAO,SAAS,CAAC,CAAC,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,QAAQ,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC;IACpD,CAAC;CACF;AA1GD,sCA0GC"}
|
||||
48
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.d.ts
vendored
Normal file
48
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.d.ts
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
export declare class Ed25519SignatureVerifier {
|
||||
/**
|
||||
* Verify an Ed25519 signature for MeshCore advertisement packets
|
||||
*
|
||||
* According to MeshCore protocol, the signed message for advertisements is:
|
||||
* timestamp (4 bytes LE) + flags (1 byte) + location (8 bytes LE, if present) + name (variable, if present)
|
||||
*/
|
||||
static verifyAdvertisementSignature(publicKeyHex: string, signatureHex: string, timestamp: number, appDataHex: string): Promise<boolean>;
|
||||
/**
|
||||
* Construct the signed message for MeshCore advertisements
|
||||
* According to MeshCore source (Mesh.cpp lines 242-248):
|
||||
* Format: public_key (32 bytes) + timestamp (4 bytes LE) + app_data (variable length)
|
||||
*/
|
||||
private static constructAdvertSignedMessage;
|
||||
/**
|
||||
* Get a human-readable description of what was signed
|
||||
*/
|
||||
static getSignedMessageDescription(publicKeyHex: string, timestamp: number, appDataHex: string): string;
|
||||
/**
|
||||
* Get the hex representation of the signed message for debugging
|
||||
*/
|
||||
static getSignedMessageHex(publicKeyHex: string, timestamp: number, appDataHex: string): string;
|
||||
/**
|
||||
* Derive Ed25519 public key from orlp/ed25519 private key format
|
||||
* This implements the same algorithm as orlp/ed25519's ed25519_derive_pub()
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
static derivePublicKey(privateKeyHex: string): Promise<string>;
|
||||
/**
|
||||
* Derive Ed25519 public key from orlp/ed25519 private key format (synchronous version)
|
||||
* This implements the same algorithm as orlp/ed25519's ed25519_derive_pub()
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
static derivePublicKeySync(privateKeyHex: string): string;
|
||||
/**
|
||||
* Validate that a private key correctly derives to the expected public key
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format
|
||||
* @param expectedPublicKeyHex - Expected 32-byte public key in hex format
|
||||
* @returns true if the private key derives to the expected public key
|
||||
*/
|
||||
static validateKeyPair(privateKeyHex: string, expectedPublicKeyHex: string): Promise<boolean>;
|
||||
}
|
||||
//# sourceMappingURL=ed25519-verifier.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ed25519-verifier.d.ts","sourceRoot":"","sources":["../../src/crypto/ed25519-verifier.ts"],"names":[],"mappings":"AAyEA,qBAAa,wBAAwB;IACnC;;;;;OAKG;WACU,4BAA4B,CACvC,YAAY,EAAE,MAAM,EACpB,YAAY,EAAE,MAAM,EACpB,SAAS,EAAE,MAAM,EACjB,UAAU,EAAE,MAAM,GACjB,OAAO,CAAC,OAAO,CAAC;IAkBnB;;;;OAIG;IACH,OAAO,CAAC,MAAM,CAAC,4BAA4B;IAuB3C;;OAEG;IACH,MAAM,CAAC,2BAA2B,CAChC,YAAY,EAAE,MAAM,EACpB,SAAS,EAAE,MAAM,EACjB,UAAU,EAAE,MAAM,GACjB,MAAM;IAIT;;OAEG;IACH,MAAM,CAAC,mBAAmB,CACxB,YAAY,EAAE,MAAM,EACpB,SAAS,EAAE,MAAM,EACjB,UAAU,EAAE,MAAM,GACjB,MAAM;IAMT;;;;;;OAMG;WACU,eAAe,CAAC,aAAa,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC;IAepE;;;;;;OAMG;IACH,MAAM,CAAC,mBAAmB,CAAC,aAAa,EAAE,MAAM,GAAG,MAAM;IAezD;;;;;;OAMG;WACU,eAAe,CAAC,aAAa,EAAE,MAAM,EAAE,oBAAoB,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC;CAQpG"}
|
||||
217
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.js
vendored
Normal file
217
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.js
vendored
Normal file
@@ -0,0 +1,217 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || (function () {
|
||||
var ownKeys = function(o) {
|
||||
ownKeys = Object.getOwnPropertyNames || function (o) {
|
||||
var ar = [];
|
||||
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
|
||||
return ar;
|
||||
};
|
||||
return ownKeys(o);
|
||||
};
|
||||
return function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
})();
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Ed25519SignatureVerifier = void 0;
|
||||
const ed25519 = __importStar(require("@noble/ed25519"));
|
||||
const hex_1 = require("../utils/hex");
|
||||
const orlp_ed25519_wasm_1 = require("./orlp-ed25519-wasm");
|
||||
// Cross-platform SHA-512 implementation
|
||||
async function sha512Hash(data) {
|
||||
// Browser environment - use Web Crypto API
|
||||
if (typeof globalThis !== 'undefined' && globalThis.crypto && globalThis.crypto.subtle) {
|
||||
const hashBuffer = await globalThis.crypto.subtle.digest('SHA-512', data);
|
||||
return new Uint8Array(hashBuffer);
|
||||
}
|
||||
// Node.js environment - use crypto module
|
||||
if (typeof require !== 'undefined') {
|
||||
try {
|
||||
const { createHash } = require('crypto');
|
||||
return createHash('sha512').update(data).digest();
|
||||
}
|
||||
catch (error) {
|
||||
// Fallback for environments where require is not available
|
||||
}
|
||||
}
|
||||
throw new Error('No SHA-512 implementation available');
|
||||
}
|
||||
function sha512HashSync(data) {
|
||||
// Node.js environment - use crypto module
|
||||
if (typeof require !== 'undefined') {
|
||||
try {
|
||||
const { createHash } = require('crypto');
|
||||
return createHash('sha512').update(data).digest();
|
||||
}
|
||||
catch (error) {
|
||||
// Fallback
|
||||
}
|
||||
}
|
||||
// Browser environment fallback - use crypto-js for sync operation
|
||||
try {
|
||||
const CryptoJS = require('crypto-js');
|
||||
const wordArray = CryptoJS.lib.WordArray.create(data);
|
||||
const hash = CryptoJS.SHA512(wordArray);
|
||||
const hashBytes = new Uint8Array(64);
|
||||
// Convert CryptoJS hash to Uint8Array
|
||||
for (let i = 0; i < 16; i++) {
|
||||
const word = hash.words[i] || 0;
|
||||
hashBytes[i * 4] = (word >>> 24) & 0xff;
|
||||
hashBytes[i * 4 + 1] = (word >>> 16) & 0xff;
|
||||
hashBytes[i * 4 + 2] = (word >>> 8) & 0xff;
|
||||
hashBytes[i * 4 + 3] = word & 0xff;
|
||||
}
|
||||
return hashBytes;
|
||||
}
|
||||
catch (error) {
|
||||
// Final fallback - this should not happen since crypto-js is a dependency
|
||||
throw new Error('No SHA-512 implementation available for synchronous operation');
|
||||
}
|
||||
}
|
||||
// Set up SHA-512 for @noble/ed25519
|
||||
ed25519.etc.sha512Async = sha512Hash;
|
||||
// Always set up sync version - @noble/ed25519 requires it
|
||||
// It will throw in browser environments, which @noble/ed25519 can handle
|
||||
try {
|
||||
ed25519.etc.sha512Sync = sha512HashSync;
|
||||
}
|
||||
catch (error) {
|
||||
console.debug('Could not set up synchronous SHA-512:', error);
|
||||
}
|
||||
class Ed25519SignatureVerifier {
|
||||
/**
|
||||
* Verify an Ed25519 signature for MeshCore advertisement packets
|
||||
*
|
||||
* According to MeshCore protocol, the signed message for advertisements is:
|
||||
* timestamp (4 bytes LE) + flags (1 byte) + location (8 bytes LE, if present) + name (variable, if present)
|
||||
*/
|
||||
static async verifyAdvertisementSignature(publicKeyHex, signatureHex, timestamp, appDataHex) {
|
||||
try {
|
||||
// Convert hex strings to Uint8Arrays
|
||||
const publicKey = (0, hex_1.hexToBytes)(publicKeyHex);
|
||||
const signature = (0, hex_1.hexToBytes)(signatureHex);
|
||||
const appData = (0, hex_1.hexToBytes)(appDataHex);
|
||||
// Construct the signed message according to MeshCore format
|
||||
const message = this.constructAdvertSignedMessage(publicKeyHex, timestamp, appData);
|
||||
// Verify the signature using noble-ed25519
|
||||
return await ed25519.verify(signature, message, publicKey);
|
||||
}
|
||||
catch (error) {
|
||||
console.error('Ed25519 signature verification failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Construct the signed message for MeshCore advertisements
|
||||
* According to MeshCore source (Mesh.cpp lines 242-248):
|
||||
* Format: public_key (32 bytes) + timestamp (4 bytes LE) + app_data (variable length)
|
||||
*/
|
||||
static constructAdvertSignedMessage(publicKeyHex, timestamp, appData) {
|
||||
const publicKey = (0, hex_1.hexToBytes)(publicKeyHex);
|
||||
// Timestamp (4 bytes, little-endian)
|
||||
const timestampBytes = new Uint8Array(4);
|
||||
timestampBytes[0] = timestamp & 0xFF;
|
||||
timestampBytes[1] = (timestamp >> 8) & 0xFF;
|
||||
timestampBytes[2] = (timestamp >> 16) & 0xFF;
|
||||
timestampBytes[3] = (timestamp >> 24) & 0xFF;
|
||||
// Concatenate: public_key + timestamp + app_data
|
||||
const message = new Uint8Array(32 + 4 + appData.length);
|
||||
message.set(publicKey, 0);
|
||||
message.set(timestampBytes, 32);
|
||||
message.set(appData, 36);
|
||||
return message;
|
||||
}
|
||||
/**
|
||||
* Get a human-readable description of what was signed
|
||||
*/
|
||||
static getSignedMessageDescription(publicKeyHex, timestamp, appDataHex) {
|
||||
return `Public Key: ${publicKeyHex} + Timestamp: ${timestamp} (${new Date(timestamp * 1000).toISOString()}) + App Data: ${appDataHex}`;
|
||||
}
|
||||
/**
|
||||
* Get the hex representation of the signed message for debugging
|
||||
*/
|
||||
static getSignedMessageHex(publicKeyHex, timestamp, appDataHex) {
|
||||
const appData = (0, hex_1.hexToBytes)(appDataHex);
|
||||
const message = this.constructAdvertSignedMessage(publicKeyHex, timestamp, appData);
|
||||
return (0, hex_1.bytesToHex)(message);
|
||||
}
|
||||
/**
|
||||
* Derive Ed25519 public key from orlp/ed25519 private key format
|
||||
* This implements the same algorithm as orlp/ed25519's ed25519_derive_pub()
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
static async derivePublicKey(privateKeyHex) {
|
||||
try {
|
||||
const privateKeyBytes = (0, hex_1.hexToBytes)(privateKeyHex);
|
||||
if (privateKeyBytes.length !== 64) {
|
||||
throw new Error(`Invalid private key length: expected 64 bytes, got ${privateKeyBytes.length}`);
|
||||
}
|
||||
// Use the orlp/ed25519 WebAssembly implementation
|
||||
return await (0, orlp_ed25519_wasm_1.derivePublicKey)(privateKeyHex);
|
||||
}
|
||||
catch (error) {
|
||||
throw new Error(`Failed to derive public key: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Derive Ed25519 public key from orlp/ed25519 private key format (synchronous version)
|
||||
* This implements the same algorithm as orlp/ed25519's ed25519_derive_pub()
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
static derivePublicKeySync(privateKeyHex) {
|
||||
try {
|
||||
const privateKeyBytes = (0, hex_1.hexToBytes)(privateKeyHex);
|
||||
if (privateKeyBytes.length !== 64) {
|
||||
throw new Error(`Invalid private key length: expected 64 bytes, got ${privateKeyBytes.length}`);
|
||||
}
|
||||
// Note: WASM operations are async, so this sync version throws an error
|
||||
throw new Error('Synchronous key derivation not supported with WASM. Use derivePublicKey() instead.');
|
||||
}
|
||||
catch (error) {
|
||||
throw new Error(`Failed to derive public key: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Validate that a private key correctly derives to the expected public key
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format
|
||||
* @param expectedPublicKeyHex - Expected 32-byte public key in hex format
|
||||
* @returns true if the private key derives to the expected public key
|
||||
*/
|
||||
static async validateKeyPair(privateKeyHex, expectedPublicKeyHex) {
|
||||
try {
|
||||
return await (0, orlp_ed25519_wasm_1.validateKeyPair)(privateKeyHex, expectedPublicKeyHex);
|
||||
}
|
||||
catch (error) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.Ed25519SignatureVerifier = Ed25519SignatureVerifier;
|
||||
//# sourceMappingURL=ed25519-verifier.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/ed25519-verifier.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
23
frontend/lib/meshcore-decoder/dist/crypto/key-manager.d.ts
vendored
Normal file
23
frontend/lib/meshcore-decoder/dist/crypto/key-manager.d.ts
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
import { CryptoKeyStore } from '../types/crypto';
|
||||
export declare class MeshCoreKeyStore implements CryptoKeyStore {
|
||||
nodeKeys: Map<string, string>;
|
||||
private channelHashToKeys;
|
||||
constructor(initialKeys?: {
|
||||
channelSecrets?: string[];
|
||||
nodeKeys?: Record<string, string>;
|
||||
});
|
||||
addNodeKey(publicKey: string, privateKey: string): void;
|
||||
hasChannelKey(channelHash: string): boolean;
|
||||
hasNodeKey(publicKey: string): boolean;
|
||||
/**
|
||||
* Get all channel keys that match the given channel hash (handles collisions)
|
||||
*/
|
||||
getChannelKeys(channelHash: string): string[];
|
||||
getNodeKey(publicKey: string): string | undefined;
|
||||
/**
|
||||
* Add channel keys by secret keys (new simplified API)
|
||||
* Automatically calculates channel hashes
|
||||
*/
|
||||
addChannelSecrets(secretKeys: string[]): void;
|
||||
}
|
||||
//# sourceMappingURL=key-manager.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/key-manager.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/key-manager.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"key-manager.d.ts","sourceRoot":"","sources":["../../src/crypto/key-manager.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,cAAc,EAAE,MAAM,iBAAiB,CAAC;AAGjD,qBAAa,gBAAiB,YAAW,cAAc;IAC9C,QAAQ,EAAE,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC,CAAa;IAGjD,OAAO,CAAC,iBAAiB,CAA+B;gBAE5C,WAAW,CAAC,EAAE;QACxB,cAAc,CAAC,EAAE,MAAM,EAAE,CAAC;QAC1B,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;KACnC;IAYD,UAAU,CAAC,SAAS,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,GAAG,IAAI;IAKvD,aAAa,CAAC,WAAW,EAAE,MAAM,GAAG,OAAO;IAK3C,UAAU,CAAC,SAAS,EAAE,MAAM,GAAG,OAAO;IAKtC;;OAEG;IACH,cAAc,CAAC,WAAW,EAAE,MAAM,GAAG,MAAM,EAAE;IAK7C,UAAU,CAAC,SAAS,EAAE,MAAM,GAAG,MAAM,GAAG,SAAS;IAKjD;;;OAGG;IACH,iBAAiB,CAAC,UAAU,EAAE,MAAM,EAAE,GAAG,IAAI;CAW9C"}
|
||||
60
frontend/lib/meshcore-decoder/dist/crypto/key-manager.js
vendored
Normal file
60
frontend/lib/meshcore-decoder/dist/crypto/key-manager.js
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.MeshCoreKeyStore = void 0;
|
||||
const channel_crypto_1 = require("./channel-crypto");
|
||||
class MeshCoreKeyStore {
|
||||
constructor(initialKeys) {
|
||||
this.nodeKeys = new Map();
|
||||
// internal map for hash -> multiple keys (collision handling)
|
||||
this.channelHashToKeys = new Map();
|
||||
if (initialKeys?.channelSecrets) {
|
||||
this.addChannelSecrets(initialKeys.channelSecrets);
|
||||
}
|
||||
if (initialKeys?.nodeKeys) {
|
||||
Object.entries(initialKeys.nodeKeys).forEach(([pubKey, privKey]) => {
|
||||
this.addNodeKey(pubKey, privKey);
|
||||
});
|
||||
}
|
||||
}
|
||||
addNodeKey(publicKey, privateKey) {
|
||||
const normalizedPubKey = publicKey.toUpperCase();
|
||||
this.nodeKeys.set(normalizedPubKey, privateKey);
|
||||
}
|
||||
hasChannelKey(channelHash) {
|
||||
const normalizedHash = channelHash.toLowerCase();
|
||||
return this.channelHashToKeys.has(normalizedHash);
|
||||
}
|
||||
hasNodeKey(publicKey) {
|
||||
const normalizedPubKey = publicKey.toUpperCase();
|
||||
return this.nodeKeys.has(normalizedPubKey);
|
||||
}
|
||||
/**
|
||||
* Get all channel keys that match the given channel hash (handles collisions)
|
||||
*/
|
||||
getChannelKeys(channelHash) {
|
||||
const normalizedHash = channelHash.toLowerCase();
|
||||
return this.channelHashToKeys.get(normalizedHash) || [];
|
||||
}
|
||||
getNodeKey(publicKey) {
|
||||
const normalizedPubKey = publicKey.toUpperCase();
|
||||
return this.nodeKeys.get(normalizedPubKey);
|
||||
}
|
||||
/**
|
||||
* Add channel keys by secret keys (new simplified API)
|
||||
* Automatically calculates channel hashes
|
||||
*/
|
||||
addChannelSecrets(secretKeys) {
|
||||
for (const secretKey of secretKeys) {
|
||||
const channelHash = channel_crypto_1.ChannelCrypto.calculateChannelHash(secretKey).toLowerCase();
|
||||
// Handle potential hash collisions
|
||||
if (!this.channelHashToKeys.has(channelHash)) {
|
||||
this.channelHashToKeys.set(channelHash, []);
|
||||
}
|
||||
this.channelHashToKeys.get(channelHash).push(secretKey);
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.MeshCoreKeyStore = MeshCoreKeyStore;
|
||||
//# sourceMappingURL=key-manager.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/key-manager.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/key-manager.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"key-manager.js","sourceRoot":"","sources":["../../src/crypto/key-manager.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAGd,qDAAiD;AAEjD,MAAa,gBAAgB;IAM3B,YAAY,WAGX;QARM,aAAQ,GAAwB,IAAI,GAAG,EAAE,CAAC;QAEjD,8DAA8D;QACtD,sBAAiB,GAAG,IAAI,GAAG,EAAoB,CAAC;QAMtD,IAAI,WAAW,EAAE,cAAc,EAAE,CAAC;YAChC,IAAI,CAAC,iBAAiB,CAAC,WAAW,CAAC,cAAc,CAAC,CAAC;QACrD,CAAC;QAED,IAAI,WAAW,EAAE,QAAQ,EAAE,CAAC;YAC1B,MAAM,CAAC,OAAO,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,EAAE,OAAO,CAAC,EAAE,EAAE;gBACjE,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YACnC,CAAC,CAAC,CAAC;QACL,CAAC;IACH,CAAC;IAED,UAAU,CAAC,SAAiB,EAAE,UAAkB;QAC9C,MAAM,gBAAgB,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;QACjD,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,gBAAgB,EAAE,UAAU,CAAC,CAAC;IAClD,CAAC;IAED,aAAa,CAAC,WAAmB;QAC/B,MAAM,cAAc,GAAG,WAAW,CAAC,WAAW,EAAE,CAAC;QACjD,OAAO,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,cAAc,CAAC,CAAC;IACpD,CAAC;IAED,UAAU,CAAC,SAAiB;QAC1B,MAAM,gBAAgB,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;QACjD,OAAO,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,gBAAgB,CAAC,CAAC;IAC7C,CAAC;IAED;;OAEG;IACH,cAAc,CAAC,WAAmB;QAChC,MAAM,cAAc,GAAG,WAAW,CAAC,WAAW,EAAE,CAAC;QACjD,OAAO,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,cAAc,CAAC,IAAI,EAAE,CAAC;IAC1D,CAAC;IAED,UAAU,CAAC,SAAiB;QAC1B,MAAM,gBAAgB,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;QACjD,OAAO,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,gBAAgB,CAAC,CAAC;IAC7C,CAAC;IAED;;;OAGG;IACH,iBAAiB,CAAC,UAAoB;QACpC,KAAK,MAAM,SAAS,IAAI,UAAU,EAAE,CAAC;YACnC,MAAM,WAAW,GAAG,8BAAa,CAAC,oBAAoB,CAAC,SAAS,CAAC,CAAC,WAAW,EAAE,CAAC;YAEhF,mCAAmC;YACnC,IAAI,CAAC,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,WAAW,CAAC,EAAE,CAAC;gBAC7C,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,WAAW,EAAE,EAAE,CAAC,CAAC;YAC9C,CAAC;YACD,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,WAAW,CAAE,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;QAC3D,CAAC;IACH,CAAC;CACF;AAhED,4CAgEC"}
|
||||
34
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.d.ts
vendored
Normal file
34
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.d.ts
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
/**
|
||||
* Derive Ed25519 public key from private key using the exact orlp/ed25519 algorithm
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
export declare function derivePublicKey(privateKeyHex: string): Promise<string>;
|
||||
/**
|
||||
* Validate that a private key and public key pair match using orlp/ed25519
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format
|
||||
* @param expectedPublicKeyHex - 32-byte public key in hex format
|
||||
* @returns true if the keys match, false otherwise
|
||||
*/
|
||||
export declare function validateKeyPair(privateKeyHex: string, expectedPublicKeyHex: string): Promise<boolean>;
|
||||
/**
|
||||
* Sign a message using Ed25519 with orlp/ed25519 implementation
|
||||
*
|
||||
* @param messageHex - Message to sign in hex format
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @param publicKeyHex - 32-byte public key in hex format
|
||||
* @returns 64-byte signature in hex format
|
||||
*/
|
||||
export declare function sign(messageHex: string, privateKeyHex: string, publicKeyHex: string): Promise<string>;
|
||||
/**
|
||||
* Verify an Ed25519 signature using orlp/ed25519 implementation
|
||||
*
|
||||
* @param signatureHex - 64-byte signature in hex format
|
||||
* @param messageHex - Message that was signed in hex format
|
||||
* @param publicKeyHex - 32-byte public key in hex format
|
||||
* @returns true if signature is valid, false otherwise
|
||||
*/
|
||||
export declare function verify(signatureHex: string, messageHex: string, publicKeyHex: string): Promise<boolean>;
|
||||
//# sourceMappingURL=orlp-ed25519-wasm.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"orlp-ed25519-wasm.d.ts","sourceRoot":"","sources":["../../src/crypto/orlp-ed25519-wasm.ts"],"names":[],"mappings":"AAgBA;;;;;GAKG;AACH,wBAAsB,eAAe,CAAC,aAAa,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC,CAiC5E;AAED;;;;;;GAMG;AACH,wBAAsB,eAAe,CAAC,aAAa,EAAE,MAAM,EAAE,oBAAoB,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC,CAoC3G;AAED;;;;;;;GAOG;AACH,wBAAsB,IAAI,CAAC,UAAU,EAAE,MAAM,EAAE,aAAa,EAAE,MAAM,EAAE,YAAY,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC,CAuC3G;AAED;;;;;;;GAOG;AACH,wBAAsB,MAAM,CAAC,YAAY,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,EAAE,YAAY,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC,CAsC7G"}
|
||||
150
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.js
vendored
Normal file
150
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.js
vendored
Normal file
@@ -0,0 +1,150 @@
|
||||
"use strict";
|
||||
// WebAssembly wrapper for orlp/ed25519 key derivation
|
||||
// This provides the exact orlp algorithm for JavaScript
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.derivePublicKey = derivePublicKey;
|
||||
exports.validateKeyPair = validateKeyPair;
|
||||
exports.sign = sign;
|
||||
exports.verify = verify;
|
||||
const hex_1 = require("../utils/hex");
|
||||
// Import the generated WASM module
|
||||
const OrlpEd25519 = require('../../lib/orlp-ed25519.js');
|
||||
/**
|
||||
* Get a fresh WASM instance
|
||||
* Loads a fresh instance each time because the WASM module could behave unpredictably otherwise
|
||||
*/
|
||||
async function getWasmInstance() {
|
||||
return await OrlpEd25519();
|
||||
}
|
||||
/**
|
||||
* Derive Ed25519 public key from private key using the exact orlp/ed25519 algorithm
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @returns 32-byte public key in hex format
|
||||
*/
|
||||
async function derivePublicKey(privateKeyHex) {
|
||||
const wasmModule = await getWasmInstance();
|
||||
const privateKeyBytes = (0, hex_1.hexToBytes)(privateKeyHex);
|
||||
if (privateKeyBytes.length !== 64) {
|
||||
throw new Error(`Invalid private key length: expected 64 bytes, got ${privateKeyBytes.length}`);
|
||||
}
|
||||
// Allocate memory buffers directly in WASM heap
|
||||
const privateKeyPtr = 1024; // Use fixed memory locations
|
||||
const publicKeyPtr = 1024 + 64;
|
||||
// Copy private key to WASM memory
|
||||
wasmModule.HEAPU8.set(privateKeyBytes, privateKeyPtr);
|
||||
// Call the orlp key derivation function
|
||||
const result = wasmModule.ccall('orlp_derive_public_key', 'number', ['number', 'number'], [publicKeyPtr, privateKeyPtr]);
|
||||
if (result !== 0) {
|
||||
throw new Error('orlp key derivation failed: invalid private key');
|
||||
}
|
||||
// Read the public key from WASM memory
|
||||
const publicKeyBytes = new Uint8Array(32);
|
||||
publicKeyBytes.set(wasmModule.HEAPU8.subarray(publicKeyPtr, publicKeyPtr + 32));
|
||||
return (0, hex_1.bytesToHex)(publicKeyBytes);
|
||||
}
|
||||
/**
|
||||
* Validate that a private key and public key pair match using orlp/ed25519
|
||||
*
|
||||
* @param privateKeyHex - 64-byte private key in hex format
|
||||
* @param expectedPublicKeyHex - 32-byte public key in hex format
|
||||
* @returns true if the keys match, false otherwise
|
||||
*/
|
||||
async function validateKeyPair(privateKeyHex, expectedPublicKeyHex) {
|
||||
try {
|
||||
const wasmModule = await getWasmInstance();
|
||||
const privateKeyBytes = (0, hex_1.hexToBytes)(privateKeyHex);
|
||||
const expectedPublicKeyBytes = (0, hex_1.hexToBytes)(expectedPublicKeyHex);
|
||||
if (privateKeyBytes.length !== 64) {
|
||||
return false;
|
||||
}
|
||||
if (expectedPublicKeyBytes.length !== 32) {
|
||||
return false;
|
||||
}
|
||||
// Allocate memory buffers directly in WASM heap
|
||||
const privateKeyPtr = 2048; // Use different fixed memory locations
|
||||
const publicKeyPtr = 2048 + 64;
|
||||
// Copy keys to WASM memory
|
||||
wasmModule.HEAPU8.set(privateKeyBytes, privateKeyPtr);
|
||||
wasmModule.HEAPU8.set(expectedPublicKeyBytes, publicKeyPtr);
|
||||
// Call the validation function (note: C function expects public_key first, then private_key)
|
||||
const result = wasmModule.ccall('orlp_validate_keypair', 'number', ['number', 'number'], [publicKeyPtr, privateKeyPtr]);
|
||||
return result === 1;
|
||||
}
|
||||
catch (error) {
|
||||
// Invalid hex strings or other errors should return false
|
||||
return false;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Sign a message using Ed25519 with orlp/ed25519 implementation
|
||||
*
|
||||
* @param messageHex - Message to sign in hex format
|
||||
* @param privateKeyHex - 64-byte private key in hex format (orlp/ed25519 format)
|
||||
* @param publicKeyHex - 32-byte public key in hex format
|
||||
* @returns 64-byte signature in hex format
|
||||
*/
|
||||
async function sign(messageHex, privateKeyHex, publicKeyHex) {
|
||||
const wasmModule = await getWasmInstance();
|
||||
const messageBytes = (0, hex_1.hexToBytes)(messageHex);
|
||||
const privateKeyBytes = (0, hex_1.hexToBytes)(privateKeyHex);
|
||||
const publicKeyBytes = (0, hex_1.hexToBytes)(publicKeyHex);
|
||||
if (privateKeyBytes.length !== 64) {
|
||||
throw new Error(`Invalid private key length: expected 64 bytes, got ${privateKeyBytes.length}`);
|
||||
}
|
||||
if (publicKeyBytes.length !== 32) {
|
||||
throw new Error(`Invalid public key length: expected 32 bytes, got ${publicKeyBytes.length}`);
|
||||
}
|
||||
// Allocate memory buffers with large gaps to avoid conflicts with scratch space
|
||||
const messagePtr = 100000;
|
||||
const privateKeyPtr = 200000;
|
||||
const publicKeyPtr = 300000;
|
||||
const signaturePtr = 400000;
|
||||
// Copy data to WASM memory
|
||||
wasmModule.HEAPU8.set(messageBytes, messagePtr);
|
||||
wasmModule.HEAPU8.set(privateKeyBytes, privateKeyPtr);
|
||||
wasmModule.HEAPU8.set(publicKeyBytes, publicKeyPtr);
|
||||
// Call orlp_sign
|
||||
wasmModule.ccall('orlp_sign', 'void', ['number', 'number', 'number', 'number', 'number'], [signaturePtr, messagePtr, messageBytes.length, publicKeyPtr, privateKeyPtr]);
|
||||
// Read signature
|
||||
const signatureBytes = new Uint8Array(64);
|
||||
signatureBytes.set(wasmModule.HEAPU8.subarray(signaturePtr, signaturePtr + 64));
|
||||
return (0, hex_1.bytesToHex)(signatureBytes);
|
||||
}
|
||||
/**
|
||||
* Verify an Ed25519 signature using orlp/ed25519 implementation
|
||||
*
|
||||
* @param signatureHex - 64-byte signature in hex format
|
||||
* @param messageHex - Message that was signed in hex format
|
||||
* @param publicKeyHex - 32-byte public key in hex format
|
||||
* @returns true if signature is valid, false otherwise
|
||||
*/
|
||||
async function verify(signatureHex, messageHex, publicKeyHex) {
|
||||
try {
|
||||
const wasmModule = await getWasmInstance();
|
||||
const signatureBytes = (0, hex_1.hexToBytes)(signatureHex);
|
||||
const messageBytes = (0, hex_1.hexToBytes)(messageHex);
|
||||
const publicKeyBytes = (0, hex_1.hexToBytes)(publicKeyHex);
|
||||
if (signatureBytes.length !== 64) {
|
||||
return false;
|
||||
}
|
||||
if (publicKeyBytes.length !== 32) {
|
||||
return false;
|
||||
}
|
||||
// Allocate memory buffers with large gaps to avoid conflicts with scratch space
|
||||
const messagePtr = 500000;
|
||||
const signaturePtr = 600000;
|
||||
const publicKeyPtr = 700000;
|
||||
// Copy data to WASM memory
|
||||
wasmModule.HEAPU8.set(signatureBytes, signaturePtr);
|
||||
wasmModule.HEAPU8.set(messageBytes, messagePtr);
|
||||
wasmModule.HEAPU8.set(publicKeyBytes, publicKeyPtr);
|
||||
// Call the orlp verify function
|
||||
const result = wasmModule.ccall('orlp_verify', 'number', ['number', 'number', 'number', 'number'], [signaturePtr, messagePtr, messageBytes.length, publicKeyPtr]);
|
||||
return result === 1;
|
||||
}
|
||||
catch (error) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=orlp-ed25519-wasm.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/crypto/orlp-ed25519-wasm.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"orlp-ed25519-wasm.js","sourceRoot":"","sources":["../../src/crypto/orlp-ed25519-wasm.ts"],"names":[],"mappings":";AAAA,sDAAsD;AACtD,wDAAwD;;AAqBxD,0CAiCC;AASD,0CAoCC;AAUD,oBAuCC;AAUD,wBAsCC;AAlMD,sCAAsD;AAEtD,mCAAmC;AACnC,MAAM,WAAW,GAAG,OAAO,CAAC,2BAA2B,CAAC,CAAC;AAEzD;;;GAGG;AACH,KAAK,UAAU,eAAe;IAC5B,OAAO,MAAM,WAAW,EAAE,CAAC;AAC7B,CAAC;AAED;;;;;GAKG;AACI,KAAK,UAAU,eAAe,CAAC,aAAqB;IACzD,MAAM,UAAU,GAAG,MAAM,eAAe,EAAE,CAAC;IAE3C,MAAM,eAAe,GAAG,IAAA,gBAAU,EAAC,aAAa,CAAC,CAAC;IAElD,IAAI,eAAe,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;QAClC,MAAM,IAAI,KAAK,CAAC,sDAAsD,eAAe,CAAC,MAAM,EAAE,CAAC,CAAC;IAClG,CAAC;IAED,gDAAgD;IAChD,MAAM,aAAa,GAAG,IAAI,CAAC,CAAC,6BAA6B;IACzD,MAAM,YAAY,GAAG,IAAI,GAAG,EAAE,CAAC;IAE/B,kCAAkC;IAClC,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,eAAe,EAAE,aAAa,CAAC,CAAC;IAEtD,wCAAwC;IACxC,MAAM,MAAM,GAAG,UAAU,CAAC,KAAK,CAC7B,wBAAwB,EACxB,QAAQ,EACR,CAAC,QAAQ,EAAE,QAAQ,CAAC,EACpB,CAAC,YAAY,EAAE,aAAa,CAAC,CAC9B,CAAC;IAEF,IAAI,MAAM,KAAK,CAAC,EAAE,CAAC;QACjB,MAAM,IAAI,KAAK,CAAC,iDAAiD,CAAC,CAAC;IACrE,CAAC;IAED,uCAAuC;IACvC,MAAM,cAAc,GAAG,IAAI,UAAU,CAAC,EAAE,CAAC,CAAC;IAC1C,cAAc,CAAC,GAAG,CAAC,UAAU,CAAC,MAAM,CAAC,QAAQ,CAAC,YAAY,EAAE,YAAY,GAAG,EAAE,CAAC,CAAC,CAAC;IAEhF,OAAO,IAAA,gBAAU,EAAC,cAAc,CAAC,CAAC;AACpC,CAAC;AAED;;;;;;GAMG;AACI,KAAK,UAAU,eAAe,CAAC,aAAqB,EAAE,oBAA4B;IACvF,IAAI,CAAC;QACH,MAAM,UAAU,GAAG,MAAM,eAAe,EAAE,CAAC;QAE3C,MAAM,eAAe,GAAG,IAAA,gBAAU,EAAC,aAAa,CAAC,CAAC;QAClD,MAAM,sBAAsB,GAAG,IAAA,gBAAU,EAAC,oBAAoB,CAAC,CAAC;QAEhE,IAAI,eAAe,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;YAClC,OAAO,KAAK,CAAC;QACf,CAAC;QAED,IAAI,sBAAsB,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;YACzC,OAAO,KAAK,CAAC;QACf,CAAC;QAED,gDAAgD;QAChD,MAAM,aAAa,GAAG,IAAI,CAAC,CAAC,uCAAuC;QACnE,MAAM,YAAY,GAAG,IAAI,GAAG,EAAE,CAAC;QAE/B,2BAA2B;QAC3B,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,eAAe,EAAE,aAAa,CAAC,CAAC;QACtD,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,sBAAsB,EAAE,YAAY,CAAC,CAAC;QAE5D,6FAA6F;QAC7F,MAAM,MAAM,GAAG,UAAU,CAAC,KAAK,CAC7B,uBAAuB,EACvB,QAAQ,EACR,CAAC,QAAQ,EAAE,QAAQ,CAAC,EACpB,CAAC,YAAY,EAAE,aAAa,CAAC,CAC9B,CAAC;QAEF,OAAO,MAAM,KAAK,CAAC,CAAC;IACtB,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,0DAA0D;QAC1D,OAAO,KAAK,CAAC;IACf,CAAC;AACH,CAAC;AAED;;;;;;;GAOG;AACI,KAAK,UAAU,IAAI,CAAC,UAAkB,EAAE,aAAqB,EAAE,YAAoB;IACxF,MAAM,UAAU,GAAG,MAAM,eAAe,EAAE,CAAC;IAE3C,MAAM,YAAY,GAAG,IAAA,gBAAU,EAAC,UAAU,CAAC,CAAC;IAC5C,MAAM,eAAe,GAAG,IAAA,gBAAU,EAAC,aAAa,CAAC,CAAC;IAClD,MAAM,cAAc,GAAG,IAAA,gBAAU,EAAC,YAAY,CAAC,CAAC;IAEhD,IAAI,eAAe,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;QAClC,MAAM,IAAI,KAAK,CAAC,sDAAsD,eAAe,CAAC,MAAM,EAAE,CAAC,CAAC;IAClG,CAAC;IAED,IAAI,cAAc,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;QACjC,MAAM,IAAI,KAAK,CAAC,qDAAqD,cAAc,CAAC,MAAM,EAAE,CAAC,CAAC;IAChG,CAAC;IAED,gFAAgF;IAChF,MAAM,UAAU,GAAG,MAAM,CAAC;IAC1B,MAAM,aAAa,GAAG,MAAM,CAAC;IAC7B,MAAM,YAAY,GAAG,MAAM,CAAC;IAC5B,MAAM,YAAY,GAAG,MAAM,CAAC;IAE5B,2BAA2B;IAC3B,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,YAAY,EAAE,UAAU,CAAC,CAAC;IAChD,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,eAAe,EAAE,aAAa,CAAC,CAAC;IACtD,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,cAAc,EAAE,YAAY,CAAC,CAAC;IAEpD,iBAAiB;IACjB,UAAU,CAAC,KAAK,CACd,WAAW,EACX,MAAM,EACN,CAAC,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,CAAC,EAClD,CAAC,YAAY,EAAE,UAAU,EAAE,YAAY,CAAC,MAAM,EAAE,YAAY,EAAE,aAAa,CAAC,CAC7E,CAAC;IAEF,iBAAiB;IACjB,MAAM,cAAc,GAAG,IAAI,UAAU,CAAC,EAAE,CAAC,CAAC;IAC1C,cAAc,CAAC,GAAG,CAAC,UAAU,CAAC,MAAM,CAAC,QAAQ,CAAC,YAAY,EAAE,YAAY,GAAG,EAAE,CAAC,CAAC,CAAC;IAEhF,OAAO,IAAA,gBAAU,EAAC,cAAc,CAAC,CAAC;AACpC,CAAC;AAED;;;;;;;GAOG;AACI,KAAK,UAAU,MAAM,CAAC,YAAoB,EAAE,UAAkB,EAAE,YAAoB;IACzF,IAAI,CAAC;QACH,MAAM,UAAU,GAAG,MAAM,eAAe,EAAE,CAAC;QAE3C,MAAM,cAAc,GAAG,IAAA,gBAAU,EAAC,YAAY,CAAC,CAAC;QAChD,MAAM,YAAY,GAAG,IAAA,gBAAU,EAAC,UAAU,CAAC,CAAC;QAC5C,MAAM,cAAc,GAAG,IAAA,gBAAU,EAAC,YAAY,CAAC,CAAC;QAEhD,IAAI,cAAc,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;YACjC,OAAO,KAAK,CAAC;QACf,CAAC;QAED,IAAI,cAAc,CAAC,MAAM,KAAK,EAAE,EAAE,CAAC;YACjC,OAAO,KAAK,CAAC;QACf,CAAC;QAED,gFAAgF;QAChF,MAAM,UAAU,GAAG,MAAM,CAAC;QAC1B,MAAM,YAAY,GAAG,MAAM,CAAC;QAC5B,MAAM,YAAY,GAAG,MAAM,CAAC;QAE5B,2BAA2B;QAC3B,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,cAAc,EAAE,YAAY,CAAC,CAAC;QACpD,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,YAAY,EAAE,UAAU,CAAC,CAAC;QAChD,UAAU,CAAC,MAAM,CAAC,GAAG,CAAC,cAAc,EAAE,YAAY,CAAC,CAAC;QAEpD,gCAAgC;QAChC,MAAM,MAAM,GAAG,UAAU,CAAC,KAAK,CAC7B,aAAa,EACb,QAAQ,EACR,CAAC,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,CAAC,EACxC,CAAC,YAAY,EAAE,UAAU,EAAE,YAAY,CAAC,MAAM,EAAE,YAAY,CAAC,CAC9D,CAAC;QAEF,OAAO,MAAM,KAAK,CAAC,CAAC;IACtB,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO,KAAK,CAAC;IACf,CAAC;AACH,CAAC"}
|
||||
51
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.d.ts
vendored
Normal file
51
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.d.ts
vendored
Normal file
@@ -0,0 +1,51 @@
|
||||
import { DecodedPacket, PacketStructure } from '../types/packet';
|
||||
import { DecryptionOptions, ValidationResult, CryptoKeyStore } from '../types/crypto';
|
||||
export declare class MeshCorePacketDecoder {
|
||||
/**
|
||||
* Decode a raw packet from hex string
|
||||
*/
|
||||
static decode(hexData: string, options?: DecryptionOptions): DecodedPacket;
|
||||
/**
|
||||
* Decode a raw packet from hex string with signature verification for advertisements
|
||||
*/
|
||||
static decodeWithVerification(hexData: string, options?: DecryptionOptions): Promise<DecodedPacket>;
|
||||
/**
|
||||
* Analyze packet structure for detailed breakdown
|
||||
*/
|
||||
static analyzeStructure(hexData: string, options?: DecryptionOptions): PacketStructure;
|
||||
/**
|
||||
* Analyze packet structure for detailed breakdown with signature verification for advertisements
|
||||
*/
|
||||
static analyzeStructureWithVerification(hexData: string, options?: DecryptionOptions): Promise<PacketStructure>;
|
||||
/**
|
||||
* Internal unified parsing method
|
||||
*/
|
||||
private static parseInternal;
|
||||
/**
|
||||
* Internal unified parsing method with signature verification for advertisements
|
||||
*/
|
||||
private static parseInternalAsync;
|
||||
/**
|
||||
* Validate packet format without full decoding
|
||||
*/
|
||||
static validate(hexData: string): ValidationResult;
|
||||
/**
|
||||
* Calculate message hash for a packet
|
||||
*/
|
||||
static calculateMessageHash(bytes: Uint8Array, routeType: number, payloadType: number, payloadVersion: number): string;
|
||||
/**
|
||||
* Create a key store for decryption
|
||||
*/
|
||||
static createKeyStore(initialKeys?: {
|
||||
channelSecrets?: string[];
|
||||
nodeKeys?: Record<string, string>;
|
||||
}): CryptoKeyStore;
|
||||
/**
|
||||
* Decode a path_len byte into hash size, hop count, and total byte length.
|
||||
* Firmware reference: Packet.h lines 79-83
|
||||
* Bits 7:6 = hash size selector: (path_len >> 6) + 1 = 1, 2, or 3 bytes per hop
|
||||
* Bits 5:0 = hop count (0-63)
|
||||
*/
|
||||
private static decodePathLenByte;
|
||||
}
|
||||
//# sourceMappingURL=packet-decoder.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"packet-decoder.d.ts","sourceRoot":"","sources":["../../src/decoder/packet-decoder.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,aAAa,EAAE,eAAe,EAAkD,MAAM,iBAAiB,CAAC;AAIjH,OAAO,EAAE,iBAAiB,EAAE,gBAAgB,EAAE,cAAc,EAAE,MAAM,iBAAiB,CAAC;AAatF,qBAAa,qBAAqB;IAChC;;OAEG;IACH,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,iBAAiB,GAAG,aAAa;IAK1E;;OAEG;WACU,sBAAsB,CAAC,OAAO,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,iBAAiB,GAAG,OAAO,CAAC,aAAa,CAAC;IAKzG;;OAEG;IACH,MAAM,CAAC,gBAAgB,CAAC,OAAO,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,iBAAiB,GAAG,eAAe;IAKtF;;OAEG;WACU,gCAAgC,CAAC,OAAO,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,iBAAiB,GAAG,OAAO,CAAC,eAAe,CAAC;IAKrH;;OAEG;IACH,OAAO,CAAC,MAAM,CAAC,aAAa;IA6Y5B;;OAEG;mBACkB,kBAAkB;IA2CvC;;OAEG;IACH,MAAM,CAAC,QAAQ,CAAC,OAAO,EAAE,MAAM,GAAG,gBAAgB;IAmDlD;;OAEG;IACH,MAAM,CAAC,oBAAoB,CAAC,KAAK,EAAE,UAAU,EAAE,SAAS,EAAE,MAAM,EAAE,WAAW,EAAE,MAAM,EAAE,cAAc,EAAE,MAAM,GAAG,MAAM;IAkDtH;;OAEG;IACH,MAAM,CAAC,cAAc,CAAC,WAAW,CAAC,EAAE;QAClC,cAAc,CAAC,EAAE,MAAM,EAAE,CAAC;QAC1B,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;KACnC,GAAG,cAAc;IAIlB;;;;;OAKG;IACH,OAAO,CAAC,MAAM,CAAC,iBAAiB;CAMjC"}
|
||||
576
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.js
vendored
Normal file
576
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.js
vendored
Normal file
@@ -0,0 +1,576 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.MeshCorePacketDecoder = void 0;
|
||||
const enums_1 = require("../types/enums");
|
||||
const hex_1 = require("../utils/hex");
|
||||
const enum_names_1 = require("../utils/enum-names");
|
||||
const key_manager_1 = require("../crypto/key-manager");
|
||||
const advert_1 = require("./payload-decoders/advert");
|
||||
const trace_1 = require("./payload-decoders/trace");
|
||||
const group_text_1 = require("./payload-decoders/group-text");
|
||||
const request_1 = require("./payload-decoders/request");
|
||||
const response_1 = require("./payload-decoders/response");
|
||||
const anon_request_1 = require("./payload-decoders/anon-request");
|
||||
const ack_1 = require("./payload-decoders/ack");
|
||||
const path_1 = require("./payload-decoders/path");
|
||||
const text_message_1 = require("./payload-decoders/text-message");
|
||||
const control_1 = require("./payload-decoders/control");
|
||||
class MeshCorePacketDecoder {
|
||||
/**
|
||||
* Decode a raw packet from hex string
|
||||
*/
|
||||
static decode(hexData, options) {
|
||||
const result = this.parseInternal(hexData, false, options);
|
||||
return result.packet;
|
||||
}
|
||||
/**
|
||||
* Decode a raw packet from hex string with signature verification for advertisements
|
||||
*/
|
||||
static async decodeWithVerification(hexData, options) {
|
||||
const result = await this.parseInternalAsync(hexData, false, options);
|
||||
return result.packet;
|
||||
}
|
||||
/**
|
||||
* Analyze packet structure for detailed breakdown
|
||||
*/
|
||||
static analyzeStructure(hexData, options) {
|
||||
const result = this.parseInternal(hexData, true, options);
|
||||
return result.structure;
|
||||
}
|
||||
/**
|
||||
* Analyze packet structure for detailed breakdown with signature verification for advertisements
|
||||
*/
|
||||
static async analyzeStructureWithVerification(hexData, options) {
|
||||
const result = await this.parseInternalAsync(hexData, true, options);
|
||||
return result.structure;
|
||||
}
|
||||
/**
|
||||
* Internal unified parsing method
|
||||
*/
|
||||
static parseInternal(hexData, includeStructure, options) {
|
||||
const bytes = (0, hex_1.hexToBytes)(hexData);
|
||||
const segments = [];
|
||||
if (bytes.length < 2) {
|
||||
const errorPacket = {
|
||||
messageHash: '',
|
||||
routeType: enums_1.RouteType.Flood,
|
||||
payloadType: enums_1.PayloadType.RawCustom,
|
||||
payloadVersion: enums_1.PayloadVersion.Version1,
|
||||
pathLength: 0,
|
||||
path: null,
|
||||
payload: { raw: '', decoded: null },
|
||||
totalBytes: bytes.length,
|
||||
isValid: false,
|
||||
errors: ['Packet too short (minimum 2 bytes required)']
|
||||
};
|
||||
const errorStructure = {
|
||||
segments: [],
|
||||
totalBytes: bytes.length,
|
||||
rawHex: hexData.toUpperCase(),
|
||||
messageHash: '',
|
||||
payload: {
|
||||
segments: [],
|
||||
hex: '',
|
||||
startByte: 0,
|
||||
type: 'Unknown'
|
||||
}
|
||||
};
|
||||
return { packet: errorPacket, structure: errorStructure };
|
||||
}
|
||||
try {
|
||||
let offset = 0;
|
||||
// parse header
|
||||
const header = bytes[0];
|
||||
const routeType = header & 0x03;
|
||||
const payloadType = (header >> 2) & 0x0F;
|
||||
const payloadVersion = (header >> 6) & 0x03;
|
||||
if (includeStructure) {
|
||||
segments.push({
|
||||
name: 'Header',
|
||||
description: 'Header byte breakdown',
|
||||
startByte: 0,
|
||||
endByte: 0,
|
||||
value: `0x${header.toString(16).padStart(2, '0')}`,
|
||||
headerBreakdown: {
|
||||
fullBinary: header.toString(2).padStart(8, '0'),
|
||||
fields: [
|
||||
{
|
||||
bits: '0-1',
|
||||
field: 'Route Type',
|
||||
value: (0, enum_names_1.getRouteTypeName)(routeType),
|
||||
binary: (header & 0x03).toString(2).padStart(2, '0')
|
||||
},
|
||||
{
|
||||
bits: '2-5',
|
||||
field: 'Payload Type',
|
||||
value: (0, enum_names_1.getPayloadTypeName)(payloadType),
|
||||
binary: ((header >> 2) & 0x0F).toString(2).padStart(4, '0')
|
||||
},
|
||||
{
|
||||
bits: '6-7',
|
||||
field: 'Version',
|
||||
value: payloadVersion.toString(),
|
||||
binary: ((header >> 6) & 0x03).toString(2).padStart(2, '0')
|
||||
}
|
||||
]
|
||||
}
|
||||
});
|
||||
}
|
||||
offset = 1;
|
||||
// handle transport codes
|
||||
let transportCodes;
|
||||
if (routeType === enums_1.RouteType.TransportFlood || routeType === enums_1.RouteType.TransportDirect) {
|
||||
if (bytes.length < offset + 4) {
|
||||
throw new Error('Packet too short for transport codes');
|
||||
}
|
||||
const code1 = bytes[offset] | (bytes[offset + 1] << 8);
|
||||
const code2 = bytes[offset + 2] | (bytes[offset + 3] << 8);
|
||||
transportCodes = [code1, code2];
|
||||
if (includeStructure) {
|
||||
const transportCode = (bytes[offset]) | (bytes[offset + 1] << 8) | (bytes[offset + 2] << 16) | (bytes[offset + 3] << 24);
|
||||
segments.push({
|
||||
name: 'Transport Code',
|
||||
description: 'Used for Direct/Response routing',
|
||||
startByte: offset,
|
||||
endByte: offset + 3,
|
||||
value: `0x${transportCode.toString(16).padStart(8, '0')}`
|
||||
});
|
||||
}
|
||||
offset += 4;
|
||||
}
|
||||
// parse path length byte (encodes hash size and hop count)
|
||||
// Bits 7:6 = hash size selector: (path_len >> 6) + 1 = 1, 2, or 3 bytes per hop
|
||||
// Bits 5:0 = hop count (0-63)
|
||||
if (bytes.length < offset + 1) {
|
||||
throw new Error('Packet too short for path length');
|
||||
}
|
||||
const pathLenByte = bytes[offset];
|
||||
const { hashSize: pathHashSize, hopCount: pathHopCount, byteLength: pathByteLength } = this.decodePathLenByte(pathLenByte);
|
||||
if (pathHashSize === 4) {
|
||||
throw new Error('Invalid path length byte: reserved hash size (bits 7:6 = 11)');
|
||||
}
|
||||
if (includeStructure) {
|
||||
const hashDesc = pathHashSize > 1 ? ` × ${pathHashSize}-byte hashes (${pathByteLength} bytes)` : '';
|
||||
let pathLengthDescription;
|
||||
if (pathHopCount === 0) {
|
||||
pathLengthDescription = pathHashSize > 1 ? `No path data (${pathHashSize}-byte hash mode)` : 'No path data';
|
||||
}
|
||||
else if (routeType === enums_1.RouteType.Direct || routeType === enums_1.RouteType.TransportDirect) {
|
||||
pathLengthDescription = `${pathHopCount} hops${hashDesc} of routing instructions (decreases as packet travels)`;
|
||||
}
|
||||
else if (routeType === enums_1.RouteType.Flood || routeType === enums_1.RouteType.TransportFlood) {
|
||||
pathLengthDescription = `${pathHopCount} hops${hashDesc} showing route taken (increases as packet floods)`;
|
||||
}
|
||||
else {
|
||||
pathLengthDescription = `Path contains ${pathHopCount} hops${hashDesc}`;
|
||||
}
|
||||
segments.push({
|
||||
name: 'Path Length',
|
||||
description: pathLengthDescription,
|
||||
startByte: offset,
|
||||
endByte: offset,
|
||||
value: `0x${pathLenByte.toString(16).padStart(2, '0')}`,
|
||||
headerBreakdown: {
|
||||
fullBinary: pathLenByte.toString(2).padStart(8, '0'),
|
||||
fields: [
|
||||
{
|
||||
bits: '6-7',
|
||||
field: 'Hash Size',
|
||||
value: `${pathHashSize} byte${pathHashSize > 1 ? 's' : ''} per hop`,
|
||||
binary: ((pathLenByte >> 6) & 0x03).toString(2).padStart(2, '0')
|
||||
},
|
||||
{
|
||||
bits: '0-5',
|
||||
field: 'Hop Count',
|
||||
value: `${pathHopCount} hop${pathHopCount !== 1 ? 's' : ''}`,
|
||||
binary: (pathLenByte & 63).toString(2).padStart(6, '0')
|
||||
}
|
||||
]
|
||||
}
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
if (bytes.length < offset + pathByteLength) {
|
||||
throw new Error('Packet too short for path data');
|
||||
}
|
||||
// convert path data to grouped hex strings (one entry per hop)
|
||||
const pathBytes = bytes.subarray(offset, offset + pathByteLength);
|
||||
let path = null;
|
||||
if (pathHopCount > 0) {
|
||||
path = [];
|
||||
for (let i = 0; i < pathHopCount; i++) {
|
||||
const hopBytes = pathBytes.subarray(i * pathHashSize, (i + 1) * pathHashSize);
|
||||
path.push((0, hex_1.bytesToHex)(hopBytes));
|
||||
}
|
||||
}
|
||||
if (includeStructure && pathHopCount > 0) {
|
||||
if (payloadType === enums_1.PayloadType.Trace) {
|
||||
// TRACE packets have SNR values in path (always single-byte entries)
|
||||
const snrValues = [];
|
||||
for (let i = 0; i < pathByteLength; i++) {
|
||||
const snrRaw = bytes[offset + i];
|
||||
const snrSigned = snrRaw > 127 ? snrRaw - 256 : snrRaw;
|
||||
const snrDb = snrSigned / 4.0;
|
||||
snrValues.push(`${snrDb.toFixed(2)}dB (0x${snrRaw.toString(16).padStart(2, '0')})`);
|
||||
}
|
||||
segments.push({
|
||||
name: 'Path SNR Data',
|
||||
description: `SNR values collected during trace: ${snrValues.join(', ')}`,
|
||||
startByte: offset,
|
||||
endByte: offset + pathByteLength - 1,
|
||||
value: (0, hex_1.bytesToHex)(bytes.slice(offset, offset + pathByteLength))
|
||||
});
|
||||
}
|
||||
else {
|
||||
let pathDescription = 'Routing path information';
|
||||
if (routeType === enums_1.RouteType.Direct || routeType === enums_1.RouteType.TransportDirect) {
|
||||
pathDescription = `Routing instructions (${pathHashSize}-byte hashes stripped at each hop as packet travels to destination)`;
|
||||
}
|
||||
else if (routeType === enums_1.RouteType.Flood || routeType === enums_1.RouteType.TransportFlood) {
|
||||
pathDescription = `Historical route taken (${pathHashSize}-byte hashes added as packet floods through network)`;
|
||||
}
|
||||
segments.push({
|
||||
name: 'Path Data',
|
||||
description: pathDescription,
|
||||
startByte: offset,
|
||||
endByte: offset + pathByteLength - 1,
|
||||
value: (0, hex_1.bytesToHex)(bytes.slice(offset, offset + pathByteLength))
|
||||
});
|
||||
}
|
||||
}
|
||||
offset += pathByteLength;
|
||||
// extract payload
|
||||
const payloadBytes = bytes.subarray(offset);
|
||||
const payloadHex = (0, hex_1.bytesToHex)(payloadBytes);
|
||||
if (includeStructure && bytes.length > offset) {
|
||||
segments.push({
|
||||
name: 'Payload',
|
||||
description: `${(0, enum_names_1.getPayloadTypeName)(payloadType)} payload data`,
|
||||
startByte: offset,
|
||||
endByte: bytes.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(bytes.slice(offset))
|
||||
});
|
||||
}
|
||||
// decode payload based on type and optionally get segments in one pass
|
||||
let decodedPayload = null;
|
||||
const payloadSegments = [];
|
||||
if (payloadType === enums_1.PayloadType.Advert) {
|
||||
const result = advert_1.AdvertPayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Trace) {
|
||||
const result = trace_1.TracePayloadDecoder.decode(payloadBytes, path, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0 // Payload segments are relative to payload start
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments; // Remove from decoded payload to keep it clean
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.GroupText) {
|
||||
const result = group_text_1.GroupTextPayloadDecoder.decode(payloadBytes, {
|
||||
...options,
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Request) {
|
||||
const result = request_1.RequestPayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0 // Payload segments are relative to payload start
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Response) {
|
||||
const result = response_1.ResponsePayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0 // Payload segments are relative to payload start
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.AnonRequest) {
|
||||
const result = anon_request_1.AnonRequestPayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Ack) {
|
||||
const result = ack_1.AckPayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Path) {
|
||||
decodedPayload = path_1.PathPayloadDecoder.decode(payloadBytes);
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.TextMessage) {
|
||||
const result = text_message_1.TextMessagePayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
else if (payloadType === enums_1.PayloadType.Control) {
|
||||
const result = control_1.ControlPayloadDecoder.decode(payloadBytes, {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
decodedPayload = result;
|
||||
if (result?.segments) {
|
||||
payloadSegments.push(...result.segments);
|
||||
delete result.segments;
|
||||
}
|
||||
}
|
||||
// if no segments were generated and we need structure, show basic payload info
|
||||
if (includeStructure && payloadSegments.length === 0 && bytes.length > offset) {
|
||||
payloadSegments.push({
|
||||
name: `${(0, enum_names_1.getPayloadTypeName)(payloadType)} Payload`,
|
||||
description: `Raw ${(0, enum_names_1.getPayloadTypeName)(payloadType)} payload data (${payloadBytes.length} bytes)`,
|
||||
startByte: 0,
|
||||
endByte: payloadBytes.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payloadBytes)
|
||||
});
|
||||
}
|
||||
// calculate message hash
|
||||
const messageHash = this.calculateMessageHash(bytes, routeType, payloadType, payloadVersion);
|
||||
const packet = {
|
||||
messageHash,
|
||||
routeType,
|
||||
payloadType,
|
||||
payloadVersion,
|
||||
transportCodes,
|
||||
pathLength: pathHopCount,
|
||||
...(pathHashSize > 1 ? { pathHashSize } : {}),
|
||||
path,
|
||||
payload: {
|
||||
raw: payloadHex,
|
||||
decoded: decodedPayload
|
||||
},
|
||||
totalBytes: bytes.length,
|
||||
isValid: true
|
||||
};
|
||||
const structure = {
|
||||
segments,
|
||||
totalBytes: bytes.length,
|
||||
rawHex: hexData.toUpperCase(),
|
||||
messageHash,
|
||||
payload: {
|
||||
segments: payloadSegments,
|
||||
hex: payloadHex,
|
||||
startByte: offset,
|
||||
type: (0, enum_names_1.getPayloadTypeName)(payloadType)
|
||||
}
|
||||
};
|
||||
return { packet, structure };
|
||||
}
|
||||
catch (error) {
|
||||
const errorPacket = {
|
||||
messageHash: '',
|
||||
routeType: enums_1.RouteType.Flood,
|
||||
payloadType: enums_1.PayloadType.RawCustom,
|
||||
payloadVersion: enums_1.PayloadVersion.Version1,
|
||||
pathLength: 0,
|
||||
path: null,
|
||||
payload: { raw: '', decoded: null },
|
||||
totalBytes: bytes.length,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown decoding error']
|
||||
};
|
||||
const errorStructure = {
|
||||
segments: [],
|
||||
totalBytes: bytes.length,
|
||||
rawHex: hexData.toUpperCase(),
|
||||
messageHash: '',
|
||||
payload: {
|
||||
segments: [],
|
||||
hex: '',
|
||||
startByte: 0,
|
||||
type: 'Unknown'
|
||||
}
|
||||
};
|
||||
return { packet: errorPacket, structure: errorStructure };
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Internal unified parsing method with signature verification for advertisements
|
||||
*/
|
||||
static async parseInternalAsync(hexData, includeStructure, options) {
|
||||
// First do the regular parsing
|
||||
const result = this.parseInternal(hexData, includeStructure, options);
|
||||
// If it's an advertisement, verify the signature
|
||||
if (result.packet.payloadType === enums_1.PayloadType.Advert && result.packet.payload.decoded) {
|
||||
try {
|
||||
const advertPayload = result.packet.payload.decoded;
|
||||
const verifiedAdvert = await advert_1.AdvertPayloadDecoder.decodeWithVerification((0, hex_1.hexToBytes)(result.packet.payload.raw), {
|
||||
includeSegments: includeStructure,
|
||||
segmentOffset: 0
|
||||
});
|
||||
if (verifiedAdvert) {
|
||||
// Update the payload with signature verification results
|
||||
result.packet.payload.decoded = verifiedAdvert;
|
||||
// If the advertisement signature is invalid, mark the whole packet as invalid
|
||||
if (!verifiedAdvert.isValid) {
|
||||
result.packet.isValid = false;
|
||||
result.packet.errors = verifiedAdvert.errors || ['Invalid advertisement signature'];
|
||||
}
|
||||
// Update structure segments if needed
|
||||
if (includeStructure && verifiedAdvert.segments) {
|
||||
result.structure.payload.segments = verifiedAdvert.segments;
|
||||
delete verifiedAdvert.segments;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
console.error('Signature verification failed:', error);
|
||||
}
|
||||
}
|
||||
return result;
|
||||
}
|
||||
/**
|
||||
* Validate packet format without full decoding
|
||||
*/
|
||||
static validate(hexData) {
|
||||
const bytes = (0, hex_1.hexToBytes)(hexData);
|
||||
const errors = [];
|
||||
if (bytes.length < 2) {
|
||||
errors.push('Packet too short (minimum 2 bytes required)');
|
||||
return { isValid: false, errors };
|
||||
}
|
||||
try {
|
||||
let offset = 1; // Skip header
|
||||
// check transport codes
|
||||
const header = bytes[0];
|
||||
const routeType = header & 0x03;
|
||||
if (routeType === enums_1.RouteType.TransportFlood || routeType === enums_1.RouteType.TransportDirect) {
|
||||
if (bytes.length < offset + 4) {
|
||||
errors.push('Packet too short for transport codes');
|
||||
}
|
||||
offset += 4;
|
||||
}
|
||||
// check path length
|
||||
if (bytes.length < offset + 1) {
|
||||
errors.push('Packet too short for path length');
|
||||
}
|
||||
else {
|
||||
const pathLenByte = bytes[offset];
|
||||
const { hashSize, byteLength } = this.decodePathLenByte(pathLenByte);
|
||||
offset += 1;
|
||||
if (hashSize === 4) {
|
||||
errors.push('Invalid path length byte: reserved hash size (bits 7:6 = 11)');
|
||||
}
|
||||
if (bytes.length < offset + byteLength) {
|
||||
errors.push('Packet too short for path data');
|
||||
}
|
||||
offset += byteLength;
|
||||
}
|
||||
// check if we have payload data
|
||||
if (offset >= bytes.length) {
|
||||
errors.push('No payload data found');
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
errors.push(error instanceof Error ? error.message : 'Validation error');
|
||||
}
|
||||
return { isValid: errors.length === 0, errors: errors.length > 0 ? errors : undefined };
|
||||
}
|
||||
/**
|
||||
* Calculate message hash for a packet
|
||||
*/
|
||||
static calculateMessageHash(bytes, routeType, payloadType, payloadVersion) {
|
||||
// for TRACE packets, use the trace tag as hash
|
||||
if (payloadType === enums_1.PayloadType.Trace && bytes.length >= 13) {
|
||||
let offset = 1;
|
||||
// skip transport codes if present
|
||||
if (routeType === enums_1.RouteType.TransportFlood || routeType === enums_1.RouteType.TransportDirect) {
|
||||
offset += 4;
|
||||
}
|
||||
// skip path data (decode path_len byte for multi-byte hops)
|
||||
if (bytes.length > offset) {
|
||||
const { byteLength } = this.decodePathLenByte(bytes[offset]);
|
||||
offset += 1 + byteLength;
|
||||
}
|
||||
// extract trace tag
|
||||
if (bytes.length >= offset + 4) {
|
||||
const traceTag = (bytes[offset]) | (bytes[offset + 1] << 8) | (bytes[offset + 2] << 16) | (bytes[offset + 3] << 24);
|
||||
return (0, hex_1.numberToHex)(traceTag, 8);
|
||||
}
|
||||
}
|
||||
// for other packets, create hash from constant parts
|
||||
const constantHeader = (payloadType << 2) | (payloadVersion << 6);
|
||||
let offset = 1;
|
||||
// skip transport codes if present
|
||||
if (routeType === enums_1.RouteType.TransportFlood || routeType === enums_1.RouteType.TransportDirect) {
|
||||
offset += 4;
|
||||
}
|
||||
// skip path data (decode path_len byte for multi-byte hops)
|
||||
if (bytes.length > offset) {
|
||||
const { byteLength } = this.decodePathLenByte(bytes[offset]);
|
||||
offset += 1 + byteLength;
|
||||
}
|
||||
const payloadData = bytes.slice(offset);
|
||||
const hashInput = [constantHeader, ...Array.from(payloadData)];
|
||||
// generate hash
|
||||
let hash = 0;
|
||||
for (let i = 0; i < hashInput.length; i++) {
|
||||
hash = ((hash << 5) - hash + hashInput[i]) & 0xffffffff;
|
||||
}
|
||||
return (0, hex_1.numberToHex)(hash, 8);
|
||||
}
|
||||
/**
|
||||
* Create a key store for decryption
|
||||
*/
|
||||
static createKeyStore(initialKeys) {
|
||||
return new key_manager_1.MeshCoreKeyStore(initialKeys);
|
||||
}
|
||||
/**
|
||||
* Decode a path_len byte into hash size, hop count, and total byte length.
|
||||
* Firmware reference: Packet.h lines 79-83
|
||||
* Bits 7:6 = hash size selector: (path_len >> 6) + 1 = 1, 2, or 3 bytes per hop
|
||||
* Bits 5:0 = hop count (0-63)
|
||||
*/
|
||||
static decodePathLenByte(pathLenByte) {
|
||||
const hashSize = (pathLenByte >> 6) + 1;
|
||||
const hopCount = pathLenByte & 63;
|
||||
return { hashSize, hopCount, byteLength: hopCount * hashSize };
|
||||
}
|
||||
}
|
||||
exports.MeshCorePacketDecoder = MeshCorePacketDecoder;
|
||||
//# sourceMappingURL=packet-decoder.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/packet-decoder.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.d.ts
vendored
Normal file
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.d.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { AckPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class AckPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): AckPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=ack.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ack.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/ack.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,UAAU,EAAE,MAAM,sBAAsB,CAAC;AAClD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,iBAAiB;IAC5B,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,UAAU,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CA2EzJ"}
|
||||
78
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.js
vendored
Normal file
78
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.js
vendored
Normal file
@@ -0,0 +1,78 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.AckPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class AckPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - Ack payload structure:
|
||||
// - checksum (4 bytes) - CRC checksum of message timestamp, text, and sender pubkey
|
||||
if (payload.length < 4) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Ack,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Ack payload too short (minimum 4 bytes for checksum)'],
|
||||
checksum: ''
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Ack Data',
|
||||
description: 'Ack payload too short (minimum 4 bytes required for checksum)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
// parse checksum (4 bytes as hex)
|
||||
const checksum = (0, hex_1.bytesToHex)(payload.subarray(0, 4));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Checksum',
|
||||
description: `CRC checksum of message timestamp, text, and sender pubkey: 0x${checksum}`,
|
||||
startByte: segmentOffset,
|
||||
endByte: segmentOffset + 3,
|
||||
value: checksum
|
||||
});
|
||||
}
|
||||
// any additional data (if present)
|
||||
if (options?.includeSegments && payload.length > 4) {
|
||||
segments.push({
|
||||
name: 'Additional Data',
|
||||
description: 'Extra data in Ack payload',
|
||||
startByte: segmentOffset + 4,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload.subarray(4))
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Ack,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
checksum
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Ack,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode Ack payload'],
|
||||
checksum: ''
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.AckPayloadDecoder = AckPayloadDecoder;
|
||||
//# sourceMappingURL=ack.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/ack.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ack.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/ack.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAChE,yCAA6C;AAE7C,MAAa,iBAAiB;IAC5B,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAA+D;QAChG,IAAI,CAAC;YACH,yDAAyD;YACzD,oFAAoF;YAEpF,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAiD;oBAC3D,IAAI,EAAE,mBAAW,CAAC,GAAG;oBACrB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,sDAAsD,CAAC;oBAChE,QAAQ,EAAE,EAAE;iBACb,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,kBAAkB;4BACxB,WAAW,EAAE,+DAA+D;4BAC5E,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAElD,kCAAkC;YAClC,MAAM,QAAQ,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC;YACpD,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,UAAU;oBAChB,WAAW,EAAE,iEAAiE,QAAQ,EAAE;oBACxF,SAAS,EAAE,aAAa;oBACxB,OAAO,EAAE,aAAa,GAAG,CAAC;oBAC1B,KAAK,EAAE,QAAQ;iBAChB,CAAC,CAAC;YACL,CAAC;YAED,mCAAmC;YACnC,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACnD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,iBAAiB;oBACvB,WAAW,EAAE,2BAA2B;oBACxC,SAAS,EAAE,aAAa,GAAG,CAAC;oBAC5B,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;iBACvC,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAiD;gBAC3D,IAAI,EAAE,mBAAW,CAAC,GAAG;gBACrB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,QAAQ;aACT,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,GAAG;gBACrB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,8BAA8B,CAAC;gBACjF,QAAQ,EAAE,EAAE;aACb,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AA5ED,8CA4EC"}
|
||||
24
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.d.ts
vendored
Normal file
24
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.d.ts
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
import { AdvertPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class AdvertPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): AdvertPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
/**
|
||||
* Decode advertisement payload with signature verification
|
||||
*/
|
||||
static decodeWithVerification(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): Promise<AdvertPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null>;
|
||||
private static parseDeviceRole;
|
||||
private static readUint32LE;
|
||||
private static readInt32LE;
|
||||
private static sanitizeControlCharacters;
|
||||
}
|
||||
//# sourceMappingURL=advert.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"advert.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/advert.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,aAAa,EAAE,MAAM,sBAAsB,CAAC;AACrD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAMpD,qBAAa,oBAAoB;IAC/B,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,aAAa,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;IAuL3J;;OAEG;WACU,sBAAsB,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,OAAO,CAAC,aAAa,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI,CAAC;IA4C1L,OAAO,CAAC,MAAM,CAAC,eAAe;IAW9B,OAAO,CAAC,MAAM,CAAC,YAAY;IAO3B,OAAO,CAAC,MAAM,CAAC,WAAW;IAM1B,OAAO,CAAC,MAAM,CAAC,yBAAyB;CAKzC"}
|
||||
244
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.js
vendored
Normal file
244
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.js
vendored
Normal file
@@ -0,0 +1,244 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.AdvertPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
const enum_names_1 = require("../../utils/enum-names");
|
||||
const ed25519_verifier_1 = require("../../crypto/ed25519-verifier");
|
||||
class AdvertPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// start of appdata section: public_key(32) + timestamp(4) + signature(64) + flags(1) = 101 bytes
|
||||
if (payload.length < 101) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Advert,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Advertisement payload too short'],
|
||||
publicKey: '',
|
||||
timestamp: 0,
|
||||
signature: '',
|
||||
appData: {
|
||||
flags: 0,
|
||||
deviceRole: enums_1.DeviceRole.ChatNode,
|
||||
hasLocation: false,
|
||||
hasName: false
|
||||
}
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Advert Data',
|
||||
description: 'Advert payload too short (minimum 101 bytes required)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let currentOffset = 0;
|
||||
// parse advertisement structure from payloads.md
|
||||
const publicKey = (0, hex_1.bytesToHex)(payload.subarray(currentOffset, currentOffset + 32));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Public Key',
|
||||
description: 'Ed25519 public key',
|
||||
startByte: segmentOffset + currentOffset,
|
||||
endByte: segmentOffset + currentOffset + 31,
|
||||
value: publicKey
|
||||
});
|
||||
}
|
||||
currentOffset += 32;
|
||||
const timestamp = this.readUint32LE(payload, currentOffset);
|
||||
if (options?.includeSegments) {
|
||||
const timestampDate = new Date(timestamp * 1000);
|
||||
segments.push({
|
||||
name: 'Timestamp',
|
||||
description: `${timestamp} (${timestampDate.toISOString().slice(0, 19)}Z)`,
|
||||
startByte: segmentOffset + currentOffset,
|
||||
endByte: segmentOffset + currentOffset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.subarray(currentOffset, currentOffset + 4))
|
||||
});
|
||||
}
|
||||
currentOffset += 4;
|
||||
const signature = (0, hex_1.bytesToHex)(payload.subarray(currentOffset, currentOffset + 64));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Signature',
|
||||
description: 'Ed25519 signature',
|
||||
startByte: segmentOffset + currentOffset,
|
||||
endByte: segmentOffset + currentOffset + 63,
|
||||
value: signature
|
||||
});
|
||||
}
|
||||
currentOffset += 64;
|
||||
const flags = payload[currentOffset];
|
||||
if (options?.includeSegments) {
|
||||
const binaryStr = flags.toString(2).padStart(8, '0');
|
||||
const deviceRole = this.parseDeviceRole(flags);
|
||||
const roleName = (0, enum_names_1.getDeviceRoleName)(deviceRole);
|
||||
const flagDesc = ` | Bits 0-3 (Role): ${roleName} | Bit 4 (Location): ${!!(flags & enums_1.AdvertFlags.HasLocation) ? 'Yes' : 'No'} | Bit 7 (Name): ${!!(flags & enums_1.AdvertFlags.HasName) ? 'Yes' : 'No'}`;
|
||||
segments.push({
|
||||
name: 'App Flags',
|
||||
description: `Binary: ${binaryStr}${flagDesc}`,
|
||||
startByte: segmentOffset + currentOffset,
|
||||
endByte: segmentOffset + currentOffset,
|
||||
value: flags.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
currentOffset += 1;
|
||||
const advert = {
|
||||
type: enums_1.PayloadType.Advert,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
publicKey,
|
||||
timestamp,
|
||||
signature,
|
||||
appData: {
|
||||
flags,
|
||||
deviceRole: this.parseDeviceRole(flags),
|
||||
hasLocation: !!(flags & enums_1.AdvertFlags.HasLocation),
|
||||
hasName: !!(flags & enums_1.AdvertFlags.HasName)
|
||||
}
|
||||
};
|
||||
let offset = currentOffset;
|
||||
// location data (if HasLocation flag is set)
|
||||
if (flags & enums_1.AdvertFlags.HasLocation && payload.length >= offset + 8) {
|
||||
const lat = this.readInt32LE(payload, offset) / 1000000;
|
||||
const lon = this.readInt32LE(payload, offset + 4) / 1000000;
|
||||
advert.appData.location = {
|
||||
latitude: Math.round(lat * 1000000) / 1000000, // Keep precision
|
||||
longitude: Math.round(lon * 1000000) / 1000000
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Latitude',
|
||||
description: `${lat}° (${lat})`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 4))
|
||||
});
|
||||
segments.push({
|
||||
name: 'Longitude',
|
||||
description: `${lon}° (${lon})`,
|
||||
startByte: segmentOffset + offset + 4,
|
||||
endByte: segmentOffset + offset + 7,
|
||||
value: (0, hex_1.bytesToHex)(payload.subarray(offset + 4, offset + 8))
|
||||
});
|
||||
}
|
||||
offset += 8;
|
||||
}
|
||||
// skip feature fields for now (HasFeature1, HasFeature2)
|
||||
if (flags & enums_1.AdvertFlags.HasFeature1)
|
||||
offset += 2;
|
||||
if (flags & enums_1.AdvertFlags.HasFeature2)
|
||||
offset += 2;
|
||||
// name data (if HasName flag is set)
|
||||
if (flags & enums_1.AdvertFlags.HasName && payload.length > offset) {
|
||||
const nameBytes = payload.subarray(offset);
|
||||
const rawName = new TextDecoder('utf-8').decode(nameBytes).replace(/\0.*$/, '');
|
||||
advert.appData.name = this.sanitizeControlCharacters(rawName) || rawName;
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Node Name',
|
||||
description: `Node name: "${advert.appData.name}"`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(nameBytes)
|
||||
});
|
||||
}
|
||||
}
|
||||
if (options?.includeSegments) {
|
||||
advert.segments = segments;
|
||||
}
|
||||
return advert;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Advert,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode advertisement payload'],
|
||||
publicKey: '',
|
||||
timestamp: 0,
|
||||
signature: '',
|
||||
appData: {
|
||||
flags: 0,
|
||||
deviceRole: enums_1.DeviceRole.ChatNode,
|
||||
hasLocation: false,
|
||||
hasName: false
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Decode advertisement payload with signature verification
|
||||
*/
|
||||
static async decodeWithVerification(payload, options) {
|
||||
// First decode normally
|
||||
const advert = this.decode(payload, options);
|
||||
if (!advert || !advert.isValid) {
|
||||
return advert;
|
||||
}
|
||||
// Perform signature verification
|
||||
try {
|
||||
// Extract app_data from the payload (everything after public_key + timestamp + signature)
|
||||
const appDataStart = 32 + 4 + 64; // public_key + timestamp + signature
|
||||
const appDataBytes = payload.subarray(appDataStart);
|
||||
const appDataHex = (0, hex_1.bytesToHex)(appDataBytes);
|
||||
const signatureValid = await ed25519_verifier_1.Ed25519SignatureVerifier.verifyAdvertisementSignature(advert.publicKey, advert.signature, advert.timestamp, appDataHex);
|
||||
advert.signatureValid = signatureValid;
|
||||
if (!signatureValid) {
|
||||
advert.signatureError = 'Ed25519 signature verification failed';
|
||||
advert.isValid = false;
|
||||
if (!advert.errors) {
|
||||
advert.errors = [];
|
||||
}
|
||||
advert.errors.push('Invalid Ed25519 signature');
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
advert.signatureValid = false;
|
||||
advert.signatureError = error instanceof Error ? error.message : 'Signature verification error';
|
||||
advert.isValid = false;
|
||||
if (!advert.errors) {
|
||||
advert.errors = [];
|
||||
}
|
||||
advert.errors.push('Signature verification failed: ' + (error instanceof Error ? error.message : 'Unknown error'));
|
||||
}
|
||||
return advert;
|
||||
}
|
||||
static parseDeviceRole(flags) {
|
||||
const roleValue = flags & 0x0F;
|
||||
switch (roleValue) {
|
||||
case 0x01: return enums_1.DeviceRole.ChatNode;
|
||||
case 0x02: return enums_1.DeviceRole.Repeater;
|
||||
case 0x03: return enums_1.DeviceRole.RoomServer;
|
||||
case 0x04: return enums_1.DeviceRole.Sensor;
|
||||
default: return enums_1.DeviceRole.ChatNode;
|
||||
}
|
||||
}
|
||||
static readUint32LE(buffer, offset) {
|
||||
return buffer[offset] |
|
||||
(buffer[offset + 1] << 8) |
|
||||
(buffer[offset + 2] << 16) |
|
||||
(buffer[offset + 3] << 24);
|
||||
}
|
||||
static readInt32LE(buffer, offset) {
|
||||
const value = this.readUint32LE(buffer, offset);
|
||||
// convert unsigned to signed
|
||||
return value > 0x7FFFFFFF ? value - 0x100000000 : value;
|
||||
}
|
||||
static sanitizeControlCharacters(value) {
|
||||
if (!value)
|
||||
return null;
|
||||
const sanitized = value.trim().replace(/[\x00-\x1F\x7F]/g, '');
|
||||
return sanitized || null;
|
||||
}
|
||||
}
|
||||
exports.AdvertPayloadDecoder = AdvertPayloadDecoder;
|
||||
//# sourceMappingURL=advert.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/advert.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.d.ts
vendored
Normal file
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.d.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { AnonRequestPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class AnonRequestPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): AnonRequestPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=anon-request.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"anon-request.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/anon-request.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,kBAAkB,EAAE,MAAM,sBAAsB,CAAC;AAC1D,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,yBAAyB;IACpC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,kBAAkB,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CA8HjK"}
|
||||
123
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.js
vendored
Normal file
123
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.js
vendored
Normal file
@@ -0,0 +1,123 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.AnonRequestPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class AnonRequestPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - AnonRequest payload structure:
|
||||
// - destination_hash (1 byte)
|
||||
// - sender_public_key (32 bytes)
|
||||
// - cipher_mac (2 bytes)
|
||||
// - ciphertext (rest of payload)
|
||||
if (payload.length < 35) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.AnonRequest,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['AnonRequest payload too short (minimum 35 bytes: dest + public key + MAC)'],
|
||||
destinationHash: '',
|
||||
senderPublicKey: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid AnonRequest Data',
|
||||
description: 'AnonRequest payload too short (minimum 35 bytes required: 1 for dest hash + 32 for public key + 2 for MAC)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let offset = 0;
|
||||
// Parse destination hash (1 byte)
|
||||
const destinationHash = (0, hex_1.byteToHex)(payload[0]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Destination Hash',
|
||||
description: `First byte of destination node public key: 0x${destinationHash}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: destinationHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Parse sender public key (32 bytes)
|
||||
const senderPublicKey = (0, hex_1.bytesToHex)(payload.subarray(1, 33));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Sender Public Key',
|
||||
description: `Ed25519 public key of the sender (32 bytes)`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 31,
|
||||
value: senderPublicKey
|
||||
});
|
||||
}
|
||||
offset += 32;
|
||||
// Parse cipher MAC (2 bytes)
|
||||
const cipherMac = (0, hex_1.bytesToHex)(payload.subarray(33, 35));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Cipher MAC',
|
||||
description: `MAC for encrypted data verification (2 bytes)`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 1,
|
||||
value: cipherMac
|
||||
});
|
||||
}
|
||||
offset += 2;
|
||||
// Parse ciphertext (remaining bytes)
|
||||
const ciphertext = (0, hex_1.bytesToHex)(payload.subarray(35));
|
||||
if (options?.includeSegments && payload.length > 35) {
|
||||
segments.push({
|
||||
name: 'Ciphertext',
|
||||
description: `Encrypted message data (${payload.length - 35} bytes). Contains encrypted plaintext with this structure:
|
||||
• Timestamp (4 bytes) - send time as unix timestamp
|
||||
• Sync Timestamp (4 bytes) - room server only, sender's "sync messages SINCE x" timestamp
|
||||
• Password (remaining bytes) - password for repeater/room`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: ciphertext
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.AnonRequest,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
destinationHash,
|
||||
senderPublicKey,
|
||||
cipherMac,
|
||||
ciphertext,
|
||||
ciphertextLength: payload.length - 35
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.AnonRequest,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode AnonRequest payload'],
|
||||
destinationHash: '',
|
||||
senderPublicKey: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.AnonRequestPayloadDecoder = AnonRequestPayloadDecoder;
|
||||
//# sourceMappingURL=anon-request.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/anon-request.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"anon-request.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/anon-request.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAChE,yCAAwD;AAExD,MAAa,yBAAyB;IACpC,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAA+D;QAChG,IAAI,CAAC;YACH,iEAAiE;YACjE,8BAA8B;YAC9B,iCAAiC;YACjC,yBAAyB;YACzB,iCAAiC;YAEjC,IAAI,OAAO,CAAC,MAAM,GAAG,EAAE,EAAE,CAAC;gBACxB,MAAM,MAAM,GAAyD;oBACnE,IAAI,EAAE,mBAAW,CAAC,WAAW;oBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,2EAA2E,CAAC;oBACrF,eAAe,EAAE,EAAE;oBACnB,eAAe,EAAE,EAAE;oBACnB,SAAS,EAAE,EAAE;oBACb,UAAU,EAAE,EAAE;oBACd,gBAAgB,EAAE,CAAC;iBACpB,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,0BAA0B;4BAChC,WAAW,EAAE,4GAA4G;4BACzH,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAClD,IAAI,MAAM,GAAG,CAAC,CAAC;YAEf,kCAAkC;YAClC,MAAM,eAAe,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAC;YAE9C,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,kBAAkB;oBACxB,WAAW,EAAE,gDAAgD,eAAe,EAAE;oBAC9E,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,eAAe;iBACvB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,qCAAqC;YACrC,MAAM,eAAe,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC;YAE5D,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,mBAAmB;oBACzB,WAAW,EAAE,6CAA6C;oBAC1D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,EAAE;oBACpC,KAAK,EAAE,eAAe;iBACvB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,EAAE,CAAC;YAEb,6BAA6B;YAC7B,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC,CAAC;YAEvD,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,+CAA+C;oBAC5D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,SAAS;iBACjB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,qCAAqC;YACrC,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC;YAEpD,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,EAAE,EAAE,CAAC;gBACpD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,2BAA2B,OAAO,CAAC,MAAM,GAAG,EAAE;;;0DAGX;oBAChD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAyD;gBACnE,IAAI,EAAE,mBAAW,CAAC,WAAW;gBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,eAAe;gBACf,eAAe;gBACf,SAAS;gBACT,UAAU;gBACV,gBAAgB,EAAE,OAAO,CAAC,MAAM,GAAG,EAAE;aACtC,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,WAAW;gBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,sCAAsC,CAAC;gBACzF,eAAe,EAAE,EAAE;gBACnB,eAAe,EAAE,EAAE;gBACnB,SAAS,EAAE,EAAE;gBACb,UAAU,EAAE,EAAE;gBACd,gBAAgB,EAAE,CAAC;aACpB,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AA/HD,8DA+HC"}
|
||||
16
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.d.ts
vendored
Normal file
16
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.d.ts
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
import { ControlPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class ControlPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): (ControlPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
}) | null;
|
||||
private static decodeDiscoverReq;
|
||||
private static decodeDiscoverResp;
|
||||
private static parseTypeFilter;
|
||||
private static createErrorPayload;
|
||||
private static readUint32LE;
|
||||
}
|
||||
//# sourceMappingURL=control.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"control.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/control.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,cAAc,EAAyD,MAAM,sBAAsB,CAAC;AAC7G,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAKpD,qBAAa,qBAAqB;IAChC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,CAAC,cAAc,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,CAAC,GAAG,IAAI;IAsB9J,OAAO,CAAC,MAAM,CAAC,iBAAiB;IAoHhC,OAAO,CAAC,MAAM,CAAC,kBAAkB;IAwHjC,OAAO,CAAC,MAAM,CAAC,eAAe;IAS9B,OAAO,CAAC,MAAM,CAAC,kBAAkB;IAgCjC,OAAO,CAAC,MAAM,CAAC,YAAY;CAM5B"}
|
||||
279
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.js
vendored
Normal file
279
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.js
vendored
Normal file
@@ -0,0 +1,279 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ControlPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
const enum_names_1 = require("../../utils/enum-names");
|
||||
class ControlPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
if (payload.length < 1) {
|
||||
return this.createErrorPayload('Control payload too short (minimum 1 byte required)', payload, options);
|
||||
}
|
||||
const rawFlags = payload[0];
|
||||
const subType = rawFlags & 0xF0; // upper 4 bits
|
||||
switch (subType) {
|
||||
case enums_1.ControlSubType.NodeDiscoverReq:
|
||||
return this.decodeDiscoverReq(payload, options);
|
||||
case enums_1.ControlSubType.NodeDiscoverResp:
|
||||
return this.decodeDiscoverResp(payload, options);
|
||||
default:
|
||||
return this.createErrorPayload(`Unknown control sub-type: 0x${subType.toString(16).padStart(2, '0')}`, payload, options);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
return this.createErrorPayload(error instanceof Error ? error.message : 'Failed to decode control payload', payload, options);
|
||||
}
|
||||
}
|
||||
static decodeDiscoverReq(payload, options) {
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset ?? 0;
|
||||
// Minimum size: flags(1) + type_filter(1) + tag(4) = 6 bytes
|
||||
if (payload.length < 6) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Control,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['DISCOVER_REQ payload too short (minimum 6 bytes required)'],
|
||||
subType: enums_1.ControlSubType.NodeDiscoverReq,
|
||||
rawFlags: payload[0],
|
||||
prefixOnly: false,
|
||||
typeFilter: 0,
|
||||
typeFilterNames: [],
|
||||
tag: 0,
|
||||
since: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid DISCOVER_REQ Data',
|
||||
description: 'DISCOVER_REQ payload too short (minimum 6 bytes required)',
|
||||
startByte: segmentOffset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
let offset = 0;
|
||||
// Byte 0: flags - upper 4 bits is sub_type (0x8), lowest bit is prefix_only
|
||||
const rawFlags = payload[offset];
|
||||
const prefixOnly = (rawFlags & 0x01) !== 0;
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Flags',
|
||||
description: `Sub-type: DISCOVER_REQ (0x8) | Prefix Only: ${prefixOnly}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: rawFlags.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Byte 1: type_filter - bit for each ADV_TYPE_*
|
||||
const typeFilter = payload[offset];
|
||||
const typeFilterNames = this.parseTypeFilter(typeFilter);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Type Filter',
|
||||
description: `Filter mask: 0b${typeFilter.toString(2).padStart(8, '0')} | Types: ${typeFilterNames.length > 0 ? typeFilterNames.join(', ') : 'None'}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: typeFilter.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Bytes 2-5: tag (uint32, little endian)
|
||||
const tag = this.readUint32LE(payload, offset);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Tag',
|
||||
description: `Random tag for response matching: 0x${tag.toString(16).padStart(8, '0')}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(offset, offset + 4))
|
||||
});
|
||||
}
|
||||
offset += 4;
|
||||
// Optional: Bytes 6-9: since (uint32, little endian) - epoch timestamp
|
||||
let since = 0;
|
||||
if (payload.length >= offset + 4) {
|
||||
since = this.readUint32LE(payload, offset);
|
||||
if (options?.includeSegments) {
|
||||
const sinceDate = since > 0 ? new Date(since * 1000).toISOString().slice(0, 19) + 'Z' : 'N/A';
|
||||
segments.push({
|
||||
name: 'Since',
|
||||
description: `Filter timestamp: ${since} (${sinceDate})`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(offset, offset + 4))
|
||||
});
|
||||
}
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Control,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
subType: enums_1.ControlSubType.NodeDiscoverReq,
|
||||
rawFlags,
|
||||
prefixOnly,
|
||||
typeFilter,
|
||||
typeFilterNames,
|
||||
tag,
|
||||
since
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
static decodeDiscoverResp(payload, options) {
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset ?? 0;
|
||||
// Minimum size: flags(1) + snr(1) + tag(4) + pubkey(8 for prefix) = 14 bytes
|
||||
if (payload.length < 14) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Control,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['DISCOVER_RESP payload too short (minimum 14 bytes required)'],
|
||||
subType: enums_1.ControlSubType.NodeDiscoverResp,
|
||||
rawFlags: payload.length > 0 ? payload[0] : 0,
|
||||
nodeType: enums_1.DeviceRole.Unknown,
|
||||
nodeTypeName: 'Unknown',
|
||||
snr: 0,
|
||||
tag: 0,
|
||||
publicKey: '',
|
||||
publicKeyLength: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid DISCOVER_RESP Data',
|
||||
description: 'DISCOVER_RESP payload too short (minimum 14 bytes required)',
|
||||
startByte: segmentOffset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
let offset = 0;
|
||||
// Byte 0: flags - upper 4 bits is sub_type (0x9), lower 4 bits is node_type
|
||||
const rawFlags = payload[offset];
|
||||
const nodeType = (rawFlags & 0x0F);
|
||||
const nodeTypeName = (0, enum_names_1.getDeviceRoleName)(nodeType);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Flags',
|
||||
description: `Sub-type: DISCOVER_RESP (0x9) | Node Type: ${nodeTypeName}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: rawFlags.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Byte 1: snr (signed int8, represents SNR * 4)
|
||||
const snrRaw = payload[offset];
|
||||
const snrSigned = snrRaw > 127 ? snrRaw - 256 : snrRaw;
|
||||
const snr = snrSigned / 4.0;
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'SNR',
|
||||
description: `Inbound SNR: ${snr.toFixed(2)} dB (raw: ${snrRaw}, signed: ${snrSigned})`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: snrRaw.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Bytes 2-5: tag (uint32, little endian) - reflected from request
|
||||
const tag = this.readUint32LE(payload, offset);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Tag',
|
||||
description: `Reflected tag from request: 0x${tag.toString(16).padStart(8, '0')}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(offset, offset + 4))
|
||||
});
|
||||
}
|
||||
offset += 4;
|
||||
// Remaining bytes: public key (8 bytes for prefix, 32 bytes for full)
|
||||
const remainingBytes = payload.length - offset;
|
||||
const publicKeyLength = remainingBytes;
|
||||
const publicKeyBytes = payload.slice(offset, offset + publicKeyLength);
|
||||
const publicKey = (0, hex_1.bytesToHex)(publicKeyBytes);
|
||||
if (options?.includeSegments) {
|
||||
const keyType = publicKeyLength === 32 ? 'Full Public Key' : 'Public Key Prefix';
|
||||
segments.push({
|
||||
name: keyType,
|
||||
description: `${keyType} (${publicKeyLength} bytes)`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + publicKeyLength - 1,
|
||||
value: publicKey
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Control,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
subType: enums_1.ControlSubType.NodeDiscoverResp,
|
||||
rawFlags,
|
||||
nodeType,
|
||||
nodeTypeName,
|
||||
snr,
|
||||
tag,
|
||||
publicKey,
|
||||
publicKeyLength
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
static parseTypeFilter(filter) {
|
||||
const types = [];
|
||||
if (filter & (1 << enums_1.DeviceRole.ChatNode))
|
||||
types.push('Chat');
|
||||
if (filter & (1 << enums_1.DeviceRole.Repeater))
|
||||
types.push('Repeater');
|
||||
if (filter & (1 << enums_1.DeviceRole.RoomServer))
|
||||
types.push('Room');
|
||||
if (filter & (1 << enums_1.DeviceRole.Sensor))
|
||||
types.push('Sensor');
|
||||
return types;
|
||||
}
|
||||
static createErrorPayload(error, payload, options) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Control,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error],
|
||||
subType: enums_1.ControlSubType.NodeDiscoverReq,
|
||||
rawFlags: payload.length > 0 ? payload[0] : 0,
|
||||
prefixOnly: false,
|
||||
typeFilter: 0,
|
||||
typeFilterNames: [],
|
||||
tag: 0,
|
||||
since: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Control Data',
|
||||
description: error,
|
||||
startByte: options.segmentOffset ?? 0,
|
||||
endByte: (options.segmentOffset ?? 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
static readUint32LE(buffer, offset) {
|
||||
return (buffer[offset] |
|
||||
(buffer[offset + 1] << 8) |
|
||||
(buffer[offset + 2] << 16) |
|
||||
(buffer[offset + 3] << 24)) >>> 0; // >>> 0 to ensure unsigned
|
||||
}
|
||||
}
|
||||
exports.ControlPayloadDecoder = ControlPayloadDecoder;
|
||||
//# sourceMappingURL=control.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/control.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
12
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.d.ts
vendored
Normal file
12
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.d.ts
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { GroupTextPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
import { DecryptionOptions } from '../../types/crypto';
|
||||
export declare class GroupTextPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: DecryptionOptions & {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): GroupTextPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=group-text.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"group-text.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/group-text.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,gBAAgB,EAAE,MAAM,sBAAsB,CAAC;AACxD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAEpD,OAAO,EAAE,iBAAiB,EAAE,MAAM,oBAAoB,CAAC;AAIvD,qBAAa,uBAAuB;IAClC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE,iBAAiB,GAAG;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,gBAAgB,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CAyHnL"}
|
||||
118
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.js
vendored
Normal file
118
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.js
vendored
Normal file
@@ -0,0 +1,118 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.GroupTextPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const channel_crypto_1 = require("../../crypto/channel-crypto");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class GroupTextPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
if (payload.length < 3) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.GroupText,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['GroupText payload too short (need at least channel_hash(1) + MAC(2))'],
|
||||
channelHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid GroupText Data',
|
||||
description: 'GroupText payload too short (minimum 3 bytes required)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let offset = 0;
|
||||
// channel hash (1 byte) - first byte of SHA256 of channel's shared key
|
||||
const channelHash = (0, hex_1.byteToHex)(payload[offset]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Channel Hash',
|
||||
description: 'First byte of SHA256 of channel\'s shared key',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: channelHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// MAC (2 bytes) - message authentication code
|
||||
const cipherMac = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 2));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Cipher MAC',
|
||||
description: 'MAC for encrypted data',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 1,
|
||||
value: cipherMac
|
||||
});
|
||||
}
|
||||
offset += 2;
|
||||
// ciphertext (remaining bytes) - encrypted message
|
||||
const ciphertext = (0, hex_1.bytesToHex)(payload.subarray(offset));
|
||||
if (options?.includeSegments && payload.length > offset) {
|
||||
segments.push({
|
||||
name: 'Ciphertext',
|
||||
description: 'Encrypted message content (timestamp + flags + message)',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: ciphertext
|
||||
});
|
||||
}
|
||||
const groupText = {
|
||||
type: enums_1.PayloadType.GroupText,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
channelHash,
|
||||
cipherMac,
|
||||
ciphertext,
|
||||
ciphertextLength: payload.length - 3
|
||||
};
|
||||
// attempt decryption if key store is provided
|
||||
if (options?.keyStore && options.keyStore.hasChannelKey(channelHash)) {
|
||||
// try all possible keys for this hash (handles collisions)
|
||||
const channelKeys = options.keyStore.getChannelKeys(channelHash);
|
||||
for (const channelKey of channelKeys) {
|
||||
const decryptionResult = channel_crypto_1.ChannelCrypto.decryptGroupTextMessage(ciphertext, cipherMac, channelKey);
|
||||
if (decryptionResult.success && decryptionResult.data) {
|
||||
groupText.decrypted = {
|
||||
timestamp: decryptionResult.data.timestamp,
|
||||
flags: decryptionResult.data.flags,
|
||||
sender: decryptionResult.data.sender,
|
||||
message: decryptionResult.data.message
|
||||
};
|
||||
break; // stop trying keys once we find one that works
|
||||
}
|
||||
}
|
||||
}
|
||||
if (options?.includeSegments) {
|
||||
groupText.segments = segments;
|
||||
}
|
||||
return groupText;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.GroupText,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode GroupText payload'],
|
||||
channelHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.GroupTextPayloadDecoder = GroupTextPayloadDecoder;
|
||||
//# sourceMappingURL=group-text.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/group-text.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"group-text.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/group-text.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAEhE,gEAA4D;AAC5D,yCAAwD;AAExD,MAAa,uBAAuB;IAClC,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAAmF;QACpH,IAAI,CAAC;YACH,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAuD;oBACjE,IAAI,EAAE,mBAAW,CAAC,SAAS;oBAC3B,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,sEAAsE,CAAC;oBAChF,WAAW,EAAE,EAAE;oBACf,SAAS,EAAE,EAAE;oBACb,UAAU,EAAE,EAAE;oBACd,gBAAgB,EAAE,CAAC;iBACpB,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,wBAAwB;4BAC9B,WAAW,EAAE,wDAAwD;4BACrE,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAClD,IAAI,MAAM,GAAG,CAAC,CAAC;YAEf,uEAAuE;YACvE,MAAM,WAAW,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC;YAC/C,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,cAAc;oBACpB,WAAW,EAAE,+CAA+C;oBAC5D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,WAAW;iBACnB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,8CAA8C;YAC9C,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YACnE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,wBAAwB;oBACrC,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,SAAS;iBACjB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,mDAAmD;YACnD,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC;YACxD,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,MAAM,EAAE,CAAC;gBACxD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,yDAAyD;oBACtE,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YAED,MAAM,SAAS,GAAuD;gBACpE,IAAI,EAAE,mBAAW,CAAC,SAAS;gBAC3B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,WAAW;gBACX,SAAS;gBACT,UAAU;gBACV,gBAAgB,EAAE,OAAO,CAAC,MAAM,GAAG,CAAC;aACrC,CAAC;YAEF,8CAA8C;YAC9C,IAAI,OAAO,EAAE,QAAQ,IAAI,OAAO,CAAC,QAAQ,CAAC,aAAa,CAAC,WAAW,CAAC,EAAE,CAAC;gBACrE,2DAA2D;gBAC3D,MAAM,WAAW,GAAG,OAAO,CAAC,QAAQ,CAAC,cAAc,CAAC,WAAW,CAAC,CAAC;gBAEjE,KAAK,MAAM,UAAU,IAAI,WAAW,EAAE,CAAC;oBACrC,MAAM,gBAAgB,GAAG,8BAAa,CAAC,uBAAuB,CAC5D,UAAU,EACV,SAAS,EACT,UAAU,CACX,CAAC;oBAEF,IAAI,gBAAgB,CAAC,OAAO,IAAI,gBAAgB,CAAC,IAAI,EAAE,CAAC;wBACtD,SAAS,CAAC,SAAS,GAAG;4BACpB,SAAS,EAAE,gBAAgB,CAAC,IAAI,CAAC,SAAS;4BAC1C,KAAK,EAAE,gBAAgB,CAAC,IAAI,CAAC,KAAK;4BAClC,MAAM,EAAE,gBAAgB,CAAC,IAAI,CAAC,MAAM;4BACpC,OAAO,EAAE,gBAAgB,CAAC,IAAI,CAAC,OAAO;yBACvC,CAAC;wBACF,MAAM,CAAC,+CAA+C;oBACxD,CAAC;gBACH,CAAC;YACH,CAAC;YAED,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,SAAS,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAChC,CAAC;YAED,OAAO,SAAS,CAAC;QACnB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,SAAS;gBAC3B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,oCAAoC,CAAC;gBACvF,WAAW,EAAE,EAAE;gBACf,SAAS,EAAE,EAAE;gBACb,UAAU,EAAE,EAAE;gBACd,gBAAgB,EAAE,CAAC;aACpB,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AA1HD,0DA0HC"}
|
||||
5
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.d.ts
vendored
Normal file
5
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.d.ts
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import { PathPayload } from '../../types/payloads';
|
||||
export declare class PathPayloadDecoder {
|
||||
static decode(payload: Uint8Array): PathPayload | null;
|
||||
}
|
||||
//# sourceMappingURL=path.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"path.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/path.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,WAAW,EAAE,MAAM,sBAAsB,CAAC;AAInD,qBAAa,kBAAkB;IAC7B,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,GAAG,WAAW,GAAG,IAAI;CA6FvD"}
|
||||
97
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.js
vendored
Normal file
97
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.js
vendored
Normal file
@@ -0,0 +1,97 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.PathPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class PathPayloadDecoder {
|
||||
static decode(payload) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - Path payload structure:
|
||||
// - path_len (1 byte, encoded: bits 7:6 = hash size selector, bits 5:0 = hop count)
|
||||
// - path (variable length) - list of node hashes (pathHashSize bytes each)
|
||||
// - extra_type (1 byte) - bundled payload type
|
||||
// - extra (rest of data) - bundled payload content
|
||||
if (payload.length < 2) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Path,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Path payload too short (minimum 2 bytes: path length + extra type)'],
|
||||
pathLength: 0,
|
||||
pathHashes: [],
|
||||
extraType: 0,
|
||||
extraData: ''
|
||||
};
|
||||
}
|
||||
const pathLenByte = payload[0];
|
||||
const pathHashSize = (pathLenByte >> 6) + 1;
|
||||
const pathHopCount = pathLenByte & 63;
|
||||
const pathByteLength = pathHopCount * pathHashSize;
|
||||
if (pathHashSize === 4) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Path,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Invalid path length byte: reserved hash size (bits 7:6 = 11)'],
|
||||
pathLength: 0,
|
||||
pathHashes: [],
|
||||
extraType: 0,
|
||||
extraData: ''
|
||||
};
|
||||
}
|
||||
if (payload.length < 1 + pathByteLength + 1) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Path,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [`Path payload too short (need ${1 + pathByteLength + 1} bytes for path length + path + extra type)`],
|
||||
pathLength: pathHopCount,
|
||||
...(pathHashSize > 1 ? { pathHashSize } : {}),
|
||||
pathHashes: [],
|
||||
extraType: 0,
|
||||
extraData: ''
|
||||
};
|
||||
}
|
||||
// Parse path hashes (pathHashSize bytes each)
|
||||
const pathHashes = [];
|
||||
for (let i = 0; i < pathHopCount; i++) {
|
||||
const hashStart = 1 + i * pathHashSize;
|
||||
const hashBytes = payload.subarray(hashStart, hashStart + pathHashSize);
|
||||
pathHashes.push((0, hex_1.bytesToHex)(hashBytes));
|
||||
}
|
||||
// Parse extra type (1 byte after path)
|
||||
const extraType = payload[1 + pathByteLength];
|
||||
// Parse extra data (remaining bytes)
|
||||
let extraData = '';
|
||||
if (payload.length > 1 + pathByteLength + 1) {
|
||||
extraData = (0, hex_1.bytesToHex)(payload.subarray(1 + pathByteLength + 1));
|
||||
}
|
||||
return {
|
||||
type: enums_1.PayloadType.Path,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
pathLength: pathHopCount,
|
||||
...(pathHashSize > 1 ? { pathHashSize } : {}),
|
||||
pathHashes,
|
||||
extraType,
|
||||
extraData
|
||||
};
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Path,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode Path payload'],
|
||||
pathLength: 0,
|
||||
pathHashes: [],
|
||||
extraType: 0,
|
||||
extraData: ''
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.PathPayloadDecoder = PathPayloadDecoder;
|
||||
//# sourceMappingURL=path.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/path.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"path.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/path.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAGd,6CAAgE;AAChE,yCAA6C;AAE7C,MAAa,kBAAkB;IAC7B,MAAM,CAAC,MAAM,CAAC,OAAmB;QAC/B,IAAI,CAAC;YACH,0DAA0D;YAC1D,oFAAoF;YACpF,2EAA2E;YAC3E,+CAA+C;YAC/C,mDAAmD;YAEnD,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,OAAO;oBACL,IAAI,EAAE,mBAAW,CAAC,IAAI;oBACtB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,oEAAoE,CAAC;oBAC9E,UAAU,EAAE,CAAC;oBACb,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,CAAC;oBACZ,SAAS,EAAE,EAAE;iBACd,CAAC;YACJ,CAAC;YAED,MAAM,WAAW,GAAG,OAAO,CAAC,CAAC,CAAC,CAAC;YAC/B,MAAM,YAAY,GAAG,CAAC,WAAW,IAAI,CAAC,CAAC,GAAG,CAAC,CAAC;YAC5C,MAAM,YAAY,GAAG,WAAW,GAAG,EAAE,CAAC;YACtC,MAAM,cAAc,GAAG,YAAY,GAAG,YAAY,CAAC;YAEnD,IAAI,YAAY,KAAK,CAAC,EAAE,CAAC;gBACvB,OAAO;oBACL,IAAI,EAAE,mBAAW,CAAC,IAAI;oBACtB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,8DAA8D,CAAC;oBACxE,UAAU,EAAE,CAAC;oBACb,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,CAAC;oBACZ,SAAS,EAAE,EAAE;iBACd,CAAC;YACJ,CAAC;YAED,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,GAAG,cAAc,GAAG,CAAC,EAAE,CAAC;gBAC5C,OAAO;oBACL,IAAI,EAAE,mBAAW,CAAC,IAAI;oBACtB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,gCAAgC,CAAC,GAAG,cAAc,GAAG,CAAC,6CAA6C,CAAC;oBAC7G,UAAU,EAAE,YAAY;oBACxB,GAAG,CAAC,YAAY,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,YAAY,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;oBAC7C,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,CAAC;oBACZ,SAAS,EAAE,EAAE;iBACd,CAAC;YACJ,CAAC;YAED,8CAA8C;YAC9C,MAAM,UAAU,GAAa,EAAE,CAAC;YAChC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,YAAY,EAAE,CAAC,EAAE,EAAE,CAAC;gBACtC,MAAM,SAAS,GAAG,CAAC,GAAG,CAAC,GAAG,YAAY,CAAC;gBACvC,MAAM,SAAS,GAAG,OAAO,CAAC,QAAQ,CAAC,SAAS,EAAE,SAAS,GAAG,YAAY,CAAC,CAAC;gBACxE,UAAU,CAAC,IAAI,CAAC,IAAA,gBAAU,EAAC,SAAS,CAAC,CAAC,CAAC;YACzC,CAAC;YAED,uCAAuC;YACvC,MAAM,SAAS,GAAG,OAAO,CAAC,CAAC,GAAG,cAAc,CAAC,CAAC;YAE9C,qCAAqC;YACrC,IAAI,SAAS,GAAG,EAAE,CAAC;YACnB,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,GAAG,cAAc,GAAG,CAAC,EAAE,CAAC;gBAC5C,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,GAAG,cAAc,GAAG,CAAC,CAAC,CAAC,CAAC;YACnE,CAAC;YAED,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,IAAI;gBACtB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,UAAU,EAAE,YAAY;gBACxB,GAAG,CAAC,YAAY,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,YAAY,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;gBAC7C,UAAU;gBACV,SAAS;gBACT,SAAS;aACV,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,IAAI;gBACtB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,+BAA+B,CAAC;gBAClF,UAAU,EAAE,CAAC;gBACb,UAAU,EAAE,EAAE;gBACd,SAAS,EAAE,CAAC;gBACZ,SAAS,EAAE,EAAE;aACd,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AA9FD,gDA8FC"}
|
||||
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.d.ts
vendored
Normal file
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.d.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { RequestPayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class RequestPayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): RequestPayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=request.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"request.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/request.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,cAAc,EAAE,MAAM,sBAAsB,CAAC;AACtD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,qBAAqB;IAChC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,cAAc,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CAqI7J"}
|
||||
129
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.js
vendored
Normal file
129
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.js
vendored
Normal file
@@ -0,0 +1,129 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.RequestPayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class RequestPayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - Request payload structure:
|
||||
// - destination hash (1 byte)
|
||||
// - source hash (1 byte)
|
||||
// - cipher MAC (2 bytes)
|
||||
// - ciphertext (rest of payload) - contains encrypted timestamp, request type, and request data
|
||||
if (payload.length < 4) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Request,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Request payload too short (minimum 4 bytes: dest hash + source hash + MAC)'],
|
||||
timestamp: 0,
|
||||
requestType: enums_1.RequestType.GetStats,
|
||||
requestData: '',
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: ''
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Request Data',
|
||||
description: 'Request payload too short (minimum 4 bytes required: 1 for dest hash + 1 for source hash + 2 for MAC)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let offset = 0;
|
||||
// Parse destination hash (1 byte)
|
||||
const destinationHash = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 1));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Destination Hash',
|
||||
description: `First byte of destination node public key: 0x${destinationHash}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: destinationHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Parse source hash (1 byte)
|
||||
const sourceHash = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 1));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Source Hash',
|
||||
description: `First byte of source node public key: 0x${sourceHash}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: sourceHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Parse cipher MAC (2 bytes)
|
||||
const cipherMac = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 2));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Cipher MAC',
|
||||
description: `MAC for encrypted data verification (2 bytes)`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 1,
|
||||
value: cipherMac
|
||||
});
|
||||
}
|
||||
offset += 2;
|
||||
// Parse ciphertext (remaining bytes)
|
||||
const ciphertext = (0, hex_1.bytesToHex)(payload.subarray(offset));
|
||||
if (options?.includeSegments && payload.length > offset) {
|
||||
segments.push({
|
||||
name: 'Ciphertext',
|
||||
description: `Encrypted message data (${payload.length - offset} bytes). Contains encrypted plaintext with this structure:
|
||||
• Timestamp (4 bytes) - send time as unix timestamp
|
||||
• Request Type (1 byte) - type of request (GetStats, GetTelemetryData, etc.)
|
||||
• Request Data (remaining bytes) - additional request-specific data`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: ciphertext
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Request,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
timestamp: 0, // Encrypted, cannot be parsed without decryption
|
||||
requestType: enums_1.RequestType.GetStats, // Encrypted, cannot be determined without decryption
|
||||
requestData: '',
|
||||
destinationHash,
|
||||
sourceHash,
|
||||
cipherMac,
|
||||
ciphertext
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Request,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode request payload'],
|
||||
timestamp: 0,
|
||||
requestType: enums_1.RequestType.GetStats,
|
||||
requestData: '',
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: ''
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.RequestPayloadDecoder = RequestPayloadDecoder;
|
||||
//# sourceMappingURL=request.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/request.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"request.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/request.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAA6E;AAC7E,yCAA6C;AAE7C,MAAa,qBAAqB;IAChC,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAA+D;QAChG,IAAI,CAAC;YACH,6DAA6D;YAC7D,8BAA8B;YAC9B,yBAAyB;YACzB,yBAAyB;YACzB,gGAAgG;YAEhG,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAqD;oBAC/D,IAAI,EAAE,mBAAW,CAAC,OAAO;oBACzB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,4EAA4E,CAAC;oBACtF,SAAS,EAAE,CAAC;oBACZ,WAAW,EAAE,mBAAW,CAAC,QAAQ;oBACjC,WAAW,EAAE,EAAE;oBACf,eAAe,EAAE,EAAE;oBACnB,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,EAAE;oBACb,UAAU,EAAE,EAAE;iBACf,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,sBAAsB;4BAC5B,WAAW,EAAE,uGAAuG;4BACpH,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAClD,IAAI,MAAM,GAAG,CAAC,CAAC;YAEf,kCAAkC;YAClC,MAAM,eAAe,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YAEzE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,kBAAkB;oBACxB,WAAW,EAAE,gDAAgD,eAAe,EAAE;oBAC9E,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,eAAe;iBACvB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,6BAA6B;YAC7B,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YAEpE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,aAAa;oBACnB,WAAW,EAAE,2CAA2C,UAAU,EAAE;oBACpE,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,6BAA6B;YAC7B,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YAEnE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,+CAA+C;oBAC5D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,SAAS;iBACjB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,qCAAqC;YACrC,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC;YAExD,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,MAAM,EAAE,CAAC;gBACxD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,2BAA2B,OAAO,CAAC,MAAM,GAAG,MAAM;;;oEAGL;oBAC1D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAqD;gBAC/D,IAAI,EAAE,mBAAW,CAAC,OAAO;gBACzB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,SAAS,EAAE,CAAC,EAAE,iDAAiD;gBAC/D,WAAW,EAAE,mBAAW,CAAC,QAAQ,EAAE,qDAAqD;gBACxF,WAAW,EAAE,EAAE;gBACf,eAAe;gBACf,UAAU;gBACV,SAAS;gBACT,UAAU;aACX,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,OAAO;gBACzB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,kCAAkC,CAAC;gBACrF,SAAS,EAAE,CAAC;gBACZ,WAAW,EAAE,mBAAW,CAAC,QAAQ;gBACjC,WAAW,EAAE,EAAE;gBACf,eAAe,EAAE,EAAE;gBACnB,UAAU,EAAE,EAAE;gBACd,SAAS,EAAE,EAAE;gBACb,UAAU,EAAE,EAAE;aACf,CAAC;QACJ,CAAC;IACH,CAAC;CAEF;AAtID,sDAsIC"}
|
||||
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.d.ts
vendored
Normal file
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.d.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { ResponsePayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class ResponsePayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): ResponsePayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=response.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"response.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/response.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,eAAe,EAAE,MAAM,sBAAsB,CAAC;AACvD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,sBAAsB;IACjC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,eAAe,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CAuH9J"}
|
||||
120
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.js
vendored
Normal file
120
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.js
vendored
Normal file
@@ -0,0 +1,120 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ResponsePayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class ResponsePayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - Response payload structure:
|
||||
// - destination_hash (1 byte)
|
||||
// - source_hash (1 byte)
|
||||
// - cipher_mac (2 bytes)
|
||||
// - ciphertext (rest of payload)
|
||||
if (payload.length < 4) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Response,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Response payload too short (minimum 4 bytes: dest + source + MAC)'],
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Response Data',
|
||||
description: 'Response payload too short (minimum 4 bytes required)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let offset = 0;
|
||||
// Destination Hash (1 byte)
|
||||
const destinationHash = (0, hex_1.byteToHex)(payload[offset]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Destination Hash',
|
||||
description: 'First byte of destination node public key',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: destinationHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// source hash (1 byte)
|
||||
const sourceHash = (0, hex_1.byteToHex)(payload[offset]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Source Hash',
|
||||
description: 'First byte of source node public key',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: sourceHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// cipher MAC (2 bytes)
|
||||
const cipherMac = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 2));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Cipher MAC',
|
||||
description: 'MAC for encrypted data in next field',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 1,
|
||||
value: cipherMac
|
||||
});
|
||||
}
|
||||
offset += 2;
|
||||
// ciphertext (remaining bytes)
|
||||
const ciphertext = (0, hex_1.bytesToHex)(payload.subarray(offset));
|
||||
if (options?.includeSegments && payload.length > offset) {
|
||||
segments.push({
|
||||
name: 'Ciphertext',
|
||||
description: 'Encrypted response data (tag + content)',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: ciphertext
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Response,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
destinationHash,
|
||||
sourceHash,
|
||||
cipherMac,
|
||||
ciphertext,
|
||||
ciphertextLength: payload.length - 4
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Response,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode response payload'],
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.ResponsePayloadDecoder = ResponsePayloadDecoder;
|
||||
//# sourceMappingURL=response.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/response.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"response.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/response.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAChE,yCAAwD;AAExD,MAAa,sBAAsB;IACjC,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAA+D;QAChG,IAAI,CAAC;YACH,8DAA8D;YAC9D,8BAA8B;YAC9B,yBAAyB;YACzB,yBAAyB;YACzB,iCAAiC;YAEjC,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAsD;oBAChE,IAAI,EAAE,mBAAW,CAAC,QAAQ;oBAC1B,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,mEAAmE,CAAC;oBAC7E,eAAe,EAAE,EAAE;oBACnB,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,EAAE;oBACb,UAAU,EAAE,EAAE;oBACd,gBAAgB,EAAE,CAAC;iBACpB,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,uBAAuB;4BAC7B,WAAW,EAAE,uDAAuD;4BACpE,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAClD,IAAI,MAAM,GAAG,CAAC,CAAC;YAEf,4BAA4B;YAC5B,MAAM,eAAe,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC;YACnD,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,kBAAkB;oBACxB,WAAW,EAAE,2CAA2C;oBACxD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,eAAe;iBACvB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,uBAAuB;YACvB,MAAM,UAAU,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC;YAC9C,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,aAAa;oBACnB,WAAW,EAAE,sCAAsC;oBACnD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,uBAAuB;YACvB,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YACnE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,sCAAsC;oBACnD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,SAAS;iBACjB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,+BAA+B;YAC/B,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC;YACxD,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,MAAM,EAAE,CAAC;gBACxD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,yCAAyC;oBACtD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAsD;gBAChE,IAAI,EAAE,mBAAW,CAAC,QAAQ;gBAC1B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,eAAe;gBACf,UAAU;gBACV,SAAS;gBACT,UAAU;gBACV,gBAAgB,EAAE,OAAO,CAAC,MAAM,GAAG,CAAC;aACrC,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,QAAQ;gBAC1B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,mCAAmC,CAAC;gBACtF,eAAe,EAAE,EAAE;gBACnB,UAAU,EAAE,EAAE;gBACd,SAAS,EAAE,EAAE;gBACb,UAAU,EAAE,EAAE;gBACd,gBAAgB,EAAE,CAAC;aACpB,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AAxHD,wDAwHC"}
|
||||
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.d.ts
vendored
Normal file
11
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.d.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { TextMessagePayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class TextMessagePayloadDecoder {
|
||||
static decode(payload: Uint8Array, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): TextMessagePayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
}
|
||||
//# sourceMappingURL=text-message.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"text-message.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/text-message.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,kBAAkB,EAAE,MAAM,sBAAsB,CAAC;AAC1D,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,yBAAyB;IACpC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,kBAAkB,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;CAuHjK"}
|
||||
120
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.js
vendored
Normal file
120
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.js
vendored
Normal file
@@ -0,0 +1,120 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TextMessagePayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class TextMessagePayloadDecoder {
|
||||
static decode(payload, options) {
|
||||
try {
|
||||
// Based on MeshCore payloads.md - TextMessage payload structure:
|
||||
// - destination_hash (1 byte)
|
||||
// - source_hash (1 byte)
|
||||
// - cipher_mac (2 bytes)
|
||||
// - ciphertext (rest of payload)
|
||||
if (payload.length < 4) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.TextMessage,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['TextMessage payload too short (minimum 4 bytes: dest + source + MAC)'],
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid TextMessage Data',
|
||||
description: 'TextMessage payload too short (minimum 4 bytes required)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
let offset = 0;
|
||||
// Destination Hash (1 byte)
|
||||
const destinationHash = (0, hex_1.byteToHex)(payload[offset]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Destination Hash',
|
||||
description: 'First byte of destination node public key',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: destinationHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Source Hash (1 byte)
|
||||
const sourceHash = (0, hex_1.byteToHex)(payload[offset]);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Source Hash',
|
||||
description: 'First byte of source node public key',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: sourceHash
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// Cipher MAC (2 bytes)
|
||||
const cipherMac = (0, hex_1.bytesToHex)(payload.subarray(offset, offset + 2));
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Cipher MAC',
|
||||
description: 'MAC for encrypted data in next field',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 1,
|
||||
value: cipherMac
|
||||
});
|
||||
}
|
||||
offset += 2;
|
||||
// Ciphertext (remaining bytes)
|
||||
const ciphertext = (0, hex_1.bytesToHex)(payload.subarray(offset));
|
||||
if (options?.includeSegments && payload.length > offset) {
|
||||
segments.push({
|
||||
name: 'Ciphertext',
|
||||
description: 'Encrypted message data (timestamp + message text)',
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: ciphertext
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.TextMessage,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
destinationHash,
|
||||
sourceHash,
|
||||
cipherMac,
|
||||
ciphertext,
|
||||
ciphertextLength: payload.length - 4
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.TextMessage,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode TextMessage payload'],
|
||||
destinationHash: '',
|
||||
sourceHash: '',
|
||||
cipherMac: '',
|
||||
ciphertext: '',
|
||||
ciphertextLength: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.TextMessagePayloadDecoder = TextMessagePayloadDecoder;
|
||||
//# sourceMappingURL=text-message.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/text-message.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"text-message.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/text-message.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAChE,yCAAwD;AAExD,MAAa,yBAAyB;IACpC,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,OAA+D;QAChG,IAAI,CAAC;YACH,iEAAiE;YACjE,8BAA8B;YAC9B,yBAAyB;YACzB,yBAAyB;YACzB,iCAAiC;YAEjC,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAyD;oBACnE,IAAI,EAAE,mBAAW,CAAC,WAAW;oBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,sEAAsE,CAAC;oBAChF,eAAe,EAAE,EAAE;oBACnB,UAAU,EAAE,EAAE;oBACd,SAAS,EAAE,EAAE;oBACb,UAAU,EAAE,EAAE;oBACd,gBAAgB,EAAE,CAAC;iBACpB,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,0BAA0B;4BAChC,WAAW,EAAE,0DAA0D;4BACvE,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAClD,IAAI,MAAM,GAAG,CAAC,CAAC;YAEf,4BAA4B;YAC5B,MAAM,eAAe,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC;YACnD,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,kBAAkB;oBACxB,WAAW,EAAE,2CAA2C;oBACxD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,eAAe;iBACvB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,uBAAuB;YACvB,MAAM,UAAU,GAAG,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC;YAC9C,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,aAAa;oBACnB,WAAW,EAAE,sCAAsC;oBACnD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,uBAAuB;YACvB,MAAM,SAAS,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC;YACnE,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,sCAAsC;oBACnD,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,SAAS;iBACjB,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,+BAA+B;YAC/B,MAAM,UAAU,GAAG,IAAA,gBAAU,EAAC,OAAO,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC;YACxD,IAAI,OAAO,EAAE,eAAe,IAAI,OAAO,CAAC,MAAM,GAAG,MAAM,EAAE,CAAC;gBACxD,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,YAAY;oBAClB,WAAW,EAAE,mDAAmD;oBAChE,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,UAAU;iBAClB,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAyD;gBACnE,IAAI,EAAE,mBAAW,CAAC,WAAW;gBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,eAAe;gBACf,UAAU;gBACV,SAAS;gBACT,UAAU;gBACV,gBAAgB,EAAE,OAAO,CAAC,MAAM,GAAG,CAAC;aACrC,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,WAAW;gBAC7B,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,sCAAsC,CAAC;gBACzF,eAAe,EAAE,EAAE;gBACnB,UAAU,EAAE,EAAE;gBACd,SAAS,EAAE,EAAE;gBACb,UAAU,EAAE,EAAE;gBACd,gBAAgB,EAAE,CAAC;aACpB,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AAxHD,8DAwHC"}
|
||||
12
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.d.ts
vendored
Normal file
12
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.d.ts
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { TracePayload } from '../../types/payloads';
|
||||
import { PayloadSegment } from '../../types/packet';
|
||||
export declare class TracePayloadDecoder {
|
||||
static decode(payload: Uint8Array, pathData?: string[] | null, options?: {
|
||||
includeSegments?: boolean;
|
||||
segmentOffset?: number;
|
||||
}): TracePayload & {
|
||||
segments?: PayloadSegment[];
|
||||
} | null;
|
||||
private static readUint32LE;
|
||||
}
|
||||
//# sourceMappingURL=trace.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"trace.d.ts","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/trace.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,YAAY,EAAE,MAAM,sBAAsB,CAAC;AACpD,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AAIpD,qBAAa,mBAAmB;IAC9B,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,UAAU,EAAE,QAAQ,CAAC,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,OAAO,CAAC;QAAC,aAAa,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,YAAY,GAAG;QAAE,QAAQ,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,IAAI;IAuItL,OAAO,CAAC,MAAM,CAAC,YAAY;CAM5B"}
|
||||
136
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.js
vendored
Normal file
136
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.js
vendored
Normal file
@@ -0,0 +1,136 @@
|
||||
"use strict";
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TracePayloadDecoder = void 0;
|
||||
const enums_1 = require("../../types/enums");
|
||||
const hex_1 = require("../../utils/hex");
|
||||
class TracePayloadDecoder {
|
||||
static decode(payload, pathData, options) {
|
||||
try {
|
||||
if (payload.length < 9) {
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Trace,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: ['Trace payload too short (need at least tag(4) + auth(4) + flags(1))'],
|
||||
traceTag: '00000000',
|
||||
authCode: 0,
|
||||
flags: 0,
|
||||
pathHashes: []
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = [{
|
||||
name: 'Invalid Trace Data',
|
||||
description: 'Trace payload too short (minimum 9 bytes required)',
|
||||
startByte: options.segmentOffset || 0,
|
||||
endByte: (options.segmentOffset || 0) + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload)
|
||||
}];
|
||||
}
|
||||
return result;
|
||||
}
|
||||
let offset = 0;
|
||||
const segments = [];
|
||||
const segmentOffset = options?.segmentOffset || 0;
|
||||
// Trace Tag (4 bytes) - unique identifier
|
||||
const traceTagRaw = this.readUint32LE(payload, offset);
|
||||
const traceTag = (0, hex_1.numberToHex)(traceTagRaw, 8);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Trace Tag',
|
||||
description: `Unique identifier for this trace: 0x${traceTagRaw.toString(16).padStart(8, '0')}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(offset, offset + 4))
|
||||
});
|
||||
}
|
||||
offset += 4;
|
||||
// Auth Code (4 bytes) - authentication/verification code
|
||||
const authCode = this.readUint32LE(payload, offset);
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Auth Code',
|
||||
description: `Authentication/verification code: ${authCode}`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset + 3,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(offset, offset + 4))
|
||||
});
|
||||
}
|
||||
offset += 4;
|
||||
// Flags (1 byte) - application-defined control flags
|
||||
const flags = payload[offset];
|
||||
if (options?.includeSegments) {
|
||||
segments.push({
|
||||
name: 'Flags',
|
||||
description: `Application-defined control flags: 0x${flags.toString(16).padStart(2, '0')} (${flags.toString(2).padStart(8, '0')}b)`,
|
||||
startByte: segmentOffset + offset,
|
||||
endByte: segmentOffset + offset,
|
||||
value: flags.toString(16).padStart(2, '0').toUpperCase()
|
||||
});
|
||||
}
|
||||
offset += 1;
|
||||
// remaining bytes are path hashes (node hashes in the trace path)
|
||||
const pathHashes = [];
|
||||
const pathHashesStart = offset;
|
||||
while (offset < payload.length) {
|
||||
pathHashes.push((0, hex_1.byteToHex)(payload[offset]));
|
||||
offset++;
|
||||
}
|
||||
if (options?.includeSegments && pathHashes.length > 0) {
|
||||
const pathHashesDisplay = pathHashes.join(' ');
|
||||
segments.push({
|
||||
name: 'Path Hashes',
|
||||
description: `Node hashes in trace path: ${pathHashesDisplay}`,
|
||||
startByte: segmentOffset + pathHashesStart,
|
||||
endByte: segmentOffset + payload.length - 1,
|
||||
value: (0, hex_1.bytesToHex)(payload.slice(pathHashesStart))
|
||||
});
|
||||
}
|
||||
// extract SNR values from path field for TRACE packets
|
||||
let snrValues;
|
||||
if (pathData && pathData.length > 0) {
|
||||
snrValues = pathData.map(hexByte => {
|
||||
const byteValue = parseInt(hexByte, 16);
|
||||
// convert unsigned byte to signed int8 (SNR values are stored as signed int8 * 4)
|
||||
const snrSigned = byteValue > 127 ? byteValue - 256 : byteValue;
|
||||
return snrSigned / 4.0; // convert to dB
|
||||
});
|
||||
}
|
||||
const result = {
|
||||
type: enums_1.PayloadType.Trace,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: true,
|
||||
traceTag,
|
||||
authCode,
|
||||
flags,
|
||||
pathHashes,
|
||||
snrValues
|
||||
};
|
||||
if (options?.includeSegments) {
|
||||
result.segments = segments;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
type: enums_1.PayloadType.Trace,
|
||||
version: enums_1.PayloadVersion.Version1,
|
||||
isValid: false,
|
||||
errors: [error instanceof Error ? error.message : 'Failed to decode trace payload'],
|
||||
traceTag: '00000000',
|
||||
authCode: 0,
|
||||
flags: 0,
|
||||
pathHashes: []
|
||||
};
|
||||
}
|
||||
}
|
||||
static readUint32LE(buffer, offset) {
|
||||
return buffer[offset] |
|
||||
(buffer[offset + 1] << 8) |
|
||||
(buffer[offset + 2] << 16) |
|
||||
(buffer[offset + 3] << 24);
|
||||
}
|
||||
}
|
||||
exports.TracePayloadDecoder = TracePayloadDecoder;
|
||||
//# sourceMappingURL=trace.js.map
|
||||
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.js.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/decoder/payload-decoders/trace.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"trace.js","sourceRoot":"","sources":["../../../src/decoder/payload-decoders/trace.ts"],"names":[],"mappings":";AAAA,mFAAmF;AACnF,cAAc;;;AAId,6CAAgE;AAChE,yCAAqE;AAErE,MAAa,mBAAmB;IAC9B,MAAM,CAAC,MAAM,CAAC,OAAmB,EAAE,QAA0B,EAAE,OAA+D;QAC5H,IAAI,CAAC;YACH,IAAI,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACvB,MAAM,MAAM,GAAmD;oBAC7D,IAAI,EAAE,mBAAW,CAAC,KAAK;oBACvB,OAAO,EAAE,sBAAc,CAAC,QAAQ;oBAChC,OAAO,EAAE,KAAK;oBACd,MAAM,EAAE,CAAC,qEAAqE,CAAC;oBAC/E,QAAQ,EAAE,UAAU;oBACpB,QAAQ,EAAE,CAAC;oBACX,KAAK,EAAE,CAAC;oBACR,UAAU,EAAE,EAAE;iBACf,CAAC;gBAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;oBAC7B,MAAM,CAAC,QAAQ,GAAG,CAAC;4BACjB,IAAI,EAAE,oBAAoB;4BAC1B,WAAW,EAAE,oDAAoD;4BACjE,SAAS,EAAE,OAAO,CAAC,aAAa,IAAI,CAAC;4BACrC,OAAO,EAAE,CAAC,OAAO,CAAC,aAAa,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;4BAC1D,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC;yBAC3B,CAAC,CAAC;gBACL,CAAC;gBAED,OAAO,MAAM,CAAC;YAChB,CAAC;YAED,IAAI,MAAM,GAAG,CAAC,CAAC;YACf,MAAM,QAAQ,GAAqB,EAAE,CAAC;YACtC,MAAM,aAAa,GAAG,OAAO,EAAE,aAAa,IAAI,CAAC,CAAC;YAElD,0CAA0C;YAC1C,MAAM,WAAW,GAAG,IAAI,CAAC,YAAY,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;YACvD,MAAM,QAAQ,GAAG,IAAA,iBAAW,EAAC,WAAW,EAAE,CAAC,CAAC,CAAC;YAE7C,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,WAAW;oBACjB,WAAW,EAAE,uCAAuC,WAAW,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,QAAQ,CAAC,CAAC,EAAE,GAAG,CAAC,EAAE;oBAC/F,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC,KAAK,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC;iBACrD,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,2DAA2D;YAC3D,MAAM,QAAQ,GAAG,IAAI,CAAC,YAAY,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;YAEpD,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,WAAW;oBACjB,WAAW,EAAE,qCAAqC,QAAQ,EAAE;oBAC5D,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM,GAAG,CAAC;oBACnC,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC,KAAK,CAAC,MAAM,EAAE,MAAM,GAAG,CAAC,CAAC,CAAC;iBACrD,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,qDAAqD;YACrD,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC;YAE9B,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,OAAO;oBACb,WAAW,EAAE,wCAAwC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,QAAQ,CAAC,CAAC,EAAE,GAAG,CAAC,KAAK,KAAK,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC,EAAE,GAAG,CAAC,IAAI;oBACnI,SAAS,EAAE,aAAa,GAAG,MAAM;oBACjC,OAAO,EAAE,aAAa,GAAG,MAAM;oBAC/B,KAAK,EAAE,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,QAAQ,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,WAAW,EAAE;iBACzD,CAAC,CAAC;YACL,CAAC;YACD,MAAM,IAAI,CAAC,CAAC;YAEZ,kEAAkE;YAClE,MAAM,UAAU,GAAa,EAAE,CAAC;YAChC,MAAM,eAAe,GAAG,MAAM,CAAC;YAC/B,OAAO,MAAM,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;gBAC/B,UAAU,CAAC,IAAI,CAAC,IAAA,eAAS,EAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC5C,MAAM,EAAE,CAAC;YACX,CAAC;YAED,IAAI,OAAO,EAAE,eAAe,IAAI,UAAU,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACtD,MAAM,iBAAiB,GAAG,UAAU,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;gBAC/C,QAAQ,CAAC,IAAI,CAAC;oBACZ,IAAI,EAAE,aAAa;oBACnB,WAAW,EAAE,8BAA8B,iBAAiB,EAAE;oBAC9D,SAAS,EAAE,aAAa,GAAG,eAAe;oBAC1C,OAAO,EAAE,aAAa,GAAG,OAAO,CAAC,MAAM,GAAG,CAAC;oBAC3C,KAAK,EAAE,IAAA,gBAAU,EAAC,OAAO,CAAC,KAAK,CAAC,eAAe,CAAC,CAAC;iBAClD,CAAC,CAAC;YACL,CAAC;YAED,uDAAuD;YACvD,IAAI,SAA+B,CAAC;YACpC,IAAI,QAAQ,IAAI,QAAQ,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC;gBACpC,SAAS,GAAG,QAAQ,CAAC,GAAG,CAAC,OAAO,CAAC,EAAE;oBACjC,MAAM,SAAS,GAAG,QAAQ,CAAC,OAAO,EAAE,EAAE,CAAC,CAAC;oBACxC,kFAAkF;oBAClF,MAAM,SAAS,GAAG,SAAS,GAAG,GAAG,CAAC,CAAC,CAAC,SAAS,GAAG,GAAG,CAAC,CAAC,CAAC,SAAS,CAAC;oBAChE,OAAO,SAAS,GAAG,GAAG,CAAC,CAAC,gBAAgB;gBAC1C,CAAC,CAAC,CAAC;YACL,CAAC;YAED,MAAM,MAAM,GAAmD;gBAC7D,IAAI,EAAE,mBAAW,CAAC,KAAK;gBACvB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,IAAI;gBACb,QAAQ;gBACR,QAAQ;gBACR,KAAK;gBACL,UAAU;gBACV,SAAS;aACV,CAAC;YAEF,IAAI,OAAO,EAAE,eAAe,EAAE,CAAC;gBAC7B,MAAM,CAAC,QAAQ,GAAG,QAAQ,CAAC;YAC7B,CAAC;YAED,OAAO,MAAM,CAAC;QAChB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO;gBACL,IAAI,EAAE,mBAAW,CAAC,KAAK;gBACvB,OAAO,EAAE,sBAAc,CAAC,QAAQ;gBAChC,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,CAAC,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,gCAAgC,CAAC;gBACnF,QAAQ,EAAE,UAAU;gBACpB,QAAQ,EAAE,CAAC;gBACX,KAAK,EAAE,CAAC;gBACR,UAAU,EAAE,EAAE;aACf,CAAC;QACJ,CAAC;IACH,CAAC;IAGO,MAAM,CAAC,YAAY,CAAC,MAAkB,EAAE,MAAc;QAC5D,OAAO,MAAM,CAAC,MAAM,CAAC;YACnB,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC;YACzB,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC;YAC1B,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,CAAC;IAC/B,CAAC;CACF;AA9ID,kDA8IC"}
|
||||
36
frontend/lib/meshcore-decoder/dist/index.d.ts
vendored
Normal file
36
frontend/lib/meshcore-decoder/dist/index.d.ts
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
export { MeshCorePacketDecoder } from './decoder/packet-decoder';
|
||||
export { MeshCorePacketDecoder as MeshCoreDecoder } from './decoder/packet-decoder';
|
||||
export type { DecodedPacket, PacketStructure, PacketSegment, PayloadSegment, HeaderBreakdown } from './types/packet';
|
||||
export type { BasePayload, AdvertPayload, TracePayload, GroupTextPayload, RequestPayload, TextMessagePayload, AnonRequestPayload, AckPayload, PathPayload, ResponsePayload, ControlPayloadBase, ControlDiscoverReqPayload, ControlDiscoverRespPayload, ControlPayload, PayloadData } from './types/payloads';
|
||||
export type { CryptoKeyStore, DecryptionOptions, DecryptionResult, ValidationResult } from './types/crypto';
|
||||
export { RouteType, PayloadType, PayloadVersion, DeviceRole, AdvertFlags, RequestType, ControlSubType } from './types/enums';
|
||||
export { MeshCoreKeyStore } from './crypto/key-manager';
|
||||
export { ChannelCrypto } from './crypto/channel-crypto';
|
||||
export { Ed25519SignatureVerifier } from './crypto/ed25519-verifier';
|
||||
export { hexToBytes, bytesToHex, byteToHex, numberToHex } from './utils/hex';
|
||||
export { getRouteTypeName, getPayloadTypeName, getPayloadVersionName, getDeviceRoleName, getRequestTypeName, getControlSubTypeName } from './utils/enum-names';
|
||||
export { createAuthToken, verifyAuthToken, parseAuthToken, decodeAuthTokenPayload } from './utils/auth-token';
|
||||
export type { AuthTokenPayload, AuthToken } from './utils/auth-token';
|
||||
import * as AuthTokenUtils from './utils/auth-token';
|
||||
import { derivePublicKey, validateKeyPair, sign, verify } from './crypto/orlp-ed25519-wasm';
|
||||
export declare const Utils: {
|
||||
derivePublicKey: typeof derivePublicKey;
|
||||
validateKeyPair: typeof validateKeyPair;
|
||||
sign: typeof sign;
|
||||
verify: typeof verify;
|
||||
createAuthToken(payload: AuthTokenUtils.AuthTokenPayload, privateKeyHex: string, publicKeyHex: string): Promise<string>;
|
||||
verifyAuthToken(token: string, expectedPublicKeyHex?: string): Promise<AuthTokenUtils.AuthTokenPayload | null>;
|
||||
parseAuthToken(token: string): AuthTokenUtils.AuthToken | null;
|
||||
decodeAuthTokenPayload(token: string): AuthTokenUtils.AuthTokenPayload | null;
|
||||
byteToHex(byte: number): string;
|
||||
bytesToHex(bytes: Uint8Array): string;
|
||||
numberToHex(num: number, padLength?: number): string;
|
||||
hexToBytes(hex: string): Uint8Array;
|
||||
getRouteTypeName(routeType: import("./types/enums").RouteType): string;
|
||||
getPayloadTypeName(payloadType: import("./types/enums").PayloadType): string;
|
||||
getPayloadVersionName(version: import("./types/enums").PayloadVersion): string;
|
||||
getDeviceRoleName(role: import("./types/enums").DeviceRole): string;
|
||||
getRequestTypeName(requestType: import("./types/enums").RequestType): string;
|
||||
getControlSubTypeName(subType: import("./types/enums").ControlSubType): string;
|
||||
};
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
frontend/lib/meshcore-decoder/dist/index.d.ts.map
vendored
Normal file
1
frontend/lib/meshcore-decoder/dist/index.d.ts.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":"AAIA,OAAO,EAAE,qBAAqB,EAAE,MAAM,0BAA0B,CAAC;AACjE,OAAO,EAAE,qBAAqB,IAAI,eAAe,EAAE,MAAM,0BAA0B,CAAC;AAGpF,YAAY,EAAE,aAAa,EAAE,eAAe,EAAE,aAAa,EAAE,cAAc,EAAE,eAAe,EAAE,MAAM,gBAAgB,CAAC;AACrH,YAAY,EACV,WAAW,EACX,aAAa,EACb,YAAY,EACZ,gBAAgB,EAChB,cAAc,EACd,kBAAkB,EAClB,kBAAkB,EAClB,UAAU,EACV,WAAW,EACX,eAAe,EACf,kBAAkB,EAClB,yBAAyB,EACzB,0BAA0B,EAC1B,cAAc,EACd,WAAW,EACZ,MAAM,kBAAkB,CAAC;AAC1B,YAAY,EAAE,cAAc,EAAE,iBAAiB,EAAE,gBAAgB,EAAE,gBAAgB,EAAE,MAAM,gBAAgB,CAAC;AAG5G,OAAO,EACL,SAAS,EACT,WAAW,EACX,cAAc,EACd,UAAU,EACV,WAAW,EACX,WAAW,EACX,cAAc,EACf,MAAM,eAAe,CAAC;AAGvB,OAAO,EAAE,gBAAgB,EAAE,MAAM,sBAAsB,CAAC;AACxD,OAAO,EAAE,aAAa,EAAE,MAAM,yBAAyB,CAAC;AACxD,OAAO,EAAE,wBAAwB,EAAE,MAAM,2BAA2B,CAAC;AAGrE,OAAO,EAAE,UAAU,EAAE,UAAU,EAAE,SAAS,EAAE,WAAW,EAAE,MAAM,aAAa,CAAC;AAC7E,OAAO,EACL,gBAAgB,EAChB,kBAAkB,EAClB,qBAAqB,EACrB,iBAAiB,EACjB,kBAAkB,EAClB,qBAAqB,EACtB,MAAM,oBAAoB,CAAC;AAC5B,OAAO,EACL,eAAe,EACf,eAAe,EACf,cAAc,EACd,sBAAsB,EACvB,MAAM,oBAAoB,CAAC;AAC5B,YAAY,EAAE,gBAAgB,EAAE,SAAS,EAAE,MAAM,oBAAoB,CAAC;AAItE,OAAO,KAAK,cAAc,MAAM,oBAAoB,CAAC;AACrD,OAAO,EAAE,eAAe,EAAE,eAAe,EAAE,IAAI,EAAE,MAAM,EAAE,MAAM,4BAA4B,CAAC;AAE5F,eAAO,MAAM,KAAK;;;;;;;;;;;;;;;;;;;CAQjB,CAAC"}
|
||||
91
frontend/lib/meshcore-decoder/dist/index.js
vendored
Normal file
91
frontend/lib/meshcore-decoder/dist/index.js
vendored
Normal file
@@ -0,0 +1,91 @@
|
||||
"use strict";
|
||||
// MeshCore Packet Decoder
|
||||
// Copyright (c) 2025 Michael Hart: https://github.com/michaelhart/meshcore-decoder
|
||||
// MIT License
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || (function () {
|
||||
var ownKeys = function(o) {
|
||||
ownKeys = Object.getOwnPropertyNames || function (o) {
|
||||
var ar = [];
|
||||
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
|
||||
return ar;
|
||||
};
|
||||
return ownKeys(o);
|
||||
};
|
||||
return function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
})();
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Utils = exports.decodeAuthTokenPayload = exports.parseAuthToken = exports.verifyAuthToken = exports.createAuthToken = exports.getControlSubTypeName = exports.getRequestTypeName = exports.getDeviceRoleName = exports.getPayloadVersionName = exports.getPayloadTypeName = exports.getRouteTypeName = exports.numberToHex = exports.byteToHex = exports.bytesToHex = exports.hexToBytes = exports.Ed25519SignatureVerifier = exports.ChannelCrypto = exports.MeshCoreKeyStore = exports.ControlSubType = exports.RequestType = exports.AdvertFlags = exports.DeviceRole = exports.PayloadVersion = exports.PayloadType = exports.RouteType = exports.MeshCoreDecoder = exports.MeshCorePacketDecoder = void 0;
|
||||
var packet_decoder_1 = require("./decoder/packet-decoder");
|
||||
Object.defineProperty(exports, "MeshCorePacketDecoder", { enumerable: true, get: function () { return packet_decoder_1.MeshCorePacketDecoder; } });
|
||||
var packet_decoder_2 = require("./decoder/packet-decoder");
|
||||
Object.defineProperty(exports, "MeshCoreDecoder", { enumerable: true, get: function () { return packet_decoder_2.MeshCorePacketDecoder; } });
|
||||
// Enum exports
|
||||
var enums_1 = require("./types/enums");
|
||||
Object.defineProperty(exports, "RouteType", { enumerable: true, get: function () { return enums_1.RouteType; } });
|
||||
Object.defineProperty(exports, "PayloadType", { enumerable: true, get: function () { return enums_1.PayloadType; } });
|
||||
Object.defineProperty(exports, "PayloadVersion", { enumerable: true, get: function () { return enums_1.PayloadVersion; } });
|
||||
Object.defineProperty(exports, "DeviceRole", { enumerable: true, get: function () { return enums_1.DeviceRole; } });
|
||||
Object.defineProperty(exports, "AdvertFlags", { enumerable: true, get: function () { return enums_1.AdvertFlags; } });
|
||||
Object.defineProperty(exports, "RequestType", { enumerable: true, get: function () { return enums_1.RequestType; } });
|
||||
Object.defineProperty(exports, "ControlSubType", { enumerable: true, get: function () { return enums_1.ControlSubType; } });
|
||||
// Crypto exports
|
||||
var key_manager_1 = require("./crypto/key-manager");
|
||||
Object.defineProperty(exports, "MeshCoreKeyStore", { enumerable: true, get: function () { return key_manager_1.MeshCoreKeyStore; } });
|
||||
var channel_crypto_1 = require("./crypto/channel-crypto");
|
||||
Object.defineProperty(exports, "ChannelCrypto", { enumerable: true, get: function () { return channel_crypto_1.ChannelCrypto; } });
|
||||
var ed25519_verifier_1 = require("./crypto/ed25519-verifier");
|
||||
Object.defineProperty(exports, "Ed25519SignatureVerifier", { enumerable: true, get: function () { return ed25519_verifier_1.Ed25519SignatureVerifier; } });
|
||||
// Utility exports
|
||||
var hex_1 = require("./utils/hex");
|
||||
Object.defineProperty(exports, "hexToBytes", { enumerable: true, get: function () { return hex_1.hexToBytes; } });
|
||||
Object.defineProperty(exports, "bytesToHex", { enumerable: true, get: function () { return hex_1.bytesToHex; } });
|
||||
Object.defineProperty(exports, "byteToHex", { enumerable: true, get: function () { return hex_1.byteToHex; } });
|
||||
Object.defineProperty(exports, "numberToHex", { enumerable: true, get: function () { return hex_1.numberToHex; } });
|
||||
var enum_names_1 = require("./utils/enum-names");
|
||||
Object.defineProperty(exports, "getRouteTypeName", { enumerable: true, get: function () { return enum_names_1.getRouteTypeName; } });
|
||||
Object.defineProperty(exports, "getPayloadTypeName", { enumerable: true, get: function () { return enum_names_1.getPayloadTypeName; } });
|
||||
Object.defineProperty(exports, "getPayloadVersionName", { enumerable: true, get: function () { return enum_names_1.getPayloadVersionName; } });
|
||||
Object.defineProperty(exports, "getDeviceRoleName", { enumerable: true, get: function () { return enum_names_1.getDeviceRoleName; } });
|
||||
Object.defineProperty(exports, "getRequestTypeName", { enumerable: true, get: function () { return enum_names_1.getRequestTypeName; } });
|
||||
Object.defineProperty(exports, "getControlSubTypeName", { enumerable: true, get: function () { return enum_names_1.getControlSubTypeName; } });
|
||||
var auth_token_1 = require("./utils/auth-token");
|
||||
Object.defineProperty(exports, "createAuthToken", { enumerable: true, get: function () { return auth_token_1.createAuthToken; } });
|
||||
Object.defineProperty(exports, "verifyAuthToken", { enumerable: true, get: function () { return auth_token_1.verifyAuthToken; } });
|
||||
Object.defineProperty(exports, "parseAuthToken", { enumerable: true, get: function () { return auth_token_1.parseAuthToken; } });
|
||||
Object.defineProperty(exports, "decodeAuthTokenPayload", { enumerable: true, get: function () { return auth_token_1.decodeAuthTokenPayload; } });
|
||||
const EnumUtils = __importStar(require("./utils/enum-names"));
|
||||
const HexUtils = __importStar(require("./utils/hex"));
|
||||
const AuthTokenUtils = __importStar(require("./utils/auth-token"));
|
||||
const orlp_ed25519_wasm_1 = require("./crypto/orlp-ed25519-wasm");
|
||||
exports.Utils = {
|
||||
...EnumUtils,
|
||||
...HexUtils,
|
||||
...AuthTokenUtils,
|
||||
derivePublicKey: orlp_ed25519_wasm_1.derivePublicKey,
|
||||
validateKeyPair: orlp_ed25519_wasm_1.validateKeyPair,
|
||||
sign: orlp_ed25519_wasm_1.sign,
|
||||
verify: orlp_ed25519_wasm_1.verify
|
||||
};
|
||||
//# sourceMappingURL=index.js.map
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user