feat: universal bitstream streaming — any input → any output

New crate: engine/ds-stream/
- Binary protocol: 16-byte header, typed frame/input enums
  - Frame types: Pixels, Delta, Audio, Signal, Neural (0x01-0x43)
  - Input types: Pointer, Key, Scroll, Gamepad, MIDI, BCI (0x01-0x90)
- WebSocket relay server (tokio + tungstenite)
  - Source → receivers: frame broadcast
  - Receivers → source: input routing
- Codec: encode/decode, XOR delta compression, RLE, convenience builders
- 17 unit tests, all passing

Streaming modes (stream-source.html):
1. Pixel mode: raw RGBA framebuffer (~28 MB/s)
2. Delta mode: XOR + RLE compression (~1-9 MB/s, 70-95% savings)
3. Signal mode: compact JSON signal diffs (~2 KB/s, 12000x reduction)
4. Neural mode: procedural SDF pixel generator (concept demo)
5. Audio channel: spring velocity→frequency synthesis
6. Multi-receiver: broadcast to all connected clients

Thin receiver client (stream-receiver.html, ~300 lines):
- Zero framework, zero build step
- Renders any incoming bitstream mode
- Local signal-diff renderer for signal mode
- AudioContext playback for audio frames
- Full input capture: click/drag, keyboard, scroll
- Per-channel bitstream bus visualization

DREAMSTACK.md: Phase 7 section with protocol spec
This commit is contained in:
enzotar 2026-02-25 10:29:44 -08:00
parent a35d44bd59
commit d7961cdc98
10 changed files with 2511 additions and 1 deletions

View file

@ -7,6 +7,8 @@ members = [
"compiler/ds-layout",
"compiler/ds-types",
"compiler/ds-cli",
"engine/ds-physics",
"engine/ds-stream",
]
[workspace.package]
@ -20,3 +22,5 @@ ds-analyzer = { path = "compiler/ds-analyzer" }
ds-codegen = { path = "compiler/ds-codegen" }
ds-layout = { path = "compiler/ds-layout" }
ds-types = { path = "compiler/ds-types" }
ds-physics = { path = "engine/ds-physics" }
ds-stream = { path = "engine/ds-stream" }

View file

@ -34,6 +34,10 @@ DreamStack is **real and running** — 6 Rust crates, 34 tests, 8 examples, ~7KB
| Types | `Signal<Int>`, `Derived<Bool>` | ✅ Hindley-Milner |
| Dev server | `dreamstack dev app.ds` | ✅ HMR |
| Router | `route "/path" -> body` / `navigate` | ✅ Hash-based |
| Two-way binding | `input { bind: name }` | ✅ Signal ↔ input |
| Async resources | `DS.resource()` / `DS.fetchJSON()` | ✅ Loading/Ok/Err |
| Springs | `let x = spring(200)` | ✅ RK4 physics |
| Constraints | `constrain el.width = expr` | ✅ Reactive solver |
### DreamStack vs React
@ -46,6 +50,7 @@ DreamStack is **real and running** — 6 Rust crates, 34 tests, 8 examples, ~7KB
| Conditional | `when x -> text "y"` | `{x && <span>y</span>}` |
| Lists | `for item in items -> ...` | `{items.map(i => ...)}` |
| Router | `route "/path" -> body` | `react-router` (external) |
| Forms | `input { bind: name }` | `useState` + `onChange` (manual) |
| Animation | Built-in springs | framer-motion (external) |
| Layout | Built-in Cassowary | CSS only |
| Types | Native HM, `Signal<T>` | TypeScript (external) |
@ -64,7 +69,7 @@ DreamStack is **real and running** — 6 Rust crates, 34 tests, 8 examples, ~7KB
### Examples
`counter.ds` · `list.ds` · `router.ds` · `todomvc.html` · `search.html` · `dashboard.html` · `playground.html` · `showcase.html` · `benchmarks.html`
`counter.ds` · `list.ds` · `router.ds` · `form.ds` · `springs.ds` · `todomvc.html` · `search.html` · `dashboard.html` · `playground.html` · `showcase.html` · `benchmarks.html`
---
@ -403,3 +408,57 @@ Nobody has unified them. That's the opportunity.
4. **Layout and animation are not afterthoughts.** They're core primitives, not CSS bolt-ons or third-party libraries.
5. **The editor and the runtime are the same thing.** Bidirectional editing collapses the design-develop gap entirely.
6. **UI is data, all the way down.** If you can't `map` over your UI structure, your abstraction is wrong.
7. **Any input bitstream → any output bitstream.** The UI is just one codec. Tomorrow's neural nets generate the pixels directly.
---
## Phase 7: Universal Bitstream Streaming
> *Stream the whole UI as bytes. Neural nets will generate the pixels, acoustics, and actuator commands.*
DreamStack's `engine/ds-stream` crate implements a universal binary protocol for streaming any I/O:
```
┌──────────┐ WebSocket / WebRTC ┌──────────┐
│ Source │ ──────frames (bytes)──► │ Receiver │
│ (renders) │ ◄──────inputs (bytes)── │ (~250 LOC)│
└──────────┘ └──────────┘
```
### Binary Protocol (16-byte header)
| Field | Size | Description |
|-------|------|-------------|
| type | u8 | Frame/input type (pixels, audio, haptic, neural, BCI) |
| flags | u8 | Input flag, keyframe flag, compression flag |
| seq | u16 | Sequence number |
| timestamp | u32 | Relative ms since stream start |
| width | u16 | Frame width or channel count |
| height | u16 | Frame height or sample rate |
| length | u32 | Payload length |
### Output Types
- `Pixels` (0x01) — raw RGBA framebuffer
- `Audio` (0x10) — PCM audio samples
- `Haptic` (0x20) — vibration/actuator commands
- `NeuralFrame` (0x40) — neural-generated pixels *(future)*
- `NeuralAudio` (0x41) — neural speech/music synthesis *(future)*
- `NeuralActuator` (0x42) — learned motor control *(future)*
### Input Types
- `Pointer` (0x01) — mouse/touch position + buttons
- `Key` (0x10) — keyboard events
- `Gamepad` (0x30) — controller axes + buttons
- `BciInput` (0x90) — brain-computer interface *(future)*
### Demos
- `examples/stream-source.html` — Springs demo captures canvas → streams pixels at 30fps
- `examples/stream-receiver.html` — Thin client (~250 lines, no framework) renders bytes
### Run It
```bash
cargo run -p ds-stream # start relay on :9100
open examples/stream-source.html # source: renders + streams
open examples/stream-receiver.html # receiver: displays bytes
```

View file

@ -0,0 +1,20 @@
[package]
name = "ds-stream"
version.workspace = true
edition.workspace = true
license.workspace = true
description = "Universal bitstream streaming — any input to any output"
[[bin]]
name = "ds-stream-relay"
path = "src/main.rs"
[lib]
path = "src/lib.rs"
[dependencies]
tokio = { version = "1", features = ["full"] }
tokio-tungstenite = "0.24"
futures-util = "0.3"
[dev-dependencies]

View file

@ -0,0 +1,247 @@
//! Frame and Input Codec — encode/decode complete messages.
//!
//! A message = header (16 bytes) + payload (variable length).
use crate::protocol::*;
// ─── Frame Encoder ───
/// Encode a complete frame message: header + payload.
pub fn encode_frame(
frame_type: FrameType,
seq: u16,
timestamp: u32,
width: u16,
height: u16,
flags: u8,
payload: &[u8],
) -> Vec<u8> {
let header = FrameHeader {
frame_type: frame_type as u8,
flags,
seq,
timestamp,
width,
height,
length: payload.len() as u32,
};
let mut buf = Vec::with_capacity(HEADER_SIZE + payload.len());
buf.extend_from_slice(&header.encode());
buf.extend_from_slice(payload);
buf
}
/// Encode an input event message: header + payload, with FLAG_INPUT set.
pub fn encode_input(
input_type: InputType,
seq: u16,
timestamp: u32,
payload: &[u8],
) -> Vec<u8> {
let header = FrameHeader {
frame_type: input_type as u8,
flags: FLAG_INPUT,
seq,
timestamp,
width: 0,
height: 0,
length: payload.len() as u32,
};
let mut buf = Vec::with_capacity(HEADER_SIZE + payload.len());
buf.extend_from_slice(&header.encode());
buf.extend_from_slice(payload);
buf
}
// ─── Frame Decoder ───
/// Decoded message: header + payload reference.
#[derive(Debug)]
pub struct DecodedMessage<'a> {
pub header: FrameHeader,
pub payload: &'a [u8],
}
/// Decode a complete message from bytes.
/// Returns None if the buffer is too short for the header or payload.
pub fn decode_message(buf: &[u8]) -> Option<DecodedMessage<'_>> {
let header = FrameHeader::decode(buf)?;
let payload_start = HEADER_SIZE;
let payload_end = payload_start + header.length as usize;
if buf.len() < payload_end {
return None;
}
Some(DecodedMessage {
header,
payload: &buf[payload_start..payload_end],
})
}
/// Total message size from a header.
pub fn message_size(header: &FrameHeader) -> usize {
HEADER_SIZE + header.length as usize
}
// ─── Delta Compression (stub for future neural compression) ───
/// Compute XOR delta between two frames.
/// Returns a delta buffer where unchanged pixels are zero bytes.
/// This is the simplest possible delta — a neural compressor would replace this.
pub fn compute_delta(current: &[u8], previous: &[u8]) -> Vec<u8> {
assert_eq!(current.len(), previous.len(), "frames must be same size");
current.iter().zip(previous.iter()).map(|(c, p)| c ^ p).collect()
}
/// Apply XOR delta to reconstruct current frame from previous + delta.
pub fn apply_delta(previous: &[u8], delta: &[u8]) -> Vec<u8> {
assert_eq!(previous.len(), delta.len(), "frames must be same size");
previous.iter().zip(delta.iter()).map(|(p, d)| p ^ d).collect()
}
/// Check if a delta frame is worth sending (vs sending a keyframe).
/// Returns true if the delta is significantly smaller due to zero runs.
pub fn delta_is_worthwhile(delta: &[u8]) -> bool {
let zero_count = delta.iter().filter(|&&b| b == 0).count();
// If more than 30% of bytes are zero, delta is worthwhile
zero_count > delta.len() * 3 / 10
}
// ─── Convenience Builders ───
/// Build a pixel frame message from raw RGBA data.
pub fn pixel_frame(
seq: u16,
timestamp: u32,
width: u16,
height: u16,
rgba_data: &[u8],
) -> Vec<u8> {
encode_frame(
FrameType::Pixels,
seq,
timestamp,
width,
height,
FLAG_KEYFRAME,
rgba_data,
)
}
/// Build a pointer input message.
pub fn pointer_input(seq: u16, timestamp: u32, event: &PointerEvent, input_type: InputType) -> Vec<u8> {
let header = FrameHeader {
frame_type: input_type as u8,
flags: FLAG_INPUT,
seq,
timestamp,
width: 0,
height: 0,
length: PointerEvent::SIZE as u32,
};
let mut buf = Vec::with_capacity(HEADER_SIZE + PointerEvent::SIZE);
buf.extend_from_slice(&header.encode());
buf.extend_from_slice(&event.encode());
buf
}
/// Build a ping/heartbeat message.
pub fn ping(seq: u16, timestamp: u32) -> Vec<u8> {
encode_frame(FrameType::Ping, seq, timestamp, 0, 0, 0, &[])
}
// ─── Tests ───
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn frame_encode_decode_roundtrip() {
let payload = vec![0xFF; 100];
let msg = encode_frame(
FrameType::Pixels,
1,
5000,
320,
240,
FLAG_KEYFRAME,
&payload,
);
let decoded = decode_message(&msg).unwrap();
assert_eq!(decoded.header.frame_type, FrameType::Pixels as u8);
assert_eq!(decoded.header.seq, 1);
assert_eq!(decoded.header.timestamp, 5000);
assert_eq!(decoded.header.width, 320);
assert_eq!(decoded.header.height, 240);
assert_eq!(decoded.header.length, 100);
assert!(decoded.header.is_keyframe());
assert_eq!(decoded.payload, &payload[..]);
}
#[test]
fn input_encode_decode_roundtrip() {
let ptr = PointerEvent { x: 100, y: 200, buttons: 1 };
let msg = pointer_input(5, 1234, &ptr, InputType::Pointer);
let decoded = decode_message(&msg).unwrap();
assert!(decoded.header.is_input());
assert_eq!(decoded.header.frame_type, InputType::Pointer as u8);
let ptr2 = PointerEvent::decode(decoded.payload).unwrap();
assert_eq!(ptr2.x, 100);
assert_eq!(ptr2.y, 200);
}
#[test]
fn delta_compression() {
let frame1 = vec![10, 20, 30, 40, 50, 60, 70, 80];
let frame2 = vec![10, 20, 30, 40, 55, 65, 70, 80]; // 2 pixels changed
let delta = compute_delta(&frame2, &frame1);
// Unchanged bytes should be 0
assert_eq!(delta[0], 0);
assert_eq!(delta[1], 0);
assert_eq!(delta[6], 0);
assert_eq!(delta[7], 0);
// Reconstruct
let reconstructed = apply_delta(&frame1, &delta);
assert_eq!(reconstructed, frame2);
}
#[test]
fn delta_worthwhile_check() {
// All zeros = definitely worthwhile (100% unchanged)
let all_zero = vec![0u8; 100];
assert!(delta_is_worthwhile(&all_zero));
// All changed = not worthwhile
let all_changed = vec![0xFF; 100];
assert!(!delta_is_worthwhile(&all_changed));
}
#[test]
fn ping_message() {
let msg = ping(99, 5000);
let decoded = decode_message(&msg).unwrap();
assert_eq!(decoded.header.frame_type, FrameType::Ping as u8);
assert_eq!(decoded.header.seq, 99);
assert_eq!(decoded.payload.len(), 0);
}
#[test]
fn partial_buffer_returns_none() {
let msg = encode_frame(FrameType::Pixels, 0, 0, 10, 10, 0, &[1, 2, 3]);
// Truncate: header says 3 bytes payload but we only give 2
assert!(decode_message(&msg[..HEADER_SIZE + 2]).is_none());
}
#[test]
fn message_size_calculation() {
let header = FrameHeader {
frame_type: 0,
flags: 0,
seq: 0,
timestamp: 0,
width: 0,
height: 0,
length: 1024,
};
assert_eq!(message_size(&header), HEADER_SIZE + 1024);
}
}

View file

@ -0,0 +1,26 @@
//! DreamStack Universal Bitstream
//!
//! Any input bitstream → any output bitstream.
//! Neural nets generate the pixels. The receiver just renders bytes.
//!
//! # Architecture
//!
//! ```text
//! ┌──────────┐ WebSocket ┌──────────┐
//! │ Source │ ─────frames──► │ Receiver │
//! │ (server) │ ◄────inputs─── │ (client) │
//! └──────────┘ └──────────┘
//! ```
//!
//! The **source** runs the DreamStack signal graph, springs, and renderer.
//! It captures frames as bytes and streams them to connected receivers.
//!
//! The **receiver** is a thin client (~200 lines) that renders incoming
//! pixel frames and sends input events back to the source.
//!
//! The **relay** is a WebSocket server that routes frames and inputs
//! between source and receivers.
pub mod protocol;
pub mod codec;
pub mod relay;

View file

@ -0,0 +1,28 @@
//! DreamStack Bitstream Relay Server
//!
//! Usage: `cargo run -p ds-stream`
//!
//! Starts a WebSocket relay on port 9100.
//! - Source connects to ws://localhost:9100/source
//! - Receivers connect to ws://localhost:9100/stream
use ds_stream::relay::{run_relay, RelayConfig};
#[tokio::main]
async fn main() {
let port = std::env::args()
.nth(1)
.and_then(|s| s.parse::<u16>().ok())
.unwrap_or(9100);
let config = RelayConfig {
addr: format!("0.0.0.0:{}", port).parse().unwrap(),
..Default::default()
};
eprintln!("Starting DreamStack Bitstream Relay on port {}...", port);
if let Err(e) = run_relay(config).await {
eprintln!("Relay error: {}", e);
std::process::exit(1);
}
}

View file

@ -0,0 +1,412 @@
//! Universal Bitstream Protocol
//!
//! Binary wire format for streaming any I/O between source and receiver.
//! Designed to be the same protocol a neural renderer would emit.
//!
//! ## Header Format (16 bytes)
//!
//! ```text
//! ┌──────┬───────┬──────┬───────────┬───────┬────────┬────────┐
//! │ type │ flags │ seq │ timestamp │ width │ height │ length │
//! │ u8 │ u8 │ u16 │ u32 │ u16 │ u16 │ u32 │
//! └──────┴───────┴──────┴───────────┴───────┴────────┴────────┘
//! ```
/// Frame header size in bytes.
pub const HEADER_SIZE: usize = 16;
/// Magic bytes for protocol identification.
pub const MAGIC: [u8; 2] = [0xD5, 0x7A]; // "DS" + "z" for stream
// ─── Frame Types ───
/// Output frame types — what the source generates.
#[repr(u8)]
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum FrameType {
/// Raw RGBA pixel data (width × height × 4 bytes)
Pixels = 0x01,
/// PNG/WebP compressed frame
CompressedPixels = 0x02,
/// Delta frame — XOR diff from previous keyframe
DeltaPixels = 0x03,
/// Audio PCM samples (f32, interleaved channels)
AudioPcm = 0x10,
/// Opus-compressed audio chunk
AudioCompressed = 0x11,
/// Haptic pulse command
Haptic = 0x20,
/// Actuator/motor command
Actuator = 0x21,
/// LED/display matrix data
LedMatrix = 0x22,
/// Signal graph state sync (DreamStack-native)
SignalSync = 0x30,
/// Signal diff — only changed signals
SignalDiff = 0x31,
// ── Forward-thinking: Neural rendering ──
/// Neural-generated frame (model output tensor as pixels)
NeuralFrame = 0x40,
/// Neural audio synthesis output
NeuralAudio = 0x41,
/// Neural actuator command (learned motor control)
NeuralActuator = 0x42,
/// Neural scene description (latent space representation)
NeuralLatent = 0x43,
// ── Control ──
/// Keyframe — receiver should reset state
Keyframe = 0xF0,
/// Heartbeat / keep-alive
Ping = 0xFE,
/// Stream end
End = 0xFF,
}
impl FrameType {
pub fn from_u8(v: u8) -> Option<Self> {
match v {
0x01 => Some(Self::Pixels),
0x02 => Some(Self::CompressedPixels),
0x03 => Some(Self::DeltaPixels),
0x10 => Some(Self::AudioPcm),
0x11 => Some(Self::AudioCompressed),
0x20 => Some(Self::Haptic),
0x21 => Some(Self::Actuator),
0x22 => Some(Self::LedMatrix),
0x30 => Some(Self::SignalSync),
0x31 => Some(Self::SignalDiff),
0x40 => Some(Self::NeuralFrame),
0x41 => Some(Self::NeuralAudio),
0x42 => Some(Self::NeuralActuator),
0x43 => Some(Self::NeuralLatent),
0xF0 => Some(Self::Keyframe),
0xFE => Some(Self::Ping),
0xFF => Some(Self::End),
_ => None,
}
}
}
// ─── Input Types ───
/// Input event types — what the receiver sends back.
#[repr(u8)]
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum InputType {
/// Pointer/mouse: x(u16), y(u16), buttons(u8)
Pointer = 0x01,
/// Pointer down
PointerDown = 0x02,
/// Pointer up
PointerUp = 0x03,
/// Key down: keycode(u16), modifiers(u8)
KeyDown = 0x10,
/// Key up
KeyUp = 0x11,
/// Touch start/move: id(u8), x(u16), y(u16)
Touch = 0x20,
/// Touch end
TouchEnd = 0x21,
/// Gamepad axis: axis(u8), value(f32)
GamepadAxis = 0x30,
/// Gamepad button: button(u8), pressed(bool)
GamepadButton = 0x31,
/// MIDI message: status(u8), data1(u8), data2(u8)
Midi = 0x40,
/// Scroll/wheel: dx(i16), dy(i16)
Scroll = 0x50,
/// Resize: width(u16), height(u16)
Resize = 0x60,
// ── Forward-thinking ──
/// Voice/audio input chunk
VoiceInput = 0x70,
/// Camera frame from receiver
CameraInput = 0x71,
/// Sensor telemetry (accelerometer, gyro, etc.)
SensorInput = 0x80,
/// BCI/neural signal input
BciInput = 0x90,
}
impl InputType {
pub fn from_u8(v: u8) -> Option<Self> {
match v {
0x01 => Some(Self::Pointer),
0x02 => Some(Self::PointerDown),
0x03 => Some(Self::PointerUp),
0x10 => Some(Self::KeyDown),
0x11 => Some(Self::KeyUp),
0x20 => Some(Self::Touch),
0x21 => Some(Self::TouchEnd),
0x30 => Some(Self::GamepadAxis),
0x31 => Some(Self::GamepadButton),
0x40 => Some(Self::Midi),
0x50 => Some(Self::Scroll),
0x60 => Some(Self::Resize),
0x70 => Some(Self::VoiceInput),
0x71 => Some(Self::CameraInput),
0x80 => Some(Self::SensorInput),
0x90 => Some(Self::BciInput),
_ => None,
}
}
}
// ─── Frame Header ───
/// 16-byte frame header for all bitstream messages.
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub struct FrameHeader {
/// Frame type (FrameType or InputType discriminant)
pub frame_type: u8,
/// Flags: bit 0 = is_input (1) vs is_output (0), bits 1-7 reserved
pub flags: u8,
/// Sequence number (wraps at u16::MAX)
pub seq: u16,
/// Timestamp in milliseconds (relative to stream start)
pub timestamp: u32,
/// Width (for pixel frames) or channel count (for audio)
pub width: u16,
/// Height (for pixel frames) or sample rate / 100 (for audio)
pub height: u16,
/// Payload length in bytes
pub length: u32,
}
/// Flag: this message carries input (receiver → source)
pub const FLAG_INPUT: u8 = 0x01;
/// Flag: this is a keyframe (full state, no delta)
pub const FLAG_KEYFRAME: u8 = 0x02;
/// Flag: payload is compressed
pub const FLAG_COMPRESSED: u8 = 0x04;
impl FrameHeader {
/// Encode header to 16 bytes (little-endian).
pub fn encode(&self) -> [u8; HEADER_SIZE] {
let mut buf = [0u8; HEADER_SIZE];
buf[0] = self.frame_type;
buf[1] = self.flags;
buf[2..4].copy_from_slice(&self.seq.to_le_bytes());
buf[4..8].copy_from_slice(&self.timestamp.to_le_bytes());
buf[8..10].copy_from_slice(&self.width.to_le_bytes());
buf[10..12].copy_from_slice(&self.height.to_le_bytes());
buf[12..16].copy_from_slice(&self.length.to_le_bytes());
buf
}
/// Decode header from 16 bytes (little-endian).
pub fn decode(buf: &[u8]) -> Option<Self> {
if buf.len() < HEADER_SIZE {
return None;
}
Some(Self {
frame_type: buf[0],
flags: buf[1],
seq: u16::from_le_bytes([buf[2], buf[3]]),
timestamp: u32::from_le_bytes([buf[4], buf[5], buf[6], buf[7]]),
width: u16::from_le_bytes([buf[8], buf[9]]),
height: u16::from_le_bytes([buf[10], buf[11]]),
length: u32::from_le_bytes([buf[12], buf[13], buf[14], buf[15]]),
})
}
/// Check if this is an input event (receiver → source).
pub fn is_input(&self) -> bool {
self.flags & FLAG_INPUT != 0
}
/// Check if this is a keyframe.
pub fn is_keyframe(&self) -> bool {
self.flags & FLAG_KEYFRAME != 0
}
}
// ─── Input Events ───
/// Compact binary input event (pointer).
#[derive(Debug, Clone, Copy)]
pub struct PointerEvent {
pub x: u16,
pub y: u16,
pub buttons: u8,
}
impl PointerEvent {
pub const SIZE: usize = 5;
pub fn encode(&self) -> [u8; Self::SIZE] {
let mut buf = [0u8; Self::SIZE];
buf[0..2].copy_from_slice(&self.x.to_le_bytes());
buf[2..4].copy_from_slice(&self.y.to_le_bytes());
buf[4] = self.buttons;
buf
}
pub fn decode(buf: &[u8]) -> Option<Self> {
if buf.len() < Self::SIZE {
return None;
}
Some(Self {
x: u16::from_le_bytes([buf[0], buf[1]]),
y: u16::from_le_bytes([buf[2], buf[3]]),
buttons: buf[4],
})
}
}
/// Compact binary input event (key).
#[derive(Debug, Clone, Copy)]
pub struct KeyEvent {
pub keycode: u16,
pub modifiers: u8,
}
impl KeyEvent {
pub const SIZE: usize = 3;
pub fn encode(&self) -> [u8; Self::SIZE] {
let mut buf = [0u8; Self::SIZE];
buf[0..2].copy_from_slice(&self.keycode.to_le_bytes());
buf[2] = self.modifiers;
buf
}
pub fn decode(buf: &[u8]) -> Option<Self> {
if buf.len() < Self::SIZE {
return None;
}
Some(Self {
keycode: u16::from_le_bytes([buf[0], buf[1]]),
modifiers: buf[2],
})
}
}
/// Compact binary scroll event.
#[derive(Debug, Clone, Copy)]
pub struct ScrollEvent {
pub dx: i16,
pub dy: i16,
}
impl ScrollEvent {
pub const SIZE: usize = 4;
pub fn encode(&self) -> [u8; Self::SIZE] {
let mut buf = [0u8; Self::SIZE];
buf[0..2].copy_from_slice(&self.dx.to_le_bytes());
buf[2..4].copy_from_slice(&self.dy.to_le_bytes());
buf
}
pub fn decode(buf: &[u8]) -> Option<Self> {
if buf.len() < Self::SIZE {
return None;
}
Some(Self {
dx: i16::from_le_bytes([buf[0], buf[1]]),
dy: i16::from_le_bytes([buf[2], buf[3]]),
})
}
}
// ─── Tests ───
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn header_roundtrip() {
let header = FrameHeader {
frame_type: FrameType::Pixels as u8,
flags: FLAG_KEYFRAME,
seq: 42,
timestamp: 1234567,
width: 1920,
height: 1080,
length: 1920 * 1080 * 4,
};
let encoded = header.encode();
let decoded = FrameHeader::decode(&encoded).unwrap();
assert_eq!(header, decoded);
}
#[test]
fn header_too_short() {
assert!(FrameHeader::decode(&[0u8; 15]).is_none());
}
#[test]
fn pointer_event_roundtrip() {
let evt = PointerEvent { x: 500, y: 300, buttons: 1 };
let encoded = evt.encode();
let decoded = PointerEvent::decode(&encoded).unwrap();
assert_eq!(evt.x, decoded.x);
assert_eq!(evt.y, decoded.y);
assert_eq!(evt.buttons, decoded.buttons);
}
#[test]
fn key_event_roundtrip() {
let evt = KeyEvent { keycode: 0x0041, modifiers: 0x03 };
let encoded = evt.encode();
let decoded = KeyEvent::decode(&encoded).unwrap();
assert_eq!(evt.keycode, decoded.keycode);
assert_eq!(evt.modifiers, decoded.modifiers);
}
#[test]
fn scroll_event_roundtrip() {
let evt = ScrollEvent { dx: -120, dy: 360 };
let encoded = evt.encode();
let decoded = ScrollEvent::decode(&encoded).unwrap();
assert_eq!(evt.dx, decoded.dx);
assert_eq!(evt.dy, decoded.dy);
}
#[test]
fn frame_type_roundtrip() {
for val in [0x01, 0x02, 0x03, 0x10, 0x11, 0x20, 0x30, 0x31, 0x40, 0x41, 0xF0, 0xFE, 0xFF] {
let ft = FrameType::from_u8(val);
assert!(ft.is_some(), "FrameType::from_u8({:#x}) should be Some", val);
assert_eq!(ft.unwrap() as u8, val);
}
assert!(FrameType::from_u8(0x99).is_none());
}
#[test]
fn input_type_roundtrip() {
for val in [0x01, 0x02, 0x03, 0x10, 0x11, 0x20, 0x21, 0x30, 0x31, 0x40, 0x50, 0x60, 0x70, 0x80, 0x90] {
let it = InputType::from_u8(val);
assert!(it.is_some(), "InputType::from_u8({:#x}) should be Some", val);
assert_eq!(it.unwrap() as u8, val);
}
assert!(InputType::from_u8(0xAA).is_none());
}
#[test]
fn header_flags() {
let h = FrameHeader {
frame_type: FrameType::Pixels as u8,
flags: FLAG_INPUT | FLAG_KEYFRAME,
seq: 0,
timestamp: 0,
width: 0,
height: 0,
length: 0,
};
assert!(h.is_input());
assert!(h.is_keyframe());
let h2 = FrameHeader { flags: 0, ..h };
assert!(!h2.is_input());
assert!(!h2.is_keyframe());
}
}

View file

@ -0,0 +1,255 @@
//! WebSocket Relay Server
//!
//! Routes frames from source→receivers and inputs from receivers→source.
//! Roles are determined by the connection path:
//! - `/source` — the frame producer (DreamStack renderer)
//! - `/stream` — frame consumers (thin receivers)
//!
//! The relay is intentionally dumb — it just forwards bytes.
//! The protocol semantics live in the source and receiver.
use std::net::SocketAddr;
use std::sync::Arc;
use futures_util::{SinkExt, StreamExt};
use tokio::net::{TcpListener, TcpStream};
use tokio::sync::{broadcast, mpsc, RwLock};
use tokio_tungstenite::tungstenite::Message;
/// Relay server configuration.
pub struct RelayConfig {
/// Address to bind to.
pub addr: SocketAddr,
/// Maximum number of receivers.
pub max_receivers: usize,
/// Frame broadcast channel capacity.
pub frame_buffer_size: usize,
}
impl Default for RelayConfig {
fn default() -> Self {
Self {
addr: "0.0.0.0:9100".parse().unwrap(),
max_receivers: 64,
frame_buffer_size: 16,
}
}
}
/// Stats tracked by the relay.
#[derive(Debug, Default, Clone)]
pub struct RelayStats {
pub frames_relayed: u64,
pub bytes_relayed: u64,
pub inputs_relayed: u64,
pub connected_receivers: usize,
pub source_connected: bool,
}
/// Shared relay state.
struct RelayState {
/// Broadcast channel: source → all receivers (frames)
frame_tx: broadcast::Sender<Vec<u8>>,
/// Channel: receivers → source (input events)
input_tx: mpsc::Sender<Vec<u8>>,
input_rx: Option<mpsc::Receiver<Vec<u8>>>,
/// Live stats
stats: RelayStats,
}
/// Run the WebSocket relay server.
pub async fn run_relay(config: RelayConfig) -> Result<(), Box<dyn std::error::Error>> {
let listener = TcpListener::bind(&config.addr).await?;
eprintln!("╔══════════════════════════════════════════════════╗");
eprintln!("║ DreamStack Bitstream Relay v0.1.0 ║");
eprintln!("║ ║");
eprintln!("║ Source: ws://{}/source ║", config.addr);
eprintln!("║ Receiver: ws://{}/stream ║", config.addr);
eprintln!("╚══════════════════════════════════════════════════╝");
let (frame_tx, _) = broadcast::channel(config.frame_buffer_size);
let (input_tx, input_rx) = mpsc::channel(256);
let state = Arc::new(RwLock::new(RelayState {
frame_tx,
input_tx,
input_rx: Some(input_rx),
stats: RelayStats::default(),
}));
while let Ok((stream, addr)) = listener.accept().await {
let state = state.clone();
tokio::spawn(handle_connection(stream, addr, state));
}
Ok(())
}
async fn handle_connection(
stream: TcpStream,
addr: SocketAddr,
state: Arc<RwLock<RelayState>>,
) {
// Peek at the HTTP upgrade request to determine role
let ws_stream = match tokio_tungstenite::accept_hdr_async(
stream,
|_req: &tokio_tungstenite::tungstenite::handshake::server::Request,
res: tokio_tungstenite::tungstenite::handshake::server::Response| {
// We'll extract the path from the URI later via a different mechanism
Ok(res)
},
)
.await
{
Ok(ws) => ws,
Err(e) => {
eprintln!("[relay] WebSocket handshake failed from {}: {}", addr, e);
return;
}
};
// For simplicity in the PoC, first connection = source, subsequent = receivers.
// A production version would parse the URI path.
let is_source = {
let s = state.read().await;
!s.stats.source_connected
};
if is_source {
eprintln!("[relay] Source connected: {}", addr);
handle_source(ws_stream, addr, state).await;
} else {
eprintln!("[relay] Receiver connected: {}", addr);
handle_receiver(ws_stream, addr, state).await;
}
}
async fn handle_source(
ws_stream: tokio_tungstenite::WebSocketStream<TcpStream>,
addr: SocketAddr,
state: Arc<RwLock<RelayState>>,
) {
let (mut ws_sink, mut ws_source) = ws_stream.split();
// Mark source as connected and take the input_rx
let input_rx = {
let mut s = state.write().await;
s.stats.source_connected = true;
s.input_rx.take()
};
let frame_tx = {
let s = state.read().await;
s.frame_tx.clone()
};
// Forward input events from receivers → source
if let Some(mut input_rx) = input_rx {
let state_clone = state.clone();
tokio::spawn(async move {
while let Some(input_bytes) = input_rx.recv().await {
let msg = Message::Binary(input_bytes.into());
if ws_sink.send(msg).await.is_err() {
break;
}
let mut s = state_clone.write().await;
s.stats.inputs_relayed += 1;
}
});
}
// Receive frames from source → broadcast to receivers
while let Some(Ok(msg)) = ws_source.next().await {
if let Message::Binary(data) = msg {
let data_vec: Vec<u8> = data.into();
{
let mut s = state.write().await;
s.stats.frames_relayed += 1;
s.stats.bytes_relayed += data_vec.len() as u64;
}
// Broadcast to all receivers (ignore send errors = no receivers)
let _ = frame_tx.send(data_vec);
}
}
// Source disconnected
eprintln!("[relay] Source disconnected: {}", addr);
let mut s = state.write().await;
s.stats.source_connected = false;
}
async fn handle_receiver(
ws_stream: tokio_tungstenite::WebSocketStream<TcpStream>,
addr: SocketAddr,
state: Arc<RwLock<RelayState>>,
) {
let (mut ws_sink, mut ws_source) = ws_stream.split();
// Subscribe to frame broadcast
let mut frame_rx = {
let mut s = state.write().await;
s.stats.connected_receivers += 1;
s.frame_tx.subscribe()
};
let input_tx = {
let s = state.read().await;
s.input_tx.clone()
};
// Forward frames from broadcast → this receiver
let _state_clone = state.clone();
let addr_clone = addr;
let send_task = tokio::spawn(async move {
loop {
match frame_rx.recv().await {
Ok(frame_bytes) => {
let msg = Message::Binary(frame_bytes.into());
if ws_sink.send(msg).await.is_err() {
break;
}
}
Err(broadcast::error::RecvError::Lagged(n)) => {
eprintln!("[relay] Receiver {} lagged by {} frames", addr_clone, n);
}
Err(_) => break,
}
}
});
// Forward input events from this receiver → source
while let Some(Ok(msg)) = ws_source.next().await {
if let Message::Binary(data) = msg {
let _ = input_tx.send(data.into()).await;
}
}
// Receiver disconnected
send_task.abort();
eprintln!("[relay] Receiver disconnected: {}", addr);
let mut s = state.write().await;
s.stats.connected_receivers = s.stats.connected_receivers.saturating_sub(1);
}
// ─── Tests ───
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn default_config() {
let config = RelayConfig::default();
assert_eq!(config.addr, "0.0.0.0:9100".parse::<SocketAddr>().unwrap());
assert_eq!(config.max_receivers, 64);
assert_eq!(config.frame_buffer_size, 16);
}
#[test]
fn stats_default() {
let stats = RelayStats::default();
assert_eq!(stats.frames_relayed, 0);
assert_eq!(stats.bytes_relayed, 0);
assert!(!stats.source_connected);
}
}

View file

@ -0,0 +1,676 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>DreamStack — Stream Receiver</title>
<link
href="https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700&family=JetBrains+Mono:wght@400;500&display=swap"
rel="stylesheet">
<style>
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: 'Inter', system-ui, sans-serif;
background: #06060b;
color: #e2e8f0;
min-height: 100vh;
display: flex;
flex-direction: column;
align-items: center;
padding: 1.5rem;
}
h1 {
font-size: 1.6rem;
font-weight: 700;
background: linear-gradient(135deg, #ec4899, #f472b6, #c084fc);
-webkit-background-clip: text;
background-clip: text;
-webkit-text-fill-color: transparent;
margin-bottom: 0.3rem;
}
.subtitle {
color: rgba(255, 255, 255, 0.25);
font-size: 0.75rem;
margin-bottom: 1rem;
}
.layout {
display: flex;
gap: 1.5rem;
align-items: flex-start;
flex-wrap: wrap;
justify-content: center;
}
.panel {
background: rgba(255, 255, 255, 0.02);
border: 1px solid rgba(255, 255, 255, 0.06);
border-radius: 16px;
padding: 1rem;
}
.panel h2 {
font-size: 0.8rem;
font-weight: 600;
color: rgba(255, 255, 255, 0.4);
text-transform: uppercase;
letter-spacing: 0.08em;
margin-bottom: 0.8rem;
}
canvas {
border-radius: 12px;
background: #000;
border: 1px solid rgba(255, 255, 255, 0.06);
cursor: crosshair;
}
.connection {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 1rem;
border-radius: 20px;
font-size: 0.75rem;
font-weight: 500;
margin-bottom: 1rem;
}
.connection.disconnected {
background: rgba(239, 68, 68, 0.1);
border: 1px solid rgba(239, 68, 68, 0.2);
color: #fca5a5;
}
.connection.connected {
background: rgba(34, 197, 94, 0.1);
border: 1px solid rgba(34, 197, 94, 0.2);
color: #86efac;
}
.connection .dot {
width: 8px;
height: 8px;
border-radius: 50%;
animation: pulse 2s ease-in-out infinite;
}
.connection.disconnected .dot {
background: #ef4444;
}
.connection.connected .dot {
background: #22c55e;
}
@keyframes pulse {
0%,
100% {
opacity: 1;
}
50% {
opacity: 0.3;
}
}
.stats {
display: grid;
grid-template-columns: 1fr 1fr 1fr;
gap: 0.4rem;
min-width: 300px;
}
.stat {
background: rgba(236, 72, 153, 0.06);
border: 1px solid rgba(236, 72, 153, 0.1);
border-radius: 10px;
padding: 0.5rem 0.7rem;
}
.stat-label {
font-size: 0.55rem;
color: rgba(255, 255, 255, 0.3);
text-transform: uppercase;
letter-spacing: 0.05em;
}
.stat-value {
font-family: 'JetBrains Mono', monospace;
font-size: 0.95rem;
font-weight: 600;
color: #f9a8d4;
margin-top: 1px;
}
.bus {
margin-top: 0.8rem;
padding-top: 0.6rem;
border-top: 1px solid rgba(255, 255, 255, 0.04);
}
.bus h3 {
font-size: 0.6rem;
color: rgba(255, 255, 255, 0.2);
text-transform: uppercase;
letter-spacing: 0.08em;
margin-bottom: 0.4rem;
}
.bus-row {
display: flex;
align-items: center;
gap: 0.4rem;
padding: 0.2rem 0;
font-size: 0.65rem;
}
.bus-arrow {
color: rgba(255, 255, 255, 0.1);
font-family: monospace;
}
.bus-label {
color: rgba(255, 255, 255, 0.3);
min-width: 70px;
}
.bus-bytes {
font-family: 'JetBrains Mono', monospace;
font-size: 0.6rem;
color: rgba(139, 92, 246, 0.5);
background: rgba(139, 92, 246, 0.05);
padding: 1px 6px;
border-radius: 4px;
}
.bus-active {
font-size: 0.7rem;
transition: color 0.2s;
}
.powered {
margin-top: 1rem;
font-size: 0.6rem;
color: rgba(255, 255, 255, 0.08);
}
.powered span {
color: rgba(236, 72, 153, 0.25);
}
.badge {
position: fixed;
bottom: 1rem;
right: 1rem;
background: rgba(236, 72, 153, 0.08);
border: 1px solid rgba(236, 72, 153, 0.15);
padding: 0.4rem 0.8rem;
border-radius: 8px;
font-size: 0.6rem;
color: rgba(255, 255, 255, 0.25);
}
.badge strong {
color: #f9a8d4;
}
</style>
</head>
<body>
<h1>📡 Bitstream Receiver</h1>
<p class="subtitle">No framework. No runtime. Just bytes → pixels + audio + haptics.</p>
<div class="connection disconnected" id="connStatus">
<div class="dot"></div>
<span id="connText">Connecting to relay...</span>
</div>
<div class="layout">
<div class="panel">
<h2>Received Frame</h2>
<canvas id="display" width="600" height="400"></canvas>
</div>
<div class="panel">
<h2>Receiver Stats</h2>
<div class="stats">
<div class="stat">
<div class="stat-label">FPS In</div>
<div class="stat-value" id="statFps">0</div>
</div>
<div class="stat">
<div class="stat-label">Latency</div>
<div class="stat-value" id="statLatency"></div>
</div>
<div class="stat">
<div class="stat-label">Mode</div>
<div class="stat-value" id="statMode"></div>
</div>
<div class="stat">
<div class="stat-label">Bandwidth</div>
<div class="stat-value" id="statBandwidth">0</div>
</div>
<div class="stat">
<div class="stat-label">Frames</div>
<div class="stat-value" id="statFrames">0</div>
</div>
<div class="stat">
<div class="stat-label">Inputs</div>
<div class="stat-value" id="statInputs">0</div>
</div>
<div class="stat">
<div class="stat-label">Delta Saved</div>
<div class="stat-value" id="statDelta"></div>
</div>
<div class="stat">
<div class="stat-label">Audio</div>
<div class="stat-value" id="statAudio"></div>
</div>
<div class="stat">
<div class="stat-label">Signals</div>
<div class="stat-value" id="statSignals"></div>
</div>
</div>
<div class="bus">
<h3>Universal Bitstream Bus</h3>
<div class="bus-row">
<span class="bus-label">pixels</span>
<span class="bus-arrow">◄──</span>
<span class="bus-bytes" id="busPixels"></span>
<span class="bus-active" id="busPixelsOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">delta</span>
<span class="bus-arrow">◄──</span>
<span class="bus-bytes" id="busDelta"></span>
<span class="bus-active" id="busDeltaOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">signals</span>
<span class="bus-arrow">◄──</span>
<span class="bus-bytes" id="busSignals"></span>
<span class="bus-active" id="busSignalsOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">neural</span>
<span class="bus-arrow">◄──</span>
<span class="bus-bytes" id="busNeural"></span>
<span class="bus-active" id="busNeuralOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">audio</span>
<span class="bus-arrow">◄──</span>
<span class="bus-bytes" id="busAudio"></span>
<span class="bus-active" id="busAudioOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row"
style="margin-top: 0.3rem; padding-top: 0.3rem; border-top: 1px dashed rgba(255,255,255,0.04);">
<span class="bus-label">pointer</span>
<span class="bus-arrow">──►</span>
<span class="bus-bytes" id="busPointer">0 B</span>
<span class="bus-active" id="busPointerOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">keyboard</span>
<span class="bus-arrow">──►</span>
<span class="bus-bytes" id="busKeyboard">0 B</span>
<span class="bus-active" id="busKeyOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row">
<span class="bus-label">scroll</span>
<span class="bus-arrow">──►</span>
<span class="bus-bytes" id="busScroll">0 B</span>
<span class="bus-active" id="busScrollOk" style="color:rgba(255,255,255,0.08)"></span>
</div>
<div class="bus-row" style="opacity: 0.3;">
<span class="bus-label">voice</span><span class="bus-arrow">──►</span><span
class="bus-bytes">future</span>
</div>
<div class="bus-row" style="opacity: 0.3;">
<span class="bus-label">camera</span><span class="bus-arrow">──►</span><span
class="bus-bytes">future</span>
</div>
<div class="bus-row" style="opacity: 0.3;">
<span class="bus-label">BCI</span><span class="bus-arrow">──►</span><span
class="bus-bytes">future</span>
</div>
</div>
</div>
</div>
<div class="powered">Built with <span>DreamStack</span> — zero-framework receiver</div>
<div class="badge">This client: <strong>~300 lines</strong> · No framework · Just bytes</div>
<script>
// ════════════════════════════════════════════════════════
// DreamStack Bitstream Receiver — All Features
// Zero framework. Renders bytes. Sends inputs.
// ════════════════════════════════════════════════════════
const HEADER_SIZE = 16;
const FRAME_PIXELS = 0x01, FRAME_DELTA = 0x03, FRAME_AUDIO = 0x10;
const FRAME_SIGNAL_SYNC = 0x30, FRAME_SIGNAL_DIFF = 0x31, FRAME_NEURAL = 0x40;
const INPUT_POINTER = 0x01, INPUT_PTR_DOWN = 0x02, INPUT_PTR_UP = 0x03;
const INPUT_KEY_DOWN = 0x10, INPUT_KEY_UP = 0x11, INPUT_SCROLL = 0x50;
const FLAG_INPUT = 0x01;
const canvas = document.getElementById('display');
const ctx = canvas.getContext('2d');
const W = 600, H = 400;
// State
let framesRecv = 0, inputsSent = 0, lastSeq = 0;
let lastSecBytes = 0, lastSecFrames = 0, lastSecTime = performance.now();
let lastFrameTime = 0, currentMode = '—';
let prevFrameData = null; // for delta reconstruction
let audioChunksRecv = 0, signalFrames = 0;
// Per-bus byte counters (reset each second)
let busBytes = { pixel: 0, delta: 0, signal: 0, neural: 0, audio: 0, pointer: 0, key: 0, scroll: 0 };
// ── Feature 4: Audio playback ──
let audioCtx = null;
let audioNextTime = 0;
const AUDIO_SAMPLE_RATE = 22050;
function playAudioChunk(samples) {
if (!audioCtx) audioCtx = new AudioContext({ sampleRate: AUDIO_SAMPLE_RATE });
const buffer = audioCtx.createBuffer(1, samples.length, AUDIO_SAMPLE_RATE);
buffer.copyToChannel(samples, 0);
const src = audioCtx.createBufferSource();
src.buffer = buffer;
src.connect(audioCtx.destination);
const now = audioCtx.currentTime;
if (audioNextTime < now) audioNextTime = now;
src.start(audioNextTime);
audioNextTime += buffer.duration;
}
// ── Feature 3: Signal diff renderer ──
function renderFromSignals(state) {
ctx.clearRect(0, 0, W, H);
// Grid
ctx.strokeStyle = 'rgba(255,255,255,0.03)'; ctx.lineWidth = 1;
for (let x = 0; x < W; x += 40) { ctx.beginPath(); ctx.moveTo(x, 0); ctx.lineTo(x, H); ctx.stroke(); }
for (let y = 0; y < H; y += 40) { ctx.beginPath(); ctx.moveTo(0, y); ctx.lineTo(W, y); ctx.stroke(); }
const { bx, by, br, tx, ty } = state;
// Connection line
ctx.beginPath(); ctx.moveTo(bx, by); ctx.lineTo(tx, ty);
ctx.strokeStyle = 'rgba(139,92,246,0.15)'; ctx.lineWidth = 1; ctx.setLineDash([3, 3]); ctx.stroke(); ctx.setLineDash([]);
// Target crosshair
ctx.strokeStyle = 'rgba(99,102,241,0.08)'; ctx.setLineDash([4, 4]);
ctx.beginPath(); ctx.moveTo(tx, 0); ctx.lineTo(tx, H); ctx.stroke();
ctx.beginPath(); ctx.moveTo(0, ty); ctx.lineTo(W, ty); ctx.stroke(); ctx.setLineDash([]);
// Target dot
ctx.beginPath(); ctx.arc(tx, ty, 6, 0, Math.PI * 2); ctx.fillStyle = 'rgba(99,102,241,0.3)'; ctx.fill();
// Ball glow
ctx.beginPath(); ctx.arc(bx, by, br + 16, 0, Math.PI * 2); ctx.fillStyle = 'rgba(139,92,246,0.03)'; ctx.fill();
ctx.beginPath(); ctx.arc(bx, by, br + 8, 0, Math.PI * 2); ctx.fillStyle = 'rgba(139,92,246,0.08)'; ctx.fill();
// Ball
ctx.beginPath(); ctx.arc(bx, by, br, 0, Math.PI * 2);
const grad = ctx.createRadialGradient(bx - br * 0.3, by - br * 0.3, br * 0.1, bx, by, br);
grad.addColorStop(0, '#c4b5fd'); grad.addColorStop(1, '#8b5cf6');
ctx.fillStyle = grad; ctx.shadowColor = '#8b5cf6'; ctx.shadowBlur = 25; ctx.fill(); ctx.shadowBlur = 0;
ctx.font = '10px Inter'; ctx.fillStyle = 'rgba(255,255,255,0.12)';
ctx.fillText('📡 Signal-rendered on receiver', 10, H - 10);
}
// ── Feature 1: Delta decompression ──
function rleDecompress(compressed, expectedLen) {
const out = new Uint8Array(expectedLen);
let ci = 0, oi = 0;
while (ci < compressed.length && oi < expectedLen) {
if (compressed[ci] === 0 && ci + 1 < compressed.length) {
const run = compressed[ci + 1];
for (let j = 0; j < run && oi < expectedLen; j++) out[oi++] = 0;
ci += 2;
} else {
out[oi++] = compressed[ci++];
}
}
return out;
}
function applyDelta(prev, delta) {
const result = new Uint8Array(prev.length);
for (let i = 0; i < prev.length; i++) result[i] = prev[i] ^ delta[i];
return result;
}
// ── Protocol helpers ──
function decodeHeader(buf) {
const v = new DataView(buf.buffer || buf, buf.byteOffset || 0);
return {
type: v.getUint8(0), flags: v.getUint8(1), seq: v.getUint16(2, true), timestamp: v.getUint32(4, true),
width: v.getUint16(8, true), height: v.getUint16(10, true), length: v.getUint32(12, true)
};
}
function encodeInput(type, payload) {
const header = new ArrayBuffer(HEADER_SIZE);
const v = new DataView(header);
v.setUint8(0, type); v.setUint8(1, FLAG_INPUT);
v.setUint16(2, inputsSent & 0xFFFF, true);
v.setUint32(4, Math.round(performance.now()), true);
v.setUint32(12, payload.byteLength, true);
const msg = new Uint8Array(HEADER_SIZE + payload.byteLength);
msg.set(new Uint8Array(header), 0);
msg.set(new Uint8Array(payload), HEADER_SIZE);
return msg;
}
function pointerPayload(x, y, buttons) {
const b = new ArrayBuffer(5), v = new DataView(b);
v.setUint16(0, x, true); v.setUint16(2, y, true); v.setUint8(4, buttons);
return b;
}
function keyPayload(keycode, modifiers) {
const b = new ArrayBuffer(3), v = new DataView(b);
v.setUint16(0, keycode, true); v.setUint8(2, modifiers);
return b;
}
function scrollPayload(dx, dy) {
const b = new ArrayBuffer(4), v = new DataView(b);
v.setInt16(0, dx, true); v.setInt16(2, dy, true);
return b;
}
// ── WebSocket ──
let ws = null;
function connect() {
ws = new WebSocket('ws://localhost:9100');
ws.binaryType = 'arraybuffer';
ws.onopen = () => {
document.getElementById('connStatus').className = 'connection connected';
document.getElementById('connText').textContent = 'Receiving bitstream';
};
ws.onmessage = e => {
if (!(e.data instanceof ArrayBuffer) || e.data.byteLength < HEADER_SIZE) return;
const bytes = new Uint8Array(e.data);
const header = decodeHeader(bytes);
const payload = bytes.subarray(HEADER_SIZE, HEADER_SIZE + header.length);
switch (header.type) {
case FRAME_PIXELS:
case FRAME_NEURAL: {
// Raw pixel frame or neural frame
if (payload.length === header.width * header.height * 4) {
const imgData = new ImageData(new Uint8ClampedArray(payload.buffer, payload.byteOffset, payload.length), header.width, header.height);
ctx.putImageData(imgData, 0, 0);
prevFrameData = new Uint8Array(payload);
}
currentMode = header.type === FRAME_NEURAL ? 'neural' : 'pixel';
busBytes[header.type === FRAME_NEURAL ? 'neural' : 'pixel'] += e.data.byteLength;
break;
}
case FRAME_DELTA: {
// Feature 1: Delta frame — decompress + apply XOR
if (prevFrameData) {
const expectedLen = header.width * header.height * 4;
const delta = rleDecompress(payload, expectedLen);
const fullFrame = applyDelta(prevFrameData, delta);
const imgData = new ImageData(new Uint8ClampedArray(fullFrame.buffer), header.width, header.height);
ctx.putImageData(imgData, 0, 0);
prevFrameData = fullFrame;
}
currentMode = 'delta';
busBytes.delta += e.data.byteLength;
break;
}
case FRAME_SIGNAL_SYNC:
case FRAME_SIGNAL_DIFF: {
// Feature 3: Signal diff — parse JSON, render locally
const json = new TextDecoder().decode(payload);
try {
const state = JSON.parse(json);
renderFromSignals(state);
signalFrames++;
} catch (err) { /* parse error */ }
currentMode = 'signal';
busBytes.signal += e.data.byteLength;
break;
}
case FRAME_AUDIO: {
// Feature 4: Audio playback
const samples = new Float32Array(payload.buffer, payload.byteOffset, payload.length / 4);
playAudioChunk(samples);
audioChunksRecv++;
busBytes.audio += e.data.byteLength;
break;
}
}
framesRecv++;
lastSeq = header.seq;
lastSecBytes += e.data.byteLength;
lastSecFrames++;
lastFrameTime = performance.now();
};
ws.onclose = () => {
document.getElementById('connStatus').className = 'connection disconnected';
document.getElementById('connText').textContent = 'Disconnected — reconnecting...';
setTimeout(connect, 2000);
};
ws.onerror = () => {
document.getElementById('connStatus').className = 'connection disconnected';
document.getElementById('connText').textContent = 'Relay not found — cargo run -p ds-stream';
};
}
// ── Feature 2: Full Input Capture ──
function sendInput(buf) {
if (ws && ws.readyState === WebSocket.OPEN) { ws.send(buf); inputsSent++; }
}
// Pointer: click = immediate action, drag = full down/move/up
let isDragging = false;
canvas.addEventListener('mousedown', e => {
const r = canvas.getBoundingClientRect();
const x = Math.round(e.clientX - r.left), y = Math.round(e.clientY - r.top);
isDragging = true;
sendInput(encodeInput(INPUT_PTR_DOWN, pointerPayload(x, y, e.buttons)));
busBytes.pointer += HEADER_SIZE + 5;
});
canvas.addEventListener('mousemove', e => {
if (!isDragging) return;
const r = canvas.getBoundingClientRect();
const x = Math.round(e.clientX - r.left), y = Math.round(e.clientY - r.top);
sendInput(encodeInput(INPUT_POINTER, pointerPayload(x, y, e.buttons)));
busBytes.pointer += HEADER_SIZE + 5;
});
window.addEventListener('mouseup', () => {
if (isDragging) {
isDragging = false;
sendInput(encodeInput(INPUT_PTR_UP, pointerPayload(0, 0, 0)));
busBytes.pointer += HEADER_SIZE + 5;
}
});
// Keyboard
document.addEventListener('keydown', e => {
const mods = (e.shiftKey ? 1 : 0) | (e.ctrlKey ? 2 : 0) | (e.altKey ? 4 : 0) | (e.metaKey ? 8 : 0);
sendInput(encodeInput(INPUT_KEY_DOWN, keyPayload(e.keyCode, mods)));
busBytes.key += HEADER_SIZE + 3;
});
document.addEventListener('keyup', e => {
const mods = (e.shiftKey ? 1 : 0) | (e.ctrlKey ? 2 : 0) | (e.altKey ? 4 : 0) | (e.metaKey ? 8 : 0);
sendInput(encodeInput(INPUT_KEY_UP, keyPayload(e.keyCode, mods)));
busBytes.key += HEADER_SIZE + 3;
});
// Scroll — resize ball
canvas.addEventListener('wheel', e => {
e.preventDefault();
sendInput(encodeInput(INPUT_SCROLL, scrollPayload(Math.round(e.deltaX), Math.round(e.deltaY))));
busBytes.scroll += HEADER_SIZE + 4;
}, { passive: false });
// ── Stats ──
setInterval(() => {
const now = performance.now(), elapsed = (now - lastSecTime) / 1000;
const fps = Math.round(lastSecFrames / elapsed);
const bw = lastSecBytes / 1024 / 1024 / elapsed;
document.getElementById('statFps').textContent = fps;
document.getElementById('statLatency').textContent = lastFrameTime ? Math.round(now - lastFrameTime) + 'ms' : '—';
document.getElementById('statMode').textContent = currentMode;
document.getElementById('statBandwidth').textContent = bw < 1 ? (bw * 1024).toFixed(0) + 'KB/s' : bw.toFixed(1) + 'MB/s';
document.getElementById('statFrames').textContent = framesRecv;
document.getElementById('statInputs').textContent = inputsSent;
document.getElementById('statDelta').textContent = currentMode === 'delta' ?
((1 - lastSecBytes / (W * H * 4 * lastSecFrames || 1)) * 100).toFixed(0) + '%' : '—';
document.getElementById('statAudio').textContent = audioChunksRecv > 0 ? audioChunksRecv : '—';
document.getElementById('statSignals').textContent = signalFrames > 0 ? signalFrames : '—';
// Bus indicators
function busUpdate(id, bytes) {
const el = document.getElementById(id);
const okEl = document.getElementById(id.replace('bus', 'bus') + 'Ok');
if (bytes > 0) {
el.textContent = bytes > 1024 ? (bytes / 1024).toFixed(0) + 'KB' : bytes + 'B';
if (okEl) okEl.style.color = 'rgba(34,197,94,0.8)';
} else {
if (okEl) okEl.style.color = 'rgba(255,255,255,0.08)';
}
}
busUpdate('busPixels', busBytes.pixel);
busUpdate('busDelta', busBytes.delta);
busUpdate('busSignals', busBytes.signal);
busUpdate('busNeural', busBytes.neural);
busUpdate('busAudio', busBytes.audio);
busUpdate('busPointer', busBytes.pointer);
busUpdate('busKeyboard', busBytes.key);
busUpdate('busScroll', busBytes.scroll);
lastSecBytes = 0; lastSecFrames = 0; lastSecTime = now;
busBytes = { pixel: 0, delta: 0, signal: 0, neural: 0, audio: 0, pointer: 0, key: 0, scroll: 0 };
}, 1000);
connect();
</script>
</body>
</html>

783
examples/stream-source.html Normal file
View file

@ -0,0 +1,783 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>DreamStack — Stream Source</title>
<link
href="https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700&family=JetBrains+Mono:wght@400;500&display=swap"
rel="stylesheet">
<style>
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: 'Inter', system-ui, sans-serif;
background: #0a0a0f;
color: #e2e8f0;
min-height: 100vh;
display: flex;
flex-direction: column;
align-items: center;
padding: 1.5rem;
}
h1 {
font-size: 1.6rem;
font-weight: 700;
background: linear-gradient(135deg, #6366f1, #8b5cf6, #ec4899);
-webkit-background-clip: text;
background-clip: text;
-webkit-text-fill-color: transparent;
margin-bottom: 0.3rem;
}
.subtitle {
color: rgba(255, 255, 255, 0.25);
font-size: 0.75rem;
margin-bottom: 1rem;
}
.layout {
display: flex;
gap: 1.5rem;
align-items: flex-start;
flex-wrap: wrap;
justify-content: center;
}
.panel {
background: rgba(255, 255, 255, 0.02);
border: 1px solid rgba(255, 255, 255, 0.06);
border-radius: 16px;
padding: 1rem;
}
.panel h2 {
font-size: 0.8rem;
font-weight: 600;
color: rgba(255, 255, 255, 0.4);
text-transform: uppercase;
letter-spacing: 0.08em;
margin-bottom: 0.8rem;
}
canvas {
border-radius: 12px;
background: rgba(255, 255, 255, 0.02);
border: 1px solid rgba(255, 255, 255, 0.06);
cursor: pointer;
}
.stats {
display: grid;
grid-template-columns: 1fr 1fr 1fr;
gap: 0.4rem;
min-width: 320px;
}
.stat {
background: rgba(99, 102, 241, 0.06);
border: 1px solid rgba(99, 102, 241, 0.1);
border-radius: 10px;
padding: 0.5rem 0.7rem;
}
.stat-label {
font-size: 0.55rem;
color: rgba(255, 255, 255, 0.3);
text-transform: uppercase;
letter-spacing: 0.05em;
}
.stat-value {
font-family: 'JetBrains Mono', monospace;
font-size: 0.95rem;
font-weight: 600;
color: #c4b5fd;
margin-top: 1px;
}
.connection {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 1rem;
border-radius: 20px;
font-size: 0.75rem;
font-weight: 500;
margin-bottom: 1rem;
}
.connection.disconnected {
background: rgba(239, 68, 68, 0.1);
border: 1px solid rgba(239, 68, 68, 0.2);
color: #fca5a5;
}
.connection.connected {
background: rgba(34, 197, 94, 0.1);
border: 1px solid rgba(34, 197, 94, 0.2);
color: #86efac;
}
.connection .dot {
width: 8px;
height: 8px;
border-radius: 50%;
animation: pulse 2s ease-in-out infinite;
}
.connection.disconnected .dot {
background: #ef4444;
}
.connection.connected .dot {
background: #22c55e;
}
@keyframes pulse {
0%,
100% {
opacity: 1;
}
50% {
opacity: 0.3;
}
}
.controls {
display: flex;
gap: 0.3rem;
flex-wrap: wrap;
margin-top: 0.6rem;
}
.controls button {
padding: 0.35rem 0.8rem;
border: none;
border-radius: 8px;
background: linear-gradient(135deg, rgba(99, 102, 241, 0.12), rgba(139, 92, 246, 0.12));
color: #c4b5fd;
cursor: pointer;
font-size: 0.7rem;
font-weight: 500;
transition: all 0.15s;
border: 1px solid rgba(139, 92, 246, 0.1);
}
.controls button:hover {
background: linear-gradient(135deg, rgba(99, 102, 241, 0.25), rgba(139, 92, 246, 0.25));
transform: translateY(-1px);
}
.controls button.active {
background: linear-gradient(135deg, rgba(99, 102, 241, 0.4), rgba(139, 92, 246, 0.4));
border-color: rgba(139, 92, 246, 0.4);
}
.mode-row {
display: flex;
gap: 0.3rem;
margin-top: 0.5rem;
}
.mode-btn {
padding: 0.3rem 0.7rem;
border: none;
border-radius: 6px;
background: rgba(255, 255, 255, 0.04);
color: rgba(255, 255, 255, 0.3);
cursor: pointer;
font-size: 0.65rem;
font-weight: 500;
border: 1px solid rgba(255, 255, 255, 0.06);
transition: all 0.15s;
}
.mode-btn.active {
background: rgba(139, 92, 246, 0.2);
color: #c4b5fd;
border-color: rgba(139, 92, 246, 0.3);
}
.bitstream-vis {
display: flex;
gap: 2px;
height: 25px;
align-items: flex-end;
margin-top: 0.4rem;
}
.bitstream-bar {
width: 3px;
background: linear-gradient(to top, #6366f1, #8b5cf6);
border-radius: 1px;
min-height: 2px;
transition: height 0.1s ease;
}
.powered {
margin-top: 1rem;
font-size: 0.6rem;
color: rgba(255, 255, 255, 0.08);
}
.powered span {
color: rgba(139, 92, 246, 0.25);
}
</style>
</head>
<body>
<h1>⚡ Bitstream Source</h1>
<p class="subtitle">Renders the UI. Streams pixels/signals/audio. Receives inputs.</p>
<div class="connection disconnected" id="connStatus">
<div class="dot"></div>
<span id="connText">Connecting to relay...</span>
</div>
<div class="layout">
<div class="panel">
<h2>Rendered Scene</h2>
<canvas id="scene" width="600" height="400"></canvas>
<div class="controls" id="controls"></div>
<div class="mode-row">
<button class="mode-btn active" id="modePixel" onclick="setMode('pixel')">📺 Pixels</button>
<button class="mode-btn" id="modeDelta" onclick="setMode('delta')">Δ Delta</button>
<button class="mode-btn" id="modeSignal" onclick="setMode('signal')">📡 Signals</button>
<button class="mode-btn" id="modeNeural" onclick="setMode('neural')">🧠 Neural</button>
<button class="mode-btn" id="modeAudio" onclick="toggleAudio()">🔇 Audio</button>
</div>
</div>
<div class="panel">
<h2>Stream Stats</h2>
<div class="stats">
<div class="stat">
<div class="stat-label">FPS</div>
<div class="stat-value" id="statFps">0</div>
</div>
<div class="stat">
<div class="stat-label">Mode</div>
<div class="stat-value" id="statMode">pixel</div>
</div>
<div class="stat">
<div class="stat-label">Receivers</div>
<div class="stat-value" id="statReceivers">0</div>
</div>
<div class="stat">
<div class="stat-label">Frame Size</div>
<div class="stat-value" id="statFrameSize">0</div>
</div>
<div class="stat">
<div class="stat-label">Bandwidth</div>
<div class="stat-value" id="statBandwidth">0</div>
</div>
<div class="stat">
<div class="stat-label">Delta Ratio</div>
<div class="stat-value" id="statDelta"></div>
</div>
<div class="stat">
<div class="stat-label">Frames</div>
<div class="stat-value" id="statFrames">0</div>
</div>
<div class="stat">
<div class="stat-label">Inputs</div>
<div class="stat-value" id="statInputs">0</div>
</div>
<div class="stat">
<div class="stat-label">Audio</div>
<div class="stat-value" id="statAudio">off</div>
</div>
</div>
<div class="bitstream-vis" id="bitstreamVis"></div>
</div>
</div>
<div class="powered">Built with <span>DreamStack</span> — universal bitstream</div>
<script>
// ════════════════════════════════════════════════════════
// DreamStack Runtime (signals + springs)
// ════════════════════════════════════════════════════════
const DS = (() => {
let currentEffect = null, batchDepth = 0, pendingEffects = new Set();
class Signal {
constructor(val) { this._value = val; this._subs = new Set(); }
get value() { if (currentEffect) this._subs.add(currentEffect); return this._value; }
set value(v) {
if (this._value === v) return; this._value = v;
if (batchDepth > 0) for (const s of this._subs) pendingEffects.add(s);
else for (const s of [...this._subs]) s._run();
}
}
class Effect {
constructor(fn) { this._fn = fn; this._disposed = false; }
_run() { if (this._disposed) return; const prev = currentEffect; currentEffect = this; try { this._fn(); } finally { currentEffect = prev; } }
dispose() { this._disposed = true; }
}
const signal = v => new Signal(v);
const effect = fn => { const e = new Effect(fn); e._run(); return e; };
const batch = fn => { batchDepth++; try { fn(); } finally { batchDepth--; if (batchDepth === 0) { const effs = [...pendingEffects]; pendingEffects.clear(); for (const e of effs) e._run(); } } };
const _activeSprings = new Set();
let _rafId = null, _lastTime = 0;
class Spring {
constructor({ value = 0, target, stiffness = 170, damping = 26, mass = 1 } = {}) {
this._signal = new Signal(value); this._velocity = 0;
this._target = target !== undefined ? target : value;
this.stiffness = stiffness; this.damping = damping; this.mass = mass; this._settled = true;
}
get value() { return this._signal.value; }
set value(v) { this.target = v; }
get target() { return this._target; }
set target(t) { this._target = t; this._settled = false; _activeSprings.add(this); _startLoop(); }
set(v) { this._signal.value = v; this._target = v; this._velocity = 0; this._settled = true; _activeSprings.delete(this); }
_step(dt) {
const pos = this._signal._value, vel = this._velocity;
const k = this.stiffness, d = this.damping, m = this.mass;
const a = (p, v) => (-k * (p - this._target) - d * v) / m;
const k1v = a(pos, vel), k1p = vel;
const k2v = a(pos + k1p * dt / 2, vel + k1v * dt / 2), k2p = vel + k1v * dt / 2;
const k3v = a(pos + k2p * dt / 2, vel + k2v * dt / 2), k3p = vel + k2v * dt / 2;
const k4v = a(pos + k3p * dt, vel + k3v * dt), k4p = vel + k3v * dt;
this._velocity = vel + (dt / 6) * (k1v + 2 * k2v + 2 * k3v + k4v);
this._signal.value = pos + (dt / 6) * (k1p + 2 * k2p + 2 * k3p + k4p);
if (Math.abs(this._velocity) < 0.01 && Math.abs(this._signal._value - this._target) < 0.01) {
this._signal.value = this._target; this._velocity = 0; this._settled = true; _activeSprings.delete(this);
}
}
}
function _startLoop() { if (_rafId !== null) return; _lastTime = performance.now(); _rafId = requestAnimationFrame(_loop); }
function _loop(now) {
const dt = Math.min((now - _lastTime) / 1000, 0.064); _lastTime = now;
batch(() => { for (const s of _activeSprings) { const steps = Math.ceil(dt / (1 / 120)); const subDt = dt / steps; for (let i = 0; i < steps; i++) s._step(subDt); } });
if (_activeSprings.size > 0) _rafId = requestAnimationFrame(_loop); else _rafId = null;
}
function spring(opts) { return new Spring(typeof opts === 'object' ? opts : { value: opts, target: opts }); }
return { signal, effect, batch, spring, Signal, Spring };
})();
// ════════════════════════════════════════════════════════
// Protocol Constants
// ════════════════════════════════════════════════════════
const HEADER_SIZE = 16;
const FRAME_PIXELS = 0x01, FRAME_DELTA = 0x03, FRAME_AUDIO = 0x10;
const FRAME_SIGNAL_SYNC = 0x30, FRAME_SIGNAL_DIFF = 0x31, FRAME_NEURAL = 0x40;
const INPUT_POINTER = 0x01, INPUT_PTR_DOWN = 0x02, INPUT_PTR_UP = 0x03;
const INPUT_KEY_DOWN = 0x10, INPUT_KEY_UP = 0x11, INPUT_SCROLL = 0x50;
const FLAG_INPUT = 0x01, FLAG_KEYFRAME = 0x02;
function encodeHeader(type, flags, seq, ts, w, h, len) {
const b = new ArrayBuffer(HEADER_SIZE), v = new DataView(b);
v.setUint8(0, type); v.setUint8(1, flags); v.setUint16(2, seq, true); v.setUint32(4, ts, true);
v.setUint16(8, w, true); v.setUint16(10, h, true); v.setUint32(12, len, true);
return new Uint8Array(b);
}
function decodeHeader(buf) {
const v = new DataView(buf.buffer || buf, buf.byteOffset || 0);
return {
type: v.getUint8(0), flags: v.getUint8(1), seq: v.getUint16(2, true), timestamp: v.getUint32(4, true),
width: v.getUint16(8, true), height: v.getUint16(10, true), length: v.getUint32(12, true)
};
}
// ════════════════════════════════════════════════════════
// Scene Setup
// ════════════════════════════════════════════════════════
const canvas = document.getElementById('scene');
const ctx = canvas.getContext('2d');
const W = 600, H = 400;
const ballX = DS.spring({ value: 300, stiffness: 170, damping: 26 });
const ballY = DS.spring({ value: 200, stiffness: 170, damping: 26 });
const ballR = DS.spring({ value: 25, stiffness: 300, damping: 20 });
const targetX = DS.signal(300), targetY = DS.signal(200);
function renderScene() {
ctx.clearRect(0, 0, W, H);
ctx.strokeStyle = 'rgba(255,255,255,0.03)'; ctx.lineWidth = 1;
for (let x = 0; x < W; x += 40) { ctx.beginPath(); ctx.moveTo(x, 0); ctx.lineTo(x, H); ctx.stroke(); }
for (let y = 0; y < H; y += 40) { ctx.beginPath(); ctx.moveTo(0, y); ctx.lineTo(W, y); ctx.stroke(); }
const bx = ballX.value, by = ballY.value, br = ballR.value, tx = targetX.value, ty = targetY.value;
ctx.beginPath(); ctx.moveTo(bx, by); ctx.lineTo(tx, ty);
ctx.strokeStyle = 'rgba(139,92,246,0.15)'; ctx.lineWidth = 1; ctx.setLineDash([3, 3]); ctx.stroke(); ctx.setLineDash([]);
ctx.strokeStyle = 'rgba(99,102,241,0.08)'; ctx.setLineDash([4, 4]);
ctx.beginPath(); ctx.moveTo(tx, 0); ctx.lineTo(tx, H); ctx.stroke();
ctx.beginPath(); ctx.moveTo(0, ty); ctx.lineTo(W, ty); ctx.stroke(); ctx.setLineDash([]);
ctx.beginPath(); ctx.arc(tx, ty, 6, 0, Math.PI * 2); ctx.fillStyle = 'rgba(99,102,241,0.3)'; ctx.fill();
ctx.beginPath(); ctx.arc(bx, by, br + 16, 0, Math.PI * 2); ctx.fillStyle = 'rgba(139,92,246,0.03)'; ctx.fill();
ctx.beginPath(); ctx.arc(bx, by, br + 8, 0, Math.PI * 2); ctx.fillStyle = 'rgba(139,92,246,0.08)'; ctx.fill();
ctx.beginPath(); ctx.arc(bx, by, br, 0, Math.PI * 2);
const grad = ctx.createRadialGradient(bx - br * 0.3, by - br * 0.3, br * 0.1, bx, by, br);
grad.addColorStop(0, '#c4b5fd'); grad.addColorStop(1, '#8b5cf6');
ctx.fillStyle = grad; ctx.shadowColor = '#8b5cf6'; ctx.shadowBlur = 25; ctx.fill(); ctx.shadowBlur = 0;
ctx.font = '10px Inter'; ctx.fillStyle = 'rgba(255,255,255,0.06)';
ctx.fillText(`DreamStack — ${streamMode} mode`, 10, H - 10);
}
DS.effect(renderScene);
// ── Feature 5: Neural Renderer ──
// Generates pixels directly from signal state — no canvas primitives
// This is what a trained model would do: signal_state → framebuffer
function neuralRender() {
const bx = ballX._signal._value, by = ballY._signal._value;
const br = ballR._signal._value;
const imgData = ctx.createImageData(W, H);
const d = imgData.data;
for (let py = 0; py < H; py++) {
for (let px = 0; px < W; px++) {
const i = (py * W + px) * 4;
// Grid (procedural)
const onGrid = (px % 40 === 0 || py % 40 === 0) ? 8 : 0;
// Ball glow (distance field — a "learned" SDF)
const dx = px - bx, dy = py - by;
const dist = Math.sqrt(dx * dx + dy * dy);
const glow = Math.max(0, 1 - dist / (br + 30)) * 0.15;
const innerGlow = Math.max(0, 1 - dist / (br + 10)) * 0.3;
const solid = dist < br ? 1 : 0;
// "Neural" color mixing
const r = Math.min(255, onGrid + glow * 60 + innerGlow * 80 + solid * 139);
const g = Math.min(255, onGrid + glow * 30 + innerGlow * 40 + solid * 92);
const b2 = Math.min(255, onGrid + glow * 120 + innerGlow * 160 + solid * 246);
d[i] = r; d[i + 1] = g; d[i + 2] = b2; d[i + 3] = 255;
}
}
ctx.putImageData(imgData, 0, 0);
ctx.font = '10px Inter'; ctx.fillStyle = 'rgba(255,255,255,0.15)';
ctx.fillText('🧠 Neural Renderer (procedural SDF)', 10, H - 10);
return imgData;
}
// ── Interaction ──
let dragging = false;
function handlePointer(x, y, type) {
if (type === 'down') {
const bx = ballX._signal._value, by = ballY._signal._value;
if (Math.hypot(x - bx, y - by) < 40) { dragging = true; canvas.style.cursor = 'grabbing'; }
else { targetX.value = x; targetY.value = y; ballX.value = x; ballY.value = y; }
} else if (type === 'move' && dragging) {
ballX.set(Math.max(20, Math.min(W - 20, x)));
ballY.set(Math.max(20, Math.min(H - 20, y)));
targetX.value = x; targetY.value = y;
} else if (type === 'up' && dragging) {
dragging = false; canvas.style.cursor = 'pointer';
ballX.value = 300; ballY.value = 200; targetX.value = 300; targetY.value = 200;
}
}
canvas.addEventListener('mousedown', e => { const r = canvas.getBoundingClientRect(); handlePointer(e.clientX - r.left, e.clientY - r.top, 'down'); });
window.addEventListener('mousemove', e => { if (!dragging) return; const r = canvas.getBoundingClientRect(); handlePointer(e.clientX - r.left, e.clientY - r.top, 'move'); });
window.addEventListener('mouseup', () => handlePointer(0, 0, 'up'));
// Preset buttons
const presets = [
{ label: '↖ TL', x: 60, y: 60 }, { label: '↗ TR', x: 540, y: 60 },
{ label: '⊙ Center', x: 300, y: 200 }, { label: '↙ BL', x: 60, y: 340 },
{ label: '↘ BR', x: 540, y: 340 },
{ label: '🎾 Bounce', action: 'bounce' }, { label: '💥 Pulse', action: 'pulse' },
];
const ctrlEl = document.getElementById('controls');
presets.forEach(p => {
const btn = document.createElement('button');
btn.textContent = p.label;
btn.addEventListener('click', () => {
if (p.action === 'bounce') {
const pos = [[80, 80], [520, 80], [520, 320], [80, 320]]; let i = 0;
const iv = setInterval(() => { const [x, y] = pos[i % pos.length]; targetX.value = x; targetY.value = y; ballX.value = x; ballY.value = y; if (++i >= 8) clearInterval(iv); }, 350);
} else if (p.action === 'pulse') { ballR.value = 70; setTimeout(() => ballR.value = 25, 250); }
else { targetX.value = p.x; targetY.value = p.y; ballX.value = p.x; ballY.value = p.y; }
});
ctrlEl.appendChild(btn);
});
// ════════════════════════════════════════════════════════
// Feature 4: Audio Synthesis
// ════════════════════════════════════════════════════════
let audioCtx = null, audioEnabled = false;
const AUDIO_SAMPLE_RATE = 22050, AUDIO_CHUNK_SIZE = 1024;
function toggleAudio() {
audioEnabled = !audioEnabled;
const btn = document.getElementById('modeAudio');
if (audioEnabled) {
if (!audioCtx) audioCtx = new AudioContext({ sampleRate: AUDIO_SAMPLE_RATE });
btn.textContent = '🔊 Audio'; btn.classList.add('active');
} else {
btn.textContent = '🔇 Audio'; btn.classList.remove('active');
}
}
function synthesizeAudio() {
if (!audioEnabled || !audioCtx) return null;
// Synthesize from spring state — velocity drives pitch, distance drives volume
const vel = Math.sqrt(ballX._velocity * ballX._velocity + ballY._velocity * ballY._velocity);
const dist = Math.hypot(ballX._signal._value - targetX._value, ballY._signal._value - targetY._value);
const freq = 220 + vel * 2; // base frequency + velocity
const amp = Math.min(0.3, dist / 500); // volume from distance
const samples = new Float32Array(AUDIO_CHUNK_SIZE);
const t0 = performance.now() / 1000;
for (let i = 0; i < AUDIO_CHUNK_SIZE; i++) {
const t = t0 + i / AUDIO_SAMPLE_RATE;
samples[i] = amp * Math.sin(2 * Math.PI * freq * t) * Math.exp(-vel * 0.001);
}
return samples;
}
function encodeAudioFrame(samples, seq, ts) {
const payload = new Uint8Array(samples.buffer);
const header = encodeHeader(FRAME_AUDIO, 0, seq, ts, 1, AUDIO_SAMPLE_RATE / 100, payload.length);
const msg = new Uint8Array(HEADER_SIZE + payload.length);
msg.set(header, 0); msg.set(payload, HEADER_SIZE);
return msg;
}
// ════════════════════════════════════════════════════════
// Streaming Engine
// ════════════════════════════════════════════════════════
let ws = null, seq = 0, streamStart = performance.now();
let framesSent = 0, bytesSent = 0, inputsRecv = 0;
let lastSecBytes = 0, lastSecFrames = 0, lastSecTime = performance.now();
let prevFrame = null; // for delta compression
let lastDeltaRatio = 0;
let streamMode = 'pixel'; // pixel | delta | signal | neural
let prevSignalState = null;
let receiverCount = 0;
// Bitstream viz
const visEl = document.getElementById('bitstreamVis');
const visBars = [];
for (let i = 0; i < 60; i++) {
const bar = document.createElement('div'); bar.className = 'bitstream-bar'; bar.style.height = '2px';
visEl.appendChild(bar); visBars.push(bar);
}
let visIdx = 0;
function setMode(mode) {
streamMode = mode;
document.querySelectorAll('.mode-btn').forEach(b => b.classList.remove('active'));
const id = 'mode' + mode.charAt(0).toUpperCase() + mode.slice(1);
const el = document.getElementById(id);
if (el) el.classList.add('active');
prevFrame = null; prevSignalState = null; // reset state on mode change
}
function connect() {
ws = new WebSocket('ws://localhost:9100');
ws.binaryType = 'arraybuffer';
ws.onopen = () => {
document.getElementById('connStatus').className = 'connection connected';
document.getElementById('connText').textContent = 'Connected — streaming';
streamStart = performance.now(); startStreaming();
};
ws.onmessage = e => {
if (e.data instanceof ArrayBuffer && e.data.byteLength >= HEADER_SIZE) {
const header = decodeHeader(new Uint8Array(e.data));
if (header.flags & FLAG_INPUT) { inputsRecv++; handleRemoteInput(header, new Uint8Array(e.data, HEADER_SIZE)); }
}
};
ws.onclose = () => {
document.getElementById('connStatus').className = 'connection disconnected';
document.getElementById('connText').textContent = 'Disconnected — reconnecting...';
stopStreaming(); setTimeout(connect, 2000);
};
ws.onerror = () => {
document.getElementById('connStatus').className = 'connection disconnected';
document.getElementById('connText').textContent = 'Relay not found — cargo run -p ds-stream';
};
}
// ── Feature 2: Full Bidirectional Input ──
function handleRemoteInput(header, payload) {
const view = new DataView(payload.buffer, payload.byteOffset);
switch (header.type) {
case INPUT_PTR_DOWN:
if (payload.length >= 5) handlePointer(view.getUint16(0, true), view.getUint16(2, true), 'down');
break;
case INPUT_POINTER:
if (payload.length >= 5) handlePointer(view.getUint16(0, true), view.getUint16(2, true), 'move');
break;
case INPUT_PTR_UP:
handlePointer(0, 0, 'up');
break;
case INPUT_KEY_DOWN:
if (payload.length >= 3) {
const keycode = view.getUint16(0, true);
// Arrow keys drive ball
if (keycode === 37) { ballX.value = ballX._signal._value - 50; targetX.value = ballX._target; }
if (keycode === 39) { ballX.value = ballX._signal._value + 50; targetX.value = ballX._target; }
if (keycode === 38) { ballY.value = ballY._signal._value - 50; targetY.value = ballY._target; }
if (keycode === 40) { ballY.value = ballY._signal._value + 50; targetY.value = ballY._target; }
if (keycode === 32) { ballR.value = 70; setTimeout(() => ballR.value = 25, 250); } // space = pulse
}
break;
case INPUT_SCROLL:
if (payload.length >= 4) {
const dy = view.getInt16(2, true);
ballR.value = Math.max(10, Math.min(80, ballR._signal._value + dy * 0.1));
}
break;
}
}
// ── Capture & Send ──
let streamingInterval = null;
function startStreaming() { if (!streamingInterval) streamingInterval = setInterval(captureAndSend, 1000 / 30); }
function stopStreaming() { if (streamingInterval) { clearInterval(streamingInterval); streamingInterval = null; } }
function captureAndSend() {
if (!ws || ws.readyState !== WebSocket.OPEN) return;
const ts = Math.round(performance.now() - streamStart);
let msg;
switch (streamMode) {
case 'pixel':
msg = sendPixelFrame(ts, false);
break;
case 'delta':
msg = sendPixelFrame(ts, true);
break;
case 'signal':
msg = sendSignalFrame(ts);
break;
case 'neural':
msg = sendNeuralFrame(ts);
break;
}
if (msg) {
ws.send(msg.buffer);
seq++; framesSent++; bytesSent += msg.length; lastSecBytes += msg.length; lastSecFrames++;
visBars[visIdx % visBars.length].style.height = Math.min(25, Math.max(2, msg.length / 30000)) + 'px';
visIdx++;
}
// Feature 4: Send audio alongside
if (audioEnabled) {
const samples = synthesizeAudio();
if (samples) {
const audioMsg = encodeAudioFrame(samples, seq & 0xFFFF, ts);
ws.send(audioMsg.buffer);
bytesSent += audioMsg.length;
lastSecBytes += audioMsg.length;
}
}
}
// ── Mode: Raw Pixels ──
function sendPixelFrame(ts, useDelta) {
const imageData = ctx.getImageData(0, 0, W, H);
const pixels = imageData.data;
if (useDelta && prevFrame) {
// Feature 1: XOR Delta Compression
const delta = new Uint8Array(pixels.length);
let zeroCount = 0;
for (let i = 0; i < pixels.length; i++) {
delta[i] = pixels[i] ^ prevFrame[i];
if (delta[i] === 0) zeroCount++;
}
lastDeltaRatio = (zeroCount / pixels.length * 100);
if (zeroCount > pixels.length * 0.3) {
// Delta is worthwhile — compress by RLE-encoding the zero runs
const compressed = rleCompress(delta);
prevFrame = new Uint8Array(pixels);
const header = encodeHeader(FRAME_DELTA, 0, seq & 0xFFFF, ts, W, H, compressed.length);
const msg = new Uint8Array(HEADER_SIZE + compressed.length);
msg.set(header, 0); msg.set(compressed, HEADER_SIZE);
return msg;
}
}
// Send full keyframe
prevFrame = new Uint8Array(pixels);
const header = encodeHeader(FRAME_PIXELS, FLAG_KEYFRAME, seq & 0xFFFF, ts, W, H, pixels.length);
const msg = new Uint8Array(HEADER_SIZE + pixels.length);
msg.set(header, 0); msg.set(pixels, HEADER_SIZE);
return msg;
}
// Simple RLE for delta frames: encode runs of zeros compactly
function rleCompress(data) {
const out = [];
let i = 0;
while (i < data.length) {
if (data[i] === 0) {
let run = 0;
while (i < data.length && data[i] === 0 && run < 255) { run++; i++; }
out.push(0, run); // 0x00 followed by run length
} else {
out.push(data[i]); i++;
}
}
return new Uint8Array(out);
}
// ── Feature 3: Signal Diff Mode ──
function sendSignalFrame(ts) {
const state = {
bx: Math.round(ballX._signal._value * 10) / 10,
by: Math.round(ballY._signal._value * 10) / 10,
br: Math.round(ballR._signal._value * 10) / 10,
bvx: Math.round(ballX._velocity * 10) / 10,
bvy: Math.round(ballY._velocity * 10) / 10,
tx: Math.round(targetX._value * 10) / 10,
ty: Math.round(targetY._value * 10) / 10,
drag: dragging ? 1 : 0,
};
const json = JSON.stringify(state);
const payload = new TextEncoder().encode(json);
// Send full sync or diff
const frameType = prevSignalState ? FRAME_SIGNAL_DIFF : FRAME_SIGNAL_SYNC;
prevSignalState = state;
const header = encodeHeader(frameType, 0, seq & 0xFFFF, ts, W, H, payload.length);
const msg = new Uint8Array(HEADER_SIZE + payload.length);
msg.set(header, 0); msg.set(payload, HEADER_SIZE);
return msg;
}
// ── Feature 5: Neural Renderer Mode ──
function sendNeuralFrame(ts) {
const imgData = neuralRender();
const pixels = imgData.data;
const header = encodeHeader(FRAME_NEURAL, FLAG_KEYFRAME, seq & 0xFFFF, ts, W, H, pixels.length);
const msg = new Uint8Array(HEADER_SIZE + pixels.length);
msg.set(header, 0); msg.set(pixels, HEADER_SIZE);
return msg;
}
// ── Stats ──
setInterval(() => {
const now = performance.now(), elapsed = (now - lastSecTime) / 1000;
document.getElementById('statFps').textContent = Math.round(lastSecFrames / elapsed);
document.getElementById('statMode').textContent = streamMode;
const avgSize = lastSecFrames > 0 ? lastSecBytes / lastSecFrames : 0;
if (avgSize > 1024 * 1024) document.getElementById('statFrameSize').textContent = (avgSize / 1024 / 1024).toFixed(1) + 'MB';
else if (avgSize > 1024) document.getElementById('statFrameSize').textContent = (avgSize / 1024).toFixed(0) + 'KB';
else document.getElementById('statFrameSize').textContent = Math.round(avgSize) + 'B';
const bw = lastSecBytes / 1024 / 1024 / elapsed;
document.getElementById('statBandwidth').textContent = bw < 1 ? (bw * 1024).toFixed(0) + 'KB/s' : bw.toFixed(1) + 'MB/s';
document.getElementById('statDelta').textContent = streamMode === 'delta' ? lastDeltaRatio.toFixed(0) + '%' : '—';
document.getElementById('statFrames').textContent = framesSent;
document.getElementById('statInputs').textContent = inputsRecv;
document.getElementById('statAudio').textContent = audioEnabled ? 'on' : 'off';
document.getElementById('statReceivers').textContent = receiverCount;
lastSecBytes = 0; lastSecFrames = 0; lastSecTime = now;
}, 1000);
connect();
</script>
</body>
</html>