Grounds
TypeScript implementation of Relish, a compact binary serialization format.
What is Relish?
Relish is a Type-Length-Value (TLV) encoding format designed by Alex Gaynor. It provides:
- Compact binary representation - smaller than JSON, competitive with Protocol Buffers
- Self-describing format - can decode without schema (for debugging)
- Schema-driven usage - type-safe encoding with TypeScript schemas
- Streaming support - encode and decode incrementally
For the complete format specification, see the Relish Spec.
Packages
Grounds provides three packages:
- @grounds/core - Low-level T[L]V encoding and decoding
- @grounds/schema - TypeBox-based schema definitions with codecs
- @grounds/stream - Streaming encode/decode utilities
Quick Example
import { RStruct, RString, RU32, field, createCodec } from "@grounds/schema";
// Define a schema
const UserSchema = RStruct({
name: field(0, RString()),
age: field(1, RU32()),
});
// Create a codec
const codec = createCodec(UserSchema);
// Encode
codec.encode({ name: "Alice", age: 30 }).match(
(bytes) => console.log("Encoded:", bytes.length, "bytes"),
(err) => console.error("Failed:", err.message),
);
Getting Started
New to Grounds? Start with Installation to set up your project.
Learn More
- Relish announcement blog post - Background and design rationale
- Relish specification - Wire format details
- Relish reference implementation - Rust implementation
Installation
Install Grounds packages using npm or pnpm.
Core Package Only
For low-level encoding without schema support:
npm install @grounds/core
# or
pnpm add @grounds/core
With Schema Support (Recommended)
For type-safe schema-driven serialization:
npm install @grounds/schema @sinclair/typebox luxon
# or
pnpm add @grounds/schema @sinclair/typebox luxon
The schema package includes @grounds/core as a dependency.
Peer dependencies:
@sinclair/typebox- TypeBox for schema definitionsluxon- DateTime handling for timestamps
With Streaming
For streaming encode/decode:
npm install @grounds/stream
# or
pnpm add @grounds/stream
TypeScript Configuration
Grounds is written in TypeScript and provides full type definitions. For best results, use strict TypeScript settings:
{
"compilerOptions": {
"strict": true,
"noUncheckedIndexedAccess": true
}
}
Next Steps
Continue to First Encode to write your first serialization code.
First Encode
Let’s encode your first value using the low-level core API.
Basic Encoding
The encode function takes a tagged value and returns a Result:
// examples/core/encode-match.ts
// Demonstrates: Basic encoding with .match() for result handling
import { encode, String_ } from "@grounds/core";
// Encode a string value
const result = encode(String_("hello world"));
// Use .match() to handle success and error cases
result.match(
(bytes) => {
console.log("Encoded successfully!");
console.log("Bytes:", bytes);
console.log("Length:", bytes.length, "bytes");
},
(err) => {
console.error("Encoding failed:", err.message);
},
);
Run this example:
tsx examples/core/encode-match.ts
Understanding the Result
Grounds uses neverthrow for error handling. The encode function returns Result<Uint8Array, EncodeError>.
Use .match() to handle both success and error cases:
- Success: Receive the encoded
Uint8Array - Error: Receive an
EncodeErrorwith code and message
What’s in the Bytes?
The encoded bytes contain:
- Type byte (1 byte) - identifies the value type (e.g.,
0x0efor String) - Length (1-5 bytes) - varint encoding of the payload length
- Payload - the actual data (e.g., UTF-8 string bytes)
For complete wire format details, see the Relish specification.
Next Steps
Learn about Encoding in depth, or jump to Schema for type-safe serialization.
Encoding
The core package provides low-level encoding for all Relish types.
Basic Encoding
Encode a value using .match() to handle the result:
// examples/core/encode-match.ts
// Demonstrates: Basic encoding with .match() for result handling
import { encode, String_ } from "@grounds/core";
// Encode a string value
const result = encode(String_("hello world"));
// Use .match() to handle success and error cases
result.match(
(bytes) => {
console.log("Encoded successfully!");
console.log("Bytes:", bytes);
console.log("Length:", bytes.length, "bytes");
},
(err) => {
console.error("Encoding failed:", err.message);
},
);
Transforming Results
Use .map() to transform successful results without unwrapping:
// examples/core/encode-transform.ts
// Demonstrates: Transforming successful results with .map()
import { encode, U32 } from "@grounds/core";
// Encode a value and transform the result to hex string
const hexResult = encode(U32(42)).map((bytes) => Buffer.from(bytes).toString("hex"));
// Use .unwrapOr() to get value with fallback
const hex = hexResult.unwrapOr("encoding failed");
console.log("Encoding 42 as U32:");
console.log("Hex:", hex);
// Can also use .match() on the transformed result
hexResult.match(
(hexString) => console.log("Success! Hex bytes:", hexString),
(err) => console.error("Failed:", err.message),
);
Collections
Encode arrays and maps:
// examples/core/encode-collections.ts
// Demonstrates: Encoding arrays and maps
import { encode, decode, Array_, Map_, TypeCode } from "@grounds/core";
// Encode an array of values (primitive U8 elements use raw numbers)
const arrayResult = encode(Array_(TypeCode.U8, [1, 2, 3]));
arrayResult.match(
(bytes) => console.log("Array encoded:", bytes.length, "bytes"),
(err) => console.error("Array encoding failed:", err.message),
);
// Roundtrip to verify
arrayResult
.andThen((bytes) => decode(bytes))
.match(
(decoded) => console.log("Array decoded:", decoded),
(err) => console.error("Array decode failed:", err.message),
);
// Encode a map (primitive String elements use raw strings)
const mapResult = encode(
Map_(
TypeCode.String,
TypeCode.String,
new Map([
["name", "Alice"],
["age", "30"],
]),
),
);
mapResult.match(
(bytes) => console.log("Map encoded:", bytes.length, "bytes"),
(err) => console.error("Map encoding failed:", err.message),
);
Tagged Values
Every value is tagged with its type code (see Relish specification for complete details):
| Type | Code | JavaScript Type |
|---|---|---|
| Null | 0x00 | null |
| Bool | 0x01 | boolean |
| u8-u128 | 0x02-0x06 | number / bigint |
| i8-i128 | 0x07-0x0b | number / bigint |
| f32/f64 | 0x0c-0x0d | number |
| String | 0x0e | string |
| Array | 0x0f | Array<T> |
| Map | 0x10 | Map<K, V> |
Next Steps
Learn about Decoding to deserialize bytes back to values.
Decoding
Decode bytes back to typed values.
Roundtrip with .andThen()
Chain encoding and decoding operations:
// examples/core/encode-roundtrip.ts
// Demonstrates: Chaining encode and decode with .andThen()
import { encode, decode, String_ } from "@grounds/core";
// Chain encode -> decode using .andThen()
// If encode fails, decode is skipped and the error propagates
const roundtrip = encode(String_("hello world")).andThen((bytes) => decode(bytes));
// Handle the final result
roundtrip.match(
(value) => {
console.log("Roundtrip successful!");
console.log("Original: hello world");
console.log("Decoded:", value);
},
(err) => {
console.error("Roundtrip failed:", err.message);
},
);
The .andThen() method chains fallible operations. If encoding fails, decoding is skipped and the error propagates.
Decoding Standalone
You can also decode bytes directly:
import { decode } from "@grounds/core";
// Decode some bytes
decode(bytes).match(
(value) => console.log("Decoded:", value),
(err) => console.error("Failed:", err.message),
);
Type Information
Decoded values include their type code:
decode(bytes).match(
(value) => {
console.log("Type:", value.type); // e.g., TypeCode.String
console.log("Value:", value.value); // e.g., "hello"
},
(err) => console.error(err.message),
);
Next Steps
Learn about Error Handling for robust error management.
Error Handling
Grounds uses neverthrow for type-safe error handling.
Handling Errors with .match()
The .match() method provides exhaustive handling of success and error cases:
// examples/core/encode-error.ts
// Demonstrates: Handling encoding errors with .match() and .mapErr()
import { encode, Struct, String_ } from "@grounds/core";
// Attempt to encode a struct with an invalid field ID (>= 128)
// The Relish wire format requires field IDs to have bit 7 clear (0-127)
const invalidStruct = Struct(new Map([[128, String_("This field ID is invalid")]]));
const result = encode(invalidStruct);
// Use .match() to inspect the error
result.match(
(bytes) => {
console.log("Unexpected success:", bytes);
},
(err) => {
console.log("Expected error occurred!");
console.log("Error message:", err.message);
},
);
// Use .mapErr() to add context to errors
const contextualResult = encode(invalidStruct).mapErr((err) => ({
originalMessage: err.message,
context: "Failed while encoding user profile struct",
}));
contextualResult.match(
() => {},
(err) => {
console.log("\nWith added context:");
console.log("Context:", err.context);
console.log("Original message:", err.originalMessage);
},
);
Error Types
EncodeError
Thrown when encoding fails:
code- Error code string (e.g., “OVERFLOW”, “INVALID_TYPE”)message- Human-readable error description
DecodeError
Thrown when decoding fails:
code- Error code string (e.g., “UNEXPECTED_EOF”, “INVALID_TYPE_CODE”)message- Human-readable error description
Adding Context with .mapErr()
Use .mapErr() to add context to errors without changing the error type:
encode(value)
.mapErr((err) => ({
...err,
context: "Failed while encoding user profile",
}))
.match(
(bytes) => {
/* success */
},
(err) => console.error(err.context, "-", err.message),
);
Chaining with .andThen()
When chaining operations, errors propagate automatically:
encode(value)
.andThen((bytes) => decode(bytes)) // skipped if encode fails
.match(
(decoded) => console.log("Success:", decoded),
(err) => console.error("Failed:", err.message),
);
Next Steps
Ready for type-safe schemas? Continue to Schema Structs.
Structs
Define structured data types with RStruct and field().
Defining a Struct
Use RStruct to define a schema with named fields:
// Demonstrates: Defining struct schemas with RStruct and field()
import { RStruct, RString, RU32, RBool, field } from "@grounds/schema";
import type { Static } from "@sinclair/typebox";
// Define a User schema
// Each field has a numeric ID (for wire format) and a type
const UserSchema = RStruct({
id: field(0, RU32()),
name: field(1, RString()),
active: field(2, RBool()),
});
// Static<typeof Schema> extracts the TypeScript type
type User = Static<typeof UserSchema>;
// TypeScript now knows the exact shape
const user: User = {
id: 12345,
name: "Alice",
active: true,
};
console.log("User schema defined successfully");
console.log("User object:", user);
console.log("TypeScript infers: { id: number, name: string, active: boolean }");
Field IDs
Each field has a numeric ID used in the wire format. Field IDs:
- Must be unique within a struct
- Are used for encoding (not the field name)
- Allow schema evolution (add new IDs, deprecate old ones)
Type Inference
Use Static<typeof Schema> to extract the TypeScript type:
import type { Static } from "@sinclair/typebox";
type User = Static<typeof UserSchema>;
// { id: number; name: string; active: boolean }
Available Field Types
| Schema Type | TypeScript Type | Relish Type |
|---|---|---|
RString() | string | String |
RBool() | boolean | Bool |
RU8() - RU128() | number / bigint | u8 - u128 |
RI8() - RI128() | number / bigint | i8 - i128 |
RF32(), RF64() | number | f32, f64 |
RTimestamp() | DateTime | Timestamp |
ROptional(T) | T | null | Optional wrapper |
RArray(T) | Array<T> | Array |
Next Steps
Learn about Enums for tagged unions, or Codecs for serialization.
Enums
Define tagged unions with REnum and variant().
Defining an Enum
Use REnum to define a schema with named variants:
// Demonstrates: Defining enum schemas with REnum and variant()
import { REnum, RStruct, RString, RU32, field, variant } from "@grounds/schema";
import type { Static } from "@sinclair/typebox";
// Define struct schemas for each variant
const TextMessageSchema = RStruct({
content: field(0, RString()),
sender: field(1, RString()),
});
const ImageMessageSchema = RStruct({
url: field(0, RString()),
width: field(1, RU32()),
height: field(2, RU32()),
});
// Define an enum with named variants
// Each variant has a numeric ID (for wire format) and a schema
const _MessageSchema = REnum({
text: variant(0, TextMessageSchema),
image: variant(1, ImageMessageSchema),
});
// Extract types for each variant
type TextMessage = Static<typeof TextMessageSchema>;
type ImageMessage = Static<typeof ImageMessageSchema>;
// Create instances of each variant
const textMsg: TextMessage = { content: "Hello!", sender: "Alice" };
const imageMsg: ImageMessage = { url: "https://example.com/img.png", width: 800, height: 600 };
console.log("Enum schema defined successfully");
console.log("Text message:", textMsg);
console.log("Image message:", imageMsg);
Variant IDs
Each variant has a numeric ID used in the wire format:
- Must be unique within an enum
- Determines which variant is encoded
- Allows schema evolution (add new variants)
Variant Types
Variants can contain any schema type:
const ResultSchema = REnum({
success: variant(
0,
RStruct({
data: field(0, RString()),
}),
),
error: variant(
1,
RStruct({
code: field(0, RU32()),
message: field(1, RString()),
}),
),
});
Discrimination
After decoding, use type guards or discriminator fields to narrow the type:
function isTextMessage(msg: unknown): msg is TextMessage {
return typeof msg === "object" && msg !== null && "content" in msg;
}
Next Steps
Learn about Codecs for encoding and decoding.
Codecs
Create type-safe encoders and decoders with createCodec.
Creating a Codec
Use createCodec to create an encoder/decoder pair:
// Demonstrates: Creating and using codecs for encode/decode
import { RStruct, RString, RU32, field, createCodec } from "@grounds/schema";
import type { Static } from "@sinclair/typebox";
// Define a schema
const UserSchema = RStruct({
id: field(0, RU32()),
name: field(1, RString()),
});
type User = Static<typeof UserSchema>;
// Create a codec from the schema
const userCodec = createCodec(UserSchema);
// Create a user object
const user: User = { id: 42, name: "Bob" };
// Encode and decode using .andThen() chaining
userCodec
.encode(user)
.andThen((bytes) => {
console.log("Encoded:", bytes.length, "bytes");
console.log("Hex:", Buffer.from(bytes).toString("hex"));
return userCodec.decode(bytes);
})
.match(
(decoded) => {
console.log("Decoded:", decoded);
console.log("Roundtrip successful!");
},
(err) => {
console.error("Failed:", err.message);
},
);
Codec API
A codec provides two methods:
encode(value)
Encodes a value to bytes:
const result: Result<Uint8Array, EncodeError> = codec.encode(value);
decode(bytes)
Decodes bytes to a value:
const result: Result<T, DecodeError> = codec.decode(bytes);
Chaining Operations
Use .andThen() for roundtrip operations:
codec
.encode(value)
.andThen((bytes) => codec.decode(bytes))
.match(
(decoded) => console.log("Success:", decoded),
(err) => console.error("Failed:", err.message),
);
Type Safety
The codec enforces types at compile time:
const userCodec = createCodec(UserSchema);
// TypeScript error: missing 'name' property
userCodec.encode({ id: 1 });
// TypeScript error: 'age' is not a number
userCodec.encode({ id: 1, name: "Alice", age: "thirty" });
Next Steps
Learn about Optional Fields for nullable values.
Optional Fields
Handle nullable values with ROptional.
Defining Optional Fields
Use ROptional to wrap any schema type:
// Demonstrates: Optional fields with ROptional and null handling
import { RStruct, RString, ROptional, field, createCodec } from "@grounds/schema";
import type { Static } from "@sinclair/typebox";
// Define a schema with optional fields
// Optional fields use null for absent values (not undefined)
const ProfileSchema = RStruct({
name: field(0, RString()),
bio: field(1, ROptional(RString())),
website: field(2, ROptional(RString())),
});
type Profile = Static<typeof ProfileSchema>;
const codec = createCodec(ProfileSchema);
// Profile with all fields
const fullProfile: Profile = {
name: "Alice",
bio: "Software developer",
website: "https://alice.dev",
};
// Profile with some fields null
const minimalProfile: Profile = {
name: "Bob",
bio: null,
website: null,
};
// Encode and decode both
console.log("Full profile:");
codec
.encode(fullProfile)
.andThen((bytes) => codec.decode(bytes))
.match(
(decoded) => console.log(" Decoded:", decoded),
(err) => console.error(" Failed:", err.message),
);
console.log("\nMinimal profile:");
codec
.encode(minimalProfile)
.andThen((bytes) => codec.decode(bytes))
.match(
(decoded) => console.log(" Decoded:", decoded),
(err) => console.error(" Failed:", err.message),
);
Null Semantics
Grounds uses null for absent values (not undefined):
type Profile = {
name: string; // required
bio: string | null; // optional
};
// Valid
const profile: Profile = { name: "Alice", bio: null };
// TypeScript error: undefined is not assignable
const profile: Profile = { name: "Alice", bio: undefined };
Wire Format
Optional fields are encoded as:
- Present: Normal encoding of the inner value
- Absent: Encoded as Null type (1 byte)
Nested Optionals
You can nest optionals for complex scenarios:
const Schema = RStruct({
// Optional array
tags: field(0, ROptional(RArray(RString()))),
// Array of optional strings
notes: field(1, RArray(ROptional(RString()))),
});
Next Steps
Continue to Streaming for incremental encoding.
Async Generators
Stream encode and decode with async generators.
Streaming Encode/Decode
Use encodeIterable and decodeIterable for streaming:
// examples/stream/async-generators.ts
// Demonstrates: Streaming encode/decode with async generators
import { encodeIterable, decodeIterable } from "@grounds/stream";
import { String_, U32, Bool, type RelishValue, type DecodedValue } from "@grounds/core";
// Generate values using an async generator
async function* generateValues(): AsyncGenerator<RelishValue> {
yield String_("hello");
yield U32(42);
yield Bool(true);
}
// Encode values to byte chunks
async function example(): Promise<void> {
const chunks: Array<Uint8Array> = [];
// encodeIterable yields Result<Uint8Array, EncodeError> for each value
for await (const result of encodeIterable(generateValues())) {
result.match(
(bytes) => chunks.push(bytes),
(err) => console.error("Encode error:", err.message),
);
}
console.log("Encoded", chunks.length, "chunks");
// Decode chunks back to values
async function* yieldChunks(): AsyncGenerator<Uint8Array> {
for (const chunk of chunks) {
yield chunk;
}
}
const values: Array<DecodedValue> = [];
// decodeIterable yields Result<DecodedValue, DecodeError> for each value
for await (const result of decodeIterable(yieldChunks())) {
result.match(
(value) => values.push(value),
(err) => console.error("Decode error:", err.message),
);
}
console.log("Decoded", values.length, "values");
console.log("Values:", values);
}
await example();
encodeIterable
Encodes values from an async generator:
async function* values(): AsyncGenerator<RelishValue> {
yield { type: TypeCode.String, value: "hello" };
yield { type: TypeCode.U32, value: 42 };
}
for await (const result of encodeIterable(values())) {
result.match(
(bytes) => sendChunk(bytes),
(err) => console.error(err),
);
}
decodeIterable
Decodes values from an async generator of byte chunks:
async function* chunks(): AsyncGenerator<Uint8Array> {
yield await receiveChunk();
}
for await (const result of decodeIterable(chunks())) {
result.match(
(value) => processValue(value),
(err) => console.error(err),
);
}
Error Handling
Each yielded item is a Result, allowing per-item error handling:
- Continue processing after recoverable errors
- Accumulate errors for batch reporting
- Stop on first error if needed
Next Steps
Learn about Web Streams for the WHATWG Streams API.
For type-safe streaming with schemas, see Schema with Async Generators.
Web Streams
Use the WHATWG Streams API for encode/decode pipelines.
Transform Streams
Use createEncoderStream and createDecoderStream:
// examples/stream/web-streams.ts
// Demonstrates: Web Streams API for encode/decode pipelines
import { createEncoderStream, createDecoderStream } from "@grounds/stream";
import { Null, Bool, String_, type RelishValue, type DecodedValue } from "@grounds/core";
async function example(): Promise<void> {
// Create values to stream
const values: Array<RelishValue> = [Null, Bool(true), String_("streaming!")];
// Create a readable stream of values
const valueStream = new ReadableStream<RelishValue>({
start(controller) {
for (const v of values) {
controller.enqueue(v);
}
controller.close();
},
});
// Pipe through encoder to get byte chunks
const encodedStream = valueStream.pipeThrough(createEncoderStream());
// Collect encoded chunks
const chunks: Array<Uint8Array> = [];
const reader = encodedStream.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
}
console.log("Encoded to", chunks.length, "chunks");
// Create readable stream from chunks
const chunkStream = new ReadableStream<Uint8Array>({
start(controller) {
for (const c of chunks) {
controller.enqueue(c);
}
controller.close();
},
});
// Pipe through decoder to get values back
const decodedStream = chunkStream.pipeThrough(createDecoderStream());
// Collect decoded values
const decoded: Array<DecodedValue> = [];
const decodedReader = decodedStream.getReader();
while (true) {
const { done, value } = await decodedReader.read();
if (done) break;
decoded.push(value);
}
console.log("Decoded", decoded.length, "values");
console.log("Values:", decoded);
}
await example();
createEncoderStream
Creates a TransformStream that encodes RelishValue to Uint8Array:
const encoderStream = createEncoderStream();
valueStream.pipeThrough(encoderStream).pipeTo(networkSink);
createDecoderStream
Creates a TransformStream that decodes Uint8Array to RelishValue:
const decoderStream = createDecoderStream();
networkSource.pipeThrough(decoderStream).pipeTo(valueHandler);
Pipeline Composition
Chain multiple transforms:
sourceStream.pipeThrough(createEncoderStream()).pipeThrough(compressionStream).pipeTo(networkSink);
Browser Compatibility
Web Streams are supported in:
- Modern browsers (Chrome, Firefox, Safari, Edge)
- Node.js 16+ (with
--experimental-fetchor Node 18+) - Deno
- Cloudflare Workers
Next Steps
For type-safe streaming with schemas, see Schema with Web Streams.
See the Reference section for wire format details.
Schema with Async Generators
Stream encode and decode with type-safe schemas using async generators.
Type-Safe Streaming
Use createCodec for schema-aware encoding and decoding:
// examples/stream/schema-async-generators.ts
// Demonstrates: Schema-aware streaming with async generators using toRelish
import { RStruct, RString, field, createCodec } from "@grounds/schema";
import { type Static } from "@sinclair/typebox";
// Define a Message schema using RStruct
const MessageSchema = RStruct({
sender: field(0, RString()),
content: field(1, RString()),
});
type Message = Static<typeof MessageSchema>;
// Generate typed message values using an async generator
async function* generateMessages(): AsyncGenerator<Message> {
yield { sender: "alice", content: "hello" };
yield { sender: "bob", content: "world" };
yield { sender: "charlie", content: "how are you?" };
}
// Example: Encode typed messages using codec, then decode back
async function example(): Promise<void> {
console.log("=== Schema-aware Async Generators ===\n");
// Create a codec for the Message schema
// This provides encode/decode with full type safety
const codec = createCodec(MessageSchema);
// Step 1: Encode messages using the codec
console.log("Encoding messages...");
const chunks: Array<Uint8Array> = [];
for await (const message of generateMessages()) {
const encodeResult = codec.encode(message);
encodeResult.match(
(bytes) => {
chunks.push(bytes);
console.log(` Encoded message from ${message.sender}: ${bytes.length} bytes`);
},
(err) => console.error(" Encode error:", err.message),
);
}
console.log(`\nSuccessfully encoded ${chunks.length} messages\n`);
// Step 2: Decode bytes back to typed messages
console.log("Decoding messages...");
const decodedMessages: Array<Message> = [];
for (const chunk of chunks) {
const decodeResult = codec.decode(chunk);
decodeResult.match(
(message) => {
decodedMessages.push(message);
console.log(` Decoded message from ${message.sender}: "${message.content}"`);
},
(err) => {
console.error(" Decode error:", err.message);
},
);
}
console.log(`\nSuccessfully decoded ${decodedMessages.length} messages\n`);
// Step 3: Verify round-trip
console.log("=== Results ===");
console.log("Original messages:");
for await (const msg of generateMessages()) {
console.log(` ${msg.sender}: ${msg.content}`);
}
console.log("\nDecoded messages:");
for (const msg of decodedMessages) {
console.log(` ${msg.sender}: ${msg.content}`);
}
// Verify all match
let allMatch = true;
for await (const origMsg of generateMessages()) {
const found = decodedMessages.find(
(m) => m.sender === origMsg.sender && m.content === origMsg.content,
);
if (!found) {
allMatch = false;
break;
}
}
console.log(`\nRound-trip successful: ${allMatch ? "YES" : "NO"}`);
}
await example();
Creating a Codec
Define a schema and create a typed codec:
import { RStruct, RString, field, createCodec } from "@grounds/schema";
import { type Static } from "@sinclair/typebox";
const MessageSchema = RStruct({
sender: field(0, RString()),
content: field(1, RString()),
});
type Message = Static<typeof MessageSchema>;
const codec = createCodec(MessageSchema);
The codec provides:
codec.encode(value): Encodes typed values toUint8Arraycodec.decode(bytes): Decodes bytes to typed values- Full TypeScript type inference from the schema
Encoding with Codecs
Encode typed messages using the codec:
async function* generateMessages(): AsyncGenerator<Message> {
yield { sender: "alice", content: "hello" };
yield { sender: "bob", content: "world" };
}
const chunks: Array<Uint8Array> = [];
for await (const message of generateMessages()) {
const result = codec.encode(message);
result.match(
(bytes) => chunks.push(bytes),
(err) => console.error(err.message),
);
}
Decoding with Codecs
Decode bytes back to typed messages:
const decodedMessages: Array<Message> = [];
for (const chunk of chunks) {
const result = codec.decode(chunk);
result.match(
(message) => decodedMessages.push(message),
(err) => console.error(err.message),
);
}
Type Safety
TypeScript enforces schema types:
codec.encode()accepts only values matching the schema typecodec.decode()returns values with the correct TypeScript type- Compile-time errors prevent type mismatches
- No manual type casting required
Error Handling
Codecs return Result types for explicit error handling:
- Encoding errors for invalid schema values
- Decoding errors for malformed binary data
- Per-message error handling in streams
- Continue processing after recoverable errors
Next Steps
Learn about Schema with Web Streams for the WHATWG Streams API with type safety.
Schema with Web Streams
Use the WHATWG Streams API with type-safe schema encoding and decoding.
Type-Safe Transform Streams
Use createSchemaEncoderStream and createSchemaDecoderStream:
// examples/stream/schema-web-streams.ts
// Demonstrates: Schema-aware Web Streams with automatic type conversion
import { createSchemaEncoderStream, createSchemaDecoderStream } from "@grounds/stream";
import { RStruct, RString, field } from "@grounds/schema";
import { type Static } from "@sinclair/typebox";
// Define a Message schema using RStruct
const MessageSchema = RStruct({
sender: field(0, RString()),
content: field(1, RString()),
});
type Message = Static<typeof MessageSchema>;
async function example(): Promise<void> {
console.log("=== Schema-aware Web Streams ===\n");
// Create typed Message values
const messages: Array<Message> = [
{ sender: "alice", content: "hello" },
{ sender: "bob", content: "world" },
{ sender: "charlie", content: "how are you?" },
];
console.log("Original messages:");
for (const msg of messages) {
console.log(` ${msg.sender}: ${msg.content}`);
}
console.log();
// Step 1: Create a readable stream of typed Message values
const messageStream = new ReadableStream<Message>({
start(controller) {
for (const msg of messages) {
controller.enqueue(msg);
}
controller.close();
},
});
// Step 2: Pipe through schema encoder (accepts Message, outputs Uint8Array)
console.log("Encoding messages through schema encoder stream...");
const encodedStream = messageStream.pipeThrough(createSchemaEncoderStream(MessageSchema));
// Collect encoded chunks
const chunks: Array<Uint8Array> = [];
const encodedReader = encodedStream.getReader();
while (true) {
const { done, value } = await encodedReader.read();
if (done) break;
chunks.push(value);
console.log(` Encoded chunk: ${value.length} bytes`);
}
console.log(`Total encoded: ${chunks.length} chunks\n`);
// Step 3: Create readable stream from encoded chunks
const chunkStream = new ReadableStream<Uint8Array>({
start(controller) {
for (const chunk of chunks) {
controller.enqueue(chunk);
}
controller.close();
},
});
// Step 4: Pipe through schema decoder (accepts Uint8Array, outputs typed Message)
console.log("Decoding messages through schema decoder stream...");
const decodedStream = chunkStream.pipeThrough(createSchemaDecoderStream(MessageSchema));
// Collect decoded typed values
const decodedMessages: Array<Message> = [];
const decodedReader = decodedStream.getReader();
while (true) {
const { done, value } = await decodedReader.read();
if (done) break;
decodedMessages.push(value);
console.log(` Decoded message from ${value.sender}: "${value.content}"`);
}
console.log(`\nSuccessfully decoded ${decodedMessages.length} messages\n`);
// Step 5: Verify round-trip
console.log("=== Verification ===");
console.log("Decoded messages:");
for (const msg of decodedMessages) {
console.log(` ${msg.sender}: ${msg.content}`);
}
// Check if all messages match
let allMatch = true;
for (let i = 0; i < messages.length; i++) {
if (
messages[i].sender !== decodedMessages[i].sender ||
messages[i].content !== decodedMessages[i].content
) {
allMatch = false;
break;
}
}
console.log(`\nRound-trip successful: ${allMatch ? "YES" : "NO"}`);
}
await example();
createSchemaEncoderStream
Creates a TransformStream that encodes typed values to Uint8Array:
import { createSchemaEncoderStream } from "@grounds/stream";
import { RStruct, RString, field } from "@grounds/schema";
const MessageSchema = RStruct({
sender: field(0, RString()),
content: field(1, RString()),
});
const encoderStream = createSchemaEncoderStream(MessageSchema);
// Stream accepts typed Message values, outputs Uint8Array
messageStream.pipeThrough(encoderStream).pipeTo(networkSink);
The encoder stream:
- Accepts values matching the schema type
- Automatically converts to Relish wire format
- Outputs
Uint8Arraychunks - Validates values against schema
createSchemaDecoderStream
Creates a TransformStream that decodes Uint8Array to typed values:
const decoderStream = createSchemaDecoderStream(MessageSchema);
// Stream accepts Uint8Array, outputs typed Message values
networkSource.pipeThrough(decoderStream).pipeTo(messageHandler);
The decoder stream:
- Accepts
Uint8Arraychunks - Parses Relish wire format
- Outputs typed values matching the schema
- Validates decoded data against schema
Pipeline Composition
Chain schema transforms with other streams:
sourceStream
.pipeThrough(createSchemaEncoderStream(MessageSchema))
.pipeThrough(compressionStream)
.pipeTo(networkSink);
networkSource
.pipeThrough(decompressionStream)
.pipeThrough(createSchemaDecoderStream(MessageSchema))
.pipeTo(messageHandler);
Type Safety
TypeScript enforces schema types throughout pipelines:
- Encoder input must match schema type
- Decoder output is typed by schema
- Compile-time errors for type mismatches
- No manual validation needed
Browser Compatibility
Schema streams work in the same environments as basic streams:
- Modern browsers (Chrome, Firefox, Safari, Edge)
- Node.js 16+ (with
--experimental-fetchor Node 18+) - Deno
- Cloudflare Workers
Next Steps
See the Reference section for wire format details.
Compressed Streams
Combine schema-aware streams with the Web Streams CompressionStream API for efficient data transfer.
Full Example
This example demonstrates encoding log entries, compressing them, then decompressing and decoding to verify the roundtrip:
// examples/stream/compressed-streams.ts
// Demonstrates: Combining schema-aware streams with Web Streams CompressionStream API
//
// Usage: pnpm exec tsx examples/stream/compressed-streams.ts [algorithm] [count]
// algorithm: "gzip" (default), "deflate", or "deflate-raw"
// count: number of log entries to generate (default: 20)
//
// Examples:
// pnpm exec tsx examples/stream/compressed-streams.ts
// pnpm exec tsx examples/stream/compressed-streams.ts deflate
// pnpm exec tsx examples/stream/compressed-streams.ts gzip 100
import { createSchemaEncoderStream, createSchemaDecoderStream } from "@grounds/stream";
import { RStruct, RString, RTimestamp, RMap, field } from "@grounds/schema";
import { type Static } from "@sinclair/typebox";
import { DateTime } from "luxon";
import { faker } from "@faker-js/faker";
// Define a LogEntry schema - realistic for streaming + compression scenarios
const LogEntrySchema = RStruct({
timestamp: field(0, RTimestamp()),
level: field(1, RString()),
message: field(2, RString()),
source: field(3, RString()),
attributes: field(4, RMap(RString(), RString())),
});
type LogEntry = Static<typeof LogEntrySchema>;
// Supported compression algorithms
// gzip, deflate, deflate-raw: Standard Web Streams API (all runtimes)
// zstd: Bun only (not available in browsers or Node.js)
type CompressionAlgorithm = "gzip" | "deflate" | "deflate-raw" | "zstd";
const VALID_ALGORITHMS: ReadonlyArray<CompressionAlgorithm> = [
"gzip",
"deflate",
"deflate-raw",
"zstd",
];
function isValidAlgorithm(value: string): value is CompressionAlgorithm {
return VALID_ALGORITHMS.includes(value as CompressionAlgorithm);
}
// Generate sample log entries with varied data using faker
function generateLogEntries(count: number): Array<LogEntry> {
const levels = ["info", "warn", "error", "debug"] as const;
const sources = ["api", "auth", "db", "cache", "worker"] as const;
const entries: Array<LogEntry> = [];
const baseTime = DateTime.now().toUTC();
for (let i = 0; i < count; i++) {
entries.push({
timestamp: baseTime.plus({ seconds: i }),
level: faker.helpers.arrayElement(levels),
message: faker.hacker.phrase(),
source: faker.helpers.arrayElement(sources),
attributes: new Map([
["requestId", faker.string.uuid()],
["userId", faker.string.nanoid(10)],
["duration", `${faker.number.int({ min: 1, max: 5000 })}ms`],
["ip", faker.internet.ipv4()],
["userAgent", faker.internet.userAgent()],
]),
});
}
return entries;
}
// Format bytes with thousands separator
function formatBytes(bytes: number): string {
return bytes.toLocaleString();
}
// Verify that decoded entries match the originals
// Note: Relish timestamps are Unix seconds, so millisecond precision is lost
function verifyRoundtrip(originals: Array<LogEntry>, decoded: Array<LogEntry>): boolean {
if (originals.length !== decoded.length) {
return false;
}
for (let i = 0; i < originals.length; i++) {
const original = originals[i];
const dec = decoded[i];
if (!original || !dec) {
return false;
}
// Compare at second precision (Relish truncates to seconds)
const originalSeconds = Math.floor(original.timestamp.toSeconds());
const decodedSeconds = Math.floor(dec.timestamp.toSeconds());
if (
originalSeconds !== decodedSeconds ||
original.level !== dec.level ||
original.message !== dec.message ||
original.source !== dec.source
) {
return false;
}
// Compare attributes map
if (original.attributes.size !== dec.attributes.size) {
return false;
}
for (const [key, value] of original.attributes) {
if (dec.attributes.get(key) !== value) {
return false;
}
}
}
return true;
}
async function main(): Promise<void> {
// Parse CLI arguments
const algorithmArg = process.argv[2] ?? "gzip";
const countArg = process.argv[3] ?? "20";
if (!isValidAlgorithm(algorithmArg)) {
console.error(
`Invalid algorithm: "${algorithmArg}". Must be one of: ${VALID_ALGORITHMS.join(", ")}`,
);
process.exit(1);
}
const count = parseInt(countArg, 10);
if (isNaN(count) || count <= 0) {
console.error(`Invalid count: "${countArg}". Must be a positive integer.`);
process.exit(1);
}
const algorithm: CompressionAlgorithm = algorithmArg;
console.log(`Compression algorithm: ${algorithm}`);
console.log(`Generating ${count} log entries...\n`);
// Generate sample data
const entries = generateLogEntries(count);
console.log("Pipeline: encode → compress → decompress → decode\n");
// Step 1: Create source stream from log entries
const sourceStream = new ReadableStream<LogEntry>({
start(controller) {
for (const entry of entries) {
controller.enqueue(entry);
}
controller.close();
},
});
// Step 2: Encode → Compress pipeline
// Schema encoder outputs Uint8Array, which feeds directly into CompressionStream
// Note: Type assertions needed because:
// 1. DOM types don't include "zstd" (Bun-only algorithm)
// 2. CompressionStream types don't perfectly align with Web Streams generics
const compressedStream = sourceStream
.pipeThrough(createSchemaEncoderStream(LogEntrySchema))
.pipeThrough(
new CompressionStream(algorithm as CompressionFormat) as unknown as TransformStream<
Uint8Array,
Uint8Array
>,
);
// Step 3: Collect compressed bytes to measure size
const compressedChunks: Array<Uint8Array> = [];
const compressedReader = compressedStream.getReader();
while (true) {
const { done, value } = await compressedReader.read();
if (done) break;
compressedChunks.push(value);
}
// Calculate sizes
const compressedSize = compressedChunks.reduce((sum, chunk) => sum + chunk.length, 0);
// Step 4: Decompress → Decode pipeline
const compressedDataStream = new ReadableStream<Uint8Array>({
start(controller) {
for (const chunk of compressedChunks) {
controller.enqueue(chunk);
}
controller.close();
},
});
// Note: Type assertions for same reasons as CompressionStream above
const decodedStream = compressedDataStream
.pipeThrough(
new DecompressionStream(algorithm as CompressionFormat) as unknown as TransformStream<
Uint8Array,
Uint8Array
>,
)
.pipeThrough(createSchemaDecoderStream(LogEntrySchema));
// Step 5: Collect decoded entries and measure uncompressed size
const decodedEntries: Array<LogEntry> = [];
let uncompressedSize = 0;
// We need to re-encode to get uncompressed size (or calculate from original)
// For simplicity, we'll encode the original entries without compression
const measureStream = new ReadableStream<LogEntry>({
start(controller) {
for (const entry of entries) {
controller.enqueue(entry);
}
controller.close();
},
});
const measureEncodedStream = measureStream.pipeThrough(createSchemaEncoderStream(LogEntrySchema));
const measureReader = measureEncodedStream.getReader();
while (true) {
const { done, value } = await measureReader.read();
if (done) break;
uncompressedSize += value.length;
}
// Collect decoded entries
const decodedReader = decodedStream.getReader();
while (true) {
const { done, value } = await decodedReader.read();
if (done) break;
decodedEntries.push(value);
}
// Calculate compression ratio
const ratio = ((1 - compressedSize / uncompressedSize) * 100).toFixed(1);
console.log(`Uncompressed size: ${formatBytes(uncompressedSize)} bytes`);
console.log(`Compressed size: ${formatBytes(compressedSize)} bytes`);
console.log(`Compression ratio: ${ratio}%\n`);
// Step 6: Verify roundtrip
const allMatch = verifyRoundtrip(entries, decodedEntries);
if (allMatch) {
console.log(`✓ All ${decodedEntries.length} entries decoded successfully`);
console.log("✓ Roundtrip verification passed");
} else {
console.log("✗ Roundtrip verification FAILED");
process.exit(1);
}
}
await main();
Composing with CompressionStream
Since createSchemaEncoderStream outputs Uint8Array and CompressionStream accepts Uint8Array, they compose directly:
import { createSchemaEncoderStream } from "@grounds/stream";
import { RStruct, RString, field } from "@grounds/schema";
const MessageSchema = RStruct({
sender: field(0, RString()),
content: field(1, RString()),
});
// Encode → Compress pipeline
sourceStream
.pipeThrough(createSchemaEncoderStream(MessageSchema))
.pipeThrough(new CompressionStream("gzip"))
.pipeTo(networkSink);
No adapters or conversion needed—standard Web Streams composition.
Decompression Pipeline
The reverse pipeline decompresses then decodes:
import { createSchemaDecoderStream } from "@grounds/stream";
// Decompress → Decode pipeline
networkSource
.pipeThrough(new DecompressionStream("gzip"))
.pipeThrough(createSchemaDecoderStream(MessageSchema))
.pipeTo(messageHandler);
Supported Algorithms
| Algorithm | Description | Runtime Support |
|---|---|---|
gzip | Most compatible, includes header/checksum | All runtimes |
deflate | Raw deflate with zlib header | All runtimes |
deflate-raw | Raw deflate, no header | All runtimes |
zstd | Fast, high compression ratio | Bun only |
Note: Brotli is not currently supported by any runtime’s native CompressionStream API.
Compression Ratios
Compression effectiveness depends on your data. The example uses faker to generate realistic, varied log entries:
20 log entries: ~55% compression (5,763 → 2,576 bytes)
50 log entries: ~58% compression (14,000 → 5,900 bytes)
100 log entries: ~60% compression (28,000 → 11,200 bytes)
Highly repetitive data (same messages, same IDs) would compress even better. Real-world data typically falls somewhere in between.
TypeScript Considerations
The TypeScript DOM types for CompressionStream don’t perfectly align with Web Streams generics. You may need type assertions:
// Type assertion for TypeScript compatibility
.pipeThrough(
new CompressionStream("gzip") as unknown as TransformStream<
Uint8Array,
Uint8Array
>
)
This is a TypeScript type definition issue, not a runtime problem.
Runtime Compatibility
CompressionStream is supported in:
- Chrome 80+
- Firefox 113+
- Safari 16.4+
- Edge 80+
- Node.js 18+
- Deno
- Bun (includes
zstdsupport)
For older environments, consider polyfills like compression-streams-polyfill.
Next Steps
- See Schema with Web Streams for more on typed streaming
- See the Reference section for wire format details
API Reference
Complete API documentation is available in the TypeDoc-generated reference.
Packages
- @grounds/core - Low-level Relish wire format encoding and decoding
- @grounds/schema - TypeBox schema integration with compile-time validation
- @grounds/stream - Streaming utilities for encoding and decoding
What’s in the API Reference?
The API reference includes detailed documentation for all exported functions, classes, types, and interfaces:
- Function signatures with parameter types and return types
- Class constructors, methods, and properties
- Type aliases and interfaces
- Code examples and usage patterns
- Cross-references to related APIs
★ Tip: Use the search function in the API documentation (top-right) to quickly find specific functions, types, or classes.
Type Codes
Complete reference of Relish type codes.
For the authoritative specification, see the Relish Spec.
Primitive Types
| Type | Code | JavaScript | Notes |
|---|---|---|---|
| Null | 0x00 | null | No payload |
| Bool | 0x01 | boolean | 0x00 = false, 0xFF = true |
| u8 | 0x02 | number | 1 byte unsigned |
| u16 | 0x03 | number | 2 bytes little-endian |
| u32 | 0x04 | number | 4 bytes little-endian |
| u64 | 0x05 | bigint | 8 bytes little-endian |
| u128 | 0x06 | bigint | 16 bytes little-endian |
| i8 | 0x07 | number | 1 byte signed |
| i16 | 0x08 | number | 2 bytes little-endian |
| i32 | 0x09 | number | 4 bytes little-endian |
| i64 | 0x0a | bigint | 8 bytes little-endian |
| i128 | 0x0b | bigint | 16 bytes little-endian |
| f32 | 0x0c | number | IEEE 754 single |
| f64 | 0x0d | number | IEEE 754 double |
| Timestamp | 0x13 | bigint / DateTime | Unix seconds (8 bytes) |
Variable-Length Types
| Type | Code | JavaScript | Notes |
|---|---|---|---|
| String | 0x0e | string | UTF-8 encoded |
| Array | 0x0f | Array<T> | Length-prefixed elements |
| Map | 0x10 | Map<K, V> | Length-prefixed key-value pairs |
Composite Types
| Type | Code | JavaScript | Notes |
|---|---|---|---|
| Struct | 0x11 | object | Field ID + value pairs |
| Enum | 0x12 | tagged union | Variant ID + value |
Reserved
Bit 7 (0x80) is reserved for future use.
Next Steps
See Wire Format for encoding details.
Wire Format
Relish encoding structure and format details.
For the authoritative specification, see the Relish Spec.
T[L]V Structure
Every Relish value is encoded as Type-[Length]-Value:
┌──────────┬──────────────┬─────────────┐
│ Type (1) │ Length (1-5) │ Value (N) │
└──────────┴──────────────┴─────────────┘
Type Byte
Single byte identifying the value type (0x00-0x13).
Bit 7 is reserved for future use.
Length (Varsize)
Variable-length encoding of payload size:
- Bit 0 = 0: 7-bit length (0-127 bytes) in single byte
- Bit 0 = 1: 4-byte little-endian length (up to 2³¹-1 bytes)
Examples:
0x0A→ 5 bytes (5 << 1 = 10, bit 0 = 0)0x01 0x00 0x01 0x00 0x00→ 128 bytes (bit 0 = 1, then LE u32)
Value
Type-specific encoding. All integers are little-endian.
Encoding Examples
String “hi”
0x0e # Type: String
0x04 # Length: 2 bytes (2 << 1 = 4)
0x68 0x69 # Value: "hi" in UTF-8
u32 value 42
0x04 # Type: u32
0x08 # Length: 4 bytes (4 << 1 = 8)
0x2a 0x00 0x00 0x00 # Value: 42 in little-endian
Bool true
0x01 # Type: Bool
0x02 # Length: 1 byte (1 << 1 = 2)
0xff # Value: true
Struct Encoding
Structs encode as a sequence of (field_id, value) pairs:
┌──────────────┬──────────────────────────────────┐
│ Struct (0x11)│ Length │
├──────────────┼──────────────────────────────────┤
│ Field ID (u8)│ Value (T[L]V) │
│ Field ID (u8)│ Value (T[L]V) │
│ ... │ ... │
└──────────────┴──────────────────────────────────┘
Enum Encoding
Enums encode as variant ID followed by value:
┌──────────────┬──────────────┬──────────────────┐
│ Enum (0x12) │ Length │ │
├──────────────┼──────────────┼──────────────────┤
│ Variant ID │ Value (T[L]V)│ │
└──────────────┴──────────────┴──────────────────┘
Learn More
- Relish announcement - Design rationale
- Relish specification - Authoritative spec
- Relish reference implementation - Rust implementation