Rename Lossless/Lossy to Hyperview/View #6

Merged
lentil merged 4 commits from chore/rename-as-hyperview into main 2025-07-09 14:37:37 -05:00
68 changed files with 862 additions and 746 deletions
Showing only changes of commit 50ac2b7b35 - Show all commits

View File

@ -42,7 +42,7 @@ npm run example-app
- DeltaV2 is the current format (DeltaV1 is legacy) - DeltaV2 is the current format (DeltaV1 is legacy)
2. **Views**: Different ways to interpret the delta stream: 2. **Views**: Different ways to interpret the delta stream:
- **Lossless View**: Stores all deltas without conflict resolution - **Hyperview View**: Stores all deltas without conflict resolution
- **Lossy Views**: Apply conflict resolution (e.g., Last-Write-Wins) - **Lossy Views**: Apply conflict resolution (e.g., Last-Write-Wins)
- Custom resolvers can be implemented - Custom resolvers can be implemented
@ -60,7 +60,7 @@ npm run example-app
- `src/node.ts`: Main `RhizomeNode` class orchestrating all components - `src/node.ts`: Main `RhizomeNode` class orchestrating all components
- `src/delta.ts`: Delta data structures and conversion logic - `src/delta.ts`: Delta data structures and conversion logic
- `src/lossless.ts`: Core lossless view implementation - `src/hyperview.ts`: Core hyperview view implementation
- `src/collection-basic.ts`: Basic collection implementation - `src/collection-basic.ts`: Basic collection implementation
- `src/http/api.ts`: REST API endpoints - `src/http/api.ts`: REST API endpoints
- `src/pub-sub.ts`: Network communication layer - `src/pub-sub.ts`: Network communication layer
@ -78,7 +78,7 @@ The HTTP API provides RESTful endpoints:
- `GET/PUT /collection/:name/:id` - Entity operations - `GET/PUT /collection/:name/:id` - Entity operations
- `GET /peers` - Peer information - `GET /peers` - Peer information
- `GET /deltas/stats` - Delta statistics - `GET /deltas/stats` - Delta statistics
- `GET /lossless/:entityId` - Raw delta access - `GET /hyperview/:entityId` - Raw delta access
### Important Implementation Notes ### Important Implementation Notes

View File

@ -150,13 +150,13 @@ curl -s -X PUT -H 'content-type:application/json' -d @/tmp/user.json http://loca
| Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. | | Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. |
| Schemas | Not really | Currently very thin layer allowing TypedCollections | | Schemas | Not really | Currently very thin layer allowing TypedCollections |
| Relationships | No | Supporting relational algebra among domain entities | | Relationships | No | Supporting relational algebra among domain entities |
| Views | Yes | Lossless: Map the `targetContext`s as properties of domain entities. | | Views | Yes | Hyperview: Map the `targetContext`s as properties of domain entities. |
| | | Lossy: Use a delta filter and a resolver function to produce a view. | | | | Lossy: Use a delta filter and a resolver function to produce a view. |
| | | Currently using functions rather than JSON-Logic expressions. | | | | Currently using functions rather than JSON-Logic expressions. |
| Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) | | Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) |
| Tests | Yes | We are set up to run unit tests and multi-node tests | | Tests | Yes | We are set up to run unit tests and multi-node tests |
| Identity | Sort of | We have an identity service via Libp2p | | Identity | Sort of | We have an identity service via Libp2p |
| Contexts | No | Each context may involve different lossy functions and delta filters | | Contexts | No | Each context may involve different view functions and delta filters |
| HTTP API | Yes | Basic peering info and entity CRUD | | HTTP API | Yes | Basic peering info and entity CRUD |
If we express views and filter rules as JSON-Logic, we can easily include them in records. If we express views and filter rules as JSON-Logic, we can easily include them in records.

View File

@ -90,7 +90,7 @@ graph TD
- Design storage layer (Mnesia/ETS) - Design storage layer (Mnesia/ETS)
- **View System** - **View System**
- Implement Lossy/Lossless views - Implement Lossy/Hyperview views
- Create resolver framework - Create resolver framework
- Add caching layer - Add caching layer
@ -237,7 +237,7 @@ sequenceDiagram
- [ ] Define delta types and operations - [ ] Define delta types and operations
- [ ] Implement DeltaBuilder - [ ] Implement DeltaBuilder
- [ ] Basic storage with Mnesia/ETS - [ ] Basic storage with Mnesia/ETS
- [ ] View system with Lossy/Lossless support - [ ] View system with Lossy/Hyperview support
### 2. Distributed Foundation ### 2. Distributed Foundation
- [ ] Node discovery and membership - [ ] Node discovery and membership

View File

@ -2,10 +2,10 @@
- [x] Organize tests? - [x] Organize tests?
- [x] More documentation in docs/ - [x] More documentation in docs/
- [ ] Rename/consolidate, lossless view() and compose() --> composeView() - [x] Rename/consolidate, hyperview view() and compose() --> composeView()
- [ ] Rename Lossless to HyperView - [x] Rename Hyperview to HyperView
- [ ] Rename Lossy to View - [x] Rename Lossy to View
- [ ] Consider whether we should use collapsed deltas - [x] Consider whether we should use collapsed deltas
- [ ] Improve ergonomics of declaring multiple entity properties in one delta - [ ] Improve ergonomics of declaring multiple entity properties in one delta
- [x] Use dotenv so we can more easily manage the local dev test environment - [x] Use dotenv so we can more easily manage the local dev test environment
- [ ] Create test helpers to reduce boilerplate - [x] Create test helpers to reduce boilerplate

View File

@ -1,5 +1,5 @@
# Test structure # Test structure
- before test, initialize node and lossless view - before test, initialize node and hyperview view
- when test begins, create and ingest a series of deltas - when test begins, create and ingest a series of deltas
- instantiate a resolver, in this case using custom resolver plugins - instantiate a resolver, in this case using custom resolver plugins
- call the resolver's initializer with the view - call the resolver's initializer with the view

View File

@ -1,5 +1,5 @@
import { RhizomeNode } from '@src'; import { RhizomeNode } from '@src';
import { Lossless } from '@src/views/lossless'; import { Hyperview } from '@src/views/hyperview';
import { Delta } from '@src/core/delta'; import { Delta } from '@src/core/delta';
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers'; import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
@ -29,12 +29,12 @@ export async function testResolverWithPlugins<T extends TestPluginMap>(
// Setup test environment // Setup test environment
const node = new RhizomeNode(); const node = new RhizomeNode();
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const view = new CustomResolver(lossless, plugins); const view = new CustomResolver(hyperview, plugins);
// Ingest all deltas through the lossless instance // Ingest all deltas through the hyperview instance
for (const delta of deltas) { for (const delta of deltas) {
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
} }
// Get the resolved view // Get the resolved view

View File

@ -1,4 +1,4 @@
import { LosslessViewOne } from '@src/views/lossless'; import { HyperviewViewOne } from '@src/views/hyperview';
import { import {
SchemaBuilder, SchemaBuilder,
PrimitiveSchemas, PrimitiveSchemas,
@ -153,12 +153,12 @@ describe('Schema System', () => {
expect(schemaRegistry.hasCircularDependencies()).toBe(true); expect(schemaRegistry.hasCircularDependencies()).toBe(true);
}); });
test('should validate lossless views against schemas', () => { test('should validate hyperview views against schemas', () => {
const userSchema = CommonSchemas.User(); const userSchema = CommonSchemas.User();
schemaRegistry.register(userSchema); schemaRegistry.register(userSchema);
// Create a valid lossless view // Create a valid hyperview view
const validView: LosslessViewOne = { const validView: HyperviewViewOne = {
id: 'user123', id: 'user123',
propertyDeltas: { propertyDeltas: {
name: [ name: [
@ -179,7 +179,7 @@ describe('Schema System', () => {
expect(result.errors).toHaveLength(0); expect(result.errors).toHaveLength(0);
// Test invalid view (missing required property) // Test invalid view (missing required property)
const invalidView: LosslessViewOne = { const invalidView: HyperviewViewOne = {
id: 'user456', id: 'user456',
propertyDeltas: { propertyDeltas: {
age: [ age: [
@ -212,7 +212,7 @@ describe('Schema System', () => {
schemaRegistry.register(schema); schemaRegistry.register(schema);
// Valid types // Valid types
const validView: LosslessViewOne = { const validView: HyperviewViewOne = {
id: 'test1', id: 'test1',
propertyDeltas: { propertyDeltas: {
stringProp: [ stringProp: [
@ -240,7 +240,7 @@ describe('Schema System', () => {
expect(validResult.valid).toBe(true); expect(validResult.valid).toBe(true);
// Invalid types // Invalid types
const invalidView: LosslessViewOne = { const invalidView: HyperviewViewOne = {
id: 'test2', id: 'test2',
propertyDeltas: { propertyDeltas: {
stringProp: [ stringProp: [
@ -331,7 +331,7 @@ describe('Schema System', () => {
.addPointer('users', 'user3', 'email') .addPointer('users', 'user3', 'email')
.addPointer('email', 'invalid@test.com') .addPointer('email', 'invalid@test.com')
.buildV1(); .buildV1();
node.lossless.ingestDelta(invalidDelta); node.hyperview.ingestDelta(invalidDelta);
const stats = collection.getValidationStats(); const stats = collection.getValidationStats();
expect(stats.totalEntities).toBe(3); expect(stats.totalEntities).toBe(3);
@ -355,11 +355,11 @@ describe('Schema System', () => {
const invalidDelta = createDelta(node.config.creator, node.config.peerId) const invalidDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty('user3', 'age', 'not-a-number', 'users') .setProperty('user3', 'age', 'not-a-number', 'users')
.buildV1(); .buildV1();
node.lossless.ingestDelta(invalidDelta); node.hyperview.ingestDelta(invalidDelta);
debug(`Manually ingested invalid delta: ${JSON.stringify(invalidDelta)}`) debug(`Manually ingested invalid delta: ${JSON.stringify(invalidDelta)}`)
debug(`Lossless view: ${JSON.stringify(node.lossless.compose(), null, 2)}`) debug(`Hyperview view: ${JSON.stringify(node.hyperview.compose(), null, 2)}`)
const validIds = collection.getValidEntities(); const validIds = collection.getValidEntities();
expect(validIds).toContain('user1'); expect(validIds).toContain('user1');
@ -371,7 +371,7 @@ describe('Schema System', () => {
expect(invalidEntities[0].entityId).toBe('user3'); expect(invalidEntities[0].entityId).toBe('user3');
}); });
test('should apply schema to lossless views', async () => { test('should apply schema to hyperview views', async () => {
const userSchema = CommonSchemas.User(); const userSchema = CommonSchemas.User();
const collection = new TypedCollectionImpl<{ const collection = new TypedCollectionImpl<{
name: string; name: string;

View File

@ -1,7 +1,7 @@
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { import {
RhizomeNode, RhizomeNode,
Lossless, Hyperview,
SumResolver, SumResolver,
CustomResolver, CustomResolver,
LastWriteWinsPlugin, LastWriteWinsPlugin,
@ -13,28 +13,28 @@ const debug = Debug('rz:test:performance');
describe('Concurrent Write Scenarios', () => { describe('Concurrent Write Scenarios', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('Simultaneous Writes with Same Timestamp', () => { describe('Simultaneous Writes with Same Timestamp', () => {
test('should handle simultaneous writes using last-write-wins resolver', () => { test('should handle simultaneous writes using last-write-wins resolver', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
const timestamp = 1000; const timestamp = 1000;
// Simulate two writers updating the same property at the exact same time // Simulate two writers updating the same property at the exact same time
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withId('delta-a') .withId('delta-a')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection') .setProperty('entity1', 'score', 100, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer2', 'host2') hyperview.ingestDelta(createDelta('writer2', 'host2')
.withId('delta-b') .withId('delta-b')
.withTimestamp(timestamp) // Same timestamp .withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection') .setProperty('entity1', 'score', 200, 'collection')
@ -52,16 +52,16 @@ describe('Concurrent Write Scenarios', () => {
test('should handle simultaneous writes using timestamp resolver with tie-breaking', () => { test('should handle simultaneous writes using timestamp resolver with tie-breaking', () => {
const timestamp = 1000; const timestamp = 1000;
const resolver = new TimestampResolver(lossless, 'creator-id'); const resolver = new TimestampResolver(hyperview, 'creator-id');
lossless.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later hyperview.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later
.withId('delta-a') .withId('delta-a')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection') .setProperty('entity1', 'score', 100, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier hyperview.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier
.withId('delta-b') .withId('delta-b')
.withTimestamp(timestamp) // Same timestamp .withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection') .setProperty('entity1', 'score', 200, 'collection')
@ -76,23 +76,23 @@ describe('Concurrent Write Scenarios', () => {
}); });
test('should handle multiple writers with aggregation resolver', () => { test('should handle multiple writers with aggregation resolver', () => {
const resolver = new SumResolver(lossless, ['points']); const resolver = new SumResolver(hyperview, ['points']);
// Multiple writers add values simultaneously // Multiple writers add values simultaneously
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity1', 'points', 10, 'collection') .setProperty('entity1', 'points', 10, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer2', 'host2') hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 20, 'collection') .setProperty('entity1', 'points', 20, 'collection')
.buildV1() .buildV1()
); );
// Third writer adds another value // Third writer adds another value
lossless.ingestDelta(createDelta('writer3', 'host3') hyperview.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 30, 'collection') .setProperty('entity1', 'points', 30, 'collection')
.buildV1() .buildV1()
@ -108,10 +108,10 @@ describe('Concurrent Write Scenarios', () => {
describe('Out-of-Order Write Arrival', () => { describe('Out-of-Order Write Arrival', () => {
test('should handle writes arriving out of chronological order', () => { test('should handle writes arriving out of chronological order', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
// Newer delta arrives first // Newer delta arrives first
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.addPointer('collection', 'entity1', 'value') .addPointer('collection', 'entity1', 'value')
.addPointer('value', 'newer') .addPointer('value', 'newer')
@ -119,7 +119,7 @@ describe('Concurrent Write Scenarios', () => {
); );
// Older delta arrives later // Older delta arrives later
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'value') .addPointer('collection', 'entity1', 'value')
.addPointer('value', 'older') .addPointer('value', 'older')
@ -134,24 +134,24 @@ describe('Concurrent Write Scenarios', () => {
}); });
test('should maintain correct aggregation despite out-of-order arrival', () => { test('should maintain correct aggregation despite out-of-order arrival', () => {
const resolver = new SumResolver(lossless, ['score']); const resolver = new SumResolver(hyperview, ['score']);
// Add deltas in reverse chronological order // Add deltas in reverse chronological order
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(3000) .withTimestamp(3000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
.addPointer('score', 30) .addPointer('score', 30)
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
.addPointer('score', 10) .addPointer('score', 10)
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
.addPointer('score', 20) .addPointer('score', 20)
@ -168,7 +168,7 @@ describe('Concurrent Write Scenarios', () => {
describe('High-Frequency Concurrent Updates', () => { describe('High-Frequency Concurrent Updates', () => {
test('should handle rapid concurrent updates to the same entity', () => { test('should handle rapid concurrent updates to the same entity', () => {
const resolver = new SumResolver(lossless, ['counter']); const resolver = new SumResolver(hyperview, ['counter']);
const baseTimestamp = 1000; const baseTimestamp = 1000;
const numWriters = 10; const numWriters = 10;
@ -177,7 +177,7 @@ describe('Concurrent Write Scenarios', () => {
// Simulate multiple writers making rapid updates // Simulate multiple writers making rapid updates
for (let writer = 0; writer < numWriters; writer++) { for (let writer = 0; writer < numWriters; writer++) {
for (let write = 0; write < writesPerWriter; write++) { for (let write = 0; write < writesPerWriter; write++) {
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`) hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + write) .withTimestamp(baseTimestamp + write)
.addPointer('collection', 'entity1', 'counter') .addPointer('collection', 'entity1', 'counter')
.addPointer('counter', 1) .addPointer('counter', 1)
@ -194,7 +194,7 @@ describe('Concurrent Write Scenarios', () => {
}); });
test('should handle concurrent updates to multiple properties', () => { test('should handle concurrent updates to multiple properties', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
name: new LastWriteWinsPlugin(), name: new LastWriteWinsPlugin(),
score: new LastWriteWinsPlugin() score: new LastWriteWinsPlugin()
}); });
@ -202,14 +202,14 @@ describe('Concurrent Write Scenarios', () => {
const timestamp = 1000; const timestamp = 1000;
// Writer 1 updates name and score // Writer 1 updates name and score
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
.addPointer('name', 'alice') .addPointer('name', 'alice')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp + 1) .withTimestamp(timestamp + 1)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
.addPointer('score', 100) .addPointer('score', 100)
@ -217,14 +217,14 @@ describe('Concurrent Write Scenarios', () => {
); );
// Writer 2 updates name and score concurrently // Writer 2 updates name and score concurrently
lossless.ingestDelta(createDelta('writer2', 'host2') hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 2) .withTimestamp(timestamp + 2)
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
.addPointer('name', 'bob') .addPointer('name', 'bob')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer2', 'host2') hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 3) .withTimestamp(timestamp + 3)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
.addPointer('score', 200) .addPointer('score', 200)
@ -241,13 +241,13 @@ describe('Concurrent Write Scenarios', () => {
describe('Cross-Entity Concurrent Writes', () => { describe('Cross-Entity Concurrent Writes', () => {
test('should handle concurrent writes to different entities', () => { test('should handle concurrent writes to different entities', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
const timestamp = 1000; const timestamp = 1000;
// Multiple writers updating different entities simultaneously // Multiple writers updating different entities simultaneously
for (let i = 0; i < 5; i++) { for (let i = 0; i < 5; i++) {
lossless.ingestDelta(createDelta(`writer${i}`, `host${i}`) hyperview.ingestDelta(createDelta(`writer${i}`, `host${i}`)
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', `entity${i}`, 'value') .addPointer('collection', `entity${i}`, 'value')
.addPointer('value', (i + 1) * 10) .addPointer('value', (i + 1) * 10)
@ -266,28 +266,28 @@ describe('Concurrent Write Scenarios', () => {
}); });
test('should handle mixed entity and property conflicts', () => { test('should handle mixed entity and property conflicts', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
votes: new MajorityVotePlugin(), votes: new MajorityVotePlugin(),
status: new LastWriteWinsPlugin() status: new LastWriteWinsPlugin()
}); });
const timestamp = 1000; const timestamp = 1000;
// Entity1: Multiple writers competing for same property // Entity1: Multiple writers competing for same property
lossless.ingestDelta(createDelta('writer1', 'host1') hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes') .addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a') .addPointer('votes', 'option_a')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer2', 'host2') hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes') .addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a') .addPointer('votes', 'option_a')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('writer3', 'host3') hyperview.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes') .addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_b') .addPointer('votes', 'option_b')
@ -295,7 +295,7 @@ describe('Concurrent Write Scenarios', () => {
); );
// Entity2: Single writer, no conflict // Entity2: Single writer, no conflict
lossless.ingestDelta(createDelta('writer4', 'host4') hyperview.ingestDelta(createDelta('writer4', 'host4')
.withTimestamp(timestamp) .withTimestamp(timestamp)
.addPointer('collection', 'entity2', 'status') .addPointer('collection', 'entity2', 'status')
.addPointer('status', 'active') .addPointer('status', 'active')
@ -312,7 +312,7 @@ describe('Concurrent Write Scenarios', () => {
describe('Stress Testing', () => { describe('Stress Testing', () => {
test('should handle large number of concurrent writes efficiently', () => { test('should handle large number of concurrent writes efficiently', () => {
const resolver = new SumResolver(lossless, ['score']); const resolver = new SumResolver(hyperview, ['score']);
const numEntities = 100; const numEntities = 100;
const numWritersPerEntity = 10; const numWritersPerEntity = 10;
@ -321,7 +321,7 @@ describe('Concurrent Write Scenarios', () => {
// Generate a large number of concurrent writes // Generate a large number of concurrent writes
for (let entity = 0; entity < numEntities; entity++) { for (let entity = 0; entity < numEntities; entity++) {
for (let writer = 0; writer < numWritersPerEntity; writer++) { for (let writer = 0; writer < numWritersPerEntity; writer++) {
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`) hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + Math.floor(Math.random() * 1000)) .withTimestamp(baseTimestamp + Math.floor(Math.random() * 1000))
.addPointer('collection', `entity${entity}`, 'score') .addPointer('collection', `entity${entity}`, 'score')
.addPointer('score', Math.floor(Math.random() * 100)) .addPointer('score', Math.floor(Math.random() * 100))
@ -344,14 +344,14 @@ describe('Concurrent Write Scenarios', () => {
}); });
test('should maintain consistency under rapid updates and resolution calls', () => { test('should maintain consistency under rapid updates and resolution calls', () => {
const resolver = new SumResolver(lossless, ['counter']); const resolver = new SumResolver(hyperview, ['counter']);
const entityId = 'stress-test-entity'; const entityId = 'stress-test-entity';
let updateCount = 0; let updateCount = 0;
// Add initial deltas // Add initial deltas
for (let i = 0; i < 50; i++) { for (let i = 0; i < 50; i++) {
lossless.ingestDelta(createDelta( hyperview.ingestDelta(createDelta(
`writer${i % 5}`, `writer${i % 5}`,
`host${i % 3}` `host${i % 3}`
) )
@ -370,7 +370,7 @@ describe('Concurrent Write Scenarios', () => {
// Add more deltas and verify consistency // Add more deltas and verify consistency
for (let i = 0; i < 25; i++) { for (let i = 0; i < 25; i++) {
lossless.ingestDelta(createDelta('late-writer', 'late-host') hyperview.ingestDelta(createDelta('late-writer', 'late-host')
.withTimestamp(2000 + i) .withTimestamp(2000 + i)
.addPointer('collection', entityId, 'counter') .addPointer('collection', entityId, 'counter')
.addPointer('counter', 2) .addPointer('counter', 2)

View File

@ -83,7 +83,7 @@ describe('Nested Object Resolution Performance', () => {
const friendshipDelta = createDelta(node.config.creator, node.config.peerId) const friendshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'friends', friendId, 'users') .setProperty(userId, 'friends', friendId, 'users')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendshipDelta); node.hyperview.ingestDelta(friendshipDelta);
} }
} }
@ -96,7 +96,7 @@ describe('Nested Object Resolution Performance', () => {
const followDelta = createDelta(node.config.creator, node.config.peerId) const followDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'followers', followerId, 'users') .setProperty(userId, 'followers', followerId, 'users')
.buildV1(); .buildV1();
node.lossless.ingestDelta(followDelta); node.hyperview.ingestDelta(followDelta);
} }
} }
@ -107,7 +107,7 @@ describe('Nested Object Resolution Performance', () => {
const mentorshipDelta = createDelta(node.config.creator, node.config.peerId) const mentorshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'mentor', mentorId, 'users') .setProperty(userId, 'mentor', mentorId, 'users')
.buildV1(); .buildV1();
node.lossless.ingestDelta(mentorshipDelta); node.hyperview.ingestDelta(mentorshipDelta);
} }
} }
@ -116,7 +116,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance for a user with many connections // Test resolution performance for a user with many connections
const testUserId = userIds[50]; // Pick a user in the middle const testUserId = userIds[50]; // Pick a user in the middle
const userViews = node.lossless.compose([testUserId]); const userViews = node.hyperview.compose([testUserId]);
const userView = userViews[testUserId]; const userView = userViews[testUserId];
const startResolution = performance.now(); const startResolution = performance.now();
@ -124,7 +124,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
userView, userView,
'network-user', 'network-user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -197,7 +197,7 @@ describe('Nested Object Resolution Performance', () => {
const linkDelta = createDelta(node.config.creator, node.config.peerId) const linkDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(currentId, 'next', nextId, 'users') .setProperty(currentId, 'next', nextId, 'users')
.buildV1(); .buildV1();
node.lossless.ingestDelta(linkDelta); node.hyperview.ingestDelta(linkDelta);
} }
const setupTime = performance.now() - startSetup; const setupTime = performance.now() - startSetup;
@ -205,7 +205,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution from the start of the chain // Test resolution from the start of the chain
const firstUserId = userIds[0]; const firstUserId = userIds[0];
const userViews = node.lossless.compose([firstUserId]); const userViews = node.hyperview.compose([firstUserId]);
const userView = userViews[firstUserId]; const userView = userViews[firstUserId];
const startResolution = performance.now(); const startResolution = performance.now();
@ -213,7 +213,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
userView, userView,
'chain-user', 'chain-user',
node.lossless, node.hyperview,
{ maxDepth: 5 } // Should resolve 5 levels deep { maxDepth: 5 } // Should resolve 5 levels deep
); );
@ -292,7 +292,7 @@ describe('Nested Object Resolution Performance', () => {
.addPointer('users', userId, 'connections') .addPointer('users', userId, 'connections')
.addPointer('connections', connectedId) .addPointer('connections', connectedId)
.buildV1(); .buildV1();
node.lossless.ingestDelta(connectionDelta); node.hyperview.ingestDelta(connectionDelta);
} }
} }
@ -301,7 +301,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance with circular references // Test resolution performance with circular references
const testUserId = userIds[0]; const testUserId = userIds[0];
const userViews = node.lossless.compose([testUserId]); const userViews = node.hyperview.compose([testUserId]);
const userView = userViews[testUserId]; const userView = userViews[testUserId];
const startResolution = performance.now(); const startResolution = performance.now();
@ -309,7 +309,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
userView, userView,
'circular-user', 'circular-user',
node.lossless, node.hyperview,
{ maxDepth: 3 } { maxDepth: 3 }
); );

View File

@ -1,13 +1,13 @@
/** /**
* Tests for lossless view compose() and decompose() bidirectional conversion * Tests for hyperview view compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into lossless views and decomposed back * Ensures that deltas can be composed into hyperview views and decomposed back
* to the original deltas with all pointer relationships preserved. * to the original deltas with all pointer relationships preserved.
*/ */
import { RhizomeNode } from '@src/node'; import { RhizomeNode } from '@src/node';
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
describe('Lossless View Compose/Decompose', () => { describe('Hyperview View Compose/Decompose', () => {
let node: RhizomeNode; let node: RhizomeNode;
beforeEach(() => { beforeEach(() => {
@ -29,10 +29,10 @@ describe('Lossless View Compose/Decompose', () => {
]; ];
// Ingest the deltas // Ingest the deltas
nameDeltas.forEach(delta => node.lossless.ingestDelta(delta)); nameDeltas.forEach(delta => node.hyperview.ingestDelta(delta));
// Compose lossless view // Compose hyperview view
const composed = node.lossless.compose(['alice']); const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice']; const aliceView = composed['alice'];
expect(aliceView).toBeDefined(); expect(aliceView).toBeDefined();
@ -41,7 +41,7 @@ describe('Lossless View Compose/Decompose', () => {
expect(aliceView.propertyDeltas.email).toHaveLength(1); expect(aliceView.propertyDeltas.email).toHaveLength(1);
// Decompose back to deltas // Decompose back to deltas
const decomposed = node.lossless.decompose(aliceView); const decomposed = node.hyperview.decompose(aliceView);
expect(decomposed).toHaveLength(2); expect(decomposed).toHaveLength(2);
@ -73,12 +73,12 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('intensity', 8) .addPointer('intensity', 8)
.buildV1(); .buildV1();
node.lossless.ingestDelta(relationshipDelta); node.hyperview.ingestDelta(relationshipDelta);
// Compose and decompose // Compose and decompose
const composed = node.lossless.compose(['alice']); const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice']; const aliceView = composed['alice'];
const decomposed = node.lossless.decompose(aliceView); const decomposed = node.hyperview.decompose(aliceView);
expect(decomposed).toHaveLength(1); expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0]; const reconstituted = decomposed[0];
@ -119,16 +119,16 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('friend', 'bob', 'friends') .addPointer('friend', 'bob', 'friends')
.buildV1(); .buildV1();
[aliceDelta, bobDelta, friendshipDelta].forEach(d => node.lossless.ingestDelta(d)); [aliceDelta, bobDelta, friendshipDelta].forEach(d => node.hyperview.ingestDelta(d));
// Compose Alice's view // Compose Alice's view
const composed = node.lossless.compose(['alice']); const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice']; const aliceView = composed['alice'];
expect(aliceView.propertyDeltas.friends).toHaveLength(1); expect(aliceView.propertyDeltas.friends).toHaveLength(1);
// Decompose and verify the friendship delta is correctly reconstructed // Decompose and verify the friendship delta is correctly reconstructed
const decomposed = node.lossless.decompose(aliceView); const decomposed = node.hyperview.decompose(aliceView);
const friendshipReconstituted = decomposed.find(d => const friendshipReconstituted = decomposed.find(d =>
d.pointers.some(p => p.localContext === 'friend') d.pointers.some(p => p.localContext === 'friend')
); );
@ -152,10 +152,10 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('name', 'Alice') .addPointer('name', 'Alice')
.buildV1(); .buildV1();
node.lossless.ingestDelta(originalDelta); node.hyperview.ingestDelta(originalDelta);
const composed = node.lossless.compose(['alice']); const composed = node.hyperview.compose(['alice']);
const decomposed = node.lossless.decompose(composed['alice']); const decomposed = node.hyperview.decompose(composed['alice']);
expect(decomposed).toHaveLength(1); expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0]; const reconstituted = decomposed[0];
@ -184,15 +184,15 @@ describe('Lossless View Compose/Decompose', () => {
.buildV1() .buildV1()
]; ];
nameDeltas.forEach(d => node.lossless.ingestDelta(d)); nameDeltas.forEach(d => node.hyperview.ingestDelta(d));
const composed = node.lossless.compose(['alice']); const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice']; const aliceView = composed['alice'];
// Should have 3 deltas for the name property // Should have 3 deltas for the name property
expect(aliceView.propertyDeltas.name).toHaveLength(3); expect(aliceView.propertyDeltas.name).toHaveLength(3);
const decomposed = node.lossless.decompose(aliceView); const decomposed = node.hyperview.decompose(aliceView);
// Should decompose back to 3 separate deltas // Should decompose back to 3 separate deltas
expect(decomposed).toHaveLength(3); expect(decomposed).toHaveLength(3);

View File

@ -1,6 +1,6 @@
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { DeltaV1, DeltaV2 } from '@src/core/delta'; import { DeltaV1, DeltaV2 } from '@src/core/delta';
import { Lossless } from '@src/views/lossless'; import { Hyperview } from '@src/views/hyperview';
import { RhizomeNode } from '@src/node'; import { RhizomeNode } from '@src/node';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers'; import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
@ -45,10 +45,10 @@ describe('DeltaBuilder', () => {
}); });
// Verify that the entity property resolves correctly // Verify that the entity property resolves correctly
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless); const view = new TimestampResolver(hyperview);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
const result = lossy.resolve(); const result = view.resolve();
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity'); expect(result!['entity-1'].properties.name).toBe('Test Entity');
}); });
@ -70,10 +70,10 @@ describe('DeltaBuilder', () => {
}); });
// Verify that the entity property resolves correctly // Verify that the entity property resolves correctly
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless); const view = new TimestampResolver(hyperview);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
const result = lossy.resolve(); const result = view.resolve();
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity'); expect(result!['entity-1'].properties.name).toBe('Test Entity');
}); });
@ -170,10 +170,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('target', 'user-2'); expect(delta.pointers).toHaveProperty('target', 'user-2');
expect(delta.pointers).toHaveProperty('type', 'follows'); expect(delta.pointers).toHaveProperty('type', 'follows');
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless); const view = new TimestampResolver(hyperview);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
const result = lossy.resolve([relId]); const result = view.resolve([relId]);
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(result![relId]).toMatchObject({ expect(result![relId]).toMatchObject({
properties: { properties: {
@ -200,10 +200,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('type', 'follows'); expect(delta.pointers).toHaveProperty('type', 'follows');
expect(delta.pointers).toHaveProperty('version', 1); expect(delta.pointers).toHaveProperty('version', 1);
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless); const view = new TimestampResolver(hyperview);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
const result = lossy.resolve([relId]); const result = view.resolve([relId]);
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(result![relId]).toMatchObject({ expect(result![relId]).toMatchObject({
properties: { properties: {

View File

@ -2,17 +2,17 @@ import Debug from 'debug';
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { NegationHelper } from '@src/features'; import { NegationHelper } from '@src/features';
import { RhizomeNode } from '@src/node'; import { RhizomeNode } from '@src/node';
import { Lossless } from '@src/views'; import { Hyperview } from '@src/views';
const debug = Debug('rz:negation:test'); const debug = Debug('rz:negation:test');
describe('Negation System', () => { describe('Negation System', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('Negation Helper', () => { describe('Negation Helper', () => {
@ -179,8 +179,8 @@ describe('Negation System', () => {
}); });
}); });
describe('Lossless View Integration', () => { describe('Hyperview View Integration', () => {
test('should filter negated deltas in lossless views', () => { test('should filter negated deltas in hyperview views', () => {
// Create original delta // Create original delta
const originalDelta = createDelta('user1', 'host1') const originalDelta = createDelta('user1', 'host1')
.setProperty('user123', 'name', 'Alice') .setProperty('user123', 'name', 'Alice')
@ -198,12 +198,12 @@ describe('Negation System', () => {
.buildV1(); .buildV1();
// Ingest all deltas // Ingest all deltas
lossless.ingestDelta(originalDelta); hyperview.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(nonNegatedDelta); hyperview.ingestDelta(nonNegatedDelta);
// Get view - should only show non-negated delta // Get view - should only show non-negated delta
const view = lossless.compose(['user123']); const view = hyperview.compose(['user123']);
expect(view.user123).toBeDefined(); expect(view.user123).toBeDefined();
@ -220,11 +220,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(originalDelta.id).buildV1(); const negation1 = createDelta('mod1', 'host1').negate(originalDelta.id).buildV1();
const negation2 = createDelta('mod2', 'host1').negate(originalDelta.id).buildV1(); const negation2 = createDelta('mod2', 'host1').negate(originalDelta.id).buildV1();
lossless.ingestDelta(originalDelta); hyperview.ingestDelta(originalDelta);
lossless.ingestDelta(negation1); hyperview.ingestDelta(negation1);
lossless.ingestDelta(negation2); hyperview.ingestDelta(negation2);
const view = lossless.compose(['post1']); const view = hyperview.compose(['post1']);
// Original delta should be negated (not visible) // Original delta should be negated (not visible)
expect(view.post1).toBeUndefined(); expect(view.post1).toBeUndefined();
@ -241,11 +241,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(delta1.id).buildV1(); const negation1 = createDelta('mod1', 'host1').negate(delta1.id).buildV1();
lossless.ingestDelta(delta1); hyperview.ingestDelta(delta1);
lossless.ingestDelta(delta2); hyperview.ingestDelta(delta2);
lossless.ingestDelta(negation1); hyperview.ingestDelta(negation1);
const stats = lossless.getNegationStats('article1'); const stats = hyperview.getNegationStats('article1');
expect(stats.totalDeltas).toBe(3); expect(stats.totalDeltas).toBe(3);
expect(stats.negationDeltas).toBe(1); expect(stats.negationDeltas).toBe(1);
@ -262,10 +262,10 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(originalDelta.id).buildV1(); const negationDelta = createDelta('admin', 'host1').negate(originalDelta.id).buildV1();
lossless.ingestDelta(originalDelta); hyperview.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
const negations = lossless.getNegationDeltas('task1'); const negations = hyperview.getNegationDeltas('task1');
expect(negations).toHaveLength(1); expect(negations).toHaveLength(1);
expect(negations[0].id).toBe(negationDelta.id); expect(negations[0].id).toBe(negationDelta.id);
expect(negations[0].creator).toBe('admin'); expect(negations[0].creator).toBe('admin');
@ -275,7 +275,7 @@ describe('Negation System', () => {
const transactionId = 'tx-negation'; const transactionId = 'tx-negation';
// Create transaction declaration // Create transaction declaration
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2) .declareTransaction(transactionId, 2)
.buildV1() .buildV1()
); );
@ -294,11 +294,11 @@ describe('Negation System', () => {
targetContext: 'deltas' targetContext: 'deltas'
}); });
lossless.ingestDelta(originalDelta); hyperview.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
// Transaction should complete, but original delta should be negated // Transaction should complete, but original delta should be negated
const view = lossless.compose(['post1']); const view = hyperview.compose(['post1']);
expect(view.post1).toBeUndefined(); // No visible deltas expect(view.post1).toBeUndefined(); // No visible deltas
}); });
@ -321,11 +321,11 @@ describe('Negation System', () => {
.setProperty('post1', 'content', 'Edited post') .setProperty('post1', 'content', 'Edited post')
.buildV1(); .buildV1();
lossless.ingestDelta(postDelta); hyperview.ingestDelta(postDelta);
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(editDelta); hyperview.ingestDelta(editDelta);
const view = lossless.compose(['post1']); const view = hyperview.compose(['post1']);
// Should show edited content (edit happened after negation) // Should show edited content (edit happened after negation)
expect(view.post1).toBeDefined(); expect(view.post1).toBeDefined();
@ -341,10 +341,10 @@ describe('Negation System', () => {
test('should handle negation of non-existent deltas', () => { test('should handle negation of non-existent deltas', () => {
const negationDelta = createDelta('moderator', 'host1').negate('non-existent-delta-id').buildV1(); const negationDelta = createDelta('moderator', 'host1').negate('non-existent-delta-id').buildV1();
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
// Should not crash and stats should reflect the orphaned negation // Should not crash and stats should reflect the orphaned negation
const stats = lossless.getNegationStats('entity1'); const stats = hyperview.getNegationStats('entity1');
expect(stats.negationDeltas).toBe(0); // No negations for this entity expect(stats.negationDeltas).toBe(0); // No negations for this entity
}); });
@ -357,16 +357,16 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(selfRefDelta.id).buildV1(); const negationDelta = createDelta('admin', 'host1').negate(selfRefDelta.id).buildV1();
lossless.ingestDelta(selfRefDelta); hyperview.ingestDelta(selfRefDelta);
lossless.ingestDelta(negationDelta); hyperview.ingestDelta(negationDelta);
const view = lossless.compose(['node1']); const view = hyperview.compose(['node1']);
expect(view.node1).toBeUndefined(); // Should be negated expect(view.node1).toBeUndefined(); // Should be negated
}); });
test('should handle multiple direct negations of the same delta', () => { test('should handle multiple direct negations of the same delta', () => {
const testNode = new RhizomeNode(); const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode); const testHyperview = new Hyperview(testNode);
// Create the original delta // Create the original delta
const originalDelta = createDelta('user1', 'host1') const originalDelta = createDelta('user1', 'host1')
@ -378,18 +378,18 @@ describe('Negation System', () => {
const negation2 = createDelta('user3', 'host1').negate(originalDelta.id).buildV1(); const negation2 = createDelta('user3', 'host1').negate(originalDelta.id).buildV1();
// Process all deltas // Process all deltas
testLossless.ingestDelta(originalDelta); testHyperview.ingestDelta(originalDelta);
testLossless.ingestDelta(negation1); testHyperview.ingestDelta(negation1);
testLossless.ingestDelta(negation2); testHyperview.ingestDelta(negation2);
// Get the view after processing all deltas // Get the view after processing all deltas
const view = testLossless.compose(['entity2']); const view = testHyperview.compose(['entity2']);
// The original delta should be negated (not in view) because it has two direct negations // The original delta should be negated (not in view) because it has two direct negations
expect(view.entity2).toBeUndefined(); expect(view.entity2).toBeUndefined();
// Verify the stats // Verify the stats
const stats = testLossless.getNegationStats('entity2'); const stats = testHyperview.getNegationStats('entity2');
expect(stats.negationDeltas).toBe(2); expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(1); expect(stats.negatedDeltas).toBe(1);
expect(stats.effectiveDeltas).toBe(0); expect(stats.effectiveDeltas).toBe(0);
@ -397,7 +397,7 @@ describe('Negation System', () => {
test('should handle complex negation chains', () => { test('should handle complex negation chains', () => {
const testNode = new RhizomeNode(); const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode); const testHyperview = new Hyperview(testNode);
// Create the original delta // Create the original delta
const deltaA = createDelta('user1', 'host1') const deltaA = createDelta('user1', 'host1')
@ -415,13 +415,13 @@ describe('Negation System', () => {
debug('Delta D (negates C): %s', deltaD.id); debug('Delta D (negates C): %s', deltaD.id);
// Process all deltas in order // Process all deltas in order
testLossless.ingestDelta(deltaA); testHyperview.ingestDelta(deltaA);
testLossless.ingestDelta(deltaB); testHyperview.ingestDelta(deltaB);
testLossless.ingestDelta(deltaC); testHyperview.ingestDelta(deltaC);
testLossless.ingestDelta(deltaD); testHyperview.ingestDelta(deltaD);
// Get the view after processing all deltas // Get the view after processing all deltas
const view = testLossless.compose(['entity3']); const view = testHyperview.compose(['entity3']);
// The original delta should be negated because: // The original delta should be negated because:
// - B negates A // - B negates A
@ -433,7 +433,7 @@ describe('Negation System', () => {
const allDeltas = [deltaA, deltaB, deltaC, deltaD]; const allDeltas = [deltaA, deltaB, deltaC, deltaD];
// Get the stats // Get the stats
const stats = testLossless.getNegationStats('entity3'); const stats = testHyperview.getNegationStats('entity3');
const isANegated = NegationHelper.isDeltaNegated(deltaA.id, allDeltas); const isANegated = NegationHelper.isDeltaNegated(deltaA.id, allDeltas);
const isBNegated = NegationHelper.isDeltaNegated(deltaB.id, allDeltas); const isBNegated = NegationHelper.isDeltaNegated(deltaB.id, allDeltas);
const isCNegated = NegationHelper.isDeltaNegated(deltaC.id, allDeltas); const isCNegated = NegationHelper.isDeltaNegated(deltaC.id, allDeltas);
@ -470,7 +470,7 @@ describe('Negation System', () => {
test('should handle multiple independent negations', () => { test('should handle multiple independent negations', () => {
const testNode = new RhizomeNode(); const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode); const testHyperview = new Hyperview(testNode);
// Create two independent deltas // Create two independent deltas
const delta1 = createDelta('user1', 'host1') const delta1 = createDelta('user1', 'host1')
@ -486,19 +486,19 @@ describe('Negation System', () => {
const negation2 = createDelta('user4', 'host1').negate(delta2.id).buildV1(); const negation2 = createDelta('user4', 'host1').negate(delta2.id).buildV1();
// Process all deltas // Process all deltas
testLossless.ingestDelta(delta1); testHyperview.ingestDelta(delta1);
testLossless.ingestDelta(delta2); testHyperview.ingestDelta(delta2);
testLossless.ingestDelta(negation1); testHyperview.ingestDelta(negation1);
testLossless.ingestDelta(negation2); testHyperview.ingestDelta(negation2);
// Get the view after processing all deltas // Get the view after processing all deltas
const view = testLossless.compose(['entity4']); const view = testHyperview.compose(['entity4']);
// Both deltas should be negated // Both deltas should be negated
expect(view.entity4).toBeUndefined(); expect(view.entity4).toBeUndefined();
// Verify the stats // Verify the stats
const stats = testLossless.getNegationStats('entity4'); const stats = testHyperview.getNegationStats('entity4');
expect(stats.negationDeltas).toBe(2); expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(2); expect(stats.negatedDeltas).toBe(2);
expect(stats.effectiveDeltas).toBe(0); expect(stats.effectiveDeltas).toBe(0);

View File

@ -1,15 +1,15 @@
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { Lossless } from '@src/views'; import { Hyperview } from '@src/views';
import { RhizomeNode } from '@src/node'; import { RhizomeNode } from '@src/node';
import { DeltaFilter } from '@src/core'; import { DeltaFilter } from '@src/core';
describe('Transactions', () => { describe('Transactions', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('Transaction-based filtering', () => { describe('Transaction-based filtering', () => {
@ -34,12 +34,12 @@ describe('Transactions', () => {
.buildV1(); .buildV1();
// Ingest transaction declaration and first two deltas // Ingest transaction declaration and first two deltas
lossless.ingestDelta(txDeclaration); hyperview.ingestDelta(txDeclaration);
lossless.ingestDelta(delta1); hyperview.ingestDelta(delta1);
lossless.ingestDelta(delta2); hyperview.ingestDelta(delta2);
// View should be empty because transaction is incomplete (2/3 deltas) // View should be empty because transaction is incomplete (2/3 deltas)
const view = lossless.compose(['user123']); const view = hyperview.compose(['user123']);
expect(view.user123).toBeUndefined(); expect(view.user123).toBeUndefined();
// Add the third delta to complete the transaction // Add the third delta to complete the transaction
@ -48,10 +48,10 @@ describe('Transactions', () => {
.setProperty('user123', 'email', 'alice@example.com') .setProperty('user123', 'email', 'alice@example.com')
.buildV1(); .buildV1();
lossless.ingestDelta(delta3); hyperview.ingestDelta(delta3);
// Now the view should include all deltas from the completed transaction // Now the view should include all deltas from the completed transaction
const completeView = lossless.compose(['user123']); const completeView = hyperview.compose(['user123']);
expect(completeView.user123).toBeDefined(); expect(completeView.user123).toBeDefined();
expect(completeView.user123.propertyDeltas.name).toHaveLength(1); expect(completeView.user123.propertyDeltas.name).toHaveLength(1);
expect(completeView.user123.propertyDeltas.age).toHaveLength(1); expect(completeView.user123.propertyDeltas.age).toHaveLength(1);
@ -63,57 +63,57 @@ describe('Transactions', () => {
const tx2 = 'tx-002'; const tx2 = 'tx-002';
// Declare two transactions // Declare two transactions
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx1, 2) .declareTransaction(tx1, 2)
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx2, 2) .declareTransaction(tx2, 2)
.buildV1() .buildV1()
); );
// Add deltas for both transactions // Add deltas for both transactions
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1) .inTransaction(tx1)
.setProperty('order1', 'status', 'pending') .setProperty('order1', 'status', 'pending')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user2', 'host2') hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2) .inTransaction(tx2)
.setProperty('order2', 'status', 'shipped') .setProperty('order2', 'status', 'shipped')
.buildV1() .buildV1()
); );
// Neither transaction is complete // Neither transaction is complete
let view = lossless.compose(['order1', 'order2']); let view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeUndefined(); expect(view.order1).toBeUndefined();
expect(view.order2).toBeUndefined(); expect(view.order2).toBeUndefined();
// Complete tx1 // Complete tx1
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1) .inTransaction(tx1)
.setProperty('order1', 'total', 100) .setProperty('order1', 'total', 100)
.buildV1() .buildV1()
); );
// tx1 is complete, tx2 is not // tx1 is complete, tx2 is not
view = lossless.compose(['order1', 'order2']); view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeDefined(); expect(view.order1).toBeDefined();
expect(view.order1.propertyDeltas.status).toHaveLength(1); expect(view.order1.propertyDeltas.status).toHaveLength(1);
expect(view.order1.propertyDeltas.total).toHaveLength(1); expect(view.order1.propertyDeltas.total).toHaveLength(1);
expect(view.order2).toBeUndefined(); expect(view.order2).toBeUndefined();
// Complete tx2 // Complete tx2
lossless.ingestDelta(createDelta('user2', 'host2') hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2) .inTransaction(tx2)
.setProperty('order2', 'tracking', 'TRACK123') .setProperty('order2', 'tracking', 'TRACK123')
.buildV1() .buildV1()
); );
// Both transactions complete // Both transactions complete
view = lossless.compose(['order1', 'order2']); view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeDefined(); expect(view.order1).toBeDefined();
expect(view.order2).toBeDefined(); expect(view.order2).toBeDefined();
expect(view.order2.propertyDeltas.status).toHaveLength(1); expect(view.order2.propertyDeltas.status).toHaveLength(1);
@ -124,19 +124,19 @@ describe('Transactions', () => {
const transactionId = 'tx-filter-test'; const transactionId = 'tx-filter-test';
// Create transaction with 2 deltas // Create transaction with 2 deltas
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2) .declareTransaction(transactionId, 2)
.buildV1() .buildV1()
); );
// Add both deltas // Add both deltas
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('doc1', 'type', 'report') .setProperty('doc1', 'type', 'report')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user2', 'host2') hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('doc1', 'author', 'Bob') .setProperty('doc1', 'author', 'Bob')
.buildV1() .buildV1()
@ -147,7 +147,7 @@ describe('Transactions', () => {
// With incomplete transaction, nothing should show // With incomplete transaction, nothing should show
// But once complete, the filter should still apply // But once complete, the filter should still apply
const view = lossless.compose(['doc1'], userFilter); const view = hyperview.compose(['doc1'], userFilter);
// Even though transaction is complete, only delta from user1 should appear // Even though transaction is complete, only delta from user1 should appear
expect(view.doc1).toBeDefined(); expect(view.doc1).toBeDefined();
@ -159,13 +159,13 @@ describe('Transactions', () => {
const transactionId = 'tx-multi-entity'; const transactionId = 'tx-multi-entity';
// Transaction that updates multiple entities atomically // Transaction that updates multiple entities atomically
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 3) .declareTransaction(transactionId, 3)
.buildV1() .buildV1()
); );
// Transfer money from account1 to account2 // Transfer money from account1 to account2
lossless.ingestDelta(createDelta('bank', 'host1') hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.addPointer('balance', 'account1', 'balance') .addPointer('balance', 'account1', 'balance')
.addPointer('value', 900) .addPointer('value', 900)
@ -173,7 +173,7 @@ describe('Transactions', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('bank', 'host1') hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.addPointer('balance', 'account2', 'balance') .addPointer('balance', 'account2', 'balance')
.addPointer('value', 1100) .addPointer('value', 1100)
@ -182,12 +182,12 @@ describe('Transactions', () => {
); );
// Transaction incomplete - no entities should show updates // Transaction incomplete - no entities should show updates
let view = lossless.compose(['account1', 'account2']); let view = hyperview.compose(['account1', 'account2']);
expect(view.account1).toBeUndefined(); expect(view.account1).toBeUndefined();
expect(view.account2).toBeUndefined(); expect(view.account2).toBeUndefined();
// Complete transaction with audit log // Complete transaction with audit log
lossless.ingestDelta(createDelta('bank', 'host1') hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.addPointer('transfer', 'transfer123', 'details') .addPointer('transfer', 'transfer123', 'details')
.addPointer('from', 'account1') .addPointer('from', 'account1')
@ -197,7 +197,7 @@ describe('Transactions', () => {
); );
// All entities should now be visible // All entities should now be visible
view = lossless.compose(['account1', 'account2', 'transfer123']); view = hyperview.compose(['account1', 'account2', 'transfer123']);
expect(view.account1).toBeDefined(); expect(view.account1).toBeDefined();
expect(view.account1.propertyDeltas.balance).toHaveLength(1); expect(view.account1.propertyDeltas.balance).toHaveLength(1);
expect(view.account2).toBeDefined(); expect(view.account2).toBeDefined();
@ -211,12 +211,12 @@ describe('Transactions', () => {
const updateEvents: Array<{ entityId: string, deltaIds: string[] }> = []; const updateEvents: Array<{ entityId: string, deltaIds: string[] }> = [];
// Listen for update events // Listen for update events
lossless.eventStream.on('updated', (entityId: string, deltaIds: string[]) => { hyperview.eventStream.on('updated', (entityId: string, deltaIds: string[]) => {
updateEvents.push({ entityId, deltaIds }); updateEvents.push({ entityId, deltaIds });
}); });
// Create transaction // Create transaction
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2) .declareTransaction(transactionId, 2)
.buildV1() .buildV1()
); );
@ -226,7 +226,7 @@ describe('Transactions', () => {
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('entity1', 'field1', 'value1') .setProperty('entity1', 'field1', 'value1')
.buildV1(); .buildV1();
lossless.ingestDelta(delta1); hyperview.ingestDelta(delta1);
// No events should be emitted yet // No events should be emitted yet
expect(updateEvents).toHaveLength(0); expect(updateEvents).toHaveLength(0);
@ -236,7 +236,7 @@ describe('Transactions', () => {
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('entity1', 'field2', 'value2') .setProperty('entity1', 'field2', 'value2')
.buildV1(); .buildV1();
lossless.ingestDelta(delta2); hyperview.ingestDelta(delta2);
// Wait for async event processing // Wait for async event processing
await new Promise(resolve => setTimeout(resolve, 10)); await new Promise(resolve => setTimeout(resolve, 10));
@ -256,20 +256,20 @@ describe('Transactions', () => {
const transactionId = 'tx-wait'; const transactionId = 'tx-wait';
// Create transaction // Create transaction
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2) .declareTransaction(transactionId, 2)
.buildV1() .buildV1()
); );
// Add first delta // Add first delta
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('job1', 'status', 'processing') .setProperty('job1', 'status', 'processing')
.buildV1() .buildV1()
); );
// Start waiting for transaction // Start waiting for transaction
const waitPromise = lossless.transactions.waitFor(transactionId); const waitPromise = hyperview.transactions.waitFor(transactionId);
let isResolved = false; let isResolved = false;
waitPromise.then(() => { isResolved = true; }); waitPromise.then(() => { isResolved = true; });
@ -278,7 +278,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(false); expect(isResolved).toBe(false);
// Complete transaction // Complete transaction
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('job1', 'status', 'completed') .setProperty('job1', 'status', 'completed')
.buildV1() .buildV1()
@ -289,7 +289,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(true); expect(isResolved).toBe(true);
// View should show completed transaction // View should show completed transaction
const view = lossless.compose(['job1']); const view = hyperview.compose(['job1']);
expect(view.job1).toBeDefined(); expect(view.job1).toBeDefined();
expect(view.job1.propertyDeltas.status).toHaveLength(2); expect(view.job1.propertyDeltas.status).toHaveLength(2);
}); });
@ -302,14 +302,14 @@ describe('Transactions', () => {
.buildV1(); .buildV1();
const updateEvents: string[] = []; const updateEvents: string[] = [];
lossless.eventStream.on('updated', (entityId: string) => { hyperview.eventStream.on('updated', (entityId: string) => {
updateEvents.push(entityId); updateEvents.push(entityId);
}); });
lossless.ingestDelta(regularDelta); hyperview.ingestDelta(regularDelta);
// Should immediately appear in view // Should immediately appear in view
const view = lossless.compose(['user456']); const view = hyperview.compose(['user456']);
expect(view.user456).toBeDefined(); expect(view.user456).toBeDefined();
expect(view.user456.propertyDeltas.name).toHaveLength(1); expect(view.user456.propertyDeltas.name).toHaveLength(1);
@ -323,29 +323,29 @@ describe('Transactions', () => {
const transactionId = 'tx-resize'; const transactionId = 'tx-resize';
// Initially declare transaction with size 2 // Initially declare transaction with size 2
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2) .declareTransaction(transactionId, 2)
.buildV1() .buildV1()
); );
// Add 2 deltas // Add 2 deltas
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('cart1', 'items', 'item1') .setProperty('cart1', 'items', 'item1')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('cart1', 'items', 'item2') .setProperty('cart1', 'items', 'item2')
.buildV1() .buildV1()
); );
// Transaction should be complete // Transaction should be complete
expect(lossless.transactions.isComplete(transactionId)).toBe(true); expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
// View should show the cart // View should show the cart
const view = lossless.compose(['cart1']); const view = hyperview.compose(['cart1']);
expect(view.cart1).toBeDefined(); expect(view.cart1).toBeDefined();
}); });
@ -353,30 +353,30 @@ describe('Transactions', () => {
const transactionId = 'tx-no-size'; const transactionId = 'tx-no-size';
// Add delta with transaction reference but no size declaration // Add delta with transaction reference but no size declaration
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('entity1', 'data', 'test') .setProperty('entity1', 'data', 'test')
.buildV1() .buildV1()
); );
// Transaction should not be complete (no size) // Transaction should not be complete (no size)
expect(lossless.transactions.isComplete(transactionId)).toBe(false); expect(hyperview.transactions.isComplete(transactionId)).toBe(false);
// Delta should not appear in view // Delta should not appear in view
const view = lossless.compose(['entity1']); const view = hyperview.compose(['entity1']);
expect(view.entity1).toBeUndefined(); expect(view.entity1).toBeUndefined();
// Declare size after the fact // Declare size after the fact
lossless.ingestDelta(createDelta('system', 'host1') hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 1) .declareTransaction(transactionId, 1)
.buildV1() .buildV1()
); );
// Now transaction should be complete // Now transaction should be complete
expect(lossless.transactions.isComplete(transactionId)).toBe(true); expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
// And delta should appear in view // And delta should appear in view
const viewAfter = lossless.compose(['entity1']); const viewAfter = hyperview.compose(['entity1']);
expect(viewAfter.entity1).toBeDefined(); expect(viewAfter.entity1).toBeDefined();
}); });
}); });

View File

@ -1,5 +1,5 @@
import { QueryEngine } from '@src/query'; import { QueryEngine } from '@src/query';
import { Lossless } from '@src/views'; import { Hyperview } from '@src/views';
import { DefaultSchemaRegistry } from '@src/schema'; import { DefaultSchemaRegistry } from '@src/schema';
import { SchemaBuilder, PrimitiveSchemas } from '@src/schema'; import { SchemaBuilder, PrimitiveSchemas } from '@src/schema';
import { CommonSchemas } from '../../../util/schemas'; import { CommonSchemas } from '../../../util/schemas';
@ -8,7 +8,7 @@ import { RhizomeNode } from '@src/node';
describe('Query Engine', () => { describe('Query Engine', () => {
let queryEngine: QueryEngine; let queryEngine: QueryEngine;
let lossless: Lossless; let hyperview: Hyperview;
let schemaRegistry: DefaultSchemaRegistry; let schemaRegistry: DefaultSchemaRegistry;
let rhizomeNode: RhizomeNode; let rhizomeNode: RhizomeNode;
@ -19,9 +19,9 @@ describe('Query Engine', () => {
requestBindPort: 4003 requestBindPort: 4003
}); });
lossless = rhizomeNode.lossless; hyperview = rhizomeNode.hyperview;
schemaRegistry = new DefaultSchemaRegistry(); schemaRegistry = new DefaultSchemaRegistry();
queryEngine = new QueryEngine(lossless, schemaRegistry); queryEngine = new QueryEngine(hyperview, schemaRegistry);
// Register test schemas // Register test schemas
schemaRegistry.register(CommonSchemas.User()); schemaRegistry.register(CommonSchemas.User());
@ -53,7 +53,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'name', name, 'user') .setProperty(id, 'name', name, 'user')
.buildV1(); .buildV1();
lossless.ingestDelta(nameDelta); hyperview.ingestDelta(nameDelta);
// Add age if provided // Add age if provided
if (age !== undefined) { if (age !== undefined) {
@ -62,7 +62,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'age', age, 'user') .setProperty(id, 'age', age, 'user')
.buildV1(); .buildV1();
lossless.ingestDelta(ageDelta); hyperview.ingestDelta(ageDelta);
} }
// Add email if provided // Add email if provided
@ -72,7 +72,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'email', email, 'user') .setProperty(id, 'email', email, 'user')
.buildV1(); .buildV1();
lossless.ingestDelta(emailDelta); hyperview.ingestDelta(emailDelta);
} }
} }
@ -83,7 +83,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'title', title, 'post') .setProperty(id, 'title', title, 'post')
.buildV1(); .buildV1();
lossless.ingestDelta(titleDelta); hyperview.ingestDelta(titleDelta);
// Author delta // Author delta
const authorDelta = createDelta('test', 'test-host') const authorDelta = createDelta('test', 'test-host')
@ -91,7 +91,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'author', author, 'post') .setProperty(id, 'author', author, 'post')
.buildV1(); .buildV1();
lossless.ingestDelta(authorDelta); hyperview.ingestDelta(authorDelta);
// Published delta // Published delta
const publishedDelta = createDelta('test', 'test-host') const publishedDelta = createDelta('test', 'test-host')
@ -99,7 +99,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'published', published, 'post') .setProperty(id, 'published', published, 'post')
.buildV1(); .buildV1();
lossless.ingestDelta(publishedDelta); hyperview.ingestDelta(publishedDelta);
// Views delta // Views delta
const viewsDelta = createDelta('test', 'test-host') const viewsDelta = createDelta('test', 'test-host')
@ -107,7 +107,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now()) .withTimestamp(Date.now())
.setProperty(id, 'views', views, 'post') .setProperty(id, 'views', views, 'post')
.buildV1(); .buildV1();
lossless.ingestDelta(viewsDelta); hyperview.ingestDelta(viewsDelta);
} }
describe('Basic Query Operations', () => { describe('Basic Query Operations', () => {

View File

@ -1,12 +1,12 @@
import {DeltaFilter} from '@src/core'; import {DeltaFilter} from '@src/core';
import {Lossless} from '@src/views'; import {Hyperview} from '@src/views';
import {RhizomeNode} from '@src/node'; import {RhizomeNode} from '@src/node';
import {createDelta} from '@src/core/delta-builder'; import {createDelta} from '@src/core/delta-builder';
describe('Lossless', () => { describe('Hyperview', () => {
const node = new RhizomeNode(); const node = new RhizomeNode();
test('creates a lossless view of keanu as neo in the matrix', () => { test('creates a hyperview view of keanu as neo in the matrix', () => {
const delta = createDelta('a', 'h') const delta = createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles') .addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor') .addPointer('role', 'neo', 'actor')
@ -35,11 +35,11 @@ describe('Lossless', () => {
target: "usd" target: "usd"
}]); }]);
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
expect(lossless.compose()).toMatchObject({ expect(hyperview.compose()).toMatchObject({
keanu: { keanu: {
referencedAs: ["actor"], referencedAs: ["actor"],
propertyDeltas: { propertyDeltas: {
@ -100,11 +100,11 @@ describe('Lossless', () => {
.addPointer('salary_currency', 'usd') .addPointer('salary_currency', 'usd')
.buildV2(); .buildV2();
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
lossless.ingestDelta(delta); hyperview.ingestDelta(delta);
expect(lossless.compose()).toMatchObject({ expect(hyperview.compose()).toMatchObject({
keanu: { keanu: {
referencedAs: ["actor"], referencedAs: ["actor"],
propertyDeltas: { propertyDeltas: {
@ -157,18 +157,18 @@ describe('Lossless', () => {
}); });
describe('can filter deltas', () => { describe('can filter deltas', () => {
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
beforeAll(() => { beforeAll(() => {
// First delta // First delta
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('A', 'H') createDelta('A', 'H')
.setProperty('ace', 'value', '1', 'ace') .setProperty('ace', 'value', '1', 'ace')
.buildV1() .buildV1()
); );
// Second delta // Second delta
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('B', 'H') createDelta('B', 'H')
// 10 11j 12q 13k 14a // 10 11j 12q 13k 14a
// .addPointer('14', 'ace', 'value') // .addPointer('14', 'ace', 'value')
@ -176,7 +176,7 @@ describe('Lossless', () => {
.buildV1() .buildV1()
); );
expect(lossless.compose()).toMatchObject({ expect(hyperview.compose()).toMatchObject({
ace: { ace: {
referencedAs: ["ace"], referencedAs: ["ace"],
propertyDeltas: { propertyDeltas: {
@ -205,7 +205,7 @@ describe('Lossless', () => {
return creator === 'A' && host === 'H'; return creator === 'A' && host === 'H';
}; };
expect(lossless.compose(undefined, filter)).toMatchObject({ expect(hyperview.compose(undefined, filter)).toMatchObject({
ace: { ace: {
referencedAs: ["ace"], referencedAs: ["ace"],
propertyDeltas: { propertyDeltas: {
@ -221,7 +221,7 @@ describe('Lossless', () => {
} }
}); });
expect(lossless.compose(["ace"], filter)).toMatchObject({ expect(hyperview.compose(["ace"], filter)).toMatchObject({
ace: { ace: {
referencedAs: ["ace"], referencedAs: ["ace"],
propertyDeltas: { propertyDeltas: {
@ -239,18 +239,18 @@ describe('Lossless', () => {
}); });
test('filter with transactions', () => { test('filter with transactions', () => {
const losslessT = new Lossless(node); const hyperviewT = new Hyperview(node);
const transactionId = 'tx-filter-test'; const transactionId = 'tx-filter-test';
// Declare transaction with 3 deltas // Declare transaction with 3 deltas
losslessT.ingestDelta( hyperviewT.ingestDelta(
createDelta('system', 'H') createDelta('system', 'H')
.declareTransaction(transactionId, 3) .declareTransaction(transactionId, 3)
.buildV1() .buildV1()
); );
// A1: First delta from creator A // A1: First delta from creator A
losslessT.ingestDelta( hyperviewT.ingestDelta(
createDelta('A', 'H') createDelta('A', 'H')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('process1', 'status', 'started', 'step') .setProperty('process1', 'status', 'started', 'step')
@ -258,7 +258,7 @@ describe('Lossless', () => {
); );
// B: Delta from creator B // B: Delta from creator B
losslessT.ingestDelta( hyperviewT.ingestDelta(
createDelta('B', 'H') createDelta('B', 'H')
.inTransaction(transactionId) .inTransaction(transactionId)
.setProperty('process1', 'status', 'processing', 'step') .setProperty('process1', 'status', 'processing', 'step')
@ -266,11 +266,11 @@ describe('Lossless', () => {
); );
// Transaction incomplete - nothing should show // Transaction incomplete - nothing should show
const incompleteView = losslessT.compose(['process1']); const incompleteView = hyperviewT.compose(['process1']);
expect(incompleteView.process1).toBeUndefined(); expect(incompleteView.process1).toBeUndefined();
// A2: Second delta from creator A completes transaction // A2: Second delta from creator A completes transaction
losslessT.ingestDelta( hyperviewT.ingestDelta(
createDelta('A', 'H') createDelta('A', 'H')
.inTransaction(transactionId) .inTransaction(transactionId)
.addPointer('step', 'process1', 'status') .addPointer('step', 'process1', 'status')
@ -279,13 +279,13 @@ describe('Lossless', () => {
); );
// All deltas visible now // All deltas visible now
const completeView = losslessT.compose(['process1']); const completeView = hyperviewT.compose(['process1']);
expect(completeView.process1).toBeDefined(); expect(completeView.process1).toBeDefined();
expect(completeView.process1.propertyDeltas.status).toHaveLength(3); expect(completeView.process1.propertyDeltas.status).toHaveLength(3);
// Filter by creator A only // Filter by creator A only
const filterA: DeltaFilter = ({creator}) => creator === 'A'; const filterA: DeltaFilter = ({creator}) => creator === 'A';
const filteredView = losslessT.compose(['process1'], filterA); const filteredView = hyperviewT.compose(['process1'], filterA);
expect(filteredView.process1).toBeDefined(); expect(filteredView.process1).toBeDefined();
expect(filteredView.process1.propertyDeltas.status).toHaveLength(2); expect(filteredView.process1.propertyDeltas.status).toHaveLength(2);

View File

@ -1,12 +1,12 @@
import Debug from 'debug'; import Debug from 'debug';
import { PointerTarget } from "@src/core/delta"; import { PointerTarget } from "@src/core/delta";
import { Lossless, LosslessViewOne } from "@src/views/lossless"; import { Hyperview, HyperviewViewOne } from "@src/views/hyperview";
import { Lossy } from "@src/views/lossy"; import { Lossy } from "@src/views/view";
import { RhizomeNode } from "@src/node"; import { RhizomeNode } from "@src/node";
import { valueFromDelta } from "@src/views/lossless"; import { valueFromDelta } from "@src/views/hyperview";
import { latestFromCollapsedDeltas } from "@src/views/resolvers/timestamp-resolvers"; import { latestFromCollapsedDeltas } from "@src/views/resolvers/timestamp-resolvers";
import { createDelta } from "@src/core/delta-builder"; import { createDelta } from "@src/core/delta-builder";
const debug = Debug('rz:test:lossy'); const debug = Debug('rz:test:view');
type Role = { type Role = {
actor: PointerTarget, actor: PointerTarget,
@ -21,9 +21,9 @@ type Summary = {
class Summarizer extends Lossy<Summary> { class Summarizer extends Lossy<Summary> {
private readonly debug: debug.Debugger; private readonly debug: debug.Debugger;
constructor(lossless: Lossless) { constructor(hyperview: Hyperview) {
super(lossless); super(hyperview);
this.debug = Debug('rz:test:lossy:summarizer'); this.debug = Debug('rz:test:view:summarizer');
} }
initializer(): Summary { initializer(): Summary {
@ -37,9 +37,9 @@ class Summarizer extends Lossy<Summary> {
// it's really not CRDT, it likely depends on the order of the pointers. // it's really not CRDT, it likely depends on the order of the pointers.
// TODO: Prove with failing test // TODO: Prove with failing test
reducer(acc: Summary, cur: LosslessViewOne): Summary { reducer(acc: Summary, cur: HyperviewViewOne): Summary {
this.debug(`Processing view for entity ${cur.id} (referenced as: ${cur.referencedAs?.join(', ')})`); this.debug(`Processing view for entity ${cur.id} (referenced as: ${cur.referencedAs?.join(', ')})`);
this.debug(`lossless view:`, JSON.stringify(cur)); this.debug(`hyperview view:`, JSON.stringify(cur));
if (cur.referencedAs?.includes("role")) { if (cur.referencedAs?.includes("role")) {
this.debug(`Found role entity: ${cur.id}`); this.debug(`Found role entity: ${cur.id}`);
@ -92,12 +92,12 @@ class Summarizer extends Lossy<Summary> {
describe('Lossy', () => { describe('Lossy', () => {
describe('use a provided initializer, reducer, and resolver to resolve entity views', () => { describe('use a provided initializer, reducer, and resolver to resolve entity views', () => {
const node = new RhizomeNode(); const node = new RhizomeNode();
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new Summarizer(lossless); const view = new Summarizer(hyperview);
beforeAll(() => { beforeAll(() => {
lossless.ingestDelta(createDelta('a', 'h') hyperview.ingestDelta(createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles') .addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor') .addPointer('role', 'neo', 'actor')
.addPointer('film', 'the_matrix', 'cast') .addPointer('film', 'the_matrix', 'cast')
@ -108,7 +108,7 @@ describe('Lossy', () => {
}); });
test('example summary', () => { test('example summary', () => {
const result = lossy.resolve(); const result = view.resolve();
debug('result', result); debug('result', result);
expect(result).toEqual({ expect(result).toEqual({
roles: [{ roles: [{

View File

@ -85,10 +85,10 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('salary', 15000000) .addPointer('salary', 15000000)
.addPointer('contract_date', '1999-03-31') .addPointer('contract_date', '1999-03-31')
.buildV1(); .buildV1();
node.lossless.ingestDelta(castingDelta); node.hyperview.ingestDelta(castingDelta);
// Test from Keanu's perspective // Test from Keanu's perspective
const keanuViews = node.lossless.compose(['keanu']); const keanuViews = node.hyperview.compose(['keanu']);
const keanuView = keanuViews['keanu']; const keanuView = keanuViews['keanu'];
expect(keanuView.propertyDeltas.filmography).toBeDefined(); expect(keanuView.propertyDeltas.filmography).toBeDefined();
@ -97,7 +97,7 @@ describe('Multi-Pointer Delta Resolution', () => {
const nestedKeanuView = schemaRegistry.applySchemaWithNesting( const nestedKeanuView = schemaRegistry.applySchemaWithNesting(
keanuView, keanuView,
'actor', 'actor',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -117,13 +117,13 @@ describe('Multi-Pointer Delta Resolution', () => {
} }
// Test from Matrix's perspective // Test from Matrix's perspective
const matrixViews = node.lossless.compose(['matrix']); const matrixViews = node.hyperview.compose(['matrix']);
const matrixView = matrixViews['matrix']; const matrixView = matrixViews['matrix'];
const nestedMatrixView = schemaRegistry.applySchemaWithNesting( const nestedMatrixView = schemaRegistry.applySchemaWithNesting(
matrixView, matrixView,
'movie', 'movie',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -169,16 +169,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('since', '2020-01-15') .addPointer('since', '2020-01-15')
.addPointer('intensity', 8) .addPointer('intensity', 8)
.buildV1(); .buildV1();
node.lossless.ingestDelta(relationshipDelta); node.hyperview.ingestDelta(relationshipDelta);
// Test from Alice's perspective // Test from Alice's perspective
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
const nestedAliceView = schemaRegistry.applySchemaWithNesting( const nestedAliceView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'person', 'person',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -244,16 +244,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('budget', 50000) .addPointer('budget', 50000)
.addPointer('deadline', '2024-06-01') .addPointer('deadline', '2024-06-01')
.buildV1(); .buildV1();
node.lossless.ingestDelta(collaborationDelta); node.hyperview.ingestDelta(collaborationDelta);
// Test from project's perspective // Test from project's perspective
const projectViews = node.lossless.compose(['website']); const projectViews = node.hyperview.compose(['website']);
const projectView = projectViews['website']; const projectView = projectViews['website'];
const nestedProjectView = schemaRegistry.applySchemaWithNesting( const nestedProjectView = schemaRegistry.applySchemaWithNesting(
projectView, projectView,
'project', 'project',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );

View File

@ -59,10 +59,10 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob') .addPointer('friends', 'bob')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendshipDelta); node.hyperview.ingestDelta(friendshipDelta);
// Get Alice's lossless view // Get Alice's hyperview view
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
expect(aliceView).toBeDefined(); expect(aliceView).toBeDefined();
@ -73,7 +73,7 @@ describe('Nested Object Resolution', () => {
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -107,15 +107,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'nonexistent') .addPointer('friends', 'nonexistent')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendshipDelta); node.hyperview.ingestDelta(friendshipDelta);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -158,23 +158,23 @@ describe('Nested Object Resolution', () => {
.addPointer('deep-users', 'alice', 'mentor') .addPointer('deep-users', 'alice', 'mentor')
.addPointer('mentor', 'bob') .addPointer('mentor', 'bob')
.buildV1(); .buildV1();
node.lossless.ingestDelta(mentorshipDelta1); node.hyperview.ingestDelta(mentorshipDelta1);
// Bob's mentor is Charlie // Bob's mentor is Charlie
const mentorshipDelta2 = createDelta(node.config.creator, node.config.peerId) const mentorshipDelta2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('deep-users', 'bob', 'mentor') .addPointer('deep-users', 'bob', 'mentor')
.addPointer('mentor', 'charlie') .addPointer('mentor', 'charlie')
.buildV1(); .buildV1();
node.lossless.ingestDelta(mentorshipDelta2); node.hyperview.ingestDelta(mentorshipDelta2);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
// Test with maxDepth = 1 (should only resolve Alice and Bob) // Test with maxDepth = 1 (should only resolve Alice and Bob)
const shallowView = schemaRegistry.applySchemaWithNesting( const shallowView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'deep-user', 'deep-user',
node.lossless, node.hyperview,
{ maxDepth: 1 } { maxDepth: 1 }
); );
@ -197,7 +197,7 @@ describe('Nested Object Resolution', () => {
const deepView = schemaRegistry.applySchemaWithNesting( const deepView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'deep-user', 'deep-user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -234,22 +234,22 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob') .addPointer('friends', 'bob')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendship1); node.hyperview.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId) const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'bob', 'friends') .addPointer('users', 'bob', 'friends')
.addPointer('friends', 'alice') .addPointer('friends', 'alice')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendship2); node.hyperview.ingestDelta(friendship2);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
// Should handle circular reference without infinite recursion // Should handle circular reference without infinite recursion
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 3 } { maxDepth: 3 }
); );
@ -275,15 +275,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'alice') .addPointer('friends', 'alice')
.buildV1(); .buildV1();
node.lossless.ingestDelta(selfFriendship); node.hyperview.ingestDelta(selfFriendship);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -311,21 +311,21 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob') .addPointer('friends', 'bob')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendship1); node.hyperview.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId) const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'charlie') .addPointer('friends', 'charlie')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendship2); node.hyperview.ingestDelta(friendship2);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 2 } { maxDepth: 2 }
); );
@ -373,15 +373,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends') .addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob') .addPointer('friends', 'bob')
.buildV1(); .buildV1();
node.lossless.ingestDelta(friendship); node.hyperview.ingestDelta(friendship);
const aliceViews = node.lossless.compose(['alice']); const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice']; const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting( const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView, aliceView,
'user', 'user',
node.lossless, node.hyperview,
{ maxDepth: 3 } { maxDepth: 3 }
); );

View File

@ -1,6 +1,6 @@
import { import {
RhizomeNode, RhizomeNode,
Lossless, Hyperview,
AggregationResolver, AggregationResolver,
MinResolver, MinResolver,
MaxResolver, MaxResolver,
@ -13,31 +13,31 @@ import { createDelta } from "@src/core/delta-builder";
describe('Aggregation Resolvers', () => { describe('Aggregation Resolvers', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('Basic Aggregation', () => { describe('Basic Aggregation', () => {
test('should aggregate numbers using min resolver', () => { test('should aggregate numbers using min resolver', () => {
const minResolver = new MinResolver(lossless, ['score']); const minResolver = new MinResolver(hyperview, ['score']);
// Add first entity with score 10 // Add first entity with score 10
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection') .setProperty('entity1', 'score', 10, 'collection')
.buildV1() .buildV1()
); );
// Add second entity with score 5 // Add second entity with score 5
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection') .setProperty('entity2', 'score', 5, 'collection')
.buildV1() .buildV1()
); );
// Add third entity with score 15 // Add third entity with score 15
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection') .setProperty('entity3', 'score', 15, 'collection')
.buildV1() .buildV1()
); );
@ -52,20 +52,20 @@ describe('Aggregation Resolvers', () => {
}); });
test('should aggregate numbers using max resolver', () => { test('should aggregate numbers using max resolver', () => {
const maxResolver = new MaxResolver(lossless, ['score']); const maxResolver = new MaxResolver(hyperview, ['score']);
// Add deltas for entities // Add deltas for entities
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection') .setProperty('entity1', 'score', 10, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection') .setProperty('entity2', 'score', 5, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection') .setProperty('entity3', 'score', 15, 'collection')
.buildV1() .buildV1()
); );
@ -79,22 +79,22 @@ describe('Aggregation Resolvers', () => {
}); });
test('should aggregate numbers using sum resolver', () => { test('should aggregate numbers using sum resolver', () => {
const sumResolver = new SumResolver(lossless, ['value']); const sumResolver = new SumResolver(hyperview, ['value']);
// Add first value for entity1 // Add first value for entity1
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection') .setProperty('entity1', 'value', 10, 'collection')
.buildV1() .buildV1()
); );
// Add second value for entity1 (should sum) // Add second value for entity1 (should sum)
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 20, 'collection') .setProperty('entity1', 'value', 20, 'collection')
.buildV1() .buildV1()
); );
// Add value for entity2 // Add value for entity2
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'value', 5, 'collection') .setProperty('entity2', 'value', 5, 'collection')
.buildV1() .buildV1()
); );
@ -107,21 +107,21 @@ describe('Aggregation Resolvers', () => {
}); });
test('should aggregate numbers using average resolver', () => { test('should aggregate numbers using average resolver', () => {
const avgResolver = new AverageResolver(lossless, ['score']); const avgResolver = new AverageResolver(hyperview, ['score']);
// Add multiple scores for entity1 // Add multiple scores for entity1
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection') .setProperty('entity1', 'score', 10, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection') .setProperty('entity1', 'score', 20, 'collection')
.buildV1() .buildV1()
); );
// Single value for entity2 // Single value for entity2
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 30, 'collection') .setProperty('entity2', 'score', 30, 'collection')
.buildV1() .buildV1()
); );
@ -134,21 +134,21 @@ describe('Aggregation Resolvers', () => {
}); });
test('should count values using count resolver', () => { test('should count values using count resolver', () => {
const countResolver = new CountResolver(lossless, ['visits']); const countResolver = new CountResolver(hyperview, ['visits']);
// Add multiple visit deltas for entity1 // Add multiple visit deltas for entity1
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection') .setProperty('entity1', 'visits', 1, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection') .setProperty('entity1', 'visits', 1, 'collection')
.buildV1() .buildV1()
); );
// Single visit for entity2 // Single visit for entity2
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'visits', 1, 'collection') .setProperty('entity2', 'visits', 1, 'collection')
.buildV1() .buildV1()
); );
@ -163,40 +163,40 @@ describe('Aggregation Resolvers', () => {
describe('Custom Aggregation Configuration', () => { describe('Custom Aggregation Configuration', () => {
test('should handle mixed aggregation types', () => { test('should handle mixed aggregation types', () => {
const resolver = new AggregationResolver(lossless, { const resolver = new AggregationResolver(hyperview, {
min_val: 'min' as AggregationType, min_val: 'min' as AggregationType,
max_val: 'max' as AggregationType, max_val: 'max' as AggregationType,
sum_val: 'sum' as AggregationType sum_val: 'sum' as AggregationType
}); });
// Add first set of values // Add first set of values
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 10, 'collection') .setProperty('entity1', 'min_val', 10, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 5, 'collection') .setProperty('entity1', 'max_val', 5, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 3, 'collection') .setProperty('entity1', 'sum_val', 3, 'collection')
.buildV1() .buildV1()
); );
// Add second set of values // Add second set of values
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 5, 'collection') .setProperty('entity1', 'min_val', 5, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 15, 'collection') .setProperty('entity1', 'max_val', 15, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 7, 'collection') .setProperty('entity1', 'sum_val', 7, 'collection')
.buildV1() .buildV1()
); );
@ -212,25 +212,25 @@ describe('Aggregation Resolvers', () => {
}); });
test('should ignore non-numeric values', () => { test('should ignore non-numeric values', () => {
const resolver = new AggregationResolver(lossless, { const resolver = new AggregationResolver(hyperview, {
score: 'sum' as AggregationType, score: 'sum' as AggregationType,
name: 'count' as AggregationType name: 'count' as AggregationType
}); });
// Add numeric value // Add numeric value
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection') .setProperty('entity1', 'score', 10, 'collection')
.buildV1() .buildV1()
); );
// Add non-numeric value (string) // Add non-numeric value (string)
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection') .setProperty('entity1', 'name', 'test', 'collection')
.buildV1() .buildV1()
); );
// Add another numeric value // Add another numeric value
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection') .setProperty('entity1', 'score', 20, 'collection')
.buildV1() .buildV1()
); );
@ -244,9 +244,9 @@ describe('Aggregation Resolvers', () => {
}); });
test('should handle empty value arrays', () => { test('should handle empty value arrays', () => {
const sumResolver = new SumResolver(lossless, ['score']); const sumResolver = new SumResolver(hyperview, ['score']);
// Create entity with non-aggregated property // Create entity with non-aggregated property
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection') .setProperty('entity1', 'name', 'test', 'collection')
.buildV1() .buildV1()
); );
@ -261,9 +261,9 @@ describe('Aggregation Resolvers', () => {
describe('Edge Cases', () => { describe('Edge Cases', () => {
test('should handle single value aggregations', () => { test('should handle single value aggregations', () => {
const avgResolver = new AverageResolver(lossless, ['value']); const avgResolver = new AverageResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 42, 'collection') .setProperty('entity1', 'value', 42, 'collection')
.buildV1() .buildV1()
); );
@ -275,14 +275,14 @@ describe('Aggregation Resolvers', () => {
}); });
test('should handle zero values', () => { test('should handle zero values', () => {
const sumResolver = new SumResolver(lossless, ['value']); const sumResolver = new SumResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 0, 'collection') .setProperty('entity1', 'value', 0, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection') .setProperty('entity1', 'value', 10, 'collection')
.buildV1() .buildV1()
); );
@ -294,14 +294,14 @@ describe('Aggregation Resolvers', () => {
}); });
test('should handle negative values', () => { test('should handle negative values', () => {
const minResolver = new MinResolver(lossless, ['value']); const minResolver = new MinResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', -5, 'collection') .setProperty('entity1', 'value', -5, 'collection')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('test', 'host1') hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection') .setProperty('entity1', 'value', 10, 'collection')
.buildV1() .buildV1()
); );

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta } from '@src'; import { RhizomeNode, Hyperview, createDelta } from '@src';
import { CollapsedDelta } from '@src/views/lossless'; import { CollapsedDelta } from '@src/views/hyperview';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers'; import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null; type PropertyTypes = string | number | boolean | null;
describe('Basic Dependency Resolution', () => { describe('Basic Dependency Resolution', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
test('should resolve dependencies in correct order', () => { test('should resolve dependencies in correct order', () => {
@ -51,20 +51,20 @@ describe('Basic Dependency Resolution', () => {
} }
} }
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
first: new FirstPlugin(), first: new FirstPlugin(),
second: new SecondPlugin() second: new SecondPlugin()
}); });
// Add some data // Add some data
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('test1', 'first', 'hello', 'test') .setProperty('test1', 'first', 'hello', 'test')
.buildV1() .buildV1()
); );
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.setProperty('test1', 'second', 'world', 'test') .setProperty('test1', 'second', 'world', 'test')

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless } from '@src'; import { RhizomeNode, Hyperview } from '@src';
import { CollapsedDelta } from '@src/views/lossless'; import { CollapsedDelta } from '@src/views/hyperview';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers'; import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null; type PropertyTypes = string | number | boolean | null;
describe('Circular Dependency Detection', () => { describe('Circular Dependency Detection', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
test('should detect circular dependencies', () => { test('should detect circular dependencies', () => {
@ -53,7 +53,7 @@ describe('Circular Dependency Detection', () => {
// Should throw an error when circular dependencies are detected // Should throw an error when circular dependencies are detected
expect(() => { expect(() => {
new CustomResolver(lossless, { new CustomResolver(hyperview, {
'a': new PluginA(), 'a': new PluginA(),
'b': new PluginB() 'b': new PluginB()
}); });
@ -84,7 +84,7 @@ describe('Circular Dependency Detection', () => {
// Should detect the circular dependency: a -> c -> b -> a // Should detect the circular dependency: a -> c -> b -> a
expect(() => { expect(() => {
new CustomResolver(lossless, { new CustomResolver(hyperview, {
'a': new PluginA(), 'a': new PluginA(),
'b': new PluginB(), 'b': new PluginB(),
'c': new PluginC() 'c': new PluginC()

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src'; import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import { import {
CustomResolver, CustomResolver,
DependencyStates, DependencyStates,
@ -9,11 +9,11 @@ import { PropertyTypes } from '@src/core/types';
describe('Edge Cases', () => { describe('Edge Cases', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
test('should handle null values', () => { test('should handle null values', () => {
@ -45,11 +45,11 @@ describe('Edge Cases', () => {
} }
} }
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
value: new NullSafeLastWriteWinsPlugin() value: new NullSafeLastWriteWinsPlugin()
}); });
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('test2', 'value', null, 'test') .setProperty('test2', 'value', null, 'test')
@ -95,19 +95,19 @@ describe('Edge Cases', () => {
} }
} }
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
value: new ConcurrentUpdatePlugin() value: new ConcurrentUpdatePlugin()
}); });
// Two updates with the same timestamp // Two updates with the same timestamp
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('test2', 'value', null, 'test') .setProperty('test2', 'value', null, 'test')
.buildV1() .buildV1()
); );
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user2', 'host2') createDelta('user2', 'host2')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.setProperty('test2', 'value', 'xylophone', 'test') .setProperty('test2', 'value', 'xylophone', 'test')
@ -149,13 +149,13 @@ describe('Edge Cases', () => {
} }
} }
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
counter: new CounterPlugin() counter: new CounterPlugin()
}); });
// Add 1000 updates // Add 1000 updates
for (let i = 0; i < 1000; i++) { for (let i = 0; i < 1000; i++) {
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000 + i) .withTimestamp(1000 + i)
.setProperty('test3', 'counter', i, 'test') .setProperty('test3', 'counter', i, 'test')
@ -193,7 +193,7 @@ describe('Edge Cases', () => {
} }
} }
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
missing: new MissingPropertyPlugin() missing: new MissingPropertyPlugin()
}); });

View File

@ -1,6 +1,6 @@
import { PropertyID } from '@src/core/types'; import { PropertyID } from '@src/core/types';
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta } from '@src'; import { RhizomeNode, Hyperview, createDelta } from '@src';
import { import {
CustomResolver, CustomResolver,
LastWriteWinsPlugin, LastWriteWinsPlugin,
@ -10,7 +10,7 @@ import {
ResolverPlugin ResolverPlugin
} from '@src/views/resolvers/custom-resolvers'; } from '@src/views/resolvers/custom-resolvers';
import Debug from 'debug'; import Debug from 'debug';
const debug = Debug('rz:test:lossless'); const debug = Debug('rz:test:hyperview');
// A simple plugin that depends on other plugins // A simple plugin that depends on other plugins
class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initialized: boolean }> { class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initialized: boolean }> {
@ -49,15 +49,15 @@ class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initial
describe('Multiple Plugins Integration', () => { describe('Multiple Plugins Integration', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
test('should handle multiple plugins with dependencies', () => { test('should handle multiple plugins with dependencies', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
temperature: new LastWriteWinsPlugin(), temperature: new LastWriteWinsPlugin(),
maxTemp: new MaxPlugin('temperature'), maxTemp: new MaxPlugin('temperature'),
minTemp: new MinPlugin('temperature'), minTemp: new MinPlugin('temperature'),
@ -67,7 +67,7 @@ describe('Multiple Plugins Integration', () => {
// Add some temperature readings // Add some temperature readings
const readings = [22, 25, 18, 30, 20]; const readings = [22, 25, 18, 30, 20];
readings.forEach((temp, index) => { readings.forEach((temp, index) => {
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('sensor1', 'host1') createDelta('sensor1', 'host1')
.withTimestamp(1000 + index * 1000) .withTimestamp(1000 + index * 1000)
.setProperty('room1', 'temperature', temp, 'sensors') .setProperty('room1', 'temperature', temp, 'sensors')
@ -89,7 +89,7 @@ describe('Multiple Plugins Integration', () => {
}); });
test('should handle multiple entities with different plugins', () => { test('should handle multiple entities with different plugins', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
name: new LastWriteWinsPlugin(), name: new LastWriteWinsPlugin(),
tags: new ConcatenationPlugin(), tags: new ConcatenationPlugin(),
score: new MaxPlugin() score: new MaxPlugin()
@ -97,7 +97,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting first delta`); debug(`Creating and ingesting first delta`);
// Add data for entity1 // Add data for entity1
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity1', 'name', 'Test Entity', 'test-name') .setProperty('entity1', 'name', 'Test Entity', 'test-name')
@ -107,7 +107,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting second delta`); debug(`Creating and ingesting second delta`);
// Add more tags to entity1 // Add more tags to entity1
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.setProperty('entity1', 'tags', 'tag2', 'test') .setProperty('entity1', 'tags', 'tag2', 'test')
@ -116,7 +116,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting third delta`); debug(`Creating and ingesting third delta`);
// Add data for entity2 // Add data for entity2
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity2', 'score', 85, 'test') .setProperty('entity2', 'score', 85, 'test')
@ -125,7 +125,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting fourth delta`); debug(`Creating and ingesting fourth delta`);
// Update score for entity2 // Update score for entity2
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.setProperty('entity2', 'score', 90, 'test') .setProperty('entity2', 'score', 90, 'test')

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src'; import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import { import {
CustomResolver, CustomResolver,
ResolverPlugin, ResolverPlugin,
@ -51,20 +51,20 @@ type LifecycleTestState = {
describe('Plugin Lifecycle', () => { describe('Plugin Lifecycle', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
test('should call initialize, update, and resolve in order', () => { test('should call initialize, update, and resolve in order', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin() test: new LifecycleTestPlugin()
}); });
// Add some data // Add some data
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('test1', 'test', 'value1') .setProperty('test1', 'test', 'value1')
@ -92,12 +92,12 @@ describe('Plugin Lifecycle', () => {
}); });
test('should handle multiple updates correctly', () => { test('should handle multiple updates correctly', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin() test: new LifecycleTestPlugin()
}); });
// First update // First update
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('test2', 'test', 'value1') .setProperty('test2', 'test', 'value1')
@ -105,7 +105,7 @@ describe('Plugin Lifecycle', () => {
); );
// Second update // Second update
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.setProperty('test2', 'test', 'value2') .setProperty('test2', 'test', 'value2')
@ -132,7 +132,7 @@ describe('Plugin Lifecycle', () => {
}); });
test('should handle empty state', () => { test('should handle empty state', () => {
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin() test: new LifecycleTestPlugin()
}); });

View File

@ -1,7 +1,7 @@
import { describe, test, expect } from '@jest/globals'; import { describe, test, expect } from '@jest/globals';
import { ResolverPlugin, DependencyStates } from '@src/views/resolvers/custom-resolvers'; import { ResolverPlugin, DependencyStates } from '@src/views/resolvers/custom-resolvers';
import { PropertyTypes } from '@src/core/types'; import { PropertyTypes } from '@src/core/types';
import type { CollapsedDelta } from '@src/views/lossless'; import type { CollapsedDelta } from '@src/views/hyperview';
import { testResolverWithPlugins, createTestDelta } from '@test-helpers/resolver-test-helper'; import { testResolverWithPlugins, createTestDelta } from '@test-helpers/resolver-test-helper';
class CountPlugin extends ResolverPlugin<{ count: number }> { class CountPlugin extends ResolverPlugin<{ count: number }> {

View File

@ -1,6 +1,6 @@
import { describe, test, expect, beforeEach } from '@jest/globals'; import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode } from '@src'; import { RhizomeNode } from '@src';
import { Lossless } from '@src/views/lossless'; import { Hyperview } from '@src/views/hyperview';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers'; import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
import { ResolverPlugin } from '@src/views/resolvers/custom-resolvers/plugin'; import { ResolverPlugin } from '@src/views/resolvers/custom-resolvers/plugin';
// import Debug from 'debug'; // import Debug from 'debug';
@ -25,11 +25,11 @@ class TestPlugin extends ResolverPlugin<unknown> {
describe('CustomResolver', () => { describe('CustomResolver', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('buildDependencyGraph', () => { describe('buildDependencyGraph', () => {
@ -42,7 +42,7 @@ describe('CustomResolver', () => {
}; };
// Act // Act
const resolver = new CustomResolver(lossless, plugins); const resolver = new CustomResolver(hyperview, plugins);
const graph = resolver.dependencyGraph; const graph = resolver.dependencyGraph;
@ -65,7 +65,7 @@ describe('CustomResolver', () => {
}; };
// Act // Act
const resolver = new CustomResolver(lossless, plugins); const resolver = new CustomResolver(hyperview, plugins);
// Access private method for testing // Access private method for testing
const graph = resolver.dependencyGraph; const graph = resolver.dependencyGraph;
@ -86,7 +86,7 @@ describe('CustomResolver', () => {
// Act & Assert // Act & Assert
expect(() => { expect(() => {
new CustomResolver(lossless, plugins); new CustomResolver(hyperview, plugins);
}).toThrow('Dependency nonexistent not found for plugin a'); }).toThrow('Dependency nonexistent not found for plugin a');
}); });
@ -99,7 +99,7 @@ describe('CustomResolver', () => {
}; };
// Act // Act
const resolver = new CustomResolver(lossless, plugins); const resolver = new CustomResolver(hyperview, plugins);
// Access private method for testing // Access private method for testing
const graph = resolver.dependencyGraph; const graph = resolver.dependencyGraph;
@ -125,7 +125,7 @@ describe('CustomResolver', () => {
// Act & Assert // Act & Assert
expect(() => { expect(() => {
new CustomResolver(lossless, plugins); new CustomResolver(hyperview, plugins);
}).toThrow('Circular dependency detected in plugin dependencies'); }).toThrow('Circular dependency detected in plugin dependencies');
}); });
}); });

View File

@ -1,6 +1,6 @@
import Debug from "debug"; import Debug from "debug";
import { createDelta } from '@src/core/delta-builder'; import { createDelta } from '@src/core/delta-builder';
import { Lossless, RhizomeNode } from '@src'; import { Hyperview, RhizomeNode } from '@src';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers'; import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
const debug = Debug('rz:test:last-write-wins'); const debug = Debug('rz:test:last-write-wins');
@ -11,24 +11,24 @@ describe('Last write wins', () => {
describe('given that two separate writes occur', () => { describe('given that two separate writes occur', () => {
const node = new RhizomeNode(); const node = new RhizomeNode();
const lossless = new Lossless(node); const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless); const view = new TimestampResolver(hyperview);
beforeAll(() => { beforeAll(() => {
lossless.ingestDelta(createDelta('a', 'h') hyperview.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 95, 'vegetable') .setProperty('broccoli', 'want', 95, 'vegetable')
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('a', 'h') hyperview.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 90, 'vegetable') .setProperty('broccoli', 'want', 90, 'vegetable')
.buildV1() .buildV1()
); );
}); });
test('our resolver should return the most recently written value', () => { test('our resolver should return the most recently written value', () => {
const result = lossy.resolve(["broccoli"]); const result = view.resolve(["broccoli"]);
debug('result', result); debug('result', result);
expect(result).toMatchObject({ expect(result).toMatchObject({
broccoli: { broccoli: {

View File

@ -1,5 +1,5 @@
import { RhizomeNode, Lossless, createDelta } from "@src"; import { RhizomeNode, Hyperview, createDelta } from "@src";
import { CollapsedDelta } from "@src/views/lossless"; import { CollapsedDelta } from "@src/views/hyperview";
import { import {
CustomResolver, CustomResolver,
ResolverPlugin, ResolverPlugin,
@ -10,11 +10,11 @@ import { PropertyTypes } from '@src/core/types';
describe('State Visibility', () => { describe('State Visibility', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
// A test plugin that records which states it sees // A test plugin that records which states it sees
@ -88,10 +88,10 @@ describe('State Visibility', () => {
prop2: spy2 prop2: spy2
} as const; } as const;
const resolver = new CustomResolver(lossless, config); const resolver = new CustomResolver(hyperview, config);
// Add some data // Add some data
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity1', 'prop1', 'value1', 'entity-prop1') .setProperty('entity1', 'prop1', 'value1', 'entity-prop1')
@ -127,10 +127,10 @@ describe('State Visibility', () => {
dependsOn: dependency dependsOn: dependency
} as const; } as const;
const resolver = new CustomResolver(lossless, config); const resolver = new CustomResolver(hyperview, config);
// Add some data // Add some data
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1') .setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -157,14 +157,14 @@ describe('State Visibility', () => {
const lastWrite = new LastWriteWinsPlugin(); const lastWrite = new LastWriteWinsPlugin();
const other = new LastWriteWinsPlugin(); const other = new LastWriteWinsPlugin();
const resolver = new CustomResolver(lossless, { const resolver = new CustomResolver(hyperview, {
dependent: dependent, dependent: dependent,
dependsOn: lastWrite, dependsOn: lastWrite,
other: other // Not declared as a dependency other: other // Not declared as a dependency
}); });
// Add some data // Add some data
lossless.ingestDelta( hyperview.ingestDelta(
createDelta('user1', 'host1') createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1') .setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -214,7 +214,7 @@ describe('State Visibility', () => {
} }
expect(() => { expect(() => {
new CustomResolver(lossless, { new CustomResolver(hyperview, {
bad: new PluginWithBadDeps() bad: new PluginWithBadDeps()
}); });
}).toThrow("Dependency nonexistent not found for plugin bad"); }).toThrow("Dependency nonexistent not found for plugin bad");

View File

@ -1,6 +1,6 @@
import { import {
RhizomeNode, RhizomeNode,
Lossless, Hyperview,
TimestampResolver, TimestampResolver,
CreatorIdTimestampResolver, CreatorIdTimestampResolver,
DeltaIdTimestampResolver, DeltaIdTimestampResolver,
@ -13,19 +13,19 @@ const debug = Debug('rz:test:timestamp-resolvers');
describe('Timestamp Resolvers', () => { describe('Timestamp Resolvers', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let hyperview: Hyperview;
beforeEach(() => { beforeEach(() => {
node = new RhizomeNode(); node = new RhizomeNode();
lossless = new Lossless(node); hyperview = new Hyperview(node);
}); });
describe('Basic Timestamp Resolution', () => { describe('Basic Timestamp Resolution', () => {
test('should resolve by most recent timestamp', () => { test('should resolve by most recent timestamp', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
// Add older delta // Add older delta
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -34,7 +34,7 @@ describe('Timestamp Resolvers', () => {
); );
// Add newer delta // Add newer delta
lossless.ingestDelta(createDelta('user2', 'host2') hyperview.ingestDelta(createDelta('user2', 'host2')
.withId('delta2') .withId('delta2')
.withTimestamp(2000) .withTimestamp(2000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -50,10 +50,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should handle multiple entities with different timestamps', () => { test('should handle multiple entities with different timestamps', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
// Entity1 - older value // Entity1 - older value
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'value') .addPointer('collection', 'entity1', 'value')
.addPointer('value', 100) .addPointer('value', 100)
@ -61,7 +61,7 @@ describe('Timestamp Resolvers', () => {
); );
// Entity2 - newer value // Entity2 - newer value
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(2000) .withTimestamp(2000)
.addPointer('collection', 'entity2', 'value') .addPointer('collection', 'entity2', 'value')
.addPointer('value', 200) .addPointer('value', 200)
@ -78,10 +78,10 @@ describe('Timestamp Resolvers', () => {
describe('Tie-Breaking Strategies', () => { describe('Tie-Breaking Strategies', () => {
test('should break ties using creator-id strategy', () => { test('should break ties using creator-id strategy', () => {
const resolver = new CreatorIdTimestampResolver(lossless); const resolver = new CreatorIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different creators // Two deltas with same timestamp, different creators
lossless.ingestDelta(createDelta('user_z', 'host1') hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -89,7 +89,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user_a', 'host1') hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta2') .withId('delta2')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -105,10 +105,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should break ties using delta-id strategy', () => { test('should break ties using delta-id strategy', () => {
const resolver = new DeltaIdTimestampResolver(lossless); const resolver = new DeltaIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different delta IDs // Two deltas with same timestamp, different delta IDs
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier .withId('delta_a') // Lexicographically earlier
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -116,7 +116,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later .withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -132,10 +132,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should break ties using host-id strategy', () => { test('should break ties using host-id strategy', () => {
const resolver = new HostIdTimestampResolver(lossless); const resolver = new HostIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different hosts // Two deltas with same timestamp, different hosts
lossless.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later hyperview.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -143,7 +143,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier hyperview.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier
.withId('delta2') .withId('delta2')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -159,10 +159,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should break ties using lexicographic strategy with string values', () => { test('should break ties using lexicographic strategy with string values', () => {
const resolver = new LexicographicTimestampResolver(lossless); const resolver = new LexicographicTimestampResolver(hyperview);
// Two deltas with same timestamp, different string values // Two deltas with same timestamp, different string values
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
@ -170,7 +170,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta2') .withId('delta2')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
@ -186,10 +186,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should break ties using lexicographic strategy with numeric values (falls back to delta ID)', () => { test('should break ties using lexicographic strategy with numeric values (falls back to delta ID)', () => {
const resolver = new LexicographicTimestampResolver(lossless); const resolver = new LexicographicTimestampResolver(hyperview);
// Two deltas with same timestamp, numeric values (should fall back to delta ID comparison) // Two deltas with same timestamp, numeric values (should fall back to delta ID comparison)
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier .withId('delta_a') // Lexicographically earlier
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -197,7 +197,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later .withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -215,11 +215,11 @@ describe('Timestamp Resolvers', () => {
describe('Complex Tie-Breaking Scenarios', () => { describe('Complex Tie-Breaking Scenarios', () => {
test('should handle multiple properties with different tie-breaking outcomes', () => { test('should handle multiple properties with different tie-breaking outcomes', () => {
const creatorResolver = new CreatorIdTimestampResolver(lossless); const creatorResolver = new CreatorIdTimestampResolver(hyperview);
const deltaResolver = new DeltaIdTimestampResolver(lossless); const deltaResolver = new DeltaIdTimestampResolver(hyperview);
// Add deltas for multiple properties with same timestamp // Add deltas for multiple properties with same timestamp
lossless.ingestDelta(createDelta('user_a', 'host1') hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_z') .withId('delta_z')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
@ -227,7 +227,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user_z', 'host1') hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_a') .withId('delta_a')
.withTimestamp(1000) // Same timestamp .withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
@ -249,10 +249,10 @@ describe('Timestamp Resolvers', () => {
}); });
test('should work consistently with timestamp priority over tie-breaking', () => { test('should work consistently with timestamp priority over tie-breaking', () => {
const resolver = new CreatorIdTimestampResolver(lossless); const resolver = new CreatorIdTimestampResolver(hyperview);
// Add older delta with "better" tie-breaking attributes // Add older delta with "better" tie-breaking attributes
lossless.ingestDelta(createDelta('user_z', 'host1') hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_z') // Would win in delta ID tie-breaking .withId('delta_z') // Would win in delta ID tie-breaking
.withTimestamp(1000) // Older timestamp .withTimestamp(1000) // Older timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -261,7 +261,7 @@ describe('Timestamp Resolvers', () => {
); );
// Add newer delta with "worse" tie-breaking attributes // Add newer delta with "worse" tie-breaking attributes
lossless.ingestDelta(createDelta('user_a', 'host1') hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_a') // Would lose in delta ID tie-breaking .withId('delta_a') // Would lose in delta ID tie-breaking
.withTimestamp(2000) // Newer timestamp .withTimestamp(2000) // Newer timestamp
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')
@ -279,8 +279,8 @@ describe('Timestamp Resolvers', () => {
describe('Edge Cases', () => { describe('Edge Cases', () => {
test('should handle single delta correctly', () => { test('should handle single delta correctly', () => {
const resolver = new TimestampResolver(lossless, 'creator-id'); const resolver = new TimestampResolver(hyperview, 'creator-id');
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'value') .addPointer('collection', 'entity1', 'value')
@ -295,9 +295,9 @@ describe('Timestamp Resolvers', () => {
}); });
test('should handle mixed value types correctly', () => { test('should handle mixed value types correctly', () => {
const resolver = new TimestampResolver(lossless); const resolver = new TimestampResolver(hyperview);
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1') .withId('delta1')
.withTimestamp(1000) .withTimestamp(1000)
.addPointer('collection', 'entity1', 'name') .addPointer('collection', 'entity1', 'name')
@ -305,7 +305,7 @@ describe('Timestamp Resolvers', () => {
.buildV1() .buildV1()
); );
lossless.ingestDelta(createDelta('user1', 'host1') hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta2') .withId('delta2')
.withTimestamp(1001) .withTimestamp(1001)
.addPointer('collection', 'entity1', 'score') .addPointer('collection', 'entity1', 'score')

View File

@ -11,7 +11,7 @@ classDiagram
-requestReply: RequestReply -requestReply: RequestReply
-httpServer: HttpServer -httpServer: HttpServer
-deltaStream: DeltaStream -deltaStream: DeltaStream
-lossless: Lossless -hyperview: Hyperview
-peers: Peers -peers: Peers
-queryEngine: QueryEngine -queryEngine: QueryEngine
-storageQueryEngine: StorageQueryEngine -storageQueryEngine: StorageQueryEngine
@ -27,15 +27,15 @@ classDiagram
+pointers: PointerV1[] +pointers: PointerV1[]
} }
class Lossless { class Hyperview {
-domainEntities: Map<DomainEntityID, LosslessEntity> -domainEntities: Map<DomainEntityID, HyperviewEntity>
-transactions: Transactions -transactions: Transactions
+view(ids: DomainEntityID[]): LosslessViewMany +view(ids: DomainEntityID[]): HyperviewViewMany
+compose(ids: DomainEntityID[]): LosslessViewMany +compose(ids: DomainEntityID[]): HyperviewViewMany
} }
class QueryEngine { class QueryEngine {
-lossless: Lossless -hyperview: Hyperview
-schemaRegistry: SchemaRegistry -schemaRegistry: SchemaRegistry
+query(schemaId: SchemaID, filter?: JsonLogic): Promise<SchemaAppliedViewWithNesting[]> +query(schemaId: SchemaID, filter?: JsonLogic): Promise<SchemaAppliedViewWithNesting[]>
} }
@ -69,23 +69,23 @@ classDiagram
%% Relationships %% Relationships
RhizomeNode --> DeltaStream RhizomeNode --> DeltaStream
RhizomeNode --> Lossless RhizomeNode --> Hyperview
RhizomeNode --> QueryEngine RhizomeNode --> QueryEngine
RhizomeNode --> StorageQueryEngine RhizomeNode --> StorageQueryEngine
RhizomeNode --> SchemaRegistry RhizomeNode --> SchemaRegistry
RhizomeNode --> DeltaStorage RhizomeNode --> DeltaStorage
Lossless --> Transactions Hyperview --> Transactions
Lossless --> LosslessEntity Hyperview --> HyperviewEntity
QueryEngine --> SchemaRegistry QueryEngine --> SchemaRegistry
QueryEngine --> Lossless QueryEngine --> Hyperview
StorageQueryEngine --> DeltaStorage StorageQueryEngine --> DeltaStorage
StorageQueryEngine --> SchemaRegistry StorageQueryEngine --> SchemaRegistry
DeltaStream --> Delta DeltaStream --> Delta
Lossless --> Delta Hyperview --> Delta
DockerOrchestrator --> ContainerManager DockerOrchestrator --> ContainerManager
DockerOrchestrator --> NetworkManager DockerOrchestrator --> NetworkManager
@ -103,7 +103,7 @@ classDiagram
- Represents atomic changes in the system - Represents atomic changes in the system
- Contains pointers to entities and their properties - Contains pointers to entities and their properties
3. **Lossless**: Manages the lossless view of data 3. **Hyperview**: Manages the hyperview view of data
- Maintains the complete history of deltas - Maintains the complete history of deltas
- Provides methods to view and compose entity states - Provides methods to view and compose entity states
@ -126,7 +126,7 @@ classDiagram
## Data Flow ## Data Flow
1. Deltas are received through the DeltaStream 1. Deltas are received through the DeltaStream
2. Lossless processes and stores these deltas 2. Hyperview processes and stores these deltas
3. Queries can be made through either QueryEngine (in-memory) or StorageQueryEngine (persisted) 3. Queries can be made through either QueryEngine (in-memory) or StorageQueryEngine (persisted)
4. The system maintains consistency through the schema system 4. The system maintains consistency through the schema system
5. In distributed mode, DockerOrchestrator manages multiple node instances 5. In distributed mode, DockerOrchestrator manages multiple node instances

View File

@ -10,11 +10,11 @@ The `CustomResolver` class is the main entry point for the Custom Resolver syste
class CustomResolver { class CustomResolver {
/** /**
* Creates a new CustomResolver instance * Creates a new CustomResolver instance
* @param view The lossless view to resolve * @param view The hyperview view to resolve
* @param config Plugin configuration * @param config Plugin configuration
*/ */
constructor( constructor(
private readonly view: LosslessView, private readonly view: HyperviewView,
private readonly config: ResolverConfig private readonly config: ResolverConfig
); );
@ -48,7 +48,7 @@ class CustomResolver {
Creates a new instance of the CustomResolver. Creates a new instance of the CustomResolver.
**Parameters:** **Parameters:**
- `view: LosslessView` - The lossless view containing the data to resolve - `view: HyperviewView` - The hyperview view containing the data to resolve
- `config: ResolverConfig` - Configuration object mapping property IDs to their resolver plugins - `config: ResolverConfig` - Configuration object mapping property IDs to their resolver plugins
**Example:** **Example:**
@ -146,10 +146,10 @@ The resolver may throw the following errors:
```typescript ```typescript
import { CustomResolver, LastWriteWinsPlugin } from './resolver'; import { CustomResolver, LastWriteWinsPlugin } from './resolver';
import { LosslessView } from '../lossless-view'; import { HyperviewView } from '../hyperview-view';
// Create a lossless view with some data // Create a hyperview view with some data
const view = new LosslessView(); const view = new HyperviewView();
// ... add data to the view ... // ... add data to the view ...
// Configure the resolver // Configure the resolver

View File

@ -22,22 +22,26 @@ The `CustomResolver` system provides a flexible framework for resolving property
## Getting Started ## Getting Started
```typescript ```typescript
import { CustomResolver, LastWriteWinsPlugin } from './resolver'; import { CustomResolver } from '../src/views/resolvers/custom-resolvers';
import { LosslessView } from '../lossless-view'; import { LastWriteWinsPlugin } from '../src/views/resolvers/custom-resolvers/plugins';
import { Hyperview } from '../src/views/hyperview';
// Create a lossless view // Create a hyperview view
const view = new LosslessView(); const hyperview = new Hyperview();
// Create a resolver with a last-write-wins strategy // Create a resolver with a last-write-wins strategy
const resolver = new CustomResolver(view, { const resolver = new CustomResolver(hyperview, {
myProperty: new LastWriteWinsPlugin() myProperty: new LastWriteWinsPlugin()
}); });
// Process updates // Process updates through the hyperview view
// ... // hyperview.applyDelta(delta);
// Get resolved values // Get resolved values for specific entities
const result = resolver.resolve(); const result = resolver.resolve(['entity1', 'entity2']);
// Or get all resolved values
const allResults = resolver.resolveAll();
``` ```
## Next Steps ## Next Steps

View File

@ -78,11 +78,11 @@ Create tests to verify your plugin's behavior:
```typescript ```typescript
describe('DiscountedPricePlugin', () => { describe('DiscountedPricePlugin', () => {
let view: LosslessView; let view: HyperviewView;
let resolver: CustomResolver; let resolver: CustomResolver;
beforeEach(() => { beforeEach(() => {
view = new LosslessView(); view = new HyperviewView();
resolver = new CustomResolver(view, { resolver = new CustomResolver(view, {
basePrice: new LastWriteWinsPlugin(), basePrice: new LastWriteWinsPlugin(),
discount: new LastWriteWinsPlugin(), discount: new LastWriteWinsPlugin(),

View File

@ -1,15 +1,150 @@
# Resolvers (Views) # Views and Resolvers
The workhorse of this system is likely going to be our lossy views. ## Core Concepts
This is where the computation likely generally occurs.
So, let's talk about how to create a view. ### Views
A lossy view initializes from a given lossless view. A `View` (previously known as `Lossy`) is a derived, computed representation of your data that provides efficient access to resolved values. It's built on top of the `Hyperview` (previously `Hyperview`) storage layer, which maintains the complete history of all deltas.
The lossless view dispatches events when entity properties are updated.
View semantics are similar to map-reduce, resolvers in Redux, etc. ```typescript
// Basic View implementation
export abstract class View<Accumulator, Result = Accumulator> {
// Core methods
abstract reducer(acc: Accumulator, cur: HyperviewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
// Built-in functionality
resolve(entityIds?: DomainEntityID[]): Result | undefined;
}
```
The key is to identify your accumulator object. ### Hyperview
Your algorithm SHOULD be implemented so that the reducer is a pure function.
All state must therefore be stored in the accumulator. `Hyperview` (previously `Hyperview`) maintains the complete, immutable history of all deltas and provides the foundation for building derived views.
```typescript
// Basic Hyperview usage
const hyperview = new Hyperview(rhizomeNode);
const view = new MyCustomView(hyperview);
const result = view.resolve(['entity1', 'entity2']);
```
## Creating Custom Resolvers
### Basic Resolver Pattern
1. **Define your accumulator type**: This holds the state of your view
2. **Implement the reducer**: Pure function that updates the accumulator
3. **Optionally implement resolver**: Transforms the accumulator into the final result
```typescript
class CountResolver extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + Object.keys(view.propertyDeltas).length;
}
initializer() { return 0; }
}
```
### Custom Resolver with Dependencies
Resolvers can depend on other resolvers using the `CustomResolver` system:
```typescript
// Define resolver plugins
const resolver = new CustomResolver(hyperview, {
totalItems: new CountPlugin(),
average: new AveragePlugin()
});
// Get resolved values
const result = resolver.resolve(['entity1', 'entity2']);
```
## Built-in Resolvers
### Last Write Wins
Keeps the most recent value based on timestamp:
```typescript
const resolver = new CustomResolver(hyperview, {
status: new LastWriteWinsPlugin()
});
```
### First Write Wins
Keeps the first value that was written:
```typescript
const resolver = new CustomResolver(hyperview, {
initialStatus: new FirstWriteWinsPlugin()
});
```
### Aggregation Resolvers
Built-in aggregators for common operations:
- `SumPlugin`: Sum of numeric values
- `AveragePlugin`: Average of numeric values
- `CountPlugin`: Count of values
- `MinPlugin`: Minimum value
- `MaxPlugin`: Maximum value
## Advanced Topics
### Performance Considerations
1. **Efficient Reducers**: Keep reducers pure and fast
2. **Selective Updates**: Only process changed entities
3. **Memoization**: Cache expensive computations
### Common Patterns
1. **Derived Properties**: Calculate values based on other properties
2. **State Machines**: Track state transitions over time
3. **Validation**: Enforce data consistency rules
## Best Practices
1. **Keep Reducers Pure**: Avoid side effects in reducers
2. **Use Strong Typing**: Leverage TypeScript for type safety
3. **Handle Edge Cases**: Consider empty states and error conditions
4. **Profile Performance**: Monitor and optimize hot paths
## Examples
### Simple Counter
```typescript
class CounterView extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + view.propertyDeltas['increment']?.length || 0;
}
initializer() { return 0; }
}
```
### Running Average
```typescript
class RunningAverageView extends View<{sum: number, count: number}, number> {
reducer(acc: {sum: number, count: number}, view: HyperviewOne): {sum: number, count: number} {
const value = // extract value from view
return {
sum: acc.sum + value,
count: acc.count + 1
};
}
resolver(acc: {sum: number, count: number}): number {
return acc.count > 0 ? acc.sum / acc.count : 0;
}
initializer() { return {sum: 0, count: 0}; }
}
```

View File

@ -56,7 +56,7 @@ const unsafeDelta = createDelta('peer1', 'peer1')
.buildV1(); .buildV1();
// This will be ingested without validation // This will be ingested without validation
node.lossless.ingestDelta(unsafeDelta); node.hyperview.ingestDelta(unsafeDelta);
// 5. Check validation status after the fact // 5. Check validation status after the fact
const stats = collection.getValidationStats(); const stats = collection.getValidationStats();

View File

@ -84,9 +84,9 @@ const delta = createTestDelta('user1', 'host1')
## How It Works ## How It Works
1. Creates a new `Lossless` instance for the test 1. Creates a new `Hyperview` instance for the test
2. Sets up a `CustomResolver` with the provided plugins 2. Sets up a `CustomResolver` with the provided plugins
3. Ingests all provided deltas into the `Lossless` instance 3. Ingests all provided deltas into the `Hyperview` instance
4. Retrieves a view for the specified entity 4. Retrieves a view for the specified entity
5. Processes the view through the resolver 5. Processes the view through the resolver
6. Calls the `expectedResult` callback with the resolved entity 6. Calls the `expectedResult` callback with the resolved entity
@ -97,4 +97,4 @@ const delta = createTestDelta('user1', 'host1')
- The helper handles all setup and teardown of test resources - The helper handles all setup and teardown of test resources
- Use `createTestDelta` for consistent delta creation in tests - Use `createTestDelta` for consistent delta creation in tests
- The helper ensures type safety between the resolver and the expected result type - The helper ensures type safety between the resolver and the expected result type
- Each test gets a fresh `Lossless` instance automatically - Each test gets a fresh `Hyperview` instance automatically

View File

@ -18,8 +18,8 @@ type User = {
(async () => { (async () => {
const rhizomeNode = new RhizomeNode(); const rhizomeNode = new RhizomeNode();
// Enable API to read lossless view // Enable API to read hyperview view
rhizomeNode.httpServer.httpApi.serveLossless(); rhizomeNode.httpServer.httpApi.serveHyperview();
const users = new BasicCollection("user"); const users = new BasicCollection("user");
users.rhizomeConnect(rhizomeNode); users.rhizomeConnect(rhizomeNode);

View File

@ -136,10 +136,10 @@ smalltalk type messaging structure on top of the database
note dimensions of attack surface note dimensions of attack surface
layers: layers:
primitives - what's a delta, schema, materialized view, lossy view primitives - what's a delta, schema, materialized view, view view
delta store - allows you to persiste deltas and query over the delta stream delta store - allows you to persiste deltas and query over the delta stream
materialized view store - lossy snapshot(s) materialized view store - view snapshot(s)
lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings view bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
-- idea: diff tools -- idea: diff tools

View File

@ -34,10 +34,10 @@ Pre-built technologies? LevelDB
Can use LevelDB to store deltas Can use LevelDB to store deltas
Structure can correspond to our desired ontology Structure can correspond to our desired ontology
Layers for Layers for
- primitives - what's a delta, schema, materialized view, lossy view - primitives - what's a delta, schema, materialized view, view view
- delta store - allows you to persiste deltas and query over the delta stream - delta store - allows you to persiste deltas and query over the delta stream
- materialized view store - lossy snapshot(s) - materialized view store - view snapshot(s)
- lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings - view bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
Protocol involving ZeroMQ-- Protocol involving ZeroMQ--

View File

@ -1,11 +1,11 @@
> myk: > myk:
> I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (lossy representation) <-> Lossless Representation <-> Delta[] I think > I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (view representation) <-> Hyperview Representation <-> Delta[] I think
> the tricky stuff comes in with, like, how do you take an undifferentiated stream of deltas, a query and a schema > the tricky stuff comes in with, like, how do you take an undifferentiated stream of deltas, a query and a schema
> and filter / merge those deltas into the lossless tree structure you need in order to then reduce into a lossy domain node > and filter / merge those deltas into the hyperview tree structure you need in order to then reduce into a view domain node
> if that part of the flow works then the rest becomes fairly self-evident > if that part of the flow works then the rest becomes fairly self-evident
> a "lossless representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc > a "hyperview representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc
> so you need both the ID of the root node (the thing being targeted by one or more deltas) as well as the scehma to apply to determine which contexts on that target to include (target_context effectively becomes a property on the domain entity, right?), as well as which schema to apply to included referenced entities, etc. > so you need both the ID of the root node (the thing being targeted by one or more deltas) as well as the scehma to apply to determine which contexts on that target to include (target_context effectively becomes a property on the domain entity, right?), as well as which schema to apply to included referenced entities, etc.
> so it's what keeps you from returning the whole stream of deltas, while still allowing you to follow arbitrary edges > so it's what keeps you from returning the whole stream of deltas, while still allowing you to follow arbitrary edges
@ -31,7 +31,7 @@ Example delta:
target: "usd" target: "usd"
}] }]
Lossless transformation: Hyperview transformation:
{ {
keanu: { keanu: {

View File

@ -9,7 +9,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
1. **Deltas are relationships**: Every delta with pointers already expresses relationships 1. **Deltas are relationships**: Every delta with pointers already expresses relationships
2. **Patterns, not structure**: We're recognizing patterns in how deltas connect entities 2. **Patterns, not structure**: We're recognizing patterns in how deltas connect entities
3. **Perspective-driven**: Different views/resolvers can interpret the same deltas differently 3. **Perspective-driven**: Different views/resolvers can interpret the same deltas differently
4. **No single truth**: Competing deltas are resolved by application-level lossy resolvers 4. **No single truth**: Competing deltas are resolved by application-level view resolvers
5. **Time-aware**: All queries are inherently temporal, showing different relationships at different times 5. **Time-aware**: All queries are inherently temporal, showing different relationships at different times
## Current State ✅ ## Current State ✅
@ -17,7 +17,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
- **All tests passing**: 21/21 suites, 183/183 tests (100%) - **All tests passing**: 21/21 suites, 183/183 tests (100%)
- **Delta system**: Fully functional with pointers expressing relationships - **Delta system**: Fully functional with pointers expressing relationships
- **Negation system**: Can invalidate deltas (and thus relationships) - **Negation system**: Can invalidate deltas (and thus relationships)
- **Query system**: Basic traversal of lossless views - **Query system**: Basic traversal of hyperview views
- **Schema system**: Can describe entity structures - **Schema system**: Can describe entity structures
- **Resolver system**: Application-level interpretation of deltas - **Resolver system**: Application-level interpretation of deltas
@ -120,7 +120,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
class PatternResolver { class PatternResolver {
// Interpret deltas matching certain patterns // Interpret deltas matching certain patterns
resolveWithPatterns(entityId, patterns) { resolveWithPatterns(entityId, patterns) {
const deltas = this.lossless.getDeltasForEntity(entityId); const deltas = this.hyperview.getDeltasForEntity(entityId);
return { return {
entity: entityId, entity: entityId,

View File

@ -2,7 +2,7 @@
## Executive Summary ## Executive Summary
The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → lossless → lossy transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented. The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → hyperview → view transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented.
## Core Concept Alignment ## Core Concept Alignment
@ -13,13 +13,13 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
- **Implementation**: Correctly implements both V1 (array) and V2 (object) formats - **Implementation**: Correctly implements both V1 (array) and V2 (object) formats
- **Tests**: Basic format conversion tested, but validation gaps exist - **Tests**: Basic format conversion tested, but validation gaps exist
2. **Lossless Views** 2. **Hyperview Views**
- **Spec**: Full inventory of all deltas composing an object - **Spec**: Full inventory of all deltas composing an object
- **Implementation**: `LosslessViewDomain` correctly accumulates deltas by entity/property - **Implementation**: `HyperviewViewDomain` correctly accumulates deltas by entity/property
- **Tests**: Good coverage of basic transformation, filtering by creator/host - **Tests**: Good coverage of basic transformation, filtering by creator/host
3. **Lossy Views** 3. **Lossy Views**
- **Spec**: Compression of lossless views using resolution strategies - **Spec**: Compression of hyperview views using resolution strategies
- **Implementation**: Initializer/reducer/resolver pattern provides flexibility - **Implementation**: Initializer/reducer/resolver pattern provides flexibility
- **Tests**: Domain-specific example (Role/Actor/Film) demonstrates concept - **Tests**: Domain-specific example (Role/Actor/Film) demonstrates concept
@ -127,4 +127,4 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
## Conclusion ## Conclusion
The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/lossless/lossy pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios. The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/hyperview/view pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios.

14
spec.md
View File

@ -8,11 +8,11 @@
* a `primitive` is a literal string, number or boolean value whose meaning is not tied up in its being a reference to a larger whole. * a `primitive` is a literal string, number or boolean value whose meaning is not tied up in its being a reference to a larger whole.
* An `object` is a composite object whose entire existence is encoded as the set of deltas that reference it. An object is identified by a unique `reference`, and every delta that includes that `reference` is asserting a claim about some property of that object. * An `object` is a composite object whose entire existence is encoded as the set of deltas that reference it. An object is identified by a unique `reference`, and every delta that includes that `reference` is asserting a claim about some property of that object.
* A `negation` is a specific kind of delta that includes a pointer with the name `negates`, a `target` reference to another delta, and a `context` called `negated_by`. * A `negation` is a specific kind of delta that includes a pointer with the name `negates`, a `target` reference to another delta, and a `context` called `negated_by`.
* a `schema` represents a template by which an `object` can be compiled into a `lossless view`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress. * a `schema` represents a template by which an `object` can be compiled into a `hyperview view`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress.
* For instance, a `lossless view` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc. * For instance, a `hyperview view` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc.
* This could lead to circular references and arbitrarily deep nesting, which runs into the problem of "returning the entire graph". So our schema should specify, for instance, that the "friends" field apply the "Summary" schema to referenced users rather than the "User" schema, where the "Summary" schema simply resolves to username and photo. * This could lead to circular references and arbitrarily deep nesting, which runs into the problem of "returning the entire graph". So our schema should specify, for instance, that the "friends" field apply the "Summary" schema to referenced users rather than the "User" schema, where the "Summary" schema simply resolves to username and photo.
* A `lossless view` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a lossless view of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance. * A `hyperview view` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a hyperview view of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance.
* A `lossless view` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a lossless view, these references would be expanded to contain lossless views of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress. * A `hyperview view` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a hyperview view, these references would be expanded to contain hyperview views of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress.
* A `lossy view` is a compression of a `lossless view` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`. * A `view view` is a compression of a `hyperview view` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`.
* Note that in a lossless view any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context. * Note that in a hyperview view any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context.
* In collapsing a lossless view into a lossy view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the lossy view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average". * In collapsing a hyperview view into a view view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the view view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average".

View File

@ -18,7 +18,7 @@ export abstract class Collection<View> {
rhizomeNode?: RhizomeNode; rhizomeNode?: RhizomeNode;
name: string; name: string;
eventStream = new EventEmitter(); eventStream = new EventEmitter();
lossy?: View; view?: View;
constructor(name: string) { constructor(name: string) {
this.name = name; this.name = name;
@ -34,7 +34,7 @@ export abstract class Collection<View> {
this.initializeView(); this.initializeView();
// Listen for completed transactions, and emit updates to event stream // Listen for completed transactions, and emit updates to event stream
this.rhizomeNode.lossless.eventStream.on("updated", (id) => { this.rhizomeNode.hyperview.eventStream.on("updated", (id) => {
// TODO: Filter so we only get members of our collection // TODO: Filter so we only get members of our collection
// TODO: Reslover / Delta Filter? // TODO: Reslover / Delta Filter?
@ -128,7 +128,7 @@ export abstract class Collection<View> {
} }
debug(`Getting ids for collection ${this.name}`) debug(`Getting ids for collection ${this.name}`)
const ids = new Set<string>(); const ids = new Set<string>();
for (const [entityId, names] of this.rhizomeNode.lossless.referencedAs.entries()) { for (const [entityId, names] of this.rhizomeNode.hyperview.referencedAs.entries()) {
if (names.has(this.name)) { if (names.has(this.name)) {
ids.add(entityId); ids.add(entityId);
} }
@ -160,7 +160,7 @@ export abstract class Collection<View> {
); );
const ingested = new Promise<boolean>((resolve) => { const ingested = new Promise<boolean>((resolve) => {
this.rhizomeNode!.lossless.eventStream.on("updated", (id: DomainEntityID) => { this.rhizomeNode!.hyperview.eventStream.on("updated", (id: DomainEntityID) => {
if (id === entityId) resolve(true); if (id === entityId) resolve(true);
}) })
}); });
@ -178,7 +178,7 @@ export abstract class Collection<View> {
await this.rhizomeNode!.deltaStream.publishDelta(delta); await this.rhizomeNode!.deltaStream.publishDelta(delta);
// ingest the delta as though we had received it from a peer // ingest the delta as though we had received it from a peer
this.rhizomeNode!.lossless.ingestDelta(delta); this.rhizomeNode!.hyperview.ingestDelta(delta);
}); });
// Return updated view of this entity // Return updated view of this entity

View File

@ -7,20 +7,20 @@ import {Collection} from '../collections/collection-abstract';
import {TimestampResolver} from '../views/resolvers/timestamp-resolvers'; import {TimestampResolver} from '../views/resolvers/timestamp-resolvers';
export class BasicCollection extends Collection<TimestampResolver> { export class BasicCollection extends Collection<TimestampResolver> {
declare lossy?: TimestampResolver; declare view?: TimestampResolver;
initializeView() { initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome'); if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new TimestampResolver(this.rhizomeNode.lossless); this.view = new TimestampResolver(this.rhizomeNode.hyperview);
} }
resolve( resolve(
id: string id: string
) { ) {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome'); if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized'); if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {}; const res = this.view.resolve([id]) || {};
return res[id]; return res[id];
} }

View File

@ -6,20 +6,20 @@ class RelationalView extends TimestampResolver {
} }
export class RelationalCollection extends Collection<RelationalView> { export class RelationalCollection extends Collection<RelationalView> {
declare lossy?: RelationalView; declare view?: RelationalView;
initializeView() { initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome'); if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new RelationalView(this.rhizomeNode.lossless); this.view = new RelationalView(this.rhizomeNode.hyperview);
} }
resolve( resolve(
id: string id: string
): ResolvedViewOne | undefined { ): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome'); if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized'); if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {}; const res = this.view.resolve([id]) || {};
return res[id]; return res[id];
} }

View File

@ -10,7 +10,7 @@ import {
SchemaApplicationOptions SchemaApplicationOptions
} from '../schema/schema'; } from '../schema/schema';
import { DefaultSchemaRegistry } from '../schema/schema-registry'; import { DefaultSchemaRegistry } from '../schema/schema-registry';
import { LosslessViewOne } from '../views/lossless'; import { HyperviewViewOne } from '../views/hyperview';
import { DomainEntityID } from '../core/types'; import { DomainEntityID } from '../core/types';
import { EntityProperties } from '../core/entity'; import { EntityProperties } from '../core/entity';
import { createDelta } from '@src/core'; import { createDelta } from '@src/core';
@ -58,38 +58,38 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
initializeView(): void { initializeView(): void {
if (!this.rhizomeNode) throw new Error('not connected to rhizome'); if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new TimestampResolver(this.rhizomeNode.lossless); this.view = new TimestampResolver(this.rhizomeNode.hyperview);
} }
resolve(id: string): ResolvedViewOne | undefined { resolve(id: string): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome'); if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized'); if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {}; const res = this.view.resolve([id]) || {};
return res[id]; return res[id];
} }
// Validate an entity against the schema // Validate an entity against the schema
validate(entity: T): SchemaValidationResult { validate(entity: T): SchemaValidationResult {
// Convert entity to a mock lossless view for validation // Convert entity to a mock hyperview view for validation
const mockLosslessView: LosslessViewOne = { const mockHyperviewView: HyperviewViewOne = {
id: 'validation-mock', id: 'validation-mock',
referencedAs: [], referencedAs: [],
propertyDeltas: {}, propertyDeltas: {},
}; };
for (const [key, value] of Object.entries(entity)) { for (const [key, value] of Object.entries(entity)) {
mockLosslessView.propertyDeltas[key] = [createDelta('validation', 'validation') mockHyperviewView.propertyDeltas[key] = [createDelta('validation', 'validation')
.addPointer(key, value as string) .addPointer(key, value as string)
.buildV1(), .buildV1(),
]; ];
} }
return this.schemaRegistry.validate('validation-mock', this.schema.id, mockLosslessView); return this.schemaRegistry.validate('validation-mock', this.schema.id, mockHyperviewView);
} }
// Apply schema to a lossless view // Apply schema to a hyperview view
apply(view: LosslessViewOne): SchemaAppliedView { apply(view: HyperviewViewOne): SchemaAppliedView {
return this.schemaRegistry.applySchema(view, this.schema.id, this.applicationOptions); return this.schemaRegistry.applySchema(view, this.schema.id, this.applicationOptions);
} }
@ -97,10 +97,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
getValidatedView(entityId: DomainEntityID): SchemaAppliedView | undefined { getValidatedView(entityId: DomainEntityID): SchemaAppliedView | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome'); if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId]; const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!losslessView) return undefined; if (!hyperviewView) return undefined;
return this.apply(losslessView); return this.apply(hyperviewView);
} }
// Get all entities in this collection with schema validation // Get all entities in this collection with schema validation
@ -169,10 +169,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
for (const entityId of entityIds) { for (const entityId of entityIds) {
if (!this.rhizomeNode) continue; if (!this.rhizomeNode) continue;
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId]; const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!losslessView) continue; if (!hyperviewView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView); const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView);
if (validationResult.valid) { if (validationResult.valid) {
stats.validEntities++; stats.validEntities++;
@ -200,16 +200,16 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
debug(`No rhizome node connected`) debug(`No rhizome node connected`)
return []; return [];
} }
const losslessView = this.rhizomeNode.lossless.compose(this.getIds()); const hyperviewView = this.rhizomeNode.hyperview.compose(this.getIds());
if (!losslessView) { if (!hyperviewView) {
debug(`No lossless view found`) debug(`No hyperview view found`)
return []; return [];
} }
debug(`getValidEntities, losslessView: ${JSON.stringify(losslessView, null, 2)}`) debug(`getValidEntities, hyperviewView: ${JSON.stringify(hyperviewView, null, 2)}`)
debug(`Validating ${this.getIds().length} entities`) debug(`Validating ${this.getIds().length} entities`)
return this.getIds().filter(entityId => { return this.getIds().filter(entityId => {
debug(`Validating entity ${entityId}`) debug(`Validating entity ${entityId}`)
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView[entityId]); const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView[entityId]);
debug(`Validation result for entity ${entityId}: ${JSON.stringify(validationResult)}`) debug(`Validation result for entity ${entityId}: ${JSON.stringify(validationResult)}`)
return validationResult.valid; return validationResult.valid;
}); });
@ -221,10 +221,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
const invalid: Array<{ entityId: DomainEntityID; errors: string[] }> = []; const invalid: Array<{ entityId: DomainEntityID; errors: string[] }> = [];
for (const entityId of this.getIds()) { for (const entityId of this.getIds()) {
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId]; const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!losslessView) continue; if (!hyperviewView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView); const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView);
if (!validationResult.valid) { if (!validationResult.valid) {
invalid.push({ invalid.push({
entityId, entityId,

View File

@ -1,24 +0,0 @@
// A delta represents an assertion from a given (perspective/POV/context).
// So we want it to be fluent to express these in the local context,
// and propagated as deltas in a configurable manner; i.e. configurable batches or immediate
// import {Delta} from '../core/types';
export class Entity {
}
export class Context {
}
export class Assertion {
}
export class View {
}
export class User {
}
export function example() {
}

View File

@ -1,5 +1,4 @@
export * from './delta'; export * from './delta';
export * from './delta-builder'; export * from './delta-builder';
export * from './types'; export * from './types';
export * from './context';
export { Entity } from './entity'; export { Entity } from './entity';

View File

@ -1,7 +1,7 @@
import Debug from "debug"; import Debug from "debug";
import EventEmitter from "events"; import EventEmitter from "events";
import {Delta, DeltaID} from "../core/delta"; import {Delta, DeltaID} from "../core/delta";
import {Lossless} from "../views/lossless"; import {Hyperview} from "../views/hyperview";
import {DomainEntityID, TransactionID} from "../core/types"; import {DomainEntityID, TransactionID} from "../core/types";
const debug = Debug('rz:transactions'); const debug = Debug('rz:transactions');
@ -72,7 +72,7 @@ export class Transactions {
transactions = new Map<TransactionID, Transaction>(); transactions = new Map<TransactionID, Transaction>();
eventStream = new EventEmitter(); eventStream = new EventEmitter();
constructor(readonly lossless: Lossless) {} constructor(readonly hyperview: Hyperview) {}
get(id: TransactionID): Transaction | undefined { get(id: TransactionID): Transaction | undefined {
return this.transactions.get(id); return this.transactions.get(id);
@ -116,7 +116,7 @@ export class Transactions {
{ {
const {transactionId, size} = getTransactionSize(delta) || {}; const {transactionId, size} = getTransactionSize(delta) || {};
if (transactionId && size) { if (transactionId && size) {
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`); debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`);
this.setSize(transactionId, size as number); this.setSize(transactionId, size as number);

View File

@ -71,8 +71,8 @@ export class HttpApi {
res.json(this.rhizomeNode.peers.peers.length); res.json(this.rhizomeNode.peers.peers.length);
}); });
// Initialize lossless and query endpoints // Initialize hyperview and query endpoints
this.serveLossless(); this.serveHyperview();
this.serveQuery(); this.serveQuery();
} }
@ -116,17 +116,17 @@ export class HttpApi {
}); });
} }
serveLossless() { serveHyperview() {
// Get all domain entity IDs. TODO: This won't scale // Get all domain entity IDs. TODO: This won't scale
this.router.get('/lossless/ids', (_req: express.Request, res: express.Response) => { this.router.get('/hyperview/ids', (_req: express.Request, res: express.Response) => {
res.json({ res.json({
ids: Array.from(this.rhizomeNode.lossless.domainEntities.keys()) ids: Array.from(this.rhizomeNode.hyperview.domainEntities.keys())
}); });
}); });
// Get all transaction IDs. TODO: This won't scale // Get all transaction IDs. TODO: This won't scale
this.router.get('/transaction/ids', (_req: express.Request, res: express.Response) => { this.router.get('/transaction/ids', (_req: express.Request, res: express.Response) => {
const set = this.rhizomeNode.lossless.referencedAs.get("_transaction"); const set = this.rhizomeNode.hyperview.referencedAs.get("_transaction");
res.json({ res.json({
ids: set ? Array.from(set.values()) : [] ids: set ? Array.from(set.values()) : []
}); });
@ -135,7 +135,7 @@ export class HttpApi {
// View a single transaction // View a single transaction
this.router.get('/transaction/:id', (req: express.Request, res: express.Response) => { this.router.get('/transaction/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req; const {params: {id}} = req;
const v = this.rhizomeNode.lossless.compose([id]); const v = this.rhizomeNode.hyperview.compose([id]);
const ent = v[id]; const ent = v[id];
if (!ent.referencedAs?.includes("_transaction")) { if (!ent.referencedAs?.includes("_transaction")) {
res.status(400).json({error: "Entity is not a transaction", id}); res.status(400).json({error: "Entity is not a transaction", id});
@ -144,19 +144,19 @@ export class HttpApi {
res.json({ res.json({
...ent, ...ent,
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id) isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
}); });
}); });
// Get a lossless view of a single domain entity // Get a hyperview view of a single domain entity
this.router.get('/lossless/:id', (req: express.Request, res: express.Response) => { this.router.get('/hyperview/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req; const {params: {id}} = req;
const v = this.rhizomeNode.lossless.compose([id]); const v = this.rhizomeNode.hyperview.compose([id]);
const ent = v[id]; const ent = v[id];
res.json({ res.json({
...ent, ...ent,
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id) isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
}); });
}); });
} }

View File

@ -2,7 +2,7 @@ import Debug from 'debug';
import {CREATOR, HTTP_API_ADDR, HTTP_API_ENABLE, HTTP_API_PORT, PEER_ID, PUBLISH_BIND_ADDR, PUBLISH_BIND_HOST, PUBLISH_BIND_PORT, REQUEST_BIND_ADDR, REQUEST_BIND_HOST, REQUEST_BIND_PORT, SEED_PEERS, STORAGE_TYPE, STORAGE_PATH} from './config'; import {CREATOR, HTTP_API_ADDR, HTTP_API_ENABLE, HTTP_API_PORT, PEER_ID, PUBLISH_BIND_ADDR, PUBLISH_BIND_HOST, PUBLISH_BIND_PORT, REQUEST_BIND_ADDR, REQUEST_BIND_HOST, REQUEST_BIND_PORT, SEED_PEERS, STORAGE_TYPE, STORAGE_PATH} from './config';
import {DeltaStream, parseAddressList, PeerAddress, Peers, PubSub, RequestReply} from './network'; import {DeltaStream, parseAddressList, PeerAddress, Peers, PubSub, RequestReply} from './network';
import {HttpServer} from './http/index'; import {HttpServer} from './http/index';
import {Lossless} from './views'; import {Hyperview} from './views';
import {QueryEngine, StorageQueryEngine} from './query'; import {QueryEngine, StorageQueryEngine} from './query';
import {DefaultSchemaRegistry} from './schema'; import {DefaultSchemaRegistry} from './schema';
import {DeltaQueryStorage, StorageFactory, StorageConfig} from './storage'; import {DeltaQueryStorage, StorageFactory, StorageConfig} from './storage';
@ -29,7 +29,7 @@ export class RhizomeNode {
requestReply: RequestReply; requestReply: RequestReply;
httpServer: HttpServer; httpServer: HttpServer;
deltaStream: DeltaStream; deltaStream: DeltaStream;
lossless: Lossless; hyperview: Hyperview;
peers: Peers; peers: Peers;
queryEngine: QueryEngine; queryEngine: QueryEngine;
storageQueryEngine: StorageQueryEngine; storageQueryEngine: StorageQueryEngine;
@ -70,14 +70,14 @@ export class RhizomeNode {
this.httpServer = new HttpServer(this); this.httpServer = new HttpServer(this);
this.deltaStream = new DeltaStream(this); this.deltaStream = new DeltaStream(this);
this.peers = new Peers(this); this.peers = new Peers(this);
this.lossless = new Lossless(this); this.hyperview = new Hyperview(this);
this.schemaRegistry = new DefaultSchemaRegistry(); this.schemaRegistry = new DefaultSchemaRegistry();
// Initialize storage backend // Initialize storage backend
this.deltaStorage = StorageFactory.create(this.config.storage!); this.deltaStorage = StorageFactory.create(this.config.storage!);
// Initialize query engines (both lossless-based and storage-based) // Initialize query engines (both hyperview-based and storage-based)
this.queryEngine = new QueryEngine(this.lossless, this.schemaRegistry); this.queryEngine = new QueryEngine(this.hyperview, this.schemaRegistry);
this.storageQueryEngine = new StorageQueryEngine(this.deltaStorage, this.schemaRegistry); this.storageQueryEngine = new StorageQueryEngine(this.deltaStorage, this.schemaRegistry);
} }
@ -91,10 +91,10 @@ export class RhizomeNode {
}: { syncOnStart?: boolean } = {}): Promise<void> { }: { syncOnStart?: boolean } = {}): Promise<void> {
debug(`[${this.config.peerId}]`, 'Starting node (waiting for ready)...'); debug(`[${this.config.peerId}]`, 'Starting node (waiting for ready)...');
// Connect our lossless view to the delta stream // Connect our hyperview view to the delta stream
this.deltaStream.subscribeDeltas(async (delta) => { this.deltaStream.subscribeDeltas(async (delta) => {
// Ingest into lossless view // Ingest into hyperview view
this.lossless.ingestDelta(delta); this.hyperview.ingestDelta(delta);
// Also store in persistent storage // Also store in persistent storage
try { try {
@ -150,11 +150,11 @@ export class RhizomeNode {
} }
/** /**
* Sync existing lossless view data to persistent storage * Sync existing hyperview view data to persistent storage
* Useful for migrating from memory-only to persistent storage * Useful for migrating from memory-only to persistent storage
*/ */
async syncToStorage(): Promise<void> { async syncToStorage(): Promise<void> {
debug(`[${this.config.peerId}]`, 'Syncing lossless view to storage'); debug(`[${this.config.peerId}]`, 'Syncing hyperview view to storage');
const allDeltas = this.deltaStream.deltasAccepted; const allDeltas = this.deltaStream.deltasAccepted;
let synced = 0; let synced = 0;

View File

@ -2,7 +2,7 @@ import jsonLogic from 'json-logic-js';
const { apply, is_logic } = jsonLogic; const { apply, is_logic } = jsonLogic;
import Debug from 'debug'; import Debug from 'debug';
import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema'; import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema';
import { Lossless, LosslessViewMany, LosslessViewOne, valueFromDelta } from '../views/lossless'; import { Hyperview, HyperviewViewMany, HyperviewViewOne, valueFromDelta } from '../views/hyperview';
import { DomainEntityID } from '../core/types'; import { DomainEntityID } from '../core/types';
import { Delta, DeltaFilter } from '../core/delta'; import { Delta, DeltaFilter } from '../core/delta';
@ -31,14 +31,14 @@ export interface QueryOptions {
} }
export interface QueryResult { export interface QueryResult {
entities: LosslessViewMany; entities: HyperviewViewMany;
totalFound: number; totalFound: number;
limited: boolean; limited: boolean;
} }
export class QueryEngine { export class QueryEngine {
constructor( constructor(
private lossless: Lossless, private hyperview: Hyperview,
private schemaRegistry: SchemaRegistry private schemaRegistry: SchemaRegistry
) {} ) {}
@ -96,12 +96,12 @@ export class QueryEngine {
const candidateEntityIds = this.discoverEntitiesBySchema(schemaId); const candidateEntityIds = this.discoverEntitiesBySchema(schemaId);
debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`); debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`);
// 2. Compose lossless views for all candidates // 2. Compose hyperview views for all candidates
const allViews = this.lossless.compose(candidateEntityIds, options.deltaFilter); const allViews = this.hyperview.compose(candidateEntityIds, options.deltaFilter);
debug(`Composed ${Object.keys(allViews).length} lossless views`); debug(`Composed ${Object.keys(allViews).length} hyperview views`);
// 3. Apply JSON Logic filter if provided // 3. Apply JSON Logic filter if provided
let filteredViews: LosslessViewMany = allViews; let filteredViews: HyperviewViewMany = allViews;
if (filter) { if (filter) {
filteredViews = this.applyJsonLogicFilter(allViews, filter, schemaId); filteredViews = this.applyJsonLogicFilter(allViews, filter, schemaId);
@ -132,10 +132,10 @@ export class QueryEngine {
/** /**
* Query for a single entity by ID with schema validation * Query for a single entity by ID with schema validation
*/ */
async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<LosslessViewOne | null> { async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<HyperviewViewOne | null> {
debug(`Querying single entity ${entityId} with schema ${schemaId}`); debug(`Querying single entity ${entityId} with schema ${schemaId}`);
const views = this.lossless.compose([entityId]); const views = this.hyperview.compose([entityId]);
const view = views[entityId]; const view = views[entityId];
if (!view) { if (!view) {
@ -165,7 +165,7 @@ export class QueryEngine {
// Strategy: Find entities that have deltas for the schema's required properties // Strategy: Find entities that have deltas for the schema's required properties
const requiredProperties = schema.requiredProperties || []; const requiredProperties = schema.requiredProperties || [];
const allEntityIds = Array.from(this.lossless.domainEntities.keys()); const allEntityIds = Array.from(this.hyperview.domainEntities.keys());
if (requiredProperties.length === 0) { if (requiredProperties.length === 0) {
// No required properties - return all entities // No required properties - return all entities
@ -177,7 +177,7 @@ export class QueryEngine {
const candidateEntities: DomainEntityID[] = []; const candidateEntities: DomainEntityID[] = [];
for (const entityId of allEntityIds) { for (const entityId of allEntityIds) {
const entity = this.lossless.domainEntities.get(entityId); const entity = this.hyperview.domainEntities.get(entityId);
if (!entity) continue; if (!entity) continue;
// Check if entity has deltas for all required property // Check if entity has deltas for all required property
@ -195,28 +195,28 @@ export class QueryEngine {
} }
/** /**
* Apply JSON Logic filter to lossless views * Apply JSON Logic filter to hyperview views
* This requires converting each lossless view to a queryable object * This requires converting each hyperview view to a queryable object
*/ */
private applyJsonLogicFilter( private applyJsonLogicFilter(
views: LosslessViewMany, views: HyperviewViewMany,
filter: JsonLogic, filter: JsonLogic,
schemaId: SchemaID schemaId: SchemaID
): LosslessViewMany { ): HyperviewViewMany {
const schema = this.schemaRegistry.get(schemaId); const schema = this.schemaRegistry.get(schemaId);
if (!schema) { if (!schema) {
debug(`Cannot filter without schema ${schemaId}`); debug(`Cannot filter without schema ${schemaId}`);
return views; return views;
} }
const filteredViews: LosslessViewMany = {}; const filteredViews: HyperviewViewMany = {};
let hasFilterErrors = false; let hasFilterErrors = false;
const filterErrors: string[] = []; const filterErrors: string[] = [];
for (const [entityId, view] of Object.entries(views)) { for (const [entityId, view] of Object.entries(views)) {
try { try {
// Convert lossless view to queryable object using schema // Convert hyperview view to queryable object using schema
const queryableObject = this.losslessViewToQueryableObject(view, schema); const queryableObject = this.hyperviewViewToQueryableObject(view, schema);
// Apply JSON Logic filter // Apply JSON Logic filter
const matches = apply(filter, queryableObject); const matches = apply(filter, queryableObject);
@ -246,16 +246,16 @@ export class QueryEngine {
} }
/** /**
* Convert a lossless view to a queryable object based on schema * Convert a hyperview view to a queryable object based on schema
* Uses simple resolution strategies for now * Uses simple resolution strategies for now
*/ */
private losslessViewToQueryableObject(view: LosslessViewOne, schema: ObjectSchema): Record<string, unknown> { private hyperviewViewToQueryableObject(view: HyperviewViewOne, schema: ObjectSchema): Record<string, unknown> {
const obj: Record<string, unknown> = { const obj: Record<string, unknown> = {
id: view.id, id: view.id,
_referencedAs: view.referencedAs _referencedAs: view.referencedAs
}; };
// Convert each schema property from lossless view deltas // Convert each schema property from hyperview view deltas
for (const [propertyId, propertySchema] of Object.entries(schema.properties)) { for (const [propertyId, propertySchema] of Object.entries(schema.properties)) {
const deltas = view.propertyDeltas[propertyId] || []; const deltas = view.propertyDeltas[propertyId] || [];
@ -315,7 +315,7 @@ export class QueryEngine {
/** /**
* Check if an entity matches a schema (basic validation) * Check if an entity matches a schema (basic validation)
*/ */
private entityMatchesSchema(view: LosslessViewOne, schemaId: SchemaID): boolean { private entityMatchesSchema(view: HyperviewViewOne, schemaId: SchemaID): boolean {
const schema = this.schemaRegistry.get(schemaId); const schema = this.schemaRegistry.get(schemaId);
if (!schema) return false; if (!schema) return false;
@ -337,7 +337,7 @@ export class QueryEngine {
* Get statistics about queryable entities * Get statistics about queryable entities
*/ */
getStats() { getStats() {
const totalEntities = this.lossless.domainEntities.size; const totalEntities = this.hyperview.domainEntities.size;
const registeredSchemas = this.schemaRegistry.list().length; const registeredSchemas = this.schemaRegistry.list().length;
return { return {

View File

@ -14,9 +14,9 @@ import {
SchemaApplicationOptions, SchemaApplicationOptions,
ResolutionContext ResolutionContext
} from '../schema/schema'; } from '../schema/schema';
import { Lossless, LosslessViewOne } from '../views/lossless'; import { Hyperview, HyperviewViewOne } from '../views/hyperview';
import { DomainEntityID, PropertyID, PropertyTypes } from '../core/types'; import { DomainEntityID, PropertyID, PropertyTypes } from '../core/types';
import { CollapsedDelta } from '../views/lossless'; import { CollapsedDelta } from '../views/hyperview';
import { Delta } from '@src/core'; import { Delta } from '@src/core';
const debug = Debug('rz:schema-registry'); const debug = Debug('rz:schema-registry');
@ -103,7 +103,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
} }
} }
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult { validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewViewOne): SchemaValidationResult {
const schema = this.get(schemaId); const schema = this.get(schemaId);
if (!schema) { if (!schema) {
return { return {
@ -282,7 +282,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
} }
applySchema( applySchema(
view: LosslessViewOne, view: HyperviewViewOne,
schemaId: SchemaID, schemaId: SchemaID,
options: SchemaApplicationOptions = {} options: SchemaApplicationOptions = {}
): SchemaAppliedView { ): SchemaAppliedView {
@ -333,9 +333,9 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
* Resolves references to other entities according to schema specifications * Resolves references to other entities according to schema specifications
*/ */
applySchemaWithNesting( applySchemaWithNesting(
view: LosslessViewOne, view: HyperviewViewOne,
schemaId: SchemaID, schemaId: SchemaID,
losslessView: Lossless, hyperviewView: Hyperview,
options: SchemaApplicationOptions = {} options: SchemaApplicationOptions = {}
): SchemaAppliedViewWithNesting { ): SchemaAppliedViewWithNesting {
const { maxDepth = 3, includeMetadata = true, strictValidation = false } = options; const { maxDepth = 3, includeMetadata = true, strictValidation = false } = options;
@ -344,16 +344,16 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
return this.resolveNestedView( return this.resolveNestedView(
view, view,
schemaId, schemaId,
losslessView, hyperviewView,
resolutionContext, resolutionContext,
{ includeMetadata, strictValidation } { includeMetadata, strictValidation }
); );
} }
private resolveNestedView( private resolveNestedView(
view: LosslessViewOne, view: HyperviewViewOne,
schemaId: SchemaID, schemaId: SchemaID,
losslessView: Lossless, hyperviewView: Hyperview,
context: ResolutionContext, context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean } options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting { ): SchemaAppliedViewWithNesting {
@ -401,7 +401,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty( const nestedViews = this.resolveReferenceProperty(
deltas, deltas,
referenceSchema, referenceSchema,
losslessView, hyperviewView,
context.withDepth(context.currentDepth + 1), context.withDepth(context.currentDepth + 1),
options, options,
view.id view.id
@ -415,7 +415,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty( const nestedViews = this.resolveReferenceProperty(
deltas, deltas,
referenceSchema, referenceSchema,
losslessView, hyperviewView,
context.withDepth(context.currentDepth + 1), context.withDepth(context.currentDepth + 1),
options, options,
view.id view.id
@ -449,7 +449,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
private resolveReferenceProperty( private resolveReferenceProperty(
deltas: Delta[], deltas: Delta[],
referenceSchema: ReferenceSchema, referenceSchema: ReferenceSchema,
losslessView: Lossless, hyperviewView: Hyperview,
context: ResolutionContext, context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean }, options: { includeMetadata: boolean; strictValidation: boolean },
parentEntityId: string parentEntityId: string
@ -469,7 +469,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta, delta,
parentEntityId, parentEntityId,
referenceSchema.targetSchema, referenceSchema.targetSchema,
losslessView, hyperviewView,
context, context,
options options
); );
@ -480,8 +480,8 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const referenceIds = this.extractReferenceIdsFromDelta(delta, parentEntityId); const referenceIds = this.extractReferenceIdsFromDelta(delta, parentEntityId);
for (const referenceId of referenceIds) { for (const referenceId of referenceIds) {
try { try {
// Get the referenced entity's lossless view // Get the referenced entity's hyperview view
const referencedViews = losslessView.compose([referenceId]); const referencedViews = hyperviewView.compose([referenceId]);
const referencedView = referencedViews[referenceId]; const referencedView = referencedViews[referenceId];
if (referencedView) { if (referencedView) {
@ -489,7 +489,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView( const nestedView = this.resolveNestedView(
referencedView, referencedView,
referenceSchema.targetSchema, referenceSchema.targetSchema,
losslessView, hyperviewView,
context, context,
options options
); );
@ -514,7 +514,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta: Delta, delta: Delta,
parentEntityId: string, parentEntityId: string,
targetSchema: SchemaID, targetSchema: SchemaID,
losslessView: Lossless, hyperviewView: Hyperview,
context: ResolutionContext, context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean } options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting | null { ): SchemaAppliedViewWithNesting | null {
@ -536,7 +536,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
// Count entity references vs scalars // Count entity references vs scalars
if (typeof target === 'string') { if (typeof target === 'string') {
const referencedViews = losslessView.compose([target]); const referencedViews = hyperviewView.compose([target]);
if (referencedViews[target]) { if (referencedViews[target]) {
entityReferenceCount++; entityReferenceCount++;
} else { } else {
@ -568,7 +568,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') { if (typeof target === 'string') {
// Try to resolve as entity reference // Try to resolve as entity reference
try { try {
const referencedViews = losslessView.compose([target]); const referencedViews = hyperviewView.compose([target]);
const referencedView = referencedViews[target]; const referencedView = referencedViews[target];
if (referencedView) { if (referencedView) {
@ -576,7 +576,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView( const nestedView = this.resolveNestedView(
referencedView, referencedView,
targetSchema, targetSchema,
losslessView, hyperviewView,
context, context,
options options
); );
@ -601,14 +601,14 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') { if (typeof target === 'string') {
// Try to resolve as entity reference // Try to resolve as entity reference
try { try {
const referencedViews = losslessView.compose([target]); const referencedViews = hyperviewView.compose([target]);
const referencedView = referencedViews[target]; const referencedView = referencedViews[target];
if (referencedView) { if (referencedView) {
const nestedView = this.resolveNestedView( const nestedView = this.resolveNestedView(
referencedView, referencedView,
targetSchema, targetSchema,
losslessView, hyperviewView,
context, context,
options options
); );

View File

@ -1,6 +1,6 @@
import { DomainEntityID, PropertyID, PropertyTypes } from "../core/types"; import { DomainEntityID, PropertyID, PropertyTypes } from "../core/types";
import { LosslessViewOne } from "../views/lossless"; import { HyperviewViewOne } from "../views/hyperview";
import { CollapsedDelta } from "../views/lossless"; import { CollapsedDelta } from "../views/hyperview";
// Base schema types // Base schema types
export type SchemaID = string; export type SchemaID = string;
@ -52,7 +52,7 @@ export interface SchemaRegistry {
register(schema: ObjectSchema): void; register(schema: ObjectSchema): void;
get(id: SchemaID): ObjectSchema | undefined; get(id: SchemaID): ObjectSchema | undefined;
list(): ObjectSchema[]; list(): ObjectSchema[];
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult; validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewViewOne): SchemaValidationResult;
} }
// Validation result types // Validation result types
@ -76,7 +76,7 @@ export interface SchemaApplicationOptions {
strictValidation?: boolean; strictValidation?: boolean;
} }
// Applied schema result - a lossless view filtered through a schema // Applied schema result - a hyperview view filtered through a schema
export interface SchemaAppliedView { export interface SchemaAppliedView {
id: DomainEntityID; id: DomainEntityID;
schemaId: SchemaID; schemaId: SchemaID;
@ -105,7 +105,7 @@ export interface SchemaAppliedViewWithNesting extends SchemaAppliedView {
export interface TypedCollection<T> { export interface TypedCollection<T> {
schema: ObjectSchema; schema: ObjectSchema;
validate(entity: T): SchemaValidationResult; validate(entity: T): SchemaValidationResult;
apply(view: LosslessViewOne): SchemaAppliedView; apply(view: HyperviewViewOne): SchemaAppliedView;
} }
// Built-in schema helpers // Built-in schema helpers

View File

@ -9,7 +9,7 @@ import {Transactions} from '../features/transactions';
import {DomainEntityID, PropertyID, PropertyTypes, TransactionID, ViewMany} from "../core/types"; import {DomainEntityID, PropertyID, PropertyTypes, TransactionID, ViewMany} from "../core/types";
import {Negation} from '../features/negation'; import {Negation} from '../features/negation';
import {NegationHelper} from '../features/negation'; import {NegationHelper} from '../features/negation';
const debug = Debug('rz:lossless'); const debug = Debug('rz:hyperview');
export type CollapsedPointer = {[key: PropertyID]: PropertyTypes}; export type CollapsedPointer = {[key: PropertyID]: PropertyTypes};
@ -49,7 +49,7 @@ export function valueFromDelta(
} }
// TODO: Store property deltas as references to reduce memory footprint // TODO: Store property deltas as references to reduce memory footprint
export type LosslessViewOne = { export type HyperviewViewOne = {
id: DomainEntityID, id: DomainEntityID,
referencedAs?: string[]; referencedAs?: string[];
propertyDeltas: { propertyDeltas: {
@ -57,21 +57,21 @@ export type LosslessViewOne = {
} }
} }
export type CollapsedViewOne = Omit<LosslessViewOne, 'propertyDeltas'> & { export type CollapsedViewOne = Omit<HyperviewViewOne, 'propertyDeltas'> & {
propertyCollapsedDeltas: { propertyCollapsedDeltas: {
[key: PropertyID]: CollapsedDelta[] [key: PropertyID]: CollapsedDelta[]
} }
}; };
export type LosslessViewMany = ViewMany<LosslessViewOne>; export type HyperviewViewMany = ViewMany<HyperviewViewOne>;
export type CollapsedViewMany = ViewMany<CollapsedViewOne>; export type CollapsedViewMany = ViewMany<CollapsedViewOne>;
class LosslessEntityMap extends Map<DomainEntityID, LosslessEntity> {}; class HyperviewEntityMap extends Map<DomainEntityID, HyperviewEntity> {};
class LosslessEntity { class HyperviewEntity {
properties = new Map<PropertyID, Set<Delta>>(); properties = new Map<PropertyID, Set<Delta>>();
constructor(readonly lossless: Lossless, readonly id: DomainEntityID) {} constructor(readonly hyperview: Hyperview, readonly id: DomainEntityID) {}
addDelta(delta: Delta | DeltaV2) { addDelta(delta: Delta | DeltaV2) {
// Convert DeltaV2 to DeltaV1 if needed // Convert DeltaV2 to DeltaV1 if needed
@ -93,7 +93,7 @@ class LosslessEntity {
propertyDeltas.add(delta); propertyDeltas.add(delta);
} }
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta)); debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta));
} }
toJSON() { toJSON() {
@ -103,14 +103,14 @@ class LosslessEntity {
} }
return { return {
id: this.id, id: this.id,
referencedAs: Array.from(this.lossless.referencedAs.get(this.id) ?? []), referencedAs: Array.from(this.hyperview.referencedAs.get(this.id) ?? []),
properties properties
}; };
} }
} }
export class Lossless { export class Hyperview {
domainEntities = new LosslessEntityMap(); domainEntities = new HyperviewEntityMap();
transactions: Transactions; transactions: Transactions;
eventStream = new EventEmitter(); eventStream = new EventEmitter();
@ -160,7 +160,7 @@ export class Lossless {
for (const entityId of affectedEntities) { for (const entityId of affectedEntities) {
let ent = this.domainEntities.get(entityId); let ent = this.domainEntities.get(entityId);
if (!ent) { if (!ent) {
ent = new LosslessEntity(this, entityId); ent = new HyperviewEntity(this, entityId);
this.domainEntities.set(entityId, ent); this.domainEntities.set(entityId, ent);
} }
// Add negation delta to the entity // Add negation delta to the entity
@ -189,7 +189,7 @@ export class Lossless {
let ent = this.domainEntities.get(target); let ent = this.domainEntities.get(target);
if (!ent) { if (!ent) {
ent = new LosslessEntity(this, target); ent = new HyperviewEntity(this, target);
this.domainEntities.set(target, ent); this.domainEntities.set(target, ent);
} }
@ -208,7 +208,7 @@ export class Lossless {
return transactionId; return transactionId;
} }
decompose(view: LosslessViewOne): Delta[] { decompose(view: HyperviewViewOne): Delta[] {
const allDeltas: Delta[] = []; const allDeltas: Delta[] = [];
const seenDeltaIds = new Set<DeltaID>(); const seenDeltaIds = new Set<DeltaID>();
@ -225,8 +225,8 @@ export class Lossless {
return allDeltas; return allDeltas;
} }
compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): LosslessViewMany { compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): HyperviewViewMany {
const view: LosslessViewMany = {}; const view: HyperviewViewMany = {};
entityIds = entityIds ?? Array.from(this.domainEntities.keys()); entityIds = entityIds ?? Array.from(this.domainEntities.keys());
for (const entityId of entityIds) { for (const entityId of entityIds) {
@ -391,4 +391,4 @@ export class Lossless {
} }
// TODO: point-in-time queries // TODO: point-in-time queries
} }

View File

@ -1,3 +1,3 @@
export * from './lossless'; export * from './hyperview';
export * from './lossy'; export * from './view';
export * from './resolvers'; export * from './resolvers';

View File

@ -1,7 +1,7 @@
import { Lossless, LosslessViewOne } from "../lossless"; import { Hyperview, HyperviewViewOne } from "../hyperview";
import { Lossy } from '../lossy'; import { Lossy } from '../view';
import { DomainEntityID, PropertyID, ViewMany } from "../../core/types"; import { DomainEntityID, PropertyID, ViewMany } from "../../core/types";
import { valueFromDelta } from "../lossless"; import { valueFromDelta } from "../hyperview";
import { EntityRecord, EntityRecordMany } from "@src/core/entity"; import { EntityRecord, EntityRecordMany } from "@src/core/entity";
export type AggregationType = 'min' | 'max' | 'sum' | 'average' | 'count'; export type AggregationType = 'min' | 'max' | 'sum' | 'average' | 'count';
@ -53,13 +53,13 @@ function aggregateValues(values: number[], type: AggregationType): number {
export class AggregationResolver extends Lossy<Accumulator, Result> { export class AggregationResolver extends Lossy<Accumulator, Result> {
constructor( constructor(
lossless: Lossless, hyperview: Hyperview,
private config: AggregationConfig private config: AggregationConfig
) { ) {
super(lossless); super(hyperview);
} }
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator { reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator {
if (!acc[cur.id]) { if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} }; acc[cur.id] = { id: cur.id, properties: {} };
} }
@ -111,41 +111,41 @@ export class AggregationResolver extends Lossy<Accumulator, Result> {
// Convenience classes for common aggregation types // Convenience classes for common aggregation types
export class MinResolver extends AggregationResolver { export class MinResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) { constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {}; const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'min'); properties.forEach(prop => config[prop] = 'min');
super(lossless, config); super(hyperview, config);
} }
} }
export class MaxResolver extends AggregationResolver { export class MaxResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) { constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {}; const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'max'); properties.forEach(prop => config[prop] = 'max');
super(lossless, config); super(hyperview, config);
} }
} }
export class SumResolver extends AggregationResolver { export class SumResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) { constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {}; const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'sum'); properties.forEach(prop => config[prop] = 'sum');
super(lossless, config); super(hyperview, config);
} }
} }
export class AverageResolver extends AggregationResolver { export class AverageResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) { constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {}; const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'average'); properties.forEach(prop => config[prop] = 'average');
super(lossless, config); super(hyperview, config);
} }
} }
export class CountResolver extends AggregationResolver { export class CountResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) { constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {}; const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'count'); properties.forEach(prop => config[prop] = 'count');
super(lossless, config); super(hyperview, config);
} }
} }

View File

@ -1,5 +1,5 @@
import { PropertyID, PropertyTypes } from "../../../core/types"; import { PropertyID, PropertyTypes } from "../../../core/types";
import { CollapsedDelta } from "../../lossless"; import { CollapsedDelta } from "../../hyperview";
import Debug from 'debug'; import Debug from 'debug';
const debug = Debug('rz:custom-resolver:plugin'); const debug = Debug('rz:custom-resolver:plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types"; import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../../views/lossless"; import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin"; import { ResolverPlugin } from "../plugin";
import Debug from 'debug'; import Debug from 'debug';
const debug = Debug('rz:concatenation-plugin'); const debug = Debug('rz:concatenation-plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types"; import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../lossless"; import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin"; import { ResolverPlugin } from "../plugin";
type FirstWriteWinsState = { type FirstWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types"; import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../lossless"; import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin"; import { ResolverPlugin } from "../plugin";
type LastWriteWinsState = { type LastWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "@src/core/types"; import { PropertyTypes } from "@src/core/types";
import { CollapsedDelta } from "@src/views/lossless"; import { CollapsedDelta } from "@src/views/hyperview";
import { ResolverPlugin, DependencyStates } from "../plugin"; import { ResolverPlugin, DependencyStates } from "../plugin";
type RunningAverageState = { type RunningAverageState = {

View File

@ -1,5 +1,5 @@
import { Lossless, LosslessViewOne } from "../../lossless"; import { Hyperview, HyperviewViewOne } from "../../hyperview";
import { Lossy } from '../../lossy'; import { Lossy } from '../../view';
import { DomainEntityID, PropertyID, PropertyTypes } from "../../../core/types"; import { DomainEntityID, PropertyID, PropertyTypes } from "../../../core/types";
import { ResolverPlugin, DependencyStates } from "./plugin"; import { ResolverPlugin, DependencyStates } from "./plugin";
import { EntityRecord } from "@src/core/entity"; import { EntityRecord } from "@src/core/entity";
@ -46,14 +46,14 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/** /**
* Creates a new CustomResolver instance * Creates a new CustomResolver instance
* @param lossless - The Lossless instance to use for delta tracking * @param hyperview - The Hyperview instance to use for delta tracking
* @param config - A mapping of property IDs to their resolver plugins * @param config - A mapping of property IDs to their resolver plugins
*/ */
constructor( constructor(
lossless: Lossless, hyperview: Hyperview,
config: PluginMap config: PluginMap
) { ) {
super(lossless); super(hyperview);
this.config = config; this.config = config;
this.buildDependencyGraph(); this.buildDependencyGraph();
this.executionOrder = this.calculateExecutionOrder(); this.executionOrder = this.calculateExecutionOrder();
@ -244,7 +244,7 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/** /**
* Update the state with new deltas from the view * Update the state with new deltas from the view
*/ */
reducer(acc: Accumulator, {id: entityId, propertyDeltas}: LosslessViewOne): Accumulator { reducer(acc: Accumulator, {id: entityId, propertyDeltas}: HyperviewViewOne): Accumulator {
debug(`Processing deltas for entity: ${entityId}`); debug(`Processing deltas for entity: ${entityId}`);
debug('Property deltas:', JSON.stringify(propertyDeltas)); debug('Property deltas:', JSON.stringify(propertyDeltas));

View File

@ -1,6 +1,6 @@
import { EntityProperties } from "../../core/entity"; import { EntityProperties } from "../../core/entity";
import { Lossless, CollapsedDelta, valueFromDelta, LosslessViewOne } from "../lossless"; import { Hyperview, CollapsedDelta, valueFromDelta, HyperviewViewOne } from "../hyperview";
import { Lossy } from '../lossy'; import { Lossy } from '../view';
import { DomainEntityID, PropertyID, PropertyTypes, Timestamp, ViewMany } from "../../core/types"; import { DomainEntityID, PropertyID, PropertyTypes, Timestamp, ViewMany } from "../../core/types";
import Debug from 'debug'; import Debug from 'debug';
@ -77,13 +77,13 @@ function compareWithTieBreaking(
export class TimestampResolver extends Lossy<Accumulator, Result> { export class TimestampResolver extends Lossy<Accumulator, Result> {
constructor( constructor(
lossless: Lossless, hyperview: Hyperview,
private tieBreakingStrategy: TieBreakingStrategy = 'delta-id' private tieBreakingStrategy: TieBreakingStrategy = 'delta-id'
) { ) {
super(lossless); super(hyperview);
} }
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator { reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator {
if (!acc[cur.id]) { if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} }; acc[cur.id] = { id: cur.id, properties: {} };
} }
@ -138,26 +138,26 @@ export class TimestampResolver extends Lossy<Accumulator, Result> {
// Convenience classes for different tie-breaking strategies // Convenience classes for different tie-breaking strategies
export class CreatorIdTimestampResolver extends TimestampResolver { export class CreatorIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) { constructor(hyperview: Hyperview) {
super(lossless, 'creator-id'); super(hyperview, 'creator-id');
} }
} }
export class DeltaIdTimestampResolver extends TimestampResolver { export class DeltaIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) { constructor(hyperview: Hyperview) {
super(lossless, 'delta-id'); super(hyperview, 'delta-id');
} }
} }
export class HostIdTimestampResolver extends TimestampResolver { export class HostIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) { constructor(hyperview: Hyperview) {
super(lossless, 'host-id'); super(hyperview, 'host-id');
} }
} }
export class LexicographicTimestampResolver extends TimestampResolver { export class LexicographicTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) { constructor(hyperview: Hyperview) {
super(lossless, 'lexicographic'); super(hyperview, 'lexicographic');
} }
} }

View File

@ -1,12 +1,12 @@
// We have the lossless transformation of the delta stream. // We have the hyperview transformation of the delta stream.
// We want to enable transformations from the lossless view, // We want to enable transformations from the hyperview view,
// into various possible "lossy" views that combine or exclude some information. // into various possible "view" views that combine or exclude some information.
import Debug from 'debug'; import Debug from 'debug';
import {Delta, DeltaFilter, DeltaID} from "../core/delta"; import {Delta, DeltaFilter, DeltaID} from "../core/delta";
import {Lossless, LosslessViewOne} from "./lossless"; import {Hyperview, HyperviewViewOne} from "./hyperview";
import {DomainEntityID, PropertyID, PropertyTypes, ViewMany} from "../core/types"; import {DomainEntityID, PropertyID, PropertyTypes, ViewMany} from "../core/types";
const debug = Debug('rz:lossy'); const debug = Debug('rz:view');
type PropertyMap = Record<PropertyID, PropertyTypes>; type PropertyMap = Record<PropertyID, PropertyTypes>;
@ -17,20 +17,20 @@ export type LossyViewOne<T = PropertyMap> = {
export type LossyViewMany<T = PropertyMap> = ViewMany<LossyViewOne<T>>; export type LossyViewMany<T = PropertyMap> = ViewMany<LossyViewOne<T>>;
// We support incremental updates of lossy models. // We support incremental updates of view models.
export abstract class Lossy<Accumulator, Result = Accumulator> { export abstract class Lossy<Accumulator, Result = Accumulator> {
deltaFilter?: DeltaFilter; deltaFilter?: DeltaFilter;
private accumulator?: Accumulator; private accumulator?: Accumulator;
initializer?(): Accumulator; initializer?(): Accumulator;
abstract reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator; abstract reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result; resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
constructor( constructor(
readonly lossless: Lossless, readonly hyperview: Hyperview,
) { ) {
this.lossless.eventStream.on("updated", (id, deltaIds) => { this.hyperview.eventStream.on("updated", (id, deltaIds) => {
debug(`[${this.lossless.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`, debug(`[${this.hyperview.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`,
JSON.stringify(deltaIds)); JSON.stringify(deltaIds));
this.ingestUpdate(id, deltaIds); this.ingestUpdate(id, deltaIds);
@ -46,16 +46,16 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
if (!this.deltaFilter) return true; if (!this.deltaFilter) return true;
return this.deltaFilter(delta); return this.deltaFilter(delta);
}; };
const losslessPartial = this.lossless.compose([entityId], combinedFilter); const hyperviewPartial = this.hyperview.compose([entityId], combinedFilter);
if (!losslessPartial) { if (!hyperviewPartial) {
// This should not happen; this should only be called after the lossless view has been updated // This should not happen; this should only be called after the hyperview view has been updated
console.error(`Lossless view for entity ${entityId} not found`); console.error(`Hyperview view for entity ${entityId} not found`);
return; return;
} }
const latest = this.accumulator || this.initializer?.() || {} as Accumulator; const latest = this.accumulator || this.initializer?.() || {} as Accumulator;
this.accumulator = this.reducer(latest, losslessPartial[entityId]); this.accumulator = this.reducer(latest, hyperviewPartial[entityId]);
} }
// Resolve the current state of the view // Resolve the current state of the view
@ -65,7 +65,7 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
} }
if (!entityIds) { if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys()); entityIds = Array.from(this.hyperview.domainEntities.keys());
} }
if (!this.resolver) { if (!this.resolver) {
@ -76,3 +76,5 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
} }
} }
// "Lossy" can simply be called "View"
export abstract class View<Accumulator, Result> extends Lossy<Accumulator, Result> {};

View File

@ -7,7 +7,7 @@
- **Delta Lifecycle**: - **Delta Lifecycle**:
- Creation via `DeltaBuilder` - Creation via `DeltaBuilder`
- Propagation through `DeltaStream` - Propagation through `DeltaStream`
- Storage in `Lossless` view - Storage in `Hyperview` view
- Transformation in `Lossy` views - Transformation in `Lossy` views
### 2. Network Layer ### 2. Network Layer
@ -22,7 +22,7 @@
### 3. Storage ### 3. Storage
- **In-Memory Storage**: - **In-Memory Storage**:
- `Lossless` view maintains complete delta history - `Hyperview` view maintains complete delta history
- `Lossy` views provide optimized access patterns - `Lossy` views provide optimized access patterns
- **Persistence**: - **Persistence**:
- LevelDB integration - LevelDB integration
@ -49,8 +49,8 @@
- `request-reply.ts`: Direct node communication - `request-reply.ts`: Direct node communication
3. **Views**: 3. **Views**:
- `lossless.ts`: Complete delta history - `hyperview.ts`: Complete delta history
- `lossy.ts`: Derived, optimized views - `view.ts`: Derived, optimized views
4. **Schema**: 4. **Schema**:
- `schema.ts`: Type definitions - `schema.ts`: Type definitions

12
todo.md
View File

@ -12,7 +12,7 @@ This document tracks work needed to achieve full specification compliance, organ
- [x] Add validation for pointer consistency - [x] Add validation for pointer consistency
### 1.2 Complete Transaction Support ✅ (mostly) ### 1.2 Complete Transaction Support ✅ (mostly)
- [x] Implement transaction-based filtering in lossless views - [x] Implement transaction-based filtering in hyperview views
- [x] Add transaction grouping in delta streams - [x] Add transaction grouping in delta streams
- [x] Test atomic transaction operations - [x] Test atomic transaction operations
- [ ] Add transaction rollback capabilities (deferred - not critical for spec parity) - [ ] Add transaction rollback capabilities (deferred - not critical for spec parity)
@ -29,8 +29,8 @@ This document tracks work needed to achieve full specification compliance, organ
### 2.1 Negation Deltas ✅ ### 2.1 Negation Deltas ✅
- [x] Implement negation delta type with "negates" pointer - [x] Implement negation delta type with "negates" pointer
- [x] Add "negated_by" context handling - [x] Add "negated_by" context handling
- [x] Update lossless view to handle negations - [x] Update hyperview view to handle negations
- [x] Update lossy resolvers to respect negations - [x] Update view resolvers to respect negations
- [x] Add comprehensive negation tests - [x] Add comprehensive negation tests
### 2.2 Advanced Conflict Resolution ✅ ### 2.2 Advanced Conflict Resolution ✅
@ -50,7 +50,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 3.1 Query Engine Foundation ✅ ### 3.1 Query Engine Foundation ✅
- [x] Implement JSON Logic parser (using json-logic-js) - [x] Implement JSON Logic parser (using json-logic-js)
- [x] Create query planner for lossless views - [x] Create query planner for hyperview views
- [x] Add query execution engine (QueryEngine class) - [x] Add query execution engine (QueryEngine class)
- [x] Implement schema-driven entity discovery - [x] Implement schema-driven entity discovery
- [x] Enable the skipped query tests - [x] Enable the skipped query tests
@ -91,7 +91,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 4.4 Schema-as-Deltas (Meta-Schema System) ### 4.4 Schema-as-Deltas (Meta-Schema System)
- [ ] Define schema entities that are stored as deltas in the system - [ ] Define schema entities that are stored as deltas in the system
- [ ] Implement schema queries that return schema instances from lossless views - [ ] Implement schema queries that return schema instances from hyperview views
- [ ] Create schema evolution through delta mutations - [ ] Create schema evolution through delta mutations
- [ ] Add temporal schema queries (schema time-travel) - [ ] Add temporal schema queries (schema time-travel)
- [ ] Build schema conflict resolution for competing schema definitions - [ ] Build schema conflict resolution for competing schema definitions
@ -186,7 +186,7 @@ This document tracks work needed to achieve full specification compliance, organ
1. **Phase 1** ✅ - These are foundational requirements 1. **Phase 1** ✅ - These are foundational requirements
2. **Phase 2.1 (Negation)** ✅ - Core spec feature that affects all views 2. **Phase 2.1 (Negation)** ✅ - Core spec feature that affects all views
3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper lossy views 3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper view views
4. **Phase 2.3 (Nesting)** ✅ - Depends on schemas and queries 4. **Phase 2.3 (Nesting)** ✅ - Depends on schemas and queries
5. **Phase 3 (Query)** ✅ - Unlocks powerful data access 5. **Phase 3 (Query)** ✅ - Unlocks powerful data access
6. **Phase 4 (Relational)** - Builds on query system 6. **Phase 4 (Relational)** - Builds on query system